CN115509122A - Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation - Google Patents

Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation Download PDF

Info

Publication number
CN115509122A
CN115509122A CN202211451943.7A CN202211451943A CN115509122A CN 115509122 A CN115509122 A CN 115509122A CN 202211451943 A CN202211451943 A CN 202211451943A CN 115509122 A CN115509122 A CN 115509122A
Authority
CN
China
Prior art keywords
vehicle
deviation
error
rate
waterline
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211451943.7A
Other languages
Chinese (zh)
Other versions
CN115509122B (en
Inventor
辛公锋
石磊
龙关旭
王福海
潘为刚
王目树
李一鸣
秦石铭
张文亮
靳华磊
张泽军
康超
李帆
胡朋
潘立平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Innovation Research Institute Of Shandong Expressway Group Co ltd
Original Assignee
Innovation Research Institute Of Shandong Expressway Group Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Innovation Research Institute Of Shandong Expressway Group Co ltd filed Critical Innovation Research Institute Of Shandong Expressway Group Co ltd
Priority to CN202211451943.7A priority Critical patent/CN115509122B/en
Publication of CN115509122A publication Critical patent/CN115509122A/en
Application granted granted Critical
Publication of CN115509122B publication Critical patent/CN115509122B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B11/00Automatic controllers
    • G05B11/01Automatic controllers electric
    • G05B11/36Automatic controllers electric with provision for obtaining particular characteristics, e.g. proportional, integral, differential
    • G05B11/42Automatic controllers electric with provision for obtaining particular characteristics, e.g. proportional, integral, differential for obtaining a characteristic which is both proportional and time-dependent, e.g. P. I., P. I. D.
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/34Smoothing or thinning of the pattern; Morphological operations; Skeletonisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses an unmanned line marking vehicle online optimization control method and system based on machine vision navigation, belonging to the field of system control, and the unmanned line marking vehicle online optimization control method based on machine vision navigation comprises the following steps: collecting a water line image of a road surface; carrying out waterline detection to obtain a waterline; based on the detected waterline, selecting a monitoring point of the current position of the vehicle, calculating the running deviation and the running deviation change rate of the vehicle, selecting a monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation and the predicted deviation change rate of the vehicle; designing nonlinear incremental PID control according to the predicted running deviation and the predicted deviation change rate of the vehicle; learning parameters of nonlinear PID control increment in the region; the method and the system realize the machine vision real-time navigation of the line marking vehicle, and design the feedforward controller and the nonlinear increment PID controller according to the tracking error and the error change rate, so as to realize the line marking vehicle path tracking control based on the machine vision navigation.

Description

Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation
Technical Field
The invention belongs to the field of system control, and particularly relates to an unmanned line marking vehicle online optimization control method and system based on machine vision navigation.
Background
In the construction process of the highway marking, a waterline is generally manually drawn on the road surface, and the existing method is generally constructed along the waterline manually. The system adopts a machine vision technology, a computer automatically identifies a waterline, and then controls a marking device to carry out marking construction along the waterline, namely navigation construction based on machine vision, and a construction schematic diagram is shown in figure 1.
The existing system has the following problems in marking vehicle control based on machine vision navigation:
the existing advanced image processing method is long in time consumption, real-time control of a marking vehicle is difficult to meet, only the gray value of each pixel is concerned, and the overall planning of foreground and background information is difficult;
the existing control method based on the model is not suitable, because the structure of the line marking vehicle is too complex, an accurate motion model is difficult to obtain, and the control law of the line marking vehicle cannot be designed by adopting the method based on the model;
the vehicle navigation adopts an image navigation mode, only the degree of deviation of a construction vehicle from a planned route can be known, accurate vehicle global coordinates are difficult to provide, and the output generally given by vehicle modeling is the position information of the vehicle;
for such problems, model-free control methods such as PID control and fuzzy control are generally adopted, but such methods rely too much on manual experience and are difficult to optimize, and most optimization methods require accurate models of controlled objects.
Disclosure of Invention
In order to solve the problems in the prior art, the invention discloses an unmanned scribing vehicle on-line optimization control method based on machine vision navigation, which realizes the machine vision real-time navigation of a scribing vehicle, designs a feedforward controller and a nonlinear PID controller according to a tracking error and an error change rate, designs an on-line learning method of parameters of the nonlinear PID controller, and realizes the scribing vehicle path tracking control based on the machine vision navigation.
The invention adopts the scheme that:
the on-line optimizing control method of the unmanned line marking vehicle based on the machine vision navigation comprises the following steps:
s1, collecting a water line image of a road surface;
s2, carrying out waterline detection based on the waterline image to obtain a waterline;
s3, based on the detected waterline, selecting a monitoring point of the current position of the vehicle, and calculating the running deviation of the vehicle
Figure 976467DEST_PATH_IMAGE001
Rate of change of deviation from running
Figure 300132DEST_PATH_IMAGE002
Selecting the monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation of the vehicle
Figure 17553DEST_PATH_IMAGE003
And predicting the rate of change of deviation
Figure 819287DEST_PATH_IMAGE004
S4, according to the predicted operation deviation of the vehicle
Figure 305763DEST_PATH_IMAGE005
And predicting the rate of change of deviation
Figure 800329DEST_PATH_IMAGE006
Obtaining a predicted feedforward control quantity
Figure 5045DEST_PATH_IMAGE007
Based on running deviation of the vehicle
Figure 610470DEST_PATH_IMAGE008
Rate of change of deviation from operation
Figure 685874DEST_PATH_IMAGE009
Obtaining an incremental nonlinear PID control law for a vehicle
Figure 85762DEST_PATH_IMAGE010
S5, controlling increment of nonlinear PID in the region
Figure 512195DEST_PATH_IMAGE011
Parameter (2) of
Figure 384293DEST_PATH_IMAGE012
Figure 579782DEST_PATH_IMAGE013
Figure 681731DEST_PATH_IMAGE014
And performing online learning.
The step S2 specifically includes the following steps:
s201, aiming at pixel points on the water line image
Figure 595460DEST_PATH_IMAGE015
Performing gray scale stretching, wherein the stretching formula is shown as formula (1):
Figure 542687DEST_PATH_IMAGE016
wherein,
Figure 858262DEST_PATH_IMAGE017
Figure 865533DEST_PATH_IMAGE018
Figure 266558DEST_PATH_IMAGE019
is a function of the stretch factor and is,
Figure 17476DEST_PATH_IMAGE020
Figure 921978DEST_PATH_IMAGE021
represents an image
Figure 834571DEST_PATH_IMAGE022
Go to the first
Figure 191734DEST_PATH_IMAGE021
Columns;
Figure 474905DEST_PATH_IMAGE023
is shown as
Figure 499492DEST_PATH_IMAGE024
Go to the first
Figure 848565DEST_PATH_IMAGE025
Gray values of the column pixel points; after stretching the gray scale of the water line image, selecting
Figure 240495DEST_PATH_IMAGE026
A personal HARR-like feature;
s202, aiming at the first step
Figure 333215DEST_PATH_IMAGE027
Generic HARR characteristics
Figure 477889DEST_PATH_IMAGE028
Figure 263442DEST_PATH_IMAGE029
The calculation method is as shown in formula (2):
Figure 595198DEST_PATH_IMAGE030
(2)
wherein,
Figure 491610DEST_PATH_IMAGE031
is as follows
Figure 219351DEST_PATH_IMAGE032
The sum of pixels of the individual HARR-like feature waterline regions,
Figure 910226DEST_PATH_IMAGE033
is as follows
Figure 729278DEST_PATH_IMAGE034
The sum of pixels of a road surface area with a HARR-like characteristic,
Figure 163801DEST_PATH_IMAGE035
Figure 17488DEST_PATH_IMAGE036
gray value based on stretched gray image
Figure 879265DEST_PATH_IMAGE037
Calculating to obtain;
after the similar HARR characteristics of each pixel point are described, normalizing each similar HARR characteristic, wherein the normalization formula is formula (3):
Figure 716771DEST_PATH_IMAGE038
wherein,
Figure 689406DEST_PATH_IMAGE039
Figure 663178DEST_PATH_IMAGE040
in order to normalize the HARR characteristics of the sample,
Figure 695856DEST_PATH_IMAGE041
Figure 755079DEST_PATH_IMAGE042
respectively the average value of the gray levels and the average value of the square of the gray levels in the detection window;
constructing pixel points based on normalized HARR-like characteristics
Figure 62564DEST_PATH_IMAGE043
Feature vector of
Figure 642841DEST_PATH_IMAGE044
Figure 112000DEST_PATH_IMAGE045
S203, aiming at each pixel point
Figure 392940DEST_PATH_IMAGE046
Feature vector of
Figure 504115DEST_PATH_IMAGE047
Performing identification to determine pixel points
Figure 186900DEST_PATH_IMAGE048
If the characteristic vector accords with the linear characteristic, setting 1 for the pixel point, and otherwise, setting 0 for realizing the binaryzation of the gray level image to obtain a binary image B;
performing expansion corrosion treatment on the binary image B to obtain a new binary image
Figure 561381DEST_PATH_IMAGE049
And obtaining an edge image E of the waterline by adopting a formula (6):
Figure 595196DEST_PATH_IMAGE050
Figure 713325DEST_PATH_IMAGE051
representing a new binary image
Figure 516196DEST_PATH_IMAGE052
To (1) a
Figure 61578DEST_PATH_IMAGE053
Go to the first
Figure 582689DEST_PATH_IMAGE054
Pixel values of the columns;
Figure 238929DEST_PATH_IMAGE055
represents the edge image E
Figure 896307DEST_PATH_IMAGE056
Go to the first
Figure 606731DEST_PATH_IMAGE057
Pixel values of the columns;
s204, identifying a straight line in the edge image E by adopting a HOUGH conversion method:
from new binary images
Figure 83979DEST_PATH_IMAGE058
Where the coordinates of the pixel having the pixel value of 1 are found to be
Figure 809490DEST_PATH_IMAGE059
Obtained by HOUGH conversion
Figure 321374DEST_PATH_IMAGE060
Obtained by the formula (7)
Figure 474138DEST_PATH_IMAGE061
A corresponding linear expression;
Figure 438683DEST_PATH_IMAGE062
wherein,
Figure 967884DEST_PATH_IMAGE063
and
Figure 599854DEST_PATH_IMAGE064
radius and angle detected by HOUGH transformation;
a group of
Figure 923519DEST_PATH_IMAGE065
Representing a straight line, given a set
Figure 640939DEST_PATH_IMAGE066
If at all
Figure 708252DEST_PATH_IMAGE067
Satisfies the expression of the formula (7)
Figure 929149DEST_PATH_IMAGE068
On the straight line represented by formula (7);
s205, eliminating interference straight lines;
the interference line removing method specifically comprises the following steps:
s205-1: selecting a plurality of straight lines with the longest length, wherein the length of the straight lines meets the length of a threshold value;
s205-2: front and back two-value image
Figure 689295DEST_PATH_IMAGE069
Detected straight line parameter
Figure 622572DEST_PATH_IMAGE070
Satisfies the requirement of formula (8):
Figure 493577DEST_PATH_IMAGE071
wherein,
Figure 834559DEST_PATH_IMAGE072
Figure 765606DEST_PATH_IMAGE073
are respectively as
Figure 192039DEST_PATH_IMAGE074
The maximum allowable deviation angle and radius,
Figure 601155DEST_PATH_IMAGE075
is a threshold value that is allowed for continuity,
Figure 531065DEST_PATH_IMAGE076
is the sampling time of the image;
s205-3: and if more than one straight line still meets the requirement after the process of S205-1 and S205-2, selecting the straight line with the highest pixel mean value on the straight line as the detected waterline.
The step S3 specifically includes the following steps: the height and width of the edge image E at which the waterline is located are H and
Figure 633013DEST_PATH_IMAGE077
selecting the intersection point of the horizontal line with the height of y and the waterline as the monitoring point of the current position of the vehicle: (
Figure 281163DEST_PATH_IMAGE078
Y) running deviation of the vehicle
Figure 697232DEST_PATH_IMAGE079
Rate of change of deviation from running
Figure 747228DEST_PATH_IMAGE080
Comprises the following steps:
Figure 488919DEST_PATH_IMAGE081
wherein,
Figure 884085DEST_PATH_IMAGE082
the actual length represented by each pixel is represented by the camera calibration;
selecting a height of
Figure 635003DEST_PATH_IMAGE083
The intersection point of the horizontal line and the waterline is used as a monitoring point of the predicted position of the vehicle (
Figure 805085DEST_PATH_IMAGE084
Figure 717677DEST_PATH_IMAGE085
) Obtaining a predicted running deviation of the vehicle
Figure 74840DEST_PATH_IMAGE086
And predicting the rate of change of deviation
Figure 629449DEST_PATH_IMAGE087
Figure 654037DEST_PATH_IMAGE088
Wherein,
Figure 3110DEST_PATH_IMAGE089
the distribution is the width of the edge image E.
S4 specifically comprises the following steps:
s401, according to the predicted operation deviation of the vehicle
Figure 113149DEST_PATH_IMAGE090
And predicting the rate of change of deviation
Figure 471449DEST_PATH_IMAGE091
Calculating a feedforward control amount of the vehicle
Figure 616122DEST_PATH_IMAGE092
Figure 136096DEST_PATH_IMAGE093
Figure 751009DEST_PATH_IMAGE094
In order to calculate the amount of feedforward control,
Figure 647421DEST_PATH_IMAGE095
and
Figure 912180DEST_PATH_IMAGE096
are two parameters of predictive control;
s402, in the area of the e-ec plane, based on the running deviation of the vehicle
Figure 868635DEST_PATH_IMAGE097
Rate of change of deviation from running
Figure 953265DEST_PATH_IMAGE098
Obtaining an incremental nonlinear PID control law for a vehicle
Figure 653368DEST_PATH_IMAGE099
Step S402 specifically includes the following steps: constructing a non-uniform division method of an error e and an error change rate ec based on a Gaussian function, wherein an e-ec plane takes the error e as a horizontal axis and the error change rate ec as a vertical axis; the error e and the error change rate ec refer to the running deviation of the vehicle
Figure 507055DEST_PATH_IMAGE100
Rate of change of deviation from running
Figure 634411DEST_PATH_IMAGE101
(ii) a The non-linear division method adopted by the e-ec plane division is as follows:
Figure 206338DEST_PATH_IMAGE102
when the division of the error change rate ec is calculated,
Figure 444552DEST_PATH_IMAGE103
is that
Figure 152745DEST_PATH_IMAGE104
When the division of the error e is calculated,
Figure 185423DEST_PATH_IMAGE103
is that
Figure 979067DEST_PATH_IMAGE105
Figure 749533DEST_PATH_IMAGE106
And
Figure 781074DEST_PATH_IMAGE107
respectively, the maximum of the absolute values of the error e and the error change rate ec, wherein,
Figure 250233DEST_PATH_IMAGE108
equal division points for error e or error change rate ec,
Figure 531173DEST_PATH_IMAGE109
in order to non-uniformly partition the points after mapping,
Figure 376769DEST_PATH_IMAGE110
adjusting a factor for the degree of non-linearity;
after the e-ec plane is divided in a non-uniform way, a divided area set is marked as
Figure 325134DEST_PATH_IMAGE111
In each divided region
Figure 965193DEST_PATH_IMAGE112
Internally performing PID control and non-linear PID control increment
Figure 733429DEST_PATH_IMAGE113
Comprises the following steps:
Figure 851558DEST_PATH_IMAGE114
wherein,
Figure 654429DEST_PATH_IMAGE115
to represent
Figure DEST_PATH_IMAGE116
Region of time of day
Figure 403073DEST_PATH_IMAGE117
Non-linear PID control increments of (1);
Figure 652746DEST_PATH_IMAGE118
which is indicative of the time of the sampling,
Figure 574565DEST_PATH_IMAGE119
indicating area
Figure 231943DEST_PATH_IMAGE120
The internal proportionality coefficient of the air-fuel ratio,
Figure 213805DEST_PATH_IMAGE121
the scale is shown to be that of,
Figure 956633DEST_PATH_IMAGE122
the lines are represented as a result of,
Figure 682144DEST_PATH_IMAGE123
a presentation column;
Figure 194028DEST_PATH_IMAGE124
indicating area
Figure 346792DEST_PATH_IMAGE125
Inner integral the coefficients of which are such that,
Figure 576916DEST_PATH_IMAGE126
indicating area
Figure 106117DEST_PATH_IMAGE127
The inner differential coefficient;
based on all regions
Figure 472508DEST_PATH_IMAGE128
Non-linear PID control increments of
Figure 796173DEST_PATH_IMAGE129
And calculating the weighted average increment of the nonlinear PID control:
Figure 716855DEST_PATH_IMAGE130
wherein,
Figure 247151DEST_PATH_IMAGE131
is a region
Figure 468048DEST_PATH_IMAGE132
The control law weight is incremented by one,
Figure 962614DEST_PATH_IMAGE133
is a region
Figure 901751DEST_PATH_IMAGE134
The error e and the radius of the error change rate ec,
Figure 507176DEST_PATH_IMAGE135
is a region
Figure 582579DEST_PATH_IMAGE136
The center of (a);
Figure 248047DEST_PATH_IMAGE137
time incremental nonlinear PID control law
Figure 408901DEST_PATH_IMAGE138
Comprises the following steps:
Figure 818017DEST_PATH_IMAGE139
wherein,
Figure 13506DEST_PATH_IMAGE140
is an incremental factor defined as:
Figure 584296DEST_PATH_IMAGE141
wherein,
Figure 515603DEST_PATH_IMAGE142
for the maximum value of the incremental factor,
Figure 993989DEST_PATH_IMAGE143
is an offset;
Figure 778405DEST_PATH_IMAGE144
for describing deviation and deviation rate of change from originTo the extent that (a) is present,
Figure 51255DEST_PATH_IMAGE145
is a scaling factor;
the method can make different errors e and error change rates ec have different increment factors.
Design of
Figure 921122DEST_PATH_IMAGE146
The method aims to provide the controller increment factors under the conditions that the error e and the error change rate ec are different, and has the effects of improving the response speed of a system and reducing the complexity of an optimization process.
The step S5 specifically includes the following steps:
there are a number of pairs
Figure 937619DEST_PATH_IMAGE147
In the divided region
Figure 107701DEST_PATH_IMAGE148
Inner, then to the area
Figure 285872DEST_PATH_IMAGE149
Nonlinear PID control increments within
Figure 908615DEST_PATH_IMAGE150
Parameter (2)
Figure 463224DEST_PATH_IMAGE151
Figure 222233DEST_PATH_IMAGE152
Figure 571305DEST_PATH_IMAGE153
Performing online learning;
increment control of nonlinear PID by adopting supervised Hebb learning rule
Figure 144326DEST_PATH_IMAGE154
The parameters of (2) are learned:
Figure 971468DEST_PATH_IMAGE155
(16)
wherein,
Figure 319404DEST_PATH_IMAGE156
is a region
Figure 839378DEST_PATH_IMAGE157
Internal learning rate, learning rate
Figure 436712DEST_PATH_IMAGE158
Based on online adjustment rules
Figure 333124DEST_PATH_IMAGE159
And (3) adjusting:
Figure 332304DEST_PATH_IMAGE160
(17)
wherein,
Figure 288759DEST_PATH_IMAGE161
is a region
Figure 842231DEST_PATH_IMAGE162
The matching coefficient in the learning rate range is used for adjusting the learning rate range,
Figure 276755DEST_PATH_IMAGE163
is a region
Figure 130441DEST_PATH_IMAGE164
Two weight coefficients within.
The unmanned line marking vehicle online optimization searching control system based on machine vision navigation comprises a vision sensor, a waterline detection unit, an operation error prediction unit, a prediction controller, a nonlinear increment PID controller and an online learning rule unit;
a visual sensor acquires a water line image of a road surface;
the waterline detection unit performs waterline detection based on the waterline image to obtain a waterline;
the operation error prediction unit selects a monitoring point of the current position of the vehicle based on the detected waterline, and calculates the operation deviation of the vehicle
Figure 251938DEST_PATH_IMAGE165
Rate of change of deviation from running
Figure 823865DEST_PATH_IMAGE166
Selecting the monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation of the vehicle
Figure 530921DEST_PATH_IMAGE167
And predicting the rate of change of deviation
Figure 973534DEST_PATH_IMAGE168
The predictive controller operates the deviation according to the prediction of the vehicle
Figure 271792DEST_PATH_IMAGE169
And predicting the rate of change of deviation
Figure 65435DEST_PATH_IMAGE170
Predicting a predicted feedforward control amount of the vehicle
Figure 841761DEST_PATH_IMAGE171
The nonlinear incremental PID controller is in a nonlinear division area of an e-ec plane and is based on the running deviation of the vehicle
Figure 670040DEST_PATH_IMAGE172
Rate of change of deviation from operation
Figure 139199DEST_PATH_IMAGE173
Obtaining an incremental nonlinear PID control law for a vehicle
Figure 154559DEST_PATH_IMAGE174
On-line learning of rule units versus regionsNonlinear PID control increments
Figure 203418DEST_PATH_IMAGE175
The parameters of (2) are learned.
The working process of the waterline detection unit specifically comprises the following steps:
s201, aiming at pixel points on the water line image
Figure 614764DEST_PATH_IMAGE176
Performing gray scale stretching, wherein the stretching formula is shown as formula (1):
Figure 989245DEST_PATH_IMAGE177
wherein,
Figure 491902DEST_PATH_IMAGE178
Figure 610030DEST_PATH_IMAGE179
Figure 412901DEST_PATH_IMAGE180
as a result of the stretching factor,
Figure 223863DEST_PATH_IMAGE181
represents an image
Figure 948236DEST_PATH_IMAGE182
Go to the first
Figure 135635DEST_PATH_IMAGE183
Columns;
Figure 527433DEST_PATH_IMAGE184
denotes the first
Figure 509296DEST_PATH_IMAGE185
Go to the first
Figure 720965DEST_PATH_IMAGE183
Gray values of the column pixel points; water lineAfter the gray level of the image is stretched, selecting
Figure 446476DEST_PATH_IMAGE186
The individual HARR-like features express the linear features of each pixel point;
s202, aiming at the first step
Figure 179200DEST_PATH_IMAGE187
Characteristic of personal HARR
Figure 800805DEST_PATH_IMAGE188
The calculation method is as shown in formula (2):
Figure 30930DEST_PATH_IMAGE030
(2)
wherein,
Figure 294552DEST_PATH_IMAGE189
for the sum of pixels of the ith HARR-like feature waterline region,
Figure 926521DEST_PATH_IMAGE190
for the sum of pixels of the ith HARR-like feature road surface area,
Figure 250187DEST_PATH_IMAGE191
Figure 702028DEST_PATH_IMAGE192
gray value based on stretched gray image
Figure 34920DEST_PATH_IMAGE193
Calculating to obtain;
after the HARR-like feature description of each pixel point is obtained, normalizing each HARR-like feature, wherein the normalization formula is as shown in formula (3):
Figure 521396DEST_PATH_IMAGE194
wherein,
Figure 15962DEST_PATH_IMAGE195
is as follows
Figure 955100DEST_PATH_IMAGE196
The number of normalized HARR features is then determined,
Figure 826104DEST_PATH_IMAGE197
Figure 895648DEST_PATH_IMAGE198
respectively the average value of the gray levels and the average value of the square of the gray levels in the detection window;
constructing pixel points based on normalized HARR-like characteristics
Figure 295536DEST_PATH_IMAGE199
Feature vector of
Figure 659653DEST_PATH_IMAGE200
Figure 334347DEST_PATH_IMAGE201
S203, aiming at each pixel point
Figure 529837DEST_PATH_IMAGE202
Feature vector of
Figure 366206DEST_PATH_IMAGE203
Identifying and judging pixel points
Figure 279935DEST_PATH_IMAGE204
If the characteristic vector accords with the linear characteristic, setting 1 for the pixel point, and otherwise, setting 0 for realizing the binaryzation of the gray level image so as to obtain a binary image B;
carrying out expansion corrosion treatment on the binary image B to obtain a new binary image
Figure 470571DEST_PATH_IMAGE205
Obtaining an edge image E of the waterline by adopting a formula (6),
Figure 723828DEST_PATH_IMAGE206
Figure 668782DEST_PATH_IMAGE207
representing a new binary image
Figure 69807DEST_PATH_IMAGE208
To (1) a
Figure 820726DEST_PATH_IMAGE209
Go to the first
Figure 990807DEST_PATH_IMAGE210
Pixel values of the columns;
Figure 452136DEST_PATH_IMAGE211
represents the edge image E
Figure 74878DEST_PATH_IMAGE212
The rows of the image data are, in turn,
Figure 895067DEST_PATH_IMAGE213
pixel values of the columns;
s204, identifying a straight line in the edge image E by adopting a HOUGH conversion method:
from new binary images
Figure 654075DEST_PATH_IMAGE214
Finding the coordinates of the pixel with the pixel value of 1 as
Figure 268727DEST_PATH_IMAGE215
Obtained by HOUGH conversion
Figure 847607DEST_PATH_IMAGE216
Obtained by the formula (7)
Figure 205908DEST_PATH_IMAGE217
A corresponding linear expression;
Figure 553843DEST_PATH_IMAGE218
wherein,
Figure 73818DEST_PATH_IMAGE219
radius and angle detected by HOUGH transformation;
a group of
Figure 671152DEST_PATH_IMAGE220
Representing a straight line, given a set
Figure 567564DEST_PATH_IMAGE220
If at all
Figure 566744DEST_PATH_IMAGE221
Satisfies the formula (7)
Figure 517339DEST_PATH_IMAGE222
On the straight line represented by formula (7);
s205, eliminating interference straight lines;
the method for eliminating the interference straight line specifically comprises the following steps:
s205-1, selecting a plurality of straight lines with the longest length and the length meeting the threshold length;
s205-2, two-value images of front and back frames
Figure 70812DEST_PATH_IMAGE223
Detected straight line parameter
Figure 505335DEST_PATH_IMAGE224
Satisfies formula (8):
Figure 296705DEST_PATH_IMAGE225
wherein,
Figure 689640DEST_PATH_IMAGE226
Figure 995987DEST_PATH_IMAGE227
are respectively as
Figure 234202DEST_PATH_IMAGE228
The maximum allowable deviation angle and radius,
Figure 473553DEST_PATH_IMAGE229
is a threshold value that is allowed for continuity,
Figure 771811DEST_PATH_IMAGE230
is the sampling moment of the image;
s205-3, if 2 or more than 2 straight lines still exist after the S205-1 and the S205-2, selecting the straight line with the highest pixel average value on the straight line as the detected waterline.
The operation error prediction unit specifically comprises the following working processes: the height and width of the edge image E of the waterline are H and H respectively
Figure 299875DEST_PATH_IMAGE231
(ii) a Selecting the intersection point of the horizontal line with the height of y and the waterline as the monitoring point of the current position of the vehicle: (
Figure 341780DEST_PATH_IMAGE232
Y) calculating the running deviation of the vehicle
Figure 170059DEST_PATH_IMAGE233
Rate of change of deviation from running
Figure 633358DEST_PATH_IMAGE234
Comprises the following steps:
Figure 914298DEST_PATH_IMAGE235
wherein,
Figure 494315DEST_PATH_IMAGE236
the actual length represented by each pixel is represented by the camera calibration;
selecting a height of
Figure 442680DEST_PATH_IMAGE237
(the intersection point of the horizontal line and the waterline is taken as a monitoring point of the predicted position of the vehicle
Figure 817160DEST_PATH_IMAGE238
Figure 585396DEST_PATH_IMAGE239
) Obtaining a predicted running deviation of the vehicle
Figure 969104DEST_PATH_IMAGE240
And predicting the rate of change of deviation
Figure 771975DEST_PATH_IMAGE241
Figure 582936DEST_PATH_IMAGE242
(10)。
The prediction controller calculates a feedforward control amount of the vehicle
Figure 838468DEST_PATH_IMAGE243
Comprises the following steps:
Figure 25867DEST_PATH_IMAGE244
Figure 417665DEST_PATH_IMAGE245
in order to predict the feedforward control quantity of the controller,
Figure 862510DEST_PATH_IMAGE246
and
Figure 339759DEST_PATH_IMAGE247
are two parameters of the predictive controller;
the nonlinear incremental PID controller is in a nonlinear division area of an e-ec plane and is based on the running deviation of the vehicle
Figure 534111DEST_PATH_IMAGE248
Rate of change of deviation from operation
Figure 45995DEST_PATH_IMAGE249
Obtaining an incremental nonlinear PID control law for a vehicle
Figure 198758DEST_PATH_IMAGE250
Constructing a non-uniform division method of an error e and an error change rate ec based on a Gaussian function, wherein an e-ec plane takes the error e as a horizontal axis and the error change rate ec as a vertical axis; error or rate of change of error refers to deviation in the operation of the vehicle
Figure 428883DEST_PATH_IMAGE251
Rate of change of deviation from running
Figure 692505DEST_PATH_IMAGE252
The e-ec plane nonlinear division method comprises the following steps:
Figure 58895DEST_PATH_IMAGE253
when the division of the error change rate ec is calculated,
Figure 913719DEST_PATH_IMAGE254
is that
Figure 365560DEST_PATH_IMAGE255
When the division of the error e is calculated,
Figure 432873DEST_PATH_IMAGE256
is that
Figure 919349DEST_PATH_IMAGE257
Figure 413916DEST_PATH_IMAGE258
Figure 370631DEST_PATH_IMAGE259
Respectively, the maximum of the absolute values of the error e and the error rate of change ec, wherein,
Figure 241635DEST_PATH_IMAGE260
for equal uniform division points of the error e or the error change rate ec,
Figure 582617DEST_PATH_IMAGE261
in order to non-uniformly partition the points after mapping,
Figure 248085DEST_PATH_IMAGE262
the factor is adjusted for the degree of non-linearity.
According to the range of the error e and the error change rate ec, the e-ec plane is divided non-uniformly, and the divided area set is marked as
Figure 940098DEST_PATH_IMAGE263
In each divided region
Figure 349213DEST_PATH_IMAGE264
PID control is carried out by a nonlinear increment PID controller, and the increment is controlled by nonlinear PID
Figure 544702DEST_PATH_IMAGE265
Comprises the following steps:
Figure 646651DEST_PATH_IMAGE266
wherein,
Figure 560380DEST_PATH_IMAGE267
represent
Figure 976449DEST_PATH_IMAGE268
Region of time of day
Figure 292024DEST_PATH_IMAGE269
Non-linear PID control increments of (a);
Figure 33715DEST_PATH_IMAGE270
which is indicative of the time of the sampling,
Figure 163302DEST_PATH_IMAGE271
indicating area
Figure 648641DEST_PATH_IMAGE272
The internal proportionality coefficient of the air-fuel ratio,
Figure 553143DEST_PATH_IMAGE273
the scale is shown to be that of a scale,
Figure 668998DEST_PATH_IMAGE274
the lines are represented by a number of lines,
Figure 26161DEST_PATH_IMAGE275
the columns are indicated.
Figure 315191DEST_PATH_IMAGE276
The value of the integral coefficient is represented by,
Figure 339779DEST_PATH_IMAGE277
represents a differential coefficient;
based on all regions
Figure 954431DEST_PATH_IMAGE278
Non-linear PID control increments of
Figure 64469DEST_PATH_IMAGE279
Calculating a nonlinear PID control weighted average increment:
Figure 157190DEST_PATH_IMAGE280
wherein,
Figure 301864DEST_PATH_IMAGE281
is a region
Figure 81558DEST_PATH_IMAGE282
The control law weight is incremented by one,
Figure 678892DEST_PATH_IMAGE283
Figure 575304DEST_PATH_IMAGE284
is a region
Figure 308905DEST_PATH_IMAGE285
The error e, the radius of the error change rate ec,
Figure 999780DEST_PATH_IMAGE286
is a region
Figure 84411DEST_PATH_IMAGE287
The center of (a);
based on incremental control law weights
Figure 784514DEST_PATH_IMAGE288
Calculating
Figure 638200DEST_PATH_IMAGE289
Time incremental nonlinear PID control law
Figure 499977DEST_PATH_IMAGE290
Comprises the following steps:
Figure 71904DEST_PATH_IMAGE291
wherein,
Figure 310118DEST_PATH_IMAGE292
an incremental factor, which is defined as follows:
Figure 283890DEST_PATH_IMAGE293
wherein,
Figure 316568DEST_PATH_IMAGE294
for the maximum value of the incremental factor,
Figure 635511DEST_PATH_IMAGE295
is an offset;
Figure 677417DEST_PATH_IMAGE296
running deviation for vehicle
Figure 708958DEST_PATH_IMAGE297
Rate of change of deviation from running
Figure 912537DEST_PATH_IMAGE298
Degree of deviation from the origin;
Figure 193477DEST_PATH_IMAGE299
Figure 39073DEST_PATH_IMAGE300
is a scaling factor.
The method can lead the error change rate ec to have different increment factors with different errors e.
Design of
Figure 987437DEST_PATH_IMAGE301
Is a running deviation
Figure 361918DEST_PATH_IMAGE302
Rate of change of deviation from running
Figure 943203DEST_PATH_IMAGE303
The controller increment factors under different conditions have the functions of improving the response speed of the system and reducing the complexity of the optimization process.
The working process of the online learning rule unit specifically comprises the following steps:
there are several pairs
Figure 326911DEST_PATH_IMAGE304
In a divided region
Figure 881781DEST_PATH_IMAGE305
Inner, then to the area
Figure 692742DEST_PATH_IMAGE306
Internal non-linear PID control increments
Figure 948274DEST_PATH_IMAGE307
Parameter (2) of
Figure 135673DEST_PATH_IMAGE308
Figure 793050DEST_PATH_IMAGE309
Figure 774913DEST_PATH_IMAGE310
And performing online learning. Incremental nonlinear PID control using supervised Hebb learning rule
Figure 517741DEST_PATH_IMAGE311
The parameters of (2) are learned:
Figure 243252DEST_PATH_IMAGE155
(16)
wherein,
Figure 755135DEST_PATH_IMAGE312
is a region
Figure 907899DEST_PATH_IMAGE313
The inner learning rate. To improve learning efficiency, learning rate
Figure 872444DEST_PATH_IMAGE314
Based on online adjustment rules
Figure 667225DEST_PATH_IMAGE315
The adjustment is carried out, namely:
Figure 33615DEST_PATH_IMAGE316
(17)
wherein,
Figure 820263DEST_PATH_IMAGE317
is a region
Figure 272104DEST_PATH_IMAGE318
The matching coefficient in the learning rate range is adjusted,
Figure 73838DEST_PATH_IMAGE319
is a region
Figure 966838DEST_PATH_IMAGE320
Two weight coefficients within.
Compared with the prior art, the invention has the beneficial effects that:
the application discloses an unmanned line marking vehicle on-line optimization control method based on machine vision navigation, which adopts a new HARR-like feature and an image processing method to realize the machine vision real-time navigation of a line marking vehicle, designs a feedforward controller and a nonlinear PID controller according to a tracking error and an error change rate, and discloses an on-line learning method of parameters of the nonlinear PID controller to realize the line marking vehicle path tracking control based on the machine vision navigation.
The application discloses a novel HARR-like feature which is used for extracting a straight line feature of a waterline, so that later waterline detection is facilitated, the real-time requirement of image detection is met, background information around the waterline is fully considered, and the success rate of the waterline detection is improved;
the method is based on the detected waterline edge, the waterline in each frame of image is extracted by adopting a HOUGH conversion method, a new interference waterline filtering method is provided, and the robustness of waterline identification is realized; according to the respective conditions of the tracking error and the error change rate, the e-ec plane is divided into areas, an incremental PID controller is designed in each area, and finally a nonlinear incremental PID controller is constructed, so that the rapidity of the track tracking of the scribing car is realized, and the tracking precision of the scribing car is improved; the parameters of the nonlinear PID controller are adjusted by adopting a supervised Hebb learning rule, so that the online optimization of the nonlinear PID controller is realized; according to the prediction tracking deviation, a feedforward prediction controller is designed, the waterline adaptability of scribing vehicle control is improved, and the scribing vehicle control precision and the anti-interference capability are improved.
Drawings
FIG. 1 is a schematic view of a line marking vehicle construction;
FIG. 2 class HARR features;
FIG. 3 is a schematic of an operating deviation and predicted deviation calculation;
FIG. 4 is an online optimizing control system of an unmanned line marking vehicle based on machine vision navigation;
FIG. 5 e-ec plane non-linear division diagram;
FIG. 6 e-ec plane division area schematic diagram;
fig. 7 is a water line image schematic.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings, and it is obvious that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
The on-line optimizing control method of the unmanned line marking vehicle based on the machine vision navigation comprises the following steps:
s1, collecting a water line image of a road surface to obtain a data sample, wherein the water line image is shown in FIG. 7:
the gray level image of the asphalt pavement is acquired through the vision sensor arranged on the line marking vehicle in the figure 1, the pavement is provided with a drawn waterline, and waterline information is arranged in the image. The vision sensor is arranged at the height of 35-45cm away from the asphalt pavement and ensures that the length of a view field in the advancing direction of the scribing vehicle is 30-40cm. In order to ensure that the visual sensor is resistant to the interference of external light, certain shading equipment is generally arranged outside the visual sensor.
S2, carrying out waterline detection based on the waterline image to obtain a waterline;
the step S2 specifically includes the following steps:
s201, aiming at pixel points on the water line image
Figure 726984DEST_PATH_IMAGE321
Performing gray scale stretching, wherein the stretching formula is shown as formula (1):
Figure 666121DEST_PATH_IMAGE322
wherein,
Figure 271546DEST_PATH_IMAGE323
Figure 346949DEST_PATH_IMAGE324
Figure 746838DEST_PATH_IMAGE325
is a stretch factor; m, n represent the m-th row and n-th column of the image;
Figure 173271DEST_PATH_IMAGE326
is shown as
Figure 310948DEST_PATH_IMAGE327
Go to the first
Figure 506437DEST_PATH_IMAGE328
After the gray value of the gray value water line image of the column pixel points is stretched, the linear characteristic of each pixel point is expressed based on the HARR-like characteristic, and the HARR-like characteristic is shown in FIG. 2: in the embodiment, 6 HARR-like features are selected according to engineering conditions, the number of the HARR-like features is large, and the HARR-like features are designed in the HARR-like features M1-M6 according to actual conditions to divide different regions. Assume that the width of the waterline area is 2w, where w is half the width of the waterlineIn the HARR-like features M1 to M3, the width of the road surface region and the transition region is w. In the HARR-like features M4-M6, the width of each region is 2w, and the height of all HARR-like features is 4w-6w. For the rotation features M2, M3, M5, M6, the rotation angle is 15 degrees, which is determined by the field waterline characteristics.
Because the boundary of the waterline area is easy to be fuzzy, in the embodiment, the transition area is arranged between the road surface area and the waterline area, so that the fuzzy interference of the waterline boundary on the feature extraction is avoided, namely, when the feature is extracted, the transition area is not considered, and the transition area is ignored.
S202, the HARR-like features are used for highlighting the straight line features of the waterline on the road surface, and the waterline identification in the later period is facilitated. To the first
Figure 608385DEST_PATH_IMAGE329
Generic HARR characteristics
Figure 522115DEST_PATH_IMAGE330
Figure 469342DEST_PATH_IMAGE331
The calculation method is as shown in formula (2):
Figure 784917DEST_PATH_IMAGE030
(2)
wherein, in the embodiment, Q =6,
Figure 57767DEST_PATH_IMAGE332
is as follows
Figure 458792DEST_PATH_IMAGE333
The sum of pixels of the individual HARR-like feature waterline regions,
Figure 209710DEST_PATH_IMAGE334
is a first
Figure 379792DEST_PATH_IMAGE335
The sum of pixels of a road surface area with a HARR-like characteristic,
Figure 557963DEST_PATH_IMAGE336
Figure 446285DEST_PATH_IMAGE337
gray value based on stretched gray image
Figure 266473DEST_PATH_IMAGE338
Calculating to obtain;
after the HARR-like feature description of each pixel point is obtained, normalizing each HARR-like feature, wherein the normalization formula is as shown in formula (3):
Figure 285202DEST_PATH_IMAGE339
(3)
wherein,
Figure 368695DEST_PATH_IMAGE340
in order to normalize the HARR characteristics of the sample,
Figure 213155DEST_PATH_IMAGE341
Figure 305876DEST_PATH_IMAGE342
respectively the average value of the gray levels and the average value of the square of the gray levels in the detection window;
based on the 6 normalized HARR-like features (see formula 3), a pixel point is constructed
Figure 450549DEST_PATH_IMAGE343
Feature vector of
Figure 236102DEST_PATH_IMAGE344
Figure 833437DEST_PATH_IMAGE345
S203, in order to detect the waterline in the image, each pixel point is subjected to
Figure 729849DEST_PATH_IMAGE346
Feature vector of
Figure 994608DEST_PATH_IMAGE347
Performing identification to determine pixel points
Figure 685484DEST_PATH_IMAGE346
And (3) judging whether the characteristic vector accords with the linear characteristic, if so, setting 1 to the pixel point, otherwise, setting 0 to realize binarization of the gray level image, and further obtaining a binary image B. The method is innovative in that the binarization problem of the image is changed into the pattern recognition problem of the features.
Using SVM method to process feature vector
Figure 770114DEST_PATH_IMAGE348
Identifying, wherein the kernel function is represented by formula (5):
Figure 204638DEST_PATH_IMAGE349
wherein,
Figure 58324DEST_PATH_IMAGE350
is a covariance matrix of the feature set.
Figure 15758DEST_PATH_IMAGE351
Is that
Figure 587685DEST_PATH_IMAGE352
The formula (5) is a kernel function in the SVM classification method, and the feature vector is obtained
Figure 825900DEST_PATH_IMAGE353
Mapping to a higher dimension, judging whether the feature is a straight line feature through the SVM, and performing two-classification; constructing a feature vector for each pixel point according to the HARR-like features, and then judging whether the pixel point represented by each feature vector is on a straight line or not by using an SVM (support vector machine);
after obtaining the binary image B, carrying out expansion corrosion treatment on the binary image B to obtain a new binary image
Figure 534093DEST_PATH_IMAGE354
Obtaining an edge image E of the waterline by adopting a formula (6),
Figure 566771DEST_PATH_IMAGE355
Figure 625994DEST_PATH_IMAGE356
representing a new binary image
Figure 136740DEST_PATH_IMAGE357
To (1)
Figure 965019DEST_PATH_IMAGE358
Go to the first
Figure 434178DEST_PATH_IMAGE359
Pixel values of the columns;
Figure 715118DEST_PATH_IMAGE360
represents the edge image E
Figure 295135DEST_PATH_IMAGE361
The rows of the image data are, in turn,
Figure 977920DEST_PATH_IMAGE362
pixel values of the columns;
s204, identifying a straight line in the edge image E by adopting a HOUGH conversion method:
from new binary images
Figure 612120DEST_PATH_IMAGE363
Where the coordinates of the pixel having the pixel value of 1 are found to be
Figure 380356DEST_PATH_IMAGE364
Obtained by HOUGH conversion
Figure 498485DEST_PATH_IMAGE365
Obtained by the formula (7)
Figure 504618DEST_PATH_IMAGE366
A corresponding linear expression;
Figure 863049DEST_PATH_IMAGE367
wherein,
Figure 321844DEST_PATH_IMAGE368
respectively, the radius and angle (variable representation of straight line) detected by the HOUGH transform.
A group of
Figure 243663DEST_PATH_IMAGE368
Representing a straight line, given a set
Figure 635462DEST_PATH_IMAGE369
If at all
Figure 617324DEST_PATH_IMAGE370
Satisfies the formula (7)
Figure 88713DEST_PATH_IMAGE371
On the straight line represented by formula (7);
s205, due to the influence of noise in the image, a plurality of straight lines are detected, and the interference straight lines are removed to obtain real straight lines. The interference line removing method specifically comprises the following steps:
s205-1: selecting 3-5 straight lines with the longest length, wherein the length of the straight lines meets the threshold length (the length is greater than or equal to 1/4 of the image height, and the threshold length is 1/4 of the image height in the embodiment);
s205-2: because the waterline is continuous, the linear parameters detected by the two frames of images before and after the waterline are continuous
Figure 548645DEST_PATH_IMAGE372
The requirement of formula (8) is satisfied, namely:
Figure 794949DEST_PATH_IMAGE373
wherein,
Figure 682134DEST_PATH_IMAGE374
Figure 115521DEST_PATH_IMAGE375
are respectively as
Figure 379143DEST_PATH_IMAGE376
The maximum allowable deviation angle and radius,
Figure 479954DEST_PATH_IMAGE377
is a threshold value that is allowed for continuity,
Figure 272461DEST_PATH_IMAGE378
is the sampling instant of the image.
S205-3: if a plurality of straight lines are still satisfied after passing through S205-1 and S205-2 (there are 2 or 2 straight lines in a line), the straight line with the highest pixel average value on the straight line is selected as the detected waterline, because the waterline is generally white and the brightness is highest in the image.
S3, selecting a monitoring point of the current position of the vehicle based on the detected waterline, and calculating the running deviation of the vehicle
Figure 989881DEST_PATH_IMAGE379
Rate of change of deviation from running
Figure 791615DEST_PATH_IMAGE380
Selecting the monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation of the vehicle
Figure 6652DEST_PATH_IMAGE381
And predicting the rate of change of deviation
Figure 501219DEST_PATH_IMAGE382
The detected waterlines are shown in FIG. 3, H,
Figure 440356DEST_PATH_IMAGE383
The height and width of the edge image E (the height and width of the edge image E and the horizontal line image H and W are the same). Taking the intersection point of the horizontal line at the position of 0.7H and the waterline as a monitoring point of the current position of the vehicle, and taking the 0.7H as y in the formula (7) to obtain the abscissa of the monitoring point of the current position
Figure 249043DEST_PATH_IMAGE384
. At this time, the running deviation of the vehicle
Figure 58867DEST_PATH_IMAGE385
Rate of change of deviation from running
Figure 989914DEST_PATH_IMAGE386
Comprises the following steps:
0.7H is a value of this embodiment y, generally, the distance between two horizontal lines in fig. 3 should be at least the distance that the vehicle can travel in one control cycle, but the two horizontal lines cannot be too close to the upper and lower boundaries of the image, and according to the set vehicle speed and the visual field range of the vision sensor, the embodiment selects the intersection point of the horizontal line at 0.7H and the water line as the monitoring point of the current position of the vehicle;
Figure 88451DEST_PATH_IMAGE387
wherein,
Figure 966408DEST_PATH_IMAGE388
obtained by camera calibration, represents the actual length (the actual distance represented by a pixel unit) represented by each pixel. In the present embodiment, the first and second embodiments,
Figure 896318DEST_PATH_IMAGE389
the value is 0.3H, taking the intersection point of the horizontal line at the position of 0.3H and the waterline as a monitoring point of the predicted position of the vehicle to obtain the predicted running deviation of the vehicle
Figure 935950DEST_PATH_IMAGE390
And predicting the rate of change of deviation
Figure 336099DEST_PATH_IMAGE391
Figure 17747DEST_PATH_IMAGE392
(10)
This division is reasonable because the vehicle speed is approximately 40 m/min and the effective field of view of the vision sensor is 20-30 cm.
As seen in fig. 3, the current position error is on the left, but to the right by the predicted position. The error cannot be adjusted excessively at the current position and the current adjustment can be fine-tuned by the prediction error.
S4, according to the predicted operation deviation of the vehicle
Figure 536584DEST_PATH_IMAGE393
And predicting the rate of change of deviation
Figure 543854DEST_PATH_IMAGE394
Obtaining a predicted feedforward control quantity
Figure 148142DEST_PATH_IMAGE395
Based on running deviation of the vehicle
Figure 899060DEST_PATH_IMAGE396
Rate of change of deviation from running
Figure 803563DEST_PATH_IMAGE397
Obtaining an incremental nonlinear PID control law for a vehicle
Figure 981734DEST_PATH_IMAGE398
The step S4 specifically includes the following steps:
s401, according to the predicted operation deviation of the vehicle
Figure 870056DEST_PATH_IMAGE399
And predicting the rate of change of deviation
Figure 424665DEST_PATH_IMAGE400
Predicting a feedforward control amount of the vehicle
Figure 443393DEST_PATH_IMAGE401
Figure 792466DEST_PATH_IMAGE402
Figure 902505DEST_PATH_IMAGE403
In order to predict the amount of the feedforward control,
Figure 260805DEST_PATH_IMAGE404
and
Figure 874320DEST_PATH_IMAGE405
are two parameters of predictive control;
s402, in the area of the e-ec plane, based on the running deviation of the vehicle
Figure 394294DEST_PATH_IMAGE406
Rate of change of deviation from running
Figure 929312DEST_PATH_IMAGE407
Obtaining an incremental nonlinear PID control law for a vehicle
Figure 825724DEST_PATH_IMAGE408
The error e or the error change rate ec of the system conforms to Gaussian distribution, so that the non-uniform division method of the error and the error change rate is constructed based on a Gaussian function, and the e-ec plane takes the error e as the horizontal axis and the error change rate ec as the vertical axis; error or variation of errorThe change rate refers to a running deviation of the vehicle
Figure 824904DEST_PATH_IMAGE409
Rate of change of deviation from running
Figure 984621DEST_PATH_IMAGE410
(ii) a The e-ec plane division adopts a nonlinear division method:
Figure 538093DEST_PATH_IMAGE411
when the division of the error change rate ec is calculated,
Figure 701178DEST_PATH_IMAGE412
is that
Figure 554864DEST_PATH_IMAGE413
When the division of the error e is calculated,
Figure 682220DEST_PATH_IMAGE414
is that
Figure 519726DEST_PATH_IMAGE415
Figure 492362DEST_PATH_IMAGE416
Figure 466134DEST_PATH_IMAGE417
Respectively, the maximum of the absolute values of the error e and the error rate of change ec, wherein,
Figure 498812DEST_PATH_IMAGE418
for equal uniform division points of the error e or the error change rate ec,
Figure 292455DEST_PATH_IMAGE419
in order to non-uniformly partition the points after mapping,
Figure 334361DEST_PATH_IMAGE420
is non-linearA degree adjustment factor; the non-uniform partitioning is schematically shown in fig. 5.
According to the range of the error e and the error change rate ec, the e-ec plane is divided non-uniformly, and the set of divided areas (each square in fig. 6 is an area) is marked as
Figure 162640DEST_PATH_IMAGE421
As shown in fig. 6:
in each divided region
Figure 631798DEST_PATH_IMAGE422
PID control is carried out by a nonlinear increment PID controller, and the increment is controlled by nonlinear PID
Figure 647159DEST_PATH_IMAGE423
Comprises the following steps:
Figure 221316DEST_PATH_IMAGE424
wherein,
Figure 904102DEST_PATH_IMAGE425
to represent
Figure 481845DEST_PATH_IMAGE426
Region of time
Figure 718922DEST_PATH_IMAGE427
Non-linear PID control increments of (a);
Figure 837051DEST_PATH_IMAGE428
which is indicative of the time of the sampling,
Figure 639922DEST_PATH_IMAGE429
indicating area
Figure 450883DEST_PATH_IMAGE430
The ratio coefficient of the inner side of the outer ring,
Figure 440836DEST_PATH_IMAGE431
the scale is shown to be that of a scale,
Figure 628234DEST_PATH_IMAGE432
the lines are represented by a number of lines,
Figure 551191DEST_PATH_IMAGE433
the columns are represented.
Figure 267474DEST_PATH_IMAGE434
The value of the integral coefficient is represented by,
Figure 10302DEST_PATH_IMAGE435
represents a differential coefficient;
based on all regions
Figure 753391DEST_PATH_IMAGE436
Non-linear PID control increments of
Figure 265275DEST_PATH_IMAGE437
And calculating the weighted average increment of the nonlinear PID control:
Figure 418039DEST_PATH_IMAGE438
wherein,
Figure 648163DEST_PATH_IMAGE439
is a region
Figure 911785DEST_PATH_IMAGE440
The control law weight is incremented by one,
Figure 543755DEST_PATH_IMAGE441
is a region
Figure 867420DEST_PATH_IMAGE442
Error e (square in figure 6), radius of error rate of change ec,
Figure 319261DEST_PATH_IMAGE443
is a region
Figure 386574DEST_PATH_IMAGE444
Of the center of (c).
Synthesizing the above incremental control law weights
Figure 873050DEST_PATH_IMAGE445
Calculating
Figure 367617DEST_PATH_IMAGE446
Time incremental nonlinear PID control law
Figure 306754DEST_PATH_IMAGE447
Comprises the following steps:
Figure 177758DEST_PATH_IMAGE448
wherein,
Figure 512881DEST_PATH_IMAGE449
is an incremental factor, defined as:
Figure 178349DEST_PATH_IMAGE450
wherein,
Figure 604782DEST_PATH_IMAGE451
for the maximum value of the incremental factor,
Figure 545056DEST_PATH_IMAGE452
is an offset;
Figure 943808DEST_PATH_IMAGE453
for describing running deviations
Figure 780177DEST_PATH_IMAGE454
Rate of change of deviation from operation
Figure 631589DEST_PATH_IMAGE455
The degree of deviation from the origin is,
Figure 844396DEST_PATH_IMAGE456
is a scaling factor.
The method can enable different e and ec to have different increment factors.
Figure 628812DEST_PATH_IMAGE457
Is a running deviation
Figure 901662DEST_PATH_IMAGE458
Rate of change of deviation from running
Figure 302687DEST_PATH_IMAGE459
The controller increment factors under different conditions have the functions of improving the response speed of the system and reducing the complexity of the optimization process.
S5, controlling increment on the nonlinear PID in the region
Figure 47746DEST_PATH_IMAGE460
Parameter (2) of
Figure 952248DEST_PATH_IMAGE461
Figure 130420DEST_PATH_IMAGE462
Figure 753162DEST_PATH_IMAGE463
Performing online learning;
s5 specifically comprises the following steps: there are several pairs
Figure 42192DEST_PATH_IMAGE464
In the divided region
Figure 801201DEST_PATH_IMAGE465
Inner, then to the area
Figure 681432DEST_PATH_IMAGE466
Internal non-linear PID control increments
Figure 791470DEST_PATH_IMAGE467
Parameter (2) of
Figure 884191DEST_PATH_IMAGE468
Figure 763286DEST_PATH_IMAGE469
Figure 548839DEST_PATH_IMAGE470
And performing online learning.
Incremental nonlinear PID control using supervised Hebb learning rule
Figure 411753DEST_PATH_IMAGE471
The parameters of (2) are learned:
Figure 302305DEST_PATH_IMAGE155
(16)
wherein,
Figure 301485DEST_PATH_IMAGE472
is a region
Figure 992361DEST_PATH_IMAGE473
The learning rate in. To improve learning efficiency, learning rate
Figure 76991DEST_PATH_IMAGE474
Based on online adjustment rules
Figure 777094DEST_PATH_IMAGE475
The adjustment is carried out, namely:
Figure 834043DEST_PATH_IMAGE476
(17)
wherein,
Figure 695820DEST_PATH_IMAGE477
is a region
Figure 674271DEST_PATH_IMAGE478
The matching coefficient in the range, the learning rate range is adjusted,
Figure 912486DEST_PATH_IMAGE479
is a region
Figure 620679DEST_PATH_IMAGE480
Two weight coefficients within. The two weighting coefficients may be artificially given based on historical data.
As shown in fig. 4, the online optimizing control system for the unmanned line marking vehicle based on machine vision navigation comprises a vision sensor, a waterline detection unit, an operation error prediction unit, a prediction controller, a nonlinear incremental PID controller and an online learning rule unit;
the visual sensor collects a water line image (fig. 7) of a road surface, and a data sample is obtained:
the gray level image of the asphalt pavement is acquired through the vision sensor arranged on the line marking vehicle in the figure 1, a waterline drawn in advance is arranged on the pavement, and waterline information is required in the image. The visual sensor is arranged at a height of 35-45cm from the asphalt pavement, and the length of the visual field in the advancing direction of the marking vehicle is ensured to be 30-40cm. In order to ensure that the visual sensor is resistant to the interference of external light, a certain shading device is generally arranged outside the visual sensor.
The waterline detection unit performs waterline detection based on the waterline image to obtain a waterline;
the working process of the waterline detection unit specifically comprises the following steps:
s201, aiming at pixel points on the water line image
Figure 653357DEST_PATH_IMAGE481
Performing gray scale stretching, wherein the stretching formula is shown as formula (1):
Figure 198999DEST_PATH_IMAGE482
(1)
wherein,
Figure 506484DEST_PATH_IMAGE483
Figure 334763DEST_PATH_IMAGE484
Figure 803921DEST_PATH_IMAGE485
as a result of the stretching factor,
Figure 84861DEST_PATH_IMAGE486
represents an image
Figure 664878DEST_PATH_IMAGE487
Line, first
Figure 347663DEST_PATH_IMAGE488
Columns;
Figure 987723DEST_PATH_IMAGE489
is shown as
Figure 755959DEST_PATH_IMAGE490
Go to the first
Figure 139667DEST_PATH_IMAGE491
After the gray value of the gray value water line image of the column pixel points is stretched, the linear characteristic of each pixel point is expressed based on the HARR-like characteristic, and the HARR-like characteristic is shown in FIG. 2:
in the embodiment, 6 HARR-like features are selected according to engineering conditions, the number of the HARR-like features is large, and the HARR-like features are designed in the HARR-like features M1-M6 according to actual conditions to divide different regions. Assuming that the width of the water line region is 2w, where w is half the width of the water line, the width of the road surface region and the transition region in the HARR-like features M1-M3 is w. In the HARR-like features M4-M6, the width of each region is 2w, and the height of all HARR-like features is 4w-6w. For the rotation features M2, M3, M5, M6, the rotation angle is 15 degrees, which is determined by the site waterline characteristics.
S202, HARR-like feature for road projectionThe straight line characteristic of waterline on the face is favorable to the waterline discernment in later stage. For the ith HARR-like feature
Figure 676959DEST_PATH_IMAGE492
The calculation method is as formula (2):
Figure 487920DEST_PATH_IMAGE030
(2)
in the present embodiment, Q =6,
Figure 743452DEST_PATH_IMAGE493
for the sum of pixels of the ith HARR-like feature waterline region,
Figure 924991DEST_PATH_IMAGE494
the sum of the pixels of the ith HARR-like characteristic road surface area,
Figure 316789DEST_PATH_IMAGE495
Figure 298652DEST_PATH_IMAGE496
gray value based on stretched gray image
Figure 307059DEST_PATH_IMAGE497
Calculating to obtain;
after the description of the HARR-like features (straight line features) of each pixel point is obtained, normalizing each HARR-like feature, wherein the normalization formula is as shown in formula (3):
Figure 32570DEST_PATH_IMAGE339
(3)
wherein,
Figure 13295DEST_PATH_IMAGE498
in order to normalize the HARR signature after the normalization,
Figure 166059DEST_PATH_IMAGE499
Figure 68287DEST_PATH_IMAGE500
respectively the average value of the gray levels and the average value of the square of the gray levels in the detection window;
based on the 6 normalized HARR-like features (see formula 3), pixel points are constructed
Figure 331909DEST_PATH_IMAGE501
Feature vector of
Figure 698300DEST_PATH_IMAGE502
Figure 21965DEST_PATH_IMAGE503
S203, in order to detect the waterline in the image, each pixel point is subjected to detection
Figure 733526DEST_PATH_IMAGE504
Feature vector of
Figure 800839DEST_PATH_IMAGE505
Identifying and judging pixel points
Figure 287315DEST_PATH_IMAGE506
And (3) judging whether the characteristic vector accords with the linear characteristic, if so, setting 1 to the pixel point, otherwise, setting 0 to realize binarization of the gray level image, and further obtaining a binary image B. The method is innovative in that the binarization problem of the image is changed into the pattern recognition problem of the features.
The feature vector constructed by the SVM method is adopted for identification, and the kernel function is as follows:
Figure 781881DEST_PATH_IMAGE507
(5)
wherein,
Figure 721018DEST_PATH_IMAGE508
is a covariance matrix of the feature set.
Figure 592022DEST_PATH_IMAGE509
Is that
Figure 401847DEST_PATH_IMAGE510
The formula (5) is a kernel function to be designed in the SVM classification method, and is to use the feature vector
Figure 801735DEST_PATH_IMAGE511
Mapping to a higher dimension, judging whether the feature is a straight line feature through the SVM, and performing two-classification; constructing a linear feature vector for each pixel point according to the HARR-like features, and then judging whether the pixel point represented by each linear feature vector is on a straight line or not by using an SVM (support vector machine);
after obtaining the binary image, carrying out expansion corrosion treatment on the binary image B to obtain a new binary image
Figure 228168DEST_PATH_IMAGE512
Obtaining an edge image E of the waterline by adopting a formula (6);
Figure 637284DEST_PATH_IMAGE513
(6)
Figure 832773DEST_PATH_IMAGE514
representing a new binary image
Figure 934721DEST_PATH_IMAGE515
To (1) a
Figure 848451DEST_PATH_IMAGE516
Go to the first
Figure 789819DEST_PATH_IMAGE517
Pixel values of the columns;
Figure 105394DEST_PATH_IMAGE518
represents the edge image E
Figure 112664DEST_PATH_IMAGE519
The rows of the image data are, in turn,
Figure 716952DEST_PATH_IMAGE520
pixel values of the columns;
s204, identifying a straight line in the edge image E by adopting a HOUGH conversion method:
from new binary images
Figure 202291DEST_PATH_IMAGE521
Finding the coordinates of the pixel with the pixel value of 1 as
Figure 106793DEST_PATH_IMAGE522
Obtained by HOUGH conversion
Figure 550544DEST_PATH_IMAGE523
Obtained by the formula (7)
Figure 173286DEST_PATH_IMAGE524
Corresponding straight line expressions;
Figure 727895DEST_PATH_IMAGE525
(7)
wherein,
Figure 752483DEST_PATH_IMAGE526
respectively, the radius and angle (variable representation of straight line) detected by the HOUGH transform.
A group of
Figure 632714DEST_PATH_IMAGE527
Representing a straight line, given a set
Figure 742753DEST_PATH_IMAGE528
If at all
Figure 587473DEST_PATH_IMAGE529
Satisfy the formula(7) It is shown
Figure 732146DEST_PATH_IMAGE530
On the straight line represented by formula (7);
s205, due to the influence of noise in the image, a plurality of straight lines are detected, and the interference straight lines are removed in the embodiment to obtain real straight lines. The method for eliminating the interference straight line specifically comprises the following steps:
s205-1, selecting 3-5 straight lines with the longest length, wherein the length of the straight lines meets the threshold length (the length is greater than or equal to 1/4 of the image height, and the threshold length is 1/4 of the image height in the embodiment);
s205-2, because the waterline is continuous, the straight line parameters detected by the front frame image and the rear frame image
Figure 252120DEST_PATH_IMAGE531
The requirement of formula (8) is satisfied, namely:
Figure 849455DEST_PATH_IMAGE532
(8)
wherein,
Figure 480288DEST_PATH_IMAGE533
Figure 213888DEST_PATH_IMAGE534
are respectively as
Figure 170343DEST_PATH_IMAGE535
The maximum allowable deviation angle and radius,
Figure 254974DEST_PATH_IMAGE536
is a threshold value that is allowed for continuity,
Figure 689497DEST_PATH_IMAGE537
is the sampling instant of the image.
S205-3, if a plurality of straight lines meet the requirements after the process of S205-1 and S205-2, selecting the straight line with the highest pixel mean value on the straight line as the detected waterline, wherein the waterline is generally white and the brightness is highest in the image.
The operation error prediction unit selects a monitoring point of the current position of the vehicle based on the detected waterline, and calculates the operation deviation of the vehicle
Figure 808763DEST_PATH_IMAGE538
Rate of change of deviation from running
Figure 404961DEST_PATH_IMAGE539
Selecting the monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation of the vehicle
Figure 242467DEST_PATH_IMAGE540
And predicting the rate of change of deviation
Figure 474822DEST_PATH_IMAGE541
The detected waterline is shown in FIG. 3:
as shown in FIG. 3, H,
Figure 448594DEST_PATH_IMAGE542
The distribution is the height and width of the edge image E (the height and width of the edge image E and the height and width of the horizontal line image H and W are the same). Taking the intersection point of the horizontal line at the position of 0.7H and the waterline as a monitoring point of the current position of the vehicle, and taking the 0.7H as y in the formula (7) to obtain the abscissa of the monitoring point of the current position
Figure 481272DEST_PATH_IMAGE543
. At this time, the running deviation of the vehicle
Figure 540495DEST_PATH_IMAGE544
Rate of change of deviation from operation
Figure 316821DEST_PATH_IMAGE545
Comprises the following steps:
0.7H is a value of this embodiment, the distance between the two horizontal lines in fig. 3 should be at least the distance that the vehicle can travel in one control cycle, but the two horizontal lines cannot be too close to the upper and lower boundaries of the image, and according to the set vehicle speed and the visual field range of the vision sensor, the present embodiment takes the intersection point of the horizontal line at 0.7H and the waterline as a monitoring point of the current position of the vehicle;
Figure 145100DEST_PATH_IMAGE546
(9)
wherein,
Figure 83100DEST_PATH_IMAGE547
obtained by camera calibration, represents the actual length (the actual distance represented by a pixel unit) represented by each pixel. The intersection point of the horizontal line at 0.3H and the waterline is used as a monitoring point of the predicted position of the vehicle to obtain the predicted running deviation of the vehicle
Figure 364039DEST_PATH_IMAGE548
And predicting the rate of change of deviation
Figure 147319DEST_PATH_IMAGE549
Figure 830104DEST_PATH_IMAGE550
(10)
This division is reasonable because the vehicle speed is approximately 40 m/min and the effective field of view of the vision sensor is 20-30 cm.
As seen in fig. 3, the current position error is on the left, but to the right by the predicted position. The error cannot be adjusted excessively at the current position and the current adjustment can be fine-tuned by the prediction error.
The predictive controller predicts an operating deviation of the vehicle based on the predicted operating deviation
Figure 470164DEST_PATH_IMAGE551
And predicting the rate of change of deviation
Figure 503979DEST_PATH_IMAGE552
Predicting a feedforward control amount of the vehicle
Figure 616248DEST_PATH_IMAGE553
Figure 419119DEST_PATH_IMAGE554
(11)
Figure 230080DEST_PATH_IMAGE555
In order to predict the feedforward control quantity of the controller,
Figure 485612DEST_PATH_IMAGE556
and with
Figure 141853DEST_PATH_IMAGE557
Are two parameters of the predictive controller;
the nonlinear incremental PID controller is in a nonlinear division area of an e-ec plane and is based on the running deviation of the vehicle
Figure 64810DEST_PATH_IMAGE558
Rate of change of deviation from running
Figure 46672DEST_PATH_IMAGE559
Obtaining an incremental nonlinear PID control law for a vehicle
Figure 789500DEST_PATH_IMAGE560
The error e or the error change rate ec of the system accords with Gaussian distribution, so that the non-uniform dividing method for constructing the error e and the error change rate ec based on the Gaussian function is adopted, and the e-ec plane takes the error e as a horizontal axis and takes the error change rate ec as a vertical axis; error or rate of change of error refers to deviation in the operation of the vehicle
Figure 249431DEST_PATH_IMAGE561
Rate of change of deviation from operation
Figure 230157DEST_PATH_IMAGE562
The e-ec plane division adopts a nonlinear division method, and the nonlinear division method comprises the following steps:
Figure 382921DEST_PATH_IMAGE563
(12)
when the division of the error change rate ec is calculated,
Figure 347466DEST_PATH_IMAGE564
is that
Figure 605228DEST_PATH_IMAGE565
When the division of the error e is calculated,
Figure 971619DEST_PATH_IMAGE566
is that
Figure 295284DEST_PATH_IMAGE567
Figure 12704DEST_PATH_IMAGE568
Figure 80017DEST_PATH_IMAGE569
Respectively, the maximum of the absolute values of the error e and the error rate of change ec, wherein,
Figure 35335DEST_PATH_IMAGE570
equal division points for error e or error change rate ec,
Figure 529901DEST_PATH_IMAGE571
in order to non-uniformly divide the points after mapping,
Figure 813246DEST_PATH_IMAGE572
adjusting a factor for the degree of non-linearity; the non-uniform partitioning diagram is shown in fig. 5.
According to the range of the error e and the error change rate ec, the e-ec plane is divided non-uniformly, and the set of divided areas (each square in fig. 6 is an area) is marked as
Figure 418671DEST_PATH_IMAGE573
As shown in fig. 6:
in each divided region
Figure 759654DEST_PATH_IMAGE574
PID control is carried out by a nonlinear increment PID controller, and the increment is controlled by the nonlinear PID
Figure 708279DEST_PATH_IMAGE575
Comprises the following steps:
Figure 134712DEST_PATH_IMAGE576
(13)
wherein,
Figure 809407DEST_PATH_IMAGE577
represent
Figure 4896DEST_PATH_IMAGE578
Region of time of day
Figure 841265DEST_PATH_IMAGE579
Non-linear PID control increments of (1);
Figure 489415DEST_PATH_IMAGE580
which is indicative of the time of the sampling,
Figure 436643DEST_PATH_IMAGE581
indicating area
Figure 752217DEST_PATH_IMAGE582
The internal proportionality coefficient of the air-fuel ratio,
Figure 493909DEST_PATH_IMAGE583
the scale is shown to be that of,
Figure 629355DEST_PATH_IMAGE584
the lines are represented by a number of lines,
Figure 380273DEST_PATH_IMAGE585
a presentation column;
Figure 284775DEST_PATH_IMAGE586
the value of the integral coefficient is represented by,
Figure 457087DEST_PATH_IMAGE587
representing the differential coefficient.
Based on all regions
Figure 345409DEST_PATH_IMAGE588
Non-linear PID control increments of
Figure 368860DEST_PATH_IMAGE589
Calculating a nonlinear PID control weighted average increment:
Figure 659027DEST_PATH_IMAGE590
(14)
wherein,
Figure 273679DEST_PATH_IMAGE591
is a region
Figure 383717DEST_PATH_IMAGE592
The control law weight is incremented by one,
Figure 945280DEST_PATH_IMAGE593
is a region
Figure 824374DEST_PATH_IMAGE594
Error e (square in figure 6), radius of error rate of change ec,
Figure 16452DEST_PATH_IMAGE595
is a region
Figure 82628DEST_PATH_IMAGE596
Of the center of (a).
Synthesizing the above incremental control law weights
Figure 979040DEST_PATH_IMAGE597
Calculating
Figure 441202DEST_PATH_IMAGE598
Time incremental nonlinear PID control law
Figure 397657DEST_PATH_IMAGE599
Comprises the following steps:
Figure 216708DEST_PATH_IMAGE600
(15)
wherein,
Figure 651232DEST_PATH_IMAGE601
is an incremental factor, which is defined as follows:
Figure 239339DEST_PATH_IMAGE602
wherein,
Figure 366695DEST_PATH_IMAGE603
for the maximum value of the incremental factor,
Figure 938622DEST_PATH_IMAGE604
is an offset;
Figure 176836DEST_PATH_IMAGE605
for describing the running deviation of the vehicle
Figure 619450DEST_PATH_IMAGE606
Rate of change of deviation from operation
Figure 917708DEST_PATH_IMAGE607
The degree of deviation from the origin;
Figure 711351DEST_PATH_IMAGE608
Figure 481818DEST_PATH_IMAGE609
is a scaling factor.
The method can lead the error change rate ec to have different increment factors according to different errors e.
Figure 778938DEST_PATH_IMAGE610
Gives out the running deviation
Figure 982518DEST_PATH_IMAGE611
Rate of change of deviation from running
Figure 997878DEST_PATH_IMAGE612
The controller increment factors under different conditions have the functions of improving the response speed of the system and reducing the complexity of the optimization process.
On-line learning rule unit for non-linear PID control increment in region
Figure 312316DEST_PATH_IMAGE613
Parameter (2)
Figure 729522DEST_PATH_IMAGE614
Figure 838423DEST_PATH_IMAGE615
Figure 341080DEST_PATH_IMAGE616
Performing online learning, and adopting supervised Hebb learning rule to control increment of nonlinear PID
Figure 459209DEST_PATH_IMAGE617
Learning the parameters;
the process of the online learning rule unit specifically comprises the following steps: there are several pairs
Figure 262080DEST_PATH_IMAGE618
In the divided region
Figure 73041DEST_PATH_IMAGE619
Inner, then to the area
Figure 346151DEST_PATH_IMAGE620
Internal non-linear PID control increments
Figure 2391DEST_PATH_IMAGE621
Parameter (2) of
Figure 659769DEST_PATH_IMAGE622
Figure 641631DEST_PATH_IMAGE623
Figure 118880DEST_PATH_IMAGE624
And performing online learning. Increment control of nonlinear PID by adopting supervised Hebb learning rule
Figure 109970DEST_PATH_IMAGE625
The parameters of (2) are learned:
Figure 621854DEST_PATH_IMAGE155
(16)
wherein,
Figure 774618DEST_PATH_IMAGE626
is a region
Figure 739163DEST_PATH_IMAGE627
The inner learning rate. To improve learning efficiency, learning rate
Figure 268364DEST_PATH_IMAGE628
Based on online adjustment rules
Figure 634754DEST_PATH_IMAGE629
The adjustment is carried out, namely:
Figure 958419DEST_PATH_IMAGE630
(17)
wherein,
Figure 675840DEST_PATH_IMAGE631
is a region
Figure 737293DEST_PATH_IMAGE632
The matching coefficient in the range, the learning rate range is adjusted,
Figure 958190DEST_PATH_IMAGE633
is a region
Figure 452757DEST_PATH_IMAGE634
Two weight coefficients within. The two weighting coefficients may be artificially given based on historical data.
The part adjusts the parameters of the controllers of each area on line, and realizes the on-line adjustment of the learning rate of each area.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: rather, the invention as claimed requires more features than are expressly recited in each claim. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules or units or groups of devices in the examples disclosed herein may be arranged in a device as described in this embodiment, or alternatively may be located in one or more devices different from the devices in this example. The modules in the foregoing examples may be combined into one module or may additionally be divided into multiple sub-modules.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. Modules or units or groups in embodiments may be combined into one module or unit or group and may furthermore be divided into sub-modules or sub-units or sub-groups. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Moreover, those skilled in the art will appreciate that although some embodiments described herein include some features included in other embodiments, not others, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
Furthermore, some of the described embodiments are described herein as a method or combination of method elements that can be performed by a processor of a computer system or by other means of performing the described functions. A processor having the necessary instructions for carrying out the method or method elements thus forms a means for carrying out the method or method elements. Further, the elements of the apparatus embodiments described herein are examples of the following apparatus: the apparatus is used to implement the functions performed by the elements for the purpose of carrying out the invention.
The various techniques described herein may be implemented in connection with hardware or software or, alternatively, with a combination of both. Thus, the methods and apparatus of the present invention, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium, wherein, when the program is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the invention.
In the case of program code execution on programmable computers, the computing device will generally include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. Wherein the memory is configured to store program code; the processor is configured to perform the method of the invention according to instructions in said program code stored in the memory.
By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer-readable media includes both computer storage media and communication media. Computer storage media stores information such as computer readable instructions, data structures, program modules or other data. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. Combinations of any of the above are also included within the scope of computer readable media.
As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this description, will appreciate that other embodiments can be devised which do not depart from the scope of the invention as disclosed herein. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the appended claims. The present invention has been disclosed in an illustrative rather than a restrictive sense, and the scope of the present invention is defined by the appended claims.

Claims (10)

1. The on-line optimization control method of the unmanned line marking vehicle based on machine vision navigation is characterized by comprising the following steps:
s1, collecting a water line image of a road surface;
s2, carrying out waterline detection based on the waterline image to obtain a waterline;
s3, selecting a monitoring point of the current position of the vehicle based on the detected waterline, and calculating the running deviation of the vehicle
Figure 911791DEST_PATH_IMAGE001
Rate of change of deviation from running
Figure 173008DEST_PATH_IMAGE002
Selecting the monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation of the vehicle
Figure 169783DEST_PATH_IMAGE003
And predicting the rate of change of deviation
Figure 350098DEST_PATH_IMAGE004
S4, according to the predicted operation deviation of the vehicle
Figure 697903DEST_PATH_IMAGE005
And predicting the rate of change of deviation
Figure 299828DEST_PATH_IMAGE006
Obtaining a predicted feedforward control quantity
Figure 151110DEST_PATH_IMAGE007
Based on running deviation of the vehicle
Figure 643271DEST_PATH_IMAGE008
Rate of change of deviation from operation
Figure 212792DEST_PATH_IMAGE009
Obtaining an incremental nonlinear PID control law for a vehicle
Figure 81391DEST_PATH_IMAGE010
S5, carrying out incremental nonlinear PID control law in the region
Figure 787179DEST_PATH_IMAGE011
The parameters in (1) are learned online.
2. The on-line optimizing control method for the unmanned line marking vehicle based on machine vision navigation as claimed in claim 1,
the step S2 specifically includes the following steps:
s201, aiming at pixel points on the water line image
Figure 715821DEST_PATH_IMAGE012
Performing gray scale stretching, wherein the stretching formula is shown as formula (1):
Figure 38218DEST_PATH_IMAGE013
(1)
wherein,
Figure 710508DEST_PATH_IMAGE014
Figure 67540DEST_PATH_IMAGE015
Figure 167083DEST_PATH_IMAGE016
is a stretch factor;
Figure 236496DEST_PATH_IMAGE017
Figure 712476DEST_PATH_IMAGE018
represents an image of
Figure 658436DEST_PATH_IMAGE019
Go to the first
Figure 663301DEST_PATH_IMAGE020
Columns;
Figure 694711DEST_PATH_IMAGE021
denotes the first
Figure 708803DEST_PATH_IMAGE022
Go to the first
Figure 509269DEST_PATH_IMAGE023
Gray values of the column pixel points; after the gray level of the water line image is stretched, selecting
Figure 950614DEST_PATH_IMAGE024
A personal HARR-like feature;
s202, aiming at the first
Figure 479DEST_PATH_IMAGE025
Characteristic of personal HARR
Figure 83841DEST_PATH_IMAGE026
Figure 4393DEST_PATH_IMAGE027
The calculation method is as shown in formula (2):
Figure 888078DEST_PATH_IMAGE028
(2)
wherein,
Figure 362922DEST_PATH_IMAGE029
is a first
Figure 249975DEST_PATH_IMAGE030
The sum of pixels of the individual HARR-like feature waterline regions,
Figure 290613DEST_PATH_IMAGE031
is as follows
Figure 339340DEST_PATH_IMAGE032
The sum of pixels of a road surface area with characteristics similar to HARR,
Figure 567059DEST_PATH_IMAGE033
Figure 992224DEST_PATH_IMAGE034
gray value based on stretched gray image
Figure 621789DEST_PATH_IMAGE035
Calculating to obtain;
after the HARR-like feature description of each pixel point is obtained, normalizing each HARR-like feature, wherein the normalization formula is as shown in formula (3):
Figure 372576DEST_PATH_IMAGE036
(3)
wherein,
Figure 87591DEST_PATH_IMAGE037
Figure 802867DEST_PATH_IMAGE038
in order to normalize the HARR signature after the normalization,
Figure 818096DEST_PATH_IMAGE039
Figure 208626DEST_PATH_IMAGE040
respectively the average value of the gray levels and the average value of the square of the gray levels in the detection window;
based on normalized HARR-like characteristics, constructing pixel points
Figure 676517DEST_PATH_IMAGE041
Feature vector of
Figure 912326DEST_PATH_IMAGE042
Figure 250903DEST_PATH_IMAGE043
(4)
S203, aiming at each pixel point
Figure 812335DEST_PATH_IMAGE044
Feature vector of
Figure 236363DEST_PATH_IMAGE045
Identifying and judging pixel points
Figure 541442DEST_PATH_IMAGE046
If the characteristic vector accords with the linear characteristic, setting 1 for the pixel point, and otherwise, setting 0 for realizing the binaryzation of the gray level image to obtain a binary image B;
carrying out expansion corrosion treatment on the binary image B to obtain a new binary image
Figure 661757DEST_PATH_IMAGE047
And obtaining an edge image E of the waterline by adopting a formula (6):
Figure 128510DEST_PATH_IMAGE048
Figure 39835DEST_PATH_IMAGE049
representing a new binary image
Figure 883026DEST_PATH_IMAGE050
To (1) a
Figure 196195DEST_PATH_IMAGE051
Go to the first
Figure 833850DEST_PATH_IMAGE052
Pixel values of the columns;
Figure 966891DEST_PATH_IMAGE053
represents the edge image E
Figure 348194DEST_PATH_IMAGE054
Go to the first
Figure 515870DEST_PATH_IMAGE055
Pixel values of the columns;
s204, identifying a straight line in the edge image E by adopting a HOUGH conversion method:
from new binary images
Figure 590005DEST_PATH_IMAGE056
Where the coordinates of the pixel having the pixel value of 1 are found to be
Figure 475922DEST_PATH_IMAGE057
Each straight line being obtained by HOUGH conversion
Figure 982952DEST_PATH_IMAGE058
Obtained by the formula (7)
Figure 536293DEST_PATH_IMAGE059
A corresponding linear expression;
Figure 515751DEST_PATH_IMAGE060
wherein,
Figure 623384DEST_PATH_IMAGE061
and
Figure 877648DEST_PATH_IMAGE062
radius and angle detected by HOUGH transformation;
a group of
Figure 285495DEST_PATH_IMAGE063
Representing a straight line, given a set
Figure 701433DEST_PATH_IMAGE064
If at all
Figure 296362DEST_PATH_IMAGE065
Satisfies the expression of the formula (7)
Figure 88738DEST_PATH_IMAGE066
On the straight line represented by formula (7);
s205, eliminating interference straight lines;
the method for eliminating the interference straight line specifically comprises the following steps:
s205-1: selecting a plurality of straight lines with the longest length, wherein the length of the straight lines meets the length of a threshold value;
s205-2: front and back two-value image
Figure 85513DEST_PATH_IMAGE067
Detected straight line parameter
Figure 937931DEST_PATH_IMAGE068
Satisfies the requirement of formula (8):
Figure 291595DEST_PATH_IMAGE069
wherein,
Figure 622082DEST_PATH_IMAGE070
Figure 4522DEST_PATH_IMAGE071
are respectively as
Figure 496683DEST_PATH_IMAGE072
The maximum allowable deviation angle and radius,
Figure 66205DEST_PATH_IMAGE073
is a threshold value that is allowed for continuity,
Figure 934804DEST_PATH_IMAGE074
is the sampling time of the image;
s205-3: and if more than one straight line meets the requirement after the steps S205-1 and S205-2, selecting the straight line with the highest pixel average value on the straight line as the detected waterline.
3. The on-line optimizing control method for the unmanned line marking vehicle based on machine vision navigation as claimed in claim 1,
the step S3 specifically includes the following steps:
the height and width of the edge image E at which the waterline is located are H and
Figure 906171DEST_PATH_IMAGE075
selecting the intersection point of the horizontal line with the height of y and the waterline as the monitor of the current position of the vehicleMeasuring point (
Figure 208714DEST_PATH_IMAGE076
Y) running deviation of the vehicle
Figure 345427DEST_PATH_IMAGE077
Rate of change of deviation from running
Figure 283296DEST_PATH_IMAGE078
Comprises the following steps:
Figure 843590DEST_PATH_IMAGE079
wherein,
Figure 943133DEST_PATH_IMAGE080
the actual length represented by each pixel is represented by the camera calibration;
selecting a height of
Figure 752826DEST_PATH_IMAGE081
The intersection point of the horizontal line and the waterline is used as a monitoring point of the predicted position of the vehicle (
Figure 228807DEST_PATH_IMAGE082
Figure 440345DEST_PATH_IMAGE083
) Obtaining a predicted running deviation of the vehicle
Figure 976369DEST_PATH_IMAGE084
And predicting the rate of change of deviation
Figure 601254DEST_PATH_IMAGE085
Figure 152364DEST_PATH_IMAGE086
Wherein,
Figure 687251DEST_PATH_IMAGE087
the distribution is the width of the edge image E.
4. The on-line optimizing control method for the unmanned line marking vehicle based on machine vision navigation as claimed in claim 1,
s4 specifically comprises the following steps:
s401, according to the predicted operation deviation of the vehicle
Figure 394176DEST_PATH_IMAGE088
And predicting the rate of change of deviation
Figure 912882DEST_PATH_IMAGE089
Calculating a feedforward control amount of the vehicle
Figure 465086DEST_PATH_IMAGE090
Figure 385637DEST_PATH_IMAGE091
Figure 732305DEST_PATH_IMAGE092
In order to calculate the feedforward control amount,
Figure 472728DEST_PATH_IMAGE093
and
Figure 94202DEST_PATH_IMAGE094
are two parameters of predictive control;
s402, in the area of the e-ec plane, based on the running deviation of the vehicle
Figure 869260DEST_PATH_IMAGE095
Rate of change of deviation from operation
Figure 652408DEST_PATH_IMAGE096
Obtaining an incremental nonlinear PID control law for a vehicle
Figure 874268DEST_PATH_IMAGE097
S402 specifically includes the following steps: the non-uniform dividing method for constructing the error e and the error change rate ec based on the Gaussian function is characterized in that an e-ec plane takes the error e as a horizontal axis and takes the error change rate ec as a vertical axis; the error e and the error change rate ec refer to the running deviation of the vehicle
Figure 768275DEST_PATH_IMAGE098
And rate of change of operating deviation
Figure 663418DEST_PATH_IMAGE099
(ii) a The e-ec plane division adopts a nonlinear division method as follows:
Figure 617468DEST_PATH_IMAGE100
when the division of the error change rate ec is calculated,
Figure 332483DEST_PATH_IMAGE101
is that
Figure 30181DEST_PATH_IMAGE102
When the division of the error e is calculated,
Figure 514252DEST_PATH_IMAGE103
is that
Figure 639202DEST_PATH_IMAGE104
Figure 841514DEST_PATH_IMAGE105
And
Figure 77323DEST_PATH_IMAGE106
respectively the maximum of the absolute values of the error e and the error rate of change ec,
Figure 947059DEST_PATH_IMAGE107
equal division points for error e or error change rate ec,
Figure 779929DEST_PATH_IMAGE108
non-uniform dividing points after mapping;
Figure 469536DEST_PATH_IMAGE109
adjusting a factor for the degree of non-linearity;
after the e-ec plane is divided in a non-uniform way, a divided area set is marked as
Figure 774616DEST_PATH_IMAGE110
In each divided region
Figure 233279DEST_PATH_IMAGE111
Internally performing PID control, and non-linear PID control increment
Figure 965611DEST_PATH_IMAGE112
Comprises the following steps:
Figure 408094DEST_PATH_IMAGE113
wherein,
Figure 516864DEST_PATH_IMAGE114
region representing time k
Figure 830034DEST_PATH_IMAGE115
Non-linear PID control increments of (a);
Figure 467689DEST_PATH_IMAGE116
which is indicative of the time of the sampling,
Figure 397467DEST_PATH_IMAGE117
indicating area
Figure 778770DEST_PATH_IMAGE118
The internal proportionality coefficient of the air-fuel ratio,
Figure 42104DEST_PATH_IMAGE119
the scale is shown to be that of a scale,
Figure 850660DEST_PATH_IMAGE120
the lines are represented as a result of,
Figure 2155DEST_PATH_IMAGE121
a presentation column;
Figure 921570DEST_PATH_IMAGE122
indicating area
Figure 209332DEST_PATH_IMAGE123
Inner integral the coefficients of which are such that,
Figure 719947DEST_PATH_IMAGE124
indicating area
Figure 93160DEST_PATH_IMAGE125
The inner differential coefficient;
based on all regions
Figure 816265DEST_PATH_IMAGE126
Non-linear PID control increments of
Figure 958534DEST_PATH_IMAGE127
And calculating the weighted average increment of the nonlinear PID control:
Figure 905630DEST_PATH_IMAGE128
wherein,
Figure 427790DEST_PATH_IMAGE129
is a region
Figure 220166DEST_PATH_IMAGE130
The weight of the control law is increased by an amount,
Figure 951361DEST_PATH_IMAGE131
is a region
Figure 272621DEST_PATH_IMAGE132
The error e and the radius of the error rate of change ec,
Figure 354847DEST_PATH_IMAGE133
is a region
Figure 419755DEST_PATH_IMAGE134
The center of (a);
calculating out
Figure 536615DEST_PATH_IMAGE135
Incremental non-linear PID control law of time
Figure 294356DEST_PATH_IMAGE136
Comprises the following steps:
Figure 863877DEST_PATH_IMAGE137
wherein,
Figure 466897DEST_PATH_IMAGE138
is an incremental factor, defined as:
Figure 438264DEST_PATH_IMAGE139
wherein,
Figure 366906DEST_PATH_IMAGE140
for the maximum value of the incremental factor,
Figure 417864DEST_PATH_IMAGE141
is an offset;
Figure 90154DEST_PATH_IMAGE142
for describing the degree to which the error e and the rate of change of error ec deviate from the origin,
Figure 916027DEST_PATH_IMAGE143
Figure 281150DEST_PATH_IMAGE144
is a scaling factor;
the method can enable different e and ec to have different increment factors.
5. The on-line optimizing control method for the unmanned line marking vehicle based on machine vision navigation as claimed in claim 1,
the step S5 specifically includes the following steps:
there are a number of pairs
Figure 559684DEST_PATH_IMAGE145
In a divided region
Figure 770086DEST_PATH_IMAGE146
Inner, then to the area
Figure 716045DEST_PATH_IMAGE147
Internal non-linear PID control increments
Figure 986489DEST_PATH_IMAGE148
Parameter (2)
Figure 752320DEST_PATH_IMAGE149
Figure 766412DEST_PATH_IMAGE150
Figure 566878DEST_PATH_IMAGE151
Performing online learning;
incremental nonlinear PID control using supervised Hebb learning rule
Figure 545242DEST_PATH_IMAGE152
The parameters of (2) are learned:
Figure 63948DEST_PATH_IMAGE153
(16)
wherein,
Figure 616152DEST_PATH_IMAGE154
is a region
Figure 5545DEST_PATH_IMAGE155
Internal learning rate, learning rate
Figure 883371DEST_PATH_IMAGE156
Based on online adjustment rules
Figure 889373DEST_PATH_IMAGE157
Adjusting:
Figure 776427DEST_PATH_IMAGE158
(17)
wherein,
Figure 941698DEST_PATH_IMAGE159
is a region
Figure 459267DEST_PATH_IMAGE160
The matching coefficient in the range is used for adjusting the learning rate range,
Figure 952565DEST_PATH_IMAGE161
is a region
Figure 395308DEST_PATH_IMAGE162
Two weight coefficients within.
6. The unmanned line marking vehicle online optimization control system based on machine vision navigation is characterized by comprising a vision sensor, a waterline detection unit, an operation error prediction unit, a prediction controller, a nonlinear increment PID controller and an online learning rule unit;
a vision sensor collects a water line image of a road surface;
the waterline detection unit performs waterline detection based on the waterline image to obtain a waterline;
the operation error prediction unit selects a monitoring point of the current position of the vehicle based on the detected waterline, and calculates the operation deviation of the vehicle
Figure 290452DEST_PATH_IMAGE163
Rate of change of deviation from operation
Figure 244501DEST_PATH_IMAGE164
Selecting the monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation of the vehicle
Figure 225096DEST_PATH_IMAGE165
And predicting the rate of change of deviation
Figure 188372DEST_PATH_IMAGE166
The predictive controller predicts an operating deviation of the vehicle based on the predicted operating deviation
Figure 406864DEST_PATH_IMAGE167
And predicting the rate of change of deviation
Figure 531815DEST_PATH_IMAGE168
Calculating a feedforward control quantity
Figure 734126DEST_PATH_IMAGE169
The nonlinear incremental PID controller is in a nonlinear divided area on an e-ec plane and is based on the running deviation of the vehicle
Figure 969935DEST_PATH_IMAGE170
Rate of change of deviation from running
Figure 574092DEST_PATH_IMAGE171
Obtaining an incremental nonlinear PID control law for a vehicle
Figure 869944DEST_PATH_IMAGE172
Online learning rule Unit versus nonlinear PID control increments within a region
Figure 893307DEST_PATH_IMAGE173
The parameters of (2) are learned.
7. The on-line optimizing control system for an unmanned line marking vehicle based on machine vision navigation of claim 6,
the working process of the waterline detection unit specifically comprises the following steps:
s201, aiming at pixel points on the water line image
Figure 995124DEST_PATH_IMAGE174
Performing gray scale stretching, wherein the stretching formula is shown as formula (1):
Figure 453787DEST_PATH_IMAGE175
wherein,
Figure 920541DEST_PATH_IMAGE176
Figure 97444DEST_PATH_IMAGE177
Figure 675056DEST_PATH_IMAGE178
as a result of the stretching factor,
Figure 988226DEST_PATH_IMAGE179
represents an image of
Figure 625880DEST_PATH_IMAGE180
Go to the first
Figure 24501DEST_PATH_IMAGE181
Columns;
Figure 405803DEST_PATH_IMAGE182
denotes the first
Figure 573480DEST_PATH_IMAGE183
Go to the first
Figure 376176DEST_PATH_IMAGE184
Gray values of the column pixel points; after stretching the gray scale of the water line image, selecting
Figure 262093DEST_PATH_IMAGE185
The individual HARR-like features express the linear features of each pixel point;
s202, aiming at the first step
Figure 509403DEST_PATH_IMAGE186
Generic HARR characteristics
Figure 531586DEST_PATH_IMAGE187
The calculation method is as shown in formula (2):
Figure 511043DEST_PATH_IMAGE188
(2)
wherein,
Figure 884256DEST_PATH_IMAGE189
for the pixel sum of the i-th HARR-like feature waterline region,
Figure 138520DEST_PATH_IMAGE190
for the sum of pixels of the ith HARR-like feature road surface area,
Figure 811946DEST_PATH_IMAGE191
Figure 493463DEST_PATH_IMAGE192
from the gray value of the stretched gray image
Figure 353972DEST_PATH_IMAGE193
Calculating to obtain;
after the HARR-like feature description of each pixel point is obtained, normalizing each HARR-like feature, wherein the normalization formula is as shown in formula (3):
Figure 417786DEST_PATH_IMAGE194
wherein,
Figure 414561DEST_PATH_IMAGE195
is as follows
Figure 266979DEST_PATH_IMAGE196
The number of normalized HARR features is then determined,
Figure 880363DEST_PATH_IMAGE197
Figure 210850DEST_PATH_IMAGE198
respectively the average value of the gray levels and the average value of the square of the gray levels in the detection window;
based on normalized HARR-like characteristics, constructing pixel points
Figure 327711DEST_PATH_IMAGE199
Feature vector of
Figure 351031DEST_PATH_IMAGE200
Figure 920552DEST_PATH_IMAGE201
(4)
S203, aiming at each pixel point
Figure 585889DEST_PATH_IMAGE202
Feature vector of
Figure 557256DEST_PATH_IMAGE203
Identifying and judging pixel points
Figure 237897DEST_PATH_IMAGE204
If the characteristic vector accords with the linear characteristic, setting 1 for the pixel point, and otherwise, setting 0 for realizing the binaryzation of the gray level image so as to obtain a binary image B;
carrying out expansion corrosion treatment on the binary image B to obtain a new binary image
Figure 560293DEST_PATH_IMAGE205
Obtaining an edge image E of the waterline by adopting a formula (6),
Figure 232583DEST_PATH_IMAGE206
Figure 589615DEST_PATH_IMAGE207
representing a new binary image
Figure 423579DEST_PATH_IMAGE208
To (1) a
Figure 498851DEST_PATH_IMAGE209
Go to the first
Figure 974832DEST_PATH_IMAGE210
Pixel values of the columns;
Figure 186371DEST_PATH_IMAGE211
represents the edge image E
Figure 722394DEST_PATH_IMAGE212
The number of rows is such that,
Figure 753804DEST_PATH_IMAGE213
pixel values of the columns;
s204, identifying a straight line in the edge image E by adopting a HOUGH conversion method:
from new binary images
Figure 767896DEST_PATH_IMAGE214
Finding the coordinates of the pixel with the pixel value of 1 as
Figure 761172DEST_PATH_IMAGE215
Obtained by HOUGH conversion
Figure 468097DEST_PATH_IMAGE216
Obtained by the formula (7)
Figure 986803DEST_PATH_IMAGE217
A corresponding linear expression;
Figure 398062DEST_PATH_IMAGE218
wherein,
Figure 177668DEST_PATH_IMAGE219
radius and angle detected by HOUGH transformation;
a group of
Figure 377531DEST_PATH_IMAGE219
Representing a straight line, given a set
Figure 711429DEST_PATH_IMAGE220
If, if
Figure 660799DEST_PATH_IMAGE221
Satisfies the formula (7)
Figure 498174DEST_PATH_IMAGE222
On the straight line represented by formula (7);
s205, eliminating interference straight lines;
the interference line removing method specifically comprises the following steps:
s205-1, selecting a plurality of straight lines with the longest length and the length meeting the threshold length;
s205-2, two frames of binary images
Figure 812481DEST_PATH_IMAGE223
Detected straight line parameter
Figure 836937DEST_PATH_IMAGE224
Satisfies formula (8):
Figure 527682DEST_PATH_IMAGE225
wherein,
Figure 157246DEST_PATH_IMAGE226
Figure 179472DEST_PATH_IMAGE227
are respectively as
Figure 487963DEST_PATH_IMAGE228
The maximum allowable deviation angle and radius,
Figure 982398DEST_PATH_IMAGE229
is a threshold value that is allowed for continuity,
Figure 59944DEST_PATH_IMAGE230
is the sampling time of the image;
s205-3, if 2 or more than 2 straight lines still exist after the process of S205-1 and S205-2, selecting the straight line with the highest pixel mean value on the straight line as the detected waterline.
8. The on-line optimizing control system of the unmanned line marking vehicle based on machine vision navigation as claimed in claim 6,
the specific working process of the operation error prediction unit comprises the following steps: the height and width of the edge image E of the waterline are H and H respectively
Figure 434162DEST_PATH_IMAGE231
(ii) a Selecting the intersection point of the horizontal line with the height of y and the waterline as the monitoring point of the current position of the vehicle: (
Figure 450790DEST_PATH_IMAGE232
,y),Calculating a running deviation of a vehicle
Figure 748916DEST_PATH_IMAGE233
Rate of change of deviation from operation
Figure 149810DEST_PATH_IMAGE234
Comprises the following steps:
Figure 507979DEST_PATH_IMAGE235
wherein,
Figure 728745DEST_PATH_IMAGE236
the actual length represented by each pixel is represented by the camera calibration;
selecting a height of
Figure 768245DEST_PATH_IMAGE237
(the intersection point of the horizontal line and the waterline is taken as a monitoring point of the predicted position of the vehicle
Figure 492487DEST_PATH_IMAGE238
Figure 224820DEST_PATH_IMAGE239
) Obtaining a predicted running deviation of the vehicle
Figure 469900DEST_PATH_IMAGE240
And predicting the rate of change of deviation
Figure 77205DEST_PATH_IMAGE241
Figure 374063DEST_PATH_IMAGE242
9. The on-line optimizing control system for an unmanned line marking vehicle based on machine vision navigation of claim 6,
feedforward control amount of vehicle predicted by prediction controller
Figure 864913DEST_PATH_IMAGE243
Comprises the following steps:
Figure 60271DEST_PATH_IMAGE244
Figure 972732DEST_PATH_IMAGE245
in order to predict the feedforward control quantity of the controller,
Figure 937146DEST_PATH_IMAGE246
and
Figure 745702DEST_PATH_IMAGE247
are two parameters of the predictive controller;
the working process of the nonlinear incremental PID controller specifically comprises the following steps:
constructing a non-uniform division method of an error e and an error change rate ec based on a Gaussian function, wherein an e-ec plane takes the error e as a horizontal axis and the error change rate ec as a vertical axis; error or rate of change of error refers to deviation in the operation of the vehicle
Figure 631619DEST_PATH_IMAGE248
Rate of change of deviation from operation
Figure 613350DEST_PATH_IMAGE249
The e-ec plane nonlinear division method comprises the following steps:
Figure 166691DEST_PATH_IMAGE250
when the division of the error change rate ec is calculated,
Figure 11062DEST_PATH_IMAGE251
is that
Figure 446592DEST_PATH_IMAGE252
When the division of the error e is calculated,
Figure 435276DEST_PATH_IMAGE253
is that
Figure 108703DEST_PATH_IMAGE254
Figure 524641DEST_PATH_IMAGE255
Figure 385150DEST_PATH_IMAGE256
Respectively, the maximum of the absolute values of the error e and the error change rate ec, wherein,
Figure 911946DEST_PATH_IMAGE257
equal division points for error e or error change rate ec,
Figure 174300DEST_PATH_IMAGE258
non-uniform dividing points after mapping;
Figure 26718DEST_PATH_IMAGE259
adjusting a factor for the degree of non-linearity;
according to the range of the error e and the error change rate ec, the e-ec plane is divided non-uniformly, and the divided area set is marked as
Figure 392101DEST_PATH_IMAGE260
In each divided region
Figure 175118DEST_PATH_IMAGE261
Internal, non-linear PID control increments
Figure 291979DEST_PATH_IMAGE262
Comprises the following steps:
Figure 784140DEST_PATH_IMAGE263
wherein,
Figure 884820DEST_PATH_IMAGE264
to represent
Figure 753419DEST_PATH_IMAGE265
Region of time
Figure 990365DEST_PATH_IMAGE266
Non-linear PID control increments of (1);
Figure 184586DEST_PATH_IMAGE267
which is indicative of the time of the sampling,
Figure 241404DEST_PATH_IMAGE268
indicating area
Figure 919553DEST_PATH_IMAGE269
The ratio coefficient of the inner side of the outer ring,
Figure 604481DEST_PATH_IMAGE270
the scale is shown to be that of,
Figure 500762DEST_PATH_IMAGE271
the lines are represented by a number of lines,
Figure 310455DEST_PATH_IMAGE272
a presentation column;
Figure 520856DEST_PATH_IMAGE273
indicating area
Figure 466816DEST_PATH_IMAGE274
Inner integral the coefficients of which are such that,
Figure 737260DEST_PATH_IMAGE275
indicating area
Figure 503091DEST_PATH_IMAGE276
A differential coefficient of the inner;
based on all regions
Figure 48342DEST_PATH_IMAGE277
Non-linear PID control increments of
Figure 842948DEST_PATH_IMAGE278
And calculating the weighted average increment of the nonlinear PID control:
Figure 549873DEST_PATH_IMAGE279
wherein,
Figure 396475DEST_PATH_IMAGE280
is a region
Figure 214258DEST_PATH_IMAGE281
The weight of the control law is increased by an amount,
Figure 869230DEST_PATH_IMAGE282
Figure 747057DEST_PATH_IMAGE283
is a region
Figure 221900DEST_PATH_IMAGE284
Error e, error variationThe radius of the rate ec is such that,
Figure 577795DEST_PATH_IMAGE285
is a region
Figure 352853DEST_PATH_IMAGE286
The center of (a);
based on incremental control law weights
Figure 401581DEST_PATH_IMAGE287
Incremental non-linear PID control law of computing time
Figure 894879DEST_PATH_IMAGE288
Comprises the following steps:
Figure 60324DEST_PATH_IMAGE289
wherein,
Figure 486626DEST_PATH_IMAGE290
an incremental factor, which is defined as follows:
Figure 237414DEST_PATH_IMAGE291
wherein,
Figure 218008DEST_PATH_IMAGE292
in order to be the maximum value of the incremental factor,
Figure 446864DEST_PATH_IMAGE293
is an offset;
Figure 196514DEST_PATH_IMAGE294
Figure 587044DEST_PATH_IMAGE295
Figure 789355DEST_PATH_IMAGE296
is a scaling factor.
10. The on-line optimizing control system of the unmanned line marking vehicle based on machine vision navigation as claimed in claim 6,
the working process of the online learning rule unit specifically comprises the following steps:
there are several pairs
Figure 290744DEST_PATH_IMAGE297
In a divided region
Figure 912479DEST_PATH_IMAGE298
Inner, then to the area
Figure 208331DEST_PATH_IMAGE299
Internal non-linear PID control increments
Figure 632359DEST_PATH_IMAGE300
Parameter (2) of
Figure 937438DEST_PATH_IMAGE301
Figure 396101DEST_PATH_IMAGE302
Figure 128434DEST_PATH_IMAGE303
Performing online learning;
increment control of nonlinear PID by adopting supervised Hebb learning rule
Figure 39758DEST_PATH_IMAGE304
The parameters of (2) are learned:
Figure 882949DEST_PATH_IMAGE305
(16)
wherein,
Figure 727277DEST_PATH_IMAGE306
is a region
Figure 630511DEST_PATH_IMAGE307
Internal learning rate, learning rate
Figure 29132DEST_PATH_IMAGE308
Based on online adjustment rules
Figure 681873DEST_PATH_IMAGE309
And (3) adjusting:
Figure 380708DEST_PATH_IMAGE310
(17)
wherein,
Figure 579477DEST_PATH_IMAGE311
is a region
Figure 262131DEST_PATH_IMAGE312
The matching coefficient in the learning rate range is adjusted,
Figure 447124DEST_PATH_IMAGE313
is a region
Figure 734886DEST_PATH_IMAGE314
Two weight coefficients within.
CN202211451943.7A 2022-11-21 2022-11-21 Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation Active CN115509122B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211451943.7A CN115509122B (en) 2022-11-21 2022-11-21 Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211451943.7A CN115509122B (en) 2022-11-21 2022-11-21 Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation

Publications (2)

Publication Number Publication Date
CN115509122A true CN115509122A (en) 2022-12-23
CN115509122B CN115509122B (en) 2023-03-21

Family

ID=84513924

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211451943.7A Active CN115509122B (en) 2022-11-21 2022-11-21 Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation

Country Status (1)

Country Link
CN (1) CN115509122B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115760850A (en) * 2023-01-05 2023-03-07 长江勘测规划设计研究有限责任公司 Method for identifying water level without scale by using machine vision

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120179322A1 (en) * 2009-09-15 2012-07-12 Ross Hennessy System and method for autonomous navigation of a tracked or skid-steer vehicle
US20150275445A1 (en) * 2012-10-17 2015-10-01 Diane Lee WATSON Vehicle for line marking
CN106527119A (en) * 2016-11-03 2017-03-22 东华大学 Fuzzy control-based differentiation first PID (proportion integration differentiation) control system
CN109176519A (en) * 2018-09-14 2019-01-11 北京遥感设备研究所 A method of improving the Robot Visual Servoing control response time
CN110398979A (en) * 2019-06-25 2019-11-01 天津大学 A kind of unmanned engineer operation equipment tracking method and device that view-based access control model is merged with posture
AU2020104234A4 (en) * 2020-12-22 2021-03-11 Qingdao Agriculture University An Estimation Method and Estimator for Sideslip Angle of Straight-line Navigation of Agricultural Machinery
CN112706835A (en) * 2021-01-07 2021-04-27 济南北方交通工程咨询监理有限公司 Expressway unmanned marking method based on image navigation
CN113296518A (en) * 2021-05-25 2021-08-24 山东交通学院 Unmanned driving system and method for formation of in-place heat regeneration unit
CN113960921A (en) * 2021-10-19 2022-01-21 华南农业大学 Visual navigation control method and system for orchard tracked vehicle
CN114942641A (en) * 2022-06-06 2022-08-26 仲恺农业工程学院 Road bridge autonomous walking marking system controlled by multiple sensor data fusion stereoscopic vision
CN115082701A (en) * 2022-08-16 2022-09-20 山东高速集团有限公司创新研究院 Multi-water-line cross identification positioning method based on double cameras

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120179322A1 (en) * 2009-09-15 2012-07-12 Ross Hennessy System and method for autonomous navigation of a tracked or skid-steer vehicle
US20150275445A1 (en) * 2012-10-17 2015-10-01 Diane Lee WATSON Vehicle for line marking
CN106527119A (en) * 2016-11-03 2017-03-22 东华大学 Fuzzy control-based differentiation first PID (proportion integration differentiation) control system
CN109176519A (en) * 2018-09-14 2019-01-11 北京遥感设备研究所 A method of improving the Robot Visual Servoing control response time
CN110398979A (en) * 2019-06-25 2019-11-01 天津大学 A kind of unmanned engineer operation equipment tracking method and device that view-based access control model is merged with posture
AU2020104234A4 (en) * 2020-12-22 2021-03-11 Qingdao Agriculture University An Estimation Method and Estimator for Sideslip Angle of Straight-line Navigation of Agricultural Machinery
CN112706835A (en) * 2021-01-07 2021-04-27 济南北方交通工程咨询监理有限公司 Expressway unmanned marking method based on image navigation
CN113296518A (en) * 2021-05-25 2021-08-24 山东交通学院 Unmanned driving system and method for formation of in-place heat regeneration unit
CN113960921A (en) * 2021-10-19 2022-01-21 华南农业大学 Visual navigation control method and system for orchard tracked vehicle
CN114942641A (en) * 2022-06-06 2022-08-26 仲恺农业工程学院 Road bridge autonomous walking marking system controlled by multiple sensor data fusion stereoscopic vision
CN115082701A (en) * 2022-08-16 2022-09-20 山东高速集团有限公司创新研究院 Multi-water-line cross identification positioning method based on double cameras

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
FAN HONG,ET AL.: "A SEMI-FRAGILE WATERMARKING SCHEME BASED ON NEURAL NETWORK", 《PROCEEDINGS OF THE TBIRD INTERNATIONAL CONFERENCE ON MACHINE LEAMING AND CYBERNETICS》 *
王绍磊 等: "图基导航的高速公路划线车无人驾驶系统", 《电子世界》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115760850A (en) * 2023-01-05 2023-03-07 长江勘测规划设计研究有限责任公司 Method for identifying water level without scale by using machine vision

Also Published As

Publication number Publication date
CN115509122B (en) 2023-03-21

Similar Documents

Publication Publication Date Title
CN113221905B (en) Semantic segmentation unsupervised domain adaptation method, device and system based on uniform clustering and storage medium
CN109375235B (en) Inland ship freeboard detection method based on deep reinforcement neural network
CN106372749B (en) Ultra-short term photovoltaic power prediction technique based on the analysis of cloud variation
CN109785385B (en) Visual target tracking method and system
CN108614994A (en) A kind of Human Head Region Image Segment extracting method and device based on deep learning
CN112818873B (en) Lane line detection method and system and electronic equipment
CN110889332A (en) Lie detection method based on micro expression in interview
CN112348849A (en) Twin network video target tracking method and device
CN107301657B (en) A kind of video target tracking method considering target movable information
CN111325711A (en) Chromosome split-phase image quality evaluation method based on deep learning
TW202022797A (en) Object detection method using cnn model and object detection apparatus using the same
CN112298194B (en) Lane changing control method and device for vehicle
CN115509122B (en) Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation
CN106557173A (en) Dynamic gesture identification method and device
CN113516853B (en) Multi-lane traffic flow detection method for complex monitoring scene
CN112184655A (en) Wide and thick plate contour detection method based on convolutional neural network
CN116630748A (en) Rare earth electrolytic tank state multi-parameter monitoring method based on fused salt image characteristics
CN116476863A (en) Automatic driving transverse and longitudinal integrated decision-making method based on deep reinforcement learning
CN105225252B (en) Particle clouds motion Forecasting Methodology
CN112508851A (en) Mud rock lithology recognition system based on CNN classification algorithm
CN116994236A (en) Low-quality image license plate detection method based on deep neural network
CN116863353A (en) Electric power tower inclination degree detection method based on rotating target detection network
CN109993772B (en) Example level feature aggregation method based on space-time sampling
CN115170882A (en) Optimization method of rail wagon part detection network and guardrail breaking fault identification method
CN118379664A (en) Video identification method and system based on artificial intelligence

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant