CN115509122A - Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation - Google Patents
Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation Download PDFInfo
- Publication number
- CN115509122A CN115509122A CN202211451943.7A CN202211451943A CN115509122A CN 115509122 A CN115509122 A CN 115509122A CN 202211451943 A CN202211451943 A CN 202211451943A CN 115509122 A CN115509122 A CN 115509122A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- deviation
- error
- rate
- waterline
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 96
- 238000005457 optimization Methods 0.000 title claims abstract description 16
- 230000008859 change Effects 0.000 claims abstract description 115
- 238000012544 monitoring process Methods 0.000 claims abstract description 29
- 238000001514 detection method Methods 0.000 claims abstract description 26
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims abstract description 24
- 230000008569 process Effects 0.000 claims description 20
- 230000006870 function Effects 0.000 claims description 16
- 238000006243 chemical reaction Methods 0.000 claims description 13
- 238000005070 sampling Methods 0.000 claims description 12
- 240000002627 Cordeauxia edulis Species 0.000 claims description 8
- 230000014509 gene expression Effects 0.000 claims description 8
- 238000013507 mapping Methods 0.000 claims description 8
- 238000010606 normalization Methods 0.000 claims description 8
- 238000004364 calculation method Methods 0.000 claims description 7
- 238000005260 corrosion Methods 0.000 claims description 6
- 230000007797 corrosion Effects 0.000 claims description 6
- 239000000446 fuel Substances 0.000 claims description 4
- 230000009466 transformation Effects 0.000 claims description 4
- 238000013461 design Methods 0.000 abstract description 6
- 238000012706 support-vector machine Methods 0.000 description 10
- 230000000007 visual effect Effects 0.000 description 10
- 238000010276 construction Methods 0.000 description 6
- 230000007704 transition Effects 0.000 description 5
- 239000010426 asphalt Substances 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 238000005192 partition Methods 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 2
- 238000003909 pattern recognition Methods 0.000 description 2
- 238000003672 processing method Methods 0.000 description 2
- 238000000638 solvent extraction Methods 0.000 description 2
- 230000002194 synthesizing effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000007723 transport mechanism Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B11/00—Automatic controllers
- G05B11/01—Automatic controllers electric
- G05B11/36—Automatic controllers electric with provision for obtaining particular characteristics, e.g. proportional, integral, differential
- G05B11/42—Automatic controllers electric with provision for obtaining particular characteristics, e.g. proportional, integral, differential for obtaining a characteristic which is both proportional and time-dependent, e.g. P. I., P. I. D.
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/34—Smoothing or thinning of the pattern; Morphological operations; Skeletonisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computing Systems (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention discloses an unmanned line marking vehicle online optimization control method and system based on machine vision navigation, belonging to the field of system control, and the unmanned line marking vehicle online optimization control method based on machine vision navigation comprises the following steps: collecting a water line image of a road surface; carrying out waterline detection to obtain a waterline; based on the detected waterline, selecting a monitoring point of the current position of the vehicle, calculating the running deviation and the running deviation change rate of the vehicle, selecting a monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation and the predicted deviation change rate of the vehicle; designing nonlinear incremental PID control according to the predicted running deviation and the predicted deviation change rate of the vehicle; learning parameters of nonlinear PID control increment in the region; the method and the system realize the machine vision real-time navigation of the line marking vehicle, and design the feedforward controller and the nonlinear increment PID controller according to the tracking error and the error change rate, so as to realize the line marking vehicle path tracking control based on the machine vision navigation.
Description
Technical Field
The invention belongs to the field of system control, and particularly relates to an unmanned line marking vehicle online optimization control method and system based on machine vision navigation.
Background
In the construction process of the highway marking, a waterline is generally manually drawn on the road surface, and the existing method is generally constructed along the waterline manually. The system adopts a machine vision technology, a computer automatically identifies a waterline, and then controls a marking device to carry out marking construction along the waterline, namely navigation construction based on machine vision, and a construction schematic diagram is shown in figure 1.
The existing system has the following problems in marking vehicle control based on machine vision navigation:
the existing advanced image processing method is long in time consumption, real-time control of a marking vehicle is difficult to meet, only the gray value of each pixel is concerned, and the overall planning of foreground and background information is difficult;
the existing control method based on the model is not suitable, because the structure of the line marking vehicle is too complex, an accurate motion model is difficult to obtain, and the control law of the line marking vehicle cannot be designed by adopting the method based on the model;
the vehicle navigation adopts an image navigation mode, only the degree of deviation of a construction vehicle from a planned route can be known, accurate vehicle global coordinates are difficult to provide, and the output generally given by vehicle modeling is the position information of the vehicle;
for such problems, model-free control methods such as PID control and fuzzy control are generally adopted, but such methods rely too much on manual experience and are difficult to optimize, and most optimization methods require accurate models of controlled objects.
Disclosure of Invention
In order to solve the problems in the prior art, the invention discloses an unmanned scribing vehicle on-line optimization control method based on machine vision navigation, which realizes the machine vision real-time navigation of a scribing vehicle, designs a feedforward controller and a nonlinear PID controller according to a tracking error and an error change rate, designs an on-line learning method of parameters of the nonlinear PID controller, and realizes the scribing vehicle path tracking control based on the machine vision navigation.
The invention adopts the scheme that:
the on-line optimizing control method of the unmanned line marking vehicle based on the machine vision navigation comprises the following steps:
s1, collecting a water line image of a road surface;
s2, carrying out waterline detection based on the waterline image to obtain a waterline;
s3, based on the detected waterline, selecting a monitoring point of the current position of the vehicle, and calculating the running deviation of the vehicleRate of change of deviation from runningSelecting the monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation of the vehicleAnd predicting the rate of change of deviation;
S4, according to the predicted operation deviation of the vehicleAnd predicting the rate of change of deviationObtaining a predicted feedforward control quantityBased on running deviation of the vehicleRate of change of deviation from operationObtaining an incremental nonlinear PID control law for a vehicle;
S5, controlling increment of nonlinear PID in the regionParameter (2) of、、And performing online learning.
The step S2 specifically includes the following steps:
s201, aiming at pixel points on the water line imagePerforming gray scale stretching, wherein the stretching formula is shown as formula (1):
wherein,,,is a function of the stretch factor and is,,represents an imageGo to the firstColumns;is shown asGo to the firstGray values of the column pixel points; after stretching the gray scale of the water line image, selectingA personal HARR-like feature;
s202, aiming at the first stepGeneric HARR characteristics,The calculation method is as shown in formula (2):
wherein,is as followsThe sum of pixels of the individual HARR-like feature waterline regions,is as followsThe sum of pixels of a road surface area with a HARR-like characteristic,、gray value based on stretched gray imageCalculating to obtain;
after the similar HARR characteristics of each pixel point are described, normalizing each similar HARR characteristic, wherein the normalization formula is formula (3):
wherein,,in order to normalize the HARR characteristics of the sample,,respectively the average value of the gray levels and the average value of the square of the gray levels in the detection window;
S203, aiming at each pixel pointFeature vector ofPerforming identification to determine pixel pointsIf the characteristic vector accords with the linear characteristic, setting 1 for the pixel point, and otherwise, setting 0 for realizing the binaryzation of the gray level image to obtain a binary image B;
performing expansion corrosion treatment on the binary image B to obtain a new binary imageAnd obtaining an edge image E of the waterline by adopting a formula (6):
representing a new binary imageTo (1) aGo to the firstPixel values of the columns;represents the edge image EGo to the firstPixel values of the columns;
s204, identifying a straight line in the edge image E by adopting a HOUGH conversion method:
from new binary imagesWhere the coordinates of the pixel having the pixel value of 1 are found to beObtained by HOUGH conversionObtained by the formula (7)A corresponding linear expression;
a group ofRepresenting a straight line, given a setIf at allSatisfies the expression of the formula (7)On the straight line represented by formula (7);
s205, eliminating interference straight lines;
the interference line removing method specifically comprises the following steps:
s205-1: selecting a plurality of straight lines with the longest length, wherein the length of the straight lines meets the length of a threshold value;
s205-2: front and back two-value imageDetected straight line parameterSatisfies the requirement of formula (8):
wherein,,are respectively asThe maximum allowable deviation angle and radius,is a threshold value that is allowed for continuity,is the sampling time of the image;
s205-3: and if more than one straight line still meets the requirement after the process of S205-1 and S205-2, selecting the straight line with the highest pixel mean value on the straight line as the detected waterline.
The step S3 specifically includes the following steps: the height and width of the edge image E at which the waterline is located are H and;
selecting the intersection point of the horizontal line with the height of y and the waterline as the monitoring point of the current position of the vehicle: (Y) running deviation of the vehicleRate of change of deviation from runningComprises the following steps:
selecting a height ofThe intersection point of the horizontal line and the waterline is used as a monitoring point of the predicted position of the vehicle (,) Obtaining a predicted running deviation of the vehicleAnd predicting the rate of change of deviation:
S4 specifically comprises the following steps:
s401, according to the predicted operation deviation of the vehicleAnd predicting the rate of change of deviationCalculating a feedforward control amount of the vehicle:
In order to calculate the amount of feedforward control,andare two parameters of predictive control;
s402, in the area of the e-ec plane, based on the running deviation of the vehicleRate of change of deviation from runningObtaining an incremental nonlinear PID control law for a vehicle;
Step S402 specifically includes the following steps: constructing a non-uniform division method of an error e and an error change rate ec based on a Gaussian function, wherein an e-ec plane takes the error e as a horizontal axis and the error change rate ec as a vertical axis; the error e and the error change rate ec refer to the running deviation of the vehicleRate of change of deviation from running(ii) a The non-linear division method adopted by the e-ec plane division is as follows:
when the division of the error change rate ec is calculated,is thatWhen the division of the error e is calculated,is that;Andrespectively, the maximum of the absolute values of the error e and the error change rate ec, wherein,equal division points for error e or error change rate ec,in order to non-uniformly partition the points after mapping,adjusting a factor for the degree of non-linearity;
In each divided regionInternally performing PID control and non-linear PID control incrementComprises the following steps:
which is indicative of the time of the sampling,indicating areaThe internal proportionality coefficient of the air-fuel ratio,the scale is shown to be that of,the lines are represented as a result of,a presentation column;indicating areaInner integral the coefficients of which are such that,indicating areaThe inner differential coefficient;
based on all regionsNon-linear PID control increments ofAnd calculating the weighted average increment of the nonlinear PID control:
wherein,is a regionThe control law weight is incremented by one,is a regionThe error e and the radius of the error change rate ec,is a regionThe center of (a);
for describing deviation and deviation rate of change from originTo the extent that (a) is present,is a scaling factor;
the method can make different errors e and error change rates ec have different increment factors.
Design ofThe method aims to provide the controller increment factors under the conditions that the error e and the error change rate ec are different, and has the effects of improving the response speed of a system and reducing the complexity of an optimization process.
The step S5 specifically includes the following steps:
there are a number of pairsIn the divided regionInner, then to the areaNonlinear PID control increments withinParameter (2)、、Performing online learning;
increment control of nonlinear PID by adopting supervised Hebb learning ruleThe parameters of (2) are learned:
wherein,is a regionInternal learning rate, learning rateBased on online adjustment rulesAnd (3) adjusting:
wherein,is a regionThe matching coefficient in the learning rate range is used for adjusting the learning rate range,is a regionTwo weight coefficients within.
The unmanned line marking vehicle online optimization searching control system based on machine vision navigation comprises a vision sensor, a waterline detection unit, an operation error prediction unit, a prediction controller, a nonlinear increment PID controller and an online learning rule unit;
a visual sensor acquires a water line image of a road surface;
the waterline detection unit performs waterline detection based on the waterline image to obtain a waterline;
the operation error prediction unit selects a monitoring point of the current position of the vehicle based on the detected waterline, and calculates the operation deviation of the vehicleRate of change of deviation from runningSelecting the monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation of the vehicleAnd predicting the rate of change of deviation。
The predictive controller operates the deviation according to the prediction of the vehicleAnd predicting the rate of change of deviationPredicting a predicted feedforward control amount of the vehicle;
The nonlinear incremental PID controller is in a nonlinear division area of an e-ec plane and is based on the running deviation of the vehicleRate of change of deviation from operationObtaining an incremental nonlinear PID control law for a vehicle;
On-line learning of rule units versus regionsNonlinear PID control incrementsThe parameters of (2) are learned.
The working process of the waterline detection unit specifically comprises the following steps:
s201, aiming at pixel points on the water line imagePerforming gray scale stretching, wherein the stretching formula is shown as formula (1):
wherein,,,as a result of the stretching factor,represents an imageGo to the firstColumns;denotes the firstGo to the firstGray values of the column pixel points; water lineAfter the gray level of the image is stretched, selectingThe individual HARR-like features express the linear features of each pixel point;
s202, aiming at the first stepCharacteristic of personal HARRThe calculation method is as shown in formula (2):
wherein,for the sum of pixels of the ith HARR-like feature waterline region,for the sum of pixels of the ith HARR-like feature road surface area,、gray value based on stretched gray imageCalculating to obtain;
after the HARR-like feature description of each pixel point is obtained, normalizing each HARR-like feature, wherein the normalization formula is as shown in formula (3):
wherein,is as followsThe number of normalized HARR features is then determined,,respectively the average value of the gray levels and the average value of the square of the gray levels in the detection window;
S203, aiming at each pixel pointFeature vector ofIdentifying and judging pixel pointsIf the characteristic vector accords with the linear characteristic, setting 1 for the pixel point, and otherwise, setting 0 for realizing the binaryzation of the gray level image so as to obtain a binary image B;
carrying out expansion corrosion treatment on the binary image B to obtain a new binary imageObtaining an edge image E of the waterline by adopting a formula (6),
representing a new binary imageTo (1) aGo to the firstPixel values of the columns;represents the edge image EThe rows of the image data are, in turn,pixel values of the columns;
s204, identifying a straight line in the edge image E by adopting a HOUGH conversion method:
from new binary imagesFinding the coordinates of the pixel with the pixel value of 1 asObtained by HOUGH conversionObtained by the formula (7)A corresponding linear expression;
a group ofRepresenting a straight line, given a setIf at allSatisfies the formula (7)On the straight line represented by formula (7);
s205, eliminating interference straight lines;
the method for eliminating the interference straight line specifically comprises the following steps:
s205-1, selecting a plurality of straight lines with the longest length and the length meeting the threshold length;
s205-2, two-value images of front and back framesDetected straight line parameterSatisfies formula (8):
wherein,,are respectively asThe maximum allowable deviation angle and radius,is a threshold value that is allowed for continuity,is the sampling moment of the image;
s205-3, if 2 or more than 2 straight lines still exist after the S205-1 and the S205-2, selecting the straight line with the highest pixel average value on the straight line as the detected waterline.
The operation error prediction unit specifically comprises the following working processes: the height and width of the edge image E of the waterline are H and H respectively(ii) a Selecting the intersection point of the horizontal line with the height of y and the waterline as the monitoring point of the current position of the vehicle: (Y) calculating the running deviation of the vehicleRate of change of deviation from runningComprises the following steps:
selecting a height of(the intersection point of the horizontal line and the waterline is taken as a monitoring point of the predicted position of the vehicle,) Obtaining a predicted running deviation of the vehicleAnd predicting the rate of change of deviation:
The prediction controller calculates a feedforward control amount of the vehicleComprises the following steps:
in order to predict the feedforward control quantity of the controller,andare two parameters of the predictive controller;
the nonlinear incremental PID controller is in a nonlinear division area of an e-ec plane and is based on the running deviation of the vehicleRate of change of deviation from operationObtaining an incremental nonlinear PID control law for a vehicle;
Constructing a non-uniform division method of an error e and an error change rate ec based on a Gaussian function, wherein an e-ec plane takes the error e as a horizontal axis and the error change rate ec as a vertical axis; error or rate of change of error refers to deviation in the operation of the vehicleRate of change of deviation from running;
The e-ec plane nonlinear division method comprises the following steps:
when the division of the error change rate ec is calculated,is thatWhen the division of the error e is calculated,is that;,Respectively, the maximum of the absolute values of the error e and the error rate of change ec, wherein,for equal uniform division points of the error e or the error change rate ec,in order to non-uniformly partition the points after mapping,the factor is adjusted for the degree of non-linearity.
According to the range of the error e and the error change rate ec, the e-ec plane is divided non-uniformly, and the divided area set is marked as;
In each divided regionPID control is carried out by a nonlinear increment PID controller, and the increment is controlled by nonlinear PIDComprises the following steps:
which is indicative of the time of the sampling,indicating areaThe internal proportionality coefficient of the air-fuel ratio,the scale is shown to be that of a scale,the lines are represented by a number of lines,the columns are indicated.The value of the integral coefficient is represented by,represents a differential coefficient;
based on all regionsNon-linear PID control increments ofCalculating a nonlinear PID control weighted average increment:
wherein,is a regionThe control law weight is incremented by one, is a regionThe error e, the radius of the error change rate ec,is a regionThe center of (a);
based on incremental control law weightsCalculatingTime incremental nonlinear PID control lawComprises the following steps:
running deviation for vehicleRate of change of deviation from runningDegree of deviation from the origin;
The method can lead the error change rate ec to have different increment factors with different errors e.
Design ofIs a running deviationRate of change of deviation from runningThe controller increment factors under different conditions have the functions of improving the response speed of the system and reducing the complexity of the optimization process.
The working process of the online learning rule unit specifically comprises the following steps:
there are several pairsIn a divided regionInner, then to the areaInternal non-linear PID control incrementsParameter (2) of、、And performing online learning. Incremental nonlinear PID control using supervised Hebb learning ruleThe parameters of (2) are learned:
wherein,is a regionThe inner learning rate. To improve learning efficiency, learning rateBased on online adjustment rulesThe adjustment is carried out, namely:
wherein,is a regionThe matching coefficient in the learning rate range is adjusted,is a regionTwo weight coefficients within.
Compared with the prior art, the invention has the beneficial effects that:
the application discloses an unmanned line marking vehicle on-line optimization control method based on machine vision navigation, which adopts a new HARR-like feature and an image processing method to realize the machine vision real-time navigation of a line marking vehicle, designs a feedforward controller and a nonlinear PID controller according to a tracking error and an error change rate, and discloses an on-line learning method of parameters of the nonlinear PID controller to realize the line marking vehicle path tracking control based on the machine vision navigation.
The application discloses a novel HARR-like feature which is used for extracting a straight line feature of a waterline, so that later waterline detection is facilitated, the real-time requirement of image detection is met, background information around the waterline is fully considered, and the success rate of the waterline detection is improved;
the method is based on the detected waterline edge, the waterline in each frame of image is extracted by adopting a HOUGH conversion method, a new interference waterline filtering method is provided, and the robustness of waterline identification is realized; according to the respective conditions of the tracking error and the error change rate, the e-ec plane is divided into areas, an incremental PID controller is designed in each area, and finally a nonlinear incremental PID controller is constructed, so that the rapidity of the track tracking of the scribing car is realized, and the tracking precision of the scribing car is improved; the parameters of the nonlinear PID controller are adjusted by adopting a supervised Hebb learning rule, so that the online optimization of the nonlinear PID controller is realized; according to the prediction tracking deviation, a feedforward prediction controller is designed, the waterline adaptability of scribing vehicle control is improved, and the scribing vehicle control precision and the anti-interference capability are improved.
Drawings
FIG. 1 is a schematic view of a line marking vehicle construction;
FIG. 2 class HARR features;
FIG. 3 is a schematic of an operating deviation and predicted deviation calculation;
FIG. 4 is an online optimizing control system of an unmanned line marking vehicle based on machine vision navigation;
FIG. 5 e-ec plane non-linear division diagram;
FIG. 6 e-ec plane division area schematic diagram;
fig. 7 is a water line image schematic.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings, and it is obvious that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
The on-line optimizing control method of the unmanned line marking vehicle based on the machine vision navigation comprises the following steps:
s1, collecting a water line image of a road surface to obtain a data sample, wherein the water line image is shown in FIG. 7:
the gray level image of the asphalt pavement is acquired through the vision sensor arranged on the line marking vehicle in the figure 1, the pavement is provided with a drawn waterline, and waterline information is arranged in the image. The vision sensor is arranged at the height of 35-45cm away from the asphalt pavement and ensures that the length of a view field in the advancing direction of the scribing vehicle is 30-40cm. In order to ensure that the visual sensor is resistant to the interference of external light, certain shading equipment is generally arranged outside the visual sensor.
S2, carrying out waterline detection based on the waterline image to obtain a waterline;
the step S2 specifically includes the following steps:
s201, aiming at pixel points on the water line imagePerforming gray scale stretching, wherein the stretching formula is shown as formula (1):
wherein,,,is a stretch factor; m, n represent the m-th row and n-th column of the image;is shown asGo to the firstAfter the gray value of the gray value water line image of the column pixel points is stretched, the linear characteristic of each pixel point is expressed based on the HARR-like characteristic, and the HARR-like characteristic is shown in FIG. 2: in the embodiment, 6 HARR-like features are selected according to engineering conditions, the number of the HARR-like features is large, and the HARR-like features are designed in the HARR-like features M1-M6 according to actual conditions to divide different regions. Assume that the width of the waterline area is 2w, where w is half the width of the waterlineIn the HARR-like features M1 to M3, the width of the road surface region and the transition region is w. In the HARR-like features M4-M6, the width of each region is 2w, and the height of all HARR-like features is 4w-6w. For the rotation features M2, M3, M5, M6, the rotation angle is 15 degrees, which is determined by the field waterline characteristics.
Because the boundary of the waterline area is easy to be fuzzy, in the embodiment, the transition area is arranged between the road surface area and the waterline area, so that the fuzzy interference of the waterline boundary on the feature extraction is avoided, namely, when the feature is extracted, the transition area is not considered, and the transition area is ignored.
S202, the HARR-like features are used for highlighting the straight line features of the waterline on the road surface, and the waterline identification in the later period is facilitated. To the firstGeneric HARR characteristics,The calculation method is as shown in formula (2):
wherein, in the embodiment, Q =6,is as followsThe sum of pixels of the individual HARR-like feature waterline regions,is a firstThe sum of pixels of a road surface area with a HARR-like characteristic,、gray value based on stretched gray imageCalculating to obtain;
after the HARR-like feature description of each pixel point is obtained, normalizing each HARR-like feature, wherein the normalization formula is as shown in formula (3):
wherein,in order to normalize the HARR characteristics of the sample,,respectively the average value of the gray levels and the average value of the square of the gray levels in the detection window;
based on the 6 normalized HARR-like features (see formula 3), a pixel point is constructedFeature vector of:
S203, in order to detect the waterline in the image, each pixel point is subjected toFeature vector ofPerforming identification to determine pixel pointsAnd (3) judging whether the characteristic vector accords with the linear characteristic, if so, setting 1 to the pixel point, otherwise, setting 0 to realize binarization of the gray level image, and further obtaining a binary image B. The method is innovative in that the binarization problem of the image is changed into the pattern recognition problem of the features.
Using SVM method to process feature vectorIdentifying, wherein the kernel function is represented by formula (5):
Is thatThe formula (5) is a kernel function in the SVM classification method, and the feature vector is obtainedMapping to a higher dimension, judging whether the feature is a straight line feature through the SVM, and performing two-classification; constructing a feature vector for each pixel point according to the HARR-like features, and then judging whether the pixel point represented by each feature vector is on a straight line or not by using an SVM (support vector machine);
after obtaining the binary image B, carrying out expansion corrosion treatment on the binary image B to obtain a new binary imageObtaining an edge image E of the waterline by adopting a formula (6),
representing a new binary imageTo (1)Go to the firstPixel values of the columns;represents the edge image EThe rows of the image data are, in turn,pixel values of the columns;
s204, identifying a straight line in the edge image E by adopting a HOUGH conversion method:
from new binary imagesWhere the coordinates of the pixel having the pixel value of 1 are found to beObtained by HOUGH conversionObtained by the formula (7)A corresponding linear expression;
wherein,respectively, the radius and angle (variable representation of straight line) detected by the HOUGH transform.
A group ofRepresenting a straight line, given a setIf at allSatisfies the formula (7)On the straight line represented by formula (7);
s205, due to the influence of noise in the image, a plurality of straight lines are detected, and the interference straight lines are removed to obtain real straight lines. The interference line removing method specifically comprises the following steps:
s205-1: selecting 3-5 straight lines with the longest length, wherein the length of the straight lines meets the threshold length (the length is greater than or equal to 1/4 of the image height, and the threshold length is 1/4 of the image height in the embodiment);
s205-2: because the waterline is continuous, the linear parameters detected by the two frames of images before and after the waterline are continuousThe requirement of formula (8) is satisfied, namely:
wherein,,are respectively asThe maximum allowable deviation angle and radius,is a threshold value that is allowed for continuity,is the sampling instant of the image.
S205-3: if a plurality of straight lines are still satisfied after passing through S205-1 and S205-2 (there are 2 or 2 straight lines in a line), the straight line with the highest pixel average value on the straight line is selected as the detected waterline, because the waterline is generally white and the brightness is highest in the image.
S3, selecting a monitoring point of the current position of the vehicle based on the detected waterline, and calculating the running deviation of the vehicleRate of change of deviation from runningSelecting the monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation of the vehicleAnd predicting the rate of change of deviation;
The detected waterlines are shown in FIG. 3, H,The height and width of the edge image E (the height and width of the edge image E and the horizontal line image H and W are the same). Taking the intersection point of the horizontal line at the position of 0.7H and the waterline as a monitoring point of the current position of the vehicle, and taking the 0.7H as y in the formula (7) to obtain the abscissa of the monitoring point of the current position. At this time, the running deviation of the vehicleRate of change of deviation from runningComprises the following steps:
0.7H is a value of this embodiment y, generally, the distance between two horizontal lines in fig. 3 should be at least the distance that the vehicle can travel in one control cycle, but the two horizontal lines cannot be too close to the upper and lower boundaries of the image, and according to the set vehicle speed and the visual field range of the vision sensor, the embodiment selects the intersection point of the horizontal line at 0.7H and the water line as the monitoring point of the current position of the vehicle;
wherein,obtained by camera calibration, represents the actual length (the actual distance represented by a pixel unit) represented by each pixel. In the present embodiment, the first and second embodiments,the value is 0.3H, taking the intersection point of the horizontal line at the position of 0.3H and the waterline as a monitoring point of the predicted position of the vehicle to obtain the predicted running deviation of the vehicleAnd predicting the rate of change of deviation:
This division is reasonable because the vehicle speed is approximately 40 m/min and the effective field of view of the vision sensor is 20-30 cm.
As seen in fig. 3, the current position error is on the left, but to the right by the predicted position. The error cannot be adjusted excessively at the current position and the current adjustment can be fine-tuned by the prediction error.
S4, according to the predicted operation deviation of the vehicleAnd predicting the rate of change of deviationObtaining a predicted feedforward control quantityBased on running deviation of the vehicleRate of change of deviation from runningObtaining an incremental nonlinear PID control law for a vehicle;
The step S4 specifically includes the following steps:
s401, according to the predicted operation deviation of the vehicleAnd predicting the rate of change of deviationPredicting a feedforward control amount of the vehicle:
In order to predict the amount of the feedforward control,andare two parameters of predictive control;
s402, in the area of the e-ec plane, based on the running deviation of the vehicleRate of change of deviation from runningObtaining an incremental nonlinear PID control law for a vehicleThe error e or the error change rate ec of the system conforms to Gaussian distribution, so that the non-uniform division method of the error and the error change rate is constructed based on a Gaussian function, and the e-ec plane takes the error e as the horizontal axis and the error change rate ec as the vertical axis; error or variation of errorThe change rate refers to a running deviation of the vehicleRate of change of deviation from running(ii) a The e-ec plane division adopts a nonlinear division method:
when the division of the error change rate ec is calculated,is thatWhen the division of the error e is calculated,is that;,Respectively, the maximum of the absolute values of the error e and the error rate of change ec, wherein,for equal uniform division points of the error e or the error change rate ec,in order to non-uniformly partition the points after mapping,is non-linearA degree adjustment factor; the non-uniform partitioning is schematically shown in fig. 5.
According to the range of the error e and the error change rate ec, the e-ec plane is divided non-uniformly, and the set of divided areas (each square in fig. 6 is an area) is marked asAs shown in fig. 6:
in each divided regionPID control is carried out by a nonlinear increment PID controller, and the increment is controlled by nonlinear PIDComprises the following steps:
which is indicative of the time of the sampling,indicating areaThe ratio coefficient of the inner side of the outer ring,the scale is shown to be that of a scale,the lines are represented by a number of lines,the columns are represented.The value of the integral coefficient is represented by,represents a differential coefficient;
based on all regionsNon-linear PID control increments ofAnd calculating the weighted average increment of the nonlinear PID control:
wherein,is a regionThe control law weight is incremented by one,is a regionError e (square in figure 6), radius of error rate of change ec,is a regionOf the center of (c).
Synthesizing the above incremental control law weightsCalculatingTime incremental nonlinear PID control lawComprises the following steps:
for describing running deviationsRate of change of deviation from operationThe degree of deviation from the origin is,is a scaling factor.
The method can enable different e and ec to have different increment factors.
Is a running deviationRate of change of deviation from runningThe controller increment factors under different conditions have the functions of improving the response speed of the system and reducing the complexity of the optimization process.
S5, controlling increment on the nonlinear PID in the regionParameter (2) of、、Performing online learning;
s5 specifically comprises the following steps: there are several pairsIn the divided regionInner, then to the areaInternal non-linear PID control incrementsParameter (2) of、、And performing online learning.
Incremental nonlinear PID control using supervised Hebb learning ruleThe parameters of (2) are learned:
wherein,is a regionThe learning rate in. To improve learning efficiency, learning rateBased on online adjustment rulesThe adjustment is carried out, namely:
wherein,is a regionThe matching coefficient in the range, the learning rate range is adjusted,is a regionTwo weight coefficients within. The two weighting coefficients may be artificially given based on historical data.
As shown in fig. 4, the online optimizing control system for the unmanned line marking vehicle based on machine vision navigation comprises a vision sensor, a waterline detection unit, an operation error prediction unit, a prediction controller, a nonlinear incremental PID controller and an online learning rule unit;
the visual sensor collects a water line image (fig. 7) of a road surface, and a data sample is obtained:
the gray level image of the asphalt pavement is acquired through the vision sensor arranged on the line marking vehicle in the figure 1, a waterline drawn in advance is arranged on the pavement, and waterline information is required in the image. The visual sensor is arranged at a height of 35-45cm from the asphalt pavement, and the length of the visual field in the advancing direction of the marking vehicle is ensured to be 30-40cm. In order to ensure that the visual sensor is resistant to the interference of external light, a certain shading device is generally arranged outside the visual sensor.
The waterline detection unit performs waterline detection based on the waterline image to obtain a waterline;
the working process of the waterline detection unit specifically comprises the following steps:
s201, aiming at pixel points on the water line imagePerforming gray scale stretching, wherein the stretching formula is shown as formula (1):
wherein,,,as a result of the stretching factor,represents an imageLine, firstColumns;is shown asGo to the firstAfter the gray value of the gray value water line image of the column pixel points is stretched, the linear characteristic of each pixel point is expressed based on the HARR-like characteristic, and the HARR-like characteristic is shown in FIG. 2:
in the embodiment, 6 HARR-like features are selected according to engineering conditions, the number of the HARR-like features is large, and the HARR-like features are designed in the HARR-like features M1-M6 according to actual conditions to divide different regions. Assuming that the width of the water line region is 2w, where w is half the width of the water line, the width of the road surface region and the transition region in the HARR-like features M1-M3 is w. In the HARR-like features M4-M6, the width of each region is 2w, and the height of all HARR-like features is 4w-6w. For the rotation features M2, M3, M5, M6, the rotation angle is 15 degrees, which is determined by the site waterline characteristics.
S202, HARR-like feature for road projectionThe straight line characteristic of waterline on the face is favorable to the waterline discernment in later stage. For the ith HARR-like featureThe calculation method is as formula (2):
in the present embodiment, Q =6,for the sum of pixels of the ith HARR-like feature waterline region,the sum of the pixels of the ith HARR-like characteristic road surface area,、gray value based on stretched gray imageCalculating to obtain;
after the description of the HARR-like features (straight line features) of each pixel point is obtained, normalizing each HARR-like feature, wherein the normalization formula is as shown in formula (3):
wherein,in order to normalize the HARR signature after the normalization,,respectively the average value of the gray levels and the average value of the square of the gray levels in the detection window;
based on the 6 normalized HARR-like features (see formula 3), pixel points are constructedFeature vector of:
S203, in order to detect the waterline in the image, each pixel point is subjected to detectionFeature vector ofIdentifying and judging pixel pointsAnd (3) judging whether the characteristic vector accords with the linear characteristic, if so, setting 1 to the pixel point, otherwise, setting 0 to realize binarization of the gray level image, and further obtaining a binary image B. The method is innovative in that the binarization problem of the image is changed into the pattern recognition problem of the features.
The feature vector constructed by the SVM method is adopted for identification, and the kernel function is as follows:
Is thatThe formula (5) is a kernel function to be designed in the SVM classification method, and is to use the feature vectorMapping to a higher dimension, judging whether the feature is a straight line feature through the SVM, and performing two-classification; constructing a linear feature vector for each pixel point according to the HARR-like features, and then judging whether the pixel point represented by each linear feature vector is on a straight line or not by using an SVM (support vector machine);
after obtaining the binary image, carrying out expansion corrosion treatment on the binary image B to obtain a new binary imageObtaining an edge image E of the waterline by adopting a formula (6);
representing a new binary imageTo (1) aGo to the firstPixel values of the columns;represents the edge image EThe rows of the image data are, in turn,pixel values of the columns;
s204, identifying a straight line in the edge image E by adopting a HOUGH conversion method:
from new binary imagesFinding the coordinates of the pixel with the pixel value of 1 asObtained by HOUGH conversionObtained by the formula (7)Corresponding straight line expressions;
wherein,respectively, the radius and angle (variable representation of straight line) detected by the HOUGH transform.
A group ofRepresenting a straight line, given a setIf at allSatisfy the formula(7) It is shownOn the straight line represented by formula (7);
s205, due to the influence of noise in the image, a plurality of straight lines are detected, and the interference straight lines are removed in the embodiment to obtain real straight lines. The method for eliminating the interference straight line specifically comprises the following steps:
s205-1, selecting 3-5 straight lines with the longest length, wherein the length of the straight lines meets the threshold length (the length is greater than or equal to 1/4 of the image height, and the threshold length is 1/4 of the image height in the embodiment);
s205-2, because the waterline is continuous, the straight line parameters detected by the front frame image and the rear frame imageThe requirement of formula (8) is satisfied, namely:
wherein,,are respectively asThe maximum allowable deviation angle and radius,is a threshold value that is allowed for continuity,is the sampling instant of the image.
S205-3, if a plurality of straight lines meet the requirements after the process of S205-1 and S205-2, selecting the straight line with the highest pixel mean value on the straight line as the detected waterline, wherein the waterline is generally white and the brightness is highest in the image.
The operation error prediction unit selects a monitoring point of the current position of the vehicle based on the detected waterline, and calculates the operation deviation of the vehicleRate of change of deviation from runningSelecting the monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation of the vehicleAnd predicting the rate of change of deviation;
The detected waterline is shown in FIG. 3:
as shown in FIG. 3, H,The distribution is the height and width of the edge image E (the height and width of the edge image E and the height and width of the horizontal line image H and W are the same). Taking the intersection point of the horizontal line at the position of 0.7H and the waterline as a monitoring point of the current position of the vehicle, and taking the 0.7H as y in the formula (7) to obtain the abscissa of the monitoring point of the current position. At this time, the running deviation of the vehicleRate of change of deviation from operationComprises the following steps:
0.7H is a value of this embodiment, the distance between the two horizontal lines in fig. 3 should be at least the distance that the vehicle can travel in one control cycle, but the two horizontal lines cannot be too close to the upper and lower boundaries of the image, and according to the set vehicle speed and the visual field range of the vision sensor, the present embodiment takes the intersection point of the horizontal line at 0.7H and the waterline as a monitoring point of the current position of the vehicle;
wherein,obtained by camera calibration, represents the actual length (the actual distance represented by a pixel unit) represented by each pixel. The intersection point of the horizontal line at 0.3H and the waterline is used as a monitoring point of the predicted position of the vehicle to obtain the predicted running deviation of the vehicleAnd predicting the rate of change of deviation:
This division is reasonable because the vehicle speed is approximately 40 m/min and the effective field of view of the vision sensor is 20-30 cm.
As seen in fig. 3, the current position error is on the left, but to the right by the predicted position. The error cannot be adjusted excessively at the current position and the current adjustment can be fine-tuned by the prediction error.
The predictive controller predicts an operating deviation of the vehicle based on the predicted operating deviationAnd predicting the rate of change of deviationPredicting a feedforward control amount of the vehicle;
In order to predict the feedforward control quantity of the controller,and withAre two parameters of the predictive controller;
the nonlinear incremental PID controller is in a nonlinear division area of an e-ec plane and is based on the running deviation of the vehicleRate of change of deviation from runningObtaining an incremental nonlinear PID control law for a vehicle;
The error e or the error change rate ec of the system accords with Gaussian distribution, so that the non-uniform dividing method for constructing the error e and the error change rate ec based on the Gaussian function is adopted, and the e-ec plane takes the error e as a horizontal axis and takes the error change rate ec as a vertical axis; error or rate of change of error refers to deviation in the operation of the vehicleRate of change of deviation from operation;
The e-ec plane division adopts a nonlinear division method, and the nonlinear division method comprises the following steps:
when the division of the error change rate ec is calculated,is thatWhen the division of the error e is calculated,is that;,Respectively, the maximum of the absolute values of the error e and the error rate of change ec, wherein,equal division points for error e or error change rate ec,in order to non-uniformly divide the points after mapping,adjusting a factor for the degree of non-linearity; the non-uniform partitioning diagram is shown in fig. 5.
According to the range of the error e and the error change rate ec, the e-ec plane is divided non-uniformly, and the set of divided areas (each square in fig. 6 is an area) is marked asAs shown in fig. 6:
in each divided regionPID control is carried out by a nonlinear increment PID controller, and the increment is controlled by the nonlinear PIDComprises the following steps:
which is indicative of the time of the sampling,indicating areaThe internal proportionality coefficient of the air-fuel ratio,the scale is shown to be that of,the lines are represented by a number of lines,a presentation column;the value of the integral coefficient is represented by,representing the differential coefficient.
Based on all regionsNon-linear PID control increments ofCalculating a nonlinear PID control weighted average increment:
wherein,is a regionThe control law weight is incremented by one,is a regionError e (square in figure 6), radius of error rate of change ec,is a regionOf the center of (a).
Synthesizing the above incremental control law weightsCalculatingTime incremental nonlinear PID control lawComprises the following steps:
for describing the running deviation of the vehicleRate of change of deviation from operationThe degree of deviation from the origin;
The method can lead the error change rate ec to have different increment factors according to different errors e.
Gives out the running deviationRate of change of deviation from runningThe controller increment factors under different conditions have the functions of improving the response speed of the system and reducing the complexity of the optimization process.
On-line learning rule unit for non-linear PID control increment in regionParameter (2)、、Performing online learning, and adopting supervised Hebb learning rule to control increment of nonlinear PIDLearning the parameters;
the process of the online learning rule unit specifically comprises the following steps: there are several pairsIn the divided regionInner, then to the areaInternal non-linear PID control incrementsParameter (2) of、、And performing online learning. Increment control of nonlinear PID by adopting supervised Hebb learning ruleThe parameters of (2) are learned:
wherein,is a regionThe inner learning rate. To improve learning efficiency, learning rateBased on online adjustment rulesThe adjustment is carried out, namely:
wherein,is a regionThe matching coefficient in the range, the learning rate range is adjusted,is a regionTwo weight coefficients within. The two weighting coefficients may be artificially given based on historical data.
The part adjusts the parameters of the controllers of each area on line, and realizes the on-line adjustment of the learning rate of each area.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: rather, the invention as claimed requires more features than are expressly recited in each claim. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules or units or groups of devices in the examples disclosed herein may be arranged in a device as described in this embodiment, or alternatively may be located in one or more devices different from the devices in this example. The modules in the foregoing examples may be combined into one module or may additionally be divided into multiple sub-modules.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. Modules or units or groups in embodiments may be combined into one module or unit or group and may furthermore be divided into sub-modules or sub-units or sub-groups. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Moreover, those skilled in the art will appreciate that although some embodiments described herein include some features included in other embodiments, not others, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
Furthermore, some of the described embodiments are described herein as a method or combination of method elements that can be performed by a processor of a computer system or by other means of performing the described functions. A processor having the necessary instructions for carrying out the method or method elements thus forms a means for carrying out the method or method elements. Further, the elements of the apparatus embodiments described herein are examples of the following apparatus: the apparatus is used to implement the functions performed by the elements for the purpose of carrying out the invention.
The various techniques described herein may be implemented in connection with hardware or software or, alternatively, with a combination of both. Thus, the methods and apparatus of the present invention, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium, wherein, when the program is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the invention.
In the case of program code execution on programmable computers, the computing device will generally include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. Wherein the memory is configured to store program code; the processor is configured to perform the method of the invention according to instructions in said program code stored in the memory.
By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer-readable media includes both computer storage media and communication media. Computer storage media stores information such as computer readable instructions, data structures, program modules or other data. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. Combinations of any of the above are also included within the scope of computer readable media.
As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this description, will appreciate that other embodiments can be devised which do not depart from the scope of the invention as disclosed herein. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the appended claims. The present invention has been disclosed in an illustrative rather than a restrictive sense, and the scope of the present invention is defined by the appended claims.
Claims (10)
1. The on-line optimization control method of the unmanned line marking vehicle based on machine vision navigation is characterized by comprising the following steps:
s1, collecting a water line image of a road surface;
s2, carrying out waterline detection based on the waterline image to obtain a waterline;
s3, selecting a monitoring point of the current position of the vehicle based on the detected waterline, and calculating the running deviation of the vehicleRate of change of deviation from runningSelecting the monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation of the vehicleAnd predicting the rate of change of deviation;
S4, according to the predicted operation deviation of the vehicleAnd predicting the rate of change of deviationObtaining a predicted feedforward control quantityBased on running deviation of the vehicleRate of change of deviation from operationObtaining an incremental nonlinear PID control law for a vehicle;
2. The on-line optimizing control method for the unmanned line marking vehicle based on machine vision navigation as claimed in claim 1,
the step S2 specifically includes the following steps:
s201, aiming at pixel points on the water line imagePerforming gray scale stretching, wherein the stretching formula is shown as formula (1):
wherein,,,is a stretch factor;,represents an image ofGo to the firstColumns;denotes the firstGo to the firstGray values of the column pixel points; after the gray level of the water line image is stretched, selectingA personal HARR-like feature;
s202, aiming at the firstCharacteristic of personal HARR,The calculation method is as shown in formula (2):
wherein,is a firstThe sum of pixels of the individual HARR-like feature waterline regions,is as followsThe sum of pixels of a road surface area with characteristics similar to HARR,、gray value based on stretched gray imageCalculating to obtain;
after the HARR-like feature description of each pixel point is obtained, normalizing each HARR-like feature, wherein the normalization formula is as shown in formula (3):
wherein,,in order to normalize the HARR signature after the normalization,,respectively the average value of the gray levels and the average value of the square of the gray levels in the detection window;
S203, aiming at each pixel pointFeature vector ofIdentifying and judging pixel pointsIf the characteristic vector accords with the linear characteristic, setting 1 for the pixel point, and otherwise, setting 0 for realizing the binaryzation of the gray level image to obtain a binary image B;
carrying out expansion corrosion treatment on the binary image B to obtain a new binary imageAnd obtaining an edge image E of the waterline by adopting a formula (6):
representing a new binary imageTo (1) aGo to the firstPixel values of the columns;represents the edge image EGo to the firstPixel values of the columns;
s204, identifying a straight line in the edge image E by adopting a HOUGH conversion method:
from new binary imagesWhere the coordinates of the pixel having the pixel value of 1 are found to beEach straight line being obtained by HOUGH conversionObtained by the formula (7)A corresponding linear expression;
a group ofRepresenting a straight line, given a setIf at allSatisfies the expression of the formula (7)On the straight line represented by formula (7);
s205, eliminating interference straight lines;
the method for eliminating the interference straight line specifically comprises the following steps:
s205-1: selecting a plurality of straight lines with the longest length, wherein the length of the straight lines meets the length of a threshold value;
s205-2: front and back two-value imageDetected straight line parameterSatisfies the requirement of formula (8):
wherein,,are respectively asThe maximum allowable deviation angle and radius,is a threshold value that is allowed for continuity,is the sampling time of the image;
s205-3: and if more than one straight line meets the requirement after the steps S205-1 and S205-2, selecting the straight line with the highest pixel average value on the straight line as the detected waterline.
3. The on-line optimizing control method for the unmanned line marking vehicle based on machine vision navigation as claimed in claim 1,
the step S3 specifically includes the following steps:
selecting the intersection point of the horizontal line with the height of y and the waterline as the monitor of the current position of the vehicleMeasuring point (Y) running deviation of the vehicleRate of change of deviation from runningComprises the following steps:
selecting a height ofThe intersection point of the horizontal line and the waterline is used as a monitoring point of the predicted position of the vehicle (,) Obtaining a predicted running deviation of the vehicleAnd predicting the rate of change of deviation:
4. The on-line optimizing control method for the unmanned line marking vehicle based on machine vision navigation as claimed in claim 1,
s4 specifically comprises the following steps:
s401, according to the predicted operation deviation of the vehicleAnd predicting the rate of change of deviationCalculating a feedforward control amount of the vehicle:
s402, in the area of the e-ec plane, based on the running deviation of the vehicleRate of change of deviation from operationObtaining an incremental nonlinear PID control law for a vehicle;
S402 specifically includes the following steps: the non-uniform dividing method for constructing the error e and the error change rate ec based on the Gaussian function is characterized in that an e-ec plane takes the error e as a horizontal axis and takes the error change rate ec as a vertical axis; the error e and the error change rate ec refer to the running deviation of the vehicleAnd rate of change of operating deviation(ii) a The e-ec plane division adopts a nonlinear division method as follows:
when the division of the error change rate ec is calculated,is thatWhen the division of the error e is calculated,is that;Andrespectively the maximum of the absolute values of the error e and the error rate of change ec,equal division points for error e or error change rate ec,non-uniform dividing points after mapping;adjusting a factor for the degree of non-linearity;
In each divided regionInternally performing PID control, and non-linear PID control incrementComprises the following steps:
which is indicative of the time of the sampling,indicating areaThe internal proportionality coefficient of the air-fuel ratio,the scale is shown to be that of a scale,the lines are represented as a result of,a presentation column;indicating areaInner integral the coefficients of which are such that,indicating areaThe inner differential coefficient;
based on all regionsNon-linear PID control increments ofAnd calculating the weighted average increment of the nonlinear PID control:
wherein,is a regionThe weight of the control law is increased by an amount,is a regionThe error e and the radius of the error rate of change ec,is a regionThe center of (a);
for describing the degree to which the error e and the rate of change of error ec deviate from the origin,,is a scaling factor;
the method can enable different e and ec to have different increment factors.
5. The on-line optimizing control method for the unmanned line marking vehicle based on machine vision navigation as claimed in claim 1,
the step S5 specifically includes the following steps:
there are a number of pairsIn a divided regionInner, then to the areaInternal non-linear PID control incrementsParameter (2)、、Performing online learning;
incremental nonlinear PID control using supervised Hebb learning ruleThe parameters of (2) are learned:
6. The unmanned line marking vehicle online optimization control system based on machine vision navigation is characterized by comprising a vision sensor, a waterline detection unit, an operation error prediction unit, a prediction controller, a nonlinear increment PID controller and an online learning rule unit;
a vision sensor collects a water line image of a road surface;
the waterline detection unit performs waterline detection based on the waterline image to obtain a waterline;
the operation error prediction unit selects a monitoring point of the current position of the vehicle based on the detected waterline, and calculates the operation deviation of the vehicleRate of change of deviation from operationSelecting the monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation of the vehicleAnd predicting the rate of change of deviation;
The predictive controller predicts an operating deviation of the vehicle based on the predicted operating deviationAnd predicting the rate of change of deviationCalculating a feedforward control quantity;
The nonlinear incremental PID controller is in a nonlinear divided area on an e-ec plane and is based on the running deviation of the vehicleRate of change of deviation from runningObtaining an incremental nonlinear PID control law for a vehicle;
7. The on-line optimizing control system for an unmanned line marking vehicle based on machine vision navigation of claim 6,
the working process of the waterline detection unit specifically comprises the following steps:
s201, aiming at pixel points on the water line imagePerforming gray scale stretching, wherein the stretching formula is shown as formula (1):
wherein,,,as a result of the stretching factor,represents an image ofGo to the firstColumns;denotes the firstGo to the firstGray values of the column pixel points; after stretching the gray scale of the water line image, selectingThe individual HARR-like features express the linear features of each pixel point;
s202, aiming at the first stepGeneric HARR characteristicsThe calculation method is as shown in formula (2):
wherein,for the pixel sum of the i-th HARR-like feature waterline region,for the sum of pixels of the ith HARR-like feature road surface area,、from the gray value of the stretched gray imageCalculating to obtain;
after the HARR-like feature description of each pixel point is obtained, normalizing each HARR-like feature, wherein the normalization formula is as shown in formula (3):
wherein,is as followsThe number of normalized HARR features is then determined,,respectively the average value of the gray levels and the average value of the square of the gray levels in the detection window;
S203, aiming at each pixel pointFeature vector ofIdentifying and judging pixel pointsIf the characteristic vector accords with the linear characteristic, setting 1 for the pixel point, and otherwise, setting 0 for realizing the binaryzation of the gray level image so as to obtain a binary image B;
carrying out expansion corrosion treatment on the binary image B to obtain a new binary imageObtaining an edge image E of the waterline by adopting a formula (6),
representing a new binary imageTo (1) aGo to the firstPixel values of the columns;represents the edge image EThe number of rows is such that,pixel values of the columns;
s204, identifying a straight line in the edge image E by adopting a HOUGH conversion method:
from new binary imagesFinding the coordinates of the pixel with the pixel value of 1 asObtained by HOUGH conversionObtained by the formula (7)A corresponding linear expression;
a group ofRepresenting a straight line, given a setIf, ifSatisfies the formula (7)On the straight line represented by formula (7);
s205, eliminating interference straight lines;
the interference line removing method specifically comprises the following steps:
s205-1, selecting a plurality of straight lines with the longest length and the length meeting the threshold length;
wherein,,are respectively asThe maximum allowable deviation angle and radius,is a threshold value that is allowed for continuity,is the sampling time of the image;
s205-3, if 2 or more than 2 straight lines still exist after the process of S205-1 and S205-2, selecting the straight line with the highest pixel mean value on the straight line as the detected waterline.
8. The on-line optimizing control system of the unmanned line marking vehicle based on machine vision navigation as claimed in claim 6,
the specific working process of the operation error prediction unit comprises the following steps: the height and width of the edge image E of the waterline are H and H respectively(ii) a Selecting the intersection point of the horizontal line with the height of y and the waterline as the monitoring point of the current position of the vehicle: (,y),Calculating a running deviation of a vehicleRate of change of deviation from operationComprises the following steps:
selecting a height of(the intersection point of the horizontal line and the waterline is taken as a monitoring point of the predicted position of the vehicle,) Obtaining a predicted running deviation of the vehicleAnd predicting the rate of change of deviation:
9. The on-line optimizing control system for an unmanned line marking vehicle based on machine vision navigation of claim 6,
feedforward control amount of vehicle predicted by prediction controllerComprises the following steps:
in order to predict the feedforward control quantity of the controller,andare two parameters of the predictive controller;
the working process of the nonlinear incremental PID controller specifically comprises the following steps:
constructing a non-uniform division method of an error e and an error change rate ec based on a Gaussian function, wherein an e-ec plane takes the error e as a horizontal axis and the error change rate ec as a vertical axis; error or rate of change of error refers to deviation in the operation of the vehicleRate of change of deviation from operation;
The e-ec plane nonlinear division method comprises the following steps:
when the division of the error change rate ec is calculated,is thatWhen the division of the error e is calculated,is that;,Respectively, the maximum of the absolute values of the error e and the error change rate ec, wherein,equal division points for error e or error change rate ec,non-uniform dividing points after mapping;adjusting a factor for the degree of non-linearity;
according to the range of the error e and the error change rate ec, the e-ec plane is divided non-uniformly, and the divided area set is marked as;
which is indicative of the time of the sampling,indicating areaThe ratio coefficient of the inner side of the outer ring,the scale is shown to be that of,the lines are represented by a number of lines,a presentation column;indicating areaInner integral the coefficients of which are such that,indicating areaA differential coefficient of the inner;
based on all regionsNon-linear PID control increments ofAnd calculating the weighted average increment of the nonlinear PID control:
wherein,is a regionThe weight of the control law is increased by an amount, is a regionError e, error variationThe radius of the rate ec is such that,is a regionThe center of (a);
based on incremental control law weightsIncremental non-linear PID control law of computing timeComprises the following steps:
10. The on-line optimizing control system of the unmanned line marking vehicle based on machine vision navigation as claimed in claim 6,
the working process of the online learning rule unit specifically comprises the following steps:
there are several pairsIn a divided regionInner, then to the areaInternal non-linear PID control incrementsParameter (2) of、、Performing online learning;
increment control of nonlinear PID by adopting supervised Hebb learning ruleThe parameters of (2) are learned:
wherein,is a regionInternal learning rate, learning rateBased on online adjustment rulesAnd (3) adjusting:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211451943.7A CN115509122B (en) | 2022-11-21 | 2022-11-21 | Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211451943.7A CN115509122B (en) | 2022-11-21 | 2022-11-21 | Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115509122A true CN115509122A (en) | 2022-12-23 |
CN115509122B CN115509122B (en) | 2023-03-21 |
Family
ID=84513924
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211451943.7A Active CN115509122B (en) | 2022-11-21 | 2022-11-21 | Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115509122B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115760850A (en) * | 2023-01-05 | 2023-03-07 | 长江勘测规划设计研究有限责任公司 | Method for identifying water level without scale by using machine vision |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120179322A1 (en) * | 2009-09-15 | 2012-07-12 | Ross Hennessy | System and method for autonomous navigation of a tracked or skid-steer vehicle |
US20150275445A1 (en) * | 2012-10-17 | 2015-10-01 | Diane Lee WATSON | Vehicle for line marking |
CN106527119A (en) * | 2016-11-03 | 2017-03-22 | 东华大学 | Fuzzy control-based differentiation first PID (proportion integration differentiation) control system |
CN109176519A (en) * | 2018-09-14 | 2019-01-11 | 北京遥感设备研究所 | A method of improving the Robot Visual Servoing control response time |
CN110398979A (en) * | 2019-06-25 | 2019-11-01 | 天津大学 | A kind of unmanned engineer operation equipment tracking method and device that view-based access control model is merged with posture |
AU2020104234A4 (en) * | 2020-12-22 | 2021-03-11 | Qingdao Agriculture University | An Estimation Method and Estimator for Sideslip Angle of Straight-line Navigation of Agricultural Machinery |
CN112706835A (en) * | 2021-01-07 | 2021-04-27 | 济南北方交通工程咨询监理有限公司 | Expressway unmanned marking method based on image navigation |
CN113296518A (en) * | 2021-05-25 | 2021-08-24 | 山东交通学院 | Unmanned driving system and method for formation of in-place heat regeneration unit |
CN113960921A (en) * | 2021-10-19 | 2022-01-21 | 华南农业大学 | Visual navigation control method and system for orchard tracked vehicle |
CN114942641A (en) * | 2022-06-06 | 2022-08-26 | 仲恺农业工程学院 | Road bridge autonomous walking marking system controlled by multiple sensor data fusion stereoscopic vision |
CN115082701A (en) * | 2022-08-16 | 2022-09-20 | 山东高速集团有限公司创新研究院 | Multi-water-line cross identification positioning method based on double cameras |
-
2022
- 2022-11-21 CN CN202211451943.7A patent/CN115509122B/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120179322A1 (en) * | 2009-09-15 | 2012-07-12 | Ross Hennessy | System and method for autonomous navigation of a tracked or skid-steer vehicle |
US20150275445A1 (en) * | 2012-10-17 | 2015-10-01 | Diane Lee WATSON | Vehicle for line marking |
CN106527119A (en) * | 2016-11-03 | 2017-03-22 | 东华大学 | Fuzzy control-based differentiation first PID (proportion integration differentiation) control system |
CN109176519A (en) * | 2018-09-14 | 2019-01-11 | 北京遥感设备研究所 | A method of improving the Robot Visual Servoing control response time |
CN110398979A (en) * | 2019-06-25 | 2019-11-01 | 天津大学 | A kind of unmanned engineer operation equipment tracking method and device that view-based access control model is merged with posture |
AU2020104234A4 (en) * | 2020-12-22 | 2021-03-11 | Qingdao Agriculture University | An Estimation Method and Estimator for Sideslip Angle of Straight-line Navigation of Agricultural Machinery |
CN112706835A (en) * | 2021-01-07 | 2021-04-27 | 济南北方交通工程咨询监理有限公司 | Expressway unmanned marking method based on image navigation |
CN113296518A (en) * | 2021-05-25 | 2021-08-24 | 山东交通学院 | Unmanned driving system and method for formation of in-place heat regeneration unit |
CN113960921A (en) * | 2021-10-19 | 2022-01-21 | 华南农业大学 | Visual navigation control method and system for orchard tracked vehicle |
CN114942641A (en) * | 2022-06-06 | 2022-08-26 | 仲恺农业工程学院 | Road bridge autonomous walking marking system controlled by multiple sensor data fusion stereoscopic vision |
CN115082701A (en) * | 2022-08-16 | 2022-09-20 | 山东高速集团有限公司创新研究院 | Multi-water-line cross identification positioning method based on double cameras |
Non-Patent Citations (2)
Title |
---|
FAN HONG,ET AL.: "A SEMI-FRAGILE WATERMARKING SCHEME BASED ON NEURAL NETWORK", 《PROCEEDINGS OF THE TBIRD INTERNATIONAL CONFERENCE ON MACHINE LEAMING AND CYBERNETICS》 * |
王绍磊 等: "图基导航的高速公路划线车无人驾驶系统", 《电子世界》 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115760850A (en) * | 2023-01-05 | 2023-03-07 | 长江勘测规划设计研究有限责任公司 | Method for identifying water level without scale by using machine vision |
Also Published As
Publication number | Publication date |
---|---|
CN115509122B (en) | 2023-03-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113221905B (en) | Semantic segmentation unsupervised domain adaptation method, device and system based on uniform clustering and storage medium | |
CN109375235B (en) | Inland ship freeboard detection method based on deep reinforcement neural network | |
CN106372749B (en) | Ultra-short term photovoltaic power prediction technique based on the analysis of cloud variation | |
CN109785385B (en) | Visual target tracking method and system | |
CN108614994A (en) | A kind of Human Head Region Image Segment extracting method and device based on deep learning | |
CN112818873B (en) | Lane line detection method and system and electronic equipment | |
CN110889332A (en) | Lie detection method based on micro expression in interview | |
CN112348849A (en) | Twin network video target tracking method and device | |
CN107301657B (en) | A kind of video target tracking method considering target movable information | |
CN111325711A (en) | Chromosome split-phase image quality evaluation method based on deep learning | |
TW202022797A (en) | Object detection method using cnn model and object detection apparatus using the same | |
CN112298194B (en) | Lane changing control method and device for vehicle | |
CN115509122B (en) | Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation | |
CN106557173A (en) | Dynamic gesture identification method and device | |
CN113516853B (en) | Multi-lane traffic flow detection method for complex monitoring scene | |
CN112184655A (en) | Wide and thick plate contour detection method based on convolutional neural network | |
CN116630748A (en) | Rare earth electrolytic tank state multi-parameter monitoring method based on fused salt image characteristics | |
CN116476863A (en) | Automatic driving transverse and longitudinal integrated decision-making method based on deep reinforcement learning | |
CN105225252B (en) | Particle clouds motion Forecasting Methodology | |
CN112508851A (en) | Mud rock lithology recognition system based on CNN classification algorithm | |
CN116994236A (en) | Low-quality image license plate detection method based on deep neural network | |
CN116863353A (en) | Electric power tower inclination degree detection method based on rotating target detection network | |
CN109993772B (en) | Example level feature aggregation method based on space-time sampling | |
CN115170882A (en) | Optimization method of rail wagon part detection network and guardrail breaking fault identification method | |
CN118379664A (en) | Video identification method and system based on artificial intelligence |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |