CN111598911A - Autonomous line patrol method and device for robot platform and storage medium - Google Patents

Autonomous line patrol method and device for robot platform and storage medium Download PDF

Info

Publication number
CN111598911A
CN111598911A CN202010724361.6A CN202010724361A CN111598911A CN 111598911 A CN111598911 A CN 111598911A CN 202010724361 A CN202010724361 A CN 202010724361A CN 111598911 A CN111598911 A CN 111598911A
Authority
CN
China
Prior art keywords
line
lane line
robot
lane
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010724361.6A
Other languages
Chinese (zh)
Other versions
CN111598911B (en
Inventor
陈挺任
马子昂
卢维
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Publication of CN111598911A publication Critical patent/CN111598911A/en
Application granted granted Critical
Publication of CN111598911B publication Critical patent/CN111598911B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/168Segmentation; Edge detection involving transform domain methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20061Hough transform

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses an autonomous line patrol method and device for a robot platform and a storage medium. The method comprises the following steps: the method comprises the steps of obtaining a ground top view of a robot line patrol scene, and detecting all line segments in the ground top view; screening all the line segments to obtain an initial lane line of the robot meeting the requirements of preset line segment angles, energy values and distance parameters, and controlling the robot to run along the initial lane line; and in the running process of the robot, acquiring a predicted lane line meeting the requirements of a preset angle change rate and a preset distance change rate in the next frame of image, screening all line segments according to the predicted lane line and set line segment angles, energy values and distance parameters to obtain a new lane line, and controlling the robot to run along the new lane line. The invention improves the accuracy of detecting and tracking the lane line.

Description

Autonomous line patrol method and device for robot platform and storage medium
Technical Field
The invention relates to the technical field of visual navigation, in particular to an autonomous line patrol method and device for a robot platform and a storage medium.
Background
With the increasing maturity of the robot technology, the application scenarios of the robot are also more and more extensive, for example: military investigation and monitoring, unmanned driving in service consumption, food delivery robots, fire fighting, environmental detection, power inspection and the like. As robots play more and more important roles in various fields, the intelligent demand of the public on the robots is higher and higher.
In recent years, the autonomous line patrol function of a robot has become a hot point of research. The existing robot autonomous line patrol algorithm mainly comprises a line patrol algorithm based on deep learning and a line patrol algorithm based on traditional vision. The line patrol algorithm based on deep learning needs a large amount of manually labeled training data of different scenes and high-performance gpu, so that the scene generalization is poor, and the requirement on equipment hardware is high. The line patrol algorithm based on the traditional vision can achieve real-time processing speed on a common cpu, and has the characteristics of high precision, low cost, convenience in deploying embedded equipment and the like. However, the inventor of the present invention finds that the existing line patrol algorithm based on the traditional vision still has the defects of low lane line detection precision, incapability of identifying a fold line at any angle, and the like.
Disclosure of Invention
The invention provides an autonomous line patrol method, an autonomous line patrol device and a storage medium for a robot platform, which can solve the defects of low lane line detection precision, incapability of identifying broken lines at any angles and the like of the conventional robot line patrol algorithm to a certain extent.
In order to solve the technical problems, the technical scheme adopted by the invention is as follows:
an autonomous line patrol method for a robot platform comprises the following steps:
the method comprises the steps of obtaining a ground top view of a robot line patrol scene, and detecting all line segments in the ground top view;
screening all the line segments to obtain an initial lane line meeting the requirements of preset line segment angles, energy values and distance parameters, and controlling the robot to run along the initial lane line;
and in the running process of the robot, acquiring a predicted lane line meeting the requirements of a preset angle change rate and a preset distance change rate in the next frame of image, screening all line segments according to the predicted lane line and set line segment angles, energy values and distance parameters to obtain a new lane line, and controlling the robot to run along the new lane line.
The technical scheme adopted by the invention also comprises the following steps: and in the running process of the robot, detecting whether a broken line exists in the next frame of image, tracking the broken line when the broken line exists, controlling the robot to rotate to be parallel to the broken line, and controlling the robot to run along the broken line.
The technical scheme adopted by the invention also comprises the following steps: the detecting all line segments in the ground top view comprises:
converting a current frame image of a robot line patrol scene into a gray map;
carrying out Gaussian blur processing on the gray level image;
performing perspective transformation on the gray-scale image subjected to the Gaussian blur processing by using a perspective transformation matrix to obtain a corresponding ground top view;
and carrying out edge detection and Hough transformation on the ground top view to obtain all line segments in the ground top view.
The technical scheme adopted by the invention also comprises the following steps: the step of obtaining the initial lane line of the robot by screening all the line segments according to the set line segment angle, the set energy value and the set distance parameter comprises the following steps:
screening all the line segments to obtain two left and right adjacent lane line segments which accord with a first screening rule, and setting a lane formed by the two screened lane line segments as an initial lane line of the robot; the first filtering rule comprises:
the angle difference between the two lane line segments cannot exceed a set degree;
the energy values of the two lane segments are greater than a set threshold, the energy value of the left lane segment is negative, and the energy value of the right lane segment is positive;
the actual distance between the two lane segments is within the actual width of the lane line.
The technical scheme adopted by the invention also comprises the following steps: the controlling the robot to travel along the initial lane line includes:
converting the initial lane line inverse perspective into an image coordinate system, and converting the image coordinate system into a camera coordinate system;
calculating the positions of four vertexes of the left lane line segment and the left lane line segment under a camera coordinate system, and then calculating the distance and the included angle between the robot and the initial lane line;
and controlling the robot to run along the initial lane line according to the calculation result of the distance and the included angle.
The technical scheme adopted by the invention also comprises the following steps: the step of screening all the line segments to obtain a new lane line according to the predicted lane line and the set line segment angle, energy value and distance parameter comprises the following steps:
setting an angle change rate delta theta as the average angle change quantity of the first five frames of the initial lane line;
setting a distance change rate deltad as an average value of pixel values of the first five frames of the initial lane line moving in the x-axis direction of the image coordinate system;
obtaining the position of the lane line in the next frame of image through the prediction of delta theta and delta d, and defining the lane line at the moment as a predicted lane line;
screening out lane line segments, the angle difference between which and the predicted lane line does not exceed a set degree, the real distance between which and the left line segment or the real distance between which and the right line segment of the predicted lane line does not exceed a set distance, from all the line segments of the top view, wherein the energy value of the lane line segments is greater than a set threshold value;
and defining a lane line formed by the predicted lane line and the screened lane line segment as a new lane line.
The technical scheme adopted by the invention also comprises the following steps: the step of screening all the line segments to obtain a new lane line according to the predicted lane line and the set line segment angle, energy value and distance parameter further comprises:
if no new lane line is tracked in the next frame image, judging whether a new lane line is tracked in the images with continuously set frame numbers;
and if the new lane line is not tracked in the images with the continuously set frame number, judging that the robot deviates from the lane line, and controlling the robot to stop running.
The technical scheme adopted by the invention also comprises the following steps: the detecting whether a polyline exists in the next frame of image by the polyline detection module comprises:
eliminating line segments with included angles smaller than a second set degree with the current lane line from all the line segments of the top view to obtain broken line candidate line segments;
selecting two line segments which accord with a third screening rule from the broken line candidate line segments, namely, the broken line obtained by detection; the third screening rule is as follows:
the angle difference between the two line segments is less than a first set degree;
the difference between the actual distance between the two line segments and the actual width of the current lane line is smaller than a set value;
and the difference between the energy values of the two line segments and the energy values of the two line segments of the current lane line is less than a set threshold value.
The technical scheme adopted by the invention also comprises the following steps: the tracking the polyline comprises:
calculating the distance and the included angle between the robot and the fold line;
judging whether the distance between the robot and the fold line is within a set distance range or not;
and if the distance is within the set distance range, controlling the robot to stop running and rotate on the spot, and updating the broken line into a new lane line when the robot rotates to be parallel to the broken line.
The invention adopts another technical scheme that: an autonomous patrol device of a robotic platform, the device comprising:
a line segment detection module: the system comprises a ground top view used for acquiring a robot line patrol scene and detecting all line segments in the ground top view;
initial lane line screening module: the system is used for screening all the line segments to obtain an initial lane line of the robot meeting the requirements of preset line segment angles, energy values and distance parameters, and controlling the robot to run along the initial lane line;
lane line tracking module: and the method is used for acquiring a predicted lane line meeting the requirements of a preset angle change rate and a preset distance change rate in the next frame of image in the running process of the robot, screening all line segments according to the predicted lane line and set line segment angles, energy values and distance parameters to obtain a new lane line, and controlling the robot to run along the new lane line.
In order to solve the technical problems, the invention adopts another technical scheme that: an autonomous routing device for a robotic platform is provided, comprising a processor, a memory coupled to the processor, wherein,
the memory stores program instructions for implementing the autonomous tour methodology of the robotic platform as described above;
the processor is used for executing the program instructions stored in the memory to control the robot to autonomously patrol.
In order to solve the technical problems, the invention adopts another technical scheme that: a storage medium storing program instructions executable by a processor to perform the autonomous tour method for a robotic platform as described above.
The invention has the beneficial effects that:
firstly, in the processes of initial lane line selection and lane line tracking, angles and energy are added, the distance between line segments is judged, meanwhile, inter-frame information is used for selecting and tracking lane lines, the obtained lane line angle is high in precision and small in error, and the accuracy of lane line detection is improved.
Secondly, lane line tracking is adopted, the candidate lane line segments of the current frame are screened by utilizing the inter-frame information, then the lane line segments are further screened by distance, angle and energy values and are completed to obtain a new lane line, and the robustness of the robot under the environments of lane line damage, shielding, weak illumination conditions and the like can be improved.
Thirdly, detecting whether a broken line exists in real time by adopting a broken line detection module while tracking the lane line, tracking the broken line when detecting the broken line, controlling the robot to stop and turn to be parallel to the broken line when the distance between the robot and the broken line is within a set range, and then controlling the robot to continue to run along the broken line; according to the scheme, the broken line at any angle can be identified and turned along the broken line, and the line patrol capacity of the robot on the complex curved path is improved.
Drawings
Fig. 1 is a schematic flow chart of an autonomous line patrol method of a robot platform according to a first embodiment of the present invention;
FIG. 2 is a flow chart illustrating an autonomous patrol method of a robot platform according to a second embodiment of the present invention;
FIG. 3a is an original current frame image obtained in a second embodiment of the present invention;
FIG. 3b is a top view of the ground with perspective transformation according to the second embodiment of the present invention;
FIG. 4 is a diagram illustrating a line segment energy value calculation method according to a second embodiment of the present invention;
FIG. 5 is a schematic diagram of a first configuration of an autonomous line patrol apparatus of a robotic platform according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a second configuration of an autonomous patrol device of a robot platform according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of a storage medium structure according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first", "second" and "third" in the present invention are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first," "second," or "third" may explicitly or implicitly include at least one of the feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise. All directional indicators (such as up, down, left, right, front, and rear … …) in the embodiments of the present invention are only used to explain the relative positional relationship between the components, the movement, and the like in a specific posture (as shown in the drawings), and if the specific posture is changed, the directional indicator is changed accordingly. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the invention. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Example one
Fig. 1 is a schematic flow chart of an autonomous line patrol method for a robot platform according to a first embodiment of the present invention. The autonomous line patrol method for the robot platform according to the first embodiment of the present invention includes the steps of:
s100: acquiring a ground top view of a robot line patrol scene, and detecting all line segments in the ground top view;
in S100, the ground plan view is obtained by: converting a current frame image of a robot line patrol scene from a color image to a gray image, then carrying out Gaussian blur processing on the gray image, and carrying out perspective transformation on the gray image subjected to the Gaussian blur processing according to a perspective transformation matrix to obtain a corresponding ground top view.
The line segment detection mode specifically comprises the following steps: and (4) performing canny detection + hough transformation on the ground top view to detect all line segments in the ground top view.
S110: screening all line segments according to the set line segment angles, the set energy values and the set distance parameters to obtain an initial lane line of the robot, and controlling the robot to run along the initial lane line;
in S110, the screening method of the initial lane line specifically includes: screening all the detected line segments to obtain two left and right adjacent lane line segments which accord with a first screening rule, and setting a lane formed by the two screened lane line segments as an initial lane line of the robot; the first filtering rule includes:
(1) because the left line segment and the right line segment of one lane line are parallel, the angle difference between the two screened lane line segments cannot exceed a first set degree (the degree is set to be 10 degrees by the invention, and the degree can be specifically set according to actual operation);
(2) the energy values of the two lane segments need to be larger than a set threshold, the energy value of the left lane segment is negative, and the energy value of the right lane segment is positive;
(3) the actual distance between two lane segments should be within the actual width of the lane; in general, the actual width of the lane line ranges from 0.15m to 0.4 m; and the real distance between the two screened lane line segments is the actual width of the initial lane line.
S120: in the running process of the robot, obtaining a predicted lane line in the next frame of image according to the angle change rate and the distance change rate, screening all line segments according to the predicted lane line and set line segment angles, energy values and distance parameters to obtain a new lane line, and controlling the robot to run along the new lane line;
in S120, the new lane line screening method specifically includes: in the moving process of the robot, the angle and the position of the lane line in the top view can be changed, and the angle change rate is set
Figure 223141DEST_PATH_IMAGE001
Setting the distance change rate for the angle average change of the first five frames of the initial lane line
Figure 985954DEST_PATH_IMAGE002
The average value of pixel values of the first five frames of the initial lane line moving in the x-axis direction of the image coordinate system is obtained by
Figure 968953DEST_PATH_IMAGE001
Figure 223217DEST_PATH_IMAGE002
Predicting the position of a lane line in the next frame of image by the two parameters, defining the lane line at the moment as a predicted lane line, and screening lane line segments meeting a second screening rule from all line segments of a ground top view according to the predicted lane line, wherein the second screening rule specifically comprises the following steps:
(1) screening out lane line segments which have an angle difference with a predicted lane line not exceeding a first set degree (the degree is set to be 10 degrees by the invention and can be set according to actual operation), and have a real distance with a left line segment or a right line segment of the predicted lane line not exceeding a set distance (the distance is set to be 0.05m by the invention and can be set according to actual operation);
(2) the energy value of the screened lane line segment is greater than a set threshold value; according to the method and the device, the line segment energy value is added into the lane line tracking judgment, so that the accuracy of lane line tracking can be improved.
S130: and in the running process of the robot, detecting whether a broken line exists in the next frame of image, tracking the broken line when the broken line exists, controlling the robot to rotate to be parallel to the broken line, and controlling the robot to run along the broken line.
In S130, the broken line detection method specifically includes: eliminating all line segments with included angles smaller than a second set degree (the degree is set to be 30 degrees by the invention and can be set according to actual operation) from all line segments in the top view to obtain a broken line candidate line segment; and selecting two line segments which accord with a third screening rule from the broken line candidate line segments, namely the broken line obtained by detection. And judging whether the distance between the robot and the broken line is within a set distance range (the distance range is set to be 1 meter according to the application, and the distance can be set according to actual operation), if so, issuing a control command through the ROS to control the robot to stop and rotate on site, and updating the broken line into a new lane line when the robot rotates to be parallel to the broken line.
In the autonomous line patrol method of the robot platform according to the first embodiment of the invention, in the processes of initial lane line selection and lane line tracking, angles and energy are added, and the distance between line segments is distinguished, so that the accuracy of lane line detection is improved;
meanwhile, the broken line of any angle is detected and identified through broken line detection, and the robot turns along the broken line, so that the line patrol capacity of the robot on a complex curved path is improved.
Example two
Fig. 2 is a schematic flow chart of an autonomous line patrol method for a robot platform according to a second embodiment of the present invention. The autonomous line patrol method for the robot platform according to the second embodiment of the present invention includes the steps of:
s200: acquiring a current frame image of a line patrol scene of the robot;
s210: processing the current frame image to obtain a ground top view corresponding to the current frame image, and detecting all line segments in the ground top view;
in S210, the processing of the current frame image specifically includes: converting the current frame image from a color image to a gray image, then carrying out Gaussian blur processing on the gray image, and carrying out perspective transformation on the gray image subjected to the Gaussian blur processing according to a perspective transformation matrix to obtain a corresponding ground top view; and canny detection + hough transformation is carried out on the ground top view, and all line segments in the ground top view are detected. Specifically, as shown in fig. 3a and 3b, fig. 3a is an original current frame image, and fig. 3b is a ground top view obtained by perspective transformation. Wherein, the perspective transformation formula is as follows:
Figure 506431DEST_PATH_IMAGE003
Figure 63314DEST_PATH_IMAGE004
the coordinates of the points in the ground plan view after perspective transformation,
Figure 674555DEST_PATH_IMAGE005
the coordinates in the image coordinate system before perspective transformation, and H is a perspective transformation matrix.
S220: screening all the detected line segments to obtain two left and right adjacent lane line segments which accord with a first screening rule, and setting a lane formed by the two screened lane line segments as an initial lane line of the robot;
in S220, the first filtering rule is specifically:
(1) because the left line segment and the right line segment of one lane line are parallel, the angle difference between the two screened lane line segments cannot exceed a first set degree (the degree is set to be 10 degrees by the invention, and the degree can be specifically set according to actual operation);
(2) the energy values of the two lane segments need to be larger than a set threshold, the energy value of the left lane segment is negative, and the energy value of the right lane segment is positive; the way of calculating the energy value of the line segment is shown in fig. 4, and in (a) of fig. 4, when the included angle between the line segment and the horizontal line is greater than 45 degrees, the line segmentA point C is arranged on the point AB, a dotted triangle is respectively arranged on the left and the right of the point C, the bottom of the triangle is 40 pixels, and the height of the triangle is 20 pixels; in fig. 4 (b), when the absolute value of the line segment angle is less than 45 degrees, the calculated energy of the upper and lower triangular regions of the point C is taken. The energy value formula for point C is:
Figure 607876DEST_PATH_IMAGE006
in which S is1The sum of pixel values of all pixel points in a left (upper) side triangle; s2The sum of pixel values of all pixel points in a right (lower) side triangle; n =400, representing the area of the triangular region, as a constant. And uniformly taking ten points on the line segment AB, and solving the energy average value of the ten points to obtain the energy value of the line segment. According to the invention, the detection precision of the lane line can be improved by adding the line segment energy value into the lane line selection judgment.
(3) The actual distance between two lane segments should be within the actual width of the lane; in general, the actual width of the lane line ranges from 0.15m to 0.4 m; and the real distance between the two screened lane line segments is the actual width of the initial lane line.
S230: calculating the distance and the included angle between the robot and the initial lane line, and controlling the robot to run along the initial lane line according to the calculation result of the distance and the included angle;
in S230, the calculation method of the distance and angle between the robot and the initial lane line is specifically: converting the initial lane line inverse perspective into an image coordinate system, and converting the image coordinate system into a camera coordinate system; suppose that two vertexes of the left lane segment detected in the ground top view are
Figure 870230DEST_PATH_IMAGE007
And
Figure 598015DEST_PATH_IMAGE008
the two vertexes of the right lane segment are
Figure 86765DEST_PATH_IMAGE009
And
Figure 558198DEST_PATH_IMAGE010
the calculation formula for converting the initial lane line into the image coordinate system is as follows:
Figure 721064DEST_PATH_IMAGE011
(1)
will be provided with
Figure 885329DEST_PATH_IMAGE007
Figure 736741DEST_PATH_IMAGE008
Figure 480706DEST_PATH_IMAGE009
And
Figure 858598DEST_PATH_IMAGE010
the calculation formula for converting the image coordinates of the four vertices to the camera coordinates is as follows:
Figure 52819DEST_PATH_IMAGE012
(2)
in the above-mentioned formula,
Figure 516161DEST_PATH_IMAGE007
the image coordinate system coordinates of the points are
Figure 63817DEST_PATH_IMAGE013
The coordinates of the camera coordinate system are
Figure 673047DEST_PATH_IMAGE014
Figure 179114DEST_PATH_IMAGE008
The image coordinate system coordinates of the points are
Figure 598594DEST_PATH_IMAGE015
The coordinates of the camera coordinate system are
Figure 340154DEST_PATH_IMAGE016
Figure 427059DEST_PATH_IMAGE009
The image coordinate system coordinates of the points are
Figure 838449DEST_PATH_IMAGE017
The coordinates of the camera coordinate system are
Figure 886170DEST_PATH_IMAGE018
Figure 775629DEST_PATH_IMAGE010
The image coordinate system coordinates of the points are
Figure 982619DEST_PATH_IMAGE019
The coordinates of the camera coordinate system are
Figure 955123DEST_PATH_IMAGE020
(ii) a K is an internal reference matrix of the camera; h is compiled to the height of the camera using a formula editor, i.e. the ground is in the camera coordinate system
Figure 614775DEST_PATH_IMAGE021
A value, which is derived from the measurement.
The distance and included angle between the robot and the initial lane line are calculated according to the following formula:
Figure 307924DEST_PATH_IMAGE022
(3)
Figure 477743DEST_PATH_IMAGE023
(4)
Figure 496515DEST_PATH_IMAGE024
(5)
Figure 643462DEST_PATH_IMAGE025
(6)
Figure 264937DEST_PATH_IMAGE026
(7)
in the formula, theta is an included angle between the orientation of the robot and the initial lane line;
Figure 180940DEST_PATH_IMAGE027
the distance from the robot to the left line segment of the initial lane line;
Figure 105034DEST_PATH_IMAGE028
the distance from the robot to the right line segment of the initial lane line; d is the distance from the robot to the initial lane line; w is the actual width of the initial lane line.
After the distance and the included angle between the Robot and the initial lane line are calculated, the calculated distance and included angle are issued to a Robot control system through an ROS (Robot operating system), and the Robot control system controls the Robot to run along the initial lane line through a DWA (dense weighted average) path planning algorithm.
S240: in the running process of the robot, carrying out lane line tracking on each frame of image, judging whether a lane line segment meeting a second screening rule exists in the top view, and if so, executing S250; if not, go to S260;
in S240, in the moving process of the robot, the angle and the position of the lane line in the top view can be changed, and the angle change rate is set
Figure 349064DEST_PATH_IMAGE029
Setting the distance change rate for the angle average change of the first five frames of the initial lane line
Figure 649596DEST_PATH_IMAGE030
The average value of pixel values of the first five frames of the initial lane line moving in the x-axis direction of the image coordinate system is obtained by
Figure 685685DEST_PATH_IMAGE029
Figure 780680DEST_PATH_IMAGE030
Predicting the position of a lane line in the next frame of image by the two parameters, defining the lane line at the moment as a predicted lane line, and screening lane line segments meeting a second screening rule from all line segments of a ground top view according to the predicted lane line, wherein the second screening rule specifically comprises the following steps:
(1) screening out lane line segments which have an angle difference with a predicted lane line not exceeding a first set degree (the degree is set to be 10 degrees by the invention and can be set according to actual operation), and have a real distance with a left line segment or a right line segment of the predicted lane line not exceeding a set distance (the distance is set to be 0.05m by the invention and can be set according to actual operation);
(2) the energy value of the screened lane line segment is greater than a set threshold value; according to the method and the device, the line segment energy value is added into the lane line tracking judgment, so that the accuracy of lane line tracking can be improved.
If a lane segment conforming to the second screening rule is screened, i.e., a new lane line is tracked, the initial lane line of the robot is updated to the new lane line formed by the screened lane segment, and
Figure 761274DEST_PATH_IMAGE029
and
Figure 865496DEST_PATH_IMAGE030
the parameter values of (a) are updated in real time as new lane lines are tracked. On the contrary, if the line segment meeting the second screening rule is not screened out, the new lane line is not tracked in the frame image.
In a second embodiment of the invention, by adding
Figure 490513DEST_PATH_IMAGE029
Figure 410538DEST_PATH_IMAGE030
Two parameters to compensate for interframesThe position of the lane line changes, the lane line is tracked by using the interframe information without detecting the lane line again every frame, and the tracking accuracy of the lane line can be improved.
S250: updating the initial lane line of the robot to a new lane line, calculating the distance and the included angle between the robot and the new lane line, controlling the robot to run along the new lane line according to the calculation result of the distance and the included angle, and executing S280;
in S250, after the lane line is updated, the positions of four vertexes of the left line segment and the right line segment of the new lane line under a camera coordinate system are worked out through the formula (1) and the formula (2), the distance and the included angle between the robot and the new lane line are worked out according to the formulas (3) to (7), then the distance and the included angle calculation result is issued to a robot control system through ROS, and the robot control system controls the robot to run along the new lane line through a DWA path planning algorithm.
S260: judging whether a lane line is tracked in the images with the continuously set frame number, and if the lane line is tracked, executing S200 again; if the lane line is not tracked, executing S270;
in S260, if the lane line is not tracked in the images with continuously set frame number (in the present invention, the set frame number is 10 frames, and the setting can be specifically performed according to the actual operation), it is determined that the robot has deviated from the lane line, and the robot is controlled to stop operating.
S270: controlling the robot to stop running, and ending;
s280: in the running process of the robot, detecting whether a broken line exists in the next frame of image in real time through a broken line detection module, and if the broken line does not exist, continuing to execute S240; if the polyline exists, S290 is executed;
in S280, the method for detecting the polyline specifically includes: eliminating all line segments with included angles smaller than a second set degree (the degree is set to be 30 degrees by the invention and can be set according to actual operation) from all line segments in the top view to obtain a broken line candidate line segment; and selecting two line segments which accord with a third screening rule from the broken line candidate line segments, namely the broken line obtained by detection.
The third screening rule is specifically as follows:
(1) the angle difference between the two line segments is smaller than a first set degree (the degree is set to be 10 degrees by the invention, and the degree can be set according to actual operation);
(2) the difference between the actual distance between the two line segments and the actual width of the current lane line is smaller than a set value (the value is set to be 0.05m by the invention, and the setting can be specifically carried out according to the actual operation);
(3) the difference between the energy values of the two line segments and the energy values of the two line segments of the current lane line is smaller than a set threshold value.
S290: detecting a broken line, and calculating the distance and the included angle between the robot and the broken line;
in S290, the method for detecting the broken line and the method for calculating the distance and the included angle between the robot and the broken line are the same as the method for tracking the lane line in S240 and the method for calculating the distance and the included angle between the robot and the lane line in S230, and this step is not repeated.
S300: controlling the robot to rotate along the fold line according to the calculation result of the distance between the robot and the fold line and the included angle, updating the fold line into a new lane line when the robot rotates to be parallel to the fold line, and controlling the robot to run along the new lane line;
in S300, the method for updating the polyline specifically includes: judging whether the distance between the robot and the broken line is within a set distance range (the distance range is set to be 1 m in the application, and the distance can be set according to actual operation in particular), if so, issuing a control command through an ROS to control the robot to stop and rotate on site, and updating the broken line into a new lane line when the control command is rotated to be parallel to the broken line; and if the distance is not within the set distance range, ending the process. According to the invention, the robot can perform autonomous navigation in a complex environment by matching with the broken line through adding the broken line detection module.
Compared with the prior art, the autonomous line patrol method for the robot platform in the second embodiment of the invention at least has the following beneficial effects:
firstly, in the initial lane line selection process, angle, energy value and distance discrimination among line segments are added, and inter-frame information is utilized to track the lane line, so that the angle precision of the tracked lane line is high, the error is small, and the accuracy of lane line detection and tracking is improved.
Secondly, lane line tracking is adopted, the candidate lane line segments of the current frame are screened by utilizing the inter-frame information, then the lane line segments are further screened by distance, angle and energy values and are completed to obtain a new lane line, and the robustness of the robot under the environments of lane line damage, shielding, weak illumination conditions and the like can be improved.
Thirdly, detecting whether a broken line exists in real time by adopting a broken line detection module while tracking the lane line, tracking the broken line when detecting the broken line, controlling the robot to stop and turn to be parallel to the broken line when the distance between the robot and the broken line is within a set range, and then controlling the robot to continue to run along the broken line; according to the scheme, the broken line at any angle can be identified and turned along the broken line, and the line patrol capacity of the robot on the complex curved path is improved.
Referring to fig. 5, fig. 5 is a schematic diagram illustrating a first structure of an autonomous line patrol apparatus of a robot platform according to an embodiment of the present invention. The apparatus 40 comprises:
the line segment detection module 41: the system comprises a ground top view used for acquiring a robot line patrol scene and detecting all line segments in the ground top view;
initial lane line screening module 42: the system comprises a robot, a line segment angle setting module, an energy value setting module, a distance setting module and a control module, wherein the line segment angle setting module is used for setting line segment angles, energy values and distance parameters;
lane line tracking module 43: and the method is used for obtaining a predicted lane line in the next frame of image according to the angle change rate and the distance change rate in the running process of the robot, screening all line segments according to the predicted lane line and set line segment angles, energy values and distance parameters to obtain a new lane line, and controlling the robot to run along the new lane line.
Referring to fig. 6, fig. 6 is a schematic diagram illustrating a second structure of the autonomous line patrol apparatus of the robot platform according to the present invention. As shown in fig. 6, the apparatus 50 includes a processor 51, and a memory 52 coupled to the processor 51.
The memory 52 stores program instructions for implementing the autonomous routing method of the robot platform described above.
The processor 51 is configured to execute program instructions stored in the memory 52 to control the autonomous patrolling of the robot.
The processor 51 may also be referred to as a CPU (Central Processing Unit). The processor 51 may be an integrated circuit chip having signal processing capabilities. The processor 51 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Referring to fig. 7, fig. 7 is a schematic structural diagram of a storage medium according to an embodiment of the invention. The storage medium of the embodiment of the present invention stores a program file 61 capable of implementing all the methods described above, wherein the program file 61 may be stored in the storage medium in the form of a software product, and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a mobile hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, or terminal devices, such as a computer, a server, a mobile phone, and a tablet.
In the embodiments provided in the present invention, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (12)

1. An autonomous line patrol method of a robot platform is characterized by comprising the following steps:
the method comprises the steps of obtaining a ground top view of a robot line patrol scene, and detecting all line segments in the ground top view;
screening all the line segments to obtain an initial lane line of the robot meeting the requirements of preset line segment angles, energy values and distance parameters, and controlling the robot to run along the initial lane line;
and in the running process of the robot, acquiring a predicted lane line meeting the requirements of a preset angle change rate and a preset distance change rate in the next frame of image, screening all line segments according to the predicted lane line and set line segment angles, energy values and distance parameters to obtain a new lane line, and controlling the robot to run along the new lane line.
2. The autonomous routing method of a robotic platform according to claim 1, wherein said controlling the robot to travel along the new lane line further comprises:
and in the running process of the robot, detecting whether a broken line exists in the next frame of image, tracking the broken line when the broken line exists, controlling the robot to rotate to be parallel to the broken line, and controlling the robot to run along the broken line.
3. The autonomous routing method of a robotic platform of claim 1, wherein said detecting all line segments in the top view of the ground comprises:
converting a current frame image of a robot line patrol scene into a gray map;
carrying out Gaussian blur processing on the gray level image;
performing perspective transformation on the gray-scale image subjected to the Gaussian blur processing by using a perspective transformation matrix to obtain a corresponding ground top view;
and carrying out edge detection and Hough transformation on the ground top view to obtain all line segments in the ground top view.
4. The autonomous routing method of a robotic platform according to claim 3,
the step of obtaining the initial lane line of the robot by screening all the line segments according to the set line segment angle, the set energy value and the set distance parameter comprises the following steps:
screening all the line segments to obtain two left and right adjacent lane line segments which accord with a first screening rule, and setting a lane formed by the two screened lane line segments as an initial lane line of the robot; the first filtering rule comprises:
the angle difference between the two lane line segments cannot exceed a set degree;
the energy values of the two lane segments are greater than a set threshold, the energy value of the left lane segment is negative, and the energy value of the right lane segment is positive;
the actual distance between the two lane segments is within the actual width of the lane line.
5. The autonomous routing method of a robotic platform according to claim 4, wherein said controlling the robot to travel along the initial lane line comprises:
converting the initial lane line inverse perspective into an image coordinate system, and converting the image coordinate system into a camera coordinate system;
calculating the positions of four vertexes of the left lane line segment and the left lane line segment under a camera coordinate system, and then calculating the distance and the included angle between the robot and the initial lane line;
and controlling the robot to run along the initial lane line according to the calculation result of the distance and the included angle.
6. The autonomous routing method of a robotic platform of claim 1, wherein the screening of all segments to obtain a new lane line according to the predicted lane line and the set segment angles, energy values and distance parameters comprises:
setting an angle change rate delta theta as the average angle change quantity of the first five frames of the initial lane line;
setting a distance change rate deltad as an average value of pixel values of the first five frames of the initial lane line moving in the x-axis direction of the image coordinate system;
obtaining the position of the lane line in the next frame of image through the delta theta and the prediction, and defining the lane line at the moment as a predicted lane line;
screening out lane line segments, the angle difference between which and the predicted lane line does not exceed a set degree, the real distance between which and the left line segment or the real distance between which and the right line segment of the predicted lane line does not exceed a set distance, from all the line segments of the top view, wherein the energy value of the lane line segments is greater than a set threshold value;
and defining a lane line formed by the predicted lane line and the screened lane line segment as a new lane line.
7. The autonomous routing method of a robotic platform of claim 6, wherein the screening all segments to obtain a new lane line according to the predicted lane line and the set segment angles, energy values and distance parameters further comprises:
if no new lane line is tracked in the next frame image, judging whether a new lane line is tracked in the images with continuously set frame numbers;
and if the new lane line is not tracked in the images with the continuously set frame number, judging that the robot deviates from the lane line, and controlling the robot to stop running.
8. The autonomous tour method of robot platform of claim 2, wherein the detecting whether there is a polyline in the next frame of image by the polyline detection module comprises:
eliminating line segments with included angles smaller than a second set degree with the current lane line from all the line segments of the top view to obtain broken line candidate line segments;
selecting two line segments which accord with a third screening rule from the broken line candidate line segments, namely, the broken line obtained by detection; the third screening rule is as follows:
the angle difference between the two line segments is less than a first set degree;
the difference between the actual distance between the two line segments and the actual width of the current lane line is smaller than a set value;
and the difference between the energy values of the two line segments and the energy values of the two line segments of the current lane line is less than a set threshold value.
9. The autonomous routing method of a robotic platform of claim 8, wherein tracking the polyline comprises:
calculating the distance and the included angle between the robot and the fold line;
judging whether the distance between the robot and the fold line is within a set distance range or not;
and if the distance is within the set distance range, controlling the robot to stop running and rotate on the spot, and updating the broken line into a new lane line when the robot rotates to be parallel to the broken line.
10. An autonomous line patrol device of a robot platform, the device comprising:
a line segment detection module: the system comprises a ground top view used for acquiring a robot line patrol scene and detecting all line segments in the ground top view;
initial lane line screening module: the system is used for screening all the line segments to obtain an initial lane line of the robot meeting the requirements of preset line segment angles, energy values and distance parameters, and controlling the robot to run along the initial lane line;
lane line tracking module: and the method is used for acquiring a predicted lane line meeting the requirements of a preset angle change rate and a preset distance change rate in the next frame of image in the running process of the robot, screening all line segments according to the predicted lane line and set line segment angles, energy values and distance parameters to obtain a new lane line, and controlling the robot to run along the new lane line.
11. An autonomous routing device for a robotic platform, the device comprising a processor, a memory coupled to the processor, wherein,
the memory stores program instructions for implementing the autonomous tour method of the robotic platform of any of claims 1-9;
the processor is used for executing the program instructions stored in the memory to control the robot to autonomously patrol.
12. A storage medium having stored thereon program instructions executable by a processor to perform the autonomous tour method of a robotic platform of any of claims 1 to 9.
CN202010724361.6A 2020-07-14 2020-07-24 Autonomous line patrol method and device for robot platform and storage medium Active CN111598911B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2020106772684 2020-07-14
CN202010677268 2020-07-14

Publications (2)

Publication Number Publication Date
CN111598911A true CN111598911A (en) 2020-08-28
CN111598911B CN111598911B (en) 2020-12-04

Family

ID=72183061

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010724361.6A Active CN111598911B (en) 2020-07-14 2020-07-24 Autonomous line patrol method and device for robot platform and storage medium

Country Status (1)

Country Link
CN (1) CN111598911B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114001738A (en) * 2021-09-28 2022-02-01 浙江大华技术股份有限公司 Visual line patrol positioning method, system and computer readable storage medium
CN115271402A (en) * 2022-07-19 2022-11-01 中环洁环境有限公司 Sanitation vehicle selection method, system, medium and equipment based on road environment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107909007A (en) * 2017-10-27 2018-04-13 上海识加电子科技有限公司 Method for detecting lane lines and device
CN109583280A (en) * 2017-09-29 2019-04-05 比亚迪股份有限公司 Lane detection method, apparatus, equipment and storage medium
CN109785291A (en) * 2018-12-20 2019-05-21 南京莱斯电子设备有限公司 A kind of lane line self-adapting detecting method
CN111209770A (en) * 2018-11-21 2020-05-29 北京三星通信技术研究有限公司 Lane line identification method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109583280A (en) * 2017-09-29 2019-04-05 比亚迪股份有限公司 Lane detection method, apparatus, equipment and storage medium
CN107909007A (en) * 2017-10-27 2018-04-13 上海识加电子科技有限公司 Method for detecting lane lines and device
CN111209770A (en) * 2018-11-21 2020-05-29 北京三星通信技术研究有限公司 Lane line identification method and device
CN109785291A (en) * 2018-12-20 2019-05-21 南京莱斯电子设备有限公司 A kind of lane line self-adapting detecting method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
孙宇飞: "高级辅助驾驶中的车道线检测研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
王志等: "移动机器人导航系统中的车道线检测方法及实现", 《计算机测量与控制》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114001738A (en) * 2021-09-28 2022-02-01 浙江大华技术股份有限公司 Visual line patrol positioning method, system and computer readable storage medium
CN115271402A (en) * 2022-07-19 2022-11-01 中环洁环境有限公司 Sanitation vehicle selection method, system, medium and equipment based on road environment
CN115271402B (en) * 2022-07-19 2023-06-27 中环洁环境有限公司 Sanitation vehicle selection method, system, medium and equipment based on road environment

Also Published As

Publication number Publication date
CN111598911B (en) 2020-12-04

Similar Documents

Publication Publication Date Title
US11816991B2 (en) Vehicle environment modeling with a camera
CN109635685B (en) Target object 3D detection method, device, medium and equipment
Zhou et al. Automated evaluation of semantic segmentation robustness for autonomous driving
CN111665842B (en) Indoor SLAM mapping method and system based on semantic information fusion
Huang et al. A fast point cloud ground segmentation approach based on coarse-to-fine Markov random field
CN111598911B (en) Autonomous line patrol method and device for robot platform and storage medium
CN110136058B (en) Drawing construction method based on overlook spliced drawing and vehicle-mounted terminal
Zou et al. Real-time full-stack traffic scene perception for autonomous driving with roadside cameras
CN112947419B (en) Obstacle avoidance method, device and equipment
CN111508272B (en) Method and apparatus for providing robust camera-based object distance prediction
CN111931764A (en) Target detection method, target detection framework and related equipment
US10902610B2 (en) Moving object controller, landmark, and moving object control method
CN114119659A (en) Multi-sensor fusion target tracking method
CN114494329B (en) Guide point selection method for autonomous exploration of mobile robot in non-planar environment
Carrera et al. Lightweight SLAM and Navigation with a Multi-Camera Rig.
Küçükyildiz et al. Development and optimization of a DSP-based real-time lane detection algorithm on a mobile platform
CN114581678A (en) Automatic tracking and re-identifying method for template feature matching
CN114549562A (en) UNet-based semi-automated oblique photography live-action model building singulation method, system, equipment and storage medium
CN114972492A (en) Position and pose determination method and device based on aerial view and computer storage medium
CN112405526A (en) Robot positioning method and device, equipment and storage medium
CN111553342A (en) Visual positioning method and device, computer equipment and storage medium
CN115902977A (en) Transformer substation robot double-positioning method and system based on vision and GPS
CN113409268B (en) Method and device for detecting passable area based on monocular camera and storage medium
Kuprešak et al. Solution for autonomous vehicle parking
CN113792634B (en) Target similarity score calculation method and system based on vehicle-mounted camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant