Disclosure of Invention
The invention aims to provide an unmanned aerial vehicle endurance method capable of improving endurance and realizing automatic inspection of a highway.
In order to solve the problems, the invention provides an automatic inspection method for an unmanned aerial vehicle expressway, which is characterized by comprising the following steps of:
s1, returning the unmanned aerial vehicle to a ground charging station for charging in an initial state or after the execution of the inspection task is finished, entering a charging mode during charging, entering an inspection preparation mode after the charging is finished, executing the step S2 when the inspection task is accessed in the charging mode, and executing the step S3 when the inspection task is accessed in the inspection preparation mode;
s2, judging the electric quantity needed by the inspection task, if the current electric quantity meets the requirement of the inspection task, executing the step S3, otherwise, continuing to charge;
s3, taking off by the unmanned aerial vehicle, starting to execute an inspection task, inspecting the highway, and shooting a live video;
s4, dividing the live video shot by the unmanned aerial vehicle into video segments according to the set time, and caching;
s5, extracting at least one frame of live image from the cached video segment;
s6, preprocessing the live image to obtain a binary image of the highway surface, wherein in the image, the vehicle outline and the lane line are white as the foreground, and the road surface is black as the background;
s7, sequentially scanning pixel points on the binary image, defining a region with M continuous foreground pixel values in a neighborhood range as a single vehicle, obtaining the number value of the vehicle, namely the current traffic flow, and transmitting the number value of the vehicle to a ground control center, wherein M is the area threshold range of the vehicle in the binary image;
s8, transversely scanning the binary image, defining an area with N transversely continuous pixel points as foreground pixel points as a lane line, defining a lane line with a distance of P from a scanning initial position as an emergency lane line, counting the number of vehicles of the emergency lane from the scanning initial position, if so, executing a step S10, otherwise, executing a step S9, wherein N is the width threshold range of the lane line on the binary image, and P is the width threshold range of the emergency lane on the binary image;
s9, deleting the called video segment in the cache, and executing the step S5;
s10, transmitting the called video segments in the cache to a ground control center, and executing the step S5;
and S11, after the inspection task is executed, returning the unmanned aerial vehicle to the ground charging station, and executing the step S1.
As a further improvement of the present invention, the step S6 includes:
s6.1, performing semantic segmentation on the live image in the step S5, separating a lane area and a background area, and deleting pixel points of the background area from the live image;
and S6.2, performing binarization processing on the live image in the step S6.1 to obtain a binarized image, wherein the road surface is displayed as a background and is black, and the automobile outline and the lane line are displayed as a foreground and are white.
As a further improvement of the present invention, the step S7 includes:
s7.1, transversely scanning the binary image, defining a region with N transversely continuous pixel points which are foreground pixel points as a lane line, and deleting the binary image;
s7.2, transversely scanning the binary image, and defining a region with continuous M foreground pixel values in a neighborhood range as a single vehicle contour;
and S7.3, counting the number of the single vehicle profiles in the step S7.2.
As a further improvement of the present invention, in step S7.2, the binary image is scanned, if the current pixel is a foreground pixel, pixels in a neighborhood setting range are scanned, and if the number of foreground pixels in the neighborhood is greater than or equal to a set threshold, all the neighborhoods are filled into foreground pixels to obtain an independent automobile contour, and the independent automobile contour is given a set gray value α, where the α value is between 0 and 255.
As a further improvement of the present invention, in step S7.2, if the number of foreground pixels in the filled vehicle contour is smaller than the set area threshold M, the foreground pixels are deleted from the binarized image.
As a further improvement of the present invention, the step S8 includes:
s8.1, transversely scanning the binary image along the outer side of a lane on the binary image, defining an area with the foreground pixel points of N transversely continuous pixel points as a lane line, defining a lane line with the distance of P from the scanning initial position as an emergency lane line, and recording the coordinates (X) of the foreground pixel points firstly scanned on the emergency lane linem;Yn);
S8.2, carrying out coordinate (X) on the longitudinally continuous foreground pixel points by a least square methodm;Yn) Performing curve fitting to obtain a quadratic parabolic curve equation of the traffic lane, drawing a fitted emergency lane line on the image obtained in the step S7.2 according to the obtained quadratic parabolic equation, and endowing the emergency lane line with a gray value beta, wherein the value of beta is between 0 and 255;
s8.3, transversely scanning the image obtained in the step S8.2 along the outer side of the lane on the binary image, if the pixel value of the foreground pixel point scanned firstly is alpha, indicating that the emergency lane is occupied, executing a step S10, if the pixel value of the foreground pixel point scanned firstly is beta, indicating that the emergency lane is unoccupied, and executing a step S9.
The invention has the advantages that the air vehicle automatically returns to a ground charging station for charging after the task is executed each time, when a new inspection task is accessed in the charging process, the electric quantity required by the task and the current electric quantity are compared, so that the normal operation of the inspection task in each flight is ensured, the automatic inspection is realized, in addition, the video shot by the unmanned aerial vehicle is divided into a plurality of sections according to the time length, the pre-analysis is carried out on the unmanned aerial vehicle, only the video sections which are possibly occupied by an emergency lane after the pre-analysis are transmitted to a ground control center, the data transmission quantity with the ground control center is reduced, and the endurance of the main body of the unmanned aerial vehicle is improved.
Detailed Description
The technical scheme of the invention is further explained by the specific implementation mode in combination with the attached drawings.
As shown in fig. 1 and 2, the present invention includes an unmanned aerial vehicle main body 2000, a ground charging station 4000, and a ground control center 6000, and is characterized in that the unmanned aerial vehicle main body 2000 includes:
the flight control module 2100 receives and executes the polling task sent by the ground control center 6000, and returns to the ground charging station 4000 after the polling task is executed;
a charging module 2200, including a power source 2201 for providing energy to the aircraft body, a charging mating unit 2202 for providing mating for charging the power source 2201, and a charging control unit 2203, wherein when the flight control module 2100 receives a flight signal of the ground control center 6000 in the charging process of the unmanned aerial vehicle body 2000, the charging control unit 2203 compares the electric quantity required by the inspection task with the current electric quantity, and only when the current electric quantity is greater than the electric quantity required by the inspection task, the flight control module 2100 is allowed to execute the inspection task, otherwise, the charging is continued;
a video processing module 2300 including a video segment creation unit 2301, a temporary buffer unit 2302, and a video segment processing unit 2303, wherein the video segment creation unit 2301 creates a video segment from a live road video recorded when the main body 2000 of the unmanned aerial vehicle performs an inspection task at a set time and stores the video segment into the temporary buffer unit 2302, the video segment processing unit 2303 can call the video segment stored in the temporary buffer unit 2302 and extract a live image of at least one frame from the video segment, and the temporary buffer unit 2302 can delete the video segment or transmit the video segment to the ground control station;
the picture processing module 2400 includes a traffic flow statistics sub-module 2403 and an emergency lane detection sub-module, and is configured to respectively obtain vehicle quantity information in the live picture and vehicle quantity information in an emergency lane by identifying the live picture, and feed the obtained information back to the ground control center 6000, when a vehicle exists in the emergency lane, the picture processing module 2400 controls the temporary cache unit 2302 to transmit a corresponding video segment to the ground control center 6000, and otherwise, controls the temporary cache unit 2302 to delete the corresponding video segment.
As a further improvement of the present invention, the picture processing module 2400 further includes:
a semantic segmentation unit 2401, performing semantic segmentation on the live image to obtain a lane region and a background region, and removing the background region in the image;
a binarization processing unit 2402 that performs binarization processing on the image from which the background region is removed to obtain a binarized image of the image in which the road surface is displayed as a background in black and the vehicles and the lane lines are displayed as a foreground in white;
the traffic flow statistics sub-module 2403 includes:
a lane line eliminating unit 2404, configured to perform a horizontal scan on the binarized image, define a lane line as N horizontal consecutive pixel points, where N is defined as a width threshold of the lane line, where N is a foreground pixel point, and eliminate the pixel points from the binarized image to obtain a first processed image;
a vehicle contour extraction unit 2406, scanning the first processed image, capturing the contour of the individual vehicle and giving a set gray value α to the contour, wherein the value of α is between 0 and 255, and obtaining a second processed image after processing;
-a statistical unit for counting the number of independent car contours in the second processed image.
As a further improvement of the present invention, the emergency lane detection sub-module 2408 includes:
the emergency lane line contour coordinate extraction unit 2409 scans the binarized image transversely along the outer side of the lane on the binarized image, and records the coordinates (X) of the foreground pixels scanned firstly if the scanned foreground pixels are N continuous foreground pixels transversely and the distance threshold from the pixels to the side edge of the binarized image is smaller than the set rangem;Yn);
An emergency lane line fitting unit 2410 for fitting coordinates (X) of longitudinally continuous foreground pixel pointsm;Yn) Performing curve fitting, drawing a lane line on the second processed image, and giving a set gray value beta to the lane line, wherein the beta value is between 0 and 255, so as to obtain a third processed image;
the emergency lane scanning unit 2411 may perform lateral scanning on the third processed image from the outside, and determine whether the emergency lane is occupied according to a gray value of the first scanned image.
As a further improvement of the present invention, the vehicle contour filling unit may sequentially scan the pixels of the first processed image, scan pixels within a set range of a neighborhood if the current pixel is a foreground pixel, and fill all neighborhoods thereof into foreground pixels to obtain an independent vehicle contour and assign a set gray value α thereto to obtain the second processed image if the number of foreground pixels in the neighborhood is greater than or equal to a set threshold.
As a further improvement of the present invention, the emergency lane line fitting unit 2410 uses the least square method to match the coordinates (X) of the longitudinally continuous foreground pixelsm;Yn) And performing curve fitting to obtain a quadratic parabolic curve equation of the traffic lane, and drawing a fitted lane line on the second processed image according to the obtained quadratic parabolic equation.
As a further improvement of the present invention, the temporary cache unit 2302 divides the video segment priority temporarily cached therein, the video segment stored in the temporary cache unit 2302 first has a higher priority than the video segment stored in the temporary cache unit 2302 later, and the video segment processing unit 2303 calls the video segment with the highest priority first.
As a further improvement of the present invention, when there is a vehicle in the emergency lane, the picture processing module 2400 controls the temporary cache unit 2302 to transmit the video segment with the highest priority to the ground control center 6000, otherwise controls the temporary cache unit 2302 to delete the video segment with the highest priority.
The specific principle of the invention is as follows:
unmanned aerial vehicle main body 2000 patrol
(1) The ground control center 6000 sends a patrol inspection instruction to the unmanned aerial vehicle main body 2000;
(2) after receiving the inspection instruction, the flight control device controls the unmanned aerial vehicle main body 2000 to execute the inspection task;
(3) and after the polling task is finished, returning to the ground charging station 4000 for charging.
(II) the unmanned aerial vehicle main body 2000 receives the patrol inspection instruction sent by the ground control center 6000 in the charging process
(1) After the unmanned aerial vehicle main body 2000 executes the task, the unmanned aerial vehicle main body returns to the ground charging station 4000 for charging;
(2) after charging is finished, entering a polling mode, waiting for a polling command sent by the ground control center 6000, and executing a polling task after receiving the polling command sent by the ground control center 6000;
(3) when the charging is not completed, the ground control center 6000 sends out a polling instruction, the charging control unit 2203 compares the current electric quantity with the electric quantity required by the polling task (the electric quantity here can be calculated by the ground control center 6000, or can be calculated by the flight control system according to the airborne navigation), when the current electric quantity is more than the electric quantity required by the polling task, the flight control system is allowed to start the unmanned aerial vehicle main body 2000 to execute the task, otherwise, the charging is performed, and the comparison is performed once at each set time.
(III) data transmission between unmanned aerial vehicle main body 2000 and ground control center 6000
(1) The unmanned aerial vehicle main body 2000 starts an airborne camera thereof to shoot videos on the ground in the polling task;
(2) in the shooting process, the live video is made into video segments at intervals by the video segment making unit 2301;
(3) the created video segments are cached in the temporary caching unit 2302, when the cached video segments are multiple, the caching unit defines the priority of the video segments, and the earlier the cached video segments are stored, the higher the priority is;
(4) the image processing module 2400 preferentially calls a video segment with a high priority to perform analysis processing, when the emergency lane is occupied after the analysis processing, the video segment is transmitted to the ground control center 6000, the ground control center 6000 further confirms the video segment, and otherwise, the video segment is deleted to save the storage space;
(IV) graphics processing Module Pre-processing
(1) The video segment processing unit 2303 extracts several frames of live images from the video segment with the highest priority of the temporary buffer unit 2302;
(2) a semantic segmentation unit 2401 performs semantic segmentation on the live image so as to acquire a background region and a lane region in the live image;
(3) and carrying out binarization processing on the lane area of the image to obtain a binarized image, wherein the road surface is displayed as a background and the vehicle and the lane line are displayed as a foreground and are displayed as white in the binarized image.
(IV) traffic flow statistics
(1) The lane line removing unit 2404 performs transverse scanning on the binarized image, defines the horizontal continuous N pixel points, which are foreground pixel points, as lane lines, and removes the lane lines from the binarized image to obtain a first processed image, wherein N is defined as a width threshold of the lane lines, and the purpose of removing the lane lines is that the lane lines and the vehicles belong to the foreground, so that the identification of the vehicles is influenced;
(2) the vehicle contour extraction unit 2406 scans the pixels of the first processed image in sequence, if the current pixel is a foreground pixel, scans pixels in a neighborhood set range, and if the number of foreground pixels in the neighborhood is greater than or equal to a set threshold, fills all the neighborhoods into foreground pixels so as to obtain an independent vehicle contour, and gives a set gray value alpha to the independent vehicle contour so as to obtain the second processed image, wherein the alpha value is between 0 and 255;
(3) the vehicle profile counting unit 2407 counts the number of independent vehicle profiles in the second processed image, thereby obtaining the current traffic flow.
(V) Emergency Lane detection
(1) The emergency lane line contour coordinate extraction unit 2409 scans the binarized image transversely along the outer side of the lane on the binarized image, and records the coordinates (X) of the foreground pixel points scanned firstly if the scanned image is scanned transversely to form N continuous foreground pixel points and the distance threshold value from the pixel points to the side edge of the binarized image is smaller than the set rangem;Yn);
(2) The emergency lane line fitting unit 2410 performs least square method on the coordinates (X) of the longitudinally continuous foreground pixel pointsm;Yn) Performing curve fitting to obtain rowsDrawing a fitted lane line on the second processed image according to the obtained quadratic parabolic curve equation, and giving a set gray value beta to the fitted lane line, wherein the beta value is between 0 and 255, so as to obtain a third processed image;
(3) an emergency lane scanning unit 2411, configured to scan the third processed image from the outside in the transverse direction, determine whether the emergency lane is occupied according to a gray level of the first scanned image, and if a gray level of the scanned pixel value is α, indicate that the emergency lane is occupied; and if the gray value of the scanned pixel point is beta, indicating that the emergency lane is not occupied.
The technical principle of the present invention is described above in connection with specific embodiments. The description is made for the purpose of illustrating the principles of the invention and should not be construed in any way as limiting the scope of the invention. Based on the explanations herein, those skilled in the art will be able to conceive of other embodiments of the present invention without inventive effort, which would fall within the scope of the present invention.