CN109131860B - Plant protection unmanned aerial vehicle based on vision - Google Patents

Plant protection unmanned aerial vehicle based on vision Download PDF

Info

Publication number
CN109131860B
CN109131860B CN201811085167.7A CN201811085167A CN109131860B CN 109131860 B CN109131860 B CN 109131860B CN 201811085167 A CN201811085167 A CN 201811085167A CN 109131860 B CN109131860 B CN 109131860B
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
shell
height
laser
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811085167.7A
Other languages
Chinese (zh)
Other versions
CN109131860A (en
Inventor
马子领
张瑞珠
吴高峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
North China University of Water Resources and Electric Power
Original Assignee
North China University of Water Resources and Electric Power
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by North China University of Water Resources and Electric Power filed Critical North China University of Water Resources and Electric Power
Priority to CN201811085167.7A priority Critical patent/CN109131860B/en
Publication of CN109131860A publication Critical patent/CN109131860A/en
Application granted granted Critical
Publication of CN109131860B publication Critical patent/CN109131860B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C27/00Rotorcraft; Rotors peculiar thereto
    • B64C27/04Helicopters
    • B64C27/08Helicopters with two or more rotors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U50/00Propulsion; Power supply
    • B64U50/10Propulsion
    • B64U50/19Propulsion using electrically powered motors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography

Abstract

The invention discloses a vision-based plant protection unmanned aerial vehicle, which comprises a shell, wherein a plurality of cantilevers are uniformly hinged to the shell, front and rear landing gears are respectively arranged at the front side and the rear side of the shell downwards, the upper surface of the shell is upwards connected with a push rod, and the top end of the push rod is provided with a GPS module; a flight controller and a storage battery are arranged in the shell, the free end of each cantilever is provided with a motor, and the output shaft of each motor is upwards connected with a propeller; the shell between the front landing gear and the rear landing gear is downwards connected with a suspension, the left end of the suspension is provided with a laser emitter, the right end of the suspension is provided with a camera, the optical axis of the lens of the camera and the straight line where the laser emitted by the laser emitter are positioned have an intersection point, and the included angle theta between the optical axis of the lens of the camera and the horizontal plane is 50-75 degrees; the middle part of the suspension is provided with a miniPC. The invention has simple structure, ensures that the light spot formed by the laser transmitter is necessarily positioned on the vertical central line of the photo image shot by the camera, has smaller image deformation and provides a good foundation for image processing.

Description

Plant protection unmanned aerial vehicle based on vision
Technical Field
The invention relates to the field of agricultural plant protection, in particular to a pesticide spraying technology of a plant protection unmanned aerial vehicle.
Background
Along with the demands of specialized system prevention and control work of shortage of rural labor force, rapid increase of labor cost and rapid development in China, the demands of society on large and medium-sized plant protection machines and agricultural aviation plant protection machines with high operation efficiency, wide application range, water conservation, pesticide saving and environmental protection and less labor are more and more urgent. The plant protection unmanned aerial vehicle has the advantages of good maneuverability, high operation efficiency, small liquid medicine application amount per unit area, no need of a special take-off and landing airport, remote control operation, reduction of human body phytotoxicity probability, improvement of pesticide effective utilization rate and the like, can adapt to plant protection requirements of crops in each growth period, and accords with the development trend of agricultural machinery.
The unmanned aerial vehicle technology is used for farmland spraying to generate technical difficulties specific to plant protection unmanned aerial vehicles, and one of the key technologies to be solved in the prior art is as follows: an agricultural unmanned aerial vehicle autonomous flight control system with high stability and high reliability is lacking. The plant protection unmanned aerial vehicle needs to work at an even lower height from the crop crown of 2-5 m, the speed is maintained at 1-6 m/s, and the stability during low-altitude low-speed operation is required to be maintained. However, a height of 2 to 5m or less is an inefficient area of the helicopter, and stability thereof is not easily achieved.
The accuracy and stability of unmanned aerial vehicle height measurement are one of the basic data for realizing unmanned aerial vehicle stable flight. The current unmanned aerial vehicle flight control system height measurement method comprises the following steps: GPS measures altitude, radio altimeter measures, and also uses pressure sensors to measure altitude. Because the instability of GPS received data is greatly interfered by the outside, such as surrounding building height, topography factors and the like, the received satellite data is frequently changed, the jump of height measurement is severe, a large amount of data processing is needed to reduce errors as much as possible, the stability is still unsatisfactory, the cost is high, the volume is large, and the GPS satellite data is not suitable for being applied to the fixed-altitude flight of a small agricultural unmanned aerial vehicle. The radio altimeter measurement accuracy can meet the flight requirement, but is relatively complex to use, antennas are required to be added on the unmanned aerial vehicle and the ground station, and the cost is relatively high, so that the radio altimeter measurement accuracy is not suitable for a small agricultural unmanned aerial vehicle. The pressure sensor overcomes many defects of GPS, such as optimization in precision, cost and volume, and the like, and has the advantages of easy use, but is easily affected by weather changes, especially in the aspect of microminiature agricultural unmanned aerial vehicle, the pressure sensor has great influence on the barometric altimeter due to complex effect caused by farmland crop canopy structure during low-altitude flight, and the flight stability is difficult to ensure.
In order to solve the problems of high stability and high reliability of autonomous flight control of a microminiature agricultural unmanned aerial vehicle operating in a complex farmland environment (different altitudes, different geographic positions and different crop types), so as to adapt to special requirements of field beyond visual range, ultra-low altitude, taking off and landing at any time and the like, the invention provides a vision-based self-adaptive fixed-altitude flight control technology.
Disclosure of Invention
The invention aims to provide a vision-based plant protection unmanned aerial vehicle, which ensures that light spots formed by laser in a photo shot by a camera are positioned on a vertical central line of an image and provides a basis for quickly acquiring real-time height of the unmanned aerial vehicle based on the photo image.
In order to achieve the purpose, the vision-based plant protection unmanned aerial vehicle comprises a disc-shaped shell which is horizontally arranged, wherein a plurality of cantilevers are evenly hinged to the shell in the circumferential direction of the shell, and the other ends of the cantilevers are free ends; each cantilever is arranged along the radial direction of the shell, and the free end of each cantilever is higher than the hinged end;
the front and rear sides of the shell are respectively provided with a front landing gear and a rear landing gear downwards, the upper surface of the shell is connected with a push rod upwards, and the top end of the push rod is provided with a GPS module; the flying control device comprises a shell, a flying controller and a storage battery, wherein the flying controller and the storage battery are arranged in the shell, the storage battery is connected with the flying controller and supplies power for the flying controller, the free end of each cantilever is provided with a motor, and the output shaft of each motor is upwards connected with a propeller;
the lower surface of the shell between the front landing gear and the rear landing gear is downwards connected with a horizontally arranged suspension, the left end of the suspension is provided with a laser emitter, and the laser emitted by the laser emitter is perpendicular to the suspension and faces downwards; the right end of the suspension is provided with a camera which is obliquely arranged leftwards and downwards; the diameter R of the laser beam emitted by the laser emitter is 10 to 15 mm;
the preset flying height of the unmanned aerial vehicle is H1 to H2 meters, H2 is more than H1, an intersection point is formed between a lens optical axis of the camera and a straight line where laser emitted by the laser emitter is located, the intersection point is lower than the laser emitters H1 to (H1+H2)/2 meters, an included angle theta between the lens optical axis of the camera and the horizontal plane is 50 to 75 degrees, and the unmanned aerial vehicle is provided with a remote controller in wireless communication with a flying controller; the middle part of the suspension is provided with a miniPC which is connected with the flight controller, the camera and the laser transmitter.
The ejector rod is hinged with the upper surface of the shell, a through hole is formed in the upper surface of the shell, and a connecting line of the GPS module extends downwards along the ejector rod and is connected with the flight controller after extending into the shell along the through hole; and the connecting lines of the laser transmitter and the camera are respectively connected with the flight controller.
The front landing gear and the rear landing gear are symmetrically arranged and respectively comprise a diagonal bracing rod and a horizontal bracing rod, the lower ends of the diagonal bracing rods are fixedly connected with the middle parts of the horizontal bracing rods, and the upper ends of the diagonal bracing rods are connected with the shell.
The intersection point of the lens optical axis of the camera and the straight line where the laser emitted by the laser emitter is located is lower than that of the laser emitter (H1 +H2)/2 meters.
The invention has the following advantages:
the vision-based plant protection unmanned aerial vehicle is simple in structure, the included angle theta between the optical axis of the lens of the camera and the horizontal plane is 50-75 degrees, and the intersection point is lower than 1-1.5 meters of the laser emitter, so that the camera can shoot images nearby the intersection point conveniently, the deformation of the images is small, and subsequent image recognition processing is facilitated. If θ is too large or too small, this results in too small the number of pixels occupied by the measurement height in the image, thereby degrading the measurement accuracy.
The intersection point is formed between the optical axis of the lens of the camera and the straight line where the laser emitted by the laser emitter is located, so that the light spot formed by the laser emitter is ensured to be necessarily located on the vertical central line of the photo image shot by the camera.
The miniPC has larger weight, is arranged in the middle of the suspension, and can keep the whole gravity center of the unmanned aerial vehicle to be positioned in the geometric center of the unmanned aerial vehicle as far as possible. The front landing gear and the rear landing gear are simple in structure, lighter and convenient to manufacture, install and use.
The intersection point of the lens optical axis of the camera and the straight line where the laser emitted by the laser emitter is positioned is lower than the intersection point of the laser emitter (H1 +H2)/2 meters. By means of the arrangement, when the flight height of the unmanned aerial vehicle is just moderate, the light spots in the pictures shot by the cameras are located at the center of the images.
The user only needs to control the normal takeoff of the vision-based plant protection unmanned aerial vehicle, and the unmanned aerial vehicle can automatically control the flying height within a preset range (from H1 to H2 meters, including two end values), so that the invention is very convenient to use, and provides a foundation for conveniently and stably realizing fixed-height spraying to crops.
The ultra-low altitude self-adaptive fixed altitude flight control method uses the altitude data table to directly inquire and obtain the real-time altitude value of the unmanned aerial vehicle, so that compared with the real-time altitude value of the unmanned aerial vehicle obtained through calculation each time, the method greatly reduces the operation amount, improves the calculation speed (the unmanned aerial vehicle has high requirement on real-time performance, and therefore the algorithm processing speed is an important condition), and reduces the power consumption (the power consumption of a CPU of the miniPC is larger when the calculation amount is larger).
The establishing process of the height data table is convenient, a group of corresponding arrays of image coordinates and actual flying heights are established at intervals of 5 cm, balance is achieved between the two targets of reducing the workload of table establishment and reducing the data of the height data table and accurately obtaining the real-time height of the unmanned aerial vehicle, and the effects that the data amount in the height data table is small and the real-time height of the unmanned aerial vehicle can be accurately obtained are achieved.
The invention has the application occasions that the unmanned aerial vehicle flies at ultra-low altitude, and the first step, namely the table building step, is completed after the accumulated rising distance of the unmanned aerial vehicle reaches 5 meters, because the invention can control the flying height of the unmanned aerial vehicle to be far lower than the range of 5 meters, the situation that the flying height of the unmanned aerial vehicle is too high is not needed to be considered.
The image of the region of interest is acquired on the photo image, so that the calculation can be concentrated in the effective middle part of the photo image, the calculated amount is reduced, the calculation speed is improved, and the real-time flying height of the unmanned aerial vehicle is obtained more quickly.
Since the image is median filtered, binarized and morphologically filtered, substantially only the flare image remains on the image, the pixel values of the pixels of the image area outside the flare are substantially zero, and the pixel values of the pixels of the image area within the flare are substantially 255. The median filtering, binarization and morphological filtering are all conventional picture processing techniques, and are not described in detail. Therefore, when the search box does not meet the light spot, sumi is basically zero, and when the search box just meets the light spot, sumi can be increased by a plurality of times, at least more than 3 times, so that whether the search box meets the light spot can be judged by the first judging step. If the light spot is encountered, a refined search is performed, and the search frame is moved by taking one pixel as a step length, so that the obtained light spot coordinates are very accurate. The average diameter of the light spot is L2 pixels, and before the search frame meets the light spot, one third of L2 pixels are used as the step length of the moving search frame, so that the search speed is accelerated while the search frame is ensured not to miss the light spot.
The specific operation of processing the image of the region of interest by the laser spot coordinate positioning algorithm function is very concise, and the image coordinate value of the light spot can be rapidly and accurately obtained.
Drawings
FIG. 1 is a general flow diagram of an ultra-low altitude adaptive fixed-altitude flight control method according to the present invention;
FIG. 2 is a flow chart of processing an image of a region of interest by a laser spot coordinate positioning algorithm function;
FIG. 3 is a schematic flow chart of the control height step;
fig. 4 is a schematic diagram of a conversion principle of converting the unmanned aerial vehicle height value y measured on the image coordinate system into an actual height value;
FIG. 5 is a schematic view of a vision-based plant protection drone according to the present invention;
fig. 6 is a left side view of fig. 5.
Detailed Description
As shown in fig. 1 to 6, the vision-based plant protection unmanned aerial vehicle of the present invention comprises a disk-shaped housing 1 horizontally arranged, wherein the housing 1 is uniformly hinged with a plurality of cantilevers 2 in the circumferential direction thereof, and the other ends of the cantilevers 2 are free ends; each cantilever 2 is arranged along the radial direction of the shell 1, and the free end of each cantilever 2 is higher than the hinged end;
the front landing gear 3 and the rear landing gear 4 are respectively arranged at the front side and the rear side of the shell 1 downwards, the upper surface of the shell 1 is connected with a push rod 5 upwards, and the top end of the push rod 5 is provided with a GPS module 6; a flight controller and a storage battery which is connected with the flight controller and supplies power for the flight controller are arranged in the shell 1, the free end of each cantilever 2 is provided with a motor 7, and the output shaft of each motor 7 is upwards connected with a propeller; the GPS module 6 is arranged at the top end of the ejector rod 5, so that the GPS module 6 is minimally interfered by other components, satellite communication can be better realized, and the real-time position of the unmanned aerial vehicle can be more accurately acquired. The flight controller, battery and propeller are all existing devices, not shown. The flight controller preferably employs the A3 model product of Xinjiang Innovative technology Co., ltd. The miniPC is preferably an Intel NUC7i3BNH type product manufactured by Intel corporation.
The lower surface of the shell 1 between the front landing gear 3 and the rear landing gear 4 is downwards connected with a horizontally arranged suspension 8, the left end of the suspension 8 is provided with a laser emitter 9, and the laser emitting direction of the laser emitter 9 is perpendicular to the suspension 8 and faces downwards; the right end of the suspension 8 is provided with a camera 10, and the camera 10 is obliquely arranged leftwards and downwards; the diameter R of the laser beam emitted from the laser emitter 9 is 10 to 15 mm (the numerical ranges in the present invention each include both ends);
the preset flying height of the unmanned aerial vehicle is H1 to H2 meters, H2 is more than H1, an intersection point 13 is formed between a lens optical axis 11 of the camera 10 and a straight line 12 where laser emitted by the laser emitter 9 is located, the distance between the intersection point 13 and the laser emitter 9 is H1 to (H1 + H2)/2 meters, an included angle theta between the lens optical axis 11 of the camera 10 and the horizontal plane is 50 to 75 degrees, and the unmanned aerial vehicle is provided with a remote controller in wireless communication with a flying controller; the preferred value of H1 is 1 meter and the preferred value of H2 is 2 meters.
The included angle theta between the lens optical axis 11 of the camera 10 and the horizontal plane is 50-75 degrees, and the preferable distance between the intersection point and the laser emitter 9 is 1-1.5 meters. If θ is too large or too small, this results in too small the number of pixels occupied by the measurement height in the image, thereby degrading the measurement accuracy.
The intersection point of the lens optical axis 11 of the camera 10 and the straight line where the laser emitted by the laser emitter 9 is located ensures that the light spot formed by the laser emitter 9 is necessarily located on the vertical center line of the photo image shot by the camera 10.
The middle part of the suspension 8 is provided with a miniPC which is connected with a flight controller, a camera 10 and a laser transmitter 9; miniPC is manufactured by Intel NUC7i3BNH type product. The miniPC has larger weight, is arranged in the middle of the suspension 8, and can keep the whole gravity center of the unmanned aerial vehicle to be positioned in the geometric center of the unmanned aerial vehicle as far as possible. The laser emitter 9 can be connected with and controlled by the miniPC, and can also be directly connected with a storage battery, so long as the unmanned aerial vehicle is started, the laser is emitted downwards.
The ejector rod 5 is hinged with the upper surface of the shell 1, a through hole is formed in the upper surface of the shell 1, and a connecting line of the GPS module 6 extends downwards along the ejector rod 5 and extends into the shell 1 along the through hole to be connected with a flight controller; the connecting lines of the laser transmitter 9 and the camera 10 are respectively connected with a flight controller. The ejector rod 5 is hinged with the upper surface of the shell 1, can be put down during carrying, reduces the occupied space, and is erected again during use. The ejector rod 5 and the shell 1 have larger friction force, so that the ejector rod 5 cannot fall down by itself under the condition of no external force.
The front landing gear 3 and the rear landing gear 4 are symmetrically arranged in structure and respectively comprise a diagonal bracing rod 14 and a horizontal bracing rod 15, the lower end of the diagonal bracing rod 14 is fixedly connected with the middle part of the horizontal bracing rod 15, and the upper end of the diagonal bracing rod 14 is connected with the shell 1. The horizontal support bar 15 is used to support the entire unmanned aerial vehicle on a fixed structure, such as the ground.
The intersection point 13 of the lens optical axis 11 of the camera 10 and the straight line where the laser emitted by the laser emitter 9 is located is lower than the laser emitter (H1+H2)/2 meters. By means of the arrangement, when the flight height of the unmanned aerial vehicle is just moderate, the light spots in the photo shot by the camera 10 are located at the center of the image.
The invention also discloses an ultra-low altitude self-adaptive fixed altitude flight control method of the vision-based plant protection unmanned aerial vehicle, which comprises the step of building a table before the unmanned aerial vehicle leaves the factory, namely, pre-building a height data table of one-to-one correspondence between the y value of the facula image coordinate and the actual flight height of the unmanned aerial vehicle; the light spots are spots with higher brightness formed by irradiating laser on the crown or the ground of the crop; the height data table is provided with a plurality of groups of light spot image coordinates, the parameters y in the plurality of groups of light spot image coordinates are provided with m, and m is a natural number;
the ultra-low altitude self-adaptive fixed-altitude flight control method comprises the following steps:
the first step is a take-off step; the flight controller receives the instruction of the remote controller and controls each motor 7 to drive the propeller to rotate so as to take off; the ground operator controls the flying height of the unmanned aerial vehicle to be H1 to H2 (such as 1 to 2 meters) meters in a visual manner through the remote controller, and then the operator stops visual measurement of the flying height of the unmanned aerial vehicle;
the second step is to continuously judge the height; specifically, every 0.05 second, performing operation of judging the flight height of the unmanned aerial vehicle once;
the third step is a height control step; and controlling the flying height of the unmanned aerial vehicle to be H1 to H2 meters.
The specific operation of the table building step is as follows:
the width of the photo image shot by the camera 10 is x0 pixels, the height is y0 pixels, and x0 and y0 are integers; taking the upper left corner of the photo image as the origin of the image coordinates, taking the horizontal line of the origin of the image coordinates to the right (the X value of the right coordinate is a positive value) as the X axis of the image coordinates, and taking the vertical line of the origin of the image coordinates to the vertically downward (the Y value of the coordinate below the origin is a positive value) as the Y axis of the image coordinates, so as to establish a coordinate system of the photo image;
because the lens optical axis 11 of the camera 10 and the straight line where the laser emitted by the laser emitter 9 is located have an intersection point, on the premise that a light spot is formed on a photo image, the light spot formed by the laser emitted by the laser emitter 9 is necessarily located on the vertical straight line where x=x0/2 is located in the image coordinates;
the image coordinates of the spot formed by the laser light emitted from the laser emitter 9 are the image coordinates at the center of the spot area (i.e., the image coordinates of the spot center point);
when the height of the unmanned aerial vehicle is zero, shooting a photo image, wherein the y value of the image coordinate of the light spot corresponds to the height of 0 m, forming a group of corresponding arrays of the image coordinate and the actual flying height, and storing the corresponding arrays in a height data table in the miniPC; the serial number of the parameter y of the image coordinate of the facula in the corresponding array is 1;
operating the unmanned aerial vehicle to fly vertically upwards, hovering once every 5 cm of the unmanned aerial vehicle, adding 1 to the sequence number of the parameter y of the image coordinates of the light spots, and performing operation of acquiring a corresponding array once until the accumulated ascending distance of the unmanned aerial vehicle reaches 5 meters;
the operation of acquiring the corresponding array is that the camera 10 shoots, the y value of the image coordinate (unit is pixel) of the facula in the obtained photo image and the actual height of the unmanned aerial vehicle during shooting are used as a corresponding array of the image coordinate and the actual flying height (unit is meter), and the corresponding array is stored in the miniPC;
the camera parameters used in this method are the same as those used in the following derivation formula (see description of the conversion process shown in fig. 4 below).
There are two methods for obtaining the array, one is to calculate the image coordinate y value according to the following formula and the actual height value, and find the corresponding array (see the description of the conversion process shown in fig. 4 below); secondly, for a certain actual height, searching and obtaining y value after image processing, and establishing a corresponding array.
After the accumulated rising distance of the unmanned aerial vehicle reaches 5 meters, the first step is completed, namely the operation of establishing a height data table of one-to-one correspondence between the spot image coordinates and the actual flight heights of the unmanned aerial vehicle is completed; in the last operation of acquiring the corresponding array, the sequence number of the parameter y of the image coordinate of the light spot is m.
The operation of judging the flight height of the unmanned aerial vehicle in the second step is specifically as follows:
the camera 10 performs one shooting and transmits a photo image to the miniPC; acquiring an image of a region of interest on a photo image by the miniPC; the coordinate system in the image of the region of interest follows the coordinate system in the photo image;
the vertical central line of the interested region coincides with the vertical central line of the photo image; the width of the region of interest is L1 pixel, and the height of the region of interest is y0 pixel which is the same as the height of the photo image; l1 is 1.5+/-0.3 times the diameter R of the laser beam emitted by the laser emitter 9, and L1 is taken as an integer; therefore, the phenomenon that the light spots are not included in the region of interest due to factors such as the too narrow width of the region of interest, the shake of the camera body when photographing the picture and the like can be avoided, and meanwhile, the area of the region of interest is not excessively increased, so that the calculation amount for calculating the sum of pixel values is not excessively increased.
The miniPC calls a laser spot coordinate positioning algorithm function to process an image of the region of interest, a y value of a current spot image coordinate is obtained, the y value is yd, the yd is sequentially compared with the y value of each image coordinate in a height data table, and a height value corresponding to a value of a parameter y with the smallest absolute value of a difference between the yd and the y in the height data table is used as a height value of the current unmanned aerial vehicle, so that one-time unmanned aerial vehicle flight height judgment operation is completed.
The specific operation of the miniPC for calling the laser spot coordinate positioning algorithm function to process the image of the region of interest is as follows:
the first substep is to establish a rectangular search frame at the top of the image of the region of interest, the search frame is as wide as the image of the region of interest, the diameters of light spots at 3 to 10 different positions are measured through photo images, the average diameter of each light spot is calculated to be L2 pixels, and the height of the search frame is set to be L2 pixels; when a rectangular search box is established, simultaneously establishing a variable i which refers to the position of the search box, wherein the initial value of i is 1; the vertical center line of the search box coincides with the vertical center line of the image of the region of interest; taking the image coordinates of the center point of the search frame as the image coordinates of the search frame;
the second substep is to calculate the SUM of pixel values of all pixel points in the search frame to obtain the SUM of pixel values in the ith position of the search frame, and record the image coordinate value of the search frame in the ith position; the pixel value of the pixel point is usually in the range of 0 to 255,0 represents black, 255 represents white, and the larger the pixel value is, the brighter the pixel point is; the pixel value at the spot must be higher than the pixel value at other areas on the image.
The third substep is: the integer part number of L2/3 is N, N is a positive integer, the search frame is moved downwards by N pixels, and the i value is added with 1;
the fourth substep is to calculate the SUM of pixel values of all pixel points in the search frame to obtain the SUM of pixel values SUMi when the search frame is at the ith position, and record the image coordinate value of the search frame when the search frame is at the ith position;
the fifth substep is a first judgment step of judging whether sumi/sum (i-1) is greater than 3; if not, jumping to re-execute the third sub-step; if yes, a sixth sub-step is performed; since the image is median filtered, binarized and morphologically filtered, substantially only the flare image remains on the image, the pixel values of the pixels of the image area outside the flare are substantially zero, and the pixel values of the pixels of the image area within the flare are substantially 255. The median filtering, binarization and morphological filtering are all conventional picture processing techniques, and are not described in detail. Therefore, when the search box does not meet the light spot, sumi is basically zero, and when the search box just meets the light spot, sumi can be increased by a plurality of times, at least more than 3 times, so that whether the search box meets the light spot can be judged by the first judging step. If the light spot is encountered, a refined search is performed, and the search frame is moved by taking one pixel as a step length, so that the obtained light spot coordinates are very accurate. The average diameter of the light spot is L2 pixels, and before the search frame meets the light spot, one third of L2 pixels are used as the step length of the moving search frame, so that the search speed is accelerated while the search frame is ensured not to miss the light spot.
The sixth substep is to move the search frame down by 1 pixel, add 1 to the i value, and then perform the seventh substep;
a seventh substep is to calculate a SUM of pixel values of all pixel points in the search frame to obtain a SUM of pixel values SUMi when the search frame is at the i-th position, and record an image coordinate value of the search frame when the search frame is at the i-th position;
the eighth substep is a second judging step of judging whether sumi is smaller than sum (i-1), if not, jumping to execute the sixth substep again; if so, taking the image coordinate value of the search box with the position parameter of (i-1) as the image coordinate value of the light spot, namely the return value of the laser spot coordinate positioning algorithm function.
The miniPC calls a laser spot coordinate positioning algorithm function to process the image of the region of interest, and pretreats the image of the region of interest;
the operation of preprocessing the image of the region of interest is in particular: the image of the region of interest is median filtered, binarized and morphologically filtered.
The specific operation of the third step is as follows:
when the flying height of the unmanned aerial vehicle is lower than H1 m, the miniPC sends a command to a flying controller, and the flying controller increases the rotating speed of the propeller, so that the height of the unmanned aerial vehicle is increased;
when the flying height of the unmanned aerial vehicle is greater than or equal to H1 m and less than or equal to H2 m, the original rotating speed of the propeller is kept unchanged;
when the flying height of the unmanned aerial vehicle is larger than H2 meters, the miniPC sends an instruction to the flying controller, and the flying controller reduces the rotating speed of the propeller, so that the height of the unmanned aerial vehicle is reduced.
The invention uses the height data table to directly inquire and obtain the real-time height value of the unmanned aerial vehicle, thus compared with the real-time height value of the unmanned aerial vehicle obtained by calculation each time, the invention greatly reduces the operation amount, improves the calculation speed and reduces the power consumption (the power consumption of the CPU of the MiniPC is larger when the calculation amount is larger).
Of course, in addition to querying the altitude data table to obtain the real-time altitude value of the unmanned aerial vehicle, the real-time altitude value of the unmanned aerial vehicle may also be obtained by the following calculation method:
firstly, the miniPC calls a laser spot coordinate positioning algorithm function to process the image of the region of interest, and a y value of the current spot image coordinate is obtained.
In the unmanned aerial vehicle height measurement process, for the convenience of calculation, a reference point is firstly set on the image, the reference point corresponds to the actual standard height, for example, the standard height value is 1.5m (the preset flying height of the unmanned aerial vehicle is 1 to 2 meters), and the reference point is set at the center position of the image. Since the image center position is a position where the lens optical axis 11 of the camera 10 is imaged, the value of the angle θ between the lens optical axis 11 of the camera 10 and the horizontal plane is also determined. That is, the image center points (x 0/2, y0/2, x0 and y0 are the width and height of each frame of image read from the camera 10, respectively) correspond to the actual height of 1.5m, and since only the value in the y direction is considered, y0/2 can also be said to correspond to 1.5m. And (3) when 1m and 2m are calculated in advance through optical knowledge, the corresponding positions y1 and y2 of the light spots on the image are calculated. The y-axis is forward downward according to the setting established by the image coordinate system, so that the reference zero point, the y1, the y0/2 and the y2 are sequentially arranged from the small to the large along the y-axis direction. The reference zero point refers to the coordinate position of the image imaged by the laser emitter 9 in the image coordinate system.
The unmanned aerial vehicle height adjustment strategy is: when the flying of the unmanned aerial vehicle encounters raised ground or crop crown, the distance between the laser light spot and the unmanned aerial vehicle becomes short, and when the distance is smaller than 1m, the unmanned aerial vehicle is required to be adjusted upwards in order to ensure that the distance between the unmanned aerial vehicle and the ground or crop crown is always kept between 1 and 2 m; when the unmanned aerial vehicle flies and encounters the concave ground or crop crown to become low, the distance between the laser light spot and the unmanned aerial vehicle becomes far, and when the distance is more than 2m, the distance between the unmanned aerial vehicle and the ground or crop crown is always kept between 1 and 2m, so that the unmanned aerial vehicle is required to be downwards adjusted.
In order to realize the unmanned aerial vehicle height adjustment strategy, when the miniPC calls a laser spot coordinate positioning algorithm function to process an image of an interested region, after a y value of a current spot image coordinate is obtained, y is used for comparing with y1 and y2, when y < y1, the unmanned aerial vehicle height is lower than 1m, and at the moment, a height pulling instruction is sent to the unmanned aerial vehicle to enable the unmanned aerial vehicle height to be adjusted to be more than 1 m; when y > y2, indicating that the height of the unmanned aerial vehicle exceeds 2m, sending a height reducing instruction to the unmanned aerial vehicle to adjust the height of the unmanned aerial vehicle to be less than 2 m; when y1 is less than y2, indicating that the unmanned aerial vehicle is between 1 and 2m, and not adjusting the height of the unmanned aerial vehicle; when the unmanned aerial vehicle is lower than or higher than the range of 1-2m, the unmanned aerial vehicle can be lifted or lowered to the height, and in the process, when y-y0/2< epsilon (epsilon is an error value close to 0), the height of the unmanned aerial vehicle is adjusted to the position of 1.5m, and a command for stopping lifting or lowering the height is sent to the unmanned aerial vehicle, so that the unmanned aerial vehicle can fly at the height of 1.5m as stably as possible.
In order to reflect the actual height value of the unmanned aerial vehicle in real time, the unmanned aerial vehicle height value y measured on the image coordinate system needs to be converted into the actual height value, and the conversion process is shown in fig. 4.
In FIG. 4
Wherein the meaning of each letter or symbol is as follows.
h: actual height of unmanned aerial vehicle in unmanned aerial vehicle organism coordinate system, unit: rice;
y, coordinate values of the laser spots in an image coordinate system;
θ: the included angle between the lens optical axis 11 of the camera and the horizontal plane;
f: the focal length of the camera 10;
d: the horizontal distance of the camera 10 to the laser transmitter 9, this data being a structural design value.
According to the formula
The values y1 and y2 on the image coordinates corresponding to the actual heights 1m and 2m can be directly obtained through solving.
The image of the region of interest is preprocessed, and median filtering is a nonlinear smoothing technique that sets the gray value of each pixel point to the median of the gray values of all pixels within a certain neighborhood window of the point. The median filtering is a nonlinear signal processing technology capable of effectively suppressing noise based on a sequencing statistical theory, and the basic principle of median filtering is to replace the value of a point in a digital image or a digital sequence with the median of the point values in a neighborhood of the point, so that surrounding pixel values are close to a true value, and isolated noise points are eliminated. The method is to use a two-dimensional sliding template with a certain structure to sort pixels in the plate according to the size of pixel values, and generate a monotonically ascending (or descending) two-dimensional data sequence. The two-dimensional median filter output is g (x, y) =med { f (x-k, y-l), (k, l e W) }, where f (x, y), g (x, y) are the original image and the processed image, respectively. W is a two-dimensional template, usually a3 x 3,5 x 5 region, and can also be in different shapes, such as a line, a circle, a cross, a ring, etc. The median filtering is also the prior mature technology, and is directly invoked when in use, and the image and template parameters to be processed are input. Binarization of an image, namely setting the gray value of a pixel point on the image to 0 or 255, namely, displaying the whole image with obvious visual effects of only black and white. Binarization is one of the simplest methods of image segmentation. Binarization may convert a gray scale image into a binary image. The pixel gradation larger than a certain critical gradation value is set as a gradation maximum value, and the pixel gradation smaller than this value is set as a gradation minimum value, thereby realizing binarization. According to different threshold selection, the binarization algorithm is divided into a fixed threshold and an adaptive threshold. The invention adopts a fixed threshold value.
Morphological filtering operation is an image processing method developed according to a set theory method of mathematical morphology aiming at a binary image. In general, morphological image processing is represented in a form of a neighborhood operation, a specially defined domain called "structural element", in which a specific logical operation is performed at each pixel position in a region corresponding to a binary image, and the result of the logical operation is a response pixel of an output image. In brief, morphological operations are a series of image processing operations based on shapes that produce an output image by applying structural elements to an input image.
The most basic morphological operations are: corrosion and swelling. The method is widely applied to: noise cancellation, segmentation of individual image elements, connection of neighboring elements and finding distinct maxima or minima regions in the image. Morphological operations are used in the present invention to eliminate small areas, highlighting the target area to be detected, i.e. the laser spot image.
The above embodiments are only for illustrating the technical solution of the present invention, and it should be understood by those skilled in the art that although the present invention has been described in detail with reference to the above embodiments: modifications and equivalents may be made thereto without departing from the spirit and scope of the invention, which is intended to be encompassed by the claims.

Claims (4)

1. The vision-based plant protection unmanned aerial vehicle comprises a disc-shaped shell which is horizontally arranged, wherein a plurality of cantilevers are evenly hinged on the shell in the circumferential direction of the shell, and the other ends of the cantilevers are free ends; the method is characterized in that: each cantilever is arranged along the radial direction of the shell, and the free end of each cantilever is higher than the hinged end;
the front and rear sides of the shell are respectively provided with a front landing gear and a rear landing gear downwards, the upper surface of the shell is connected with a push rod upwards, and the top end of the push rod is provided with a GPS module; the flying control device comprises a shell, a flying controller and a storage battery, wherein the flying controller and the storage battery are arranged in the shell, the storage battery is connected with the flying controller and supplies power for the flying controller, the free end of each cantilever is provided with a motor, and the output shaft of each motor is upwards connected with a propeller;
the lower surface of the shell between the front landing gear and the rear landing gear is downwards connected with a horizontally arranged suspension, the left end of the suspension is provided with a laser emitter, and the laser emitted by the laser emitter is perpendicular to the suspension and faces downwards; the right end of the suspension is provided with a camera which is obliquely arranged leftwards and downwards; the diameter R of the laser beam emitted by the laser emitter is 10 to 15 mm;
the preset flying height of the unmanned aerial vehicle is H1 to H2 meters, H2 is more than H1, an intersection point is formed between a lens optical axis of the camera and a straight line where laser emitted by the laser emitter is located, the intersection point is lower than the laser emitters H1 to (H1+H2)/2 meters, an included angle theta between the lens optical axis of the camera and the horizontal plane is 50 to 75 degrees, and the unmanned aerial vehicle is provided with a remote controller in wireless communication with a flying controller; the middle part of the suspension is provided with a miniPC which is connected with the flight controller, the camera and the laser transmitter;
when the heights of the unmanned aerial vehicle are calculated to be 1m and 2m in advance, the positions y1 and y2 of the light spots formed by the laser transmitters on the image shot by the camera are corresponding to each other;
when the miniPC calls a laser spot coordinate positioning algorithm function to process an image of an interested region, after a y value of a current spot image coordinate is obtained, comparing y with y1 and y2, and when y is less than y1, indicating that the height of the unmanned aerial vehicle is lower than 1m, sending a height pulling instruction to the unmanned aerial vehicle at the moment to enable the height of the unmanned aerial vehicle to be adjusted to be more than 1 m; when y > y2, indicating that the height of the unmanned aerial vehicle exceeds 2m, sending a height reducing instruction to the unmanned aerial vehicle to adjust the height of the unmanned aerial vehicle to be less than 2 m; when y1< y < y2, the unmanned aerial vehicle is between 1 and 2m, and the height of the unmanned aerial vehicle is not adjusted.
2. The vision-based plant protection drone of claim 1, wherein: the ejector rod is hinged with the upper surface of the shell, a through hole is formed in the upper surface of the shell, and a connecting line of the GPS module extends downwards along the ejector rod and is connected with the flight controller after extending into the shell along the through hole; and the connecting lines of the laser transmitter and the camera are respectively connected with the flight controller.
3. The vision-based plant protection drone of claim 1 or 2, wherein: the front landing gear and the rear landing gear are symmetrically arranged and respectively comprise a diagonal bracing rod and a horizontal bracing rod, the lower ends of the diagonal bracing rods are fixedly connected with the middle parts of the horizontal bracing rods, and the upper ends of the diagonal bracing rods are connected with the shell.
4. The vision-based plant protection drone of claim 1 or 2, wherein: the intersection point of the lens optical axis of the camera and the straight line where the laser emitted by the laser emitter is located is lower than that of the laser emitter (H1 +H2)/2 meters.
CN201811085167.7A 2018-09-18 2018-09-18 Plant protection unmanned aerial vehicle based on vision Active CN109131860B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811085167.7A CN109131860B (en) 2018-09-18 2018-09-18 Plant protection unmanned aerial vehicle based on vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811085167.7A CN109131860B (en) 2018-09-18 2018-09-18 Plant protection unmanned aerial vehicle based on vision

Publications (2)

Publication Number Publication Date
CN109131860A CN109131860A (en) 2019-01-04
CN109131860B true CN109131860B (en) 2024-01-23

Family

ID=64814675

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811085167.7A Active CN109131860B (en) 2018-09-18 2018-09-18 Plant protection unmanned aerial vehicle based on vision

Country Status (1)

Country Link
CN (1) CN109131860B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109765931B (en) * 2019-01-31 2021-03-16 交通运输部天津水运工程科学研究所 Near-infrared video automatic navigation method suitable for breakwater inspection unmanned aerial vehicle
IT202100024602A1 (en) 2021-09-27 2023-03-27 Securesi Srl HORIZONTAL SPRAY SYSTEM OF LIQUID PRODUCTS WITH DRONE

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102495522A (en) * 2011-12-01 2012-06-13 天津曙光敬业科技有限公司 Method for manufacturing 360-degree air panoramic interactive roam system based on unmanned helicopter aerial shooting
CN104309803A (en) * 2014-10-27 2015-01-28 广州极飞电子科技有限公司 Automatic landing system and method of rotor aircraft
CN106454228A (en) * 2016-09-20 2017-02-22 朱海燕 Human face identification based video monitor intelligentizing network system
CN106585965A (en) * 2016-12-30 2017-04-26 苏州曾智沃德智能科技有限公司 Unmanned aerial vehicle used for highway surveying
CN107422743A (en) * 2015-09-12 2017-12-01 深圳九星智能航空科技有限公司 The unmanned plane alignment system of view-based access control model
CA3005289A1 (en) * 2016-11-23 2018-05-23 Ibs Of America A monitoring system, control system, actuation assembly of a paper machine, and a method of contolling

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102495522A (en) * 2011-12-01 2012-06-13 天津曙光敬业科技有限公司 Method for manufacturing 360-degree air panoramic interactive roam system based on unmanned helicopter aerial shooting
CN104309803A (en) * 2014-10-27 2015-01-28 广州极飞电子科技有限公司 Automatic landing system and method of rotor aircraft
CN107422743A (en) * 2015-09-12 2017-12-01 深圳九星智能航空科技有限公司 The unmanned plane alignment system of view-based access control model
CN106454228A (en) * 2016-09-20 2017-02-22 朱海燕 Human face identification based video monitor intelligentizing network system
CA3005289A1 (en) * 2016-11-23 2018-05-23 Ibs Of America A monitoring system, control system, actuation assembly of a paper machine, and a method of contolling
CN106585965A (en) * 2016-12-30 2017-04-26 苏州曾智沃德智能科技有限公司 Unmanned aerial vehicle used for highway surveying

Also Published As

Publication number Publication date
CN109131860A (en) 2019-01-04

Similar Documents

Publication Publication Date Title
US11771076B2 (en) Flight control method, information processing device, program and recording medium
CN110244766B (en) Planning method and system for unmanned aerial vehicle routing inspection route of photovoltaic power station
CN110879601B (en) Unmanned aerial vehicle inspection method for unknown fan structure
CN110618691B (en) Machine vision-based method for accurately landing concentric circle targets of unmanned aerial vehicle
CN105867397B (en) A kind of unmanned plane exact position landing method based on image procossing and fuzzy control
CN106969730B (en) A kind of top fruit sprayer volume measuring method based on unmanned plane Detection Techniques
CN109131860B (en) Plant protection unmanned aerial vehicle based on vision
CN112197766B (en) Visual gesture measuring device for tethered rotor platform
CN106774405B (en) Orchard plant protection drone obstacle avoidance apparatus and method based on three-level avoidance mechanism
CN113093772B (en) Method for accurately landing hangar of unmanned aerial vehicle
CN108873944B (en) Ultra-low altitude self-adaptive fixed-height flight control method
CN110455258A (en) A kind of unmanned plane Terrain Clearance Measurement method based on monocular vision
CN102538782A (en) Helicopter landing guide device and method based on computer vision
CN110879617A (en) Infrared-guided unmanned aerial vehicle landing method and device
CN112947526B (en) Unmanned aerial vehicle autonomous landing method and system
CN114743021A (en) Fusion method and system of power transmission line image and point cloud data
CN106940734A (en) A kind of Migrating Insects monitor recognition methods and device in the air
CN109765931B (en) Near-infrared video automatic navigation method suitable for breakwater inspection unmanned aerial vehicle
CN211060910U (en) Unmanned aerial vehicle measures and machine carries laser radar&#39;s looks like accuse point mark target system
CN213987269U (en) System for unmanned aerial vehicle patrols and examines fan blade
Li et al. Prediction of wheat gains with imagery from four-rotor UAV
Zhao et al. Theoretical design and first test in laboratory of a composite visual servo-based target spray robotic system
CN110618696B (en) Air-ground integrated surveying and mapping unmanned aerial vehicle
CN112327891A (en) Unmanned aerial vehicle autonomous landing system and method
CN111232234A (en) Method for real-time positioning system of aircraft space

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant