CN109446955B - Image processing method and device, unmanned aerial vehicle and server - Google Patents

Image processing method and device, unmanned aerial vehicle and server Download PDF

Info

Publication number
CN109446955B
CN109446955B CN201811208671.1A CN201811208671A CN109446955B CN 109446955 B CN109446955 B CN 109446955B CN 201811208671 A CN201811208671 A CN 201811208671A CN 109446955 B CN109446955 B CN 109446955B
Authority
CN
China
Prior art keywords
image
target area
sub
identity
unmanned aerial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811208671.1A
Other languages
Chinese (zh)
Other versions
CN109446955A (en
Inventor
刘小军
温宏愿
刘增元
周军
牛绿原
张朋
刘磊
苏洋洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Taizhou Institute Of Sci&tech Nust
Original Assignee
Taizhou Institute Of Sci&tech Nust
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Taizhou Institute Of Sci&tech Nust filed Critical Taizhou Institute Of Sci&tech Nust
Priority to CN201811208671.1A priority Critical patent/CN109446955B/en
Publication of CN109446955A publication Critical patent/CN109446955A/en
Application granted granted Critical
Publication of CN109446955B publication Critical patent/CN109446955B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M7/00Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
    • A01M7/0025Mechanical sprayers
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M7/00Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
    • A01M7/0089Regulating or controlling systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Pest Control & Pesticides (AREA)
  • Environmental Sciences (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Zoology (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Wood Science & Technology (AREA)
  • Insects & Arthropods (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Image Processing (AREA)

Abstract

The application provides an image processing method and device, an unmanned aerial vehicle and a server, which are used for solving the problem that whether the unmanned aerial vehicle needs to spray or not in the prior art can not judge whether a sub-target area needs to spray or not according to the image of the area. The method is applied to the unmanned aerial vehicle and comprises the following steps: dividing the received target area to obtain at least one sub-target area; selecting image acquisition coordinates from each of the at least one sub-target area to obtain a plurality of image acquisition coordinates; sequentially flying to each image acquisition coordinate in the plurality of image acquisition coordinates to shoot an image, and adding an identity mark to a sub-target area corresponding to the image acquisition coordinate; and sending the shot image and the identity of the corresponding sub-target area to a server.

Description

Image processing method and device, unmanned aerial vehicle and server
Technical Field
The application relates to the technical field of unmanned aerial vehicles and image recognition, and relates to an image processing method and device, an unmanned aerial vehicle and a server.
Background
At present, the traditional multi-rotor unmanned plant protection machine in the industry generates corresponding torque through different motor blades to control the actions of advancing, turning, climbing, rolling and the like of the airplane. Unmanned plant protection machine carries on pesticide sprinkler, utilizes the powerful downdraft that the aircraft paddle produced to spray the even effectual crops surface of pesticide liquid grain. However, the traditional unmanned plant protection machine has a single function, only can spray pesticides, and cannot complete the function of crop disaster detection. Therefore, the problem that whether the unmanned aerial vehicle needs to spray or not can not be judged according to the images of the sub-target areas in the prior art exists.
Disclosure of Invention
In view of this, the present application provides an image processing method and apparatus, an unmanned aerial vehicle, and a server, which are used to solve the problem that in the prior art, an unmanned aerial vehicle cannot determine whether a sub-target area needs to be sprayed according to an image of the area.
The application provides an image processing method, which is applied to an unmanned aerial vehicle, and comprises the following steps: dividing the received target area to obtain at least one sub-target area; selecting image acquisition coordinates from each of the at least one sub-target area to obtain a plurality of image acquisition coordinates; sequentially flying to each image acquisition coordinate in the plurality of image acquisition coordinates to shoot an image, and adding an identity mark to a sub-target area corresponding to the image acquisition coordinate; and sending the shot image and the identity of the corresponding sub-target area to a server.
Optionally, in this embodiment of the application, before the dividing the received target area to obtain at least one sub-target area, the method further includes: and receiving the target area transmitted by the ground station.
Optionally, in this embodiment of the application, after the sending the captured image and the identifiers of the corresponding sub-target areas to the server, the method further includes: receiving the identity of at least one sub-target area needing pesticide spraying, which is sent by the server; flying to a sub-target area corresponding to the identification mark, and spraying liquid to the sub-target area through a spraying device.
Optionally, in this embodiment of the present application, the dividing the received target area to obtain at least one sub-target area includes: calculating by using the pythagorean theorem according to the angle of the visual angle of the unmanned aerial vehicle and the height of the unmanned aerial vehicle to obtain the area of a target area; if the area of the target area is larger than the preset area, dividing the target area into a first area with the area not larger than the preset area, and taking the first area as the sub-target area; and if the area of the target area is smaller than the preset area, taking the target area as a sub-target area.
The application also provides an image processing method, which is applied to a server and comprises the following steps: receiving an image and a corresponding identity mark sent by the unmanned aerial vehicle; presetting the image of the sub-target area corresponding to the identity identification to obtain the state information of the sub-target area; if the state information meets the preset condition, taking the sub-target area as a spraying area; and sending the identity of at least one spraying area to the unmanned aerial vehicle.
Optionally, in this embodiment of the application, the image of the sub-target area is an image of a crop growing in the ground, and the preset processing is performed on the image of the sub-target area corresponding to the identifier to obtain the status information of the sub-target area, where the preset processing includes: converting a first color space of the image into a second color space, thereby obtaining a first image; removing a first interference factor in the first image to obtain a second image; and calculating the color distribution of the crops in the second image, and acquiring state information reflecting the growth conditions of the crops according to the colors of the crops.
The application also provides an image processing device, is applied to unmanned aerial vehicle, includes: a target area obtaining module, configured to divide the received target area to obtain at least one sub-target area; an acquisition coordinate obtaining module, configured to select an image acquisition coordinate from each of the at least one sub-target region, and obtain a plurality of image acquisition coordinates; the image identification obtaining module is used for sequentially flying to each image acquisition coordinate in the plurality of image acquisition coordinates to shoot an image and adding an identity identification to a sub-target area corresponding to the image acquisition coordinate; and the image identifier sending module is used for sending the shot image and the identity identifier of the corresponding sub-target area to the server.
The present application also provides an image processing apparatus applied to a server, including: the image identifier receiving module is used for receiving the image and the corresponding identity identifier sent by the unmanned aerial vehicle; the state information acquisition module is used for carrying out preset processing on the image of the sub-target area corresponding to the identity identification to acquire the state information of the sub-target area; a spraying area obtaining module for taking the sub-target area as a spraying area; and the identity transmitting module is used for transmitting at least one identity of the spraying area to the unmanned aerial vehicle.
This application still provides an unmanned aerial vehicle, unmanned aerial vehicle includes: a first processor, a first memory storing machine-readable instructions executable by the first processor, and a first communication interface for communicating with an external device, the machine-readable instructions, when executed by the first processor, performing a method as described above.
The present application further provides a server, comprising: a second processor, a second memory storing machine-readable instructions executable by the second processor, and a second communication interface for communicating with an external device, the machine-readable instructions, when executed by the second processor, performing the method as described above.
The application provides an image processing method, an image processing device, an unmanned aerial vehicle and a server, a target area is divided into at least one sub-target area through the unmanned aerial vehicle, images of the at least one sub-target area are collected and identity marks are added, then the collected images and the identity marks are sent to the server, after the server receives the images and the identity marks, whether the sub-target area is a spraying area or not is judged through presetting processing on the images, at least one spraying area is obtained, and the identity marks of the at least one spraying area are sent to the unmanned aerial vehicle. The sub-target area needing spraying is effectively identified through the mode, and therefore the problem that whether the area needs spraying or not can not be judged according to the image of the sub-target area by the unmanned aerial vehicle in the prior art is solved.
In order to make the aforementioned and other objects and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
For a clearer explanation of the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic flowchart illustrating an image processing method at an unmanned aerial vehicle end according to an embodiment of the present application;
fig. 2 is a schematic flowchart illustrating step S110 of an image processing method at an unmanned aerial vehicle end according to an embodiment of the present application;
FIG. 3 is a flowchart illustrating a complete image processing method at an unmanned aerial vehicle according to an embodiment of the present application;
fig. 4 is a schematic flowchart illustrating an image processing method at a server side according to an embodiment of the present application;
fig. 5 is a flowchart illustrating step S220 of an image processing method at a server side according to an embodiment of the present application;
FIG. 6 is a schematic diagram illustrating an image processing apparatus according to an embodiment of the present application;
fig. 7 shows a schematic structural diagram of a drone and a server provided by an embodiment of the present application.
Icon: 101-an image processing device; 100-unmanned aerial vehicle; 110-target area obtaining module; 120-acquisition coordinate acquisition module; 130-an image identity obtaining module; 140-an image identification sending module; 150-a first processor; 160-a first memory; 170 — a first communication interface; 200-a server; 210-an image identification receiving module; 220-a state information obtaining module; 230-a spray area acquisition module; 240-identity transmitting module; 250-a second processor; 260-a second memory; 270-a second communication interface.
Detailed Description
In the description of the present application, it should be noted that the terms "upper", "lower", "left", "right", "inner", "outer", and the like indicate orientations or positional relationships based on orientations or positional relationships shown in the drawings or orientations or positional relationships conventionally laid out when the products are used, and are only for convenience in describing the present application and simplifying the description, but do not indicate or imply that the referred devices or elements must have a specific orientation, be constructed in a specific orientation, and be operated, and thus, should not be construed as limiting the present application.
In the description of the present application, it is further noted that, unless expressly stated or limited otherwise, the terms "disposed," "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements.
Some embodiments of the present application will be described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
The utility model provides an unmanned aerial vehicle includes but not limited to multi-functional plant protection machine, will explain with multi-functional plant protection machine below:
the multifunctional plant protection machine mainly comprises main parts such as a flight controller, an image processing system, a spraying device, a motor, a mechanical structure, a ground system and the like.
First embodiment
Referring to fig. 1, fig. 1 is a schematic flow chart illustrating an image processing method at an unmanned aerial vehicle end according to an embodiment of the present application. The application provides an image processing method, which is applied to an unmanned aerial vehicle and comprises the following steps:
step S110: and dividing the received target area to obtain at least one sub-target area.
Referring to fig. 2, fig. 2 is a schematic flowchart illustrating step S110 of the image processing method at the unmanned aerial vehicle end according to the embodiment of the present application. Optionally, in this embodiment of the present application, dividing the received target area to obtain at least one sub-target area includes:
step S111: and calculating by utilizing the Pythagorean theorem according to the visual angle of the unmanned aerial vehicle and the height of the unmanned aerial vehicle to obtain the area of the target area.
It should be noted that step S111 can be divided into two steps: the method comprises the following steps of firstly, calculating the length of a target area by utilizing a pythagorean theorem according to the visual angle of the unmanned aerial vehicle and the height of the unmanned aerial vehicle. For example: according to the view angle range b of the unmanned aerial vehicle being 120 degrees, the half view angle a of the view angle range being 60 degrees and the height h of the unmanned aerial vehicle being 10 meters, and according to the trigonometric function tan a in the pythagorean theorem being x/h, wherein x is the length of the target area, the calculation formula is x being h tan a being 10 being 2.144 being 21.44 meters. In the second step, if the aspect ratio of the onboard camera of the unmanned aerial vehicle is 4:3, for example, the width calculation formula of the target area is x: y is 4: 3: 21.44: y, y is 16.08 m, and the area of the target area calculated finally is s is x: y is 344.75 square meters.
Step S112: if the area of the target area is larger than the preset area, the target area is divided into a first area with the area not larger than the preset area, and the first area is used as a sub-target area.
For example, the area of the target region is s ═ x ═ y ═ 21.44 ═ 16.08 ═ 344.75 square meters, and the preset region area is, for example, s1 ═ 10 ═ 8 ═ 80 square meters, then the target region may be divided into rundup (21.44/10) × (16.08/8) ═ 9 sub-target regions, where rundup means rounding up, and in a specific implementation process, it is necessary to adjust the target region according to actual conditions.
Step S113: and if the area of the target area is smaller than the preset area, taking the target area as a sub-target area.
For example, if the area of the first and second sub-target regions is s1 ═ 10 × 8 ═ 80 square meters, the area of the third sub-target region is s3 ═ 21.44-2 × 10 ═ 16.08-8 × 2 ═ 0.1152 square meters.
Step S120: image acquisition coordinates are selected from each of the at least one sub-target area, obtaining a plurality of image acquisition coordinates.
In a specific implementation process, the image capturing coordinates are usually coordinate positions of center points of the selected sub-target areas, and may be adjusted according to specific situations such as terrain and weather.
Step S130: and sequentially flying to each image acquisition coordinate in the plurality of image acquisition coordinates to shoot an image, and adding an identity mark to the sub-target area corresponding to the image acquisition coordinate.
For example, the image is captured by sequentially flying to each of the plurality of image capture coordinates, a timestamp is added to the captured image as an identification of the image, and a location coordinate position may also be used as the identification, specifically, the longitude and latitude of the coordinate, or the relative position of the coordinate with respect to a certain point.
Step S140: and sending the shot image and the identity of the corresponding sub-target area to a server.
The shot image and the identity of the corresponding sub-target area are sent to the server, the image can be stored in a cache, the cache can be a cloud storage platform provided by a third-party tool, and then the image is sent to the server through the cloud storage platform provided by the third party and processed by the server. The cloud storage platform provided by the third party provides real-time storage service, and the server can receive and process the images asynchronously, so that the image processing speed is accelerated, and the resources of the server are saved.
Referring to fig. 3, fig. 3 is a schematic flowchart illustrating a complete image processing method at an unmanned aerial vehicle end according to an embodiment of the present application. Optionally, in this embodiment of the application, before dividing the received target area to obtain at least one sub-target area, the method further includes:
step S100: and receiving the target area transmitted by the ground station.
The ground station can be a target area sent by a server of a ground control center, and can also be an unmanned aerial vehicle controller in hands of ground station workers, such as an unmanned aerial vehicle flight controller, the model of which is STM32F427, the working frequency is up to 168MHz, the operation speed is high, and the system power consumption is low. The flight control acquires attitude data through the integrated 3-axis gyroscope, the 3-axis accelerator and the geomagnetic sensor and adopts optimized process control, and the modulation signals output different duty ratio signals to drive the brushless direct current motor, so that the flight attitude is controlled. Therefore, the specific transmission method and transmission apparatus should not be construed as limiting the present application.
Referring to fig. 3, optionally, in this embodiment of the application, after sending the captured image and the identifiers of the corresponding sub-target areas to the server, the method further includes:
step S150: and receiving the identity of at least one sub-target area needing pesticide spraying sent by the server.
Step S160: flying to a sub-target area corresponding to the identification mark, and spraying liquid to the sub-target area through a spraying device.
Wherein, carry out route planning according to the regional ID of sub-target, can reach the area that needs to spray again, open and adjust sprinkler, make it utilize aircraft to make powerful air current downwards evenly effectual spraying pesticide to the crops surface downwards.
Second embodiment
Referring to fig. 4, fig. 4 is a schematic flowchart illustrating an image processing method at a server according to an embodiment of the present application. The application provides an image processing method, which is applied to a server and comprises the following steps:
step S210: and receiving the image and the corresponding identity sent by the unmanned aerial vehicle.
As described in the first embodiment, the unmanned aerial vehicle may be first sent to the cloud storage platform of the third party, and then the cloud storage platform of the third party is sent to the server for processing. Therefore, the photos can be conveniently viewed and updated by using a cloud storage platform of a third party, such as a local synchronization tool of the seven-cow cloud, and the updated photos can be saved to a specific position of the server. The identification can be directly transmitted to the server or can be transmitted to the server through a cloud storage platform of a third party, and therefore, the specific manner and object of receiving the image and the corresponding identification are not to be construed as limiting the application.
Step S220: and carrying out preset processing on the image of the sub-target area corresponding to the identity identification to obtain the state information of the sub-target area.
Referring to fig. 5, fig. 5 is a schematic flowchart illustrating step S220 of the image processing method at the server side according to the embodiment of the present application. Optionally, in this embodiment of the application, the image of the sub-target area is an image of a crop growing in the ground, and the preset processing is performed on the image of the sub-target area corresponding to the identity identifier to obtain the state information of the sub-target area, where the preset processing includes:
step S221: the first color space of the image is converted into a second color space, thereby obtaining a first image.
Wherein, after the program loads the image, the input image is converted from RGB (RGB color mode is a color standard in the industry) color space into HSV space (HSV (Hue, Saturation, Value) which is a color space created by a.r. smith in 1978 according to the intuitive characteristics of color) by OPENCV (OPENCV is an open-source cross-platform computer vision library). HSV (also called HSB) is a two-dimensional representation of points in the RGB color space that attempts to describe perceived color relationships more accurately than RGB.
For example, H denotes (hue), S denotes (saturation), L denotes (lightness), V denotes (hue), and B denotes (lightness). The program counts the number of 8-bit deep images, i.e. (0 to 255), with 0 to 255 bit baselines, and the data curve histogram of the lower graph inevitably appears for an achromatic image. The program then divides the peak by the trough value between the two peaks and takes the average of the two peaks to calculate an approximate ideal threshold. The value above which remains below becomes 255. Therefore, the influence caused by land and light can be effectively avoided. Therefore, the first image converted into the HSV space can sense the color relation more accurately.
Step S222: and removing the first interference factor in the first image to obtain a second image.
In a specific implementation process, the first interference factor in the first image is removed, and for example, whether the color of the crop is in a healthy area is calculated according to the overall color distribution of the crop and the HSV condition of the color. The block size in the adaptive threshold function used in the program (block size) is used to calculate the pixel neighborhood size of the threshold 3,5, 7.
Step S223: and calculating the color distribution of the crops in the second image, and acquiring state information reflecting the growth conditions of the crops according to the colors of the crops.
In the specific implementation process, in the rice growth detection, the variance of the effective pixel hue value after the adaptive threshold segmentation is calculated mainly through the main hue value of the standard map, such as the green hue value, wherein the normal growth is obtained at 1000-2000, the mild yellow seedling is obtained at 2000-.
Step S230: and if the state information meets the preset condition, taking the sub-target area as a spraying area.
Step S240: and sending the identity of at least one spraying area to the unmanned aerial vehicle.
Wherein, for the above example, for example, when the area is heavy yellow seedling, the sub-target area is marked as the spraying area, and then the identity of the spraying area is sent to the unmanned aerial vehicle.
Third embodiment
Referring to fig. 6, fig. 6 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present disclosure. The application provides a pair of image processing apparatus 101 is applied to unmanned aerial vehicle 100, includes:
a target area obtaining module 110, configured to divide the received target area to obtain at least one sub-target area.
An acquisition coordinate obtaining module 120, configured to select an image acquisition coordinate from each of the at least one sub-target area, and obtain a plurality of image acquisition coordinates.
The image identifier obtaining module 130 is configured to sequentially fly to each image acquisition coordinate of the multiple image acquisition coordinates to capture an image, and add an identity identifier to a sub-target area corresponding to the image acquisition coordinate.
And an image identifier sending module 140, configured to send the captured image and the identifiers of the corresponding sub-target areas to the server 200.
Fourth embodiment
Referring to fig. 6, the image processing apparatus 101 provided by the present application is applied to a server 200, and includes:
the image identifier receiving module 210 is configured to receive an image and a corresponding identity identifier sent by the drone 100.
The status information obtaining module 220 is configured to perform preset processing on the image of the sub-target area corresponding to the identity identifier, and obtain status information of the sub-target area.
A spray area obtaining module 230 for taking the sub-target area as a spray area.
An identity sending module 240, configured to send an identity of the at least one spraying area to the drone 100.
Fifth embodiment
Referring to fig. 7, fig. 7 shows a schematic structural diagram of a drone and a server according to an embodiment of the present application. The application provides a pair of unmanned aerial vehicle 100, unmanned aerial vehicle 100 includes: a first processor 150, a first memory 160 and a first communication interface 170, the first memory 160 storing machine readable instructions executable by the first processor 150, the first communication interface 170 for communicating with external devices, the machine readable instructions when executed by the first processor 150 performing a method as in the first embodiment.
Sixth embodiment
Referring to fig. 7, the present application provides a server 200, where the server 200 includes: a second processor 250, a second memory 260 and a second communication interface 270, the second memory 260 storing machine readable instructions executable by the second processor 250, the second communication interface 270 being for communicating with external devices, the machine readable instructions, when executed by the second processor 250, performing a method as in the second embodiment.
The application provides an image processing method, an image processing device, an unmanned aerial vehicle and a server, a target area is divided into at least one sub-target area through the unmanned aerial vehicle, images of the at least one sub-target area are collected and identity marks are added, then the collected images and the identity marks are sent to the server, after the server receives the images and the identity marks, whether the sub-target area is a spraying area or not is judged through presetting processing on the images, at least one spraying area is obtained, and the identity marks of the at least one spraying area are sent to the unmanned aerial vehicle. The sub-target area needing spraying is effectively identified through the mode, and therefore the problem that whether the area needs spraying or not can not be judged according to the image of the sub-target area by the unmanned aerial vehicle in the prior art is solved.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (8)

1. An image processing method is applied to an unmanned aerial vehicle, and comprises the following steps:
dividing the received target area to obtain at least one sub-target area;
selecting image acquisition coordinates from each of the at least one sub-target area to obtain a plurality of image acquisition coordinates;
sequentially flying to each image acquisition coordinate in the plurality of image acquisition coordinates to shoot an image, and adding an identity mark to a sub-target area corresponding to the image acquisition coordinate, wherein the adding of the identity mark to the sub-target area corresponding to the image acquisition coordinate comprises the following steps: determining the identification mark by using the time stamp when the image is shot and the position of the place coordinate when the image is shot;
sending the shot image and the identity marks of the corresponding sub-target areas to a server;
receiving the identity of at least one sub-target area needing pesticide spraying, which is sent by the server;
flying to a sub-target area corresponding to the identity identification, and spraying liquid to the sub-target area through a spraying device;
the status information of the sub-target area is obtained by performing preset processing on the image of the sub-target area corresponding to the identity; the preset processing of the image of the sub-target area corresponding to the identity mark comprises: converting an RGB color space of the image into an HSV color space through OpenCV, thereby obtaining a first image; removing a first interference factor in the first image to obtain a second image; calculating the color distribution of the crops in the second image, and acquiring state information reflecting the growth conditions of the crops according to the colors of the crops;
wherein the calculating the crop color distribution in the second image and obtaining the state information reflecting the crop growth status according to the crop color comprises: counting the quantity of the HSV color space of the second image by taking 0-255 as a baseline to obtain a data curve histogram; calculating an approximate ideal threshold value according to an average value between two peak values of the data curve histogram, and performing self-adaptive threshold value segmentation according to the approximate ideal threshold value to obtain a variance value of effective pixel hue values in the second image; and comparing the variance value of the effective pixel hue value with a preset range to obtain the state information of the growth condition of the crops.
2. The image processing method of claim 1, wherein before the dividing the received target region to obtain at least one sub-target region, further comprising:
and receiving the target area transmitted by the ground station.
3. The image processing method of claim 1, wherein the dividing the received target area to obtain at least one sub-target area comprises:
calculating by using the pythagorean theorem according to the angle of the visual angle of the unmanned aerial vehicle and the height of the unmanned aerial vehicle to obtain the area of a target area;
if the area of the target area is larger than the preset area, dividing the target area into a first area with the area not larger than the preset area, and taking the first area as the sub-target area;
and if the area of the target area is smaller than the preset area, taking the target area as a sub-target area.
4. An image processing method applied to a server, the image processing method comprising:
receiving an image and a corresponding identity mark sent by an unmanned aerial vehicle, wherein the identity mark is determined by the unmanned aerial vehicle by using a time stamp when the image is shot and a position coordinate position when the image is shot;
presetting the image of the sub-target area corresponding to the identity identification to obtain the state information of the sub-target area;
if the state information meets the preset condition, taking the sub-target area as a spraying area;
sending an identity of at least one spraying area to the unmanned aerial vehicle, wherein the identity is used for the unmanned aerial vehicle to receive the identity of at least one sub-target area needing pesticide spraying, which is sent by the server; flying to a sub-target area corresponding to the identity mark, and spraying liquid to the sub-target area through a spraying device;
the status information of the sub-target area is obtained by performing preset processing on the image of the sub-target area corresponding to the identity; the preset processing of the image of the sub-target area corresponding to the identity mark comprises: converting an RGB color space of the image into an HSV color space through OpenCV, thereby obtaining a first image; removing a first interference factor in the first image to obtain a second image; calculating the color distribution of the crops in the second image, and acquiring state information reflecting the growth conditions of the crops according to the colors of the crops;
wherein the calculating the crop color distribution in the second image and obtaining the state information reflecting the crop growth status according to the crop color comprises: counting the quantity of the HSV color space of the second image by taking 0-255 as a baseline to obtain a data curve histogram; calculating an approximate ideal threshold value according to an average value between two peak values of the data curve histogram, and performing self-adaptive threshold value segmentation according to the approximate ideal threshold value to obtain a variance value of effective pixel hue values in the second image; and comparing the variance value of the effective pixel hue value with a preset range to obtain the state information of the growth condition of the crops.
5. The utility model provides an image processing apparatus which characterized in that is applied to unmanned aerial vehicle, includes:
a target area obtaining module, configured to divide the received target area to obtain at least one sub-target area;
an acquisition coordinate obtaining module, configured to select an image acquisition coordinate from each of the at least one sub-target region, and obtain a plurality of image acquisition coordinates;
an image identifier obtaining module, configured to fly to each image acquisition coordinate of the multiple image acquisition coordinates in sequence to capture an image, and add an identifier to a sub-target area corresponding to the image acquisition coordinate, where adding an identifier to the sub-target area corresponding to the image acquisition coordinate includes: determining the identification mark by using the time stamp when the image is shot and the position of the place coordinate when the image is shot;
the image identifier sending module is used for sending the shot image and the identity identifier of the corresponding sub-target area to the server; the server is also used for receiving the identity of at least one sub-target area needing pesticide spraying and sent by the server; the liquid spraying device is used for flying to a sub-target area corresponding to the identity mark and spraying liquid to the sub-target area through the spraying device;
the status information of the sub-target area is obtained by performing preset processing on the image of the sub-target area corresponding to the identity; the preset processing of the image of the sub-target area corresponding to the identity mark comprises: converting an RGB color space of the image into an HSV color space through OpenCV, thereby obtaining a first image; removing a first interference factor in the first image to obtain a second image; calculating the color distribution of the crops in the second image, and acquiring state information reflecting the growth conditions of the crops according to the colors of the crops;
wherein the calculating the crop color distribution in the second image and obtaining the state information reflecting the crop growth status according to the crop color comprises: counting the quantity of the HSV color space of the second image by taking 0-255 as a baseline to obtain a data curve histogram; calculating an approximate ideal threshold value according to an average value between two peak values of the data curve histogram, and performing self-adaptive threshold value segmentation according to the approximate ideal threshold value to obtain a variance value of effective pixel hue values in the second image; and comparing the variance value of the effective pixel hue value with a preset range to obtain the state information of the growth condition of the crops.
6. An image processing apparatus, applied to a server, comprising:
the unmanned aerial vehicle comprises an image identifier receiving module, a time stamp module and a corresponding identity identifier, wherein the image identifier receiving module is used for receiving an image sent by the unmanned aerial vehicle and the corresponding identity identifier, and the identity identifier is determined by the unmanned aerial vehicle by using the time stamp when the image is shot and the position coordinate position when the image is shot;
the state information acquisition module is used for carrying out preset processing on the image of the sub-target area corresponding to the identity identification to acquire the state information of the sub-target area;
a spraying area obtaining module for taking the sub-target area as a spraying area;
the identity identification sending module is used for sending the identity identification of at least one spraying area to the unmanned aerial vehicle, wherein the identity identification is used for the unmanned aerial vehicle to receive the identity identification of at least one sub-target area needing pesticide spraying, which is sent by the server; flying to a sub-target area corresponding to the identity mark, and spraying liquid to the sub-target area through a spraying device;
the status information of the sub-target area is obtained by performing preset processing on the image of the sub-target area corresponding to the identity; the preset processing of the image of the sub-target area corresponding to the identity mark comprises: converting an RGB color space of the image into an HSV color space through OpenCV, thereby obtaining a first image; removing a first interference factor in the first image to obtain a second image; calculating the color distribution of the crops in the second image, and acquiring state information reflecting the growth conditions of the crops according to the colors of the crops;
wherein the calculating the crop color distribution in the second image and obtaining the state information reflecting the crop growth status according to the crop color comprises: counting the quantity of the HSV color space of the second image by taking 0-255 as a baseline to obtain a data curve histogram; calculating an approximate ideal threshold value according to an average value between two peak values of the data curve histogram, and performing self-adaptive threshold value segmentation according to the approximate ideal threshold value to obtain a variance value of effective pixel hue values in the second image; and comparing the variance value of the effective pixel hue value with a preset range to obtain the state information of the growth condition of the crops.
7. A drone, characterized in that it comprises: a first processor, a first memory storing machine readable instructions executable by the first processor, and a first communication interface for communicating with an external device, the machine readable instructions when executed by the first processor performing the method of any of claims 1-3.
8. A server, characterized in that the server comprises: a second processor, a second memory storing machine-readable instructions executable by the second processor, and a second communication interface for communicating with an external device, the machine-readable instructions when executed by the second processor performing the method of claim 4.
CN201811208671.1A 2018-10-17 2018-10-17 Image processing method and device, unmanned aerial vehicle and server Active CN109446955B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811208671.1A CN109446955B (en) 2018-10-17 2018-10-17 Image processing method and device, unmanned aerial vehicle and server

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811208671.1A CN109446955B (en) 2018-10-17 2018-10-17 Image processing method and device, unmanned aerial vehicle and server

Publications (2)

Publication Number Publication Date
CN109446955A CN109446955A (en) 2019-03-08
CN109446955B true CN109446955B (en) 2020-08-25

Family

ID=65547138

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811208671.1A Active CN109446955B (en) 2018-10-17 2018-10-17 Image processing method and device, unmanned aerial vehicle and server

Country Status (1)

Country Link
CN (1) CN109446955B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110178605B (en) * 2019-06-28 2021-03-30 重庆文理学院 Automatic sprinkling system of accurate pesticide to kiwi fruit leaf disease
CN111789093B (en) * 2020-07-21 2022-02-18 金子辰 Road flower 5G intelligent mobile pesticide spraying system and pesticide spraying method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1646716A1 (en) * 2003-07-11 2006-04-19 ZGene A/S Yellow fever mosquito deoxyribonucleoside kinases and its use
CN102759528A (en) * 2012-07-09 2012-10-31 陕西科技大学 Method for detecting diseases of crop leaves
CN103903006A (en) * 2014-03-05 2014-07-02 中国科学院合肥物质科学研究院 Crop pest identification method and system based on Android platform
CN103955938A (en) * 2014-05-15 2014-07-30 安徽农业大学 Wheat growing status diagnosing method based on mobile internet mode and leaf color analysis
CN105223202A (en) * 2014-06-27 2016-01-06 丁莉丽 A kind of method detecting crops leaf diseases
CN108563979A (en) * 2017-12-29 2018-09-21 南京农业大学 A method of based on the farmland image discriminating rice blast state of an illness of taking photo by plane

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105744218B (en) * 2014-12-11 2019-04-26 小米科技有限责任公司 Rubbish method for cleaning and device
CN104849274A (en) * 2015-04-18 2015-08-19 中国计量学院 Real-time detection method for drought status in detected area based on miniature unmanned plane
KR20170004280A (en) * 2015-07-02 2017-01-11 강민구 Apparatus for inducing an aircraft
CN105116911B (en) * 2015-07-20 2017-07-21 广州极飞科技有限公司 Unmanned plane spray method
CN105173085B (en) * 2015-09-18 2017-06-16 山东农业大学 Unmanned plane variable farm chemical applying automatic control system and method
CN105159319B (en) * 2015-09-29 2017-10-31 广州极飞科技有限公司 The spray method and unmanned plane of a kind of unmanned plane
CN106200683B (en) * 2016-07-04 2019-01-15 佛山昊航科技有限公司 Unmanned plane plant protection system and plant protection method
CN106585992A (en) * 2016-12-15 2017-04-26 上海土是宝农业科技有限公司 Method and system for intelligent identification and accurate pesticide spraying using unmanned aerial vehicles
CN106873631B (en) * 2017-04-21 2020-07-28 广州极飞科技有限公司 Unmanned aerial vehicle control method, plant protection operation method, unmanned aerial vehicle and ground station
CN107861519A (en) * 2017-11-06 2018-03-30 四川联众防务科技有限责任公司 A kind of unmanned plane landing control system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1646716A1 (en) * 2003-07-11 2006-04-19 ZGene A/S Yellow fever mosquito deoxyribonucleoside kinases and its use
CN102759528A (en) * 2012-07-09 2012-10-31 陕西科技大学 Method for detecting diseases of crop leaves
CN103903006A (en) * 2014-03-05 2014-07-02 中国科学院合肥物质科学研究院 Crop pest identification method and system based on Android platform
CN103955938A (en) * 2014-05-15 2014-07-30 安徽农业大学 Wheat growing status diagnosing method based on mobile internet mode and leaf color analysis
CN105223202A (en) * 2014-06-27 2016-01-06 丁莉丽 A kind of method detecting crops leaf diseases
CN108563979A (en) * 2017-12-29 2018-09-21 南京农业大学 A method of based on the farmland image discriminating rice blast state of an illness of taking photo by plane

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于机器视觉的农田害虫快速检测与识别研究;韩瑞珍;《中国博士学位论文全文数据库》;20140715;第2014卷(第7期);D046-5 *

Also Published As

Publication number Publication date
CN109446955A (en) 2019-03-08

Similar Documents

Publication Publication Date Title
CN106873627B (en) Multi-rotor unmanned aerial vehicle and method for automatically inspecting power transmission line
US9557738B2 (en) Return path configuration for remote controlled aerial vehicle
JP7263630B2 (en) Performing 3D reconstruction with unmanned aerial vehicles
CN106647804B (en) A kind of automatic detecting method and system
CN205121341U (en) Unmanned aerial vehicle ground command system
CN105318888A (en) Unmanned perception based unmanned aerial vehicle route planning method
CN110254722B (en) Aircraft system, aircraft system method and computer-readable storage medium
CN110770791A (en) Image boundary acquisition method and device based on point cloud map and aircraft
EP3859480A1 (en) Unmanned aerial vehicle control method, device and spraying system, and unmanned aerial vehicle and storage medium
CN109446955B (en) Image processing method and device, unmanned aerial vehicle and server
CN110618691B (en) Machine vision-based method for accurately landing concentric circle targets of unmanned aerial vehicle
JPWO2017175804A1 (en) Method, program, and apparatus for spraying medicine by unmanned air vehicle
CN110799983A (en) Map generation method, map generation equipment, aircraft and storage medium
US11320269B2 (en) Information processing apparatus, information processing method, and information processing program
CN111709994B (en) Autonomous unmanned aerial vehicle visual detection and guidance system and method
CN111003192A (en) Unmanned aerial vehicle autonomous landing system and landing method based on GPS and vision
US20210018938A1 (en) Computation load distribution
US11760482B1 (en) Updating virtual aerial map using sensors associated with aerial vehicles
US20240177619A1 (en) Map including data for routing aerial vehicles during gnss failure
CN106945835A (en) A kind of unmanned vehicle
CN108196538B (en) Three-dimensional point cloud model-based field agricultural robot autonomous navigation system and method
CN105243653A (en) Fast mosaic technology of remote sensing image of unmanned aerial vehicle on the basis of dynamic matching
CN114815871A (en) Vision-based autonomous landing method for vertical take-off and landing unmanned mobile platform
Rojas-Perez et al. Real-time landing zone detection for UAVs using single aerial images
CN105208346B (en) Transmission facility identification method based on unmanned plane

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant