AU2021105629A4 - System and Method for Monitoring, Detecting and Counting Fruits in a Field - Google Patents

System and Method for Monitoring, Detecting and Counting Fruits in a Field Download PDF

Info

Publication number
AU2021105629A4
AU2021105629A4 AU2021105629A AU2021105629A AU2021105629A4 AU 2021105629 A4 AU2021105629 A4 AU 2021105629A4 AU 2021105629 A AU2021105629 A AU 2021105629A AU 2021105629 A AU2021105629 A AU 2021105629A AU 2021105629 A4 AU2021105629 A4 AU 2021105629A4
Authority
AU
Australia
Prior art keywords
uav
field
unmanned aerial
aerial vehicle
fruits
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
AU2021105629A
Inventor
Srinivasa A. H.
Veena A.
Basavesh D.
Vidya H. A.
Vidyarani H. J.
S. Pushpalatha
Asha Ramesh
Gowrishankar S.
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
H A Vidya Ms
H J Vidyarani Ms
Pushpalatha S Ms
Ramesh Asha Dr
Original Assignee
H A Vidya Ms
H J Vidyarani Ms
Pushpalatha S Ms
Ramesh Asha Dr
S Gowrishankar Dr
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by H A Vidya Ms, H J Vidyarani Ms, Pushpalatha S Ms, Ramesh Asha Dr, S Gowrishankar Dr filed Critical H A Vidya Ms
Priority to AU2021105629A priority Critical patent/AU2021105629A4/en
Application granted granted Critical
Publication of AU2021105629A4 publication Critical patent/AU2021105629A4/en
Ceased legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0022Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the communication link
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • G01S19/14Receivers specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S2205/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S2205/001Transmission of position information to remote stations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The present disclosure discloses a fruit detection and counting system (100) comprising: an unmanned aerial vehicle (UAV) including a camera; and a location detector. The system further comprises a housing unit having a controller configured to receive the captured real time images from one of the segments of the field from the camera and geo-coordinates of the unmanned aerial vehicle (UAV) located at a predefined elevation; count the quantity of the fruits and determine the quality of the fruits present in the each of the segments of the field using deep learning techniques; generate a direction signal to guide the unmanned aerial vehicle (UAV) to move to the one of the other segments of the field based on the current geo-coordinate of the unmanned aerial vehicle (UAV); and extract features of the fruits present in the field using filters. 2/2 Start 200 Fly at a predefined location above a ground level an unmanned aerial vehicle (UAV) 202 Capture real time images of fruits present in predefined segments of a field using a camera 204 Monitor the real-time location of the unmanned aerial vehicle (UAV) by employing a location detector 206 Communicate with the unmanned aerial vehicle (UAV) through a controller associated with a housing unit 208 Receive the captured real time images from one of the segments of the field from the camera and geo coordinates of the unmanned aerial vehicle (UAV) located at a predefined elevation 210 Count the quantity of the fruits and determine the quality of the fruits present in the each of the segments of the field.212 Generate a direction signal to guide the unmanned aerial vehicle (UAV) to move to the one of the other segments of the field based on the current geo coordinate of the unmanned aerial vehicle 214 Extract features of the fruits present in the field using filters, wherein the quality of the fruit is detected on Figure 2 the extraction of a boundary of a fruit and extraction of edges determine the presence of the fruits present in the field using filters 216 E nd

Description

2/2
Start
200 Fly at a predefined location above a ground level an unmanned aerial vehicle (UAV) 202
Capture real time images of fruits present in predefined segments of a field using a camera 204
Monitor the real-time location of the unmanned aerial vehicle (UAV) by employing a location detector 206
Communicate with the unmanned aerial vehicle (UAV) through a controller associated with a housing unit 208
Receive the captured real time images from one of the segments of the field from the camera and geo coordinates of the unmanned aerial vehicle (UAV) located at a predefined elevation 210
Count the quantity of the fruits and determine the quality of the fruits present in the each of the segments of the field.212
Generate a direction signal to guide the unmanned aerial vehicle (UAV) to move to the one of the other segments of the field based on the current geo coordinate of the unmanned aerial vehicle 214
Extract features of the fruits present in the field using filters, wherein the quality of the fruit is detected on Figure 2 the extraction of a boundary of a fruit and extraction of edges determine the presence of the fruits present in the field using filters 216
E nd
AUSTRALIA
Patents Act 1990
COMPLETE SPECIFICATION SYSTEM AND METHOD FOR MONITORING, DETECTING AND COUNTING FRUITS IN A FIELD
The following statement is a full description of this invention, including the best method of performing it known to me
SYSTEM AND METHOD FOR MONITORING, DETECTINGAND COUNTING FRUITS IN A FIELD BACKGROUND
[001] Field of Invention
[002] The present disclosure is related to the field of agronomic and agricultural monitoring and counting systems. More particularly, the present disclosure relates to a fruit detection and counting system that focuses on detecting quality of fruit and counting of fruits in a field.
[003] Description of Related Art
[004] Autonomous vehicles and monitoring system have been made for automated navigation and detection of agricultural products on fields. Unmanned aerial vehicles (UAVs), sometimes referred to as drones, are remotely piloted or self-piloted aircraft that may carry sensors, communications equipment, etc., are used to sense objects in their environment.
[005] Modern commercial agricultural continues to become a more data-driven industry, taking advantage of technological advances in computing and communication resources. This discussion is believed to be helpful in providing the background information to smooth the path for better understanding of the various aspects of the present system. Accordingly, it should be acknowledged that these statements are to be read in this light, and not as admissions of prior art.
[006] A disadvantage of the above-mentioned prior system, however, is that in some cases, background objects might include moving parts, such as birds, plant and trees with swinging leaves and stems caused by wind, to name a few. In this case, the resultant captured image will nonetheless contain images of these moving background objects together with the image of the desired object, which are visually distractive in the investigation of the images by the unmanned aerial vehicles (UAVs). Moreover, the existence of motional background objects often causes faulty detections in the technical field of agriculture system. While unmanned aerial vehicles (UAVs) have also been used for agricultural monitoring, such conventional systems are not entirely satisfactory.
[007] In view of the above-mentioned drawback of the system, there exists a need in deep learning techniques through various methodologies that can erase all background objects, including static background scene and motional background objects, while detecting, counting and displaying only the target agricultural products.
SUMMARY
[008] Embodiments in accordance with the present invention provide fruit detection and counting system. The system includes an unmanned aerial vehicle (UAV) flying at a predefined location above a ground level. The unmanned aerial vehicle (UAV) includes a camera to capture real time images of fruits present in predefined segments of a field. The unmanned aerial vehicle (UAV) also includes location detector to monitor the real-time location of the unmanned aerial vehicle (UAV). The system also includes a controller communicating with the unmanned aerial vehicle (UAV). The controller is configured to receive the captured real time images from one of the segments of the field from the camera and geo-coordinates of the unmanned aerial vehicle (UAV) located at a predefined elevation. The controller is also configured to count the quantity of the fruits and determine the quality of the fruits present in the each of the segments of the field through the captured real time images of the fruits and surveyed images of the fruit using deep learning techniques. The controller is also configured to generate a direction signal to guide the unmanned aerial vehicle (UAV) to move to the one of the other segments of the field based on the current geo-coordinate of the unmanned aerial vehicle (UAV). The controller is also configured to extract features of the fruits present in the field using filters, wherein the quality of the fruit is detected on the extraction of a boundary of a fruit and extraction of edges determine the presence of the fruits present in the field using filters.
[009] Embodiments in accordance with the present invention provide a method for detecting and counting fruits in a field. The method comprising the step of flying at a predefined location above a ground level an unmanned aerial vehicle (UAV). The method further comprising the step of capturing real time images of fruits present in predefined segments of a field using a camera. The method comprising the step of monitoring the real-time location of the unmanned aerial vehicle (UAV) by employing a location detector. The method comprising the step of communicating with the unmanned aerial vehicle (UAV) through a controller. The method further includes the step of receiving the captured real time images from one of the segments of the field from the camera and geo-coordinates of the unmanned aerial vehicle (UAV) located at a predefined elevation. The method also includes the step of counting the quantity of the fruits and determine the quality of the fruits present in the each of the segments of the field through the captured real time images of the fruits and surveyed images of the fruit using deep learning techniques. The method comprising the step of generating a direction signal to guide the unmanned aerial vehicle (UAV) to move to the one of the other segments of the field based on the current geo-coordinate of the unmanned aerial vehicle (UAV). The method comprising the step of extracting features of the fruits present in the field using filters, wherein the quality of the fruit is detected on the extraction of a boundary of a fruit and extraction of edges determine the presence of the fruits present in the field using filters.
[0010] These and other advantages will be apparent from the present application of the embodiments described herein.
[0011] The preceding is a simplified summary to provide an understanding of some embodiments of the present invention. This summary is neither an extensive nor exhaustive overview of the present invention and its various embodiments. The summary presents selected concepts of the embodiments of the present invention in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other embodiments of the present invention are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] The above and still further features and advantages of embodiments of the present invention will become apparent upon consideration of the following detailed description of embodiments thereof, especially when taken in conjunction with the accompanying drawings, and wherein:
[0013] FIG. 1 illustrates a block diagram of a fruit detection and counting system, according to embodiments of the present invention disclosed herein; and
[0014] FIG. 2 illustrates a flowchart of a method for detecting and counting fruits in a field, according to embodiments of the present invention disclosed herein.
[0015] The headings used herein are for organizational purposes only and are not meant to be used to limit the scope of the description or the claims. As used throughout this application, the word "may" is used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense (i.e., meaning must). Similarly, the words "include", "including", and "includes" mean including but not limited to. To facilitate understanding, like reference numerals have been used, where possible, to designate like elements common to the figures. Optional portions of the figures may be illustrated using dashed or dotted lines, unless the context of usage indicates otherwise.
DETAILED DESCRIPTION
[0016] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments, but the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.
[0017] In any embodiment described herein, the open-ended terms "comprising," "comprises," and the like (which are synonymous with "including," "having" and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.
[0018] As used herein, the singular forms "a", "an", and "the" designate both the singular and the plural, unless expressly stated to designate the singular only.
[0019] FIG. 1 illustrates a illustrates a block diagram of a fruit detection and counting system 100, according to embodiments of the present invention. The fruit detection and counting system 100 comprises an unmanned aerial vehicle (UAV) 102 and a housing unit 104, according to embodiments of the present invention. The unmanned aerial vehicle (UAV) 102 may comprise a camera 108, a location detector 110, and so forth. According to embodiments of the present invention, the housing unit 104 may comprise a controller 112. Further, the unmanned aerial vehicle (UAV) 102 and the housing unit 104 may be connected through a communication network 106, according to embodiments of the present invention.
[0020] The communication network 106 may include a data network such as, but not limited to, an Internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Metropolitan Area Network (MAN), etc. In some embodiments of the present invention, the communication network 106 may include a wireless network, such as, but not limited to, a cellular network and may employ various technologies including an Enhanced Data Rates for Global Evolution (EDGE), a General Packet Radio Service (GPRS), and so forth. In some embodiments of the present invention, the communication network 106 may include or otherwise cover networks or sub-networks, each of which may include, for example, a wired or a wireless data pathway. According to an embodiment of the present invention, the unmanned aerial vehicle (UAV) 102 and the housing unit 104 may be configured to communicate with each other by one or more communication mediums connected to the communication network 106. The communication mediums include, but are not limited to, a coaxial cable, a copper wire, a fiber optic, a wire that comprise a system bus coupled to a processor of a computing device, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the communication mediums, including known, related art, and/or later developed technologies.
[0021] According to an embodiment of the present invention, the unmanned aerial vehicle (UAV) 102 may fly at a predefined location above a ground level. In an embodiment of the present invention, the predefined location of the UAV may be a pre-planned path at fixed location above the ground level to click the image of the field. Further, when the unmanned aerial vehicle (UAV) 102 may reach the predefined elevation, same level of an altitude is adjusted to get the field image and the field is divided into number of squares and each of the squares zoom in view may be created. According to embodiments of the present invention, the predefined elevation may be an elevation defined while capturing the image doing survey of the field.
[0022] In an embodiment of the present invention, the zoom in view of the squares may be marked in circles with a unique ID and a table may be generated to track the circles. Furthermore, a super-imposed circle may be drawn upon the original circle to cover one of the segments of the field that may not be covered by original circle and the super-imposed circle may be given unique ID on top of original circle to track.
[0023] The unmanned aerial vehicle (UAV) 102 may comprise the camera 108, the location detector 110, and so forth. In an embodiment of the present invention, the camera 108 may capture real time images of fruits present in predefined segments of a field. According to embodiments of the present invention, the camera 108 may be configured for capturing a stream of images of each segment of the field in real-time.
[0024] In another embodiment of the present invention, the camera 108 may be configured for capturing a video of the field. In an embodiment of the present invention, the camera 108 may comprises an in-built GPS for allowing the camera 108 for automatic geotagging.
[0025] According to embodiments of the present invention, the location detector 110 may monitor the real-time location of the unmanned aerial vehicle (UAV) 102. According to embodiments of the present invention, the location detector 110 may be such as, but notlimited to, a location sensor, a GPS module, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the location detector including known, related art, and/or later developed technologies.
[0026] According to embodiments of the present invention, the housing unit 104 may be a structure, a part of structure or a space that may be used for storing other components of the fruit detection and counting system 100. In one embodiment of the present invention, the housing unit 104 may act as enclosure for components of the fruit detection and counting system 100. In preferred embodiments of the present invention, the housing unit 104 may comprise the controller 112.
[0027] In an embodiment of the present invention, the housing unit 104 may be attached with the unmanned aerial vehicle (UAV) 102 having the controller 112 for communicating with the unmanned aerial vehicle (UAV) 102. The controller 112 may be configured to communicate with the components of fruit detection and counting system 100 using the communication network 106, according to embodiments of the present invention. Furthermore, the controller 112 may be configured to receive and transmit data associated with the fruit detection and counting system 100, in an embodiment of the present invention.
[0028] According to embodiments of the present invention, the controller 112 may be, but not limited to, a Programmable Logic Control unit (PLC), a microcontroller, a microprocessor, a computing device, a development board, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the controller 112 including known, related art, and/or later developed technologies that may be capable of processing the received data.
[0029] According to embodiments of the present invention, the controller 112 may be configured to receive the captured real time images from one of the segments of the field from the camera 108 and geo-coordinates of the unmanned aerial vehicle (UAV) 102 located at a predefined elevation. The controller 112 may be configured to count the quantity of the fruits and determine the quality of the fruits present in the each of the segments of the field through the captured real time images of the fruits and surveyed images of the fruit using deep learning techniques.
[0030] Further, the controller 112 may be configured to detect and count the fruits may be configured to divide the field into imaginary squares using the deep learning techniques. According to embodiments of the present invention, the deep learning techniques are selected from a group comprising a Convolution Neural Network technique, a mask regional convolutional neural network (R-CNN) technique, a faster regional convolutional neural network (R-CNN) technique, and so forth. Embodiments are intended to include or otherwise cover any type of the deep learning techniques, including known, related art, and/or later developed technologies.
[0031] The controller 112 may guide the unmanned aerial vehicle (UAV) (102) to the centroid of each of the imaginary squares and generate an activation signal to capture the image of the fruits beneath the field of vision of the camera 108.
[0032] Further, the controller may be configured to tabulate the circles by geotagging each circle. The controller 112 may guide the unmanned aerial vehicle (UAV) (102) to another location that superimposes the previously captured circles and generate the activation signal to capture the image of the fruits beneath the field of vision of the camera 108. In an embodiment of the present invention, the captured real time image may be compared with the surveyed image to determine a portion of the land area of the field covered by the unmanned aerial vehicle (UAV) 102.
[0033] According to embodiments of the present invention, the controller 112 may be configured to generate a direction signal to guide the unmanned aerial vehicle (UAV) 102 to move to the one of the other segments of the field based on the current geo-coordinate of the unmanned aerial vehicle (UAV) 102.
[0034] The controller 112 may be configured to extract features of the fruits present in the field using filters, wherein the quality of the fruit is detected on the extraction of a boundary of a fruit and extraction of edges determine the presence of the fruits present in the field using filters. Embodiments are intended to include or otherwise cover any type of the filters, including known, related art, and/or later developed technologies.
[0035] According to embodiments of the present invention, the filters may be selected from a group such as, but not limited to, a Kalman filter, a Gabor filter, a Sharpening filter, a Binomial blur filter, a Gradient X filter, a Gradient Y filter, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the filters including known, related art, and/or later developed technologies.
[0036] In an exemplary scenario, the division of the field into squares may be based on how well the UAV 102 is able to capture an area of the segment. Once an appropriate elevation is found then the UAV 102 may be adjusted for the same altitude. The UAV 102 may be moved to a location using GPS co-ordinates where the camera 108 may capture the required area of the field. The captured real time image is then compared with the surveyed image to determine which portion of the land area has been covered.
[0037] Further, quality of the camera 108 may be of high resolution so that the camera 108 may capture the images at an acceptable resolution. Moreover, the elevation of the UAV 102 may be based on the best captured image at that elevation. The location of the UAV 102 may be based on the coverage area of the camera 108. High quality camera 108 results in higher resolution and wider coverage. Geotagging of the location may be done using the camera 108. An initial survey of the field may pin point the GPS location, where the UAV 102 has to take the image.
[0038] Each of the original circle may be given a unique ID and a table may be created to track these circles. At the elevation, the unmanned aerial vehicle (UAV) 102 may cover the distance for example 1m, now for this 1m the circles may be generated. In an embodiment of the present invention, the fruits may be identified and counted by identifying and counting all the fruits which are within the circles. Further, the unique ID for the circles which are superimposed are assigned, on top of the exiting circle. Also, the areas of the field which are not intersected either by the original circles and the super imposed circles or super imposed circles with other superimposed circles may be checked. In the chord areas of the field which satisfy this condition the fruits may be counted. Furthermore, a list of pre-determined coordinates based on a survey conducted at different elevation of UAV 102 may be maintained which determines the best location to capture the image. Once the UAV 102 reaches the pre-determined coordinates an image may be captured. The captured image is sent to the ground station. Once the ground station signals that they have received the image then the unmanned vehicle 102 may be signaled to move to the next pre- determined location.
[0039] In another exemplary scenario, the partially visible fruits that may be covered beneath the leaves may be identified. The fruit detection may be proceeded when the fruit region is light green/yellow and the leaves are in dark green color. Further, the boundary of the fruit may be extracted. Feature extraction occurs in the convolution layers of Deep Learning using different filters. Edge extraction may determine the presence of fruits. Detection of the type of fruit, identification of the quality of the fruit, whether it is ripened or not may be achieved using the fruit detection and counting system 100. The detection of fruit or the quality of the fruit may be based on the color and size of the fruit by using Convolution Neural Network techniques such as Mask-RCNN, Faster R-CNN.
[0040] In an embodiment of the present invention, the UAV 102 may fly a pre-planned path at fixed location above the ground level to click image. The UAV 102 at centroid of each square and the square seems to be the field of view of the camera 108 may be attached to a scanner. Superimposed circle may cover the left-out parts of the original circle. Further, ground station coordinates the movement of the UAV 102, and create the table by filling the table.
[0041] FIG. 2 illustrates a flowchart of a method 200 for detecting and counting fruits in a field, according to embodiments of the present invention.
[0042] At step 202, the fruit detection and counting system 100 may fly at a predefined location above a ground level the unmanned aerial vehicle (UAV) 102.
[0043] At step 204, the fruit detection and counting system 100 may capture real time images of fruits present in predefined segments of a field using the camera 108. Further, the camera 108 may comprise an in-built GPS for allowing the camera 108 for automatic geotagging.
[0044] At step 206, the fruit detection and counting system 100 may monitor the real-time location of the unmanned aerial vehicle (UAV) 102 by employing the location detector 110.
[0045] At step 208, the fruit detection and counting system 100 may communicate with the unmanned aerial vehicle (UAV) 102 through the controller 112 associated with the housing unit 104.
[0046] At step 210, the fruit detection and counting system 100 may receive the captured real time images from one of the segments of the field from the camera 108 and geo-coordinates of the unmanned aerial vehicle (UAV) 102 located at a predefined elevation.
[0047] At step 212, the fruit detection and counting system 100 may count the quantity of the fruits and determine the quality of the fruits present in the each of the segments of the field through the captured real time images of the fruits and surveyed images of the fruit using deep learning techniques. According to embodiment of the present invention, the deep learning techniques are selected from a group comprising a Convolution Neural Network technique, a mask regional convolutional neural network (R-CNN) technique, a faster regional convolutional neural network (R-CNN) technique, and so forth. Embodiments are intended to include or otherwise cover any type of the deep learning techniques, including known, related art, and/or later developed technologies.
[0048] At step 214, the fruit detection and counting system 100 may generate a direction signal to guide the unmanned aerial vehicle (UAV) 102 to move to the one of the other segments of the field based on the current geo-coordinate of the unmanned aerial vehicle (UAV) 102.
[0049] At step 216, the fruit detection and counting system 100 may extract features of the fruits present in the field using filters. In an embodiment of the present invention, the quality of the fruit is detected on the extraction of a boundary of a fruit and extraction of edges determine the presence of the fruits present in the field using filters. According to embodiments of the present invention, the filters are selected from a group such as, but not limited to, a Kalman filter, a
Gabor filter, a Sharpening filter, a Binomial blur filter, a Gradient X filter, a Gradient Y filter, and so forth. Embodiments are intended to include or otherwise cover any type of the filters, including known, related art, and/or later developed technologies
[0050] While the invention has been described in connection with what is presently considered to be the most practical and various embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.
[0051] This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope the invention is defined in the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements within substantial differences from the literal languages of the claims.

Claims (10)

Claims
1. A fruit detection and counting system, the system (100) comprising:
an unmanned aerial vehicle (UAV) (102) flying at a predefined location above a ground level, wherein the unmanned aerial vehicle (UAV) (102) including:
a camera (108) to capture real time images of fruits present in predefined segments of a field; and
a location detector (110) to monitor the real-time location of the unmanned aerial vehicle (UAV) (102);
a housing unit (104) attached with the unmanned aerial vehicle (UAV) (102) having a controller (112) for communicating with the unmanned aerial vehicle (UAV) (102), wherein the controller (112) is configured to:
receive the captured real time images from one of the segments of the field from the camera (108) and geo-coordinates of the unmanned aerial vehicle (UAV) (102) located at a predefined elevation;
count the quantity of the fruits and determine the quality of the fruits present in the each of the segments of the field through the captured real time images of the fruits and surveyed images of the fruit using deep learning techniques;
generate a direction signal to guide the unmanned aerial vehicle (UAV) (102) to move to the one of the other segments of the field based on the current geo-coordinate of the unmanned aerial vehicle (UAV) (102); and
extract features of the fruits present in the field using filters, wherein the quality of the fruit is detected on the extraction of a boundary of a fruit and extraction of edges determine the presence of the fruits present in the field using filters.
2. The system as claimed in claim 1, wherein the controller (112) is configured to:
divide the field into imaginary squares using the deep learning techniques; guide the unmanned aerial vehicle (UAV) (102) to the centroid of each of the imaginary squares and generate an activation signal to capture the image of the fruits beneath the field of vision of the camera (108); tabulate the circles by geotagging each circle; and guide the unmanned aerial vehicle (UAV) (102) to another location that superimposes the previously captured circles and generate the activation signal to capture the image of the fruits beneath the field of vision of the camera (108).
3. The system as claimed in claim 1, wherein when the unmanned aerial vehicle (UAV) (102) reaches the predefined elevation, same level of an altitude is adjusted to get the field image and the field is divided into number of squares and each of the squares zoom in view is created.
4. The system as claimed in claim 1, wherein the zoom in view of the squares is marked in circles with a unique ID and a table is generated to track the circles.
5. The system as claimed in claim 1, wherein a super-imposed circle is drawn upon the original circle to cover one of the segments of the field that are not covered by original circle and the super-imposed circle are given unique ID on top of original circle to track.
6. The system as claimed in claim 1, wherein the camera (108) comprises an in built GPS for allowing the camera (108) for automatic geotagging.
7. The system as claimed in claim 1, wherein the deep learning techniques are selected from a group comprising a Convolution Neural Network technique, a mask regional convolutional neural network (R-CNN) technique, a faster regional convolutional neural network (R-CNN) technique, and/or a combination thereof.
8. The system as claimed in claim 1, wherein the filters are selected from a group such as, but not limited to, a Kalman filter, a Gabor filter, a Sharpening filter, a Binomial blur filter, a Gradient X filter, a Gradient Y filter, and/or a combination thereof.
9. The system as claimed in claim 1, wherein the captured real time image is compared with the surveyed image to determine a portion of the land area of the field covered by the unmanned aerial vehicle (UAV) (102).
10. A method for detecting and counting fruits in a field, the method comprising the steps of:
flying at a predefined location above a ground level an unmanned aerial vehicle (UAV) (102);
capturing real time images of fruits present in predefined segments of a field using a camera (108);
monitoring the real-time location of the unmanned aerial vehicle (UAV) (102) by employing a location detector (110);
communicating with the unmanned aerial vehicle (UAV) (102) through a controller (112) associated with a housing unit (104);
receiving the captured real time images from one of the segments of the field from the camera (108) and geo-coordinates of the unmanned aerial vehicle (UAV) (102) located at a predefined elevation;
counting the quantity of the fruits and determine the quality of the fruits present in the each of the segments of the field through the captured real time images of the fruits and surveyed images of the fruit using deep learning techniques;
generating a direction signal to guide the unmanned aerial vehicle (UAV) (102) to move to the one of the other segments of the field based on the current geo-coordinate of the unmanned aerial vehicle (UAV) (102); and
extracting features of the fruits present in the field using filters, wherein the quality of the fruit is detected on the extraction of a boundary of a fruit and extraction of edges determine the presence of the fruits present in the field using filters.
AU2021105629A 2021-08-17 2021-08-17 System and Method for Monitoring, Detecting and Counting Fruits in a Field Ceased AU2021105629A4 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2021105629A AU2021105629A4 (en) 2021-08-17 2021-08-17 System and Method for Monitoring, Detecting and Counting Fruits in a Field

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2021105629A AU2021105629A4 (en) 2021-08-17 2021-08-17 System and Method for Monitoring, Detecting and Counting Fruits in a Field

Publications (1)

Publication Number Publication Date
AU2021105629A4 true AU2021105629A4 (en) 2021-11-25

Family

ID=78610525

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2021105629A Ceased AU2021105629A4 (en) 2021-08-17 2021-08-17 System and Method for Monitoring, Detecting and Counting Fruits in a Field

Country Status (1)

Country Link
AU (1) AU2021105629A4 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115019216A (en) * 2022-08-09 2022-09-06 江西师范大学 Real-time ground object detection and positioning counting method, system and computer

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115019216A (en) * 2022-08-09 2022-09-06 江西师范大学 Real-time ground object detection and positioning counting method, system and computer

Similar Documents

Publication Publication Date Title
EP2678835B1 (en) A surveillance system and a method for detecting a foreign object, debris, or damage in an airfield
Liu et al. Automated vehicle extraction and speed determination from QuickBird satellite images
Niu et al. A UAV-based traffic monitoring system-invited paper
US11827352B2 (en) Visual observer for unmanned aerial vehicles
CN106203265A (en) A kind of Construction Fugitive Dust Pollution based on unmanned plane collection image is derived from dynamic monitoring and coverage prognoses system and method
CN110898353A (en) Panoramic monitoring and linkage control method and system for fire-fighting robot of transformer substation
CN111225854A (en) Unmanned plane
KR102195179B1 (en) Orthophoto building methods using aerial photographs
KR20170101516A (en) Apparatus and method for fire monitoring using unmanned aerial vehicle
CN107291100A (en) A kind of monitoring method based on unmanned plane
KR102184693B1 (en) Automatic landing method of unmanned aerial vehicle into ground-free vehicle, and automatic landing device of unmanned aerial vehicles
KR102166432B1 (en) Method for replying disaster situation using smart drone
AU2021105629A4 (en) System and Method for Monitoring, Detecting and Counting Fruits in a Field
CN113778137A (en) Unmanned aerial vehicle autonomous inspection method for power transmission line
CN114020043A (en) Unmanned aerial vehicle building project supervision system and method, electronic equipment and storage medium
CN114689030A (en) Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision
WO2022061632A1 (en) Obstacle detection method and apparatus, and unmanned aerial vehicle and storage medium
CN110211159A (en) A kind of aircraft position detection system and method based on image/video processing technique
EP3373092B1 (en) Method for locating a fault of a system
US11765315B2 (en) Flying body control apparatus, flying body control method, and flying body control program
CN107544481A (en) A kind of unmanned plane makes an inspection tour control method, apparatus and system
CN113449688A (en) Power transmission tree obstacle recognition system based on image and laser point cloud data fusion
CN116740833A (en) Line inspection and card punching method based on unmanned aerial vehicle
CN109708659A (en) A kind of distributed intelligence photoelectricity low latitude guard system
Majidi et al. Real time aerial natural image interpretation for autonomous ranger drone navigation

Legal Events

Date Code Title Description
FGI Letters patent sealed or granted (innovation patent)
MK22 Patent ceased section 143a(d), or expired - non payment of renewal fee or expiry