IL291800B2 - An aerial-based spectral system for the detection of red palm weevil infestation in palm trees, and a method thereof - Google Patents
An aerial-based spectral system for the detection of red palm weevil infestation in palm trees, and a method thereofInfo
- Publication number
- IL291800B2 IL291800B2 IL291800A IL29180022A IL291800B2 IL 291800 B2 IL291800 B2 IL 291800B2 IL 291800 A IL291800 A IL 291800A IL 29180022 A IL29180022 A IL 29180022A IL 291800 B2 IL291800 B2 IL 291800B2
- Authority
- IL
- Israel
- Prior art keywords
- spectral
- palm
- canopy
- tree
- trees
- Prior art date
Links
- 241001133760 Acoelorraphe Species 0.000 title claims description 64
- 238000000034 method Methods 0.000 title claims description 48
- 241001078693 Rhynchophorus ferrugineus Species 0.000 title claims description 44
- 230000003595 spectral effect Effects 0.000 title claims description 44
- 238000001514 detection method Methods 0.000 title claims description 23
- 206010061217 Infestation Diseases 0.000 title claims description 21
- 238000001228 spectrum Methods 0.000 claims description 18
- 238000010801 machine learning Methods 0.000 claims description 17
- 238000000701 chemical imaging Methods 0.000 claims description 12
- 238000003384 imaging method Methods 0.000 claims description 11
- 238000005259 measurement Methods 0.000 claims description 9
- 239000002420 orchard Substances 0.000 claims description 9
- 230000036541 health Effects 0.000 claims description 6
- 230000007613 environmental effect Effects 0.000 claims description 4
- 230000003862 health status Effects 0.000 claims description 4
- 208000015181 infectious disease Diseases 0.000 claims description 4
- 241000233788 Arecaceae Species 0.000 claims description 3
- 238000013459 approach Methods 0.000 claims description 3
- 238000012545 processing Methods 0.000 claims description 3
- 230000008859 change Effects 0.000 claims description 2
- 238000003909 pattern recognition Methods 0.000 claims description 2
- 238000010183 spectrum analysis Methods 0.000 claims description 2
- 230000002123 temporal effect Effects 0.000 claims description 2
- 238000012549 training Methods 0.000 claims description 2
- 241000607479 Yersinia pestis Species 0.000 description 7
- 241000233805 Phoenix Species 0.000 description 5
- 235000010659 Phoenix dactylifera Nutrition 0.000 description 5
- 230000006378 damage Effects 0.000 description 5
- 239000000575 pesticide Substances 0.000 description 5
- 241000254171 Curculionidae Species 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 3
- 238000003491 array Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 239000003016 pheromone Substances 0.000 description 2
- 241000282472 Canis lupus familiaris Species 0.000 description 1
- 244000060011 Cocos nucifera Species 0.000 description 1
- 235000013162 Cocos nucifera Nutrition 0.000 description 1
- 241000254173 Coleoptera Species 0.000 description 1
- 241000196324 Embryophyta Species 0.000 description 1
- 241000287219 Serinus canaria Species 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- 239000004411 aluminium Substances 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 239000002223 garnet Substances 0.000 description 1
- YWTYJOPNNQFBPC-UHFFFAOYSA-N imidacloprid Chemical compound [O-][N+](=O)\N=C1/NCCN1CC1=CC=C(Cl)N=C1 YWTYJOPNNQFBPC-UHFFFAOYSA-N 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000003449 preventive effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000009528 severe injury Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000005507 spraying Methods 0.000 description 1
- 230000007480 spreading Effects 0.000 description 1
- 230000009885 systemic effect Effects 0.000 description 1
- 238000011282 treatment Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 238000011179 visual inspection Methods 0.000 description 1
- 229910052727 yttrium Inorganic materials 0.000 description 1
- VWQVUPCCIRVNHF-UHFFFAOYSA-N yttrium atom Chemical compound [Y] VWQVUPCCIRVNHF-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/188—Vegetation
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01B—SOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
- A01B79/00—Methods for working soil
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01B—SOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
- A01B79/00—Methods for working soil
- A01B79/005—Precision agriculture
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01N—PRESERVATION OF BODIES OF HUMANS OR ANIMALS OR PLANTS OR PARTS THEREOF; BIOCIDES, e.g. AS DISINFECTANTS, AS PESTICIDES OR AS HERBICIDES; PEST REPELLANTS OR ATTRACTANTS; PLANT GROWTH REGULATORS
- A01N63/00—Biocides, pest repellants or attractants, or plant growth regulators containing microorganisms, viruses, microbial fungi, animals or substances produced by, or obtained from, microorganisms, viruses, microbial fungi or animals, e.g. enzymes or fermentates
- A01N63/10—Animals; Substances produced thereby or obtained therefrom
- A01N63/14—Insects
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0088—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0251—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/29—Geographical information databases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/217—Validation; Performance evaluation; Active pattern learning techniques
- G06F18/2178—Validation; Performance evaluation; Active pattern learning techniques based on feedback of a supervisor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10036—Multispectral image; Hyperspectral image
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Remote Sensing (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Automation & Control Theory (AREA)
- Data Mining & Analysis (AREA)
- Environmental Sciences (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Electromagnetism (AREA)
- Zoology (AREA)
- Medical Informatics (AREA)
- Mechanical Engineering (AREA)
- Databases & Information Systems (AREA)
- Artificial Intelligence (AREA)
- Soil Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Plant Pathology (AREA)
- Agronomy & Crop Science (AREA)
- Insects & Arthropods (AREA)
- Microbiology (AREA)
- Pest Control & Pesticides (AREA)
- Wood Science & Technology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Biotechnology (AREA)
- Virology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Business, Economics & Management (AREA)
- Game Theory and Decision Science (AREA)
Description
TITLE AN AERIAL-BASED SPECTRAL SYSTEM FOR THE DETECTION OF RED PALM WEEVIL INFESTATION IN PALM TREES, AND A METHOD THEREOF.
FIELD OF THE INVENTION The invention is in the integrating remote sensing, image processing, and machine learning technologies for pest control in palm orchards.
BACKROUND OF THE INVENTION The red palm weevil (Rhynchophorus ferrugineus, RPW) is a serious pest in 20 palm species and is considered one of the most aggressive pests in growing dates in the world. The weevil originated in the tropical Asia and was invaded Israel at the end of the 20th century. Since then, it has progressively spread over the country. The RPW is a beetle from the palm weevil family that extends its life cycle deep in the trunk. The larvae hatch directly into the palm tissue and feed on the inner parts of the stem, embodied in the tuber they produce from the palm fibers, and emerge into the space created in the depth of the stem or the external environment in order to find a new host palm. The hidden lifestyle of the weevil, i.e., completing a life cycle within the tree trunk, poses challenges to growers and researchers due to the difficulty of recognizing its presence. The infected trees are difficult to locate, the pesticide attempts are complex and require the transport of the pesticides to the depth of the trunk. Examining the effectiveness of treatments is also difficult and requires physical analysis of the strain, leading to its destruction. Fig. 1 shows the damage that infection can cause to a tree (1a) by the RPW (1b).
Existing information, estimate the damage from the palm weevil in the world at a total of hundreds of millions of dollars. For example, by 2013 the damage to the weevil amounted to more than million euros and it is expected that by 2023, in France, Spain and Italy (MacLeod, 2013) alone the economic damage will double.
Pest control is carried out in pest control interfaces with pesticides such as Confidor and Karta-Max, which are mainly applied by volumetric spraying or by spreading a vertex several times a year. Analysis of seizure data from the orchards of Kibbutz Ein Hanatziv between the years 2013-2017 indicates that the pest population is increasing, despite the use of various plant protection methods, such as pheromone traps and preventive pesticides. In light of this, the need to develop advanced technological means for early detection at high sensitivity thresholds and their integration into the systemic pesticide interface is more necessary than ever.
Khawaja Ghulam Rasool at el. (Evaluation of some non-invasive approaches for the detection of red palm weevil infestation) discloses that the RPW causes severe damage to date palm trees, leading to the death of trees if not detected and treated in time. A major obstacle in RPW control is the difficulty in identifying an early-stage infestation in the present study. The efficacy of some non-invasive optical devices including cameras (digital camera and thermal camera), TreeRadarUnit™ (TRU) (Radar 2000, Radar 900), resistograph, and magnetic DNA biosensor were used to detect RPW infestation in date palm trees under field conditions at Riyadh, Saudi Arabia. Date palm trees used in these experiments were selected based on visual observations. After inspection of date palm trees with different devices to detect RPW infestation, each tree was taken down and dissected in detail to validate the accuracy of each device.
Maged E.A. Mohammed, Hamadttu A.F. El-Shafie and Mohammed R. Alhajhoj (Recent Trends in the Early Detection of the Invasive Red Palm Weevil, Rhynchophorus ferrugineus (Olivier)) disclosed that the RPW is one of the most invasive pest species that poses a serious threat to date palm and coconut palm cultivation as well as the ornamental Canary Island palm. Infested palm shows visible signs when the infestation is more advanced; in this case, the rescuing of infested palms is more complicated. Early detection is a useful tool to eradicate and control RPW successfully. Until now, the early detection techniques of RPW rely mainly on visual inspection and pheromone trapping. Several methods to detect RPW infestation have recently emerged. These include remote sensing, highly sensitive microphones, thermal sensors, drones, acoustic sensors, and sniffer dogs. The main objective of this chapter is to provide an overview of the modern methods for early detection of the RPW and discuss the most important RPW detection technologies that are field applicable.
There exists a long felt need for a method of detecting red palm weevil infestations in the field.
SUMMARY It is the object of the present invention to present a method for identifying palm trees infected with red palm weevil (RPW), characterized steps of: a. Obtaining a 3-dimentional image of the canopy of a palm tree from above; b. Identifying specific sections of the tree canopy; c. Obtaining a spectral image of the tree from above; d. Identifying the spectral image of the specific section of the tree canopy; and e. Comparing the spectral images to identify infection status of the tree; the comparison being characterized as being spatial and temporal; Wherein the specific sections of the tree canopy is characterized as the ‘lulav’.
It is another object of the present invention to present the method, as presented in any of the above, wherein the 3-dimentional image is generated using LiDAR.
It is another object of the present invention to present the method, as presented in any of the above, wherein the step of identifying the ‘lulav ‘section of the palm trees canopy is conducted by steps, selected from a group consisting of machine learning, pattern recognition, calculating the centroid, calculating the center of circumscribed circle and calculating the center of convex hull.
It is another object of the present invention to present the method, as presented in any of the above, wherein the spectral image is obtained in a predetermined visible, near infrared (NIR), and thermal infrared spectral bands.
It is another object of the present invention to present the method, as presented in any of the above, wherein the spectral image is obtained at a wavelength range of 400 to 12000 nM.
It is another object of the present invention to present the method, as presented in any of the above,, wherein the spectrum is in the range of 8000 to 12000nm.
It is another object of the present invention to present the method, as presented in any of the above, wherein the spectrum is in the range of 400 to 1200nm.
It is another object of the present invention to present the method, as presented in any of the above, additionally comprising a step of obtaining spectral images of additional trees.
It is another object of the present invention to present the method, as presented in any of the above, additionally comprising a step of comparing the spectral images of separate trees.
It is another object of the present invention to present the method, as presented in any of the above, addition comprising a step of obtaining spectral images of the tree at a second time point.
It is another object of the present invention to present the method, as presented in any of the above, additionally comprising a step of comparing the spectral images of the tree at different time point.
It is another object of the present invention to present the method, as presented in any of the above, additionally comprising a step of analyzing tree health.
It is another object of the present invention to present the method, as presented in any of the above, additionally comprising step of generating a photographic mosaic from the individual images captured by the spectral imaging devices.
It is the object of the present invention to present a system for a non-invasive early detection of red palm weevil (RPW) infestation comprising: a. An arial photography system/vehicle, configured to position spectral imaging devices and allow aerial image capture of palm trees orchards, the system comprising: i. at least one spectral imaging device; ii. at least one system for obtaining a 3-D image configured to capture the shape of the ground and palms trees; b. a computer-readable media (CRM), running an image processing software, the software configured to produce a photographic mosaic from the individual images captured by the spectral imaging devices; c. a computer-readable media (CRM), running an image processing software configured to identify the relevant section of the palm trees canopy; and d. a processor configured to compare the spectral analysis of the relevant section of the palm trees canopy and provide a score indicative of the palm tree's health; wherein the relevant section of the palm trees canopy is characterized as the ‘lulav’.
It is another object of the present invention to present the system, as presented in any of the above, wherein the imaging device configured to capture the images in a predetermined visible, near infrared (NIR), and thermal infrared spectral bands.
It is another object of the present invention to present the system, as presented in any of the above, wherein the imaging device configured to capture the images at a range of 400 to 12000 nM.
It is another object of the present invention to present the system, as presented in any of the above, wherein the spectrum is in the range of 8000 to 12000nm.
It is another object of the present invention to present the system, as presented in any of the above, wherein the spectrum is in the range of 400 to 1200nm.
It is another object of the present invention to present the system, as presented in any of the above, wherein the system for obtaining a 3-D image is a LiDAR.
It is another object of the present invention to present the system, as presented in any of the above, wherein the arial system is a characterized as unmanned or manned.
It is another object of the present invention to present the system, as presented in any of the above, wherein the arial system is a drone, an airplane, a helicopter or a quadcopter.
It is another object of the present invention to present the system, as presented in any of the above, additionally composing a computer-readable media (CRM), running an image processing software configured to detect the palm trees health status by means of spectral vegetation indices.
It is another object of the present invention to present the system, as presented in any of the above, wherein the image processing software identifying the relevant section of the palm trees canopy is conducted by a using a technique, selected from a group consisting of machine learning, calculating the centroid, calculating the center of circumscribed circle and calculating the center of convex hull.
It is the object of the present invention to present an unmanned aircraft system (UAS) for early detection of red palm weevil (RPW) infestation, comprising: a. An unmanned aerial vehicle (UAV), comprising: i. At least one spectral imaging device; ii. At least one system for obtaining a 3-D image, configured to capture the shape of ground and palm tree canopy; b. A control station/unit, configured for controlling the UAV; and c. A data links/database, comprising image processing software, the software configured to: i. produce a photographic mosaic from the individual images captured by the spectral imaging devices; ii. identify the relevant section of the palm trees canopy; and iii. detect the palm trees health status by means of spectral vegetation indices; wherein the relevant section of the palm trees canopy is characterized as the ‘lulav’.
It is another object of the present invention to present the system, as presented in any of the above, wherein the image processing software identifying the relevant section of the palm trees canopy is conducted by a using a technique, selected from a group consisting of machine learning, calculating the centroid, calculating the center of circumscribed circle and calculating the center of convex hull.
It is another object of the present invention to present the system, as presented in any of the above, wherein the imaging device configured to capture the images in a predetermined visible, near infrared (NIR), and thermal infrared spectral bands.
It is another object of the present invention to present the system, as presented in any of the above, wherein the imaging device configured to capture the images at a range of 400 to 12000 nm.
It is another object of the present invention to present the system, as presented in any of the above, wherein the spectrum is in the range of 8000 to 12000nm.
It is another object of the present invention to present the system, as presented in any of the above, wherein spectrum is in the range of 400 to 1200nm.
It is another object of the present invention to present the system, as presented in any of the above, wherein the system for obtaining a 3-D image is configured to identify the shape and position of palm trees.
It is another object of the present invention to present the system, as presented in any of the above, wherein the system for obtaining a 3-D image is a LiDAR.
It is another object of the present invention to present the system, as presented in any of the above, wherein the arial system is a drone, an airplane, a helicopter and a quadcopter.
It is the object of the present invention to a method for a non-invasive early detection of RPW infestation comprising steps of: a. obtaining a photographic mosaic of a palm tree from an image processing skimmer; b. identifying relevant sections of a canopy of the palm trees; c. obtaining aerial spectral images in a predetermined wavelength/spectrum; d. identifying the spectral data of the relevant section; e. comparing the spectral data to a machine learning database of palm trees data; f. providing a score indicative of the palm tree's health. wherein the section of the palm trees canopy is identified by a using a technique, selected from a group consisting of machine learning, calculating the centroid, calculating the center of circumscribed circle and calculating the center of convex hull.
It is the object of the present invention to a machine learning tool used for a non-invasive early detection of RPW infestation, wherein the machine learning is performed in a supervised learning approach, based on: a. multiple aerial spectral images in a predetermined spectrum of palm trees orchards, captured in time intervals over different environmental conditions; b. an algorithm for separating the palm trees and the surrounding vegetation; c. a method of identifying relevant sections of the tree canopy; d. an algorithm for identifying a change in spectral indices over time in the palm trees; e. an algorithm for identifying canopy of palm trees affected by the RPW infestation, acoustic measurements and physiological measurements.
It is the object of the present invention to present a method of training a machine learning tool for detection of RPW infestation, comprising step of obtaining: a. multiple aerial spectral images in a predetermined spectrum of palm trees orchards, captured in time intervals over different environmental conditions; b. an algorithm for separating the palm trees and the surrounding vegetation; c. a method of identifying sections of the tree canopy; d. acoustic measurements; and e. physiological measurements.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention wherein: Figure 1(a, b) – shows a palm tree damaged by infection and the RPW.
Figure 2 – presents an arial (drone) RGB image of the palm tree canopy, ground clutter removed.
Figure 3 – presents the identification of the ‘Lulav’ area of the tree.
Figure 4 – presents the identification of the center of the tree.
Figure 5 – presents the spectral data from healthy and infected trees. Figure 5(a, b) – Demonstrates the application of the spectral system of the present invention. Figure 6 – demonstrates analysis of a Medjool orchard.
DETAILED DESCRIPTION OF THE PERFERD EMBODYMENT The following description is provided, alongside all chapters of the present invention, so as to enable any person skilled in the art to make use of the invention and sets forth the best modes contemplated by the inventor of carrying out this invention. Various modifications, however, are adapted to remain apparent to those skilled in the art, since the generic principles of the present invention have been defined specifically to provide compositions and methods.
In this application, the term light detection and ranging or laser imaging, detection, and ranging (LiDAR), sometimes is called 3-D laser scanning refers to the combination of 3-D scanning and laser scanning.
LiDAR systems has a number of major components.
- Laser, with 600–1000 nm lasers being the most common for non-scientific applications. Alternatively, 1550 nm lasers are often used in the presence of people, as it is eye-safe at relatively high-power. Airborne topographic mapping LiDARs generally use 1064 nm diode-pumped Yttrium aluminium garnet (YAG) laser. - Phased arrays, configured to control a microscopic array of individual antennas. By controlling the timing (phase) of each antenna, the system ensures a cohesive signal in a specific direction. - Microelectromechanical machines or Microelectromechanical mirrors (MEMS) same mirror from another angle. - Scanner and optics - Photodetector and receiver electronics, with two main photodetector technologies used: solid state photodetectors (such as silicon avalanche photodiodes) and photomultipliers. - Position and navigation systems, to enable the determination of the absolute position and orientation of the sensor. The stem often includes a Global Positioning System (GPC) receiver and an inertial measurement unit (IMU). - Sensor, that detects the energy reflected from an object. By recording the time between transmission and detection, and by using the speed of light it is possible to calculate the distance to the object(s).
Flash LiDAR is conducted by illuminating the entire field of view with wide diverging laser beam in a single pulse. Flash LIDAR enables 3-D imaging due to the camera's ability to emit a larger flash and sense the spatial relationships and dimensions of area of interest, from the returned/reflected energy. Imaging LiDAR can also be performed using arrays of high-speed detectors and modulation sensitive detector arrays that are typically built on single chips using complementary metal–oxide–semiconductor (CMOS) and hybrid CMOS/Charge-coupled device (CCD) fabrication techniques. High resolution 3-D LiDAR cameras often use homodyne detection with an electronic CCD or CMOS shutter.
In the present application, the term unmanned arial vehicle or system (UAV or UAS) refers to an aircraft that can be operated without a crew (such as a pilot). UAV can also be referred to as a drone. The UAV can be fixed winged or wingless, tailed or tailless. In may embodiments, the drone is configured as a Quadcopter. In many embodiments, the UAV comprises numerous sensing system, such as position, height, altitude, speed, Global positioning system (GPS), stability etc. An unmanned arial system (UAS) refers to a system, comprising an UAV, in addition to a (ground-based) controller and systems for the communication UAV. The UAV can be remote controlled, by a human operator, often referred to as remotely-piloted aircraft (RPA), or by various degree of autonomous, such as autopilot assistance. Unless otherwise stated, with reference to numerical quantities, the term " about " refers to a tolerance of ±25% of the stated nominal value. Unless otherwise stated, all numerical ranges are inclusive of the stated limits of the range.
Claims (35)
1. A method for identifying palm trees infected with red palm weevil (RPW), characterized steps of: a. Obtaining a 3-dimentional image of the canopy of a palm tree from above; b. Identifying specific sections of said tree canopy; c. Obtaining a spectral image of said tree from above; d. Identifying the spectral image of said specific section of said tree canopy; and e. Comparing said spectral images to identify infection status of said tree; said comparison being characterized as being spatial and temporal; Wherein said specific sections of said tree canopy is characterized as the ‘LULAV’.
2. The method of claim 1, wherein said 3-dimentional image is generated using LiDAR.
3. The method of claim 1, wherein said step of identifying said ‘LULAV‘ section of said palm trees canopy is conducted by steps, selected from a group consisting of machine learning, pattern recognition, calculating the centroid, calculating the center of circumscribed circle, and calculating the center of convex hull.
4. The method of claim 1, wherein said spectral image is obtained in a predetermined visible, near infrared (NIR), and thermal infrared spectral bands.
5. The method of claim 1, wherein said spectral image is obtained at a wavelength range of 400 to 12000 nm.
6. The method of claim 5, wherein said spectrum is in a range of 8000 to 12000 nm.
7. The method of claim 5, wherein said spectrum is in a range of 400 to 1200 nm.
8. The method of claim 1, additionally comprising a step of obtaining spectral images of additional trees.
9. The method of claim 8, additionally comprising a step of comparing said spectral images of separate trees.
10. The method of claim 1, addition comprising a step of obtaining spectral images of said tree at a second time point.
11. The method of claim 10, additionally comprising a step of comparing said spectral images of said tree at different time point.
12. The method of claim 1, additionally comprising a step of analyzing tree health.
13. The method of claim 1, additionally comprising step of generating a photographic mosaic from said individual images captured by said spectral imaging devices.
14. A system for a non-invasive early detection of red palm weevil (RPW) infestation comprising: a. An arial photography system/vehicle, configured to position spectral imaging devices and allow aerial image capture of palm trees orchards, said system comprising: i. at least one spectral imaging device; ii. at least one system for obtaining a 3-D image configured to capture the shape of the ground and palms trees; b. a computer-readable media (CRM), running an image processing software, said software configured to produce a photographic mosaic from said individual images captured by said spectral imaging devices; c. a computer-readable media (CRM), running an image processing software configured to identify the relevant section of said palm trees canopy; and d. a processor configured to compare spectral analysis of said relevant section of the palm trees canopy and provide a score indicative of said palm tree's health; wherein said relevant section of the palm trees canopy is characterized as the ‘LULAV’.
15. The system of claim 14, wherein said imaging device configured to capture said images in a predetermined visible, near infrared (NIR), and thermal infrared spectral bands.
16. The system of claim 14, wherein said imaging device configured to capture said images at a range of 400 to 12000 nm.
17. The system of claim 16, wherein said spectrum is in a range of 8000 to 12000 nm.
18. The system of claim 16, wherein said spectrum is in a range of 400 to 1200 nm.
19. The system of claim 14, wherein said system for obtaining a 3-D image is a LiDAR.
20. The system of claim 14, wherein said aerial system is a characterized as unmanned or manned.
21. The system of claim 14, wherein said aerial system is a drone, an airplane, a helicopter or a quadcopter.
22. The system of claim 14, additionally composing a computer-readable media (CRM), running an image processing software configured to detect the palm trees health status by means of spectral vegetation indices.
23. The system of claim 14, wherein said image processing software identifying said relevant section of the palm trees canopy is conducted by a using a technique, selected from a group consisting of machine learning, calculating the centroid, calculating the center of circumscribed circle and calculating the center of convex hull.
24. An unmanned aircraft system (UAS) for early detection of red palm weevil (RPW) infestation, comprising: a. An unmanned aerial vehicle (UAV), comprising: i. At least one spectral imaging device; ii. At least one system for obtaining a 3-D image, configured to capture the shape of ground and palm tree canopy; b. A control station/unit, configured for controlling said UAV; and c. A data links/database, comprising image processing software, said software configured to: i. produce a photographic mosaic from said individual images captured by said spectral imaging devices; ii. identify the relevant section of said palm trees canopy; and iii. detect the palm trees health status by means of spectral vegetation indices; wherein said relevant section of the palm trees canopy is characterized as the ‘LULAV’.
25. The system of claim 24, wherein said image processing software identifying said relevant section of the palm trees canopy is conducted by a using a technique, selected from a group consisting of machine learning, calculating the centroid, calculating the center of circumscribed circle and calculating the center of convex hull.
26. The system of claim 24, wherein said imaging device configured to capture said images in a predetermined visible, near infrared (NIR), and thermal infrared spectral bands.
27. The system of claim 24, wherein said imaging device configured to capture said images at a range of 400 to 12000 nm.
28. The system of claim 27, wherein said spectrum is in a range of 8000 to 12000 nm.
29. The system of claim 27, wherein spectrum is in a range of 400 to 1200 nm.
30. The system of claim 24, wherein said system for obtaining a 3-D image is configured to identify the shape and position of palm trees.
31. The system of claim 24, wherein said system for obtaining a 3-D image is a LiDAR.
32. The system of claim 24, wherein said arial aerial system is a drone, an airplane, a helicopter and a quadcopter.
33. A method for a non-invasive early detection of RPW infestation comprising steps of a. obtaining a photographic mosaic of a palm tree from an image processing skimmer; b. identifying relevant sections of a canopy of said palm trees; c. obtaining aerial spectral images in a predetermined wavelength/spectrum; d. identifying the spectral data of said relevant section; e. comparing said spectral data to a machine learning database of palm trees data; f. providing a score indicative of said palm tree's health. wherein said section of the palm trees canopy is identified by a using a technique, selected from a group consisting of machine learning, calculating the centroid, calculating the center of circumscribed circle and calculating the center of convex hull and wherein said specific sections of said tree canopy is characterized as the ‘LULAV’.
34. A machine learning tool used for a non-invasive early detection of RPW infestation, wherein said machine learning is performed in a supervised learning approach, based on a. multiple aerial spectral images in a predetermined spectrum of palm trees orchards, captured in time intervals over different environmental conditions; b. an algorithm for separating palm trees and surrounding vegetation; c. a method of identifying relevant sections of the tree canopy; d. an algorithm for identifying a change in spectral indices over time in palm trees, e. an algorithm for identifying canopy of palm trees affected by RPW infestation, f. acoustic measurements, g. physiological measurements. wherein said specific sections of said tree canopy is characterized as the ‘LULAV’.
35. A method of training a machine learning tool for detection of RPW infestation, comprising step of obtaining: a. multiple aerial spectral images in a predetermined spectrum of palm trees orchards, captured in time intervals over different environmental conditions; b. an algorithm for separating palm trees and surrounding vegetation; c. a method of identifying the ‘LULAV’ sections of the tree canopy; d. acoustic measurements, e. physiological measurements.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IL291800A IL291800B2 (en) | 2022-03-29 | 2022-03-29 | An aerial-based spectral system for the detection of red palm weevil infestation in palm trees, and a method thereof |
PCT/IL2023/050296 WO2023187776A1 (en) | 2022-03-29 | 2023-03-21 | An aerial-based spectral system for the detection of red palm weevil infestation in palm trees, and a method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IL291800A IL291800B2 (en) | 2022-03-29 | 2022-03-29 | An aerial-based spectral system for the detection of red palm weevil infestation in palm trees, and a method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
IL291800A IL291800A (en) | 2022-12-01 |
IL291800B2 true IL291800B2 (en) | 2023-04-01 |
Family
ID=84272334
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
IL291800A IL291800B2 (en) | 2022-03-29 | 2022-03-29 | An aerial-based spectral system for the detection of red palm weevil infestation in palm trees, and a method thereof |
Country Status (2)
Country | Link |
---|---|
IL (1) | IL291800B2 (en) |
WO (1) | WO2023187776A1 (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018154260A1 (en) * | 2017-02-22 | 2018-08-30 | Wareosh Mustafa | Maintenance of a plurality of plants |
US20200029488A1 (en) * | 2018-07-26 | 2020-01-30 | Bear Flag Robotics, Inc. | Vehicle Controllers For Agricultural And Industrial Applications |
US20200200683A1 (en) * | 2018-12-19 | 2020-06-25 | InnerPlant, Inc. | Sensor plant and method for identifying stressors in crops based on characteristics of sensor plants |
US20200364843A1 (en) * | 2019-05-17 | 2020-11-19 | Ceres Imaging, Inc. | Methods and systems for crop pest management utilizing geospatial images and microclimate data |
US20210209747A1 (en) * | 2020-01-06 | 2021-07-08 | The Texas A&M University System | Unmanned aerial system genotype analysis using machine learning routines |
US11074447B1 (en) * | 2018-07-13 | 2021-07-27 | Hana Resources, Inc. | Land analysis system using drone-captured data |
WO2021198731A1 (en) * | 2020-04-01 | 2021-10-07 | Sarabi Soroush | An artificial-intelligence-based method of agricultural and horticultural plants' physical characteristics and health diagnosing and development assessment. |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070244608A1 (en) * | 2006-04-13 | 2007-10-18 | Honeywell International Inc. | Ground control station for UAV |
US20100208981A1 (en) * | 2009-02-13 | 2010-08-19 | Harris Corporation | Method for visualization of point cloud data based on scene content |
US20210097691A1 (en) * | 2019-09-30 | 2021-04-01 | Nvidia Corporation | Image generation using one or more neural networks |
US20210209803A1 (en) * | 2020-01-06 | 2021-07-08 | Quantela Inc | Computer-based method and system for geo-spatial analysis |
-
2022
- 2022-03-29 IL IL291800A patent/IL291800B2/en unknown
-
2023
- 2023-03-21 WO PCT/IL2023/050296 patent/WO2023187776A1/en unknown
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018154260A1 (en) * | 2017-02-22 | 2018-08-30 | Wareosh Mustafa | Maintenance of a plurality of plants |
US11074447B1 (en) * | 2018-07-13 | 2021-07-27 | Hana Resources, Inc. | Land analysis system using drone-captured data |
US20200029488A1 (en) * | 2018-07-26 | 2020-01-30 | Bear Flag Robotics, Inc. | Vehicle Controllers For Agricultural And Industrial Applications |
US20200200683A1 (en) * | 2018-12-19 | 2020-06-25 | InnerPlant, Inc. | Sensor plant and method for identifying stressors in crops based on characteristics of sensor plants |
US20200364843A1 (en) * | 2019-05-17 | 2020-11-19 | Ceres Imaging, Inc. | Methods and systems for crop pest management utilizing geospatial images and microclimate data |
US20210209747A1 (en) * | 2020-01-06 | 2021-07-08 | The Texas A&M University System | Unmanned aerial system genotype analysis using machine learning routines |
WO2021198731A1 (en) * | 2020-04-01 | 2021-10-07 | Sarabi Soroush | An artificial-intelligence-based method of agricultural and horticultural plants' physical characteristics and health diagnosing and development assessment. |
Also Published As
Publication number | Publication date |
---|---|
WO2023187776A1 (en) | 2023-10-05 |
IL291800A (en) | 2022-12-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Eismann et al. | Automated hyperspectral cueing for civilian search and rescue | |
US7417727B2 (en) | Method and apparatus for standoff detection of liveness | |
US20100013615A1 (en) | Obstacle detection having enhanced classification | |
JP6390054B2 (en) | Monitoring system | |
Malveaux et al. | Using drones in agriculture: unmanned aerial systems for agricultural remote sensing applications | |
Sarkar et al. | Towards autonomous phytopathology: Outcomes and challenges of citrus greening disease detection through close-range remote sensing | |
Cuaran et al. | Crop monitoring using unmanned aerial vehicles: A review | |
Murray et al. | Survey and insights into unmanned aerial-vehicle-based detection and documentation of clandestine graves and human remains | |
De Biasio et al. | UAV-based environmental monitoring using multi-spectral imaging | |
WO2014147043A1 (en) | Method for analysing a cultivated plot of farmland | |
WO2021062459A1 (en) | Weed mapping | |
Gonzalez et al. | Advances in unmanned aerial systems and payload technologies for precision agriculture | |
Jung et al. | Analysis of vegetation infection information using unmanned aerial vehicle with optical sensor | |
Singh et al. | Multi-temporal high resolution unmanned aerial vehicle (UAV) Multispectral imaging for menthol mint crop monitoring | |
Maslekar et al. | Application of unmanned aerial vehicles (UAVs) for pest surveillance, monitoring and management | |
IL291800B2 (en) | An aerial-based spectral system for the detection of red palm weevil infestation in palm trees, and a method thereof | |
EP3844715B1 (en) | Apparatus and method for identifying organisms | |
Keita et al. | Semantic segmentation based field detection using drones | |
Al-Mulla et al. | Use of drones and satellite images to assess the health of date palm trees | |
Da Silva et al. | Unimodal and Multimodal Perception for Forest Management: Review and Dataset. Computation 2021, 9, 127 | |
Sadenova et al. | Study of unmanned aerial vehicle sensors for practical remote application of earth sensing in agriculture | |
Rilling et al. | A multisensor platform for comprehensive detection of crop status: Results from two case studies | |
Kaasalainen et al. | Work in progress: combining indoor positioning and 3D point clouds from multispectral Lidar | |
Yang et al. | Evaluating airborne hyperspectral imagery for mapping waterhyacinth infestations | |
Lee et al. | Designing a Perception System for Safe Autonomous Operations in Agriculture |