GB2553631B - Data Processing of images of a crop - Google Patents
Data Processing of images of a crop Download PDFInfo
- Publication number
- GB2553631B GB2553631B GB1709756.9A GB201709756A GB2553631B GB 2553631 B GB2553631 B GB 2553631B GB 201709756 A GB201709756 A GB 201709756A GB 2553631 B GB2553631 B GB 2553631B
- Authority
- GB
- United Kingdom
- Prior art keywords
- image
- crop
- images
- canopy
- captured
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/40—Analysis of texture
- G06T7/41—Analysis of texture based on statistical description of texture
- G06T7/45—Analysis of texture based on statistical description of texture using co-occurrence matrix computation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/97—Determining parameters from multiple pictures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20076—Probabilistic image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
- G06T2207/30188—Vegetation; Agriculture
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Quality & Reliability (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Probability & Statistics with Applications (AREA)
- Geometry (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Description
The following terms are registered trade marks and should be read as such wherever they occur in this document:
LemnaTec (Page 2)
Raspberry Pi (Page 9)
Wi-Fi (Pages 10 & 18)
Data processing of images of a crop
Field of the invention
The present invention relates to data processing of images of a crop, in particular a cereal crop such as wheat, maize or rice, for use in image-based field phenotyping.
Background
Due to a narrowing range of available genetic diversity of modern crop germplasm and increasing fluctuations in weather caused by global climate change, research is being directed to searching for new sources of variation, such as landraces and wild relatives, to seek traits with greater yield potential, as well as environmental adaptation. This search requires robust measures of adaptive traits from many experimental plots throughout the growing season.
To increase yield and to improve crop adaptation to diverse environments sustainably, modern genetic and genomics technologies have been employed to enable an efficient selection of valuable lines with high yield, biotic and abiotic stress tolerance, and disease resistance. For example, quantitative trait locus (QTL) analysis and genomewide association studies (GWAS) can be used to examine genetic architecture, genome sequencing can be employed to reveal gene content and diversity and marker-assisted selection (MAS) or genomic selection (GS) can be used to accumulate favourable alleles. These approaches, however, are limited by low-throughput, laborious and inaccurate in-field phenotyping approaches. Phenotyping is widely recognised as forming a bottleneck preventing researchers from linking the richness of genomic and genotypic information to important traits thereby allowing this information to be used effectively for agriculture.
Agricultural practitioners, such as breeders, growers, farmers and crop scientists, have been seeking new approaches to relieving the bottleneck. For instance, non-invasive remote sensors and aerial imaging devices, such as unmanned aerial vehicles (UAVs) and blimps, are being used to study crop performance and field variability. Satellite imaging and tailored portable devices can be used to predict crop growth and yield potential based on canopy photosynthesis and normalised difference vegetation indices (NDVI). Large-scale imaging systems equipped with three-dimensional laser scanners and multispectral sensors have used to try to automate plant monitoring for a fixed number of pots or plots either in greenhouse (such as Scanalyzer HTS/3D HT marketed by LemnaTec, Aachen, Germany) or in the field (for example, LeasyScan marketed by Phenospex, Heerlen, The Netherlands and Field Scanalyzer marketed by LemnaTec Aachen, Germany). However, systems suffer from being expensive and small scale, and from providing low frequency of measurements. The systems tend to have inadequate software analytical tools for use by agricultural practitioners to make sense of complicated phenotypic datasets. It is desirable, therefore, to be able to measure crop growth dynamically and to identify key adaptive traits in large numbers of experimental plots in different regions. Thus, there is need to develop an affordable, reliable phenotyping platform that can be easily used and widely-adopted in breeding pipelines and by crop research communities worldwide.
Summary
According to a first aspect of the present invention there is provided a method of processing images of a crop (which may be in a field, a part of field, a plot, a plant pot, or plant tray). The crop may be a cereal crop. The method comprises preparing a series of images of the crop comprising receiving a series of captured images of the crop captured over a period of time, for each image in the series of captured images, determining whether the image satisfies at least one image-quality requirement, upon a positive determination, adding the image to the series of images to be processed. The method comprises retrieving the series of images of the crop to be processed and identifying, in an image (or “initial image”) selected from the series of images to be used as a reference image, a reference system against which other images can be compared, the reference system including an extent of a crop plot and/or one or more reference points. The method also comprises, for each of at least one other image in the series of images, calibrating or adjusting the image using the reference system, and determining a height of a canopy of the crop in the image, a main orientation of the crop and/or a value indicative of vegetative greenness (for example, a normalised green value in an RGB colour space and/or excessive greenness).
This can afford greater flexibility when monitoring a crop, particularly large numbers of crops, over periods of months. For example, the method can be used to process images of a crop which have been captured in the field and, thus, subject to vagaries of weather. Moreover, the method can be used for each crop and, thus, allow large data to be processed for large numbers of crops.
The one or more reference points may include a plot region (which includes the crop plot and a region around the crop plot, e.g. a gap between adjacent crop plots), a canopy space, and/or at least one height marker (which maybe a graduated ranging pole and/or a reference mark).
The method may further comprise identifying at least one reference marker in the reference image. The method may further comprise classifying pixels in the reference image into one or more groups corresponding to one or more respective object types. The method may further comprise, for each of at least one other image in the series of images, identifying corner-featured points in the crop plot in the image.
The method may further comprise preparing the series of images of the crop comprising, receiving a series of captured images of the crop and, for each image in the series of captured images, determining whether the image satisfies at least one image-quality, requirement, upon a positive determination, adding the image to the series of images to be processed. The at least one image-quality requirement may include brightness of the image, size of the image file, sharpness of the image and/or the proportion of dark area in the image area.
Determining the height of the canopy may comprise detecting a visible part of a ranging pole. Determining the height of a canopy of the crop in the crop plot in the image may comprise calculating an entropy based on compactness of a texture of a crop canopy space, isotropy of the texture of the crop canopy space and distribution of intensity of the crop canopy space. Determining the height of the canopy may comprise measuring a weighted centroid of the crop canopy space. Determining the canopy height may comprise determining respective positions of corner-featured objects in the crop canopy space and calculating an average from the tip positions.
The method may further comprise, for the image series, generating dynamic grow curves defining a developmental profile for the crop, calculating stem rigidity and lodging risk based on the main orientation of the crop and/or calculating vegetation and senescence periods based on a series of the values indicative of vegetative greenness.
Vegetative greenness Gv(x,y) can be computed based on excessive greenness ExG(x,y) and excessive red ExR(x,y) indices. The vegetative greenness can be defined by:
Gy(x,y) = (2*/c(x,y)-fR(x,y) - fB(x,y)) - (i.4*/sO,y) - A(x,y)) excessive greenness can be defined by:
ExG(x,y) = 2*/G(x,y) -//?(%,y) - fB(x,y) and excessive red can be defined by:
Fxs(x,y) = l.4*/s(x,y) -fB(x,y) where fR(x,y) is the red channel of a colour image, f u(x,y) represents the blue channel, and fc(x,y) the green channel.
According a second aspect of the present invention there is provided a method of batch processing comprising processing images of at least crops according to the first aspect of the present invention.
According a third aspect of the present invention there is provided a computer program comprising instruction which, when executed by a data processing system causes the data processing system to perform a method according to the first or second aspects of the present invention.
According a fourth aspect of the present invention there is provided a computer program product comprising a computer-readable storage media storing a computer program according to the third aspect of the present invention.
According a fifth aspect of the present invention there is provided a system comprising a data processing system configured to perform the method according to the first or second aspects of the present invention.
The system may further comprise at least one terminal. The, or each terminal, may comprise a light-level sensor to measure a light level for controlling image capture settings, a camera for capturing images of a region of a growing crop based on the image capture settings, data storage for storing images captured by the camera, a wireless network interface for transferring data, including images captured by the camera, to a remotely-located image processing computer system and an on-board computer system for controlling storage and transfer of captured images. The on-board computer system may be configured to determine whether an image-quality characteristic of the captured image satisfies a predetermined condition and, upon a negative determination, discarding the captured image such that only images satisfying the predetermined condition are transferred.
According a sixth aspect of the present invention there is provided a terminal. The terminal comprises a light-level sensor to measure a light level for controlling image capture settings, a camera for capturing images of a region of a growing crop based on the image capture settings, data storage for storing images captured by the camera, a wireless network interface for transferring data, including images captured by the camera, to a remotely-located image processing computer system and an on-board computer system for controlling storage and transfer of captured images. The on-board computer system is configured to determine whether an image-quality characteristic of the captured image satisfies a predetermined condition and, upon a negative determination, discarding the captured image such that only images satisfying the predetermined condition are transferred.
This can help reduce the amount of data transmitted and the number of images that need to be processed.
Brief Description of the Drawings
Certain embodiments of the present invention will now be described, by way of example, with reference to the accompanying drawings, in which:
Figure 1 is a schematic block diagram of an image capture and data processing system; Figure 2 illustrates a terminal used in plant breeding version of an image capturing system;
Figure 3 is a plan view of the terminal and crop shown in Figure 2;
Figure 4 illustrates the terminal shown in Figures 2 and 3 in more detail;
Figure 5 illustrates a terminal in the form of a cart used in farming version of an image capturing system;
Figures 6a, 6b and 6c illustrate a cart using a farming version of the image capturing system in first, second and third configurations respectively;
Figure 7a is a front elevation of the cart shown in Figure 5 with soil augers deployed; Figure 7b is a plan view of the underside of the cart shown in Figure 5;
Figure 8 is a process flow diagram of a method of capturing images, and processing and quantifying crop growth patterns and adaptive traits;
Figure 9a illustrates plots of height for five near isogenic lines (NILs) of wheat, thermal time, solar radiation, and rainfall against time;
Figure 9b is a plot of relative growth rate (RGR) against time for five NILs at different stages of growth;
Figures 9c and 9d are heat maps for first and second growth traits (namely RGR and canopy height) respectively showing the relationship between environmental factors in relation to four key growth stages (GS32-69);
Figure 9e show graphs of actual and estimated height against time for a combination of five NILs;
Figure 9f shows actual and predicted growth stages;
Figures 10a to toe show graphs of actual and estimated height against time for five NILs and for a combination of the five NILs;
Figure 11 is a schematic block diagram of servers and software modules used in the image-based phenotyping system;
Figure 12 is process flow diagram of a method of capturing images of a crop;
Figure 13 is a process flow diagram of a method of selecting images from a series of captured crop images;
Figure 14 is a process flow diagram of a method of defining coordinates of a reference system;
Figure 15 is an image of a crop, ranging pole and reference points;
Figure 16 is an image showing identification of reference points;
Figure 17 is an image showing computer-generated points and rectangles around the reference points;
Figure 17 illustrates pixel-metric conversion;
Figure 18 is an image of a crop, ranging pole and reference points;
Figure 19 is an image showing six different classes of objects;
Figure 20a is an image resulting from a first type of edge detection;
Figure 20b is an image resulting from a second type of edge detection;
Figure 20c is an image combining the first image shown in Figure 20a and the second image shown in Figure 20;
Figure 21 is an image showing computer-identified markers;
Figure 22 is an image showing a constructed reference system;
Figure 23 is an image showing a crop canopy;
Figures 24a and 24b illustrates measuring visibility of a ranging pole to determine the height of the plot over a season;
Figure 25 illustrates measuring a canopy region using corner-featured detection over a season;
Figure 26 illustrates measuring lodging risk using pattern measurements over a season; Figures 27a and 27b illustrates calculation of entropy and skewness;
Figure 29 is an image where a ranging pole is partially covered;
Figure 29 is a process flow diagram of a method of tracking plots of interest based on initial reference positions;
Figure 30 illustrates a series of images showing tracking plots of interest;
Figure 31 illustrates a series of images showing marking of the tips of erect leaves; Figure 32 is a process flow diagram of a method of modelling interaction between the crop and environment; and
Figure 33 is a process flow diagram of a method of predictively modelling crop growth.
Detailed description of Certain Embodiments
Image-based phenotyping system 1
Referring to Figure 1, an image capturing and data processing system 1 is shown for use in field phenotyping. The system 1 can allow automatic monitoring of crop growth and development using low-cost, in-field terminal workstations 2 (or “terminals”). For a given area of a crop 3, a set of n terminals 2 may be used, where n may be one or at least two, for example, between two and 20. The crop 3 is maybe a cereal crop, such as wheat, maize or rice.
Each terminal 2 includes a computer system 4, preferably in the form of a single-board computer (such as a Raspberry Pi 2 or a Raspberry Pi 3), on-board storage 5, a camera module 6, a temperature and humidity sensor module 7, a light sensor module 8, a soil sensor module 9, a wireless network interface module 10 and wired network interface(s) 11. Terminals 2 may be connected to an in-field wireless local area network 12 and a terminal 2 maybe connected to other terminals 2 by a wired peer-to-peer connection 13. The terminals 2 may provide nodes in a wireless mesh network. The computer system 4 includes at least one processor (not shown) and memory (not shown). A terminal 2 can be used to perform continuous monitoring using high-resolution (e.g. 5 megapixel), time-lapse photography, in-field evaluation and data transfer via realtime file sharing and data exchange servers. The camera module 6 can be used to capture, for example, between 1 and 6 images each hour. Preferably, three images per hour are captured, i.e. resulting in 48 images per day (assuming images are only captured during 16 hours of a 24-hour day).
The system 1 includes a central web server 14, a data processing system 15, preferably in the form of a high-performance computing system (HPC), having associated storage 16 and a database storage 17 which receive image and environment data from the central web server 14. The system 1 may provide a web-based interface, via a network, 17 to a remotely-located computer system 19 for monitoring workstation status and data.
Fixed and/or mobile computing devices 20, 21 can be used to access the system.
Referring to Figures 2 and 3, in a first, plant-breeding version of the system 1, the terminals 2 take the form of compact units fixedly mounted on posts 21 (or “poles”) and can be used by plant breeders to monitor a crop 3, such as wheat. Each terminal 2 is elevated to a height hp above ground level 22, typically between 50 and 100 cm and preferably 75 cm.
A ranging pole or rod 23, preferably having a series of alternating coloured bands or sections of equal height, is placed roughly at the centre of a sample plot 24. The ranging pole 23 extends to a height /p above ground level, preferably 1.2 m, and is located a distance di from the terminal 2, preferably between 1.4 to 1.5 m. The sample plot takes the form of a square or rectangle having an area A, preferably about 1 m2. A white, rectangular reference point (or “fiducial mark”) 25 is inserted in the ground 26 between the terminal 2 and the ranging pole 23. The reference point 25 is located a distance d2 from the terminal 2, preferably between 1.0 to 1.1 m A soil sensor 9 is inserted in the soil 25 in or close to the sample plot 24. A cable (not shown) connects the terminal 2 and the soil sensor 9.
Referring to Figure 4, the terminal 2 comprises a weather-proof case 29 (herein referred to as a “housing” or “enclosure”) formed from acrylonitrile butadiene styrene (ABS) or other suitable material, having a generally rectangular prism shape and having a top 30, an underside 31, front 32, sides 33 and rear 34.
At the front 32 of the case 29, the terminal 2 has a camera hood 35 which shields a UV lens 36 disposed in front of camera module 6. The camera module 6 preferably takes the form of an RGB camera module. The camera module 6 is connected to the singleboard computer 6 via a cable 37 and on-board connector 38. An LED 39 is disposed in the top 30 of the case 29 and is connected to the single-board computer 6 via general-purpose input/output 40.
The single-board computer system 4 includes an integrated sensor board 41. The single-board computer system computer 2 includes an Ethernet connector 42 and USB connector 43 which are connected to respective RJ45 Ethernet and mini-USB sockets 44,45 at the rear 34 of the case 29 and which provide wired network interfaces 11. The single-board computer system 4 is provided with a WiFi dongle which provides a wireless network interface module 10 and with USB storage which provides on-board storage 5. The USB storage 5 may have a capacity of 16 GB or more.
The single-board computer system 4 is powered via a (e.g. 12V/5V) voltage converter 46 which receives power from an external source such as a (e.g. 12V) battery (not shown) which is trickle-charged by a solar panel (not shown) or via a (e.g. 5V/2A) power supply (not shown) via an external power and data connector 47. The soil sensor 9 (Figure 2) is connect to the single-board computer system 4 via the external power and data connector 47. An environmental sensor module 48 is mounted on the top 30 of the case 29 can us connected to the single-board computer system 4 via a data cable 49. The single-board computer 4 may be provided with a heat sink 50.
Referring to Figure 5 and also to Figures 6a to 6c, in a second, farming version of the system 1, the terminal 2 takes the form a cart. The camera module 6 and, optionally, other modules are contained in a sensor housing 51 having a dome-shaped cap and which is mounted to a distal end of a telescopic pole 52 which extends upwardly from a cart enclosure 53 (or “cart body”) having a top 54 and underside 55, front 56, sides 57 and rear 58. The sensor housing 52 is elevated to a height htp above ground level 22, typically between 50 cm and 3 m, preferably 2.5 m. The reference point 25 is located a distance d2 from the terminal 2, preferably between 2 to 4 m, preferably 3 m.
Referring in particular to Figure 6a, in a first configuration, wheels 59 are deployed from the underside 55 of the cart enclosure 53 and the cart can be manoeuvred into position using handles 60 which are rotatably mounted to the sides 57 of the cart 51 close to the rear 58.
Referring in particular to Figure 6b, in a second configuration, the wheels 59 are withdrawn into the enclosure 54 and the handles 60 are folded back against the sides 57 of the cart enclosure 54 so that the cart rests on legs 61. A solar panel 62 pivotably mounted on the top 54 of the cart enclosure 53 is folded out. The sensor housing 51 can be raised from the cart enclosure 53.
Referring to in particular to Figure 6c, in a third configuration, the telescopic pole 52 is fully extended.
Referring to Figure 7a, a set of four soil augers 63 can drill, from the underside 55 of the cart, into the soil so as to secure the cart in position.
Referring to Figure 7b, the telescopic pole 52 is mounted to a turntable 64, driven by a motor 65, which allows the pole 52 to be rotated.
Referring again to Figure 1, the system 1 can facilitate automatic crop phenotyping. Scientists and agricultural practitioners can access terminal workstations 2 remotely for real-time monitoring using, for example a computer 19 (for example, located in an office or laboratory) or a tablet or smart phone 20 (for example, located in the field).
Users can inspect not only a whole field in different regions using a plurality of terminals 2, but also can control terminals 2 to review performance of crops, initiate new monitoring sessions or transfer on-board phenotypic and sensor datasets to external computing storage. If users are granted administrative access to the system 2, they can oversee the operations, where status of every terminal 2 is constantly updated by the control system. Authorised users can inspect information such as online and offline status, operational mode, representative daily images, micro-environment (e.g. temperature/humidity in a plot region), and computational resource such as CPU and memory. The architecture of the control system supports the collation of phenotypic and sensor data for storage, visualisation, GUI-based systems interactions, and processing on high-performance computing (HPC) system 14,15,16, for example, in the form of SGI UV 2000 system having Intel (RTM) Xeon (RTM) cores.
An in-field weather station can be used to meteorological data including photosynthetically active solar radiation, rainfall, temperature, relative humidity and wind speed. Phenotypic and climate datasets are managed and saved in the data processing system 15 for durable data storage and centralised trait analysis.
Overview of data analysis pipeline
Referring to Figure 1, the terminals 2 are configured to take images at a rate of three per hour and at a resolution of 2592 x 1944 pixels so as to capture phenotypic plasticity, early expression of traits and crop-environment interactions. For example, over 200 GB data may be generated by ten terminals 2 in a field season during a 95-day period. To extract meaningful results from the growth and developmental data effectively, analytic libraries, such as OpenCV, Scikit-learn and Scikit-image can be used, together with automated bioimage informatics algorithms embedded in a high-throughput trait analysis pipeline.
Referring to Figure 8, an overview of the image-based phenotyping pipeline is shown. A terminal 2 captures images of the crop and transmits the images to the data processing system (step So).
The data processing system, preferably a high-performance computing system 15, selects representative crop images according to their size, clarity, imaging dates and genotypes (step Si). Preferably only high-quality images are used for trait analysis, although all images (including low-quality images) can be stored.
The data processing system 15 defines reference positions of plots monitored during the experiment (step S2).
In the real-life agricultural and plant breeding situations, winds, rainfall, irrigation and crop spraying can result in camera movement which can cause issues when cross-referencing trait measures in a time-lapse image series.
To help ameliorate camera movement, the data processing system 15 identifies an initial reference position of a given plot and to transform every image in the series to the same position for comparison. For instance, the data processing system 15 detects coordinates of the white reference points 25 (Figure 2) and dark markers on the ranging pole 22 (Figure 2) to carry out colour feature selection. The data processing system 15 classifies pixels into different groups such as crop canopy, wheel tracks, and plot regions based on the machine-learning methods, such as k-means and spectral clustering. The system 15 then establishes a pseudo reference system that records the plot area, the canopy space, height markers and the pixel-metric conversion.
The data processing system 15 uses the initial reference positions to perform in-depth trait analysis (step S3).
For a given genotype, the data processing system 15 employs an adaptive intensity and gamma equalisation to adjust colour and contrast to minimise colour distortion caused by variable in-field lighting. The system 15 then tracks geometric differences between the plot on a given image and the initial plot position. If different, a geometric transformation is applied to recalibrate the image, which removes areas outside the plot area. This can generate different sizes of black bars in the top of the given image. Within a given plot, data processing system calculates the crop height by detecting the visible part of the ranging pole as well as the canopy region.
The data processing system 15 locates corner-featured points within the canopy region to generates pseudo points so as to locate the tips of erect leaves at stem elongation or jointing (i.e. growth stages 32 to 39 of the Zadoks growth scale), reflective surfaces of curving leaves and heads between booting and anthesis jointing (i.e. growth stages 41 to 69) and corner-featured points on wheat ears and grains during senescence (i.e. growth stages 71 to 95).
In addition to crop growth patterns in relation to thermal time (degree day, °Cd), the system can include other dynamic measures of a number of traits in the pipeline (step S4). For example, vegetative greenness is calculated through separating the green channel in RGB images within plots of interest. An output which lies in a range between 0-255, can be used to assess green biomass and stay-green (i.e. prolonged green leaf area duration through delayed leaf senescence). Morphological traits such as the main orientation of a given plot which lies in a range between o°-i8o°, are quantified based on an optimised edge detection method, which computes the alignment of crop stems for assessing stem rigidity and lodging risk.
The steps in the image-based phenotyping pipeline will be described in more detail hereinafter. However, before describing the image-based phenotyping pipeline in more detail, experimental results obtained using the system will be described.
Monitoring near-isogenic lines of wheat
The radically-different nature of environments where wheat is grown provides a unique opportunity to study the genetic diversity of wheat in connection with yield and stress tolerance through phenotypic differences. Near-isogenic lines (NILs) of wheat from a genetic stock are used to test the system.
Between May and August 2015, five NILs were monitored, namely Late days-to-ear-emergence (DTEM) with Ppd-i loss of function (lof), Early-DTEM Ppd-Dia photoperiod insensitivity, Short, Rht-Dib semi dwarfing, Stay-Green (a stay green induced mutant), and Paragon wild type (WT). All the Near-isogenic lines are in the genetic background of Paragon (a UK spring wheat variety) and were constantly monitored over a 95-day period.
Referring to Figure 9a, dynamic developmental profiles of the five lines generated by system, together with environmental factors recorded during the period, are shown.
Referring to Figure 9b, based on the developmental data, daily relative growth rates (RGR) at different growth stages are shown.
Referring Figures 9c to 9f, applied machine-learning based models are shown to explore the dynamics between genotype, phenotype and environmental factors.
Referring again to Figure 9a, to measure the rate and sensitivity of wheat growth dynamically in relation to the environment, Paragon WT is used as a reference and highlight six key growth stages (i.e. GS32-95) from stem elongation or jointing (i.e. GS 32-39) to ripening (i.e. GS 91-95). The five growth curves generally followed a sigmoid curve. At the beginning of the crop monitoring, Ppd-Dia NIL was already at the end of the jointing stage (i.e. GS37-39) and hence was the first line reaching a maximum height. Ppd-i lof was the last to stop increasing in height. The heights of five NILs were very similar in the middle of June (highlighted by a dashed circle), which verifies what was observed in the field as all the NILs were at different growth stages.
By cross-referencing five development profiles (based on growth stages, instead of calendar days), it was noticed that, although Ppd-Dia NIL and Rht-Dib were recorded at similar maximum heights, namely 83.4cm and 80.6cm respectively, the latter had a gentle-mannered growth pattern. This suggests that this line could be suitable for crop management as farmers and growers would have more time to decide whether to apply fertiliser and irrigation to assist the growth or to use chemical control to prevent rapid height increase. Ppd-i lof s growth stages had been shifted back and thus had more time to develop. As a result, this line became the tallest in the trial. Although all five lines experienced some degree of height reduction due to a significant storm on 24 July 2015, Paragon WT presented a much lower lodging risk, as it maintained its height during ripening (i.e. GS91-95). To verify the above observation, the heading dates and canopy height on the same plots were manually scores and a strong correlation was obtained (with correlation coefficient of 0.985).
Moving beyond descriptive phenotypic research, genotype (G), phenotype (P) and environmental (E) datasets can be incorporated into machine-learning based modelling to explore the dynamics between GxPxE. First, to understand which environmental factors were strongly correlated with the growth of the five NILs at every key growth stage, daily RGR of the lines were computed and associated the growth rate according with their growth stages. The scatter chart shown in Figure 9b shows the growth vigour of the five genotypes, active from jointing to flowering (i.e. GS32-69) and inactive after GS71 (grain-filling). After that, Pearson correlation coefficient was calculated and the p-value based on growth traits such as normalised RGR (nesting three-day rates to reduce noise) and canopy height at four key stages (i.e. jointing, booting, heading and flowering). Through this, six environmental factors have been identified that were significantly correlated with growth traits (p < 0.01) out of 14, namely normalised degree day, solar radiation, rainfall, normalised temperature, light duration, and growth stage duration. Two heat maps shown in Figures 9c and 9d show the relationship between the identified environmental factors and two growth traits (namely RGR and canopy height) in relation to four key growth stages (i.e. GS32-69).
As growth traits did not change excessively after anthesis (i.e. GS71-95), these were not included the later stages in the correlation analysis.
Using the six identified environmental factors and growth traits measured during the growing season, a set of linear regression models were used to establish a global predictive model to forecast the growth and development of wheat in the genetic background of Paragon, when interacting with the environment. Figure 9e shows how the model forecasts the overall Paragon growth data (GT, mean squared error: 20.1512, correlation: 0.9991). The model uses the six environmental factors at six stages (i.e. GS32-95) as the input to obtain estimates of the relative growth rates yt for every given genotype. The formula below was used for the prediction:
(1) where A,7' is the environmental data at a time point t, β is the model parameters for growth stage S, and c is a constant offset. Ordinary least squares is used to determine the coefficients of the model. The model is also applied to predict the growth of the five NILs and compared the estimated growth with the recorded data generated the system.
On the basis of the first predictive model, a second model is used to forecast the timing and duration of key growth stages (i.e. GS32-95) to link the crop growth prediction with real-world agricultural practices. Thus, farmers, growers and breeders can make sound decisions based on the difference between the predicted growth curve and the actual growth pattern measured by the system. This approach could also assist agricultural practitioners in terms of line selection, fertiliser application, irrigation and harvesting to secure yield production. Figure 9f illustrates the performance of the second model. It has employed a set of support vector machines (SVM) with radial basis function kernels to classify the timing and duration of key growth stages. The model was tested by comparing the predicted growth stages with the true data measured by crop physiologists.
Figures 10a to toe show graphs of actual and estimated height against time for five NILs and for a combination of the five NILs;
Data transfer and data processing system
Referring to Figure 11, the data transfer and processing system 81 includes several servers and software modules
The system 81 includes comprises a data transfer server 82 and a remote-control server 83 running on the central web server 14 which allows users to connect to terminals 2 (Figure 1).
The system also includes support modules 84 for performing functions such as uploading sensor data and hardware information. Representative daily images are routinely selected and transferred to the central server during the night, which provides a daily snapshot of the monitored crops.
The system includes a control module 85 running on the central web server 14 which logs updates received from clients, in other words the terminals 2.
The terminal 2 includes an imaging program 86 to control the camera module 6 (Figure 1) for time-lapse crop monitoring. The program 86 can automatically adjust white balance, exposure mode and shutter speed in relation to variable in-field lighting conditions using an interface (for example the picamera package). Both image resolution and imaging frequency can be changed if users want to modify experimental settings. The program 86 can also conducts the initial quality control and data backup after every image is captured.
An application 87 running on each terminal 2 is run at regular, scheduled intervals. The application 87 queries the terminal 2 to determine workstation status information such as uptime, network addresses and storage usage. Sensor data and more variable system data such as CPU temperature and processor/memory usage is sampled at a higher frequency and a mean average of the readings is recorded during the query.
Once the application 87 has collected the necessary data, it is encoded into a JavaScript Object Notation (JSON) data object and transmitted over HTTP to the central server 14 which stores the data in a database 88 (for example, a SQL database) running on the data processing system 15. Status of the system can be displayed and automatically updated using a web-based interface using a web browser 89 running on a remote computer 19, determining whether each node 2 is online by the time of their last update. The web interface provides information, including the location of terminals 2 in the field (a field map can be uploaded to the central server), graphs of collected terminal/sensor data, and facilitates SSH and VNC linking to active nodes 2. The system provides a centralised real-time monitoring system to administer the network of in-field workstations 2 and collate collected data for visualisation, batch processing and annotation.
Referring also to Figure 12, when a terminal 2 is initialised by a user via a user interface, the application 87, imports vision and imaging libraries (step S0.1), receives experimental settings including genotypes, workstation ID, imaging frequency and duration (step So.2) and checks that the hardware, such as WiFi interface 10, USB flash drive 5 and the like, is operating correctly (steps So.3 & So.4). If the hardware is not operating correctly, then the user is prompted to check the hardware via the user interface (step S0.5). The user then set up imaging dates and create folders for capturing and saving time-lapse image series (step So.6).
The application 87 then starts the process of periodically acquiring images.
The application 87 dynamically adjusts imaging setting such as white balance, shutter speed and exposure mode based on infield light conditions (step So.8). The application then checks whether on-board storage is full or imaging should stop for other reasons (step So.9). If there is sufficient storage, then the application triggers image capture and saves the image in on-board storage 5 (step So. 10). The application 87 checks image quality (step So.11). If image is of insufficiently high quality, the image is archived and the image is removed from the series of images (step So.12). If the image is of a sufficiently high quality, then the application places the terminal into a sleep mode and sets a timer (not shown) for the next image (step So.13).
Referring again to Figure 11, the data processing system 15 executes several algorithms and software modules 90, 91, 92, 93, 94,95.
An image selection algorithm 90 performs an assessment of large image datasets captured in field trials by comparing images with a number of fixed criteria.
Referring also to Figure 13, the algorithm 90 performs initial set up including importing vision and imaging libraries (step S1.1), opening a GUI to receive a user selection of an image series (step S1.2) and setting up file systems on the data processing system 15 for saving results (step S1.3).
The algorithm 90 can also perform checks that the data processing system 15 is operating correctly (for example, that a valid directory has been selected) and, if not, to prompt the user to perform a check.
The image selection algorithm 90 goes through each image in the series (step S1.5 & S1.6) to determine whether the image meet analysis standards (step Si.7). Those that meet the standard are collated. Each image is quantified by brightness, shadow percentage and sharpness, allowing images that meet or exceed a set of thresholds to be retained for further traits analysis.
To determine the brightness of an image, the median value of pixel intensity is taken by transforming the image into hue, saturation and value (“HSV”) colour space. If the median intensity value is lower than a set threshold, the image is culled and not used from this point forward. For example, the median brightness may be assigned a value between 0 (dark) to 1 (bright) and a threshold having a value of between 0.3 and 0.5 may be used. A threshold value of over 0.5 corresponds to an image taken in bright sunshine.
Image sharpness (or “image clarity”) is determined by applying a Sobel edge detection. The detectable edges are calculated and then correlated with sharpness and exposure range of the image. The result of clarity detection is also compared to a set threshold, which will disqualify images if they are out of focus or unclear with ill-defined edges. For example, an obtained value may take a value between 0 (blurred) to 1 (sharp) and a threshold may take a value of between 0.3 and 0.5. A threshold value of over 0.5 corresponds to a sharp image.
Measuring shadow areas involves determining the proportion of the image containing dark pixels and comparing it to a threshold value. For example, the proportion may take a value between 0 (all shadow) to 1 (no shadow) and a threshold may take a value of 0.2 (i.e. 20%).
Measuring size involves determining the size of the image and comparing it to a threshold value. For example, the threshold may be 3.0 MB.
If the image selection algorithm 90 judges the image to be of low quality, then the image is removed from the series (step S1.9). Information about the discarded image may be recorded.
Otherwise, if the image passes all the comparisons, the selected image is included in a result folder, with a CSV file recording image metadata for further high-throughput image analysis (step S1.10).
Image selection may be based on one, two, three or all four of these measures. Preferably, image selection is based on all four measures. Other measures may be used.
Referring again to Figure 11, a plot detection algorithm 91 detects initial reference positions of monitored plots.
Referring also to Figure 14, the plot detection algorithm 91 performs initial set up including importing vision and imaging libraries (step S2.1), opening a GUI to receive a user selection of an image to serve as reference image (step S2.2) and setting up file systems on the HPC system for saving results (step 2.3).
The plot detection algorithm 91 can also perform checks that the data processing system 15 is operating correctly (for example, that a valid directory has been selected) and, if not, to prompt the user to perform a check.
The plot detection algorithm 91 may perform gamma correction (step S2.4), for example, so as to balance intensity distribution towards 50%. This can help with image screening.
Referring also to Figure 15, the plot detection algorithm 91 identifies the coordinates of white reference canes 26 so as to define the plot region and dark height markers on the ranging pole 24 using colour-based feature selection on the basis of HSV and Lab nonlinear colour space (step S2.5).
Referring also to Figure 16, in particular, the plot detection algorithm 91 uses a normalised grey image scale to detect white parts of the image which have a high saturation value. For example, this may involve keeping only 30 % of the pixels having the highest saturation value. The plot detection algorithm 91 then removes small objections, for example, those which have a height (or other dimension) no more than a given number of pixels. Reference canes 26 have a height of over 12,500 pixels and, thus, a threshold less than 12,500 is used. Holes in the image, i.e. the detected small objects, are filled in as black. Figure 16 shows the result of identifying the most saturated regions and removing small objects from the image.
Referring also to Figure 16, the plot detection algorithm 91 then identifies the reference canes 26 based on size and the ratio of width-to-length (“WL ratio”). The plot detection algorithm 91 defines a rectangle 101 for each reference canes 26 and a corresponding centre 102.
Referring to Figure 14,17 and 18, the plot detection algorithm 91 perform classifies pixels into different groups such as sky 103, soil 27 between plots, crop canopy 105, shadow 106, references 26, markers 107 and other objects (not shown) using unsupervised machine-learning techniques such as K-means and spectral clustering (step S2.6).
Figure 17 shows the RGB image and Figure 18 shows the corresponding classified image where different groups are differently coloured. The top half of the image is generally navy blue and the bottom half mainly comprises region of cyan and navy blue. In the bottom half, the large region of navy blue contains flecked regions red and orange. There is a large region of orange in the bottom left-hand corner and there is a large region of cyan in the bottom right-hand corner.
If no objects are found (step S2.7), the plot detection algorithm 91 prompts the user to select another image to use as a reference (step S2.8).
Referring again to Figure 14, after detecting initial reference objects in the image, the algorithm establishes a pseudo three-dimensional reference system that records the two-dimensional coordinates of the plot area, the canopy region, and height markers through a range of feature selection approaches (steps S2.9 to S2.12). The pixel-metric conversion is also computed based on height markers on the ranging pole 24.
Referring also to Figure 17a, pixel-metric conversion includes counting the number of pixels between the centres C of adjacent dark markers 107 and/or an angle of inclination, Θ, of a line between the centres of adjacent dark markers.
The plot detection algorithm 91 finds makers on the ranging pole 23 using two techniques.
Referring also to Figure 20a, the plot detection algorithm 91 uses an edge detector, such as a Canny edge detector, to detect the markers 107 (which have well-defined edges with respect to the pole and to adjacent light markers)
Referring also to Figure 20b, the plot detection algorithm 91 uses global thresholding using the median intensity value to find the markers 107 (which are very dark).
Referring also to Figure 20c, the plot detection algorithm 91 can combine the results of the two approaches.
Referring to Figures 14 and 21, the plot detection algorithm 91 locates the markers 107 on the ranging pole 24.
Referring also to Figure 22, the plot detection algorithm 91 identifies the (three) reference points 26, or more specifically the centres 102 of the reference points 26, and computes a location for a fourth reference point 108 and defines a polygon 109 having the reference points 102,108 as the vertices.
Referring also to Figure 23, the plot detection algorithm 91 calculates corners 110 of a polygon 111 which defines resized canopy region 112. As the crop grows, the polygon 111 moves and the corresponding entropy changes.
Referring again to Figure 11, a crop performance measurements algorithm 92 is used to measure canopy height, identify corner features and measure growth, colour and orientation trait analysis.
For a given crop genotype, adaptive intensity and gamma equalisation is applied to the image to minimise colour distortion caused by variable field illumination. The algorithm 92 can determine the canopy height in a number of different approaches in case one of them cannot extract the height reading as planned.
Referring to Figures 24a and 24b, a first approach is simply to inspect the ranging pole 24 and identify the visible part 113 of the pole 24 and, thus, “read” the height of the plot off the pole 24. However, the first approach may not always be possible, especially as the crop gets taller or the ranging pole was covered by random objects such as pointing leaves, agricultural vehicles, or people, which could provide false positive height reading, and all of the ranging pole 24 is obscured.
Referring to Figure 25, a second approach involves determining coordinates of the top of leaves which are labelled with pseudo points 114 and then calculating the median value of heights of the pseudo points 114. The two-dimensional pseudo points’ height coordinates (y-axis) are calculated within the crop canopy space and then the median value 115 of the height readings is computed to represent the canopy height at the time point.
Referring to Figure 26, 27a and 27b, an entropy-based process of detection, in particular calculating entropy using grey-level co-occurrence matrices (GLCM). In particular, to determine the height of the canopy, the entropy-based texture analysis is used to detect whether the canopy region 112 enclosed by the hyper plane 111 changes between two adjacent images using GLCM. The texture analysis is able to determine a weighted centroid position and, based on weighted centroid position, the position of the hyper plane can be determined.
Further details regarding digital-image texture analysis and the equations for entropy H and skewness μ3 are found in A. Materka and M. Strzelecki: “Texture Analysis Methods - A Review” Technical Report, University of Lodz, Institute of Electronics, COST B11 report, Brussels 1998.
Figure 26 shows, on the right-hand side, images of the crop at different times in which a region 116 of the image is identified as being the crop plot (and falsely-coloured bright green) and for which entropy H and skewness μ3 can be calculated.
If positional changes are identified, for example, the weighted centroid moves up or down (depending on growth stages), the position of the hyper plane 111 is changed and the canopy height is recorded. After that, corner-featured points are detected within the new canopy space. This step generates pseudo points that are cast in the canopy region for verifying canopy height measures from the previous approaches. In other words, the coordinates of the pseudo points can be used to compute a canopy height. Thus, even if the canopy and/or ranging pole is partially covered (for example, as shown in Figure 28), it is still possible to obtain a height reading.
Referring to Figure 28, the terminal 2 was moved resulting in a black bar across the top of the image. Pseudo points, however, can still be sued to detect the canopy height, even when the other approaches could not generate precise height measurements.
The dynamic canopy height changes are computed by combining the measurements of height markers, with weighted centroid position and the red pseudo points.
The algorithm 92 may try determining the canopy height using different approaches in order. Thus, if the algorithm 92 is unable to determine the height using the ranging pole 24 or obtains an unexpected value (e.g. which exceeds the previously-obtained value by a given percentage), then it will attempt to determine the canopy height by the identified corner-featured points.
The algorithm 92 may try determining the canopy height using all available approaches and combine some of them.
Referring also to Figure 29, the crop performance measurements algorithm 92 performs initial set up including importing vision and imaging libraries and machine leaning libraries (step S3.1), opening a GUI to receive a user selection of an image series (step S3.2) and setting up file systems on the data processing system 15 for saving results (step S3.3).
The crop performance measurements algorithm 92 can also perform checks that the data processing system 15 is operating correctly (for example, that a valid directoiy has been selected) and, if not, to prompt the user to perform a check.
The crop performance measurements algorithm 92 employs an adaptive intensity and dynamic gamma equalisation to adjust colour and contrast to minimise colour distortion caused by diverse in-field lighting (step S3.4).
The crop performance measurements algorithm 92 tracks geometric differences between the plot on a given image and the initial position. If different, a geometric transformation method is applied to recalibrate the image which removes areas outside the plot area (steps S3.5).
Within a plot, the crop measuring algorithm 92 tracks the crop height by detecting the visible part of the ranging pole 24 (step S3.6 & S3.7). If tracking is unsuccessful, then the algorithm 92 moves on to the next image in the series (step S3.4).
To determine the change of the canopy space during the season, an entropy-based texture analysis is used to detect whether the canopy region 112 enclosed by the polygon 111 (herein referred to as the “crop canopy space”, “canopy region” or “hyper plane”) changes between adjacent images. GLCMs are used to calculate the entropy value of canopy texture. If the entropy of the texture shifts (moving up or down, depending on growth stages), the two-dimensional coordinate of the centroid of the canopy texture is recorded, which allows the polygon 111 to be repositioned (i.e. moved vertically) to represent the change of the canopy space.
Referring also to Figure 26, within the repositioned canopy space, corner-featured points 114 are detected. This step generates coloured (e.g. red) pseudo points cast in the canopy region 112, representing the tips of erect leaves at stem elongation or jointing (i.e. GS 32-39), reflective surfaces of curving leaves or crop heads between booting and anthesis (i.e. GS 41-69) and corner points on ears and grains during senescence (i.e. GS 71-95).
Figure 30 shows a series of images illustrating crop growth from GS 37 to GS 92 over a period of 95 days.
Referring still to Figure 24, the algorithm 92 applies Harris and Shi-Tomasi corner detection methods to locate corner-featured points within the canopy region 112.
Referring also to Figure 31, red pseudo points 114 are generated to represent the tips of erect leaves, reflective surfaces of curving leaves, heads and the corner points on ears.
The main orientation of a given plot is quantified based on an optimised Canny edge detection method, which computes the alignment of crop stems.
The user, via the GUI, may change the colour of a reference marker 26 in an image, for example, turning it pink 116. This can be used to mark events, such as if the terminal 2 or the ranging pole 24 is moved or a new monitoring task is initiated.
It is noted that the system does not require the position of the terminal 2, the ranging pole 24 or the reference markers 26 to be known. Thus, there is no need for GPS or for recording positions of, for example, the terminal 2. A machine-learning algorithm, e.g. clustering method, can be used to determine the region of interest, e.g. crop canopy space.
Referring to Figure 11, a data interpolation and analysis module 93 can be used to handle minor data loss during the field experiments.
Referring still to Figure 11 and also to Figure 32, a crop-environment interaction modelling module 94 is used to identify interactions between the recorded crop growth of five wheat genotypes and a number of environmental factors (steps S4.1 to S4.9).
An example of crop-environment modelling is hereinbefore described in relation to Figure 10. The following comments are made in relation to the model used in the example.
Correlations are performed for each environmental factor grouped over three days with the recorded growth data. The reason for grouping environmental factors into nested three-day periods is to remove outliers and smooth the input data. The correlations are determined for each growth stage for five genotypes. The analysis is performed on the grouped data as particular stages (e.g. booting and heading) contain few recorded growth data due to the short duration of both stages were present during the growth. To obtain the dynamic between relative growth rates (RGR) and environmental factors, a formula (eRGR)_1 is used to transfer negative correlation values, as the RGR series is a decreasing sequence in relation to the increasing nature of growth stages.
Based on significant environmental factors, a set of linear regression models are explored and a single linear regression model is selected to estimate RGR of five genotypes in relation to given in-field environment conditions. Environmental factors with insignificant correlations (where p > o.oi, with respect to the height over the entire time-series) are removed from the analysis as they provide little predictive power. Ordinaiy least squares are used to derive the model coefficients and all the stages are included as features. The RGR data is normalised to present percentage changes in height between two consecutive days. To predict the canopy height for a given genotype, environment data at each growth stage is input to the global model. To derive the height of the plant over time, successive applications of
(2) are applied, where ht is the height of the plant at a current time point, ht-i is the height of the plant at the previous time-point, and ho is equal to the initial height.
Referring to Figure to, performance of the model is verified by estimating the growth of all five NILs, including the overall paragon growth data (GT). The estimation is displayed with respect to the true canopy height datasets. The mean squared error recorded for G2 (genotype two, Late-DTEM), G3 (genotype three, Early-DTEM) and G4 (genotype four, Stay-Green) shows that the estimated height is close to the true growth curves. However, the error is much larger for Gi (genotype one, Paragon WT) and G5 (genotype five Short). This is due to the majority of crop growth happens during the early stages (GS32-GS59), estimation deviation during these initial stages could affect the overall height results.
The growth stage predictive model is based on a GxPxE model hereinbefore described. The model is produced to explore how to predict growth stages of different wheat genotypes on the basis of real growth traits and environment data. It employs support vector machines (SVM) with radial basis function kernels to classify growth stages, as SVMs are popular machine learning techniques for classification. The performance of the model is tested by overall paragon wheat growth data (GT) and Paragon WT (Gi), as GT performs well in the GxPxE interaction model whereas Gi performs poorly. The prediction in comparison with the manually recorded growth stages suggests a successful prediction of the timing and duration of stem elongation and jointing (GS32-39) through heading (GS51-59) and flowering (GS61-69) through Ripening (GS91-95).
However, the transition from heading to flowering has introduced an error, the transition has been predicted three days earlier. The main reason for this error is due to the short duration of booting (GS41-49) and heading (GS51-59). All genotypes used for training cannot sufficiently differentiate the two stages.
Modifications
It will be appreciated that various modifications may be made to the embodiments hereinbefore described. Such modifications may involve equivalent and other features which are already known in the design, manufacture and use of imaging systems, image processing and phenotyping systems and component parts thereof and which maybe used instead of or in addition to features already described herein. Features of one embodiment may be replaced or supplemented by features of another embodiment.
Other types of single-board computers can be used, such as Intel (RTM) Edison. Terminals need not use a single-board computer. A high-performance computing system need not be used. For example, a desktop computer or workstation can be used.
Claims (15)
1. A method of processing images of a crop comprising: retrieving a series of images of a crop captured over a period of time; identifying, in an image selected from the series of images to be used as a reference image, a reference system against which other images can be compared, the reference system including an extent of a crop plot and/or a set of one or more reference points; and for each of at least one other image in the series of images: calibrating the image using the reference system; and determining a height of a canopy of the crop in the crop plot in the image, a main orientation of the crop and/or a value indicative of vegetative greenness.
2. A method according to claim l, wherein the one or more reference points include: a plot region, a canopy space, and/or at least one height marker.
3. A method according to claim 1 or 2, further comprising: identifying at least one reference marker in the reference image.
4. A method according to any preceding claim, further comprising: classifying pixels in the reference image into one or more groups corresponding to one or more respective object types.
5. A method according to any preceding claim, further comprising: for each of at least one other image in the series of images: identifying corner-featured points in the crop plot in the image.
6. A method according to any preceding claim, further comprising: preparing the series of images of the crop comprising: receiving a series of captured images of the crop; for each image in the series of captured images: determining whether the image satisfies at least one image-quality requirement; upon a positive determination, adding the image to the series of images to be processed.
7. A method according to claim 6, wherein the at least one image-quality requirement includes brightness of the image, size of the image file, sharpness of the image or the proportion of dark area in the image area.
8. A method according to any preceding claim, wherein determining the height of a canopy of the crop in the crop plot in the image comprises: detecting a visible part of a ranging pole.
9. A method according to any preceding claim, wherein determining the height of a canopy of the crop in the crop plot in the image comprises: calculating an entropy based on compactness of a texture of a crop canopy space, isotropy of the texture of the crop canopy space, and/or distribution of intensity of the crop canopy space.
10. A method according to any preceding claim, wherein determining the height of a canopy of the crop in the crop plot in the image comprises: measuring a weighted centroid of a crop canopy space.
11. A method according to any preceding claim, wherein determining the height of a canopy of the crop in the crop plot in the image comprises: determining respective positions of corner-featured objects in the crop canopy space; and calculating an average from the tip positions.
12. A method according to any preceding claim, further comprising: for the image series: generating dynamic grow curves defining a developmental profile for the crop; calculating stem rigidity and lodging risk based on the main orientation of the crop; and/or calculating vegetation and senescence periods based on a series of the values indicative of vegetative greenness.
13. A system comprising: a data processing system configured to perform the method according to any one of claims 1 to 12.
14. A system according to claim 12, further comprising: at least one terminal comprising: a light-level sensor to measure a light level for controlling image capture settings; a camera for capturing images of a region of a growing crop based on the image capture settings; data storage for storing images captured by the camera; a wireless network interface for transferring data, including images captured by the camera, to a remotely-located image processing computer system; and an on-board computer system for controlling storage and transfer of captured images, wherein the on-board computer system is configured to determine whether a characteristic of the captured image satisfies a predetermined condition and, upon a negative determination, discarding the captured image such that only images satisfying the predetermined condition are transferred.
15. A terminal comprising: a light-level sensor to measure a light level for controlling image capture settings; a camera for capturing images of a region of a growing crop based on the image capture settings; data storage for storing images captured by the camera; a wireless network interface for transferring data, including images captured by the camera, to a remotely-located image processing computer system; and an on-board computer system for controlling storage and transfer of captured images, wherein the on-board computer system is configured to determine whether a characteristic of the captured image satisfies a predetermined condition and, upon a negative determination, discarding the captured image such that only images satisfying the predetermined condition are transferred.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1709756.9A GB2553631B (en) | 2017-06-19 | 2017-06-19 | Data Processing of images of a crop |
EP18721093.5A EP3642792A1 (en) | 2017-06-19 | 2018-04-13 | Data processing of images of a crop |
PCT/GB2018/050985 WO2018234733A1 (en) | 2017-06-19 | 2018-04-13 | Data processing of images of a crop |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1709756.9A GB2553631B (en) | 2017-06-19 | 2017-06-19 | Data Processing of images of a crop |
Publications (3)
Publication Number | Publication Date |
---|---|
GB201709756D0 GB201709756D0 (en) | 2017-08-02 |
GB2553631A GB2553631A (en) | 2018-03-14 |
GB2553631B true GB2553631B (en) | 2019-10-30 |
Family
ID=59462327
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1709756.9A Active GB2553631B (en) | 2017-06-19 | 2017-06-19 | Data Processing of images of a crop |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP3642792A1 (en) |
GB (1) | GB2553631B (en) |
WO (1) | WO2018234733A1 (en) |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020000043A1 (en) * | 2018-06-28 | 2020-01-02 | University Of Southern Queensland | Plant growth feature monitoring |
WO2020087346A1 (en) * | 2018-10-31 | 2020-05-07 | 深圳市大疆创新科技有限公司 | Photographing control method, movable platform, control device, and storage medium |
CN109859101B (en) * | 2019-01-18 | 2022-10-28 | 黑龙江八一农垦大学 | Crop canopy thermal infrared image identification method and system |
CN109948596B (en) * | 2019-04-26 | 2022-04-22 | 电子科技大学 | Method for identifying rice and extracting planting area based on vegetation index model |
US10916028B1 (en) | 2019-08-22 | 2021-02-09 | Cnh Industrial America Llc | Sensor assembly for an agricultural implement and related systems and methods for monitoring field surface conditions |
CN111369494B (en) * | 2020-02-07 | 2023-05-02 | 中国农业科学院农业环境与可持续发展研究所 | Winter wheat spike density detection method and device |
CN114170500A (en) * | 2020-08-20 | 2022-03-11 | 中国农业大学 | Wheat lodging area extraction system and method |
CN112070741B (en) * | 2020-09-07 | 2024-02-23 | 浙江师范大学 | Rice chalkiness degree detecting system based on image salient region extracting method |
CN112712038B (en) * | 2020-12-31 | 2024-05-28 | 武汉珈和科技有限公司 | Method and system for monitoring wheat lodging condition based on multispectral satellite image |
US11810285B2 (en) | 2021-03-16 | 2023-11-07 | Cnh Industrial Canada, Ltd. | System and method for determining soil clod parameters of a field using three-dimensional image data |
CN113325761A (en) * | 2021-05-25 | 2021-08-31 | 哈尔滨工业大学 | Plant growth period identification control system based on deep learning and identification control method thereof |
DE102021114996A1 (en) * | 2021-06-10 | 2022-12-15 | Eto Magnetic Gmbh | Device for detecting sprouting of seeds, agricultural sensor device and agricultural monitoring and/or agricultural control method and system |
CN113469068B (en) * | 2021-07-06 | 2022-11-01 | 信阳农林学院 | Growth monitoring method for large-area planting of camellia oleifera |
CN114688997B (en) * | 2022-03-29 | 2023-03-14 | 华南农业大学 | Automatic blade area detection device and method based on RLS adaptive filtering algorithm |
CN114862705B (en) * | 2022-04-25 | 2022-11-25 | 陕西西影数码传媒科技有限责任公司 | Image quality evaluation method for image color restoration |
CN115049926B (en) * | 2022-06-10 | 2023-10-24 | 安徽农业大学 | Wheat lodging loss evaluation method and device based on deep learning |
CN116503741B (en) * | 2023-06-25 | 2023-08-25 | 山东仟邦建筑工程有限公司 | Intelligent prediction system for crop maturity |
CN117370823B (en) * | 2023-12-05 | 2024-02-20 | 恒健达(辽宁)医学科技有限公司 | Spraying control method and system for agricultural planting |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2001033505A2 (en) * | 1999-11-04 | 2001-05-10 | Monsanto Company | Multi-variable model for identifying crop response zones in a field |
JP2014018140A (en) * | 2012-07-18 | 2014-02-03 | Fujitsu Ltd | Method, device and program for identifying crop condition change date |
CN104320607A (en) * | 2014-08-06 | 2015-01-28 | 江苏恒创软件有限公司 | Method for monitoring growth of farmland crops based on drone |
US20150250113A1 (en) * | 2014-03-04 | 2015-09-10 | Greenonyx Ltd | Systems and methods for cultivating and distributing aquatic organisms |
CN105574897A (en) * | 2015-12-07 | 2016-05-11 | 中国科学院合肥物质科学研究院 | Crop growth situation monitoring Internet of Things system based on visual inspection |
CN105869152A (en) * | 2016-03-24 | 2016-08-17 | 北京农业信息技术研究中心 | Method and device for measuring spatial distribution of crop plant heights through unmanned plane remote sensing |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6529615B2 (en) * | 1997-10-10 | 2003-03-04 | Case Corporation | Method of determining and treating the health of a crop |
US20160224703A1 (en) * | 2015-01-30 | 2016-08-04 | AgriSight, Inc. | Growth stage determination system and method |
US9734400B2 (en) * | 2015-01-30 | 2017-08-15 | AgriSight, Inc. | System and method for field variance determination |
-
2017
- 2017-06-19 GB GB1709756.9A patent/GB2553631B/en active Active
-
2018
- 2018-04-13 WO PCT/GB2018/050985 patent/WO2018234733A1/en unknown
- 2018-04-13 EP EP18721093.5A patent/EP3642792A1/en not_active Withdrawn
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2001033505A2 (en) * | 1999-11-04 | 2001-05-10 | Monsanto Company | Multi-variable model for identifying crop response zones in a field |
JP2014018140A (en) * | 2012-07-18 | 2014-02-03 | Fujitsu Ltd | Method, device and program for identifying crop condition change date |
US20150250113A1 (en) * | 2014-03-04 | 2015-09-10 | Greenonyx Ltd | Systems and methods for cultivating and distributing aquatic organisms |
CN104320607A (en) * | 2014-08-06 | 2015-01-28 | 江苏恒创软件有限公司 | Method for monitoring growth of farmland crops based on drone |
CN105574897A (en) * | 2015-12-07 | 2016-05-11 | 中国科学院合肥物质科学研究院 | Crop growth situation monitoring Internet of Things system based on visual inspection |
CN105869152A (en) * | 2016-03-24 | 2016-08-17 | 北京农业信息技术研究中心 | Method and device for measuring spatial distribution of crop plant heights through unmanned plane remote sensing |
Also Published As
Publication number | Publication date |
---|---|
WO2018234733A1 (en) | 2018-12-27 |
GB2553631A (en) | 2018-03-14 |
EP3642792A1 (en) | 2020-04-29 |
GB201709756D0 (en) | 2017-08-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
GB2553631B (en) | Data Processing of images of a crop | |
Sun et al. | Three-dimensional photogrammetric mapping of cotton bolls in situ based on point cloud segmentation and clustering | |
US10028452B2 (en) | Horticultural monitoring system | |
Virlet et al. | Field Scanalyzer: An automated robotic field phenotyping platform for detailed crop monitoring | |
Zhou et al. | CropQuant: an automated and scalable field phenotyping platform for crop monitoring and trait measurements to facilitate breeding and digital agriculture | |
Bernotas et al. | A photometric stereo-based 3D imaging system using computer vision and deep learning for tracking plant growth | |
Bac et al. | Robust pixel-based classification of obstacles for robotic harvesting of sweet-pepper | |
González-Esquiva et al. | Development of a visual monitoring system for water balance estimation of horticultural crops using low cost cameras | |
CN109843034B (en) | Yield prediction for grain fields | |
EP3032946A2 (en) | Method for automatic phenotype measurement and selection | |
CN113223040B (en) | Banana estimated yield method and device based on remote sensing, electronic equipment and storage medium | |
Wu et al. | Predicting Zea mays flowering time, yield, and kernel dimensions by analyzing aerial images | |
WO2020000043A1 (en) | Plant growth feature monitoring | |
Olenskyj et al. | End-to-end deep learning for directly estimating grape yield from ground-based imagery | |
Lootens et al. | High-throughput phenotyping of lateral expansion and regrowth of spaced Lolium perenne plants using on-field image analysis | |
Zhao et al. | Detecting sorghum plant and head features from multispectral UAV imagery | |
Gonzalez et al. | PhytoOracle: Scalable, modular phenomics data processing pipelines | |
He et al. | Extraction of soybean plant trait parameters based on SfM-MVS algorithm combined with GRNN | |
Subeesh et al. | UAV imagery coupled deep learning approach for the development of an adaptive in-house web-based application for yield estimation in citrus orchard | |
CN116052141B (en) | Crop growth period identification method, device, equipment and medium | |
Goel et al. | Machine learning-based remote monitoring and predictive analytics system for crop and livestock | |
Wong et al. | Automated Corn Ear Height Prediction Using Video-Based Deep Learning | |
Agarwal | Detection of plant emergence based on spatio temporal image sequence analysis | |
Jing et al. | Sunflower-YOLO: Detection of sunflower capitula in UAV remote sensing images | |
Schmidtke | Developing a phone-based imaging tool to inform on fruit volume and potential optimal harvest time |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
732E | Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977) |
Free format text: REGISTERED BETWEEN 20201210 AND 20201216 |