WO2018234733A1 - Data processing of images of a crop - Google Patents

Data processing of images of a crop Download PDF

Info

Publication number
WO2018234733A1
WO2018234733A1 PCT/GB2018/050985 GB2018050985W WO2018234733A1 WO 2018234733 A1 WO2018234733 A1 WO 2018234733A1 GB 2018050985 W GB2018050985 W GB 2018050985W WO 2018234733 A1 WO2018234733 A1 WO 2018234733A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
crop
images
canopy
series
Prior art date
Application number
PCT/GB2018/050985
Other languages
French (fr)
Inventor
Ji Zhou
Daniel Reynolds
Simon Griffiths
Original Assignee
Earlham Institute
John Innes Centre
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Earlham Institute, John Innes Centre filed Critical Earlham Institute
Priority to EP18721093.5A priority Critical patent/EP3642792A1/en
Publication of WO2018234733A1 publication Critical patent/WO2018234733A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • G06T7/41Analysis of texture based on statistical description of texture
    • G06T7/45Analysis of texture based on statistical description of texture using co-occurrence matrix computation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture

Definitions

  • the present invention relates to data processing of images of a crop, in particular a cereal crop such as wheat, maize or rice, for use in image-based field phenotyping.
  • QTL quantitative trait locus
  • GWAS genome- wide association studies
  • MAS marker-assisted selection
  • GS genomic selection
  • Agricultural practitioners such as breeders, growers, farmers and crop scientists, have been seeking new approaches to relieving the bottleneck.
  • non-invasive remote sensors and aerial imaging devices such as unmanned aerial vehicles (UAVs) and blimps, are being used to study crop performance and field variability.
  • Satellite imaging and tailored portable devices can be used to predict crop growth and yield potential based on canopy photosynthesis and normalised difference vegetation indices (NDVI).
  • a method of processing images of a crop (which maybe in a field, a part of field, a plot, a plant pot, or plant tray).
  • the crop may be a cereal crop.
  • the method comprises retrieving a series of images of a crop captured over a period of time and identifying, in an image (or "initial image") selected from the series of images to be used as a reference image, a reference system against which other images can be compared, the reference system including an extent of a crop plot and/or one or more reference points.
  • the method also comprises, for each of at least one other image in the series of images, calibrating or adjusting the image using the reference system, and determining a height of a canopy of the crop in the image, a main orientation of the crop and/or a value indicative of vegetative greenness (for example, a normalised green value in an RGB colour space and/or excessive greenness).
  • the method can be used to process images of a crop which have been captured in the field and, thus, subject to vagaries of weather. Moreover, the method can be used for each crop and, thus, allow large data to be processed for large numbers of crops.
  • the one or more reference points may include a plot region (which includes the crop plot and a region around the crop plot, e.g. a gap between adjacent crop plots), a canopy space, and/or at least one height marker (which maybe a graduated ranging pole and/or a reference mark).
  • the method may further comprise identifying at least one reference marker in the reference image.
  • the method may further comprise classifying pixels in the reference image into one or more groups corresponding to one or more respective object types.
  • the method may further comprise, for each of at least one other image in the series of images, identifying corner-featured points in the crop plot in the image.
  • the method may further comprise preparing the series of images of the crop
  • the at least one image-quality requirement may include brightness of the image, size of the image file, sharpness of the image and/or the proportion of dark area in the image area.
  • the method may further comprise, for the image series, generating dynamic grow curves defining a developmental profile for the crop, calculating stem rigidity and lodging risk based on the main orientation of the crop and/ or calculating vegetation and senescence periods based on a series of the values indicative of vegetative greenness.
  • Vegetative greenness Gv(x,y) can be computed based on excessive greenness EXG(X,V) and excessive red Ex R (x,y) indices.
  • the vegetative greenness can be defined by:
  • a computer program comprising instruction which, when executed by a data processing system causes the data processing system to perform a method according to the first or second aspects of the present invention.
  • a computer program product comprising a computer-readable storage media storing a computer program according to the third aspect of the present invention.
  • a system comprising a data processing system configured to perform the method according to the first or second aspects of the present invention.
  • the system may further comprise at least one terminal.
  • The, or each terminal may comprise a light-level sensor to measure a light level for controlling image capture settings, a camera for capturing images of a region of a growing crop based on the image capture settings, data storage for storing images captured by the camera, a wireless network interface for transferring data, including images captured by the camera, to a remotely-located image processing computer system and an on-board computer system for controlling storage and transfer of captured images.
  • the on-board computer system may be configured to determine whether an image-quality characteristic of the captured image satisfies a predetermined condition and, upon a negative determination, discarding the captured image such that only images satisfying the predetermined condition are transferred. According a sixth aspect of the present invention there is provided a terminal.
  • the terminal comprises a light-level sensor to measure a light level for controlling image capture settings, a camera for capturing images of a region of a growing crop based on the image capture settings, data storage for storing images captured by the camera, a wireless network interface for transferring data, including images captured by the camera, to a remotely-located image processing computer system and an on-board computer system for controlling storage and transfer of captured images.
  • the on-board computer system is configured to determine whether an image-quality characteristic of the captured image satisfies a predetermined condition and, upon a negative determination, discarding the captured image such that only images satisfying the predetermined condition are transferred.
  • Figure 1 is a schematic block diagram of an image capture and data processing system
  • Figure 2 illustrates a terminal used in plant breeding version of an image capturing system
  • Figure 3 is a plan view of the terminal and crop shown in Figure 2;
  • FIG. 4 illustrates the terminal shown in Figures 2 and 3 in more detail
  • Figure 5 illustrates a terminal in the form of a cart used in farming version of an image capturing system
  • Figures 6a, 6b and 6c illustrate a cart using a farming version of the image capturing system in first, second and third configurations respectively;
  • Figure 7a is a front elevation of the cart shown in Figure 5 with soil augers deployed;
  • Figure 7b is a plan view of the underside of the cart shown in Figure 5;
  • Figure 8 is a process flow diagram of a method of capturing images, and processing and quantifying crop growth patterns and adaptive traits
  • Figure 9a illustrates plots of height for five near isogenic lines (NILs) of wheat, thermal time, solar radiation, and rainfall against time;
  • NILs near isogenic lines
  • Figure 9b is a plot of relative growth rate (RGR) against time for five NILs at different stages of growth
  • Figures 9c and 9d are heat maps for first and second growth traits (namely RGR and canopy height) respectively showing the relationship between environmental factors in relation to four key growth stages (GS32-69);
  • Figure 9e show graphs of actual and estimated height against time for a combination of five NILs
  • Figure 9f shows actual and predicted growth stages
  • Figures 10a to loe show graphs of actual and estimated height against time for five NILs and for a combination of the five NILs;
  • Figure 11 is a schematic block diagram of servers and software modules used in the image-based phenotyping system
  • Figure 12 is process flow diagram of a method of capturing images of a crop
  • Figure 13 is a process flow diagram of a method of selecting images from a series of captured crop images
  • Figure 14 is a process flow diagram of a method of defining coordinates of a reference system
  • Figure 15 is an image of a crop, ranging pole and reference points
  • Figure 16 is an image showing identification of reference points
  • Figure 17 is an image showing computer-generated points and rectangles around the reference points
  • Figure 17a illustrates pixel-metric conversion
  • Figure 18 is an image of a crop, ranging pole and reference points
  • Figure 19 is an image showing six different classes of objects
  • Figure 20a is an image resulting from a first type of edge detection
  • Figure 20b is an image resulting from a second type of edge detection
  • Figure 20c is an image combining the first image shown in Figure 20a and the second image shown in Figure 20;
  • Figure 21 is an image showing computer-identified markers
  • Figure 22 is an image showing a constructed reference system
  • Figure 23 is an image showing a crop canopy
  • Figures 24a and 24b illustrates measuring visibility of a ranging pole to determine the height of the plot over a season
  • Figure 25 illustrates measuring a canopy region using corner-featured detection over a season
  • Figure 26 illustrates measuring lodging risk using pattern measurements over a season
  • Figures 27a and 27b illustrates calculation of entropy and skewness
  • Figure 28 is an image where a ranging pole is partially covered
  • Figure 29 is a process flow diagram of a method of tracking plots of interest based on initial reference positions
  • Figure 30 illustrates a series of images showing tracking plots of interest
  • Figure 31 illustrates a series of images showing marking of the tips of erect leaves
  • Figure 32 is a process flow diagram of a method of modelling interaction between the crop and environment
  • Figure 33 is a process flow diagram of a method of predictively modelling crop growth.
  • an image capturing and data processing system 1 is shown for use in field phenotvping.
  • the system 1 can allow automatic monitoring of crop growth and development using low-cost, in-field terminal workstations 2 (or "terminals").
  • a set of n terminals 2 may be used, where n maybe one or at least two, for example, between two and 20.
  • the crop 3 is may be a cereal crop, such as wheat, maize or rice.
  • Each terminal 2 includes a computer system 4, preferably in the form of a single-board computer (such as a Raspberry Pi 2 or a Raspberry Pi 3), on-board storage 5, a camera module 6, a temperature and humidity sensor module 7, a light sensor module 8, a soil sensor module 9, a wireless network interface module 10 and wired network interface(s) 11.
  • Terminals 2 may be connected to an in-field wireless local area network 12 and a terminal 2 may be connected to other terminals 2 by a wired peer-to-peer connection 13.
  • the terminals 2 may provide nodes in a wireless mesh network.
  • the computer system 4 includes at least one processor (not shown) and memory (not shown).
  • a terminal 2 can be used to perform continuous monitoring using high-resolution (e.g. 5 megapixel), time-lapse photography, in-freld evaluation and data transfer via realtime file sharing and data exchange servers.
  • the camera module 6 can be used to capture, for example, between 1 and 6 images each hour. Preferably, three images per hour are captured, i.e. resulting in 48 images per day (assuming images are only captured during 16 hours of a 24-hour day).
  • the system 1 includes a central web server 14, a data processing system 15, preferably in the form of a high-performance computing system (HPC), having associated storage 16 and a database storage 17 which receive image and environment data from the central web server 14.
  • the system 1 may provide a web-based interface, via a network, 17 to a remotely-located computer system 19 for monitoring workstation status and data.
  • Fixed and/ or mobile computing devices 20, 21 can be used to access the system.
  • the terminals 2 take the form of compact units fixedly mounted on posts 21 (or "poles") and can be used by plant breeders to monitor a crop 3, such as wheat.
  • Each terminal 2 is elevated to a height h p above ground level 22, typically between 50 and 100 cm and preferably 75 cm.
  • a ranging pole or rod 23 preferably having a series of alternating coloured bands or sections of equal height, is placed roughly at the centre of a sample plot 24.
  • the ranging pole 23 extends to a height ht above ground level, preferably 1.2 m, and is located a distance di from the terminal 2, preferably between 1.4 to 1.5 m.
  • the sample plot takes the form of a square or rectangle having an area A, preferably about 1 m 2 .
  • a white, rectangular reference point (or "fiducial mark”) 25 is inserted in the ground 26 between the terminal 2 and the ranging pole 23.
  • the reference point 25 is located a distance d 2 from the terminal 2, preferably between 1.0 to 1.1 m
  • a soil sensor 9 is inserted in the soil 25 in or close to the sample plot 24.
  • a cable (not shown) connects the terminal 2 and the soil sensor 9.
  • the terminal 2 comprises a weather-proof case 29 (herein referred to as a "housing” or “enclosure”) formed from acrylonitrile butadiene styrene (ABS) or other suitable material, having a generally rectangular prism shape and having a top 30, an underside 31, front 32, sides 33 and rear 34.
  • a weather-proof case 29 (herein referred to as a "housing” or “enclosure”) formed from acrylonitrile butadiene styrene (ABS) or other suitable material, having a generally rectangular prism shape and having a top 30, an underside 31, front 32, sides 33 and rear 34.
  • ABS acrylonitrile butadiene styrene
  • the terminal 2 has a camera hood 35 which shields a UV lens 36 disposed in front of camera module 6.
  • the camera module 6 preferably takes the form of an RGB camera module.
  • the camera module 6 is connected to the single- board computer 6 via a cable 37 and on-board connector 38.
  • An LED 39 is disposed in the top 30 of the case 29 and is connected to the single-board computer 6 via general- purpose input/output 40.
  • the single-board computer system 4 includes an integrated sensor board 41.
  • the single-board computer system computer 2 includes an Ethernet connector 42 and USB connector 43 which are connected to respective RJ45 Ethernet and mini-USB sockets 44, 45 at the rear 34 of the case 29 and which provide wired network interfaces 11.
  • the single-board computer system 4 is provided with a WiFi dongle which provides a wireless network interface module 10 and with USB storage which provides on-board storage 5.
  • the USB storage 5 may have a capacity of 16 GB or more.
  • the single-board computer system 4 is powered via a (e.g. 12V/5V) voltage converter 46 which receives power from an external source such as a (e.g. 12V) battery (not shown) which is trickle-charged by a solar panel (not shown) or via a (e.g. 5V/2A) power supply (not shown) via an external power and data connector 47.
  • the soil sensor 9 ( Figure 2) is connect to the single-board computer system 4 via the external power and data connector 47.
  • An environmental sensor module 48 is mounted on the top 30 of the case 29 can us connected to the single-board computer system 4 via a data cable 49 ⁇
  • the single-board computer 4 may be provided with a heat sink 50.
  • the terminal 2 takes the form a cart.
  • the camera module 6 and, optionally, other modules are contained in a sensor housing 51 having a dome-shaped cap and which is mounted to a distal end of a telescopic pole 52 which extends upwardly from a cart enclosure 53 (or "cart body") having a top 54 and underside 55, front 56, sides 57 and rear 58.
  • the sensor housing 52 is elevated to a height h tp above ground level 22, typically between 50 cm and 3 m, preferably 2.5 m.
  • the reference point 25 is located a distance d 2 from the terminal 2, preferably between 2 to 4 m, preferably 3 m.
  • wheels 59 are deployed from the underside 55 of the cart enclosure 53 and the cart can be manoeuvred into position using handles 60 which are rotatably mounted to the sides 57 of the cart 51 close to the rear 58.
  • the wheels 59 are withdrawn into the enclosure 54 and the handles 60 are folded back against the sides 57 of the cart enclosure 54 so that the cart rests on legs 61.
  • a solar panel 62 pivotably mounted on the top 54 of the cart enclosure 53 is folded out.
  • the sensor housing 51 can be raised from the cart enclosure 53.
  • a set of four soil augers 63 can drill, from the underside 55 of the cart, into the soil so as to secure the cart in position.
  • the telescopic pole 52 is mounted to a turntable 64, driven by a motor 65, which allows the pole 52 to be rotated.
  • the system 1 can facilitate automatic crop phenotyping.
  • scientists and agricultural practitioners can access terminal workstations 2 remotely for real-time monitoring using, for example a computer 19 (for example, located in an office or laboratory) or a tablet or smart phone 20 (for example, located in the field). Users can inspect not only a whole field in different regions using a plurality of terminals 2, but also can control terminals 2 to review performance of crops, initiate new monitoring sessions or transfer on-board phenotypic and sensor datasets to external computing storage.
  • control system supports the collation of phenotypic and sensor data for storage, visualisation, GUI-based systems interactions, and processing on high-performance computing (HPC) system 14, 15, 16, for example, in the form of SGI UV 2000 system having Intel (RTM) Xeon (RTM) cores.
  • HPC high-performance computing
  • An in-freld weather station can be used to meteorological data including
  • Phenotypic and climate datasets are managed and saved in the data processing system 15 for durable data storage and centralised trait analysis.
  • the terminals 2 are configured to take images at a rate of three per hour and at a resolution of 2592 x 1944 pixels so as to capture phenotypic plasticity, early expression of traits and crop-environment interactions. For example, over 200 GB data may be generated by ten terminals 2 in a field season during a 95-day period.
  • analytic libraries such as OpenCV, Scikit-learn and Scikit-image can be used, together with automated bioimage informatics algorithms embedded in a high-throughput trait analysis pipeline.
  • a terminal 2 captures images of the crop and transmits the images to the data processing system (step So).
  • the data processing system selects representative crop images according to their size, clarity, imaging dates and genotypes (step Si). Preferably only high-quality images are used for trait analysis, although all images (including low-quality images) can be stored.
  • the data processing system 15 defines reference positions of plots monitored during the experiment (step S2).
  • the data processing system 15 identifies an initial reference position of a given plot and to transform every image in the series to the same position for comparison. For instance, the data processing system 15 detects coordinates of the white reference points 25 ( Figure 2) and dark markers on the ranging pole 22 ( Figure 2) to carry out colour feature selection. The data processing system 15 classifies pixels into different groups such as crop canopy, wheel tracks, and plot regions based on the machine-learning methods, such as k-means and spectral clustering. The system 15 then establishes a pseudo reference system that records the plot area, the canopy space, height markers and the pixel-metric conversion.
  • the data processing system 15 uses the initial reference positions to perform in-depth trait analysis (step S3).
  • the data processing system 15 employs an adaptive intensity and gamma equalisation to adjust colour and contrast to minimise colour distortion caused by variable in-field lighting.
  • the system 15 tracks geometric differences between the plot on a given image and the initial plot position. If different, a geometric transformation is applied to recalibrate the image, which removes areas outside the plot area. This can generate different sizes of black bars in the top of the given image.
  • data processing system calculates the crop height by detecting the visible part of the ranging pole as well as the canopy region.
  • the data processing system 15 locates corner-featured points within the canopy region to generates pseudo points so as to locate the tips of erect leaves at stem elongation or jointing (i.e.
  • the system can include other dynamic measures of a number of traits in the pipeline (step S4). For example, vegetative greenness is calculated through separating the green channel in RGB images within plots of interest. An output which lies in a range between 0-255, can be used to assess green biomass and stay-green (i.e. prolonged green leaf area duration through delayed leaf senescence). Morphological traits such as the main orientation of a given plot which lies in a range between o°-i8o°, are quantified based on an optimised edge detection method, which computes the alignment of crop stems for assessing stem rigidity and lodging risk.
  • NILs Near-isogenic lines
  • RGR daily relative growth rates
  • Ppd-i lof was the last to stop increasing in height.
  • the heights of five NILs were very similar in the middle of June (highlighted by a dashed circle), which verifies what was observed in the field as all the NILs were at different growth stages.
  • the scatter chart shown in Figure 9b shows the growth vigour of the five genotypes, active from jointing to flowering (i.e. GS32-69) and inactive after GS71 (grain-filling).
  • Pearson correlation coefficient was calculated and the p-value based on growth traits such as normalised RGR (nesting three-day rates to reduce noise) and canopy height at four key stages (i.e. jointing, booting, heading and flowering).
  • growth traits p ⁇ 0.01
  • FIG. 9e shows how the model forecasts the overall Paragon growth data (GT, mean squared error: 20.1512, correlation: 0.9991).
  • the model uses the six environmental factors at six stages (i.e. GS32-95) as the input to obtain estimates of the relative growth rates y t for every given genotype.
  • the model is also applied to predict the growth of the five NILs and compared the estimated growth with the recorded data generated the system.
  • a second model is used to forecast the timing and duration of key growth stages (i.e. GS32-95) to link the crop growth prediction with real-world agricultural practices.
  • GS32-95 key growth stages
  • farmers, growers and breeders can make sound decisions based on the difference between the predicted growth curve and the actual growth pattern measured by the system.
  • This approach could also assist agricultural practitioners in terms of line selection, fertiliser application, irrigation and harvesting to secure yield production.
  • Figure 9f illustrates the performance of the second model. It has employed a set of support vector machines (SVM) with radial basis function kernels to classify the timing and duration of key growth stages. The model was tested by comparing the predicted growth stages with the true data measured by crop
  • SVM support vector machines
  • Figures 10a to loe show graphs of actual and estimated height against time for five NILs and for a combination of the five NILs;
  • the data transfer and processing system 81 includes several servers and software modules
  • the system 81 includes comprises a data transfer server 82 and a remote-control server 83 running on the central web server 14 which allows users to connect to terminals 2 ( Figure 1).
  • the system also includes support modules 84 for performing functions such as uploading sensor data and hardware information.
  • Representative daily images are routinely selected and transferred to the central server during the night, which provides a daily snapshot of the monitored crops.
  • the system includes a control module 85 running on the central web server 14 which logs updates received from clients, in other words the terminals 2.
  • the terminal 2 includes an imaging program 86 to control the camera module 6 ( Figure 1) for time-lapse crop monitoring.
  • the program 86 can automatically adjust white balance, exposure mode and shutter speed in relation to variable in-field lighting conditions using an interface (for example the picamera package). Both image resolution and imaging frequency can be changed if users want to modify experimental settings.
  • the program 86 can also conducts the initial quality control and data backup after every image is captured.
  • An application 87 running on each terminal 2 is run at regular, scheduled intervals.
  • the application 87 queries the terminal 2 to determine workstation status information such as uptime, network addresses and storage usage.
  • Sensor data and more variable system data such as CPU temperature and processor/memory usage is sampled at a higher frequency and a mean average of the readings is recorded during the query.
  • JSON JavaScript Object Notation
  • the central server 14 which stores the data in a database 88 (for example, a SQL database) running on the data processing system 15.
  • Status of the system can be displayed and automatically updated using a web-based interface using a web browser 89 running on a remote computer 19, determining whether each node 2 is online by the time of their last update.
  • the web interface provides information, including the location of terminals 2 in the field (a field map can be uploaded to the central server), graphs of collected terminal/sensor data, and facilitates SSH and VNC linking to active nodes 2.
  • the system provides a centralised real-time monitoring system to administer the network of in-field workstations 2 and collate collected data for visualisation, batch processing and annotation.
  • the application 87 imports vision and imaging libraries (step S0.1), receives experimental settings including genotypes, workstation ID, imaging frequency and duration (step So.2) and checks that the hardware, such as WiFi interface 10, USB flash drive 5 and the like, is operating correctly (steps So.3 & So.4). If the hardware is not operating correctly, then the user is prompted to check the hardware via the user interface (step S0.5). The user then set up imaging dates and create folders for capturing and saving time-lapse image series (step So.6). The application 87 then starts the process of periodically acquiring images.
  • the application 87 dynamically adjusts imaging setting such as white balance, shutter speed and exposure mode based on infield light conditions (step So.8).
  • the application checks whether on-board storage is full or imaging should stop for other reasons (step So.9). If there is sufficient storage, then the application triggers image capture and saves the image in on-board storage 5 (step So.10).
  • the application 87 checks image quality (step So.11). If image is of insufficiently high quality, the image is archived and the image is removed from the series of images (step So.12). If the image is of a sufficiently high quality, then the application places the terminal into a sleep mode and sets a timer (not shown) for the next image (step So.13).
  • the data processing system 15 executes several algorithms and software modules 90, 91, 92, 93, 94, 95.
  • An image selection algorithm 90 performs an assessment of large image datasets captured in field trials by comparing images with a number of fixed criteria.
  • the algorithm 90 performs initial set up including importing vision and imaging libraries (step S1.1), opening a GUI to receive a user selection of an image series (step S1.2) and setting up file systems on the data processing system 15 for saving results (step S1.3).
  • the algorithm 90 can also perform checks that the data processing system 15 is operating correctly (for example, that a valid directory has been selected) and, if not, to prompt the user to perform a check.
  • the image selection algorithm 90 goes through each image in the series (step S1.5 & S1.6) to determine whether the image meet analysis standards (step S1.7). Those that meet the standard are collated. Each image is quantified by brightness, shadow percentage and sharpness, allowing images that meet or exceed a set of thresholds to be retained for further traits analysis.
  • the median value of pixel intensity is taken by transforming the image into hue, saturation and value ("HSV") colour space. If the median intensity value is lower than a set threshold, the image is culled and not used from this point forward.
  • the median brightness may be assigned a value between o (dark) to 1 (bright) and a threshold having a value of between 0.3 and 0.5 may be used.
  • a threshold value of over 0.5 corresponds to an image taken in bright sunshine.
  • Image sharpness (or "image clarity”) is determined by applying a Sobel edge detection. The detectable edges are calculated and then correlated with sharpness and exposure range of the image.
  • the result of clarity detection is also compared to a set threshold, which will disqualify images if they are out of focus or unclear with ill-defined edges.
  • a set threshold For example, an obtained value may take a value between o (blurred) to 1 (sharp) and a threshold may take a value of between 0.3 and 0.5.
  • a threshold value of over 0.5 corresponds to a sharp image.
  • Measuring shadow areas involves determining the proportion of the image containing dark pixels and comparing it to a threshold value. For example, the proportion may take a value between o (all shadow) to 1 (no shadow) and a threshold may take a value of 0.2 (i.e. 20%).
  • Measuring size involves determining the size of the image and comparing it to a threshold value. For example, the threshold may be 3.0 MB. If the image selection algorithm 90 judges the image to be of low quality, then the image is removed from the series (step S1.9). Information about the discarded image may be recorded.
  • the selected image is included in a result folder, with a CSV file recording image metadata for further high-throughput image analysis (step S1.10).
  • Image selection may be based on one, two, three or all four of these measures.
  • image selection is based on all four measures. Other measures may be used.
  • a plot detection algorithm 91 detects initial reference positions of monitored plots.
  • the plot detection algorithm 91 performs initial set up including importing vision and imaging libraries (step S2.1), opening a GUI to receive a user selection of an image to serve as reference image (step S2.2) and setting up file systems on the HPC system for saving results (step 2.3).
  • the plot detection algorithm 91 can also perform checks that the data processing system 15 is operating correctly (for example, that a valid directory has been selected) and, if not, to prompt the user to perform a check.
  • the plot detection algorithm 91 may perform gamma correction (step S2.4), for example, so as to balance intensity distribution towards 50%. This can help with image screening.
  • the plot detection algorithm 91 identifies the coordinates of white reference canes 26 so as to define the plot region and dark height markers on the ranging pole 24 using colour-based feature selection on the basis of HSV and Lab non- linear colour space (step S2.5).
  • the plot detection algorithm 91 uses a normalised grey image scale to detect white parts of the image which have a high saturation value. For example, this may involve keeping only 30 % of the pixels having the highest saturation value.
  • the plot detection algorithm 91 then removes small objections, for example, those which have a height (or other dimension) no more than a given number of pixels.
  • Reference canes 26 have a height of over 12,500 pixels and, thus, a threshold less than 12,500 is used. Holes in the image, i.e. the detected small objects, are filled in as black.
  • Figure 16 shows the result of identifying the most saturated regions and removing small objects from the image.
  • the plot detection algorithm 91 identifies the reference canes 26 based on size and the ratio of width-to-length ("WL ratio").
  • the plot detection algorithm 91 defines a rectangle 101 for each reference canes 26 and a corresponding centre 102.
  • the plot detection algorithm 91 perform classifies pixels into different groups such as sky 103, soil 27 between plots, crop canopy 105, shadow 106, references 26, markers 107 and other objects (not shown) using unsupervised machine-learning techniques such as K-means and spectral clustering (step S2.6).
  • Figure 17 shows the RGB image and Figure 18 shows the corresponding classified image where different groups are differently coloured.
  • the top half of the image is generally navy blue and the bottom half mainly comprises region of cyan and navy blue.
  • the large region of navy blue contains flecked regions red and orange.
  • step S2.7 If no objects are found (step S2.7), the plot detection algorithm 91 prompts the user to select another image to use as a reference (step S2.8).
  • the algorithm establishes a pseudo three-dimensional reference system that records the two-dimensional coordinates of the plot area, the canopy region, and height markers through a range of feature selection approaches (steps S2.9 to S2.12).
  • the pixel-metric conversion is also computed based on height markers on the ranging pole 24.
  • pixel-metric conversion includes counting the number of pixels between the centres C of adjacent dark markers 107 and/or an angle of inclination, ⁇ , of a line between the centres of adjacent dark markers.
  • the plot detection algorithm 91 finds makers on the ranging pole 23 using two techniques.
  • the plot detection algorithm 91 uses an edge detector, such as a Canny edge detector, to detect the markers 107 (which have well-defined edges with respect to the pole and to adjacent light markers)
  • the plot detection algorithm 91 uses global thresholding using the median intensity value to find the markers 107 (which are very dark).
  • the plot detection algorithm 91 can combine the results of the two approaches.
  • the plot detection algorithm 91 locates the markers 107 on the ranging pole 24.
  • the plot detection algorithm 91 identifies the (three) reference points 26, or more specifically the centres 102 of the reference points 26, and computes a location for a fourth reference point 108 and defines a polygon 109 having the reference points 102, 108 as the vertices.
  • the plot detection algorithm 91 calculates corners 110 of a polygon 111 which defines resized canopy region 112. As the crop grows, the polygon 111 moves and the corresponding entropy changes.
  • a crop performance measurements algorithm 92 is used to measure canopy height, identify corner features and measure growth, colour and orientation trait analysis. For a given crop genotype, adaptive intensity and gamma equalisation is applied to the image to minimise colour distortion caused by variable field illumination.
  • the algorithm 92 can determine the canopy height in a number of different approaches in case one of them cannot extract the height reading as planned.
  • a first approach is simply to inspect the ranging pole 24 and identify the visible part 113 of the pole 24 and, thus, "read" the height of the plot off the pole 24.
  • the first approach may not always be possible, especially as the crop gets taller or the ranging pole was covered by random objects such as pointing leaves, agricultural vehicles, or people, which could provide false positive height reading, and all of the ranging pole 24 is obscured.
  • a second approach involves determining coordinates of the top of leaves which are labelled with pseudo points 114 and then calculating the median value of heights of the pseudo points 114.
  • the two-dimensional pseudo points' height coordinates (y-axis) are calculated within the crop canopy space and then the median value 115 of the height readings is computed to represent the canopy height at the time point.
  • an entropy-based process of detection in particular calculating entropy using grey-level co-occurrence matrices (GLCM).
  • GLCM grey-level co-occurrence matrices
  • the entropy-based texture analysis is used to detect whether the canopy region 112 enclosed by the hyper plane 111 changes between two adjacent images using GLCM.
  • the texture analysis is able to determine a weighted centroid position and, based on weighted centroid position, the position of the hyper plane can be determined.
  • Figure 26 shows, on the right-hand side, images of the crop at different times in which a region 116 of the image is identified as being the crop plot (and falsely-coloured bright green) and for which entropy H and skewness ⁇ 3 can be calculated. If positional changes are identified, for example, the weighted centroid moves up or down (depending on growth stages), the position of the hyper plane 111 is changed and the canopy height is recorded. After that, corner- featured points are detected within the new canopy space. This step generates pseudo points that are cast in the canopy region for verifying canopy height measures from the previous approaches. In other words, the coordinates of the pseudo points can be used to compute a canopy height. Thus, even if the canopy and/ or ranging pole is partially covered (for example, as shown in Figure 28), it is still possible to obtain a height reading.
  • the terminal 2 was moved resulting in a black bar across the top of the image. Pseudo points, however, can still be sued to detect the canopy height, even when the other approaches could not generate precise height measurements.
  • the dynamic canopy height changes are computed by combining the measurements of height markers, with weighted centroid position and the red pseudo points.
  • the algorithm 92 may try determining the canopy height using different approaches in order. Thus, if the algorithm 92 is unable to determine the height using the ranging pole 24 or obtains an unexpected value (e.g. which exceeds the previously-obtained value by a given percentage), then it will attempt to determine the canopy height by the identified corner-featured points.
  • the algorithm 92 may try determining the canopy height using all available approaches and combine some of them.
  • the crop performance measurements algorithm 92 performs initial set up including importing vision and imaging libraries and machine leaning libraries (step S3.1), opening a GUI to receive a user selection of an image series (step S3.2) and setting up file systems on the data processing system 15 for saving results (step S3.3).
  • the crop performance measurements algorithm 92 can also perform checks that the data processing system 15 is operating correctly (for example, that a valid directory has been selected) and, if not, to prompt the user to perform a check.
  • the crop performance measurements algorithm 92 employs an adaptive intensity and dynamic gamma equalisation to adjust colour and contrast to minimise colour distortion caused by diverse in-field lighting (step S3.4).
  • the crop performance measurements algorithm 92 tracks geometric differences between the plot on a given image and the initial position. If different, a geometric transformation method is applied to recalibrate the image which removes areas outside the plot area (steps S3.5).
  • the crop measuring algorithm 92 tracks the crop height by detecting the visible part of the ranging pole 24 (step S3.6 & S3.7). If tracking is unsuccessful, then the algorithm 92 moves on to the next image in the series (step S3.4).
  • an entropy-based texture analysis is used to detect whether the canopy region 112 enclosed by the polygon 111 (herein referred to as the "crop canopy space", “canopy region” or “hyper plane") changes between adjacent images.
  • GLCMs are used to calculate the entropy value of canopy texture. If the entropy of the texture shifts (moving up or down, depending on growth stages), the two-dimensional coordinate of the centroid of the canopy texture is recorded, which allows the polygon 111 to be repositioned (i.e. moved vertically) to represent the change of the canopy space.
  • corner-featured points 114 are detected within the repositioned canopy space.
  • This step generates coloured (e.g. red) pseudo points cast in the canopy region 112, representing the tips of erect leaves at stem elongation or jointing (i.e. GS 32-39), reflective surfaces of curving leaves or crop heads between booting and anthesis (i.e. GS 41-69) and corner points on ears and grains during senescence (i.e. GS 71-95).
  • coloured e.g. red
  • Figure 30 shows a series of images illustrating crop growth from GS 37 to GS 92 over a period of 95 days.
  • the algorithm 92 applies Harris and Shi-Tomasi corner detection methods to locate corner-featured points within the canopy region 112.
  • red pseudo points 114 are generated to represent the tips of erect leaves, reflective surfaces of curving leaves, heads and the corner points on ears.
  • the main orientation of a given plot is quantified based on an optimised Canny edge detection method, which computes the alignment of crop stems.
  • the user via the GUI, may change the colour of a reference marker 26 in an image, for example, turning it pink 116. This can be used to mark events, such as if the terminal 2 or the ranging pole 24 is moved or a new monitoring task is initiated.
  • the system does not require the position of the terminal 2, the ranging pole 24 or the reference markers 26 to be known. Thus, there is no need for GPS or for recording positions of, for example, the terminal 2.
  • a machine-learning algorithm e.g. clustering method, can be used to determine the region of interest, e.g. crop canopy space.
  • a data interpolation and analysis module 93 can be used to handle minor data loss during the field experiments.
  • a crop-environment interaction modelling module 94 is used to identify interactions between the recorded crop growth of five wheat genotypes and a number of environmental factors (steps S4.1 to S4.9).
  • Correlations are performed for each environmental factor grouped over three days with the recorded growth data.
  • the reason for grouping environmental factors into nested three-day periods is to remove outliers and smooth the input data.
  • the correlations are determined for each growth stage for five genotypes. The analysis is performed on the grouped data as particular stages (e.g. booting and heading) contain few recorded growth data due to the short duration of both stages were present during the growth.
  • a formula (e RGR ) _1 is used to transfer negative correlation values, as the RGR series is a decreasing sequence in relation to the increasing nature of growth stages.
  • h t is the height of the plant at a current time point
  • h t -i is the height of the plant at the previous time-point
  • h 0 is equal to the initial height
  • the growth stage predictive model is based on a GxPxE model hereinbefore described.
  • the model is produced to explore how to predict growth stages of different wheat genotypes on the basis of real growth traits and environment data. It employs support vector machines (SVM) with radial basis function kernels to classify growth stages, as SVMs are popular machine learning techniques for classification.
  • SVM support vector machines
  • the performance of the model is tested by overall paragon wheat growth data (GT) and Paragon WT (Gi), as GT performs well in the GxPxE interaction model whereas Gi performs poorly.
  • Terminals need not use a single-board computer.
  • a high-performance computing system need not be used.
  • a desktop computer or workstation can be used.
  • claims have been formulated in this application to particular combinations of features, it should be understood that the scope of the disclosure of the present invention also includes any novel features or any novel combination of features disclosed herein either explicitly or implicitly or any generalization thereof, whether or not it relates to the same invention as presently claimed in any claim and whether or not it mitigates any or all of the same technical problems as does the present invention.
  • the applicants hereby give notice that new claims may be formulated to such features and/ or combinations of such features during the prosecution of the present application or of any further application derived therefrom.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Quality & Reliability (AREA)
  • Geometry (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

A method of processing images of a crop, particularly a cereal crop, is described. The method comprises retrieving a series of images of a crop (3; Fig. 2) captured over a period of time and identifying, in an image selected from the series of images to be used as reference image, a reference system against which other images can be compared, the reference system including an extent of a crop plot (111; Fig. 23) and/or one or more reference points, such as height markers (107; Fig. 21). The method also comprises, for each of at least one other image in the series of images, calibrating the image using the reference system, and determining a height of a canopy of the crop in the image, a main orientation of the crop and/or a value indicative of vegetative greenness.

Description

Data processing of images of a crop
Field of the invention
The present invention relates to data processing of images of a crop, in particular a cereal crop such as wheat, maize or rice, for use in image-based field phenotyping.
Background
Due to a narrowing range of available genetic diversity of modern crop germplasm and increasing fluctuations in weather caused by global climate change, research is being directed to searching for new sources of variation, such as landraces and wild relatives, to seek traits with greater yield potential, as well as environmental adaptation. This search requires robust measures of adaptive traits from many experimental plots throughout the growing season. To increase yield and to improve crop adaptation to diverse environments sustainably, modern genetic and genomics technologies have been employed to enable an efficient selection of valuable lines with high yield, biotic and abiotic stress tolerance, and disease resistance. For example, quantitative trait locus (QTL) analysis and genome- wide association studies (GWAS) can be used to examine genetic architecture, genome sequencing can be employed to reveal gene content and diversity and marker-assisted selection (MAS) or genomic selection (GS) can be used to accumulate favourable alleles. These approaches, however, are limited by low-throughput, laborious and inaccurate in-field phenotyping approaches. Phenotyping is widely recognised as forming a bottleneck preventing researchers from linking the richness of genomic and genotypic information to important traits thereby allowing this information to be used effectively for agriculture.
Agricultural practitioners, such as breeders, growers, farmers and crop scientists, have been seeking new approaches to relieving the bottleneck. For instance, non-invasive remote sensors and aerial imaging devices, such as unmanned aerial vehicles (UAVs) and blimps, are being used to study crop performance and field variability. Satellite imaging and tailored portable devices can be used to predict crop growth and yield potential based on canopy photosynthesis and normalised difference vegetation indices (NDVI). Large-scale imaging systems equipped with three-dimensional laser scanners and multispectral sensors have used to try to automate plant monitoring for a fixed number of pots or plots either in greenhouse (such as Scanalyzer HTS/3D HT marketed by LemnaTec, Aachen, Germany) or in the field (for example, LeasyScan marketed by Phenospex, Heerlen, The Netherlands and Field Scanalyzer marketed by LemnaTec Aachen, Germany). However, systems suffer from being expensive and small scale, and from providing low frequency of measurements. The systems tend to have inadequate software analytical tools for use by agricultural practitioners to make sense of complicated phenotypic datasets. It is desirable, therefore, to be able to measure crop growth dynamically and to identify key adaptive traits in large numbers of experimental plots in different regions. Thus, there is need to develop an affordable, reliable phenotyping platform that can be easily used and widely-adopted in breeding pipelines and by crop research communities worldwide.
Summary
According to a first aspect of the present invention there is provided a method of processing images of a crop (which maybe in a field, a part of field, a plot, a plant pot, or plant tray). The crop may be a cereal crop. The method comprises retrieving a series of images of a crop captured over a period of time and identifying, in an image (or "initial image") selected from the series of images to be used as a reference image, a reference system against which other images can be compared, the reference system including an extent of a crop plot and/or one or more reference points. The method also comprises, for each of at least one other image in the series of images, calibrating or adjusting the image using the reference system, and determining a height of a canopy of the crop in the image, a main orientation of the crop and/or a value indicative of vegetative greenness (for example, a normalised green value in an RGB colour space and/or excessive greenness).
This can afford greater flexibility when monitoring a crop, particularly large numbers of crops, over periods of months. For example, the method can be used to process images of a crop which have been captured in the field and, thus, subject to vagaries of weather. Moreover, the method can be used for each crop and, thus, allow large data to be processed for large numbers of crops.
The one or more reference points may include a plot region (which includes the crop plot and a region around the crop plot, e.g. a gap between adjacent crop plots), a canopy space, and/or at least one height marker (which maybe a graduated ranging pole and/or a reference mark).
The method may further comprise identifying at least one reference marker in the reference image. The method may further comprise classifying pixels in the reference image into one or more groups corresponding to one or more respective object types. The method may further comprise, for each of at least one other image in the series of images, identifying corner-featured points in the crop plot in the image.
The method may further comprise preparing the series of images of the crop
comprising, receiving a series of captured images of the crop and, for each image in the series of captured images, determining whether the image satisfies at least one image- quality, requirement, upon a positive determination, adding the image to the series of images to be processed. The at least one image-quality requirement may include brightness of the image, size of the image file, sharpness of the image and/or the proportion of dark area in the image area.
Determining the height of the canopy may comprise detecting a visible part of a ranging pole. Determining the height of a canopy of the crop in the crop plot in the image may comprise calculating an entropy based on compactness of a texture of a crop canopy space, isotropy of the texture of the crop canopy space and distribution of intensity of the crop canopy space. Determining the height of the canopy may comprise measuring a weighted centroid of the crop canopy space. Determining the canopy height may comprise determining respective positions of corner-featured objects in the crop canopy space and calculating an average from the tip positions.
The method may further comprise, for the image series, generating dynamic grow curves defining a developmental profile for the crop, calculating stem rigidity and lodging risk based on the main orientation of the crop and/ or calculating vegetation and senescence periods based on a series of the values indicative of vegetative greenness.
Vegetative greenness Gv(x,y) can be computed based on excessive greenness EXG(X,V) and excessive red ExR(x,y) indices. The vegetative greenness can be defined by:
Gv{x,y) = (2*/c(x,y) -A(x,y) - feix,y)) - (i.4*/«(x,y) - /s(x,y)) excessive greenness can be defined by:
ExG{x,y) = 2*/c(x,y) -A(x,y) - A(x,y) and excessive red can be defined by: ExR{x,y) = l.4*/fl(x,y) -fB(x,y) where fii(x,y) is the red channel of a colour image, ίε(χ,ν) represents the blue channel, and ίσ(χ,ν) the green channel. According a second aspect of the present invention there is provided a method of batch processing comprising processing images of at least crops according to the first aspect of the present invention. According a third aspect of the present invention there is provided a computer program comprising instruction which, when executed by a data processing system causes the data processing system to perform a method according to the first or second aspects of the present invention. According a fourth aspect of the present invention there is provided a computer program product comprising a computer-readable storage media storing a computer program according to the third aspect of the present invention.
According a fifth aspect of the present invention there is provided a system comprising a data processing system configured to perform the method according to the first or second aspects of the present invention.
The system may further comprise at least one terminal. The, or each terminal, may comprise a light-level sensor to measure a light level for controlling image capture settings, a camera for capturing images of a region of a growing crop based on the image capture settings, data storage for storing images captured by the camera, a wireless network interface for transferring data, including images captured by the camera, to a remotely-located image processing computer system and an on-board computer system for controlling storage and transfer of captured images. The on-board computer system may be configured to determine whether an image-quality characteristic of the captured image satisfies a predetermined condition and, upon a negative determination, discarding the captured image such that only images satisfying the predetermined condition are transferred. According a sixth aspect of the present invention there is provided a terminal. The terminal comprises a light-level sensor to measure a light level for controlling image capture settings, a camera for capturing images of a region of a growing crop based on the image capture settings, data storage for storing images captured by the camera, a wireless network interface for transferring data, including images captured by the camera, to a remotely-located image processing computer system and an on-board computer system for controlling storage and transfer of captured images. The on-board computer system is configured to determine whether an image-quality characteristic of the captured image satisfies a predetermined condition and, upon a negative determination, discarding the captured image such that only images satisfying the predetermined condition are transferred.
This can help reduce the amount of data transmitted and the number of images that need to be processed.
Brief Description of the Drawings
Certain embodiments of the present invention will now be described, by way of example, with reference to the accompanying drawings, in which:
Figure 1 is a schematic block diagram of an image capture and data processing system; Figure 2 illustrates a terminal used in plant breeding version of an image capturing system;
Figure 3 is a plan view of the terminal and crop shown in Figure 2;
Figure 4 illustrates the terminal shown in Figures 2 and 3 in more detail;
Figure 5 illustrates a terminal in the form of a cart used in farming version of an image capturing system;
Figures 6a, 6b and 6c illustrate a cart using a farming version of the image capturing system in first, second and third configurations respectively;
Figure 7a is a front elevation of the cart shown in Figure 5 with soil augers deployed; Figure 7b is a plan view of the underside of the cart shown in Figure 5;
Figure 8 is a process flow diagram of a method of capturing images, and processing and quantifying crop growth patterns and adaptive traits;
Figure 9a illustrates plots of height for five near isogenic lines (NILs) of wheat, thermal time, solar radiation, and rainfall against time;
Figure 9b is a plot of relative growth rate (RGR) against time for five NILs at different stages of growth;
Figures 9c and 9d are heat maps for first and second growth traits (namely RGR and canopy height) respectively showing the relationship between environmental factors in relation to four key growth stages (GS32-69);
Figure 9e show graphs of actual and estimated height against time for a combination of five NILs;
Figure 9f shows actual and predicted growth stages;
Figures 10a to loe show graphs of actual and estimated height against time for five NILs and for a combination of the five NILs;
Figure 11 is a schematic block diagram of servers and software modules used in the image-based phenotyping system;
Figure 12 is process flow diagram of a method of capturing images of a crop;
Figure 13 is a process flow diagram of a method of selecting images from a series of captured crop images;
Figure 14 is a process flow diagram of a method of defining coordinates of a reference system;
Figure 15 is an image of a crop, ranging pole and reference points; Figure 16 is an image showing identification of reference points;
Figure 17 is an image showing computer-generated points and rectangles around the reference points;
Figure 17a illustrates pixel-metric conversion;
Figure 18 is an image of a crop, ranging pole and reference points;
Figure 19 is an image showing six different classes of objects;
Figure 20a is an image resulting from a first type of edge detection;
Figure 20b is an image resulting from a second type of edge detection;
Figure 20c is an image combining the first image shown in Figure 20a and the second image shown in Figure 20;
Figure 21 is an image showing computer-identified markers;
Figure 22 is an image showing a constructed reference system;
Figure 23 is an image showing a crop canopy;
Figures 24a and 24b illustrates measuring visibility of a ranging pole to determine the height of the plot over a season;
Figure 25 illustrates measuring a canopy region using corner-featured detection over a season;
Figure 26 illustrates measuring lodging risk using pattern measurements over a season; Figures 27a and 27b illustrates calculation of entropy and skewness;
Figure 28 is an image where a ranging pole is partially covered;
Figure 29 is a process flow diagram of a method of tracking plots of interest based on initial reference positions;
Figure 30 illustrates a series of images showing tracking plots of interest;
Figure 31 illustrates a series of images showing marking of the tips of erect leaves; Figure 32 is a process flow diagram of a method of modelling interaction between the crop and environment; and
Figure 33 is a process flow diagram of a method of predictively modelling crop growth.
Detailed description of Certain Embodiments
Image-based phenotvping system 1
Referring to Figure 1, an image capturing and data processing system 1 is shown for use in field phenotvping. The system 1 can allow automatic monitoring of crop growth and development using low-cost, in-field terminal workstations 2 (or "terminals"). For a given area of a crop 3, a set of n terminals 2 may be used, where n maybe one or at least two, for example, between two and 20. The crop 3 is may be a cereal crop, such as wheat, maize or rice. Each terminal 2 includes a computer system 4, preferably in the form of a single-board computer (such as a Raspberry Pi 2 or a Raspberry Pi 3), on-board storage 5, a camera module 6, a temperature and humidity sensor module 7, a light sensor module 8, a soil sensor module 9, a wireless network interface module 10 and wired network interface(s) 11. Terminals 2 may be connected to an in-field wireless local area network 12 and a terminal 2 may be connected to other terminals 2 by a wired peer-to-peer connection 13. The terminals 2 may provide nodes in a wireless mesh network. The computer system 4 includes at least one processor (not shown) and memory (not shown).
A terminal 2 can be used to perform continuous monitoring using high-resolution (e.g. 5 megapixel), time-lapse photography, in-freld evaluation and data transfer via realtime file sharing and data exchange servers. The camera module 6 can be used to capture, for example, between 1 and 6 images each hour. Preferably, three images per hour are captured, i.e. resulting in 48 images per day (assuming images are only captured during 16 hours of a 24-hour day).
The system 1 includes a central web server 14, a data processing system 15, preferably in the form of a high-performance computing system (HPC), having associated storage 16 and a database storage 17 which receive image and environment data from the central web server 14. The system 1 may provide a web-based interface, via a network, 17 to a remotely-located computer system 19 for monitoring workstation status and data. Fixed and/ or mobile computing devices 20, 21 can be used to access the system.
Referring to Figures 2 and 3, in a first, plant-breeding version of the system 1, the terminals 2 take the form of compact units fixedly mounted on posts 21 (or "poles") and can be used by plant breeders to monitor a crop 3, such as wheat. Each terminal 2 is elevated to a height hp above ground level 22, typically between 50 and 100 cm and preferably 75 cm.
A ranging pole or rod 23, preferably having a series of alternating coloured bands or sections of equal height, is placed roughly at the centre of a sample plot 24. The ranging pole 23 extends to a height ht above ground level, preferably 1.2 m, and is located a distance di from the terminal 2, preferably between 1.4 to 1.5 m. The sample plot takes the form of a square or rectangle having an area A, preferably about 1 m2. A white, rectangular reference point (or "fiducial mark") 25 is inserted in the ground 26 between the terminal 2 and the ranging pole 23. The reference point 25 is located a distance d2 from the terminal 2, preferably between 1.0 to 1.1 m A soil sensor 9 is inserted in the soil 25 in or close to the sample plot 24. A cable (not shown) connects the terminal 2 and the soil sensor 9.
Referring to Figure 4, the terminal 2 comprises a weather-proof case 29 (herein referred to as a "housing" or "enclosure") formed from acrylonitrile butadiene styrene (ABS) or other suitable material, having a generally rectangular prism shape and having a top 30, an underside 31, front 32, sides 33 and rear 34.
At the front 32 of the case 29, the terminal 2 has a camera hood 35 which shields a UV lens 36 disposed in front of camera module 6. The camera module 6 preferably takes the form of an RGB camera module. The camera module 6 is connected to the single- board computer 6 via a cable 37 and on-board connector 38. An LED 39 is disposed in the top 30 of the case 29 and is connected to the single-board computer 6 via general- purpose input/output 40. The single-board computer system 4 includes an integrated sensor board 41. The single-board computer system computer 2 includes an Ethernet connector 42 and USB connector 43 which are connected to respective RJ45 Ethernet and mini-USB sockets 44, 45 at the rear 34 of the case 29 and which provide wired network interfaces 11. The single-board computer system 4 is provided with a WiFi dongle which provides a wireless network interface module 10 and with USB storage which provides on-board storage 5. The USB storage 5 may have a capacity of 16 GB or more.
The single-board computer system 4 is powered via a (e.g. 12V/5V) voltage converter 46 which receives power from an external source such as a (e.g. 12V) battery (not shown) which is trickle-charged by a solar panel (not shown) or via a (e.g. 5V/2A) power supply (not shown) via an external power and data connector 47. The soil sensor 9 (Figure 2) is connect to the single-board computer system 4 via the external power and data connector 47. An environmental sensor module 48 is mounted on the top 30 of the case 29 can us connected to the single-board computer system 4 via a data cable 49· The single-board computer 4 may be provided with a heat sink 50. Referring to Figure 5 and also to Figures 6a to 6c, in a second, farming version of the system 1, the terminal 2 takes the form a cart. The camera module 6 and, optionally, other modules are contained in a sensor housing 51 having a dome-shaped cap and which is mounted to a distal end of a telescopic pole 52 which extends upwardly from a cart enclosure 53 (or "cart body") having a top 54 and underside 55, front 56, sides 57 and rear 58. The sensor housing 52 is elevated to a height htp above ground level 22, typically between 50 cm and 3 m, preferably 2.5 m. The reference point 25 is located a distance d2 from the terminal 2, preferably between 2 to 4 m, preferably 3 m. Referring in particular to Figure 6a, in a first configuration, wheels 59 are deployed from the underside 55 of the cart enclosure 53 and the cart can be manoeuvred into position using handles 60 which are rotatably mounted to the sides 57 of the cart 51 close to the rear 58. Referring in particular to Figure 6b, in a second configuration, the wheels 59 are withdrawn into the enclosure 54 and the handles 60 are folded back against the sides 57 of the cart enclosure 54 so that the cart rests on legs 61. A solar panel 62 pivotably mounted on the top 54 of the cart enclosure 53 is folded out. The sensor housing 51 can be raised from the cart enclosure 53.
Referring to in particular to Figure 6c, in a third configuration, the telescopic pole 52 is fully extended.
Referring to Figure 7a, a set of four soil augers 63 can drill, from the underside 55 of the cart, into the soil so as to secure the cart in position.
Referring to Figure 7b, the telescopic pole 52 is mounted to a turntable 64, driven by a motor 65, which allows the pole 52 to be rotated. Referring again to Figure 1, the system 1 can facilitate automatic crop phenotyping. Scientists and agricultural practitioners can access terminal workstations 2 remotely for real-time monitoring using, for example a computer 19 (for example, located in an office or laboratory) or a tablet or smart phone 20 (for example, located in the field). Users can inspect not only a whole field in different regions using a plurality of terminals 2, but also can control terminals 2 to review performance of crops, initiate new monitoring sessions or transfer on-board phenotypic and sensor datasets to external computing storage. If users are granted administrative access to the system 2, they can oversee the operations, where status of every terminal 2 is constantly updated by the control system. Authorised users can inspect information such as online and offline status, operational mode, representative daily images, micro-environment (e.g. temperature/humidity in a plot region), and computational resource such as CPU and memory. The architecture of the control system supports the collation of phenotypic and sensor data for storage, visualisation, GUI-based systems interactions, and processing on high-performance computing (HPC) system 14, 15, 16, for example, in the form of SGI UV 2000 system having Intel (RTM) Xeon (RTM) cores.
An in-freld weather station can be used to meteorological data including
photosynthetically active solar radiation, rainfall, temperature, relative humidity and wind speed. Phenotypic and climate datasets are managed and saved in the data processing system 15 for durable data storage and centralised trait analysis.
Overview of data analysis pipeline
Referring to Figure 1, the terminals 2 are configured to take images at a rate of three per hour and at a resolution of 2592 x 1944 pixels so as to capture phenotypic plasticity, early expression of traits and crop-environment interactions. For example, over 200 GB data may be generated by ten terminals 2 in a field season during a 95-day period. To extract meaningful results from the growth and developmental data effectively, analytic libraries, such as OpenCV, Scikit-learn and Scikit-image can be used, together with automated bioimage informatics algorithms embedded in a high-throughput trait analysis pipeline.
Referring to Figure 8, an overview of the image-based phenotyping pipeline is shown.
A terminal 2 captures images of the crop and transmits the images to the data processing system (step So).
The data processing system, preferably a high-performance computing system 15, selects representative crop images according to their size, clarity, imaging dates and genotypes (step Si). Preferably only high-quality images are used for trait analysis, although all images (including low-quality images) can be stored. The data processing system 15 defines reference positions of plots monitored during the experiment (step S2).
In the real-life agricultural and plant breeding situations, winds, rainfall, irrigation and crop spraying can result in camera movement which can cause issues when cross- referencing trait measures in a time-lapse image series.
To help ameliorate camera movement, the data processing system 15 identifies an initial reference position of a given plot and to transform every image in the series to the same position for comparison. For instance, the data processing system 15 detects coordinates of the white reference points 25 (Figure 2) and dark markers on the ranging pole 22 (Figure 2) to carry out colour feature selection. The data processing system 15 classifies pixels into different groups such as crop canopy, wheel tracks, and plot regions based on the machine-learning methods, such as k-means and spectral clustering. The system 15 then establishes a pseudo reference system that records the plot area, the canopy space, height markers and the pixel-metric conversion.
The data processing system 15 uses the initial reference positions to perform in-depth trait analysis (step S3).
For a given genotype, the data processing system 15 employs an adaptive intensity and gamma equalisation to adjust colour and contrast to minimise colour distortion caused by variable in-field lighting. The system 15 then tracks geometric differences between the plot on a given image and the initial plot position. If different, a geometric transformation is applied to recalibrate the image, which removes areas outside the plot area. This can generate different sizes of black bars in the top of the given image. Within a given plot, data processing system calculates the crop height by detecting the visible part of the ranging pole as well as the canopy region. The data processing system 15 locates corner-featured points within the canopy region to generates pseudo points so as to locate the tips of erect leaves at stem elongation or jointing (i.e. growth stages 32 to 39 of the Zadoks growth scale), reflective surfaces of curving leaves and heads between booting and anthesis jointing (i.e. growth stages 41 to 69) and corner-featured points on wheat ears and grains during senescence (i.e. growth stages 71 to 95). In addition to crop growth patterns in relation to thermal time (degree day, °Cd), the system can include other dynamic measures of a number of traits in the pipeline (step S4). For example, vegetative greenness is calculated through separating the green channel in RGB images within plots of interest. An output which lies in a range between 0-255, can be used to assess green biomass and stay-green (i.e. prolonged green leaf area duration through delayed leaf senescence). Morphological traits such as the main orientation of a given plot which lies in a range between o°-i8o°, are quantified based on an optimised edge detection method, which computes the alignment of crop stems for assessing stem rigidity and lodging risk.
The steps in the image-based phenotyping pipeline will be described in more detail hereinafter. However, before describing the image-based phenotyping pipeline in more detail, experimental results obtained using the system will be described. Monitoring near-isogenic lines of wheat
The radically-different nature of environments where wheat is grown provides a unique opportunity to study the genetic diversity of wheat in connection with yield and stress tolerance through phenotypic differences. Near-isogenic lines (NILs) of wheat from a genetic stock are used to test the system.
Between May and August 2015, five NILs were monitored, namely Late days-to-ear- emergence (DTEM) with Ppd-i loss of function (lof), Early-DTEM Ppd-Dia
photoperiod insensitivity, Short, Rht-Dib semi dwarfing, Stay-Green (a stay green induced mutant), and Paragon wild type (WT). All the Near-isogenic lines are in the genetic background of Paragon (a UK spring wheat variety) and were constantly monitored over a 95-day period.
Referring to Figure 9a, dynamic developmental profiles of the five lines generated by system, together with environmental factors recorded during the period, are shown.
Referring to Figure 9b, based on the developmental data, daily relative growth rates (RGR) at different growth stages are shown.
Referring Figures 9c to 9f, applied machine-learning based models are shown to explore the dynamics between genotype, phenotype and environmental factors. Referring again to Figure 9a, to measure the rate and sensitivity of wheat growth dynamically in relation to the environment, Paragon WT is used as a reference and highlight six key growth stages (i.e. GS32-95) from stem elongation or jointing (i.e. GS 32-39) to ripening (i.e. GS 91-95). The five growth curves generally followed a sigmoid curve. At the beginning of the crop monitoring, Ppd-Dia NIL was already at the end of the jointing stage (i.e. GS37-39) and hence was the first line reaching a maximum height. Ppd-i lof was the last to stop increasing in height. The heights of five NILs were very similar in the middle of June (highlighted by a dashed circle), which verifies what was observed in the field as all the NILs were at different growth stages.
By cross-referencing five development profiles (based on growth stages, instead of calendar days), it was noticed that, although Ppd-Dia NIL and Rht-Dib were recorded at similar maximum heights, namely 83.4cm and 80.6cm respectively, the latter had a gentle-mannered growth pattern. This suggests that this line could be suitable for crop management as farmers and growers would have more time to decide whether to apply fertiliser and irrigation to assist the growth or to use chemical control to prevent rapid height increase. Ppd-i lof s growth stages had been shifted back and thus had more time to develop. As a result, this line became the tallest in the trial. Although all five lines experienced some degree of height reduction due to a significant storm on 24 July 2015, Paragon WT presented a much lower lodging risk, as it maintained its height during ripening (i.e. GS91-95). To verify the above observation, the heading dates and canopy height on the same plots were manually scores and a strong correlation was obtained (with correlation coefficient of 0.985). Moving beyond descriptive phenotypic research, genotype (G), phenotype (P) and environmental (E) datasets can be incorporated into machine-learning based modelling to explore the dynamics between GxPxE. First, to understand which environmental factors were strongly correlated with the growth of the five NILs at every key growth stage, daily RGR of the lines were computed and associated the growth rate according with their growth stages. The scatter chart shown in Figure 9b shows the growth vigour of the five genotypes, active from jointing to flowering (i.e. GS32-69) and inactive after GS71 (grain-filling). After that, Pearson correlation coefficient was calculated and the p-value based on growth traits such as normalised RGR (nesting three-day rates to reduce noise) and canopy height at four key stages (i.e. jointing, booting, heading and flowering). Through this, six environmental factors have been identified that were significantly correlated with growth traits (p < 0.01) out of 14, namely normalised degree day, solar radiation, rainfall, normalised temperature, light duration, and growth stage duration. Two heat maps shown in Figures 9c and 9d show the
relationship between the identified environmental factors and two growth traits (namely RGR and canopy height) in relation to four key growth stages (i.e. GS32-69). As growth traits did not change excessively after anthesis (i.e. GS71-95), these were not included the later stages in the correlation analysis.
Using the six identified environmental factors and growth traits measured during the growing season, a set of linear regression models were used to establish a global predictive model to forecast the growth and development of wheat in the genetic background of Paragon, when interacting with the environment. Figure 9e shows how the model forecasts the overall Paragon growth data (GT, mean squared error: 20.1512, correlation: 0.9991). The model uses the six environmental factors at six stages (i.e. GS32-95) as the input to obtain estimates of the relative growth rates yt for every given genotype. The formula below was used for the prediction: y, = ¾r * ¾ + c (1) where ( ris the environmental data at a time point t, β is the model parameters for growth stage S, and c is a constant offset. Ordinary least squares is used to determine the coefficients of the model. The model is also applied to predict the growth of the five NILs and compared the estimated growth with the recorded data generated the system.
On the basis of the first predictive model, a second model is used to forecast the timing and duration of key growth stages (i.e. GS32-95) to link the crop growth prediction with real-world agricultural practices. Thus, farmers, growers and breeders can make sound decisions based on the difference between the predicted growth curve and the actual growth pattern measured by the system. This approach could also assist agricultural practitioners in terms of line selection, fertiliser application, irrigation and harvesting to secure yield production. Figure 9f illustrates the performance of the second model. It has employed a set of support vector machines (SVM) with radial basis function kernels to classify the timing and duration of key growth stages. The model was tested by comparing the predicted growth stages with the true data measured by crop
physiologists. Figures 10a to loe show graphs of actual and estimated height against time for five NILs and for a combination of the five NILs;
Data transfer and data processing system
Referring to Figure 11, the data transfer and processing system 81 includes several servers and software modules
The system 81 includes comprises a data transfer server 82 and a remote-control server 83 running on the central web server 14 which allows users to connect to terminals 2 (Figure 1).
The system also includes support modules 84 for performing functions such as uploading sensor data and hardware information. Representative daily images are routinely selected and transferred to the central server during the night, which provides a daily snapshot of the monitored crops.
The system includes a control module 85 running on the central web server 14 which logs updates received from clients, in other words the terminals 2. The terminal 2 includes an imaging program 86 to control the camera module 6 (Figure 1) for time-lapse crop monitoring. The program 86 can automatically adjust white balance, exposure mode and shutter speed in relation to variable in-field lighting conditions using an interface (for example the picamera package). Both image resolution and imaging frequency can be changed if users want to modify experimental settings. The program 86 can also conducts the initial quality control and data backup after every image is captured.
An application 87 running on each terminal 2 is run at regular, scheduled intervals. The application 87 queries the terminal 2 to determine workstation status information such as uptime, network addresses and storage usage. Sensor data and more variable system data such as CPU temperature and processor/memory usage is sampled at a higher frequency and a mean average of the readings is recorded during the query.
Once the application 87 has collected the necessary data, it is encoded into a JavaScript Object Notation (JSON) data object and transmitted over HTTP to the central server 14 which stores the data in a database 88 (for example, a SQL database) running on the data processing system 15. Status of the system can be displayed and automatically updated using a web-based interface using a web browser 89 running on a remote computer 19, determining whether each node 2 is online by the time of their last update. The web interface provides information, including the location of terminals 2 in the field (a field map can be uploaded to the central server), graphs of collected terminal/sensor data, and facilitates SSH and VNC linking to active nodes 2. The system provides a centralised real-time monitoring system to administer the network of in-field workstations 2 and collate collected data for visualisation, batch processing and annotation.
Referring also to Figure 12, when a terminal 2 is initialised by a user via a user interface, the application 87, imports vision and imaging libraries (step S0.1), receives experimental settings including genotypes, workstation ID, imaging frequency and duration (step So.2) and checks that the hardware, such as WiFi interface 10, USB flash drive 5 and the like, is operating correctly (steps So.3 & So.4). If the hardware is not operating correctly, then the user is prompted to check the hardware via the user interface (step S0.5). The user then set up imaging dates and create folders for capturing and saving time-lapse image series (step So.6). The application 87 then starts the process of periodically acquiring images.
The application 87 dynamically adjusts imaging setting such as white balance, shutter speed and exposure mode based on infield light conditions (step So.8). The application then checks whether on-board storage is full or imaging should stop for other reasons (step So.9). If there is sufficient storage, then the application triggers image capture and saves the image in on-board storage 5 (step So.10). The application 87 checks image quality (step So.11). If image is of insufficiently high quality, the image is archived and the image is removed from the series of images (step So.12). If the image is of a sufficiently high quality, then the application places the terminal into a sleep mode and sets a timer (not shown) for the next image (step So.13).
Referring again to Figure 11, the data processing system 15 executes several algorithms and software modules 90, 91, 92, 93, 94, 95. An image selection algorithm 90 performs an assessment of large image datasets captured in field trials by comparing images with a number of fixed criteria. Referring also to Figure 13, the algorithm 90 performs initial set up including importing vision and imaging libraries (step S1.1), opening a GUI to receive a user selection of an image series (step S1.2) and setting up file systems on the data processing system 15 for saving results (step S1.3).
The algorithm 90 can also perform checks that the data processing system 15 is operating correctly (for example, that a valid directory has been selected) and, if not, to prompt the user to perform a check.
The image selection algorithm 90 goes through each image in the series (step S1.5 & S1.6) to determine whether the image meet analysis standards (step S1.7). Those that meet the standard are collated. Each image is quantified by brightness, shadow percentage and sharpness, allowing images that meet or exceed a set of thresholds to be retained for further traits analysis.
To determine the brightness of an image, the median value of pixel intensity is taken by transforming the image into hue, saturation and value ("HSV") colour space. If the median intensity value is lower than a set threshold, the image is culled and not used from this point forward. For example, the median brightness may be assigned a value between o (dark) to 1 (bright) and a threshold having a value of between 0.3 and 0.5 may be used. A threshold value of over 0.5 corresponds to an image taken in bright sunshine. Image sharpness (or "image clarity") is determined by applying a Sobel edge detection. The detectable edges are calculated and then correlated with sharpness and exposure range of the image. The result of clarity detection is also compared to a set threshold, which will disqualify images if they are out of focus or unclear with ill-defined edges. For example, an obtained value may take a value between o (blurred) to 1 (sharp) and a threshold may take a value of between 0.3 and 0.5. A threshold value of over 0.5 corresponds to a sharp image.
Measuring shadow areas involves determining the proportion of the image containing dark pixels and comparing it to a threshold value. For example, the proportion may take a value between o (all shadow) to 1 (no shadow) and a threshold may take a value of 0.2 (i.e. 20%). Measuring size involves determining the size of the image and comparing it to a threshold value. For example, the threshold may be 3.0 MB. If the image selection algorithm 90 judges the image to be of low quality, then the image is removed from the series (step S1.9). Information about the discarded image may be recorded.
Otherwise, if the image passes all the comparisons, the selected image is included in a result folder, with a CSV file recording image metadata for further high-throughput image analysis (step S1.10).
Image selection may be based on one, two, three or all four of these measures.
Preferably, image selection is based on all four measures. Other measures may be used.
Referring again to Figure 11, a plot detection algorithm 91 detects initial reference positions of monitored plots.
Referring also to Figure 14, the plot detection algorithm 91 performs initial set up including importing vision and imaging libraries (step S2.1), opening a GUI to receive a user selection of an image to serve as reference image (step S2.2) and setting up file systems on the HPC system for saving results (step 2.3).
The plot detection algorithm 91 can also perform checks that the data processing system 15 is operating correctly (for example, that a valid directory has been selected) and, if not, to prompt the user to perform a check.
The plot detection algorithm 91 may perform gamma correction (step S2.4), for example, so as to balance intensity distribution towards 50%. This can help with image screening.
Referring also to Figure 15, the plot detection algorithm 91 identifies the coordinates of white reference canes 26 so as to define the plot region and dark height markers on the ranging pole 24 using colour-based feature selection on the basis of HSV and Lab non- linear colour space (step S2.5). Referring also to Figure 16, in particular, the plot detection algorithm 91 uses a normalised grey image scale to detect white parts of the image which have a high saturation value. For example, this may involve keeping only 30 % of the pixels having the highest saturation value. The plot detection algorithm 91 then removes small objections, for example, those which have a height (or other dimension) no more than a given number of pixels. Reference canes 26 have a height of over 12,500 pixels and, thus, a threshold less than 12,500 is used. Holes in the image, i.e. the detected small objects, are filled in as black. Figure 16 shows the result of identifying the most saturated regions and removing small objects from the image.
Referring also to Figure 16, the plot detection algorithm 91 then identifies the reference canes 26 based on size and the ratio of width-to-length ("WL ratio"). The plot detection algorithm 91 defines a rectangle 101 for each reference canes 26 and a corresponding centre 102.
Referring to Figure 14, 17 and 18, the plot detection algorithm 91 perform classifies pixels into different groups such as sky 103, soil 27 between plots, crop canopy 105, shadow 106, references 26, markers 107 and other objects (not shown) using unsupervised machine-learning techniques such as K-means and spectral clustering (step S2.6).
Figure 17 shows the RGB image and Figure 18 shows the corresponding classified image where different groups are differently coloured. The top half of the image is generally navy blue and the bottom half mainly comprises region of cyan and navy blue. In the bottom half, the large region of navy blue contains flecked regions red and orange. There is a large region of orange in the bottom left-hand corner and there is a large region of cyan in the bottom right-hand corner.
If no objects are found (step S2.7), the plot detection algorithm 91 prompts the user to select another image to use as a reference (step S2.8).
Referring again to Figure 14, after detecting initial reference objects in the image, the algorithm establishes a pseudo three-dimensional reference system that records the two-dimensional coordinates of the plot area, the canopy region, and height markers through a range of feature selection approaches (steps S2.9 to S2.12). The pixel-metric conversion is also computed based on height markers on the ranging pole 24. Referring also to Figure 17a, pixel-metric conversion includes counting the number of pixels between the centres C of adjacent dark markers 107 and/or an angle of inclination, Θ, of a line between the centres of adjacent dark markers.
The plot detection algorithm 91 finds makers on the ranging pole 23 using two techniques.
Referring also to Figure 20a, the plot detection algorithm 91 uses an edge detector, such as a Canny edge detector, to detect the markers 107 (which have well-defined edges with respect to the pole and to adjacent light markers)
Referring also to Figure 20b, the plot detection algorithm 91 uses global thresholding using the median intensity value to find the markers 107 (which are very dark).
Referring also to Figure 20c, the plot detection algorithm 91 can combine the results of the two approaches.
Referring to Figures 14 and 21, the plot detection algorithm 91 locates the markers 107 on the ranging pole 24.
Referring also to Figure 22, the plot detection algorithm 91 identifies the (three) reference points 26, or more specifically the centres 102 of the reference points 26, and computes a location for a fourth reference point 108 and defines a polygon 109 having the reference points 102, 108 as the vertices.
Referring also to Figure 23, the plot detection algorithm 91 calculates corners 110 of a polygon 111 which defines resized canopy region 112. As the crop grows, the polygon 111 moves and the corresponding entropy changes.
Referring again to Figure 11, a crop performance measurements algorithm 92 is used to measure canopy height, identify corner features and measure growth, colour and orientation trait analysis. For a given crop genotype, adaptive intensity and gamma equalisation is applied to the image to minimise colour distortion caused by variable field illumination. The algorithm 92 can determine the canopy height in a number of different approaches in case one of them cannot extract the height reading as planned.
Referring to Figures 24a and 24b, a first approach is simply to inspect the ranging pole 24 and identify the visible part 113 of the pole 24 and, thus, "read" the height of the plot off the pole 24. However, the first approach may not always be possible, especially as the crop gets taller or the ranging pole was covered by random objects such as pointing leaves, agricultural vehicles, or people, which could provide false positive height reading, and all of the ranging pole 24 is obscured.
Referring to Figure 25, a second approach involves determining coordinates of the top of leaves which are labelled with pseudo points 114 and then calculating the median value of heights of the pseudo points 114. The two-dimensional pseudo points' height coordinates (y-axis) are calculated within the crop canopy space and then the median value 115 of the height readings is computed to represent the canopy height at the time point.
Referring to Figure 26, 27a and 27b, an entropy-based process of detection, in particular calculating entropy using grey-level co-occurrence matrices (GLCM). In particular, to determine the height of the canopy, the entropy-based texture analysis is used to detect whether the canopy region 112 enclosed by the hyper plane 111 changes between two adjacent images using GLCM. The texture analysis is able to determine a weighted centroid position and, based on weighted centroid position, the position of the hyper plane can be determined.
Further details regarding digital-image texture analysis and the equations for entropy H and skewness μ3 are found in A. Materka and M. Strzelecki: "Texture Analysis Methods - A Review" Technical Report, University of Lodz, Institute of Electronics, COST B11 report, Brussels 1998.
Figure 26 shows, on the right-hand side, images of the crop at different times in which a region 116 of the image is identified as being the crop plot (and falsely-coloured bright green) and for which entropy H and skewness μ3 can be calculated. If positional changes are identified, for example, the weighted centroid moves up or down (depending on growth stages), the position of the hyper plane 111 is changed and the canopy height is recorded. After that, corner- featured points are detected within the new canopy space. This step generates pseudo points that are cast in the canopy region for verifying canopy height measures from the previous approaches. In other words, the coordinates of the pseudo points can be used to compute a canopy height. Thus, even if the canopy and/ or ranging pole is partially covered (for example, as shown in Figure 28), it is still possible to obtain a height reading.
Referring to Figure 28, the terminal 2 was moved resulting in a black bar across the top of the image. Pseudo points, however, can still be sued to detect the canopy height, even when the other approaches could not generate precise height measurements.
The dynamic canopy height changes are computed by combining the measurements of height markers, with weighted centroid position and the red pseudo points. The algorithm 92 may try determining the canopy height using different approaches in order. Thus, if the algorithm 92 is unable to determine the height using the ranging pole 24 or obtains an unexpected value (e.g. which exceeds the previously-obtained value by a given percentage), then it will attempt to determine the canopy height by the identified corner-featured points.
The algorithm 92 may try determining the canopy height using all available approaches and combine some of them.
Referring also to Figure 29, the crop performance measurements algorithm 92 performs initial set up including importing vision and imaging libraries and machine leaning libraries (step S3.1), opening a GUI to receive a user selection of an image series (step S3.2) and setting up file systems on the data processing system 15 for saving results (step S3.3). The crop performance measurements algorithm 92 can also perform checks that the data processing system 15 is operating correctly (for example, that a valid directory has been selected) and, if not, to prompt the user to perform a check.
The crop performance measurements algorithm 92 employs an adaptive intensity and dynamic gamma equalisation to adjust colour and contrast to minimise colour distortion caused by diverse in-field lighting (step S3.4). The crop performance measurements algorithm 92 tracks geometric differences between the plot on a given image and the initial position. If different, a geometric transformation method is applied to recalibrate the image which removes areas outside the plot area (steps S3.5).
Within a plot, the crop measuring algorithm 92 tracks the crop height by detecting the visible part of the ranging pole 24 (step S3.6 & S3.7). If tracking is unsuccessful, then the algorithm 92 moves on to the next image in the series (step S3.4).
To determine the change of the canopy space during the season, an entropy-based texture analysis is used to detect whether the canopy region 112 enclosed by the polygon 111 (herein referred to as the "crop canopy space", "canopy region" or "hyper plane") changes between adjacent images.
GLCMs are used to calculate the entropy value of canopy texture. If the entropy of the texture shifts (moving up or down, depending on growth stages), the two-dimensional coordinate of the centroid of the canopy texture is recorded, which allows the polygon 111 to be repositioned (i.e. moved vertically) to represent the change of the canopy space.
Referring also to Figure 26, within the repositioned canopy space, corner-featured points 114 are detected. This step generates coloured (e.g. red) pseudo points cast in the canopy region 112, representing the tips of erect leaves at stem elongation or jointing (i.e. GS 32-39), reflective surfaces of curving leaves or crop heads between booting and anthesis (i.e. GS 41-69) and corner points on ears and grains during senescence (i.e. GS 71-95).
Figure 30 shows a series of images illustrating crop growth from GS 37 to GS 92 over a period of 95 days.
Referring still to Figure 24, the algorithm 92 applies Harris and Shi-Tomasi corner detection methods to locate corner-featured points within the canopy region 112. Referring also to Figure 31, red pseudo points 114 are generated to represent the tips of erect leaves, reflective surfaces of curving leaves, heads and the corner points on ears. The main orientation of a given plot is quantified based on an optimised Canny edge detection method, which computes the alignment of crop stems.
The user, via the GUI, may change the colour of a reference marker 26 in an image, for example, turning it pink 116. This can be used to mark events, such as if the terminal 2 or the ranging pole 24 is moved or a new monitoring task is initiated.
It is noted that the system does not require the position of the terminal 2, the ranging pole 24 or the reference markers 26 to be known. Thus, there is no need for GPS or for recording positions of, for example, the terminal 2.
A machine-learning algorithm, e.g. clustering method, can be used to determine the region of interest, e.g. crop canopy space. Referring to Figure 11, a data interpolation and analysis module 93 can be used to handle minor data loss during the field experiments.
Referring still to Figure 11 and also to Figure 32, a crop-environment interaction modelling module 94 is used to identify interactions between the recorded crop growth of five wheat genotypes and a number of environmental factors (steps S4.1 to S4.9).
An example of crop-environment modelling is hereinbefore described in relation to Figure 10. The following comments are made in relation to the model used in the example.
Correlations are performed for each environmental factor grouped over three days with the recorded growth data. The reason for grouping environmental factors into nested three-day periods is to remove outliers and smooth the input data. The correlations are determined for each growth stage for five genotypes. The analysis is performed on the grouped data as particular stages (e.g. booting and heading) contain few recorded growth data due to the short duration of both stages were present during the growth. To obtain the dynamic between relative growth rates (RGR) and environmental factors, a formula (eRGR)_1 is used to transfer negative correlation values, as the RGR series is a decreasing sequence in relation to the increasing nature of growth stages. Based on significant environmental factors, a set of linear regression models are explored and a single linear regression model is selected to estimate RGR of five genotypes in relation to given in-field environment conditions. Environmental factors with insignificant correlations (where p > 0.01, with respect to the height over the entire time-series) are removed from the analysis as they provide little predictive power. Ordinary least squares are used to derive the model coefficients and all the stages are included as features. The RGR data is normalised to present percentage changes in height between two consecutive days. To predict the canopy height for a given genotype, environment data at each growth stage is input to the global model. To derive the height of the plant over time, successive applications of
Figure imgf000029_0001
are applied, where ht is the height of the plant at a current time point, ht-i is the height of the plant at the previous time-point, and h0 is equal to the initial height.
Referring to Figure 10, performance of the model is verified by estimating the growth of all five NILs, including the overall paragon growth data (GT). The estimation is displayed with respect to the true canopy height datasets. The mean squared error recorded for G2 {genotype two, Late-DTEM), G3 {genotype three, Early-DTEM) and G4 {genotype four, Stay-Green) shows that the estimated height is close to the true growth curves. However, the error is much larger for Gi {genotype one, Paragon WT) and G5 {genotype five Short). This is due to the majority of crop growth happens during the early stages (GS32-GS59), estimation deviation during these initial stages could affect the overall height results.
The growth stage predictive model is based on a GxPxE model hereinbefore described. The model is produced to explore how to predict growth stages of different wheat genotypes on the basis of real growth traits and environment data. It employs support vector machines (SVM) with radial basis function kernels to classify growth stages, as SVMs are popular machine learning techniques for classification. The performance of the model is tested by overall paragon wheat growth data (GT) and Paragon WT (Gi), as GT performs well in the GxPxE interaction model whereas Gi performs poorly. The prediction in comparison with the manually recorded growth stages suggests a successful prediction of the timing and duration of stem elongation and jointing (GS32- 39) through heading (GS51-59) and flowering (GS61-69) through Ripening (GS91-95). However, the transition from heading to flowering has introduced an error, the transition has been predicted three days earlier. The main reason for this error is due to the short duration of booting (GS41-49) and heading (GS51-59). All genotypes used for training cannot sufficiently differentiate the two stages.
Modifications
It will be appreciated that various modifications may be made to the embodiments hereinbefore described. Such modifications may involve equivalent and other features which are already known in the design, manufacture and use of imaging systems, image processing and phenotyping systems and component parts thereof and which may be used instead of or in addition to features already described herein. Features of one embodiment may be replaced or supplemented by features of another embodiment.
Other types of single-board computers can be used, such as Intel (RTM) Edison.
Terminals need not use a single-board computer.
A high-performance computing system need not be used. For example, a desktop computer or workstation can be used. Although claims have been formulated in this application to particular combinations of features, it should be understood that the scope of the disclosure of the present invention also includes any novel features or any novel combination of features disclosed herein either explicitly or implicitly or any generalization thereof, whether or not it relates to the same invention as presently claimed in any claim and whether or not it mitigates any or all of the same technical problems as does the present invention. The applicants hereby give notice that new claims may be formulated to such features and/ or combinations of such features during the prosecution of the present application or of any further application derived therefrom.

Claims

Claims
1. A method of processing images of a crop comprising:
retrieving a series of images of a crop captured over a period of time;
identifying, in an image selected from the series of images to be used as a reference image, a reference system against which other images can be compared, the reference system including an extent of a crop plot and/ or a set of one or more reference points; and
for each of at least one other image in the series of images:
calibrating the image using the reference system; and
determining a height of a canopy of the crop in the crop plot in the image, a main orientation of the crop and/ or a value indicative of vegetative greenness.
2. A method according to claim 1, wherein the one or more reference points include: a plot region,
a canopy space, and/ or
at least one height marker.
3. A method according to claim 1 or 2, further comprising:
identifying at least one reference marker in the reference image.
4. A method according to any preceding claim, further comprising:
classifying pixels in the reference image into one or more groups corresponding to one or more respective object types.
5. A method according to any preceding claim, further comprising:
for each of at least one other image in the series of images:
identifying corner-featured points in the crop plot in the image.
6. A method according to any preceding claim, further comprising:
preparing the series of images of the crop comprising:
receiving a series of captured images of the crop;
for each image in the series of captured images:
determining whether the image satisfies at least one image-quality requirement; upon a positive determination, adding the image to the series of images to be processed.
7. A method according to claim 6, wherein the at least one image-quality requirement includes brightness of the image, size of the image file, sharpness of the image or the proportion of dark area in the image area.
8. A method according to any preceding claim, wherein determining the height of a canopy of the crop in the crop plot in the image comprises:
detecting a visible part of a ranging pole.
9. A method according to any preceding claim, wherein determining the height of a canopy of the crop in the crop plot in the image comprises:
calculating an entropy based on compactness of a texture of a crop canopy space, isotropy of the texture of the crop canopy space, and/ or distribution of intensity of the crop canopy space.
10. A method according to any preceding claim, wherein determining the height of a canopy of the crop in the crop plot in the image comprises:
measuring a weighted centroid of a crop canopy space.
11. A method according to any preceding claim, wherein determining the height of a canopy of the crop in the crop plot in the image comprises:
determining respective positions of corner-featured objects in the crop canopy space; and
calculating an average from the tip positions.
12. A method according to any preceding claim, further comprising:
for the image series:
generating dynamic grow curves defining a developmental profile for the crop; calculating stem rigidity and lodging risk based on the main orientation of the crop; and/or
calculating vegetation and senescence periods based on a series of the values indicative of vegetative greenness.
13. A system comprising: a data processing system configured to perform the method according to any one of claims 1 to 12.
14. A system according to claim 12, further comprising:
at least one terminal comprising:
a light-level sensor to measure a light level for controlling image capture settings;
a camera for capturing images of a region of a growing crop based on the image capture settings;
data storage for storing images captured by the camera;
a wireless network interface for transferring data, including images captured by the camera, to a remotely-located image processing computer system; and
an on-board computer system for controlling storage and transfer of captured images, wherein the on-board computer system is configured to determine whether a characteristic of the captured image satisfies a predetermined condition and, upon a negative determination, discarding the captured image such that only images satisfying the predetermined condition are transferred.
15. A terminal comprising:
a light-level sensor to measure a light level for controlling image capture settings; a camera for capturing images of a region of a growing crop based on the image capture settings;
data storage for storing images captured by the camera;
a wireless network interface for transferring data, including images captured by the camera, to a remotely-located image processing computer system; and
an on-board computer system for controlling storage and transfer of captured images, wherein the on-board computer system is configured to determine whether a characteristic of the captured image satisfies a predetermined condition and, upon a negative determination, discarding the captured image such that only images satisfying the predetermined condition are transferred.
PCT/GB2018/050985 2017-06-19 2018-04-13 Data processing of images of a crop WO2018234733A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP18721093.5A EP3642792A1 (en) 2017-06-19 2018-04-13 Data processing of images of a crop

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1709756.9A GB2553631B (en) 2017-06-19 2017-06-19 Data Processing of images of a crop
GB1709756.9 2017-06-19

Publications (1)

Publication Number Publication Date
WO2018234733A1 true WO2018234733A1 (en) 2018-12-27

Family

ID=59462327

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2018/050985 WO2018234733A1 (en) 2017-06-19 2018-04-13 Data processing of images of a crop

Country Status (3)

Country Link
EP (1) EP3642792A1 (en)
GB (1) GB2553631B (en)
WO (1) WO2018234733A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109948596A (en) * 2019-04-26 2019-06-28 电子科技大学 A method of rice identification and crop coverage measurement are carried out based on vegetation index model
CN111369494A (en) * 2020-02-07 2020-07-03 中国农业科学院农业环境与可持续发展研究所 Winter wheat ear density detection method and device
CN112070741A (en) * 2020-09-07 2020-12-11 浙江师范大学 Rice whiteness degree detection system based on image saliency region extraction method
US10916028B1 (en) 2019-08-22 2021-02-09 Cnh Industrial America Llc Sensor assembly for an agricultural implement and related systems and methods for monitoring field surface conditions
CN113325761A (en) * 2021-05-25 2021-08-31 哈尔滨工业大学 Plant growth period identification control system based on deep learning and identification control method thereof
CN113469068A (en) * 2021-07-06 2021-10-01 信阳农林学院 Growth monitoring method for large-area planting of camellia oleifera
CN114688997A (en) * 2022-03-29 2022-07-01 华南农业大学 Automatic blade area detection device and method based on RLS adaptive filtering algorithm
CN114862705A (en) * 2022-04-25 2022-08-05 陕西西影数码传媒科技有限责任公司 Image quality evaluation method for image color restoration
CN115049926A (en) * 2022-06-10 2022-09-13 安徽农业大学 Wheat lodging loss assessment method and device based on deep learning
WO2022258653A1 (en) * 2021-06-10 2022-12-15 Eto Magnetic Gmbh Device for detecting a sprouting of sown seeds, agricultural sensor device, and agricultural monitoring and/or control method and system
US11810285B2 (en) 2021-03-16 2023-11-07 Cnh Industrial Canada, Ltd. System and method for determining soil clod parameters of a field using three-dimensional image data
CN117370823A (en) * 2023-12-05 2024-01-09 恒健达(辽宁)医学科技有限公司 Spraying control method and system for agricultural planting

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020000043A1 (en) * 2018-06-28 2020-01-02 University Of Southern Queensland Plant growth feature monitoring
CN114827474A (en) * 2018-10-31 2022-07-29 深圳市大疆创新科技有限公司 Shooting control method, movable platform, control device and storage medium
CN109859101B (en) * 2019-01-18 2022-10-28 黑龙江八一农垦大学 Crop canopy thermal infrared image identification method and system
CN112712038B (en) * 2020-12-31 2024-05-28 武汉珈和科技有限公司 Method and system for monitoring wheat lodging condition based on multispectral satellite image
CN116503741B (en) * 2023-06-25 2023-08-25 山东仟邦建筑工程有限公司 Intelligent prediction system for crop maturity

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010036295A1 (en) * 1997-10-10 2001-11-01 Hendrickson Larry L. Method of determining and treating the health of a crop
US20160224703A1 (en) * 2015-01-30 2016-08-04 AgriSight, Inc. Growth stage determination system and method
CN105869152A (en) * 2016-03-24 2016-08-17 北京农业信息技术研究中心 Method and device for measuring spatial distribution of crop plant heights through unmanned plane remote sensing
US20160239709A1 (en) * 2015-01-30 2016-08-18 AgriSight, Inc. System and method for field variance determination

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7058197B1 (en) * 1999-11-04 2006-06-06 Board Of Trustees Of The University Of Illinois Multi-variable model for identifying crop response zones in a field
JP5904044B2 (en) * 2012-07-18 2016-04-13 富士通株式会社 Crop state change date identification method, apparatus and program
US10039244B2 (en) * 2014-03-04 2018-08-07 Greenonyx Ltd Systems and methods for cultivating and distributing aquatic organisms
CN104320607A (en) * 2014-08-06 2015-01-28 江苏恒创软件有限公司 Method for monitoring growth of farmland crops based on drone
CN105574897A (en) * 2015-12-07 2016-05-11 中国科学院合肥物质科学研究院 Crop growth situation monitoring Internet of Things system based on visual inspection

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010036295A1 (en) * 1997-10-10 2001-11-01 Hendrickson Larry L. Method of determining and treating the health of a crop
US20160224703A1 (en) * 2015-01-30 2016-08-04 AgriSight, Inc. Growth stage determination system and method
US20160239709A1 (en) * 2015-01-30 2016-08-18 AgriSight, Inc. System and method for field variance determination
CN105869152A (en) * 2016-03-24 2016-08-17 北京农业信息技术研究中心 Method and device for measuring spatial distribution of crop plant heights through unmanned plane remote sensing

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
BISKUP ET AL: "A stereo imaging system for measuring structural parameters of plant canopies", PLANT CELL AND ENVIRON, WILEY-BLACKWELL PUBLISHING LTD, GB, vol. 30, no. 10, 1 January 2007 (2007-01-01), pages 1299 - 1308, XP007912123, ISSN: 0140-7791, DOI: 10.1111/J.1365-3040.2007.01702.X *
JI ZHOU ET AL: "CropQuant: An automated and scalable field phenotyping platform for crop monitoring and trait measurements to facilitate breeding and digital agriculture", BIORXIV, 10 July 2017 (2017-07-10), XP055490315, Retrieved from the Internet <URL:https://pdfs.semanticscholar.org/7ac1/76f331008e7380f7f879e07b31877a6ed449.pdf> [retrieved on 20180705], DOI: 10.1101/161547 *
N. TILLY ET AL: "Terrestrial laser scanning for plant height measurement and biomass estimation of maize", ISPRS - INTERNATIONAL ARCHIVES OF THE PHOTOGRAMMETRY, REMOTE SENSING AND SPATIAL INFORMATION SCIENCES, vol. XL-7, 19 September 2014 (2014-09-19), pages 181 - 187, XP055490343, DOI: 10.5194/isprsarchives-XL-7-181-2014 *

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109948596A (en) * 2019-04-26 2019-06-28 电子科技大学 A method of rice identification and crop coverage measurement are carried out based on vegetation index model
CN109948596B (en) * 2019-04-26 2022-04-22 电子科技大学 Method for identifying rice and extracting planting area based on vegetation index model
US10916028B1 (en) 2019-08-22 2021-02-09 Cnh Industrial America Llc Sensor assembly for an agricultural implement and related systems and methods for monitoring field surface conditions
CN111369494A (en) * 2020-02-07 2020-07-03 中国农业科学院农业环境与可持续发展研究所 Winter wheat ear density detection method and device
CN111369494B (en) * 2020-02-07 2023-05-02 中国农业科学院农业环境与可持续发展研究所 Winter wheat spike density detection method and device
CN112070741A (en) * 2020-09-07 2020-12-11 浙江师范大学 Rice whiteness degree detection system based on image saliency region extraction method
CN112070741B (en) * 2020-09-07 2024-02-23 浙江师范大学 Rice chalkiness degree detecting system based on image salient region extracting method
US11810285B2 (en) 2021-03-16 2023-11-07 Cnh Industrial Canada, Ltd. System and method for determining soil clod parameters of a field using three-dimensional image data
CN113325761A (en) * 2021-05-25 2021-08-31 哈尔滨工业大学 Plant growth period identification control system based on deep learning and identification control method thereof
WO2022258653A1 (en) * 2021-06-10 2022-12-15 Eto Magnetic Gmbh Device for detecting a sprouting of sown seeds, agricultural sensor device, and agricultural monitoring and/or control method and system
CN113469068B (en) * 2021-07-06 2022-11-01 信阳农林学院 Growth monitoring method for large-area planting of camellia oleifera
CN113469068A (en) * 2021-07-06 2021-10-01 信阳农林学院 Growth monitoring method for large-area planting of camellia oleifera
CN114688997B (en) * 2022-03-29 2023-03-14 华南农业大学 Automatic blade area detection device and method based on RLS adaptive filtering algorithm
CN114688997A (en) * 2022-03-29 2022-07-01 华南农业大学 Automatic blade area detection device and method based on RLS adaptive filtering algorithm
CN114862705B (en) * 2022-04-25 2022-11-25 陕西西影数码传媒科技有限责任公司 Image quality evaluation method for image color restoration
CN114862705A (en) * 2022-04-25 2022-08-05 陕西西影数码传媒科技有限责任公司 Image quality evaluation method for image color restoration
CN115049926A (en) * 2022-06-10 2022-09-13 安徽农业大学 Wheat lodging loss assessment method and device based on deep learning
CN115049926B (en) * 2022-06-10 2023-10-24 安徽农业大学 Wheat lodging loss evaluation method and device based on deep learning
CN117370823A (en) * 2023-12-05 2024-01-09 恒健达(辽宁)医学科技有限公司 Spraying control method and system for agricultural planting
CN117370823B (en) * 2023-12-05 2024-02-20 恒健达(辽宁)医学科技有限公司 Spraying control method and system for agricultural planting

Also Published As

Publication number Publication date
GB2553631B (en) 2019-10-30
EP3642792A1 (en) 2020-04-29
GB201709756D0 (en) 2017-08-02
GB2553631A (en) 2018-03-14

Similar Documents

Publication Publication Date Title
GB2553631B (en) Data Processing of images of a crop
Zhang et al. Growth monitoring of greenhouse lettuce based on a convolutional neural network
US10028452B2 (en) Horticultural monitoring system
Sun et al. Three-dimensional photogrammetric mapping of cotton bolls in situ based on point cloud segmentation and clustering
Virlet et al. Field Scanalyzer: An automated robotic field phenotyping platform for detailed crop monitoring
Wang et al. High-throughput phenotyping with deep learning gives insight into the genetic architecture of flowering time in wheat
Bernotas et al. A photometric stereo-based 3D imaging system using computer vision and deep learning for tracking plant growth
Zhou et al. CropQuant: an automated and scalable field phenotyping platform for crop monitoring and trait measurements to facilitate breeding and digital agriculture
Bac et al. Robust pixel-based classification of obstacles for robotic harvesting of sweet-pepper
Pádua et al. Vineyard variability analysis through UAV-based vigour maps to assess climate change impacts
EP3032946B1 (en) Method for automatic phenotype measurement and selection
Zhuang et al. Early detection of water stress in maize based on digital images
González-Esquiva et al. Development of a visual monitoring system for water balance estimation of horticultural crops using low cost cameras
CN109843034B (en) Yield prediction for grain fields
Diago et al. On‐the‐go assessment of vineyard canopy porosity, bunch and leaf exposure by image analysis
Wu et al. Predicting Zea mays flowering time, yield, and kernel dimensions by analyzing aerial images
CN113223040B (en) Banana estimated yield method and device based on remote sensing, electronic equipment and storage medium
Kenchanmane Raju et al. Leaf Angle eXtractor: A high‐throughput image processing framework for leaf angle measurements in maize and sorghum
Lootens et al. High-throughput phenotyping of lateral expansion and regrowth of spaced Lolium perenne plants using on-field image analysis
Li et al. Development of image-based wheat spike counter through a Faster R-CNN algorithm and application for genetic studies
Olenskyj et al. End-to-end deep learning for directly estimating grape yield from ground-based imagery
CN106971409A (en) Maize canopy leaf color modeling and method
Guo et al. Panicle Ratio Network: streamlining rice panicle measurement by deep learning with ultra-high-definition aerial images in the field
Bai et al. Dynamic UAV phenotyping for rice disease resistance analysis based on multisource data
Rößle et al. Efficient Noninvasive FHB Estimation using RGB Images from a Novel Multiyear, Multirater Dataset

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18721093

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018721093

Country of ref document: EP

Effective date: 20200120