GB2247948A - Micropropagration - Google Patents

Micropropagration Download PDF

Info

Publication number
GB2247948A
GB2247948A GB9116159A GB9116159A GB2247948A GB 2247948 A GB2247948 A GB 2247948A GB 9116159 A GB9116159 A GB 9116159A GB 9116159 A GB9116159 A GB 9116159A GB 2247948 A GB2247948 A GB 2247948A
Authority
GB
United Kingdom
Prior art keywords
plant
stem
image
plant material
image signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB9116159A
Other versions
GB9116159D0 (en
Inventor
Nigel James Bruce Mcfarlane
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National Research Development Corp UK
Original Assignee
National Research Development Corp UK
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National Research Development Corp UK filed Critical National Research Development Corp UK
Publication of GB9116159D0 publication Critical patent/GB9116159D0/en
Publication of GB2247948A publication Critical patent/GB2247948A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01HNEW PLANTS OR NON-TRANSGENIC PROCESSES FOR OBTAINING THEM; PLANT REPRODUCTION BY TISSUE CULTURE TECHNIQUES
    • A01H4/00Plant reproduction by tissue culture techniques ; Tissue culture techniques therefor
    • A01H4/003Cutting apparatus specially adapted for tissue culture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0014Image feed-back for automatic industrial control, e.g. robot with camera

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biotechnology (AREA)
  • Developmental Biology & Embryology (AREA)
  • Cell Biology (AREA)
  • Botany (AREA)
  • Environmental Sciences (AREA)
  • Robotics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

A method of locating a stem of a plant during micropropagation comprises the steps of generating an image signal by a video camera representing an image of the plants, and processing the image signal to locate portions of the plants having widths and lengths likely to constitute stem portions. Stems are then selected by choosing plant portions fulfilling the criteria that the angle to the vertical of the most nearly vertical side of the plant portion is less than a predetermined angle, and that there can be traced from the bottom of the plant portion a continuous downward path through plant material to the surface of the material in which the plant is growing. Another criterion which can be used is that the plant portion terminates at its top and at its bottom in an area of plant material greater than various predetermined size criteria. The location signals are then processed to provide a control signal for controlling a robotic end effecter to move to the required location, to grasp the selected stem, and to remove the plant or a required portion thereof from a nutrient medium in which the plant is growing. Stereo information can be derived by observing the plant from slightly different directions.

Description

METHODS AND APPARATUS RELATING TO MICROPROPAGATION The present invention relates to methods and apparatus for use in micropropagation. The invention is concerned in particular with a method of locating a stem of a plant during micropropagation, a method of harvesting a plant, and apparatus for putting the methods into effect.
Micropropagation is an increasingly important technique for the rapid production of genetically identical plants. It is a labour-intensive industry in which the potential gains in speed, sterility, and cost saving make automation an attractive prospect. In the micropropagation process, plants are grown from small pieces of plant tissue in an agar-based medium. After several weeks of growth, the microplants are removed from their containers for dissection, and the pieces which have the potential to grow into new plants are placed in fresh containers of agar, to develop into the next generation. One of the tasks which must be carried out by an automatic micropropagation system is that of harvesting the plants for dissection, i.e.
removing the plants, or the wanted parts thereof, from the soft nutrient medium in which they are growing. At present this is done manually by the operator using forceps. In an automated system, the removal will be carried out by a robotic end effecter.
In an automated system, it will be necessary to guide the robotic end effecter to the location of the required plant automatically in response to a visual image of the plant. However such guidance is difficult due to the natural variability of biological objects. Vision processing is a method of sensory control which has been applied to similar problems in other areas of agriculture, such as tomato sorting, fruit harvesting, and plant identification. However these techniques are difficult to apply in micropropagation because of the confused mass of foliage which often occurs.
According to the present invention there is provided a method of locating a stem of a plant during micropropagation comprising the steps of generating an image signal representing an image of the plant, and processing the image signal to locate a portion of plant material in the image likely to constitute a stem, by locating a portion of plant material having the following criteria, namely: (i) having a horizontal width lying in a predetermined range of horizontal distance, and (ii) having a length greater than a predetermined value; and having at least one of the following criteria, namely: (iii) having an angle of inclination to the vertical less than a predetermined value, and (iv) having a region of plant material above and/or below the plant portion fulfilling one or more further predetermined criteria.
Where references are made to horizontal and vertical, and top and bottom, these are to be taken to be references to horizontal and vertical directions on a normal presentation of the image of the plant on a monitor screen with the stem substantially upright leading upwardly from the nutrient medium in which it is planted. It is possible however that the plant may be imaged by a camera when in a position other than the normal vertical position. Thus the terms vertical and horizontal when used with regard to the image signal do not necessarily relate to the orientation of the plant itself at the actual work station.
In a preferred form of the invention, the method includes processing the image signal to locate a portion of plant material having at least the following criterion, namely: having a region of plant material above and/or below the plant portion which provides a continuous or substantially continuous path of predetermined characteristics leading through plant material from the top and/or bottom of the plant portion. Preferably the method includes processing the image signal to locate a portion of plant material having at least the following criterion, namely: having a region of plant material below the plant portion which provides a continuous or substantially continuous downward path through plant material from the bottom of the plant portion to a base level related to the level of a material in which the plant is growing.
In a preferred arrangement, the image signal is processed first to locate a plurality of portions of plant material in the image which have the first two stated criteria of horizonal width and length, and the processing procedure then selects portions which fulfil the criteria of the angle of inclination and the path through plant material to the growing medium in which the plant is growing. As an alternative to the latter criterion, portions can be selected by reference to the criterion that the portion of plant material has a region of plant material at the top and/or bottom of the plant portion fulfilling one or more predetermined size criteria.
It is found that true stem portions can be distinguished from false stem portions such as pieces of leaves, in that stems are normally more nearly vertical than other thin plant portions. Similarly, true stems can be distinguished from leaves and other parts of the plant by reference to the property possessed of most genuine stem segments of not continuing into empty space at either end of the stem portion. For these reasons, preferably, the possible stem portions are tested for inclination of the plant portion to the vertical, and for a path through plant material to the surface of the medium in which the plant is growing or as an alternative, the presence of a mass of plant material of predetermined size above and below the end of the plant portion.
In one preferred form, the method includes processing the image signal to test for the criterion of a downward path, by the steps of: examining pixels below the bottom of a possible stem portion to determine if there exists a pixel representing plant material which is immediately below, or to one side or the other by a predetermined number of pixels, of a selected pixel in the bottom of the possible stem portion; and repeating the examination step sequentially below each located pixel of plant material to locate subsequent lower pixels which are immediately below, or to one side or the other by a predetermined number of pixels, of previously located pixels in the said downward path.
Conveniently, the method may be carried out so that if, in one of the said examination steps, a pixel of plant material is located for which in a subsequent examination step there is found no succeeding lower pixel of plant material, then the said pixel for which there is no succeeding lower pixel of plant material, is replaced in the image signal by a pixel which does not represent plant material.
In the alternative criterion which may be applied (of determining if a plant portion has a region of plant material at the top and/or bottom fulfilling further predetermined size criteria), the, or one of the, predetermined size criteria, may be that, at the top and/or bottom of the plant portion, the horizontal width of the plant portion must be greater than a predetermined value, which is itself set to be greater than the lower end of the said range of horizontal width distance specified for the plant portion. In accordance with another feature, the or one of the said predetermined size criteria may be that, above the top and/or below the bottom of the plant portion, there should be in the image an area of plant material of more than a predetermined value.
Preferably, the said criterion of angle of inclination is that the angle to the vertical of the most nearly vertical side of the plant portion should be less than a predetermined value. Conveniently the said predetermined value is chosen from a range comprising 150 to 30 to the vertical, preferably 200 to 250. Preferably, the criterion for the horizontal width of the plant portion is that the horizontal width must lie within the said predetermined range, throughout the length of the plant portion.
The invention finds particular application where it is required to insert into a group of plants growing in a container, a robotic end effector, for example a pair of gripping fingers which will close together and grip a stem of a plant. In such an arrangement it is useful to have depth information, indicating the distance of a selected stem from a fixed point of observation.
Thus in accordance with a preferred feature of the invention, the method may include generating two or more image signals representing images of the plant when observed from different directions, comparing the image signals and deriving depth information relating to the distance of different parts of the plant from a fixed point of observation of the plant, and storing depth information in association with located plant portions of a single image of the plant, which are likely to represent stems.
The depth information may be used when setting criteria for selecting likely portions of plant which constitute suitable stem portions for grasping. The method may include deriving from the depth information for a likely stem portion a measure of the accuracy of the depth information associated with that stem portion, and selecting a preferred stem portion by criterion including the accuracy of the depth information associated with the stem portion.
Preferably the two or more image signals are generated by maintaining the direction of observation of the plant constant, and rotating the plant relative to the direction of observation of the plant.
In one particularly preferred form, the method includes the further step of processing the image signal to extrapolate the image of the selected portion of plant material to represent an extension of the selected plant portion in the downward direction. Preferably the method includes the step of processing the image signal to locate the intersection of the said extension of plant portion with a base level related to the level of a material in which the plant is growing.
In one preferred way of carrying the method into effect, the image signal is adapted to present an image of the plant by a horizontally scanned image raster, the method including the steps of scanning across the image to locate a plant portion identified by a transition from background to plant material followed by a transition from plant material to background, the transitions being separated by a horizontal distance which lies in the said predetermined range of distance; scanning along the next line of the image raster to test for the presence of a subsequent adjacent scan containing a pair of transitions separated from each other by a horizontal distance in the same range, and positioned laterally within a predetermined relationship with the previous pair of transitions; and continuing with subsequent scans to test for the presence of at least a predetermined number of successive pairs of transitions fulfilling the above requirements, and thereby locating a portion of the plant image likely to constitutea stem.
The invention also provides a method of harvesting a plant during micropropagation comprising the steps of locating a stem of the plant in accordance with the method of any of the preceding paragraphs, generating a control signal related to the location of a stem, and removing the plant from a growing medium at the said location by robotic means under the control of the said control signal.
Preferably the method includes the step of gripping the plant at the said location by robotic gripping means under the control of the said control signal. In one particular preferred form the said robotic gripping means is directed to grip the plant at a location related to the said extension of the plant portion.
It is particularly to be noted that where a feature of the invention has been set out with regard to a method, there is also provided in accordance with the invention an apparatus incorporating that feature, and vice versa. In particular, there is provided in accordance with the invention apparatus for use in micropropagation comprising means for generating an image signal representing an image of the plant; and signal processing means for processing the image signal to locate a required portion of plant material and to generate an output signal containing information as to the location of the plant portion, the processing means operating to locate the portion of plant material in the image as having the following criteria, namely: (i) having a horizontal width lying in a predetermined range of horizontal distance, and (ii) having a length greater than a predetermined value; and having at least one of the following criteria, namely: (iii) having an angle of inclination to the vertical less than a predetermined value, and (iv) having a region of plant material above and/or below the plant portion fulfilling one or more further predetermined criteria.
The apparatus may further comprise means for presenting a plurality of plants at a work station, and robotic means for removing a plant or a required portion thereof from a growing medium at a selected location under the control of the said output signal.
Embodiments of the invention will now be described by way of example by reference to the accompanying drawings in which:- Figure 1 is a diagrammatic representation of apparatus embodying the invention for locating and selecting stems of plants and for harvesting plants, during micropropagation; Figures 2a to 2d show a flow chart of a routine for processing a visual image signal in an embodiment of the invention; Figure 3 shows a visual image thresholded into two grey levels of image, of a microplant in a container; Figures 3a, 3b, and 3c show the image of Figure 3 after processing to various stages in the flow chart of Figures 2a to 2d, and Figure 3d shows an additional step of signal processing which may be carried out; Figure 4 is a representation of a close-up of a stem segment in an image of a plant, found by use of a method embodying the invention;; Figures 5a to 5e are diagrammatic representations of pixels illustrating tests carried out at step 241 in the flow chart of Figure 2c; and Figures 6a to 6b show a flow chart of an alternative routine for processing a visual image signal in an alternative embodiment of the invention.
One of the tasks which must be carried by an automatic micropropagation system is that of harvesting the plants for dissection. A typical configuration of microplants at this stage in the process is shown in Figure 1. A container 11 has been cut down from the normal shape of a margarine tub, to provide a shallow tray of agar 12 in which grow three plants 13, nominally arranged in a row as shown. A typical example of plant is chrysanthemum. The tray may measure 80mm by 50mm having a depth of lOmm of agar, and typically the plants are 75mm high.
A harvesting tool 19 constituting a robotic end effecter, has a pair of gripper rods 20 and 21 extending towards the plants. The end effecter 19 may be driven by a control device 17 in three orthogonal directions indicated at X, Y and Z in the Figure 1. When the end effecter 19 has been brought to the position shown in Figure 1, the plant can be grasped by the gripper rods 20 and 21 moving together to grip the stem 14, and the plant can be removed from the container 11. The stems are preferentially grasped close to the bottom because it is assumed that the stem is least likely to break at that point when gripped.
The plants may be removed for example by cutting at the base, for example by a laser or by a blade mounted on a robotic tool, or by pulling the plant vertically from the agar.
It is an object of the invention to provide an algorithm for locating and selecting stems by computer vision, so that the robotic harvesting tool 19 can be automatically guided. In the following description, only the positions of the stems in the XY plane are considered.
Movement of the end effecter 19 in the Z direction can either be avoided, by growing the plants in a single row, or where for example six plants are grown in two rows, the necessary depth information can be obtained from a second image of the container, slightly rotated from its original position.
Figure 1 illustrates in diagrammatic form an arrangement for generating an image signal representing the plants by means of a solid state video camera 15 directed to the plants 13 to capture monochrome images of the microplants in the container. The effect of back lighting can be achieved by using an opaque tunnel (not shown) to shield the plants from overhead and transverse light, with an inclined sheet of white card placed so as to reflect lighting from behind the plants, imaging them in silhouette. The video signal from the camera 15 is transmitted along a line 23 to a main microcomputer 24.
The microcomputer 24 constitutes signal processing means for processing the image signal to locate a portion (also referred to as a segment) of plant material in the image which is identified as a stem. The microcomputer 24 then produces a control signal containing information relating to the location of the stem, and the control signal is passed along a further line 25 to operate the control device 17 to effect movement of the end effecter 19.
There will now be described the algorithm used in accordance with a preferred embodiment of the invention, to locate, using computer vision, stems or parts of stems suitable for the harvesting robotic end effecter to grasp.
In the microcomputer 24, the video image is digitised and placed in RAM by a Frame Grabber Card and is continuously displayed from RAM on a monitor screen 26.
Software is used to contract the image into one quarter of the monitor screen, after which the dimensions of the image are 256x256 pixels with a resolution of 128 grey levels.
The image produced by the camera 15 is thresholded into two grey levels: grey for the plants and white for the background. Figure 3 shows a typical thresholded image, in which the locations of the stems are clearly visible to the human eye. In this image, six plants in two rows are presented. It is unusual for all six stems to be visible at the same time in the crowded container. However this is not important to the practical performance of the algorithm, because hidden stems are gradually revealed by the removal of occlusions as the more readily-visible plants are harvested.
The algorithm works by searching the image for features corresponding to the definition of a stem summarised in the following table.
Summary of stem-f inding algorithm Ranae of ProPerty Possessed by stem Description of Property Segments Thickness Number of grey pixels between 2 to 8 pixels inclusive.
left edge and right edge Length Vertical height from top to Greater than 3 pixels.
bottom.
Angle Average angle to the vertical of Less than 220.
the two sides, assuming sides to be straight.
Continuous Path to growing medium Can there be traced from the Plant pixel vertically bottom of the plant portion a below, or one pixel to continuous downward path through either side.
plant material to the surface of the material in which the plant is growing? The algorithm can be summarised as the steps of selecting a plant portion which satisfies the criteria that: (i) the horizontal width of the plant portion, throughout its length, lies in the range two to eight pixels; and (ii) the length of the plant portion is greater than three pixels; and (iii) the angle to the vertical of the average of the two vertical sides of the plant portion is less than 22 ; and (iv) there exists in the line of the scanning raster below the bottom of the stem portion, a pixel representing plant material which is immediately below the centre of the bottom of the stem portion, or is positioned to one side or the other by one pixel, and which represents plant material; and that in succeeding lines of the scanning raster, down to the level of the material in which the plant portion is growing, there is in each succeeding scanning line a pixel of plant material which is either immediately below a pixel of plant material located in the previous line, or is to one side or the other of that pixel by the width of one pixel.
There will now be set out the factors taken into account during the use of the algorithm, and the steps used in the signal processing means 24. These steps are set out in the flow chart of Figures 2a to 2d.
Referring to Figure 2a, the processing starts at step 220 by the video camera 15 observing the plant from a first direction. At step 221 the processing means 24 captures a first image from the camera 15 and stores this first image.
At step 222, a motor (not shown) rotates the container 11 by a predetermined amount, in this case about 4 , and a second visual image of the plant is taken by the camera 15.
At step 223, the second video image is captured by the microcomputer 24 and stored.
At step 224, both the first image and the second image are thresholded to give an image of two grey levels, such as is shown in Figure 3. At step 225, the two images are compared in the processing means 24 to derive depth information from the stereo images, and the processing means calculates the distances of edge pixels from the camera in the second image.
At step 226, the first image is discarded, and from this step onwards the processing is carried out with regard to the second image, shown for example in Figure 3. In Figure 2a the flow chart terminates at A, and then continues in Figure 2b, starting at A.
Referring to Figure 2b, the processing continues at step 230 and the algorithm scans the image row by row until a pair of left and right hand edge points are found, at steps 231 and 232, which indicate a white-to-grey transition followed by a grey-to-white transition. The width between the transitions is then tested at step 233 to locate a pair of left and right hand edge points which are 2 to 8 pixels apart. The signal processing means then tracks the stem downwards in step 234 until a stopping condition is reached.
In general terms, the tracking of the stem downwards can be said to comprise the steps of scanning across the image to locate a plant portion identified by a transition from background to plant material followed by a transition from plant material to background, the transitions being separated by a horizontal distance which lies in the said predetermined range of distance; scanning along the next line of the image raster to test for the presence of a subsequent adjacent scan containing a pair of transitions separated from each other by a horizontal distance in the same range, and positioned laterally within a predetermined relationship with the previous pair of transitions; and continuing with subsequent scans to test for the presence of at least a predetermined number of successive pairs of transitions fulfilling the above requirements, and thereby locating a portion of the plant image likely to constitute a stem.
The tracking is stopped if (a) the stem thickness becomes thinner than 2 pixels or thicker than 8, or (b) if either edge deviates suddenly by more than 2 pixels to the right or left. The image signal at the output of step 234 is indicated as image I which is shown in Figure 3a. As will be seen in Figure 3a, at stage I, all possible stem portions are shown, even if these are very short, for example as indicated at 29.
Figure 4 shows an enlargement of part of an image of a plant. The portion shown in Figure 4 appears in the top left hand corner of the plant image of Figure 3a, and is shown to illustrate how the tracking proceeds. Figure 4 shows a stem segment after tracking, with the pixels visited marked in heavy black. In the case shown, the segment was followed downwards for 24 pixels until tracking was stopped by the thickness expanding to more than 8 pixels, and by the right hand edge moving more than 2 pixels to the right.
Returning to Figure 2b, at step 235 the algorithm rejects stem segments having a length less than 3 pixels.
At step 236, the position of the stem is recorded and added to a list of possible stem segments.After tracking a segment, the coordinates of the sides of the segment are recorded, and the search for more stems is resumed from the point of entry into the segment. This continues until the end of the screen is reached, at step 237.
The flow chart is continued in Figure 2c, being linked to the part shown in Figure 2b by the flow line B. At step 238, the candidate stem segments are extracted from the list for examination for further criteria.
The set of candidate stem segments found after scanning the entire image includes leaf-stems, noise and some thin, vertical pieces of leaves amongst the genuine stems. Usually the stems are readily distinguishable from the leaf-stems by their angle to the vertical. The angle of a stem segment to the vertical is calculated from the co-ordinates of its corners; the angles of the two straight lines drawn from the bottom left to top left and from bottom right to top right are both calculated, and the average angle to the vertical of the two sides is taken to be that of the stem. The selection by the criterion of angle is carried out at step 239.
Stems are also distinguished from noise and many of the leaf pieces by their length. Noise and leaf pieces rarely yield a coherent pair of edges over more than a few pixels. Length is measured as the vertical difference between the top and bottom coordinates of the segment.
This step has already been dealt with in step 235. Thus stems are initially defined as a pair of edges, 2 to 8 pixels thick, with a length greater than 3 pixels and an angle to the vertical of not more than 22 .
After the elimination of candidate stems by angle and length, an image typically contains some 50 remaining segments, including perhaps about 3 errors where a feature is falsely classified as a stem. Many of the errors are removed by use of the property possessed by genuine stem segments of continuing downwardly through plant material to the surface of the growing medium such as agar.
The algorithm effects the test for a continuous downward path through plant material to the agar, as follows. The method is illustrated in Figure 3b, where the completed paths are indicated at 30, 31, 32, 33, 34, and 35, and in Figures 5a to Se, where the testing of individual pixels is illustrated.
The method carried out in step 241 of the flow chart, begins by examining the middle pixel of the bottom row of a stem candidate. The central pixel is indicated by way of example in Figure 5a at 510. The algorithm looks one row down in the scanning raster, and examines three pixels indicated at 511, 512 and 513. In the example shown in Figure 5b, the pixels 511 and 512 are both found to be pixels representing plant material (hereinafter referred to as plant pixels). The algorithm is arranged first to examine pixel 512, the pixel immediately below the previous pixel 510. Finding this to be a plant pixel, the algorithm next examines the three pixels in the next row down, comprising a pixel immediately below pixel 512, indicated at pixel 516, and the pixels on either side, indicated at 515 and 517.In this case, each of the three pixels does not represent plant material, as shown in Figure 5c. The algorithm then returns to consideration of pixel 510, and takes two further steps. Firstly the algorithm replaces the plant pixel of 512 by a blank, or non-plant pixel 512, to avoid any further attempts to track down a path through pixel 512. Next the algorithm examines the two pixels on either side of 512, and tries firstly the pixel to the left in Figure 5d, namely pixel 511. In this particular case, pixel 511 is found to be a plant pixel so that the path, for example path 31, now continues downwardly, having moved through an angle of 450 to one side, as shown in Figure 5e.
The next step is to test the three pixels immediately below pixel 511, indicated at pixels 514, 515 and 516. The routine established will again first test the central pixel 515, and will preferably follow the vertically downward path through pixel 515. Only if this path is stopped by meeting a non-plant pixel in a lower raster scan, will the algorithm withdraw back up to pixel 511 and then move sideways to examine pixels 514 and 516.
As shown in Figure 3b, the outcome of this step 241 in Figure 2c, is that a series of paths 30 to 35 are traced downwardly through the plant material to the level of the agar at 12. If any stem segment fails to meet the test of the routine in step 241, that stem segment is deleted from the list of candidate stems, at step 242 in the flow chart.
It is not essential during the step 241 in the routine of Figure 2c to look at the raster scan immediately below the stem segment. It would be possible for example to look at every second line of the raster scan when examining pixels to determine if there is a continuous path to the growing medium. This would of course merely show if there was a substantially continuous path, but this would be sufficient. Also, the path traced would no longer either be vertically downwards, or at 450 Instead the path would be either vertically downwards and at 22F to the vertical.
In some arrangements this may be preferable.
After exercising the criteria of steps 239 and 241, rejected stem segments are deleted from the list at step 242, and after the last stem segment in the list has been dealt with, at step 243, the output image signal as indicated at II, represents an image shown in Figure 3c.
As can be seen, the short stem segments of Figure 3a have been removed from the further processed signal shown in Figure 3c.
Next, as shown in Figure 2c, at steps 244 to 247, the algorithm makes use of various depth information stored at previous step 225 in Figure 2a, to remove less likely stem segments. At step 244, the algorithm examines stem segments taken from the remaining list of candidate stems.
In step 245, the processing means looks at each pixel of a stem segment, and averages the stereo, or depth, information from each pixel, over the whole length of the stem segment. This averaged depth information for the whole stem segment is stored for future use. In step 246, the processing means calculates a measure of the accuracy of the stereo information for each entire stem segment and again stores the measure. Thus in these steps, the algorithm averages stereo information of the individual pixels, to give overall information as to the overall movement of that stem portion transversely during the rotation of the container 11 between the two stereo visual images. The estimate of accuracy (or inverse of error), measurement at step 246 takes the form of an inverse of standard deviation, and gives an estimate of the accuracy in that signal.For example for each stem segment located, the error could be 5mm or 6mm, or even up to lOmm. For the best values of depth information, the error will be within about 2mm. This is used later in the flow chart by a decision to select for grasping a stem segment which has the best stereo information available for the positioning of the gripping elements.
At step 247, determination is made as to when the last stem segment in the list has been dealt with, and the flow chart moves through connection C, to the remainder of the chart shown in Figure 2d.
Referring to Figure 2d, at step 250, the processing means examines the first candidate segment from the remaining list and begins to construct a score for each candidate segment, comprising three components. The components may be summarised as follows: (i) A decision whether the candidate segment has no stereo information available at all. If stereo information is available, a score of one hundred is given to that candidate segment.
(ii) A decision is made as to whether the candidate segment has sufficient clearance from other plant material in front of the candidate segment, to allow entry of the gripping rods 20.
(This test will be explained more fully hereinafter). If clearance is available, ten units are added to the score.
(iii) The accuracy, or inverse of standard deviation, of the stereo information is examined and a number of units is added to the score formed by a range of one to nine units representing the accuracy of the stereo. This number of units is formed by taking the inverse of the standard deviation, or of the square of the standard deviation, and modified to form an appropriate score of units in the range one to nine. This number of units is determined empirically for the particular case. The highest certainty should be attributed a score of up to ten units, with the lowest level of certainty having attributed one unit.
Thus, following the flow chart, at step 252, one hundred units are added to the score for the segment if at least some stereo information is available for that stem segment. At step 253, the stereo information available is considered again to determine whether there is sufficient clearance for the gripping rods 20 and 21 to be inserted into the mass of plants, to remove a stem portion. In the visual image pixels are examined on either side of a candidate stem segment to check if there are any edge pixels in front of the stem-portion space which will get in the way of the gripping rods 20 and 21. A higher score is given to stem portions which do not have any obstructions in the space on either side of the stem segment in the two dimensional visual image of Figure 3, but positioned in front of the relevant stem segment area, so far as the stereo information is concerned. In step 254, there is added to the score for a stem segment a number from one to nine representing the accuracy of the stereo information available. At step 255, a check is made as to when the last stem segment has been dealt with, and the routine then passes to step 256.
In step 256, the stem segments are arranged in a list in order of the score for each stem segment. At step 257 the robotic end effector 19 is directed by the control device 17 to the centre of the best stem selected, i.e. the stem portion at the head of the list sorted at step 256.
As a refinement, it can be arranged that, as the robotic end effector 19 moves repeatedly into the container 11, to remove plantlets, the distance by which the rods 20 and 21 move into the container is increased. The reason for this is that as the algorithm proceeds down the list of stems, the lack of accuracy of the stereo information increases, so that the doubt as to the depth position of a plantlet increases. For this reason, it is then wise to advance the gripper rods 20 and 21 to a greater extent into the container 11, relative to the depth position expected, to ensure that the stem is gripped.If for example an early plantlet on the list has a deviation or error of 2mm to 3mm in its depth position, it is sufficient to advance the grippers beyond the expected position by say 5mum. As the container is emptied, and candidates lower down the list are selected, the error may be in the range Smm to 6mm, so that it is wise to advance the rods 20 and 21 by, say, lOmm beyond the expected position.
At step 258 in the algorithm, the plantlet selected is either cut and removed, or is pulled upwardly and removed.
The routine ends at step 259.
The provision of stereo information in the algorithm will now be commented on. The need for stereo information is to avoid the end effector 19 proceeding into the container 11 and removing two or more plants at once, which are positioned one in front of each other as far as the camera is concerned. There are a number of advantages in obtaining the stereo information by rotating the container 11, rather than observing a stationary container by two cameras or one camera moved between two positions.
Firstly, the rotation of the container avoids the need to calibrate two cameras. A camera is calibrated by setting up a grid at the back of the container 11, and arranging for the robotic end effector 19 to draw a series of dots on the grid. The position of the dots on a monitor screen 26 is then compared with the actual position of the dots in the grid, so that a correlation can be produced between camera position on the screen, and position of the tool 19 in real space.
Another advantage of rotation of the container 11, is that if it occurs that no suitable stem portions are located by the processing means 24, from one or two views presented by the camera 15, then the entire container 11 can be rotated through a much greater angle, and the sequence can be tried again. The container 11 can be moved through a number of large angles, until a suitable pair of visual images is obtained, showing suitable stem segments.
The algorithm used to obtain stereo information is a simple, known, algorithm. initially edges are located in the first image, and the second image is then examined.
The algorithm looks at six pixels on each side of the computed position for the same edge in the second image, and having located the images, calculates the difference in distance between the two edges.
There will now be described with reference to the Figures 6a and 6b, an alternative flow chart showing an alternative algorithm which may be used in accordance with an embodiment of the invention. It is particularly to be appreciated that different features may be selected from the two main embodiments described in the two flow charts, and may be combined in other embodiments.
The algorithm used in the embodiments of Figures 6a and 6b works by searching the image for features corresponding to the definition of a stem summarised in the following table.
Summary of stem-finding algorithm Range of Property Possessed by stem Description of Property Segments Thickness Number of grey pixels between 2 to 8 pixels inclusive.
left edge and right edge Length Vertical height from top to Greater than 5 pixels.
bottom.
Angle Angle of most vertical side to Less than 220.
the vertical, assuming sides to be straight.
End Thickness Thickness of top (or bottom) Greater than 2 pixels.
end of stem.
Number of stem pixels in adjacent row 'Adjacent row' is defined as the Greater than 1.
pixels immediately above the top of the stem (or below the bottom) in a horizontal span equal to the thickness of the stem end.
The algorithm can be summarised as the steps of selecting a plant portion which satisfies the criteria that: (i) the horizontal width of the plant portion, throughout its length, lies in the range two to eight pixels; and (ii) the length of the plant portion is greater than five pixels; and (iii) the angle to the vertical of the most nearly vertical side of the plant portion is less than 22 ; and (iv) the horizontal widths of the top and bottom ends of the plant portion are each greater than two pixels; and (v) the adjacent lines of the scanning raster, above the top of the plant portion, and below the bottom of the plant portion, each contains, in the region respectively immediately above and below the plant portion, more than one pixel representing plant material.
There will now be set out the factors taken into account during the use of the algorithm, and the steps used in the signal processing means 24 when operating in accordance with the flow chart of Figures 6a and 6b.
Referring to Figure 6a, the processing starts at step 630 and only a single visual image is used so that no stereo information is available. The algorithm scans the image row by row until a pair of left and right hand edge points are found, at steps 631 and 632, which indicate a white-to-grey transition followed by a grey-to-white transition. The width between the transitions is then tested at step 633 to locate a pair of left and right hand edge points which are 2 to 8 pixels apart. The signal processing means then tracks the stem downwards in step 634 until a stopping condition is reached. The tracking is stopped if (a) the stem thickness becomes thinner than 2 pixels or thicker than 8, or (b) if either edge deviates suddenly by more than 2 pixels to the right or left.The image signal at the output of step 634 is indicated as image I which is generally the same as that shown in Figure 3a for the previous embodiment. As will be seen in Figure 3a, at stage I, all possible stem portions are shown, even if these are very short, for example as indicated at 29.
Returning to Figure 6a, at step 635 the algorithm rejects stem segments having a length less than 5 pixels.
At step 636, the position of the stem is recorded and added to a list of possible stem segments.After tracking a segment, the coordinates of the corners, that is to say the tops and bottoms of both edges, are recorded, and the search for more stems is resumed from the point of entry into the segment. This continues until the end of the screen is reached, at step 637.
The flow chart is continued in Figure 6b, being linked to the part shown in Figure 6a by the flow line A. At step 638, the candidate stem segments are extracted from the list for examination for further criteria. The set of candidate stem segments found after scanning the entire image includes leaf-stems, noise and some thin, vertical pieces of leaves amongst the genuine stems. Usually the stems are readily distinguishable from the leaf-stems by their angle to the vertical. The angle of a stem segment to the vertical is calculated from the co-ordinates of its corners; the angles of the two straight lines drawn from the bottom left to top left and from bottom right to top right are both calculated, and the angle closest to the vertical is taken to be that of the stem.Taking the most vertical side of the stem segment as a measure of its angle allows short segments, which are more prone to errors in this quantity, to be more easily recognised as stems. The selection by the criterion of angle is carried out at step 639.
Stems are also distinguished from noise and many of the leaf pieces by their length. Noise and leaf pieces rarely yield a coherent pair of edges over more than a few pixels. Length is measured as the vertical difference between the top and bottom coordinates of the segment.
This step has already been dealt with in step 635. Thus stems are initially defined as a pair of edges, 2 to 8 pixels thick, with a length greater than 5 pixels and an angle to the vertical of not more than 22 After the elimination of candidate stems by angle and length, an image typically contains some 20 remaining segments, including perhaps about 3 errors where a feature is falsely classified as a stem. Many of the errors are removed by use of the property possessed by genuine stem segments of not continuing into empty space at either end; stem segments always end in a leaf or another stem segment at the top, or in leaf, stem or agar at the bottom.
Segments are rejected which are thinner than 3 pixels at either end (step 640), or which do not continue into at least two plant pixels immediately above and below the ends (step 641). Approximately half the errors are removed by this criterion.
After exercising the criteria of steps 639, 640 and 641, rejected stems segments are deleted from the list at step 642, and after the last stem segment in the list has been dealt with, at step 643, the output image signal is indicated at ii representing an image generally similar to that shown in Figure 3c for the previous embodiment. Next, at step 644, the algorithm locates the longest stem of the stems shown in Figure 3c, and at step 645 the algorithm extrapolates the longest stem downwardly towards the container 11. Also at step 645, the algorithm locates a region at the intersection of the extrapolated longest stem with the general level of the agar 12 in the container 11.
The output signal after step 645 is indicated at III, and represents a visual image generally the same as that shown in Figure 3d for the previous embodiment. In Figure 3d, the longest stem section is indicated at 28, and the intersection of the extrapolated stem and the surface of the agar 12, is indicated at 29. The final steps of the flow chart are that, at step 646 the robotic end effector is directed to grasp the selected stem in the region of the location 29 in Figure 3d, and at step 647 the selected stem is cut and the plant removed from the container 11. The algorithm is then stopped at step 648.

Claims (25)

1. A method of locating a stem of a plant during micropropagation comprising the steps of generating an image signal representing an image of the plant, and processing the image signal to locate a portion of plant material in the image likely to constitute a stem, by locating a portion of plant material having the following criteria, namely: (i) having a horizontal width lying in a predetermined range of horizontal distance, and (ii) having a length greater than a predetermined value; and having at least one of the following criteria, namely: (iii) having an angle of inclination to the vertical less than a predetermined value, and (iv) having a region of plant material above and/or below the plant portion fulfilling one or more further predetermined criteria.
2. A method according to claim 1 including processing the image signal to locate a portion of plant material having at least the following criterion, namely: having a region of plant material above and/or below the plant portion which provides a continuous or substantially continuous path of predetermined characteristics leading through plant material from the top and/or bottom of the plant portion.
3. A method according to claim 1 or 2 including processing the image signal to locate a portion of plant material having at least the following criterion, namely: having a region of plant material below the plant portion which provides a continuous or substantially continuous downward path through plant material from the bottom of the plant portion to a base level related to the level of a material in which the plant is growing.
4. A method according to claim 3 including processing the image signal to test for the criterion of a downward path, by the steps of: examining pixels below the bottom of a possible stem portion to determine if there exists a pixel representing plant material which is immediately below, or to one side or the other by a predetermined number of pixels, of a selected pixel in the bottom of the possible stem portion; and repeating the examination step sequentially below each located pixel of plant material to locate subsequent lower pixels which are immediately below, or to one side or the other by a predetermined number of pixels, of previously located pixels in the said downward path.
5. A method according to claim 4 in which if, in one of the said examination steps, a pixel of plant material is located for which in a subsequent examination step there is found no succeeding lower pixel of plant material, then the said pixel for which there is no succeeding lower pixel of plant material, is replaced in the image signal by a pixel which does not represent plant material.
6. A method according to any preceding claim in which the said criterion of angle of inclination is that the angle to the vertical of the average of the two sides of the plant portion should be less than a predetermined value.
7. A method according to any preceding claim in which the said predetermined value of an angle to the vertical is selected from a range comprising 200 to 250..
8. A method according to any preceding claim in which the criterion for the horizontal width of the plant portion is that the horizontal width must lie within the said predetermined range, throughout the length of the plant portion.
9. A method according to any preceding claim including processing the image signal to locate a portion of plant material having at least the following criterion, namely: having a region of plant material at the top and/or bottom of the plant portion fulfilling one or more predetermined size criteria.
10. A method according to claim 9 in which the or one of the said predetermined size criteria is that, at the top and/or bottom of the plant portion, the horizontal width of the plant portion must be greater than a predetermined value, which is itself set to be greater than the lower end of the said range of horizontal width distance specified for the plant portion; and/or that, above the top and/or below the bottom of the plant portion, there should be in the image an area of plant material of more than a predetermined value.
11. A method according to any preceding claim including generating two or more image signals representing images of the plant when observed from different directions, comparing the image signals and deriving depth information relating to the distance of different parts of the plant from a fixed point of observation of the plant, and storing depth information in association with located plant portions of a single image of the plant, which are likely to represent stems.
12. A method according to claim 11 including deriving from the depth information for a likely stem portion a measure of the accuracy of the depth information associated with that stem portion, and selecting a preferred stem portion by criteria including the accuracy of the depth information associated with the stem portion.
13. A method according to claim 11 or 12 in which the two or more image signals are generated by maintaining the direction of observation of the plant constant, and rotating the plant relative to the direction of observation of the plant.
14. A method of locating a stem according to any preceding claim including the further step of processing the image signal to extrapolate the image of the selected portion of plant material to represent an extension of the selected plant portion in the downward direction.
15. A method according to claim 14 including the step of processing the image signal to locate the intersection of the said extension of plant portion with a base level related to the level of a material in which the plant is growing.
16. A method according to any preceding claim in which the image signal is adapted to present an image of the plant by a horizontally scanned image raster, the method including the steps of scanning across the image to locate a plant portion identified by a transition from background to plant material followed by a transition from plant material to background, the transitions being separated by a horizontal distance which lies in the said predetermined range of distance; scanning along the next line of the image raster to test for the presence of a subsequent adjacent scan containing a pair of transitions separated from each other by a horizontal distance in the same range, and positioned laterally within a predetermined relationship with the previous pair of transitions; and continuing with subsequent scans to test for the presence of at least a predetermined number of successive pairs of transitions fulfilling the above requirements, and thereby locating a portion of the plant image likely to constitute a stem.
17. A method of harvesting a plant during micropropagation comprising the steps of locating a stem of the plant in accordance with the method of any preceding claim, generating a control signal related to the location of a stem, and removing the plant from a growing medium at the said location by robotic means under the control of the said control signal.
18. A method according to claim 17 including the step of gripping the plant at the said location by robotic gripping means under the control of the said control signal.
19. A method according to claim 18 when including the limitations of claim 14 or 15, in which the said robotic gripping means is directed to grip the plant at a location related to the said extension of the plant portion.
20. A method of locating a stem of a plant during micropropagation comprising the steps of generating an image signal representing an image of the plant, and processing the image signal to locate a portion of plant material in the image likely to constitute a stem, by locating a portion of plant material having the following criteria, namely: (i) having a horizontal width lying in a predetermined range of horizontal distance, (ii) having a length greater than a predetermined value, (iii) having an angle of inclination to the vertical less than a predetermined value, and (iv) having a region of plant material below the plant portion which provides a continuous or substantially continuous downward path through plant material from the bottom of the plant portion to a base level related to the level of a material in which the plant is growing.
21. Apparatus for use in micropropagation comprising means for generating an image signal representing an image of the plant; and signal processing means for processing the image signal to locate a required portion of plant material and to generate an output signal containing information as to the location of the plant portion, the processing means operating to locate the portion of plant material in the image as having the following criteria, namely: (i) having a horizontal width lying in a predetermined range of horizontal distance, and (ii) having a length greater than a predetermined value; and having at least one of the following criteria, namely: (iii) having an angle of inclination to the vertical less than a predetermined value, and (iv) having a region of plant material above and/or below the plant portion fulfilling one or more further predetermined criteria.
22. Apparatus according to claim 21 further comprising means for presenting a plurality of plants at a work station, and robotic means for removing a plant or a required portion thereof from a growing medium at a selected location under the control of the said output signal.
23. A method of locating a stem of a plant during micropropagation, substantially as hereinbefore described with reference to the accompanying drawings.
24. A method of harvesting a plant during micropropagation substantially as hereinbefore described with reference to the accompanying drawings.
25. Apparatus for use in micropropagation of a plant substantially as hereinbefore described with reference to the accompanying drawings.
GB9116159A 1990-07-26 1991-07-26 Micropropagration Withdrawn GB2247948A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB909016443A GB9016443D0 (en) 1990-07-26 1990-07-26 Methods and apparatus relating to micropropagation

Publications (2)

Publication Number Publication Date
GB9116159D0 GB9116159D0 (en) 1991-09-11
GB2247948A true GB2247948A (en) 1992-03-18

Family

ID=10679710

Family Applications (2)

Application Number Title Priority Date Filing Date
GB909016443A Pending GB9016443D0 (en) 1990-07-26 1990-07-26 Methods and apparatus relating to micropropagation
GB9116159A Withdrawn GB2247948A (en) 1990-07-26 1991-07-26 Micropropagration

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GB909016443A Pending GB9016443D0 (en) 1990-07-26 1990-07-26 Methods and apparatus relating to micropropagation

Country Status (4)

Country Link
EP (1) EP0540627A1 (en)
AU (1) AU8304891A (en)
GB (2) GB9016443D0 (en)
WO (1) WO1992001994A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110116688A1 (en) * 2009-11-13 2011-05-19 Li Yi-Fang Automatic measurement system and method for plant features, and recording medium thereof

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5370713A (en) * 1990-09-07 1994-12-06 The Commonwealth Industrial Gases Limited Automatic plant dividing system
FR2725812B1 (en) * 1994-10-17 1997-01-17 Cirad Coop Int Rech Agro Dev METHOD FOR IDENTIFYING VARIOUS OBJECTS, SPECIES OR INDIVIDUALS AND APPLICATIONS THEREOF
DE19845883B4 (en) * 1997-10-15 2007-06-06 LemnaTec GmbH Labor für elektronische und maschinelle Naturanalytik Method for determining the phytotoxicity of a test substance

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1986006576A1 (en) * 1985-05-15 1986-11-20 The Commonwealth Industrial Gases Limited Method and apparatus for dividing plant materials

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4613269A (en) * 1984-02-28 1986-09-23 Object Recognition Systems, Inc. Robotic acquisition of objects by means including histogram techniques

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1986006576A1 (en) * 1985-05-15 1986-11-20 The Commonwealth Industrial Gases Limited Method and apparatus for dividing plant materials

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110116688A1 (en) * 2009-11-13 2011-05-19 Li Yi-Fang Automatic measurement system and method for plant features, and recording medium thereof
US8600117B2 (en) * 2009-11-13 2013-12-03 Institute For Information Industry Automatic measurement system and method for plant features, and recording medium thereof

Also Published As

Publication number Publication date
EP0540627A1 (en) 1993-05-12
GB9016443D0 (en) 1990-09-12
AU8304891A (en) 1992-02-18
WO1992001994A1 (en) 1992-02-06
GB9116159D0 (en) 1991-09-11

Similar Documents

Publication Publication Date Title
Yamamoto et al. Development of a stationary robotic strawberry harvester with a picking mechanism that approaches the target fruit from below
Sarig Robotics of fruit harvesting: A state-of-the-art review
JP5119392B2 (en) Fruit harvesting robot and strawberry cultivation facility
EP0858640B1 (en) Teat location for milking
US5471541A (en) System for determining the pose of an object which utilizes range profiles and synethic profiles derived from a model
US6756589B1 (en) Method for observing specimen and device therefor
CN101493313B (en) Image processing process for ripe fruit identification and positioning
Huang et al. An automatic machine vision-guided grasping system for Phalaenopsis tissue culture plantlets
CN114175927B (en) Cherry tomato picking method and cherry tomato picking manipulator
Rath et al. Robotic harvesting of Gerbera Jamesonii based on detection and three-dimensional modeling of cut flower pedicels
Velumani Wheat ear detection in plots by segmenting mobile laser scanner data
Hayashi et al. Gentle handling of strawberries using a suction device
EP4057811A1 (en) System and method for automated and semi-automated mosquito separation identification counting and pooling
EP0222836A1 (en) Method and apparatus for dividing plant materials
GB2247948A (en) Micropropagration
JP3277529B2 (en) Fruit harvesting robot
Tarrío et al. A harvesting robot for small fruit in bunches based on 3-D stereoscopic vision
Feng et al. Fruit Location And Stem Detection Method For Strawbery Harvesting Robot
Kounalakis et al. Development of a tomato harvesting robot: Peduncle recognition and approaching
He et al. Detecting and localizing strawberry centers for robotic harvesting in field environment
Lefebvre et al. Computer vision and agricultural robotics for disease control: the Potato operation
EP0570473A1 (en) A method for use in a multiplication process of plants and a device for carrying out said method.
McFarlane Image-guidance for robotic harvesting of micropropagated plants
McFarlane A computer-vision algorithm for automatic guidance of microplant harvesting
CA3111952A1 (en) Mushroom harvesting vision system and method of harvesting mushrooms using said system

Legal Events

Date Code Title Description
732 Registration of transactions, instruments or events in the register (sect. 32/1977)
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)