WO2013166022A2 - Machine vision system for frozen aliquotter for biological samples - Google Patents

Machine vision system for frozen aliquotter for biological samples Download PDF

Info

Publication number
WO2013166022A2
WO2013166022A2 PCT/US2013/038880 US2013038880W WO2013166022A2 WO 2013166022 A2 WO2013166022 A2 WO 2013166022A2 US 2013038880 W US2013038880 W US 2013038880W WO 2013166022 A2 WO2013166022 A2 WO 2013166022A2
Authority
WO
WIPO (PCT)
Prior art keywords
container
frozen sample
bore
camera
frozen
Prior art date
Application number
PCT/US2013/038880
Other languages
French (fr)
Other versions
WO2013166022A3 (en
Inventor
Mohammadreza RAMEZANIFARD
Saeed Sokhanvar
Todd BASQUE
Peter L. FULLER
Matthew Sweetland
Original Assignee
Cryoxtract Instruments, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/489,234 external-priority patent/US20130286192A1/en
Priority claimed from US13/844,156 external-priority patent/US20140267713A1/en
Application filed by Cryoxtract Instruments, Llc filed Critical Cryoxtract Instruments, Llc
Priority to JP2015510387A priority Critical patent/JP6108572B2/en
Priority to BR112014026936A priority patent/BR112014026936A2/en
Priority to AU2013256489A priority patent/AU2013256489A1/en
Priority to EP13724056.0A priority patent/EP2845013A2/en
Priority to CN201380022571.9A priority patent/CN104428678A/en
Priority to CA2870505A priority patent/CA2870505A1/en
Publication of WO2013166022A2 publication Critical patent/WO2013166022A2/en
Publication of WO2013166022A3 publication Critical patent/WO2013166022A3/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N1/00Sampling; Preparing specimens for investigation
    • G01N1/28Preparing specimens for investigation including physical details of (bio-)chemical methods covered elsewhere, e.g. G01N33/50, C12Q
    • G01N1/286Preparing specimens for investigation including physical details of (bio-)chemical methods covered elsewhere, e.g. G01N33/50, C12Q involving mechanical work, e.g. chopping, disintegrating, compacting, homogenising
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N35/00Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor
    • G01N35/00584Control arrangements for automatic analysers
    • G01N35/00722Communications; Identification
    • G01N35/00732Identification of carriers, materials or components in automatic analysers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N35/00Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor
    • G01N35/0099Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor comprising robots or similar manipulators
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N35/00Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor
    • G01N35/00584Control arrangements for automatic analysers
    • G01N35/00722Communications; Identification
    • G01N2035/00891Displaying information to the operator
    • G01N2035/0091GUI [graphical user interfaces]

Definitions

  • the present invention relates generally to machine vision systems and methods, and more particularly to machine vision systems for facilitating control of robotic systems for taking multiple frozen sample cores from frozen samples in containers without thawing the frozen samples.
  • Bio samples are commonly preserved to support a broad variety of biomedical and biological research that includes but is not limited to translational research, molecular medicine, and biomarker discovery.
  • Biological samples include any samples which are of animal (including human) , plant, protozoal, fungal, bacterial, viral, or other biological origin.
  • biological samples include, but are not limited to, organisms and/or biological fluids isolated from or excreted by an organism such as plasma, serum, urine, whole blood, cord blood, other blood-based derivatives, cerebral spinal fluid, mucus (from respiratory tract, cervical) , ascites, saliva, amniotic fluid, seminal fluid, tears, sweat, any fluids from plants (including sap); cells (e.g., animal, plant, protozoal, fungal, or bacterial cells, including buffy coat cells; cell lysates, homogenates, or suspensions; microsomes; cellular organelles (e.g., mitochondria); nucleic acids (e.g., RNA, DNA) , including chromosomal DNA, mitochondrial DNA, and plasmids
  • Bio samples may also include plants, portions of plants (e.g., seeds) and tissues (e.g., muscle, fat, skin, etc.).
  • Biobanks typically store these valuable samples in containers (e.g., well plates or arrays, tubes, vials, or the like) and cryopreserve them. Tubes, vials, and similar
  • containers can be organized in arrays and can be stored in well plates, racks, divided containers, etc. Although some samples are stored at relatively higher temperatures (e.g., about -20 degrees centigrade) , other samples are stored at much lower temperatures. For example some samples are stored in freezers at -80 degrees centigrade, or lower using liquid Nitrogen or the vapor phase above liquid Nitrogen) to preserve the biochemical composition and integrity of the frozen sample as close as possible to the in vivo state to facilitate accurate,
  • Biobanks have adopted different ways to address this need to provide sample aliquots.
  • One option is to freeze a sample in large volume, thaw it when aliquots are requested and then refreeze any remainder of the parent sample for storage in the cryopreserved state until future aliquots are needed.
  • This option makes efficient use of frozen storage space; yet this efficiency comes at the cost of sample quality.
  • Exposing a sample repeatedly to freeze/thaw cycles can degrade the sample's critical biological molecules (e.g., RNA) and damage biomarkers, either of which could compromise the results of any study using data obtained from the damaged samples.
  • Another option is to freeze a sample in large volume, thaw it when an aliquot is requested, subdivide the remainder of the parent sample in small volumes to make
  • the system uses a drill including a hollow coring bit to take a frozen core sample from the original parent sample without thawing the parent sample.
  • the frozen sample core obtained by the drill is used as the aliquot for the test. After the frozen core is removed, the remainder of the sample is returned to frozen storage in its original container until another aliquot from the parent sample is needed for a future test .
  • One aspect of the invention is a machine vision system for use with a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a respective container.
  • the machine vision system includes a platform for supporting one or more of the containers.
  • the platform has a station for receiving at least one of the containers and a pair of calibration marks on the platform in fixed positions relative to the station.
  • the system has a camera for capturing an image of the container while the container is received at the station.
  • a processor is configured to receive image data from the camera indicative of the image of the container.
  • the processor is configured to determine one or more locations where a frozen sample core has already been taken from a frozen sample contained in the container by: (a) evaluating contrast in the image to identify one or more bore candidates; and (b) using information about the position of the calibration marks relative to the bore candidates to determine whether or not the one or more candidates are likely to be artifacts instead of real bores in the sample.
  • Another aspect of the invention is a method of taking a frozen sample core from a frozen sample that is contained in a container.
  • the method includes positioning the container at a station for receiving a container on a platform.
  • the platform has a pair of calibration marks on the platform in fixed positions relative to the station.
  • An image of the container is captured while the container is received at the station.
  • One or more locations where a frozen sample core has already been taken from the frozen sample contained in the container is determined by: (a) evaluating contrast in the image to identify one or more bore candidates; and (b) using information about the position of the calibration marks relative to the bore candidates to determine whether or not the one or more candidates are likely to be artifacts instead of real bores in the frozen sample.
  • a frozen sample core is taken from the sample at a location from which no frozen sample core has already been taken, as determined in the determining step.
  • Yet another aspect of the invention is a machine vision system for use with a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a respective container.
  • the machine vision system includes a platform and a camera for capturing an image of one of the containers while it is on the platform.
  • processor is configured to receive image data from the camera indicative of the image captured by the camera.
  • the processor is configured to determine one or more locations where a frozen sample core has already been taken from the frozen sample contained in the container by: (a) evaluating contrast in the image to identify one or more bore candidates; and (b)
  • Another aspect of the invention is a method of calibrating a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a respective container.
  • the method includes using a camera for capturing an image of one or more containers while the
  • FIG. 10 Another aspect of the invention is a machine vision system for use with a robotic system adapted for taking a plurality of frozen sample cores from frozen samples that are each contained in a container.
  • the machine vision system includes a camera for capturing an image of a container while the container is supported by a platform.
  • the camera has an optical axis.
  • the system has a ring light for illuminating the container on the platform.
  • the ring light includes a plurality of light sources arranged in an annular patter.
  • the optical axis of the camera extends through a central portion of the annular pattern.
  • a processor is adapted to receive image data from the camera indicative of the image captured by the camera and to determine one or more locations where a frozen sample core has already been taken from the sample contained in the container by evaluating contrast in the image.
  • Still another aspect of the invention is a method of determining one or more locations where frozen sample core have already been taken from frozen samples, each of the frozen samples being contained in a respective container.
  • the method includes operating a robotic system to move a camera relative to a first one of the containers so the camera is directed at the frozen sample in the first container.
  • the frozen sample is illuminated using a ring light.
  • the ring light has a plurality of light sources arranged in an annular pattern.
  • the camera has an optical axis that extends through a central portion of the annular pattern.
  • the camera is used to capture an image of the illuminated frozen sample. Contrast in the captured image is evaluated and the image is processed to identify one or more bore candidates in the captured image and determine whether or not the bore candidates are likely to be artifacts or real bores in the frozen sample.
  • the robotic system is operated to move the camera relative to a second of the containers so the camera is directed at the frozen sample in the second container. The imaging is repeated for the frozen sample in the second
  • the system includes a camera configured for capturing monochrome images of the containers while the containers are supported by a platform.
  • a light is positioned to illuminate the containers and the samples
  • a processor is adapted to receive grayscale image data from the camera indicative of images formed by the camera and determine locations where frozen sample cores have already been taken from the samples by evaluating contrast in the images.
  • the light emits light having a color other than white.
  • Another aspect of the invention is a method of determining one or more locations where frozen sample core have already been taken from frozen samples.
  • Each of the frozen samples is contained in a respective container. The method includes operating a robotic system to move a camera relative to a first one of the containers so the camera is directed at the frozen sample in the first container.
  • the frozen sample is illuminated with a colored light.
  • the camera is used to capture a grayscale image of the illuminated frozen sample.
  • Contrast in the captured image is evaluated and the image is processed to identify one or more bore candidates in the captured image and determine whether or not the bore candidates are likely to be artifacts or real bores in the frozen sample.
  • the robotic system is operated to move the camera relative to a second of the containers so the camera is directed at the frozen sample in said second container. The imaging is repeated for the frozen sample in the second container.
  • Another aspect of the invention is a machine vision system for use with a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a container.
  • the system includes a camera for taking images of the containers while the containers are supported by a platform.
  • a light is positioned to illuminate the containers and the samples contained therein while the containers are on the platform.
  • the light has red light emitting elements, blue light emitting elements, and green light emitting elements.
  • the intensity of light emitted from the red, blue, and green light emitting elements is selectively adjustable to allow any of multiple different colors of light to be selected as the color of light to be emitted by the light.
  • a processor is adapted to receive image data from the camera indicative of images formed by the camera and determine locations where frozen sample cores have already been taken from the samples by evaluating contrast in the images.
  • the processor is adapted to receive input about the color of the samples in the containers and adjust the color of the light emitted by the light to reduce a difference between the color of the samples and the color of the light emitted by the light.
  • Another aspect of the invention is a method of determining one or more locations where frozen sample core have already been taken from frozen samples.
  • Each of the frozen samples is contained in a respective container.
  • the method includes operating a robotic system to move a camera relative to a first one of the containers so the camera is directed at the frozen sample in the first container.
  • the frozen sample is illuminated with a colored light. The color of the light is selected to match the color of the frozen sample.
  • the camera is used to capture an image of the illuminated frozen sample.
  • Contrast in the captured image is evaluated and the image is processed to identify one or more bore candidates in the captured image and determine whether or not the bore candidates are likely to be artifacts or real bores in the frozen sample.
  • the robotic system is operated to move the camera relative to a second of the containers so the camera is directed at the frozen sample in said second container.
  • the imaging process is repeated for the frozen sample in the second container.
  • Yet another embodiment of the invention is a method of determining one or more locations where frozen sample core have already been taken from frozen samples. Each of the frozen samples is contained in a respective container. The method includes operating a robotic system to position one of the containers on a platform at a station for receiving the
  • a container while a frozen sample core is extracted from the frozen sample contained in the container.
  • a light is used to provide at least one of back lighting and side lighting for the container.
  • a camera is used to capture an image of the frozen sample while illuminated by the light. Contrast in the captured image is evaluated and the image is processed to identify one or more bore candidates in the captured image.
  • FIG. 1024 Another inventive aspect is a machine vision system for use with a robotic system adapted for taking a plurality of frozen sample cores from frozen samples that are each contained in a container.
  • the machine vision system includes a camera for capturing an image of a container while the container is supported by a platform at a station for receiving the container while a frozen sample core is extracted from the frozen sample contained therein.
  • the system has a red light for illuminating the container from above while it is on the platform at the station with substantially monochromatic red light.
  • a processor is adapted to receive image data from the camera indicative of the image captured by the camera and to determine one or more locations where a frozen sample core has already been taken from the sample contained in the container by evaluating contrast in the image .
  • Yet another aspect of the invention is a method of determining one or more locations where a frozen sample core have already been taken from frozen samples.
  • Each of the frozen samples is contained in a respective container.
  • the method includes operating a robotic system to position one of the containers on a platform at a station for receiving the container while a frozen sample core is extracted from the frozen sample contained in the container.
  • the container is illuminated from above while it is on the platform at the station with substantially monochromatic red light.
  • a camera is used to capture an image of the frozen sample while illuminated by the red light. Contrast in the captured image is evaluated and the image is processed to identify one or more bore
  • FIG. 10 Another aspect of the invention is a machine vision system for use with a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a respective container.
  • the machine vision system includes a platform for supporting one or more of the containers. The platform having a station for receiving at least one of the containers .
  • the system has a camera for capturing an image of the container while the container is received at the station.
  • a processor is configured to receive image data from the camera indicative of the image of the container.
  • the processor is configured to determine one or more locations where a frozen sample core has already been taken from a frozen sample
  • the container by evaluating contrast in the image to identify one or more bore candidates and identify an edge of the container and using information about the position of the edge relative to the bore candidates to determine whether or not the one or more candidates are likely to be artifacts instead of real bores in the sample.
  • Another aspect of the invention is a method of determining one or more locations where a frozen sample core have already been taken from frozen samples. Each of the frozen samples is contained in a respective container. The method includes operating a robotic system to position one of the containers on a platform at a station for receiving the
  • a container while a frozen sample core is extracted from the frozen sample contained in the container.
  • a camera is used to capture an image of the frozen sample. Contrast in the captured image is evaluated to identify one or more bore candidates and identify an edge of the container. Information about the position of the edge relative to the bore candidates is used to determine whether or not the one or more candidates are likely to be artifacts instead of real bores in the sample.
  • One aspect of the invention is a machine vision system for use with a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a respective container.
  • the machine vision system includes a platform for supporting one or more of the containers.
  • the platform has a station for receiving at least one of the containers.
  • the system includes a camera for capturing an image of the container while the container is received at the station.
  • the system includes a fill level detection system adapted to detect the positions of the surfaces of the frozen samples.
  • a processor is configured to receive signals from the fill level detection system and use the signals to determine where to position the camera to obtain an image of the frozen samples.
  • Yet another aspect of the invention is a machine vision system for use with a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a respective container.
  • the machine vision system includes a platform for supporting one or more of the containers .
  • the platform has a station for receiving at least one of the containers.
  • the system includes a coring probe for taking frozen sample cores from the frozen samples.
  • the system includes a camera for capturing an image of the container while the container is received at the station.
  • a processor is configured to receive image data from the camera indicative of the image of the container and to determine one or more
  • the processor is configured to move the coring probe into the open end of at least one bore to clear the open end of the bore of debris.
  • Another aspect of the invention is a method of taking a frozen sample core from a frozen sample that is contained in a container.
  • the method includes positioning the container at a station for receiving a container on a platform. An image of the container is captured while the container is received at the station. One or more locations where a frozen sample core has already been taken from the frozen sample contained in the container is determined. The frozen sample core is taken from the frozen sample at a location from which no frozen sample core has already been taken, as determined in the determining step. After taking the frozen sample core from the frozen sample, a coring probe is inserted into the one or more locations where a frozen sample core has been taken to clear the one or more locations where a frozen sample core has been taken of debris.
  • Still another aspect of the invention is a machine vision system for use with a robotic system adapted for taking a plurality of frozen sample cores from frozen samples that are each contained in a container.
  • the machine vision system includes a camera for capturing an image of a container while the container is supported by a platform.
  • the system includes a light for illuminating the container on the platform.
  • a majority of the light energy emitted by the light is selected from the group consisting of red light with a wavelength in the range of 620nm to 750nm and green light with a wavelength in the range of 495nm to 570nm.
  • a processor is adapted to receive image data from the camera indicative of the image captured by the camera and to determine one or more locations where a frozen sample core has already been taken from the sample contained in the container by evaluating the image.
  • Another aspect of the invention is a method of determining one or more locations where frozen sample core have already been taken from frozen samples, each of the frozen samples being contained in a respective container.
  • the method includes operating a robotic system to move a camera relative to a first one of the containers so the camera is directed at the frozen sample in the first container.
  • the frozen sample is illuminated using a light, wherein a majority of the light energy emitted by the light is selected from the group
  • the image is used to identify one or more bore candidates in the captured image and determine whether or not the bore candidates are likely to be artifacts or real bores in the frozen sample.
  • the robotic system is operated to move the camera relative to a second of the containers so the camera is directed at the frozen sample in said second
  • the imaging is repeated for the frozen sample in said second container.
  • FIG. 1 is perspective of one example of a frozen aliquotter including one embodiment of a machine vision system of the present invention
  • FIG. 2 is a top plan of the frozen aliquotter
  • FIG. 3 is a top plan of the frozen aliquotter with parts removed to avoid obstructing view of one embodiment of a platform thereof;
  • FIG. 4 is an enlarged perspective of the platform taken in a plane including line 4--4 on Fig. 4
  • FIG. 5 is a perspective of a fragment of the frozen aliquotter shown in cross section taken in a plane including line 5--5 on Fig. 2;
  • FIG. 6 is a perspective of one embodiment of robotic end effector for use with a frozen aliquotter
  • FIG. 7 is a bottom plan view of the robotic end effector illustrated in Fig. 6;
  • FIG. 8 is a schematic diagram showing some of the components of the frozen aliquotter
  • FIG. 9 is a schematic diagram illustrating bore candidates that differ in size
  • FIG. 10 is a schematic diagram illustrating one embodiment of a geometric pattern according to which frozen sample cores are extracted from a frozen sample
  • FIG. 12 is a schematic diagram illustrating bore candidates that are positioned at various different angles relative to one another from a center of the container;
  • FIG. 13 is a schematic diagram illustrating bore candidates that do not follow an expected sequence planned for extraction of frozen sample cores from a frozen sample;
  • FIG. 14 is a photograph of a container illustrating use of an edge finding algorithm to identify the location of an edge of the container from the image data;
  • FIG. 15 is a schematic diagram illustrating one embodiment of using fixed calibration marks to identify the location of features in the image data
  • FIG. 16 is a photograph showing a pair of
  • FIG. 17 is a photograph of one embodiment of a density step target
  • Figure 19 is a schematic illustration of the coring probe of FIG. 18 inserted into the bore.
  • the source container station 107 includes a receptacle 106 for receiving containers 105 and a pair of clamping jaws 108, 110 on opposite sides at the top of the receptacle. At least one of the jaws 108 is selectively moveable, such as by a pneumatic actuator (not shown), toward and away from the other jaw 110 for selectively clamping containers 105 in position at the station 107 to hold them in place during extraction of a frozen sample and releasing the containers so they can be removed from the station and replaced in the tray 117 afterward. Similar jaws can be used to hold the container 105 at the sample receiving station 109 if desired .
  • the system could be adapted for use with well plates and arrays in which multiple different frozen samples are stored in a single container.
  • appropriate components can be provided (e.g., on the end effector) for moving well plates or arrays instead of individual containers and the stations 107, 109 for receiving the containers can be adapted to receive well plates or arrays without departing from the scope of the invention.
  • the clamping system can be adapted to hold well plates and arrays within the scope of the invention.
  • a cooling system 131 for keeping the frozen samples and the frozen sample cores extracted therefrom frozen is positioned under the platform 103 in the illustrated embodiment, although the cooling system can be positioned elsewhere and/or other cooling systems used without departing from the scope of the invention.
  • the end effector 111 of the robotic system 101 includes a sample coring probe 121 and a sample core extraction system 123 operable to move the sample coring probe into one of the frozen samples contained in one of the containers 105 and then withdraw the coring probe from the frozen sample to obtain a frozen sample core from the frozen sample.
  • the sample core extraction system 123 includes a motor 125 adapted to rotate the sample coring probe 123 as the robotic drive system 113 lowers the sample coring probe into the container and then raises it out of the container. Additional details about the operation of a coring probe to extract frozen sample cores from frozen samples are set forth in U.S. pre-grant publication No. 20090019877, PCT application No. PCT/US2011/61214 , filed
  • any sample coring probe and sample extraction system can be used within the scope of the invention, as long as they can be operated to extract a frozen sample core from a frozen sample while resulting in only limited to no thawing of the frozen sample material and the frozen sample core extracted therefrom .
  • the end effector 111 also includes a gripping system 127 operable to selectively hold and release containers 105 for use by the robotic system 101 in moving containers back and forth between the trays 117 and the stations 107, 109 on the platform for the containers from which frozen sample cores are being taken and into which frozen sample cores are being deposited.
  • a gripping system 127 operable to selectively hold and release containers 105 for use by the robotic system 101 in moving containers back and forth between the trays 117 and the stations 107, 109 on the platform for the containers from which frozen sample cores are being taken and into which frozen sample cores are being deposited.
  • the gripping system includes a plurality of moveable fingers 129 moveable by one or more pneumatic actuators (not shown) under the control of the processor 114. It is understood other gripping systems may be used within the scope of the invention.
  • the gripping system can be modified if desired to facilitate use of the gripping system to move well plates or arrays containing multiple frozen samples.
  • the processor 114 suitably processes the image captured while the container 105 is illuminated with the light 145 in various ways to facilitate this determination.
  • the processor 114 is configured to perform a thresholding filter to the raw image data, apply one or more morphological filters (e.g., erosion, dilation, opening, and/or closing) to the thresholded image, and then apply particle analysis to identify one or more bore candidates.
  • morphological filters e.g., erosion, dilation, opening, and/or closing
  • first line extending between the bore and the center axis of the container and the second line extending between the center axis of the container and another bore
  • the number of bores in the geometric pattern can vary within the scope of the invention.
  • the pattern in Fig. 10 is a regular pattern, meaning the bores are all the same size, are all spaced the same distance from the center, and are all spaced at equal angles, it is understood that the pattern could be irregular within the scope of the invention.
  • some bore candidates can be spaced too close to the center (e.g., see distance D4 in Fig. 11) or too far from the center of the container (e.g., see distance D5 in Fig. 11), or conversely, spaced to far or close to the edge of the container if the edge of the container can be detected, to fall within the geometric pattern.
  • the angular spacing between one or more of the bore candidates can be different (either too high ⁇ 2 or too low ⁇ 3) from the expected angular spacing.
  • frozen sample cores will be extracted from the frozen samples according to a specific orderly
  • the processor 114 can determine a bore candidate is an artifact on the basis that it is out of order with a sequence according to which frozen sample cores are expected to be extracted from the frozen sample, particularly when multiple bore candidates 301, 303 follow the expected sequence and only one bore candidate 305 is out of sequence .
  • the processor 114 can apply more rigorous standards to help exclude likely artifacts when the number of bore candidates is too high.
  • the circles 163, 165 define an area to be scanned in an attempt to identify the edge of the container 105.
  • the processor 114 is suitably configured to evaluate the image data to determine points 169 along each line where there is sharp contrast. Each point 169 potentially represents an intersection between the edge of the container 105 and the respective scan line 167. In the case of a successful attempt to identify the edge of a container, a significant number of the points 169 will lie on the same circle (or other shape if the containers do not have circular shapes) in which case the processor 114 concludes the points 169 lying thereon define the edge of the container 105.
  • the edge of the container can refer to the edge of a well or other discrete area within which one frozen sample is stored on a well plate or other container adapted for holding multiple different samples .
  • Ultraviolet or infrared lighting can help enhance the contrast between the edge of the container and the surroundings in the image. This enhanced contrast improves detection and
  • a separate UV or IR light source can be positioned to illuminate the container.
  • the UV or IR light source can be moveable (e.g., mounted on the end effector 111) or fixed (e.g., secured to or within the platform 103) within the scope of the invention.
  • the separate UV or IR light source can be positioned in the platform to provide indirect lighting (e.g., backlighting or side lighting) to the container to aid in edge detection.
  • any one of or combination of the lights 145, 181, 183, 185 can include a UV or IR light source .
  • the processor 114 is suitably configured to use the calibration marks 161 (e.g., in combination with edge detection or without edge detection) to determine whether or not the one or more bore candidates are likely to be artifacts instead of real bores in the frozen sample.
  • the calibration marks 161 are designed to ensure there is strong color contrast between the calibration marks and the surrounding objects in the image even if there is frost formation or other conditions that minimize contrast between the edge of the container 105 and its
  • the processor 114 can determine the position of the bore candidates by comparing their positions to the positions of the calibration marks.
  • the calibration marks 161 are suitably positioned to form a triangle with the center of the container.
  • the processor 114 can be configured to identify the center axis of the container by triangulating the center from the calibration marks.
  • a machine vision system 141 including a processor 114 that is configured to use calibration marks 161 to identify the center of a container is not sensitive to errors in the rotation of the camera or to errors in translation of the camera.
  • the processor 114 can be configured to identify the edge of the container 105 and/or the center axis of the
  • the processor 114 can be configured to both use the calibration marks and peripheral edge detection to identify the center of the container.
  • the processor 114 can be configured to compare the positions of the bore candidates directly to the positions of the calibration marks 161 to determine which bore candidates are likely to be artifacts without computing the relative distances between the bore candidates and the center of the container or the edge of the container without computing the center of the container and/or without computing the edge of the container .
  • the processor 114 is configured to automatically select a suitable location in the frozen sample from which the robotic system 101 can take another frozen sample core (or in the case of a frozen sample from which no frozen sample cores have been taken yet, it is configured to automatically select the location from which the initial frozen sample core will be extracted) once the processor has determined from the image data whether or not there are any bores in a particular frozen sample and the locations of any such bores.
  • This facilitates taking frozen sample cores from samples that may have already been subjected to previous extractions of frozen sample cores without requiring the processor 114 to have access to any information about the number of previous frozen sample cores that may have been extracted from the sample or the locations within the frozen sample from which any such sample cores have been taken. This eliminates the need for manual intervention to orient the containers 105 is a particular way and greatly reduces the amount of data on a sample that needs to be tracked to
  • the system 101 can still recognize bores in the frozen sample even if the bores are not where they would be expected to be if the previously extracted frozen sample cores had been extracted according to the protocols of the system 101 instead of whatever other protocols were previously in use.
  • the processor 114 can be configured to select a location for the next frozen sample core that continues the geometric pattern that has already been started. Another option if it is desired that the next sample core be taken from a particular radial location in the frozen sample is that the processor 114 can be configured to select a location that is the desired radial distance from the center of the container and also sufficiently spaced from existing bores in the frozen sample. The processor 114 can be configured so a user can select which of these options is used for any
  • the processor 114 is also configured to select an appropriate initial geometric pattern for the locations from which a plurality of frozen sample cores will be extracted if the processor determines there are no existing bores in the frozen sample.
  • the processor 114 can be configured to select a geometric pattern that maximizes the number of frozen sample cores that can be taken from the frozen sample and/or the processor can be configured to select a geometric pattern that results in one or more frozen sample cores (e.g., all frozen sample cores) being taken from a particularly desired radial distance from the center of the container.
  • the processor 114 can be configured to allow a user to select which of several different strategies will be used for planning the geometric pattern of the locations from which frozen sample cores are to be extracted for different containers or sets of containers. If desired, the processor 114 can be configured to display the geometric pattern selected by the processor and/or the
  • the color of light emitted by the light 145 can be important. In general, better results are obtained when the light used to illuminate the frozen sample matches the color of the frozen sample.
  • the color of the light used to illuminate the frozen sample is suitably the same as the color of the sample or no more different from the color of the sample than one of the two adjacent colors on an RGB color wheel having six colors arranged in the following order extending around the wheel: red, yellow, green, cyan, blue, magenta, and then back to red.
  • a red light works well with red samples, orange samples, and yellow samples.
  • the light 145 can emit red light for illuminating the frozen sample. It is also anticipated that it will be desirable in some cases for the light to emit green light or blue light. However, the light can emit any color light within the broad scope of the invention .
  • the light 145 includes red light emitting elements, blue light emitting elements, and green light emitting elements and the intensity of light emitted from the red, blue, and green light emitting elements is selectively adjustable to allow any of multiple different colors of light to be selected as the color of light that is used to illuminate the samples.
  • the light 145 includes red light emitting elements that emit red light having a wavelength in the range of about 620nm to about 750nm (about 400THz to about 484THz) .
  • the majority of the light energy emitted by the light is suitably within the range of about 620nm to about 750nm.
  • the processor suitably adjust the color of the light used to illuminate the sample to white when capturing the image that will be used to determine the color of the sample to facilitate accurate color detection and then adjusts the color of the light to match the color of the sample.
  • the processor can be configured to capture a color image of one of the samples to assess the color of all of the samples in that set and adjust the color once to match the color of all the samples in the set.
  • Digital cameras are available that can capture both grayscale images and color images, so it is possible that the camera captures one or more color images (e.g., to identify the color of the sample so the light used to illuminate the sample can be adjusted to match the color of the sample) and also captures grayscale images for use by the processor to identify locations where bores exist within the frozen samples.
  • the machine vision system 141 is suitably a
  • the platform 103 has one or more fixed targets 171 positioned thereon.
  • the camera 143 is mounted on the robotic system 101 so it can be moved to capture an image of each of the fixed targets 171.
  • the processor is suitably configured to receive image data from the camera indicative of images of the target (s) formed by the camera and calibrate the robotic system 101 using an image of the one or more fixed targets 171 on the platform 103. As illustrated in Fig.
  • At least one of the targets 173 has an image (e.g., a cross hair) having a point or intersection of lines for calibration in the x and y directions and a shape (e.g., circle) having a known size for calibration in the z direction.
  • the calibration system suitably has a user interface (not shown) configured to allow a user to guide the camera from a position that is not in registration with one of the targets (e.g., so a reticule overlaying the captured image is not aligned with the cross hair) toward a position that is in registration with said target (e.g., so the reticule is aligned with the cross hair) .
  • the processor 114 is suitably configured to use multiple additional features on the platform 103 as targets to help calibrate the robotic system 101.
  • the platform 103 is suitably configured to use multiple additional features on the platform 103 as targets to help calibrate the robotic system 101.
  • the processor 114 is suitably configured to use multiple additional features on the platform 103 as targets to help calibrate the robotic system 101.
  • processor 114 is suitably configured to calibrate the robotic system 101 using images of multiple features on the platform 103 selected from the group consisting of:
  • Fig. 3 illustrates 13 calibration points that can be used according to one particular embodiment of the calibration system, with each of calibration points being labeled consecutively from 201-213.
  • Point 201 corresponds to the target 175 on the platform 103 in the recessed area 115.
  • Points 201, 202, and 203 correspond to the stations 107, 109 for receiving the containers 105 and the station 119 for washing the coring probe 121, respectively.
  • Points 204-208 correspond to various points (e.g., points at the corners) of one of the trays 117a and points 209-212 correspond to various points (e.g., points at the corners) of another of the trays 117b.
  • the image data from the camera 143 is used to instruct the robotic drive system (either by the processor or a user) to raise or lower the camera until the size of the shape in the image captured by the camera indicates the camera is the proper distance from the target in the z- direction.
  • Calibration in the Z-direction could instead be achieved using a lens setting for the camera 143 having a known focal length and then adjusting the height of the camera until the image is in focus.
  • the processor When the camera 143 is in registration with the target 171 and the correct distance from the target, the processor records positional information from the robotic system 101 (e.g., data from encoders and other devices that provide positional feedback about the position of various components of the robotic system) and designates that
  • the station 109 for receiving the container 105 in which a frozen sample core is to be deposited;
  • the calibration targets 171 and calibration points include each of points 201- 213 on Fig. 3.
  • the density step target 177 is also used during the calibration process to adjust camera settings and light
  • the light 145 is turned on and the diaphragm of the camera 143 and/or intensity of electric current supplied to the light are adjusted while the camera captures images of the density step target 177 until a particular shaded block on the density step target is read as a certain gray level by the camera 143. For example, good results have been obtained when the light 145 and camera 143 are adjusted so the third lightest color block on a standard density step target is read by the camera as 200 gray level .
  • the user adjusts the position of the end effector until the gripper system 127 is in registration with the target 171 or other reference point and provides an indication to the processor that the gripper system is in registration therewith.
  • the order in the steps of this method is not important.
  • the processor 114 determines the positional offset between the camera 143, coring prove 121, and gripper assembly 127 using the information provided in this process.
  • the entire calibration process is suitably completed without requiring any contact between the end effector 111 or any components moveable with the end effector and the platform 103 or any components on the platform.
  • a set of containers 105 containing frozen samples is placed on the platform 103.
  • one or more trays 117a can be loaded with sample containers 105 and then placed on the platform 103 (e.g., in the recessed area 115) .
  • a set of empty containers 105 for receiving frozen sample cores after they are extracted is loaded into one or more additional trays 117b and placed on the platform 103.
  • the robotic system 101 uses the gripper system 127 to move one of the containers 105 containing a frozen sample to the station 107 for receiving containers from which frozen sample cores are being extracted and moves one of the empty containers to the station 109 for receiving
  • the robotic system moves the camera 143 into position over the station 107 for holding the containers 105 containing frozen sample while frozen sample cores are extracted from them.
  • the robotic system suitably includes a fill level detection system for detecting the level of an upper surface of the frozen sample. Details concerning the construction and operation of a suitable fill level detection system are provided in U.S. Application No. 13/359,301, entitled Robotic End
  • the fill level detection system provides information about the position of the upper surface of the frozen sample.
  • the fill level detection system can be used to position the camera 143 at a desired level above the frozen sample to improve focusing of the camera.
  • the processor 114 uses the information from the fill level detection system about the position of the upper surface of the frozen sample to determine the elevation at which to position the camera for obtaining an image of the frozen sample taken while the camera is within an optimal range of distances from the upper surface of the sample.
  • the light 145 is used to illuminate the container 105 at the station 107 and the frozen sample contained therein.
  • the machine vision system 141 includes the option of adjusting the color of the light 145, the color of the frozen sample is determined (e.g., using image data from the camera and/or user input) and the color of the light is adjusted to match the color of the frozen sample, as described above. For example, if the frozen sample is red, orange, or yellow, the light 145 can be adjusted to emit red light. Likewise, if additional lighting options are used, additional images of the container 105 are captured with one or more of the lights 181, 183, 185 providing illumination.
  • a thresholding filter is suitably applied to the raw image obtained with illumination from light 145.
  • a morphological filter is also applied to the image. After the image has been filtered a particle analysis imaging
  • the processor uses the image data to evaluate whether or not any bore candidates are actual bores or just artifacts in order to determine whether or not any frozen sample cores have already been taken from the frozen sample and, if so, to identify the location (s) from which they were taken.
  • the method suitably includes heating the calibration marks using the low-power resistance heaters to ensure the calibration marks are not obscured by frost.
  • the processor 114 Once the processor 114 has identified the bore candidates and determined which bore candidates are likely to be artifacts and which are likely to be real bores, the processor automatically selects a position from which a frozen sample core will be extracted, accounting for the position of pre-existing bores in the frozen sample if there are any. Then the processor 114 instructs the robotic system 101 to move the coring probe 121 into position over the selected location and instructs the sample extraction system to extract a frozen sample core from the location.
  • the frozen parent sample is further processed to clear away any frost crystals or other debris in each of the bores in the frozen sample to ensure better accuracy and reliability in the machine vision system 141 when the sample is retrieved again later from frozen storage to obtain additional frozen sample cores.
  • the bores may contain or be obscured by frost crystals that have grown on the sample (e.g., while the sample was in frozen storage), by debris (e.g., from previously drilling frozen sample cores), and/or for other reasons.
  • the processor 114 suitably uses the image data that was obtained from the sample before the most recent frozen sample core was extracted (i.e., the image data used to evaluate bore
  • the processor 114 suitably also uses data obtained during the extraction of the most recent frozen sample core (e.g., the geometric sampling pattern used to obtain the most recent frozen sample core(s), information about the location from which the most recent frozen sample cores were taken) . Using this image data about the location (s) of any bores before the most recent extraction and the data about the location (s) of any bores made during the most recent extraction, the processor 114 determines the location (s) of every bore and suspected bore in the frozen sample.
  • data obtained during the extraction of the most recent frozen sample core e.g., the geometric sampling pattern used to obtain the most recent frozen sample core(s), information about the location from which the most recent frozen sample cores were taken
  • the processor 114 instructs the robotic system to reprocess or clean any bore(s) made during the most recent frozen sample core extraction, and then
  • the processor 114 subsequently instructs the robotic system to reprocess or clean any bore(s) identified using the image data that was obtained before the most recent frozen sample core was extracted.
  • the processor 114 instructs the robotic system to reprocess or clean only any bore(s) identified using the image data that was obtained before the most recent frozen sample core was extracted.
  • the processor 114 instructs the robotic system to reprocess or clean only any bore(s) made during the most recent frozen sample core
  • the processor 114 instructs the robotic system to reprocess or clean any bore(s) identified using the image data that was obtained before the most recent frozen sample core was extracted prior to
  • the processor 114 may direct an ejector pin 190 of the end effector 111 to move to an extended position in which the ejector pin extends from a distal end of the coring probe 121.
  • the coring probe 121 is positioned over an identified bore 192 (Fig. 18) and then lowered into the bore to clear the bore of any debris (Fig. 19) .
  • the coring probe 121 is lowered into the identified bore 192 whether or not there is debris in the bore.
  • the machine vision system 141 can include sensors as described in U.S. Application No. 13/359,301, filed January 26, 2012, to determine whether or not the ejector pin 190 is being lowered into a bore instead of being lowered into contact with frozen sample. As the ejector pin 190 enters the open end of the bore, any frost, debris or other similar objects that may be
  • obstructing the view of the bore is knocked away from the open end of the bore, either by being knocked into the bottom of the bore or by being pushed aside. Clearing the bore(s) of debris and frost makes it easier for the machine vision system 141 to correctly identify bores in a frozen sample next time it is retrieved from frozen storage when additional frozen sample cores are required. Because the coring probe and ejector pin 190 are already required to contact the sample to complete other parts of the process, there is substantially no added risk of contaminating the sample by using the coring probe and ejector pin to clear the debris away from the open ends of the bores.
  • aspects of the invention may also be practiced in distributed computing environments where tasks are performed by local and remote processing devices that are linked (either by hardwired links, wireless links, or by a combination of
  • program modules may be located in both local and remote memory storage devices .
  • the computer may also include a magnetic hard disk drive for reading from and writing to a magnetic hard disk, a magnetic disk drive for reading from or writing to a removable magnetic disk, and an optical disk drive for reading from or writing to removable optical disk such as a CD-ROM or other optical media.
  • the magnetic hard disk drive, magnetic disk drive, and optical disk drive are connected to the system bus by a hard disk drive interface, a magnetic disk drive-interface, and an optical drive interface, respectively.
  • the drives and their associated computer-readable media provide nonvolatile storage of computer-executable instructions, data structures, program modules, and other data for the computer.
  • exemplary environment described herein employs a magnetic hard disk, a removable magnetic disk, and a removable optical disk
  • other types of computer readable media for storing data can be used, including magnetic cassettes, flash memory cards, digital video disks, Bernoulli cartridges, RAMs, ROMs, and the like.
  • Program code means comprising one or more program modules may be stored on the hard disk, magnetic disk, optical disk, ROM, and/or RAM, including an operating system, one or more application programs, other program modules, and program data.
  • a user may enter commands and information into the computer through keyboard, pointing device, or other input devices, such as a microphone, joy stick, game pad, satellite dish, scanner, or the like.
  • These and other input devices are often connected to the processing unit through a serial port interface coupled to system bus.
  • the input devices may be connected by other interfaces, such as a parallel port, a game port, or a universal serial bus (USB) .
  • a monitor or another display device is also connected to system bus via an interface, such as video adapter.
  • personal computers typically include other peripheral output devices (not shown), such as speakers and printers.
  • the computer may operate in a networked environment using logical connections to one or more remote computers, such as remote computers .
  • Remote computers may each be another personal computer, a server, a router, a network PC, a peer device or other common network node, and typically include many or all of the elements described above relative to the computer.
  • the logical connections include a local area network (LAN) and a wide area network (WAN) that are presented here by way of example and not limitation.
  • LAN local area network
  • WAN wide area network
  • Such networking environments are commonplace in office-wide or enterprise-wide computer networks, intranets and the Internet.
  • the computer When used in a LAN networking environment, the computer is connected to the local network through a network interface or adapter.
  • WAN networking When used in a WAN networking
  • the computer may include a modem, a wireless link, or other means for establishing communications over the wide area network, such as the Internet.
  • the modem which may be internal or external, is connected to the system bus via the serial port interface.
  • program modules depicted relative to the computer, or portions thereof, may be stored in the remote memory storage device. It will be appreciated that the network connections shown are exemplary and other means of establishing communications over wide area network may be used.
  • Such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage, or other magnetic storage devices, or any other medium that can be used to carry or store desired program code means in the form of computer-executable instructions or data structures and that can be accessed by a general purpose or special purpose computer.
  • a network or another communications connection either hardwired, wireless, or a combination of hardwired or wireless
  • the computer properly views the connection as a computer-readable medium.
  • any such a connection is properly termed a computer-readable medium.
  • Computer-executable instructions comprise, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose
  • processing device to perform a certain function or group of functions .
  • a frozen sample that is contained in a container 105 is positioned (e.g., by the robotic system) at the station 107 on the platform 103 having
  • processor 114 uses the captured image to determine one or more locations where a frozen sample core has already been taken from the frozen sample contained in the container by: (a) evaluating contrast in the image to identify one or more bore candidates; and (b) determining whether or not the one or more bore
  • the processor 114 uses information including at least one of the following:
  • the distance between the bore candidate and a center axis of the container 105 ; the angle formed between a first line and a second line, the first line extending between the bore and the center axis of the container and the second line extending between the center axis of the container and another bore
  • the system takes a frozen sample core from the frozen sample at a location from which no frozen sample core has already been taken, as determined by the processor.
  • the robotic system 101 is calibrated by using the camera 143 to capture an image of one or more fixed targets 171 on the platform 103.
  • the processor 114 uses an image of the one or more targets 171 to calibrate the robotic system.
  • the same camera 143 is used to capture an image of one or more containers 105 while the containers are supported by the platform to determine whether or not one or more frozen sample cores has already been taken from the frozen sample .
  • the robotic system 101 is operated to move the camera 143 relative to a first one of the containers 105 so the camera is directed at the frozen sample in the first container.
  • the frozen sample in the container 105 is illuminated using the ring light 145.
  • the camera 143 is used to capture an image of the illuminated frozen sample.
  • the processor 114 evaluates contrast in the captured image and processes the image to identify one or more bore candidates in the captured image.
  • the processor 114 determine whether or not the bore candidates are likely to be artifacts or real bores in the frozen sample.
  • the robotic system 101 moves the camera relative to a second of the containers 105 so the camera is directed at the frozen sample in the second container and the process is repeated.
  • the robotic system 101 moves the camera 143 relative to a first one of the
  • the robotic system 101 moves the camera 143 relative to a second of the containers 105 so the camera is directed at the frozen sample in the second container. The process is repeated.
  • the robotic system 101 moves the camera 143 relative to a first one of the containers 105 so the camera is directed at the frozen sample in the first container.
  • the frozen sample in the container 105 is illuminated with a light 145 that has a color selected to match the color of the frozen sample.
  • the camera 143 captures an image of the illuminated frozen sample.
  • the processor 114 evaluates contrast in the captured image and processes the image to identify one or more bore candidates in the captured image.
  • the processor 114 determines whether or not the bore candidates are likely to be artifacts or real bores in the frozen sample.
  • the robotic system moves the camera 143 relative to a second of the containers 105 so the camera is directed at the frozen sample in the second container and the process is repeated.

Abstract

A machine vision system for use with a system that takes frozen sample cores from samples that are in containers includes a camera. A processor is configured to receive image data from the camera and to determine locations where frozen sample cores have already been taken. A method of determining one or more locations where a frozen sample core have already been taken from frozen samples includes operating a robotic system to position one of the containers on a platform at a station for receiving the container while a frozen sample core is extracted from the frozen sample contained in the container. The camera is used to capture an image of the frozen sample. Contrast in the captured image is evaluated to identify one or more bore candidates. The processor uses the image to determine whether or not the bore candidates are real bores or artifacts.

Description

MACHINE VISION SYSTEM FOR FROZEN ALIQUOTTER
FOR BIOLOGICAL SAMPLES
FIELD OF INVENTION
[ 0001 ] The present invention relates generally to machine vision systems and methods, and more particularly to machine vision systems for facilitating control of robotic systems for taking multiple frozen sample cores from frozen samples in containers without thawing the frozen samples.
BACKGROUND
[ 0002 ] Biological samples are commonly preserved to support a broad variety of biomedical and biological research that includes but is not limited to translational research, molecular medicine, and biomarker discovery. Biological samples include any samples which are of animal (including human) , plant, protozoal, fungal, bacterial, viral, or other biological origin. For example, biological samples include, but are not limited to, organisms and/or biological fluids isolated from or excreted by an organism such as plasma, serum, urine, whole blood, cord blood, other blood-based derivatives, cerebral spinal fluid, mucus (from respiratory tract, cervical) , ascites, saliva, amniotic fluid, seminal fluid, tears, sweat, any fluids from plants (including sap); cells (e.g., animal, plant, protozoal, fungal, or bacterial cells, including buffy coat cells; cell lysates, homogenates, or suspensions; microsomes; cellular organelles (e.g., mitochondria); nucleic acids (e.g., RNA, DNA) , including chromosomal DNA, mitochondrial DNA, and plasmids
(e.g., seed plasmids); small molecule compounds in suspension or solution (e.g. small molecule compounds in DMSO) ; and other fluid-based biological samples. Biological samples may also include plants, portions of plants (e.g., seeds) and tissues (e.g., muscle, fat, skin, etc.).
[ 0003 ] Biobanks typically store these valuable samples in containers (e.g., well plates or arrays, tubes, vials, or the like) and cryopreserve them. Tubes, vials, and similar
containers can be organized in arrays and can be stored in well plates, racks, divided containers, etc. Although some samples are stored at relatively higher temperatures (e.g., about -20 degrees centigrade) , other samples are stored at much lower temperatures. For example some samples are stored in freezers at -80 degrees centigrade, or lower using liquid Nitrogen or the vapor phase above liquid Nitrogen) to preserve the biochemical composition and integrity of the frozen sample as close as possible to the in vivo state to facilitate accurate,
reproducible analyses of the samples.
[0004] From time to time, it may be desirable to run one or more tests on a sample that has been frozen. For example, a researcher may want to perform tests on a set of samples having certain characteristics. A particular sample may contain enough material to support a number of different tests. In order to conserve resources, smaller samples known as aliquots are commonly taken from larger cryopreserved samples (which are sometimes referred to as parent samples) for use in one or more tests so the remainder of the parent sample will be available for one or more different future tests.
[0005] Biobanks have adopted different ways to address this need to provide sample aliquots. One option is to freeze a sample in large volume, thaw it when aliquots are requested and then refreeze any remainder of the parent sample for storage in the cryopreserved state until future aliquots are needed. This option makes efficient use of frozen storage space; yet this efficiency comes at the cost of sample quality. Exposing a sample repeatedly to freeze/thaw cycles can degrade the sample's critical biological molecules (e.g., RNA) and damage biomarkers, either of which could compromise the results of any study using data obtained from the damaged samples.
[0006] Another option is to freeze a sample in large volume, thaw it when an aliquot is requested, subdivide the remainder of the parent sample in small volumes to make
additional aliquots for future tests and then refreeze these smaller volume aliquots to cryopreserve each aliquot separately until needed for a future test. This approach limits the number of freeze/thaw cycles to which a sample is exposed, but there is added expense associated with the larger volume of frozen storage space, labor, and larger inventory of sample containers (e.g. tubes, vials, or the like) required to maintain the cryopreserved aliquots. Moreover, the aliquots can be degraded or damaged by even a limited number freeze/thaw cycles.
[ 0007 ] Yet another approach is to divide a large volume sample into smaller volume aliquots before freezing them for the first time. This approach can limit the number of freeze thaw cycles to which a sample may be subjected to only one; yet, there are disadvantages associated with the costs of labor, frozen storage space, and sample container inventory
requirements with this approach.
[ 0008 ] U.S. pre-grant publication No. 20090019877, the contents of which are hereby incorporated by reference,
discloses a system for extracting frozen sample cores from a frozen biological sample without thawing the original (parent) sample. The system uses a drill including a hollow coring bit to take a frozen core sample from the original parent sample without thawing the parent sample. The frozen sample core obtained by the drill is used as the aliquot for the test. After the frozen core is removed, the remainder of the sample is returned to frozen storage in its original container until another aliquot from the parent sample is needed for a future test .
[ 0009 ] The present inventors have developed systems and methods, which will be described below, that facilitate
automatic recognition of whether or not a frozen sample contains any bores from previous extraction of one or more frozen sample cores as well as the positions of any such bores to implement automatic extraction of further frozen sample cores from the sample .
SUMMARY
[ 0010 ] One aspect of the invention is a machine vision system for use with a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a respective container. The machine vision system includes a platform for supporting one or more of the containers. The platform has a station for receiving at least one of the containers and a pair of calibration marks on the platform in fixed positions relative to the station. The system has a camera for capturing an image of the container while the container is received at the station. A processor is configured to receive image data from the camera indicative of the image of the container. The processor is configured to determine one or more locations where a frozen sample core has already been taken from a frozen sample contained in the container by: (a) evaluating contrast in the image to identify one or more bore candidates; and (b) using information about the position of the calibration marks relative to the bore candidates to determine whether or not the one or more candidates are likely to be artifacts instead of real bores in the sample.
[ 0011 ] Another aspect of the invention is a method of taking a frozen sample core from a frozen sample that is contained in a container. The method includes positioning the container at a station for receiving a container on a platform. The platform has a pair of calibration marks on the platform in fixed positions relative to the station. An image of the container is captured while the container is received at the station. One or more locations where a frozen sample core has already been taken from the frozen sample contained in the container is determined by: (a) evaluating contrast in the image to identify one or more bore candidates; and (b) using information about the position of the calibration marks relative to the bore candidates to determine whether or not the one or more candidates are likely to be artifacts instead of real bores in the frozen sample. A frozen sample core is taken from the sample at a location from which no frozen sample core has already been taken, as determined in the determining step.
[ 0012 ] Yet another aspect of the invention is a machine vision system for use with a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a respective container. The machine vision system includes a platform and a camera for capturing an image of one of the containers while it is on the platform. A
processor is configured to receive image data from the camera indicative of the image captured by the camera. The processor is configured to determine one or more locations where a frozen sample core has already been taken from the frozen sample contained in the container by: (a) evaluating contrast in the image to identify one or more bore candidates; and (b)
determining whether or not the one or more bore candidates are likely to be artifacts instead of real bores in the sample. The processor is configured to use at least one of the following to determine whether or not the one or more bore candidates are likely to be artifacts: (i) the size of the bore candidate; (ii) the distance between the bore candidate and a center axis of the container; (iii) the angle formed between a first line and a second line, the first line extending between the bore and the center axis of the container and the second line extending between the center axis of the container and another bore candidate; (iv) the relation between the position of the one or more bore candidates and an expected pattern of bores in the sample; (v) the location of the one or more bore candidates relative to a peripheral edge of the container; (vi) the number of bore candidates identified; (vii) the amount of contrast between the bore candidates and the area surrounding the bore candidates; and (viii) combinations thereof.
[ 0013 ] Another aspect of the invention is a method of taking a frozen sample core from a frozen sample that is contained in a container. The method includes capturing an image of the container. The captured image is used to determine one or more locations where a frozen sample core has already been taken from the frozen sample contained in the container by: (a) evaluating contrast in the image to identify one or more bore candidates; and (b) determining whether or not the one or more bore candidates are likely to be artifacts instead of real bores in the frozen sample. At least one of the following pieces of information is used to determine whether or not the one or more bore candidates are likely to be artifacts instead of real bores: (i) the size of the bore candidate; (ii) the distance between the bore candidate and a center axis of the container; (iii) the angle formed between a first line and a second line, the first line extending between the bore and the center axis of the container and the second line extending between the center axis of the container and another bore candidate; (iv) the relation between the position of the one or more bore candidates and an expected pattern of bores in the frozen sample; (v) the location of the one or more bore candidates relative to a peripheral edge of the container; (vi) the number of bore candidates identified; (vii) the amount of contrast between the bore candidates and the surrounding areas; and (viii)
combinations thereof. A frozen sample core is taken from the sample at a location from which no frozen sample core has already been taken, as determined in the determining step.
[ 0014 ] Another aspect of the invention is a calibration system configured to calibrate a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a respective container. The calibration system includes a platform for supporting the containers. The platform having one or more fixed targets positioned thereon. A camera is mounted on the robotic system for capturing an image of one or more containers while the containers are supported by the platform and for capturing images of the one or more fixed targets positioned on the platform. A processor is configured to receive image data from the camera indicative of images formed by the camera. The processor is configured to calibrate the robotic system using an image of the one or more fixed targets on the platform.
[ 0015 ] Another aspect of the invention is a method of calibrating a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a respective container. The method includes using a camera for capturing an image of one or more containers while the
containers are supported by a platform of the robotic system to determine whether or not one or more frozen sample cores has already been taken from the frozen sample to capture an image of one or more fixed targets on the platform. The image of the one or more targets is used to calibrate the robotic system.
[ 0016 ] Another aspect of the invention is a machine vision system for use with a robotic system adapted for taking a plurality of frozen sample cores from frozen samples that are each contained in a container. The machine vision system includes a camera for capturing an image of a container while the container is supported by a platform. The camera has an optical axis. The system has a ring light for illuminating the container on the platform. The ring light includes a plurality of light sources arranged in an annular patter. The optical axis of the camera extends through a central portion of the annular pattern. A processor is adapted to receive image data from the camera indicative of the image captured by the camera and to determine one or more locations where a frozen sample core has already been taken from the sample contained in the container by evaluating contrast in the image. [ 0017 ] Still another aspect of the invention is a method of determining one or more locations where frozen sample core have already been taken from frozen samples, each of the frozen samples being contained in a respective container. The method includes operating a robotic system to move a camera relative to a first one of the containers so the camera is directed at the frozen sample in the first container. The frozen sample is illuminated using a ring light. The ring light has a plurality of light sources arranged in an annular pattern. The camera has an optical axis that extends through a central portion of the annular pattern. The camera is used to capture an image of the illuminated frozen sample. Contrast in the captured image is evaluated and the image is processed to identify one or more bore candidates in the captured image and determine whether or not the bore candidates are likely to be artifacts or real bores in the frozen sample. The robotic system is operated to move the camera relative to a second of the containers so the camera is directed at the frozen sample in the second container. The imaging is repeated for the frozen sample in the second
container .
[ 0018 ] Yet another aspect of the invention is a machine vision system for use with a robotic system for taking a
plurality of frozen sample cores from frozen samples that are each contained in a container. The system includes a camera configured for capturing monochrome images of the containers while the containers are supported by a platform. A light is positioned to illuminate the containers and the samples
contained therein while the containers are on the platform. A processor is adapted to receive grayscale image data from the camera indicative of images formed by the camera and determine locations where frozen sample cores have already been taken from the samples by evaluating contrast in the images. The light emits light having a color other than white. [ 0019 ] Another aspect of the invention is a method of determining one or more locations where frozen sample core have already been taken from frozen samples. Each of the frozen samples is contained in a respective container. The method includes operating a robotic system to move a camera relative to a first one of the containers so the camera is directed at the frozen sample in the first container. The frozen sample is illuminated with a colored light. The camera is used to capture a grayscale image of the illuminated frozen sample. Contrast in the captured image is evaluated and the image is processed to identify one or more bore candidates in the captured image and determine whether or not the bore candidates are likely to be artifacts or real bores in the frozen sample. The robotic system is operated to move the camera relative to a second of the containers so the camera is directed at the frozen sample in said second container. The imaging is repeated for the frozen sample in the second container.
[ 0020 ] Another aspect of the invention is a machine vision system for use with a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a container. The system includes a camera for taking images of the containers while the containers are supported by a platform. A light is positioned to illuminate the containers and the samples contained therein while the containers are on the platform. The light has red light emitting elements, blue light emitting elements, and green light emitting elements. The intensity of light emitted from the red, blue, and green light emitting elements is selectively adjustable to allow any of multiple different colors of light to be selected as the color of light to be emitted by the light. A processor is adapted to receive image data from the camera indicative of images formed by the camera and determine locations where frozen sample cores have already been taken from the samples by evaluating contrast in the images. The processor is adapted to receive input about the color of the samples in the containers and adjust the color of the light emitted by the light to reduce a difference between the color of the samples and the color of the light emitted by the light.
[ 0021 ] Another aspect of the invention is a method of determining one or more locations where frozen sample core have already been taken from frozen samples. Each of the frozen samples is contained in a respective container. The method includes operating a robotic system to move a camera relative to a first one of the containers so the camera is directed at the frozen sample in the first container. The frozen sample is illuminated with a colored light. The color of the light is selected to match the color of the frozen sample. The camera is used to capture an image of the illuminated frozen sample.
Contrast in the captured image is evaluated and the image is processed to identify one or more bore candidates in the captured image and determine whether or not the bore candidates are likely to be artifacts or real bores in the frozen sample. The robotic system is operated to move the camera relative to a second of the containers so the camera is directed at the frozen sample in said second container. The imaging process is repeated for the frozen sample in the second container.
[ 0022 ] Another aspect of the invention is a machine vision system for use with a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a container. The machine vision system includes a platform for supporting the containers . The platform has a station for receiving one of the containers while a frozen sample core is extracted from a frozen sample contained in the container. The system has a camera for capturing images of containers while they are received at the station on the platform. A light is positioned to illuminate the containers from a position
providing at least one of back lighting and side lighting. [ 0023 ] Yet another embodiment of the invention is a method of determining one or more locations where frozen sample core have already been taken from frozen samples. Each of the frozen samples is contained in a respective container. The method includes operating a robotic system to position one of the containers on a platform at a station for receiving the
container while a frozen sample core is extracted from the frozen sample contained in the container. A light is used to provide at least one of back lighting and side lighting for the container. A camera is used to capture an image of the frozen sample while illuminated by the light. Contrast in the captured image is evaluated and the image is processed to identify one or more bore candidates in the captured image.
[ 0024 ] Another inventive aspect is a machine vision system for use with a robotic system adapted for taking a plurality of frozen sample cores from frozen samples that are each contained in a container. The machine vision system includes a camera for capturing an image of a container while the container is supported by a platform at a station for receiving the container while a frozen sample core is extracted from the frozen sample contained therein. The system has a red light for illuminating the container from above while it is on the platform at the station with substantially monochromatic red light. A processor is adapted to receive image data from the camera indicative of the image captured by the camera and to determine one or more locations where a frozen sample core has already been taken from the sample contained in the container by evaluating contrast in the image .
[ 0025 ] Yet another aspect of the invention is a method of determining one or more locations where a frozen sample core have already been taken from frozen samples. Each of the frozen samples is contained in a respective container. The method includes operating a robotic system to position one of the containers on a platform at a station for receiving the container while a frozen sample core is extracted from the frozen sample contained in the container. The container is illuminated from above while it is on the platform at the station with substantially monochromatic red light. A camera is used to capture an image of the frozen sample while illuminated by the red light. Contrast in the captured image is evaluated and the image is processed to identify one or more bore
candidates in the captured image.
[ 0026 ] Another aspect of the invention is a machine vision system for use with a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a respective container. The machine vision system includes a platform for supporting one or more of the containers. The platform having a station for receiving at least one of the containers . The system has a camera for capturing an image of the container while the container is received at the station. A processor is configured to receive image data from the camera indicative of the image of the container. The processor is configured to determine one or more locations where a frozen sample core has already been taken from a frozen sample
contained in the container by evaluating contrast in the image to identify one or more bore candidates and identify an edge of the container and using information about the position of the edge relative to the bore candidates to determine whether or not the one or more candidates are likely to be artifacts instead of real bores in the sample.
[ 0027 ] Another aspect of the invention is a method of determining one or more locations where a frozen sample core have already been taken from frozen samples. Each of the frozen samples is contained in a respective container. The method includes operating a robotic system to position one of the containers on a platform at a station for receiving the
container while a frozen sample core is extracted from the frozen sample contained in the container. A camera is used to capture an image of the frozen sample. Contrast in the captured image is evaluated to identify one or more bore candidates and identify an edge of the container. Information about the position of the edge relative to the bore candidates is used to determine whether or not the one or more candidates are likely to be artifacts instead of real bores in the sample.
[ 0028 ] One aspect of the invention is a machine vision system for use with a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a respective container. The machine vision system includes a platform for supporting one or more of the containers. The platform has a station for receiving at least one of the containers. The system includes a camera for capturing an image of the container while the container is received at the station. The system includes a fill level detection system adapted to detect the positions of the surfaces of the frozen samples. A processor is configured to receive signals from the fill level detection system and use the signals to determine where to position the camera to obtain an image of the frozen samples.
[ 0029 ] Another aspect of the invention is a method of determining one or more locations where frozen sample core have already been taken from frozen samples, each of the frozen samples being contained in a respective container. The method includes using a fill level detection system to determine the position of a surface of the frozen sample that is spaced from a bottom of the container. Information from the fill level detection system is used to determine where to position a camera so the camera has a predetermined position relative to the surface of the sample and the camera is moved to that position. An image of the frozen sample in the container is captured from that position. The image is used to identify the location of one or more bores in the sample. [ 0030 ] Yet another aspect of the invention is a machine vision system for use with a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a respective container. The machine vision system includes a platform for supporting one or more of the containers . The platform has a station for receiving at least one of the containers. The system includes a coring probe for taking frozen sample cores from the frozen samples. The system includes a camera for capturing an image of the container while the container is received at the station. A processor is configured to receive image data from the camera indicative of the image of the container and to determine one or more
locations where a frozen sample core has been taken from a frozen sample contained in the container. The processor is configured to move the coring probe into the open end of at least one bore to clear the open end of the bore of debris.
[ 0031 ] Another aspect of the invention is a method of taking a frozen sample core from a frozen sample that is contained in a container. The method includes positioning the container at a station for receiving a container on a platform. An image of the container is captured while the container is received at the station. One or more locations where a frozen sample core has already been taken from the frozen sample contained in the container is determined. The frozen sample core is taken from the frozen sample at a location from which no frozen sample core has already been taken, as determined in the determining step. After taking the frozen sample core from the frozen sample, a coring probe is inserted into the one or more locations where a frozen sample core has been taken to clear the one or more locations where a frozen sample core has been taken of debris.
[ 0032 ] Still another aspect of the invention is a machine vision system for use with a robotic system adapted for taking a plurality of frozen sample cores from frozen samples that are each contained in a container. The machine vision system includes a camera for capturing an image of a container while the container is supported by a platform. The system includes a light for illuminating the container on the platform. A majority of the light energy emitted by the light is selected from the group consisting of red light with a wavelength in the range of 620nm to 750nm and green light with a wavelength in the range of 495nm to 570nm. A processor is adapted to receive image data from the camera indicative of the image captured by the camera and to determine one or more locations where a frozen sample core has already been taken from the sample contained in the container by evaluating the image.
[ 0033 ] Another aspect of the invention is a method of determining one or more locations where frozen sample core have already been taken from frozen samples, each of the frozen samples being contained in a respective container. The method includes operating a robotic system to move a camera relative to a first one of the containers so the camera is directed at the frozen sample in the first container. The frozen sample is illuminated using a light, wherein a majority of the light energy emitted by the light is selected from the group
consisting of red light with a wavelength in the range of 620nm to 750nm and green light with a wavelength in the range of 495nm to 570nm. The camera is used to capture an image of the
illuminated frozen sample. The image is used to identify one or more bore candidates in the captured image and determine whether or not the bore candidates are likely to be artifacts or real bores in the frozen sample. The robotic system is operated to move the camera relative to a second of the containers so the camera is directed at the frozen sample in said second
container. The imaging is repeated for the frozen sample in said second container.
[ 0034 ] Other objects and features will in part be apparent and will in part be pointed out hereinafter. BRIEF DESCRIPTION OF THE DRAWINGS
[ 0035 ] FIG. 1 is perspective of one example of a frozen aliquotter including one embodiment of a machine vision system of the present invention;
[ 0036 ] FIG. 2 is a top plan of the frozen aliquotter;
[ 0037 ] FIG. 3 is a top plan of the frozen aliquotter with parts removed to avoid obstructing view of one embodiment of a platform thereof;
[ 0038 ] FIG. 4 is an enlarged perspective of the platform taken in a plane including line 4--4 on Fig. 4
[ 0039 ] FIG. 5 is a perspective of a fragment of the frozen aliquotter shown in cross section taken in a plane including line 5--5 on Fig. 2;
[ 0040 ] FIG. 6 is a perspective of one embodiment of robotic end effector for use with a frozen aliquotter;
[ 0041 ] FIG. 7 is a bottom plan view of the robotic end effector illustrated in Fig. 6;
[ 0042 ] FIG. 8 is a schematic diagram showing some of the components of the frozen aliquotter;
[ 0043 ] FIG. 9 is a schematic diagram illustrating bore candidates that differ in size;
[ 0044 ] FIG. 10 is a schematic diagram illustrating one embodiment of a geometric pattern according to which frozen sample cores are extracted from a frozen sample;
[ 0045 ] FIG. 11 is a schematic diagram illustrating bore candidates that spaced different distances from a center of a container;
[ 0046 ] FIG. 12 is a schematic diagram illustrating bore candidates that are positioned at various different angles relative to one another from a center of the container;
[ 0047 ] FIG. 13 is a schematic diagram illustrating bore candidates that do not follow an expected sequence planned for extraction of frozen sample cores from a frozen sample; [ 0048 ] FIG. 14 is a photograph of a container illustrating use of an edge finding algorithm to identify the location of an edge of the container from the image data;
[ 0049 ] FIG. 15 is a schematic diagram illustrating one embodiment of using fixed calibration marks to identify the location of features in the image data;
[ 0050 ] FIG. 16 is a photograph showing a pair of
calibration marks in fixed position relative to a container;
[ 0051 ] FIG. 17 is a photograph of one embodiment of a density step target;
[ 0052 ] FIG. 18 is a schematic illustration of a coring probe positioned over a bore in a frozen sample; and
[ 0053 ] Figure 19 is a schematic illustration of the coring probe of FIG. 18 inserted into the bore.
[ 0054 ] Corresponding reference characters indicate
corresponding parts throughout the drawings.
DETAILED DESCRIPTION
[ 0055 ] Referring now to the drawings, first to Figs. 1-3 in particular, one embodiment of a robotic system for taking frozen sample cores from frozen samples contained in containers is generally designated 101. The system 101 includes a platform 103 for supporting a plurality of containers 105 and a robotic end effector 111 movable relative to the platform by a motorized drive system 113 controlled by a processor 114 (Fig. 8) . In the illustrated embodiment, the robotic system 101 is a cartesian gantry style robot, but this is not required and other types of robotic systems can be used within the scope of the invention. Additional details about robotic systems for taking frozen sample cores from frozen samples are set forth in U.S. pre-grant publication No. 20090019877, PCT application No.
PCT/US2011/61214, filed November 17, 2011, and U.S. Application No. 13/359,301, filed January 26, 2012, the contents of which are each hereby incorporated by reference. [ 0056 ] In the illustrated embodiment, the platform 103 includes a recessed area 115 sized and shaped to receive one or more removable trays 117 for holding the containers 105. For example, one or more of the trays 117a are suitably source trays that hold a plurality of source containers 105, each of which contains a frozen sample core is to be taken, and one or more other trays 117b are suitably destination trays that hold a plurality of destination containers, each of which is for use holding one or more frozen sample cores taken from the
containers on the source tray.
[ 0057 ] As illustrated in Figs. 3 and 4, the platform 103 also includes a source container station 107 adapted to receive one of the source containers while a frozen sample core is extracted from the frozen sample container therein and a destination container station 109 adapted to receive an empty container in which one or more frozen sample cores are
deposited. As illustrated in Fig. 4, the source container station 107 includes a receptacle 106 for receiving containers 105 and a pair of clamping jaws 108, 110 on opposite sides at the top of the receptacle. At least one of the jaws 108 is selectively moveable, such as by a pneumatic actuator (not shown), toward and away from the other jaw 110 for selectively clamping containers 105 in position at the station 107 to hold them in place during extraction of a frozen sample and releasing the containers so they can be removed from the station and replaced in the tray 117 afterward. Similar jaws can be used to hold the container 105 at the sample receiving station 109 if desired .
[ 0058 ] The system 101 illustrated in the drawings is adapted for use with frozen samples that are stored in
individual containers 105. However, it is understood the system could be adapted for use with well plates and arrays in which multiple different frozen samples are stored in a single container. For example, appropriate components can be provided (e.g., on the end effector) for moving well plates or arrays instead of individual containers and the stations 107, 109 for receiving the containers can be adapted to receive well plates or arrays without departing from the scope of the invention. Likewise, the clamping system can be adapted to hold well plates and arrays within the scope of the invention.
[0059] A washing station 119 for cleaning a sample coring probe 121 used to extract frozen sample cores from the frozen samples is also on the platform. Details concerning the
construction and operation of a suitable washing station are provided in PCT application No. PCT/US2011/61214 , filed November 17, 2011, and do not need to be repeated herein.
[0060] A cooling system 131 for keeping the frozen samples and the frozen sample cores extracted therefrom frozen is positioned under the platform 103 in the illustrated embodiment, although the cooling system can be positioned elsewhere and/or other cooling systems used without departing from the scope of the invention. As illustrated in Figs. 5-7, the end effector 111 of the robotic system 101 includes a sample coring probe 121 and a sample core extraction system 123 operable to move the sample coring probe into one of the frozen samples contained in one of the containers 105 and then withdraw the coring probe from the frozen sample to obtain a frozen sample core from the frozen sample. In the illustrated embodiment, for example, the sample core extraction system 123 includes a motor 125 adapted to rotate the sample coring probe 123 as the robotic drive system 113 lowers the sample coring probe into the container and then raises it out of the container. Additional details about the operation of a coring probe to extract frozen sample cores from frozen samples are set forth in U.S. pre-grant publication No. 20090019877, PCT application No. PCT/US2011/61214 , filed
November 17, 2011, and U.S. Application No. 13/359,301, filed January 26, 2012 and do not need to be repeated herein. It is understood any sample coring probe and sample extraction system can be used within the scope of the invention, as long as they can be operated to extract a frozen sample core from a frozen sample while resulting in only limited to no thawing of the frozen sample material and the frozen sample core extracted therefrom .
[ 0061 ] The end effector 111 also includes a gripping system 127 operable to selectively hold and release containers 105 for use by the robotic system 101 in moving containers back and forth between the trays 117 and the stations 107, 109 on the platform for the containers from which frozen sample cores are being taken and into which frozen sample cores are being deposited. Those skilled in the art will be familiar with various commercially available gripping systems that can be used. In the illustrated embodiment, for example, the gripping system includes a plurality of moveable fingers 129 moveable by one or more pneumatic actuators (not shown) under the control of the processor 114. It is understood other gripping systems may be used within the scope of the invention. For example, the gripping system can be modified if desired to facilitate use of the gripping system to move well plates or arrays containing multiple frozen samples.
[ 0062 ] As illustrated schematically in Fig. 8, the robotic system 101 cooperates with a machine vision system 141
configured to automatically recognize locations from which frozen sample cores have already been taken from the frozen samples in the containers 105 (if there are any) to facilitate taking additional frozen sample cores from already-cored frozen samples. At these locations, there will be a bore or hole in the frozen sample. In some cases the bore may be empty, but in other cases the bore may contain or be obscured by frost crystals that have grown on the sample (e.g., while the sample was in frozen storage), by debris, and/or for other reasons. The machine vision system 141 is also configured to recognize the absence of any bores in frozen samples that have not yet had any frozen sample cores extracted from them. The machine vision system 141 facilitates use of the robotic system 101 to take frozen sample cores from frozen samples that were previously sampled to obtain an aliquot and then were returned to frozen storage for a period of time before additional frozen sample cores from that sample are requested to provide additional aliquots in later tests.
[ 0063 ] The machine vision system 141 includes a camera 143 mounted for capturing an image of a container 105 and a frozen sample contained therein while the container is supported by the platform 103. In the illustrated embodiment, the machine vision system 141 includes a display 146 coupled to the processor 114 for displaying the captured and/or processed image data. The camera 143 is suitably mounted on the robotic system 101 for movement relative to the containers 105 by the robotic system. For example, in the illustrated embodiment, the camera 143 is mounted on the end effector 111 for movement with the end effector. It is recognized the camera could be mounted in fixed position relative to the platform within the scope of the invention .
[ 0064 ] The camera 143 and processor 114 are configured to communicate with one another so the processor can instruct the camera to capture images at appropriate times and receive image data from the camera indicative of the images captured by the camera. The processor 114 for the vision system 141 can suitably be the same processor that controls operation of the robotic system 101, although separate processors could be used within the scope of the invention. Various cameras can be used within the broad scope of the invention. For example, the camera 143 can be a digital camera containing a CCD array (not shown) that converts the captured image into electrical signals. The camera 143 in the illustrated embodiment is configured to capture monochromatic (e.g., grayscale) images instead of color images for reasons that will be discussed in more detail later, but the camera can be configured to capture color images within the broad scope of the invention.
[ 0065 ] The machine vision system 141 also includes a light 145 for illuminating the container 105 being imaged by the camera 143. One or more lights having various different
configurations, arrangements, and colors can be used within the broad scope of the invention. The lights can be moveable (e.g., mounted on the end effector 111) or fixed (e.g., secured to or within the platform 103) within the scope of the invention. The one or more lights can be positioned to provide bright field illumination, dark field illumination, indirect lighting (e.g., side lighting), backlighting, direct lighting (i.e., lighting directed perpendicular to the illuminated surface) , and any combinations thereof. Figure 4 illustrates three optional lights 181, 183, 185 that can be positioned at fixed locations relative to the station for receiving the container 105 holding the frozen sample. For example, fiber optic cables can be routed through the platform or provided on the platform to provide light at locations designated 181, 183, and/or 185. Other types of lights could also be secured on or within the platform at these locations.
[ 0066 ] The containers 105 are typically transparent or at least translucent so light can pass through the side or bottom of the container and interact with the frozen sample therein. The light 181 at the bottom of the receptacle 106 for receiving the container 105 provides a backlighting option. The light 183 at the top of the container is suitably secured within one of the jaws 108, 110 to provide a side lighting and/or dark field illumination option for the surface of the sample. The light 185 in the side of the receptacle 106 suitably provides a side lighting option below or at the surface of the frozen sample. When the side lighting and/or back lighting options are used, the bores in the frozen sample will typically have a brighter appearance than the frozen sample in the corresponding image. Side lighting and/or back lighting can be useful in detecting real bores that are either filled with or completely obscured by frost or other debris. The machine vision system 141 can include multiple different lights and the processor 114 can be
configured to operate the lights sequentially if desired to make use of image data of the frozen sample under different lighting conditions .
[ 0067 ] As illustrated in Figs. 6 and 7, the light 145 in the illustrated embodiment is a ring light. The ring light 145 has a plurality of light sources 147 (e.g., LEDs) arranged in an annular (e.g., circular) pattern. For example, the ring light 145 suitably has a hollow cylindrical housing 151 having a groove 153 in one end. The light sources 147 are positioned in the groove 153 in a recessed position so the housing blocks the direct path of light from the light sources at wide angles therefrom. A cover 149 such as a clear window, transparent or translucent diffuser, or lens can be installed in the groove to enclose the light sources 147 if desired.
[ 0068 ] In the embodiment illustrated in Figs. 6-7, the camera 143 is positioned so an optical axis 155 of the camera 143 extends through a central portion of the annular pattern of the ring light 151. For example, the annular ring light 145 suitably has a central axis that is co-linear with the optical axis 155 of the camera. The ring light 145 and camera 143 are suitably arranged so there is no direct path from the light sources 147 in the ring light to the camera. In Fig. 5, for example, the camera 143 has a forward end 157 for receiving light from an object being imaged and the ring light 145 is suitably positioned to extend farther forward than the camera so the light emitted by the ring light is emitted from a position in front of the camera. Also as illustrated in Fig. 5, the edge of the housing 151 of the ring light 145 suitably extends between the light sources 147 and the camera 143 to block the path of light directly from the light sources into the camera. [ 0069 ] The processor 114 is configured to receive image data from the camera 143 indicative of the image of the
container 105 and the frozen sample therein and to use the image data from the camera to determine one or more locations where a frozen sample core has already been taken from the frozen sample contained in the container. The processor 114 can be configured to use various methods to make this determination. For example, the processor 114 can suitably be configured to evaluate
contrast in the image to identify one or more bore candidates and then determine whether or not the one or more bore
candidates are likely to be artifacts instead of real bores in the sample using information from the image.
[ 0070 ] The processor 114 suitably processes the image captured while the container 105 is illuminated with the light 145 in various ways to facilitate this determination. For example, in one embodiment the processor 114 is configured to perform a thresholding filter to the raw image data, apply one or more morphological filters (e.g., erosion, dilation, opening, and/or closing) to the thresholded image, and then apply particle analysis to identify one or more bore candidates.
[ 0071 ] It is understood that the bore candidates identified by the processor might include some features that are artifacts instead of real bores. For example, the sample surfaces can be blotchy or become blotchy over time (e.g., due to undesired formation of frost crystals on the frozen sample, irregularities in the surface contour of the frozen sample resulting from the speed with which the sample was frozen, pieces of ice and other debris that may accumulate on the upper surface of the frozen sample such as by falling from the cap or sides of the
container, etc.) . Further, although the real bores resulting from extraction of frozen sample cores are typically very uniform (e.g., circular) in appearance initially, frost crystals that might grow on the frozen sample after it is returned to cold storage can extend into the bore or over the opening at the top of the bore and alter the appearance of the bore. Thus, it has been found that a machine vision system that looks for nice perfectly formed bores and excludes all else from the list of bore candidates results in a significant risk of failure to recognize actual bores that exist in the sample, particularly when the frozen sample is replaced in cold storage for a long time before additional frozen sample cores from that frozen sample are requested.
[ 0072 ] Accordingly, the processor 114 is suitably
configured to use multiple types of information to determine whether or not a bore candidate is likely to be a real bore candidate or an artifact. For example, the processor 114 is suitably configured to use information selected from the group consisting of :
the size of the bore candidate;
the distance between the bore candidate and a center axis of the container;
the angle formed between a first line and a second line, the first line extending between the bore and the center axis of the container and the second line extending between the center axis of the container and another bore
candidate;
the relation between the position of the one or more bore candidates and an expected pattern of bores in the sample; the location of the one or more bore candidates relative to a peripheral edge of the container;
the total number of bore candidates associated with a particular container;
the amount of contrast between the bore candidates and the area surrounding the bore candidates; and
combinations thereof to help determine whether or not a bore candidate is likely to be an artifact or a real bore.
[ 0073 ] In many cases there will be an expected range of size (e.g., diameter) for the bores formed by extracting a frozen sample core from a frozen sample. However, some bore candidates can be substantially larger or substantially smaller than expected, as illustrated in Fig. 9. Thus, it is possible the processor 114 may be able to determine certain bore
candidates are likely to be artifacts on the basis of the size being either too large (e.g., having diameter Dl in Fig. 9) or too small (e.g., having diameter D2 in Fig. 9) .
[ 0074 ] In many cases frozen sample cores will be extracted from the frozen samples according to a pre-determined geometric pattern or a geometric pattern that can be recognized from the captured image data by the processor 114. The geometric pattern can vary depending on what objectives are to be achieved, such as maximizing the number of frozen sample cores that can be extracted from a frozen sample or taking as many frozen sample cores as possible at a particular radial position from the center. Figure 10 illustrates an example of one geometric pattern in which five sample cores are extracted from a frozen sample. The bores resulting from this pattern are all spaced about the same distance D3 from the center of the container and the angles θι between the lines extending between corresponding points (e.g., the center) in adjacent bores are all about equal. The number of bores in the geometric pattern can vary within the scope of the invention. Although the pattern in Fig. 10 is a regular pattern, meaning the bores are all the same size, are all spaced the same distance from the center, and are all spaced at equal angles, it is understood that the pattern could be irregular within the scope of the invention.
[ 0075 ] As illustrated in Fig. 11, some bore candidates can be spaced too close to the center (e.g., see distance D4 in Fig. 11) or too far from the center of the container (e.g., see distance D5 in Fig. 11), or conversely, spaced to far or close to the edge of the container if the edge of the container can be detected, to fall within the geometric pattern. Likewise, as illustrated in Fig. 12 the angular spacing between one or more of the bore candidates can be different (either too high Θ2 or too low Θ3) from the expected angular spacing. Thus, the
processor 114 can determine certain bore candidates are likely to be artifacts on the basis that the distance between the bore candidate and a center axis of the container (or from an edge of the container) is not within expected limits and/or that the angle formed between lines extending between corresponding points in the two bore candidates (e.g., the center, as
illustrated, or an edge) and the center axis of the container is not within expected limits.
[0076] In many cases, frozen sample cores will be extracted from the frozen samples according to a specific orderly
sequence. As illustrated in Fig. 10, for example, the frozen sample cores are extracted starting at the top position and then moving clockwise around the geometric patter until the last sample core has been taken. As illustrated in Fig. 13, in some cases one or more of the bore candidates may be out of order even though it could be positioned at a correct place within the geometric pattern. For example, there may be an empty gap 307 in the pattern between one of the bore candidates 305 and other bore candidates 301, 303 indicating that if all of the bore candidates are real bores, the result would be the expected sequence was not followed. In this case, the processor 114 can determine a bore candidate is an artifact on the basis that it is out of order with a sequence according to which frozen sample cores are expected to be extracted from the frozen sample, particularly when multiple bore candidates 301, 303 follow the expected sequence and only one bore candidate 305 is out of sequence .
[0077] In some cases, the number of bore candidates can be larger than is expected. The processor 114 is suitably
configured to recognize this as an indication of a higher likelihood that one or more of the bore candidates is an
artifact. The processor 114 can apply more rigorous standards to help exclude likely artifacts when the number of bore candidates is too high.
[ 0078 ] Sometimes the amount of contrast between a bore candidate and its surrounding area can help distinguish between real bores and artifacts. For example, a large contrast can be indicative of a very good candidate for a real bore whereas a smaller contrast might indicate one of a set of bore candidate that is questionable on other accounts (e.g., there are too many bore candidates, there are only two bore candidates and they do not follow the expected geometric pattern, etc.) is more likely than the other (s) to be an artifact.
[ 0079 ] One way the processor can evaluate the positions of the bore candidates is with reference to the position of the center axis of the container 105 holding the frozen sample or alternatively relative to a peripheral edge of the container. The processor 114 can be configured to identify the edge and/or the center of the container 105 in various ways within the scope of the invention. For example, one option is to use an edge finding algorithm to identify the inner or outer peripheral edge of the container 105 and then compute the geometric center of that edge to identify the center of the container. For example, Fig. 14 shows an image of a container 105 with an overlay including a pair of concentric circles 163, 165 and a plurality of radially extending scan lines 167 extending between the circles. The circles 163, 165 define an area to be scanned in an attempt to identify the edge of the container 105. The processor 114 is suitably configured to evaluate the image data to determine points 169 along each line where there is sharp contrast. Each point 169 potentially represents an intersection between the edge of the container 105 and the respective scan line 167. In the case of a successful attempt to identify the edge of a container, a significant number of the points 169 will lie on the same circle (or other shape if the containers do not have circular shapes) in which case the processor 114 concludes the points 169 lying thereon define the edge of the container 105. As used herein in the context of edge finding techniques and using information about the edge of the container to identify and/or evaluate bore candidates, it is understood the edge of the container can refer to the edge of a well or other discrete area within which one frozen sample is stored on a well plate or other container adapted for holding multiple different samples .
[ 0080 ] Although edge detection techniques can work very well in certain circumstances, edge detection can be impaired when there is low contrast between the edge of the container and the background. This can present a problem when relying solely on edge detection in the context of a machine vision system for a frozen aliquotter because one of the most common colors for containers is white or semi-transparent and white frost can form on surfaces adjacent the containers, which leads to the
potential problem that there might be low contrast between the edge of the container and the surroundings in the image.
Ultraviolet or infrared lighting can help enhance the contrast between the edge of the container and the surroundings in the image. This enhanced contrast improves detection and
identification of the container edges. In one embodiment, a separate UV or IR light source can be positioned to illuminate the container. The UV or IR light source can be moveable (e.g., mounted on the end effector 111) or fixed (e.g., secured to or within the platform 103) within the scope of the invention. In one embodiment, the separate UV or IR light source can be positioned in the platform to provide indirect lighting (e.g., backlighting or side lighting) to the container to aid in edge detection. In another embodiment, any one of or combination of the lights 145, 181, 183, 185 can include a UV or IR light source .
[ 0081 ] As illustrated in Figs. 15 and 16, a pair of calibration marks 161 is suitably provided on the platform 103 at fixed positions relative to the source container station 107. The calibration marks 161 can be any structure or marking that has sufficient contrast with the background to allow the calibration marks to be reliably identified by the processor 114 from the image data. For example, the calibration marks 161 can suitably be dark openings, dark colored marking (e.g., dots), or other structures. The calibration marks 161 can suitably be or include heaters (e.g., small low-power resistance heaters) to limit accumulation of frost on the calibration marks, which might obscure the calibration marks.
[ 0082 ] The processor 114 is suitably configured to use the calibration marks 161 (e.g., in combination with edge detection or without edge detection) to determine whether or not the one or more bore candidates are likely to be artifacts instead of real bores in the frozen sample. The calibration marks 161 are designed to ensure there is strong color contrast between the calibration marks and the surrounding objects in the image even if there is frost formation or other conditions that minimize contrast between the edge of the container 105 and its
surrounding in the image data. Because the calibration marks 161 are at fixed positions relative to the station 107 for receiving source containers 105, the processor 114 can determine the position of the bore candidates by comparing their positions to the positions of the calibration marks.
[ 0083 ] For example, the calibration marks 161 are suitably positioned to form a triangle with the center of the container. The angles a and β formed between a line connecting the
calibration marks and the respective lines connecting the calibration marks to the center can be known before the vision system 141 inspects a frozen sample. Accordingly, the processor 114 can be configured to identify the center axis of the container by triangulating the center from the calibration marks. A machine vision system 141 including a processor 114 that is configured to use calibration marks 161 to identify the center of a container is not sensitive to errors in the rotation of the camera or to errors in translation of the camera. In cases where it is not practical to use edge detection to determine the center of the container (e.g., because of low contrast) , the processor 114 can be configured to identify the edge of the container 105 and/or the center axis of the
container as a function of the position of the calibration marks without detecting any edges of the container. Alternatively, the processor 114 can be configured to both use the calibration marks and peripheral edge detection to identify the center of the container. The processor 114 can be configured to compare the positions of the bore candidates directly to the positions of the calibration marks 161 to determine which bore candidates are likely to be artifacts without computing the relative distances between the bore candidates and the center of the container or the edge of the container without computing the center of the container and/or without computing the edge of the container .
[ 0084 ] The processor 114 is configured to automatically select a suitable location in the frozen sample from which the robotic system 101 can take another frozen sample core (or in the case of a frozen sample from which no frozen sample cores have been taken yet, it is configured to automatically select the location from which the initial frozen sample core will be extracted) once the processor has determined from the image data whether or not there are any bores in a particular frozen sample and the locations of any such bores. This facilitates taking frozen sample cores from samples that may have already been subjected to previous extractions of frozen sample cores without requiring the processor 114 to have access to any information about the number of previous frozen sample cores that may have been extracted from the sample or the locations within the frozen sample from which any such sample cores have been taken. This eliminates the need for manual intervention to orient the containers 105 is a particular way and greatly reduces the amount of data on a sample that needs to be tracked to
successfully manage and process samples when extracting frozen sample cores from frozen samples to fill orders for sample aliquots. It also facilitates the ability to install the robotic system 101 in a bio-bank that has previously used one or more different approaches to sample core extraction (e.g., using a geometric pattern of sample cores that maximizes the number of samples that can be obtained vs. using a geometric pattern of sample cores that results in some or all of the samples being taken from a part of the frozen sample that is a particular radial distance from the center even if this reduces the maximum number of samples cores that can be extracted) . Thus, if a bio- bank has previously employed one strategy or particular set of operating procedures for extracting frozen sample cores, the system 101 can still recognize bores in the frozen sample even if the bores are not where they would be expected to be if the previously extracted frozen sample cores had been extracted according to the protocols of the system 101 instead of whatever other protocols were previously in use.
[0085] For example, if the processor 114 detects one or more pre-existing bores in the frozen sample, the processor can be configured to select a location for the next frozen sample core that continues the geometric pattern that has already been started. Another option if it is desired that the next sample core be taken from a particular radial location in the frozen sample is that the processor 114 can be configured to select a location that is the desired radial distance from the center of the container and also sufficiently spaced from existing bores in the frozen sample. The processor 114 can be configured so a user can select which of these options is used for any
particular container or set of containers.
[0086] The processor 114 is also configured to select an appropriate initial geometric pattern for the locations from which a plurality of frozen sample cores will be extracted if the processor determines there are no existing bores in the frozen sample. The processor 114 can be configured to select a geometric pattern that maximizes the number of frozen sample cores that can be taken from the frozen sample and/or the processor can be configured to select a geometric pattern that results in one or more frozen sample cores (e.g., all frozen sample cores) being taken from a particularly desired radial distance from the center of the container. The processor 114 can be configured to allow a user to select which of several different strategies will be used for planning the geometric pattern of the locations from which frozen sample cores are to be extracted for different containers or sets of containers. If desired, the processor 114 can be configured to display the geometric pattern selected by the processor and/or the
location (s) selected by the processor as the site(s) for frozen sample extraction (s) to facilitate confirmation and/or
intervention by a human operator.
[ 0087 ] It has been determined that the color of light emitted by the light 145 can be important. In general, better results are obtained when the light used to illuminate the frozen sample matches the color of the frozen sample. For example, the color of the light used to illuminate the frozen sample is suitably the same as the color of the sample or no more different from the color of the sample than one of the two adjacent colors on an RGB color wheel having six colors arranged in the following order extending around the wheel: red, yellow, green, cyan, blue, magenta, and then back to red. For example, a red light works well with red samples, orange samples, and yellow samples. Because there are large numbers of blood (red) and urine (yellow or orange) samples that have been frozen for research, it is anticipated that it can be desirable for the light 145 to emit red light for illuminating the frozen sample. It is also anticipated that it will be desirable in some cases for the light to emit green light or blue light. However, the light can emit any color light within the broad scope of the invention .
[0088] In one embodiment, the light 145 includes red light emitting elements, blue light emitting elements, and green light emitting elements and the intensity of light emitted from the red, blue, and green light emitting elements is selectively adjustable to allow any of multiple different colors of light to be selected as the color of light that is used to illuminate the samples. In one embodiment, the light 145 includes red light emitting elements that emit red light having a wavelength in the range of about 620nm to about 750nm (about 400THz to about 484THz) . For example, the majority of the light energy emitted by the light (e.g., substantially all of the light energy) is suitably within the range of about 620nm to about 750nm. The light sources can include LEDs that emit light concentrated in the range of about 620nm to about 750nm in wavelength. In another embodiment, the light 145 includes green light emitting elements that emit green light having a wavelength in the range of about 495nm to about 570nm (about 526THz to about 606THz) . For example, the majority of the light energy emitted by the light (e.g., substantially all of the light energy) is suitably within the range of about 495nm to about 570nm. The light sources can include LEDs that emit light concentrated in the range of about 495nm to about 570nm in wavelength. The light sources 147 can include some light sources that emit only red light, other light sources that emit only green light, and other light sources that emit only blue light. Another possibility is that the light sources include multicolor LEDs, each which is operable to emit red light, green light, blue light and
combinations thereof.
[0089] In the case that the color of light can be adjusted the processor 114 can suitably be configured to receive input about the color of the samples in the containers and adjust the color of the light emitted by the light to reduce a difference between the color of the samples and the color of the light emitted by the light. For example, the vision system 141 can include a user interface configured to allow a user to input information about the color of the samples and the processor 114 can be configured to adjust the color of the light to match the color input by the user. Another option is that the camera 143 can be adapted to capture a color image of the sample (or a representative sample of a group of samples) and the processor 114 can be configured to adjust the color of the light to match the color of the sample in the captured color image. The processor suitably adjust the color of the light used to illuminate the sample to white when capturing the image that will be used to determine the color of the sample to facilitate accurate color detection and then adjusts the color of the light to match the color of the sample. In some cases it may be known that an entire set of samples will be similar in color, in which case the processor can be configured to capture a color image of one of the samples to assess the color of all of the samples in that set and adjust the color once to match the color of all the samples in the set.
[ 0090 ] Although the vision system can be configured so the camera captures color images of the frozen samples and the processor uses information from the color images to identify where frozen sample cores have already been taken within the scope of the invention, surprisingly good results are obtained when the vision system is configured so the camera captures a monochromatic (e.g., grayscale) image of the frozen sample (even when the light illuminating the sample is other than white, e.g., selected to match the color of the sample) and the processor uses the grayscale image to determine whether or not frozen samples cores have already been taken from the frozen sample and, if so, to identify the locations from which the frozen sample cores have already been taken. Digital cameras are available that can capture both grayscale images and color images, so it is possible that the camera captures one or more color images (e.g., to identify the color of the sample so the light used to illuminate the sample can be adjusted to match the color of the sample) and also captures grayscale images for use by the processor to identify locations where bores exist within the frozen samples.
[ 0091 ] The machine vision system 141 is suitably a
component of a calibration system configured to calibrate the robotic system 101. As illustrated in Figs. 1-3, the platform 103 has one or more fixed targets 171 positioned thereon. The camera 143 is mounted on the robotic system 101 so it can be moved to capture an image of each of the fixed targets 171. The processor is suitably configured to receive image data from the camera indicative of images of the target (s) formed by the camera and calibrate the robotic system 101 using an image of the one or more fixed targets 171 on the platform 103. As illustrated in Fig. 1, at least one of the targets 173 has an image (e.g., a cross hair) having a point or intersection of lines for calibration in the x and y directions and a shape (e.g., circle) having a known size for calibration in the z direction. The calibration system suitably has a user interface (not shown) configured to allow a user to guide the camera from a position that is not in registration with one of the targets (e.g., so a reticule overlaying the captured image is not aligned with the cross hair) toward a position that is in registration with said target (e.g., so the reticule is aligned with the cross hair) .
[ 0092 ] Also, as illustrated in Figs. 1-3, one of the targets 171 (e.g., the target 173 having the cross hair and circle) is secured to an upper surface of the work deck outside the recessed area 115 for receiving the trays. Further, one of the targets 175 is suitably secured to a bottom of the recessed area (e.g., between the trays 117 and adjacent the stations 107, 109, as illustrated in Fig. 3) for receiving the containers 105. In illustrated embodiment, the target 175 in the recessed area 115 also includes a cross hair.
[ 0093 ] Another of the targets in the illustrated embodiment is suitably a density step target 177 (see Fig. 17) having an image including one or more series of blocks arranged from lighter shades to darker shades for use calibrating light/dark settings for the camera 143.
[ 0094 ] The processor 114 is suitably configured to use multiple additional features on the platform 103 as targets to help calibrate the robotic system 101. For example, the
processor 114 is suitably configured to calibrate the robotic system 101 using images of multiple features on the platform 103 selected from the group consisting of:
the station 107 for receiving a container 105 from which a frozen sample core is to be taken;
the station 109 for receiving a container 105 in which a frozen sample core is to be deposited;
the station 119 for cleaning a coring probe 121 of the robotic system 101;
one or more sample trays 117 on the platform 103 for holding the containers 105; and
combinations thereof.
[ 0095 ] For example, Fig. 3 illustrates 13 calibration points that can be used according to one particular embodiment of the calibration system, with each of calibration points being labeled consecutively from 201-213. Point 201 corresponds to the target 175 on the platform 103 in the recessed area 115. Points 201, 202, and 203 correspond to the stations 107, 109 for receiving the containers 105 and the station 119 for washing the coring probe 121, respectively. Points 204-208 correspond to various points (e.g., points at the corners) of one of the trays 117a and points 209-212 correspond to various points (e.g., points at the corners) of another of the trays 117b. Point 213 corresponds to the target 173 on the deck of the platform 103. The points included in the calibration are suitably selected so they collectively extend over at least a substantial portion of the operating envelop, but the specific points used in the calibration process can vary within the scope of the invention.
[ 0096 ] The calibration system is suitably configured to complete calibration of the robotic system 101 without any physical contact between (i) the end effector 111 or and any components moveable with the end effector and (ii) the platform 103 or components on the platform.
[ 0097 ] The calibration system is also suitably configured to determine the positions of the camera 143, coring probe 143, and gripper 127 relative to one another to compensate for variations in the positional offsets associated with the camera, probe, and gripper. For example, the calibration system is suitably configured so a user can direct movement of the end effector 111, to bring each of the camera 143, coring probe 121, and gripping system 127 into registration with a target 171 or other reference point and provide an indication to the processor 114 each time one of them is in registration therewith. This allows the processor 114 to compute offsets between these components that account for positional variations that may result during assembly of the robotic system 101 or for any other reason. This facilitates more accurate position of the components of the robotic system 101.
[ 0098 ] During initial installation of the robotic system 101, and from time to time thereafter as may be needed, the machine vision system 141 is suitably used to calibrate the robotic system. The robotic drive system 113 moves the camera 143 into a position that is estimated by the processor to be in registration with one of the targets 171. Then, image data from the camera 143 is used (either by the processor or a user) to instruct the robotic drive system 113 to adjust the position of the camera until it is in registration with the target. If the target 171 includes a circle or other shape having a known size for calibration in the z-axis, the image data from the camera 143 is used to instruct the robotic drive system (either by the processor or a user) to raise or lower the camera until the size of the shape in the image captured by the camera indicates the camera is the proper distance from the target in the z- direction. Calibration in the Z-direction could instead be achieved using a lens setting for the camera 143 having a known focal length and then adjusting the height of the camera until the image is in focus. When the camera 143 is in registration with the target 171 and the correct distance from the target, the processor records positional information from the robotic system 101 (e.g., data from encoders and other devices that provide positional feedback about the position of various components of the robotic system) and designates that
information as corresponding to a set point corresponding to respective target. The process is repeated for each of the targets 171. For example, in the embodiment illustrated in Fig. 3 the process is repeated for each of the calibration points 201-213.
[ 0099 ] Although the targets 171 and/or calibration points can be positioned at various locations on the platform 103, in the illustrated embodiment, the targets suitably include one target 173 on the upper surface of the deck of the platform and another target 175 in the recessed area of the platform. The targets 171/calibration points suitably include multiple targets including at least one of:
the station 107 for receiving the container 105 from which a frozen sample core is to be taken;
the station 109 for receiving the container 105 in which a frozen sample core is to be deposited;
the station 119 for cleaning the coring probe 121;
one or more trays 117a, 117b on the platform 103 for holding the containers 105; and combinations thereof.
[ 00100 ] For example, in one embodiment the calibration targets 171 and calibration points include each of points 201- 213 on Fig. 3.
[ 00101 ] The density step target 177 is also used during the calibration process to adjust camera settings and light
intensity to provide standard image capturing conditions. The light 145 is turned on and the diaphragm of the camera 143 and/or intensity of electric current supplied to the light are adjusted while the camera captures images of the density step target 177 until a particular shaded block on the density step target is read as a certain gray level by the camera 143. For example, good results have been obtained when the light 145 and camera 143 are adjusted so the third lightest color block on a standard density step target is read by the camera as 200 gray level .
[ 00102 ] The robotic system 101 is also calibrated to adjust for any variations in the offset between the positions of the camera 143, the coring probe 121, and the gripping system 127. For example, the camera 143 is first positioned in registration with one of the targets 171 or other reference point on the platform 103, at which point a user provides an indication to the processor 114 that the camera is in registration therewith. Then a user adjusts the position of the end effector 111 until the coring probe 121 is in registration with the target 171 or reference point and provides an indication to the processor 114 that the coring probe is in registration therewith. Finally, the user adjusts the position of the end effector until the gripper system 127 is in registration with the target 171 or other reference point and provides an indication to the processor that the gripper system is in registration therewith. The order in the steps of this method is not important. The processor 114 determines the positional offset between the camera 143, coring prove 121, and gripper assembly 127 using the information provided in this process. The entire calibration process is suitably completed without requiring any contact between the end effector 111 or any components moveable with the end effector and the platform 103 or any components on the platform.
[ 00103 ] To extract frozen sample cores from the frozen samples, a set of containers 105 containing frozen samples is placed on the platform 103. For example, one or more trays 117a can be loaded with sample containers 105 and then placed on the platform 103 (e.g., in the recessed area 115) . A set of empty containers 105 for receiving frozen sample cores after they are extracted is loaded into one or more additional trays 117b and placed on the platform 103. The robotic system 101 uses the gripper system 127 to move one of the containers 105 containing a frozen sample to the station 107 for receiving containers from which frozen sample cores are being extracted and moves one of the empty containers to the station 109 for receiving
destination containers into which the frozen sample cores are to be deposited.
[ 00104 ] Then the robotic system moves the camera 143 into position over the station 107 for holding the containers 105 containing frozen sample while frozen sample cores are extracted from them. The robotic system suitably includes a fill level detection system for detecting the level of an upper surface of the frozen sample. Details concerning the construction and operation of a suitable fill level detection system are provided in U.S. Application No. 13/359,301, entitled Robotic End
Effector for Frozen Aliquotter and Methods of Taking a Frozen Aliquot from Biological Samples, filed January 26, 2012, the contents of which are hereby incorporated by reference. The fill level detection system provides information about the position of the upper surface of the frozen sample. The fill level detection system can be used to position the camera 143 at a desired level above the frozen sample to improve focusing of the camera. For example, the processor 114 uses the information from the fill level detection system about the position of the upper surface of the frozen sample to determine the elevation at which to position the camera for obtaining an image of the frozen sample taken while the camera is within an optimal range of distances from the upper surface of the sample. The light 145 is used to illuminate the container 105 at the station 107 and the frozen sample contained therein. If the machine vision system 141 includes the option of adjusting the color of the light 145, the color of the frozen sample is determined (e.g., using image data from the camera and/or user input) and the color of the light is adjusted to match the color of the frozen sample, as described above. For example, if the frozen sample is red, orange, or yellow, the light 145 can be adjusted to emit red light. Likewise, if additional lighting options are used, additional images of the container 105 are captured with one or more of the lights 181, 183, 185 providing illumination.
[ 00105 ] The camera 143 captures one or more raw images of the illuminated container 105 and frozen sample. The raw image is suitably processed to facilitate recognition of bore
candidates. For example, a thresholding filter is suitably applied to the raw image obtained with illumination from light 145. A morphological filter is also applied to the image. After the image has been filtered a particle analysis imaging
algorithm is performed to recognize any bore candidates. The processor then uses the image data to evaluate whether or not any bore candidates are actual bores or just artifacts in order to determine whether or not any frozen sample cores have already been taken from the frozen sample and, if so, to identify the location (s) from which they were taken.
[ 00106] If the option of using calibration marks 161 to evaluate the positions of the bore candidates is used, the method suitably includes heating the calibration marks using the low-power resistance heaters to ensure the calibration marks are not obscured by frost. [ 00107 ] Once the processor 114 has identified the bore candidates and determined which bore candidates are likely to be artifacts and which are likely to be real bores, the processor automatically selects a position from which a frozen sample core will be extracted, accounting for the position of pre-existing bores in the frozen sample if there are any. Then the processor 114 instructs the robotic system 101 to move the coring probe 121 into position over the selected location and instructs the sample extraction system to extract a frozen sample core from the location. Consequently the coring probe 121 is extended into the frozen sample (e.g., while rotating if the sample extraction system 123 uses a drilling action) and then withdrawn from the frozen sample with the frozen sample core contained therein. The robotic system 101 moves the coring probe 121 into position over the top of the container 105 at the station 109 for holding the destination containers and ejects the frozen sample core from the coring probe into the destination container. If more than one frozen sample core is needed to provide enough material for the aliquot that has been ordered, the frozen sample core extraction process is repeated at another suitable location within the frozen sample, as automatically determined by the processor, until a sufficient amount of sample material has been deposited in the destination container 105.
[ 00108 ] After a sufficient amount of sample material has been deposited in the destination container 105, the frozen parent sample is further processed to clear away any frost crystals or other debris in each of the bores in the frozen sample to ensure better accuracy and reliability in the machine vision system 141 when the sample is retrieved again later from frozen storage to obtain additional frozen sample cores. As described above, the bores may contain or be obscured by frost crystals that have grown on the sample (e.g., while the sample was in frozen storage), by debris (e.g., from previously drilling frozen sample cores), and/or for other reasons. To improve the ability to recognize bores in the frozen parent sample next time it is retrieved from frozen storage, the processor 114 suitably uses the image data that was obtained from the sample before the most recent frozen sample core was extracted (i.e., the image data used to evaluate bore
candidates, the determination whether or not any frozen sample cores had already been taken from the frozen sample, and if frozen sample cores had already been taken, the location (s) from which they were taken) . The processor 114 suitably also uses data obtained during the extraction of the most recent frozen sample core (e.g., the geometric sampling pattern used to obtain the most recent frozen sample core(s), information about the location from which the most recent frozen sample cores were taken) . Using this image data about the location (s) of any bores before the most recent extraction and the data about the location (s) of any bores made during the most recent extraction, the processor 114 determines the location (s) of every bore and suspected bore in the frozen sample. Once the processor 114 has identified all the bore(s) in the frozen sample, the processor instructs the robotic system to move the coring probe 121 into position over each bore in turn to reprocess or clean the bore of any debris. The processor 114 may or may not have information about whether a bore has any debris obstructing it or contained therein, and therefore instructs the robotic system to move the coring probe 121 into position over each bore in turn for lowering into the bore regardless of whether any debris is identified in the bore. In one embodiment, the processor 114 instructs the robotic system to reprocess or clean any bore(s) identified using the image data that was obtained before the most recent frozen sample core was extracted, and then
subsequently instructs the robotic system to reprocess or clean any bore(s) made during the most recent frozen sample core extraction. In another embodiment, the processor 114 instructs the robotic system to reprocess or clean any bore(s) made during the most recent frozen sample core extraction, and then
subsequently instructs the robotic system to reprocess or clean any bore(s) identified using the image data that was obtained before the most recent frozen sample core was extracted. In another embodiment, the processor 114 instructs the robotic system to reprocess or clean only any bore(s) identified using the image data that was obtained before the most recent frozen sample core was extracted. In another embodiment, the processor 114 instructs the robotic system to reprocess or clean only any bore(s) made during the most recent frozen sample core
extraction. In yet another embodiment, the processor 114 instructs the robotic system to reprocess or clean any bore(s) identified using the image data that was obtained before the most recent frozen sample core was extracted prior to
instructing the robotic system to extract the most recent frozen sample core.
[ 00109 ] Any order or combination of cleaning processes is within the broad scope of the present invention. As illustrated in Figs. 18 and 19, for example, the processor 114 may direct an ejector pin 190 of the end effector 111 to move to an extended position in which the ejector pin extends from a distal end of the coring probe 121. The coring probe 121 is positioned over an identified bore 192 (Fig. 18) and then lowered into the bore to clear the bore of any debris (Fig. 19) . The coring probe 121 is lowered into the identified bore 192 whether or not there is debris in the bore. If the ejector pin 190 encounters resistance (e.g., the identified bore is actually an artifact and not a real bore), this resistance is detected (e.g., using the components of the fill level detection system) and the downward motion of the coring probe 121 and ejector pin is stopped to prevent damage to the frozen sample and to the robotic system. The machine vision system 141 can include sensors as described in U.S. Application No. 13/359,301, filed January 26, 2012, to determine whether or not the ejector pin 190 is being lowered into a bore instead of being lowered into contact with frozen sample. As the ejector pin 190 enters the open end of the bore, any frost, debris or other similar objects that may be
obstructing the view of the bore is knocked away from the open end of the bore, either by being knocked into the bottom of the bore or by being pushed aside. Clearing the bore(s) of debris and frost makes it easier for the machine vision system 141 to correctly identify bores in a frozen sample next time it is retrieved from frozen storage when additional frozen sample cores are required. Because the coring probe and ejector pin 190 are already required to contact the sample to complete other parts of the process, there is substantially no added risk of contaminating the sample by using the coring probe and ejector pin to clear the debris away from the open ends of the bores.
[ 00110 ] The following discussion is intended to provide a brief, general description of a suitable processing environment in which aspects of the invention may be implemented. Although not required, aspects of the invention will be described in the general context of computer-executable instructions, such as program modules, being executed by computers or processors in network environments. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Computer-executable instructions, associated data structures, and program modules represent examples of the program code means for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represent examples of
corresponding acts for implementing the functions described in such steps .
[ 00111 ] Those skilled in the art will appreciate that aspects of the invention may be practiced in network computing environments with many types of computer system configurations, including personal computers, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer
electronics, network PCs, minicomputers, mainframe computers, and the like. Aspects of the invention may also be practiced in distributed computing environments where tasks are performed by local and remote processing devices that are linked (either by hardwired links, wireless links, or by a combination of
hardwired or wireless links) through a communications network. In a distributed computing or processing environment, program modules may be located in both local and remote memory storage devices .
[ 00112 ] An exemplary system for implementing aspects of the invention includes a general purpose computing device in the form of a conventional computer, including a processing unit, a system memory, and a system bus that couples various system components including the system memory to the processing unit. The system bus may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. The system memory includes read only memory (ROM) and random access memory (RAM) . A basic input/output system (BIOS), containing the basic routines that help transfer information between elements within the computer, such as during start-up, may be stored in ROM.
[ 00113 ] The computer may also include a magnetic hard disk drive for reading from and writing to a magnetic hard disk, a magnetic disk drive for reading from or writing to a removable magnetic disk, and an optical disk drive for reading from or writing to removable optical disk such as a CD-ROM or other optical media. The magnetic hard disk drive, magnetic disk drive, and optical disk drive are connected to the system bus by a hard disk drive interface, a magnetic disk drive-interface, and an optical drive interface, respectively. The drives and their associated computer-readable media provide nonvolatile storage of computer-executable instructions, data structures, program modules, and other data for the computer. Although the exemplary environment described herein employs a magnetic hard disk, a removable magnetic disk, and a removable optical disk, other types of computer readable media for storing data can be used, including magnetic cassettes, flash memory cards, digital video disks, Bernoulli cartridges, RAMs, ROMs, and the like.
[ 00114 ] Program code means comprising one or more program modules may be stored on the hard disk, magnetic disk, optical disk, ROM, and/or RAM, including an operating system, one or more application programs, other program modules, and program data. A user may enter commands and information into the computer through keyboard, pointing device, or other input devices, such as a microphone, joy stick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to the processing unit through a serial port interface coupled to system bus. Alternatively, the input devices may be connected by other interfaces, such as a parallel port, a game port, or a universal serial bus (USB) . A monitor or another display device is also connected to system bus via an interface, such as video adapter. In addition to the monitor, personal computers typically include other peripheral output devices (not shown), such as speakers and printers.
[ 00115 ] The computer may operate in a networked environment using logical connections to one or more remote computers, such as remote computers . Remote computers may each be another personal computer, a server, a router, a network PC, a peer device or other common network node, and typically include many or all of the elements described above relative to the computer. The logical connections include a local area network (LAN) and a wide area network (WAN) that are presented here by way of example and not limitation. Such networking environments are commonplace in office-wide or enterprise-wide computer networks, intranets and the Internet. [ 00116] When used in a LAN networking environment, the computer is connected to the local network through a network interface or adapter. When used in a WAN networking
environment, the computer may include a modem, a wireless link, or other means for establishing communications over the wide area network, such as the Internet. The modem, which may be internal or external, is connected to the system bus via the serial port interface. In a networked environment, program modules depicted relative to the computer, or portions thereof, may be stored in the remote memory storage device. It will be appreciated that the network connections shown are exemplary and other means of establishing communications over wide area network may be used.
[ 00117 ] Embodiments within the scope of the present
invention also include computer-readable media for carrying or having computer-executable instructions or data structures stored thereon. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer. By way of example, and not
limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage, or other magnetic storage devices, or any other medium that can be used to carry or store desired program code means in the form of computer-executable instructions or data structures and that can be accessed by a general purpose or special purpose computer. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a computer-readable medium. Thus, any such a connection is properly termed a computer-readable medium. Combinations of the above should also be included within the scope of computer- readable media. Computer-executable instructions comprise, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose
processing device to perform a certain function or group of functions .
[ 00118 ] In one mode of operation, a frozen sample that is contained in a container 105 is positioned (e.g., by the robotic system) at the station 107 on the platform 103 having
calibration marks 161 in fixed positions relative to the
station. The camera 143 captures an image of the container 105 and the sample therein while the container 105 is received at the station 107. The processor 114 determines one or more locations where a frozen sample core has already been taken from the frozen sample contained in the container 105 by: (a) evaluating contrast in the image to identify one or more bore candidates in the frozen sample; and (b) using information about the position of the calibration marks relative to the bore candidates to determine whether or not the one or more
candidates are likely to be artifacts instead of real bores in the frozen sample. The frozen sample core is taken from the sample at a location from which no frozen sample core has already been taken, as determined by the processor.
[ 00119 ] In another mode of operation, the camera captures an image of a container 105 containing a frozen sample. The
processor 114 uses the captured image to determine one or more locations where a frozen sample core has already been taken from the frozen sample contained in the container by: (a) evaluating contrast in the image to identify one or more bore candidates; and (b) determining whether or not the one or more bore
candidates are likely to be artifacts instead of real bores in the frozen sample. To make this determination, the processor 114 uses information including at least one of the following:
the size of the bore candidate;
the distance between the bore candidate and a center axis of the container 105; the angle formed between a first line and a second line, the first line extending between the bore and the center axis of the container and the second line extending between the center axis of the container and another bore
candidate;
the relation between the position of the one or more bore candidates and an expected pattern of bores in the frozen sample;
the location of the one or more bore candidates relative to a peripheral edge of the container;
the number of bore candidates identified;
the amount of contrast between the bore candidates and the surrounding areas; and
combinations thereof.
[ 00120 ] The system takes a frozen sample core from the frozen sample at a location from which no frozen sample core has already been taken, as determined by the processor.
[ 00121 ] In yet another mode of operation, the robotic system 101 is calibrated by using the camera 143 to capture an image of one or more fixed targets 171 on the platform 103. The processor 114 uses an image of the one or more targets 171 to calibrate the robotic system. Then the same camera 143 is used to capture an image of one or more containers 105 while the containers are supported by the platform to determine whether or not one or more frozen sample cores has already been taken from the frozen sample .
[ 00122 ] In still another mode of operation, the robotic system 101 is operated to move the camera 143 relative to a first one of the containers 105 so the camera is directed at the frozen sample in the first container. The frozen sample in the container 105 is illuminated using the ring light 145. The camera 143 is used to capture an image of the illuminated frozen sample. The processor 114 evaluates contrast in the captured image and processes the image to identify one or more bore candidates in the captured image. The processor 114 determine whether or not the bore candidates are likely to be artifacts or real bores in the frozen sample. The robotic system 101 moves the camera relative to a second of the containers 105 so the camera is directed at the frozen sample in the second container and the process is repeated.
[ 00123 ] In yet another mode of operation, the robotic system 101 moves the camera 143 relative to a first one of the
containers 105 so the camera is directed at the frozen sample in the first container. The frozen sample in the container 105 is illuminated with a colored light. The camera 143 captures a grayscale image of the illuminated frozen sample. The processor 114 evaluates contrast in the captured image and processes the image to identify one or more bore candidates in the captured image. The processor 113 determines whether or not the bore candidates are likely to be artifacts or real bores in the frozen sample. The robotic system 101 moves the camera 143 relative to a second of the containers 105 so the camera is directed at the frozen sample in the second container. The process is repeated.
[ 00124 ] In another mode of operation, the robotic system 101 moves the camera 143 relative to a first one of the containers 105 so the camera is directed at the frozen sample in the first container. The frozen sample in the container 105 is illuminated with a light 145 that has a color selected to match the color of the frozen sample. The camera 143 captures an image of the illuminated frozen sample. The processor 114 evaluates contrast in the captured image and processes the image to identify one or more bore candidates in the captured image. The processor 114 determines whether or not the bore candidates are likely to be artifacts or real bores in the frozen sample. The robotic system moves the camera 143 relative to a second of the containers 105 so the camera is directed at the frozen sample in the second container and the process is repeated. [ 00125 ] In another mode of operation the robotic system 101 to positions one of the containers 105 on the platform 103 at a station 107 for receiving the container while a frozen sample core is extracted from the frozen sample contained in the container. One or more of the lights 181, 183, 185 provides at least one of back lighting and side lighting for the container 105. In another embodiment, one or more lights can provide direct lighting to the container 105. The camera 143 captures an image of the frozen sample while it is directly or indirectly (e.g., sidelit and/or backlit) lit. The processor 114 evaluates contrast in the captured image and processes the image to identify one or more bore candidates in the captured image.
[ 00126] The modes of operation described above can be used in combination or they can be used separately within the scope of the invention.
[ 00127 ] When introducing elements of the present invention of the preferred embodiments thereof, the articles "a", "an", "the", and "said" are intended to mean that there are one or more of the elements. The terms "comprising", "including", and "having" are intended to be inclusive and mean that there may be additional elements other than the listed elements.
[ 00128 ] In view of the foregoing, it will be seen that the several objects of the invention are achieved and other
advantageous results attained.
[ 00129 ] As various changes could be made in the above constructions without departing from the scope of the invention, it is intended that all matter contained in the above
description and shown in the accompanying drawings shall be interpreted as illustrative and not in a limiting sense.

Claims

What is claimed is:
1. A machine vision system for use with a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a respective container, the machine vision system comprising:
a platform for supporting one or more of the containers, the platform having a station for receiving at least one of the containers and a pair of calibration marks on the platform in fixed positions relative to the station;
a camera for capturing an image of the container while the container is received at the station;
a processor configured to receive image data from the camera indicative of the image of the container and to determine one or more locations where a frozen sample core has already been taken from a frozen sample contained in the container by: (a) evaluating contrast in the image to identify one or more bore candidates; and (b) using information about the position of the calibration marks relative to the bore candidates to determine whether or not the one or more candidates are likely to be artifacts instead of real bores in the sample.
2. A machine vision system as set forth in claim 1 wherein using information about the position of the calibration marks relative to the bore candidates comprises using the calibration marks to identify a center axis of the container and at least one of (i) using information about the position of the one or more bore candidates relative to the center axis of the container and (ii) using information about the angular position of one of said bore candidates relative to the center axis of the container compared to the angular position of another of said bore candidates relative to the center axis of the
container .
3. A machine vision system as set forth in claim 1 or 2 wherein the determining further comprises using information about the size of the one or more bore candidates to determine whether or not the one or more candidates are likely to be artifacts instead of real bores in the sample.
4. A machine vision system as set forth in any one of claims 2-3 wherein the processor is configured to identify the center axis of the container as a function of the position of the calibration marks without detecting any edges of the container .
5. A machine vision system as set forth in any one of claims 1-3 wherein the processor is configured to identify an edge of the container and wherein the determining comprises using information about the position of the one or more candidate bores relative to the edge of the container to determine whether or not the one or more candidates are likely to be artifacts instead of real bores in the sample.
6. A machine vision system as set forth in any one of claims 1-5 wherein the calibration marks comprise low power resistance heaters to limit accumulation of frost on the calibration marks.
7. A machine vision system as set forth in any one of claims 1-6 wherein the processor is further configured to control a position of the camera relative to a position of the station .
8. A machine vision system as set forth in one of claims 1-7 wherein the processor is configured to determine whether or not the one or more bore candidates are likely to be artifacts instead of real bores in the sample using information about the position of the one or more bore candidates relative to a center axis of the container.
9. A machine vision system as set forth in any one of claims 1-8 wherein the processor is configured to determine whether or not the one or more bore candidates are likely to be artifacts instead of real bores in the sample using information about the angular position of one of said bore candidates relative to the center axis of the container compared to the angular position of another of said bore candidates relative to the center axis of the container.
10. A machine vision system as set forth in one of claims 1-9 wherein the processor is configured to identify the center axis of the container by triangulating the center from the calibration marks.
11. A method of taking a frozen sample core from a frozen sample that is contained in a container, the method comprising: positioning the container at a station for receiving a container on a platform, the platform having a pair of
calibration marks on the platform in fixed positions relative to the station;
capturing an image of the container while the container is received at the station;
determining one or more locations where a frozen sample core has already been taken from the frozen sample contained in the container by: (a) evaluating contrast in the image to identify one or more bore candidates; and (b) using information about the position of the calibration marks relative to the bore candidates to determine whether or not the one or more
candidates are likely to be artifacts instead of real bores in the frozen sample; and taking the frozen sample core from the sample at a location from which no frozen sample core has already been taken, as determined in the determining step.
12. A method as set forth in claim 11 wherein the
determining further comprising using information about the size of the one or more bore candidates to help determine whether or not the one or more candidates are likely to be artifacts instead of real bores in the frozen sample.
13. A method as set forth in any one of claims 11-12 further comprising detecting a peripheral edge of the container in the captured image, wherein the determining further comprises using information about the position of the one or more bore candidates relative to the edge of the container to help determine whether or not the one or more candidates are likely to be artifacts instead of real bores in the frozen sample.
14. A method as set forth in any one of claims 11-13 further comprising heating the calibration marks to limit accumulation of frost on the calibration marks.
15. A method as set forth in claim 11 or 14 wherein using information about the position the calibration marks relative to the one or more bore candidates comprises identifying a center axis of the container and evaluating the position of the bore candidates relative to the center axis of the container to determine whether or not the one or more candidates are likely to be artifacts instead of real bores in the frozen sample.
16. A method as set forth in claim 15 wherein evaluating the position of the bore candidates relative to the center axis of the containers comprises using information about the angular position of one of said bore candidates relative to the center axis of the container compared to the angular position of another of said bore candidates relative to the center axis of the container to determine whether or not the one or more candidates are likely to be artifacts instead of real bores in the frozen sample.
17. A method as set forth in any one of claims 15-16 wherein identifying the center axis of the container comprises using triangulation .
18. A method as set forth in any one of claims 11-17 wherein evaluating contrast in the image to identify one or more bore candidates comprises applying a thresholding filter to the image .
19. A method as set forth in any one of claims 11-18 wherein evaluating contrast in the image to identify one or more bore candidates comprises applying a morphological filter to the image .
20. A method as set forth in in claim 18 wherein evaluating contrast in the image to identify one or more bore candidates further comprises applying a morphological filter to the image after applying the thresholding filter to the image.
21. A method as set forth in any one of claims 18-20 wherein evaluating contrast in the image to identify one or more bore candidates comprises applying a particle analysis imaging algorithm after the filtering.
22. A machine vision system for use with a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a respective container, the machine vision system comprising: a platform;
a camera for capturing an image of one of the containers while it is on the platform; and
a processor configured to receive image data from the camera indicative of the image captured by the camera and to determine one or more locations where a frozen sample core has already been taken from the frozen sample contained in the container by: (a) evaluating contrast in the image to identify one or more bore candidates; and (b) determining whether or not the one or more bore candidates are likely to be artifacts instead of real bores in the sample using at least one of the following :
(i) the size of the bore candidate;
(ii) the distance between the bore candidate and a center axis of the container;
(iii) the angle formed between a first line and a second line, the first line extending between the bore and the center axis of the container and the second line extending between the center axis of the container and another bore candidate;
(iv) the relation between the position of the one or more bore candidates and an expected pattern of bores in the sample;
(v) the location of the one or more bore candidates relative to a peripheral edge of the container;
(vi) the number of bore candidates identified;
(vii) the amount of contrast between the bore candidates and the area surrounding the bore candidates; and
(viii) combinations thereof.
23. A machine vision system as set forth in claim 22 wherein the processor is configured to determine whether or not the one or more bore candidates are likely to be artifacts instead of real bores in the frozen sample using information about location of the one or more bore candidates relative to a peripheral edge of the container.
24. A machine vision system as set forth in any one of claims 22-23 wherein the processor is configured to determine whether or not the one or more bore candidates are likely to be artifacts instead of real bores in the frozen sample using information about the size of the one or more bore candidates.
25. A machine vision system as set forth in any one of claims 22-24 wherein the processor is configured to determine whether or not the one or more bore candidates are likely to be artifacts instead of real bores in the frozen sample using information about the distance between the bore candidate and a center axis of the container.
26. A machine vision system as set forth in any one of claims 22-25 wherein the processor is configured to determine whether or not the one or more bore candidates are likely to be artifacts instead of real bores in the sample by using
information about the angle formed between a first line and a second line, the first line extending between the bore and the center axis of the container and the second line extending between the center axis of the container and another bore candidate .
27. A machine vision system as set forth in any one of claims 22-26 wherein the processor is configured to determine whether or not the one or more bore candidates are likely to be artifacts instead of real bores in the frozen sample by using information about the relation between the position of the one or more bore candidates and an expected pattern of bores in the frozen sample.
28. A machine vision system as set forth in one of claims 22-27 wherein the processor is configured to determine whether or not the one or more bore candidates are likely to be
artifacts instead of real bores in the frozen sample by using information about the number of bore candidates identified.
29. A machine vision system as set forth in claim 28 wherein the processor is configured to determine whether or not the one or more bore candidates are likely to be artifacts instead of real bores in the frozen sample by using information about the amount of contrast between the bore candidates and the areas surrounding the bore candidates.
30. A method of taking a frozen sample core from a frozen sample that is contained in a container, the method comprising: capturing an image of the container;
using the captured image to determine one or more locations where a frozen sample core has already been taken from the frozen sample contained in the container by: (a) evaluating contrast in the image to identify one or more bore candidates; and (b) determining whether or not the one or more bore
candidates are likely to be artifacts instead of real bores in the frozen sample using information including at least one of the following:
(i) the size of the bore candidate;
(ii) the distance between the bore candidate and a center axis of the container;
(iii) the angle formed between a first line and a second line, the first line extending between the bore and the center axis of the container and the second line extending between the center axis of the container and another bore candidate;
(iv) the relation between the position of the one or more bore candidates and an expected pattern of bores in the frozen sample; (v) the location of the one or more bore candidates relative to a peripheral edge of the container;
(vi) the number of bore candidates identified;
(vii) the amount of contrast between the bore candidates and the surrounding areas; and
(viii) combinations thereof; and
taking the frozen sample core from the sample at a location from which no frozen sample core has already been taken, as determined in the determining step.
31. A method as set forth in claim 30 wherein the
determining comprises using information about location of the one or more bore candidates relative to a peripheral edge of the container .
32. A method as set forth in any one of claims 30-31 wherein the determining comprises using information about the size of the one or more bore candidates.
33. A method as set forth in any one of claims 30-32 wherein the determining comprises using information about the distance between the bore candidate and a center axis of the container .
34. A method as set forth in any one of claims 30-33 wherein the determining comprises using information about the angle formed between a first line and a second line, the first line extending between the bore and the center axis of the container and the second line extending between the center axis of the container and another bore candidate.
35. A method as set forth in any one of claims 30-34 wherein the determining comprises using information about the relation between the position of the one or more bore candidates and an expected pattern of bores in the frozen sample.
36. A method as set forth in any one of claims 30-35 wherein the determining comprises using information about the number of bore candidates identified.
37. A method as set forth in any one of claims 30-36 wherein the determining comprise using information about the amount of contrast between the bore candidates and the
surrounding areas .
38. A calibration system configured to calibrate a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a respective container, the calibration system comprising:
a platform for supporting the containers, the platform having one or more fixed targets positioned thereon;
a camera mounted on the robotic system for capturing an image of one or more containers while the containers are supported by the platform and for capturing images of the one or more fixed targets positioned on the platform; and
a processor configured to:
receive image data from the camera indicative of images formed by the camera; and
calibrate the robotic system using an image of the one or more fixed targets on the platform.
39. A calibration system as set forth in claim 38 wherein the processor is further configured to determine one or
more locations where a frozen sample core has already been taken from a frozen sample in one of the containers by
evaluating contrast in an image of said container.
40. A calibration system as set forth in any one of claims 38-39 wherein the one or more fixed targets comprises a target having an image for calibration in the x and y directions and shape having a known size for calibration in the z direction.
41. A calibration system as set forth in claim 40 wherein the platform comprises a work deck having a recessed area for receiving the containers and said target having the image for calibration in the x and y directions and shape having a known size for calibration in the z direction is secured to an upper surface of the work deck outside the recessed area.
42. A calibration system as set forth in claim 41 wherein the one or more fixed targets includes a target secured to a bottom of the recessed area.
43. A calibration system as set forth in claim 42 wherein the target secured to the bottom of the recessed area has an image for calibration in the x and y directions.
44. A calibration system as set forth in any one of claims 38-43 further comprising a density step display on the platform for light/dark calibration of the camera.
45. A calibration system as set forth in any one of claims 38-44 wherein the processor is further configured to calibrate the robotic system using images of multiple features on the platform selected from the group consisting of
(i) a station for receiving a container from which a frozen sample core is to be taken;
(ii) a station for receiving a container in which a frozen sample core is to be deposited;
(iii) a station for cleaning a coring probe of the robotic system; (iv) one or more trays on the platform for holding the containers; and
(v) combinations thereof.
46. A calibration system as set forth in any one of claims 38-45 further comprising a user interface configured to allow a user to guide the camera from a position that is not in
registration with one of the targets toward a position that is in registration with said target.
47. A calibration system as set forth in any one of claims 38-46 wherein the robotic system comprises an end effector, the camera being mounted on the end effector for movement with the end effector relative to the platform, and the calibration system is configured to complete calibration of the robotic system without any physical contact between the end effector or and any components moveable with the end effector and the platform or components on the platform.
48. A calibration system as set forth in any one of the claims 38-46 wherein the robotic system comprises an end effector, the camera being mounted on the end effector for movement with the end effector relative to the platform, the end effector further comprising a coring probe for taking a frozen sample core from the frozen samples and a gripper adapted to selective hold and release containers for moving the containers relative to the platform, the calibration system being further configured to determine the positions of the camera, probe, and gripper relative to one another to compensate for variations in the positional offsets associated with the camera, probe, and gripper .
49. A method of calibrating a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a respective container, the method comprising: using a camera for capturing an image of one or more containers while the containers are supported by a platform of the robotic system to determine whether or not one or more frozen sample cores has already been taken from the frozen sample to capture an image of one or more fixed targets on the platform; and
using an image of the one or more targets to calibrate the robotic system.
50. A method as set forth in claim 49 wherein the platform comprises a work deck having a recessed area for receiving the containers and at least one of said one or more targets is secured to an upper surface of the work deck outside the recessed area.
51. A method as set forth in claim 50 wherein the one or more fixed targets also includes at least one target secured to a bottom of the recessed area.
52. A method as set forth in any one of claims 49-51 further comprising using a density step display on the platform to calibrate light/dark settings of the camera.
53. A method as set forth in any one of claims 49-52 further comprising using images of multiple additional features on the platform to help calibrate the robotic system, said multiple additional features including at least one of :
(i) a station for receiving a container from which a frozen sample core is to be taken;
(ii) a station for receiving a container in which a frozen sample core is to be deposited; (iii) a station for cleaning a coring probe of the robotic system;
(iv) one or more trays on the platform for holding the containers; and
(v) combinations thereof.
54. A method as set forth in any one of claims 49-53 further comprising using the camera to capture an image of a container and using the image to determine the location of one or more bores in the sample.
55. A machine vision system for use with a robotic system adapted for taking a plurality of frozen sample cores from frozen samples that are each contained in a container, the machine vision system comprising:
a camera for capturing an image of a container while the container is supported by a platform, the camera having an optical axis;
a ring light for illuminating the container on the
platform, the ring light comprising a plurality of light sources arranged in an annular pattern, the optical axis of the camera extending through a central portion of the annular pattern; and a processor adapted to receive image data from the camera indicative of the image captured by the camera and to determine one or more locations where a frozen sample core has already been taken from the sample contained in the container by evaluating contrast in the image.
56. A machine vision system as set forth in claim 55 wherein the ring light emits red light.
57. A machine vision system as set forth in claim 55 wherein the ring light emits green light.
58. A machine vision system as set forth in claim 55 wherein the ring light comprises red light emitting elements, blue light emitting elements, and green light emitting elements, the intensity of light emitted from the red, blue, and green light emitting elements being selectively adjustable to allow any of multiple different colors of light to be selected as the color of light to be emitted by the ring light.
59. A machine vision system as set forth in claim 55 wherein the ring light comprises multicolor LEDs, each of said multicolor LEDs being operable to emit red light, green light, blue light and combinations thereof.
60. A machine vision system as set forth in any one of claims 55-59 in combination with a container and a frozen sample contained in the container, the camera being position to take an image of the frozen sample, the ring light being adapted to emit a light that matches a color of the frozen sample.
61. A machine vision system as set forth in claim 60 wherein the light emitted by the ring light has a first color and the color of the frozen sample has a second color and the first color is selected from the group consisting of: (i) the same as the second color and (ii) no more different from the second color than one of two adjacent colors on a 6-color RGB color wheel.
62. A machine vision system as set forth in any one of claims 60 and 61 wherein the frozen sample has a color selected from the group consisting of yellow, orange, and red and the light emitted by the ring light is red.
63. A machine vision system as set forth in any one of claims 55-62 wherein the ring light and camera are arranged so there is no direct path from the light sources in the ring light to the camera.
64. A machine vision system as set forth in any one of claims 55-63 wherein the camera has a forward end for receiving light from an object and the ring light extends farther forward than the camera so the light emitted by the ring light is emitted from a position in front of the camera.
65. A machine vision system as set forth in any one of claims 55-64 wherein the ring 1ight comprises a housing having an annular groove and the light sources are recessed within the groove .
66. A method of determining one or more locations where frozen sample core have already been taken from frozen samples, each of the frozen samples being contained in a respective container, the method comprising:
(a) operating a robotic system to move a camera relative to a first one of the containers so the camera is directed at the frozen sample in the first container;
(b) illuminating said frozen sample using a ring light, the ring light comprising a plurality of light sources arranged in an annular pattern, the camera having an optical axis that extends through a central portion of the annular pattern;
(c) using the camera to capture an image of the illuminated frozen sample;
(d) evaluating contrast in the captured image and
processing the image to identify one or more bore candidates in the captured image and determine whether or not the bore candidates are likely to be artifacts or real bores in the frozen sample; (e) operating a robotic system to move the camera relative to a second of the containers so the camera is directed at the frozen sample in said second container; and
(f) repeating steps (b) - (d) for the frozen sample in said second container.
67. A method as set forth in claim 66 wherein step (b) comprises illuminating the frozen sample with red light.
68. A method as set forth in claim 66 wherein step (b) comprises illuminating the frozen sample with green light.
69. A method as set forth in claim 66 wherein step (b) comprises illuminating the frozen sample with light having a color that matches the color of the frozen sample.
70 A method as se
e has a color selec
e, and red and step
e with red light.
71. A method as set forth in any one of claims 66-70 wherein step (c) comprises capturing a grayscale image of the illuminated frozen sample.
72. A machine vision system for use with a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a container, the machine vision system comprising:
a camera configured for capturing monochrome images of the containers while the containers are supported by a platform; a light positioned to illuminate the containers and the samples contained therein while the containers are on the platform; and a processor adapted to receive grayscale image data from the camera indicative of images formed by the camera and determine locations where frozen sample cores have already been taken from the samples by evaluating contrast in the images, wherein the light emits light having a color other than white .
73. A machine vision system as set forth in claim 72 wherein the camera is configured to capture a grayscale image.
74. A machine vision system as set forth in claim 72 or wherein the light emits red light.
75. A machine vision system as set forth in claim 72 or 73 wherein the light emits green light.
76. A machine vision system as set forth in claim 72 or 73 wherein the light comprises red light emitting elements, blue light emitting elements, and green light emitting elements, the intensity of light emitted from the red, blue, and green light emitting elements being selectively adjustable to allow any of multiple different colors of light to be selected as the color of light to be emitted by the light.
77. A method of determining one or more locations where frozen sample core have already been taken from frozen samples each of the frozen samples being contained in a respective container, the method comprising:
(a) operating a robotic system to move a camera relative a first one of the containers so the camera is directed at the frozen sample in the first container;
(b) illuminating said frozen sample with a colored light;
(c) using the camera to capture a grayscale image of the illuminated frozen sample; (d) evaluating contrast in the captured image and
processing the image to identify one or more bore candidates in the captured image and determine whether or not the bore candidates are likely to be artifacts or real bores in the frozen sample;
(e) operating the robotic system to move the camera relative to a second of the containers so the camera is directed at the frozen sample in said second container; and
(f) repeating steps (b) - (d) for the frozen sample in said second container.
78. A method as set forth in claim 77 wherein step (b) comprises illuminating the frozen sample with red light.
79. A method as set forth in claim 77 wherein step (b) comprises illuminating the frozen sample with green light.
80. A method as set forth in claim 77 wherein step (b) comprises illuminating the frozen sample with light having a color that matches the color of the frozen sample.
81. A method as set forth in claim 77 wherein the frozen sample has a color selected from the group consisting of yellow, orange, and red and step (b) comprises illuminating the frozen sample with red light.
82. A machine vision system for use with a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a container, the machine vision system comprising:
a camera for taking images of the containers while the containers are supported by a platform;
a light positioned to illuminate the containers and the samples contained therein while the containers are on the platform, wherein the light comprises red light emitting elements, blue light emitting elements, and green light emitting elements, the intensity of light emitted from the red, blue, and green light emitting elements being selectively adjustable to allow any of multiple different colors of light to be selected as the color of light to be emitted by the light; and
a processor adapted to receive image data from the camera indicative of images formed the camera and determine locations where frozen sample cores have already been taken from the samples by evaluating contrast in the images,
wherein the processor is adapted to receive input about the color of the samples in the containers and adjust the color of the light emitted by the light to reduce a difference between the color of the samples and the color of the light emitted by the light. as set f
es a use
on about
samples
84. A machine vision system as set forth in claim 81 wherein the camera is a color camera and the processor is adapted to use information in the image data received from the camera to determine the color of the samples and automatically adjust the color of the light.
85. A method of determining one or more locations where frozen sample core have already been taken from frozen samples each of the frozen samples being contained in a respective container, the method comprising:
(a) operating a robotic system to move a camera relative a first one of the containers so the camera is directed at the frozen sample in the first container; (b) illuminating said frozen sample with a colored light, the color of the light being selected to match a color of the frozen sample;
(c) using the camera to capture an image of the illuminated frozen sample;
(d) evaluating contrast in the captured image and
processing the image to identify one or more bore candidates in the captured image and determine whether or not the bore candidates are likely to be artifacts or real bores in the frozen sample;
(e) operating a robotic system to move the camera relative to a second of the containers so the camera is directed at the frozen sample in said second container; and
(f) repeating steps (b) - (d) for the frozen sample in said second container.
86. A method as set forth in claim 85 wherein step (b) comprises illuminating the frozen sample with red light.
87. A method as set forth in claim 85 wherein step (b) comprises illuminating the frozen sample with green light.
88. A method as set forth in claim 85 wherein the frozen sample has a color selected from the group consisting of yellow, orange, and red and step (b) comprises illuminating the frozen sample with red light.
89. A machine vision system for use with a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a container, the machine vision system comprising:
a platform for supporting the containers, the platform having a station for receiving one of the containers while a frozen sample core is extracted from a frozen sample contained in the container;
a camera for capturing images of containers while they are received at the station on the platform;
a light positioned to illuminate the containers from a position providing at least one of back lighting and side lighting .
90. A machine vision system as set forth in claim 90 further comprising a processor adapted to receive image data from the camera indicative of images formed by the camera and determine locations where frozen sample cores have already been taken from the samples by evaluating how much light passes through the containers at various locations as indicated in the images .
91. A machine vision system as set forth in any one of claims 89-90 wherein the light is positioned to illuminate the containers from a position providing back lighting.
92. A machine vision system as set forth in any one of claims 89-90 wherein the light is position to illuminate the containers from a position providing side lighting.
93. A machine vision system as set forth in any one of claims 89-92 wherein the light comprises a fiber optic cable.
94. A machine vision system as set forth in any one of claims 89-93 further comprising a second light, the second light being positioned to provide bright field illumination of the containers .
95. A machine vision system as set forth in claim 94 wherein the second light comprises a ring light on axis with the camera .
96. A method of determining one or more locations where frozen sample core have already been taken from frozen samples, each of the frozen samples being contained in a respective container, the method comprising:
(a) operating a robotic system to position one of the containers on a platform at a station for receiving the
container while a frozen sample core is extracted from the frozen sample contained in the container;
(b) using a light to provide at least one of back lighting and side lighting for the container;
(c) using a camera to capture an image of the frozen sample while illuminated by the light;
(d) evaluating contrast in the captured image and
processing the image to identify one or more bore candidates in the captured image.
97. A machine vision system for use with a robotic system adapted for taking a plurality of frozen sample cores from frozen samples that are each contained in a container, the machine vision system comprising:
a camera for capturing an image of a container while the container is supported by a platform at a station for receiving the container while a frozen sample core is extracted from the frozen sample contained therein;
a red light for illuminating the container from above while it is on the platform at the station with substantially
monochromatic red light; and
a processor adapted to receive image data from the camera indicative of the image captured by the camera and to determine one or more locations where a frozen sample core has already been taken from the sample contained in the container by evaluating contrast in the image.
98. A method of determining one or more locations where a frozen sample core have already been taken from frozen samples, each of the frozen samples being contained in a respective container, the method comprising:
(a) operating a robotic system to position one of the containers on a platform at a station for receiving the
container while a frozen sample core is extracted from the frozen sample contained in the container;
(b) illuminating the container from above while it is on the platform at the station with substantially monochromatic red light;
(c) using a camera to capture an image of the frozen sample while illuminated by the red light;
(d) evaluating contrast in the captured image and
processing the image to identify one or more bore candidates in the captured image.
99. A machine vision system for use with a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a respective container, the machine vision system comprising:
a platform for supporting one or more of the containers, the platform having a station for receiving at least one of the containers ;
a camera for capturing an image of the container while the container is received at the station;
a processor configured to receive image data from the camera indicative of the image of the container and to determine one or more locations where a frozen sample core has already been taken from a frozen sample contained in the container by: (a) evaluating contrast in the image to identify one or more bore candidates and identify an edge of the container; (b) using information about the position of the edge relative to the bore candidates to determine whether or not the one or more
candidates are likely to be artifacts instead of real bores in the sample.
100. A method of determining one or more locations where a frozen sample core have already been taken from frozen samples, each of the frozen samples being contained in a respective container, the method comprising:
(a) operating a robotic system to position one of the containers on a platform at a station for receiving the
container while a frozen sample core is extracted from the frozen sample contained in the container;
(b) using a camera to capture an image of the frozen sample;
(c) evaluating contrast in the captured image to identify one or more bore candidates and identify an edge of the
container; and
(d) using information about the position of the edge relative to the bore candidates to determine whether or not the one or more candidates are likely to be artifacts instead of real bores in the sample.
101. A machine vision system for use with a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a respective container, the machine vision system comprising:
a platform for supporting one or more of the containers, the platform having a station for receiving at least one of the containers ;
a camera for capturing an image of the container while the container is received at the station; a fill level detection system adapted to detect the positions of the surfaces of the frozen samples; and
a processor configured to receive signals from the fill level detection system and use the signals to determine where position the camera to obtain an image of the frozen samples.
102. A method of determining one or more locations where frozen sample core have already been taken from frozen samples, each of the frozen samples being contained in a respective container, the method comprising:
using a fill level detection system to determine the position of a surface of the frozen sample that is spaced from a bottom of the container;
using information from the fill level detection system to determine where to position a camera so the camera has a predetermined position relative to the surface of the sample and moving the camera to that position;
capturing an image of the frozen sample in the container from that position; and
using the image to identify the location of one or more bores in the sample.
103. A machine vision system for use with a robotic system for taking a plurality of frozen sample cores from frozen samples that are each contained in a respective container, the machine vision system comprising:
a platform for supporting one or more of the containers, the platform having a station for receiving at least one of the containers ;
a coring probe for taking frozen sample cores from the frozen samples;
a camera for capturing an image of the container while the container is received at the station; and a processor configured to receive image data from the camera indicative of the image of the container and to determine one or more locations where a frozen sample core has been taken from a frozen sample contained in the container, the processor being configured to move the coring probe into the open end of at least one bore to clear the open end of the bore of debris.
104. A method of taking a frozen sample core from a frozen sample that is contained in a container, the method comprising: positioning the container at a station for receiving a container on a platform;
capturing an image of the container while the container is received at the station;
determining one or more locations where a frozen sample core has already been taken from the frozen sample contained in the container;
taking the frozen sample core from the frozen sample at a location from which no frozen sample core has already been taken, as determined in the determining step; and
after taking the frozen sample core from the frozen sample, inserting a coring probe into the one or more locations where a frozen sample core has been taken to clear the one or more locations where a frozen sample core has been taken of debris.
105. A machine vision system for use with a robotic system adapted for taking a plurality of frozen sample cores from frozen samples that are each contained in a container, the machine vision system comprising:
a camera for capturing an image of a container while the container is supported by a platform;
a light for illuminating the container on the platform, wherein a majority of the light energy emitted by the light is selected from the group consisting of red light with a wavelength in the range of 620nm to 750nm and green light with a wavelength in the range of 495nm to 570nm; and
a processor adapted to receive image data from the camera indicative of the image captured by the camera and to determine one or more locations where a frozen sample core has already been taken from the sample contained in the container by evaluating the image.
106. A machine vision system as set forth in claim 105 wherein the light emits red light.
107. A machine vision system as set forth in claim 105 wherein the light emits green light.
108. A machine vision system as set forth in any one of claims 105-107 wherein the light is a ring light comprising a plurality of light sources arranged in an annular pattern and the camera has an optical axis extending through a central portion of the annular pattern.
109. A machine vision system as set forth in any one of claims 105-108 wherein the processor is adapted to determine locations where frozen sample cores have already been taken from the frozen sample by evaluating contrast in the image.
110. A machine vision system as set forth in any one of claims 105-109 wherein the processor is adapted to determine locations where frozen sample cores have already been taken from the frozen sample by evaluating how much light passes through the container at various locations as indicated by the image.
111. A machine vision system as set forth in any one claims claim 105-110 wherein the light is positioned to illuminate the container from a position providing at least one of direct lighting and indirect lighting.
112. A machine vision system as set forth in any one of claims 105-111 wherein the camera is configured for capturing monochrome images of the container while the container is supported by the platform and the processor is adapted to receive grayscale image data from the camera indicative of images formed by the camera and determine one or more locations where a frozen sample core has already been taken from the sample by evaluating contrast in the image.
113. A machine vision system as set forth in claim 112 wherein the camera is configured to capture a grayscale image.
114. A machine vision system as set forth in any one of claims 105-113 in combination with a container and a frozen sample contained in the container, the camera being positioned to take an image of the frozen sample, the light being adapted to emit a light that matches a color of the frozen sample.
115. A method of determining one or more locations where frozen sample core have already been taken from frozen samples, each of the frozen samples being contained in a respective container, the method comprising:
(a) operating a robotic system to move a camera relative to a first one of the containers so the camera is directed at the frozen sample in the first container;
(b) illuminating said frozen sample using a light, wherein a majority of the light energy emitted by the light is selected from the group consisting of red light with a wavelength in the range of 620nm to 750nm and green light with a wavelength in the range of 495nm to 570nm; (c) using the camera to capture an image of the illuminated frozen sample;
(d) using the image to identify one or more bore candidates in the captured image and determine whether or not the bore candidates are likely to be artifacts or real bores in the frozen sample;
(e) operating a robotic system to move the camera relative to a second of the containers so the camera is directed at the frozen sample in said second container; and
(f) repeating steps (b) - (d) for the frozen sample in said second container.
116. A method as set forth in claim 115 wherein step (b) comprises illuminating the frozen sample with red light.
117. A method as set forth in claim 15 wherein step (b) comprises illuminating the frozen sample with green light.
118. A method as set forth in any one of claims 115-117 wherein step (b) comprises illuminating the frozen sample with light having a color that matches the color of the frozen sample .
119. A method as set forth in any one of claims 115-118 wherein step (c) comprises capturing a grayscale image of the illuminated frozen sample.
120. A method as set forth in any one of claims 115-119 further comprising illuminating the container with at least one of ultraviolet and infrared light.
PCT/US2013/038880 2012-04-30 2013-04-30 Machine vision system for frozen aliquotter for biological samples WO2013166022A2 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
JP2015510387A JP6108572B2 (en) 2012-04-30 2013-04-30 Machine vision system for frozen aliquoters for biological samples
BR112014026936A BR112014026936A2 (en) 2012-04-30 2013-04-30 machine vision system for use with a robotic system for taking a plurality of frozen sample cores from frozen samples that are contained in a respective container, and methods for calibrating a robotic system for taking a plurality of frozen cores. frozen sample from frozen samples that are contained in a respective container, taking a frozen sample core from a frozen sample that is contained in a container and determining one or more locations where frozen sample cores have already been taken from frozen samples
AU2013256489A AU2013256489A1 (en) 2012-04-30 2013-04-30 Machine vision system for frozen aliquotter for biological samples
EP13724056.0A EP2845013A2 (en) 2012-04-30 2013-04-30 Machine vision system for frozen aliquotter for biological samples
CN201380022571.9A CN104428678A (en) 2012-04-30 2013-04-30 Machine vision system for frozen aliquotter for biological samples
CA2870505A CA2870505A1 (en) 2012-04-30 2013-04-30 Machine vision system for frozen aliquotter for biological samples

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US201261640662P 2012-04-30 2012-04-30
US61/640,662 2012-04-30
US13/489,234 US20130286192A1 (en) 2012-04-30 2012-06-05 Machine Vision System for Frozen Aliquotter for Biological Samples
US13/489,234 2012-06-05
US13/844,156 2013-03-15
US13/844,156 US20140267713A1 (en) 2013-03-15 2013-03-15 Machine Vision System for Frozen Aliquotter for Biological Samples

Publications (2)

Publication Number Publication Date
WO2013166022A2 true WO2013166022A2 (en) 2013-11-07
WO2013166022A3 WO2013166022A3 (en) 2014-03-13

Family

ID=49515022

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/038880 WO2013166022A2 (en) 2012-04-30 2013-04-30 Machine vision system for frozen aliquotter for biological samples

Country Status (7)

Country Link
EP (1) EP2845013A2 (en)
JP (1) JP6108572B2 (en)
CN (1) CN104428678A (en)
AU (1) AU2013256489A1 (en)
BR (1) BR112014026936A2 (en)
CA (1) CA2870505A1 (en)
WO (1) WO2013166022A2 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115223034A (en) * 2021-04-16 2022-10-21 中国科学院上海药物研究所 Automatic hole selection method and device for cryoelectron microscope

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090019877A1 (en) 2006-01-13 2009-01-22 Dale Larson Systems, Methods and Devices for Frozen Sample Distribution

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040085443A1 (en) * 2000-12-13 2004-05-06 Kallioniemi Olli P Method and system for processing regions of interest for objects comprising biological material
US8068988B2 (en) * 2003-09-08 2011-11-29 Ventana Medical Systems, Inc. Method for automated processing of digital images of tissue micro-arrays (TMA)
EP2444788A3 (en) * 2004-06-18 2012-08-22 Covance, Inc. Moisture barrier for a microarrayer
US7405056B2 (en) * 2005-03-02 2008-07-29 Edward Lam Tissue punch and tissue sample labeling methods and devices for microarray preparation, archiving and documentation
US9533418B2 (en) * 2009-05-29 2017-01-03 Cognex Corporation Methods and apparatus for practical 3D vision system
US8744163B2 (en) * 2010-02-09 2014-06-03 International Genomics Consortium System and method for laser dissection

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090019877A1 (en) 2006-01-13 2009-01-22 Dale Larson Systems, Methods and Devices for Frozen Sample Distribution

Also Published As

Publication number Publication date
JP6108572B2 (en) 2017-04-05
JP2015516077A (en) 2015-06-04
CN104428678A (en) 2015-03-18
BR112014026936A2 (en) 2017-06-27
WO2013166022A3 (en) 2014-03-13
CA2870505A1 (en) 2013-11-07
AU2013256489A1 (en) 2014-10-30
EP2845013A2 (en) 2015-03-11

Similar Documents

Publication Publication Date Title
US20130286192A1 (en) Machine Vision System for Frozen Aliquotter for Biological Samples
US9109194B2 (en) Device for harvesting bacterial colony and method therefor
JP6334783B2 (en) Optical cup or cuvette used for optical analysis
EP1686368B1 (en) Animal cell confluence detection apparatus and method
EP2973229B1 (en) Flow cell alignment methods and systems
EP1810003B1 (en) Determination of the boundaries between fractions and extraction of selected fractions in a fractionated sample
CN111052129B (en) Deep learning volume quantification method and apparatus
EP4353813A2 (en) Automated method and system for obtaining and preparing microorganism sample for both identification and antibiotic susceptibility tests
MX2008000263A (en) Heating element for a rotating multiplex fluorescence detection device.
WO2012074771A2 (en) Apparatus and methods for aliquotting frozen samples
US20150055132A1 (en) Method for calibrating spectroscopy apparatus and equipment for use in the method
WO2005045749B1 (en) Automated storage and retrieval device and method
US20140267713A1 (en) Machine Vision System for Frozen Aliquotter for Biological Samples
BR112019007302B1 (en) METHOD FOR CATCHING AND COLLECTING PLANT MATTER, SYSTEM FOR CATCHING PLANT MATTER AND COLLECTING PLANT MATTER CATCHES IN A TRAY HAVING AT LEAST ONE COMPARTMENT, AND COLLECTION UNIT
WO2013166022A2 (en) Machine vision system for frozen aliquotter for biological samples
JP7044875B2 (en) Slide rack judgment system
JP3966220B2 (en) Protein crystal observation device
JP7462039B2 (en) Method and apparatus for providing foreground illumination calibration for characterizing sample containers - Patents.com
AU2015215836A1 (en) Apparatus and methods for aliquotting frozen samples
EP4052461A1 (en) Methods and apparatus providing calibration of background illumination for sample and/or sample container characterization
BR112017025551B1 (en) AUTOMATED METHOD AND SYSTEM FOR OBTAINING AND PREPARING SAMPLES OF MICRO-ORGANISMS FOR BOTH IDENTIFICATION AND ANTIBIOTICS SUSCEPTIBILITY TESTS
BRPI0811990B1 (en) A method for determining whether a seed exhibits a desired phenotype, a system for separating a plurality of seeds based on identified seed phenotypes, and a method for determining whether individual seeds within a plurality of seeds exhibit a desired phenotype

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13724056

Country of ref document: EP

Kind code of ref document: A2

ENP Entry into the national phase in:

Ref document number: 2015510387

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase in:

Ref document number: 2870505

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 2013724056

Country of ref document: EP

ENP Entry into the national phase in:

Ref document number: 2013256489

Country of ref document: AU

Date of ref document: 20130430

Kind code of ref document: A

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112014026936

Country of ref document: BR

ENP Entry into the national phase in:

Ref document number: 112014026936

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20141028