CN110832502B - Image-based pipe top circle detection with multiple candidates - Google Patents

Image-based pipe top circle detection with multiple candidates Download PDF

Info

Publication number
CN110832502B
CN110832502B CN201880046201.1A CN201880046201A CN110832502B CN 110832502 B CN110832502 B CN 110832502B CN 201880046201 A CN201880046201 A CN 201880046201A CN 110832502 B CN110832502 B CN 110832502B
Authority
CN
China
Prior art keywords
tube
images
candidates
processor
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201880046201.1A
Other languages
Chinese (zh)
Other versions
CN110832502A (en
Inventor
张耀仁
S.克卢克纳
B.S.波拉克
陈德仁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens Healthcare Diagnostics Inc
Original Assignee
Siemens Healthcare Diagnostics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Healthcare Diagnostics Inc filed Critical Siemens Healthcare Diagnostics Inc
Publication of CN110832502A publication Critical patent/CN110832502A/en
Application granted granted Critical
Publication of CN110832502B publication Critical patent/CN110832502B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/251Colorimeters; Construction thereof
    • G01N21/253Colorimeters; Construction thereof for batch operation, i.e. multisample apparatus
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N35/00Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor
    • G01N35/00584Control arrangements for automatic analysers
    • G01N35/00722Communications; Identification
    • G01N35/00732Identification of carriers, materials or components in automatic analysers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N35/00Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor
    • G01N35/02Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor using a plurality of sample containers moved by a conveyor system past one or more treatment or analysis stations
    • G01N35/04Details of the conveyor system
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N35/00Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor
    • G01N35/02Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor using a plurality of sample containers moved by a conveyor system past one or more treatment or analysis stations
    • G01N35/04Details of the conveyor system
    • G01N2035/0401Sample carriers, cuvettes or reaction vessels
    • G01N2035/0418Plate elements with several rows of samples
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N35/00Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor
    • G01N35/02Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor using a plurality of sample containers moved by a conveyor system past one or more treatment or analysis stations
    • G01N35/04Details of the conveyor system
    • G01N2035/0496Other details
    • G01N2035/0498Drawers used as storage or dispensing means for vessels or cuvettes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/06Recognition of objects for industrial automation

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Immunology (AREA)
  • Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Multimedia (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Processing (AREA)
  • Automatic Analysis And Handling Materials Therefor (AREA)
  • Image Analysis (AREA)

Abstract

Embodiments provide a method of locating a top of tube circle region in an input image using image-based top of tube circle detection based on multiple candidate selections. According to embodiments provided herein, multiple candidate selection enhances the robustness of tube circle detection by utilizing multiple views of the same tube to increase the robustness of tube top circle detection. With multiple candidates extracted from images at different viewpoints of the same tube, the multiple candidate selection algorithm selects the optimal combination among the candidates and provides a more accurate measurement of tube characteristics. This information is invaluable in an IVD environment where the sample handler is handling the tube and moving the tube to the analyzer for testing and analysis.

Description

Image-based pipe top circle detection with multiple candidates
Cross Reference to Related Applications
The present application claims priority from U.S. provisional application serial No. 62/531,108 filed on 7, 11, 2017, the contents of which are incorporated herein by reference in their entirety.
Technical Field
Embodiments disclosed herein relate generally to capturing images of a tube tray to determine characteristics of tubes supported within the tray, and more particularly to using image-based tube top circle detection with multiple candidates for accurate determination of characteristics of tubes supported within the tray.
Background
In Vitro (IVD) diagnostics allow a laboratory to aid in the diagnosis of disease based on assays performed on patient fluid samples. IVD includes various types of analytical tests and assays related to patient diagnosis and treatment, which can be performed by analyzing liquid samples taken from a patient's body fluid or abscess. These assays are typically performed using an automated clinical chemistry analyzer (analyzer) into which tubes or vials containing patient samples have been loaded. Due to the variety of assays required in modern IVD labs, and the amount of testing necessary to operate the laboratory, multiple analyzers are typically employed in a single laboratory. An automated system may also be used between and among the analyzers. The sample may be transported from the doctor's office to a laboratory, stored in the laboratory, placed into an automated system or analyzer, and stored for subsequent testing.
Trays are commonly used to accomplish transport and storage between analyzers. A tray is typically an array of several patient samples stored in test tubes. These trays are generally stackable and facilitate easy carrying of multiple samples from one part of the laboratory to another. For example, a laboratory may receive a tray of patient samples for testing from a hospital or clinic. The tray of patient samples may be stored in a refrigerator in a laboratory. Trays of patient samples may also be stored in drawers. In some automated systems, the analyzers may accept trays of patient samples and process the samples accordingly, while some analyzers may require that the samples be removed from the trays and placed into a rack, such as a positioner (puck), by an operator prior to further treatment. The trays are typically passive devices that allow the samples to be carried and in some cases arranged in an orderly relationship.
Generally, information about the sample tubes stored in the tray is not known until an operator or sample handling mechanism interacts with each tube. For example, the sample handling robotic arm may pick up the tube, remove it from the tray, and place it into the rack. The rack may then travel to a decapper station to remove any possible caps and pass a bar code reader so that the bar codes on the sides of the tube may be read to reveal the contents of the tube. In many prior art sample handling mechanisms, the identity of the tube is unknown until after the tube is removed from the tray. In this way, all tubes in the tray will typically be handled in the same manner until after placement of the tubes onto the racks in the automated system.
Disclosure of Invention
Embodiments provide a method of using image-based pipe top circle detection. Embodiments relate to image-based methods to further improve robustness of tube circle detection and handle more challenging situations by using multiple candidate selections.
According to an embodiment, a method of using image-based pipe top circle detection, the method comprising: receiving a series of images of a tray from at least one camera, the tray comprising a plurality of tube slots, each tube slot configured to receive a sample tube, the series of images of the tray being acquired via the at least one camera; extracting, using a processor, a plurality of candidates in each image of a series of images for a given sample tube; calculating, using a processor, a plurality of consistency scores, each of the plurality of consistency scores being between candidates of images in a series of images spanning possible combinations from the plurality of candidates; accumulating, using a processor, a plurality of consistency scores; and selecting, using the processor, a true tube top circle for a given sample tube for each image in the series of images based on a highest consistency score between candidates of images in the series of cross images.
In an embodiment, calculating the plurality of consistency scores includes: determining, using a processor, possible combinations of the plurality of candidates; calculating, using a processor, an attribute for each of the possible combinations, the attribute being based on a characteristic of a respective one of the plurality of candidates; and calculating, using the processor, a plurality of consistency scores based on the calculated attributes. In an embodiment, the method further comprises storing the calculated attributes in a look-up table. In an embodiment, the attributes include one or more of circle diameter, circle center position, circle height, and circle center offset. In another embodiment, the attributes include one or more of shape, color, and texture across the image block.
In an embodiment, the tray is configured to fit within a portion of a drawer movable between an open position and a closed position, and images of the tray are acquired via at least one camera as the drawer is moved between the open and closed positions.
According to an embodiment, a given sample tube in the series of images comprises a given sample tube in a plurality of different viewpoints.
In an embodiment, the method further comprises analyzing the true tube top circle for a given sample tube to determine tube class and tube characteristics.
In an additional embodiment, a vision system for use in an in vitro diagnostic environment, comprising: a tray comprising a plurality of tube slots arranged in a matrix of rows and columns, each tube slot configured to receive a sample tube; a surface configured to receive a tray; at least one camera configured to capture a series of images of the tray; and a processor configured to: extracting a plurality of candidates in each image of a series of images for a given sample tube; calculating a plurality of consistency scores, each of the plurality of consistency scores being between candidates of images in a series of images spanning possible combinations from the plurality of candidates; accumulating a plurality of consistency scores; and selecting a true top circle for a given sample tube for each image in the series of images based on a highest consistency score between candidates of images in the series of cross-images.
Additional features and advantages of the present disclosure will be made apparent from the following detailed description of illustrative embodiments that proceeds with reference to the accompanying drawings.
Drawings
The foregoing and other aspects of the embodiments disclosed herein are best understood from the following detailed description, when read in conjunction with the accompanying drawings. For the purpose of illustrating the embodiments disclosed herein, there is shown in the drawings embodiments which are presently preferred, it being understood, however, that the embodiments disclosed herein are not limited to the specific instrumentalities disclosed. The following figures are included in the accompanying drawings:
FIG. 1A is a representation of a system for characterizing tube trays and tubes supported in drawers by image analysis, according to an embodiment;
FIG. 1B illustrates an exemplary drawer vision system testing device (harnesss) including an image capture system positioned over a tube tray disposed on a drawer, according to an embodiment;
FIG. 2 shows a block diagram representation of a system for characterizing a tube tray supported in a drawer and tubes contained thereon by image analysis, according to an embodiment;
3A-3H are exemplary images of a sample tube depicting sample results of tube circle detection with multiple candidates, according to embodiments herein;
FIGS. 4A and 4B illustrate a tube geometry estimation error distribution based on tube circle detection with a single candidate and multiple candidates;
FIG. 5 is a flow chart illustrating a method of using image-based pipe top circle detection in accordance with embodiments herein; and
FIG. 6 illustrates an example of a computing environment in which embodiments of the application may be implemented.
Detailed Description
The present application relates to several of the concepts described in PCT application nos. PCT/US14/27217, PCT application nos. PCT/US15/35092, and PCT application nos. PCT/US16/18109, each of which is incorporated herein by reference in its entirety.
Embodiments relate to an image-based method to further improve the robustness of tube circle detection and handle more challenging situations by using multiple candidate selections. This is critical for sample tube classification and characterization in clinical laboratory automation systems.
Embodiments include systems and methods for determining categories and characteristics of tubes supported within tube trays in an automated vision system configured to acquire images of tube trays and tubes supported within tube trays. Some embodiments include acquiring images of a pallet that is manually placed and aligned in an automated system. For example, an automated system may provide a flat surface with rails and allow an operator to manually align a keying device (keying feature) on the pallet to the track and push the pallet to the work area.
Some embodiments may include an automated Drawer Vision System (DVS) that includes drawers for loading and unloading tube trays on which sample tubes are contained. Images of the tray may be acquired via one or more cameras mounted over the access area of the drawer as the drawer moves between an open position and a closed position (e.g., a work area position).
According to an embodiment, the acquired images are analyzed to determine tube class and tube characteristics. Embodiments use image-based top of tube circle detection with multiple candidate selections to locate top of tube circle regions in an input image. The tube top circle region may be used to determine the class and characteristics of the tube that have variations in its appearance.
As described in PCT application No. PCT/US14/27217, the tube tray of the DVS is configured to support a plurality of tubes within a drawer and in slots arranged in an array of rows and columns. The images are used to characterize the tray and the tubes supported on the tray. In particular, according to an embodiment, by analyzing the image, various characteristics of the tube may be determined, such as, for example, a tray slot containing the tube; a center point of the tube, a diameter of the tube, and a height of the tube in a coordinate system; orientation of the tray within the drawer; whether the tube is a plain tube (tube) covered with a cap or tube top sample cup; color(s) of the tube cover, bar code on the tray surface; and the speed at which the drawers supporting the trays are being inserted into or removed from the work environment. Without expensive equipment, and without handling or touching the tube, embodiments quickly determine pieces of this and other information. Such knowledge allows for efficient and simplified handling of the pipes, as well as reduced setup and maintenance costs.
According to embodiments provided herein, multiple candidate selection enhances the robustness of tube circle detection by utilizing multiple views of the same tube to increase the robustness of tube top circle detection. With multiple candidates extracted from images at different viewpoints of the same tube, the multiple candidate selection algorithm selects the optimal combination among the candidates and provides a more accurate measurement of tube characteristics. This information is valuable in an IVD environment where the sample handler is handling the tube and moving the tube to the analyzer for testing and analysis.
FIG. 1A is a representation of an exemplary drawer vision system 100 in which a tube tray 120 and tubes 130 contained thereon are characterized by obtaining and analyzing images thereof, according to an embodiment. One or more drawers 110 are movable between open and closed positions and are provided in an working envelope (envelope) 105 for a sample handler. One or more tube trays 120 may be loaded into drawer 110 or may be a permanent device of drawer 110. Each tube tray 120 has an array of rows and columns of slots (as depicted in the exemplary tray 121) in which tubes 130 can be supported.
According to an embodiment, an image of the tube tray 120 is photographed. The images are analyzed to determine the characteristics of the tube tray 120 and the tubes 130. According to embodiments provided herein, a mobile tray/fixed camera approach is used to capture images for analysis thereof. The image capture system 140 is used to take images of the tube tray 120 and the tubes 130 contained thereon as the tube tray 120 is moved into the work envelope 105 by, for example, manually or automatically pushing into the drawer 110.
The image capture system 140 may include one or more cameras (e.g., left camera 242 and right camera 244 shown in fig. 2) located at or near the entrance to the work envelope 105. In some embodiments, one or more cameras 242, 244 may be located above the surface of the tube tray 120. For example, cameras 242, 244 may be placed three to six inches above the surface to capture high resolution images of tube tray 120. Other distances and/or positioning may also be used depending on the characteristics of the cameras 242, 244 and the desired viewing angle and image quality. Optionally, the image capture system 140 may include one or more illumination sources, such as LED flash lamps.
FIG. 1B illustrates an exemplary testing device of an exemplary drawer vision system that may be used with embodiments disclosed herein. As shown in fig. 1B, the image capture system 140 is located above the surface of the tube tray 120 that supports the tubes 130 and is disposed on the drawer 110. Drawer 110 shown in the embodiment at fig. 1B is configured to support two 55-slot trays or six 15-slot trays. However, embodiments may include trays configured to support trays having different numbers of slots and having different sizes.
FIG. 2 illustrates a block diagram representation of an exemplary system 200 for characterizing a tube tray 120 supported in a drawer 110 and a tube 130 contained thereon by image analysis, according to an embodiment. According to an embodiment, the image capture system 140 includes two cameras, a left camera 242 and a right camera 244. Additional or fewer cameras may be included depending on the size of the drawers 110 and tube trays 120, as well as the desired image quality and image viewing angle. Light source 246 and image capture controller 248 are also part of image capture system 140.
The encoder 210 (such as a quadrature encoder) may be used to determine when a row of tube trays 120 is moved to a centered or substantially centered position under one or more cameras 242, 244. The encoder 210 transmits a signal (i.e., pulse) to the image capture controller 248 upon detecting movement of the tube tray 120 corresponding to the new row of tube trays 120 moving into a centered or substantially centered position under the one or more cameras 242, 244. The signal is used as an instruction for the image capture controller 248 to instruct the cameras 242, 244 to take an image upon receiving the signal.
The controller 220 is provided for managing image analysis of images captured by the cameras 242, 244. Upon detecting the closing of drawer 110, image capture controller 248 provides images to controller 220 for downloading and processing. According to an embodiment, the controller 220 is part of a sample handler that is used in an IVD environment to handle and move the tube tray 120 and tube 130 between storage locations (such as the work envelope 105) to an analyzer. The image analysis performed by the controller 220 is used to indicate the sample handler with respect to the various determined characteristics of the tube trays 120 and tubes 130, allowing the sample handler to handle and process the tube trays 120 and tubes 130 accordingly.
One or more memory devices 240 are associated with the controller 220. The one or more memory devices 240 may be internal or external to the controller 220.
One or more drawer sensors 230 may be connected to the controller 220 to indicate when the drawer 110 is fully closed and/or when the drawer 110 is fully open. According to an embodiment, a fully closed drawer 110 is used as an indication to start image processing of captured and stored images. When drawer 110 is fully closed, drawer sensor 230 sends a signal to controller 220.
The view of the camera of the sample tube 130 may be partially obscured by other sample tubes 130 in the tray 120 or itself due to the viewing angle. Because the top circle of the tube 130 is less likely to be occluded than other portions of the tube 130, embodiments determine the class and/or characteristics of the tube 130 based on the top circle of the tube 130.
Edge detection, such as Canny edge, provides a basic clue to image-based tube circle detection and localization algorithms, as described in PCT application No. PCT/US 16/18109. However, when there are many rounded edges in the image from various objects, the edge of the top circle may not have the strongest response. Thus, constraining the circle detection algorithm to output a single detection result may eliminate the opportunity to select the correct circle. According to embodiments herein, the idea of multi-candidate selection is to reserve multiple candidates in an image for each sample tube 130 and defer the selection process until the candidates have been extracted from the multiple images. If the same tube 130 is visible from different viewpoints in, for example, two or three images, then the optimal circle may be selected from candidates in the images by finding the most consistent pair or triplet from all possible combinations. In this way, a tube top circle with a weaker edge response is more likely to be selected as long as it outperforms other candidates in the specified consistency metric.
According to an embodiment, the consistency score is calculated based on one or more of the following attributes: circle diameter (in pixels), circle center position (in pixels), circle height (in metric units), and circle center offset (in metric units). In an embodiment, the circle diameter and the circle center position may be obtained directly from the image blocks of each individual candidate. In an embodiment, the circle height and the circle center offset may be calculated from a pair of candidates from the two images. According to an embodiment, height, diameter, and tube (or circle) center offset estimates may be calculated from a pair of circle detection outputs from two images, as described in PCT application No. PCT/US 16/18109.
According to an embodiment, wherein each sample tube 130 is observed through three viewpoints/images, if there is a reservation for the tube 130 in each imagenEach candidate has for each sample tube 130n 3 And three triples. At these pointsn 3 Of the triples, only3nIndividual blocks3n 2 And non-duplicate pairs. In an embodiment, the attributes for each block and each pair are calculated and stored in a look-up table. In each triplet, the consistency scores are accumulated from the three blocks and pairs involved in the triplet, and the one with the highest consistency score is selected.
Fig. 3A-3H are exemplary images of a sample tube 130 depicting eight sample results 310-380 of tube circle detection with multiple candidates according to embodiments provided herein. Each sample set 310-380 includes 12 image blocks arranged in a three by four grid. Each row of the grid includes the first three candidates from each image of sample tube 130, and the candidates surrounded by rectangles (e.g., 312, 314, and 316 in set 310 shown in fig. 3A; 322, 324, 326, etc. in set 320 shown in fig. 3B) indicate candidates selected from the three candidates by an algorithm. The selected candidates (e.g., 312, 314, and 316 in set 310; 322, 324, and 326 in set 320) are provided on the last column of the grid.
It is observed that while the first candidate listed on the first column of the grid of sets 310-380 appears to have the strongest edge response, that candidate is not always the best choice, as the strong edge response may come from other objects. From the last column of each sample set 310-380, it is illustrated that the selected candidates form consistent triples, which are more likely to be true top circles.
Quantitatively, as shown in fig. 4A and 4B (fig. 4A detection with a single candidate (410) and fig. 4B detection with multiple candidates (420)), tube geometry estimates based on tube circle detection with and without multiple candidates are compared and described. The joint distribution with multiple candidates is clearly shown from the joint distribution of tube height and diameter errors in millimeters over the joint distribution with a single candidate.
According to an embodiment, the multi-candidate selection algorithm is applicable to a tube top circle detection and localization algorithm as described herein, and is also applicable to a generic circle detection algorithm that can provide multiple candidates for the same tube in each observation image.
The consistency between candidates across multiple views may also be measured by other image features such as, but not limited to, shape, color, and texture across image blocks.
Since the proposed drawer vision system provides at least three views for each sample tube 130, it is beneficial to use the proposed algorithm to improve tube top circle detection and positioning.
Fig. 5 is a flow chart 500 illustrating a method of using image-based top of pipe circle detection with multi-candidate selection in accordance with an embodiment described herein.
At 510, a series of images of a tray is received from at least one camera. In an embodiment, the tray includes a plurality of tube slots, each configured to receive a sample tube 130, the series of images of the tray being acquired via at least one camera.
At 520, a plurality of candidates in each image of the series of images is extracted for a given sample tube using a processor.
At 530, the processor calculates a plurality of consistency scores, each of the plurality of consistency scores being between candidates across images in the series of images from the possible combinations of the plurality of candidates.
At 540, the processor accumulates a plurality of consistency scores.
At 550, the processor is used to select a true tube top circle for a given sample tube for each image in the series of images based on a highest consistency score between candidates across images in the series of images.
FIG. 6 illustrates an example of a computing environment 600 in which embodiments of the application may be implemented. The computing environment 600 may be implemented as part of any of the components described herein. The computing environment 600 may include a computer system 610, which is one example of a computing system upon which embodiments of the application may be implemented. As shown in FIG. 6, computer system 610 may include a communication mechanism, such as a bus 621, or other communication mechanism for communicating information within computer system 610. The system 610 further includes one or more processors 620 coupled with the bus 621 for processing information. The processor 620 may include one or more CPUs, GPUs, or any other processor known in the art.
Computer system 610 also includes a system memory 630 coupled to bus 621 for storing information and instructions to be executed by processor 620. The system memory 630 may include computer-readable storage media in the form of volatile and/or nonvolatile memory such as Read Only Memory (ROM) 631 and/or Random Access Memory (RAM) 632. The system memory RAM 632 may include other dynamic storage device(s) (e.g., dynamic RAM, static RAM, and synchronous DRAM). The system memory ROM 631 can include other static storage device(s) (e.g., programmable ROM, erasable PROM, and electrically erasable PROM). In addition, system memory 630 may be used for storing temporary variables or other intermediate information during execution of instructions by processor 620. A basic input/output system 633 (BIOS), containing the basic routines that help to transfer information between elements within computer system 610, such as during start-up, may be stored in ROM 631. RAM 632 may contain data and/or program modules that are immediately accessible to and/or presently being operated on by processor 620. The system memory 630 may additionally include, for example, an operating system 634, application programs 635, other program modules 636, and program data 637.
Computer system 610 also includes a disk controller 640 that is coupled to bus 621 to control one or more storage devices for storing information and instructions, such as a magnetic hard disk 641 and a removable media drive 642 (e.g., a floppy disk drive, a compact disk drive, a tape drive, and/or a solid state drive). The storage device may be added to the computer system 610 using an appropriate device interface, such as Small Computer System Interface (SCSI), integrated Device Electronics (IDE), universal Serial Bus (USB), or FireWire (FireWire).
Computer system 610 may also include a display controller 665 coupled to bus 621 to control a display or monitor 666, such as a Cathode Ray Tube (CRT) or Liquid Crystal Display (LCD), for displaying information to a computer user. Computer system 610 includes a user input interface 660 and one or more input devices, such as a keyboard 662 and pointing device 661, for interacting with a computer user and providing information to processor 620. Pointing device 661 may be, for example, a mouse, a trackball, or a pointing stick for communicating direction information and command selections to processor 620 and for controlling cursor movement on display 666. The display 666 may provide a touch screen interface that allows input to supplement or replace the transfer of directional information and command selections by the pointing device 661.
Computer system 610 may perform some or all of the processing steps of embodiments of the present application in response to processor 620 executing one or more sequences of one or more instructions contained in a memory, such as system memory 630. Such instructions may be read into system memory 630 from another computer-readable medium, such as hard disk 641 or removable medium drive 642. The hard disk 641 may contain one or more data stores and data files used by embodiments of the application. The data store contents and data files may be encrypted to improve security. The processor 620 may also be employed in a multi-processing arrangement to execute one or more sequences of instructions contained in the system memory 630. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions. Thus, embodiments are not limited to any specific combination of hardware circuitry and software.
As stated above, computer system 610 may include at least one computer-readable medium or memory for holding instructions programmed according to embodiments of the application and for containing data structures, tables, records, or other data described herein. The term "computer-readable medium" as used herein refers to any non-transitory, tangible medium that participates in providing instructions to processor 620 for execution. A computer-readable medium may take many forms, including but not limited to, non-volatile media, and transmission media. Non-limiting examples of non-volatile media include optical disks, solid state drives, magnetic disks, and magneto-optical disks, such as hard disk 641 or removable media drive 642. Non-limiting examples of volatile media include dynamic memory, such as system memory 630. Non-limiting examples of transmission media include coaxial cables, copper wire and fiber optics, including the wires that comprise bus 621. Transmission media can also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications.
The computing environment 600 may further include a computer system 610 operating in a networked environment using logical connections to one or more remote computers, such as a remote computer 680. The remote computer 680 may be a personal computer (laptop or desktop), a mobile device, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 610. When used in a networking environment, the computer 610 may include a modem 672 for establishing communications over the network 671, such as the internet. The modem 672 may be connected to the system bus 621 via the network interface 670, or via another appropriate mechanism.
The network 671 may be any network or system generally known in the art, including the Internet, an intranet, a Local Area Network (LAN), a Wide Area Network (WAN), a Metropolitan Area Network (MAN), a direct connection or series of connections, a cellular telephone network, or any other network or medium capable of facilitating communications between the computer system 610 and other computers (e.g., the remote computing system 680). The network 671 may be wired, wireless, or a combination thereof. The wired connection may be implemented using ethernet, universal Serial Bus (USB), RJ-11, or any other wired connection generally known in the art. The wireless connection may be implemented using WiFi, wiMAX and bluetooth, infrared, cellular networks, satellites, or any other wireless connection method generally known in the art. Further, several networks may operate alone or in communication with each other to facilitate communications within the network 671.
A processor, as used herein, is a device for performing machine-readable instructions stored on a computer-readable medium, for performing tasks, and may comprise any one or combination of hardware and firmware. A processor may also include a memory storing machine-readable instructions executable to perform tasks. A processor acts upon information by manipulating, analyzing, modifying, converting or transmitting information for use by an executable procedure or an information device and/or by routing the information to an output device. The processor may use or include the capability of, for example, a computer, controller, or microprocessor, and is regulated using executable instructions to perform specialized functions not performed by a general purpose computer. The processor may be coupled (electrically and/or as including executable components) with any other processor such that interaction and/or communication therebetween is enabled. The computer program instructions may be loaded onto a computer (including, but not limited to, a general purpose computer or special purpose computer) or other programmable processing apparatus to produce a machine, such that the computer program instructions which execute on the computer or other programmable processing apparatus create means for implementing the functions specified in the flowchart block(s). The user interface processor or generator is a known element comprising electronic circuitry or software or a combination of both for generating the display element or part thereof. The User Interface (UI) includes one or more display elements that enable user interaction with the processor or other device.
As used herein, an executable application includes code or machine readable instructions for adjusting a processor to implement predetermined functions such as those of an operating system, a contextual data acquisition system, or other information processing systems, for example, in response to user commands or input. An executable procedure is a segment of code or machine readable instructions, a subroutine, or other different section of code or portion of an executable application for performing one or more specific procedures. These processes may include receiving input data and/or parameters, performing operations on the received input data and/or performing functions in response to the received input parameters, and providing the resulting output data and/or parameters. As used herein, a Graphical User Interface (GUI) includes one or more display elements generated by a display processor and enabling user interaction with the processor or other device and associated data acquisition and processing functions.
The UI also includes executable procedures or executable applications. The executable process or executable application adjusts the display processor to generate a signal representing the UI display image. These signals are supplied to a display device that displays the elements for viewing by a user. The executable process or executable application further receives a signal from a user input device such as a keyboard, mouse, light pen, touch screen, or any other means that allows a user to provide data to the processor. The processor manipulates the UI display element in response to signals received from the input device under control of the executable process or executable application. In this way, a user interacts with the display element using the input device, thereby enabling user interaction with the processor or other device. The functions and process steps herein may be performed automatically or in response, in whole or in part, to user commands. The automatically performed activities (including steps) are performed in response to executable instructions or device operations without requiring the user to directly initiate the activities.
As used herein, a workflow processor processes data to determine tasks to be added to or removed from a task list, or to modify tasks that are merged on a task list or to modify tasks that are for merging on a task list, as specified, for example, in a program(s). A task list is a list of tasks for execution by a worker, a user of a device, or a combination of both. The workflow processor may or may not employ a workflow engine. As used herein, a workflow engine is a processor that executes in response to a predetermined process definition that implements a process in response to an event and data associated with the event. The workflow engine sequentially and/or concurrently implements the process in response to the event-associated data to determine tasks for execution by the devices and or workers and to update task lists of the devices and workers to include the determined tasks. The process definition may be defined by a user and includes a sequence of process steps including one or more of start, wait, decide, and task allocation steps for execution by, for example, a device and or worker. An event is an occurrence of an operation that affects a process implemented using a process definition. The workflow engine includes process definition functions that allow a user to define a process to follow and may include an event monitor. A processor in the workflow engine tracks which processes are running according to the process definition, what steps need to be performed for which patients, physicians, and next, and may include processes for informing physicians of the tasks to be performed.
The systems and processes of the figures presented herein are not exclusive. Other systems, processes, and menus may be derived in accordance with the principles of the present application to accomplish the same objectives. Although the application has been described with reference to specific embodiments, it is to be understood that the embodiments and variations shown and described herein are for illustration purposes only. Modifications to the present design may be effected by those skilled in the art without departing from the scope of the application. Additionally, in alternative embodiments, the processes and applications may be located on one or more (e.g., distributed) processing devices on a network linking the elements of FIG. 6. Any of the functions and steps provided in the figures may be implemented in hardware, software, or a combination of both. No claim element is to be construed herein under the specification of 35 u.s.c. 112 (f) unless the element is explicitly recited using the phrase "means for … …".
Although the present application has been described with reference to the exemplary embodiments, the present application is not limited thereto. Those skilled in the art will appreciate that many changes and modifications can be made to the preferred embodiments of the application and that such changes and modifications can be made without departing from the true spirit of the application. It is therefore intended that the following claims be interpreted to cover all such equivalent modifications as fall within the true spirit and scope of the application.

Claims (10)

1. A method of using image-based pipe top circle detection, the method comprising:
receiving a series of images of a tray from at least one camera, the tray comprising a plurality of tube slots, each tube slot configured to receive a sample tube, the series of images of the tray being acquired via the at least one camera;
extracting, using a processor, a plurality of candidates for each image in the series of images for a given sample tube;
calculating, using a processor, a plurality of consistency scores, each of the plurality of consistency scores being a consistency score between candidates across images in a series of images from a possible combination of the plurality of candidates;
accumulating, using a processor, a plurality of consistency scores; and
selecting, using a processor, a true tube top circle for a given sample tube for each image in the series of images based on a highest consistency score between candidates across images in the series of images, wherein calculating the plurality of consistency scores comprises:
determining, using a processor, possible combinations of the plurality of candidates;
calculating, using a processor, an attribute for each of the possible combinations, the attribute being based on characteristics of a respective candidate of the plurality of candidates; and
a plurality of consistency scores is calculated based on the calculated attributes using a processor,
wherein the attributes include one or more of circle diameter, circle center position, circle height, and circle center offset, or include one or more shapes and textures across the image block.
2. The method of claim 1, further comprising storing the calculated attributes in a lookup table.
3. The method of claim 1, wherein the tray is configured to fit within a portion of a drawer movable between an open position and a closed position, and the image of the tray is acquired via the at least one camera as the drawer moves between the open and closed positions.
4. The method of claim 1, wherein a given sample tube in the series of images comprises a given sample tube in a plurality of different viewpoints.
5. The method of claim 1, further comprising analyzing a true tube top circle for a given sample tube to determine tube class and tube characteristics.
6. A vision system for use in an in vitro diagnostic environment, comprising:
a tray comprising a plurality of tube slots arranged in a matrix of rows and columns, each tube slot configured to receive a sample tube;
a surface configured to receive a tray;
at least one camera configured to capture a series of images of the tray; and
a processor configured to:
extracting a plurality of candidates in each image of the series of images for a given sample tube;
calculating a plurality of consistency scores, each of the plurality of consistency scores being a consistency score between candidates across images in a series of images from possible combinations of the plurality of candidates;
accumulating a plurality of consistency scores; and
selecting a true tube top circle for a given sample tube for each image in the series of images based on a highest consistency score between candidates across images in the series of images, wherein calculating, by the processor, the plurality of consistency scores comprises:
determining possible combinations of the plurality of candidates;
calculating an attribute for each of the possible combinations, the attribute based on a characteristic of a respective one of the plurality of candidates; and
a plurality of consistency scores is calculated based on the calculated attributes,
wherein the attributes include one or more of circle diameter, circle center position, circle height, and circle center offset, or include one or more shapes and textures across the image block.
7. The system of claim 6, wherein the processor is further configured to store the calculated attributes in a lookup table.
8. The system of claim 6, wherein the surface comprises a portion of a drawer movable between an open position and a closed position, and wherein the image of the tray is acquired via the at least one camera as the drawer is moved between the open position and the closed position.
9. The system of claim 6, wherein a given sample tube in the series of images comprises a given sample tube in a plurality of different viewpoints.
10. The system of claim 6, wherein the processor is further configured to analyze the true tube top circle for a given sample tube to determine tube class and tube characteristics.
CN201880046201.1A 2017-07-11 2018-06-25 Image-based pipe top circle detection with multiple candidates Active CN110832502B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201762531108P 2017-07-11 2017-07-11
US62/531108 2017-07-11
PCT/US2018/039283 WO2019013961A1 (en) 2017-07-11 2018-06-25 Image-based tube top circle detection with multiple candidates

Publications (2)

Publication Number Publication Date
CN110832502A CN110832502A (en) 2020-02-21
CN110832502B true CN110832502B (en) 2023-09-05

Family

ID=65001796

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880046201.1A Active CN110832502B (en) 2017-07-11 2018-06-25 Image-based pipe top circle detection with multiple candidates

Country Status (5)

Country Link
US (1) US20200232908A1 (en)
EP (1) EP3652674B1 (en)
JP (1) JP7087058B2 (en)
CN (1) CN110832502B (en)
WO (1) WO2019013961A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA3235375A1 (en) * 2021-10-18 2023-04-27 Mark Andrews Inspection unit for cans and method for determining the quality of cans

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104364799A (en) * 2012-06-08 2015-02-18 高通股份有限公司 Fast feature detection by reducing an area of a camera image through user selection
WO2016133924A1 (en) * 2015-02-18 2016-08-25 Siemens Healthcare Diagnostics Inc. Image-based tube slot circle detection for a vision system

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3797686B2 (en) * 1995-09-07 2006-07-19 株式会社東芝 Shape recognition apparatus and method
US7263220B2 (en) * 2003-02-28 2007-08-28 Eastman Kodak Company Method for detecting color objects in digital images
US7988933B2 (en) * 2006-09-01 2011-08-02 Siemens Healthcare Diagnostics Inc. Identification system for a clinical sample container
JP2011100223A (en) 2009-11-04 2011-05-19 Panasonic Electric Works Co Ltd Image processing apparatus and image processing method
CN103370626A (en) * 2011-03-03 2013-10-23 株式会社日立高新技术 Specimen data processing device for analysis device, auto-sampler device, liquid chromatograph device, specimen data processing method, and analysis method
WO2012125291A1 (en) * 2011-03-15 2012-09-20 Siemens Healthcare Diagnostics Inc. Multi-view stereo systems and methods for tube inventory in healthcare diagnostics
EP2658958B1 (en) * 2011-03-18 2017-05-03 Siemens Healthcare Diagnostics Inc. Methods and systems for calibration of a positional orientation between a sample container and nozzle tip
CN103842794B (en) * 2011-07-22 2017-09-22 罗氏血液诊断股份有限公司 fluid sample preparation system and method
EP2753914B1 (en) * 2011-09-09 2021-03-24 Gen-Probe Incorporated Automated sample handling instrumentation, systems, processes, and methods
WO2013070754A1 (en) * 2011-11-07 2013-05-16 Beckman Coulter, Inc. Robotic arm
KR20140091033A (en) * 2011-11-07 2014-07-18 베크만 컬터, 인코포레이티드 Specimen container detection
US10145857B2 (en) * 2013-03-14 2018-12-04 Siemens Healthcare Diagnostics Inc. Tube tray vision system
CN107003124B (en) * 2014-06-10 2020-07-07 西门子医疗保健诊断公司 Drawer vision system
CN107110879B (en) * 2015-01-28 2019-04-05 株式会社日立高新技术 Liquid level check device, automatic analysing apparatus and processing unit
US9940731B2 (en) 2015-09-04 2018-04-10 International Business Machines Corporation Unsupervised asymmetry detection
JP6517652B2 (en) 2015-10-01 2019-05-22 日本電信電話株式会社 Object saliency map calculation device, method and program
CA3018085C (en) * 2016-03-15 2022-10-25 Abbott Molecular Inc. Systems and methods for automated analysis

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104364799A (en) * 2012-06-08 2015-02-18 高通股份有限公司 Fast feature detection by reducing an area of a camera image through user selection
WO2016133924A1 (en) * 2015-02-18 2016-08-25 Siemens Healthcare Diagnostics Inc. Image-based tube slot circle detection for a vision system

Also Published As

Publication number Publication date
JP7087058B2 (en) 2022-06-20
WO2019013961A1 (en) 2019-01-17
CN110832502A (en) 2020-02-21
US20200232908A1 (en) 2020-07-23
JP2020526759A (en) 2020-08-31
EP3652674B1 (en) 2024-02-28
EP3652674A4 (en) 2020-07-08
EP3652674A1 (en) 2020-05-20

Similar Documents

Publication Publication Date Title
US10290090B2 (en) Image-based tube slot circle detection for a vision system
JP6960980B2 (en) Image-based tray alignment and tube slot positioning in visual systems
CN107615336B (en) Location-based detection of tray slot type and cuvette type in a vision system
CN103885168B (en) Self-alignment method for microscopie unit
US20170061614A1 (en) Image measuring apparatus and non-temporary recording medium on which control program of same apparatus is recorded
CN110832502B (en) Image-based pipe top circle detection with multiple candidates
CN106525620A (en) Hardness test apparatus and hardness testing method
US20180025211A1 (en) Cell tracking correction method, cell tracking correction device, and storage medium which stores non-transitory computer-readable cell tracking correction program
RU2019105286A (en) PREPARATION SCAN WITH OPTIMIZED WORKING PROCESS
CN114513601A (en) Method for compensating lens switching error and image analysis device
CN113758856A (en) Method for relocating cells on smear and cell image analysis device
CN114125261A (en) Method for calibrating movement of movable device, image analysis device and analysis system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40015912

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant