WO2013170204A1 - Augmented reality for workflow assistance - Google Patents

Augmented reality for workflow assistance

Info

Publication number
WO2013170204A1
WO2013170204A1 PCT/US2013/040637 US2013040637W WO2013170204A1 WO 2013170204 A1 WO2013170204 A1 WO 2013170204A1 US 2013040637 W US2013040637 W US 2013040637W WO 2013170204 A1 WO2013170204 A1 WO 2013170204A1
Authority
WO
Grant status
Application
Patent type
Prior art keywords
sample
tray
operator
samples
system
Prior art date
Application number
PCT/US2013/040637
Other languages
French (fr)
Inventor
Baris YAGCI
Elizabeth Bononno
Original Assignee
Siemens Healthcare Diagnostics Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device; Cooperation and interconnection of the display device with other functional units using display panels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01LCHEMICAL OR PHYSICAL LABORATORY APPARATUS FOR GENERAL USE
    • B01L2300/00Additional constructional details
    • B01L2300/02Identification, exchange or storage of information
    • B01L2300/021Identification, e.g. bar codes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01LCHEMICAL OR PHYSICAL LABORATORY APPARATUS FOR GENERAL USE
    • B01L2300/00Additional constructional details
    • B01L2300/02Identification, exchange or storage of information
    • B01L2300/023Sending and receiving of information, e.g. using bluetooth
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01LCHEMICAL OR PHYSICAL LABORATORY APPARATUS FOR GENERAL USE
    • B01L2300/00Additional constructional details
    • B01L2300/02Identification, exchange or storage of information
    • B01L2300/024Storing results with means integrated into the container
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01LCHEMICAL OR PHYSICAL LABORATORY APPARATUS FOR GENERAL USE
    • B01L2300/00Additional constructional details
    • B01L2300/02Identification, exchange or storage of information
    • B01L2300/025Displaying results or values with integrated means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01LCHEMICAL OR PHYSICAL LABORATORY APPARATUS FOR GENERAL USE
    • B01L2300/00Additional constructional details
    • B01L2300/02Identification, exchange or storage of information
    • B01L2300/025Displaying results or values with integrated means
    • B01L2300/027Digital display, e.g. LCD, LED
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01LCHEMICAL OR PHYSICAL LABORATORY APPARATUS FOR GENERAL USE
    • B01L9/00Supporting devices; Holding devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01LCHEMICAL OR PHYSICAL LABORATORY APPARATUS FOR GENERAL USE
    • B01L9/00Supporting devices; Holding devices
    • B01L9/06Test-tube stands; Test-tube holders
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/08Biomedical applications

Abstract

System and methods assist an operator handling samples or other items that have an ordered relationship with an environment. A camera identifies samples and trays, including the orientation of a tray, to assist the operator in locating a sample at a known location in the tray and for tracking the chain of custody of the sample. Information can be displayed to the operator to assist him/her in locating a sample, which can include augmenting reality.

Description

AUGMENTED REALITY FOR WORKFLOW ASSISTANCE

[0001] This application claims priority to U.S. provisional application Serial No.1/645,775 filed May 11, 2012, which is incorporated herein by reference in its entirety.

TECHNOLOGY FIELD

[0002] The present invention relates in general to an augmented reality (AR) system for facilitating location of objects that have a known ordered relationship and, more particularly, to systems and methods for maintaining chains of custody of patient samples or other objects where an operator can be assisted by augmented reality. Embodiments of the present invention are particularly well suited for, but in no way limited to, maintaining chain of custody of patient samples in a laboratory environment, where an operator participates in a defined workflow.

BACKGROUND

[0003] In vitro diagnostics (IVD) allows labs to assist in the diagnosis of disease based on assays performed on patient fluid samples. IVD includes various types of analytical tests and assays related to patient diagnosis and therapy that can be performed by analysis of a liquid sample taken from a patient's bodily fluids, or abscesses. These assays are typically conducted with automated clinical chemistry analyzers (analyzers) into which tubes or vials containing patient samples have been loaded. Because of the variety of assays needed in a modern IVD lab, and the volume of testing necessary to operate a lab, multiple analyzers are often employed in a single lab. Between and amongst analyzers, automation systems may also be used. Samples may be transported from a doctor's office to a lab, stored in the lab, placed into an automation system or analyzer, and stored for subsequent testing. [0004] In the prior art, storage and transport between analyzers is typically done manually using trays. A tray is typically an array of several patient samples stored in test tubes. These trays are often stackable and facilitate easy carrying of multiple samples from one part of the laboratory to another. For example, a laboratory may receive a tray of patient samples for testing from a hospital or clinic. That tray of patient samples can be stored in refrigerators in the laboratory. In some automation systems, an analyzer can accept a tray of patient samples and handle the samples accordingly, while some analyzers may require that samples be removed from trays by the operator and placed into carriers (such as pucks) before further handling. Trays are generally passive devices that allow samples to be carried and, in some cases, arranged in an ordered relationship.

[0005] Samples are typically identified by a barcode on the test tube carrying the sample. These barcodes are often difficult to read without scanning the barcode information. As a sample moves through the IVD environment, the barcode is read at multiple locations. By reading the barcode, each processor or machine within the IVD environment can identify the sample and determine how to handle the sample. For example, a patient sample may require three specific tests. When a sample comes into the lab, an operator can scan the barcode using a barcode scanner and a computer display may tell the operator which machine the sample should be placed in. At the machine (such as an analyzer), the barcode will be read once again. The analyzer will then determine which tests should be performed. If an automation system is used, the barcode is often read at each decision point within the automation system. For example, if a track system is used to route samples between multiple analyzer testing stations, decision points will read the barcode and determine whether to redirect each sample to each of the various testing stations.

[0006] Once in a tray, these barcodes are often obscured, making it difficult for an operator to readily identify individual samples. This can become problematic where the operator wishes to retrieve specific samples. For example, if tests are performed on a specific sample and a physician subsequently analyzes the results and determines that additional testing is needed, that sample may not be readily available. The sample may be stored in a refrigerator in one of many stackable trays. An operator would need to identify the individual tray that contains the sample, remove the tray from the refrigerator, and manually locate the specific sample by withdrawing them one at a time. This can be a time- consuming and frustrating process.

[0007] Similarly, prior art IVD systems typically rely on barcode information to determine the identity of a sample at a given barcode reader, but the systems do not easily facilitate chain of custody consideration. For example, once a sample is scanned at a barcode reader, the sample's instantaneous location can be determined (e.g., it is the sample at the current reader), but subsequent action on the sample is not linked to that sample until the sample's barcode appears in the environment once again. This can result in dozens of scans of the barcode as a sample moves throughout the environment. Because each scan can be a slow optical process, these scans can add to the overall processing time of samples, particularly when a lab is handling a large volume of samples. While many prior art systems are slow enough that this does not create a bottleneck, this could present a problem once systems become faster.

SUMMARY

[0008] Embodiments of the present invention address and overcome one or more of the above shortcomings and drawbacks by providing devices and systems for visually displaying information about one or more samples to an operator in the operator's field of vision. By displaying human recognizable information in the operator's field of vision, such as via an augmented reality / heads-up display, a human operator can more easily manage a large number of samples and quickly determine information, such as status information about the sample. This technology is particularly well-suited for, but by no means limited to, displaying information about samples as they move within an in vitro diagnostics (IVD) environment. Some embodiments are also suitable for tracking and identifying any type of objects in an ordered environment.

[0009] Embodiments of the present invention are directed to a system for assisting an operator, including at least one camera, at least one display, and at least one tray capable of receiving a plurality of samples and detecting their location within the tray. The system also includes one or more processors together configured to monitor the location of the plurality of samples and to display information on the display to assist an operator in locating at least one sample.

[0010] According to one aspect of the invention, at least one camera is configured to read a barcode associated with at least one sample. According to another aspect of the invention, at least one tray is configured to transmit information about the presence or absence of samples in the corresponding locations within the tray to the one or more processors. In another aspect, the processors are together further configured to receive barcode information associated with a sample, and correlate the presence of the sample at a corresponding position in the tray. In yet another aspect, the processors are further configured to maintain a chain of custody for the plurality of samples.

[0011] According to one aspect of the invention, a tray includes a plurality of slots and at least one orientation mark. The at least one camera can be configured to capture an image of the orientation mark and the processors are together further configured to determine an orientation of the tray. The processors can also be configured to determine a location of at least one of the plurality of samples in the image. The display can be configured to display information in response to the location of the plurality of samples in the image.

[0012] According to one aspect of the invention, at least one display includes a heads-up display. According to another aspect of the invention, at least one wearable camera is used.

[0013] Embodiments of the present invention are directed to a method for assisting an operator including steps of determining an identity of a first sample, detecting the presence of an unidentified sample at a first position in a first tray and correlating the identity of the first sample with the unidentified sample. The identity of the first sample and the first position is communicated to a processor. The processor records an association between the first sample and first position to maintain a chain of custody of the first sample.

[0014] According to one aspect of the invention, the removal of the first sample from the first position in the first tray is detected and the processor determines a new location of the first sample. According to another aspect of the invention, the first sample is located in response to a query and the location of the first sample is displayed via an augmented reality display. In another aspect, the removal of the first sample from the first position in the first tray is detected and, subsequently, the presence of the first sample at a second position in a second tray is detected. The processor records an association between the first sample and the second position to update the chain of custody of the sample.

[0015] According to one aspect of the invention, the removal of the first sample from the first position in the first tray is detected and the processor updates the chain of custody to reflect that an operator has custody of the first sample. According to another aspect of the invention, the removal of the first sample from the first position in the first container is detected and the processor determines a new location of the first item.

According to yet another aspect of the invention, in response to a query, the first item is located and the location of the first item is displayed via an augmented reality display.

According to still another aspect of the invention, the removal of the first item from the first position in the first container is detected, and the presence of the first item at a second position in a second container is subsequently detected. The processor records an association between the first item and second position to update the chain of custody of the sample. According to a further aspect of the invention, the removal of the first item from the first position in the container is detected and the processor updates the chain of custody to reflect that an operator has custody of the first item.

[0016] Embodiments of the present invention are directed to a system having hardware and software configured to determine an identity of a first item, detect the presence of an unidentified item at a first position in a first container, and correlate the identity of the first item with the unidentified item. The identity of the first item and the first position is communicated to a processor, which records an association between the first item and first position to maintain a chain of custody of the first item.

BRIEF DESCRIPTION OF THE DRAWINGS

[0017] The foregoing and other aspects of the present invention are best understood from the following detailed description when read in connection with the accompanying drawings. For the purpose of illustrating the invention, there is shown in the drawings embodiments that are presently preferred, it being understood, however, that the invention is not limited to the specific instrumentalities disclosed. Included in the drawings are the following Figures:

[0018] FIG. 1 is a top view of a sample tray for use with certain embodiments of the present invention; [0019] FIG. 2 is a cross-sectional diagram of the internal components for use in an exemplary sample tray for use with certain embodiments of the present invention;

[0020] FIG. 3 is a representation of exemplary marks that may be used to orient the field of vision with certain embodiments of the present invention;

[0021] FIG. 4 is a sample augmented reality image that may be suitable for use with certain embodiments of the present invention;

[0022] FIG. 5 is a sample augmented reality image that may be suitable for use with certain embodiments of the present invention;

[0023] FIG. 6 is a perspective view of exemplary augmented reality glasses that may be used with certain embodiments of the present invention;

[0024] FIG. 7 is a hardware system diagram for an augmented reality system that may be suitable for use with certain embodiments of the present invention;

[0025] FIG. 8 is a system diagram for an augmented reality system that may be suitable for use with certain embodiments of the present invention;

[0026] FIG. 9 is a flowchart depicting a method for use with certain embodiments of the present invention;

[0027] FIG. 10 is a flowchart depicting an information flow for use with certain embodiments of the present invention; and

[0028] FIG. 11 is a flowchart depicting the chain of custody identified by certain embodiments of the present invention.

DETAILED DESCRIPTION OF ILLUSTRATIVE EMBODIMENTS

TERMS AND CONCEPTS ASSOCIATED WITH SOME EMBODIMENTS

[0029] Analyzer: Automated clinical analyzers ("analyzers") include clinical chemistry analyzers, automated immunoassay analyzers, or any other type of in vitro diagnostics (IVD) testing analyzers. Generally, an analyzer performs a series of automated IVD tests on a plurality of patient samples. Patient samples may be loaded into an analyzer (manually or via an automation system), which can then perform one or more

immunoassays, chemistry tests, or other observable tests on each sample. The term analyzer may refer to, but is not limited to, an analyzer that is configured as a modular analytical system. A modular analytical system includes an integrated and extendable system comprising any combinations of a plurality of modules (which can include the same type of module or different types of modules) interconnected in a linear or other geometric configuration by an automation surface, such as an automation track. In some embodiments, the automation track may be configured as an integral conveyance system on which independent carriers are used to move patient samples and other types of material between the modules. Generally, at least one module in a modular analytical system is an analyzer module. Modules may be specialized or made redundant to allow higher throughput of analytical tasks on patient samples.

[0030] Analyzer module: An analyzer module is a module within a modular analyzer that is configured to perform IVD tests, such as immunoassays, chemistry tests, or other observable tests on patient samples. Typically, an analyzer module extracts a liquid sample from a sample vessel and combines the sample with reagents in reaction cuvettes or tubes (referred to generally as reaction vessels). Tests available in an analyzer module may include, but are not limited to, a subset of electrolyte, renal or liver function, metabolic, cardiac, mineral, blood disorder, drug, immunoassay, or other tests. In some systems, analyzer modules may be specialized or made redundant to allow higher throughput. The functions of an analyzer module may also be performed by standalone analyzers that do not utilize a modular approach. [0031] Carrier: A carrier is a transportation unit that can be used to move sample vessels (and, by extension, fluid samples) or other items in an automation system. In some embodiments, carriers may be simple, like traditional automation pucks (e.g., passive devices comprising a holder for engaging a tube or item, a friction surface to allow an external conveyor belt in the automation track to provide motive force, and a plurality of sides that allow the puck to be guided by walls or rails in the automation track to allow the track to route a puck to its destination). In some embodiments, carriers may include active components, such as processors, motion systems, guidance systems, sensors, and the like. In some embodiments, carriers can include onboard intelligence that allows carriers to be self- guided between points in an automation system. In some embodiments, carriers can include onboard components that provide motive forces while, in others, motive forces may be provided by an automation surface, such as a track. In some embodiments, carriers move along automation tracks that restrict motion to a single direction (e.g., fore and aft) between decision points. Carriers may be specialized to a given payload in an IVD environment, such as having a tube holder to engage and carry a sample tube, or may include mounting surfaces suitable to carry different items around an automation system. Carriers can be configured to include one or more slots (e.g., a carrier may hold one or a plurality of sample vessels).

[0032] Carriers/Trays/Racks: A carrier may be distinguishable from a tray, which may commonly refer to a device that does not travel along an automation track (e.g., carried by an operator) and is configured to hold a plurality of payloads (e.g., sample tubes). A rack is a general term to describe a device that is configured to hold a plurality of payloads (e.g., sample tubes). A rack may refer to a tray (when used outside an automation track) or carrier (when configured to traverse an automation track) that is configured to carry a plurality of payloads. Racks may refer to one-dimensional or two-dimensional arrays of slots, in some embodiments.

[0033] Central controller or processor: A central controller/processor (which may sometimes be referred to as a central scheduler) is a processor that is part of the automation system, separate from any processors onboard carriers. A central controller can facilitate traffic direction, scheduling, and task management for carriers. In some embodiments, a central controller can communicate with subsystems in the automation system and wirelessly communicate with carriers. This may also include sending trajectory or navigational information or instructions to carriers and determining which carriers should go where and when. In some embodiments, local processors may be responsible for managing carriers on local track sections, such as managing local queues. These local processors may act as local equivalents to central controllers.

[0034] In vitro diagnostics (IVD): In vitro diagnostics (IVD) are tests that can detect diseases, conditions, infections, metabolic markers, or quantify various constituents of bodily materials/fluids. These tests are performed in laboratory, hospital, physician office, or other health professional settings, outside the body of a patient. IVD testing generally utilizes medical devices intended to perform diagnoses from assays in a test tube or other sample vessel or, more generally, in a controlled environment outside a living organism. IVD includes testing and diagnosis of disease or quantifying various constituents of bodily materials/fluids based on assays performed on patient fluid samples. IVD includes various types of analytical tests and assays related to patient diagnosis and therapy that can be performed by analysis of a liquid sample taken from a patient's bodily fluids, or abscesses. These assays are typically conducted with analyzers into which tubes or vials containing patient samples have been loaded. IVD can refer to any subset of the IVD functionality described herein. [0035] Lab automation system: Lab automation systems include any systems that can automatically (e.g., at the request of an operator or software) shuttle sample vessels or other items within a laboratory environment. With respect to analyzers, an automation system may automatically move vessels or other items to, from, amongst, or between stations in an analyzer. These stations may include, but are not limited to, modular testing stations (e.g., a unit that can specialize in certain types of assays or can otherwise provide testing services to the larger analyzer), sample handling stations, storage stations, or work cells.

[0036] Module: A module performs specific task(s) or function(s) within a modular analytical system. Examples of modules may include: a pre-analytic module, which prepares a sample for analytic testing, (e.g., a decapper module, which removes a cap on top of a sample test tube); an analyzer module, which extracts a portion of a sample and performs tests or assays; a post-analytic module, which prepares a sample for storage after analytic testing (e.g., a recapper module, which reseals a sample test tube); or a sample handling module. The function of a sample handling module may include managing sample containers/vessels for the purposes of inventory management, sorting, moving them onto or off of an automation track (which may include an integral conveyance system, moving sample containers/vessels onto or off of a separate laboratory automation track, and moving sample containers/vessels into or out of trays, racks, carriers, pucks, and/or storage locations.

[0037] Payload: While exemplary carriers are described with respect to carrying patient samples, in some embodiments, carriers can be used to transport any other reasonable payload across an automation system. This may include fluids, fluid containers, reagents, waste, disposable items, parts, or any other suitable payloads.

[0038] Processor: A processor may refer to one or more processors and/or related software and processing circuits. This may include single or multicore processors, single or multiple processors, embedded systems, or distributed processing architectures, as appropriate, for implementing the recited processing function in each embodiment.

[0039] Samples: Samples refers to fluid or other samples taken from a patient (human or animal) and may include blood, urine, hematocrit, amniotic fluid, or any other fluid suitable for performing assays or tests upon. Samples may sometimes refer to calibration fluids or other fluids used to assist an analyzer in processing other patient samples.

[0040] STAT (short turnaround time) sample: Samples may have different priority assigned by a laboratory information system (LIS) or operator to assign STAT priority to samples that should take precedent over non-STAT samples in the analyzer. When used judiciously, this may allow certain samples to move through the testing process faster than other samples, allowing physicians or other practitioners to receive testing results quickly.

[0041] Station: A station includes a portion of a module that performs a specific task within a module. For example, the pipetting station associated with an analyzer module may be used to pipette sample fluid out of sample containers/vessels being carried by carriers on an integrated conveyance system or a laboratory automation system. Each module can include one or more stations that add functionality to a module.

[0042] Station/module: A station includes a portion of an analyzer that performs a specific task within an analyzer. For example, a capper/decapper station may remove and replace caps from sample vessels; a testing station can extract a portion of a sample and perform tests or assays; a sample handling station can manage sample vessels, moving them onto or off of an automation track, and moving sample vessels into or out of storage locations or trays. Stations may be modular, allowing stations to be added to a larger analyzer. Each module can include one or more stations that add functionality to an analyzer, which may be comprised of one or more modules. In some embodiments, modules may include portions of, or be separate from, an automation system that may link a plurality of modules and/or stations. Stations may include one or more instruments for performing a specific task (e.g., a pipette is an instrument that may be used at an immunoassay station to interact with samples on an automation track). Except where noted otherwise, the concepts of module and station may be referred to interchangeably.

[0043] Tubes/sample vessels/fluid containers: Samples may be carried in vessels, such as test tubes or other suitable vessels, to allow carriers to transport samples without contaminating the carrier surfaces.

EXEMPLARY EMBODIMENTS

[0044] The above problems in the prior art have motivated the discovery of improved apparatus and methods for reliably and/or automatically displaying information about samples as the samples are being stored and transported within an IVD environment, or other environment where objects can benefit from chain of custody and an ordered relationship. Specifically, embodiments of the present invention provide an operator with an augmented reality (AR) vision system that allows the operator to easily track samples within the IVD environment based on the recorded chain of custody of samples and information about the relationship between the sample and the environment. For example, when the system knows that a sample is in a given tray and in a given position in that tray, the operator may wear AR glasses that automatically identify the location of the tray, which tray the sample is in within a plurality of trays, and which slot the sample is in within the tray. The AR glasses can provide visual cues via a heads-up display (HUD) to the operator to assist the operator in rapidly identifying the location of sample. Similarly, once a sample is located, the AR system can provide basic information about the sample to the operator via the HUD. This information can include the identity of the sample, the tests to be performed on the sample, where the sample should be taken next, the priority of the sample, and any other relevant information.

[0045] The augmented reality system can provide visual cues to the operator in any number of suitable ways. For example, the operator may wear AR glasses where a semi- transparent display is placed on at least one lens of the glasses to form an HUD. A camera on the glasses can identify the image that the operator sees. The camera can be any conventional camera system, such as a CCD or CMOS-based sensor and corresponding optics. Multiple cameras can be used to provide stereoscopic imaging. Similarly, an IR range-finding camera can be paired with an imaging camera to give both depth and image information. Such a pairing is commercially available as part of the Kinect SDK from Microsoft Corporation and others.

[0046] By analyzing the image information, and comparing the visual information with known information about samples, the display on the glasses can display visual cues in the visual field of the operator to convey which object in the image plane relates to the sample, as well as displaying any information about the sample to the operator in real time. The operator can be considered a user operating the systems described herein.

[0047] In some embodiments, rather than using a display that is in the lenses of glasses, the glasses may include a projector that projects light into the environment that will be visible to the operator that can assist the operator in locating samples, thus creating a HUD. For example, a redirectable laser can be employed on the glasses to project a beam of light into the environment to indicate visually to the operator where his/her attention should be drawn. Similarly, a multi-pixel projector, such as a DLP, can project an image into the environment that can convey information to an operator. For example, text about a sample can be projected onto a flat surface within the visual field, while a flashing image can be projected onto the physical object that contains the sample that is visible in the visual field. [0048] Augmented reality systems that consider the operator's visual field and augment the images that an operator sees should be accessible to persons of ordinary skill in the art. For example, there are several vendors that supply AR glasses, such as Vuzix.

Google has recently announced plans to sell an android-based AR system that uses one or more cameras and network-based image processing to identify salient features within the image and augment the image for operators to provide AR capability cheaply and efficiently. For example, cell phones can be supplied with applications that use the camera of the device and augment the image displayed on the screen with pertinent information. Such an AR system could be easily employed with some embodiments of the present invention. For example, a laboratory technician that is searching for a given sample could utilize a handheld device (e.g., a cell phone or tablet device) to search the environment as the operator moves about a laboratory. As the operator exposes the image field of this handout device to certain trays and samples, the display of the device can identify relevant samples for the operator in the display of the handheld device. The operator can then click on the image of the sample to pull up additional information. Accordingly, in some embodiments, existing hardware can be easily adapted for use with the present invention

[0049] One particular use case that may be useful for an operator in an IVD environment is locating a sample within a tray. This can be accomplished by providing visual cues to an image processor that a given tray is in the images being observed. Samples in an IVD environment are typically contained in tubes. The tubes holding samples can be referred to generally as containers. These containers are placed into and held in trays.

[0050] Assisting the image processor in orienting the tray within the image can further provide a reference frame to the system for identifying individual locations within the tray. If the system has knowledge of the location of a sample within the tray, the system can easily identify which sample in the image is a given sample. If an operator runs a query to help find a given sample or group of samples, the system can identify the position or positions associated with the query and, upon locating that position within a tray in the image field, the AR system can display information to the operator to assist the operator in identifying the appropriate sample or samples.

[0051] FIG. 1 shows an exemplary tray having visual cues that may be suitable for use with certain embodiments. Tray 10 includes a plurality of slots 12 arranged in an array. Trays allow samples (or any other items) to be arranged and stored in an ordered

relationship. In this example, there are 25 slots in tray 10, which allow samples to be arranged in an ordered array. Each slot can hold a sample tube. An orientation mark 14 can be provided on the top surface of tray 10. This orientation mark can include the identity of the tray. By knowing the position of the orientation marked within an image of the top of the tray, the system can orient each of the 25 slots. For example, a slot 15 may be the known location of a given sample in tray 10. By knowing that orientation mark 14 is in the bottom left corner of the tray, the system can determine that slot 15 is in the upper right corner. This means that the location of slot 15 in the image is coincident with the location of that sample. The system can then provide a visual indicator that identifies slot 15, such as an arrow or box that can be drawn around slot 15, to identify the location of a sample.

[0052] Orientation marks need not be placed in the top surface. For example, orientation mark 16 can be provided on one or more side surfaces of the tray 10. This can allow the tray to be identified when stacked amongst other trays. Furthermore, it may allow the tray to be oriented if part of the top surface is obscured. In some embodiments, an oblique surface can also be used to provide an orientation mark that is visible to the operator from many angles. In some embodiments, orientation marks operate as both identification marks that identify the tray and as positioning marks that identify a known position of the tray in an image. Multiple marks can also be used to add robustness to viewing orientation information from multiple angles. Furthermore, a tray can also be designed with handles on certain sides, so that range of orientations of the tray can be limited to further add robustness to detecting the orientation of the tray and viewing marks. Similarly, a tray can be made of an asymmetric shape or layout (e.g., a slot missing or the top surface is not square) to provide orientation clues to the image processor).

[0053] FIG. 2 shows a cross-section of a portion of tray 10, including the

components that may be used to detect the presence or absence of a sample tube within each slot. For example, slot 15 includes switch 20, which detects the presence or absence of a sample tube. Switch 20 may be any suitable form of a switch including, for example, a pressure switch or mechanical switch that is depressed when a tube is inserted into slot 15. Switch 20 communicates via signal line 22 to microcontroller 24. Suitable sensors may also include optical sensors that detect the presence or absence of an object by interruption of an optical beam, or via a camera. Signal line 22 can be part of a bus or an individual conductor. Microcontroller 24 can use the information received from switch 20 to determine that a sample has been placed into slot 15 or removed therefrom. Microcontroller 24 can be a microcontroller or other processor suitable for detecting the presence or absence of samples in slots of the tray, and for communicating this information to information systems used for tracking the chain of custody of samples within the laboratory.

[0054] Tray 10 includes power system 25 and memory 26 for working with microcontroller 24. Power system 25 can include a battery or other suitable power systems. Memory 26 can include programming instructions and data memory suitable for determining the presence or absence of samples and communicating this information. Communication system 28 communicates information about states of slots to other components in the information systems of the laboratory. Communication system 28 can be a wireless device, such as a Wi-Fi device, XBEE communication system, or near field communication device. Information about the states of the various slots can be conveyed in any suitable manner, including conveying information as states change, upon request from an external system, or at regular intervals, providing a list of occupied slots and timestamp information relating to when the slots became occupied or unoccupied.

[0055] Chain of custody can generally be considered synonymous with keeping track of a sample. The level of detail maintained in a chain of custody can vary by embodiment and application. In some embodiments, a strict chain of custody may be desired whereby, at substantially each moment a sample is in a lab environment, it is attributed to a tray, instrument, user, or other defined custodian. In some embodiments, chain of custody can be more loosely defined to determine if a sample is in a tray, and where it is generally or specifically located. In some embodiments, there can be a lapse in chain of custody as the sample moves throughout the environment. The systems and methods described herein can substantially assist in providing automatic assistance in updating the chain of custody of a sample as it is placed into and removed from custodians, such as trays or instruments.

[0056] Information about the state of each slot in the tray, along with the time at which the state changed, can be useful for determining chain of custody of samples. For example, when an operator scans the barcode of a sample and, subsequently, a slot of a tray becomes occupied, the system can correlate the two events and determine that the scanned sample has now been placed in that given slot in the given tray. This information can then be stored for later use. When the operator subsequently needs to locate that sample, the knowledge that that sample has been placed in a given slot in a given tray can be used to easily identify the current location of the sample. This can be useful to other systems within the lab. For example, when a tray is placed into an automated clinical chemistry analyzer (analyzer), the analyzer may request identification of all the STAT samples. Because the identity of each sample in each slot is known, the slots containing STAT samples can be conveyed easily to the analyzer. The analyzer can then select the STAT samples for priority handling within the system.

[0057] Similarly, when an analyzer completes processing of a given sample, a sample handling mechanism may place that completed sample back in the tray. The location of that completed sample can be determined by correlating when the sample was completed and when a slot became occupied. When an analyzer begins working with a sample from a tray, the opposite situation occurs and each sample is removed from the tray and the switches in the tray can indicate that the analyzer now has custody and control of the sample, as each sample is removed. Similarly, in some embodiments, the analyzer can actively identify to the IT systems in the lab which sample it is removing or placing in the tray and actively identify the slot it is interacting with. As a result, custody can be handed off between the tray and the analyzer and vice versa. This allows a real-time chain of custody for each sample to be established without the need for additional operator intervention.

Accordingly, in some embodiments, existing laboratory procedures can continue to be followed, while the system automatically maintains chain of custody that was not previously recorded in prior art systems.

[0058] FIG. 3 shows examples of orientation marks that may be suitable for certain embodiments of the present invention. These marks may be used on one or more surface(s) of trays that hold sample vessels or tubes. In some embodiments, these marks may be used with other objects within the laboratory environment to provide orientation in the images observed by the system to increase the capability of augmenting the reality of an operator of the system. These marks may include both a reference point that is distinct from other patterns likely to be observed in the image and, in some embodiments, can also provide identity information about the reference point. For example, orientation mark 30 is a QR code. QR codes can include several bits of information, depending on the version of QR code being used. For example, a version of QR code provides a 21 x 21 pixel image, while another version of QR code provides a 33 x 33 pixel image, which can convey up to 50 characters, and information along with error correction. Any suitable size QR version can be chosen, and any known QR encoding scheme can be used as suitable for the application. For example, a lower resolution QR code may be used to reduce the cost of the imaging sensors used in the AR system. For example, orientation mark 30 is a version of QR code. Multiple QR codes can be used to increase the amount of information conveyed, if necessary. As higher resolution imaging devices and higher quality optics become readily available or cheaper, the amount of information conveyed in QR codes used in orientation marks can increase. QR codes can be advantageous as orientation marks because the built-in error coding provides robustness, while the codes themselves are readily distinguishable from surrounding imaging features in most images. Furthermore, QR codes are asymmetric, including three reference points. Orientation information can be determined from the code itself, without requiring specific knowledge of the surface that the QR code is placed upon. In some embodiments, other two-dimensional marks are used that convey information and position, such as competing marks to the QR standard.

[0059] Orientation mark 32 is a bulls-eye mark. A bulls-eye mark is suitable for identifying a precise position on a surface, although it may not be suitable for conveying identity information about the position. Once a precise position on the surface is determined in the image plane, other visual cues about the surface, such as edges, can be used to determine the orientation of the surface. For example, by placing an orientation mark at a corner of the surface of a tray, identifying the edges of the tray can reveal the precise orientation of the tray in the image. Once the orientation is determined, the individual slots in the tray can be identified. Once the slots are identified, if an operator requests that the system locate a sample in a known slot, the image captured by an imaging device can be altered to highlight the location of the slot containing that sample in the image displayed to an operator. In some embodiments, the AR system does not display the image captured by an imaging device but, rather, displays highlighting information to the operator in his/her visual field such that it appears to the operator that part of the physical world is highlighted.

[0060] Position 33 is a cross. Like bulls-eye 32, cross 33 is symmetric, and conveys little orientation information or data without the edges of the surface. Marks 34 and 35 are examples of simple orientation marks that convey both position and orientation information due to asymmetry. Limited identity information can also be conveyed with simple marks. For example, if a variety of marks are used, the type of mark that appears in an image can identify the type of object that is being viewed.

[0061] Barcode 38 provides another type of mark that can be used to convey information in an image. Barcode 38 can be used to identify a position. Individual marks within the barcode can be placed at known locations on the surface, such that an edge of a barcode indicates a predetermined position on the surface being viewed. Furthermore, barcode 38 can also convey data, such as the identity of the mark. Barcodes can also be used on sample tubes to identify the sample. By providing an imaging system that can read barcode information to determine information about objects in the environment, the system can identify not only the location of sample tubes relative to marks on a tray, but may also read the information on the sample tubes themselves.

[0062] FIG. 4 shows an exemplary image that can be observed by an operator of an augmented reality system in accordance with some embodiments of the present invention. An operator can observe a stack of trays 41, 42, and 43, which can be stored in a refrigerated location. For example, an operator might open a refrigerator door to retrieve a sample amongst a plurality of refrigerated samples. To the human eye, there is little difference between the samples in the refrigerator. However, by utilizing an imaging device that can observe substantially the same image that an operator observes, one or more processors can process the image information to detect certain information that may not be human readable. For example, an imaging device mounted on glasses worn by the operator can observe barcode 45 to determine that tray 43 is the tray that the operator is looking for. In some embodiments, marks on the side of a tray may not be necessary if the trays are stored in a known order. A database that tracks the location of samples as they move through the laboratory environment can recall that the sample being sought is stored in the middle tray.

[0063] Once a processor determines from the image that tray 43 is the important tray, the processor can provide a visual indicator to the operator that tray 43 is important. By using an imaging device to record images that are substantially co-planar with the operator's field of view, the processor can determine where in the image plane to draw the operator's attention. The processor can use a heads-up display, or other display that can overlay information in either the operator' s field of view or an image of the surrounding environment to visually convey the importance of certain parts of the environment to the operator. For example, the processor can send instructions to a heads-up display in the glasses worn by the operator to display an illuminated box 47 around the visual area containing tray 43. Box 47 can be illuminated in any suitable manner, depending on the display used. For example, in a display utilizing a projector, the box can be projected via a laser or DLP projector into the environment. For a heads-up display utilizing a transparent, or semi-transparent, screen in front of the operator's eyes, the box can be drawn on the screen or screens. For a display utilizing a separate screen that is not a heads-up display (e.g., an LCD, LED, AMOLED screen on a handheld device), the display can play back an image of the environment and draw box 47 in an image.

[0064] In addition to box 47, the display can also include textual information about the samples being sought. For example, text 48 can be displayed as part of the display in the HUD or handheld display. This text can appear to be floating in space next to the tray or sample being identified. In this example, the text includes the identity of the patient, the identification of the sample, and the tray and slot in which the sample is currently located.

[0065] FIG. 5 shows a similar example to that shown in FIG. 4, in which an individual sample in a tray is identified using the same mechanisms. In this example, the AR system processes the image to determine the orientation of tray 43. Once the orientation of tray 43 is determined, the AR system can determine where slot 6 is in the image. Once slot 6 is identified, the AR system can draw an illuminated box 49 around sample 50 contained in slot 6. Similarly, the AR system can display text 48 about the sample. It should be appreciated that other visual cues besides boxes 47 (FIG. 4) and 49 can also be used to draw the operator's attention to a given object in his/her environment. For example, a blinking indicator, an arrow, a circle, or any other shape can also be used to draw the operator's attention to a portion of the visual plane.

[0066] FIG.6 shows an exemplary HUD unit that can be used by an operator as part of the AR system. AR glasses 52 can be worn by an operator similarly to normal lab safety goggles. The operator can observe his/her environment via lenses 54, which can be glass or polycarbonate lenses, similar to those used in safety goggles. AR glasses 52 also include some imaging components to provide AR functionality. For example, at least one HUD unit can be placed in the glasses such that the operator peers through the HUD display when observing his/her environment. In this example, HUDs are used. AR glasses 52 include a left HUD 56 and a right HUD 57. These HUDs can be formed by any conventional HUD display technology. For example, a transparent or semi-transparent LCD panel can be used.

[0067] To facilitate augmenting the operator's reality using HUDs 56 and 57, AR glasses 52 are equipped with at least one camera to observe the environment in substantially the same visual plane as the operator does. AR glasses 52 include two cameras 58, one for each eye. Because cameras 58 are substantially near the observer's iris, the images recorded by cameras 58 are substantially coplanar with the images observed by the operator's eyes. This enables an image processor to see substantially what the observer sees. By determining where an object is in the image plane observed by cameras 58, the AR system can approximate where the object is in the observer's visual field. By correlating pixels in the image recorded by cameras 58 with pixels in HUDs 56 and 57, the AR system can determine which pixels in the displays correspond to the object in the image. This enables the AR system to draw boxes using HUDs 56 and 57 that approximate the location of a sample being viewed by the operator. By selectively enabling pixels in HUDs 56 and 57, the observer can see images through AR glasses 52 that are substantially the same as those shown in FIGs. 4 and 5.

[0068] FIG. 7 is a hardware system diagram showing the internal components that are used in an exemplary AR glasses. System 60 includes the components used to make a generic AR display. System 60 can be implemented via AR glasses 52 (Fig 6), a handheld device carried by the operator, or any other AR system utilized by the operator. Processor 61 interprets image information and optionally sensor data to assist in determining orientation and interpreting the image that will be seen by an observer. Processor 61 may be a microcontroller, DSP, one or more CPUs, or any combination thereof. In some

embodiments, processor 61 is a processor suitable for low-power operation and basic image processing. Power system 63 provides power to the system. Power system 63 can include batteries or the like. Memory 64 provides data and instruction memory to processor 61.

[0069] Processor 61 can determine orientation information from sensors 65. Sensors 65 can include gyroscopes, solid-state compasses, GPS, RFID sensors, or any other sensors that may assist in orienting the AR glasses. Imaging devices 73 provide image information, which can include orientation information, if orientation marks appear in the image. Imaging devices 73 can be cameras that are configured to see substantially similar images to those seen by an operator. AR system 60 can also include a database 74 that includes information about the environment, including the location of certain samples or other objects. This can be used by processor 61, along with the images retrieved by imaging devices 73 to determine if and where certain objects appear in the images. In some embodiments, the image processing can occur via an external processor, to allow more powerful computing to be employed without requiring a large power system 63. For example, processor 61 can utilize communication system 75 to communicate with other processors and IT resources to assist in image processing, or other processing. For example, a cloud computing resource may be utilized to increase the image processing power of the AR system, allowing processor 61 to be a low-power device.

[0070] Communication system 75 can be used to send image information to other processors or to receive information about the location of certain objects to update database

74. Communication system 75 may be a wireless device, such as a Wi-Fi device or an XBee radio transceiver. Communication system 75 may include all hardware necessary to wirelessly communicate with the AR system and other IT components within the laboratory environment. This can include, for example, any necessary antennas.

[0071] Processor 61 communicates with peripheral devices via bus 70. This allows processor 61 to gather sensor and image data via sensors 65 and imaging devices 73, compare information to database 74, communicate information via communication system

75, and display information using displays 72. Displays 72 can include any suitable display, including heads-up displays 57 and 56 (FIG. 6), a projector that projects information into the environment, or displays on one or more handheld devices or terminals.

[0072] FIG. 8 shows a laboratory system 80 that includes software and network resources that can utilize AR system 60 (FIG. 7). Laboratory information system (LIS) 83 manages patient data 82, including information about patient samples and testing status of the samples, and identifies the tests that should be performed on patient samples. LIS 83 can include IT resources suitable to allow doctors to access patient data 82 and update this information. Using LIS 83, doctors can request tests on patient samples, create new patient samples upon drawing samples from patients, monitor testing status, view the results of tests, and request additional tests to be performed. LIS 83 may include commercially available LIS software that is accessible and implemented by hospitals.

[0073] Middleware 84 can include software specific to the laboratory environment. Middleware 84 can be, for example, syngo® laboratory data manager by Siemens AG. Middleware, such as a laboratory data manager, enables an IVD lab to track samples, create custom logic for assays, and handle a wide variety of analytical tasks. This middleware can communicate with the LIS software to provide more detailed sample handling analysis than might otherwise be needed by the LIS software. LIS presents a front-end for doctors and hospitals, while the middleware allows more refined analysis and sample handling logic. The middleware can interact directly with analyzers and other laboratory equipment. By segregating the IT environment into middleware and LIS software, a hospital can implement a custom back-end in the diagnostic lab without interfering with other hospital software. Middleware 84 can also act as an interface between laboratory instruments, networks, LIS software, and hospital IT resources. Middleware can be used to set up and design custom workflows within the clinical environment, allowing operators to verify certain analytical tasks, provide instructions to operators for specific tasks, and create custom rules that may vary by sample or assay. It should be appreciated that, in some embodiments, the middleware 84 may be part of the LIS software 83, and vice versa. Middleware 84 and LIS 83 can be implemented as software and/or hardware using conventional IT means. For example, the software may be run on a client/server environment, stand-alone computers, a cloud computing environment, or any other combination thereof. Communication between LIS 83 and middleware 84 can also be via conventional means, such as a network, an API, a messaging system, etc. LIS 83 and middleware 84 can run on the same or different computer environments.

[0074] Middleware 84 can also act as an intermediary for the various components of the AR system and other parts of the laboratory environment. For example, cameras 85, which may be part of AR glasses worn by an operator, can provide image data to

middleware 84. Middleware 84 can assist the processor in the AR glasses to process image information and assist the operator in navigating his/her environment. Similarly, cameras 85 may include devices capable of image processing, whereby salient information about the images, and not the images themselves, is sent to middleware 84. Cameras 85 can also include any suitable imaging devices capable of recording images and storing or transmitting the images to a processor or middleware 84.

[0075] Trays 86 can communicate with middleware 84 to indicate the presence or absence of a sample to in each slot. Middleware 84 can use this information to associate a known sample with slots in the trays. Middleware 84 can actively participate in monitoring the chain of custody of the sample in a clinical environment.

[0076] Similarly, barcode readers 87 can send patient sample identity information to middleware 84. Upon receiving identification of a patient sample at a barcode reader, middleware 84 can determine that sample is currently located at the station having the barcode reader. Barcode readers 87 may include stand-alone barcode readers within a clinical environment, barcode readers associated with individual stations within the environment, or may be virtual barcode readers, which are part of cameras 85. For example, an operator that is handling samples received from a hospital can visually inspect each sample when it arrives. One or more cameras that are worn by the operator as part of the AR glasses can observe the barcode information of each sample as the operator views the sample. The AR glasses may provide a visual and audible indicator to the operator that the sample has been properly scanned. The one or more cameras worn by the operator can then communicate the sample identity information to middleware 84. Accordingly, readers 87 may be part of cameras 85.

[0077] Middleware 84 can include instructions to associate samples with slots in trays 86. When a sample is scanned by cameras 85 or readers 87 and shortly thereafter a tray reports that a sample has been placed in a slot, middleware 84 can determine that this event indicates that the recently scanned sample tube has been placed in that given slot in the tray reporting the change in status.

[0078] Middleware 84 can also interact with laboratory equipment 88. Laboratory equipment 88 can include one or more analyzer, or other laboratory stations, such as sample handling devices, de-cappers, or incubation or storage devices. Middleware 84 may also interact with one or more displays 89. Displays 89 can include HUD modules that are used for augmented reality by an operator. For example, displays 89 can include one or more HUDs that are part of an AR headset or glasses worn by an operator. Displays 89 can also be other displays of the laboratory environment, such as displays related to terminals 90. Terminals 90 can include workstations, laptops, wireless devices, etc. that allow an operator to interact with middleware 84. Further examples include terminals 90 used by an operator to create custom workflows, check results of analysis, or any other task suitable for the clinical environment.

[0079] The components in the system 80 can communicate with one another via any suitable communication protocols. For example, middleware 84 may communicate with other components wirelessly or via one or more signal wires, such as via an IP network. [0080] FIG. 9 depicts typical use cases and workflows that occur in some

embodiments of the present invention. Workflow 100 illustrates the most common steps that occur based on operator action. Operator actions are shown on the left-hand side, while hardware actions are shown in the middle, and software actions are shown on the right. It should be appreciated that hardware actions and software actions can be performed by a combination of hardware and software; the distinction is merely intended to be illustrative. At step 102, a sample arrives in the clinical environment. The sample may be part of a larger group of samples that are arriving from a hospital or other location.

[0081] After samples arrive, an operator will visually inspect each sample to verify that it remains intact and perform any other visual quality assurance tasks, at step 104. After the visual inspection, the operator will typically scan the barcode of the tube to check it in, at step 105. Whereas prior art systems required the operator to scan the barcode with a handheld or tabletop barcode scanner, some embodiments can alternatively perform step 105 automatically as part of the visual inspection by the operator. When the operator observes the sample and wears one or more cameras, the cameras can detect and read the barcode on each sample during the visual inspection of step 104, and can provide visual or audio feedback to the operator that the sample tube has been successfully scanned.

[0082] At step 106, the barcode reader or camera(s) worn or carried by the operator communicates the barcode information to the middleware or LIS software. This informs the software of the identity of the sample that has arrived. This allows the software to keep inventory and track the custody of samples. At step 108, the software can update the sample status. This status information can be maintained in any conventional IT form, including by maintaining a database that tracks the chain of custody and current status of each patient sample in the clinical environment. For example, the status can be updated at step 108 to "scanned by operator 23." Custody can be said to have transferred to the operator. Each sample can have an entry in the database that includes the chain of custody information, as well as patient information, information about the tests to be performed on the sample, and any results thereof.

[0083] At step 110, the operator places the recently scanned sample tube into a tray. Using switches in the tray, at step 112, the tray can detect that a sample has been placed in a certain slot. Alternatively, step 112 may be performed via the operator's head-mounted cameras. By detecting the orientation of the tray and detecting where the object

corresponding to the sample tube has been placed within a tray, the cameras can determine which slot the sample tube has now been placed into. When the presence of a sample is detected, such as at step 112, it can be said that the sample detected is an unknown sample until an identity of a sample is associated with the sample. For example, when a sample is first detected by a tray, the tray determines that some sample has been detected, but the identity of the sample is not known until it is correlated with the identity information of the sample, such as that information determined at step 105.

[0084] At step 114, the tray (or camera) communicates the change in status to middleware or LIS software. This informs the software that a sample tube has been placed in a given location in a sample-holding tray. At step 116, the software will then update the sample status by correlating the sample status from 108 with the information received. At step 114, the software can determine that the recently scanned sample is the sample placed into the tray and update that sample's status. For example, the status can be updated at step 116 to "in tray 43, slot 6." Custody can be said to have transferred from the operator to the tray at this point.

[0085] At step 118, the operator moves the tray. For example, the operator can move the tray into and out of a refrigerator. Until the sample is removed from the tray, however, the custody of the sample does not change - it remains in the same slot, in the same tray, and custody remains with the tray. The operator can also move the tray to an instrument so that the samples in the tray may be processed. The custody of the samples can transition to the instrument as the instrument begins processing the samples. For example, the operator can feed the tray into the sample handling input of an analyzer or other instrument. At step 120, the instrument handles the samples received from the operator. This can include removing the samples from the tray and placing the samples into carriers or internal holding areas for the instrument. Custody has now switched from the tray to the instrument. At step 122, the instrument can acknowledge this change in custody by communicating with middleware or LIS software that it is now handling a given sample. This communication can also involve the tray itself; when a sample has been removed from a tray, the tray can report that a given slot is no longer occupied. When this is reported to the middleware or LIS, the software can determine that that slot was previously occupied by a given sample. The software can then determine the tray no longer has custody of that sample. When an instrument scans the barcode of a sample or reports having custody of the tray, the software can determine the instrument now has custody of the sample and that the sample is now located in the instrument. At step 124, the status information of that sample is updated to reflect that the sample is now located in, and in the custody of, the instrument. For example, the status can be updated at step 124 to "processing in instrument 3."

[0086] Once the instrument is finished, the instrument may place the sample back in the tray. This can prompt the tray to detect the sample, consistently with step 112.

Accordingly, the software will ultimately update the sample status in accordance with step 116 and give custody back to the tray.

[0087] At step 126, an operator can run a query using a terminal or handheld device. The query can be prompted by any number of causes in the laboratory workflow. For example, an operator may wish to determine where all samples pertaining to a given patient are currently located. For instance, a critically ill patient may have had multiple samples taken, such as a blood sample and a urine sample, and may have had samples previously drawn during previous visits. That patient may have multiple samples that need to be retrieved for retesting, such as for testing a new hypothesis of a treating physician. The terminal may inform the operator, generally, where the samples can be found. For example, a sample currently in an instrument can be reported as located in that instrument.

Meanwhile, some samples may be in storage. The terminal may report to the operator the storage location, such as a refrigerator in the lab and the identity of a tray in a refrigerator. In prior art systems, the operator may have to search for that tray, and once he/she has found the tray, the operator may have to then manually search each sample in the tray, requiring the operator to remove each sample for visual inspection until the proper sample is retrieved.

[0088] In some embodiments, the operator may benefit from augmented reality, whereby the AR system provides a more intuitive approach to locating the sample. At step 128, the operator's query is communicated to the middleware or LIS software. This can be in the form of a network communication from the operator's terminal to the middleware or LIS software. At step 130, the software can retrieve the current status of each sample associated with the query by searching the sample status database, which has been updated at each previous step to reflect the current custody of the sample.

[0089] At step 132, the software can communicate the results to one or more displays in the clinical environment. For example, software may communicate to the terminal display that all samples are currently stored in the refrigerator. This prompts the operator to go to the refrigerator. Furthermore, the software may also communicate instructions to the operator's AR headset to identify the trays, and slots within those trays, where the samples can be found. This enables the AR headset to begin observing images detected by cameras on the operator's head to search for the specified trays and slots. [0090] When the operator opens the refrigerator, he/she may see several trays in the refrigerator. At step 134, the AR headset or other camera device begins searching for the identified trays in the images it detects. After the operator has opened the refrigerator and peered inside, cameras and processors processing the image recorded by these cameras can locate the specified tray(s). Once the trays are found in the images, one or more processors associated with the AR headset can determine where in the operator's visual plane the trays or samples appear. The processors can then determine where in a display (such as a HUD) to highlight to draw the operator's attention to the located tray or sample at step 136. At step 136, the display is used to draw the operator's attention to a certain portion of the visual field. Exemplary displays are shown in FIGs. 4 and 5.

[0091] This process can repeat as the operator moves the highlighted tray and begins looking for the sample in the tray. Once the orientation of the tray has been identified, a processor processing images observed by the cameras can detect the known location of a sample in the image, based on the information received at step 132. The display can then highlight the location of the sample within the tray to assist the operator in locating the sample.

[0092] One of the benefits of some embodiments of the present invention is that, from the operator's perspective, the workflow 100 is no different than he/she is used to in prior art systems. Whereas the chain of custody is automatically monitored on the back-end via software, the operator is not required to conduct additional tasks that he/she is not already trained for. For example, operators typically visually inspect samples and place them in trays. However, the operator can benefit from the AR system because it can make certain tasks easier. For example, it assists the operator in finding samples in response to his/her queries and keeps track of the chain of custody of samples automatically. Similarly, in embodiments where the barcode is read automatically during the visual inspection at step 104, the operator may skip a step of scanning a sample manually with a barcode reader.

[0093] FIG. 10 shows information flow in an exemplary embodiment. When a sample 50 arrives in the lab at step A, its barcode is scanned by readers 87 or cameras 85. Readers 87 or cameras 85 report the identity of the sample to middleware 84, at step B. At step C, the sample is placed into one of trays 86. At step D, tray 86 reports the presence of the new sample to middleware 84. Middleware 84 can correlate the information received at steps B and D to change the custody of the sample to the tray that received the sample. This change in custody is reported to LIS 83, at step E. LIS 83 then updates patient data 82 to reflect the change of custody. It should be noted that middleware 84 and LIS 83 can communicate other events. For example, when a sample is first scanned, middleware 84 can report the sample's presence in the lab to LIS 83.

[0094] Later, the tray containing the sample is placed into an instrument 88. At step F, the instrument reports to middleware 84 when it has begun testing the sample.

Middleware 84 can note the change in custody to reflect that the sample is now in the instrument 88. Middleware 84 can report this information to LIS 83. Simultaneously with the instrument's report, trays 86 may report that the sample has been removed. In some embodiments, middleware 84 can receive multiple communications about the hand-off and correlate these events as a single hand-off. Subsequently, custody can be handed off between instrument 88 and trays 86, and the trays stored. This information can be conveyed to middleware 84 similar to steps D and F.

[0095] Later, an operator may wish to locate the sample. Using terminal 90 or another input device, the operator may send a query to middleware 84 or LIS 83, at step G. LIS 83 can query the patient database 82 to determine where the sample was last located. Middleware 84 can use this information to display the location to the operator via displays 89, at step H. This can include displaying the location of the sample in a tray in an augmented reality fashion as described herein.

[0096] FIG. 11 is a flowchart showing the actions and their resulting impact on chain of custody. The chain of custody flow begins with state 150 where custody of the sample is generally with the operator or lab. At step 162, an operator uses a camera or barcode reader to identify the sample by scanning its barcode, or by performing other suitable steps, such as manually inputting the sample's identity in some embodiments. At step 164, the operator then places that sample in a tray. Switches in the tray can detect the presence of the sample and note the change to the middleware. The middleware can correlate these events, causing the custody to pass to the tray, resulting in state 152. This change in custody can be recorded in a database.

[0097] At step 166, the operator may interact with the sample and tray. For instance, the operator may place the tray or sample into an instrument for testing. At step 167, the instrument or operator can remove the sample from the tray. The instrument or tray can report the event to the middleware. The middleware can note the change in sample status and update the custody to be with the instrument, resulting in state 154.

[0098] Subsequently, when testing by the instrument is complete, the instrument or the operator may place the sample back in a slot in a tray, at step 168. At step 169, the instrument or operator can place the sample back in the same or a different tray at the same or a different location from where it was removed in step 167. The instrument or the tray can report the event to middleware. If done by the operator, this step can be similar to step 162, where the operator scans the samples as they come off the instrument. If done by the instrument, this step can be the opposite of step 166, where the tray notes the presence of a sample known to be finished. Middleware can note the change in status and determine that custody now lies with the tray, resulting in state 156. [0099] Later, at step 170, the operator may seek the location of the sample. This query can be sent to the middleware or LIS via a terminal or other input device. As described throughout, the AR system can assist the operator in locating the sample by displaying images in a display, such as a HUD, at step 172. At step 174, with the assistance of the AR system, the operator locates the sample and removes it from the tray. The switches on the tray that previously contained the sample will indicate that the sample has been removed from its known slot. This information can be reported to the middleware. The middleware can note the change in custody and determine that custody now lies with the operator or the lab in general, resulting in state 158. Custody can also pass to the operator anytime that a sample is removed without identifying another custodian, such as an instrument. In the case of multiple operators, custody can pass to the tube or lab generally.

[00100] Once custody passes from a tray to another custodian, such as the operator or instrument, custody can later pass to the same or another tray if a sample is detected at another slot, and the event can be correlated to the specific sample. For example, if after step 174, there are no events that identify another sample is being handled, and a sample is placed in another slot/location in the same or another tray, the chain of custody of the sample removed in step 174 can be associated with the new location. If step 164 occurs after step 174, without a scan occurring at step 162, it can be inferred that the sample removed from the first slot at step 174 has been placed in the new slot at step 164. This can facilitate automatic updates to the chain of custody in the event that samples in a tray are resorted. Furthermore, it should be appreciated that samples do not need to progress through the workflow of the environment in the order shown in FIG. 11. For example, a sample identification at step 162 can occur after step 174, and the chain of custody of the sample can be updated to reflect its new location in the same or different tray. [00101] Some embodiments of the systems and methods described herein can also be used outside of a laboratory environment. For example, the embodiments have been described in the context of fluid samples in tubes or vials in an IVD lab. However, this is merely intended as illustrative. The same principles could be extended to other environments where objects have ordered relationships. For example, the systems and methods described herein can be used in a shipping or manufacturing environment where, instead of samples, the items being organized and moving through custody are items being shipped (e.g., goods or packages intended for a destination) or component parts of a custom product (e.g., instead of sample tubes, the objects being tracked could be individual components that are custom made or otherwise intended for a given customer's product).

[00102] For example, the sample tube/containers described herein could be replaced with any objects, such as boxes or widgets. Accordingly, the sample

tubes/containers in the embodiments described herein can be considered illustrative. It will be appreciated the some embodiments handle and track other objects, and any suitable items may be substituted for the sample tubes into the illustrative embodiments. For example, the AR system described herein could be used in a manufacturing environment where, rather than sample containers, the items being tracked and ordered are unique parts needed to build a product to order. The parts for a customer' s products could be stored in ordered trays (which could be large bins, containers, or palettes). Trays can be considered a type of container, and each embodiment disclosed throughout can be considered to also contemplate the use of containers generally. Some embodiments of trays may also include one or more shelves, particularly if the AR system is used in an industrial environment. These trays can detect the presence and location of items similarly to the trays described throughout. An operator searching for parts can be directed to the correct tray and the correct location within that tray via an AR headset or similar AR device. Similarly, the systems and methods can be used to maintain chain of custody of customer boxes and purchased products in a shipping/order fulfillment system.

[00103] As used herein, a processor should be understood to include any number of processors. While the processors described herein have been broken down as individual processors performing certain tasks, this is done for illustrative purposes.

Embodiments of the present invention can include single or multiple processors performing the roles described. Furthermore, the roles described as separate processors herein can, in some embodiments, be performed by separate or common processors or any subset therein. For example, in some embodiments, some of the tasks performed by microcontroller 24 can be performed by a separate processor that is not part of tray 10. Furthermore, it should be appreciated that when multiple tasks are attributed to one or more processors, it is not intended to limit embodiments to those where one or each processor performs all tasks. Accordingly, the term processor is intended to include multiple processors, and the term processors is intended to encompass one or more processors where tasks can be shared or each task can be delegated to a separate processor. It should further be appreciated that the distinction between hardware and software is merely intended for illustrative purposes; software can include hardware, and vice versa. Tasks described as occurring on hardware can be performed partially or fully in software or hardware, and vice versa. It should similarly be appreciated that when a task is described as performed by a processor, this includes hardware and software instructions. Accordingly, steps disclosed herein as performed by processors are contemplated to include programming instructions for performing the steps and the processors are contemplated to be configured to perform these steps by software instructions or otherwise.

[00104] Embodiments of the present invention may be integrated with existing analyzers and automation systems. It should be appreciated that carriers may be configured in many shapes and sizes, including layouts and physical configurations suitable for use with any contemplated analyzer or instrument. For example, in some embodiments, a carrier may include multiple slots for carrying multiple samples around an automation track. One embodiment, for example, may include a physical layout of a tube-holding portion of a carrier with multiple slots in one or more transport racks. Each rack may include multiple slots (e.g., five or more slots), each slot configured to hold a tube (e.g., a sample tube).

[00105] Although the invention has been described with reference to exemplary embodiments, it is not limited thereto. Those skilled in the art will appreciate that numerous changes and modifications may be made to the preferred embodiments of the invention and that such changes and modifications may be made without departing from the true spirit of the invention. It is therefore intended that the appended claims be construed to cover all such equivalent variations as fall within the true spirit and scope of the invention.

Claims

What is claimed is:
1. A system for assisting an operator comprising: at least one camera; at least one display; at least one tray capable of receiving a plurality of samples and detecting their location within the tray; and one or more processors together configured to monitor the location of the plurality of samples and to display information on the display to assist an operator in locating at least one sample.
2. The system of claim 1, wherein the at least one camera is configured to read a barcode associated with at least one sample.
3. The system of claim 1, wherein the at least one tray is configured to transmit information about the presence or absence of samples in the corresponding locations within the tray to the one or more processors.
4. The system of claim 1, wherein the one or more processors are together further configured to receive barcode information associated with a sample, and correlate the presence of the sample at a corresponding position in the at least one tray.
5. The system of claim 1, wherein the one or more processors are further configured to maintain a chain of custody for the plurality of samples.
6. The system of claim 1, wherein the at least one tray comprises a plurality of slots and at least one orientation mark.
7. The system of claim 6, wherein: the at least one camera is configured to capture an image of the at least one orientation mark; and the one or more processors are together further configured to determine an orientation of the at least one tray.
8. The system of claim 7, wherein the one or more processors are together further configured to determine a location of at least one of the plurality of samples in the image.
9. The system of claim 8, wherein the at least one display is configured to display information in response to the location of the at least one of the plurality of samples in the image.
10. The system of claim 1, wherein the at least one display comprises a heads-up display.
11. The system of claim 1, wherein the at least one camera comprises at least one wearable camera.
12. A method for assisting an operator comprising steps of: determining an identity of a first sample; detecting the presence of an unidentified sample at a first position in a first tray; correlating the identity of the first sample with the unidentified sample; communicating the identity of the first sample and the first position to a processor; and recording, via the processor, an association between the first sample and first position to maintain a chain of custody of the first sample.
13. The method of claim 12 further comprising steps of: detecting the removal of the first sample from the first position in the first tray; and determining, via the processor, a new location of the first sample.
14. The method of claim 12 further comprising steps of: in response to a query, locating the first sample; and displaying the location of the first sample via an augmented reality display.
15. The method of claim 12 further comprising steps of: detecting the removal of the first sample from the first position in the first tray; subsequently detecting the presence of the first sample at a second position in a second tray; and recording, via the processor, an association between the first sample and the second position to update the chain of custody of the sample.
16. The method of claim 12 further comprising steps of: detecting the removal of the first sample from the first position in the first tray; and updating, via the processor, the chain of custody to reflect that an operator has custody of the first sample.
17. The method of claim 12 further comprising steps of: detecting the removal of the first sample from the first position in the first container; and determining, via the processor, a new location of the first sample.
18. The method of claim 12 further comprising steps of: in response to a query, locating the first sample; and displaying the location of the first sample via an augmented reality display.
19. The method of claim 12 further comprising steps of: detecting the removal of the first sample from the first position in the first container; subsequently detecting the presence of the first sample at a second position in a second container; and recording, via the processor, an association between the first sample and second position to update the a chain of custody of the sample.
20. The method of claim 12 further comprising steps of: detecting the removal of the first sample from the first position in the container; and updating, via the processor, the chain of custody to reflect that an operator has custody of the first sample.
21. A system having hardware and software configured to: determine an identity of a first item; detect the presence of an unidentified item at a first position in a first container; correlate the identity of the first item with the unidentified item; communicate the identity of the first item and the first position to a processor; and record, via the processor, an association between the first item and first position to maintain a chain of custody of the first item.
PCT/US2013/040637 2012-05-11 2013-05-10 Augmented reality for workflow assistance WO2013170204A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US201261645775 true 2012-05-11 2012-05-11
US61/645,775 2012-05-11

Publications (1)

Publication Number Publication Date
WO2013170204A1 true true WO2013170204A1 (en) 2013-11-14

Family

ID=49551317

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/040637 WO2013170204A1 (en) 2012-05-11 2013-05-10 Augmented reality for workflow assistance

Country Status (1)

Country Link
WO (1) WO2013170204A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170142324A1 (en) * 2015-11-18 2017-05-18 Roche Diagnostics Operations, Inc. Method for generating an entry for an electronic laboratory journal

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100129789A1 (en) * 2007-04-06 2010-05-27 Brian Austin Self Automated assay and system
US7865312B2 (en) * 2000-02-02 2011-01-04 Phenomenome Discoveries Inc. Method of non-targeted complex sample analysis
US8060008B2 (en) * 2004-04-07 2011-11-15 Nokia Corporation Mobile station and interface adapted for feature extraction from an input media sample

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7865312B2 (en) * 2000-02-02 2011-01-04 Phenomenome Discoveries Inc. Method of non-targeted complex sample analysis
US8060008B2 (en) * 2004-04-07 2011-11-15 Nokia Corporation Mobile station and interface adapted for feature extraction from an input media sample
US20100129789A1 (en) * 2007-04-06 2010-05-27 Brian Austin Self Automated assay and system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170142324A1 (en) * 2015-11-18 2017-05-18 Roche Diagnostics Operations, Inc. Method for generating an entry for an electronic laboratory journal
EP3171302A1 (en) * 2015-11-18 2017-05-24 F. Hoffmann-La Roche AG A method for generating an entry for an electronic laboratory journal

Similar Documents

Publication Publication Date Title
US5314825A (en) Chemical analyzer
US20090117620A1 (en) Automated analyzer for clinical laboratory
US20080024301A1 (en) System for tracking vessels in automated laboratory analyzers by radio frequency identification
US7435383B2 (en) Automated tissue staining system and reagent container
WO2010087303A1 (en) Automated analyzer and automatic analysis method
US20050013736A1 (en) Operator interface module segmented by function in an automatic clinical analyzer
US20060051239A1 (en) Method and apparatus for handling sample holders
JP2000046835A (en) System and method for inspecting specimen
US20140373747A1 (en) Power source for an automation system mechanism
JP4346923B2 (en) Target cells automatic search system
US20080201082A1 (en) Blood image analyzer
US20050191214A1 (en) Automated tissue staining system and reagent container
US20130076898A1 (en) Apparatus, systems, and methods for tracking medical products using an imaging unit
US20080042839A1 (en) Device and Method for Identifying, Locating and Tracking Objects on Laboratory Equipment
US20090223308A1 (en) Analyzer, sample transportation method for analyzer, and computer program product
JP2004093518A (en) System for producing specimen indication information and specimen rack
WO2012069925A1 (en) Devices and methods for programmable manipulation of pipettes
US20060085162A1 (en) Laboratory sample transfer apparatus with interchangeable tools
US20060188406A1 (en) Semi-automated pipetting aid
EP0569215A2 (en) Clinical chemistry analyser
JP2009074887A (en) Specimen analyzer
US20100282003A1 (en) Specimen processing device and specimen processing method
US20080050278A1 (en) System for automatically storing and reprocessing patient sample's in an automatic clinical analyzer
US20100324722A1 (en) System for managing inventories of reagents
US20100094564A1 (en) Analytical strip reading apparatus and the analyical strip used therein

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13787213

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13787213

Country of ref document: EP

Kind code of ref document: A1