WO2020163218A1 - Systems and methods for implemented mixed reality in laboratory automation - Google Patents

Systems and methods for implemented mixed reality in laboratory automation Download PDF

Info

Publication number
WO2020163218A1
WO2020163218A1 PCT/US2020/016360 US2020016360W WO2020163218A1 WO 2020163218 A1 WO2020163218 A1 WO 2020163218A1 US 2020016360 W US2020016360 W US 2020016360W WO 2020163218 A1 WO2020163218 A1 WO 2020163218A1
Authority
WO
WIPO (PCT)
Prior art keywords
environment
laboratory
component
processor
user
Prior art date
Application number
PCT/US2020/016360
Other languages
French (fr)
Inventor
Robert K. GANTZER
Original Assignee
Beam Therapeutics Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beam Therapeutics Inc. filed Critical Beam Therapeutics Inc.
Priority to US17/426,602 priority Critical patent/US12125145B2/en
Priority to EP20751960.4A priority patent/EP3921803A4/en
Publication of WO2020163218A1 publication Critical patent/WO2020163218A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/24Use of tools
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/02Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14131D bar codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes

Definitions

  • laboratory automation is utilized to research, develop, optimize and capitalize on technologies in the laboratory.
  • Laboratory automation professionals are typically academic, commercial and government researchers, scientists and engineers who conduct research and develop new technologies to increase productivity, elevate experimental data quality, reduce lab process cycle times, or enable experimentation that otherwise would be impossible.
  • a popular application of laboratory automation technology is laboratory robotics utilizing different automated laboratory instruments, devices, software algorithms, and methodologies used to enable, expedite and increase the efficiency and effectiveness of scientific research in laboratories.
  • One example of such robotic system in an automated laboratory setting is the use of autosamplers using micro syringes.
  • Laboratories devoted to activities such as high-throughput screening, combinatorial chemistry, automated clinical and analytical testing, diagnostics, large scale biorepositories, and many others, would not exist without advancements in laboratory automation.
  • Automated laboratory systems can be expensive and require very precise setup and configuration to ensure that the automated equipment can perform the specific tasks without issue.
  • downtime within an automated laboratory (caused by malfunctions, crashes, errors, etc.) can be costly in time and financials to resolve the issue(s) causing the downtime.
  • a significant number of lab automation crashes or errors, such as sample processing errors, can be attributed to human error.
  • a large percentage of human errors are caused by misplaced or improperly setup labware.
  • FIG. 1 is an exemplary computing system for implementing the systems and methods of the present invention
  • FIG. 2 is an exemplary process for designing experiment protocols for use in accordance with the present invention
  • FIG. 3 is an exemplary computing system for implementing the systems and methods of the present invention
  • FIG. 4 is an exemplary computing system for implementing the systems and methods of the present invention.
  • FIG. 5 A is an exemplary graphical user interface for designing experiment protocols
  • FIG. 5B is exemplary graphical user interface for transforming an experiment protocol in an augmented reality program
  • FIG. 6 is an exemplary process for configuring an automated laboratory using the system and methods of the present invention.
  • FIGS. 7A, 7B, 7C, 7D, 7E, 7F, 7G, 7H, 71, AND 7J are exemplary implementations of configuring an automated laboratory using augmented reality.
  • FIG. 8 shows an exemplary computer architecture for use, in accordance with the present invention.
  • An illustrative embodiment of the present invention relates to systems and methods for providing mixed reality content within an automated laboratory.
  • the incorporation of mixed reality into a real-time view of the laboratory environment is provided to help bridge the gap between digital protocols and actual execution.
  • the present invention can assist in the setup experiments and track what is happening during execution of those experiments in real-time via rendering of mixed reality content.
  • FIGS. 1 through 8 illustrate an example embodiment or embodiments of improved operation for automated laboratories by implementing augmented reality, according to the present invention.
  • FIGS. 1 through 8 illustrate an example embodiment or embodiments of improved operation for automated laboratories by implementing augmented reality, according to the present invention.
  • FIG. 1 depicts an illustrative system 100 for implementing the steps in accordance with the aspects of the present invention.
  • a computing architecture 110 may interface with user devices 124, either directly or via a telecommunication network(s) 126 to provide mixed reality (MR) assistance with configuring, administering, and monitoring automated lab processes in a laboratory environment 102.
  • MR mixed reality
  • the term“mixed reality” refers to virtual reality, augmented reality, or other interface for providing virtual auditory and visual objects to a user, and any combination thereof.
  • the computing architecture 110 uses MR objects to provide cues related to the laboratory environment 102 such that a user may interact with a sophisticated and complex laboratory environment 102 with minimal training while also minimizing risks due to user error.
  • the automated components 104 can include any combination of automated lab equipment including robotics, conveyers, computing systems, or any automated laboratory components known in the art.
  • the automated components 104 can include robotic arms, actuators, automated samplers, etc.
  • the automated components 104, and configuration thereof can vary based on laboratory size, type of research, etc.
  • the automated components 104 may be recognizable by the computing architecture 110 using, e.g., computer identifiable objects or markers placed thereon.
  • the automated components 104 can include barcodes, QR code labels, etc. that can be machine readable via IR scanners, video capture, etc.
  • the automated complements 104 themselves can be machine recognizable utilizing other or additional object recognition techniques and combinations thereof, including any combination of systems known in the art.
  • the instrument components 106 can include hardware and software configured to take measurements and monitor other elements within the automated laboratory.
  • the instrument components 106 can include temperature sensors, motion sensors, light sensors (e.g., IR sensors), vibration sensors, video sensors, audio sensors, spectrometers, or any other sensors known in the art.
  • the sensors can include a combination of sensors included within a particular automated laboratory setup and/or supplemented with sensors specifically for use with the present invention. For example, additional sensors for identifying objects, placement of objects, volume of fluid within object, etc. can be included with standard laboratory automation components to make up the instrument components 106.
  • the instrument components 106 can be configured to communicate data to a computing architecture 110 for additional processing, including, e.g., monitoring a laboratory process or experiment, reconfiguring or troubleshoot a laboratory process, collecting real-time data and laboratory environment status, as well as any other processing and combinations thereof.
  • the instrument components 106 can include any combination of wired, wireless (e.g., WiFi, Bluetooth, etc.) enabled devices with protocols for sharing data to other computing devices.
  • the labware components 108 can include any combination of non-automated equipment found in an automated laboratory.
  • the labware components 108 can include glassware, trays, plates, test tubes, pipettes, containers, or any other combination of labware known in the art.
  • the labware components 108 may be recognizable by the computing architecture 110 using, e.g., computer identifiable markers placed thereon.
  • the labware components 108 can include barcodes, QR code labels, etc. that can be machine readable via IR scanners, video capture, etc.
  • the labware components 108 themselves can be machine recognizable utilizing other or additional object recognition techniques and combinations thereof, including any combination of systems known in the art.
  • the system 100 can utilize the mixed reality lab automation computing architecture 110 to provide mixed reality for facilitating interactions within the laboratory environment 102.
  • the computing architecture 110 can utilize information received from within the system 100 to determine what mixed reality content should be displayed on user devices 124 in a real-time view of the automated laboratory 102.
  • the computing architecture 110 can include a computing system with specialized software, hardware, and databases designed for providing informative and instructive mixed reality to a user to facilitate interacting with the complex and sophisticated laboratory environment 102.
  • the computing architecture 110 can be software installed on a local computing device 112, a remote centralized computing device 112, a cloud based architecture accessible by other computing devices (e.g., the user devices 124), a web accessible architecture, or the like.
  • the computing architecture 110 may instead be incorporated into the user devices 124 for local processing, or a combination of local and cloud systems.
  • the computing device 112 can include a processor, a memory, an input output interface, input and output devices and a storage system 114.
  • the computing device 112 can include an operating system configured to carry out operations for the applications installed thereon.
  • the computing device 112 can include a single computing device, a collection of computing devices in a network computing system, a cloud computing infrastructure, or a combination thereof.
  • the storage system 114 can include any combination of computing devices configured to store and organize a collection of data.
  • storage system 114 can be a local storage device on the computing device 112, a remote database facility, or a cloud computing storage environment.
  • the storage system 114 can also include a database management system utilizing a given database model configured to interact with a user for analyzing the database data.
  • the computing architecture 110 can include a combination of core components to carry out the various functions of the present invention using the computing device 112 and storage system 114.
  • some embodiments include the computing device 112 including hardware components, software components or a combination thereof for recognizing automated components 104 and labware components 108, and based on automation protocols and feedback from instrument components 106, assist users with configuration and monitoring of an automated laboratory process using MR via the user devices 124.
  • the computing device 112 of the computing architecture 110 can include an automation protocol module 118 a configuration module 120, and an operation module 122 in communication with a measurement module 116.
  • the modules 116, 118, 120 and 122 can include any combination of hardware and software configured to carry out the various aspects of the present invention.
  • the computing device 112 may utilize measurements from the measurement module to providing feedback to, e.g., a user at a user device 124 for an MR overlay of the laboratory environment 102 based on a current state in the laboratory environment 102.
  • the measurement module 116 can receive and/or obtain data from the instrument components in real-time or periodically.
  • the measurement module 116 can be configured to communicate with, aggregate from, and store measurements and other data obtained by the instrument components 106.
  • the measurements and other data can be stored in the storage system 114 for use by the computer device 112, particularly, the configuration module 120 and the operation module 122.
  • the measurement module 116 can be located proximate to the laboratory environment 104 and be communicatively attached to the various instrument components 106 therein. Upon receiving data from the instrument components 106, the measurement module 116 can transmit or otherwise provide said information to the storage system 114 and/or computing device 112.
  • the measurement module 116 can be remotely located with the computing device 112 and receive the data from the instrument components 106 through a wireless communication means.
  • the instrument components can be internet of things (IoT) enabled devices and can wirelessly transmit data to the measurement module 116 within the computing architecture 110.
  • IoT internet of things
  • the automation protocol module 118 may be configured for constructing laboratory protocols that include MR instructions for configuration the automated components 104, instrument components 106 and labware components 108 of the laboratory environment 102 to accurately perform an automated laboratory process.
  • the equipment, components and settings for even relatively simple automated laboratory processes may be complex and difficult to set-up.
  • a user or administrator or other laboratory professional may design the laboratory process in a simulated environment using, e.g., mixed reality objects.
  • the configuration module 120 can provide a series of mixed reality overlays in a display of the user device 124 based on components within view of a camera of the user device 124.
  • the user device 124 may capture imagery of the laboratory environment 102 and recognize the markers placed at locations of the laboratory environment 102.
  • the configuration module 120 may identify a component or substance to be placed at the marker based on the present laboratory configuration for running a selected laboratory process with the automated components 104.
  • the configuration module 120 may then serve an overlay of a corresponding three-dimensional image to the user device 124 overlaying the associated location in the real-world laboratory environment 102 such that the three-dimensional image depicts a step in configuring the laboratory environment 102 for the particular laboratory process.
  • the configuration module 120 can provide a mixed reality instructional video to be displayed to the user within the real-time display on the user device 124 overlaying a real-world laboratory environment 102 location where said instrument component 106 should be setup.
  • machine learning and image recognition algorithms can be incorporated to identify other issues, for example, incorrect labware, empty tip boxes, or hazards, preventing crashes before they occur.
  • the operation module 122 can be configured to provide information regarding a real-time state of materials, equipment, components and/or data to a user (e.g., user of a user device 124) during operation of the automated laboratory environment 102.
  • operation module 122 can provide information to a user in the form of mixed reality elements within a real-time display of the laboratory environment 102 on the user device 124.
  • the operation module 122 can provide mixed reality overlay for display on the user device 124 in response to data received from the user device 124 and the system 100.
  • the operation module 122 can utilize a combination of data received from the user devices 124, measurement module 116, and/or the operation module 122 to analyze a current configuration and operational state of the components 104, 106, 108 within the laboratory environment 102 and determine the mixed reality information to display to the user via the user devices 124.
  • the operation module 122 can receive data related to outputs from the instrument components 106 (e.g., temperatures, spin rates, incubation times, etc.), as well as other computer identifiable items (e.g., QR barcodes), within the laboratory environment 102 and relay that information in a human readable format to the user via mixed reality text and/or images to be displayed to the user within the real-time display on the user device 124.
  • data related to outputs from the instrument components 106 e.g., temperatures, spin rates, incubation times, etc.
  • other computer identifiable items e.g., QR barcodes
  • the system 100 can include one or more of user devices 124 configured to communicate with the computing architecture 1 10 over a telecommunication network(s) 126.
  • the computing architecture 1 10 can act as a host, for each of the user devices 124, providing the functionality of the modules 116, 118, 120 to the user devices 112 while sharing a secured network connection.
  • the one or more of user devices 124 can include any combination of computing devices, as described with respect to the computing device 112.
  • the computing device 112 and the plurality of user devices 124 can include any combination of servers, personal computers, laptops, tablets, smartphones, virtual reality goggles, etc.
  • the computing devices 112, 122 are configured to establish a connection and communicate over telecommunication network(s) 126 to carry out aspects of the present invention.
  • the telecommunication network(s) 126 can include any combination of known networks.
  • the telecommunication network(s) 126 may be combination of a mobile network, WAN, LAN, or other type of network.
  • the telecommunication network(s) 126 can be used to exchange data between the computing devices 112, 122, exchange data with the storage system 114, and/or to collect data from additional sources.
  • the one or more of user devices 124 can include, or otherwise be communicatively attached to a device configured to, capture video in real-time, transmit video to the computing device 112 for analysis, and receive mixed reality enhancements for display within the real-time video.
  • the user devices 124 can include or be communicatively attached to a camera configured to capture video in real time.
  • the user devices 124 can include or otherwise be in communication with software (e.g., via a web portal or cloud infrastructure) that communicates with the computing architecture 110 and data can be shared between the user devices 124 and the computing architecture 110.
  • the user devices 124 can be configured to provide real-time information to the computing architecture 110, the computing architecture 110 can perform analysis on the received data, and the computing architecture 1 10 can provide mixed reality to the user devices 124 for display within the real-time display.
  • an exemplary process 200 for configuring the system 100 to implement augmented reality for use in accordance with the present invention is provided.
  • a user interface can provided to enable an expert user to design and program an experiment protocol for the laboratory environment 102.
  • the expert user can define the required automated components 104, instrument components 106, and proper utilization and placement of labware 108 for a particular experiment.
  • the experimental definitions can include positioning (e.g., location, orientation, etc.) for each of the components 104, 106, 108, parameters (e.g., materials, temperatures, quantities, etc.) for those components 104, 106, 108, unique machine identifiable objects associated with those components 104, 106, 108, and augmented reality objects to be associated with any of the machine identifiable objects.
  • an automated laboratory 102 can be configured for any combination of experiments and each of the experiments would have its own uniquely designed and stored protocol. Once the protocol is designed it can be saved in a database for future use (e.g., editing, sharing, etc.).
  • the experiment protocol design GUI 500 can include a combination of windows to assist the expert user in designing an experiment protocol.
  • the experiment protocol design GUI 500 can include a first window for an automation script window for programing rules and methods into the protocol and a second window with the deck layout for the experiment, as shown in FIG. 5 A.
  • the expert user can program experiment protocols by designing the deck layout in the second window including but not limited to assigning labware (e.g., tubes, plates, racks, etc.) and using the first window to assign values to the labware, including but not limited to defining liquid types, volume reequipments, etc. for the labware.
  • metadata can be assigned to the labware (e.g., machine identifiable objects).
  • the computing architecture 110 can transform the experimental protocols designed by expert users (step 210) into an augmented reality program within the computing architecture 110 that can be utilized/rendered to a user during laboratory configuration and/or operation.
  • the transformation step can be performed through automated translation utilizing integrated software or through manual intervention.
  • the experiment protocol can be exported to an augmented reality program that generates all of the components from the deck layout on an automation platform for assignment of augmented reality objects.
  • the deck layout in the augmented reality program can include all the components provided in step 210 positioned at the locations on the deck specified by the expert user.
  • each of the labware components on the deck layout can be associated with machine identifiable objects (step 210) which can then be associated with one or more augmented reality objects.
  • the protocol can assign visual markers in the form of QR Codes to each labware component and each of the QR code.
  • FIG. 5B depicts an example graphical user interface 502 including augmented reality objects 240 and machine identifiable objects 230 within the augmented reality program. Codes can be associated with augmented reality objects stored in a database. The visual markers can be automatically generated, assigned, and/or selected by the user from a database.
  • a component identified as a 96 well plate in the experiment protocol from step 210 can be transformed into a three-dimensional image of a 96 well at the specified location on the generated deck platform in the augmented reality program.
  • a second or more augmented reality objects can be associated with the same component.
  • the 96 well can include a tag that is should include an instructional video, and the appropriate video will be amended to the three-dimensional image of a 96 well as specified in step 210.
  • the augmented reality can include any combination of images, text, video, notes, prompts, etc. known in the art. Once compiled, the augmented reality program can be linked to one or more user devices 124 for use in accordance with the present invention.
  • the automation protocol module 118 may be configured to perform step 210 from FIG. 2 above to create a laboratory configuration 210 using data received from the storage system 114.
  • the automation protocol module 118 may retrieve from the storage system 114 automation lab equipment and labware information such as, e.g., a labware library 216 having details describing various labware items, and a deck equipment library 217 having details describing automated deck equipment.
  • a laboratory process may utilize a variety of automated components 108 that need to be set-up properly to accurately and reliably run the process.
  • digital representations of the automated components 108 may be laid out to create a digital representing of an automated deck of equipment for implementing the laboratory process.
  • the automation protocol module 118 may be used to design an automated deck layout 211.
  • a user may control the automation protocol module 118 to configure deck equipment from the deck equipment library 217 to create the automated deck layout design 211.
  • the user may design the automated deck layout design 211 to match a real-world configuration of automated laboratory components 104 in the laboratory environment 102.
  • lab components 108 may also need to be placed in specific locations in order to accurately and reliably execute a laboratory process in the laboratory environment 102.
  • labware arrangements relative to the automated deck layout design 211 may also be configured.
  • the user may use the automation protocol module 118 to configure labware assignments 212 relative to the automated deck layout design 211.
  • the use may select labware form the labware library 216 to be located in areas on the automated deck layout of the automated deck layout design 211.
  • the user may drag a representation of a labware component and place it in a graphical location within the configuration GUI corresponding to a location on the automated deck layout design 211.
  • the laboratory configuration 210 can include a graphical design for where to place labware components on a deck layout to represent locations in the real-world configuration of the automated components 104 in which a user may be instructed to place lab components 108 based on the labware assignments 212 in the laboratory configuration 210.
  • the laboratory configuration 210 may also include information for setting sample volumes for the labware of the labware assignments 212.
  • the automated components 106 may be set to fill laboratory components 108 in a given location with a particular amount of a sample, or lab components 108 to be placed in a given location may need to be placed there with a particular amount of sample.
  • These volume requirements 213 may be established and added to the laboratory configuration 210 via the automation protocol module 118 using, e.g., the configuration GUI.
  • a particular laboratory process may also depend on liquid types provided to the automated components 106 and lab components 108, such as, e.g., solutions, samples, reagents, among other liquids.
  • a user may provide liquid type assignments 214 to various components in the configuration GUI.
  • a user may assign a particular solution to a particular labware assignment 212, such as, e.g., a well plate including a 96 well plate assigned to a particular automated component of the automated deck layout design 211.
  • an automated laboratory protocol can be designed end-to-end using the automation protocol module 118, including the layout of automated components, selection and locations of labware, volume requirements and fluid types from start to finish of the laboratory process.
  • the automation protocol module 118 may load a metadata library 215 for each requirement and assignment, e.g., from the storage system 114, e.g., in the labware library 216.
  • the metadata can include data identifying computer readable location markers such that labware, volumes and liquids can be correlated to the computer readable markers provided in the laboratory environment 102.
  • the laboratory configuration 210 can be saved and stored in the storage system 114 for use by, e.g., the computing device 112 including the configuration module 120 and the operation module 122.
  • the computing device 112 including the configuration module 120 and the operation module 122.
  • a user may select the laboratory configuration 210 when setting up the laboratory environment 102 for an associated automated laboratory process.
  • FIG. 4 an exemplary process for utilizing the computing device 112 and storage system 114 to implement augmented reality program in the configuration module 118.
  • a laboratory configuration 226, such as the laboratory configuration 210 designed in the automation protocol module 118 above, can be used by the configuration module 120 to render mixed reality or augmented reality elements.
  • the elements can include graphical overlays to display at the user device 124 to depict physical steps to take in the laboratory environment 102 to establish an automated laboratory protocol.
  • the configuration module 120 may receive the laboratory configuration 226, either from the storage system 114, or as exported directly from the automation protocol module 118. Using the automated deck layout design 211, the configuration module 120 may render each component of the automated platform in the laboratory environment 102 as a 3D virtual object for use in mixed reality devices such as the user devices 124. In some embodiments, the configuration module 120 translates configuration scripts, data, parameters, and other information into the virtual objects using a suitable rendering engine for rendering mixed reality object, such as, e.g., the UnityTM engine, or other suitable engine. Included in the rendering of the automated deck layout 221 may be assignments for visual markers 222 based on, e.g., the labware assignments 212 as described above.
  • the configuration module 120 may use the labware assignments 212 and labware imagery to map labware, e.g., lab components 108, assigned in the laboratory configuration 226 to 3D imagery depicting the labware.
  • the labware imagery may be provided by a pre-rendered set of 3D labware images for mixed reality environments maintained in an imagery library 227 in the storage system 1 14. Based on, e.g., metadata, tags, labels, names or other details and parameters of each labware assignment code in the laboratory configuration 226, a corresponding labware model can be selected form the imagery library 227 and mapped to the labware assignment 212.
  • each labware assignment 212 is also correlated with a marker via the metadata library 215 and the marker assignments 222, the labware image mappings 223 can be used to determine where in the laboratory environment 102 to overlay a rendering of each labware component 108 for instructing a user.
  • the configuration module 120 may receive user device imagery 229A captured by an image sensor in the user device 124.
  • the user device imagery 229A includes an image of a visual marker (e.g., a QR code) in the laboratory environment 102
  • the configuration module 120 can reference automated deck layout rendering
  • the configuration module 120 can select the corresponding labware image mapped to that marker and provide the labware image to the user device 124 display as a user device overlay 229B.
  • the user device 124 may display the labware image in a location on the display that intersects the user’s field of view and the visual marker in the laboratory environment 102 to overlay an animation of the appropriate labware component 108 in the location to which the labware component 108 is needed for the selected automation protocol.
  • the configuration module 120 may provide an animation to the user device 124 as a user device overlay 229B including a 3D model of the 96 well plate being positioned onto the automated component 106.
  • the configuration module 120 may also utilize additional instruction 224 of the configuration rendering 220 to further instruct the user on placement and configuration of the labware components 108 and automated components 106.
  • the configuration rendering 220 can include, e.g., additional instruction 224 based on, e.g., volume requirements 213, liquid type assignments 214, and other automation protocol parameters to effectuate a particular automation protocol for a laboratory process.
  • the configuration module 120 may also provide overlay instruction in a user device overlay 229B, e.g., via textual description or an animation, regarding the volume of a particular fluid with which to fill the labware component 106.
  • the additional instruction 224 may include overlay instructions in a user device overlay 229B for inputting settings into an interface of automated components 106, such as, e.g., aliquoting settings, system control system, iterations of an operation, among other settings, e.g., via textual description or an animation.
  • an exemplary process 600 for utilizing the system 100 to implement the augmented reality for in the laboratory environment 102 provides augmented reality objects to a user device 124 to assist in the proper setup of an automated laboratory for a particular experiment protocol.
  • the user device 124 can communicate with the computing architecture 110 to receive the necessary information and content for utilization within the laboratory environment 102.
  • the communication can include synchronizing and updating with the latest laboratory layouts, experiment protocols, augmented reality objects, etc. to be generated to the device.
  • any combination of data can be preloaded into the application running on the user device or downloaded and/or rendered in real-time from data stored remotely on the computing architecture 1 10.
  • the laboratory environment 102 can include a computing device responsible for the automated function of the robotics within the laboratory environment 102, which will need to download the aspects of the experiment protocol responsible to providing instructions to the robotics during operation of the experiment.
  • a user can have a portable computing device for viewing the laboratory environment 102 and receiving augmented reality feedback for setting up the laboratory environment 102 and during operation of the experiment, which will need to download augmented reality and render it in a real-time view of the laboratory environment 102.
  • step 602 using a user device(s) 122 with access to the programming of the present invention, the user selects to automated protocol they would like to execute in the laboratory environment 102. Selecting the automated protocol can include presenting the user with a graphical user interface with options for different automated laboratory equipment and/or specific experiment to be run. Each of the options can be customizable by the user based on their equipment and experimental preferences. In some embodiments, the system 100 can adjust the content being provided to the user based on the user selections.
  • augmented reality content is rendered on the user device (122) at the locations of the machine identifiable objects 230.
  • the augmented reality displayed to the user via the user device 112 can provide guidance and real-time feedback on how to properly set-up the experiment protocol within the given laboratory environment 102.
  • the user device 124 can utilize the unique machine identifiable objects 230 as anchors, projecting images, texts, video, or other visual cues at those locations to facilitate the proper setup of the target object or location including the machine identifiable objects 230, as discussed in greater detail with respect to FIGS. 7A-7J.
  • the user device 124 can also be provided with feedback on improperly placed objects and warns the user of potential issues which could result in instrument crashes, safety hazards, and/or incorrect data generation.
  • the user device 124 can provide real-time updates to the user through a continuous connection between the laboratory environment 102, the measurement module 116 communicatively attached to the computing architecture 110, and the user device 124 itself.
  • These real-time updates can include augmented reality overlays that alert the user of system failures or errors, scheduled user interventions, instrument data (i.e. temperature, volume, sample ID), or other visual representations which would facilitate the execution of the protocol.
  • the computing architecture 110 can operate in conjunction with the user device 124 to display mixed reality content 240 to assist the user setting up and configuring an automated laboratory environment (e.g., via the configuration module 120) using process 200 and/or provide additional insight during operation of the automated laboratory environment (e.g., via the operation module 122) using process 600.
  • the user can utilize the user device 124 to capture a real-time image/video of a laboratory environment 102 and receive mixed reality content 240 to be displayed within said real-time image/video to convey particular information to the user.
  • the configuration module 120 can include preprogrammed instructions and configurations for one or more automated laboratory setups and configurations. Initially, based on the real-time view of the automated laboratory 102, the configuration module 120 can provide users with step by step installation and configuration instructions through the display of mixed reality content within the real-world view on the user device 124.
  • the configuration module 120 can identify augmented reality installation to be transmitted to the user device 124 for display within the real-world view on the device 122 at locations of the machine identifiable objects 230.
  • the configuration module 120 can provide mixed reality view for locations and installation instructions for each of the components 104, 106, 108 within an empty lab space.
  • a user can provide a real-time view of an empty lab space (e.g., room, tray, container, etc.) and the configuration module 120 can determine where the appropriate components 104, 106, 108 should be installed and provide that information to the user via mixed reality content rendered within the real-time view of the lab space.
  • FIG. 7 A an exemplary image of an automated laboratory 102 environment is depicted.
  • the example automated laboratory 102 in FIG. 7 A includes a combination of automated components 104, labware components 108 (e.g., plates with test tube trays), and locations with machine identifiable objects 230 located thereon.
  • the machine identifiable objects 230 can be view by user devices 124 and can be used to identify specific locations, components, etc. such that they can be recognized and utilized as inputs for the system 100.
  • the machine identifiable objects 230 can be associated with locations in which particular labware components should be placed.
  • the machine identifiable objects 230 can also identify components that the system 100 will monitor and track during operation, or can include spaces in which no objects or obstructions should be present, such that if the system 100 cannot locate a particular machine identifiable objects 230, it will notify the user that an error in the setup is present.
  • augmented realty objects will be rendered over any recognized machine identifiable objects 230 to assist a user in the setup and operation of the automated laboratory 102.
  • the user device 124 can first transmit at least a portion of the captured real-time image/video to the computing architecture 110 for analysis.
  • the captured image/video can include unique machine identifiable objects 240 (e.g., barcodes) that can be associated (computing architecture 110) with mixed reality objections (e.g., augmented reality, virtual reality, etc.).
  • the user device 124 can be configured to identify the unique machine identifiable objects 230 and transmit them to the computing architecture 110 for analyses and/or retrieval of augmented reality content 240.
  • an exemplary laboratory environment 210 can include two-dimensional Quick Response Code (QR Codes) associated with one or more items within the laboratory environment 210.
  • QR Codes two-dimensional Quick Response Code
  • the present invention can also use any combination of image recognition in addition to or in place of the machine identifiable objects 230 without departing from the scope of the present invention.
  • the computing architecture 110 in response to receiving data from the user device 124, can analyze any machine identifiable objects 230 (or image recognition) and determine whether those objects are associated with mixed reality content 240 stored in the storage system 114. The association can be determined by searching a lookup table or other database to see if a mixed reality content 240 is associated with the received data. In instances in which there is mixed reality content 240 associated with data received from the user device 124, the computing architecture 110 can determine what mixed reality content 240 is to be displayed and how it is to be displayed. Thereafter, the computing architecture 110 can be configured to package the mixed reality content 240 and instructions for the user device 124 for displaying the mixed reality content 240 in the real-time image/video and transmit the package to the user device 112 for rendering.
  • any machine identifiable objects 230 or image recognition
  • FIG. 7C an illustrative example of mixed reality content 240 that can be rendered in association with a machine identifiable objects 230 from FIG. 7B is depicted.
  • a three-dimensional rendering of a well plate is rendered, indicating to the user the specific labware that should be placed at the particular location.
  • FIG. 7D upon recognition of the second machine identifiable object 230 from FIG. 7B, a second three-dimensional rendering of a well plate is rendered at the second location.
  • the computing architecture 110 can determine that a combination of image, video, text, and three-dimensional objects should be displayed at the respective locations of the machine identifiable objects 230 within the real-time image/video for the laboratory environment 210.
  • the mixed reality content 240 can be rendered at the locations of the machine identifiable objects 230, effectively covering the machine identifiable objects 230 from the view of the user.
  • the objects can be rendered with translucently as to not completely cover the machine identifiable objects 230 from view of the user (e.g., to act as a reference point) or a combination thereof.
  • FIG. 7E the user can place the labware component 108 matching the mixed reality content 240.
  • the three-dimensional well plate rendered in FIGS. 7C and 7D is replaced by the user with the real-world well plate matching the three-dimensional well plate at the same location as the three-dimensional well plate.
  • FIG. 7F a view of the laboratory environment 102 is depicted in which a labware component 108 has been placed, an augmented reality object 240 has been rendered indicating labware that needs to be placed, and a machine identifiable objects 230 that has not yet been recognized/associated with an augmented reality object 240 is provided.
  • FIG. 7E provides representations for how the present invention progresses from a machine identifiable objects 230, to rendering an augmented reality object 240, and concluding with a user having placed the appropriate labware component 108.
  • another example augmented reality object 240 can be rendered in place of a machine identifiable objects 230.
  • the augmented reality object 240 can include placement of a labware component that requires additional steps from the user before setup is complete. The additional steps can be presented to the user in multiple formats, including but not limited to, text, images, video, audio, or a combination thereof.
  • a three-dimensional rending of a reservoir is provided along with text indicating to the user to add 50m1 of TE Buffer.
  • multiple augmented reality objects 240 can be rendered in place of a single machine identifiable objects 230.
  • FIG. 7H a three-dimensional rending of a reservoir is provided along with text indicating to the user to add 50ul of TE Buffer along with a video that can play to show the user how to fill the reservoir.
  • a user can place the labware component 108 provided in FIGS. 7G and 7H and fill the labware component 108 with the appropriate material.
  • a reservoir is placed by the user, as shown in FIG. 71, and 50m1 of TE Buffer is added to the bin, as shown in FIG. 7J.
  • the configuration module 120 can continue provide the augmented reality content (e.g., video, location, etc.) as machine identifiable objects 230 are recognized at the different locations and the user can repeat the above steps until the automated laboratory is completely setup and/or configured for the experiment protocol.
  • augmented reality content e.g., video, location, etc.
  • the configuration module 120 can analyze the received machine identifiable objects 230 along with other elements within the automated laboratory environment and determine whether components 104, 106, 108 are installed at the appropriate locations and whether the appropriate steps were taken in setting up those objects (i.e., the proper amount of fluid was added). In the event that one or more components 104, 106, 108 are not installed at the appropriate locations, the configuration module 120 can provide mixed reality content 240 to the user in the form of warnings or alerts to display how said components 104, 106, 108 should be installed.
  • the configuration module 120 can also identify other objects that do not belong within the automated laboratory space and issue a warning to the user that those objects may interfere with operation of the laboratory.
  • the configuration module 120 can use image recognition technology to identify any components that are not recognized and/or are located improperly within the automated environment.
  • the operation module 122 can provide users with additional information during operation of the automated laboratory.
  • the operation module 122 can be configured to monitor objects within the automated laboratory environment to ensure that the automated laboratory 102 is operating as intended and provide real-time feedback to the user as to what is occurring at each stage within the operation.
  • the operation module 122 can aggregate data related to the real-world display on the user device 124 and data from the measurement module 116 and determine what mixed reality content to display on the user device 124.
  • the user can customize what data to be displayed.
  • the aggregated data can include a combination of received machine identifiable objects 230 from the user device 124, image recognition data, and received measurement data (from the measurement module 116.
  • Real-time updates can include timers, sample volumes, sample IDs, transfer volumes, intervention instructions.
  • the operation module 122 can associate different mixed reality content with the same identifiable objects used by the configuration module 120.
  • the configuration module 120 can displayed augmented videos for where to setup the labware while the operation module 122 can display augmented images or text displaying information related to the labware items themselves.
  • the operation module 122 can associate mixed reality identification information for what is in test tubes and display that information to the user in real-time.
  • the operation module 122 can display measured information for those test tubes, for example, temperature provided by the instrument components 106. This can assist a user to both know what is happening within the automated laboratory at a particular point in time as well as provide additional insight that a user would not otherwise be able to glean by watching operation of the laboratory without the mixed reality information.
  • the operation module 122 can also track and store historic data about the operations being performed within the automated laboratory 102 via data received from the measurement module 116. For example, the operation module 122 can track temperature data throughout an experiment and display the results to a user in an augmented environment. In some embodiments, the operation module 122 can provide feedback to the user based on an analysis of the historic data. For example, the operation module 122 can identify when an experiment is not running as normal and can provide a mixed reality warning to the user to warn that something may not be operating as intended to potentially avoid running the experiment in its entirety when it is not correct.
  • the configuration module 120 and the operation module 122 can be configured to project different mixed reality content for different purposes for different circumstances.
  • the configuration module 120 can project intended labware components 108 onto a deck to facilitate proper placement, integrate video segments to provide granular instructions for setting up the laboratory environment 104 as necessary, provide mixed reality demonstrating how to place labware components 108, highlight sample wells that need controls or other attention, provide alerts if labware components 108 are placed in incorrect locations, project data above samples to identify information about the samples (e.g., concentrations, sample IDs, volume, etc.), and provide holograms highlighting the source of errors within the operation of the automated components 104 or instrument components 106 (e.g., warning sign of a crash, missing tip box, etc.).
  • any combination of mixed reality content can be provided to a user that may be useful in a particular laboratory environment.
  • mixed reality in laboratory automation can better connect the scientist to the task that the automation is performing by bringing the digital to the‘real world’.
  • incorporation of mixed reality into the real-time view of the laboratory environment 102 can help bridge the gap between digital protocols and actual execution.
  • scientists will not only be able to better set up their experiments (e.g., via content provided by the configuration module 120), but they will be able to track in real-time what is happening during execution of those experiments (e.g., via content provided by the operation module). For example, if a user can ‘see’ where samples should be placed within the real-time environment, they can ensure proper placement. This can lead to more consistency, safer operation, and ultimately better data. Similarly, if a user can‘see’ what the automated system is doing, then the user can better understand how the experiment is going and how the resulting data is generated. This can lead to increased traceability, more consistency, safer operation, and ultimately better data.
  • An automated liquid handler is able to perform this task in a single transfer, whereas a scientist doing this by hand would need multiple transfers.
  • the liquid handler digitally tracks each dispense, providing traceability and error-proofing that is not present when the task is performed by hand.
  • the robotic liquid handler still relies on the user to set up the automation deck correctly. If the user puts the trough or plate in the wrong location, the instrument does not know and will not perform the transfer correctly. In addition, if not enough liquid is added to the trough, then the liquid handler will not aspirate the correct amount and the required volumes in each well will not be met. As provided here, using augmented reality to set-up the equipment correctly will lead to better results.
  • PCR Polymerase chain reaction
  • Setting up a PCR reaction involves a series of liquid handling steps that often need to be performed in a particular sequence and utilizing a set amount of reagent and sample in order to get accurate and high-quality results.
  • PCR reagents can be expensive, so performing large volume reactions can be cost-prohibitive.
  • laboratory automation platforms are utilized to miniaturize reaction volumes.
  • scientists perform PCR reactions in single tubes or 96-well plates because they are easy to use and allow for reliable sample/reagent transfers.
  • On a robotic system a scientist can utilize a plate with a much higher well density (384 wells/plate or 1536 wells/plate). This allows for miniaturization of the reaction, resulting in more data generation at a lower cost per reaction.
  • PCR set-up requires multiple reagents and samples, it can be an error-prone process. It is possible that a reagent or sample will be missed, which will result in a failed reaction. These failed reactions delay results and increase research costs. Automated liquid handling can help limit these mistakes, but they are still reliant on the user properly setting up the automation for the reaction. Since this is a multi-reagent reaction, it is vital that the user places the correct reagent or sample in the desired location. If this is not performed correctly, the reaction will either fail or generate incorrect data. Augmented reality can assist the user through targeted prompts, embedded videos, and visual signals to ensure the correct reagents and samples are placed in the desired location. The end result will be a lower failure rate and higher quality data generation.
  • Any suitable computing device can be used to implement the computing devices 110, 122 and methods/functionality described herein and be converted to a specific system for performing the operations and features described herein through modification of hardware, software, and firmware, in a manner significantly more than mere execution of software on a generic computing device, as would be appreciated by those of skill in the art.
  • One illustrative example of such a computing device 800 is depicted in FIG. 8.
  • the computing device 500 is merely an illustrative example of a suitable computing environment and in no way limits the scope of the present invention.
  • embodiments of the present invention may utilize any number of computing devices 500 in any number of different ways to implement a single embodiment of the present invention. Accordingly, embodiments of the present invention are not limited to a single computing device 500, as would be appreciated by one with skill in the art, nor are they limited to a single type of implementation or configuration of the example computing device 500.
  • the computing device 800 can include a bus 810 that can be coupled to one or more of the following illustrative components, directly or indirectly: a memory 812, one or more processors 814, one or more presentation components 816, input/output ports 818, input/output components 820, and a power supply 824.
  • the bus 810 can include one or more busses, such as an address bus, a data bus, or any combination thereof.
  • busses such as an address bus, a data bus, or any combination thereof.
  • FIG. 8 is merely illustrative of an exemplary computing device that can be used to implement one or more embodiments of the present invention, and in no way limits the invention.
  • the computing device 800 can include or interact with a variety of computer-readable media.
  • computer-readable media can include Random Access Memory (RAM); Read Only Memory (ROM); Electronically Erasable Programmable Read Only Memory (EEPROM); flash memory or other memory technologies; CDROM, digital versatile disks (DVD) or other optical or holographic media; magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices that can be used to encode information and can be accessed by the computing device 800.
  • the memory 812 can include computer-storage media in the form of volatile and/or nonvolatile memory.
  • the memory 812 may be removable, non-removable, or any combination thereof.
  • Exemplary hardware devices are devices such as hard drives, solid-state memory, optical-disc drives, and the like.
  • the computing device 800 can include one or more processors that read data from components such as the memory 812, the various EO components 816, etc.
  • Presentation component(s) 816 present data indications to a user or other device.
  • Exemplary presentation components include a display device, speaker, printing component, vibrating component, etc.
  • the EO ports 818 can enable the computing device 800 to be logically coupled to other devices, such as EO components 820. Some of the EO components 820 can be built into the computing device 800. Examples of such EO components 820 include a microphone, joystick, recording device, game pad, satellite dish, scanner, printer, wireless device, networking device, and the like.
  • the terms“comprises” and“comprising” are intended to be construed as being inclusive, not exclusive.
  • the terms“exemplary”,“example”, and “illustrative”, are intended to mean“serving as an example, instance, or illustration” and should not be construed as indicating, or not indicating, a preferred or advantageous configuration relative to other configurations.
  • the terms “about”, “generally”, and “approximately” are intended to cover variations that may existing in the upper and lower limits of the ranges of subjective or objective values, such as variations in properties, parameters, sizes, and dimensions.
  • the terms“about”,“generally”, and “approximately” mean at, or plus 10 percent or less, or minus 10 percent or less. In one non limiting example, the terms“about”,“generally”, and“approximately” mean sufficiently close to be deemed by one of skill in the art in the relevant field to be included.
  • the term“substantially” refers to the complete or nearly complete extend or degree of an action, characteristic, property, state, structure, item, or result, as would be appreciated by one of skill in the art. For example, an object that is“substantially” circular would mean that the object is either completely a circle to mathematically determinable limits, or nearly a circle as would be recognized or understood by one of skill in the art.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Systems and methods for providing instructing mixed reality overlays for configuring automation protocols in laboratory processes using computing systems include receiving an image from an image feed of an environment. Component markers in the image of the environment are detected and matched with an associated environment component, an environment component action or both. The environment component or environment component action, or both are selected and an animation of an instruction is rendered. An overlay of the animation is caused to display in an augmented reality device associated with a user to appear in a location in the environment of the environment component or environment component action, or both.

Description

SYSTEMS AND METHODS FOR IMPLEMENTED MIXED REALITY IN
LABORATORY AUTOMATION
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] The present applications claims priority to the earlier filed provisional application having serial no. 62/801,045, and hereby incorporates subject matter of the provisional application in its entirety.
BACKGROUND
[0002] Generally, laboratory automation is utilized to research, develop, optimize and capitalize on technologies in the laboratory. Laboratory automation professionals are typically academic, commercial and government researchers, scientists and engineers who conduct research and develop new technologies to increase productivity, elevate experimental data quality, reduce lab process cycle times, or enable experimentation that otherwise would be impossible. A popular application of laboratory automation technology is laboratory robotics utilizing different automated laboratory instruments, devices, software algorithms, and methodologies used to enable, expedite and increase the efficiency and effectiveness of scientific research in laboratories. One example of such robotic system in an automated laboratory setting is the use of autosamplers using micro syringes. Laboratories devoted to activities such as high-throughput screening, combinatorial chemistry, automated clinical and analytical testing, diagnostics, large scale biorepositories, and many others, would not exist without advancements in laboratory automation.
[0003] Automated laboratory systems can be expensive and require very precise setup and configuration to ensure that the automated equipment can perform the specific tasks without issue. As such, downtime within an automated laboratory (caused by malfunctions, crashes, errors, etc.) can be costly in time and financials to resolve the issue(s) causing the downtime. A significant number of lab automation crashes or errors, such as sample processing errors, can be attributed to human error. A large percentage of human errors are caused by misplaced or improperly setup labware.
SUMMARY
[0004] There is a need for improvements for assisting users in the configuration and management of an automated laboratory, particularly by providing users with more straightforward guidance on where labware should be placed and how the automated laboratory should be setup can help to eliminate mistakes, increase safety, and reduce instrument downtime due to crashes. Additionally, providing users with real time feedback for what is occurring during operation of the automated lab can assist the user to understand what is happening and spot potential issues. The present invention is directed toward further solutions to address this need, in addition to having other desirable characteristics.
BRIEF DESCRIPTION OF THE FIGURES
[0005] These and other characteristics of the present invention will be more fully understood by reference to the following detailed description in conjunction with the attached drawings, in which:
[0006] FIG. 1 is an exemplary computing system for implementing the systems and methods of the present invention;
[0007] FIG. 2 is an exemplary process for designing experiment protocols for use in accordance with the present invention;
[0008] FIG. 3 is an exemplary computing system for implementing the systems and methods of the present invention;
[0009] FIG. 4 is an exemplary computing system for implementing the systems and methods of the present invention;
[0010] FIG. 5 A is an exemplary graphical user interface for designing experiment protocols;
[0011] FIG. 5B is exemplary graphical user interface for transforming an experiment protocol in an augmented reality program;
[0012] FIG. 6 is an exemplary process for configuring an automated laboratory using the system and methods of the present invention;
[0013] FIGS. 7A, 7B, 7C, 7D, 7E, 7F, 7G, 7H, 71, AND 7J are exemplary implementations of configuring an automated laboratory using augmented reality; and
[0014] FIG. 8 shows an exemplary computer architecture for use, in accordance with the present invention.
DETAILED DESCRIPTION
[0015] An illustrative embodiment of the present invention relates to systems and methods for providing mixed reality content within an automated laboratory. The incorporation of mixed reality into a real-time view of the laboratory environment is provided to help bridge the gap between digital protocols and actual execution. The present invention can assist in the setup experiments and track what is happening during execution of those experiments in real-time via rendering of mixed reality content.
[0016] FIGS. 1 through 8, wherein like parts are designated by like reference numerals throughout, illustrate an example embodiment or embodiments of improved operation for automated laboratories by implementing augmented reality, according to the present invention. Although the present invention will be described with reference to the example embodiment or embodiments illustrated in the figures, it should be understood that many alternative forms can embody the present invention. One of skill in the art will additionally appreciate different ways to alter the parameters of the embodiment(s) disclosed in a manner still in keeping with the spirit and scope of the present invention.
[0017] FIG. 1 depicts an illustrative system 100 for implementing the steps in accordance with the aspects of the present invention. In particular, a computing architecture 110 may interface with user devices 124, either directly or via a telecommunication network(s) 126 to provide mixed reality (MR) assistance with configuring, administering, and monitoring automated lab processes in a laboratory environment 102. As used herein, the term“mixed reality” refers to virtual reality, augmented reality, or other interface for providing virtual auditory and visual objects to a user, and any combination thereof. In some embodiments, the computing architecture 110 uses MR objects to provide cues related to the laboratory environment 102 such that a user may interact with a sophisticated and complex laboratory environment 102 with minimal training while also minimizing risks due to user error.
[0018] In particular, the system 100 may include a combination of hardware and software implemented within an automated laboratory 102 for carrying out automated laboratory processes. The automated laboratory 102 can include a combination of components that would commonly be located within an automated lab and may be set-up, configured, initialized or otherwise manipulated to automatically perform laboratory processes with respect to a sample or other process subject. In some embodiments, the components can include automated components 104, instrument components 106, and lab ware components 108. In some embodiments, each of the automated components 104, instrument components 106, and labware components 108 may be recognizable by the computing architecture 110 such that the computing architecture 1 10 may assist with the set-up, configuration, initialization or other manipulation/interaction based on the detection and recognition of each component in a predefined laboratory protocol.
[0019] The automated components 104 can include any combination of automated lab equipment including robotics, conveyers, computing systems, or any automated laboratory components known in the art. For example, the automated components 104 can include robotic arms, actuators, automated samplers, etc. As would be appreciated by one skilled in the art, the automated components 104, and configuration thereof, can vary based on laboratory size, type of research, etc. In some embodiments, the automated components 104 may be recognizable by the computing architecture 110 using, e.g., computer identifiable objects or markers placed thereon. For example, the automated components 104 can include barcodes, QR code labels, etc. that can be machine readable via IR scanners, video capture, etc. As would be appreciated by one skilled in the art, the automated complements 104 themselves can be machine recognizable utilizing other or additional object recognition techniques and combinations thereof, including any combination of systems known in the art.
[0020] The instrument components 106 can include hardware and software configured to take measurements and monitor other elements within the automated laboratory. For example, the instrument components 106 can include temperature sensors, motion sensors, light sensors (e.g., IR sensors), vibration sensors, video sensors, audio sensors, spectrometers, or any other sensors known in the art. The sensors can include a combination of sensors included within a particular automated laboratory setup and/or supplemented with sensors specifically for use with the present invention. For example, additional sensors for identifying objects, placement of objects, volume of fluid within object, etc. can be included with standard laboratory automation components to make up the instrument components 106. In some embodiments, the instrument components 106 can be configured to communicate data to a computing architecture 110 for additional processing, including, e.g., monitoring a laboratory process or experiment, reconfiguring or troubleshoot a laboratory process, collecting real-time data and laboratory environment status, as well as any other processing and combinations thereof. For example, the instrument components 106 can include any combination of wired, wireless (e.g., WiFi, Bluetooth, etc.) enabled devices with protocols for sharing data to other computing devices.
[0021] The labware components 108 can include any combination of non-automated equipment found in an automated laboratory. For example, the labware components 108 can include glassware, trays, plates, test tubes, pipettes, containers, or any other combination of labware known in the art. In some embodiments, the labware components 108 may be recognizable by the computing architecture 110 using, e.g., computer identifiable markers placed thereon. For example, the labware components 108 can include barcodes, QR code labels, etc. that can be machine readable via IR scanners, video capture, etc. As would be appreciated by one skilled in the art, the labware components 108 themselves can be machine recognizable utilizing other or additional object recognition techniques and combinations thereof, including any combination of systems known in the art.
[0022] In some embodiments, the system 100 can utilize the mixed reality lab automation computing architecture 110 to provide mixed reality for facilitating interactions within the laboratory environment 102. In particular, the computing architecture 110 can utilize information received from within the system 100 to determine what mixed reality content should be displayed on user devices 124 in a real-time view of the automated laboratory 102. In some embodiments, the computing architecture 110 can include a computing system with specialized software, hardware, and databases designed for providing informative and instructive mixed reality to a user to facilitate interacting with the complex and sophisticated laboratory environment 102. For example, the computing architecture 110 can be software installed on a local computing device 112, a remote centralized computing device 112, a cloud based architecture accessible by other computing devices (e.g., the user devices 124), a web accessible architecture, or the like. However, the computing architecture 110 may instead be incorporated into the user devices 124 for local processing, or a combination of local and cloud systems.
[0023] The combination of hardware and software that make up the computing device 112 within computing architecture 110 are specifically configured to provide a technical solution to a particular problem utilizing an unconventional combination of steps/operations to carry out aspects of the present invention. In particular, the computing architecture 110 is designed utilize the computing device 112 to execute a unique combination of steps to provide a novel approach to providing informative and instructive mixed reality to a user within a real-time view of an automated laboratory. This technical solution arises out of problems created by the development and increased use of automated laboratories.
[0024] In accordance with an example embodiment of the present invention, the computing device 112 can include a processor, a memory, an input output interface, input and output devices and a storage system 114. The computing device 112 can include an operating system configured to carry out operations for the applications installed thereon. As would be appreciated by one skilled in the art, the computing device 112 can include a single computing device, a collection of computing devices in a network computing system, a cloud computing infrastructure, or a combination thereof. Similarly, as would be appreciated by one of skill in the art, the storage system 114 can include any combination of computing devices configured to store and organize a collection of data. For example, storage system 114 can be a local storage device on the computing device 112, a remote database facility, or a cloud computing storage environment. The storage system 114 can also include a database management system utilizing a given database model configured to interact with a user for analyzing the database data.
[0025] Moreover, in some embodiments, the computing architecture 110 can include a combination of core components to carry out the various functions of the present invention using the computing device 112 and storage system 114. For example, some embodiments include the computing device 112 including hardware components, software components or a combination thereof for recognizing automated components 104 and labware components 108, and based on automation protocols and feedback from instrument components 106, assist users with configuration and monitoring of an automated laboratory process using MR via the user devices 124. In accordance with an example embodiment of the present invention, the computing device 112 of the computing architecture 110 can include an automation protocol module 118 a configuration module 120, and an operation module 122 in communication with a measurement module 116. As would be appreciated by one skilled in the art, the modules 116, 118, 120 and 122 can include any combination of hardware and software configured to carry out the various aspects of the present invention.
[0026] In accordance with an example embodiment of the present invention, the computing device 112 may utilize measurements from the measurement module to providing feedback to, e.g., a user at a user device 124 for an MR overlay of the laboratory environment 102 based on a current state in the laboratory environment 102. In some embodiments, the measurement module 116 can receive and/or obtain data from the instrument components in real-time or periodically. For example, the measurement module 116 can be configured to communicate with, aggregate from, and store measurements and other data obtained by the instrument components 106. In some embodiments, the measurements and other data can be stored in the storage system 114 for use by the computer device 112, particularly, the configuration module 120 and the operation module 122.
[0027] In one example, the measurement module 116 can be located proximate to the laboratory environment 104 and be communicatively attached to the various instrument components 106 therein. Upon receiving data from the instrument components 106, the measurement module 116 can transmit or otherwise provide said information to the storage system 114 and/or computing device 112.
[0028] In another example the measurement module 116 can be remotely located with the computing device 112 and receive the data from the instrument components 106 through a wireless communication means. For example, the instrument components can be internet of things (IoT) enabled devices and can wirelessly transmit data to the measurement module 116 within the computing architecture 110.
[0029] In some embodiments, the automation protocol module 118 may be configured for constructing laboratory protocols that include MR instructions for configuration the automated components 104, instrument components 106 and labware components 108 of the laboratory environment 102 to accurately perform an automated laboratory process. In many cases, the equipment, components and settings for even relatively simple automated laboratory processes may be complex and difficult to set-up. To facilitate reliably instructing a user for configuring the laboratory environment 102, a user or administrator or other laboratory professional may design the laboratory process in a simulated environment using, e.g., mixed reality objects.
[0030] Accordingly, in some embodiments, the automated protocol module 118 may include, e.g., a library of three-dimensional representations of components, such as the automated components 104, instrument components 106 and labware components 108, as well as template protocols and settings for each component used in each process. A professional may access the computing device 112 to modify the libraries and assemble a protocol. For example, the user may select a deck layout of equipment, assign labware to particular locations in the deck, establish volume requirements of samples, assign liquid types, import metadata related to each piece of equipment, deck configuration, labware component, among other configuration options.
[0031] In some embodiments, the metadata may include computer identifiable items such as objects or markers (e.g., QR barcodes) that can be used to associated equipment and components with real-world markers. Thus, in some embodiments, in addition to establish a protocol in the automation protocol module 118, the professional may also establish corresponding physical markers in the laboratory environment 102 to mark component locations, e.g., locations for well plates among other components. Accordingly, in some embodiments, the professional may design the automation protocol to correlate specific equipment to the component locations based on the needs of a given laboratory process. Each automation protocol may then be stored as a preset laboratory configuration in the storage system 114 for access by, e.g., the configuration module 120 and the operation module 122.
[0032] In accordance with an example embodiment of the present invention, the configuration module 120 can be configured to provide assistance to a user (e.g., user of a user device 124) with the configuration and/or assembly of the automated laboratory environment 102 by generating MR overlays at the user device 124 based on established protocols configured with the automation protocol module 118. The configuration module 120 can be configured to create and/or provide preset laboratory configurations at the user device 124 based on the automated laboratory equipment being used and the specific experiments being run. The preset laboratory configurations can be stored for providing assistance during laboratory setup. In some embodiments, configuration module 120 can provide assistance by providing instructions to a user in the form of mixed reality elements within a real-time display of the laboratory environment 102 on the user device 124.
[0033] Thus, in some embodiments, the configuration module 120 can provide mixed reality content for display on the user device 124 in response to data received from the user device 124. In particular, the configuration module 120 can utilize a combination of data received from the user devices 124 and/or the measurement module 116 to analyze a current configuration and layout of the components 104, 106, 108 within the laboratory environment 102 and determine the mixed reality information to display to the user via the user devices 124 for completing a setup according to the stored preset laboratory configurations.
[0034] For example, the configuration module 120 can provide a series of mixed reality overlays in a display of the user device 124 based on components within view of a camera of the user device 124. In particular, the user device 124 may capture imagery of the laboratory environment 102 and recognize the markers placed at locations of the laboratory environment 102. Using the markers, the configuration module 120 may identify a component or substance to be placed at the marker based on the present laboratory configuration for running a selected laboratory process with the automated components 104. The configuration module 120 may then serve an overlay of a corresponding three-dimensional image to the user device 124 overlaying the associated location in the real-world laboratory environment 102 such that the three-dimensional image depicts a step in configuring the laboratory environment 102 for the particular laboratory process.
[0035] In another example, if the configuration module 120 reviewed real-time video capture of the laboratory environment 102 and determines that an identified instrument component 106 (or any component) is setup at a wrong location, the configuration module 120 can provide a mixed reality instructional video to be displayed to the user within the real-time display on the user device 124 overlaying a real-world laboratory environment 102 location where said instrument component 106 should be setup. Furthermore, machine learning and image recognition algorithms can be incorporated to identify other issues, for example, incorrect labware, empty tip boxes, or hazards, preventing crashes before they occur.
[0036] In accordance with an example embodiment of the present invention, the operation module 122 can be configured to provide information regarding a real-time state of materials, equipment, components and/or data to a user (e.g., user of a user device 124) during operation of the automated laboratory environment 102. In some embodiments, operation module 122 can provide information to a user in the form of mixed reality elements within a real-time display of the laboratory environment 102 on the user device 124.
[0037] For example, the operation module 122 can provide mixed reality overlay for display on the user device 124 in response to data received from the user device 124 and the system 100. In particular, the operation module 122 can utilize a combination of data received from the user devices 124, measurement module 116, and/or the operation module 122 to analyze a current configuration and operational state of the components 104, 106, 108 within the laboratory environment 102 and determine the mixed reality information to display to the user via the user devices 124. For example, the operation module 122 can receive data related to outputs from the instrument components 106 (e.g., temperatures, spin rates, incubation times, etc.), as well as other computer identifiable items (e.g., QR barcodes), within the laboratory environment 102 and relay that information in a human readable format to the user via mixed reality text and/or images to be displayed to the user within the real-time display on the user device 124.
[0038] In accordance with an example embodiment of the present invention, the system 100 can include one or more of user devices 124 configured to communicate with the computing architecture 1 10 over a telecommunication network(s) 126. In some embodiments, the computing architecture 1 10 can act as a host, for each of the user devices 124, providing the functionality of the modules 116, 118, 120 to the user devices 112 while sharing a secured network connection. As would be appreciated by one skilled in the art, the one or more of user devices 124 can include any combination of computing devices, as described with respect to the computing device 112. For example, the computing device 112 and the plurality of user devices 124 can include any combination of servers, personal computers, laptops, tablets, smartphones, virtual reality goggles, etc. In accordance with an example embodiment of the present invention, the computing devices 112, 122 are configured to establish a connection and communicate over telecommunication network(s) 126 to carry out aspects of the present invention. As would be appreciated by one skilled in the art, the telecommunication network(s) 126 can include any combination of known networks. For example, the telecommunication network(s) 126 may be combination of a mobile network, WAN, LAN, or other type of network. The telecommunication network(s) 126 can be used to exchange data between the computing devices 112, 122, exchange data with the storage system 114, and/or to collect data from additional sources.
[0039] In some embodiments, the one or more of user devices 124 can include, or otherwise be communicatively attached to a device configured to, capture video in real-time, transmit video to the computing device 112 for analysis, and receive mixed reality enhancements for display within the real-time video. For example, the user devices 124 can include or be communicatively attached to a camera configured to capture video in real time. Additionally, the user devices 124 can include or otherwise be in communication with software (e.g., via a web portal or cloud infrastructure) that communicates with the computing architecture 110 and data can be shared between the user devices 124 and the computing architecture 110. The user devices 124 can be configured to provide real-time information to the computing architecture 110, the computing architecture 110 can perform analysis on the received data, and the computing architecture 1 10 can provide mixed reality to the user devices 124 for display within the real-time display.
[0040] Referring to FIG. 2, an exemplary process 200 for configuring the system 100 to implement augmented reality for use in accordance with the present invention is provided. At step 210, through the computing architecture 110, a user interface can provided to enable an expert user to design and program an experiment protocol for the laboratory environment 102. When designing the protocol, the expert user can define the required automated components 104, instrument components 106, and proper utilization and placement of labware 108 for a particular experiment. The experimental definitions can include positioning (e.g., location, orientation, etc.) for each of the components 104, 106, 108, parameters (e.g., materials, temperatures, quantities, etc.) for those components 104, 106, 108, unique machine identifiable objects associated with those components 104, 106, 108, and augmented reality objects to be associated with any of the machine identifiable objects. As would be appreciated by one skilled in the art, an automated laboratory 102 can be configured for any combination of experiments and each of the experiments would have its own uniquely designed and stored protocol. Once the protocol is designed it can be saved in a database for future use (e.g., editing, sharing, etc.).
[0041] Referring to FIG. 5 A, an example experiment protocol design within a graphical user interface (GUI) 500 is depicted. The experiment protocol design GUI 500 can include a combination of windows to assist the expert user in designing an experiment protocol. For example, the experiment protocol design GUI 500 can include a first window for an automation script window for programing rules and methods into the protocol and a second window with the deck layout for the experiment, as shown in FIG. 5 A. Using the experiment protocol design GUI 500, the expert user can program experiment protocols by designing the deck layout in the second window including but not limited to assigning labware (e.g., tubes, plates, racks, etc.) and using the first window to assign values to the labware, including but not limited to defining liquid types, volume reequipments, etc. for the labware. Additionally, metadata can be assigned to the labware (e.g., machine identifiable objects).
[0042] Continuing with FIG. 2, at step 220 the computing architecture 110 can transform the experimental protocols designed by expert users (step 210) into an augmented reality program within the computing architecture 110 that can be utilized/rendered to a user during laboratory configuration and/or operation. The transformation step can be performed through automated translation utilizing integrated software or through manual intervention. For example, the experiment protocol can be exported to an augmented reality program that generates all of the components from the deck layout on an automation platform for assignment of augmented reality objects. In some embodiments, the deck layout in the augmented reality program can include all the components provided in step 210 positioned at the locations on the deck specified by the expert user. [0043] Once in the augmented reality program, each of the labware components on the deck layout can be associated with machine identifiable objects (step 210) which can then be associated with one or more augmented reality objects. For example, the protocol can assign visual markers in the form of QR Codes to each labware component and each of the QR code. FIG. 5B depicts an example graphical user interface 502 including augmented reality objects 240 and machine identifiable objects 230 within the augmented reality program. Codes can be associated with augmented reality objects stored in a database. The visual markers can be automatically generated, assigned, and/or selected by the user from a database. For example, a component identified as a 96 well plate in the experiment protocol from step 210 can be transformed into a three-dimensional image of a 96 well at the specified location on the generated deck platform in the augmented reality program. In some embodiments, a second or more augmented reality objects can be associated with the same component. For example, the 96 well can include a tag that is should include an instructional video, and the appropriate video will be amended to the three-dimensional image of a 96 well as specified in step 210. As would be appreciated by one skilled in the art, the augmented reality can include any combination of images, text, video, notes, prompts, etc. known in the art. Once compiled, the augmented reality program can be linked to one or more user devices 124 for use in accordance with the present invention.
[0044] Referring to FIG. 3, an exemplary process for utilizing the computing device 112 and storage system 114 to implement experiment design in the automation protocol module 118.
[0045] In some embodiments, the automation protocol module 118 may be configured to perform step 210 from FIG. 2 above to create a laboratory configuration 210 using data received from the storage system 114. For example, the automation protocol module 118 may retrieve from the storage system 114 automation lab equipment and labware information such as, e.g., a labware library 216 having details describing various labware items, and a deck equipment library 217 having details describing automated deck equipment.
[0046] In some embodiments, a laboratory process may utilize a variety of automated components 108 that need to be set-up properly to accurately and reliably run the process. In order to provide a user, via a user device 124, with accurate mixed reality instruction regarding the configuration of the automated components 108 in the laboratory process, digital representations of the automated components 108 may be laid out to create a digital representing of an automated deck of equipment for implementing the laboratory process. Thus, in some embodiments, using the labware library 216 and the deck equipment library 217, as well as other possible inputs such as measurement data from the measurement module 116, the automation protocol module 118 may be used to design an automated deck layout 211. Using a configuration graphical user interface (GUI), a user may control the automation protocol module 118 to configure deck equipment from the deck equipment library 217 to create the automated deck layout design 211. For example, the user may design the automated deck layout design 211 to match a real-world configuration of automated laboratory components 104 in the laboratory environment 102.
[0047] In some embodiments, in addition to an arrangement or configuration of automated components 106, lab components 108 may also need to be placed in specific locations in order to accurately and reliably execute a laboratory process in the laboratory environment 102. Thus, labware arrangements relative to the automated deck layout design 211 may also be configured. For example, the user may use the automation protocol module 118 to configure labware assignments 212 relative to the automated deck layout design 211. In particular, the use may select labware form the labware library 216 to be located in areas on the automated deck layout of the automated deck layout design 211. For example, the user may drag a representation of a labware component and place it in a graphical location within the configuration GUI corresponding to a location on the automated deck layout design 211. As a result, the laboratory configuration 210 can include a graphical design for where to place labware components on a deck layout to represent locations in the real-world configuration of the automated components 104 in which a user may be instructed to place lab components 108 based on the labware assignments 212 in the laboratory configuration 210.
[0048] In some embodiments, the laboratory configuration 210 may also include information for setting sample volumes for the labware of the labware assignments 212. During the real- world laboratory process, the automated components 106 may be set to fill laboratory components 108 in a given location with a particular amount of a sample, or lab components 108 to be placed in a given location may need to be placed there with a particular amount of sample. These volume requirements 213 may be established and added to the laboratory configuration 210 via the automation protocol module 118 using, e.g., the configuration GUI.
[0049] In some embodiments, a particular laboratory process may also depend on liquid types provided to the automated components 106 and lab components 108, such as, e.g., solutions, samples, reagents, among other liquids. Thus, in designing the laboratory configuration 210 with the automation protocol module 118, a user may provide liquid type assignments 214 to various components in the configuration GUI. For example, a user may assign a particular solution to a particular labware assignment 212, such as, e.g., a well plate including a 96 well plate assigned to a particular automated component of the automated deck layout design 211. Thus, an automated laboratory protocol can be designed end-to-end using the automation protocol module 118, including the layout of automated components, selection and locations of labware, volume requirements and fluid types from start to finish of the laboratory process.
[0050] In some embodiments, to facilitate matching requirements and assignments of the laboratory configuration 210 to the real-world configuration of automated laboratory components 104 in the laboratory environment 102 for, e.g., mixed reality rendering of set-up guides via the user device 124, the automation protocol module 118 may load a metadata library 215 for each requirement and assignment, e.g., from the storage system 114, e.g., in the labware library 216. In some embodiments, the metadata can include data identifying computer readable location markers such that labware, volumes and liquids can be correlated to the computer readable markers provided in the laboratory environment 102.
[0051] In some embodiments, the laboratory configuration 210 can be saved and stored in the storage system 114 for use by, e.g., the computing device 112 including the configuration module 120 and the operation module 122. Thus, a user may select the laboratory configuration 210 when setting up the laboratory environment 102 for an associated automated laboratory process.
[0052] Referring to FIG. 4, an exemplary process for utilizing the computing device 112 and storage system 114 to implement augmented reality program in the configuration module 118.
[0053] In some embodiments, a laboratory configuration 226, such as the laboratory configuration 210 designed in the automation protocol module 118 above, can be used by the configuration module 120 to render mixed reality or augmented reality elements. In some embodiments, the elements can include graphical overlays to display at the user device 124 to depict physical steps to take in the laboratory environment 102 to establish an automated laboratory protocol.
[0054] In some embodiments, the configuration module 120 may receive the laboratory configuration 226, either from the storage system 114, or as exported directly from the automation protocol module 118. Using the automated deck layout design 211, the configuration module 120 may render each component of the automated platform in the laboratory environment 102 as a 3D virtual object for use in mixed reality devices such as the user devices 124. In some embodiments, the configuration module 120 translates configuration scripts, data, parameters, and other information into the virtual objects using a suitable rendering engine for rendering mixed reality object, such as, e.g., the Unity™ engine, or other suitable engine. Included in the rendering of the automated deck layout 221 may be assignments for visual markers 222 based on, e.g., the labware assignments 212 as described above. In some embodiments, the labware of the labware assignments 212 can be linked to visual markers 222 based on where the labware is assigned relative to the automation deck layout. The visual markers may mark the position and/or component at which the labware is assigned, and thus may serve as an object in the configuration rendering 220 to provide a computer readable marking for identifying the labware assignments 212.
[0055] In some embodiments, the configuration module 120 may use the labware assignments 212 and labware imagery to map labware, e.g., lab components 108, assigned in the laboratory configuration 226 to 3D imagery depicting the labware. In some embodiments, the labware imagery may be provided by a pre-rendered set of 3D labware images for mixed reality environments maintained in an imagery library 227 in the storage system 1 14. Based on, e.g., metadata, tags, labels, names or other details and parameters of each labware assignment code in the laboratory configuration 226, a corresponding labware model can be selected form the imagery library 227 and mapped to the labware assignment 212. Because each labware assignment 212 is also correlated with a marker via the metadata library 215 and the marker assignments 222, the labware image mappings 223 can be used to determine where in the laboratory environment 102 to overlay a rendering of each labware component 108 for instructing a user.
[0056] For example, in some embodiments, the configuration module 120 may receive user device imagery 229A captured by an image sensor in the user device 124. Where the user device imagery 229A includes an image of a visual marker (e.g., a QR code) in the laboratory environment 102, the configuration module 120 can reference automated deck layout rendering
221 and marker assignments of the configuration rendering 220. Using the marker assignment
222 associated with the imaged visual marker, the configuration module 120 can select the corresponding labware image mapped to that marker and provide the labware image to the user device 124 display as a user device overlay 229B. Once sent to the display, the user device 124 may display the labware image in a location on the display that intersects the user’s field of view and the visual marker in the laboratory environment 102 to overlay an animation of the appropriate labware component 108 in the location to which the labware component 108 is needed for the selected automation protocol. For example, where an automation protocol has been designed that requires a 96 well plate to be placed on a particular automated component 106, upon capturing an image of the corresponding visual marker on the automated component 106, the configuration module 120 may provide an animation to the user device 124 as a user device overlay 229B including a 3D model of the 96 well plate being positioned onto the automated component 106.
[0057] Moreover, the configuration module 120 may also utilize additional instruction 224 of the configuration rendering 220 to further instruct the user on placement and configuration of the labware components 108 and automated components 106. In particular, the configuration rendering 220 can include, e.g., additional instruction 224 based on, e.g., volume requirements 213, liquid type assignments 214, and other automation protocol parameters to effectuate a particular automation protocol for a laboratory process. For example, upon rendering an instructive animation for the placement of a particular labware component 108, the configuration module 120 may also provide overlay instruction in a user device overlay 229B, e.g., via textual description or an animation, regarding the volume of a particular fluid with which to fill the labware component 106. Similarly, the additional instruction 224 may include overlay instructions in a user device overlay 229B for inputting settings into an interface of automated components 106, such as, e.g., aliquoting settings, system control system, iterations of an operation, among other settings, e.g., via textual description or an animation.
[0058] Referring to FIG. 6, an exemplary process 600 for utilizing the system 100 to implement the augmented reality for in the laboratory environment 102. In particular, the steps of process 600 provide augmented reality objects to a user device 124 to assist in the proper setup of an automated laboratory for a particular experiment protocol. Initially, the user device 124 can communicate with the computing architecture 110 to receive the necessary information and content for utilization within the laboratory environment 102. The communication can include synchronizing and updating with the latest laboratory layouts, experiment protocols, augmented reality objects, etc. to be generated to the device. As would be appreciated by one skilled in the art, any combination of data can be preloaded into the application running on the user device or downloaded and/or rendered in real-time from data stored remotely on the computing architecture 1 10. [0059] In some embodiments, there can be multiple user device 124 communicating within the in the laboratory environment 102 communicating and sharing data with the computing architecture 110. For example, the laboratory environment 102 can include a computing device responsible for the automated function of the robotics within the laboratory environment 102, which will need to download the aspects of the experiment protocol responsible to providing instructions to the robotics during operation of the experiment. Similarly, a user can have a portable computing device for viewing the laboratory environment 102 and receiving augmented reality feedback for setting up the laboratory environment 102 and during operation of the experiment, which will need to download augmented reality and render it in a real-time view of the laboratory environment 102.
[0060] At step 602, using a user device(s) 122 with access to the programming of the present invention, the user selects to automated protocol they would like to execute in the laboratory environment 102. Selecting the automated protocol can include presenting the user with a graphical user interface with options for different automated laboratory equipment and/or specific experiment to be run. Each of the options can be customizable by the user based on their equipment and experimental preferences. In some embodiments, the system 100 can adjust the content being provided to the user based on the user selections.
[0061] At step 604, in response to detecting machine identifiable objects 230 (e.g., QR Codes) augmented reality content is rendered on the user device (122) at the locations of the machine identifiable objects 230. In some embodiments, the augmented reality displayed to the user via the user device 112 can provide guidance and real-time feedback on how to properly set-up the experiment protocol within the given laboratory environment 102. The user device 124 can utilize the unique machine identifiable objects 230 as anchors, projecting images, texts, video, or other visual cues at those locations to facilitate the proper setup of the target object or location including the machine identifiable objects 230, as discussed in greater detail with respect to FIGS. 7A-7J.
[0062] In some embodiments, the user device 124 can also be provided with feedback on improperly placed objects and warns the user of potential issues which could result in instrument crashes, safety hazards, and/or incorrect data generation.
[0063] At step 606, after setup of the laboratory environment 102 is complete and the automated experiment has been started, the user device 124 can provide real-time updates to the user through a continuous connection between the laboratory environment 102, the measurement module 116 communicatively attached to the computing architecture 110, and the user device 124 itself. These real-time updates can include augmented reality overlays that alert the user of system failures or errors, scheduled user interventions, instrument data (i.e. temperature, volume, sample ID), or other visual representations which would facilitate the execution of the protocol.
[0064] In some embodiments, the computing architecture 110 can operate in conjunction with the user device 124 to display mixed reality content 240 to assist the user setting up and configuring an automated laboratory environment (e.g., via the configuration module 120) using process 200 and/or provide additional insight during operation of the automated laboratory environment (e.g., via the operation module 122) using process 600.
[0065] In operation, as discussed herein, the user can utilize the user device 124 to capture a real-time image/video of a laboratory environment 102 and receive mixed reality content 240 to be displayed within said real-time image/video to convey particular information to the user. In some embodiments, the configuration module 120 can include preprogrammed instructions and configurations for one or more automated laboratory setups and configurations. Initially, based on the real-time view of the automated laboratory 102, the configuration module 120 can provide users with step by step installation and configuration instructions through the display of mixed reality content within the real-world view on the user device 124. For example, based on the received machine identifiable objects 230 the configuration module 120 can identify augmented reality installation to be transmitted to the user device 124 for display within the real-world view on the device 122 at locations of the machine identifiable objects 230. In some embodiments, the configuration module 120 can provide mixed reality view for locations and installation instructions for each of the components 104, 106, 108 within an empty lab space. For example, a user can provide a real-time view of an empty lab space (e.g., room, tray, container, etc.) and the configuration module 120 can determine where the appropriate components 104, 106, 108 should be installed and provide that information to the user via mixed reality content rendered within the real-time view of the lab space.
[0066] Referring to FIG. 7 A, an exemplary image of an automated laboratory 102 environment is depicted. The example automated laboratory 102 in FIG. 7 A includes a combination of automated components 104, labware components 108 (e.g., plates with test tube trays), and locations with machine identifiable objects 230 located thereon. In some embodiments, the machine identifiable objects 230 can be view by user devices 124 and can be used to identify specific locations, components, etc. such that they can be recognized and utilized as inputs for the system 100. For example, the machine identifiable objects 230 can be associated with locations in which particular labware components should be placed. The machine identifiable objects 230 can also identify components that the system 100 will monitor and track during operation, or can include spaces in which no objects or obstructions should be present, such that if the system 100 cannot locate a particular machine identifiable objects 230, it will notify the user that an error in the setup is present.
[0067] In some embodiments, augmented realty objects will be rendered over any recognized machine identifiable objects 230 to assist a user in the setup and operation of the automated laboratory 102. To determine the mixed reality content 240 to be conveyed, the user device 124 can first transmit at least a portion of the captured real-time image/video to the computing architecture 110 for analysis. Referring to FIG. 7A-7C, in some embodiments, the captured image/video can include unique machine identifiable objects 240 (e.g., barcodes) that can be associated (computing architecture 110) with mixed reality objections (e.g., augmented reality, virtual reality, etc.). The user device 124, or software thereon, can be configured to identify the unique machine identifiable objects 230 and transmit them to the computing architecture 110 for analyses and/or retrieval of augmented reality content 240. For example, as depicted in FIG. 7B an exemplary laboratory environment 210, or subset thereof, can include two-dimensional Quick Response Code (QR Codes) associated with one or more items within the laboratory environment 210. As would be appreciated by one skilled in the art, the present invention can also use any combination of image recognition in addition to or in place of the machine identifiable objects 230 without departing from the scope of the present invention.
[0068] In some embodiments, in response to receiving data from the user device 124, the computing architecture 110 can analyze any machine identifiable objects 230 (or image recognition) and determine whether those objects are associated with mixed reality content 240 stored in the storage system 114. The association can be determined by searching a lookup table or other database to see if a mixed reality content 240 is associated with the received data. In instances in which there is mixed reality content 240 associated with data received from the user device 124, the computing architecture 110 can determine what mixed reality content 240 is to be displayed and how it is to be displayed. Thereafter, the computing architecture 110 can be configured to package the mixed reality content 240 and instructions for the user device 124 for displaying the mixed reality content 240 in the real-time image/video and transmit the package to the user device 112 for rendering.
[0069] Referring to FIG. 7C, an illustrative example of mixed reality content 240 that can be rendered in association with a machine identifiable objects 230 from FIG. 7B is depicted. In this example, a three-dimensional rendering of a well plate is rendered, indicating to the user the specific labware that should be placed at the particular location. Similarly, referring to FIG. 7D, upon recognition of the second machine identifiable object 230 from FIG. 7B, a second three-dimensional rendering of a well plate is rendered at the second location. As would be appreciated by one skilled in the art, based on the particular machine identifiable object 230 recognized, the computing architecture 110 can determine that a combination of image, video, text, and three-dimensional objects should be displayed at the respective locations of the machine identifiable objects 230 within the real-time image/video for the laboratory environment 210. In some embodiments, as depicted in FIGS. 7C and 7D, the mixed reality content 240 can be rendered at the locations of the machine identifiable objects 230, effectively covering the machine identifiable objects 230 from the view of the user. Alternatively, the objects can be rendered with translucently as to not completely cover the machine identifiable objects 230 from view of the user (e.g., to act as a reference point) or a combination thereof.
[0070] Referring to FIG. 7E the user can place the labware component 108 matching the mixed reality content 240. In this example, the three-dimensional well plate rendered in FIGS. 7C and 7D is replaced by the user with the real-world well plate matching the three-dimensional well plate at the same location as the three-dimensional well plate. Referring to FIG. 7F, a view of the laboratory environment 102 is depicted in which a labware component 108 has been placed, an augmented reality object 240 has been rendered indicating labware that needs to be placed, and a machine identifiable objects 230 that has not yet been recognized/associated with an augmented reality object 240 is provided. FIG. 7E provides representations for how the present invention progresses from a machine identifiable objects 230, to rendering an augmented reality object 240, and concluding with a user having placed the appropriate labware component 108.
[0071] Referring to FIG. 7G, another example augmented reality object 240 can be rendered in place of a machine identifiable objects 230. In some embodiments, the augmented reality object 240 can include placement of a labware component that requires additional steps from the user before setup is complete. The additional steps can be presented to the user in multiple formats, including but not limited to, text, images, video, audio, or a combination thereof. As shown in FIG. 7G a three-dimensional rending of a reservoir is provided along with text indicating to the user to add 50m1 of TE Buffer. In some embodiments, multiple augmented reality objects 240 can be rendered in place of a single machine identifiable objects 230. Referring to FIG. 7H, a three-dimensional rending of a reservoir is provided along with text indicating to the user to add 50ul of TE Buffer along with a video that can play to show the user how to fill the reservoir.
[0072] Referring to FIGS. 71 and 7J a user can place the labware component 108 provided in FIGS. 7G and 7H and fill the labware component 108 with the appropriate material. In this example, a reservoir is placed by the user, as shown in FIG. 71, and 50m1 of TE Buffer is added to the bin, as shown in FIG. 7J. As each instruction is followed by the user and components 104, 106, 108 are properly setup, the configuration module 120 can continue provide the augmented reality content (e.g., video, location, etc.) as machine identifiable objects 230 are recognized at the different locations and the user can repeat the above steps until the automated laboratory is completely setup and/or configured for the experiment protocol.
[0073] In some embodiments, based on the data received from the user device 124, the configuration module 120 can analyze the received machine identifiable objects 230 along with other elements within the automated laboratory environment and determine whether components 104, 106, 108 are installed at the appropriate locations and whether the appropriate steps were taken in setting up those objects (i.e., the proper amount of fluid was added). In the event that one or more components 104, 106, 108 are not installed at the appropriate locations, the configuration module 120 can provide mixed reality content 240 to the user in the form of warnings or alerts to display how said components 104, 106, 108 should be installed. In some embodiments, the configuration module 120 can also identify other objects that do not belong within the automated laboratory space and issue a warning to the user that those objects may interfere with operation of the laboratory. For example, the configuration module 120 can use image recognition technology to identify any components that are not recognized and/or are located improperly within the automated environment.
[0074] With respect to the operation module 122, it can provide users with additional information during operation of the automated laboratory. The operation module 122 can be configured to monitor objects within the automated laboratory environment to ensure that the automated laboratory 102 is operating as intended and provide real-time feedback to the user as to what is occurring at each stage within the operation. In some embodiments, the operation module 122 can aggregate data related to the real-world display on the user device 124 and data from the measurement module 116 and determine what mixed reality content to display on the user device 124. In some embodiments, the user can customize what data to be displayed. The aggregated data can include a combination of received machine identifiable objects 230 from the user device 124, image recognition data, and received measurement data (from the measurement module 116. Real-time updates can include timers, sample volumes, sample IDs, transfer volumes, intervention instructions.
[0075] In some embodiments, the operation module 122 can associate different mixed reality content with the same identifiable objects used by the configuration module 120. For example, with respect to the labware components 108, the configuration module 120 can displayed augmented videos for where to setup the labware while the operation module 122 can display augmented images or text displaying information related to the labware items themselves. For example, the operation module 122 can associate mixed reality identification information for what is in test tubes and display that information to the user in real-time. Additionally, the operation module 122 can display measured information for those test tubes, for example, temperature provided by the instrument components 106. This can assist a user to both know what is happening within the automated laboratory at a particular point in time as well as provide additional insight that a user would not otherwise be able to glean by watching operation of the laboratory without the mixed reality information.
[0076] In some embodiments, the operation module 122 can also track and store historic data about the operations being performed within the automated laboratory 102 via data received from the measurement module 116. For example, the operation module 122 can track temperature data throughout an experiment and display the results to a user in an augmented environment. In some embodiments, the operation module 122 can provide feedback to the user based on an analysis of the historic data. For example, the operation module 122 can identify when an experiment is not running as normal and can provide a mixed reality warning to the user to warn that something may not be operating as intended to potentially avoid running the experiment in its entirety when it is not correct.
[0077] In some example embodiments, the configuration module 120 and the operation module 122 can be configured to project different mixed reality content for different purposes for different circumstances. For example, the configuration module 120 can project intended labware components 108 onto a deck to facilitate proper placement, integrate video segments to provide granular instructions for setting up the laboratory environment 104 as necessary, provide mixed reality demonstrating how to place labware components 108, highlight sample wells that need controls or other attention, provide alerts if labware components 108 are placed in incorrect locations, project data above samples to identify information about the samples (e.g., concentrations, sample IDs, volume, etc.), and provide holograms highlighting the source of errors within the operation of the automated components 104 or instrument components 106 (e.g., warning sign of a crash, missing tip box, etc.). As would be appreciated by one skilled in the art, any combination of mixed reality content can be provided to a user that may be useful in a particular laboratory environment.
[0078] The use of mixed reality in laboratory automation can better connect the scientist to the task that the automation is performing by bringing the digital to the‘real world’. In particular, incorporation of mixed reality into the real-time view of the laboratory environment 102 can help bridge the gap between digital protocols and actual execution. Scientists will not only be able to better set up their experiments (e.g., via content provided by the configuration module 120), but they will be able to track in real-time what is happening during execution of those experiments (e.g., via content provided by the operation module). For example, if a user can ‘see’ where samples should be placed within the real-time environment, they can ensure proper placement. This can lead to more consistency, safer operation, and ultimately better data. Similarly, if a user can‘see’ what the automated system is doing, then the user can better understand how the experiment is going and how the resulting data is generated. This can lead to increased traceability, more consistency, safer operation, and ultimately better data.
EXAMPLES
[0079] The below examples are provided as exemplary examples only and are not intended to limit the present invention to any implementations.
Example 1: Sample/Reagent Aliquot
[0080] The most basic, but widely used, application for automated liquid handling is sample or reagent dispensing or aliquoting. Scientists often need to dispense a valuable reagent or sample from a single tube or trough into numerous tubes or plates. This can be a labor-intensive task that is mundane and error-prone. To expedite this task a scientist can utilize an automated liquid handler. In one example, a user can be instructed by an augmented reality graphic to fill a large reagent trough with a set volume of water. The 96-well liquid dispensing head will then aspirate the water and dispense into the provided 96-well plate. An automated liquid handler is able to perform this task in a single transfer, whereas a scientist doing this by hand would need multiple transfers. In addition, the liquid handler digitally tracks each dispense, providing traceability and error-proofing that is not present when the task is performed by hand. However, the robotic liquid handler still relies on the user to set up the automation deck correctly. If the user puts the trough or plate in the wrong location, the instrument does not know and will not perform the transfer correctly. In addition, if not enough liquid is added to the trough, then the liquid handler will not aspirate the correct amount and the required volumes in each well will not be met. As provided here, using augmented reality to set-up the equipment correctly will lead to better results.
Example 2: PCR Set-up
[0081] Polymerase chain reaction (PCR) is a common molecular technique used to amplify segments of DNA to allow for many downstream applications, including DNA analysis or sequencing. Setting up a PCR reaction involves a series of liquid handling steps that often need to be performed in a particular sequence and utilizing a set amount of reagent and sample in order to get accurate and high-quality results. PCR reagents can be expensive, so performing large volume reactions can be cost-prohibitive. To overcome this challenge, laboratory automation platforms are utilized to miniaturize reaction volumes. Typically, scientists perform PCR reactions in single tubes or 96-well plates because they are easy to use and allow for reliable sample/reagent transfers. On a robotic system a scientist can utilize a plate with a much higher well density (384 wells/plate or 1536 wells/plate). This allows for miniaturization of the reaction, resulting in more data generation at a lower cost per reaction.
[0082] Since PCR set-up requires multiple reagents and samples, it can be an error-prone process. It is possible that a reagent or sample will be missed, which will result in a failed reaction. These failed reactions delay results and increase research costs. Automated liquid handling can help limit these mistakes, but they are still reliant on the user properly setting up the automation for the reaction. Since this is a multi-reagent reaction, it is vital that the user places the correct reagent or sample in the desired location. If this is not performed correctly, the reaction will either fail or generate incorrect data. Augmented reality can assist the user through targeted prompts, embedded videos, and visual signals to ensure the correct reagents and samples are placed in the desired location. The end result will be a lower failure rate and higher quality data generation. [0083] Any suitable computing device can be used to implement the computing devices 110, 122 and methods/functionality described herein and be converted to a specific system for performing the operations and features described herein through modification of hardware, software, and firmware, in a manner significantly more than mere execution of software on a generic computing device, as would be appreciated by those of skill in the art. One illustrative example of such a computing device 800 is depicted in FIG. 8. The computing device 500 is merely an illustrative example of a suitable computing environment and in no way limits the scope of the present invention. A“computing device,” as represented by FIG. 8, can include a “workstation,” a“server,” a“laptop,” a“desktop,” a“hand-held device,” a“mobile device,” a “tablet computer,” or other computing devices, as would be understood by those of skill in the art. Given that the computing device 800 is depicted for illustrative purposes, embodiments of the present invention may utilize any number of computing devices 500 in any number of different ways to implement a single embodiment of the present invention. Accordingly, embodiments of the present invention are not limited to a single computing device 500, as would be appreciated by one with skill in the art, nor are they limited to a single type of implementation or configuration of the example computing device 500.
[0084] The computing device 800 can include a bus 810 that can be coupled to one or more of the following illustrative components, directly or indirectly: a memory 812, one or more processors 814, one or more presentation components 816, input/output ports 818, input/output components 820, and a power supply 824. One of skill in the art will appreciate that the bus 810 can include one or more busses, such as an address bus, a data bus, or any combination thereof. One of skill in the art additionally will appreciate that, depending on the intended applications and uses of a particular embodiment, multiple of these components can be implemented by a single device. Similarly, in some instances, a single component can be implemented by multiple devices. As such, FIG. 8 is merely illustrative of an exemplary computing device that can be used to implement one or more embodiments of the present invention, and in no way limits the invention.
[0085] The computing device 800 can include or interact with a variety of computer-readable media. For example, computer-readable media can include Random Access Memory (RAM); Read Only Memory (ROM); Electronically Erasable Programmable Read Only Memory (EEPROM); flash memory or other memory technologies; CDROM, digital versatile disks (DVD) or other optical or holographic media; magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices that can be used to encode information and can be accessed by the computing device 800.
[0086] The memory 812 can include computer-storage media in the form of volatile and/or nonvolatile memory. The memory 812 may be removable, non-removable, or any combination thereof. Exemplary hardware devices are devices such as hard drives, solid-state memory, optical-disc drives, and the like. The computing device 800 can include one or more processors that read data from components such as the memory 812, the various EO components 816, etc. Presentation component(s) 816 present data indications to a user or other device. Exemplary presentation components include a display device, speaker, printing component, vibrating component, etc.
[0087] The EO ports 818 can enable the computing device 800 to be logically coupled to other devices, such as EO components 820. Some of the EO components 820 can be built into the computing device 800. Examples of such EO components 820 include a microphone, joystick, recording device, game pad, satellite dish, scanner, printer, wireless device, networking device, and the like.
[0088] As utilized herein, the terms“comprises” and“comprising” are intended to be construed as being inclusive, not exclusive. As utilized herein, the terms“exemplary”,“example”, and “illustrative”, are intended to mean“serving as an example, instance, or illustration” and should not be construed as indicating, or not indicating, a preferred or advantageous configuration relative to other configurations. As utilized herein, the terms “about”, “generally”, and “approximately” are intended to cover variations that may existing in the upper and lower limits of the ranges of subjective or objective values, such as variations in properties, parameters, sizes, and dimensions. In one non-limiting example, the terms“about”,“generally”, and “approximately” mean at, or plus 10 percent or less, or minus 10 percent or less. In one non limiting example, the terms“about”,“generally”, and“approximately” mean sufficiently close to be deemed by one of skill in the art in the relevant field to be included. As utilized herein, the term“substantially” refers to the complete or nearly complete extend or degree of an action, characteristic, property, state, structure, item, or result, as would be appreciated by one of skill in the art. For example, an object that is“substantially” circular would mean that the object is either completely a circle to mathematically determinable limits, or nearly a circle as would be recognized or understood by one of skill in the art. The exact allowable degree of deviation from absolute completeness may in some instances depend on the specific context. However, in general, the nearness of completion will be so as to have the same overall result as if absolute and total completion were achieved or obtained. The use of“substantially” is equally applicable when utilized in a negative connotation to refer to the complete or near complete lack of an action, characteristic, property, state, structure, item, or result, as would be appreciated by one of skill in the art.
[0089] Numerous modifications and alternative embodiments of the present invention will be apparent to those skilled in the art in view of the foregoing description. Accordingly, this description is to be construed as illustrative only and is for the purpose of teaching those skilled in the art the best mode for carrying out the present invention. Details of the structure may vary substantially without departing from the spirit of the present invention, and exclusive use of all modifications that come within the scope of the appended claims is reserved. Within this specification embodiments have been described in a way which enables a clear and concise specification to be written, but it is intended and will be appreciated that embodiments may be variously combined or separated without parting from the invention. It is intended that the present invention be limited only to the extent required by the appended claims and the applicable rules of law.
[0090] It is also to be understood that the following claims are to cover all generic and specific features of the invention described herein, and all statements of the scope of the invention which, as a matter of language, might be said to fall therebetween.

Claims

CLAIMS What is claimed is:
1. A method comprising: receiving, by at least one processor, an image from an image feed of an environment; detecting, by the at least one processor, at least one automated component marker in the image of the environment; matching, by the at least one processor, the at least one automated component marker with an associated at least one environment component, at least one environment component action, or both; selecting, by the at least one processor, at least one instruction associated with the at least one environment component, the at least one environment component action, or both; rendering, by the at least one processor, an animation of the at least one instruction; and causing to display, by the at least one processor, an overlay of the animation to appear, via an augmented reality device associated with at least one user, in a location in the environment of the at least one environment component, the at least one environment component action, or both in the environment.
2. The method as recited in claim 1, wherein the environment comprises a laboratory environment.
3. The method as recited in claim 1, wherein the at least one instruction is associated with establishing a laboratory component configuration.
4. The method as recited in claim 1, further comprising: determining, by the at least one processor, a display location on a display of the augmented reality device that intersects a line-of-sight to the location in the environment; and generating, by the at least one processor, a depiction of the animation in the display location to simulate the environment of the at least one environment component, the at least one environment component action, or both appearing in the location in the environment.
5. The method as recited in claim 1, wherein the image feed is received from an image capture device of the augmented reality device in real-time.
6. The method as recited in claim 1, further comprising determining, by the at least one processor, a sequence of automated component markers of the at least one automated component marker associated with a sequence of steps in the at least one instruction.
7. The method as recited in claim 1, further comprising determining, by the at least one processor, an environment state associated with objects within the image of the environment.
8. The method as recited in claim 7, further comprising determining, by the at least one processor, a first step of the at least one instruction based at least in part on the environment state.
9. The method as recited in claim 1, further comprising: identifying, by the at least one processor, changes to the environment represented in the image feed based at least in part on differences between images in the image feed; determining, by the at least one processor, user actions in the environment based at least in part on the changes to the environment; and logging, by the at least one processor, the user actions in a user action log.
10. A method comprising: receiving, by at least one processor, an image from an image feed of a laboratory environment; detecting, by the at least one processor, at least one automated component marker in the image of the laboratory environment associated with a location for laboratory component configuration; matching, by the at least one processor, the at least one automated component marker with an associated at least one laboratory component, at least one laboratory component action, or both for the laboratory component configuration; selecting, by the at least one processor, at least one instruction associated with the at least one laboratory component, the at least one laboratory component action, or both; rendering, by the at least one processor, an animation of the at least one instruction associated with establishing the laboratory component configuration; and causing to display, by the at least one processor, an overlay of the animation to appear, via an augmented reality device associated with at least one user, in the location for the laboratory component configuration to visually instruct the at least one user according to the at least one instruction for the laboratory component configuration.
11. The method as recited in claim 10, further comprising: determining, by the at least one processor, a display location on a display of the augmented reality device that intersects a line-of-sight to the location in the laboratory environment; and generating, by the at least one processor, a depiction of the animation in the display location to simulate the laboratory environment of the at least one laboratory component, the at least one laboratory component action, or both appearing in the location in the laboratory environment.
12. The method as recited in claim 10, wherein the image feed is received from an image capture device of the augmented reality device in real-time.
13. The method as recited in claim 10, further comprising determining, by the at least one processor, a sequence of automated component markers of the at least one automated component marker associated with a sequence of steps in the at least one instruction.
14. The method as recited in claim 10, further comprising determining, by the at least one processor, a laboratory state associated with objects within the image of the laboratory environment.
15. The method as recited in claim 14, further comprising determining, by the at least one processor, a first step of the at least one instruction based at least in part on the laboratory state.
16. The method as recited in claim 10, further comprising: identifying, by the at least one processor, changes to the laboratory environment represented in the image feed based at least in part on differences between images in the image feed; determining, by the at least one processor, user actions in the laboratory environment based at least in part on the changes to the laboratory environment; and logging, by the at least one processor, the user actions in a user action log.
17. A system comprising: at least one processor in communication with a non-transitory computer readable medium having instructions stored thereon that cause the at least one processor to perform steps to: receive an image from an image feed of an environment; detect at least one automated component marker in the image of the
environment; match the at least one automated component marker with an associated at least one environment component, at least one environment component action, or both; select at least one instruction associated with the at least one environment component, the at least one environment component action, or both; render an animation of the at least one instruction; and cause to display an overlay of the animation to appear, via an augmented reality device associated with at least one user, in a location in the environment of the at least one environment component, the at least one environment component action, or both in the environment.
18. The system as recited in claim 17, further comprising an augmented reality device in communication with the at least one processor and configured to: receive the animation; determine a display location on a display of the augmented reality device that intersects a line-of-sight to the location in the environment; and generates a depiction of the animation in the display location to simulate the
environment of the at least one environment component, the at least one environment component action, or both appearing in the location in the
environment.
19. The system as recited in claim 17, further comprising an image capture device in communication with the at least one processor and configured to capture a real-time image feed of the environment.
20. The system as recited in claim 17, wherein the at least one processor is further configured to execute instructions to perform steps to: identify changes to the environment represented in the image feed based at least in part on differences between images in the image feed; determine user actions in the environment based at least in part on the changes to the environment; and log the user actions in a user action log.
PCT/US2020/016360 2019-02-04 2020-02-03 Systems and methods for implemented mixed reality in laboratory automation WO2020163218A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/426,602 US12125145B2 (en) 2019-02-04 2020-02-03 Systems and methods for implemented mixed reality in laboratory automation
EP20751960.4A EP3921803A4 (en) 2019-02-04 2020-02-03 Systems and methods for implemented mixed reality in laboratory automation

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962801045P 2019-02-04 2019-02-04
US62/801,045 2019-02-04

Publications (1)

Publication Number Publication Date
WO2020163218A1 true WO2020163218A1 (en) 2020-08-13

Family

ID=71948301

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/016360 WO2020163218A1 (en) 2019-02-04 2020-02-03 Systems and methods for implemented mixed reality in laboratory automation

Country Status (3)

Country Link
US (1) US12125145B2 (en)
EP (1) EP3921803A4 (en)
WO (1) WO2020163218A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210304886A1 (en) * 2020-03-26 2021-09-30 Roche Diagnostics Operations, Inc. Method and devices for tracking laboratory resources
CN117160547A (en) * 2023-10-25 2023-12-05 南京浦蓝大气环境研究院有限公司 Atmospheric environment simulation device adaptable to various environments

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230146648A1 (en) * 2021-11-10 2023-05-11 IntelliMedia Networks, Inc. Immersive learning application framework for video with web content overlay control

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140118339A1 (en) * 2012-10-31 2014-05-01 The Boeing Company Automated frame of reference calibration for augmented reality
US20140176603A1 (en) * 2012-12-20 2014-06-26 Sri International Method and apparatus for mentoring via an augmented reality assistant
US20170213316A1 (en) * 2016-01-21 2017-07-27 International Business Machines Corporation Augmented reality overlays based on an optically zoomed input
US20180082480A1 (en) * 2016-09-16 2018-03-22 John R. White Augmented reality surgical technique guidance
US20180197043A1 (en) * 2017-01-11 2018-07-12 Alibaba Group Holding Limited Image recognition based on augmented reality

Family Cites Families (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8542906B1 (en) * 2008-05-21 2013-09-24 Sprint Communications Company L.P. Augmented reality image offset and overlay
FR2949587A1 (en) 2009-09-03 2011-03-04 Nicolas Berron Making an assembly of parts using a computer, comprises establishing digital model in three dimensions of each part, forming a virtual template of each part, and setting digital model with respect to reference of the workstation
US8830267B2 (en) * 2009-11-16 2014-09-09 Alliance For Sustainable Energy, Llc Augmented reality building operations tool
SG188076A1 (en) * 2010-03-30 2013-03-28 Ns Solutions Corp Information processing apparatus, information processing method, and program
US8832233B1 (en) * 2011-07-20 2014-09-09 Google Inc. Experience sharing for conveying communication status
US9235819B2 (en) * 2011-11-04 2016-01-12 Canon Kabushiki Kaisha Printing system, image forming apparatus, and method
KR101354133B1 (en) * 2013-12-12 2014-02-05 한라아이엠에스 주식회사 Remote place management type ballast water treatment system by augmented reality
JP6244954B2 (en) * 2014-02-06 2017-12-13 富士通株式会社 Terminal apparatus, information processing apparatus, display control method, and display control program
EP3132390A1 (en) * 2014-04-16 2017-02-22 Exxonmobil Upstream Research Company Methods and systems for providing procedures in real-time
US20150325047A1 (en) * 2014-05-06 2015-11-12 Honeywell International Inc. Apparatus and method for providing augmented reality for maintenance applications
JP6326996B2 (en) * 2014-06-13 2018-05-23 富士通株式会社 Terminal device, information processing system, and display control program
JP2016133294A (en) * 2015-01-22 2016-07-25 ジョンソンコントロールズ ヒタチ エア コンディショニング テクノロジー(ホンコン)リミテッド Air conditioner repairing and maintenance system and its method
US10142596B2 (en) * 2015-02-27 2018-11-27 The United States Of America, As Represented By The Secretary Of The Navy Method and apparatus of secured interactive remote maintenance assist
US10360729B2 (en) * 2015-04-06 2019-07-23 Scope Technologies Us Inc. Methods and apparatus for augmented reality applications
JP6572600B2 (en) * 2015-04-09 2019-09-11 セイコーエプソン株式会社 Information processing apparatus, information processing apparatus control method, and computer program
EP3127058A1 (en) * 2015-04-20 2017-02-08 NSF International Computer-implemented methods for remotely interacting with performance of food quality and workplace safety tasks using a head mounted display
US9972133B2 (en) * 2015-04-24 2018-05-15 Jpw Industries Inc. Wearable display for use with tool
JP2019537786A (en) * 2016-10-21 2019-12-26 トルンプフ ヴェルクツォイクマシーネン ゲゼルシャフト ミット ベシュレンクテル ハフツング ウント コンパニー コマンディートゲゼルシャフトTrumpf Werkzeugmaschinen GmbH + Co. KG Control Based on Internal Location of Manufacturing Process in Metal Processing Industry
US10735691B2 (en) * 2016-11-08 2020-08-04 Rockwell Automation Technologies, Inc. Virtual reality and augmented reality for industrial automation
US20180211447A1 (en) * 2017-01-24 2018-07-26 Lonza Limited Methods and Systems for Using a Virtual or Augmented Reality Display to Perform Industrial Maintenance
JP6604522B2 (en) * 2017-01-30 2019-11-13 京セラドキュメントソリューションズ株式会社 Image forming system, image forming apparatus, and guide program
US10573081B2 (en) * 2017-08-03 2020-02-25 Taqtile, Inc. Authoring virtual and augmented reality environments via an XR collaboration application
US10771350B2 (en) * 2017-09-26 2020-09-08 Siemens Aktiengesellschaft Method and apparatus for changeable configuration of objects using a mixed reality approach with augmented reality
WO2019226688A1 (en) * 2018-05-22 2019-11-28 Agilent Technologies, Inc. Method and system for implementing augmented reality (ar)-based assistance within work environment
US10768605B2 (en) * 2018-07-23 2020-09-08 Accenture Global Solutions Limited Augmented reality (AR) based fault detection and maintenance
US10943401B2 (en) * 2019-02-01 2021-03-09 International Business Machines Corporation Active visual recognition in mobile augmented reality
JP2020197835A (en) * 2019-05-31 2020-12-10 ファナック株式会社 Data collection and setting device for industrial machine

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140118339A1 (en) * 2012-10-31 2014-05-01 The Boeing Company Automated frame of reference calibration for augmented reality
US20140176603A1 (en) * 2012-12-20 2014-06-26 Sri International Method and apparatus for mentoring via an augmented reality assistant
US20170213316A1 (en) * 2016-01-21 2017-07-27 International Business Machines Corporation Augmented reality overlays based on an optically zoomed input
US20180082480A1 (en) * 2016-09-16 2018-03-22 John R. White Augmented reality surgical technique guidance
US20180197043A1 (en) * 2017-01-11 2018-07-12 Alibaba Group Holding Limited Image recognition based on augmented reality

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3921803A4 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210304886A1 (en) * 2020-03-26 2021-09-30 Roche Diagnostics Operations, Inc. Method and devices for tracking laboratory resources
US12119109B2 (en) * 2020-03-26 2024-10-15 Roche Diagnostics Operations, Inc. Method and devices for tracking laboratory resources
CN117160547A (en) * 2023-10-25 2023-12-05 南京浦蓝大气环境研究院有限公司 Atmospheric environment simulation device adaptable to various environments
CN117160547B (en) * 2023-10-25 2024-01-02 南京浦蓝大气环境研究院有限公司 Atmospheric environment simulation device adaptable to various environments

Also Published As

Publication number Publication date
US12125145B2 (en) 2024-10-22
EP3921803A4 (en) 2022-11-02
EP3921803A1 (en) 2021-12-15
US20220139046A1 (en) 2022-05-05

Similar Documents

Publication Publication Date Title
US11847751B2 (en) Method and system for implementing augmented reality (AR)-based assistance within work environment
US12125145B2 (en) Systems and methods for implemented mixed reality in laboratory automation
EP3841442B1 (en) Monitoring system and method for biopharmaceutical products
CN109545366B (en) Method and system for summarizing diagnostic analyzer related information
EP3140761B1 (en) Intelligent service assistant - instrument side software client
US10661267B2 (en) Electric pipette system, electric pipette, and operating procedure display device
US20180032549A1 (en) Experimental data recording device, experimental data recording method and experimental data display device
JP2020508441A (en) Dynamic control of automation systems
JPWO2015029676A1 (en) Sample analysis system
WO2018119321A1 (en) Automatic diagnostic laboratory and laboratory information management system for high throughput
CN103592447A (en) Method and apparatus for determining or testing an arrangement of laboratory articles on a work area of a laboratory work station
Shumate et al. IoT for real-time measurement of high-throughput liquid dispensing in laboratory environments
US20050102056A1 (en) Computer-guided sample handling
AU2020337871B2 (en) Systems and methods for laboratory deck setup verification
JP7415000B2 (en) Computer-implemented liquid handler protocol
JP6781992B2 (en) Experimental data recording device, computer program, experimental data, experimental data recording method, experimental data display device and experimental data display method
JP7393454B2 (en) Troubleshooting with proximity interaction and voice commands
US11999066B2 (en) Robotics calibration in a lab environment
Althobaiti et al. AR Gauge Scanner Mobile Application
WO2014196342A1 (en) Specimen pre-processing device system building assistance tool and system information building tool
Kranjc Entrepreneur's Perspective on Laboratories in 10 Years

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20751960

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020751960

Country of ref document: EP

Effective date: 20210906