GB2612967A - Computer vision system for a biological assay - Google Patents

Computer vision system for a biological assay Download PDF

Info

Publication number
GB2612967A
GB2612967A GB2116418.1A GB202116418A GB2612967A GB 2612967 A GB2612967 A GB 2612967A GB 202116418 A GB202116418 A GB 202116418A GB 2612967 A GB2612967 A GB 2612967A
Authority
GB
United Kingdom
Prior art keywords
equipment
user
list
microplate
biological assay
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2116418.1A
Other versions
GB202116418D0 (en
Inventor
Thomas Meany
Helene Steiner
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Biotech Res Laboratories Ltd
Original Assignee
Biotech Res Laboratories Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Biotech Res Laboratories Ltd filed Critical Biotech Res Laboratories Ltd
Priority to GB2116418.1A priority Critical patent/GB2612967A/en
Publication of GB202116418D0 publication Critical patent/GB202116418D0/en
Publication of GB2612967A publication Critical patent/GB2612967A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/44Event detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects

Abstract

A method for monitoring the completion of a biological assay, comprising: receiving images of a user performing one or more steps of the biological assay 602; identifying, in the images, equipment and/or samples (101-106, Fig.1) used by the user during the biological assay 603; and storing the identified equipment and/or samples (image data) as a first list of equipment 604. Identification of errors in performance of the biological assay may be obtained by comparing a second list of equipment, which details the correct performance of the assay, to the first list of equipment and identifying deviations of the first list. Barcodes (QR) (202, 204, Fig. 3) may be used to identify samples. Errors that may be found include: errors in the order of the performance of assay steps; using the wrong piece of equipment; using equipment incorrectly e.g. having equipment at the wrong temperature as determined by a display of the equipment; the use of an incorrect well of a microplate; the user not wearing correct protective equipment and/or cross contamination. An indication of the error, as characterised by the deviation, may be provided to the user as an audio or visual alert (501, Fig.2) or stored for later use.

Description

Computer vision system for a biological assay
Field of the Invention
The invention relates to a method and computer vision system for detection of errors in the performance of a high-throughput biological assay.
Background of the Invention
Biological assays in the most general form require the analysis of biological materials or tissues in order to obtain an analytical finding. A specific subset includes diagnostic assays which are undertaken on living tissues to make a diagnosis. In the case of human diagnostics it is possible to use routine experimental procedures to understand if a human being has a specific illness, disease or infection.
The procedures required to make the determination will involve the use of specific tools and apparatus. This includes personal protective equipment to ensure the user performing the assay is protected from a potential contaminant. This can include a microbiological biosafety cabinet, laboratory coat and impermeable gloves. The assays often require the use of temperature control techniques that can provide a specific pre-set temperature, temperature profile and a fixed time period such as ice or chilled thermal blocks, water based heat baths, metal heating block. Often the assays require the use of agitation to maintain a consistent solution, and this can involve the use of mixers or other mechanical agitators to ensure mixing occurs at an adjustable predetermined frequency for a fixed period of time. Assays will often require the movement of precise volumes of liquids between temporary sterile storage locations such as glass beakers or plastic tubes or multiple wells. Maintenance of sterility and the prevention of contamination or degradation of DNA or RNA from cells or virus can require the use of sterile, disposable, single use plastics free of contaminants, DNase, RNase or other degradants.
The detection and identification of the source of errors in biological assays is known to be a challenge. Scientific controls are often used in an experiment where the influencing factor being tested is not applied, serving as a standard for comparison against a group where the factor is applied. Due to the complexity of biological systems, it is common for a control or set of controls to be used to permit users to identify failure points in their procedure. Often the failure of an assay will be the result of human errors in the procedure, and these are by nature difficult to immediately identify.
Summary of the Invention
Examples of preferred aspects and embodiments of the invention are as set out in the accompanying independent and dependent claims.
The following presents a simplified summary of the disclosure in order to provide a basic understanding to the reader. This summary is not intended to identify key features or essential features of the claimed subject matter nor is it intended to be used to limit the scope of the claimed subject matter. Its sole purpose is to present a selection of concepts disclosed herein in a simplified form as a prelude to the more detailed description that is presented later.
The present invention is concerned with the automated detection of the tools, apparatus and components required for the undertaking of biological assays. By using an automated computer detection system to identify equipment, consumables and liquid used in a biological assay the system can tag and track the movement of the components over the course of the assay. This can then allow the system to automatically identify a discrepancy in a pre-programmed procedure. This can provide a prompt such as an audio or visual signal to alert the user of the discrepancy. It can also provide the user with a record of the error to permit the user to identify biological assays that may have an inaccurate result due to faults in the assay procedure. Biological assays are time consuming and require costly inputs and identification of failures early in a process is valuable for users.
A computer implemented method for providing a record of user performance of one or more steps of a biological assay is described. The method comprises receiving, from an image capture device, a first plurality of images of the user performing the biological assay, wherein the first plurality of images span a first time period in which the user is performing the one or more steps of the biological assay. The method further comprises extracting image data from the first plurality of images, wherein extracting the image data comprises identifying equipment and/or samples used by the user during the first time period. The method also comprises storing the extracted image data as a first list of equipment.
The methods described herein may be performed by software in machine readable form on a tangible storage medium e.g., in the form of a computer program comprising computer program code means adapted to perform all the steps of any of the methods described herein when the program is run on a computer and where the computer program may be embodied on a computer readable medium. Examples of tangible (or non-transitory) storage media include disks, thumb drives, memory cards etc and do not include propagated signals. The software can be suitable for execution on a parallel processor or a serial processor such that the method steps may be carried out in any suitable order, or simultaneously.
This acknowledges that firmware and software can be valuable, separately tradable commodities. It is intended to encompass software, which runs on or controls "dumb" or standard hardware, to carry out the desired functions. It is also intended to encompass software which "describes" or defines the configuration of hardware, such as HDL (hardware description language) software, as is used for designing silicon chips, or for configuring universal programmable chips, to carry out desired functions.
The preferred features may be combined as appropriate, as would be apparent to a skilled person, and may be combined with any of the aspects of the invention.
Many of the attendant features will be more readily appreciated as the same becomes better understood by reference to the following detailed description considered in connection with the accompanying drawings.
Brief Description of the Figures
The present description will be better understood from the following detailed description read in light of the accompanying drawings, wherein: Figure 1 displays the workspace 100 to be imaged with biological materials (101-a 24-unit biological sample container, 102-a reservoir to contain output samples, 103-replacement disposable plastics for moving liquid, 104 -concentrated reagents, 105 -waste container, and 106-an accurate liquid handling tool with replaceable tip for repeatable sterile liquid handling) and a worker 107 using those materials to perform an assay. The computer vision system is capable of recognising, identifying, and tracking the specific materials.
Figure 2 illustrates an intervention 501 occurring in the workspace 500 by means of a visual cue or an audio cue.
Figure 3 shows the imaged workspace 200 and barcodes being used and imaged as an additional identifier for samples. Multiple barcodes 202 and 204 can be used to identify multiple biological tools 201 and 203 or containers.
Figure 4 shows a worker performing an assay in the workspace. The worker uses biological tools to move materials in order to perform an assay. The computer vision system is capable of identification and tracking of the instruments and materials.
Figure 5 illustrates a computer vision system which identifies the instrumentation and can create a list of procedures which can be compared against a prepared list of procedures. The comparison thus records if a procedure has been performed correctly or incorrectly.
Figure 6 illustrates a method in accordance with examples of this invention.
Figure] illustrates an exemplary computing-based device in which embodiments of Figures 1 to Scan be implemented.
Like reference numerals are used to designate like parts in the accompanying drawings.
Detailed Description of the Invention
The following description is made for the purpose of illustrating the general principles of the present technology and is not meant to limit the inventive concepts claimed herein. As will be apparent to anyone of ordinary skill in the art, one or more or all of the particular features described herein in the context of one embodiment are also present in some other embodiment(s) and/or can be used in combination with other described features in various possible combinations and permutations in some other embodiment(s).
A computer implemented method and a computer vision system for identifying errors in user performance of a biological assay is described.
A biological assay can be any type of assay where multiple samples are analysed. For example, the biological assay can be a diagnostic test for any type of pathogen including human, animal or plant pathogens. The pathogen can be bacterial, viral or fungal. For example, the biological assay can be a diagnostic test for a viral pathogen, such as, influenza A, influenza B, SARS, including SARS-00V-2 (COVID-19), hepatitis A, hepatitis B, parvovirus B19, measles, rubella, mumps or arboviruses such as St. Louis encephalitis virus.
Biological assays can be conducted in a sterile environment such as a biosafety cabinet with a laminar flow designed to maintain a sterile work environment and the protection of the user from potentially infectious, pathogenic, or contaminated items within the environment.
Figure 1 shows a workspace 100 where a user or worker 107 is to perform a biological assay.
The workspace 100 can be imaged by an image capture device such as a video camera. The workspace contains the tools, equipment, or items necessary to perform the biological assay. For example, as shown in Figure 1, the workspace space contains a 24-unit biological sample container 101, a reservoir to contain output samples 102, replacement disposable plastics for moving liquid 103, concentrated reagents 104, waste container 105 and an accurate liquid handling tool with replaceable tip for repeatable sterile liquid handling 106. The skilled person would understand that the contents of workplace 100 is purely exemplary and different equipment or tools may be present dependent upon the biological assay to be performed.
As the user 107 performs the biological assay they are monitored by one or more image capture devices (not shown). In one example the one or more image capture devices (not shown) comprise video cameras. However, in other examples the image capture device may comprise a camera that captures multiple separate images separated in time. In some examples, the one or more image capture devices comprises more than one image capture device and each image capture device is used to provide a different view of the user 107 performing the biological assay. Thus, multiple cameras can be used to provide additional angles for analysis.
The image capture device, or where appropriate image capture devices, capture a first plurality of images of the user/worker 107 performing the biological assay. The images from this first plurality of images span a time period during which the user 107 is performing one or more steps of the biological assay. In some examples, the images from this first plurality of images cover the entire time period for which the user 107 is performing the biological assay. In other examples, only a limited number of steps of the biological assay are captured. For example, if only a limited number of steps of the biological assay are crucial for accuracy, only these steps may be represented in the images captured by the image capture device. In some examples, the image capture device or image capture devices can be triggered in response to a signal from an ultrasonic or infrared motion detector. In such examples, multiple inputs can be added to the computer vision system and integrated into the vision system such that ultrasonic, or infrared motion detection may be used as a trigger for a camera to begin collecting images of the biological assay.
The images captured by the image capture device are processed to extract image data. The images captured by the image capture device are processed to identify the equipment, tools or other items used or otherwise interacted with by the user 107 during performance of the biological assay. In addition, in some examples, any samples, such as biological samples, that the user interacts with or uses either directly or via pieces of equipment can also be identified from the image data or separately. The equipment, tools or other items used by the user 107 and, where applicable, the samples used by the user 107, are stored as a list of equipment. The list of equipment can correspond to a list which represents procedures performed by the user 107 as part of the biological assay. In some examples, the list of equipment may be an ordered list of equipment used by the user 107 wherein the order of the list reflects the order in which the user 107 used the equipment. The order in which the equipment is used may be identified from the image data by processing the image data sequentially and identifying the order in which items of equipment used by the user are identified in the image data.
The user 107 using equipment may comprise the user 107 interacting with the equipment in any way. For example, the user 107 using the equipment may include the user 107 picking up or otherwise moving the equipment. In other examples, the user 107, using the equipment may comprise the user interacting with a piece of equipment through another piece of equipment. For example, a user 107 may be considered to have used the 24-unit biological sample container 102 if the user 107 uses liquid handling tool 106 to move liquid (or any other suitable substance) into the 24-unit biological sample container 102, even if the user 107 never directly touches or otherwise handles the biological sample container 102.
Once the first plurality of images have been processed in order to extract the list of equipment used by the user 107 and, where applicable, an order of the equipment used by the user 107, the image data can, in some examples, be compared to predetermined assay data. The predetermined assay data can be obtained in multiple ways. In one example, the predetermined assay data may be obtained from the computer vision system monitoring a sample user/trainer performing the biological assay. However, in other examples the predetermined biological assay data may be programmed or obtained from another source. In any case, the predetermined biological assay data comprises a representation of a routine to be followed by the user 107 when the user 107 performs the one or more steps of the biological assay. This routine comprises a list of equipment that will be used/should be used by the user 107 when the user 107 is performing the biological assay. The predetermined biological assay data may also comprise an order in which the user 107 will/should use the equipment. Therefore, as with the image data, the predetermined assay data may comprise an ordered list of equipment wherein the order of the list reflects the order in which the user 107 should use the equipment. The skilled person would understand that the predetermined assay data represents a routine to guide the user 107 and that the user 107 may erroneously fail to follow this routine. Hence, the skilled person would understand that while a user 107 should use a piece of equipment as part of the routine this may not happen in practice.
As mentioned above, the predetermined assay data may be obtained by having a trainer/user perform an exemplary version of the biological assay. In this example, before the user performs the biological assay, a trainer/user performs the biological assay using an exemplary process. The computer vision system extracts image data from this exemplary process, extracting a list of equipment used by the trainer/user and, for example, storing it as a list of equipment that comprises the biological assay data. In some examples, the list of equipment forming the biological assay data can comprise an ordered list of equipment wherein the order of the equipment in the list reflects the order in which the trainer/user used the equipment.
Comparing the image data to the predetermined assay data may comprise comparing the list of equipment that was used by the user and which forms part of the image data to the list of equipment that should be used by the user and which forms part of the predetermined biological assay data. Any deviation between the image data and the predetermined assay data can be identified. For example, any differences between the list of equipment extracted from the image data and the list of equipment that comprises the predetermined assay data can be identified. These differences could be in the equipment used or, when the two lists of equipment are ordered lists of equipment, in the order in which equipment was used. These differences represent an error in the performance of the biological assay and thus a potential indication that the result of the biological assay may not be accurate and thus the results of any diagnostic tests may be incorrect.
For example, the biological assay data may indicate that the user 107 should interact with accurate liquid handling tool 106, then concentrated reagents 104, then the 24-unit biological sample container 101. The image data may indicate the user 107 actually interacted with accurate liquid handling tool 106, then concentrated reagents 104, then the reservoir to contain output samples 102. Therefore, a deviation between the biological assay data and the image data can be determined. In particular, it can be determined that the last piece of equipment on the ordered list differs between the biological assay data and the image data.
This deviation represents an error in performance of the biological assay.
When an error/deviation is determined then an intervention can occur in that an indication that an error has been identified can be provided. In the example shown in Figure 2, this error may be notified to the user 107 performing the biological assay immediately by means of a visual or audio cue 501 that is provided to the user 107 so that it is visible/audible intervention in the workspace 100/500. In a non-limiting example, Figure 2 shows an example error notification that comprises an exclamation mark 501 that is displayed to the user 107 on a display in the workspace 100/500. In other examples, a different notification of error may be used, for example a display of a cross, a crossed-out circle or other error symbol may be used or a red light may be lit. As an alternative or in addition an audio signal such as an alarm or buzzer may be provided to the user 107. This enables the user 107 to either stop performing the biological assay or to correct the error. Presenting an error in this form may give the user 107 opportunity to remedy the error quickly and before any subsequent steps of the biological assay are performed which would prevent such a remedy. Therefore, providing a user with an error in this form can increase the accuracy of the resultant biological assays.
In another example in addition or as an alternative, the indication of the error may be provided to the user 107 or another party when they view the results of the biological assay. This informs the user 107 or other party that an error occurred during performance of the biological assay and that the results of the biological assay may not be accurate. For example, if the biological assay is a diagnostic test for a disease, then the notification that an error occurred during performance of the assay allows the user 107 or other party to establish the diagnostic test may represent a false positive or a false negative and should be performed again.
Performing a biological assay can be repetitive. In such a case the predetermined assay data may comprise data related to a single repetition of the biological assay. However, the user 107 may make several repetitions of the routine in the biological assay data, for example, by repeating a set of steps multiple times on different samples. In such a case, each repetition of the set of steps by the user 107 may be separately compared to the biological assay data. The start of a new repetition may be identified by a specific user 107 action or by having the computer vision identify the first step in the routine represented by the biological assay data and consider a new repetition to have been started when this step is identified in the image data.
Figure 3 shows a workspace 200 that corresponds to workspace 100 and workspace 500. As discussed above, the first plurality of images are analyzed to determine image data wherein the image data comprises a list of equipment used by the user 107 and, in some examples, an order in which the equipment was used 107. In order to generate this list, the computer vision system and method can identify and track the instruments, materials and other equipment used by the user 107. In one example the equipment can be identified using a machine learning model that has been trained to identify laboratory equipment. The machine learning model may be a classifier that has been trained to identify laboratory equipment for example using supervised learning wherein the classifier is provided with images of laboratory equipment and information identifying the laboratory equipment during training. In other examples the equipment may be identified using an identifier such as a barcode or a OR code or using any other suitable means.
The skilled person would understand that various forms of equipment can be identified. For example, the method may comprise identifying equipment to manoeuvre precise volumes of liquids using equipment such as a pipette. In other examples, the method may additionally or alternatively comprise identifying equipment to maintain sterility using disposable or consumable components to temporarily house fluids that can be glass or plastic such as pipette tips, plastic plates, or beakers. Additionally, or alternatively, the method may comprise identifying equipment used to maintain a stable temperature using equipment such as: heating apparatus, incubators, ice blocks. The skilled person would understand that as mentioned above a biological assay may be conducted in a sterile environment such as a biosafety cabinet with a laminar flow designed to maintain a sterile work environment and the protection of the user from potentially infectious, pathogenic, or contaminated items within the environment. Thus, the equipment identified may include a biosafe cabinet to ensure that the biological assay is taking place in the biosafe cabinet.
In one example, as well as identifying items of equipment, the system/method may identify whether a user is wearing personal protective equipment while performing the biological assay. This can involve determining whether a user is wearing gloves. When gloves are used, the gloves can be a colour that is distinct from human skin colours. For example, the gloves could be blue, green, purple etc. The image data can be processed to determine whether the hands of the user are a colour consistent with human skin colours or a colour consistent with the gloves. If the user's hands are a colour consistent with human skin colour it can be determined the user is not wearing gloves. Similarly, if the user's hands are a colour consistent with the colour of the gloves it can be determined the user is wearing gloves. In some examples, in response to determining the user is not wearing gloves, an intervention can occur in the form of an indication as described with respect to Figure 2. While the above example assumes the user is wearing gloves on both hands, the skilled person would understand that in some examples, dependent upon the process being performed, the user only has to wear a glove on a single hand. In such a case, only a single hand will be considered when determining if the user is wearing gloves.
In the example shown in Figure 3, identifiers, 202, 204 can be used to identify the samples being used as part of the biological assay. The samples may be biological samples such as human specimen/human biological specimen or any other suitable form of sample. The identifiers can be printed or stuck on the pieces of equipment, such as test tubes or plastic tubes, containing the samples or presented on the equipment containing the samples in any other suitable form. The identifiers 202, 204 may comprise a barcode, a OR code, a serial number or any other form of identifier that can be read by a computer vision system or alternatively by an identifier scanner that works in conjunction with the computer vision system and provides the information to the computer vision system. The identifiers can be used to form log files of the samples used.
In the example shown in Figure 3, the identifiers comprise barcodes. When barcodes are used to identify samples, these barcodes can include data using the health level 7 communication format. The list of equipment can then also include any samples/specimens the user interacted with based on the identifiers on the equipment the user interacts with. In addition, or as an alternative the sample/specimen information can be stored separately from the image data. In some examples, the sample/specimen information can be compared to information about which samples/specimens should be used to confirm the biological assay was performed on the correct sample/specimen. As such, either the list of equipment or the separate sample information can comprise log files. Therefore, in the event that the assay is conducted on samples such as human specimens, identifying equipment may integrate barcode analysis to identify specific samples and associated errors and integrate this data using the health level 7 communication format. Thus, in some examples integrating barcode technology it is possible to associate log files either recorded manually by the user or produced by laboratory instrumentation such as a robotic liquid handler or an optical analyser.
Figure 4 shows a user 304 performing a biological assay in a workspace 300 wherein, for example, the biological assay forms part of a diagnostic test for Covid-19. The user 304 can correspond to user 107 and the workspace 300 can correspond to workspace 100. In Figure 4, the user 304 is moving a biological liquid, such as a sample, from a location in 24-unit biological sample container 301 to a position in a reservoir to contain output samples 302 for mixing purposes. The movement of this combined biological liquid may comprise moving the test tube/plastic tube containing the biological liquid from location 301 to apparatus 302. Alternatively, or in addition, the movement uses a disposable plastic tip taken from 303 and accurate liquid handling tool 306. The disposable plastic tip may contain a reagent that is added to the biological liquid before it is moved to the reservoir to contain output samples 302. This procedure can be repeated many times using many apparatuses. The skilled person would understand that the process performed by user 304 is purely exemplary and that performing the biological assay may involve different processes.
In Figure 4, the computer vision system is capable of identification and tracking of the instruments and materials. The system can create a list of procedures performed by the user 304 based on the movements conducted. As discussed above, images are captured of the user 304 performing the biological assay during a first time period. These images are analysed by the computer vision system to identify the equipment being used by the user 304 during this time period. This analysis can involve detecting the equipment/tools using machine learning and the samples using identifiers 202, 204. For example, any piece of equipment or tool the user 304 picks up or otherwise interacts with can be identified. In addition, or as an alternative, any piece of equipment or tool that is moving can be identified irrespective of whether the computer vision system has identified that the user 304 is responsible for this movement.
However, it is noted that not all pieces of equipment will move. For example, 24-unit biological sample container 301 and reservoir to contain output samples 302 may remain stationary throughout the biological assay even though the user 304 is, for example, transferring a biological liquid from the 24-unit biological sample container 301 to the reservoir to contain output samples 302. Therefore, the list of equipment can include equipment the user 304 does not interact with directly. To this end, computer vision system can detect equipment the user 304 interacts with via other equipment. Therefore, computer vision system can include in the list of equipment any piece of equipment that interacts with another piece of equipment. This can be done by detecting when a first piece of equipment is moved to the vicinity of a second piece of equipment and/or is moved within a threshold distance of a second piece of equipment. In such a scenario, the second piece of equipment can be included in the list of equipment used by the user 304.
In one example, all pieces of equipment the user interacts with directly or indirectly are included in the ordered list in the order the user interacts with the equipment. For example, if a first piece of equipment is used to transfer a part of the biological assay from a second piece of equipment to a third piece of equipment, the list of equipment can comprise the first piece of equipment, the second piece of equipment and then the third piece of equipment in that order. In another example, if a first piece of equipment is used to transfer a part of the biological assay from a second piece of equipment to a third piece of equipment, the list of equipment can comprise the first piece of equipment, the second piece of equipment, the first piece of equipment and the third piece of equipment.
In alternative examples, the list of equipment can comprise a list formed of multiple columns and/or sections. For example, the list of equipment can comprise a first column or sub-section indicating a first piece of equipment being interacted with by user 304. The list of equipment can then comprise a second column or sub-section that corresponds to the first column or sub-section and indicates any secondary pieces of equipment being interacted with by the first piece of equipment. In this case, taking the example above of a first piece of equipment being used to transfer a part of the biological assay from a second piece of equipment to a third piece of equipment the list could comprise: Alternatively, the multiple columns/sections may represent the equipment being interacted with by the user 304 either through direct interaction or via another piece of equipment along with details of the form of the interaction. In this case, the above example of a first piece of equipment being used to transfer a part of the biological assay from a second piece of equipment to a third piece of equipment, could result in a list taking the form: First piece of equipment Direct user interaction Second piece of equipment Via first piece of equipment Third piece of equipment Via first piece of equipment The skilled person would understand that in alternative examples, the list of equipment may comprise additional entries representing the user moving the first piece of equipment between the second and third piece of equipment and/or picking up or putting down the first piece of equipment. Therefore, the skilled person would understand that the above are purely exemplary ways to forming the list of equipment and other ways are also possible.
First piece of equipment Second piece of equipment First piece of equipment Third piece of equipment In the above cases, the pieces of equipment interacted with by the user 304 either directly or indirectly can be detected using a machine learning model. A first item of equipment may be added to the list in response to either the first item of equipment moving or in response to a second item of equipment being moved within a threshold distance of the first item of equipment. This enables the computer vision system to identify the items of equipment that form part of the biological assay. However, the skilled person would understand that other ways of identifying the items of equipment that form part of the performance of the biological assay are possible.
The skilled person would understand that a user performing the biological assay may interact with a piece of equipment in a non-meaningful way. For example, a user 304 may pick up a piece of equipment and then instantly put down the piece of equipment. In such a case, it is desirable to avoid the piece of equipment appearing on the ordered list of equipment to prevent a false error being identified. To avoid this problem, trivial interactions with a piece of equipment can be removed from the list of equipment.
In one example, an interaction with a piece of equipment is identified as trivial if that piece of equipment does not interact with any other pieces of equipment during the user 304 interaction. Therefore, actions where a user 304 picks up a piece of equipment and then puts it down are not included in the list of equipment. This can include if the user 304 picks up a piece of equipment by mistake or picks up a piece of equipment to move the equipment in order to rearrange the workspace 300. In some examples, if a user picks up a piece of equipment and this is identified as an error, an indication of the error can be provided to the user in accordance with the examples described with respect to Figure 2. If this indication results in the user immediately putting down the equipment without interacting with any other piece of equipment, the piece of equipment can then be deleted from the list of equipment, so the error does not show in subsequent analysis. This means that a user 304 can be alerted of a potential error while any record of the biological assay is adjusted to represent the fact that the potential error did not occur.
In another example, the interaction with the piece of equipment may only be removed from the list of equipment if the interaction lasts below a threshold length of time. This is to ensure that interactions where a user 304 interacts with a single piece of equipment by, for example, swirling a beaker, heating the equipment in their hand, or observing the piece of equipment are not removed from the list of equipment when they represent important steps of the biological assay. The threshold length of time may be 10 second, 20 second, 30 second or another suitable value. In order to ensure these short interactions can be removed, the list of equipment may comprise details on the length of time the user 304 interacted with the equipment so that entries below the threshold can be identified. In an alternative example, an item of equipment may only be added to the list of equipment after the threshold length of time.
In a further example, a piece of equipment may be identified as being trivial and the computer vision system may know not to add this piece of equipment to the list of equipment. For example, a user may use a guide in the form of a 2D or 3D object that indicates how to perform the biological assay. In such cases, the computer vision system can exclude the guide from the list of equipment since the guide is being used for informational purposes rather than as part of the process of performing the biological assay. To this end, the computer vision system can be trained or otherwise programmed to identify and ignore the guide.
As described above, in one example an error in a biological assay can comprise a user 107, 304 interacting with an incorrect piece of equipment. However, another form of error involves a user interacting with the correct piece of equipment in the wrong way. Figure 5 comprises Figure 5a where user 404 performs the biological assay correctly and Figure 5b where user 410 performs the biological assay incorrectly.
In Figure 5a, user 404 in workspace 400 is performing a step of a biological assay by using accurate liquid handling tool 405 with a replacement disposable plastic 403 to move a biological liquid from 24-unit biological sample container 401 to reservoir to contain output samples 402. In Figure 5a, the user 404 correctly transfers the biological liquid to the upper left unit of the reservoir to contain output samples 402.
In Figure 5b, user 410 in workspace 406 is performing the same step of biological assay as user 404. In particular, user 410 is using accurate liquid handling tool 411 with a replacement disposable plastic 409 to move a biological liquid from 24-unit biological sampler container 407 to reservoir to contain output samples 408. In Figure 5b, the user 410 transfers the biological liquid to the second from the top leftmost unit of the reservoir to contain output sample408. This represents an error since the user 410 was meant to transfer the biological liquid to the upper left unit of the reservoir to contain output samples 408. However, user 404 and user 410 are both interacting with the correct pieces of equipment in the correct order. Therefore, in some examples it is beneficial to have the list of equipment identify which area or portion of the equipment the user 404, 410 interacted with.
To this end, the computer vision system can identify different sections of the equipment and include this information in the list of equipment. For example, a piece of equipment, such as a microplate can comprises multiple cells or units. While a microplate can be used to directly contain samples, the skilled person would understand that the term microplate also covers racks that are configured to accept test tubes/plastic tubes. If the equipment comprises a microplate, such as 24-unit biological sampler container 401, 407, and reservoir to contain output samples 402, 408 or any other piece of equipment that can be divided into units, then the computer vision system can identify and label each individual cell or unit. For example, the computer vision system can use a grid style labelling system, labelling each row of units using a letter e.g., A, B, C, and each column of units using a number, e.g., 1, 2, 3 or vice versa. This labelling system can be used to identify each individual unit. For example, the upper left unit of a 24-unit piece of equipment can be labelled 1A while the bottom right can be labelled 6D. This allows each individual unit to be identified. Other ways of identifying sections of pieces of equipment can be used where appropriate. The list of equipment can then comprise both the item of equipment and the unit or section of the piece of equipment the user 404,410 interacted with. An error in the biological assay can be identified if either one or both of the item of equipment and section of the piece of equipment differ between the predetermined biological assay data and the list of equipment from the image data. This enables an error such as the one made by user 410 to be identified.
In terms of identifying errors, one form of error can occur when the microplate should contain controls. To identify this form of error, the predetermined assay data may comprise information identifying which wells or units of a microplate are to be used to contain controls.
If the user transfers any piece of equipment, sample, reagent, or other substance other than a control to a well of the microplate indicated as containing a control in the predetermined assay data, then this can be detected by, for example by comparing the well of the microplate with which the user is interacting with the wells of the microplate indicated as being control wells in the predetermined assay data, and an error can be recorded. This error can result in an intervention and notification as described with respect to Figure 2. In addition, in some examples, the error can be recorded so that the user can view the errors at a later date.
Another form of error that may occur, is the user may try and move two or more samples or other substances to the same well of a microplate thus resulting in contamination. To identify this sort of error, the list of equipment can comprise an identification of the well of the microplate used at one or more of the steps of the biological assay that involve interacting with the microplate. Whenever a user interacts with a well of the microplate, the list of equipment can be consulted to confirm if the well has been used previously. If the well has been used previously an error can be identified and an intervention/indication of error can be provided as described with respect to Figure 2. In some instances, as well as consulting the list of equipment, the predetermined biological assay data can be consulted. The predetermined assay data can comprise information indicating when a well should be reused (e.g., when a user should perform more than one action on a single well). If the well reuse is in accordance with the predetermined assay data, then no error needs to be provided.
The skilled person would appreciate several different ways of tracking wells/units of pieces of equipment using computer vision system. In one example, the computer vision system has an expected orientation of workspace 400, 406. In this example, the sections of the pieces of equipment are labelled according to this orientation. For example, the computer vision system may consider the upper part of workspaces 400 and 406 to be the "top" of the workspace 400, 406. In this case, the computer vision system may automatically label the units, wells or sections of the equipment based on this expected orientation. For example, when units are being labelled according to a grid system, then the upper left unit according to the expected orientation may be unit Al.
In another example, each piece of equipment can comprise an orientation marker. The orientation marker can identify an extremity of the piece of equipment for example a top-left, top-right, bottom-left, or bottom-right corner. This orientation marker can then be used to identify the sections, units, or wells of the equipment. In the example where units are being labelled according to a grid system, the orientation marker can be used to identify a particular unit, for example the top left unit and the grid system of units can then be identified based on this orientation marker. The use of such an orientation marker means that if the piece of equipment is moved or rotated the same sections of the piece of equipment are identified with the same labelling even though the piece of equipment is now in a different orientation with respect to workspace 400, 406.
A combination of orientation markers and expected orientation of a workspace 400,406 can also be used. In this example, the sections of pieces of equipment are initially labelled based on their orientation in the workspace. The sections are then stored based on their position relative to an orientation marker and this is used to subsequently identify sections of the piece of equipment. For example, if the orientation marker is on the bottom right-hand corner of a 24-unit piece of equipment 401, 402, 407, 408, then the computer vision system may identify unit Al as being diagonal opposite the orientation marker. This allows the user 404, 410 to rotate equipment and also ensures the user 404, 410 can start with the equipment in any orientation without error.
In another example, the sections of the equipment can be identified in a dynamic fashion. In one example, this involves labelling the first unit or section of the piece of equipment the user interacts with as Al and/or as the unit/section identified in the predetermined assay data. This unit/section can then be used to identify all other units/sections. In another version of this example, it is appreciated that in some instances it is not important which unit of a piece of equipment the user 404, 410 interacts with provided this unit has not been used previously and is consistently used for all interactions with the desired unit. In this example, the list of equipment is dynamically compared to the predetermined assay data. If the user 404, 410 interacts with a unit or section of a piece of equipment for the first time and this interaction is consistent with the predetermined assay data, then a reference for the unit/section is extracted from the predetermined assay data. The predetermined assay data is then analysed to confirm that this is the first use of that reference and if it is not the first use of that reference an indication of an error is provided. Else, if this is the first use of that reference, that reference is assigned to that unit/section of the piece of equipment. Any subsequent interaction with that unit/section of the piece of equipment will use the assigned reference and an indication of an error will be provided if the references in the predetermined assay data and the list of Is equipment disagree. In this example, an orientation marker may be used to account for rotation of any piece of equipment.
The tracking of wells/units/sections etc., has uses beyond error identification. For example, as an alternative or in addition to error identification, the tracking of wells/units/sections can be used to create log data or other data relating to the position of samples and/or other useful products. For example, the user 404 may be transferring a plurality of samples from 401 to 402. Each sample may be identified by an identifier such as a barcode, or by any other suitable means. In this case, the tracking of wells/units/sections may be used to create a list indicating the well of output sampler 402 to which each sample has been transferred. The user 404 may later access this data to match the results from each well/unit/section to the correct sample used to generate the results.
The skilled person would understand that correct performance of the biological assay may be subject to numerical constraints. For example, during performance of the biological assay, a sample or other liquid or substance may need to be heated to a certain temperature. Alternatively, or in addition, a specific quantity of a sample, liquid or other substance may need to be transferred from one piece of equipment to another piece of equipment. In other examples, a sample, liquid, or other substance may need to be weighed or otherwise measured and a particular quantity of the sample, liquid or other substance identified. To enable this the computer vision system may identify from the image data any numerical measurements that occur and may identify the value that is determined from these measurements. The biological assay data may comprise pre-defined values for these measurements, for example, representing the value these measurements should take if the biological assay is conduced correctly. Comparing the image data to the predetermined assay data can further comprise comparing the predetermined or desired values from the predetermined assay data to the measured values. A deviation can be determined if these values do not agree and an indication that an error has been identified in the performance of the biological assay can therefore be provided as discussed with respect to Figure 2.
As discussed above, the predetermined biological assay data may be obtained from the computer vision system monitoring a sample user/trainer performing the biological assay. In particular, to create the biological assay data a sample user/trainer may perform the steps comprising the biological assay during a time period. An image capture device or image capture devices such as those described above may monitor the sample user and capture a plurality of images of the trainer performing the biological assay over the time period. These images may be processed by the computer vision system to extract the predetermined assay data. In particular, the plurality of images of the sample user may be processed to obtain image data comprising a list of equipment used by the sample user and, in some examples, the order in which the equipment was used. This list of equipment and order can be used to form the predetermined assay data. The equipment used by the sample user can be identified in any of the ways discussed above including by use of machine learning and/or identifiers. The sections/units of pieces of equipment used can also be identified in the ways discussed above.
Creating the predetermined assay data in this fashion means the predetermined assay data can be created without programming the routine into the computer vision system. This enables a sample user who is not skilled in programming to create predetermined assay data. This also enables new predetermined assay data to be created quickly to incorporate any changes necessary.
In one example a user performing a biological assay is monitored using a camera connected to a computer which connects to laboratory information management software. The camera records the movements of the biological equipment designed for temperature management, measuring, and manoeuvring precise volumes of liquid and disposable plastic and glass components used to ensure sterility used in the performance of the assay. The images recorded are automatically annotated to determine the instruments and materials used in the assay and tracking of those components. This is compared to the predetermined assay procedure programmed by the user or provided from another source. By using automated image processing, it is possible to diagnose a potential error in advance of the full performance of the assay allowing the user to intervene early enough to save time and materials. The intervention can be audio or visual and presented on a screen adjacent to the user's workspace. Alternatively, intervention can be ignored and a log file of the source of errors is recorded for posterity and further review. The system can integrate with additional inputs such as barcode technology and additional sensors to link error logs throughout the assay process.
As mentioned above, the biological assay may comprise a biological assay for diagnosing Covid- 19. Therefore, the system can be used for diagnostic protocols such as the preparation of a diagnostic assay for COVID-19. The routine used by the user for preparing such an assay may involve obtaining a sample contained in a plastic tube. The plastic tube may comprise an identifier of the sample in the form of a barcode. A swab may be present in the plastic tube. Since the plastic tube may have undergone transportation, it may have a lid on. The user performing the biological assay may remove the lid from the plastic tube and retain the lid. The user may then remove the swab from the tube using tweezers and dispose of the swab. The user may then transfer a reagent from a disposable plastic into the plastic tube. The user may then dispose of the disposable plastic and put the lid back on the plastic tube. The user may then place the plastic tube in a microplate. The user may perform this process multiple times leaving space in the microplate for controls. A guide or other form of indicator may be used to inform the user how the controls should be positioned in the microplate. The microplate can then be processed further to perform the necessary diagnostic tests.
Figure 6 is a flow chart showing a method, 600, in accordance with an example of the application.
In optional step 601, predetermined assay data may be stored. The predetermined assay data comprises a second list of equipment wherein the second list of equipment comprises a list of equipment that should be used during performance of one or more steps of the biological assay. In some examples, the second list of equipment may be ordered to reflect the order in which the user should interact with the pieces of equipment in the list. The predetermined assay data may be generated by a trainer or sample user as described above. Alternatively, the biological assay data may be programmed or generated in any other suitable way.
In step 602, the method comprises receiving a plurality of images of a user performing a biological assay. The one or more images can be obtained from an image capture device and can span a first time period in which the user is performing the one or more steps of the biological assay.
In step 603, the method comprises extracting image data from the plurality of images. The extracting image data comprises identifying equipment and/or samples used by the user during the first time period. The equipment can be identified using a machine learning model as described above. The equipment can comprise laboratory equipment/tools and items used by the user during performance of the biological assay. Where applicable, the samples can be identified using identifiers. The identifiers may be identified from the image data. Alternatively, the user may have a separate scanner such as a barcode reader that can be used to identify the samples based on their identifiers.
At step 604, the extracted image data is stored as a first list of equipment. This list of equipment can comprise a list of all pieces of equipment the user interacted with either directly or indirectly as identified from the image data. The list of equipment can be ordered based on the order in which the user used the equipment. The order can be determined by processing the images sequentially when extracting image data. In some examples the first list of equipment also comprises the samples the user interacted with as identified from the image data or the separate scanner. In other example the list of samples is provided separately. In further examples, a list of samples is not included.
In some examples the method stops at step 604. The list of equipment can later be reviewed by the user as a log of the biological assay process. This enables the user to review the process and identify errors and/or positions of samples. However, in other examples, the method continues.
In optional step 605, the method comprises comparing the two lists of equipment so that the image data and the predetermined assay data are compared. This involves comparing the second list of equipment to the first list of equipment or vice versa. In step 606 any deviations or differences between the two lists of equipment can be determined. These deviations reflect error in performance of the biological assay. The deviation can reflect an item of equipment appearing on one list of equipment but not another list of equipment. The deviation can also reflect the items of the two lists of equipment occurring in a different order. In either case, the deviation or difference can reflect the fact that the user has not correctly performed the procedure/steps involved in performing the biological assay.
In optional step 607, the method comprises providing an indication to the user that an error has been identified in performance of the biological assay. This indication can be provided when a deviation is determined between the second and first list of equipment. In some examples, the indication of an error is provided as soon as the deviation is noticed in the form of a notification in the workspace of the user. This gives the user an opportunity to correct the error. If the user successfully corrects the error, the first list of equipment can be amended to reflect this by removing from the list of equipment any pieces of equipment that were used trivially. In addition, if an item/sample/piece of equipment is moved from an incorrect to a correct position or section/unit/well of another piece of equipment in response to an error notification, the position of the item or piece of equipment can be amended in the first list of equipment. Alternatively, or in addition to providing an immediate indication of error, the error can be stored in a log and the indication can be provided to the user when the user is reviewing the biological assay. This notifies the user that the biological assay contains an error and hence any diagnostic test etc. represented by the biological assay may be inaccurate.
In some examples, a method of monitoring performance of a biological assay is provided. The method comprising 1) collecting image data from one or more steps of the assay, 2) comparing the image data to predetermined assay data and 3) identifying any deviation between the image data and the predetermined assay data, wherein the deviation identifies an error in the performance of the biological assay. In another example, a vision system that automatically identifies and tracks the movement of biological materials, components, and equipment; used in the performance of assays and compares this against a pre-programmed procedure to identify an erroneous or unfamiliar pattern of movement is provided. This can involve a method of remotely tracking a component of a biological assay.
In some examples a user performing a biological assay is monitored using a camera connected to a computer which connects to laboratory information management software is provided. The camera records the movements of the biological equipment designed for temperature management measuring, and manoeuvring precise volumes of liquid and disposable plastic and glass components used to ensure sterility used in the performance of the assay. The images recorded are automatically annotated to determine instruments and materials used in the assay and tracking of those components. This is compared to the predetermined assay procedure programmed by the user or provided from another source. By using automated image processing, it is possible to diagnose a potential error in advance of the full performance of the assay allowing the user to intervene early enough to save time and materials. The intervention can be audio or visual and presented on a screen adjacent to the user's workspace. Alternatively, intervention can be ignored and a log file of the source of errors is recorded for posterity and further review. The system can integrate with additional inputs such as barcode technology and additional sensors to link error logs throughout the assay process.
The methods described above may be carried out by a computer vision system or another suitable computing-based device. Computing-based device or computer vision system shown in Figure 7 comprises one or more processors 902 which are microprocessors, controllers, or any other suitable type of processors for processing computer executable instructions to control the operation of the device in order to perform the methods described above. In some examples, for example where a system on a chip architecture is used, the processors 902 include one or more fixed function blocks (also referred to as accelerators) which implement a part of the method described above in hardware (rather than software or firmware). Platform software comprising an operating system 904 or any other suitable platform software is provided at the computing-based device to enable application software 906 to be executed on the device.
The computer executable instructions are provided using any computer-readable media that is accessible by computing based device 900. Computer-readable media includes, for example, computer storage media such as memory 908 and communications media. Computer storage media, such as memory 908, includes volatile and non-volatile, removable, and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or the like. Computer storage media includes, but is not limited to, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM), electronic erasable programmable read only memory ([[PROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that is used to store information for access by a computing device. In contrast, communication media embody computer readable instructions, data structures, program modules, or the like in a modulated data signal, such as a carrier wave, or other transport mechanism. As defined herein, computer storage media does not include communication media. Therefore, a computer storage medium should not be interpreted to be a propagating signal per se. Although the computer storage media (memory 908) is shown within the computing-based device 900 it will be appreciated that the storage is, in some examples, distributed or located remotely and accessed via a network or other communication link (e.g., using communication interface 910).
The computing-based device 900 also comprises an input/output controller 912 arranged to output display information to a display device 914 which may be separate from or integral to the computing-based device 900. The display information may provide a graphical user interface. The input/output controller 912 is also arranged to receive and process input from one or more devices, such as a user input device 916 (e.g., a mouse, keyboard, camera, microphone, or other sensor). In some examples the user input device 916 detects voice input, user gestures or other user actions and provides a natural user interface (NUI). In an embodiment the display device 914 also acts as the user input device 916 if it is a touch sensitive display device. The input/output controller 912 outputs data to devices other than the display device in some examples, e.g., a locally connected printing device (not shown in FIG. 7).
Any of the input/output controller 912, display device 914 and the user input device 916 may comprise NUI technology which enables a user to interact with the computing-based device in a natural manner, free from artificial constraints imposed by input devices such as mice, keyboards, remote controls, and the like. Examples of NUI technology that are provided in some examples include but are not limited to those relying on voice and/or speech recognition, touch and/or stylus recognition (touch sensitive displays), gesture recognition both on screen and adjacent to the screen, air gestures, head and eye tracking, voice and speech, vision, touch, gestures, and machine intelligence. Other examples of NUI technology that are used in some examples include intention and goal understanding systems, motion gesture detection systems using depth cameras (such as stereoscopic camera systems, infrared camera systems, red green blue (rgb) camera systems and combinations of these), motion gesture detection using accelerometers/gyroscopes, facial recognition, three dimensional (3D) displays, head, eye and gaze tracking, immersive augmented reality and virtual reality systems and technologies for sensing brain activity using electric field sensing electrodes (electro encephalogram (EEG) and related methods).
The term 'computer' or 'computing-based device' is used herein to refer to any device with processing capability such that it executes instructions. Those skilled in the art will realize that such processing capabilities are incorporated into many different devices and therefore the terms 'computer and 'computing-based device' each include personal computers (PCs), servers, mobile telephones (including smart phones), tablet computers, set-top boxes, media players, games consoles, personal digital assistants, wearable computers, and many other devices.
The methods described herein are performed, in some examples, by software in machine readable form on a tangible storage medium e.g., in the form of a computer program comprising computer program code means adapted to perform all the operations of one or more of the methods described herein when the program is run on a computer and where the computer program may be embodied on a computer readable medium. The software is suitable for execution on a parallel processor or a serial processor such that the method operations may be carried out in any suitable order, or simultaneously.
Those skilled in the art will realize that storage devices utilized to store program instructions are optionally distributed across a network. For example, a remote computer is able to store an example of the process described as software. A local or terminal computer is able to access the remote computer and download a part or all of the software to run the program. Alternatively, the local computer may download pieces of the software as needed or execute some software instructions at the local terminal and some at the remote computer (or computer network). Those skilled in the art will also realize that by utilizing conventional techniques known to those skilled in the art that all, or a portion of the software instructions may be carried out by a dedicated circuit, such as a digital signal processor (DSP), programmable logic array, or the like.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
It will be understood that the benefits and advantages described above may relate to one embodiment or may relate to several embodiments. The embodiments are not limited to those that solve any or all of the stated problems or those that have any or all of the stated benefits and advantages. It will further be understood that reference to 'an' item refers to one or more of those items.
The operations of the methods described herein may be carried out in any suitable order, or simultaneously where appropriate. Additionally, individual blocks may be deleted from any of the methods without departing from the scope of the subject matter described herein. Aspects of any of the examples described above may be combined with aspects of any of the other examples described to form further examples without losing the effect sought.
It will be understood that the above description is given by way of example only and that various modifications may be made by those skilled in the art. The above specification, examples and data provide a complete description of the structure and use of exemplary embodiments. Although various embodiments have been described above with a certain degree of particularity, or with reference to one or more individual embodiments, those skilled in the art could make numerous alterations to the disclosed embodiments without departing
from the scope of this specification.
Where the description has explicitly disclosed in isolation some individual features, any apparent combination of two or more such features is considered also to be disclosed, to the extent that such features or combinations are apparent and capable of being carried out based on the present specification as a whole in the light of the common general knowledge of a person skilled in the art, irrespective of whether such features or combinations of features solve any problems disclosed herein. In view of the foregoing description, it will be evident to a person skilled in the art that various modifications may be made within the scope of the invention.

Claims (25)

  1. Claims 1. A computer implemented method for providing a record of user performance of one or more steps of a biological assay, the method comprising: receiving, from an image capture device, a first plurality of images of the user performing the biological assay, wherein the first plurality of images span a first time period in which the user is performing the one or more steps of the biological assay; extracting image data from the first plurality of images, wherein extracting the image data comprises identifying equipment and/or samples used by the user during the first time period; and storing the extracted image data as a first list of equipment.
  2. 2. The method of claim 1 further comprising identifying errors in user performance of the one or more steps of the biological assay by: storing predetermined assay data wherein the predetermined assay data comprises a second list of equipment, the second list of equipment comprising a list of equipment that should be used during performance of the one or more steps; comparing the image data to the predetermined assay data by comparing the second list of equipment with the first list of equipment; identifying, based on the comparison, a deviation between the second list of equipment and the first list of equipment, wherein the deviation identifies an error in the performance of the biological assay; and providing an indication to the user that an error has been identified in performance of the biological assay.
  3. 3. The method of claim 2 wherein: the extracting the image data further comprises identifying an order in which the user used the identified equipment and/or samples; the second list of equipment comprises an order in which the equipment should be used during performance of the one or more steps; and identifying a deviation between the second list of equipment and the first list of equipment comprises identifying either a deviation in an item of equipment used or a deviation in an order of equipment used between the second list of equipment and the first list of equipment.
  4. 4. The method of claim 2 or claim 3, wherein: the user performing the biological assay comprises the user repeating a set of steps multiple times wherein each repetition of the set of steps comprises the user performing the set of steps on a different sample; the predetermined assay data comprises a list of equipment that should be used during each performance of the set of steps; and comparing the image data to the predetermined assay data comprises comparing each user repetition of the set of steps with the predetermined assay data.
  5. 5. The method of any of claims 2 to 4 further comprising: generating the predetermined assay data wherein generating the predetermined assay data comprises: receiving, from the image capture device, a second plurality of images of a trainer performing the biological assay, wherein the second plurality of images span a second time period in which the trainer is performing the one or more steps of the biological assay; and extracting the predetermined assay data from the second plurality of images.
  6. 6. The method of any of claims 2 to 5 wherein: the second list of equipment comprises desired values for any measurements or readouts of the equipment to be used by the user; extracting image data from the first plurality of images further comprises extracting from the first plurality of images actual values for any measurements or readouts provided by the equipment used by the user; and comparing the image data to the predetermined assay data further comprises comparing the desired values for any measurements or readouts to the actual values for any measurements or readouts.
  7. 7. The method of any of claims 2 to 6 wherein: the predetermined assay data comprises information identifying a microplate comprising a plurality of wells and information identifying a third well of the microplate wherein the third well of the microplate comprises a well for containing a control; the equipment comprises a sample and a microplate; user performance of the one or more steps of the biological assay comprises the user transferring the sample to a first well of the microplate; extracting the image data comprises identifying the microplate and the first well of the microplate; comparing the second list of equipment with the first list of equipment comprises comparing the identification of the first well of the microplate with the information identifying the third well of the microplate; and identifying a deviation between the second list of equipment and the first list of equipment comprises identifying the first well of the microplate and the third well of the microplate are the same.
  8. 8. The method of any previous claim wherein: the equipment comprises a sample and a microplate wherein the microplate comprises a plurality of wells arranged in a grid; user performance of the one or more steps of the biological assay comprises the user transferring the sample to a first well of the microplate; and extracting the image data comprises identifying the sample, the microplate and the first well of the microplate.
  9. 9. The method of any of claims 1 to 8 wherein: the equipment further comprises a sample, a second sample and a microplate wherein the microplate comprises a plurality of wells arranged in a grid; user performance of the one or more steps of the biological assay comprises the user transferring the sample to a first well of the microplate and the second sample to a second well of the microplate; extracting the image data further comprises identifying the microplate, the first well of the microplate and the second well of the microplate; and the method further comprises: comparing the identification of the first well of the microplate with the identification of the second well of the microplate; and if the first well of the microplate and the second well of the microplate are the same providing an indication to the user that an error has been identified in the S performance of the biological assay.
  10. 10. The method of any of claims 2 to 9 wherein providing an indication to the user that an error has been identified in the performance of the biological assay comprises: providing an audio or visual alert to the user that an error has been detected while the user is performing the biological assay; and/or storing the indication in the first list of equipment.
  11. 11. The method of any previous claim wherein: the equipment used by the user comprises a guide that aids the user in performing the biological assay; and excluding the guide from the first list of equipment.
  12. 12. The method of any previous claim wherein extracting image data from the first plurality of images comprises: identifying the equipment used by the user using a machine learning model that has been trained to identify laboratory equipment.
  13. 13. The method of claim 12 wherein the machine learning model is a classifier that has been trained to identify laboratory equipment. 25
  14. 14. The method of claim 13 wherein the classifier was trained using supervised learning by providing the classifier with images of laboratory equipment and information identifying the laboratory equipment.
  15. 15. The method of any previous claim further comprising: identifying whether the user is wearing personal protective equipment; and providing an indication to the user that an error has been identified in performance of the biological assay if the user is not wearing personal protective equipment.
  16. 16. The method of claim 15 wherein: the personal protective equipment comprises gloves wherein the gloves have a colour that differs from human skin colours; and identifying whether the user is wearing personal protective equipment comprises detecting a presence or lack of presence of the gloves based on whether hands of the user are the same colour as the gloves.
  17. 17. The method of any previous claim wherein extracting image data from the first plurality of images comprises: identifying the samples used by the user wherein the samples are identified using an identifier present on each sample and optionally wherein the identifiers use a health level 7 communication format.
  18. 18. The method of claim 17 wherein the identifiers present on the samples comprise either barcodes or OR codes.
  19. 19. The method of any previous claim wherein the first list of equipment and/or the second list of equipment used by the user/trainer includes any equipment the user/trainer interacts with via another piece of equipment.
  20. 20. The method of any previous claim wherein the biological assay comprises a diagnostic assay for COVID-19.
  21. 21. The method of claim 20 wherein performing the biological assay comprises, for each sample of one or more samples: obtaining the sample wherein the sample is contained in a plastic tube, the plastic tube contains a swab, and a lid of the plastic tube is on the plastic tube; removing the lid from the plastic tube and retaining the lid; removing the swab from the plastic tube using tweezers and disposing of the swab in a waste container; using an accurate liquid handling tool, transferring a reagent from a disposable plastic into the plastic tube; disposing of the disposable plastic in the waste container; putting the retained lid on the plastic tube; and placing the plastic tube in a well of a microplate.
  22. 22. The method of any previous claim further comprising: starting to receive the first plurality of images from the image capture device in response to a motion detector detecting motion in the vicinity of the image capture device.
  23. 23. The method of any previous claim wherein the image capture device comprises one or more video cameras wherein each video camera of the one or more video cameras provides a different view of the user performing the biological assay.
  24. 24. A computer vision system for identifying errors in user performance of a biological assay, the computer vision system comprising: an image capture device configured to capture a plurality of images; a processor; and a memory, the memory comprising instructions that when executed cause the processor to perform the method of any of claims 1 to 23.
  25. 25. A non-transitory computer readable storage medium containing instructions that when executed cause a processor to perform a method according to any of claims 1 to 23.
GB2116418.1A 2021-11-15 2021-11-15 Computer vision system for a biological assay Pending GB2612967A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2116418.1A GB2612967A (en) 2021-11-15 2021-11-15 Computer vision system for a biological assay

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2116418.1A GB2612967A (en) 2021-11-15 2021-11-15 Computer vision system for a biological assay

Publications (2)

Publication Number Publication Date
GB202116418D0 GB202116418D0 (en) 2021-12-29
GB2612967A true GB2612967A (en) 2023-05-24

Family

ID=79163548

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2116418.1A Pending GB2612967A (en) 2021-11-15 2021-11-15 Computer vision system for a biological assay

Country Status (1)

Country Link
GB (1) GB2612967A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150209114A1 (en) * 2014-01-29 2015-07-30 Becton, Dickinson And Company System and Method for Collection Confirmation and Sample Tracking at the Clinical Point of Use
US20180301014A1 (en) * 2017-04-12 2018-10-18 Disney Enterprises, Inc. System and method for monitoring procedure compliance
WO2020099424A1 (en) * 2018-11-15 2020-05-22 Global Life Sciences Solutions Usa Llc Method and system for monitoring a set-up for manufacture of a biopharmaceutical product

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150209114A1 (en) * 2014-01-29 2015-07-30 Becton, Dickinson And Company System and Method for Collection Confirmation and Sample Tracking at the Clinical Point of Use
US20180301014A1 (en) * 2017-04-12 2018-10-18 Disney Enterprises, Inc. System and method for monitoring procedure compliance
WO2020099424A1 (en) * 2018-11-15 2020-05-22 Global Life Sciences Solutions Usa Llc Method and system for monitoring a set-up for manufacture of a biopharmaceutical product

Also Published As

Publication number Publication date
GB202116418D0 (en) 2021-12-29

Similar Documents

Publication Publication Date Title
US7876935B2 (en) Sample processing apparatus with a vision system
US10768187B2 (en) Automatic analysis device and specimen inspection automation system
EP2040081B1 (en) Sample analyzer and error information displaying method
CN106971290A (en) Method for generating the entry on electronic leaning laboratory daily record
CN104535777A (en) Full-automatic medical detection system
JP6781919B2 (en) Electric pipette system, electric pipette and work procedure display device
JP2009225742A (en) Isolator
JP7317970B2 (en) Automatic analysis system and alarm handling method
US11400447B2 (en) Micromixer
KR20150111696A (en) Apparatus for blood testing and method for blood testing thereof
JP2015180862A (en) Measuring apparatus and measuring method
US20220020455A1 (en) Point-of-care diagnostic instrument workflow
GB2612967A (en) Computer vision system for a biological assay
WO2022131039A1 (en) Sample pooling device and sample pooling system
JP2020165969A (en) Analytical laboratory
Xiao Designing and implementing a large-scale high-throughput Total Laboratory Automation (TLA) system for DNA database construction
JP6923167B2 (en) Electric pipette system, electric pipette and work procedure display device
WO2022223455A1 (en) Image processing during biological sample analysis
WO2023007805A1 (en) Test method and test system
WO2023007814A1 (en) Inspection system
Khan Setup of a PCR Laboratory
JP2021039091A (en) Creation of laboratory automation device control program, using detection of object
Jarrar Lean DNA extraction for polymerase chain reaction improvement: A risk analysis based evaluation with Lean Six Sigma solutions
Ohlsén et al. The History of the Data Systems AutoChemist®(ACH) and AutoChemist-PRISMA (PRISMA®): from 1964 to 1986
Zosel et al. Using Automated Cap Inspection to Prevent Medical Errors