US11783563B2 - Software and algorithms for use in remote assessment of disease diagnostics - Google Patents
Software and algorithms for use in remote assessment of disease diagnostics Download PDFInfo
- Publication number
- US11783563B2 US11783563B2 US17/333,978 US202117333978A US11783563B2 US 11783563 B2 US11783563 B2 US 11783563B2 US 202117333978 A US202117333978 A US 202117333978A US 11783563 B2 US11783563 B2 US 11783563B2
- Authority
- US
- United States
- Prior art keywords
- image
- computer
- implemented method
- interest
- images
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/17—Image acquisition using hand-held instruments
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/168—Segmentation; Edge detection involving transform domain methods
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/24—Aligning, centring, orientation detection or correction of the image
- G06V10/247—Aligning, centring, orientation detection or correction of the image by affine transforms, e.g. correction due to perspective effects; Quadrilaterals, e.g. trapezoids
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Definitions
- the present disclosure generally relates to devices and methods for home testing, telemedicine applications and other in-situ immunoassay measurements. More specifically, the present disclosure relates to methods for processing data collected by client devices in conjunction with consumables available to users in a simple and accurate procedure to assess a disease diagnostic, locally and/or remotely.
- test samples with the test sample, also referred to as “sample cartridge” back and forth (before use of the test sample) between the medical provider (e.g., clinic, physician, pharmacy), the laboratory and the user.
- the medical provider e.g., clinic, physician, pharmacy
- these test samples tend to cause delays in clinical laboratories, many times unnecessarily (as many samples may be negative).
- time lag between test and result may be a potential hazard, e.g., for epidemic or pandemic emergencies, or when the outcome of treatment of a serious condition is dramatically impacted by the time of start of a therapy, or an infected user leave the office without an immediate result, neglecting follow up and proceeding to infect others.
- FIG. 1 illustrates an architecture including a remote server, a database, and an image-capturing device to collect an image from a test cartridge in an enclosure, according to some embodiments.
- FIG. 2 illustrates details in devices used in the architecture of FIG. 1 .
- FIG. 3 illustrates an architecture of a convolutional neural network to provide a disease diagnostic from one or more images from a sample assay, according to some embodiments.
- FIGS. 4 A-B illustrate a convolution operation between two layers in a convolutional neural network, according to some embodiments.
- FIGS. 5 A-B illustrate the content of multiple nodes in multiple layers in a convolutional neural network, according to some embodiments.
- FIG. 6 illustrates a table of results for an assay diagnosis including two targets, according to some embodiments.
- FIGS. 7 A-B illustrate sensitivity and selectivity results for an assay diagnosis including two targets, according to some embodiments.
- FIG. 8 is a flow chart illustrating steps in a method for determining the presence or absence of an analyte of interest, according to some embodiments.
- FIG. 9 is a flow chart illustrating steps in a method for determining the presence or absence of an analyte of interest, according to some embodiments.
- FIG. 10 is a flow chart illustrating steps in a method for determining the presence or absence of an analyte of interest, according to some embodiments.
- FIG. 11 is a block diagram illustrating an example computer system with which the client and server of FIGS. 1 and 2 and the methods of FIGS. 7 - 11 can be implemented.
- immunoassays designed for the detection of chemical and biological agents or pathogens may include security tests and screening (e.g., at airports, police and military checkpoints), or environmental analysis and monitoring (e.g., air pollution, contamination of water ways and reservoirs—for disease control or agricultural production-, and the like).
- Embodiments consistent with the present disclosure take advantage of the high image-capturing and processing capabilities of current consumer appliances to provide simple yet accurate diagnostic procedures for selected diseases or infections (e.g., legionella, influenza, Ebola, Lyme disease, myocardial infarction, Strep A, respiratory syncytial virus, human metapneumovirus, SARS-CoV2, and the like).
- diseases or infections e.g., legionella, influenza, Ebola, Lyme disease, myocardial infarction, Strep A, respiratory syncytial virus, human metapneumovirus, SARS-CoV2, and the like.
- the types of tests consistent with embodiments in the present disclosure may include any type of spectroscopic analysis of test assays using electromagnetic radiation, such as, without limitation, absorption spectroscopy (ultra-violet, visible, or infrared) including reflectance or transmittance spectroscopy, or emission spectroscopy, including fluorescence and luminescence spectroscopy, Raman spectroscopy, and any type of radiation scattering.
- absorption spectroscopy ultraviolet, visible, or infrared
- emission spectroscopy including fluorescence and luminescence spectroscopy, Raman spectroscopy, and any type of radiation scattering.
- embodiments as disclosed herein may further exploit the networking capabilities of such appliances to enhance the processing capabilities of each test by using cloud-computing solutions.
- a high quality (e.g., high spatial and spectral resolution) image, sequence of images, or video is uploaded to a remote server that can perform massively parallel computations to provide, in a reduced time, a diagnostic result.
- analyzed material may be processed immediately, at a later date/time, and/or may be compared to previously collected materials to determine differences over time, e.g., a time evolution of the analyte across a test strip.
- the ability to collect and compile data libraries may enable the generation of self-teaching algorithms (Artificial Intelligence or Machine Learning algorithms) from the analysis of such image libraries to generate initial versions and improved versions as the size and diversity of such libraries increases.
- self-teaching algorithms Artificial Intelligence or Machine Learning algorithms
- the subject system provides several advantages, including the ability for a user to quickly learn whether a disease is present or latent, without the need to access specialized personnel, or a complex machine or instrument.
- Some embodiments provide the advantage of widely broadening the market for medical test kits, as consumers who have wide access to image-capturing devices in the form of mobile computing devices and other appliances, may desire to perform tests even before perceiving any symptoms or going to a doctor or clinic. This also may provide the advantage of a screening step before people attend clinics or saturate the resources of a given medical facility. Further, the cost of a test for a remote user of methods as disclosed herein may be substantially lower than the cost associated with a visit to a clinic or laboratory, including waiting times, scheduling, taking an appointment away from a truly infected patient, or exposing a healthy patient to a waiting room full of sick people.
- the proposed solution further provides improvements to the functioning of computers (e.g., the server or a user mobile device) because it saves data storage space and interaction time by enabling a remote transmission of image analysis data and results (e.g., pictures, sequences of pictures, and/or videos).
- computers e.g., the server or a user mobile device
- image analysis data and results e.g., pictures, sequences of pictures, and/or videos.
- each user may grant explicit permission for such user information to be shared or stored.
- the explicit permission may be granted using privacy controls integrated into the disclosed system.
- Each user may be provided notice that such user information will be shared with explicit consent, and each user may at any time end the information sharing, and may delete any stored user information.
- the stored user information may be encrypted to protect user security and identity.
- FIG. 1 illustrates an architecture 10 including a remote server 130 , a database 152 , a client device 110 and an image-capturing device 100 A to collect an image or video from a test cartridge 101 , according to some embodiments.
- Client device 110 may include a smartphone or other mobile computing device (e.g., tablet, pad, or even laptop). In some embodiments, more than one image-capturing device can be controlled using one mobile computing device.
- Architecture 10 provides, in real-time, an accurate assessment as to the presence or not of one or more target analytes in a test sample from an assay result.
- the assay may be run in test cartridge 101 , and may include an immunoassay for detecting the one or more analytes of interest in a biological sample.
- Test cartridge 101 may provide a substrate for flowing the biological sample on multiple test channels for detection of 1-20 analytes of interest (or more). Images of the assay as it progresses may be provided by image-capturing device 100 A communicatively coupled with client device 110 .
- Test cartridge 101 in one embodiment, is an immunoassay test strip enclosed in a housing or cartridge to ease its handling. In other embodiments, test cartridge 101 is simply an immunoassay test strip, such as a dip stick. That is, an external housing is optional, and if present, need not be a cartridge or cassette housing but can be a flexible laminate, such as that disclosed in U.S. Patent Application Publication No. 2009/02263854 and shown in Design Pat. No. D606664.
- An immunoassay test strip in one embodiment, comprises in sequence, a sample pad, a label pad, one or more lines or bands selected from a test line, a control line and a reference line, and an absorbent pad.
- a support member is present, and each or some of the sample pad, label pad, lines, and absorbent pad are disposed on the support member.
- Exemplary immunoassay test strips are described, for example, in U.S. Pat. Nos. 9,207,181; 9,989,466; and 10,168,329 and in U.S. Publication Nos. 2017/0059566 and 2018/0229232, each of which is incorporated by reference herein. Additional details on immunoassay test strips are provided infra.
- the assay is an immunoassay including reagents for detection of an infectious agent (e.g., a virus or a bacterium) in the biological sample.
- the immunoassay may include reagents for detection of protein, including Antibodies against specific analytes, or small molecule biomarker or an autoantibody.
- the analytes of interest are detectable by emission of a unique signal associated with each analyte selected from the analytes of interest.
- the biological sample includes a body fluid (e.g., blood, serum, plasma, sputum, mucus, saliva, tear, feces, or urine).
- architecture 10 includes a user of client device 110 who has ordered a kit including test cartridge 101 and image-capturing device 100 A and is ready to perform a personal test for a disease or condition remotely from a hospital or clinic or any other location (e.g., at home, in a pharmacy, retail store, doctor's office, and the like).
- image-capturing device 100 A includes an enclosure 120 to prevent ambient light from perturbing or interfering with the measurement.
- image-capturing device 100 A wirelessly transmits an image of test cartridge 101 to client device 110 .
- Client device 110 then may transmit the image or video to a remote server 130 , to database 152 , or both, via network 150 , for processing.
- image-capturing device 100 A and/or client device 110 may perform at least one or more operations to the image or one more image frames from a video using processors 112 - 1 and/or 112 - 2 , respectively (hereinafter, collectively referred to as “processors 112 ”), before transmitting the image to server 130 or to database 152 .
- processors 112 processors 112 - 1 and/or 112 - 2
- client device 110 may perform at least one or more quality control steps over the one or more images provided by image-capturing device 100 A before transmitting to server 130 .
- client device 110 may obtain a preliminary or a definitive diagnostic based on the analysis of the image of test cartridge 101 . Accordingly, in some embodiments, client device 110 may transmit the preliminary or definitive diagnostic to server 130 with or without an image of test cartridge 101 .
- processors 112 may execute instructions and collect or save data, the instructions and data stored in a memory 132 - 1 (in image-capturing device 100 A) or in a memory 132 - 2 (in client device 110 ).
- Client device 110 communicates with image-capturing device 100 A via a signal 160 - 1 and/or with server 130 via a signal 160 - 2 , using a communication module 118 - 2 .
- signal 160 - 1 includes a transmittable file generated by processor 112 - 1 , including data from an array sensor collecting an image from test cartridge 101 .
- signal 160 - 2 may include a diagnostic of the assay based on image analysis of the transmittable file.
- Image-capturing device 100 A may communicate with client device 110 through a communication module 118 - 1 .
- Signals 160 - 1 and 160 - 2 may be digital or analog signals, wireless signals, radio-frequency (RF) signals, electrical signals, Ethernet signals, and the like.
- Communication modules 118 - 1 and 118 - 2 will be collectively referred to, hereinafter, as “communication modules 118 .”
- Communication modules 118 may include hardware and software associated with RF antennas for communication via WiFi, Bluetooth (e.g., low energy Bluetooth, BLE), or nearfield contact (NFC) protocols.
- Bluetooth e.g., low energy Bluetooth, BLE
- NFC nearfield contact
- any one of signals 160 may be encrypted and/or encoded for security purposes.
- image-capturing device 100 A may include a sensor array 140 and an optics coupling mechanism 115 (e.g., a lens system with autofocus capabilities).
- Sensor array 140 may collect one or more images of test cartridge 101 at a desired frame rate, to form a video.
- sensor array 140 may collect a single image of test cartridge 101 (e.g., after an assay has run its course), or more than one image (e.g., before and after an assay runs its course).
- sensor array 140 may collect multiple images of test cartridge 101 at a pre-selected frequency rate (e.g., as the test cassette is running). The frequency rate may be adjusted, modified, accelerated, or slowed, based on preliminary or quality control tests performed by client device 110 .
- Remote server 130 may provide support for an image-capturing application 122 installed in memory 132 - 2 of client device 110 .
- the support may include update installation, retrieval of raw data (e.g., pictures, sequences of pictures and videos) for storage in database 152 , image processing, and the like.
- Image-capturing application 122 may include commands and instructions to control image-capturing device 100 A.
- Image-capturing application 122 may also include commands and instructions to perform at least a partial analysis of the one or more images provided by image-capturing device 100 A.
- the instructions in image-capturing application 122 may include a neural network (NN), artificial intelligence (AI), or a machine learning (ML) algorithm to assess a diagnostic based on the one or more images of test cartridge 101 .
- NN neural network
- AI artificial intelligence
- ML machine learning
- image-capturing application 122 may include instructions to assess a quality control of the one or more images provided by image-capturing device 100 A, based on sensor data indicative of the positioning of test cartridge 101 within enclosure 120 .
- the sensor data may be provided by sensors disposed within enclosure 120 .
- client device 110 may further include an image-capturing device 100 B to collect an image of a fiduciary label 105 on test cartridge 101 .
- image-capturing application 122 may incorporate the image of a label 105 on test cartridge 101 into a measurement protocol.
- the measurement protocol may be transmitted by client device 110 to server 130 and/or to database 152 , where metadata associated with sampling cartridge 101 may be correlated with information stored therein.
- the metadata in fiduciary label 105 may be correlated with a user ID and with an assay identification code (e.g., flu test, Lyme disease test, pregnancy test, hepatitis, or any other disease or assay).
- image-capturing devices 100 A and 100 B will be collectively referred to as “image-capturing devices 100 .”
- image-capturing application 122 may also include instructions for the user as to the mode of use and a measurement protocol for test cartridge 101 .
- the instructions may illustrate to the user, step by step, how to collect a sample (e.g., using a swab or other extraction mechanism), mix the sample with appropriate reagents, and provide at least a portion of the sample into test cartridge 101 .
- image-capturing application 122 may display the instructions and other illustrative icons to the user on a display 116 of client device 110 .
- FIG. 2 illustrates an example server 130 and client device 110 in architecture 10 (cf. FIG. 1 ), according to certain aspects of the disclosure.
- Client device 110 and server 130 are communicatively coupled over network 150 via respective communications modules 218 - 1 and 218 - 2 (hereinafter, collectively referred to as “communications modules 218 ”).
- Communications modules 218 interface with network 150 to send and receive information, such as data, requests, responses, and commands to other devices on the network.
- Communications modules 218 can be, for example, modems or Ethernet cards and other RF hardware and software (e.g., antennas, modulators, demodulators, phase-locked loops, digital-to-analog converters, digital signal processors, and the like).
- a user may interact with client device 110 via an input device 214 and an output device 216 .
- Input device 214 may include a mouse, a keyboard, a pointer, a touchscreen, a microphone, and the like.
- Output device 216 may be a screen display, a touchscreen, a speaker, and the like.
- Client device 110 may include a memory 232 - 1 and a processor 212 - 1 .
- Memory 232 - 1 may include an application 222 , configured to run in client device 110 .
- Application 222 may be downloaded by the user from server 130 , and may be hosted by server 130 .
- Server 130 includes a memory 232 - 2 , a processor 212 - 2 , and communications module 218 - 2 .
- processors 212 - 1 and 212 - 2 , and memories 232 - 1 and 232 - 2 will be collectively referred to, respectively, as “processors 212 ” and “memories 232 .”
- Processors 212 are configured to execute instructions stored in memories 232 .
- memory 232 - 2 includes a diagnostic engine 240 . Diagnostic engine 240 may share or provide features and resources to application 222 , including tools associated with image-processing and predictive analysis. The user may access diagnostic engine 240 through application 222 or a web browser installed in a memory 232 - 1 of client device 110 . Accordingly, application 222 may be installed by server 130 and perform scripts and other routines provided by server 130 through any one of multiple tools. Execution of application 222 may be controlled by processor 212 - 1 .
- diagnostic engine 240 may include an assay reading tool 242 and an image processing tool 244 . Diagnostic engine 240 may have access to a history of assay images and/or videos collected from other users, including diagnostic results, stored in a database 252 (e.g., through network 150 ). In some embodiments, diagnostic engine 240 , the tools contained therein, and at least part of database 252 may be hosted in a different server communicatively coupled with server 130 .
- application 222 may include a diagnostic model 225 , which may be a portion or a simplified version of diagnostic engine 240 , stored in memory 232 - 1 and executed by processor 212 - 1 .
- Diagnostic model 225 may provide a classification of the input image of the assay as “Positive,” (target analyte detected above a sensitivity value) “Negative,” (target analyte not detected above a sensitivity value) or “Invalid” (measurement error, low SNR, un-calibrated measurement, and the like).
- diagnostic model 225 may be a faster and simplified version of diagnostic engine 240 that enables the user of client device 110 to more quickly reach a decision regarding the progress of an assay in the test cartridge and/or monitor general function of the device including, but not limited to, temperature range, light leakage, and the like. This may be helpful in remote locations where network 150 may be unreliable or the connectivity be slow, noisy, sporadic, or lost.
- more than one diagnostic model may be used to provide a “Positive,” “Negative,” or “Invalid” classification.
- the models may be developed based on the same images.
- each of the multiple models possess unique characteristics for classification determination.
- a “Positive,” “Negative,” or “Invalid” classification may be more robust and less prone to false positive or false negative results due to the classification being based on multiple models.
- FIG. 3 illustrates an architecture of a convolutional neural network (CNN) 300 to provide a disease diagnostic from one or more images from a sample assay, according to some embodiments.
- CNN 300 may include a deep neural network (DNN) applied to analyzing visual imagery.
- CNN 300 includes a regularized version of a fully connected network in which each neuron in one layer is connected to all neurons in the next layer.
- CNN 300 may include an input layer 301 and an output layer 302 .
- Input layer 301 may include a pixelated image 303 or even a collection of pixelated images forming a volume (e.g., an image of a test assay or multiple images of an evolving test assay, or a video).
- image 303 may include three-dimensional pixels, or ‘voxels.’
- CNN 300 also includes multiple hidden layers 310 - 1 , 310 - 2 , 310 - 3 , 310 - 4 , and 310 - 5 (hereinafter, collectively referred to as “hidden layers 310 ”) linking input layer 301 with output layer 302 .
- Output layer 302 may be a binary answer to the question of whether one or more of the target analytes are present in the sample, according to the assay in image 303 . Accordingly, in some embodiments, output layer 302 includes a vector having a number of components equal to the number of target analytes in the assay. Each of the components in the vector in output layer 302 may include a binary value, e.g., ‘0’ for absent and ‘1’ for present.
- each of the components in output layer 302 may include a real number between ‘0’ and ‘1,’ indicative of a probability that a given target analyte is present in the sample, or even its concentration in the sample (e.g., by normalized weight or volume, or any other normalization measure).
- Hidden layers 310 may include convolutional layers that convolve one or more pixels or voxels from input layer 301 with a suitably chosen multiplication factor or dot product. Coupling each of hidden layers 310 with one another, an activation function may include a rectifier function (e.g., a rectified linear unit, or RELU) subsequently followed by additional convolutions such as pooling layers, fully connected layers, and normalization layers, referred to as hidden layers because their inputs and outputs are masked by the activation function and a final convolution.
- a rectifier function e.g., a rectified linear unit, or RELU
- additional convolutions such as pooling layers, fully connected layers, and normalization layers, referred to as hidden layers because their inputs and outputs are masked by the activation function and a final convolution.
- the final convolution may include a backpropagation to more accurately weight the end result in output layer 302 .
- a backpropagation step a value from output layer 302 is entered at each of hidden layers 310 (from last to first, in reverse order).
- a pseudo-inverse transformation is applied and the most likely input is obtained for the given output.
- a most likely assay input is obtained after back propagating an output value.
- a measure of the difference between the most likely input and the real input in input layer 301 indicates whether or not some of the convolution parameters and sampling parameters (e.g., for the RELU transformation) are desirably changed.
- FIGS. 4 A-B illustrate a convolution operation 400 between two layers in a convolutional neural network, according to some embodiments.
- the first layer may include a pixelated image 403 as part of input layer 401
- the convolved layer may include a feature map 411 in a hidden layer 410 .
- FIG. 4 A shows convolution 400 in more detail.
- Convolution 400 adds each element along the diagonals of a selected portion 405 of image 403 (xl factor), and neglects every other element of selected portion 405 ( ⁇ 0 factor).
- element 415 in hidden layer 410 has a value ‘4.’
- Selected portion 405 may include a digital portion of pixelated image 403 .
- the remaining elements in hidden layer 410 are determined in the same way as element 415 , simply moving selected portion 405 in either direction across pixelated image 403 .
- convolution 400 reduces the dimensionality of pixelated image 403 from a 5 ⁇ 5 matrix to a 3 ⁇ 3 matrix in hidden layer 410 .
- the reduction in dimensionality is arbitrary and may be selected according to a better predictability at the output layer, or according to any other criterion such as computational cost, time to completion, and the like.
- the size of selected portion 405 in convolution 400 may be selected according to multiple criteria, including a desire to reduce computational cost. For example, a 1/9 reduction in computational power may be obtained when selected portion 405 is reduced from a 3 ⁇ 3 portion to a 1 ⁇ 1 portion.
- FIG. 4 B illustrates the formation of feature map 411 in layer 410 as a result of applying convolution 400 to all pixels in image 403 .
- Element 415 is obtained by applying convolution function 400 to selected portion 405 .
- Note a reduction in the dimension (5 ⁇ 5) of feature map 411 is reduced relative to the dimension (7 ⁇ 7) of image 403 as a result of the 3 ⁇ 3 convolution 400 .
- multiple convolution functions 400 each associated to a different filter, may result in multiple feature maps 411 , each associated with a different feature from image 403 .
- FIGS. 5 A-B illustrate the content of multiple nodes in multiple layers 510 - 1 , 510 - 2 , 510 - 3 , and 510 - 4 (hereinafter, collectively referred to as “layers 510 ”) in a convolutional neural network, according to some embodiments.
- layers 510 includes a set of filters 520 - 1 , 520 - 2 , 520 - 3 , and 520 - 4 (hereinafter, collectively referred to as “filters 520 ”) and a corresponding set of feature maps 521 - 1 , 521 - 2 , 521 - 3 , and 521 - 4 (hereinafter, collectively referred to as “feature maps 521 ”).
- Filters 520 include different convolutional functions (e.g., filters) applied to the same input, to obtain a specific feature map 521 for each filter 520 . Accordingly, some embodiments include a squeezing convolutional step where the number of filter maps is reduced between a given layer 510 and the next. In a squeezing convolutional step, filters 520 may have a reduced dimension (e.g., 1 ⁇ 1 convolution rather than 3 ⁇ 3 convolution) to reduce the computational cost. In some embodiments, late down sampling enables the system to maintain multiple feature maps 521 throughout more hidden layers 510 , thus providing a more detailed and accurate network that can target more refined features in an assay (e.g., multiple target analytes and potential interactions or correlations among them). In some embodiments, an expanding convolutional step increases the number of filters 520 and feature maps 521 and may include larger convolution sets (e.g., 1 ⁇ 1 and 3 ⁇ 3 convolutions).
- FIG. 6 illustrates a table of results 600 for an assay diagnosis including two targets, according to some embodiments.
- the assay used for table 600 is a binary assay for Flu A (FA) and Flu B (FB) detection.
- the analysis was performed by a diagnostics engine using assay reading tools and image processing tools as disclosed herein (e.g., diagnostic engine 240 , assay reading tool 242 and image processing tool 244 , application 222 and diagnostic model 225 ).
- a diagnostics engine using assay reading tools and image processing tools as disclosed herein (e.g., diagnostic engine 240 , assay reading tool 242 and image processing tool 244 , application 222 and diagnostic model 225 ).
- a CNN as disclosed herein
- Table 600 represents a 3 ⁇ 3 matrix cross-correlating the different combinations of positive and negative FA/FB results.
- FIGS. 7 A-B illustrate a sensitivity chart 700 A and a specificity chart 700 B resulting for an assay diagnosis including two targets, according to some embodiments (hereinafter, collectively referred to as “charts 700 ”).
- the assay used for charts 700 is the binary assay for FA and FB detection from table 600 .
- a single assay may determine presence or absence of one or more analytes of interest. In an embodiment, and by way of example, a single assay may determine presence and/or absence of each of influenza A and influenza B, independently or simultaneously, or overlapping in time, and the system may report an aggregated result when desired (e.g. influenza A positive, influenza B positive, influenza A negative, influenza B negative, and combinations thereof). In an embodiment, and by way of example, a single assay may determine presence and/or absence of each of respiratory syncytial virus, influenza A, influenza B, and human metapneumovirus, independently or simultaneously, or overlapping in time, and the system may report an aggregated result when desired (e.g.
- a single assay may determine presence and/or absence of each of respiratory syncytial virus, influenza A, influenza B, and SARS CoV2, independently or simultaneously, or overlapping in time, and the system may report an aggregated result when desired (e.g. influenza A positive, influenza B positive, respiratory syncytial virus, negative, SARS CoV2 negative, and combinations thereof).
- the system may report the results for a given test differently to different parties (e.g. report Flu+/ ⁇ to the patient/customer and the details such as Flu A+/ ⁇ and Flu B+/ ⁇ to the doctor or government agency).
- FIG. 7 A illustrates a sensitivity chart 700 A.
- the values in chart 700 A indicate a likelihood that a diagnostic engine as disclosed herein will detect either FA or FB when the sample comes from an individual that has the respective disease(s).
- FIG. 7 B illustrates a specificity chart 700 B.
- the values in chart 700 B indicate a likelihood that a diagnostic engine as disclosed herein will NOT detect either FA or FB when the sample comes from an individual that does NOT have the respective disease(s).
- FIG. 8 is a flow chart illustrating steps in a method 800 for determining the presence or absence of an analyte of interest, according to some embodiments.
- Methods consistent with the present disclosure may include at least one or more of the steps in method 800 performed at least partially by one or more devices in an architecture including a remote server, a database, a client device, and an image-capturing device as disclosed herein (e.g., architecture 10 , remote server 130 , database 152 , client device 110 , and image-capturing devices 100 ).
- Either one of the server, the database, the client device, and the image-capturing device may include a memory circuit storing instructions and a processor circuit configured to execute the instructions to perform, at least partially, one or more of the steps in method 800 (e.g., memory circuit 132 , and processor circuit 112 ).
- at least one or all of the server, the database, the client device, or the image-capturing device may include a communications module configured to transmit and receive data to one or more of the devices in the architecture, through a network or via a one-to-one (wired or wireless) communication channel (e.g., communications modules 118 and network 150 ).
- At least one of the steps in method 800 may be partially performed by a diagnostic engine in a server, using an assay reading tool and an image processing tool as disclosed herein (e.g., diagnostic engine 240 , assay reading tool 242 , and image processing tool 244 ).
- at least one of the steps in method 800 may be partially performed by an application installed in a client device, the application including a diagnostic model hosted by the server (e.g., application 222 and diagnostic model 225 ).
- the image-capturing device may include an enclosure enshrouding a coupling mechanism and a cartridge mount configured to receive a test cartridge (e.g., enclosure 120 , coupling mechanism 115 , and test cartridge 101 ).
- the image of the illuminated test cartridge may include at least a portion of a reading zone in the test cartridge, delimited by a border line.
- the reading portion may include an immunoassay (e.g., a lateral flow immunoassay).
- the lateral flow immunoassay may include a series of images of a lateral flow immunoassay collected over time.
- methods consistent with the present disclosure may include at least one step from method 800 , or more than one step from method 800 performed in a different order, or overlapping in time.
- some embodiments consistent with the present disclosure may include one or more steps in method 800 performed simultaneously, or quasi-simultaneously.
- Step 802 includes receiving an image from an image-capturing device, the image including an area of interest in a test cartridge.
- Step 804 includes finding a border of the area of interest of the test cartridge and applying a geometrical transformation on an area delimited by the border of the test cartridge to bring the image of the area of interest in the test cartridge to a selected size and a selected shape.
- Step 806 includes applying a geometrical transformation on an area delimited by the border of the test cartridge to bring the image of the area of interest in the test cartridge to a selected size and a selected shape.
- Step 808 includes identifying a target region within the area of interest of the test cartridge.
- the target region comprises a process control area including at least one of a positive control area or a negative control area
- step 808 includes evaluating a signal intensity in the process control area.
- step 808 includes identifying at least a test line and a control line in the area of interest of the test cartridge within a field of view of the image.
- Step 810 includes evaluating a quality of the image based on a characteristic feature of the target region. In some embodiments, step 810 includes comparing a selected feature of the image with a value associated with selected features of multiple images having known quality values.
- Step 812 includes providing commands to adjust an optical coupling in the image-capturing device when the quality of the image is lower than a selected threshold.
- Step 814 includes providing the image to a processor that contains software designed to assess a subject diagnostics based on a digital analysis of the image when a quality of the image meets the selected threshold.
- step 814 includes displaying the image in a computer display, and including a viewing guide in the computer display, the viewing guide overlapping at least a portion of the digital analysis of the image.
- step 814 includes displaying a test result in a computer display, and not displaying the image.
- FIG. 9 is a flow chart illustrating steps in a method 900 for determining the presence or absence of an analyte of interest, according to some embodiments.
- Methods consistent with the present disclosure may include at least one or more of the steps in method 900 performed at least partially by one or more devices in an architecture including a remote server, a database, a client device, and an image-capturing device as disclosed herein (e.g., architecture 10 , remote server 130 , database 152 , client device 110 , and image-capturing devices 100 ).
- Either one of the server, the database, the client device, and the image-capturing device may include a memory circuit storing instructions and a processor circuit configured to execute the instructions to perform, at least partially, one or more of the steps in method 900 (e.g., memory circuit 132 , and processor circuit 112 ).
- at least one or all of the server, the database, the client device, or the image-capturing device may include a communications module configured to transmit and receive data to one or more of the devices in the architecture, through a network or via a one-to-one (wired or wireless) communication channel (e.g., communications modules 118 and network 150 ).
- At least one of the steps in method 900 may be partially performed by a diagnostic engine in a server, using an assay reading tool and an image processing tool as disclosed herein (e.g., diagnostic engine 240 , assay reading tool 242 , and image processing tool 244 ).
- at least one of the steps in method 900 may be partially performed by an application installed in a client device, the application including a diagnostic model hosted by the server (e.g., application 222 and diagnostic model 225 ).
- the image-capturing device may include an enclosure enshrouding a coupling mechanism and a cartridge mount configured to receive a test cartridge (e.g., enclosure 120 , coupling mechanism 115 , and test cartridge 101 ).
- the image of the illuminated test cartridge may include at least a portion of a reading zone in the test cartridge, delimited by a border line.
- the reading portion may include an immunoassay (e.g., a lateral flow immunoassay).
- the lateral flow immunoassay may include a series of images of a lateral flow immunoassay collected over time.
- methods consistent with the present disclosure may include at least one step from method 900 , or more than one step from method 900 performed in a different order, or overlapping in time.
- some embodiments consistent with the present disclosure may include one or more steps in method 900 performed simultaneously, or quasi-simultaneously.
- Step 902 includes receiving an image from an image-capturing device, the image comprising an area of interest in a test cartridge.
- Step 904 includes providing a first identifier code (e.g., a mobile device ID, and the like) identifying the image-capturing device to a processor that contains software designed to assess a subject diagnostics based on a digital analysis of the image.
- a first identifier code e.g., a mobile device ID, and the like
- Step 906 includes identifying a target region within the area of interest of the test cartridge.
- Step 908 includes evaluating a quality of the image based on a characteristic feature of the target region and on the first identifier code. In some embodiments, step 908 includes selecting a threshold for a signal intensity in a process control area within the target region based on the first identifier code. In some embodiments, step 908 includes selecting specific features or attributes in the image that indicate that the test cartridge ran correctly. In some embodiments, step 908 may include verifying that a signal intensity at the end of a test channel in the test cartridge is higher than a selected threshold indicative that the sample flowed to the end. In some embodiments, step 908 includes verifying that a reference line appears at a selected location.
- step 908 may include verifying that a signal intensity of a negative control is less than a selected threshold indicative of an assay interference. In some embodiments, step 908 may include verifying exposure, focus, and other optical characteristics of the image are satisfactory. Accordingly, in some embodiments, step 908 includes assessing whether the image is appropriate to send to the AI model for inferencing. For example, in some embodiments, step 908 includes verifying that a valid crop in the image contains features expected from a valid test cassette.
- step 910 includes providing the image to the processor.
- step 910 includes retrieving a calibration table from a remote server using the first identifier code, the calibration table associated with the image-capturing device, and indicative of a signal value that is a threshold for evaluating the quality of the image.
- the calibration table may be stored in the client device, or may be provided to the client device from the server using a barcode in the test cartridge (which may be captured by the client device and provided to the server to retrieve the calibration table).
- step 910 includes providing a second identifier code identifying the test cartridge to the processor, wherein evaluating the quality of the image further comprises selecting a threshold for a signal intensity in a process control area within the target region based on the second identifier code.
- a threshold may be modulated in order to classify a sample “Positive” or “Negative.” In some embodiments, such modulation is performed by changing the normalization factor between training with multiple images and inferencing the resulting classification. In some embodiments, a “Positive” or “Negative” classification may be more robust and less prone to false positive or false negative results due to appropriate modulation of a selected threshold.
- FIG. 10 is a flow chart illustrating steps in a method 1000 for determining the presence or absence of an analyte of interest, according to some embodiments.
- Methods consistent with the present disclosure may include at least one or more of the steps in method 1000 performed at least partially by one or more devices in an architecture including a remote server, a database, a client device, and an image-capturing device as disclosed herein (e.g., architecture 10 , remote server 130 , database 152 , client device 110 , and image-capturing devices 100 ).
- Either one of the server, the database, the client device, and the image-capturing device may include a memory circuit storing instructions and a processor circuit configured to execute the instructions to perform, at least partially, one or more of the steps in method 1000 (e.g., memory circuit 132 , and processor circuit 112 ).
- at least one or all of the server, the database, the client device, or the image-capturing device may include a communications module configured to transmit and receive data to one or more of the devices in the architecture, through a network or via a one-to-one (wired or wireless) communication channel (e.g., communications modules 118 and network 150 ).
- At least one of the steps in method 1000 may be partially performed by a diagnostic engine in a server, using an assay reading tool and an image processing tool as disclosed herein (e.g., diagnostic engine 240 , assay reading tool 242 , and image processing tool 244 ).
- at least one of the steps in method 1000 may be partially performed by an application installed in a client device, the application including a diagnostic model hosted by the server (e.g., application 222 and diagnostic model 225 ).
- the image-capturing device may include an enclosure enshrouding a coupling mechanism and a cartridge mount configured to receive a test cartridge (e.g., enclosure 120 , coupling mechanism 115 , and test cartridge 101 ).
- the image of the illuminated test cartridge may include at least a portion of a reading zone in the test cartridge, delimited by a border line.
- the reading portion may include an immunoassay (e.g., a lateral flow immunoassay).
- the lateral flow immunoassay may include a series of images of a lateral flow immunoassay collected over time.
- methods consistent with the present disclosure may include at least one step from method 1000 , or more than one step from method 1000 performed in different order, or overlapping in time.
- some embodiments consistent with the present disclosure may include one or more steps in method 1000 performed simultaneously, or quasi-simultaneously.
- Step 1002 includes retrieving a first image associated with an assay in a test cartridge carrying a biological sample from a user for a disease diagnostic.
- step 1002 includes receiving an image from a client device via a remote network communication channel.
- step 1002 includes receiving the image from an image-capturing device via a wireless communication channel.
- step 1002 includes accessing a database including multiple images of multiple assays including different biological samples from multiple users.
- step 1002 includes retrieving a second image collected from a same assay at a different time and the digital portions are combined in with a time point in a non-linear fashion.
- step 1002 include retrieving a second image collected from a same assay at a different time, and forming a volume between the first image and the second image, wherein selecting the digital portion of the first image comprises selecting a digital portion as a portion of the volume, the digital portion having a time dimension, and wherein the weighted value is associated to a dynamic value for the assay.
- step 1002 includes receiving a first identification code identifying an image-capturing device that generated the first image, and adjusting the model based on a parameter associated with the image-capturing device.
- step 1002 includes receiving a second identification code identifying the test cartridge, and adjusting the model based on a parameter associated with the test cartridge.
- Step 1004 includes selecting a digital portion of the first image.
- step 1004 includes cropping the digital portion to overlap a test channel in a prototype strip of the assay.
- step 1004 includes matching the digital portion with a fiduciary mark at an edge of the first image associated with the assay in the test cartridge.
- Step 1006 includes modifying, with a model, the digital portion of the first image to obtain a weighted value of the digital portion.
- step 1006 includes aggregating multiple values of adjacent pixels within the digital portion into a convolution value, and modifying the convolution value by a factor selected from the model.
- step 1006 includes convoluting a value of the digital portion of the first image with multiple values of adjacent digital portions of the first image according to a weighting coefficient in the model.
- the model includes weighting factors obtained from training with multiple images in a database, each of the images associated with a known diagnostic outcome, and step 1006 includes comparing the known diagnostic outcome with the diagnostic value.
- step 1006 includes retrieving a second image from a client device and evaluating a dynamic value for the assay based on a difference between the digital portion of the first image and a digital portion of the second image.
- step 1006 includes updating the model with a modified weighted factor according to a comparison between the diagnostic value and a known diagnostic for a biological sample.
- step 1006 includes selecting, with the model, a weighting factor to obtain the weighted value of the digital portion, based on a batch identifier of the test cartridge.
- Step 1008 includes determining, based on the weighted value of the digital portion and the model, a diagnostic value.
- Step 1010 includes determining a certainty level for the diagnostic value based on a second weighted value from a second digital portion of the first image.
- step 1010 may include updating the model when the certainty level for the diagnostic value is less than a predetermined value.
- FIG. 11 is a block diagram illustrating an exemplary computer system 1100 with which the client device 110 and server 130 of FIGS. 1 and 2 , and the methods of FIGS. 8 - 10 can be implemented.
- the computer system 1100 may be implemented using hardware or a combination of software and hardware, either in a dedicated server, or integrated into another entity, or distributed across multiple entities.
- Computer system 1100 (e.g., client 110 and server 130 ) includes a bus 1108 or other communication mechanism for communicating information, and a processor 1102 (e.g., processors 112 and 212 ) coupled with bus 1108 for processing information.
- the computer system 1100 may be implemented with one or more processors 1102 .
- Processor 1102 may be a general-purpose microprocessor, a microcontroller, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), a Programmable Logic Device (PLD), a controller, a state machine, gated logic, discrete hardware components, or any other suitable entity that can perform calculations or other manipulations of information.
- DSP Digital Signal Processor
- ASIC Application Specific Integrated Circuit
- FPGA Field Programmable Gate Array
- PLD Programmable Logic Device
- Computer system 1100 can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them stored in an included memory 1104 (e.g., memories 132 and 232 ), such as a Random Access Memory (RAM), a flash memory, a Read-Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable PROM (EPROM), registers, a hard disk, a removable disk, a CD-ROM, a DVD, or any other suitable storage device, coupled to bus 1108 for storing information and instructions to be executed by processor 1102 .
- the processor 1102 and the memory 1104 can be supplemented by, or incorporated in, special purpose logic circuitry.
- the instructions may be stored in the memory 1104 and implemented in one or more computer program products, e.g., one or more modules of computer program instructions encoded on a computer-readable medium for execution by, or to control the operation of, the computer system 1100 , and according to any method well known to those of skill in the art, including, but not limited to, computer languages such as data-oriented languages (e.g., SQL, dBase), system languages (e.g., C, Objective-C, C++, Assembly), architectural languages (e.g., Java, .NET), and application languages (e.g., PHP, Ruby, Perl, Python).
- data-oriented languages e.g., SQL, dBase
- system languages e.g., C, Objective-C, C++, Assembly
- architectural languages e.g., Java, .NET
- application languages e.g., PHP, Ruby, Perl, Python.
- Instructions may also be implemented in computer languages such as array languages, aspect-oriented languages, assembly languages, authoring languages, command line interface languages, compiled languages, concurrent languages, curly-bracket languages, dataflow languages, data-structured languages, declarative languages, esoteric languages, extension languages, fourth-generation languages, functional languages, interactive mode languages, interpreted languages, iterative languages, list-based languages, little languages, logic-based languages, machine languages, macro languages, metaprogramming languages, multiparadigm languages, numerical analysis, non-English-based languages, object-oriented class-based languages, object-oriented prototype-based languages, off-side rule languages, procedural languages, reflective languages, rule-based languages, scripting languages, stack-based languages, synchronous languages, syntax handling languages, visual languages, wirth languages, and xml-based languages.
- Memory 1104 may also be used for storing temporary variable or other intermediate information during execution of instructions to be executed by processor 1102 .
- a computer program as discussed herein does not necessarily correspond to a file in a file system.
- a program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, subprograms, or portions of code).
- a computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
- the processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output.
- Computer system 1100 further includes a data storage device 1106 such as a magnetic disk or optical disk, coupled to bus 1108 for storing information and instructions.
- Computer system 1100 may be coupled via input/output module 1110 to various devices.
- Input/output module 1110 can be any input/output module.
- Exemplary input/output modules 1110 include data ports such as USB ports.
- the input/output module 1110 is configured to connect to a communications module 1112 .
- Exemplary communications modules 1112 e.g., communications modules 118 and 218
- networking interface cards such as Ethernet cards and modems.
- input/output module 1110 is configured to connect to a plurality of devices, such as an input device 1114 (e.g., input device 214 ) and/or an output device 1116 (e.g., output device 216 ).
- exemplary input devices 1114 include a keyboard and a pointing device, e.g., a mouse or a trackball, by which a user can provide input to the computer system 1100 .
- Other kinds of input devices 1114 can be used to provide for interaction with a user as well, such as a tactile input device, visual input device, audio input device, or brain-computer interface device.
- feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, tactile, or brain wave input.
- exemplary output devices 1116 include display devices, such as an LCD (liquid crystal display) monitor, for displaying information to the user.
- the client device 110 and server 130 can be implemented using a computer system 1100 in response to processor 1102 executing one or more sequences of one or more instructions contained in memory 1104 .
- Such instructions may be read into memory 1104 from another machine-readable medium, such as data storage device 1106 .
- Execution of the sequences of instructions contained in main memory 1104 causes processor 1102 to perform the process steps described herein.
- processors in a multi-processing arrangement may also be employed to execute the sequences of instructions contained in memory 1104 .
- hard-wired circuitry may be used in place of or in combination with software instructions to implement various aspects of the present disclosure.
- aspects of the present disclosure are not limited to any specific combination of hardware circuitry and software.
- a computing system that includes a back end component, e.g., a data server, or that includes a middleware component, e.g., an application server, or that includes a front end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the subject matter described in this specification, or any combination of one or more such back end, middleware, or front end components.
- the components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network.
- the communication tool can include, for example, any one or more of a LAN, a WAN, the Internet, and the like. Further, the communication tool can include, but is not limited to, for example, any one or more of the following tool topologies, including a bus network, a star network, a ring network, a mesh network, a star-bus network, tree or hierarchical network, or the like.
- the communications modules can be, for example, modems or Ethernet cards.
- Computer system 1100 can include clients and servers.
- a client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
- Computer system 1100 can be, for example, and without limitation, a desktop computer, laptop computer, or tablet computer.
- Computer system 1100 can also be embedded in another device, for example, and without limitation, a mobile telephone, a PDA, a mobile audio player, a Global Positioning System (GPS) receiver, a video game console, and/or a television set top box.
- GPS Global Positioning System
- machine-readable storage medium or “computer-readable medium” as used herein refers to any medium or media that participates in providing instructions to processor 1102 for execution. Such a medium may take many forms, including, but not limited to, non-volatile media, volatile media, and transmission media.
- Non-volatile media include, for example, optical or magnetic disks, such as data storage device 1106 .
- Volatile media include dynamic memory, such as memory 1104 .
- Transmission media include coaxial cables, copper wire, and fiber optics, including the wires forming bus 1108 .
- Machine-readable media include, for example, floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH EPROM, any other memory chip or cartridge, or any other medium from which a computer can read.
- the machine-readable storage medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter affecting a machine-readable propagated signal, or a combination of one or more of them.
- the immunoassay test strip mentioned above may be configured uniquely for detection of a particular pathogen or analyte of species of interest.
- species of interest include, but are not limited to, proteins, haptens, immunoglobulins, enzymes, hormones, polynucleotides, steroids, lipoproteins, drugs (including drugs of abuse), bacterial antigens, and viral antigens.
- analytes of interest include Streptococcus , Influenza A, Influenza B, respiratory syncytial virus (RSV), hepatitis A, B, and/or C, pneumococcal, human metapneumovirus, and other infectious agents well-known to those in the art.
- a test device is intended for detection of one or more of antigens associated with Lyme disease.
- an immunoassay test strip is intended for use in the field of women's health.
- test devices for detection of one or more of fetal-fibronectin, chlamydia, human chorionic gonadotropin (hCG), hyperglycosylated chorionic gonadotropin, human papillomavirus (HPV), and the like are contemplated.
- an immunoassay test strip for detection of vitamin D is designed for interaction with the apparatus and method of normalization described herein.
- An exemplary immunoassay test strip may include a sample receiving zone in fluid communication with a label zone.
- a fluid sample placed on or in the sample zone flows by capillary action from the sample zone in a downstream direction.
- a label zone is in fluid communication with at least a test line or band and, optionally, a control line or band and/or a reference line or band.
- the label zone is downstream from the sample zone, and the series of control and test lines are downstream from the label zone, and an optional absorbent pad is downstream from the portion of the test strip on which the lines are positioned.
- the sample zone receives the sample suspected of containing an analyte of interest.
- the label zone in some embodiments, contains two dried conjugates that are comprised of particles containing a label element.
- the label element includes a label that emits a signal in any of a number of selected emission processes: e.g., electromagnetic radiation, alpha particle radiation, positron radiation, beta radiation, and the like.
- the electromagnetic radiation emission may include a fluorescence emission, Raman emission, and the like.
- the label may absorb a selected type of radiation, e.g., electromagnetic radiation as in microwave absorption, infrared (IR) absorption, visible absorption, or ultraviolet (UV) absorption.
- the label element may include multiple label elements selected from all or more of the above radiation emission and/or absorption described above.
- the label element may include a fluorescent element.
- An exemplary fluorescent element is a lanthanide material, such as one of the fifteen elements lanthanum, cerium, praseodymium, neodymium, promethium, samarium, europium, gadolinium, terbium, dysprosium, holmium, erbium, ytterbium, lutetium, and yttrium.
- the lanthanide material is embedded in or on a particle, such as a polystyrene particle.
- the particles can be microparticles (particles less than about 1,000 micrometers in diameter, in some instances less than about 500 micrometers in diameter, in some instances less than 200, 150, or 100 micrometers in diameter) containing a luminescent or fluorescent lanthanide, wherein in some embodiments, the lanthanide is europium. In some embodiments, the lanthanide is a chelated europium.
- the microparticles in some embodiments, have a core of a lanthanide material with a polymeric coating, such as an europium core with polystyrene coating. A binding partner for the analyte(s) of interest in the sample is/are attached to or associated with the outer surface of the microparticles.
- the binding partner for the analyte(s) of interest is an antibody, a monoclonal antibody, or a polyclonal antibody.
- binding partners can be selected, and can include complexes such as a biotin and streptavidin complex.
- a test strip intended for detection and/or discrimination of influenza A and influenza B can include a first test line to detect influenza A and a second test line to detect influenza B.
- Microparticle-antibody conjugates comprised of microparticles coated with antibodies specific for influenza A and microparticles coated with antibodies specific for influenza B may be included in the label zone, and in some embodiments, downstream of the negative control line.
- a first test line for influenza A and a second test line for influenza B can be disposed downstream of the label zone.
- the first test line for influenza A comprises a monoclonal or polyclonal antibody to a determinant on the nucleoprotein of influenza A
- the second test line for influenza B comprises a monoclonal or polyclonal antibody to a determinant on the nucleoprotein of influenza B.
- a typical immunoassay sandwich will form on the respective test line that matches the antigen in the sample.
- Other assays may include SARS COV2 assays, with substantially the same architecture as the assay mentioned above, for influenza A and/or B.
- Other assays may include serology assays where the presence of Antibodies against a pathogen can be determined.
- an assay may include multiple analytes in multiple paths for one flow direction, like the RVP4 assay.
- assay may include multiple analytes in multiple flow directions, like the Lyme assay (SOFIA Lyme, Quidel Corporation), or modified Lyme assay.
- microparticle-antibody conjugates that do not bind to the negative control line or to a test line continue to flow by capillary action downstream, and the remaining sample encounters the reference line, in some embodiments proceeding into the absorbent pad.
- the immunoassay test device is intended for receiving a wide variety of samples, including biological samples from human bodily fluids, including but not limited to, nasal secretions, nasopharyngeal secretions, saliva, mucous, urine, vaginal secretions, fecal samples, blood, etc.
- Immunoassay test kits are provided with a positive control swab or sample.
- a negative control swab or sample is provided.
- the user may be prompted to insert or apply a positive or negative control sample or swab.
- An immunoassay band emits fluorescence light primarily from fluorophores bound to the target analyte, as they are fixed on the substrate by adherence to the immuno-proteins in the immunoassay strip (e.g., adsorption, chemi-sorption, immune-ligand, and the like). Accordingly, the presence of a red emission within the boundaries of the band is mostly attributable to the presence of the target analyte (e.g., presence of pathogenic antigens, and the like). However, the amount of red signal within the boundaries of the immunoassay band may include some background. To better assess the background signal (e.g., not originated by target analytes bound to the antibodies on the band), some sample cartridges may include a blank control area.
- the phrase “at least one of” preceding a series of items, with the terms “and” or “or” to separate any of the items, modifies the list as a whole, rather than each member of the list (e.g., each item).
- the phrase “at least one of” does not require selection of at least one item; rather, the phrase allows a meaning that includes at least one of any one of the items, and/or at least one of any combination of the items, and/or at least one of each of the items.
- phrases “at least one of A, B, and C” or “at least one of A, B, or C” each refer to only A, only B, or only C; any combination of A, B, and C; and/or at least one of each of A, B, and C.
- a method may be an operation, an instruction, or a function and vice versa.
- a clause may be amended to include some or all of the words (e.g., instructions, operations, functions, or components) recited in other one or more clauses, one or more words, one or more sentences, one or more phrases, one or more paragraphs, and/or one or more clauses.
- phrases such as an aspect, the aspect, another aspect, some aspects, one or more aspects, an implementation, the implementation, another implementation, some implementations, one or more implementations, an embodiment, the embodiment, another embodiment, some embodiments, one or more embodiments, a configuration, the configuration, another configuration, some configurations, one or more configurations, the subject technology, the disclosure, the present disclosure, other variations thereof and alike are for convenience and do not imply that a disclosure relating to such phrase(s) is essential to the subject technology or that such disclosure applies to all configurations of the subject technology.
- a disclosure relating to such phrase(s) may apply to all configurations, or one or more configurations.
- a disclosure relating to such phrase(s) may provide one or more examples.
- a phrase such as an aspect or some aspects may refer to one or more aspects and vice versa, and this applies similarly to other foregoing phrases.
- Pronouns in the masculine include the feminine and neuter gender (e.g., her and its) and vice versa.
- the term “some” refers to one or more.
- Underlined and/or italicized headings and subheadings are used for convenience only, do not limit the subject technology, and are not referred to in connection with the interpretation of the description of the subject technology. Relational terms such as first and second and the like may be used to distinguish one entity or action from another without necessarily requiring or implying any actual such relationship or order between such entities or actions.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Investigating Or Analysing Biological Materials (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
- Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
- Automatic Analysis And Handling Materials Therefor (AREA)
- Medical Treatment And Welfare Office Work (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims (28)
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/333,978 US11783563B2 (en) | 2020-05-29 | 2021-05-28 | Software and algorithms for use in remote assessment of disease diagnostics |
| US18/456,451 US12165373B2 (en) | 2020-05-29 | 2023-08-25 | Software and algorithms for use in remote assessment of disease diagnostics |
| US18/937,892 US20250069353A1 (en) | 2020-05-29 | 2024-11-05 | Software and algorithms for use in remote assessment of disease diagnostics |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202063032012P | 2020-05-29 | 2020-05-29 | |
| US17/333,978 US11783563B2 (en) | 2020-05-29 | 2021-05-28 | Software and algorithms for use in remote assessment of disease diagnostics |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US18/456,451 Continuation US12165373B2 (en) | 2020-05-29 | 2023-08-25 | Software and algorithms for use in remote assessment of disease diagnostics |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20210374959A1 US20210374959A1 (en) | 2021-12-02 |
| US11783563B2 true US11783563B2 (en) | 2023-10-10 |
Family
ID=76891119
Family Applications (3)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/333,978 Active 2042-01-13 US11783563B2 (en) | 2020-05-29 | 2021-05-28 | Software and algorithms for use in remote assessment of disease diagnostics |
| US18/456,451 Active US12165373B2 (en) | 2020-05-29 | 2023-08-25 | Software and algorithms for use in remote assessment of disease diagnostics |
| US18/937,892 Pending US20250069353A1 (en) | 2020-05-29 | 2024-11-05 | Software and algorithms for use in remote assessment of disease diagnostics |
Family Applications After (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US18/456,451 Active US12165373B2 (en) | 2020-05-29 | 2023-08-25 | Software and algorithms for use in remote assessment of disease diagnostics |
| US18/937,892 Pending US20250069353A1 (en) | 2020-05-29 | 2024-11-05 | Software and algorithms for use in remote assessment of disease diagnostics |
Country Status (8)
| Country | Link |
|---|---|
| US (3) | US11783563B2 (en) |
| EP (1) | EP4158526A2 (en) |
| JP (1) | JP2023529088A (en) |
| CN (1) | CN115836327A (en) |
| AU (1) | AU2021281435A1 (en) |
| CA (1) | CA3180150A1 (en) |
| MX (1) | MX2022015019A (en) |
| WO (1) | WO2021243254A2 (en) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20230274538A1 (en) * | 2020-10-09 | 2023-08-31 | The Trustees Of Columbia University In The City Of New York | Adaptable Automated Interpretation of Rapid Diagnostic Tests Using Self-Supervised Learning and Few-Shot Learning |
| US20230410454A1 (en) * | 2020-05-29 | 2023-12-21 | Quidel Corporation | Software and algorithms for use in remote assessment of disease diagnostics |
| US20240095930A1 (en) * | 2022-09-21 | 2024-03-21 | Hon Hai Precision Industry Co., Ltd. | Machine learning method |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN115917297A (en) | 2020-05-29 | 2023-04-04 | 奎多公司 | Systems and methods for remote evaluation of sample assays for disease diagnosis |
| WO2025125304A1 (en) * | 2023-12-14 | 2025-06-19 | Roche Diabetes Care Gmbh | Method for determining an item of information about a quality of an image |
Citations (25)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US3092465A (en) * | 1960-03-25 | 1963-06-04 | Miles Lab | Diagnostic test device for blood sugar |
| US6052692A (en) | 1998-01-30 | 2000-04-18 | Flashpoint Technology, Inc. | Method and system for managing image related events without compromising image processing |
| US6833863B1 (en) | 1998-02-06 | 2004-12-21 | Intel Corporation | Method and apparatus for still image capture during video streaming operations of a tethered digital camera |
| US20070031283A1 (en) | 2005-06-23 | 2007-02-08 | Davis Charles Q | Assay cartridges and methods for point of care instruments |
| US20090263854A1 (en) | 2008-04-21 | 2009-10-22 | Quidel Corporation | Integrated assay device and housing |
| USD606664S1 (en) | 2008-08-11 | 2009-12-22 | Quidel Corporation | Assay device and housing combined |
| US20100028870A1 (en) * | 2008-06-06 | 2010-02-04 | Mark Welch | Design of synthetic nucleic acids for expression of encoded proteins |
| WO2010081219A1 (en) | 2009-01-13 | 2010-07-22 | Fio Corporation | A handheld diagnostic test device and method for use with an electronic device and a test cartridge in a rapid diagnostic test |
| US20120224053A1 (en) | 2009-06-17 | 2012-09-06 | Board Of Regents, The University Of Texas System | Method and apparatus for quantitative microimaging |
| US20150346097A1 (en) | 2012-12-21 | 2015-12-03 | Micronics, Inc. | Portable fluorescence detection system and microassay cartridge |
| US9207181B2 (en) * | 2012-03-01 | 2015-12-08 | Quidel Corporation | System and apparatus for point-of-care diagnostics |
| US20160349185A1 (en) | 2015-05-29 | 2016-12-01 | Samsung Electronics Co., Ltd. | Strip for analysis and apparatus and system using strip for analysis |
| US20170059566A1 (en) | 2015-08-27 | 2017-03-02 | Quidel Corporation | Immunoassay test device with two fluid flow paths for detection and differentiation of two or more analytes |
| US20170254804A1 (en) * | 2013-11-19 | 2017-09-07 | National Tsing Hua University | Portable fluorescence detection system |
| US20170337912A1 (en) | 2004-09-27 | 2017-11-23 | Soundstreak, Llc | Method and apparatus for remote digital content monitoring and management |
| US9989466B2 (en) | 2013-12-06 | 2018-06-05 | Quidel Corporation | Method for reducing analyzer variability using a normalization target |
| US20180229232A1 (en) | 2017-02-10 | 2018-08-16 | Quidel Corporation | Substrate with channels for controlled fluid flow |
| US20180341818A1 (en) | 2017-05-26 | 2018-11-29 | MP High Tech Solutions Pty Ltd | Apparatus and Method of Location Determination in a Thermal Imaging System |
| US20180367469A1 (en) | 2017-06-19 | 2018-12-20 | Justin Re | Enhanced real-time linking methods and systems |
| US10168329B2 (en) | 2011-08-03 | 2019-01-01 | Quidel Corporation | N-acetyl-D-glucosamine for enhanced specificity of Strep A immunoassay |
| US20190299209A1 (en) | 2018-03-27 | 2019-10-03 | Lawrence C. Dugan | Multi-channel optical detection system and method for multi-chamber assays |
| US20200256856A1 (en) * | 2017-10-26 | 2020-08-13 | Essenlix Corporation | System and methods of image-based assay using crof and machine learning |
| US10820847B1 (en) | 2019-08-15 | 2020-11-03 | Talis Biomedical Corporation | Diagnostic system |
| WO2021243254A2 (en) | 2020-05-29 | 2021-12-02 | Quidel Corporation | System and methods for remote assessment of a sample assay for disease diagnostics |
| US20210373008A1 (en) | 2020-05-29 | 2021-12-02 | Quidel Corporation | System and methods for remote assessment of a sample assay for disease diagnostics |
Family Cites Families (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US3644177A (en) * | 1969-11-12 | 1972-02-22 | Yissum Res Dev Co | Monitoring penicillin in biological substances |
| US10226213B2 (en) * | 2002-10-01 | 2019-03-12 | Zhou Tian Xing | Wearable digital device for personal health use for saliva, urine and blood testing and mobile wrist watch powered by user body |
| US8774526B2 (en) | 2010-02-08 | 2014-07-08 | Microsoft Corporation | Intelligent image search results summarization and browsing |
| WO2012109712A1 (en) | 2011-02-18 | 2012-08-23 | National Ict Australia Limited | Image quality assessment |
| US9311520B2 (en) * | 2012-08-08 | 2016-04-12 | Scanadu Incorporated | Method and apparatus for performing and quantifying color changes induced by specific concentrations of biological analytes in an automatically calibrated environment |
| US9674465B2 (en) | 2015-06-03 | 2017-06-06 | Omnivision Technologies, Inc. | Non-visible illumination scheme |
| CN105550651B (en) * | 2015-12-14 | 2019-12-24 | 中国科学院深圳先进技术研究院 | A method and system for automatic analysis of panoramic images of digital pathological slides |
| AU2017204494B2 (en) * | 2016-09-01 | 2019-06-13 | Casio Computer Co., Ltd. | Diagnosis assisting device, image processing method in diagnosis assisting device, and non-transitory storage medium having stored therein program |
| CN106625681A (en) * | 2017-02-23 | 2017-05-10 | 福建强闽信息科技有限公司 | Aquatic animal disease diagnosis robot device and implementation method |
| WO2018154078A1 (en) * | 2017-02-24 | 2018-08-30 | Fundació Institut Català De Nanociència I Nanotecnologia | An analytical test substrate as fluorescent probe for performing a detection of an analyte, a portable device for performing such detection and a system thereof |
| US10835122B2 (en) * | 2018-05-14 | 2020-11-17 | Reliant Immune Diagnostics, Inc. | System and method for image processing of medical test results using generalized curve field transform |
| CN115100418B (en) * | 2022-07-18 | 2025-04-25 | 复旦大学 | Antigen detection kit identification method, device, equipment and storage medium |
-
2021
- 2021-05-28 CA CA3180150A patent/CA3180150A1/en active Pending
- 2021-05-28 MX MX2022015019A patent/MX2022015019A/en unknown
- 2021-05-28 CN CN202180049223.5A patent/CN115836327A/en active Pending
- 2021-05-28 US US17/333,978 patent/US11783563B2/en active Active
- 2021-05-28 WO PCT/US2021/034928 patent/WO2021243254A2/en not_active Ceased
- 2021-05-28 JP JP2022573434A patent/JP2023529088A/en active Pending
- 2021-05-28 EP EP21740633.9A patent/EP4158526A2/en active Pending
- 2021-05-28 AU AU2021281435A patent/AU2021281435A1/en active Pending
-
2023
- 2023-08-25 US US18/456,451 patent/US12165373B2/en active Active
-
2024
- 2024-11-05 US US18/937,892 patent/US20250069353A1/en active Pending
Patent Citations (27)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US3092465A (en) * | 1960-03-25 | 1963-06-04 | Miles Lab | Diagnostic test device for blood sugar |
| US6052692A (en) | 1998-01-30 | 2000-04-18 | Flashpoint Technology, Inc. | Method and system for managing image related events without compromising image processing |
| US6833863B1 (en) | 1998-02-06 | 2004-12-21 | Intel Corporation | Method and apparatus for still image capture during video streaming operations of a tethered digital camera |
| US20170337912A1 (en) | 2004-09-27 | 2017-11-23 | Soundstreak, Llc | Method and apparatus for remote digital content monitoring and management |
| US20070031283A1 (en) | 2005-06-23 | 2007-02-08 | Davis Charles Q | Assay cartridges and methods for point of care instruments |
| US20090263854A1 (en) | 2008-04-21 | 2009-10-22 | Quidel Corporation | Integrated assay device and housing |
| US20100028870A1 (en) * | 2008-06-06 | 2010-02-04 | Mark Welch | Design of synthetic nucleic acids for expression of encoded proteins |
| USD606664S1 (en) | 2008-08-11 | 2009-12-22 | Quidel Corporation | Assay device and housing combined |
| WO2010081219A1 (en) | 2009-01-13 | 2010-07-22 | Fio Corporation | A handheld diagnostic test device and method for use with an electronic device and a test cartridge in a rapid diagnostic test |
| US20120224053A1 (en) | 2009-06-17 | 2012-09-06 | Board Of Regents, The University Of Texas System | Method and apparatus for quantitative microimaging |
| US10168329B2 (en) | 2011-08-03 | 2019-01-01 | Quidel Corporation | N-acetyl-D-glucosamine for enhanced specificity of Strep A immunoassay |
| US9207181B2 (en) * | 2012-03-01 | 2015-12-08 | Quidel Corporation | System and apparatus for point-of-care diagnostics |
| US20150346097A1 (en) | 2012-12-21 | 2015-12-03 | Micronics, Inc. | Portable fluorescence detection system and microassay cartridge |
| US20170254804A1 (en) * | 2013-11-19 | 2017-09-07 | National Tsing Hua University | Portable fluorescence detection system |
| US9989466B2 (en) | 2013-12-06 | 2018-06-05 | Quidel Corporation | Method for reducing analyzer variability using a normalization target |
| US20160349185A1 (en) | 2015-05-29 | 2016-12-01 | Samsung Electronics Co., Ltd. | Strip for analysis and apparatus and system using strip for analysis |
| US20170059566A1 (en) | 2015-08-27 | 2017-03-02 | Quidel Corporation | Immunoassay test device with two fluid flow paths for detection and differentiation of two or more analytes |
| US20180229232A1 (en) | 2017-02-10 | 2018-08-16 | Quidel Corporation | Substrate with channels for controlled fluid flow |
| US20180341818A1 (en) | 2017-05-26 | 2018-11-29 | MP High Tech Solutions Pty Ltd | Apparatus and Method of Location Determination in a Thermal Imaging System |
| US20180367469A1 (en) | 2017-06-19 | 2018-12-20 | Justin Re | Enhanced real-time linking methods and systems |
| US20200256856A1 (en) * | 2017-10-26 | 2020-08-13 | Essenlix Corporation | System and methods of image-based assay using crof and machine learning |
| US20190299209A1 (en) | 2018-03-27 | 2019-10-03 | Lawrence C. Dugan | Multi-channel optical detection system and method for multi-chamber assays |
| US10820847B1 (en) | 2019-08-15 | 2020-11-03 | Talis Biomedical Corporation | Diagnostic system |
| WO2021243254A2 (en) | 2020-05-29 | 2021-12-02 | Quidel Corporation | System and methods for remote assessment of a sample assay for disease diagnostics |
| US20210373008A1 (en) | 2020-05-29 | 2021-12-02 | Quidel Corporation | System and methods for remote assessment of a sample assay for disease diagnostics |
| US20210374959A1 (en) * | 2020-05-29 | 2021-12-02 | Quidel Corporation | Software and algorithms for use in remote assessment of disease diagnostics |
| WO2021243179A1 (en) | 2020-05-29 | 2021-12-02 | Quidel Corporation | System and methods for remote assessment of a sample assay for disease diagnostics |
Non-Patent Citations (7)
| Title |
|---|
| Carrio et al., "Automated Low-Cost Smartphone-Based Lateral Flow Saliva Test Reader for Drugs-of-Abuse Detection," Sensors 2015, 15, 29569-29593; doi:10.3390/s151129569 (Year: 2015). * |
| Carrio et al., "Automated Low-Cost Smartphone-Based Lateral Flow Saliva Test Reader for Drugs-of-Abuse Detection", Sensors (Basel), vol. 15, No. 11, pp. 29569-29593 (2015). |
| Foysal et al., "Analyte Quantity Detection from Lateral Flow Assay Using a Smartphone", Sensors, vol. 19, No. 21, Art. 4812, 19 pages (2019). |
| International Search Report from International Application No. PCT/US2021/034801, 4 pages, dated Sep. 13, 2021, application now published as International Publication No. WO2021/243179 on Dec. 2, 2021. |
| International Search Report from International Application No. PCT/US2021/034928, 5 pages, dated Dec. 3, 2021, application now published as International Publication No. WO2021/243254 on Dec. 2, 2021. |
| Liu et al., "Point-of-care testing based on smartphone: The current state-of-the-art (2017-2018)", Biosens. Bioelectron., vol. 132, pp. 17-37 (2019). |
| Saisin et al., "Significant Sensitivity Improvement for Camera-Based Lateral Flow Immunoassay Readers", Sensors (Basel), vol. 18, No. 11, Art. 4026, 8 pages (2018). |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20230410454A1 (en) * | 2020-05-29 | 2023-12-21 | Quidel Corporation | Software and algorithms for use in remote assessment of disease diagnostics |
| US12165373B2 (en) * | 2020-05-29 | 2024-12-10 | Ortho-Clinical Diagnostics, Inc. | Software and algorithms for use in remote assessment of disease diagnostics |
| US20230274538A1 (en) * | 2020-10-09 | 2023-08-31 | The Trustees Of Columbia University In The City Of New York | Adaptable Automated Interpretation of Rapid Diagnostic Tests Using Self-Supervised Learning and Few-Shot Learning |
| US12541964B2 (en) * | 2020-10-09 | 2026-02-03 | The Trustees Of Columbia University In The City Of New York | Adaptable automated interpretation of rapid diagnostic tests using self-supervised learning and few-shot learning |
| US20240095930A1 (en) * | 2022-09-21 | 2024-03-21 | Hon Hai Precision Industry Co., Ltd. | Machine learning method |
Also Published As
| Publication number | Publication date |
|---|---|
| CA3180150A1 (en) | 2021-12-02 |
| US20250069353A1 (en) | 2025-02-27 |
| EP4158526A2 (en) | 2023-04-05 |
| JP2023529088A (en) | 2023-07-07 |
| WO2021243254A2 (en) | 2021-12-02 |
| US20230410454A1 (en) | 2023-12-21 |
| AU2021281435A1 (en) | 2023-02-02 |
| CN115836327A (en) | 2023-03-21 |
| MX2022015019A (en) | 2023-03-17 |
| US12165373B2 (en) | 2024-12-10 |
| WO2021243254A3 (en) | 2022-01-06 |
| US20210374959A1 (en) | 2021-12-02 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12165373B2 (en) | Software and algorithms for use in remote assessment of disease diagnostics | |
| Li et al. | A survey on parameter identification, state estimation and data analytics for lateral flow immunoassay: from systems science perspective | |
| US10613082B2 (en) | Device for performing a diagnostic test and methods for use thereof | |
| McRae et al. | Programmable bio-nanochip platform: a point-of-care biosensor system with the capacity to learn | |
| Lu et al. | Rapid diagnostic testing platform for iron and vitamin A deficiency | |
| US10972641B2 (en) | Optics, device, and system for assaying | |
| US20200292539A1 (en) | Result determination in an immunoassay by measuring kinetic slopes | |
| JP2020509403A (en) | Assay optics, devices, and systems | |
| CN101558302A (en) | Portable apparatus for improved sample analysis | |
| US12298251B2 (en) | System and methods for remote assessment of a sample assay for disease diagnostics | |
| Goswami et al. | AI algorithm for mode classification of PCF-SPR sensor design | |
| Ghosh et al. | Rapid single-tier serodiagnosis of Lyme disease | |
| US20220407988A1 (en) | Image-Based Assay Using Mark-Assisted Machine Learning | |
| WO2017058813A1 (en) | Methods and systems for point-of-care sample analysis | |
| Jing et al. | A novel method for quantitative analysis of C-reactive protein lateral flow immunoassays images via CMOS sensor and recurrent neural networks | |
| Matthews et al. | Rapid dengue and outbreak detection with mobile systems and social networks | |
| Augustine et al. | Point-of-care testing: the convergence of innovation and accessibility in diagnostics | |
| Sun et al. | Neural network enables high accuracy for hepatitis B surface antigen detection with a plasmonic platform | |
| US20130224768A1 (en) | Immunochromatographic assay method and apparatus | |
| CN120801703A (en) | Nanometer photon structure chip and preparation method, detection method and related equipment thereof | |
| Zhu et al. | Rapid diagnosis of membranous nephropathy based on kidney tissue Raman spectroscopy and deep learning | |
| Hoque Tania et al. | Pathological test type and chemical detection using deep neural networks: a case study using ELISA and LFA assays | |
| Huang et al. | Label-free immunoassay of carcinoembryonic antigen by microfluidic channel biosensor based on imaging ellipsometry and its clinic application | |
| Zhang et al. | Methodological study of quantitative immunochromatographic detection based on optimised flow measurement | |
| Jing et al. | A Novel Method for Quantitative Analysis of C-Reactive Protein Lateral Flow Immunoassays |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| AS | Assignment |
Owner name: QUIDEL CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:YAGHOUBI, HOUMAN;MARSH, CURTIS;RONGEY, SCOTT;AND OTHERS;SIGNING DATES FROM 20200922 TO 20201211;REEL/FRAME:056911/0949 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| AS | Assignment |
Owner name: BANK OF AMERICA, N.A., NORTH CAROLINA Free format text: SECURITY AGREEMENT;ASSIGNORS:QUIDEL CORPORATION;BIOHELIX CORPORATION;DIAGNOSTIC HYBRIDS, INC.;AND OTHERS;REEL/FRAME:060220/0711 Effective date: 20220527 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| AS | Assignment |
Owner name: ORTHO-CLINICAL DIAGNOSTICS, INC., NEW JERSEY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:QUIDEL CORPORATION;REEL/FRAME:068657/0827 Effective date: 20240812 Owner name: ORTHO-CLINICAL DIAGNOSTICS, INC., NEW JERSEY Free format text: ASSIGNMENT OF ASSIGNOR'S INTEREST;ASSIGNOR:QUIDEL CORPORATION;REEL/FRAME:068657/0827 Effective date: 20240812 |
|
| AS | Assignment |
Owner name: BANK OF AMERICA, N.A., AS ADMINISTRATIVE AGENT, NORTH CAROLINA Free format text: SECURITY AGREEMENT;ASSIGNORS:CRIMSON INTERNATIONAL ASSETS LLC;MICRO TYPING SYSTEMS, INC.;ORTHO-CLINICAL DIAGNOSTICS, INC.;AND OTHERS;REEL/FRAME:072526/0643 Effective date: 20250821 Owner name: QUIDEL CORPORATION, CALIFORNIA Free format text: FREE FORM MESSAGE RELEASE (REEL 060220 / FRAME 0711);ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:072577/0536 Effective date: 20250821 Owner name: BIOHELIX CORPORATION, MASSACHUSETTS Free format text: FREE FORM MESSAGE RELEASE (REEL 060220 / FRAME 0711);ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:072577/0536 Effective date: 20250821 Owner name: DIAGNOSTIC HYBRIDS, INC., OHIO Free format text: FREE FORM MESSAGE RELEASE (REEL 060220 / FRAME 0711);ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:072577/0536 Effective date: 20250821 Owner name: QUIDEL CARDIOVASCULAR INC., CALIFORNIA Free format text: FREE FORM MESSAGE RELEASE (REEL 060220 / FRAME 0711);ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:072577/0536 Effective date: 20250821 Owner name: ORTHO-CLINICAL DIAGNOSTICS, INC., NEW JERSEY Free format text: FREE FORM MESSAGE RELEASE (REEL 060220 / FRAME 0711);ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:072577/0536 Effective date: 20250821 Owner name: CRIMSON U.S. ASSETS LLC, NEW JERSEY Free format text: FREE FORM MESSAGE RELEASE (REEL 060220 / FRAME 0711);ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:072577/0536 Effective date: 20250821 Owner name: CRIMSON INTERNATIONAL ASSETS LLC, NEW JERSEY Free format text: FREE FORM MESSAGE RELEASE (REEL 060220 / FRAME 0711);ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:072577/0536 Effective date: 20250821 Owner name: MICRO TYPING SYSTEMS, INC., FLORIDA Free format text: FREE FORM MESSAGE RELEASE (REEL 060220 / FRAME 0711);ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:072577/0536 Effective date: 20250821 Owner name: QUIDEL CORPORATION, CALIFORNIA Free format text: RELEASE (REEL 060220 / FRAME 0711);ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:072577/0536 Effective date: 20250821 Owner name: BIOHELIX CORPORATION, MASSACHUSETTS Free format text: RELEASE (REEL 060220 / FRAME 0711);ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:072577/0536 Effective date: 20250821 Owner name: DIAGNOSTIC HYBRIDS, INC., OHIO Free format text: RELEASE (REEL 060220 / FRAME 0711);ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:072577/0536 Effective date: 20250821 Owner name: QUIDEL CARDIOVASCULAR INC., CALIFORNIA Free format text: RELEASE (REEL 060220 / FRAME 0711);ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:072577/0536 Effective date: 20250821 Owner name: ORTHO-CLINICAL DIAGNOSTICS, INC., NEW JERSEY Free format text: RELEASE (REEL 060220 / FRAME 0711);ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:072577/0536 Effective date: 20250821 Owner name: CRIMSON U.S. ASSETS LLC, NEW JERSEY Free format text: RELEASE (REEL 060220 / FRAME 0711);ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:072577/0536 Effective date: 20250821 Owner name: CRIMSON INTERNATIONAL ASSETS LLC, NEW JERSEY Free format text: RELEASE (REEL 060220 / FRAME 0711);ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:072577/0536 Effective date: 20250821 Owner name: MICRO TYPING SYSTEMS, INC., FLORIDA Free format text: RELEASE (REEL 060220 / FRAME 0711);ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:072577/0536 Effective date: 20250821 |