WO2020076928A1 - Consumer-based disease diagnostics - Google Patents

Consumer-based disease diagnostics Download PDF

Info

Publication number
WO2020076928A1
WO2020076928A1 PCT/US2019/055365 US2019055365W WO2020076928A1 WO 2020076928 A1 WO2020076928 A1 WO 2020076928A1 US 2019055365 W US2019055365 W US 2019055365W WO 2020076928 A1 WO2020076928 A1 WO 2020076928A1
Authority
WO
WIPO (PCT)
Prior art keywords
optical property
diagnostic test
property modifying
result
modifying device
Prior art date
Application number
PCT/US2019/055365
Other languages
French (fr)
Other versions
WO2020076928A8 (en
Inventor
Frank B. MYERS
Debkishore MITRA
John Robert Waldeisen
Ivan Krastev Dimov
Original Assignee
Lucira Health, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/155,829 external-priority patent/US11080848B2/en
Application filed by Lucira Health, Inc. filed Critical Lucira Health, Inc.
Priority to CN201980066514.8A priority Critical patent/CN112823276A/en
Priority to JP2021519556A priority patent/JP2022504506A/en
Priority to EP19872211.8A priority patent/EP3864393A4/en
Priority to MX2021004018A priority patent/MX2021004018A/en
Priority to CA3114215A priority patent/CA3114215A1/en
Publication of WO2020076928A1 publication Critical patent/WO2020076928A1/en
Publication of WO2020076928A8 publication Critical patent/WO2020076928A8/en
Priority to DO2021000058A priority patent/DOP2021000058A/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/75Systems in which material is subjected to a chemical reaction, the progress or the result of the reaction being investigated
    • G01N21/77Systems in which material is subjected to a chemical reaction, the progress or the result of the reaction being investigated by observing the effect on a chemical indicator
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/75Systems in which material is subjected to a chemical reaction, the progress or the result of the reaction being investigated
    • G01N2021/752Devices comprising reaction zones
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/75Systems in which material is subjected to a chemical reaction, the progress or the result of the reaction being investigated
    • G01N21/77Systems in which material is subjected to a chemical reaction, the progress or the result of the reaction being investigated by observing the effect on a chemical indicator
    • G01N2021/7756Sensor type
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/75Systems in which material is subjected to a chemical reaction, the progress or the result of the reaction being investigated
    • G01N21/77Systems in which material is subjected to a chemical reaction, the progress or the result of the reaction being investigated by observing the effect on a chemical indicator
    • G01N2021/7769Measurement method of reaction-produced change in sensor
    • G01N2021/7786Fluorescence
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/75Systems in which material is subjected to a chemical reaction, the progress or the result of the reaction being investigated
    • G01N21/77Systems in which material is subjected to a chemical reaction, the progress or the result of the reaction being investigated by observing the effect on a chemical indicator
    • G01N21/78Systems in which material is subjected to a chemical reaction, the progress or the result of the reaction being investigated by observing the effect on a chemical indicator producing a change of colour
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2201/00Features of devices classified in G01N21/00
    • G01N2201/02Mechanical
    • G01N2201/022Casings
    • G01N2201/0221Portable; cableless; compact; hand-held

Definitions

  • This description generally relates to disease diagnostics, and particularly to a portable system to perform image-based disease diagnostic tests. DESCRIPTION OF THE RELATED ART
  • the hardware required to complete an immunoassay for a diagnostic test can be fit into a handheld device.
  • trained personnel such as a nurse, physician, or lab technician must analyze the handheld device to determine the test result.
  • a patient would have to visit a facility such as a healthcare clinic, or the trained personnel would have to visit the patient at the patient’s home, for example.
  • the patient can collect a biological sample of the patient (such as a mouth or nostril swab) at the patient’s home and mail the biological sample to a laboratory to perform the immunoassay using the biological sample.
  • a diagnostic system performs disease diagnostic tests (also referred to herein as“diagnostic tests” or“nucleic acid disease diagnostic tests”) using at least an optical property modifying device and a mobile device.
  • the optical property modifying device includes a plurality of reaction chambers. The reaction chambers are configured to perform a reaction between a biological sample and at least an optical property modifying reagent.
  • the optical property modifying device can include an interface configured to receive the biological sample.
  • the diagnostic system can communicate instructions for the diagnostic test to the user via the mobile device.
  • the plurality of reaction chambers can be located on a surface of the optical property modifying device such that at least one of a geometric pattern formed by a layout of the plurality of reaction chambers and a color of the reaction between the biological sample and at least the optical property modifying reagent within the plurality of reaction chambers is visible from a point exterior to the optical property modifying device.
  • the optical property modifying device includes an electronic display configured to display an indication of a result of the diagnostic test.
  • the electronic display can be a liquid crystal display, an organic light emitting diode display, an electronic paper display, and/or one or more light emitting diodes (LEDs).
  • the indication of the diagnostic test result presented by the electronic display of the optical property modifying device can be the actual result of the diagnostic test.
  • the indication can be presented by the electronic display as human- readable symbols such as text, machine-readable symbols such as a barcode, and any other type of symbols.
  • a user provides a biological sample to the optical property modifying device.
  • the optical property modifying device includes an interface
  • the user provides the biological sample to the interface of the optical property modifying device.
  • the biological sample reacts with an optical property modifying reagent in the reaction chambers of the device.
  • the optical property modifying device performs a nucleic acid amplification assay that changes the color of a solution including the biological sample and the optical property modifying reagent in the reaction chambers.
  • the user captures one or more images of the optical property modifying device using an optical sensor (e.g., camera) of the mobile device.
  • the images of the optical property modifying device that are captured by the user can include different portions of the optical property modifying device.
  • the optical property modifying device or a user interface displayed on the mobile device can provide visual markers.
  • the diagnostic system analyzes one or more captured images at the mobile device or at a computer server in communication with the mobile device.
  • the diagnostic system can determine whether the optical property modifying device is included in the images, based on, for example, geometric characteristics and/or color characteristics that are known to be associated with the optical property modifying device.
  • a geometric characteristic known to be associated with the optical property modifying device can include, for example, a geometric pattern formed by a layout of the reaction chambers of the optical property modifying device.
  • a color characteristic known to be associated with the optical property modifying device can include, for example, colors of the reaction between the biological sample and at least the optical property modifying reagent within the reaction chambers.
  • the diagnostic system can also determine whether the optical property modifying device is included in the images based on any other geometric characteristics and/or color characteristics that are known to be associated with the optical property modifying device.
  • the diagnostic system can determine whether a particular portion of the optical property modifying device is included in the captured images. For example, the diagnostic system can determine whether the electronic display of the optical property modifying device is included in the images. The diagnostic system can also determine a quality level of the images based on factors such as skew, scale, focusing, shadowing, or white-balancing. The diagnostic system can select one or more of the captured images for further analysis based on any of the above determinations. For instance, the diagnostic system can select captured images that are determined to include a particular portion of the optical property modifying device, such as the electronic display, for further analysis.
  • the diagnostic system determines a result of the diagnostic test based at least in part on the images of the optical property modifying device captured by the mobile device.
  • a result of a diagnostic test can include, for instance, a positive result, a negative result, a quantitative result, and/or an undetermined result.
  • the diagnostic system determines the result of the diagnostic test based at least in part on the indication displayed by the electronic display in the images.
  • the images captured by the mobile device include an electronic display of the optical property modifying device that displays an indication of a result of the diagnostic test performed, to determine the result of the diagnostic test based on the indication displayed by the electronic display in the captured image, at least one of optical pattern matching, optical template matching, and optical character recognition for the indication displayed by the electronic display in the image can be performed.
  • the diagnostic system determines the result of the diagnostic test based at least in part on at least one of a known geometric characteristic and a known color characteristic associated with the optical property modifying device and depicted in the images. Specifically, based on a known geometric characteristic of the optical property modifying device, such as a geometric pattern formed by a layout of the reaction chambers, a location of each reaction chamber in an image can be identified. Then, a color of the image at the identified location of each reaction chamber in the image can be determined. The identified color for each reaction chamber can be compared with known color characteristics of the optical property modifying device to determine the result of the diagnostic test.
  • a known geometric characteristic of the optical property modifying device such as a geometric pattern formed by a layout of the reaction chambers
  • the diagnostic system can communicate the determined result of the diagnostic test to the user via the mobile device. Furthermore, in some embodiments, a message that includes sponsored content based on the diagnostic test result is generated and provided to the user via the mobile device.
  • the diagnostic system can also provide the determined result to a third-party system including a health care provider such as a physician or a pharmacy, a government agency such as the Center for Disease Control (CDC), an insurance provider, a telemedicine partner, a treatment manufacturer, and any other third-party system to aid in disease prevention and/or treatment.
  • a health care provider such as a physician or a pharmacy
  • a government agency such as the Center for Disease Control (CDC)
  • an insurance provider a telemedicine partner
  • a treatment manufacturer a treatment manufacturer
  • any other third-party system to aid in disease prevention and/or treatment.
  • provision of the result of the diagnostic test to a pharmacy can allow the pharmacy to fulfill a prescription for the user.
  • the result of the diagnostic test can be provided to a physician located within a predetermined distance
  • a geographical location of the mobile device and/or a timestamp of the geographical location are determined at one or more time points before, after, and/or during performance of the diagnostic test.
  • a geographical location of the mobile device and a timestamp of the geographical location can be determined when the result of the diagnostic test is provided to the user via the mobile device.
  • the geographical location of the mobile device can be determined by a location sensor, such as a Global Positioning System (GPS) sensor, of the mobile device.
  • GPS Global Positioning System
  • the diagnostic system can store the geographical location and/or the timestamp of the geographical location with the determined result of the diagnostic test.
  • the result of the diagnostic test is determined in part based on at least one of the determined geographical location and the timestamp.
  • the sensitivity and specificity of the diagnostic test result can be adjusted based on geospatial and seasonal variations in disease prevalence.
  • metadata in addition to or instead of geographical location and timestamp data can be provided alongside the diagnostic test result to one or more of the third-party systems mentioned above. For instance, data from the user’s electronic health record, the user’s vital signs, and any other pertinent information may be provided to the one or more third-party systems.
  • the geographical location and/or the timestamp can be provided alongside the result of the diagnostic test to a third-party system, as briefly mentioned above.
  • the diagnostic test result, geographical location, and timestamp can be provided to a third-party system that stores epidemiological data that is based on aggregated diagnostic test results from a plurality of mobile devices associated with a population of patients.
  • epidemiological data can be used, for example, in disease prevention.
  • the optical property modifying device, mobile device, and any other test equipment, such as a swab and collection tube for the biological sample, are all portable.
  • a central lab is not required to perform disease diagnostic tests.
  • the mobile device or a computer server in communication with the mobile device
  • the diagnostic system can provide a quick diagnosis without requiring input from health care personnel.
  • determination of the result of the diagnostic test by the diagnostic system is independent of mobile device hardware because the diagnostic system is configured to normalize images captured from different types of mobile devices and optical sensors.
  • the diagnostic system is a point-of-care system that enables patients to conveniently complete diagnostic tests at their homes (or other patient locations) without having to leave their homes, mail test material to a lab, or have health care personnel visit them at their homes.
  • the optical property modifying device transmits one or more data streams to the diagnostic system, for use by the diagnostic system in determining the result of the diagnostic test.
  • the optical property modifying device further comprises a data stream transmission module that is configured to transmit data streams from the optical property modifying device.
  • the information transmitted from the optical property modifying device to the diagnostic system within the data streams can comprise any of the information that the mobile device captures in images in alternative embodiments, and in some embodiments, even further information.
  • the data streams can be transmitted from the optical property modifying device via electromagnetic radiofrequency transmission (e.g. Bluetooth), via ultrasonic transmission, and/or via audio transmission. Benefits to transmission of data streams via ultrasonic transmission include the ability to quickly transmit large data structures without specialized RF hardware or software pairing.
  • a non-transitory computer- readable storage medium stores instructions that when executed by a processor causes the processor to execute any embodiment of the above-described methods.
  • FIG. 1 is a diagram of a system environment for performing disease diagnostic tests, according to one embodiment.
  • FIG. 2A is a diagram of an optical property modifying device for use with a mobile device, according to one embodiment.
  • FIG. 2B is a diagram of an optical property modifying device for use with a mobile device, according to another embodiment.
  • FIG. 3 is a block diagram of a diagnostic system, according to one embodiment.
  • FIG. 4A shows a user interface of a mobile application for a disease diagnostic test, according to one embodiment.
  • FIG. 4B shows another user interface of the mobile application shown in FIG. 4A including information about the disease diagnostic test, according to one embodiment.
  • FIG. 4C shows another user interface of the mobile application shown in FIG. 4A including instructions to use a swab, according to one embodiment.
  • FIG. 4D shows another user interface of the mobile application shown in FIG. 4A including instructions to use a collection tube, according to one embodiment.
  • FIG. 4E shows another user interface of the mobile application shown in FIG. 4A including instructions to use a cartridge, according to one embodiment.
  • FIG. 4F shows another user interface of the mobile application shown in FIG. 4A including instructions to wait for a chemical reaction of the disease diagnostic test to complete, according to one embodiment.
  • FIG. 4G shows another user interface of the mobile application shown in FIG. 4A including instructions to scan the cartridge, according to one embodiment.
  • FIG. 4H shows another user interface of the mobile application shown in FIG. 4A including results for the disease diagnostic test, according to one embodiment.
  • FIG. 5A is a data flow diagram for performing a disease diagnostic test in a conventional system environment, according to one embodiment.
  • FIG. 5B is a data flow diagram for performing a disease diagnostic test in a system environment including the diagnostic system, according to one embodiment.
  • FIG. 6 is a flowchart illustrating a process for determining test results for a disease diagnostic test, according to one embodiment.
  • FIG. 1 is a diagram of a system environment for performing disease diagnostic tests, according to one embodiment.
  • the system environment includes a diagnostic server 150, a mobile device 110, an optical property modifying device 120, and one or more health care providers 130.
  • the diagnostic server 150, mobile device 110, and health care providers 130 can be connected to each other via a network 140.
  • different and/or additional entities can be included in the system environment.
  • the functions performed by the various entities of FIG. 1 can vary in different embodiments.
  • the mobile device 110 is an electronic device that includes a diagnostic system 100 to determine test results for disease diagnostic tests performed by users using at least the mobile device 110 (e.g., a smartphone, tablet, laptop computer, etc.) and the optical property modifying device 120. Since the mobile device 110 and optical property modifying device 120 are portable, users can perform disease diagnostic tests at the patient’s home or any other suitable location outside of health care facilities such as hospitals or central labs, for disease diagnostic tests.
  • the diagnostic server 150 is a computer server that can perform some or all functionality of the diagnostic system 100 in some embodiments.
  • a user of the mobile device 110 interacts with the diagnostic system 100 via a mobile application.
  • the mobile application communicates information from the diagnostic system 100.
  • the mobile application can present instructions or results for a disease diagnostic test on a graphical user interface displayed on an electronic display on the mobile device 110.
  • the mobile application provides information via audio signals or tactile feedback (e.g., vibrating the mobile device 110).
  • the mobile application running on the mobile device 110 can provide data from sensors of the mobile device 110 to the diagnostic system 100.
  • the mobile device 110 includes an optical sensor such as a camera to capture images of the optical property modifying device 120.
  • the mobile device 110 can provide the captured images to the diagnostic system 100 for further processing, which is further described with reference to FIG. 3.
  • the mobile device 110 can also include a location sensor such as a Global Positioning System (GPS) sensor, a motion sensor such as an accelerometer, gyroscope, or inertial measurement unit, a proximity sensor, or a temperature sensor.
  • a location sensor such as a Global Positioning System (GPS) sensor
  • GPS Global Positioning System
  • the mobile device 110 can communicate with the diagnostic server 150 and health care provider 130 via the network 140, which can comprise any combination of local area and wide area networks employing wired or wireless communication links.
  • the network 140 uses standard communications technologies and Internet protocols.
  • all or some of the communication links of the network 140 can be encrypted, for example, to provide a technical safeguard for Health Insurance Portability and Accountability Act (HIPAA) compliance.
  • HIPAA Health Insurance Portability and Accountability Act
  • transmission of information from the optical property modifying device 120 to the mobile device 110, to the health care provider 130, and/or to the diagnostic server 150 is discussed as image-based transmission. Specifically, throughout this disclosure, information is discussed as being transmitted from the optical property modifying device 120 to the mobile device 110, to the health care provider 130, and/or to the diagnostic server 150 via images captured of the optical property modifying device 120 by the mobile device 110. However, in additional or alternative embodiments, rather than transmitting information from the optical property modifying device 120 to other modules of the system environment via images captured by the mobile device 110, the optical property modifying device 120 can directly transmit data to other modules of the system environment via one or more alternative means including electromagnetic radiofrequency transmission (e.g.
  • electromagnetic radiofrequency transmission e.g.
  • the optical property modifying device 120 can include a data transmission module that is configured to transmit data directly from the optical property modifying device 120.
  • the diagnostic system can determine diagnostic test results based on data received directly from the optical property modifying device 120, as opposed to images received from the mobile device 110. While these alternative embodiments of data transmission are not discussed throughout the remainder of this disclosure, it is to be noted that in some embodiments, this alternative mode of data transmission can be implemented instead of or in addition to image-based data transmission.
  • the health care provider 130 is a computer server associated a health care provider such as a pharmacy, a central laboratory (e.g., for completing chemical reactions for disease diagnostic tests), a hospital, other types of healthcare facilities, or any other suitable provider of health care services.
  • a health care provider such as a pharmacy, a central laboratory (e.g., for completing chemical reactions for disease diagnostic tests), a hospital, other types of healthcare facilities, or any other suitable provider of health care services.
  • the diagnostic system 100 provides a disease diagnostic test result of a patient to a pharmacy. Based on the results, the pharmacy determines an appropriate prescription for the patient.
  • FIG. 2 A is a diagram of the optical property modifying device 120 for use with the mobile device 110, according to one embodiment.
  • the optical property modifying device 120 (also referred to herein as a “cartridge”) includes a collection tube interface 230, one or more reaction chambers 240, an indicator 250, and a QR code 260.
  • the optical property modifying device 120 can include fewer or additional components than those depicted in FIG. 2A.
  • a user uses the optical property modifying device 120 to react a biological sample (e.g., including a nucleic acid) of a patient with an optical property modifying reagent in the reaction chambers 240.
  • a biological sample e.g., including a nucleic acid
  • the optical property modifying device 120 can receive the biological sample from a collection tube 200 via the collection tube interface 230.
  • the mobile device 110 can provide instructions for the test to the user, for example, via a graphical user interface of the mobile device 110 as further described below with reference to FIGS. 4A-H.
  • the indicator 250 can be a light-emitting diode (LED) that provides an indication of a status of a disease diagnostic test being performed by the optical property modifying device 120.
  • the indicator 250 can display a red colored light indicating that the test has not yet started, a yellow colored light indicating that the test is in progress, and a green colored light indicating that the test has completed.
  • the indicator 250 can be another type of indicator different than an LED (e.g., an audio indicator) and the optical property modifying device 120 can include any number of indicators 250 (including zero).
  • the embodiment of the optical property modifying device 120 depicted in FIG. 2A includes the QR code 260.
  • the QR code 260 can be associated with information for a disease diagnostic test such as a type of the test or an expiration date of the test.
  • the diagnostic system 100 can determine the information for disease diagnostic test by processing images of the QR code 260 scanned by an optical sensor of the mobile device 110.
  • the QR code 260 can also be another type of code, such as a barcode, other identifier, or machine-readable signature. While the embodiment of the optical property modifying device 120 includes the QR code 260, in some alternative embodiments, the optical property modifying device 120 does not include a QR code.
  • the optical property modifying device 120 performs a type of disease diagnostic test involving nucleic acid(s), e.g. nucleic acid amplification, using the reaction chambers 240 to determine the presence or amount of a particular genetic target in the biological sample.
  • the optical property modifying device 120 is configured to receive an optical property modifying reagent, which is a solution comprising nucleic acid enzymes and primers which specifically a target nucleic acid sequence specific to an organism of interest. The presence or amount of this target nucleic acid sequence within the biological sample can indicate that a patient is infected with a particular pathogen or carries a particular genetic disease.
  • the optical property modifying reagent includes a dye which changes color or fluoresces upon target amplification, resulting in a visible change in the solution’s color characteristics.
  • the diagnostic system 100 can analyze the color characteristics of the solution of the biological sample and the optical property modifying reagent to determine results of the disease diagnostic test.
  • the optical property modifying device 120 can perform other tests such as immunoassays or enzyme-linked immunosorbent assays (ELISAs).
  • the reaction chambers 240 are located on a surface of the optical property modifying device 120, such that the reaction chambers 240 and their contents are at least partially visible from a surface of the optical property modifying device 120.
  • the mobile device 110 can capture images of the reaction chambers 240 and their contents.
  • the reaction chambers 240 are located on a surface of the optical property modifying device 120, to determine that an image depicts the optical property modifying device 120, it can be determined that the image depicts one or more of color characteristics and geometric characteristics known to be associated with the reaction chambers 240 of the optical property modifying device 120.
  • a geometric characteristic known to be associated with the reaction chambers 240 of the optical property modifying device 120 can comprise a geometric pattern formed by a layout of the reaction chambers.
  • a color characteristic known to be associated with the reaction chambers 240 of the optical property modifying device 120 can comprise a color of the reaction between the biological sample and the optical property modifying reagent within a reaction chamber of the device.
  • the diagnostic system can also use one or more of color characteristics and geometric characteristics identified in an image of the reaction chambers 240 of the optical property modifying device 120, to determine a result of a diagnostic test performed by the optical property modifying device 120. More specifically, using known geometric characteristics of the optical property modifying device 120, a location of each reaction chamber can be identified in the image. Then, a color of the image at the identified location of each reaction chamber in the image can be determined. These identified colors can be compared to known colorimetric characteristics of the optical property modifying device 120 to determine the result of the diagnostic test performed by the optical property modifying device 120. In this way, capturing images of the optical property modifying device 120 using the mobile device 110 enables determination of the result of the diagnostic test performed by the optical property modifying device 120.
  • reaction chambers 240 are not visible from a surface of the optical property modifying device 120, and thus the contents of the reaction chambers 240 are also not visible.
  • alternative means of determining a result of a diagnostic test performed by the optical property modifying device 120 can be used.
  • the optical property modifying device 120 can perform multiple disease diagnostic tests in parallel or in series.
  • the optical property modifying device 120 shown in the example in FIG. 2A includes five reaction chambers 240, though in other embodiments, the number of reaction chambers can vary. Each reaction chamber can or cannot be coupled to another reaction chamber. Further, a given disease diagnostic test can be associated with one or more reaction chambers.
  • the optical property modifying device 120 shown in FIG. 2A is used to perform an influenza type A test associated with two of the reaction chambers, an influenza type B test associated with one of the reaction chambers, a positive internal control test associated with one of the reaction chambers, and a negative internal control test associated with one of the reaction chambers.
  • the optical property modifying device 120 can include a user control such as a tab, a button, or a switch.
  • the optical property modifying device 120 starts the chemical reaction of the disease diagnostic test when the user interacts with the user control. For example, when the user pulls the tab, presses the button, or activates the switch, the optical property modifying device 120 can start to mix the optical property modifying reagent with the biological sample and regulate temperature of the reaction chambers to an appropriate set temperature point. In other embodiments, the optical property modifying device can start mixing and regulating temperature of the reaction chambers automatically once a biological sample is presented.
  • the collection tube 200 is configured to store a biological sample and includes a cartridge interface 210 and a cap 220.
  • a user can provide a biological sample to the collection tube 200 using a swab, for example.
  • the cap 220 encloses the biological sample in the collection tube 200.
  • a user can attach the collection tube 200 to the optical property modifying device 120 by physically contacting the cartridge interface 210 of the collection tube 200 to the collection tube interface 230 of the optical property modifying device 120. When attached, the collection tube 200 can provide a stored biological sample to the optical property modifying device 120 via the interfaces 210 and 230. Though the collection tube 200 shown in FIG.
  • the collection tube 200 can have a different overall shape or form factor such as a rectangular prism.
  • a collection tube is not used to provide the biological sample to the optical property modifying device 120.
  • FIG. 2B is a diagram of the optical property modifying device 120 for use with the mobile device 110, according to another embodiment. Unlike the embodiment of the optical property modifying device 120 shown in FIG. 2A, in the embodiment of the optical property modifying device 120 shown in FIG. 2B, the reaction chambers of the device are not located on a surface of the device, and thus are not visible from a point exterior to the optical property modifying device 120.
  • the optical property modifying device 120 includes an electronic display 270 on a surface.
  • the electronic display 270 can be, for example, a liquid crystal display (LCD), organic light emitting diode (OLED) display, electronic paper display, or one or more individual light emitting diodes (LEDs), among other types of displays.
  • the electronic display 270 is configured to present an indication of one or more results of a diagnostic test performed by the optical property modifying device
  • the electronic display 270 is configured to present an indication of one or more results of a diagnostic test based on reactions between a biological sample and at least an optical property modifying reagent within the reaction chambers of the optical property modifying device 120.
  • an indication of a diagnostic test result presented by the electronic display 270 can include the actual, determined result of the diagnostic test, presented as one or more of human-readable symbols (e.g., alphanumeric characters or graphics) and machine-readable symbols (e.g., a barcode or QR code).
  • human-readable symbols e.g., alphanumeric characters or graphics
  • machine-readable symbols e.g., a barcode or QR code
  • a software algorithm can determine the indication to be displayed by the electronic display 270, based on one or more of the color
  • a color characteristic can include, for example, a color of a reaction between the biological sample and at least the optical property modifying reagent within a reaction chamber.
  • a geometric characteristic can include, for example, a geometric pattern formed by a layout of the reaction chambers of the optical property modifying device 120
  • the mobile device 110 can capture images of the electronic display 270.
  • the mobile device 110 can be positioned over the optical property modifying device 120 to scan or take a photo or video of the electronic display 270 of the optical property modifying device 120.
  • the diagnostic system can use image recognition processes to determine the result of the diagnostic test based on the indication presented by the electronic display 270, as captured in the images.
  • the image recognition processes can include, for example, optical pattern matching, optical template matching, or optical character recognition (OCR) algorithms known to one skilled in the art.
  • the diagnostic system can detect that an image of the electronic display 270 of the optical property modifying device 120 captured by the mobile device 110 presents a message indicating a positive diagnosis for influenza A. Responsive to the detection, the mobile device 110 can provide further interpretation or options for next steps to a user.
  • the embodiment of the optical property modifying device 120 depicted in FIG. 2A includes reaction chambers 240 located on a surface of the optical property modifying device 120 and no electronic display
  • the embodiment of the optical property modifying device 120 depicted in FIG. 2B includes an electronic display 270 and reaction chambers that are not located on a surface of the optical property modifying device 120
  • features of the embodiments of the optical property modifying device 120 depicted in FIGS. 2A-B can occur in any combination.
  • another embodiment of the optical property modifying device 120 may include both the electronic display 270 and the reaction chambers 240 located on a surface of the optical property modifying device 120.
  • FIG. 3 is a block diagram of the diagnostic system 100, according to one embodiment.
  • the diagnostic system 100 includes a user data store 310, an image processing engine 320, an image analysis engine 330, a diagnostic test engine 340, and a test data store 350.
  • the diagnostic system 100 can include additional, fewer, or different components for various applications.
  • the functionality of the diagnostic system 100 can be performed by or implemented on the diagnostic server 150 instead of the mobile device 110.
  • the mobile device 110 can acquire images of the optical property modifying device 120 using the image processing engine 320 and provide the images to the diagnostic server 150 for further processing by the image analysis engine 330.
  • the mobile device 110 can then receive a test result determined by the diagnostic server 150 and communicate the test result to a user. This can be advantageous because the image analysis engine 330 can require more computational resources relative to the image processing engine 320 or other components of the diagnostic system 100 on the mobile device 110.
  • the user data store 310 stores information about users of the diagnostic system 100, such as patients who have completed— or will complete— a diagnostic test.
  • the information can include user information such as name, demographics, family data, geographical location, timestamp data, contact information, or physiological data.
  • the information can describe user’s medical history such as prescriptions, health conditions, genetic data, visits to health care providers, past or current medical services or interventions received, or previously completed diagnostic tests along with relevant details such as the date and/or time that the tests were taken and the test results.
  • the image processing engine 320 acquires and processes images from the mobile device 110.
  • the image processing engine 320 can implement image processing techniques known to those skilled in the art such as noise reduction using different types of filters, skew correction, resizing, rescaling, rotation, equalization,
  • the image processing engine 320 can also determine whether the optical property modifying device 120 or a particular portion of the optical property modifying device 120 is included in the image based on known geometric characteristics and/or color characteristics associated with the optical property modifying device 120.
  • a known geometric characteristic of the optical property modifying device 120 can include a geometric pattern formed by a layout of the reaction chambers on the surface of the optical property modifying device 120.
  • the optical property modifying device 120 includes an electronic display, the known geometric
  • the characteristic of the optical property modifying device 120 can include dimensions of the electronic display.
  • the QR code 260 shown in FIGS. 2 A and 2B can be a geometric characteristic that is unique to the optical property modifying device 120, a batch of optical property modifying devices 120 (e.g., having a certain expiration date based on a manufacturing date), or a particular type of diagnostic test.
  • Other types of geometric characteristics include, for example, a border of the optical property modifying device 120, a two-dimensional (2D) or three-dimensional (3D) barcode, an alignment marker (e.g., the reaction chambers 240, the indicator 250, the example text“Flu A+B Test” shown in FIG. 2A), or any other type of fiducial marker that can be a point of reference in an image.
  • the image processing engine 320 can transform an acquired image (e.g. translate, skew, scale, or rotate) based on geometric characteristics, or in other words, spatial data of the image determined by the image processing engine 320. Alternatively, the image processing engine 320 can leave the image unchanged but instead transform the coordinates it uses to identify regions of interest for analysis within the image.
  • a fiducial marker of the optical property modifying device 120 has a reference orientation and/or size.
  • the QR code 260 shown in FIG. 2A is known to have a length and width of one centimeter (e.g., dimensions of a square) and be orientated with the text“Flu A+B Test” labeled on the optical property modifying device 120.
  • the image processing engine 320 determines that the QR code 260 appears smaller or larger than expected based on the reference size, the image processing engine 320 can scale the image up or down, respectively. If the image processing engine 320 determines that the QR code 260 shown in an image is skewed in shape (e.g., a trapezoid or rhombus), the image processing engine 320 can adjust the skew of the image so that the QR code 260 is rendered more closely as the expected shape of a square. If the image processing engine 320 determines that the QR code 260 shown in the image is not aligned to the reference orientation (e.g., the image has been rotated by 45, 60, or 90 degrees), the image processing engine 320 can rotate the image to the reference orientation. Further, the image processing engine 320 can determine a level of focus of the image by resolving two or more features of the optical property modifying device 120 such as the QR code 260 and another fiducial marker.
  • the image processing engine 320 can determine a level of focus of the image
  • known color characteristics can also be associated with the optical property modifying device 120.
  • a known color characteristic of the optical property modifying device 120 can include a color of one or more contents of the reaction chambers of the optical property modifying device 120, such as the optical property modifying reagent or a reaction solution. Additional color characteristics of the optical property modifying device 120 can include colors of components of the optical property modifying device 120 itself, such as, for example, a color of the indicator 250 shown in FIG. 2A.
  • a surface of the optical property modifying device 120 can be known to have a uniform color (e.g., an approximately white color).
  • an image of the optical property modifying device 120 can be tinted with warmer colors (e.g., red) or cooler colors (e.g., blue) depending on lighting conditions when the image was captured.
  • the image processing engine 320 can use white balancing (e.g., color balancing) to determine the lighting conditions of the image and adjust the colors of the image.
  • white balancing e.g., color balancing
  • the image processing engine 320 can also use shadowing techniques to remove or otherwise account for shadows in images that can skew the color of certain portions of the images. For instance, the image processing engine 320 measures the uniformity of a known color of the surface of the optical property modifying device 120 to identify shadows in an image.
  • the image processing engine 320 continuously monitors images captured by an optical sensor of the mobile device 110.
  • the mobile device 110 displays a live stream of images captured by the optical sensor
  • the image processing engine 320 can determine a quality level of each image. If the quality level of an image meets a threshold level, the image processing engine 320 selects the image for further processing. The image processing engine 320 can determine the quality level and threshold level based on, for instance, the focus, shadowing, white-balancing level, or illumination level of an image.
  • the image processing engine 320 can normalize images captured by mobile devices 110 having different technical specifications. This can be advantageous because users can likely have different types of mobile devices 110. For instance, one mobile device has a higher resolution camera with stronger white flash light than another mobile device. By normalizing images captured from these two mobile devices, the image processing engine 320 can effectively process images regardless of the particular specifications of the mobile devices.
  • the image analysis engine 330 analyzes images acquired and processed by the image processing engine 320. Specifically, in embodiments in which the optical property modifying device 120 includes reaction chambers located on a surface of the optical property modifying device 120 such that contents of the reactions chambers are at least partially visible in images of the optical property modifying device 120, the image analysis engine 330 analyzes the reaction chambers in the images. On the other hand, in embodiments in which the optical property modifying device 120 includes an electronic display that displays an indication of a result of a diagnostic test performed by the optical property modifying device 120, the image analysis engine 330 analyzes the indication displayed by the electronic display in the images captured of the optical property modifying device 120. Both embodiments of image analysis by the image analysis engine 330 are discussed in turn below.
  • the image analysis engine 330 determines one or more color characteristics of the reaction chambers in the images such as the color, transparency, translucency, opacity, or uniformity of the reaction chambers and/or contents within the reaction chambers.
  • the image analysis engine 330 determines information about the color characteristics of the reaction chambers, e.g., the average and variance of color of a group of pixels in an image corresponding to each reaction chamber.
  • the image analysis engine 330 can determine the color average and variance (or any other statistical analysis) based on a RGB (red-green-blue) color model, hue-saturation-value (HSV) color model, or any other suitable color model.
  • the image analysis engine 330 identifies these color characteristics based on known geometric characteristics of the optical property modifying device 120 such as size, shape, orientation, or identifying fiducial marks.
  • the image analysis engine 330 retrieves a color image (e.g., to use as a reference image) from the test data store 350, identifies the size and orientation of a QR code in the color image, and uses a known size (e.g., 1 x 1 cm) of the QR code and known relative positions of the five reaction chambers to the QR code (e.g., 2 cm to the right and above, spaced at 1 cm increments) to extract the color average and variance of each reaction chamber.
  • a color image e.g., to use as a reference image
  • a known size e.g., 1 x 1 cm
  • known relative positions of the five reaction chambers to the QR code e.g., 2 cm to the right and above, spaced at 1 cm increments
  • the image analysis engine 330 can determine test results for a disease diagnostic test by comparing the determined color information of a reaction chamber to reference information that indicates one or more color ranges (e.g., based on the RGB or HSV color model).
  • a color range can correspond to a diagnosis for a disease diagnostic test associated with the reaction chamber.
  • the image analysis engine 330 can determine a test result based on the color range within which the determined color of the reaction chamber falls. In other words, the image analysis engine 330 can match color changes of the reaction chambers to a particular test result.
  • the image analysis engine 330 determines a negative diagnosis of the disease responsive to matching the determined color to a first color range (e.g., yellow to green colors), a positive diagnosis of the disease responsive to matching the determined color to a second color range (e.g., blue to violet colors), and an undetermined (e.g.,“not available” or“unknown”) diagnosis of the disease responsive to matching the determined color to a third color range (or any colors that do not fall within any specified range).
  • a first color range e.g., yellow to green colors
  • a positive diagnosis of the disease responsive to matching the determined color to a second color range e.g., blue to violet colors
  • an undetermined e.g.,“not available” or“unknown” diagnosis of the disease responsive to matching the determined color to a third color range (or any colors that do not fall within any specified range).
  • the image analysis engine 330 can determine a test result based on the distribution and clustering of colors of the individual pixels comprising the image of a reaction chamber, rather than a point estimate of color average or variance. For example, the image analysis engine 330 can classify each pixel within the image of reaction chamber as falling within one of the three color ranges described above. If a plurality of pixels falls within the first color range, the image analysis engine 330 determines a negative diagnosis of the disease. If a plurality of pixels falls within the second color range, the image analysis engine 330 determines a positive diagnosis of the disease. If a plurality of pixels falls within the third color range, the image analysis engine 330 determines an undetermined diagnosis of the disease. In this way, the image analysis engine 330 is robust to, for example, black and white pixels representing reflections from chamber edges that are substantially different colors from the solution within the reaction chambers and might confound a point estimate-based approach to determining the color of the solution.
  • the image analysis engine 330 determines a severity level of the disease or a confidence level of test result based on the comparing the color information with the reference information. For example, relative to a lighter color, a darker color within a certain color range can indicate that the disease diagnosis is more severe or that there is a greater confidence level.
  • An undetermined test result can occur due to various factors such as a defect in the collection tube 200 or optical property modifying device 120, variation in the captured image, or human error by the user or patient, e.g., contaminating the biological sample provided for the diagnostic test or not waiting long enough (or too long) for a chemical reaction of the diagnostic test to complete.
  • the image analysis engine 330 implements defect detection algorithms to analyze an image of an optical property modifying device 120.
  • the image analysis engine 330 can identify bubbles in the reaction chambers of the optical property modifying device 120 or other types of defects such as debris or scratches inside or on the exterior of the reaction chambers. Based on the identification of a defect, the image analysis engine 330 can use image processing techniques to remove the defects from the image, e.g., removing a cluster of pixels corresponding to a defect. Thus, the image analysis engine 330 can exclude anomalous color values from the determination of color information (e.g., average and variance), which can improve the accuracy of the corresponding test result.
  • image processing techniques e.g., removing a cluster of pixels corresponding to a defect.
  • the image analysis engine 330 determines the diagnostic test result by interpreting the indication displayed by the electronic display.
  • an indication of a diagnostic test result displayed by an electronic display can include the actual result of the diagnostic test.
  • the indication can be presented by the electronic display as human- readable symbols such as text, machine-readable symbols such as a barcode, and any other type of symbols.
  • the image analysis engine 330 can perform pattern matching, template matching, or another technique known to those skilled in the art.
  • the image analysis engine 330 can also determine test results further based on information other than analyzed images in some embodiments. For example, the image analysis engine 330 can determine test results based in part on a geographical location and/or timestamp of the diagnostic test. In particular, a certain disease can be more prevalent in a certain geographical region (e.g., an area with a tropical climate or a dry climate) or during a certain time range (e.g., during the summer or winter season). In some embodiments, the performance (e.g., sensitivity vs. specificity) of the test can be adjusted based upon known epidemiological factors such as disease prevalence so that during times of low prevalence, the sensitivity of the test can be decreased to prevent false positive test results.
  • a certain disease can be more prevalent in a certain geographical region (e.g., an area with a tropical climate or a dry climate) or during a certain time range (e.g., during the summer or winter season).
  • the diagnostic test engine 340 provides information to and/or receives information from the mobile device 110 and the health care provider 130 for disease diagnostic tests.
  • the diagnostic test engine 340 can provide instructions for a diagnostic test to the mobile device 110.
  • the diagnostic test engine 340 can retrieve the instructions for the diagnostic test (e.g., including text, graphics, audio, or other media content) from the test data store 350.
  • the diagnostic test engine 340 also receives data from the mobile device
  • the diagnostic test engine 110 such as images of the optical property modifying device 120 captured by an optical sensor of the mobile device 110 and/or metadata such as the geographical location of the mobile device 110, a timestamp of a captured image, information describing the optical sensor (e.g., pixel resolution, aperture, or flash), patient electronic health record data, patient vital signs, and any other pertinent information.
  • the diagnostic test engine e.g., pixel resolution, aperture, or flash
  • diagnostic tests e.g., a type of diagnostic test or a result of the diagnostic test
  • the image processing engine 320 and/or the image analysis engine 330 determines information for diagnostic tests (e.g., a type of diagnostic test or a result of the diagnostic test) based on images processed and analyzed by the image processing engine 320 and/or the image analysis engine 330.
  • the diagnostic test engine can provide the test result to the mobile device 110 and/or to a third-party system, including the health care provider 130. Further, the diagnostic test engine 340 can store test results in the test data store 350 or the user data store 310 along with one or more of the metadata mentioned above.
  • the diagnostic test engine 340 can provide test results and one or more of the metadata mentioned above to a computer server of a third-party system including the health care provider 130 such as a physician or a pharmacy, a government agency such as the Center for Disease Control (CDC), an insurance provider, a telemedicine partner, a treatment manufacturer, and any other third-party system to aid in disease prevention and/or treatment.
  • the third-party system can store aggregated health information of a population of people and can organize the health information based on geographical location and/or based on timestamp. The third-party system can thus render epidemiological data based on aggregated test results from multiple mobile devices 110 and a population of patients.
  • the epidemiological data can be organized based on various parameters (e.g., demographics, geographical location, temporal information, or types of diseases) and used to determine risk factors or preventative measures for certain diseases.
  • the CDC can use the test results and health information to evaluate the well-being of a population within the government’s jurisdiction, e.g., monitoring the spread and prevalence of a disease such as influenza.
  • the epidemiological data can be used to develop a real-time quantitative, qualitative, or semi-quantitative severity index for a particular disease, based in part of geographical location. For instance, a visual heat map can be generated to indicate geographical disease prevalence.
  • disease prevalence can be forecast based on historical disease averages, real-time diagnostic results, user geographical history, weather, traffic patterns, or any other relevant statistics using predictive algorithms.
  • Determined geographical disease trends can be provided to third-party systems such as pharmaceutical manufacturers, retail stores, government entities, and insurance providers to guide treatment-related supply chain and resource allocation decisions. Additionally, geographical disease trends can be provided to individual users and other third-party systems such as social media platforms, travel providers, and advertising agencies to inform individual user behavior.
  • the diagnostic test engine 340 can communicate sponsored content based on test results of diagnostic tests to users via the mobile device 110.
  • the sponsored content is a content item displayed on a user interface of the mobile device 110.
  • Examples of sponsored content include information from vendors of goods or services that can help treat diseases tested by the diagnostic system 100. For instance, the vendors provide prescription medicines, over-the-counter medicines, physical therapy, rehabilitative services, etc.
  • the diagnostic test engine 340 can communicate alerts to users via the mobile device 110. These alerts can describe local disease prevalence and provide recommendations for disease prevention, diagnostics, and treatment. The alerts can be based on a user’s location history and potential recent disease exposure.
  • FIG. 4A shows a user interface 400 of a mobile application for a disease diagnostic test, according to one embodiment.
  • the user interface 400 shows an image
  • the image 405 is the field of view of the camera of the mobile device 110 displaying the user interface 400.
  • the image 405 is oriented to align with the mobile device 110 because the edges of optical property modifying device 120 are approximately parallel to the edges of the user interface 400, and in extension, to the mobile device 110.
  • the user interface 400 includes one or more alignment markers overlaying the field of view to assist a user of the mobile device 110 in aligning the mobile device 110 to the optical property modifying device 120 for capturing the image 405.
  • an alignment marker is a semi-transparent graphic of the optical property modifying device 120 that the user can use to overlap with the physical optical property modifying device 120.
  • the alignment marker corresponds to a geometric characteristic of the optical property modifying device 120 (e.g., the QR code 260), or a line that should be aligned with an edge of the optical property modifying device 120.
  • FIG. 4B shows another user interface 410 of the mobile application shown in FIG. 4A, including information about the disease diagnostic test, according to one embodiment.
  • the image processing engine 320 and image analysis engine 330 process the image 405 of the optical property modifying device 120 shown in FIG. 4A.
  • the diagnostic test engine 340 determines a disease diagnostic test of the optical property modifying device 120 and provides information about the disease diagnostic test for display.
  • the user interface 410 indicates the type of test (e.g., influenza type A and B), expiration date (e.g., Jan 30, 2020), test description, duration in time (e.g., 20 minutes for a chemical reaction of the test), and cartridge ID (e.g., 0x34E) of the optical property modifying device 120.
  • type of test e.g., influenza type A and B
  • expiration date e.g., Jan 30, 2020
  • test description e.g., duration in time
  • duration in time e.g., 20 minutes for a chemical reaction of the test
  • cartridge ID
  • FIG. 4C shows another user interface 415 of the mobile application shown in FIG. 4A, including instructions to use a swab 420, according to one embodiment.
  • the user interface 415 indicates instructions for an example step of the disease diagnostic test shown in FIG. 4B. For this step, the user is instructed to remove the swab 420 from a sterile package 425 and swab the nostril of a patient undergoing the disease diagnostic test to collect a biological sample of the patient.
  • FIG. 4D shows another user interface 430 of the mobile application shown in FIG. 4A, including instructions to use a collection tube 200, according to one embodiment.
  • the user interface 430 indicates instructions for an example step of the disease diagnostic test shown in FIG. 4B.
  • the user is instructed to insert the swab 420 into the collection tube 200 and swirl the swab 420 around for 10 seconds. Afterwards, the swab 420 should be discarded. The cap 220 of the collection tube 200 is removed so that the swab 420 can be inserted inside.
  • FIG. 4E shows another user interface 435 of the mobile application shown in FIG. 4A, including instructions to use a cartridge, according to one embodiment.
  • the user interface 435 indicates instructions for an example step of the disease diagnostic test shown in FIG. 4B.
  • the user is instructed to place the collection tube 200 onto the cartridge (the optical property modifying device 120) and close the cap 220.
  • the cartridge interface 210 should be coupled to the collection tube interface 230, which provides the biological sample of the patient to the optical property modifying device 120.
  • the user is also instructed to pull the tab of the optical property modifying device 120 to start the chemical reaction of the disease diagnostic test.
  • FIG. 4F shows another user interface 440 of the mobile application shown in FIG. 4A, including instructions to wait for a chemical reaction of the disease diagnostic test to complete, according to one embodiment.
  • the user interface 440 displays a timer that indicates the time remaining until the chemical reaction of the disease diagnostic test shown in FIG. 4B should be complete. Though the total time for this example disease diagnostic test is 20 minutes, in other embodiments, the time can vary, e.g., between 1 and 60 minutes.
  • FIG. 4G shows another user interface 445 of the mobile application shown in FIG. 4A, including instructions to scan the cartridge, according to one embodiment.
  • the user interface 445 indicates instructions for an example step of the disease diagnostic test shown in FIG. 4B. For this step, the user is instructed to scan (take one or more images of) the cartridge (the optical property modifying device f20). Since the chemical reaction of the disease diagnostic test has completed after waiting the duration of time for the test, the color characteristics of the reaction chambers of the imaged 5 optical property modifying device f20 have changed. Because the optical property modifying device f20 depicted in FIGS. 4A-H is the embodiment of the optical property modifying device f20 depicted in FIG.
  • the reaction chambers are located on a surface of the optical property modifying device f20, and thus the contents of the reaction chambers are at least partially visible in the images of the optical property 10 modifying device f20 depicted in FIGS. 4A-H. Therefore, the image 450 of the optical property modifying device f20 shows that the two of the reaction chambers 455 and 460 have changed at least in color as result of the biological sample of the patient reacting with an optical property modifying reagent.
  • optical property modifying device f 5 f20 depicted in FIGS. 4A-H is the embodiment of the optical property modifying
  • the reaction chambers would not be located on a surface of the optical property modifying device f20, and thus the contents of the reaction chambers would not be visible in the images of the optical property modifying device f20 depicted in FIGS. 4A-H. Rather, in alternative embodiments in which the 20 optical property modifying device f20 depicted in FIGS. 4A-H is the embodiment of the optical property modifying device f20 depicted in FIG. 2B, the optical property modifying device f20 would include an electronic display displaying an indication of a result of the diagnostic test.
  • FIG. 4H shows another user interface 465 of the mobile application shown 25 in FIG. 4A, including results for the disease diagnostic test, according to one
  • the image processing engine 320 and image analysis engine 330 process and analyze the image 445 of the optical property modifying device 120 shown in FIG.
  • the diagnostic test engine 340 determines test results of the disease diagnostic test and provides information associated with the test results for display on the user interface 465.
  • the user interface 465 indicates the patient has tested positive for influenza A and negative for influenza B.
  • the user interface 465 also shows an option for the user or patient to call a treatment hotline.
  • the treatment hotline is a phone number of a health care provider that can provide pharmaceutical drugs or other health care services for treating influenza.
  • the user interface 465 provides an option for the user to modify, verify/confirm, or cancel the test result, which can be performed before the diagnostic system 100 provides (via the diagnostic test engine 340) the test result to the health care provider 130, for example. If the test result is updated after the test results has been provided the health care provider 130, the diagnostic test engine 340 can automatically provide the updated test result as well. In one embodiment, if the test result is positive, the diagnostic test engine 340 provides a notification for display on the mobile device 110 indicating that the health care provider 130 will be contacted on behalf of the patient regarding the test result. For example, the diagnostic test engine 340 provides the test result to a physician. The diagnostic test engine 340 can receive a recommendation (e.g., taking vitamins or medication) from the health care provider 130 in response to providing the test result and communicate the recommendation to the user.
  • a recommendation e.g., taking vitamins or medication
  • the diagnostic test engine 340 provides the test result to a pharmacy to allow the pharmacy to fulfill a prescription for the patient to treat the disease associated with the test result.
  • the diagnostic test engine 340 can determine a pharmacy that is located nearby the patient (e.g., within a predetermined distance such as a threshold radius of 10 kilometers) based on geographical information, for instance, the location where the diagnostic test was taken or the address of the patient.
  • FIG. 5A is a data flow diagram for performing a disease diagnostic test in a conventional system environment according to one embodiment.
  • a patient 500 who wants to complete a disease diagnostic test visits a physician 510 at a hospital or clinic.
  • the physician 510 collects a biological sample of the patient 500.
  • the physician 510 provides the biological sample to a central lab 530 with staff that performs the chemical reaction portion of the disease diagnostic test using lab equipment and the biological sample.
  • the central lab 530 provides test results of the disease diagnostic test to the physician 510.
  • the physician 510 provides the test results to the patient 500. If the patient tested positive for a disease, the test results can indicate a prescription for medication or another type of medical treatment.
  • step 5 the physician 510 provides the prescription to a pharmacy 520.
  • the pharmacy 520 consults with the physician 510 to verify or clarify the prescription.
  • step 6 the patient 500 visits the pharmacy 520 to request the medication for the prescription.
  • step 7 the pharmacy 520 fulfills the prescription by providing the medication to the patient 500.
  • FIG. 5B is a data flow diagram for performing a disease diagnostic test in a system environment including the diagnostic system 100 according to one embodiment.
  • a patient 500 who wants to complete a disease diagnostic test uses the diagnostic system 100, e.g., by completing steps of the disease diagnostic test as shown in FIGS. 4A-H.
  • the diagnostic system 100 can simultaneously provide the test results to the patient 500 and the pharmacy 520 in steps 2A and 2B.
  • the diagnostic system 100 provides the test results for display on a user interface of a mobile application.
  • the diagnostic system 100 can automatically provide the test results to the pharmacy 520 without requiring further input from the patient 500 (and/or a physician 510), or in response to receiving a confirmation of the test result from the patient 500, in some embodiments.
  • the pharmacy 520 fulfills a prescription for the patient 500 based on the test results, e.g., the pharmacy 520 delivers medications to the home of the patient 500 or prepares the medications for pickup at a location nearby the patient 500.
  • the diagnostic server 150 can consult a physician 510 in an automated manner without requiring that the patient specify the physician or manually facilitate this consultation.
  • one or more physicians 510 are affiliated with the diagnostic server 150.
  • the diagnostic system 100 can also provide the test results to the physician 510, e.g., in response to a request by the patient. Thus, if the patient 500 visits the physician 510 to seek treatment for the disease, the physician 510 can determine an appropriate treatment based on the test results.
  • the system environment including the diagnostic system 100 shown in FIG. 5B facilitates a process requiring fewer steps for the patient 500 to complete a disease diagnostic test and receive appropriate medications to treat the disease if the patient 500 tested positive.
  • the diagnostic system 100 enables patients to receive treatment for diseases more promptly.
  • the system environment shown in FIG. 5B does not require the central lab 530 because the diagnostic system 100 allows the patient 500 to complete the chemical reaction portion of the disease diagnostic test at home using the optical property modifying device 120. This can be particularly advantageous for patients residing far away from the nearest central lab
  • FIG. 6 is a flowchart illustrating a process 600 for determining test results for a disease diagnostic test according to one embodiment.
  • the process 600 is used by the diagnostic system 100— e.g., modules of the diagnostic system 100 described with reference to FIG. 3— within the system environment in FIG.
  • the process 600 can include different or additional steps than those described in conjunction with FIG. 6 in some embodiments or perform steps in different orders than the order described in conjunction with FIG. 6.
  • the image processing engine 320 receives 610 a set of images of an optical property modifying device 120 for a nucleic acid disease diagnostic test captured by an optical sensor of a mobile device 110.
  • the image processing engine 320 determines 620, for each image in the set, whether the electronic display of the optical property modifying device 120 is shown in the image.
  • the image processing engine 320 selects 630 one or more images of the set that are determined to show the electronic display of the optical property modifying device 120.
  • the image analysis engine 330 determines 640 a test result for the nucleic acid disease diagnostic test based on the one or more images.
  • the diagnostic test engine 340 provides 650 the test result for display on the mobile device 110.
  • step 620 can also or alternatively include determining, for each image in the set, whether the reaction chambers of the optical property modifying device 120 are shown in the image.
  • step 630 can also or alternatively include selecting one or more images of the set that are determined to show the reaction chambers of the optical property modifying device 120.
  • any reference to“one embodiment” or“an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment.
  • the appearances of the phrase“in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
  • Coupled and “connected” along with their derivatives.
  • some embodiments can be described using the term“coupled” to indicate that two or more elements are in direct physical or electrical contact.
  • the term“coupled,” however, can also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other.
  • the embodiments are not limited in this context unless otherwise explicitly stated.
  • the terms“comprises,”“comprising,”“includes,” “including,”“has,”“having” or any other variation thereof are intended to cover a non exclusive inclusion.
  • a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but can include other elements not expressly listed or inherent to such process, method, article, or apparatus.
  • “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
  • a software module is implemented with a computer program product including a computer-readable non-transitory medium containing computer program code, which can be executed by a computer processor for performing any or all of the steps, operations, or processes described.
  • Embodiments of the invention can also relate to a product that is produced by a computing process described herein. Such a product can include information resulting from a computing process, where the information is stored on a non-transitory, tangible computer readable storage medium and can include any embodiment of a computer program product or other data combination described herein.

Abstract

A diagnostic system performs a disease diagnostic test using at least an optical property modifying device and a mobile device. A user provides a biological sample to the optical property modifying device. The biological sample reacts with a reagent in reaction chambers of the device. The user captures images of the optical property modifying device using the mobile device. Based on an analysis of the captured images, the diagnostic system can determine the result of the disease diagnostic test. The diagnostic system presents the test result to the user via the mobile device.

Description

CONSUMER-BASED DISEASE DIAGNOSTICS
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application claims priority to and the benefit of U.S. Patent Application No. 16/155,829, filed October 9, 2018, which is hereby incorporated by reference herein in its entirety.
BACKGROUND FIELD OF ART
[0002] This description generally relates to disease diagnostics, and particularly to a portable system to perform image-based disease diagnostic tests. DESCRIPTION OF THE RELATED ART
[0003] Though many disease diagnostic tests are performed in laboratory settings using benchtop equipment, advances in medical technologies such as microfluidics have enabled users to perform diagnostic tests using smaller-sized and more portable equipment. For example, the hardware required to complete an immunoassay for a diagnostic test can be fit into a handheld device. However, in conventional test processes, trained personnel such as a nurse, physician, or lab technician must analyze the handheld device to determine the test result. Thus, a patient would have to visit a facility such as a healthcare clinic, or the trained personnel would have to visit the patient at the patient’s home, for example. Alternatively, the patient can collect a biological sample of the patient (such as a mouth or nostril swab) at the patient’s home and mail the biological sample to a laboratory to perform the immunoassay using the biological sample.
[0004] The above examples do not provide a quick test result because these processes require travel of person(s) or transportation of some test material.
Additionally, in remote areas that have scarce medical resources and personnel, it is especially challenging to perform diagnostic tests using conventional test processes. Developments in telecommunications have helped spur the growth of telemedicine technologies that provide remote diagnosis of patients. However, it is still challenging to perform a point-of-care diagnostic test using a reliable and portable end-to-end system that patients can complete in the comfort of their homes.
SUMMARY
[0005] A diagnostic system performs disease diagnostic tests (also referred to herein as“diagnostic tests” or“nucleic acid disease diagnostic tests”) using at least an optical property modifying device and a mobile device. The optical property modifying device includes a plurality of reaction chambers. The reaction chambers are configured to perform a reaction between a biological sample and at least an optical property modifying reagent. In some embodiments, the optical property modifying device can include an interface configured to receive the biological sample. The diagnostic system can communicate instructions for the diagnostic test to the user via the mobile device.
[0006] In certain embodiments, the plurality of reaction chambers can be located on a surface of the optical property modifying device such that at least one of a geometric pattern formed by a layout of the plurality of reaction chambers and a color of the reaction between the biological sample and at least the optical property modifying reagent within the plurality of reaction chambers is visible from a point exterior to the optical property modifying device. [0007] In some embodiments, the optical property modifying device includes an electronic display configured to display an indication of a result of the diagnostic test. The electronic display can be a liquid crystal display, an organic light emitting diode display, an electronic paper display, and/or one or more light emitting diodes (LEDs). In some embodiments, the indication of the diagnostic test result presented by the electronic display of the optical property modifying device can be the actual result of the diagnostic test. The indication can be presented by the electronic display as human- readable symbols such as text, machine-readable symbols such as a barcode, and any other type of symbols.
[0008] A user provides a biological sample to the optical property modifying device. In embodiments in which the optical property modifying device includes an interface, the user provides the biological sample to the interface of the optical property modifying device. The biological sample reacts with an optical property modifying reagent in the reaction chambers of the device. For example, the optical property modifying device performs a nucleic acid amplification assay that changes the color of a solution including the biological sample and the optical property modifying reagent in the reaction chambers.
[0009] In some embodiments, the user captures one or more images of the optical property modifying device using an optical sensor (e.g., camera) of the mobile device. The images of the optical property modifying device that are captured by the user can include different portions of the optical property modifying device. In some instances, to help the user align the mobile device to the portion of the optical property modifying device being captured in the images, the optical property modifying device or a user interface displayed on the mobile device can provide visual markers.
[0010] The diagnostic system analyzes one or more captured images at the mobile device or at a computer server in communication with the mobile device. First, in some embodiments, the diagnostic system can determine whether the optical property modifying device is included in the images, based on, for example, geometric characteristics and/or color characteristics that are known to be associated with the optical property modifying device. In embodiments in which the reaction chambers are located on a surface of the optical property modifying device, a geometric characteristic known to be associated with the optical property modifying device can include, for example, a geometric pattern formed by a layout of the reaction chambers of the optical property modifying device. As another example, in embodiments in which the reaction chambers are located on a surface of the optical property modifying device, a color characteristic known to be associated with the optical property modifying device can include, for example, colors of the reaction between the biological sample and at least the optical property modifying reagent within the reaction chambers. The diagnostic system can also determine whether the optical property modifying device is included in the images based on any other geometric characteristics and/or color characteristics that are known to be associated with the optical property modifying device.
[0011] In further embodiments, the diagnostic system can determine whether a particular portion of the optical property modifying device is included in the captured images. For example, the diagnostic system can determine whether the electronic display of the optical property modifying device is included in the images. The diagnostic system can also determine a quality level of the images based on factors such as skew, scale, focusing, shadowing, or white-balancing. The diagnostic system can select one or more of the captured images for further analysis based on any of the above determinations. For instance, the diagnostic system can select captured images that are determined to include a particular portion of the optical property modifying device, such as the electronic display, for further analysis. [0012] The diagnostic system then determines a result of the diagnostic test based at least in part on the images of the optical property modifying device captured by the mobile device. A result of a diagnostic test can include, for instance, a positive result, a negative result, a quantitative result, and/or an undetermined result.
[0013] For instance, in embodiments in which the images captured by the mobile device depict an electronic display of the optical property modifying device that displays an indication of the result of the diagnostic test, the diagnostic system determines the result of the diagnostic test based at least in part on the indication displayed by the electronic display in the images. Specifically, in embodiments in which the images captured by the mobile device include an electronic display of the optical property modifying device that displays an indication of a result of the diagnostic test performed, to determine the result of the diagnostic test based on the indication displayed by the electronic display in the captured image, at least one of optical pattern matching, optical template matching, and optical character recognition for the indication displayed by the electronic display in the image can be performed.
[0014] Similarly, in embodiments in which the images captured by the mobile device depict a plurality of reaction chambers located on a surface of the optical property modifying device, the diagnostic system determines the result of the diagnostic test based at least in part on at least one of a known geometric characteristic and a known color characteristic associated with the optical property modifying device and depicted in the images. Specifically, based on a known geometric characteristic of the optical property modifying device, such as a geometric pattern formed by a layout of the reaction chambers, a location of each reaction chamber in an image can be identified. Then, a color of the image at the identified location of each reaction chamber in the image can be determined. The identified color for each reaction chamber can be compared with known color characteristics of the optical property modifying device to determine the result of the diagnostic test.
[0015] The diagnostic system can communicate the determined result of the diagnostic test to the user via the mobile device. Furthermore, in some embodiments, a message that includes sponsored content based on the diagnostic test result is generated and provided to the user via the mobile device. The diagnostic system can also provide the determined result to a third-party system including a health care provider such as a physician or a pharmacy, a government agency such as the Center for Disease Control (CDC), an insurance provider, a telemedicine partner, a treatment manufacturer, and any other third-party system to aid in disease prevention and/or treatment. For instance, provision of the result of the diagnostic test to a pharmacy can allow the pharmacy to fulfill a prescription for the user. As another example, the result of the diagnostic test can be provided to a physician located within a predetermined distance from the user, such that the physician is able to provide a recommendation to treat a condition associated with the result.
[0016] In some embodiments, a geographical location of the mobile device and/or a timestamp of the geographical location are determined at one or more time points before, after, and/or during performance of the diagnostic test. For example, a geographical location of the mobile device and a timestamp of the geographical location can be determined when the result of the diagnostic test is provided to the user via the mobile device. The geographical location of the mobile device can be determined by a location sensor, such as a Global Positioning System (GPS) sensor, of the mobile device. Then, in certain embodiments, the diagnostic system can store the geographical location and/or the timestamp of the geographical location with the determined result of the diagnostic test. In some further embodiments, the result of the diagnostic test is determined in part based on at least one of the determined geographical location and the timestamp. For example, in some embodiments, the sensitivity and specificity of the diagnostic test result can be adjusted based on geospatial and seasonal variations in disease prevalence. In further embodiments, metadata in addition to or instead of geographical location and timestamp data can be provided alongside the diagnostic test result to one or more of the third-party systems mentioned above. For instance, data from the user’s electronic health record, the user’s vital signs, and any other pertinent information may be provided to the one or more third-party systems.
[0017] In embodiments in which a geographical location of the mobile device and/or a timestamp of the geographical location are determined, the geographical location and/or the timestamp can be provided alongside the result of the diagnostic test to a third-party system, as briefly mentioned above. For instance, the diagnostic test result, geographical location, and timestamp can be provided to a third-party system that stores epidemiological data that is based on aggregated diagnostic test results from a plurality of mobile devices associated with a population of patients. This
epidemiological data can be used, for example, in disease prevention.
[0018] The optical property modifying device, mobile device, and any other test equipment, such as a swab and collection tube for the biological sample, are all portable. Thus, since a patient can complete the chemical reaction portion of the disease diagnostic test at home, a central lab is not required to perform disease diagnostic tests. Additionally, because the mobile device (or a computer server in communication with the mobile device) performs image analysis to determine the result of the diagnostic test, the diagnostic system can provide a quick diagnosis without requiring input from health care personnel. Furthermore, determination of the result of the diagnostic test by the diagnostic system is independent of mobile device hardware because the diagnostic system is configured to normalize images captured from different types of mobile devices and optical sensors. Thus, the diagnostic system is a point-of-care system that enables patients to conveniently complete diagnostic tests at their homes (or other patient locations) without having to leave their homes, mail test material to a lab, or have health care personnel visit them at their homes.
[0019] In another embodiment of the disclosure, rather than determining a result of a diagnostic test using one or more images of the optical property modifying device captured by a mobile device, the optical property modifying device transmits one or more data streams to the diagnostic system, for use by the diagnostic system in determining the result of the diagnostic test. In such embodiments, the optical property modifying device further comprises a data stream transmission module that is configured to transmit data streams from the optical property modifying device. The information transmitted from the optical property modifying device to the diagnostic system within the data streams can comprise any of the information that the mobile device captures in images in alternative embodiments, and in some embodiments, even further information. The data streams can be transmitted from the optical property modifying device via electromagnetic radiofrequency transmission (e.g. Bluetooth), via ultrasonic transmission, and/or via audio transmission. Benefits to transmission of data streams via ultrasonic transmission include the ability to quickly transmit large data structures without specialized RF hardware or software pairing.
[0020] In yet another embodiment of the disclosure, a non-transitory computer- readable storage medium stores instructions that when executed by a processor causes the processor to execute any embodiment of the above-described methods.
BRIEF DESCRIPTION OF DRAWINGS
[0021] FIG. 1 is a diagram of a system environment for performing disease diagnostic tests, according to one embodiment. [0022] FIG. 2A is a diagram of an optical property modifying device for use with a mobile device, according to one embodiment.
[0023] FIG. 2B is a diagram of an optical property modifying device for use with a mobile device, according to another embodiment.
[0024] FIG. 3 is a block diagram of a diagnostic system, according to one embodiment.
[0025] FIG. 4A shows a user interface of a mobile application for a disease diagnostic test, according to one embodiment.
[0026] FIG. 4B shows another user interface of the mobile application shown in FIG. 4A including information about the disease diagnostic test, according to one embodiment.
[0027] FIG. 4C shows another user interface of the mobile application shown in FIG. 4A including instructions to use a swab, according to one embodiment.
[0028] FIG. 4D shows another user interface of the mobile application shown in FIG. 4A including instructions to use a collection tube, according to one embodiment.
[0029] FIG. 4E shows another user interface of the mobile application shown in FIG. 4A including instructions to use a cartridge, according to one embodiment.
[0030] FIG. 4F shows another user interface of the mobile application shown in FIG. 4A including instructions to wait for a chemical reaction of the disease diagnostic test to complete, according to one embodiment.
[0031] FIG. 4G shows another user interface of the mobile application shown in FIG. 4A including instructions to scan the cartridge, according to one embodiment.
[0032] FIG. 4H shows another user interface of the mobile application shown in FIG. 4A including results for the disease diagnostic test, according to one embodiment. [0033] FIG. 5A is a data flow diagram for performing a disease diagnostic test in a conventional system environment, according to one embodiment.
[0034] FIG. 5B is a data flow diagram for performing a disease diagnostic test in a system environment including the diagnostic system, according to one embodiment.
[0035] FIG. 6 is a flowchart illustrating a process for determining test results for a disease diagnostic test, according to one embodiment.
[0036] The figures depict embodiments of the present invention for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein can be employed without departing from the principles of the invention described herein.
DETAILED DESCRIPTION
I. EXAMPLE SYSTEM OVERVIEW
[0037] FIG. 1 is a diagram of a system environment for performing disease diagnostic tests, according to one embodiment. The system environment includes a diagnostic server 150, a mobile device 110, an optical property modifying device 120, and one or more health care providers 130. The diagnostic server 150, mobile device 110, and health care providers 130 can be connected to each other via a network 140. In other embodiments, different and/or additional entities can be included in the system environment. The functions performed by the various entities of FIG. 1 can vary in different embodiments.
[0038] The mobile device 110 is an electronic device that includes a diagnostic system 100 to determine test results for disease diagnostic tests performed by users using at least the mobile device 110 (e.g., a smartphone, tablet, laptop computer, etc.) and the optical property modifying device 120. Since the mobile device 110 and optical property modifying device 120 are portable, users can perform disease diagnostic tests at the patient’s home or any other suitable location outside of health care facilities such as hospitals or central labs, for disease diagnostic tests. The diagnostic server 150 is a computer server that can perform some or all functionality of the diagnostic system 100 in some embodiments.
[0039] A user of the mobile device 110 interacts with the diagnostic system 100 via a mobile application. The mobile application communicates information from the diagnostic system 100. For example, the mobile application can present instructions or results for a disease diagnostic test on a graphical user interface displayed on an electronic display on the mobile device 110. As another example, the mobile application provides information via audio signals or tactile feedback (e.g., vibrating the mobile device 110). The mobile application running on the mobile device 110 can provide data from sensors of the mobile device 110 to the diagnostic system 100. For example, the mobile device 110 includes an optical sensor such as a camera to capture images of the optical property modifying device 120. The mobile device 110 can provide the captured images to the diagnostic system 100 for further processing, which is further described with reference to FIG. 3. As other examples, the mobile device 110 can also include a location sensor such as a Global Positioning System (GPS) sensor, a motion sensor such as an accelerometer, gyroscope, or inertial measurement unit, a proximity sensor, or a temperature sensor.
[0040] The mobile device 110 can communicate with the diagnostic server 150 and health care provider 130 via the network 140, which can comprise any combination of local area and wide area networks employing wired or wireless communication links. In one embodiment, the network 140 uses standard communications technologies and Internet protocols. In some embodiments, all or some of the communication links of the network 140 can be encrypted, for example, to provide a technical safeguard for Health Insurance Portability and Accountability Act (HIPAA) compliance.
[0041] Throughout this disclosure, transmission of information from the optical property modifying device 120 to the mobile device 110, to the health care provider 130, and/or to the diagnostic server 150 is discussed as image-based transmission. Specifically, throughout this disclosure, information is discussed as being transmitted from the optical property modifying device 120 to the mobile device 110, to the health care provider 130, and/or to the diagnostic server 150 via images captured of the optical property modifying device 120 by the mobile device 110. However, in additional or alternative embodiments, rather than transmitting information from the optical property modifying device 120 to other modules of the system environment via images captured by the mobile device 110, the optical property modifying device 120 can directly transmit data to other modules of the system environment via one or more alternative means including electromagnetic radiofrequency transmission (e.g. Bluetooth, WiFi, etc.), ultrasonic transmission, audio transmission, or any other type of non-image-based transmission. For instance, in some embodiments, the optical property modifying device 120 can include a data transmission module that is configured to transmit data directly from the optical property modifying device 120. Thus, in such embodiments, the diagnostic system can determine diagnostic test results based on data received directly from the optical property modifying device 120, as opposed to images received from the mobile device 110. While these alternative embodiments of data transmission are not discussed throughout the remainder of this disclosure, it is to be noted that in some embodiments, this alternative mode of data transmission can be implemented instead of or in addition to image-based data transmission.
[0042] Turning back to FIG. 1, the health care provider 130 is a computer server associated a health care provider such as a pharmacy, a central laboratory (e.g., for completing chemical reactions for disease diagnostic tests), a hospital, other types of healthcare facilities, or any other suitable provider of health care services. As an example use case, the diagnostic system 100 provides a disease diagnostic test result of a patient to a pharmacy. Based on the results, the pharmacy determines an appropriate prescription for the patient.
[0043] FIG. 2 A is a diagram of the optical property modifying device 120 for use with the mobile device 110, according to one embodiment. In the embodiment shown in FIG. 2A, the optical property modifying device 120 (also referred to herein as a “cartridge”) includes a collection tube interface 230, one or more reaction chambers 240, an indicator 250, and a QR code 260. In alternative embodiments, the optical property modifying device 120 can include fewer or additional components than those depicted in FIG. 2A.
[0044] To perform a disease diagnostic test, a user uses the optical property modifying device 120 to react a biological sample (e.g., including a nucleic acid) of a patient with an optical property modifying reagent in the reaction chambers 240. In some embodiments, the optical property modifying device 120 can receive the biological sample from a collection tube 200 via the collection tube interface 230.
While performing the disease diagnostic test, the mobile device 110 can provide instructions for the test to the user, for example, via a graphical user interface of the mobile device 110 as further described below with reference to FIGS. 4A-H.
[0045] The indicator 250 can be a light-emitting diode (LED) that provides an indication of a status of a disease diagnostic test being performed by the optical property modifying device 120. For example, the indicator 250 can display a red colored light indicating that the test has not yet started, a yellow colored light indicating that the test is in progress, and a green colored light indicating that the test has completed. In other embodiments, the indicator 250 can be another type of indicator different than an LED (e.g., an audio indicator) and the optical property modifying device 120 can include any number of indicators 250 (including zero).
[0046] As mentioned above, the embodiment of the optical property modifying device 120 depicted in FIG. 2A includes the QR code 260. The QR code 260 can be associated with information for a disease diagnostic test such as a type of the test or an expiration date of the test. The diagnostic system 100 can determine the information for disease diagnostic test by processing images of the QR code 260 scanned by an optical sensor of the mobile device 110. The QR code 260 can also be another type of code, such as a barcode, other identifier, or machine-readable signature. While the embodiment of the optical property modifying device 120 includes the QR code 260, in some alternative embodiments, the optical property modifying device 120 does not include a QR code.
[0047] In one use case, the optical property modifying device 120 performs a type of disease diagnostic test involving nucleic acid(s), e.g. nucleic acid amplification, using the reaction chambers 240 to determine the presence or amount of a particular genetic target in the biological sample. For example, the optical property modifying device 120 is configured to receive an optical property modifying reagent, which is a solution comprising nucleic acid enzymes and primers which specifically a target nucleic acid sequence specific to an organism of interest. The presence or amount of this target nucleic acid sequence within the biological sample can indicate that a patient is infected with a particular pathogen or carries a particular genetic disease. Further, the optical property modifying reagent includes a dye which changes color or fluoresces upon target amplification, resulting in a visible change in the solution’s color characteristics. Thus, the diagnostic system 100 can analyze the color characteristics of the solution of the biological sample and the optical property modifying reagent to determine results of the disease diagnostic test. In some embodiments, the optical property modifying device 120 can perform other tests such as immunoassays or enzyme-linked immunosorbent assays (ELISAs).
[0048] In the embodiment of the optical property modifying device 120 depicted in FIG. 2A, the reaction chambers 240 are located on a surface of the optical property modifying device 120, such that the reaction chambers 240 and their contents are at least partially visible from a surface of the optical property modifying device 120. In such embodiments, the mobile device 110 can capture images of the reaction chambers 240 and their contents. Furthermore, in such embodiments in which the reaction chambers 240 are located on a surface of the optical property modifying device 120, to determine that an image depicts the optical property modifying device 120, it can be determined that the image depicts one or more of color characteristics and geometric characteristics known to be associated with the reaction chambers 240 of the optical property modifying device 120. A geometric characteristic known to be associated with the reaction chambers 240 of the optical property modifying device 120 can comprise a geometric pattern formed by a layout of the reaction chambers. A color characteristic known to be associated with the reaction chambers 240 of the optical property modifying device 120 can comprise a color of the reaction between the biological sample and the optical property modifying reagent within a reaction chamber of the device.
[0049] The diagnostic system can also use one or more of color characteristics and geometric characteristics identified in an image of the reaction chambers 240 of the optical property modifying device 120, to determine a result of a diagnostic test performed by the optical property modifying device 120. More specifically, using known geometric characteristics of the optical property modifying device 120, a location of each reaction chamber can be identified in the image. Then, a color of the image at the identified location of each reaction chamber in the image can be determined. These identified colors can be compared to known colorimetric characteristics of the optical property modifying device 120 to determine the result of the diagnostic test performed by the optical property modifying device 120. In this way, capturing images of the optical property modifying device 120 using the mobile device 110 enables determination of the result of the diagnostic test performed by the optical property modifying device 120.
[0050] In alternative embodiments, such as in the embodiment discussed below with regard to FIG. 2B, the reaction chambers 240 are not visible from a surface of the optical property modifying device 120, and thus the contents of the reaction chambers 240 are also not visible. As discussed below with regard to FIG. 2B, in such embodiments, alternative means of determining a result of a diagnostic test performed by the optical property modifying device 120 can be used.
[0051] The optical property modifying device 120 can perform multiple disease diagnostic tests in parallel or in series. The optical property modifying device 120 shown in the example in FIG. 2A includes five reaction chambers 240, though in other embodiments, the number of reaction chambers can vary. Each reaction chamber can or cannot be coupled to another reaction chamber. Further, a given disease diagnostic test can be associated with one or more reaction chambers. For example, the optical property modifying device 120 shown in FIG. 2A is used to perform an influenza type A test associated with two of the reaction chambers, an influenza type B test associated with one of the reaction chambers, a positive internal control test associated with one of the reaction chambers, and a negative internal control test associated with one of the reaction chambers. In some embodiments, multiple tests are associated with at least one common reaction chamber, which can be advantageous to consolidate the number of reaction chambers required for the optical property modifying device 120. [0052] Though not shown in FIG. 2A, in some embodiments, the optical property modifying device 120 can include a user control such as a tab, a button, or a switch. In such embodiments, the optical property modifying device 120 starts the chemical reaction of the disease diagnostic test when the user interacts with the user control. For example, when the user pulls the tab, presses the button, or activates the switch, the optical property modifying device 120 can start to mix the optical property modifying reagent with the biological sample and regulate temperature of the reaction chambers to an appropriate set temperature point. In other embodiments, the optical property modifying device can start mixing and regulating temperature of the reaction chambers automatically once a biological sample is presented.
[0053] The collection tube 200 is configured to store a biological sample and includes a cartridge interface 210 and a cap 220. A user can provide a biological sample to the collection tube 200 using a swab, for example. The cap 220 encloses the biological sample in the collection tube 200. A user can attach the collection tube 200 to the optical property modifying device 120 by physically contacting the cartridge interface 210 of the collection tube 200 to the collection tube interface 230 of the optical property modifying device 120. When attached, the collection tube 200 can provide a stored biological sample to the optical property modifying device 120 via the interfaces 210 and 230. Though the collection tube 200 shown in FIG. 2A is a cylindrical tube, in other embodiments, the collection tube 200 can have a different overall shape or form factor such as a rectangular prism. Alternatively, in some embodiments, a collection tube is not used to provide the biological sample to the optical property modifying device 120.
[0054] FIG. 2B is a diagram of the optical property modifying device 120 for use with the mobile device 110, according to another embodiment. Unlike the embodiment of the optical property modifying device 120 shown in FIG. 2A, in the embodiment of the optical property modifying device 120 shown in FIG. 2B, the reaction chambers of the device are not located on a surface of the device, and thus are not visible from a point exterior to the optical property modifying device 120.
[0055] Instead, in the embodiment of the optical property modifying device 120 shown in FIG. 2B, the optical property modifying device 120 includes an electronic display 270 on a surface. The electronic display 270 can be, for example, a liquid crystal display (LCD), organic light emitting diode (OLED) display, electronic paper display, or one or more individual light emitting diodes (LEDs), among other types of displays. The electronic display 270 is configured to present an indication of one or more results of a diagnostic test performed by the optical property modifying device
120. More specifically, the electronic display 270 is configured to present an indication of one or more results of a diagnostic test based on reactions between a biological sample and at least an optical property modifying reagent within the reaction chambers of the optical property modifying device 120. In some embodiments, an indication of a diagnostic test result presented by the electronic display 270 can include the actual, determined result of the diagnostic test, presented as one or more of human-readable symbols (e.g., alphanumeric characters or graphics) and machine-readable symbols (e.g., a barcode or QR code). For example, in the embodiment depicted in FIG. 2B, the electronic display 270 presents the actual, determined result of the diagnostic test,“Flu A Positive”, in human-readable symbols.
[0056] An indication of a diagnostic test result presented by the electronic display
270 may be based on one or more of color characteristics and geometric characteristics of the reaction chambers. A software algorithm can determine the indication to be displayed by the electronic display 270, based on one or more of the color
characteristics and the geometric characteristics of the reaction chambers. As mentioned above, a color characteristic can include, for example, a color of a reaction between the biological sample and at least the optical property modifying reagent within a reaction chamber. A geometric characteristic can include, for example, a geometric pattern formed by a layout of the reaction chambers of the optical property modifying device 120
[0057] In such embodiments in which an indication of a result of a diagnostic test performed by the optical property modifying device 120 is presented on the electronic display 270 of the optical property modifying device 120, the mobile device 110 can capture images of the electronic display 270. For example, as shown in FIG. 2B, the mobile device 110 can be positioned over the optical property modifying device 120 to scan or take a photo or video of the electronic display 270 of the optical property modifying device 120. Then, the diagnostic system can use image recognition processes to determine the result of the diagnostic test based on the indication presented by the electronic display 270, as captured in the images. The image recognition processes can include, for example, optical pattern matching, optical template matching, or optical character recognition (OCR) algorithms known to one skilled in the art. As an example, as shown in FIG. 2B, the diagnostic system can detect that an image of the electronic display 270 of the optical property modifying device 120 captured by the mobile device 110 presents a message indicating a positive diagnosis for influenza A. Responsive to the detection, the mobile device 110 can provide further interpretation or options for next steps to a user.
[0058] While the embodiment of the optical property modifying device 120 depicted in FIG. 2A includes reaction chambers 240 located on a surface of the optical property modifying device 120 and no electronic display, and the embodiment of the optical property modifying device 120 depicted in FIG. 2B includes an electronic display 270 and reaction chambers that are not located on a surface of the optical property modifying device 120, in alternative embodiments, features of the embodiments of the optical property modifying device 120 depicted in FIGS. 2A-B can occur in any combination. For example, another embodiment of the optical property modifying device 120 may include both the electronic display 270 and the reaction chambers 240 located on a surface of the optical property modifying device 120. II. EXAMPLE SYSTEM ARCHITECTURE
[0059] FIG. 3 is a block diagram of the diagnostic system 100, according to one embodiment. The diagnostic system 100 includes a user data store 310, an image processing engine 320, an image analysis engine 330, a diagnostic test engine 340, and a test data store 350. In other embodiments, the diagnostic system 100 can include additional, fewer, or different components for various applications.
[0060] In some embodiments, some or all of the functionality of the diagnostic system 100 can be performed by or implemented on the diagnostic server 150 instead of the mobile device 110. For example, the mobile device 110 can acquire images of the optical property modifying device 120 using the image processing engine 320 and provide the images to the diagnostic server 150 for further processing by the image analysis engine 330. The mobile device 110 can then receive a test result determined by the diagnostic server 150 and communicate the test result to a user. This can be advantageous because the image analysis engine 330 can require more computational resources relative to the image processing engine 320 or other components of the diagnostic system 100 on the mobile device 110.
[0061] The user data store 310 stores information about users of the diagnostic system 100, such as patients who have completed— or will complete— a diagnostic test.
The information can include user information such as name, demographics, family data, geographical location, timestamp data, contact information, or physiological data. The information can describe user’s medical history such as prescriptions, health conditions, genetic data, visits to health care providers, past or current medical services or interventions received, or previously completed diagnostic tests along with relevant details such as the date and/or time that the tests were taken and the test results.
[0062] The image processing engine 320 acquires and processes images from the mobile device 110. The image processing engine 320 can implement image processing techniques known to those skilled in the art such as noise reduction using different types of filters, skew correction, resizing, rescaling, rotation, equalization,
focusing/sharpening, or edge detection. The image processing engine 320 can also determine whether the optical property modifying device 120 or a particular portion of the optical property modifying device 120 is included in the image based on known geometric characteristics and/or color characteristics associated with the optical property modifying device 120.
[0063] As mentioned above, in embodiments in which the reaction chambers of the optical property modifying device 120 are located on a surface of the device, a known geometric characteristic of the optical property modifying device 120 can include a geometric pattern formed by a layout of the reaction chambers on the surface of the optical property modifying device 120. In embodiments in which the optical property modifying device 120 includes an electronic display, the known geometric
characteristic of the optical property modifying device 120 can include dimensions of the electronic display. As another example, the QR code 260 shown in FIGS. 2 A and 2B can be a geometric characteristic that is unique to the optical property modifying device 120, a batch of optical property modifying devices 120 (e.g., having a certain expiration date based on a manufacturing date), or a particular type of diagnostic test. Other types of geometric characteristics include, for example, a border of the optical property modifying device 120, a two-dimensional (2D) or three-dimensional (3D) barcode, an alignment marker (e.g., the reaction chambers 240, the indicator 250, the example text“Flu A+B Test” shown in FIG. 2A), or any other type of fiducial marker that can be a point of reference in an image.
[0064] The image processing engine 320 can transform an acquired image (e.g. translate, skew, scale, or rotate) based on geometric characteristics, or in other words, spatial data of the image determined by the image processing engine 320. Alternatively, the image processing engine 320 can leave the image unchanged but instead transform the coordinates it uses to identify regions of interest for analysis within the image. As an example, a fiducial marker of the optical property modifying device 120 has a reference orientation and/or size. For instance, the QR code 260 shown in FIG. 2A is known to have a length and width of one centimeter (e.g., dimensions of a square) and be orientated with the text“Flu A+B Test” labeled on the optical property modifying device 120. If the image processing engine 320 determines that the QR code 260 appears smaller or larger than expected based on the reference size, the image processing engine 320 can scale the image up or down, respectively. If the image processing engine 320 determines that the QR code 260 shown in an image is skewed in shape (e.g., a trapezoid or rhombus), the image processing engine 320 can adjust the skew of the image so that the QR code 260 is rendered more closely as the expected shape of a square. If the image processing engine 320 determines that the QR code 260 shown in the image is not aligned to the reference orientation (e.g., the image has been rotated by 45, 60, or 90 degrees), the image processing engine 320 can rotate the image to the reference orientation. Further, the image processing engine 320 can determine a level of focus of the image by resolving two or more features of the optical property modifying device 120 such as the QR code 260 and another fiducial marker.
[0065] As mentioned above, known color characteristics can also be associated with the optical property modifying device 120. Specifically, the image processing engine
320 has information indicating that a surface (or a fiducial marker) of the optical property modifying device 120 is known to be a certain color(s). As mentioned above, in embodiments in which the reaction chambers of the optical property modifying device 120 are located on a surface of the device, a known color characteristic of the optical property modifying device 120 can include a color of one or more contents of the reaction chambers of the optical property modifying device 120, such as the optical property modifying reagent or a reaction solution. Additional color characteristics of the optical property modifying device 120 can include colors of components of the optical property modifying device 120 itself, such as, for example, a color of the indicator 250 shown in FIG. 2A.
[0066] As another example, a surface of the optical property modifying device 120 can be known to have a uniform color (e.g., an approximately white color). However, an image of the optical property modifying device 120 can be tinted with warmer colors (e.g., red) or cooler colors (e.g., blue) depending on lighting conditions when the image was captured. The image processing engine 320 can use white balancing (e.g., color balancing) to determine the lighting conditions of the image and adjust the colors of the image. Thus, the image processing engine 320 can properly match the known white color of the surface of the optical property modifying device 120 with the adjusted image. The image processing engine 320 can also use shadowing techniques to remove or otherwise account for shadows in images that can skew the color of certain portions of the images. For instance, the image processing engine 320 measures the uniformity of a known color of the surface of the optical property modifying device 120 to identify shadows in an image.
[0067] In some embodiments, the image processing engine 320 continuously monitors images captured by an optical sensor of the mobile device 110. For example, the mobile device 110 displays a live stream of images captured by the optical sensor
(e.g., representing the field of view of the optical sensor) to provide a real-time visual guide for a user to align the mobile device 110 with the optical property modifying device 120 being imaged. While monitoring, the image processing engine 320 can determine a quality level of each image. If the quality level of an image meets a threshold level, the image processing engine 320 selects the image for further processing. The image processing engine 320 can determine the quality level and threshold level based on, for instance, the focus, shadowing, white-balancing level, or illumination level of an image.
[0068] Further, by adjusting the image using the techniques and parameters described above, the image processing engine 320 can normalize images captured by mobile devices 110 having different technical specifications. This can be advantageous because users can likely have different types of mobile devices 110. For instance, one mobile device has a higher resolution camera with stronger white flash light than another mobile device. By normalizing images captured from these two mobile devices, the image processing engine 320 can effectively process images regardless of the particular specifications of the mobile devices.
[0069] The image analysis engine 330 analyzes images acquired and processed by the image processing engine 320. Specifically, in embodiments in which the optical property modifying device 120 includes reaction chambers located on a surface of the optical property modifying device 120 such that contents of the reactions chambers are at least partially visible in images of the optical property modifying device 120, the image analysis engine 330 analyzes the reaction chambers in the images. On the other hand, in embodiments in which the optical property modifying device 120 includes an electronic display that displays an indication of a result of a diagnostic test performed by the optical property modifying device 120, the image analysis engine 330 analyzes the indication displayed by the electronic display in the images captured of the optical property modifying device 120. Both embodiments of image analysis by the image analysis engine 330 are discussed in turn below.
[0070] First, turning to the embodiment in which the image analysis engine 330 analyzes images including reaction chambers located on a surface of the optical property modifying device 120, the image analysis engine 330 determines one or more color characteristics of the reaction chambers in the images such as the color, transparency, translucency, opacity, or uniformity of the reaction chambers and/or contents within the reaction chambers. As an example, the image analysis engine 330 determines information about the color characteristics of the reaction chambers, e.g., the average and variance of color of a group of pixels in an image corresponding to each reaction chamber. The image analysis engine 330 can determine the color average and variance (or any other statistical analysis) based on a RGB (red-green-blue) color model, hue-saturation-value (HSV) color model, or any other suitable color model. The image analysis engine 330 identifies these color characteristics based on known geometric characteristics of the optical property modifying device 120 such as size, shape, orientation, or identifying fiducial marks. For instance, the image analysis engine 330 retrieves a color image (e.g., to use as a reference image) from the test data store 350, identifies the size and orientation of a QR code in the color image, and uses a known size (e.g., 1 x 1 cm) of the QR code and known relative positions of the five reaction chambers to the QR code (e.g., 2 cm to the right and above, spaced at 1 cm increments) to extract the color average and variance of each reaction chamber.
[0071] The image analysis engine 330 can determine test results for a disease diagnostic test by comparing the determined color information of a reaction chamber to reference information that indicates one or more color ranges (e.g., based on the RGB or HSV color model). A color range can correspond to a diagnosis for a disease diagnostic test associated with the reaction chamber. Thus, the image analysis engine 330 can determine a test result based on the color range within which the determined color of the reaction chamber falls. In other words, the image analysis engine 330 can match color changes of the reaction chambers to a particular test result. For instance, the image analysis engine 330 determines a negative diagnosis of the disease responsive to matching the determined color to a first color range (e.g., yellow to green colors), a positive diagnosis of the disease responsive to matching the determined color to a second color range (e.g., blue to violet colors), and an undetermined (e.g.,“not available” or“unknown”) diagnosis of the disease responsive to matching the determined color to a third color range (or any colors that do not fall within any specified range).
[0072] Alternatively, the image analysis engine 330 can determine a test result based on the distribution and clustering of colors of the individual pixels comprising the image of a reaction chamber, rather than a point estimate of color average or variance. For example, the image analysis engine 330 can classify each pixel within the image of reaction chamber as falling within one of the three color ranges described above. If a plurality of pixels falls within the first color range, the image analysis engine 330 determines a negative diagnosis of the disease. If a plurality of pixels falls within the second color range, the image analysis engine 330 determines a positive diagnosis of the disease. If a plurality of pixels falls within the third color range, the image analysis engine 330 determines an undetermined diagnosis of the disease. In this way, the image analysis engine 330 is robust to, for example, black and white pixels representing reflections from chamber edges that are substantially different colors from the solution within the reaction chambers and might confound a point estimate-based approach to determining the color of the solution.
[0073] In some embodiments, the image analysis engine 330 determines a severity level of the disease or a confidence level of test result based on the comparing the color information with the reference information. For example, relative to a lighter color, a darker color within a certain color range can indicate that the disease diagnosis is more severe or that there is a greater confidence level. An undetermined test result can occur due to various factors such as a defect in the collection tube 200 or optical property modifying device 120, variation in the captured image, or human error by the user or patient, e.g., contaminating the biological sample provided for the diagnostic test or not waiting long enough (or too long) for a chemical reaction of the diagnostic test to complete.
[0074] In some embodiments, the image analysis engine 330 implements defect detection algorithms to analyze an image of an optical property modifying device 120.
For example, the image analysis engine 330 can identify bubbles in the reaction chambers of the optical property modifying device 120 or other types of defects such as debris or scratches inside or on the exterior of the reaction chambers. Based on the identification of a defect, the image analysis engine 330 can use image processing techniques to remove the defects from the image, e.g., removing a cluster of pixels corresponding to a defect. Thus, the image analysis engine 330 can exclude anomalous color values from the determination of color information (e.g., average and variance), which can improve the accuracy of the corresponding test result.
[0075] Turning next to the embodiment in which the image analysis engine 330 analyzes images including an indication of a diagnostic test result displayed by an electronic display of the optical property modifying device 120, the image analysis engine 330 determines the diagnostic test result by interpreting the indication displayed by the electronic display. As discussed above, an indication of a diagnostic test result displayed by an electronic display can include the actual result of the diagnostic test. In such embodiments, the indication can be presented by the electronic display as human- readable symbols such as text, machine-readable symbols such as a barcode, and any other type of symbols. To interpret the indication displayed by the electronic display, the image analysis engine 330 can perform pattern matching, template matching, or another technique known to those skilled in the art.
[0076] The image analysis engine 330 can also determine test results further based on information other than analyzed images in some embodiments. For example, the image analysis engine 330 can determine test results based in part on a geographical location and/or timestamp of the diagnostic test. In particular, a certain disease can be more prevalent in a certain geographical region (e.g., an area with a tropical climate or a dry climate) or during a certain time range (e.g., during the summer or winter season). In some embodiments, the performance (e.g., sensitivity vs. specificity) of the test can be adjusted based upon known epidemiological factors such as disease prevalence so that during times of low prevalence, the sensitivity of the test can be decreased to prevent false positive test results.
[0077] The diagnostic test engine 340 provides information to and/or receives information from the mobile device 110 and the health care provider 130 for disease diagnostic tests. For example, the diagnostic test engine 340 can provide instructions for a diagnostic test to the mobile device 110. The diagnostic test engine 340 can retrieve the instructions for the diagnostic test (e.g., including text, graphics, audio, or other media content) from the test data store 350.
[0078] The diagnostic test engine 340 also receives data from the mobile device
110, such as images of the optical property modifying device 120 captured by an optical sensor of the mobile device 110 and/or metadata such as the geographical location of the mobile device 110, a timestamp of a captured image, information describing the optical sensor (e.g., pixel resolution, aperture, or flash), patient electronic health record data, patient vital signs, and any other pertinent information. The diagnostic test engine
340 determines information for diagnostic tests (e.g., a type of diagnostic test or a result of the diagnostic test) based on images processed and analyzed by the image processing engine 320 and/or the image analysis engine 330.
[0079] Following identification of a diagnostic test result, the diagnostic test engine can provide the test result to the mobile device 110 and/or to a third-party system, including the health care provider 130. Further, the diagnostic test engine 340 can store test results in the test data store 350 or the user data store 310 along with one or more of the metadata mentioned above.
[0080] As mentioned above, in some embodiments, the diagnostic test engine 340 can provide test results and one or more of the metadata mentioned above to a computer server of a third-party system including the health care provider 130 such as a physician or a pharmacy, a government agency such as the Center for Disease Control (CDC), an insurance provider, a telemedicine partner, a treatment manufacturer, and any other third-party system to aid in disease prevention and/or treatment. The third-party system can store aggregated health information of a population of people and can organize the health information based on geographical location and/or based on timestamp. The third-party system can thus render epidemiological data based on aggregated test results from multiple mobile devices 110 and a population of patients. The epidemiological data can be organized based on various parameters (e.g., demographics, geographical location, temporal information, or types of diseases) and used to determine risk factors or preventative measures for certain diseases. For example, the CDC can use the test results and health information to evaluate the well-being of a population within the government’s jurisdiction, e.g., monitoring the spread and prevalence of a disease such as influenza. As a further example, the epidemiological data can be used to develop a real-time quantitative, qualitative, or semi-quantitative severity index for a particular disease, based in part of geographical location. For instance, a visual heat map can be generated to indicate geographical disease prevalence. As yet another example, disease prevalence can be forecast based on historical disease averages, real-time diagnostic results, user geographical history, weather, traffic patterns, or any other relevant statistics using predictive algorithms. Determined geographical disease trends can be provided to third-party systems such as pharmaceutical manufacturers, retail stores, government entities, and insurance providers to guide treatment-related supply chain and resource allocation decisions. Additionally, geographical disease trends can be provided to individual users and other third-party systems such as social media platforms, travel providers, and advertising agencies to inform individual user behavior.
[0081] In some embodiments, the diagnostic test engine 340 can communicate sponsored content based on test results of diagnostic tests to users via the mobile device 110. For instance, the sponsored content is a content item displayed on a user interface of the mobile device 110. Examples of sponsored content include information from vendors of goods or services that can help treat diseases tested by the diagnostic system 100. For instance, the vendors provide prescription medicines, over-the-counter medicines, physical therapy, rehabilitative services, etc.
[0082] In further embodiments, the diagnostic test engine 340 can communicate alerts to users via the mobile device 110. These alerts can describe local disease prevalence and provide recommendations for disease prevention, diagnostics, and treatment. The alerts can be based on a user’s location history and potential recent disease exposure.
III. EXAMPLE USER INTERFACES OF MOBILE APPLICATION
[0083] FIG. 4A shows a user interface 400 of a mobile application for a disease diagnostic test, according to one embodiment. The user interface 400 shows an image
405 of the optical property modifying device 120. The image 405 is the field of view of the camera of the mobile device 110 displaying the user interface 400. The image 405 is oriented to align with the mobile device 110 because the edges of optical property modifying device 120 are approximately parallel to the edges of the user interface 400, and in extension, to the mobile device 110. In some embodiments, the user interface 400 includes one or more alignment markers overlaying the field of view to assist a user of the mobile device 110 in aligning the mobile device 110 to the optical property modifying device 120 for capturing the image 405. For example, an alignment marker is a semi-transparent graphic of the optical property modifying device 120 that the user can use to overlap with the physical optical property modifying device 120. As another example, the alignment marker corresponds to a geometric characteristic of the optical property modifying device 120 (e.g., the QR code 260), or a line that should be aligned with an edge of the optical property modifying device 120.
[0084] FIG. 4B shows another user interface 410 of the mobile application shown in FIG. 4A, including information about the disease diagnostic test, according to one embodiment. The image processing engine 320 and image analysis engine 330 process the image 405 of the optical property modifying device 120 shown in FIG. 4A. Based on the processing, the diagnostic test engine 340 determines a disease diagnostic test of the optical property modifying device 120 and provides information about the disease diagnostic test for display. In particular, the user interface 410 indicates the type of test (e.g., influenza type A and B), expiration date (e.g., Jan 30, 2020), test description, duration in time (e.g., 20 minutes for a chemical reaction of the test), and cartridge ID (e.g., 0x34E) of the optical property modifying device 120.
[0085] FIG. 4C shows another user interface 415 of the mobile application shown in FIG. 4A, including instructions to use a swab 420, according to one embodiment. The user interface 415 indicates instructions for an example step of the disease diagnostic test shown in FIG. 4B. For this step, the user is instructed to remove the swab 420 from a sterile package 425 and swab the nostril of a patient undergoing the disease diagnostic test to collect a biological sample of the patient. [0086] FIG. 4D shows another user interface 430 of the mobile application shown in FIG. 4A, including instructions to use a collection tube 200, according to one embodiment. The user interface 430 indicates instructions for an example step of the disease diagnostic test shown in FIG. 4B. For this step, the user is instructed to insert the swab 420 into the collection tube 200 and swirl the swab 420 around for 10 seconds. Afterwards, the swab 420 should be discarded. The cap 220 of the collection tube 200 is removed so that the swab 420 can be inserted inside.
[0087] FIG. 4E shows another user interface 435 of the mobile application shown in FIG. 4A, including instructions to use a cartridge, according to one embodiment. The user interface 435 indicates instructions for an example step of the disease diagnostic test shown in FIG. 4B. For this step, the user is instructed to place the collection tube 200 onto the cartridge (the optical property modifying device 120) and close the cap 220. In particular, the cartridge interface 210 should be coupled to the collection tube interface 230, which provides the biological sample of the patient to the optical property modifying device 120. The user is also instructed to pull the tab of the optical property modifying device 120 to start the chemical reaction of the disease diagnostic test.
[0088] FIG. 4F shows another user interface 440 of the mobile application shown in FIG. 4A, including instructions to wait for a chemical reaction of the disease diagnostic test to complete, according to one embodiment. The user interface 440 displays a timer that indicates the time remaining until the chemical reaction of the disease diagnostic test shown in FIG. 4B should be complete. Though the total time for this example disease diagnostic test is 20 minutes, in other embodiments, the time can vary, e.g., between 1 and 60 minutes.
[0089] FIG. 4G shows another user interface 445 of the mobile application shown in FIG. 4A, including instructions to scan the cartridge, according to one embodiment. The user interface 445 indicates instructions for an example step of the disease diagnostic test shown in FIG. 4B. For this step, the user is instructed to scan (take one or more images of) the cartridge (the optical property modifying device f20). Since the chemical reaction of the disease diagnostic test has completed after waiting the duration of time for the test, the color characteristics of the reaction chambers of the imaged 5 optical property modifying device f20 have changed. Because the optical property modifying device f20 depicted in FIGS. 4A-H is the embodiment of the optical property modifying device f20 depicted in FIG. 2A, the reaction chambers are located on a surface of the optical property modifying device f20, and thus the contents of the reaction chambers are at least partially visible in the images of the optical property 10 modifying device f20 depicted in FIGS. 4A-H. Therefore, the image 450 of the optical property modifying device f20 shows that the two of the reaction chambers 455 and 460 have changed at least in color as result of the biological sample of the patient reacting with an optical property modifying reagent.
[0090] In alternative embodiments in which the optical property modifying device f 5 f20 depicted in FIGS. 4A-H is the embodiment of the optical property modifying
device f20 depicted in FIG. 2B, the reaction chambers would not be located on a surface of the optical property modifying device f20, and thus the contents of the reaction chambers would not be visible in the images of the optical property modifying device f20 depicted in FIGS. 4A-H. Rather, in alternative embodiments in which the 20 optical property modifying device f20 depicted in FIGS. 4A-H is the embodiment of the optical property modifying device f20 depicted in FIG. 2B, the optical property modifying device f20 would include an electronic display displaying an indication of a result of the diagnostic test.
[0091] FIG. 4H shows another user interface 465 of the mobile application shown 25 in FIG. 4A, including results for the disease diagnostic test, according to one
embodiment. The image processing engine 320 and image analysis engine 330 process and analyze the image 445 of the optical property modifying device 120 shown in FIG.
4G. Based on the analysis, the diagnostic test engine 340 determines test results of the disease diagnostic test and provides information associated with the test results for display on the user interface 465. In particular, the user interface 465 indicates the patient has tested positive for influenza A and negative for influenza B. The user interface 465 also shows an option for the user or patient to call a treatment hotline. For instance, the treatment hotline is a phone number of a health care provider that can provide pharmaceutical drugs or other health care services for treating influenza.
[0092] In some embodiments, the user interface 465 provides an option for the user to modify, verify/confirm, or cancel the test result, which can be performed before the diagnostic system 100 provides (via the diagnostic test engine 340) the test result to the health care provider 130, for example. If the test result is updated after the test results has been provided the health care provider 130, the diagnostic test engine 340 can automatically provide the updated test result as well. In one embodiment, if the test result is positive, the diagnostic test engine 340 provides a notification for display on the mobile device 110 indicating that the health care provider 130 will be contacted on behalf of the patient regarding the test result. For example, the diagnostic test engine 340 provides the test result to a physician. The diagnostic test engine 340 can receive a recommendation (e.g., taking vitamins or medication) from the health care provider 130 in response to providing the test result and communicate the recommendation to the user.
[0093] As another example, the diagnostic test engine 340 provides the test result to a pharmacy to allow the pharmacy to fulfill a prescription for the patient to treat the disease associated with the test result. The diagnostic test engine 340 can determine a pharmacy that is located nearby the patient (e.g., within a predetermined distance such as a threshold radius of 10 kilometers) based on geographical information, for instance, the location where the diagnostic test was taken or the address of the patient.
IV. EXAMPLE DATA FLOW DIAGRAM
[0094] FIG. 5A is a data flow diagram for performing a disease diagnostic test in a conventional system environment according to one embodiment. For an example process, in step 1, a patient 500 who wants to complete a disease diagnostic test visits a physician 510 at a hospital or clinic. The physician 510 collects a biological sample of the patient 500. In step 2, the physician 510 provides the biological sample to a central lab 530 with staff that performs the chemical reaction portion of the disease diagnostic test using lab equipment and the biological sample. In step 3, the central lab 530 provides test results of the disease diagnostic test to the physician 510. In step 4, the physician 510 provides the test results to the patient 500. If the patient tested positive for a disease, the test results can indicate a prescription for medication or another type of medical treatment. In step 5, the physician 510 provides the prescription to a pharmacy 520. In some cases, the pharmacy 520 consults with the physician 510 to verify or clarify the prescription. In step 6, the patient 500 visits the pharmacy 520 to request the medication for the prescription. In step 7, the pharmacy 520 fulfills the prescription by providing the medication to the patient 500.
[0095] FIG. 5B is a data flow diagram for performing a disease diagnostic test in a system environment including the diagnostic system 100 according to one embodiment.
For an example process, in step 1, a patient 500 who wants to complete a disease diagnostic test uses the diagnostic system 100, e.g., by completing steps of the disease diagnostic test as shown in FIGS. 4A-H. Once the diagnostic system 100 determines test results of the disease diagnostic test, the diagnostic system 100 can simultaneously provide the test results to the patient 500 and the pharmacy 520 in steps 2A and 2B. For example, as shown in the FIG. 4H, the diagnostic system 100 provides the test results for display on a user interface of a mobile application. Additionally, the diagnostic system 100 can automatically provide the test results to the pharmacy 520 without requiring further input from the patient 500 (and/or a physician 510), or in response to receiving a confirmation of the test result from the patient 500, in some embodiments. In step 3, the pharmacy 520 fulfills a prescription for the patient 500 based on the test results, e.g., the pharmacy 520 delivers medications to the home of the patient 500 or prepares the medications for pickup at a location nearby the patient 500.
[0096] In cases where physician consultation is required, the diagnostic server 150 can consult a physician 510 in an automated manner without requiring that the patient specify the physician or manually facilitate this consultation. In some embodiments, one or more physicians 510 are affiliated with the diagnostic server 150. In some embodiments, the diagnostic system 100 can also provide the test results to the physician 510, e.g., in response to a request by the patient. Thus, if the patient 500 visits the physician 510 to seek treatment for the disease, the physician 510 can determine an appropriate treatment based on the test results.
[0097] In contrast to the conventional system environment shown in FIG. 5A, the system environment including the diagnostic system 100 shown in FIG. 5B facilitates a process requiring fewer steps for the patient 500 to complete a disease diagnostic test and receive appropriate medications to treat the disease if the patient 500 tested positive. By reducing the number of steps, the diagnostic system 100 enables patients to receive treatment for diseases more promptly. Further, the system environment shown in FIG. 5B does not require the central lab 530 because the diagnostic system 100 allows the patient 500 to complete the chemical reaction portion of the disease diagnostic test at home using the optical property modifying device 120. This can be particularly advantageous for patients residing far away from the nearest central lab
530, for example, rural areas or developing countries. V. EXAMPLE PROCESS FLOW
[0098] FIG. 6 is a flowchart illustrating a process 600 for determining test results for a disease diagnostic test according to one embodiment. In some embodiments, the process 600 is used by the diagnostic system 100— e.g., modules of the diagnostic system 100 described with reference to FIG. 3— within the system environment in FIG.
1. The process 600 can include different or additional steps than those described in conjunction with FIG. 6 in some embodiments or perform steps in different orders than the order described in conjunction with FIG. 6.
[0099] In one embodiment, the image processing engine 320 receives 610 a set of images of an optical property modifying device 120 for a nucleic acid disease diagnostic test captured by an optical sensor of a mobile device 110. In embodiments in which the optical property modifying device 120 includes an electronic display, the image processing engine 320 determines 620, for each image in the set, whether the electronic display of the optical property modifying device 120 is shown in the image. The image processing engine 320 selects 630 one or more images of the set that are determined to show the electronic display of the optical property modifying device 120. The image analysis engine 330 determines 640 a test result for the nucleic acid disease diagnostic test based on the one or more images. The diagnostic test engine 340 provides 650 the test result for display on the mobile device 110.
[0100] In embodiments in which the reaction chambers of the optical property modifying device 120 are located on a surface of the optical property modifying device 120, step 620 can also or alternatively include determining, for each image in the set, whether the reaction chambers of the optical property modifying device 120 are shown in the image. Furthermore, step 630 can also or alternatively include selecting one or more images of the set that are determined to show the reaction chambers of the optical property modifying device 120. VI. ALTERNATIVE EMBODIMENTS
[0101] Upon reading this disclosure, those of skill in the art will appreciate still additional alternative structural and functional designs through the disclosed principles herein. Thus, while particular embodiments and applications have been illustrated and described, it is to be understood that the disclosed embodiments are not limited to the precise construction and components disclosed herein. Various modifications, changes and variations, which will be apparent to those skilled in the art, can be made in the arrangement, operation and details of the method and apparatus disclosed herein without departing from the spirit and scope defined in the appended claims.
[0102] As used herein any reference to“one embodiment” or“an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase“in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
[0103] Some embodiments can be described using the expression“coupled” and “connected” along with their derivatives. For example, some embodiments can be described using the term“coupled” to indicate that two or more elements are in direct physical or electrical contact. The term“coupled,” however, can also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other. The embodiments are not limited in this context unless otherwise explicitly stated.
[0104] As used herein, the terms“comprises,”“comprising,”“includes,” “including,”“has,”“having” or any other variation thereof, are intended to cover a non exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but can include other elements not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary,“or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
[0105] In addition, use of the“a” or“an” are employed to describe elements and components of the embodiments herein. This is done merely for convenience and to give a general sense of the invention. This description should be read to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.
[0106] Some portions of this description describe the embodiments of the invention in terms of algorithms and symbolic representations of operations on information.
These algorithmic descriptions and representations are commonly used by those skilled in the data processing arts to convey the substance of their work effectively to others skilled in the art. These operations, while described functionally, computationally, or logically, are understood to be implemented by computer programs or equivalent electrical circuits, microcode, or the like. Furthermore, it has also proven convenient at times, to refer to these arrangements of operations as modules, without loss of generality. The described operations and their associated modules can be embodied in software, firmware, hardware, or any combinations thereof.
[0107] Any of the steps, operations, or processes described herein can be performed or implemented with one or more hardware or software modules, alone or in combination with other devices. In one embodiment, a software module is implemented with a computer program product including a computer-readable non-transitory medium containing computer program code, which can be executed by a computer processor for performing any or all of the steps, operations, or processes described. [0108] Embodiments of the invention can also relate to a product that is produced by a computing process described herein. Such a product can include information resulting from a computing process, where the information is stored on a non-transitory, tangible computer readable storage medium and can include any embodiment of a computer program product or other data combination described herein.

Claims

What is claimed is:
1. An optical property modifying device configured to perform a nucleic acid based diagnostic test, the optical property modifying device comprising: an interface configured to receive a biological sample; a plurality of reaction chambers configured to perform a reaction between the
biological sample and at least an optical property modifying reagent; and an electronic display configured to present a result of the nucleic acid based
diagnostic test based on the reaction between the biological sample and at least the optical property modifying reagent.
2. The optical property modifying device of claim 1, wherein the electronic display is a liquid crystal display.
3. The optical property modifying device of claim 1, wherein the electronic display is an organic light emitting diode display.
4. The optical property modifying device of claim 1, wherein the electronic display is an electronic paper display.
5. The optical property modifying device of claim 1, wherein the electronic display includes one or more light emitting diodes.
6. A method for determining a result of a diagnostic test, the method comprising: receiving one or more images of an optical property modifying device configured to perform a diagnostic test, the one or more images captured by an optical sensor of a mobile device, the optical property modifying device comprising: a plurality of reaction chambers configured to perform a reaction
between a biological sample and at least an optical property modifying reagent; and an electronic display configured to present an indication of a result of the diagnostic test based on the reaction between the biological sample and at least the optical property modifying reagent; determining that an image of the one or more images depicts the electronic display of the optical property modifying device, the electronic display displaying the indication of the result of the diagnostic test; determining the result of the diagnostic test based at least in part on the indication displayed by the electronic display in the image; and providing the determined result of the diagnostic test to the mobile device for
presentation by the mobile device.
7. The method of claim 6, wherein the indication of the result of the diagnostic test comprises the result of the diagnostic test, presented as one or more of human-readable symbols and machine-readable symbols.
8. The method of any one of claims 6-7, wherein determining the result of the diagnostic test based at least in part on the indication displayed by the electronic display in the image comprises: performing at least one of optical pattern matching, optical template matching, and optical character recognition for the indication displayed by the electronic display in the image, to determine the result of the diagnostic test.
9. The method of any one of claims 6-8, further comprising: receiving a geographical location of the mobile device from a location sensor of the mobile device; and storing the geographical location and a timestamp of the geographical location with the determined result of the diagnostic test.
10. The method of claim 9, wherein determining the result of the diagnostic test is further based on at least one of the geographical location and the timestamp.
11. The method of any one of claims 9-10, further comprising: providing the result of the diagnostic test, the geographical location, and the
timestamp to a third-party system including epidemiological data based on aggregated diagnostic test results from a plurality of mobile devices associated with a population of patients.
12. An optical property modifying device configured to perform a nucleic acid based diagnostic test, the optical property modifying device comprising: an interface configured to receive a biological sample; and a plurality of reaction chambers configured to perform a reaction between the
biological sample and at least an optical property modifying reagent, the plurality of reaction chambers located on a surface of the optical property modifying device such that at least one of a geometric pattern formed by a layout of the plurality of reaction chambers and a color of the reaction between the biological sample and at least the optical property modifying reagent within the plurality of reaction chambers is visible from a point exterior to the optical property modifying device.
13. A method for determining a result of a diagnostic test, the method comprising: receiving one or more images of an optical property modifying device configured to perform a diagnostic test, the one or more images captured by an optical sensor of a mobile device, the optical property modifying device comprising: a plurality of reaction chambers located on a surface of the optical property modifying device, the plurality of reaction chambers configured to perform a reaction between a biological sample and at least an optical property modifying reagent; determining that an image of the one or more images depicts at least one of a known geometric characteristic and a known color characteristic associated with the optical property modifying device; determining the result of the diagnostic test based at least in part on the at least one of the known geometric characteristic and the known color characteristic in the image; and providing the determined result of the diagnostic test to the mobile device for
presentation by the mobile device.
14. The method of claim 13, wherein the known geometric characteristic comprises a geometric pattern formed by a layout of the plurality of reaction chambers of the optical property modifying device, and wherein the known color characteristic comprises a color of the reaction between the biological sample and at least the optical property modifying reagent within the plurality of reaction chambers.
15. The method of claim 14, further comprising: identifying a location of each reaction chamber of the plurality of reaction chambers in the image, based on the geometric pattern formed by a layout of the plurality of reaction chambers depicted in the image; determining the color of the reaction between the biological sample and at least the optical property modifying reagent at the identified location of each reaction chamber of the plurality of reaction chambers in the image; and determining the result of the diagnostic test based at least in part on the determined color of the image at the identified location of each reaction chamber in the image.
16. The method of any one of claims 13-15, further comprising: receiving a geographical location of the mobile device from a location sensor of the mobile device; and storing the geographical location and a timestamp of the geographical location with the determined result of the diagnostic test.
17. The method of claim 16, wherein determining the result of the diagnostic test is further based on at least one of the geographical location and the timestamp.
18. The method of any one of claims 16-17, further comprising: providing the result of the diagnostic test, the geographical location, and the
timestamp to a third-party system including epidemiological data based on aggregated diagnostic test results from a plurality of mobile devices associated with a population of patients.
19. A computer program product comprising a non-transitory computer readable storage medium having instructions encoded thereon that, when executed by a processor, cause the processor to: receive one or more images of an optical property modifying device configured to perform a diagnostic test, the one or more images captured by an optical sensor of a mobile device, the optical property modifying device comprising: a plurality of reaction chambers configured to perform a reaction
between a biological sample and at least an optical property modifying reagent; and an electronic display configured to present an indication of a result of the diagnostic test based on the reaction between the biological sample and at least the optical property modifying reagent; determine that an image of the one or more images depicts the electronic display of the optical property modifying device, the electronic display displaying the indication of the result of the diagnostic test; determine the result of the diagnostic test based at least in part on the indication
displayed by the electronic display in the image; and provide the determined result of the diagnostic test to the mobile device for
presentation by the mobile device.
20. The computer program product of claim 19, wherein the indication of the result of the diagnostic test comprises the result of the diagnostic test, presented as one or more of human- readable symbols and machine-readable symbols.
21. The computer program product of any one of claims 19-20, wherein instructions to determine the result of the diagnostic test based at least in part on the indication displayed by the electronic display in the image further comprise instructions to: perform at least one of optical pattern matching, optical template matching, and
optical character recognition for the indication displayed by the electronic display in the image, to determine the result of the diagnostic test.
22. The computer program product of any one of claims 19-21, the non-transitory computer readable storage medium having further instructions that when executed by the processor cause the processor to: receive a geographical location of the mobile device from a location sensor of the mobile device; and store the geographical location and a timestamp of the geographical location with the determined result of the diagnostic test.
23. The computer program product of claim 22, wherein the instructions to determine the result of the diagnostic test are further based on at least one of the geographical location and the timestamp.
24. The computer program product of any one of claims 22-23, the non-transitory computer readable storage medium having further instructions that when executed by the processor cause the processor to: provide the result of the diagnostic test, the geographical location, and the timestamp to a third-party system including epidemiological data based on aggregated diagnostic test results from a plurality of mobile devices associated with a population of patients.
25. A computer program product comprising a non-transitory computer readable storage medium having instructions encoded thereon that, when executed by a processor, cause the processor to: receive one or more images of an optical property modifying device configured to perform a diagnostic test, the one or more images captured by an optical sensor of a mobile device, the optical property modifying device comprising: a plurality of reaction chambers located on a surface of the optical property modifying device, the plurality of reaction chambers configured to perform a reaction between a biological sample and at least an optical property modifying reagent; determine that an image of the one or more images depicts at least one of a known geometric characteristic and a known color characteristic associated with the optical property modifying device; determine the result of the diagnostic test based at least in part on the at least one of the known geometric characteristic and the known color characteristic in the image; and provide the determined result of the diagnostic test to the mobile device for
presentation by the mobile device.
26. The computer program product of claim 25, wherein the known geometric
characteristic comprises a geometric pattern formed by a layout of the plurality of reaction chambers of the optical property modifying device, and wherein the known color characteristic comprises a color of the reaction between the biological sample and at least the optical property modifying reagent within the plurality of reaction chambers.
27. The computer program product of claim 26, the non-transitory computer readable storage medium having further instructions that when executed by the processor cause the processor to: identify a location of each reaction chamber of the plurality of reaction chambers in the image, based on the geometric pattern formed by a layout of the plurality of reaction chambers depicted in the image; determine the color of the reaction between the biological sample and at least the optical property modifying reagent at the identified location of each reaction chamber of the plurality of reaction chambers in the image; and determine the result of the diagnostic test based at least in part on the determined color of the image at the identified location of each reaction chamber in the image.
28. The computer program product of any one of claims 25-27, the non-transitory computer readable storage medium having further instructions that when executed by the processor cause the processor to: receive a geographical location of the mobile device from a location sensor of the mobile device; and store the geographical location and a timestamp of the geographical location with the determined result of the diagnostic test.
29. The computer program product of claim 28, wherein the instructions to determine the result of the diagnostic test are further based on at least one of the geographical location and the timestamp.
30. The computer program product of any one of claims 28-29, the non-transitory computer readable storage medium having further instructions that when executed by the processor cause the processor to: provide the result of the diagnostic test, the geographical location, and the timestamp to a third-party system including epidemiological data based on aggregated diagnostic test results from a plurality of mobile devices associated with a population of patients.
PCT/US2019/055365 2018-10-09 2019-10-09 Consumer-based disease diagnostics WO2020076928A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
CN201980066514.8A CN112823276A (en) 2018-10-09 2019-10-09 Consumer-based disease diagnosis
JP2021519556A JP2022504506A (en) 2018-10-09 2019-10-09 How to Diagnose Consumer-Based Diseases
EP19872211.8A EP3864393A4 (en) 2018-10-09 2019-10-09 Consumer-based disease diagnostics
MX2021004018A MX2021004018A (en) 2018-10-09 2019-10-09 Consumer-based disease diagnostics.
CA3114215A CA3114215A1 (en) 2018-10-09 2019-10-09 Consumer-based disease diagnostics
DO2021000058A DOP2021000058A (en) 2018-10-09 2021-04-07 CONSUMER-BASED DIAGNOSIS OF DISEASES

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US16/155,829 US11080848B2 (en) 2017-04-06 2018-10-09 Image-based disease diagnostics using a mobile device
US16/155,829 2018-10-09

Publications (2)

Publication Number Publication Date
WO2020076928A1 true WO2020076928A1 (en) 2020-04-16
WO2020076928A8 WO2020076928A8 (en) 2020-11-12

Family

ID=70165175

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2019/055365 WO2020076928A1 (en) 2018-10-09 2019-10-09 Consumer-based disease diagnostics

Country Status (7)

Country Link
EP (1) EP3864393A4 (en)
JP (1) JP2022504506A (en)
CN (1) CN112823276A (en)
CA (1) CA3114215A1 (en)
DO (1) DOP2021000058A (en)
MX (1) MX2021004018A (en)
WO (1) WO2020076928A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6198107B1 (en) * 1997-03-07 2001-03-06 Clare Chemical Research, Inc. Fluorometric detection using visible light
US20060078929A1 (en) * 2003-04-02 2006-04-13 Clondiag Chip Technologies Gmbh Device for the amplification and detection of nucleic acids
US20080149840A1 (en) * 2006-03-24 2008-06-26 Kalyan Handique Fluorescence Detector for Microfluidic Diagnostic System
US20080204380A1 (en) * 2007-02-23 2008-08-28 Shin Hye-Jin Organic light emitting diode display and driving method thereof
US20130003162A1 (en) * 2011-06-29 2013-01-03 Napoleon Leoni Electronic paper with porous standoff layer
US20160275149A1 (en) * 2013-06-28 2016-09-22 Life Technologies Corporation Methods and Systems for Visualizing Data Quality
US20180293350A1 (en) * 2017-04-06 2018-10-11 Diassess Inc. Image-based disease diagnostics using a mobile device
US20190050988A1 (en) * 2017-04-06 2019-02-14 Diassess Inc. Image-based disease diagnostics using a mobile device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6699713B2 (en) * 2000-01-04 2004-03-02 The Regents Of The University Of California Polymerase chain reaction system
RU2578023C2 (en) * 2009-01-13 2016-03-20 Эф-Ай-Оу Корпорейшн Portable diagnostic unit and method for using it with electronic device and diagnostic cartridge in instant diagnostic tests
KR101796906B1 (en) * 2009-03-24 2017-11-10 유니버시티 오브 시카고 Method for carrying out a reaction
WO2011088214A2 (en) * 2010-01-13 2011-07-21 Seventh Sense Biosystems, Inc. Rapid delivery and/or withdrawal of fluids
US20150247190A1 (en) * 2012-10-05 2015-09-03 California Institute Of Technology Methods and systems for microfluidics imaging and analysis
AU2014228937A1 (en) * 2013-03-15 2015-10-15 Nanobiosym, Inc. Systems and methods for mobile device analysis of nucleic acids and proteins
WO2015038717A1 (en) * 2013-09-11 2015-03-19 Wellumina Health, Inc. System for diagnostic testing of liquid samples

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6198107B1 (en) * 1997-03-07 2001-03-06 Clare Chemical Research, Inc. Fluorometric detection using visible light
US20060078929A1 (en) * 2003-04-02 2006-04-13 Clondiag Chip Technologies Gmbh Device for the amplification and detection of nucleic acids
US20080149840A1 (en) * 2006-03-24 2008-06-26 Kalyan Handique Fluorescence Detector for Microfluidic Diagnostic System
US20080204380A1 (en) * 2007-02-23 2008-08-28 Shin Hye-Jin Organic light emitting diode display and driving method thereof
US20130003162A1 (en) * 2011-06-29 2013-01-03 Napoleon Leoni Electronic paper with porous standoff layer
US20160275149A1 (en) * 2013-06-28 2016-09-22 Life Technologies Corporation Methods and Systems for Visualizing Data Quality
US20180293350A1 (en) * 2017-04-06 2018-10-11 Diassess Inc. Image-based disease diagnostics using a mobile device
US20190050988A1 (en) * 2017-04-06 2019-02-14 Diassess Inc. Image-based disease diagnostics using a mobile device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3864393A4 *

Also Published As

Publication number Publication date
WO2020076928A8 (en) 2020-11-12
CN112823276A (en) 2021-05-18
MX2021004018A (en) 2021-06-23
DOP2021000058A (en) 2021-04-30
JP2022504506A (en) 2022-01-13
EP3864393A1 (en) 2021-08-18
EP3864393A4 (en) 2022-06-22
CA3114215A1 (en) 2020-04-16

Similar Documents

Publication Publication Date Title
US11954851B2 (en) Image-based disease diagnostics using a mobile device
US10146909B2 (en) Image-based disease diagnostics using a mobile device
US11250944B2 (en) Uniquely coded color boards for analyzing images
US11412931B2 (en) System and method for image processing of medical test results using generalized curve field transform
US20220165427A1 (en) System and method for handing diagnostic test results to telemedicine provider
US20210396750A1 (en) System and method for digital remote primary, secondary, and tertiary color calibration via smart device in analysis of medical test results
CN105572110A (en) A method of and an apparatus for measuring biometric information
US11915810B2 (en) System and method for transmitting prescription to pharmacy using self-diagnostic test and telemedicine
CA2856094C (en) A quality control sensor method, system and device for use with biological/environmental rapid diagnostic test devices
WO2018106415A1 (en) Automated camera-based optical assessment system and method
Park et al. The design and evaluation of a mobile system for rapid diagnostic test interpretation
Dell et al. Mobile tools for point-of-care diagnostics in the developing world
US20190096516A1 (en) System and method for medical escalation and intervention that is a direct result of a remote diagnostic test
EP3864393A1 (en) Consumer-based disease diagnostics
US20220270720A1 (en) Adjustment method for an analytical determination of an analyte in a body fluid
US20190027259A1 (en) System and method for remote mapping of gold conjugates
US20190122771A1 (en) System and method for real-time insurance quote in response to a self-diagnostic test
US20210192265A1 (en) Method and devices for assessing the suitability of a sample tube for use in a laboratory automation system
WO2022163032A1 (en) Antigen test terminal and online infection testing system using same
CN117079779A (en) Method for operating medical data management platform, device and storage medium
CN114121273A (en) Information processing apparatus, information processing method, and system
JP2022115777A (en) Antigen test terminal and online infection test system using the same
Ozkan et al. Research Article A Novel Automatic Rapid Diagnostic Test Reader Platform

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19872211

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 3114215

Country of ref document: CA

ENP Entry into the national phase

Ref document number: 2021519556

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2019872211

Country of ref document: EP

Effective date: 20210510