WO2015173801A1 - Method and system for automated visual analysis of a dipstick using standard user equipment - Google Patents
Method and system for automated visual analysis of a dipstick using standard user equipment Download PDFInfo
- Publication number
- WO2015173801A1 WO2015173801A1 PCT/IL2015/050487 IL2015050487W WO2015173801A1 WO 2015173801 A1 WO2015173801 A1 WO 2015173801A1 IL 2015050487 W IL2015050487 W IL 2015050487W WO 2015173801 A1 WO2015173801 A1 WO 2015173801A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- colored
- image
- test reagents
- illumination
- dipstick
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 43
- 238000004458 analytical method Methods 0.000 title abstract description 5
- 230000000007 visual effect Effects 0.000 title abstract description 5
- 238000005286 illumination Methods 0.000 claims abstract description 53
- 239000003153 chemical reaction reagent Substances 0.000 claims abstract description 49
- 238000012360 testing method Methods 0.000 claims abstract description 42
- 238000012545 processing Methods 0.000 claims abstract description 25
- 238000013507 mapping Methods 0.000 claims abstract description 6
- 238000010606 normalization Methods 0.000 claims description 8
- 239000003086 colorant Substances 0.000 claims description 7
- 230000008569 process Effects 0.000 claims description 6
- 238000005457 optimization Methods 0.000 claims 2
- 238000004590 computer program Methods 0.000 claims 1
- 238000010586 diagram Methods 0.000 description 11
- 239000007788 liquid Substances 0.000 description 4
- 239000000126 substance Substances 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 230000001747 exhibiting effect Effects 0.000 description 2
- 238000009499 grossing Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000005316 response function Methods 0.000 description 2
- 230000002441 reversible effect Effects 0.000 description 2
- 210000002700 urine Anatomy 0.000 description 2
- 239000012491 analyte Substances 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 239000011111 cardboard Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 208000015181 infectious disease Diseases 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000011087 paperboard Substances 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000002485 urinary effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/56—Extraction of image or video features relating to colour
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/75—Systems in which material is subjected to a chemical reaction, the progress or the result of the reaction being investigated
- G01N21/77—Systems in which material is subjected to a chemical reaction, the progress or the result of the reaction being investigated by observing the effect on a chemical indicator
- G01N21/78—Systems in which material is subjected to a chemical reaction, the progress or the result of the reaction being investigated by observing the effect on a chemical indicator producing a change of colour
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/8483—Investigating reagent band
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30072—Microarray; Biochip, DNA array; Well plate
Definitions
- the present invention relates generally to systems and methods of automatically analyzing dipsticks, and in particular to such methods implementing image processing techniques tailored for standard user equipment.
- UE user equipment
- UE refers herein to any device used directly by an end-user to communicate. It can be a hand-held telephone, a laptop computer equipped with a mobile broadband adapter, or any other device.
- UE refers specifically to an arbitrary platform which is equipped with image capturing, image processing, and wireless communication capabilities.
- testing dipstick or simply “dipstick” refers herein to a testing measurement device usually made of paper or cardboard and is impregnated with reagents that indicate some feature of a liquid or a gas by changing color.
- dipsticks can be used to test for a variety of liquids for the presence of a given substance, known as an analyte.
- urine dipsticks are used to determine properties of a given sample and detect and measure the presence of a variety of substances that indicate a person' s state of health.
- specularity refers herein to the visual appearance of specular reflection. In computer vision, it means the mirror like properties of the surface: A directional reflection of incoming light (illumination) as described by the law of reflection. A simplified modelling of that reflection is the specular component in the Phong reflection model.
- Dipsticks are used by a variety of healthcare providers to assist in diagnostics, specifically, but not exclusively of urinary analysis of patients.
- the core concept is a set of reagents which are designed to chemically react to substances in a liquid under test (e.g., urine) by changing their color within a predefined color range.
- the set of colored reagents can then be compared to a predefined color key which can be used, either manually (e.g., by an expert user) or automatically (e.g., using a dedicated image processing computerized system) to yield qualitative and quantitative data relating to the substances in the liquid under test.
- Embodiments of the present invention overcome the aforementioned disadvantages of the prior art by enabling a non-expert user to carry out computerized, automatic interpretation of a dipstick, using a standard arbitrary platform at his or her location of choice.
- a method of visual analysis of a dipstick using user equipment having optical capturing and image processing capabilities may include the following steps: capturing, through a user equipment (UE) having specified image capturing and processing capabilities, an image of a dipstick having one or more colored test reagents, and a calibration array having a plurality of colored reference elements, tailored specifically for the dipstick color reagents; deriving, based on the captured image, local illumination parameters associated with the dipstick and the calibration array; determining whether the illumination parameters are within predefined illumination boundary conditions which are sufficient for interpreting the one or more colored test reagents, given the specified image capturing and processing capabilities of the UE; applying one or more image enhancement operations on the captured image, based on predefined mappings between the derived illumination parameters and one or more required adjustments; and interpreting the one or more colored test reagents, based on the colored reference elements, in the enhanced captured image.
- the dipstick-specific calibrator can
- Figure 1 is a high level schematic block diagram illustrating a system according to the present invention
- Figure 2 is a high level flowchart diagram illustrating an aspect of a method according to some embodiments of the present invention.
- FIGS. 3A and 3B are exemplary non-limiting calibration arrays illustrating an aspect of some embodiments of the present invention.
- the drawings together with the following detailed description make the embodiments of the invention apparent to those skilled in the art.
- FIG. 1 is a high level schematic block diagram illustrating a system 100 according to embodiments of the present invention.
- System 100 may include an arbitrary platform 10 such as user equipment (UE) having an image capturing device 110 having specified image capturing and a computer processor 120 processing capabilities.
- Capturing device 110 may be configured to capture one or more images of a dipstick 20 having one or more colored test reagents 20-1 to 20-M, and a pre-generated calibrator 30 (referred herein also as a calibration array) having a plurality of colored reference elements 30-1 to 30-N.
- UE user equipment
- Capturing device 110 may be configured to capture one or more images of a dipstick 20 having one or more colored test reagents 20-1 to 20-M, and a pre-generated calibrator 30 (referred herein also as a calibration array) having a plurality of colored reference elements 30-1 to 30-N.
- a pre-generated calibrator 30 referred herein also as a calibration array
- Reference elements 30-1 to 30-N are tailored specifically for the dipstick color reagents, and are generated specifically for each type of dipstick based on its properties it is understood that a single calibrator may be tailored for a plurality of dipsticks, as long as it is tailored to a group of dipsticks and not all possible dipsticks.
- Computer processor 120 may be configured to: derive, based on the captured image 130, local illumination parameters 140 associated with the dipstick and the calibration array. Computer processor 120 may further be configured to determine whether the illumination and other parameters such as the spatial angle of the UE relative to the calibration array are within predefined illumination boundary conditions 150 which are sufficient for interpreting the one or more colored test reagents, given the specified image capturing and processing capabilities of arbitrary platform (or UE) 10.
- computer processor 120 may further be configured to apply one or more image enhancement operation 160 on the captured image, wherein the image enhancement operation 160 is configured for rendering the color reagents at captured image more distinguishable and less prone to artifacts; and interpret the one or more colored test reagents, based on the colored calibration setup, in the enhanced captured image 170.
- aforementioned enhancement operation 160 may be carried on a location remote to arbitrary platform 10 such as one or more servers on a cloud network 50, to which arbitrary platform 10 may be connected, e.g., by a Wi-Fi connection using communication circuitry 190.
- arbitrary platform 10 such as one or more servers on a cloud network 50
- arbitrary platform 10 may be connected, e.g., by a Wi-Fi connection using communication circuitry 190.
- the data may be stored on UE 10 and transmitted later to cloud 50 once wireless connectivity is resumed.
- the illumination parameters are not within the predefined illumination boundary conditions, instructing a user of the arbitrary platform with instructions 180 how to improve the illumination parameters.
- the one or more image enhancement operation may include detecting portions of specular reflections coming from the colored calibration shapes or the colored test reagents, and applying image processing algorithms that reduce the specular reflections.
- the one or more image enhancement operation comprises determining, for each pixel at the captured image associated with one of the colored test reagents, a uniform color, based on a normalization factor calculated based on the derived illumination parameters and the specified image capturing and processing capabilities.
- the illumination parameters are not within the predefined illumination boundary conditions, indicating to a user that interpreting of the dipstick by the arbitrary platform is not possible.
- the one or more colored test reagents, and the colored reference elements are located, based on a specified layout, in prearranged locations.
- the instruction to the user indicate a specified movement pattern of the arbitrary platform vis a vis the dipstick and the calibration array.
- the detecting of portions of specular reflections is carried out by comparing pixels associated with a same color reagent, to a predefined threshold.
- FIG. 2 is a high level flowchart diagram illustrating an aspect of a method 200 according to some embodiments of the present invention.
- the method may include the following steps: capturing, using an arbitrary platform having specified image capturing and processing capabilities, an image of: a dipstick having one or more colored test reagents, and a calibration array having a plurality of colored reference elements which are specifically tailored to the test reagents of the dipstick 210; deriving, based on the captured image, illumination parameters associated with the dipstick and the calibration array 220; determining whether the illumination parameters are within a predefined illumination boundary conditions which is sufficient for interpreting the one or more colored test reagents, given the specified image capturing and processing capabilities of the arbitrary platform 230; applying one or more image enhancement operation on the captured image, based on predefined mapping between the derived illumination parameters and one or more required adjustments 240; and interpreting the one or more colored test reagents, based on the colored reference elements, in the enhanced captured image 250. It is understood that, while implementing method
- Figures 3A and 3B are exemplary non limiting embodiments of calibration array 30A and 30B exhibiting a plurality of reference elements used as reference values for feature vector normalization and designated locations for dipstick 20A and 20B.
- Calibration array may be pre-generated after a long process of learning the myriads of illumination conditions that may be presented when capturing the image using the arbitrary platform (UE).
- the reference elements are carefully tailored per each type of dipstick for optimal performance. Additionally, the different capturing capabilities of many platforms (e.g., smart telephones, tablet PC and the like) are being studied.
- calibrator 30A shown in Figure 3A have been carefully selected to have different shades of basic colors, several textures and positions relative to dipstick 20A, while of calibrator 30B shown in Figure 3B have been carefully selected to have different shades of basic colors, several textures and positions relative to dipstick 20B.
- calibrators 30A and 30B may be provided with a texture for matching or rectification and base colors for on the fly normalization. Additionally, the calibrator may apply a reverse effect of the response function of the capturing device of the arbitrary platform 10. In some embodiments, a representative response function is assumed. In others, different calibrators are used for groups of known arbitrary platforms.
- the calibrator is provided with arbitrary geometrical elements for simplifying dipstick extraction (when dipstick 20 has rectangular reagents), and exhibiting more gray levels for improved gamma correction.
- the calibrator is provided with reference elements using two reference elements per color for a better normalization and specularity identification.
- calibrator 30 is provided with reference elements having black borders around them for minimizing over smoothing of certain colors by some camera models. Additionally, uniform gray arbitrary geometrical elements may be are added for better normalization (again, due to over smoothing). Adding high contrast elements for enabling fast blob based calibrator rectification on the phone.
- aspects of the present invention may be embodied as a system, method or an apparatus. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, microcode, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit,” “module” or “system, and a "cloud”.
- each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
- the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
- each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration can be implemented by special purpose hardware- based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
- an embodiment is an example or implementation of the inventions.
- the various appearances of "one embodiment,” “an embodiment” or “some embodiments” do not necessarily all refer to the same embodiments.
- various features of the invention may be described in the context of a single embodiment, the features may also be provided separately or in any suitable combination. Conversely, although the invention may be described herein in the context of separate embodiments for clarity, the invention may also be implemented in a single embodiment.
- Methods of the present invention may be implemented by performing or completing manually, automatically, or a combination thereof, selected steps or tasks.
- method may refer to manners, means, techniques and procedures for accomplishing a given task including, but not limited to, those manners, means, techniques and procedures either known to, or readily developed from known manners, means, techniques and procedures by practitioners of the art to which the invention belongs.
- the present invention may be implemented in the testing or practice with methods and materials equivalent or similar to those described herein.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Health & Medical Sciences (AREA)
- Apparatus Associated With Microorganisms And Enzymes (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
A method and system for automated visual analysis of a dipstick using standard user equipment (UE) are disclosed herein. The method may include the following steps: capturing, using an arbitrary UE having specified image capturing and processing capabilities, an image of a dipstick having colored test reagents, and a calibration array having a plurality of colored reference elements which are tailored specifically to the test reagents; deriving, based on the captured image, illumination parameters associated with the dipstick and the calibration array; determining whether the illumination parameters are within predefined illumination boundary conditions sufficient for interpreting the test reagents, given the specified image capturing and processing capabilities of the UE; applying image enhancement operations to the captured image, based on predefined mapping between the derived illumination parameters and the required adjustments; and interpreting the colored test reagents, based on the colored reference elements, in the enhanced captured image.
Description
METHOD AND SYSTEM FOR AUTOMATED VISUAL ANALYSIS OF A DIPSTICK USING STANDARD USER EQUIPMENT
FIELD OF THE INVENTION The present invention relates generally to systems and methods of automatically analyzing dipsticks, and in particular to such methods implementing image processing techniques tailored for standard user equipment.
BACKGROUND OF THE INVENTION
Prior to setting forth a short discussion of the related art, it may be helpful to set forth definitions of certain terms that will be used hereinafter.
The term "user equipment" (UE) refers herein to any device used directly by an end-user to communicate. It can be a hand-held telephone, a laptop computer equipped with a mobile broadband adapter, or any other device. In the context used herein UE refers specifically to an arbitrary platform which is equipped with image capturing, image processing, and wireless communication capabilities.
The term "testing dipstick" or simply "dipstick" refers herein to a testing measurement device usually made of paper or cardboard and is impregnated with reagents that indicate some feature of a liquid or a gas by changing color. In medicine, dipsticks can be used to test for a variety of liquids for the presence of a given substance, known as an analyte. For example, urine dipsticks are used to determine properties of a given sample and detect and measure the presence of a variety of substances that indicate a person' s state of health.
The term "specularity" refers herein to the visual appearance of specular reflection. In computer vision, it means the mirror like properties of the surface: A directional reflection of incoming light (illumination) as described by the law of reflection. A simplified modelling of that reflection is the specular component in the Phong reflection model.
Dipsticks are used by a variety of healthcare providers to assist in diagnostics, specifically, but not exclusively of urinary analysis of patients. The core concept is a set of reagents which are designed to chemically react to substances in a liquid under test (e.g., urine) by changing their color within a predefined color range. The set of colored reagents can then be compared to a predefined color key which can be used, either manually (e.g., by an expert user) or
automatically (e.g., using a dedicated image processing computerized system) to yield qualitative and quantitative data relating to the substances in the liquid under test.
Currently, computer vision can be used to interpret the color reagent responses into quantitative and qualitative clinical data. This is being carried out by dedicated hardware which may include a pre-calibrated scanner, which is operated in well-known and monitored illumination conditions, and a classifier that operates based on the calibrated images derived by the scanner.
The need to use dedicated hardware necessitates patients carry out the dipstick test in clinics rather than in the convenience of their home or other place of choice. Such a visit to the lab also mandates coming in unnecessary contact with infections and diseases. A non-expert interpretation of the dipstick is also not recommended - for the fear of wrong interpretation and misdiagnosis. It would be therefore be advantageous to be able to produce such accurate clinical data at home, using image processing techniques, without the need to use a dedicated hardware or software. SUMMARY OF THE INVENTION
Embodiments of the present invention overcome the aforementioned disadvantages of the prior art by enabling a non-expert user to carry out computerized, automatic interpretation of a dipstick, using a standard arbitrary platform at his or her location of choice.
According to one embodiment of the present invention, there is provided a method of visual analysis of a dipstick using user equipment having optical capturing and image processing capabilities. The method may include the following steps: capturing, through a user equipment (UE) having specified image capturing and processing capabilities, an image of a dipstick having one or more colored test reagents, and a calibration array having a plurality of colored reference elements, tailored specifically for the dipstick color reagents; deriving, based on the captured image, local illumination parameters associated with the dipstick and the calibration array; determining whether the illumination parameters are within predefined illumination boundary conditions which are sufficient for interpreting the one or more colored test reagents, given the specified image capturing and processing capabilities of the UE; applying one or more image enhancement operations on the captured image, based on predefined mappings between the derived illumination parameters and one or more required adjustments; and interpreting the one or more colored test reagents, based on the colored reference elements, in the enhanced captured image. Advantageously, by embodiments of the
present invention, the dipstick-specific calibrator can, in real time, determine whether the environment of choice crosses boundary conditions.
These additional, and/or other aspects and/or advantages of the present invention are set forth in the detailed description which follows. BRIEF DESCRIPTION OF THE DRAWINGS
For a better understanding of the invention and in order to show how it may be implemented, references are made, purely by way of example, to the accompanying drawings in which like numerals designate corresponding elements or sections. In the accompanying drawings:
Examples illustrative of embodiments of the invention are described below with reference to the figures attached hereto. In the figures, identical structures, elements or parts that appear in more than one figure are generally labeled with the same number in all the figures in which they appear. Dimensions of components and features shown in the figures are generally chosen for convenience and clarity of presentation and are not necessarily shown to scale.
Figure 1 is a high level schematic block diagram illustrating a system according to the present invention;
Figure 2 is a high level flowchart diagram illustrating an aspect of a method according to some embodiments of the present invention; and
Figures 3A and 3B are exemplary non-limiting calibration arrays illustrating an aspect of some embodiments of the present invention. The drawings together with the following detailed description make the embodiments of the invention apparent to those skilled in the art.
DETAILED DESCRIPTION OF THE INVENTION
With specific reference now to the drawings in detail, it is stressed that the particulars shown are for the purpose of example and solely for discussing the preferred embodiments of the present invention, and are presented in the cause of providing what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the invention. In this regard, no attempt is made to show structural details of the invention in more detail than is necessary for a fundamental understanding of the invention. The description taken with the drawings makes apparent to those skilled in the art how the several forms of the invention may be embodied in practice. Before explaining the embodiments of the invention in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of the components set forth in the following descriptions or illustrated in the drawings. The invention is applicable to other embodiments or of being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting.
Figure 1 is a high level schematic block diagram illustrating a system 100 according to embodiments of the present invention. System 100 may include an arbitrary platform 10 such as user equipment (UE) having an image capturing device 110 having specified image capturing and a computer processor 120 processing capabilities. Capturing device 110 may be configured to capture one or more images of a dipstick 20 having one or more colored test reagents 20-1 to 20-M, and a pre-generated calibrator 30 (referred herein also as a calibration array) having a plurality of colored reference elements 30-1 to 30-N. Reference elements 30-1 to 30-N are tailored specifically for the dipstick color reagents, and are generated specifically for each type of dipstick based on its properties it is understood that a single calibrator may be tailored for a plurality of dipsticks, as long as it is tailored to a group of dipsticks and not all possible dipsticks.
Computer processor 120 may be configured to: derive, based on the captured image 130, local illumination parameters 140 associated with the dipstick and the calibration array. Computer processor 120 may further be configured to determine whether the illumination and other parameters such as the spatial angle of the UE relative to the calibration array are within predefined illumination boundary conditions 150 which are sufficient for interpreting the one
or more colored test reagents, given the specified image capturing and processing capabilities of arbitrary platform (or UE) 10.
In some embodiments of the present invention, computer processor 120 may further be configured to apply one or more image enhancement operation 160 on the captured image, wherein the image enhancement operation 160 is configured for rendering the color reagents at captured image more distinguishable and less prone to artifacts; and interpret the one or more colored test reagents, based on the colored calibration setup, in the enhanced captured image 170.
In an alternative embodiment, aforementioned enhancement operation 160 may be carried on a location remote to arbitrary platform 10 such as one or more servers on a cloud network 50, to which arbitrary platform 10 may be connected, e.g., by a Wi-Fi connection using communication circuitry 190. In a case wireless connection is not available; the data may be stored on UE 10 and transmitted later to cloud 50 once wireless connectivity is resumed.
According to some embodiments of the invention, in a case that the illumination parameters are not within the predefined illumination boundary conditions, instructing a user of the arbitrary platform with instructions 180 how to improve the illumination parameters.
According to some embodiments of the invention, the one or more image enhancement operation may include detecting portions of specular reflections coming from the colored calibration shapes or the colored test reagents, and applying image processing algorithms that reduce the specular reflections.
According to some embodiments of the invention, the one or more image enhancement operation comprises determining, for each pixel at the captured image associated with one of the colored test reagents, a uniform color, based on a normalization factor calculated based on the derived illumination parameters and the specified image capturing and processing capabilities.
According to some embodiments of the invention, in a case that the illumination parameters are not within the predefined illumination boundary conditions, indicating to a user that interpreting of the dipstick by the arbitrary platform is not possible.
According to some embodiments of the invention, wherein the one or more colored test reagents, and the colored reference elements are located, based on a specified layout, in prearranged locations.
According to some embodiments of the invention, the instruction to the user indicate a specified movement pattern of the arbitrary platform vis a vis the dipstick and the calibration array.
According to some embodiments of the invention, the detecting of portions of specular reflections is carried out by comparing pixels associated with a same color reagent, to a predefined threshold.
Figure 2 is a high level flowchart diagram illustrating an aspect of a method 200 according to some embodiments of the present invention. The method may include the following steps: capturing, using an arbitrary platform having specified image capturing and processing capabilities, an image of: a dipstick having one or more colored test reagents, and a calibration array having a plurality of colored reference elements which are specifically tailored to the test reagents of the dipstick 210; deriving, based on the captured image, illumination parameters associated with the dipstick and the calibration array 220; determining whether the illumination parameters are within a predefined illumination boundary conditions which is sufficient for interpreting the one or more colored test reagents, given the specified image capturing and processing capabilities of the arbitrary platform 230; applying one or more image enhancement operation on the captured image, based on predefined mapping between the derived illumination parameters and one or more required adjustments 240; and interpreting the one or more colored test reagents, based on the colored reference elements, in the enhanced captured image 250. It is understood that, while implementing method 200 may be carried out using the aforementioned architecture of system 100, other architectures may be used by those skilled in the art.
Figures 3A and 3B are exemplary non limiting embodiments of calibration array 30A and 30B exhibiting a plurality of reference elements used as reference values for feature vector normalization and designated locations for dipstick 20A and 20B. Calibration array may be pre-generated after a long process of learning the myriads of illumination conditions that may be presented when capturing the image using the arbitrary platform (UE). The reference elements are carefully tailored per each type of dipstick for optimal performance. Additionally, the different capturing capabilities of many platforms (e.g., smart telephones, tablet PC and the like) are being studied. All of the above is being carefully used in embodiments of the present invention in order to produce dipstick- specific calibrators that have a very large dynamic range in the sense that many illumination conditions are within the operation boundary of the capturing process that is sufficient for proper interpretation of the
medical data on the dipstick. The reference elements (used as reference values for feature vector normalization) of calibrator 30A shown in Figure 3A have been carefully selected to have different shades of basic colors, several textures and positions relative to dipstick 20A, while of calibrator 30B shown in Figure 3B have been carefully selected to have different shades of basic colors, several textures and positions relative to dipstick 20B.
According to some embodiments, calibrators 30A and 30B may be provided with a texture for matching or rectification and base colors for on the fly normalization. Additionally, the calibrator may apply a reverse effect of the response function of the capturing device of the arbitrary platform 10. In some embodiments, a representative response function is assumed. In others, different calibrators are used for groups of known arbitrary platforms.
According to other embodiments, the calibrator is provided with arbitrary geometrical elements for simplifying dipstick extraction (when dipstick 20 has rectangular reagents), and exhibiting more gray levels for improved gamma correction.
According to other embodiments, the calibrator is provided with reference elements using two reference elements per color for a better normalization and specularity identification.
According to other embodiments, calibrator 30 is provided with reference elements having black borders around them for minimizing over smoothing of certain colors by some camera models. Additionally, uniform gray arbitrary geometrical elements may be are added for better normalization (again, due to over smoothing). Adding high contrast elements for enabling fast blob based calibrator rectification on the phone.
As will be appreciated by one skilled in the art, the aforementioned process of generating the calibration may be the product of trial and error process that can be implemented in various manners. It should be noted that the aforementioned guidelines may be used in order to generate further improvements for the calibration. Aspects of the present invention may be embodied as a system, method or an apparatus. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, microcode, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit," "module" or "system, and a "cloud".
The aforementioned flowchart and block diagrams illustrate the architecture, functionality, and operation of possible implementations of systems and methods according to various
embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware- based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In the above description, an embodiment is an example or implementation of the inventions. The various appearances of "one embodiment," "an embodiment" or "some embodiments" do not necessarily all refer to the same embodiments. Although various features of the invention may be described in the context of a single embodiment, the features may also be provided separately or in any suitable combination. Conversely, although the invention may be described herein in the context of separate embodiments for clarity, the invention may also be implemented in a single embodiment.
Reference in the specification to "some embodiments", "an embodiment", "one embodiment" or "other embodiments" means that a particular feature, structure, or characteristic described in connection with the embodiments is included in at least some embodiments, but not necessarily all embodiments, of the inventions.
It is to be understood that the phraseology and terminology employed herein is not to be construed as limiting and are for descriptive purpose only. The principles and uses of the teachings of the present invention may be better understood with reference to the accompanying description, figures and examples.
It is to be understood that the details set forth herein do not construe a limitation to an application of the invention.
Furthermore, it is to be understood that the invention can be carried out or practiced in various ways and that the invention can be implemented in embodiments other than the ones outlined in the description above.
It is to be understood that the terms "including", "comprising", "consisting" and grammatical variants thereof do not preclude the addition of one or more components, features, steps, or integers or groups thereof and that the terms are to be construed as specifying components, features, steps or integers. If the specification or claims refer to "an additional" element, that does not preclude there being more than one of the additional element.
It is to be understood that where the claims or specification refer to "a" or "an" element, such reference is not be construed that there is only one of that element.
It is to be understood that where the specification states that a component, feature, structure, or characteristic "may", "might", "can" or "could" be included, that particular component, feature, structure, or characteristic is not required to be included.
Where applicable, although state diagrams, flow diagrams or both may be used to describe embodiments, the invention is not limited to those diagrams or to the corresponding descriptions. For example, flow need not move through each illustrated box or state, or in exactly the same order as illustrated and described.
Methods of the present invention may be implemented by performing or completing manually, automatically, or a combination thereof, selected steps or tasks.
The term "method" may refer to manners, means, techniques and procedures for accomplishing a given task including, but not limited to, those manners, means, techniques and procedures either known to, or readily developed from known manners, means, techniques and procedures by practitioners of the art to which the invention belongs.
The descriptions, examples, methods and materials presented in the claims and the specification are not to be construed as limiting but rather as illustrative only.
Meanings of technical and scientific terms used herein are to be commonly understood as by one of ordinary skill in the art to which the invention belongs, unless otherwise defined.
The present invention may be implemented in the testing or practice with methods and materials equivalent or similar to those described herein.
While the invention has been described with respect to a limited number of embodiments, these should not be construed as limitations on the scope of the invention, but rather as
exemplifications of some of the preferred embodiments. Other possible variations, modifications, and applications are also within the scope of the invention. Accordingly, the scope of the invention should not be limited by what has thus far been described, but by the appended claims and their legal equivalents.
Claims
1. A method comprising:
capturing, using a standard user equipment (UE) having specified image capturing and processing capabilities, an image of a dipstick having one or more colored test reagents, and a calibration array having a plurality of colored reference elements which are specifically tailored to said colored test reagents;
deriving, based on the captured image, local illumination parameters associated with the dipstick and the calibration array;
determining whether the illumination parameters are within a set of predefined illumination boundary conditions which are sufficient for interpreting the one or more colored test reagents, given the specified image capturing and processing capabilities of the UE; applying one or more image enhancement operation to the captured image, to yield an enhanced image, based on predefined mapping between the derived illumination parameters and one or more required adjustments; and
interpreting the one or more colored test reagents, based on the colored reference elements, in the enhanced image.
2. The method according to claim 1 , wherein in a case that the illumination parameters are not within the set of predefined illumination boundary conditions, instructing a user of the UE how to improve the illumination parameters.
3. The method according to claim 1 , wherein the one or more image enhancement operations comprises detecting portions of specular reflections coming from the colored reference elements or the colored test reagents, and applying image processing algorithms that reduces the specular reflections.
4. The method according to claim 1, wherein the one or more image enhancement operation comprises normalizing the pixel colors of each reagent to yield, a uniform color, based on a normalization factor calculated based on the derived illumination parameters and the specified image capturing and processing capabilities.
5. The method according to claim 1, wherein the one or more image enhancement operation comprises normalizing the pixel colors the reagent and reference elements to achieve uniform illumination.
6. The method according to claim 1, wherein in a case that the illumination parameters are not within the predefined illumination boundary conditions, indicating to a user that a proper interpreting of the dipstick by the arbitrary platform is not possible.
7. The method according to claim 1, wherein the one or more colored test reagents, and the colored reference elements are located, based on a specified layout, in prearranged locations.
8. The method according to claim 1 , wherein the calibration array is generated based on data derived in a series of trials and error, in which a plurality of illumination conditions were tested against a plurality of UEs.
9. The method according to claim 1 , wherein at least one of: shape, color, location, and texture of the reference elements on the calibration array are selected in an optimization process, configured to increase likelihood of a successful interpretation of the colored test reagents.
10. The method according to claim 2, wherein the instruction to the user indicate a specified movement pattern of the UE vis a vis the dipstick and the calibration array.
11. The method according to claim 3 , wherein the detecting of portions of specular reflections is carried out by comparing pixels associated with a same color element or reagent, to a predefined threshold.
12. A system comprising:
a user equipment (UE) having specified image capturing and processing capabilities, configured to capture an image of a dipstick having one or more colored test reagents, and a calibration array having a plurality of colored reference elements; and
a computer processor configured to:
derive, based on the captured image, illumination parameters associated with the dipstick and the calibration array;
determine whether the illumination parameters are within a predefined illumination boundary conditions which is sufficient for interpreting the one or more colored test reagents, given the specified image capturing and processing capabilities of the arbitrary platform;
apply one or more image enhancement operation to the captured image, to yield an enhanced image, based on predefined mapping between the derived illumination parameters and one or more required adjustments; and
interpret the one or more colored test reagents, based on the colored reference elements, in the enhanced captured image.
13. The system according to claim 12, wherein in a case that the illumination parameters are not within the predefined illumination boundary conditions, instructing a user of the arbitrary platform how to improve the illumination parameters.
14. The system according to claim 12, wherein the one or more image enhancement operation comprises detecting portions of specular reflections coming from the colored reference elements or the colored test reagents, and applying image processing algorithms that reduces the specular reflections.
15. The system according to claim 12, wherein the one or more image enhancement operation comprises determining, for each pixel at the captured image associated with one of the colored test reagents, a uniform color, based on a normalization factor calculated based on the derived illumination parameters and the specified image capturing and processing capabilities.
16. The system according to claim 12, wherein the one or more image enhancement operation comprises normalizing the pixel colors the reagent and reference elements to achieve uniform illumination.
17. The system according to claim 12, wherein in a case that the illumination parameters are not within the predefined illumination boundary conditions, indicating to a user that interpreting of the dipstick by the arbitrary platform is not possible.
18. The system according to claim 12, wherein the one or more colored test reagents, and the colored reference elements are located, based on a specified layout, in prearranged locations.
19. The system according to claim 12, wherein the calibration array is generated based on data derived in a series of trial and error, in which a plurality of illumination conditions were tested.
20. The system according to claim 12, wherein at least one of: shape, color, location, and texture of the reference elements on the calibration array are selected in an optimization process, configured to increase likelihood of a successful interpretation of the colored test reagents.
21. The system according to claim 13, wherein the instruction to the user indicate a specified movement pattern of the arbitrary platform vis a vis the dipstick and the calibration array.
22. The system according to claim 14, wherein the detecting of portions of specular reflections is carried out by comparing pixels associated with a same color reagent, to a predefined threshold.
23. A computer program product comprising:
a non-transitory computer readable storage medium having computer readable program embodied therewith, the computer readable program comprising:
computer readable program configured to instruct a user equipment (UE) having specified image capturing and processing capabilities, to capture an image of a dipstick having one or more colored test reagents, and a calibration array having a plurality of colored reference elements which are tailored specifically to said test reagents;
computer readable program configured to derive, based on the captured image, illumination parameters associated with the dipstick and the calibration array;
computer readable program configured to determine whether the illumination parameters are within a predefined illumination boundary conditions which is sufficient for interpreting the one or more colored test reagents, given the specified image capturing and processing capabilities of the UE;
computer readable program configured to apply one or more image enhancement operation to the captured image, to yield an enhanced image, based on predefined mapping between the derived illumination parameters and one or more required adjustments; and
computer readable program configured to interpret the one or more colored test reagents, based on the colored reference elements, in the enhanced captured image
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP15792976.1A EP3143378A4 (en) | 2014-05-12 | 2015-05-11 | Method and system for automated visual analysis of a dipstick using standard user equipment |
US15/050,710 US10068329B2 (en) | 2014-05-12 | 2016-02-23 | Method and system for automated visual analysis of a dipstick using standard user equipment |
US16/120,335 US10559081B2 (en) | 2014-05-12 | 2018-09-03 | Method and system for automated visual analysis of a dipstick using standard user equipment |
US16/724,986 US10991096B2 (en) | 2014-05-12 | 2019-12-23 | Utilizing personal communications devices for medical testing |
US16/725,011 US11087467B2 (en) | 2014-05-12 | 2019-12-23 | Systems and methods for urinalysis using a personal communications device |
US17/238,434 US20210241456A1 (en) | 2014-05-12 | 2021-04-23 | Utilizing personal communications devices for medical testing |
US17/369,375 US11727547B2 (en) | 2014-05-12 | 2021-07-07 | Using patient generated image data to update electronic medical records |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/274,817 | 2014-05-12 | ||
US14/274,817 US9972077B2 (en) | 2014-05-12 | 2014-05-12 | Method and system for automated visual analysis of a dipstick using standard user equipment |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/274,817 Continuation-In-Part US9972077B2 (en) | 2014-05-12 | 2014-05-12 | Method and system for automated visual analysis of a dipstick using standard user equipment |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/050,710 Continuation-In-Part US10068329B2 (en) | 2014-05-12 | 2016-02-23 | Method and system for automated visual analysis of a dipstick using standard user equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015173801A1 true WO2015173801A1 (en) | 2015-11-19 |
Family
ID=54368298
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IL2015/050487 WO2015173801A1 (en) | 2014-05-12 | 2015-05-11 | Method and system for automated visual analysis of a dipstick using standard user equipment |
Country Status (3)
Country | Link |
---|---|
US (1) | US9972077B2 (en) |
EP (1) | EP3143378A4 (en) |
WO (1) | WO2015173801A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022518364A (en) * | 2019-01-02 | 2022-03-15 | ヘルシー.アイオー リミテッド | Use of image analysis to assess medical medical conditions |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9179844B2 (en) | 2011-11-28 | 2015-11-10 | Aranz Healthcare Limited | Handheld skin measuring or monitoring device |
US11087467B2 (en) | 2014-05-12 | 2021-08-10 | Healthy.Io Ltd. | Systems and methods for urinalysis using a personal communications device |
US10991096B2 (en) | 2014-05-12 | 2021-04-27 | Healthy.Io Ltd. | Utilizing personal communications devices for medical testing |
WO2016025935A2 (en) * | 2014-08-15 | 2016-02-18 | Scanadu Incorporated | Precision luxmeter methods for digital cameras to quantify colors in uncontrolled lighting environments |
US10013527B2 (en) | 2016-05-02 | 2018-07-03 | Aranz Healthcare Limited | Automatically assessing an anatomical surface feature and securely managing information related to the same |
US11116407B2 (en) | 2016-11-17 | 2021-09-14 | Aranz Healthcare Limited | Anatomical surface assessment methods, devices and systems |
EP4183328A1 (en) | 2017-04-04 | 2023-05-24 | Aranz Healthcare Limited | Anatomical surface assessment methods, devices and systems |
US10146909B2 (en) * | 2017-04-06 | 2018-12-04 | Diassess Inc. | Image-based disease diagnostics using a mobile device |
AU2018345841A1 (en) | 2017-10-06 | 2020-05-21 | The Research Foundation For The State University For The State Of New York | Selective optical aqueous and non-aqueous detection of free sulfites |
PT3477286T (en) * | 2017-10-25 | 2023-04-24 | Hoffmann La Roche | Methods and devices for performing an analytical measurement |
US11112406B2 (en) * | 2018-06-15 | 2021-09-07 | Reliant Immune Diagnostics, Inc. | System and method for digital remote primary, secondary, and tertiary color calibration via smart device in analysis of medical test results |
WO2020028729A1 (en) | 2018-08-01 | 2020-02-06 | Mammoth Biosciences, Inc. | Programmable nuclease compositions and methods of use thereof |
US11681886B2 (en) | 2018-09-06 | 2023-06-20 | John P. Peeters | Genomic and environmental blockchain sensors |
EP3931313A2 (en) | 2019-01-04 | 2022-01-05 | Mammoth Biosciences, Inc. | Programmable nuclease improvements and compositions and methods for nucleic acid amplification and detection |
US12039726B2 (en) | 2019-05-20 | 2024-07-16 | Aranz Healthcare Limited | Automated or partially automated anatomical surface assessment methods, devices and systems |
US11192100B2 (en) | 2020-03-25 | 2021-12-07 | Vessel Health, Inc. | Multi-factor urine test system that adjusts for lighting and timing |
PT3954990T (en) * | 2020-08-11 | 2023-04-13 | Hoffmann La Roche | Test strip fixation device for optical measurements of an analyte |
EP4214669A1 (en) | 2020-09-17 | 2023-07-26 | Scanwell Health, Inc. | Diagnostic test kits and methods of analyzing the same |
USD970033S1 (en) | 2020-10-23 | 2022-11-15 | Becton, Dickinson And Company | Cartridge imaging background device |
CA3198824A1 (en) | 2020-10-23 | 2022-04-28 | Becton, Dickinson And Company | Systems and methods for imaging and image-based analysis of test devices |
CN113037725B (en) * | 2021-02-26 | 2022-04-22 | 上海钧正网络科技有限公司 | Riding test method, server, test pile and readable storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7652268B2 (en) * | 2006-01-31 | 2010-01-26 | Jp Laboratories, Inc | General purpose, high accuracy dosimeter reader |
US20120063652A1 (en) * | 2010-09-15 | 2012-03-15 | Teco Diagnostics | Method and apparatus for performing color-based reaction testing of biological materials |
WO2012131386A1 (en) * | 2011-03-31 | 2012-10-04 | Albagaia Limited | Testing apparatus |
WO2013077802A1 (en) * | 2011-11-23 | 2013-05-30 | Calmark Sweden Ab | Testing system arrangement and method for testing |
WO2014025415A2 (en) * | 2012-08-08 | 2014-02-13 | Scanadu Incorporated | Method and apparatus for performing and quantifying color changes induced by specific concentrations of biological analytes in an automatically calibrated environment |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4523852A (en) * | 1983-06-09 | 1985-06-18 | Miles Laboratories, Inc. | Color comparison reference standard and method for using same |
US20070024657A1 (en) * | 2003-10-13 | 2007-02-01 | Zhang Nengsheng A | Method and apparatus for calibrating colour print engines |
US8506901B2 (en) | 2010-11-03 | 2013-08-13 | Teco Diagnostics | All-in-one specimen cup with optically readable results |
JP2015509582A (en) | 2012-02-03 | 2015-03-30 | ユニバーシティ・オブ・シンシナティ | Methods, systems, and apparatus for analyzing colorimetric assays |
US9063091B2 (en) * | 2012-04-06 | 2015-06-23 | Ixensor Inc. | Test strips and method for reading test strips |
US9241663B2 (en) * | 2012-09-05 | 2016-01-26 | Jana Care Inc. | Portable medical diagnostic systems and methods using a mobile device |
EP2731051A1 (en) * | 2012-11-07 | 2014-05-14 | bioMérieux | Bio-imaging method |
-
2014
- 2014-05-12 US US14/274,817 patent/US9972077B2/en active Active
-
2015
- 2015-05-11 WO PCT/IL2015/050487 patent/WO2015173801A1/en active Application Filing
- 2015-05-11 EP EP15792976.1A patent/EP3143378A4/en not_active Ceased
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7652268B2 (en) * | 2006-01-31 | 2010-01-26 | Jp Laboratories, Inc | General purpose, high accuracy dosimeter reader |
US20120063652A1 (en) * | 2010-09-15 | 2012-03-15 | Teco Diagnostics | Method and apparatus for performing color-based reaction testing of biological materials |
WO2012131386A1 (en) * | 2011-03-31 | 2012-10-04 | Albagaia Limited | Testing apparatus |
WO2013077802A1 (en) * | 2011-11-23 | 2013-05-30 | Calmark Sweden Ab | Testing system arrangement and method for testing |
WO2014025415A2 (en) * | 2012-08-08 | 2014-02-13 | Scanadu Incorporated | Method and apparatus for performing and quantifying color changes induced by specific concentrations of biological analytes in an automatically calibrated environment |
Non-Patent Citations (1)
Title |
---|
See also references of EP3143378A4 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022518364A (en) * | 2019-01-02 | 2022-03-15 | ヘルシー.アイオー リミテッド | Use of image analysis to assess medical medical conditions |
Also Published As
Publication number | Publication date |
---|---|
US20150325006A1 (en) | 2015-11-12 |
EP3143378A1 (en) | 2017-03-22 |
EP3143378A4 (en) | 2017-11-08 |
US9972077B2 (en) | 2018-05-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9972077B2 (en) | Method and system for automated visual analysis of a dipstick using standard user equipment | |
US10559081B2 (en) | Method and system for automated visual analysis of a dipstick using standard user equipment | |
CN106546581B (en) | Test paper detection card intelligent detection system and test paper detection card intelligent analysis method | |
JP7278276B2 (en) | Methods and apparatus for making analytical measurements based on chromogenic reactions | |
KR101624583B1 (en) | Urine Examination Method and Related Devices | |
García et al. | Mobile phone platform as portable chemical analyzer | |
CN104969068B (en) | For the method and apparatus changed by the color that the biological analyte of certain concentration induces to be executed and quantified in automatic calibration environment | |
JP2021508135A (en) | Analysis of captured images to determine inspection conclusions | |
Šafranko et al. | Designing ColorX, image processing software for colorimetric determination of concentration, to facilitate students’ investigation of analytical chemistry concepts using digital imaging technology | |
Nixon et al. | Accurate device-independent colorimetric measurements using smartphones | |
WO2013138356A2 (en) | System and method for robust estimation of color dependent measurements | |
JP6356141B2 (en) | Medical device or system for measuring hemoglobin levels during an accident using a camera-projector system | |
US10088411B2 (en) | Method and device for performing colorimetric analysis of a test fluid to evaluate physiological parameters | |
US20230146924A1 (en) | Neural network analysis of lfa test strips | |
JP2022506336A (en) | Methods and devices for performing analytical measurements | |
US8983181B2 (en) | Method and system for determining the color of an object in a photo | |
CN111387932B (en) | Vision detection method, device and equipment | |
CN108204979A (en) | For the method and apparatus of light source calibration in test paper detection device | |
CN115170629A (en) | Wound information acquisition method, device, equipment and storage medium | |
CN114667452A (en) | Method for determining the concentration of an analyte in a body fluid | |
KR20130086690A (en) | System for performing colorimetric analysis when running in a portable device and method of colorimetric analysis using the same | |
KR20140135921A (en) | URINE STRIP FOR Urine EXAMINATION | |
Kibria et al. | Smartphone-based point-of-care urinalysis assessment | |
JP2016139331A (en) | Environment inspection determination support system | |
US11974732B2 (en) | System and method for urine analysis and personal health monitoring |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15792976 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REEP | Request for entry into the european phase |
Ref document number: 2015792976 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015792976 Country of ref document: EP |