US20190021649A1 - Device for non-invasive detection of skin problems associated with diabetes mellitus - Google Patents

Device for non-invasive detection of skin problems associated with diabetes mellitus Download PDF

Info

Publication number
US20190021649A1
US20190021649A1 US16/044,248 US201816044248A US2019021649A1 US 20190021649 A1 US20190021649 A1 US 20190021649A1 US 201816044248 A US201816044248 A US 201816044248A US 2019021649 A1 US2019021649 A1 US 2019021649A1
Authority
US
United States
Prior art keywords
foot
diagnostic apparatus
medical diagnostic
horizontal surface
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/044,248
Inventor
Mike Van Snellenberg
Anne Weiler
Luke Feaster
Ben Spencer
Jahyen Chung
Sara Hansen-Lund
Soma Mandel
Josh Bishop
Gavin Ray
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Caravan Health Inc
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US16/044,248 priority Critical patent/US20190021649A1/en
Publication of US20190021649A1 publication Critical patent/US20190021649A1/en
Assigned to Wellpepper, Inc. reassignment Wellpepper, Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MANDAL, Soma, Bishop, Josh, SPENCER, BEN, Chung, Jahyen, Feaster, Luke, RAY, GAVIN, Hansen-Lund, Sara, Van Snellenberg, Mike, Weiler, Anne
Assigned to CARAVAN HEALTH, INC. reassignment CARAVAN HEALTH, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: 2020 AWMVS, INC.
Assigned to 2020 AWMVS, INC. reassignment 2020 AWMVS, INC. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: Wellpepper, Inc.
Priority to US17/752,755 priority patent/US20220280100A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • A61B5/0035Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for acquisition of images from more than one imaging mode, e.g. combining MRI and optical tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0048Detecting, measuring or recording by applying mechanical forces or stimuli
    • A61B5/0053Detecting, measuring or recording by applying mechanical forces or stimuli by applying pressure, e.g. compression, indentation, palpation, grasping, gauging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • A61B5/015By temperature mapping of body part
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4029Detecting, measuring or recording for evaluating the nervous system for evaluating the peripheral nervous systems
    • A61B5/4041Evaluating nerves condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/445Evaluating skin irritation or skin trauma, e.g. rash, eczema, wound, bed sore
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4824Touch or pain perception evaluation
    • A61B5/4827Touch or pain perception evaluation assessing touch sensitivity, e.g. for evaluation of pain threshold
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6829Foot or ankle
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/70Means for positioning the patient in relation to the detecting, measuring or recording means
    • A61B5/706Indicia not located on the patient, e.g. floor marking
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7405Details of notification to user or communication with user or patient ; user input means using sound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7405Details of notification to user or communication with user or patient ; user input means using sound
    • A61B5/741Details of notification to user or communication with user or patient ; user input means using sound using synthesised speech
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01GWEIGHING
    • G01G19/00Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups
    • G01G19/44Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups for weighing persons
    • G01G19/50Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups for weighing persons having additional measuring devices, e.g. for height
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0252Load cells
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30088Skin; Dermal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images

Definitions

  • a medical diagnostic apparatus includes a controller; a user interface configured for user interaction with the medical diagnostic apparatus; a platform having a horizontal surface; a weight measurement system coupled to the platform; and at least one visible light image sensor positioned below the horizontal surface that is capable of producing a diagnostic visible light image of a bottom portion of a foot or feet positioned on the horizontal surface. At least a portion of the horizontal surface is transparent to visible light.
  • At least a portion of the horizontal surface is transparent to infrared light suitable for thermal imaging, and the apparatus further includes at least one infrared image sensor positioned below the horizontal surface that is capable of producing a thermal image of the first target area.
  • the apparatus further includes at least one visible light image sensor positioned above the platform capable of producing a diagnostic visible light image of a top portion of the user's foot or feet.
  • the apparatus further includes at least one infrared image sensor positioned above the platform capable of producing a thermal image of a top portion of the user's foot or feet.
  • a medical diagnostic apparatus in another aspect, includes a platform having a horizontal surface; at least one upper image sensor positioned above the horizontal surface and configured to capture one or more images of a top portion of a foot or feet; at least one lower image sensor positioned below the horizontal surface and configured to capture one or more images of a bottom portion of the foot or feet; a controller; and a user interface configured for user interaction with the medical diagnostic apparatus. At least a portion of the horizontal surface is transparent to visible light. In an embodiment, two cameras area are positioned below the horizontal surface.
  • At least a portion of the horizontal surface is transparent to infrared light suitable for thermal imaging
  • the medical diagnostic apparatus further includes at least one infrared image sensor positioned below the horizontal surface that is capable of producing a thermal image of the bottom portion of the foot or feet.
  • a medical diagnostic apparatus may include a device for testing for peripheral neuropathy.
  • the device may include a controller; a foot platform having at least one opening; at least one vertically oriented monofilament positioned to pass through the at least one opening of the foot platform; and at least one actuator positioned below the at least one opening of the foot platform.
  • the at least one actuator is mechanically coupled to the at least one vertically oriented monofilament and is configured to move the at least one vertically oriented monofilament to pass through the at least one opening of the foot platform.
  • a medical diagnostic apparatus may include a visual or tactile guide for foot positioning, one or more illumination sources, or a combination of these or other additional features.
  • a computer-implemented method for automated diagnosis of a diabetic foot condition.
  • the method includes capturing, by one or image capture devices of a medical diagnostic apparatus, optical image data of a target area of a foot; collecting, by a touch sensitivity testing device (e.g., a servo-actuated monofilament probe) of the medical diagnostic apparatus, physical touch sensitivity data for the target area of the foot; transmitting, by the medical diagnostic apparatus, the optical image data, the physical touch sensitivity data, or a combination of such data to an analysis engine; and outputting, by the analysis engine, one or more indications of a diabetic foot condition.
  • the method may further include checking the optical image data one or more of image quality, lighting conditions, or body positioning.
  • the method may further include, prior to collecting the physical touch sensitivity data, confirming the position of the foot based at least in part on the optical image data.
  • the analysis engine may include an image classifier.
  • a user interface may include an interactive voice interface, a display, a visual indicator, a combination of such user interface features, or other user interface features.
  • FIG. 1 is a block diagram that depicts electronic systems that may be used in disclosed embodiments
  • FIG. 2 is a flow chart that describes an illustrative weight-activated data collection and analysis workflow that may be used in disclosure embodiments;
  • FIG. 3 is a flowchart of an illustrative monofilament exam workflow that may be used in disclosure embodiments;
  • FIG. 4 is a perspective view of a disclosed embodiment integrated in a standing scale
  • FIG. 5 is a schematic diagram of a monofilament assembly that may be used in described embodiments for peripheral neuropathy testing.
  • FIG. 6 is a block diagram that illustrates aspects of an illustrative computing device appropriate for use in accordance with embodiments of the present disclosure.
  • the present disclosure describes embodiments that facilitate non-invasive detection of skin problems in the feet or other areas, including neuropathy, vascular disease, and ulcers, such as those associated with diabetes mellitus.
  • Diabetic foot infection is the most common complication of diabetes mellitus leading to hospitalization and the most frequent cause of non-traumatic lower extremity amputation. Diabetic foot ulcers and other foot infections develop due to reduced feeling in the foot from peripheral neuropathy, the most common form of diabetic neuropathy. People that suffer from such conditions may not be aware of abrasions or hotspots that may develop into ulcers, that wounds are not healing, or that foreign objects have become lodged in the foot.
  • disclosed embodiments use a combination of sensors (e.g., optical and physical sensors) combined with related processing techniques (e.g., machine-learned classifiers) to help detect skin conditions (e.g., possible complications from diabetes).
  • sensors e.g., optical and physical sensors
  • processing techniques e.g., machine-learned classifiers
  • skin conditions e.g., possible complications from diabetes.
  • the user or patient may be guided through operation of the device using an interactive voice system.
  • One possible sensing mechanism that may be used in disclosed embodiments is a group of low-cost image sensors (e.g., in commercially available cameras), capturing visible spectrum or near-IR images.
  • cameras are arranged in pairs (e.g., with one above and one below the subject body part being measured).
  • Many-camera setups can also be used to enhance the camera coverage and increase the accuracy of the prediction algorithms.
  • Single-camera applications are also possible, though in practice the user may need to reposition his body to get full camera coverage of the body part in question.
  • the disclosed image sensors and cameras may be used alone, or in combination with other sensing systems.
  • thermographic infrared image sensor e.g., in commercially available infrared cameras
  • thermographic infrared image sensor for detecting areas of varying skin temperature which can help identify regions that are either cooler-than-surrounding areas (which may indicate conditions such as compromised blood flow) or hotter-than-surrounding areas (which may indicate conditions such as active infections).
  • the infrared image sensors e.g., in one or more thermal cameras
  • the visible-spectrum image sensors e.g., with one infrared camera above and one below the subject body part being measured.
  • a third sensing mechanism that may be used in disclosed embodiments is a physical sensor.
  • the physical sensor uses one or more servo-actuated monofilaments to test for skin sensation loss.
  • a monofilament exam is a test used to identify cases of peripheral neuropathy in diabetic patients. This technique is similar in some respects to the Semmes-Weinstein monofilament exam used by physicians, although the administration of the test, features of the testing device, and collection and analysis of the data are different than traditional tests, as described in detail below.
  • the device attaches a series of monofilament fibers with a standard gauge (e.g., a 5.07 gauge fiber that produces 10 g of pressure) to micro-servo actuators placed in locations distributed around the device in order to contact test sites of the body part to be measured.
  • a standard gauge e.g., a 5.07 gauge fiber that produces 10 g of pressure
  • test site locations include the heel, foot arch, ball of the foot, behind the toes, and the big toe.
  • the test may be administered with an interactive user interface, such as an interactive voice system.
  • the user interface may be implemented as a graphical user interface (e.g., with a touch screen) or in some other way.)
  • the user interface may, for example, provide instructions to the patient on how to start the test or prompt the patient (e.g., using a recorded or synthesized voice output) to say or otherwise indicate (e.g., with a button press, a gesture or other user input) when they feel a sensation from the monofilament. Responses from the patient can then be processed, recorded, and acted upon by the system as described herein.
  • the system may include circuitry embedded in the foot platform to test the user's ability to feel heat or pain by means of integrated heating strips or low-current electrical discharges.
  • a heat-testing device may use an optically transparent but electrically conductive material such as indium tin oxide to conduct current to the test site where it is run through a higher-resistance portion of the coating in order to generate resistive heating, similar to the way aircraft window heaters work.
  • the user may then be prompted (e.g., using a recorded or synthesized voice output) to say or otherwise indicate (e.g., with a button press, a gesture or other user input) when they feel a sensation from the heating element.
  • the heat sensitivity testing method may include several different levels of heating to test for sensation to obtain a more accurate assessment of the level of neuropathy present.
  • An electrical discharge device may use transparent conductive material to route a circuit to several test sites located around the foot (or other body part being tested). A small gap is left for the user's body to complete the circuit and introduce electrical stimulation. This may be done with high-voltage, low-current electricity as is commonly used in other medical diagnostic equipment to test for pain responses. Again, the user is prompted to indicate when they feel the stimulus. The intensity of the discharge can be varied to obtain a more accurate assessment of the level of neuropathy present.
  • the sensors and related structures and control circuitry are combined and integrated into a standing scale form factor.
  • a standing scale e.g., a bathroom scale
  • This disclosed embodiment also has the benefit of collecting the user's weight, which is also an important metric for diabetic patients, since weight management is often an important part of a diabetes management regimen.
  • FIG. 1 is a block diagram that depicts electronic systems that may be used in disclosed embodiments.
  • microcontroller 51 controls program execution and sensor integration, and is powered by a low voltage power supply 61 , and uses a durable storage device 62 , such as a magnetic hard drive or static flash drive, for program and data storage.
  • the device may be controlled by some other controller or computing device.
  • FIG. 1 also depicts an illustrative user interface in the form of an interactive voice system.
  • the interactive voice system provides a user experience which makes use of a speaker 11 to provide audio output and microphones with related speech recognition software or hardware to listen for user commands and responses to questions.
  • the interactive voice system may use a single microphone or an array of microphones 12 , which may be organized for beamforming for better background noise rejection.
  • the illustrative user interface also uses a visual indicator, such as a multi-color cue light 13 , to indicate various interface states.
  • the cue light 13 may be used to indicate states such as microphone active, microphone muted, system busy (e.g., when capturing an image or processing a captured image), and when the system is speaking or providing other output.
  • the system may obtain data from one or more sensors, which may be physical or optical.
  • a collection of load cells 21 which may be arranged in groups of four, are used to measure the user's weight.
  • the system may self-tare to subtract the weight of other device equipment such as the footbed, electronics, and cameras.
  • the system may perform self-taring during device initialization, before the first user weigh-in.
  • Load cells may be connected to the microcontroller by way of an analog-to-digital amplifier/converter such as the AVIA Semiconductor HX711.
  • the device also includes an array of monofilament probe assemblies 22 that are used to test the user's foot for peripheral neuropathy. These are described in more detail in FIG. 3 .
  • the device uses optical image sensors (e.g., in the form of cameras) to create images of the user's feet (or other body areas in some embodiments).
  • visible-light cameras 31 are used, along with other cameras that capture images at other wavelengths.
  • visible light cameras 31 may be commercially available cameras with visible-light image sensors such as those found in cellular phones.
  • Other cameras may use image sensors calibrated to capture visible light, near-infrared light, infrared light, or some combination thereof.
  • wide-angle lenses may be used in conjunction with image sensors to enable more compact chassis design, such as for cameras mounted below a foot platform.
  • Near-infrared cameras 32 may be used to capture extended spectrum information. Described embodiments may include one or more illumination sources 34 (e.g., LED lights) that provide more consistent image lighting for visible-light and/or near-infrared image sensors. Consistent image lighting can improve image classifier accuracy.
  • illumination sources 34 e.g., LED lights
  • thermal image sensors to capture heat data about the user's body.
  • thermal image sensors are included in thermal (far-infrared) cameras 33 .
  • Thermal imaging applied to skin provides the ability to indicate areas of concern, such as skin regions where the skin is cooler than surrounding skin, possibly indicating lower-than-normal blood supply, which could indicate a likely area for the development of peripheral neuropathy.
  • a skin region that is higher temperature than the surrounding skin may indicate the development of an infection such as a diabetic foot ulcer.
  • the images are provided as input to image processing software, such as a neural network classifier pipeline 41 , which then performs a series of classification steps related to diagnosis or treatment.
  • Some exemplary classification steps include, for example, detecting the presence, position and orientation of feet (or other body parts); detecting the presence and location of skin abnormalities (e.g., ulcers, foreign objects, or abnormal temperatures); and classifying these abnormalities (e.g., as possible areas of peripheral neuropathy or infection).
  • good results can be obtained by using a deep convolutional neural network.
  • Many existing commercial and open-source frameworks can be used for this task. Basic principles of training classifiers are well known to those skilled in the art of machine learning, and these basic principles need not be described here.
  • training classifiers involves collecting extensive datasets of feet (or other body parts), both with and without skin abnormalities being searched for, manually labeling this data with the correct classification labels for each step in the classification pipeline (e.g., presence, position, orientation of feet, or presence and location of skin abnormalities, types of abnormalities).
  • This data is then used to train the machine-learned classifiers and iteratively improve the classifier accuracy by obtaining new data, adjusting the steps in the classification pipeline, extracting new features to assist in classification, etc.
  • the overall classifier pipeline prediction can be measured with the precision of the predictions (e.g., the percentage of predictions that correctly find positive results) and the recall of the predictions (e.g., percentage of truly positive results that result in positive predictions). These metrics can be balanced in order to obtain an acceptable tradeoff between the two.
  • the classification pipeline output may be multi-class; for example, it may identify diabetic foot ulcers as well as other foot conditions such as cuts, bruises, corns, warts, etc.
  • the classification pipeline output may alternately output a binary classification indicating whether a given skin issue requires further medical follow-up.
  • the binary classification approach may be useful in situations where an abnormality is detected to be present but, due to factors such as poor image quality or missing images, the abnormality cannot be accurately classified.
  • Some embodiments send data (e.g., patient weight data, raw image data, image classification data) to other devices for storage or further processing.
  • data may be transmitted through a wireless networking adapter 63 and then through a network 64 to arrive at a remote computer system, such as a patient data management service 65 .
  • This service may store weight data, raw images, classifier output, or other data and perform further image processing, test the data against predefined rules such as having positive classifier predictions or weight gains above some threshold, and communications such as patient follow up messages.
  • a positive classification reading for foot ulcers may trigger a computer system (e.g., the patient data management service 65 ) to send an alert to the patient or the patient's care team for follow-up, and send a report (e.g., including images) directly into the patient's electronic medical records.
  • a computer system e.g., the patient data management service 65
  • a report e.g., including images
  • FIG. 4 is a perspective view of a disclosed embodiment integrated in a standing scale.
  • the device includes a suitably sized foot platform 401 for the user to stand on during use.
  • the platform 401 may be constructed of a strong transparent material such as polycarbonate or tempered glass in order to provide the ability for upward-facing cameras 410 to image the bottoms of the user's feet. The transparent material may be selected based on the imaging to be performed.
  • thermographic cameras to capture images through the foot platform
  • a suitable material that is transparent to long-infrared wavelengths may be used.
  • Built-in illumination 411 such as LED lights, may be used to provide consistent and sufficient lighting for the images.
  • the cameras 410 may include wide angle lenses, and may be arranged in an array. This design allows the platform 401 to be constructed with a low profile, which reduces the likelihood of injury due to tripping or falling when using the device.
  • the footbed measures 12 inches ⁇ 12 inches, and uses a pair cameras with 150° field of view to obtain full coverage of the footbed at a range of approximately 3.2 inches. In this example, the optical distortion of these cameras is minimal enough to not require any special processing, and give clear corner-to-corner resolution.
  • the platform 401 is supported by four load cells, which are in turn supported by support legs 402 .
  • the load cells are used to measure the weight of the user.
  • other components on the device may be attached to the foot platform 401 .
  • the weight of the foot platform and other components attached to it may be tared by the device during an automatic taring process, which may be performed during device power-up, device restart, or at some other time.
  • imaging of the tops of the user's feet may be desirable.
  • image sensors may be included in an upper head assembly 404 attached to a support arm 403 .
  • the upper head assembly also may include elements of a user interface, which may be beneficial for locating the user interface closer to the user's head to allow the user to more easily interact with the user interface (e.g., to more easily detect the user's voice in a user interface with voice control functionality).
  • a user interface may be beneficial for locating the user interface closer to the user's head to allow the user to more easily interact with the user interface (e.g., to more easily detect the user's voice in a user interface with voice control functionality).
  • the upper head assembly 404 includes a speaker 405 and a microphone 406 to support voice interactions with the user, as well as a visual indicator such as a multi-color cue light 407 to indicate interface states such as microphone active, microphone muted, system busy, system speaking (e.g., providing synthesized or recorded voice output), or error conditions.
  • Image sensors in the upper head assembly 404 may include cameras 408 such as visible-light, near-infrared, and thermographic cameras, along with built-in illumination 409 in order to provide sufficient and consistent lighting for the images.
  • some embodiments are equipped with one or more monofilament assemblies 412 to perform monofilament exams.
  • the device may include several assemblies placed at various locations around the device in order to test different sites on the user's feet.
  • a foot outline (not shown) or other visual or tactile guide may be provided on the foot platform 401 to assist users in positioning their feet correctly for imaging or monofilament testing.
  • FIG. 5 is a schematic diagram of a monofilament assembly that may be used in described embodiments for peripheral neuropathy testing.
  • the test is derived from the Semmes-Weinstein monofilament exam used by physicians.
  • an actuator e.g., micro servo actuator 503
  • micro servo actuator 503 is activated to move the monofilament 504 through an opening in the foot platform 502 so that the monofilament 504 is in contact with the user's foot 501 .
  • Various forms of actuation are contemplated, including a rotary servo with an arm that is connected to the monofilament, or a linear actuator. As shown in FIG.
  • the actuation occurs with sufficient force to cause the monofilament to bend or buckle below the foot platform 501 .
  • the device can be designed to ensure that a consistent amount of pressure is applied during testing.
  • the monofilament is pre-calibrated to a standard amount of buckling force, e.g., 10 grams.
  • FIG. 2 is a flow chart that describes an illustrative weight-activated data collection and analysis workflow in a disclosed embodiment.
  • the device when a user steps onto the foot platform at step 201 , the device is activated.
  • the device collects the weight over the course of a trigger period (e.g., a few seconds) and compares this to its trigger threshold weight (e.g., 10 pounds) and trigger period (e.g., one second) at step 202 . If the measured weight is below the threshold weight, or the weight is present for less than the trigger period, the activation is assumed to be accidental at step 203 , and the workflow ends.
  • Other embodiments may use different thresholds for weight and trigger periods, or such thresholds may be omitted if not deemed necessary for a particular application.
  • the user will be prompted (e.g., with synthesized or recorded voice output from the speaker 11 ) to stand still at step 204 , since imaging can take a few seconds and remaining motionless during the image capture process may help the system to obtain higher quality images.
  • the device can then, concurrently or in series, use load cells at step 205 to measure the user's weight, use visual light and/or near-infrared image sensors to capture images of the user's feet at step 206 , and use infrared imaging sensors to obtain thermographic images of the user's feet at step 207 .
  • Data obtained in steps 205 , 206 , and 207 can then be tested in step 209 using a classifier or other image analysis or pattern recognition techniques.
  • the system can use techniques such as edge detection to ensure that quality, lighting, and body positioning are satisfactory for input into the image classification pipeline. For example, if image analysis indicates blurry edges in locations where clear edges are expected, or if one foot is detected when two feet are expected, the system may infer that the user was not standing still or not positioned correctly during the image capture. The determination as to whether the input data are satisfactory may vary depending on implementation. If the inputs are found to be deficient, the user interface may prompt the user to take corrective action at step 210 . For example, if one of the user's feet was not positioned for a clear view from the camera, the user interface may prompt the user to move that foot back onto the foot platform.
  • edge detection For example, if image analysis indicates blurry edges in locations where clear edges are expected, or if one foot is detected when two feet are expected, the system may infer that the user was not standing still or not positioned correctly during the image capture. The determination as to whether the input data are satisfactory may vary depending on implementation. If the inputs are found to be deficient
  • the device After suggesting corrective action, the device then attempts to obtain new sensor inputs and returns to step 204 . If satisfactory sensor data cannot be obtained (e.g., after a threshold number of collection attempts), the user interface may prompt the user to try again later at step 211 and end the workflow.
  • images may be passed to a neural network classifier pipeline that classifies the images at step 212 .
  • classification which may indicate the presence or absence of conditions like diabetic foot ulcers—the system may upload data such as classification results, raw images, and the user's weight to a patient data management service at step 213 .
  • data such as the sensor and classification data
  • the system may omit local classification or image analysis and transmit only raw images, or images and weight data, to an external system or server that performs more intensive processing, such as image analysis and classification.
  • the user interface provides the user with a summary of the process so far at step 214 , which may include measurements and classification outputs.
  • the system may provide, for example, the user's weight, how measurements such as weight may be trending compared with previous measurements, and an assessment of whether any problematic issues were detected.
  • the summary provided to the user at step 214 will mark the end of the workflow.
  • the workflow may proceed with further examination of the user.
  • the system will check if it is time to perform a monofilament examination at step 215 . This check could be based on a scheduled examination period, or it could be a rule-based action based on results observed from the previous steps in the exam. For example, if a thermographic camera detects hot or cold spots (skin regions that are cooler or warmer than surrounding regions), these conditions could be cross-verified with a physical monofilament exam.
  • the system initiates the monofilament exam process at step 216 . If the device is not equipped with a monofilament examination device, or if it is not time for a monofilament exam, the user interface concludes the measurement session by reminding the user of any important information that may be pertinent at step 217 . For example, this might include the date of their next monofilament exam, or if the user is following a larger care plan, it may include other aspects of disease management like tips for healthy eating, reminders to exercise, and so on. The workflow then ends.
  • the system may omit providing a summary of the procedure, or the system may provide the summary at some other time or in some other way.
  • the device may provider the user with the option to receive an email or other communication indicating results of the test for privacy reasons, rather than providing them as audible voice output to the user.
  • FIG. 3 is a flowchart of an illustrative monofilament exam workflow.
  • the monofilament exam is initiated while a user is standing on the device, as part of a weight-activated workflow, such as the workflow illustrated in FIG. 2 .
  • the user interface prompts the user to stand still at step 301 , since the monofilament exam may take a couple of minutes or more to complete.
  • the user interface may provide an indicator, such as a visual countdown timer or feedback from the cue light, to indicate that the exam is in progress or estimated time remaining.
  • the device selects test sites for monofilament tests at step 302 .
  • the device selects a random order for the test sites where monofilament tests will be actuated, and may also include one or more placebo measurements.
  • the device may perform the exam according to a predetermined order of test sites or placebo measurements, or select from among a set of possible orders of test sites or placebo measurements.
  • the device uses the imaging system in combination with further processing (e.g., a classifier or other image analysis algorithm such as edge detection) to check if the user's feet are in a proper position at step 303 . If the feet are out of position at step 304 , the user interface prompts the user to take corrective action at step 305 . Once the user's feet are properly positioned, the system determines whether the first test site is a placebo measurement at step 306 . If the action performed at the site is a non-placebo measurement, the relevant monofilament test assembly for that test site is actuated at step 307 .
  • further processing e.g., a classifier or other image analysis algorithm such as edge detection
  • a monofilament probe assembly that does not contact the user's foot will be actuated at step 308 .
  • Placebo tests may be used to test for false-positive responses by the user. Since the monofilament exam actuators may generate a certain amount of noise and vibration during an actual exam, in a placebo test it may be important to actually perform a physical actuation to simulate the noise and vibration of a real exam and accurately test for false-positive response.
  • the user interface prompts the user to indicate if they felt the last touch at step 309 .
  • the user can then respond with an affirmative or negative response at step 310 , which the system will match against the test that was actually performed.
  • the system determines whether there are more sites to be tested or placebo measurements to be performed. If so, steps 303 - 310 may be repeated for subsequent test sites or placebo measurements in the set selected at step 302 . Once all sites have been tested and any placebo measurements have been performed, the results, including which test sites were actuated and how the user responded, may be uploaded to a patient data management service at step 312 .
  • Some embodiments may use the device as a standalone device without the use of a patient data management service, in which case this step may be skipped.
  • the user interface summarizes the results for the user at step 313 . This summary may include listing how many sites were tested, how many the user was able to correctly identify, and a list of any test sites where the user did not feel a real actuation.
  • the system may omit providing a summary of the procedure, or the system may provide the summary at some other time or in some other way.
  • the device may provider the user with the option to receive an email or other communication indicating results of the test for privacy reasons, rather than providing them as audible voice output to the user.
  • computing techniques and related tools described herein may be implemented by any suitable computing device or set of devices.
  • an engine may be used to perform actions.
  • An engine includes logic (e.g., in the form of computer program code) configured to cause one or more computing devices to perform actions described herein as being associated with the engine.
  • a computing device can be specifically programmed to perform the actions by having installed therein a tangible computer-readable medium having computer-executable instructions stored thereon that, when executed by one or more processors of the computing device, cause the computing device to perform the actions.
  • the particular engines described herein are included for ease of discussion, but many alternatives are possible. For example, actions described herein as associated with two or more engines on multiple devices may be performed by a single engine. As another example, actions described herein as associated with a single engine may be performed by two or more engines on the same device or on multiple devices.
  • server devices may include suitable computing devices configured to provide information and/or services described herein.
  • Server devices may include any suitable computing devices, such as dedicated server devices.
  • Server functionality provided by server devices may, in some cases, be provided by software (e.g., virtualized computing instances or application objects) executing on a computing device that is not a dedicated server device.
  • client can be used to refer to a computing device that obtains information and/or accesses services provided by a server over a communication link. However, the designation of a particular device as a client device does not necessarily require the presence of a server.
  • a single device may act as a server, a client, or both a server and a client, depending on context and configuration.
  • Actual physical locations of clients and servers are not necessarily important, but the locations can be described as “local” for a client and “remote” for a server to illustrate a common usage scenario in which a client is receiving information provided by a server at a remote location.
  • a peer-to-peer arrangement or other models, can be used.
  • FIG. 6 is a block diagram that illustrates aspects of an illustrative computing device 600 appropriate for use in accordance with embodiments of the present disclosure.
  • the description below is applicable to servers, personal computers, mobile phones, smart phones, tablet computers, embedded computing devices, and other currently available or yet-to-be-developed devices that may be used in accordance with embodiments of the present disclosure.
  • the computing device 600 includes at least one processor 602 and a system memory 604 connected by a communication bus 606 .
  • the system memory 604 may be volatile or nonvolatile memory, such as read only memory (“ROM”), random access memory (“RAM”), EEPROM, flash memory, or other memory technology.
  • ROM read only memory
  • RAM random access memory
  • EEPROM electrically erasable programmable read-only memory
  • flash memory or other memory technology.
  • system memory 604 typically stores data and/or program modules that are immediately accessible to and/or currently being operated on by the processor 602 .
  • the processor 602 may serve as a computational center of the computing device 600 by supporting the execution of instructions.
  • the computing device 600 may include a network interface 610 comprising one or more components for communicating with other devices over a network.
  • Embodiments of the present disclosure may access basic services that utilize the network interface 610 to perform communications using common network protocols.
  • the network interface 610 may also include a wireless network interface configured to communicate via one or more wireless communication protocols, such as WiFi, 2G, 3G, 4G, LTE, WiMAX, Bluetooth, and/or the like.
  • the computing device 600 also includes a storage medium 608 .
  • services may be accessed using a computing device that does not include means for persisting data to a local storage medium. Therefore, the storage medium 608 depicted in FIG. 6 is optional.
  • the storage medium 608 may be volatile or nonvolatile, removable or nonremovable, implemented using any technology capable of storing information such as, but not limited to, a hard drive, solid state drive, CD-ROM, DVD, or other disk storage, magnetic tape, magnetic disk storage, and/or the like.
  • computer-readable medium includes volatile and nonvolatile and removable and nonremovable media implemented in any method or technology capable of storing information, such as computer-readable instructions, data structures, program modules, or other data.
  • system memory 604 and storage medium 608 depicted in FIG. 6 are examples of computer-readable media.
  • FIG. 6 does not show some of the typical components of many computing devices.
  • the computing device 600 may include input devices, such as a keyboard, keypad, mouse, trackball, microphone, video camera, touchpad, touchscreen, electronic pen, stylus, and/or the like.
  • Such input devices may be coupled to the computing device 600 by wired or wireless connections including RF, infrared, serial, parallel, Bluetooth, USB, or other suitable connection protocols using wireless or physical connections.
  • input data can be captured by input devices and processed, transmitted, or stored (e.g., for future processing).
  • the processing may include encoding data streams, which can be subsequently decoded for presentation by output devices.
  • Media data can be captured by multimedia input devices and stored by saving media data streams as files on a computer-readable storage medium (e.g., in memory or persistent storage on a client device, server, administrator device, or some other device).
  • Input devices can be separate from and communicatively coupled to computing device 600 (e.g., a client device), or can be integral components of the computing device 600 .
  • multiple input devices may be combined into a single, multifunction input device (e.g., a video camera with an integrated microphone).
  • the computing device 600 may also include output devices such as a display, speakers, printer, etc.
  • the output devices may include video output devices such as a display or touchscreen.
  • the output devices also may include audio output devices such as external speakers or earphones.
  • the output devices can be separate from and communicatively coupled to the computing device 600 , or can be integral components of the computing device 600 .
  • Input functionality and output functionality may be integrated into the same input/output device (e.g., a touchscreen). Any suitable input device, output device, or combined input/output device either currently known or developed in the future may be used with described systems.
  • functionality of computing devices described herein may be implemented in computing logic embodied in hardware or software instructions, which can be written in a programming language, such as C, C++, COBOL, JAVATM, PHP, Perl, Python, Ruby, HTML, CSS, JavaScript, VBScript, ASPX, Microsoft .NETTM languages such as C#, and/or the like.
  • Computing logic may be compiled into executable programs or written in interpreted programming languages.
  • functionality described herein can be implemented as logic modules that can be duplicated to provide greater processing capability, merged with other modules, or divided into sub-modules.
  • the computing logic can be stored in any type of computer-readable medium (e.g., a non-transitory medium such as a memory or storage medium) or computer storage device and be stored on and executed by one or more general-purpose or special-purpose processors, thus creating a special-purpose computing device configured to provide functionality described herein.
  • a computer-readable medium e.g., a non-transitory medium such as a memory or storage medium
  • computer storage device e.g., a non-transitory medium such as a memory or storage medium
  • general-purpose or special-purpose processors e.g., a general-purpose or special-purpose processors
  • modules or subsystems can be separated into additional modules or subsystems or combined into fewer modules or subsystems.
  • modules or subsystems can be omitted or supplemented with other modules or subsystems.
  • functions that are indicated as being performed by a particular device, module, or subsystem may instead be performed by one or more other devices, modules, or subsystems.
  • processing stages in the various techniques can be separated into additional stages or combined into fewer stages.
  • processing stages in the various techniques can be omitted or supplemented with other techniques or processing stages.
  • processing stages that are described as occurring in a particular order can instead occur in a different order.
  • processing stages that are described as being performed in a series of steps may instead be handled in a parallel fashion, with multiple modules or software processes concurrently handling one or more of the illustrated processing stages.
  • processing stages that are indicated as being performed by a particular device or module may instead be performed by one or more other devices or modules.

Abstract

A medical diagnostic apparatus includes a controller; a user interface; a platform having a horizontal surface; and at least one visible light image sensor positioned below the horizontal surface that is capable of producing a diagnostic visible light image of a bottom portion of a foot or feet positioned on the horizontal surface. At least a portion of the horizontal surface is transparent to visible light. The apparatus may further include at least one infrared image sensor positioned below the horizontal surface that is capable of producing a thermal image of the first target area. The apparatus may further include sensors positioned above the platform capable of producing diagnostic visible light images or thermal images of a top portion of the user's foot or feet. The apparatus may further include a weight measurement system coupled to the platform.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application claims the benefit of U.S. Provisional Application No. 62/536,388, filed Jul. 24, 2017, which is incorporated by reference herein.
  • SUMMARY
  • This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
  • In one aspect, a medical diagnostic apparatus includes a controller; a user interface configured for user interaction with the medical diagnostic apparatus; a platform having a horizontal surface; a weight measurement system coupled to the platform; and at least one visible light image sensor positioned below the horizontal surface that is capable of producing a diagnostic visible light image of a bottom portion of a foot or feet positioned on the horizontal surface. At least a portion of the horizontal surface is transparent to visible light.
  • In an embodiment, at least a portion of the horizontal surface is transparent to infrared light suitable for thermal imaging, and the apparatus further includes at least one infrared image sensor positioned below the horizontal surface that is capable of producing a thermal image of the first target area.
  • In an embodiment, the apparatus further includes at least one visible light image sensor positioned above the platform capable of producing a diagnostic visible light image of a top portion of the user's foot or feet.
  • In an embodiment, the apparatus further includes at least one infrared image sensor positioned above the platform capable of producing a thermal image of a top portion of the user's foot or feet.
  • In another aspect, a medical diagnostic apparatus includes a platform having a horizontal surface; at least one upper image sensor positioned above the horizontal surface and configured to capture one or more images of a top portion of a foot or feet; at least one lower image sensor positioned below the horizontal surface and configured to capture one or more images of a bottom portion of the foot or feet; a controller; and a user interface configured for user interaction with the medical diagnostic apparatus. At least a portion of the horizontal surface is transparent to visible light. In an embodiment, two cameras area are positioned below the horizontal surface. In an embodiment, at least a portion of the horizontal surface is transparent to infrared light suitable for thermal imaging, and the medical diagnostic apparatus further includes at least one infrared image sensor positioned below the horizontal surface that is capable of producing a thermal image of the bottom portion of the foot or feet.
  • In any of the described embodiments, a medical diagnostic apparatus may include a device for testing for peripheral neuropathy. The device may include a controller; a foot platform having at least one opening; at least one vertically oriented monofilament positioned to pass through the at least one opening of the foot platform; and at least one actuator positioned below the at least one opening of the foot platform. The at least one actuator is mechanically coupled to the at least one vertically oriented monofilament and is configured to move the at least one vertically oriented monofilament to pass through the at least one opening of the foot platform.
  • In any of the described embodiments, a medical diagnostic apparatus may include a visual or tactile guide for foot positioning, one or more illumination sources, or a combination of these or other additional features.
  • In another aspect, a computer-implemented method is described for automated diagnosis of a diabetic foot condition. The method includes capturing, by one or image capture devices of a medical diagnostic apparatus, optical image data of a target area of a foot; collecting, by a touch sensitivity testing device (e.g., a servo-actuated monofilament probe) of the medical diagnostic apparatus, physical touch sensitivity data for the target area of the foot; transmitting, by the medical diagnostic apparatus, the optical image data, the physical touch sensitivity data, or a combination of such data to an analysis engine; and outputting, by the analysis engine, one or more indications of a diabetic foot condition. The method may further include checking the optical image data one or more of image quality, lighting conditions, or body positioning. The method may further include, prior to collecting the physical touch sensitivity data, confirming the position of the foot based at least in part on the optical image data. The analysis engine may include an image classifier.
  • In any of the described embodiments, a user interface may include an interactive voice interface, a display, a visual indicator, a combination of such user interface features, or other user interface features.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The foregoing aspects and many of the attendant advantages of this invention will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings, wherein:
  • FIG. 1 is a block diagram that depicts electronic systems that may be used in disclosed embodiments;
  • FIG. 2 is a flow chart that describes an illustrative weight-activated data collection and analysis workflow that may be used in disclosure embodiments;
  • FIG. 3 is a flowchart of an illustrative monofilament exam workflow that may be used in disclosure embodiments;
  • FIG. 4 is a perspective view of a disclosed embodiment integrated in a standing scale;
  • FIG. 5 is a schematic diagram of a monofilament assembly that may be used in described embodiments for peripheral neuropathy testing; and
  • FIG. 6 is a block diagram that illustrates aspects of an illustrative computing device appropriate for use in accordance with embodiments of the present disclosure.
  • DETAILED DESCRIPTION
  • The present disclosure describes embodiments that facilitate non-invasive detection of skin problems in the feet or other areas, including neuropathy, vascular disease, and ulcers, such as those associated with diabetes mellitus.
  • In the following description, numerous specific details are set forth in order to provide a thorough understanding of illustrative embodiments of the present disclosure. It will be apparent to one skilled in the art, however, that many embodiments of the present disclosure may be practiced without some or all of the specific details. In some instances, well-known process steps have not been described in detail in order not to unnecessarily obscure various aspects of the present disclosure. Further, it will be appreciated that embodiments of the present disclosure may employ any combination of features described herein. The illustrative examples provided herein are not intended to be exhaustive or to limit the claimed subject matter to the precise forms disclosed.
  • Diabetic foot infection is the most common complication of diabetes mellitus leading to hospitalization and the most frequent cause of non-traumatic lower extremity amputation. Diabetic foot ulcers and other foot infections develop due to reduced feeling in the foot from peripheral neuropathy, the most common form of diabetic neuropathy. People that suffer from such conditions may not be aware of abrasions or hotspots that may develop into ulcers, that wounds are not healing, or that foreign objects have become lodged in the foot.
  • To address problems such as these, disclosed embodiments use a combination of sensors (e.g., optical and physical sensors) combined with related processing techniques (e.g., machine-learned classifiers) to help detect skin conditions (e.g., possible complications from diabetes). As described in detail below, the user or patient may be guided through operation of the device using an interactive voice system.
  • One possible sensing mechanism that may be used in disclosed embodiments is a group of low-cost image sensors (e.g., in commercially available cameras), capturing visible spectrum or near-IR images. In a disclosed embodiment, cameras are arranged in pairs (e.g., with one above and one below the subject body part being measured). Many-camera setups can also be used to enhance the camera coverage and increase the accuracy of the prediction algorithms. Single-camera applications are also possible, though in practice the user may need to reposition his body to get full camera coverage of the body part in question. The disclosed image sensors and cameras may be used alone, or in combination with other sensing systems.
  • A second sensing mechanism that may be used in disclosed embodiments uses a thermographic infrared image sensor (e.g., in commercially available infrared cameras) for detecting areas of varying skin temperature which can help identify regions that are either cooler-than-surrounding areas (which may indicate conditions such as compromised blood flow) or hotter-than-surrounding areas (which may indicate conditions such as active infections). In a disclosed embodiment, the infrared image sensors (e.g., in one or more thermal cameras) are co-located with the visible-spectrum image sensors (e.g., with one infrared camera above and one below the subject body part being measured).
  • A third sensing mechanism that may be used in disclosed embodiments is a physical sensor. In a disclosed embodiment, the physical sensor uses one or more servo-actuated monofilaments to test for skin sensation loss. A monofilament exam is a test used to identify cases of peripheral neuropathy in diabetic patients. This technique is similar in some respects to the Semmes-Weinstein monofilament exam used by physicians, although the administration of the test, features of the testing device, and collection and analysis of the data are different than traditional tests, as described in detail below.
  • In a disclosed embodiment, the device attaches a series of monofilament fibers with a standard gauge (e.g., a 5.07 gauge fiber that produces 10 g of pressure) to micro-servo actuators placed in locations distributed around the device in order to contact test sites of the body part to be measured. On a foot, illustrative test site locations include the heel, foot arch, ball of the foot, behind the toes, and the big toe. The test may be administered with an interactive user interface, such as an interactive voice system. (As an alternative or in addition to the interactive voice system, the user interface may be implemented as a graphical user interface (e.g., with a touch screen) or in some other way.) The user interface may, for example, provide instructions to the patient on how to start the test or prompt the patient (e.g., using a recorded or synthesized voice output) to say or otherwise indicate (e.g., with a button press, a gesture or other user input) when they feel a sensation from the monofilament. Responses from the patient can then be processed, recorded, and acted upon by the system as described herein.
  • Other physical sensing or testing mechanisms may be used in combination with or in place of monofilament tests. For example, the system may include circuitry embedded in the foot platform to test the user's ability to feel heat or pain by means of integrated heating strips or low-current electrical discharges. In such embodiments, a heat-testing device may use an optically transparent but electrically conductive material such as indium tin oxide to conduct current to the test site where it is run through a higher-resistance portion of the coating in order to generate resistive heating, similar to the way aircraft window heaters work. As in other embodiments described herein, the user may then be prompted (e.g., using a recorded or synthesized voice output) to say or otherwise indicate (e.g., with a button press, a gesture or other user input) when they feel a sensation from the heating element. The heat sensitivity testing method may include several different levels of heating to test for sensation to obtain a more accurate assessment of the level of neuropathy present. An electrical discharge device may use transparent conductive material to route a circuit to several test sites located around the foot (or other body part being tested). A small gap is left for the user's body to complete the circuit and introduce electrical stimulation. This may be done with high-voltage, low-current electricity as is commonly used in other medical diagnostic equipment to test for pain responses. Again, the user is prompted to indicate when they feel the stimulus. The intensity of the discharge can be varied to obtain a more accurate assessment of the level of neuropathy present.
  • In a disclosed embodiment, the sensors and related structures and control circuitry are combined and integrated into a standing scale form factor. A standing scale (e.g., a bathroom scale) provides a surface on which a person may stand, which provides a suitable platform for foot-based data collection. This disclosed embodiment also has the benefit of collecting the user's weight, which is also an important metric for diabetic patients, since weight management is often an important part of a diabetes management regimen.
  • FIG. 1 is a block diagram that depicts electronic systems that may be used in disclosed embodiments. In the example shown in FIG. 1, microcontroller 51 controls program execution and sensor integration, and is powered by a low voltage power supply 61, and uses a durable storage device 62, such as a magnetic hard drive or static flash drive, for program and data storage. Alternatively, the device may be controlled by some other controller or computing device.
  • FIG. 1 also depicts an illustrative user interface in the form of an interactive voice system. The interactive voice system provides a user experience which makes use of a speaker 11 to provide audio output and microphones with related speech recognition software or hardware to listen for user commands and responses to questions. The interactive voice system may use a single microphone or an array of microphones 12, which may be organized for beamforming for better background noise rejection. The illustrative user interface also uses a visual indicator, such as a multi-color cue light 13, to indicate various interface states. In this example, the cue light 13 may be used to indicate states such as microphone active, microphone muted, system busy (e.g., when capturing an image or processing a captured image), and when the system is speaking or providing other output.
  • During operation, the system may obtain data from one or more sensors, which may be physical or optical. In the example shown in FIG. 1, a collection of load cells 21, which may be arranged in groups of four, are used to measure the user's weight. The system may self-tare to subtract the weight of other device equipment such as the footbed, electronics, and cameras. For example, the system may perform self-taring during device initialization, before the first user weigh-in. Load cells may be connected to the microcontroller by way of an analog-to-digital amplifier/converter such as the AVIA Semiconductor HX711.
  • In some embodiments, the device also includes an array of monofilament probe assemblies 22 that are used to test the user's foot for peripheral neuropathy. These are described in more detail in FIG. 3.
  • The device uses optical image sensors (e.g., in the form of cameras) to create images of the user's feet (or other body areas in some embodiments). In the example shown in FIG. 1, visible-light cameras 31 are used, along with other cameras that capture images at other wavelengths. In some embodiments, to reduce cost and complexity visible light cameras 31 may be commercially available cameras with visible-light image sensors such as those found in cellular phones. Other cameras may use image sensors calibrated to capture visible light, near-infrared light, infrared light, or some combination thereof. In some embodiments, wide-angle lenses may be used in conjunction with image sensors to enable more compact chassis design, such as for cameras mounted below a foot platform. Near-infrared cameras 32 may be used to capture extended spectrum information. Described embodiments may include one or more illumination sources 34 (e.g., LED lights) that provide more consistent image lighting for visible-light and/or near-infrared image sensors. Consistent image lighting can improve image classifier accuracy.
  • Some embodiments use thermal image sensors to capture heat data about the user's body. In the example shown in FIG. 1, thermal image sensors are included in thermal (far-infrared) cameras 33. Thermal imaging applied to skin provides the ability to indicate areas of concern, such as skin regions where the skin is cooler than surrounding skin, possibly indicating lower-than-normal blood supply, which could indicate a likely area for the development of peripheral neuropathy. Alternately, a skin region that is higher temperature than the surrounding skin may indicate the development of an infection such as a diabetic foot ulcer.
  • In some embodiments, subsequent to collection of optical imaging data, the images are provided as input to image processing software, such as a neural network classifier pipeline 41, which then performs a series of classification steps related to diagnosis or treatment. Some exemplary classification steps include, for example, detecting the presence, position and orientation of feet (or other body parts); detecting the presence and location of skin abnormalities (e.g., ulcers, foreign objects, or abnormal temperatures); and classifying these abnormalities (e.g., as possible areas of peripheral neuropathy or infection). In each of these steps, good results can be obtained by using a deep convolutional neural network. Many existing commercial and open-source frameworks can be used for this task. Basic principles of training classifiers are well known to those skilled in the art of machine learning, and these basic principles need not be described here.
  • In described embodiments, training classifiers involves collecting extensive datasets of feet (or other body parts), both with and without skin abnormalities being searched for, manually labeling this data with the correct classification labels for each step in the classification pipeline (e.g., presence, position, orientation of feet, or presence and location of skin abnormalities, types of abnormalities). This data is then used to train the machine-learned classifiers and iteratively improve the classifier accuracy by obtaining new data, adjusting the steps in the classification pipeline, extracting new features to assist in classification, etc. The overall classifier pipeline prediction can be measured with the precision of the predictions (e.g., the percentage of predictions that correctly find positive results) and the recall of the predictions (e.g., percentage of truly positive results that result in positive predictions). These metrics can be balanced in order to obtain an acceptable tradeoff between the two.
  • The classification pipeline output may be multi-class; for example, it may identify diabetic foot ulcers as well as other foot conditions such as cuts, bruises, corns, warts, etc. The classification pipeline output may alternately output a binary classification indicating whether a given skin issue requires further medical follow-up. The binary classification approach may be useful in situations where an abnormality is detected to be present but, due to factors such as poor image quality or missing images, the abnormality cannot be accurately classified.
  • Some embodiments send data (e.g., patient weight data, raw image data, image classification data) to other devices for storage or further processing. For example, data may be transmitted through a wireless networking adapter 63 and then through a network 64 to arrive at a remote computer system, such as a patient data management service 65. This service may store weight data, raw images, classifier output, or other data and perform further image processing, test the data against predefined rules such as having positive classifier predictions or weight gains above some threshold, and communications such as patient follow up messages. For example, in some embodiments a positive classification reading for foot ulcers may trigger a computer system (e.g., the patient data management service 65) to send an alert to the patient or the patient's care team for follow-up, and send a report (e.g., including images) directly into the patient's electronic medical records.
  • FIG. 4 is a perspective view of a disclosed embodiment integrated in a standing scale. In the example shown in FIG. 4, the device includes a suitably sized foot platform 401 for the user to stand on during use. (Although embodiments described herein refer to users standing on the platform, it should be readily understood that in other embodiments the device can be modified, such as with a bench or chair, such that the user is not required to stand.) The platform 401 may be constructed of a strong transparent material such as polycarbonate or tempered glass in order to provide the ability for upward-facing cameras 410 to image the bottoms of the user's feet. The transparent material may be selected based on the imaging to be performed. For example, to allow thermographic cameras to capture images through the foot platform, a suitable material that is transparent to long-infrared wavelengths may be used. Built-in illumination 411, such as LED lights, may be used to provide consistent and sufficient lighting for the images. To capture images of the user's feet at close range, the cameras 410 may include wide angle lenses, and may be arranged in an array. This design allows the platform 401 to be constructed with a low profile, which reduces the likelihood of injury due to tripping or falling when using the device. In one embodiment, the footbed measures 12 inches×12 inches, and uses a pair cameras with 150° field of view to obtain full coverage of the footbed at a range of approximately 3.2 inches. In this example, the optical distortion of these cameras is minimal enough to not require any special processing, and give clear corner-to-corner resolution.
  • The platform 401 is supported by four load cells, which are in turn supported by support legs 402. The load cells are used to measure the weight of the user. For mechanical simplicity and to prevent binding or friction, which may produce inaccurate weight measurements, other components on the device may be attached to the foot platform 401. The weight of the foot platform and other components attached to it may be tared by the device during an automatic taring process, which may be performed during device power-up, device restart, or at some other time.
  • In some situations, imaging of the tops of the user's feet may be desirable. For this type of imaging, image sensors may be included in an upper head assembly 404 attached to a support arm 403. The upper head assembly also may include elements of a user interface, which may be beneficial for locating the user interface closer to the user's head to allow the user to more easily interact with the user interface (e.g., to more easily detect the user's voice in a user interface with voice control functionality). In the example shown in FIG. 4, the upper head assembly 404 includes a speaker 405 and a microphone 406 to support voice interactions with the user, as well as a visual indicator such as a multi-color cue light 407 to indicate interface states such as microphone active, microphone muted, system busy, system speaking (e.g., providing synthesized or recorded voice output), or error conditions. Image sensors in the upper head assembly 404 may include cameras 408 such as visible-light, near-infrared, and thermographic cameras, along with built-in illumination 409 in order to provide sufficient and consistent lighting for the images.
  • As mentioned above, some embodiments are equipped with one or more monofilament assemblies 412 to perform monofilament exams. Although only one monofilament assembly is shown in FIG. 4 for ease of illustration, the device may include several assemblies placed at various locations around the device in order to test different sites on the user's feet. A foot outline (not shown) or other visual or tactile guide may be provided on the foot platform 401 to assist users in positioning their feet correctly for imaging or monofilament testing.
  • FIG. 5 is a schematic diagram of a monofilament assembly that may be used in described embodiments for peripheral neuropathy testing. In described embodiments, the test is derived from the Semmes-Weinstein monofilament exam used by physicians. During operation, an actuator (e.g., micro servo actuator 503) is activated to move the monofilament 504 through an opening in the foot platform 502 so that the monofilament 504 is in contact with the user's foot 501. Various forms of actuation are contemplated, including a rotary servo with an arm that is connected to the monofilament, or a linear actuator. As shown in FIG. 5, the actuation occurs with sufficient force to cause the monofilament to bend or buckle below the foot platform 501. The device can be designed to ensure that a consistent amount of pressure is applied during testing. In at least one embodiment, the monofilament is pre-calibrated to a standard amount of buckling force, e.g., 10 grams.
  • FIG. 2 is a flow chart that describes an illustrative weight-activated data collection and analysis workflow in a disclosed embodiment. In the example shown in FIG. 2, when a user steps onto the foot platform at step 201, the device is activated. The device collects the weight over the course of a trigger period (e.g., a few seconds) and compares this to its trigger threshold weight (e.g., 10 pounds) and trigger period (e.g., one second) at step 202. If the measured weight is below the threshold weight, or the weight is present for less than the trigger period, the activation is assumed to be accidental at step 203, and the workflow ends. Other embodiments may use different thresholds for weight and trigger periods, or such thresholds may be omitted if not deemed necessary for a particular application.
  • In the example shown in FIG. 2, if the threshold weight is measured for the trigger period, the user will be prompted (e.g., with synthesized or recorded voice output from the speaker 11) to stand still at step 204, since imaging can take a few seconds and remaining motionless during the image capture process may help the system to obtain higher quality images. The device can then, concurrently or in series, use load cells at step 205 to measure the user's weight, use visual light and/or near-infrared image sensors to capture images of the user's feet at step 206, and use infrared imaging sensors to obtain thermographic images of the user's feet at step 207. Data obtained in steps 205, 206, and 207 can then be tested in step 209 using a classifier or other image analysis or pattern recognition techniques.
  • With regard to images, the system can use techniques such as edge detection to ensure that quality, lighting, and body positioning are satisfactory for input into the image classification pipeline. For example, if image analysis indicates blurry edges in locations where clear edges are expected, or if one foot is detected when two feet are expected, the system may infer that the user was not standing still or not positioned correctly during the image capture. The determination as to whether the input data are satisfactory may vary depending on implementation. If the inputs are found to be deficient, the user interface may prompt the user to take corrective action at step 210. For example, if one of the user's feet was not positioned for a clear view from the camera, the user interface may prompt the user to move that foot back onto the foot platform. After suggesting corrective action, the device then attempts to obtain new sensor inputs and returns to step 204. If satisfactory sensor data cannot be obtained (e.g., after a threshold number of collection attempts), the user interface may prompt the user to try again later at step 211 and end the workflow.
  • Once satisfactory sensor inputs are obtained, the inputs are processed further. For example, images may be passed to a neural network classifier pipeline that classifies the images at step 212. After classification—which may indicate the presence or absence of conditions like diabetic foot ulcers—the system may upload data such as classification results, raw images, and the user's weight to a patient data management service at step 213. While some embodiments may upload data, such as the sensor and classification data, to other computing devices, this is not required. For example, standalone devices that do not use a network connection or a patient data management service are contemplated. In other scenarios, such as cloud computing arrangements, the system may omit local classification or image analysis and transmit only raw images, or images and weight data, to an external system or server that performs more intensive processing, such as image analysis and classification.
  • Referring again to the example shown in FIG. 2, after classification has occurred, the user interface provides the user with a summary of the process so far at step 214, which may include measurements and classification outputs. The system may provide, for example, the user's weight, how measurements such as weight may be trending compared with previous measurements, and an assessment of whether any problematic issues were detected.
  • In some embodiments or usage scenarios, the summary provided to the user at step 214 will mark the end of the workflow. In other embodiments or scenarios, the workflow may proceed with further examination of the user. In the example shown in FIG. 2, in embodiments equipped with monofilament probes, the system will check if it is time to perform a monofilament examination at step 215. This check could be based on a scheduled examination period, or it could be a rule-based action based on results observed from the previous steps in the exam. For example, if a thermographic camera detects hot or cold spots (skin regions that are cooler or warmer than surrounding regions), these conditions could be cross-verified with a physical monofilament exam. If the monofilament examination is to be performed, the system initiates the monofilament exam process at step 216. If the device is not equipped with a monofilament examination device, or if it is not time for a monofilament exam, the user interface concludes the measurement session by reminding the user of any important information that may be pertinent at step 217. For example, this might include the date of their next monofilament exam, or if the user is following a larger care plan, it may include other aspects of disease management like tips for healthy eating, reminders to exercise, and so on. The workflow then ends.
  • Many alternatives to the workflow of FIG. 2 are possible. For example, the system may omit providing a summary of the procedure, or the system may provide the summary at some other time or in some other way. For example, if the device is located in a public area such as a pharmacy, the device may provider the user with the option to receive an email or other communication indicating results of the test for privacy reasons, rather than providing them as audible voice output to the user.
  • FIG. 3 is a flowchart of an illustrative monofilament exam workflow. In some embodiments, the monofilament exam is initiated while a user is standing on the device, as part of a weight-activated workflow, such as the workflow illustrated in FIG. 2. The user interface prompts the user to stand still at step 301, since the monofilament exam may take a couple of minutes or more to complete. The user interface may provide an indicator, such as a visual countdown timer or feedback from the cue light, to indicate that the exam is in progress or estimated time remaining. The device selects test sites for monofilament tests at step 302. In this example, the device selects a random order for the test sites where monofilament tests will be actuated, and may also include one or more placebo measurements. Alternatively, the device may perform the exam according to a predetermined order of test sites or placebo measurements, or select from among a set of possible orders of test sites or placebo measurements.
  • The device then uses the imaging system in combination with further processing (e.g., a classifier or other image analysis algorithm such as edge detection) to check if the user's feet are in a proper position at step 303. If the feet are out of position at step 304, the user interface prompts the user to take corrective action at step 305. Once the user's feet are properly positioned, the system determines whether the first test site is a placebo measurement at step 306. If the action performed at the site is a non-placebo measurement, the relevant monofilament test assembly for that test site is actuated at step 307. If the action performed at the test site is a placebo measurement, a monofilament probe assembly that does not contact the user's foot will be actuated at step 308. Placebo tests may be used to test for false-positive responses by the user. Since the monofilament exam actuators may generate a certain amount of noise and vibration during an actual exam, in a placebo test it may be important to actually perform a physical actuation to simulate the noise and vibration of a real exam and accurately test for false-positive response.
  • Following either the placebo measurement or the actual exam at the test site, the user interface prompts the user to indicate if they felt the last touch at step 309. The user can then respond with an affirmative or negative response at step 310, which the system will match against the test that was actually performed. At step 311, the system determines whether there are more sites to be tested or placebo measurements to be performed. If so, steps 303-310 may be repeated for subsequent test sites or placebo measurements in the set selected at step 302. Once all sites have been tested and any placebo measurements have been performed, the results, including which test sites were actuated and how the user responded, may be uploaded to a patient data management service at step 312. Some embodiments may use the device as a standalone device without the use of a patient data management service, in which case this step may be skipped. The user interface summarizes the results for the user at step 313. This summary may include listing how many sites were tested, how many the user was able to correctly identify, and a list of any test sites where the user did not feel a real actuation.
  • Many alternatives to the workflow of FIG. 3 are possible. For example, the system may omit providing a summary of the procedure, or the system may provide the summary at some other time or in some other way. For example, if the device is located in a public area such as a pharmacy, the device may provider the user with the option to receive an email or other communication indicating results of the test for privacy reasons, rather than providing them as audible voice output to the user.
  • Illustrative Computing Devices and Operating Environments
  • Unless otherwise specified in the context of specific examples, computing techniques and related tools described herein may be implemented by any suitable computing device or set of devices.
  • In any of the described examples, an engine may be used to perform actions. An engine includes logic (e.g., in the form of computer program code) configured to cause one or more computing devices to perform actions described herein as being associated with the engine. For example, a computing device can be specifically programmed to perform the actions by having installed therein a tangible computer-readable medium having computer-executable instructions stored thereon that, when executed by one or more processors of the computing device, cause the computing device to perform the actions. The particular engines described herein are included for ease of discussion, but many alternatives are possible. For example, actions described herein as associated with two or more engines on multiple devices may be performed by a single engine. As another example, actions described herein as associated with a single engine may be performed by two or more engines on the same device or on multiple devices.
  • Some of the functionality described herein may be implemented in the context of a client-server relationship. In this context, server devices may include suitable computing devices configured to provide information and/or services described herein. Server devices may include any suitable computing devices, such as dedicated server devices. Server functionality provided by server devices may, in some cases, be provided by software (e.g., virtualized computing instances or application objects) executing on a computing device that is not a dedicated server device. The term “client” can be used to refer to a computing device that obtains information and/or accesses services provided by a server over a communication link. However, the designation of a particular device as a client device does not necessarily require the presence of a server. At various times, a single device may act as a server, a client, or both a server and a client, depending on context and configuration. Actual physical locations of clients and servers are not necessarily important, but the locations can be described as “local” for a client and “remote” for a server to illustrate a common usage scenario in which a client is receiving information provided by a server at a remote location. Alternatively, a peer-to-peer arrangement, or other models, can be used.
  • FIG. 6 is a block diagram that illustrates aspects of an illustrative computing device 600 appropriate for use in accordance with embodiments of the present disclosure. The description below is applicable to servers, personal computers, mobile phones, smart phones, tablet computers, embedded computing devices, and other currently available or yet-to-be-developed devices that may be used in accordance with embodiments of the present disclosure.
  • In its most basic configuration, the computing device 600 includes at least one processor 602 and a system memory 604 connected by a communication bus 606. Depending on the exact configuration and type of device, the system memory 604 may be volatile or nonvolatile memory, such as read only memory (“ROM”), random access memory (“RAM”), EEPROM, flash memory, or other memory technology. Those of ordinary skill in the art and others will recognize that system memory 604 typically stores data and/or program modules that are immediately accessible to and/or currently being operated on by the processor 602. In this regard, the processor 602 may serve as a computational center of the computing device 600 by supporting the execution of instructions.
  • As further illustrated in FIG. 6, the computing device 600 may include a network interface 610 comprising one or more components for communicating with other devices over a network. Embodiments of the present disclosure may access basic services that utilize the network interface 610 to perform communications using common network protocols. The network interface 610 may also include a wireless network interface configured to communicate via one or more wireless communication protocols, such as WiFi, 2G, 3G, 4G, LTE, WiMAX, Bluetooth, and/or the like.
  • In FIG. 6, the computing device 600 also includes a storage medium 608. However, services may be accessed using a computing device that does not include means for persisting data to a local storage medium. Therefore, the storage medium 608 depicted in FIG. 6 is optional. In any event, the storage medium 608 may be volatile or nonvolatile, removable or nonremovable, implemented using any technology capable of storing information such as, but not limited to, a hard drive, solid state drive, CD-ROM, DVD, or other disk storage, magnetic tape, magnetic disk storage, and/or the like.
  • As used herein, the term “computer-readable medium” includes volatile and nonvolatile and removable and nonremovable media implemented in any method or technology capable of storing information, such as computer-readable instructions, data structures, program modules, or other data. In this regard, the system memory 604 and storage medium 608 depicted in FIG. 6 are examples of computer-readable media.
  • For ease of illustration and because it is not important for an understanding of the claimed subject matter, FIG. 6 does not show some of the typical components of many computing devices. In this regard, the computing device 600 may include input devices, such as a keyboard, keypad, mouse, trackball, microphone, video camera, touchpad, touchscreen, electronic pen, stylus, and/or the like. Such input devices may be coupled to the computing device 600 by wired or wireless connections including RF, infrared, serial, parallel, Bluetooth, USB, or other suitable connection protocols using wireless or physical connections.
  • In any of the described examples, input data can be captured by input devices and processed, transmitted, or stored (e.g., for future processing). The processing may include encoding data streams, which can be subsequently decoded for presentation by output devices. Media data can be captured by multimedia input devices and stored by saving media data streams as files on a computer-readable storage medium (e.g., in memory or persistent storage on a client device, server, administrator device, or some other device). Input devices can be separate from and communicatively coupled to computing device 600 (e.g., a client device), or can be integral components of the computing device 600. In some embodiments, multiple input devices may be combined into a single, multifunction input device (e.g., a video camera with an integrated microphone). The computing device 600 may also include output devices such as a display, speakers, printer, etc. The output devices may include video output devices such as a display or touchscreen. The output devices also may include audio output devices such as external speakers or earphones. The output devices can be separate from and communicatively coupled to the computing device 600, or can be integral components of the computing device 600. Input functionality and output functionality may be integrated into the same input/output device (e.g., a touchscreen). Any suitable input device, output device, or combined input/output device either currently known or developed in the future may be used with described systems.
  • In general, functionality of computing devices described herein may be implemented in computing logic embodied in hardware or software instructions, which can be written in a programming language, such as C, C++, COBOL, JAVA™, PHP, Perl, Python, Ruby, HTML, CSS, JavaScript, VBScript, ASPX, Microsoft .NET™ languages such as C#, and/or the like. Computing logic may be compiled into executable programs or written in interpreted programming languages. Generally, functionality described herein can be implemented as logic modules that can be duplicated to provide greater processing capability, merged with other modules, or divided into sub-modules. The computing logic can be stored in any type of computer-readable medium (e.g., a non-transitory medium such as a memory or storage medium) or computer storage device and be stored on and executed by one or more general-purpose or special-purpose processors, thus creating a special-purpose computing device configured to provide functionality described herein.
  • EXTENSIONS AND ALTERNATIVES
  • Many alternatives to the systems and devices described herein are possible. For example, individual modules or subsystems can be separated into additional modules or subsystems or combined into fewer modules or subsystems. As another example, modules or subsystems can be omitted or supplemented with other modules or subsystems. As another example, functions that are indicated as being performed by a particular device, module, or subsystem may instead be performed by one or more other devices, modules, or subsystems. Although some examples in the present disclosure include descriptions of devices comprising specific hardware components in specific arrangements, techniques and tools described herein can be modified to accommodate different hardware components, combinations, or arrangements. Further, although some examples in the present disclosure include descriptions of specific usage scenarios, techniques and tools described herein can be modified to accommodate different usage scenarios. Functionality that is described as being implemented in software can instead be implemented in hardware, or vice versa.
  • Many alternatives to the techniques described herein are possible. For example, processing stages in the various techniques can be separated into additional stages or combined into fewer stages. As another example, processing stages in the various techniques can be omitted or supplemented with other techniques or processing stages. As another example, processing stages that are described as occurring in a particular order can instead occur in a different order. As another example, processing stages that are described as being performed in a series of steps may instead be handled in a parallel fashion, with multiple modules or software processes concurrently handling one or more of the illustrated processing stages. As another example, processing stages that are indicated as being performed by a particular device or module may instead be performed by one or more other devices or modules.
  • While illustrative embodiments have been illustrated and described, it will be appreciated that various changes can be made therein without departing from the spirit and scope of the invention.

Claims (20)

1. A medical diagnostic apparatus, comprising:
a controller;
a user interface configured for user interaction with the medical diagnostic apparatus;
a platform comprising a horizontal surface, wherein at least a portion of the horizontal surface is transparent to visible light;
a weight measurement system coupled to the platform; and
at least one visible light image sensor positioned below the horizontal surface that is capable of producing a diagnostic visible light image of a bottom portion of a foot or feet positioned on the horizontal surface.
2. The medical diagnostic apparatus of claim 1, wherein at least a portion of the horizontal surface is transparent to infrared light suitable for thermal imaging, the medical diagnostic apparatus further comprising at least one infrared image sensor positioned below the horizontal surface that is capable of producing a thermal image of the first target area.
3. The medical diagnostic apparatus of claim 1, wherein the user interface comprises an interactive voice interface.
4. The medical diagnostic apparatus of claim 3, wherein the user interface further comprises a display or a visual indicator.
5. The medical diagnostic apparatus of claim 1 further comprising a visual or tactile guide for foot positioning.
6. The medical diagnostic apparatus of claim 1 further comprising one or more illumination sources.
7. A device for testing for peripheral neuropathy comprising, in combination with the medical diagnostic apparatus of claim 1:
a controller;
a foot platform having at least one opening;
at least one vertically oriented monofilament positioned to pass through the at least one opening of the foot platform; and
at least one actuator positioned below the at least one opening of the foot platform, the at least one actuator being mechanically coupled to the at least one vertically oriented monofilament and configured to move the at least one vertically oriented monofilament to pass through the at least one opening of the foot platform.
8. The medical diagnostic apparatus of claim 1 further comprising at least one visible light image sensor positioned above the platform capable of producing a diagnostic visible light image of a top portion of the user's foot or feet.
9. The medical diagnostic apparatus of claim 1 further comprising at least one infrared image sensor positioned above the platform capable of producing a thermal image of a top portion of the user's foot or feet.
10. A medical diagnostic apparatus, comprising:
a platform comprising a horizontal surface, wherein at least a portion of the horizontal surface is transparent to visible light;
at least one upper image sensor positioned above the horizontal surface, the at least one upper image sensor being configured to capture one or more images of a top portion of a foot or feet;
at least one lower image sensor positioned below the horizontal surface, the at least one lower image sensor being configured to capture one or more images of a bottom portion of the foot or feet;
a controller; and
a user interface configured for user interaction with the medical diagnostic apparatus.
11. The medical diagnostic apparatus of claim 10, wherein at least a portion of the horizontal surface is transparent to infrared light suitable for thermal imaging, the medical diagnostic apparatus further comprising at least one infrared image sensor positioned below the horizontal surface that is capable of producing a thermal image of the bottom portion of the foot or feet.
12. The medical diagnostic apparatus of claim 10, wherein the user interface comprises one or more of an interactive voice interface, a display, or a visual indicator.
13. The medical diagnostic apparatus of claim 10, wherein the at least one lower image sensor comprises two cameras positioned below the horizontal surface.
14. The medical diagnostic apparatus of claim 10 further comprising a visual or tactile guide for foot positioning.
15. The medical diagnostic apparatus of claim 10 further comprising one or more illumination sources.
16. A method for automated diagnosis of a diabetic foot condition, the method comprising:
capturing, by one or image capture devices of a medical diagnostic apparatus, optical image data of a target area of a foot;
collecting, by a touch sensitivity testing device of the medical diagnostic apparatus, physical touch sensitivity data for the target area of the foot;
transmitting, by the medical diagnostic apparatus, the optical image data, the physical touch sensitivity data, or a combination of such data to an analysis engine; and
outputting, by the analysis engine, one or more indications of a diabetic foot condition.
17. The method of claim 16 further comprising checking the optical image data one or more of image quality, lighting conditions, or body positioning.
18. The method of claim 16 further comprising, prior to collecting the physical touch sensitivity data, confirming the position of the foot based at least in part on the optical image data.
19. The method of claim 16, wherein the touch sensitivity testing device comprises a servo-actuated monofilament probe.
20. The method of claim 16, wherein the analysis engine comprises an image classifier.
US16/044,248 2017-07-24 2018-07-24 Device for non-invasive detection of skin problems associated with diabetes mellitus Abandoned US20190021649A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US16/044,248 US20190021649A1 (en) 2017-07-24 2018-07-24 Device for non-invasive detection of skin problems associated with diabetes mellitus
US17/752,755 US20220280100A1 (en) 2017-07-24 2022-05-24 Device for non-invasive detection of skin problems associated with diabetes mellitus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762536388P 2017-07-24 2017-07-24
US16/044,248 US20190021649A1 (en) 2017-07-24 2018-07-24 Device for non-invasive detection of skin problems associated with diabetes mellitus

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/752,755 Division US20220280100A1 (en) 2017-07-24 2022-05-24 Device for non-invasive detection of skin problems associated with diabetes mellitus

Publications (1)

Publication Number Publication Date
US20190021649A1 true US20190021649A1 (en) 2019-01-24

Family

ID=65014565

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/044,248 Abandoned US20190021649A1 (en) 2017-07-24 2018-07-24 Device for non-invasive detection of skin problems associated with diabetes mellitus
US17/752,755 Pending US20220280100A1 (en) 2017-07-24 2022-05-24 Device for non-invasive detection of skin problems associated with diabetes mellitus

Family Applications After (1)

Application Number Title Priority Date Filing Date
US17/752,755 Pending US20220280100A1 (en) 2017-07-24 2022-05-24 Device for non-invasive detection of skin problems associated with diabetes mellitus

Country Status (1)

Country Link
US (2) US20190021649A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110517230A (en) * 2019-08-06 2019-11-29 马桂文 A kind of foot morphological analysis diagnostic system
US10733481B2 (en) * 2018-12-29 2020-08-04 Hon Hai Precision Industry Co., Ltd. Cloud device, terminal device, and method for classifying images
WO2021176255A1 (en) 2020-03-06 2021-09-10 Uab Diabetis System, method, and apparatus for temperature asymmetry measurement of body parts
US20210287797A1 (en) * 2020-03-11 2021-09-16 Memorial Sloan Kettering Cancer Center Parameter selection model using image analysis
WO2022040576A1 (en) * 2020-08-21 2022-02-24 Empo Health, Inc. System to detect foot abnormalities
US20220208344A1 (en) * 2020-12-29 2022-06-30 Kpn Innovations, Llc. Systems and methods for generating an alimentary plan for managing skin disorders
US11426121B1 (en) 2019-09-20 2022-08-30 Auburn University Semi-automated plantar surface sensation detection device
US11484252B2 (en) * 2019-06-17 2022-11-01 Medic, Inc. Device for providing health and wellness data through foot imaging
US11538157B1 (en) * 2019-06-27 2022-12-27 Jeffrey Norman Schoess Imaging system and method for assessing wounds
US11583206B2 (en) * 2017-11-29 2023-02-21 Hewlett-Packard Development Company, L.P. Sensing plantar adipose tissue
US20230169630A1 (en) * 2021-12-01 2023-06-01 Ford Global Technologies, Llc Image compensation service

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9017266B2 (en) * 2007-02-13 2015-04-28 The Hong Kong Polytechnic University Automated testing for palpating diabetic foot patient
GB2550582B (en) * 2016-05-23 2020-07-15 Bluedrop Medical Ltd A skin inspection device identifying abnormalities

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11583206B2 (en) * 2017-11-29 2023-02-21 Hewlett-Packard Development Company, L.P. Sensing plantar adipose tissue
US10733481B2 (en) * 2018-12-29 2020-08-04 Hon Hai Precision Industry Co., Ltd. Cloud device, terminal device, and method for classifying images
US11484252B2 (en) * 2019-06-17 2022-11-01 Medic, Inc. Device for providing health and wellness data through foot imaging
US11538157B1 (en) * 2019-06-27 2022-12-27 Jeffrey Norman Schoess Imaging system and method for assessing wounds
CN110517230A (en) * 2019-08-06 2019-11-29 马桂文 A kind of foot morphological analysis diagnostic system
US11426121B1 (en) 2019-09-20 2022-08-30 Auburn University Semi-automated plantar surface sensation detection device
US20220296158A1 (en) * 2020-03-06 2022-09-22 Uab Diabetis System, method, and apparatus for temperature asymmetry measurement of body parts
WO2021176255A1 (en) 2020-03-06 2021-09-10 Uab Diabetis System, method, and apparatus for temperature asymmetry measurement of body parts
US20210287797A1 (en) * 2020-03-11 2021-09-16 Memorial Sloan Kettering Cancer Center Parameter selection model using image analysis
US11887732B2 (en) * 2020-03-11 2024-01-30 Memorial Sloan Kettering Cancer Center Parameter selection model using image analysis
WO2022040576A1 (en) * 2020-08-21 2022-02-24 Empo Health, Inc. System to detect foot abnormalities
US20220208344A1 (en) * 2020-12-29 2022-06-30 Kpn Innovations, Llc. Systems and methods for generating an alimentary plan for managing skin disorders
US11581084B2 (en) * 2020-12-29 2023-02-14 Kpn Innovations, Llc. Systems and methods for generating an alimentary plan for managing skin disorders
US20230169630A1 (en) * 2021-12-01 2023-06-01 Ford Global Technologies, Llc Image compensation service

Also Published As

Publication number Publication date
US20220280100A1 (en) 2022-09-08

Similar Documents

Publication Publication Date Title
US20220280100A1 (en) Device for non-invasive detection of skin problems associated with diabetes mellitus
US10863927B2 (en) Identifying fall risk using machine learning algorithms
Maddah et al. Use of a smartphone thermometer to monitor thermal conductivity changes in diabetic foot ulcers: a pilot study
Muneer et al. Smart health monitoring system using IoT based smart fitness mirror
US10117617B2 (en) Automated systems and methods for skin assessment and early detection of a latent pathogenic bio-signal anomaly
US10667682B2 (en) Assessment of low contrast visual sensitivity
KR101535432B1 (en) Contents valuation system and contents valuating method using the system
US11076798B2 (en) System and method for non-invasive and non-contact measurement in early therapeutic intervention
Yan et al. Resting and postexercise heart rate detection from fingertip and facial photoplethysmography using a smartphone camera: a validation study
JP2005228315A (en) Child growth development management system and method
CN106659392A (en) Unobtrusive skin tissue hydration determining device and related method
US20180242874A1 (en) Devices, systems and methods for coronary and/or pulmonary abnormality detection utilizing electrocardiography
US11647938B2 (en) Wearable heartbeat and breathing waveform continuous monitoring system
JP7015795B2 (en) Classification of physical condition
JP2019091498A (en) System and method for correcting answers
JP2022516586A (en) Body analysis
JP2018108327A (en) Health monitoring system, health monitoring method and health monitoring program
US20190355448A1 (en) Automated health assessment system and method thereof
JP2023522952A (en) Systems and methods for remote dermatological diagnosis
Halamka et al. The digital reconstruction of health care
JP2021028808A (en) Information processing system, information processing device, information processing method, program, and learned model
Chadwick et al. Mobile medical applications for melanoma risk assessment: False assurance or valuable tool?
WO2020203651A1 (en) Skin disease analyzing program, skin disease analyzing method, skin disease analyzing device, and skin disease analyzing system
US11950883B2 (en) System, method, and apparatus for temperature asymmetry measurement of body parts
Jayakody et al. HemoSmart: a non-invasive, machine learning based device and mobile app for anemia detection

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: WELLPEPPER, INC., WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:VAN SNELLENBERG, MIKE;WEILER, ANNE;FEASTER, LUKE;AND OTHERS;SIGNING DATES FROM 20190222 TO 20190320;REEL/FRAME:048851/0785

AS Assignment

Owner name: 2020 AWMVS, INC., WASHINGTON

Free format text: CHANGE OF NAME;ASSIGNOR:WELLPEPPER, INC.;REEL/FRAME:051665/0967

Effective date: 20200108

Owner name: CARAVAN HEALTH, INC., MISSOURI

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:2020 AWMVS, INC.;REEL/FRAME:051584/0086

Effective date: 20200113

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION