WO2012111012A1 - Système et procédé permettant d'exécuter un examen médical automatique et autoguidé - Google Patents

Système et procédé permettant d'exécuter un examen médical automatique et autoguidé Download PDF

Info

Publication number
WO2012111012A1
WO2012111012A1 PCT/IL2012/050050 IL2012050050W WO2012111012A1 WO 2012111012 A1 WO2012111012 A1 WO 2012111012A1 IL 2012050050 W IL2012050050 W IL 2012050050W WO 2012111012 A1 WO2012111012 A1 WO 2012111012A1
Authority
WO
WIPO (PCT)
Prior art keywords
patient
data
diagnostics device
spatial disposition
medical
Prior art date
Application number
PCT/IL2012/050050
Other languages
English (en)
Inventor
David GILAD-GILOR
Original Assignee
Eon Medical Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Eon Medical Ltd. filed Critical Eon Medical Ltd.
Priority to CA2827523A priority Critical patent/CA2827523C/fr
Priority to CN201710067747.2A priority patent/CN107115123B/zh
Priority to AU2012219076A priority patent/AU2012219076B2/en
Priority to CN201280018592.9A priority patent/CN103781403B/zh
Priority to US14/000,374 priority patent/US8953837B2/en
Priority to EP12746572.2A priority patent/EP2675345B1/fr
Priority to JP2013554060A priority patent/JP6254846B2/ja
Publication of WO2012111012A1 publication Critical patent/WO2012111012A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/065Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
    • A61B5/067Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe using accelerometers or gyroscopes
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0257Proximity sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6843Monitoring or controlling sensor contact pressure

Definitions

  • This invention relates to the field of medical examinations, and more specifically to the field of automatic and self-guided medical examinations.
  • US Patent No. 6,544,198 (Chong et al.) issued April 8, 2003 discloses a stethoscope system for self-examination whereby the condition of health of a particular individual can be diagnosed by comparing characteristic sound waves classified by diseases with sound waves generated from various parts of the individual's body.
  • This system also provides for remote medical examination whereby sound waves generated from various parts of the individual's body are transmitted to a medical specialist using the Internet and receiving a virtual medical examination via the Internet.
  • US Patent No. 6,014,432 (Mondey) issued January 11, 2000 discloses a home health care system comprising: patient station including a first videophone, an electronic imaging assembly and a stethoscope assembly, coupled to said first videophone, for respectively producing digital image and physiological sound signals of a patient, wherein said first videophone simultaneously transmits said digital signals over a public telecommunications network; and a health care provider's station including a second videophone, a video display and a sound reproducer, wherein the second videophone receives digital signals from the first videophone over the public telecommunications network, displays the images of the patient on the display, and reproduces the physiological sounds of the patient by the sound reproducer.
  • patient station including a first videophone, an electronic imaging assembly and a stethoscope assembly, coupled to said first videophone, for respectively producing digital image and physiological sound signals of a patient, wherein said first videophone simultaneously transmits said digital signals over a public telecommunications network
  • a health care provider's station including a second videophone
  • the instrument has a casing that includes a hand-holdable body portion, a neck portion that extends from the body portion to a head portion that is formed of a back cover, a front cover, and a sealing gasket to form a fully soakable instrument.
  • a circuit board assembly in the body portion contains video processing circuitry, and a flexible neck board which extends forward from the body portion through the neck portion of the casing to a head board located in the head portion of the casing.
  • a solid state imager and a miniature lamp are disposed on the head board.
  • the front cover contains an adjustable focus lens cell for focusing on the imager an image of a target in the lens cell's field of view.
  • the instrument can be configured for various applications by installing front and back covers that are suited for a specific purpose.
  • the instrument can thus be used, for example, as an otoscope, a dental camera, or an episcope.
  • the instrument provides a monitor-ready standard format full color video signal to a remotely located monitor.
  • Such checks may be required as a routine check-up, according to a patients request, or in light of a need that arises (such as, for example, when a person does not feel well).
  • Such checks are performed during a face to face visit by medically trained personnel (e.g. a physician, a nurse, etc.) in light of the fact that there is a need of certain knowledge, as well as equipment, in order to perform such checks. It is estimated that there are billions of medical examinations performed each year. It is to be noted that the number of general checks is expected to grow in the future as the average life expectancy keeps rising, and elderly people tend to use more medical service.
  • a handheld diagnostics device for performing at least one medical examination, the diagnostics device comprising at least one diagnostics sensor and a processor, wherein for each of the at least one medical examinations the processor is configured to provide medical data acquisition guidance to a user based on pre-defined reference data; operate the at least one diagnostics sensor to acquire medical data of a patient.
  • a handheld diagnostics device further comprising at least one navigation sensor operable to acquire position data
  • the processor is further configured to receive navigation data from the at least one navigation sensors and reference data; utilize the navigation data and the reference data, in order to calculate position of the diagnostic device with respect to the patient; calculate a route from the position of the diagnostics device to a desired position of the diagnostics device, in accordance with the reference data; provide navigational guidance to the user; acquire data relating to the at least one medical examination when the diagnostics device is in the desired position.
  • a handheld diagnostics device wherein the medical data acquisition guidance includes at least one of the following:
  • a handheld diagnostics device wherein the processor is further configured to verify that the acquired data meets pre-defined standards.
  • pre-defined standards are at least one of:
  • a method for operating a handheld diagnostics device for performing at least one medical examination for each of the at least one medical examinations the method comprising providing medical data acquisition guidance to a user based on pre-defined reference data, and operating the at least one diagnostics sensor to acquire medical data of a patient.
  • a method further comprising receiving navigation data and reference data, utilizing the navigation data and the reference data, in order to calculate position of the diagnostic device with respect to the patient, calculating a route from the position of the diagnostics device to a desired position of the diagnostics device, in accordance with the reference data, providing navigational guidance to the user, and acquiring data relating to the at least one medical examination when the diagnostics device is in the desired position.
  • the medical data acquisition guidance includes at least one of the following:
  • the pre-defined standards are at least one of:
  • a handheld diagnostics device for performing at least one medical examination, the diagnostics device comprising at least one diagnostics sensor operable to acquire medical data, and a processor, wherein for each of the at least one medical examinations the processor is configured to, in a state of the diagnostics device with respect to a patient in which a respective medical examination enabling condition is not met, provide to a user guidance for changing a state of the diagnostics device with respect to the patient; determine, during the changing of the state of the diagnostics device, fulfillment of the respective medical examination enabling condition based on comparison of acquired data with respective reference data; and operate the at least one diagnostics sensor to acquire respective medical data of the subject when the respective medical examination enabling condition is met.
  • a handheld diagnostics device wherein the processor is configured to operate the at least one diagnostics sensor to acquire respective medical data of the patient when a first medical examination enabling condition is met and a second medical examination enabling condition is not met, and afterwards to operate the at least one diagnostics sensor to acquire medical data of the patient when the second medical examination enabling condition is met.
  • handheld diagnostics device configured to perform one or more medical examinations of a patient, the diagnostics device comprising:
  • At least one diagnostics sensor At least one diagnostics sensor
  • the processor is configured to:
  • a handheld diagnostics device wherein the reference data and the navigation enabling data are body or body organ images.
  • a handheld diagnostics device wherein the reference data and the navigation enabling data is Inertial Navigation System (INS) data received from the at least one navigation sensor.
  • INS Inertial Navigation System
  • the processor is configured to perform the following steps in order to determine the diagnostics device spatial disposition with respect to the desired spatial disposition:
  • a handheld diagnostics device wherein the processor is configured to perform the following steps in order to determine the diagnostics device spatial disposition with respect to the desired spatial disposition:
  • the diagnostics device spatial disposition with respect to the desired spatial disposition.
  • a handheld diagnostics device wherein the determine is based on triangulation techniques while utilizing the received INS data acquired in at least three pre-defined reference points on the patient's body.
  • a handheld diagnostics device wherein the processor is further configured to transmit the acquired medical data to a remote workstation.
  • a handheld diagnostics device wherein the processor is configured to perform the transmit via a central system.
  • a handheld diagnostics device wherein the central system routes the acquired medical data to available trained personnel.
  • a handheld diagnostics device wherein the processor is further configured to:
  • a handheld diagnostics device wherein the reference data is acquired during a calibration process performed by trained personnel.
  • a handheld diagnostics device wherein the processor is further configured to perform the following steps during the calibration process:
  • a handheld diagnostics device wherein upon arrival to the desired diagnostics device spatial disposition, the processor is further configured to perform the following additional steps:
  • a handheld diagnostics device wherein the one or more medical examinations of the patient are defined by a pre-defined check plan associated with the patient.
  • a handheld diagnostics device wherein the diagnostics sensor is an image based diagnostics sensor.
  • a handheld diagnostics device wherein the diagnostics sensor is a sound based diagnostics sensor.
  • a handheld diagnostics device wherein the navigation sensor is a camera. In accordance with one example of the presently disclosed subject matter, there is still further provided a handheld diagnostics device wherein the navigation sensor is an INS. In accordance with one example of the presently disclosed subject matter, there is still further provided a handheld diagnostics device wherein the maneuvering instructions are voice instructions provided via a speaker.
  • a handheld diagnostics device wherein the maneuvering instructions are visual instructions provided via a display.
  • a handheld diagnostics device wherein the maneuvering instructions are vibration instructions provided via vibration elements.
  • a handheld diagnostics device wherein the processor is further configured to verify that the acquired data meets pre-defined standards.
  • a handheld diagnostics device wherein the pre-defined standards are at least one of:
  • a handheld diagnostics device wherein the reference data is generic reference data.
  • the processor is further configured to automatically identify the patient.
  • a method for performing one or more medical examinations of a patient using a diagnostics device comprising:
  • the reference data and the navigation enabling data are body or body organ images.
  • the reference data and the navigation enabling data is Inertial Navigation System (INS) data received from the at least one navigation sensor.
  • INS Inertial Navigation System
  • determining comprises:
  • determining comprises:
  • the diagnostics device spatial disposition with respect to the desired spatial disposition.
  • determining is based on triangulation techniques while utilizing the received INS data acquired in at least three pre-defined reference points on the patient's body.
  • a method further comprising transmitting the acquired medical data to a trained personnel workstation.
  • a method wherein the transmitting is performed via a central system.
  • the central system routes the acquired medical data to available trained personnel.
  • maneuvering instructions are voice instructions provided via a speaker.
  • maneuvering instructions are visual instructions provided via a display.
  • maneuvering instructions are vibration instructions provided via vibration elements.
  • a method further comprising verifying that the acquired data meets predefined standards.
  • the pre-defined standards are at least one of:
  • a method further comprising automatically identifying the patient there is still further provided a handheld diagnostics device configured to perform one or more medical examinations of a patient, he diagnostics device comprising:
  • At least one diagnostics sensor At least one diagnostics sensor
  • the processor is configured to:
  • a method for performing one or more medical examinations of a patient using a diagnostics device wherein for at least one medical examination of the medical examinations, the method comprising:
  • Fig. 1 is a block diagram schematically illustrating one example of a system for performing an automatic and self-guided medical examination, in accordance with the presently disclosed subject matter;
  • Fig. 2 is a block diagram schematically illustrating one example of a diagnostic device configured to perform an automatic and self-guided medical examination, in accordance with the presently disclosed subject matter
  • Fig. 3 is a block diagram schematically illustrating an example of diagnostic sensors configured to acquire medical data, in accordance with the presently disclosed subject matter
  • Fig. 4 is a block diagram schematically illustrating an example of a navigation module configured to calculate the spatial disposition of the diagnostic device with respect to patient's body (or a specific part thereof), in accordance with the presently disclosed subject matter;
  • Fig. 5 is a block diagram schematically illustrating an example of a guiding module configured to guide the diagnostic device user, in accordance with the presently disclosed subject matter
  • Fig. 6 is a flowchart illustrating one example of a sequence of operations carried out for performing an automatic and self-guided medical examination, in accordance with the presently disclosed subject matter
  • Fig. 7 is a flowchart illustrating one example of a sequence of operations carried out for performing personalized calibration of a diagnostic device, in accordance with the presently disclosed subject matter
  • Fig. 8a is a flowchart illustrating an example of a sequence of operations carried out for recording reference data during personalized calibration of a diagnostic device, using imaging and orientation sensors, in accordance with the presently disclosed subject matter;
  • Fig. 8b is a flowchart illustrating an example of a sequence of operations carried out for recording reference data during personalized calibration of a diagnostic device, using INS sensors and body points, in accordance with the presently disclosed subject matter.
  • Fig. 8c is a flowchart illustrating one example of a sequence of operations carried out for recording reference data during personalized calibration of a diagnostic device, using reference points and pointing object, in accordance with the presently disclosed subject matter;
  • FIG. 9 is a schematic illustration of exemplary image based reference patterns, in accordance with the presently disclosed subject matter.
  • Fig. 10 is a schematic illustration of exemplary image based and INS based reference points, in accordance with the presently disclosed subject matter
  • Fig. 11 is a flowchart illustrating one example of a sequence of operations carried out for calculating the spatial disposition of a diagnostic device with respect to patient's 103 body (or a specific part thereof), in accordance with the presently disclosed subject matter;
  • Fig. 12 is a flowchart illustrating one example of a sequence of operations carried out for navigating a diagnostic device and guiding a diagnostic device user accordingly, in accordance with the presently disclosed subject matter;
  • Fig. 12a is a flowchart illustrating another example of a sequence of operations carried out for navigating a diagnostic device and guiding a diagnostic device user accordingly, in accordance with the presently disclosed subject matter;
  • Fig. 12b is a schematic illustration of an exemplary pointing object used for navigating a diagnostic device and guiding a diagnostic device user accordingly, in accordance with the presently disclosed subject matter;
  • Fig. 13 is a schematic illustration of exemplary presentation of navigational instructions to a diagnostic device user, in accordance with the presently disclosed subject matter
  • Fig. 14 is a flowchart illustrating one example of a sequence of operations carried out for acquisition and verification of a reading by a diagnostic device, in accordance with the presently disclosed subject matter;
  • Fig. 15 is a block diagram schematically illustrating one example of a system for performing an automatic and remote trained personnel guided medical examination, in accordance with the presently disclosed subject matter;
  • Fig. 16 is a schematic illustration of some exemplary guiding devices that can be used for providing navigation instructions to a user of a diagnostic device, in accordance with the presently disclosed subject matter;
  • Fig. 17 is a flowchart illustrating one example of a sequence of operations carried out for performing an automatic and remote trained personnel guided medical examination, in accordance with the presently disclosed subject matter;
  • Fig. 18 is a flowchart illustrating one example of a sequence of operations carried out for navigating a diagnostic device and guiding a diagnostic device user accordingly in a remote trained personnel guided medical examination, in accordance with the presently disclosed subject matter;
  • Fig. 19 is a schematic illustration of an exemplary navigation and guiding presentation to trained personnel, in accordance with the presently disclosed subject matter;
  • Fig. 20 is a flowchart illustrating one example of a sequence of operations carried out for acquisition and verification of a reading by a diagnostic device in a remote trained personnel guided medical examination, in accordance with the presently disclosed subject matter.
  • the term "computer” should be expansively construed to cover any kind of electronic device with data processing capabilities, including, by way of non-limiting example, a personal computer, a server, a computing system, a communication device, a processor (e.g. digital signal processor (DSP), a microcontroller, a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), etc.), any other electronic computing device, and or any combination thereof.
  • DSP digital signal processor
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • the phrase “for example,” “such as”, “for instance” and variants thereof describe non-limiting embodiments of the presently disclosed subject matter.
  • Reference in the specification to “one case”, “some cases”, “other cases” or variants thereof means that a particular feature, structure or characteristic described in connection with the example(s) is included in at least one example of the presently disclosed subject matter.
  • the appearance of the phrase “one case”, “some cases”, “other cases” or variants thereof does not necessarily refer to the same example(s).
  • each module in the figures can be made up of any combination of software, hardware and/or firmware that performs the functions as defined and explained herein.
  • the modules in the figures may be centralized in one location or dispersed over more than one location.
  • Fig. 1 a block diagram schematically illustrating one example of a system for performing an automatic and self-guided medical examination, in accordance with the presently disclosed subject matter.
  • user 102 and patient 103 are located at patient location 100.
  • User 102 can in some cases be patient 103 whose medical examination is required (in such cases, even though user 102 and patient 103 are shown as separate entities in the drawings, they are in fact the same entity). In other cases, user 102 can be a person that will be performing the medical examination of patient 103.
  • diagnostic device 104 For the purpose of performing a medical examination, user 102 operates a diagnostic device 104, as further detailed below. In some cases, user 102 also operates a patient workstation 114, as further detailed below.
  • Patient workstation 114 can be any computer, including a personal computer, a portable computer, a cellular handset or an apparatus with appropriate processing capabilities, including a computer and/or an apparatus which can be, for example, specifically configured for that purpose. It is to be noted that in some cases, patient workstation 114 can be incorporated within diagnostics device 104. Diagnostics device 104 comprises (or is otherwise associated with) at least one processor 106 (e.g.
  • DSP digital signal processor
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • Processor 106 is configured to receive instructions and control the components and operations of diagnostics device 104.
  • diagnostics device 104 can be configured to communicate with patient workstation 114.
  • the communication between diagnostics device 104 and patient workstation 114 can be realized by any communication means, e.g. via wired or wireless communication. It can be noted that user 102, patient 103, diagnostics device 104 and patient workstation 114 are located at patient location 100.
  • Diagnostics device 104 can be configured to acquire various data as further detailed below.
  • the acquired data can be transmitted (directly from diagnostics device 104 or through patient workstation 114) to trained personnel workstation 122 located at trained personnel location 120 and/or to central system 130.
  • Central system 130 and trained personnel workstation 122 can be any computer, including a personal computer, a portable computer, a cellular handset or an apparatus with appropriate processing capabilities, including a computer and/or an apparatus which can be, for example, specifically configured for that purpose.
  • the acquired data can be transmitted for example via Internet 116. It is to be noted that the data can be transmitted while utilizing other known communication alternatives, such as a cellular network, VPN, LAN, etc.
  • Central system 130 comprises patient & check plan repository 136 in which various data relating to the patient is maintained. Such data can include, for example, patient identification number, patient name, patient age, patient contact details, patient medical data (such as diseases, sensitivities to medicines, etc.), check plans data (as further detailed below), etc.
  • Central system 130 can further comprise a medical examination repository 134 in which data acquired by diagnostics device 104 and patient workstation 114 is maintained. Such data can include, for example, results of medical examinations performed using diagnostics device (such as ear readings, lungs or heart recorded sound, blood pressure, body temperature, etc. as further detailed below).
  • Central system 130 further comprises management system 132 configured to forward received data to a selected trained personnel workstation 122 (for example an available trained personnel workstation 122 or trained personnel workstation 122 with the shortest queue).
  • a selected trained personnel workstation 122 for example an available trained personnel workstation 122 or trained personnel workstation 122 with the shortest queue.
  • there may be more than one trained personnel location 120 and trained personnel 124 as central system 130 allows for a distributed approach in which data can be received by the central system 130 from multiple patient locations and transferred by it to multiple trained personnel locations.
  • the transmitted data is received at central system 130, the data is saved in medical examination repository 134 and management system 132 can transmit the received data to trained personnel location 120 (e.g. via Internet 116.
  • management system 132 can also manage other processes such as, subscribing patients, planning scheduling of patients to available trained personnel, etc.
  • central system 130 is optional to the solution and that central system 130 can be part of the trained personnel system 120, In addition the communication between the patient location 100 to the trained personnel location 120 can be implemented directly without the use of or need for a central system 130.
  • trained personnel workstation 122 When the transmitted data is received at trained personnel workstation 122, the data can be saved in trained personnel data repository 123 that can be connected to trained personnel workstation 122.
  • a trained personnel 124 e.g. a doctor, a nurse, a medic, etc., including any other person with the know-how and skill to acquire and/or analyze medical data
  • patient workstation 114, trained personnel workstation 122 and central system 130 can include a display (e.g. LCD screen), and a keyboard or any other suitable input/output devices.
  • trained personnel 124 can provide feedback to user 102, for example by transmitting data back to patient workstation 114.
  • Such feedback can include, for example, analysis of the received data, request to receive more data, medical treatment instructions, invitation to further examination, etc.
  • trained personnel 124 can transmit feedback data to central system 130, which, in turn, can transmit the feedback data to patient workstation 114 (e.g. via the Internet, cellular network, etc.).
  • Fig. 2 is a block diagram schematically illustrating one example of a diagnostic device configured to perform an automatic and self-guided medical examination, in accordance with the presently disclosed subject matter.
  • Diagnostics device 104 can compriseinter alia, diagnostic sensors module 202, guiding module 206, examination logic module 208, check plan repository 210 and data repository 216. Diagnostics device can further comprise navigation module 204, reading and verification logic module 212 and calibration logic module 214.
  • Examination logic module 208 can be responsible for operating diagnostics device 104 for performing a medical examination of patient 103.
  • Diagnostics device 104 can be activated for example by User 102.
  • user 102 can optionally indicate the patient to be checked.
  • Such indication can be in the form of inputting patient 103 identification details (e.g. patient id, patient name, etc.), for example in patient workstation 114.
  • patient 103 identification details e.g. patient id, patient name, etc.
  • Such indication can be in the form of selecting a specific patient 103, for example from a list of known patients.
  • Such list of known patients can be displayed on patient workstation 114.
  • such list of known patients can be displayed on a display connected to diagnostics device 104.
  • diagnostic device 104 can automatically identify patient 103 by using methods of body identification such as face recognition, fingerprint reading or any other mean of biometric identification.
  • body identification such as face recognition, fingerprint reading or any other mean of biometric identification.
  • Such automatic identification can utilize, for example, navigation camera 420 or any other peripheral, reader or sensor connected to diagnostic device 104 or to patient workstation 114 that enable acquiring data relevant to the automatic identification. It is to be noted that other methods of indicating or identifying a patient to be checked can be utilized as well.
  • examination logic module 208 can be configured to retrieve data relating to a check plan.
  • check plan data can be stored on one or more of: check plan repository 210, patient & check plan repository 136, trained personnel data repository 123 or any other location operatively connected to diagnostics device 104 on which patient specific check plan data can be stored.
  • a check plan can define a series of medical examinations and data to be acquired by diagnostics device 104. Such medical data acquisition can be performed by user 102 on patient 103.
  • the medical data can include, for example, body temperature, blood pressure, pulse, respiratory rate, throat image, mole image, ear image, etc.
  • the check plan can in some cases be a generic check plan (e.g.
  • the check plan can be defined according to a certain medical condition of patient 103 (e.g. a check plan for patients with cancer can comprise a series of cancer specific required medical examinations, a check plan for patients with high blood pressure can comprise a series of high blood pressure specific required medical examinations, etc.).
  • the check plan can be specifically defined for patient 103, for example according to a trained personnel 124 decision (e.g. a physician interested in monitoring specific medical data of a specific patient can decide upon a patient specific check plan).
  • the check plan can include information, inter alia about the examination process, steps and logic, and predefined reading parameters such as type of sensor to be used (still image vs. video), required length of reading (sound or video recording) in terms of time (e.g. seconds), and reading data thresholds (for example definition of acceptable minimal and/or maximal reading limits to be used as a quality parameter of a reading.
  • examination logic module 208 can be configured to utilize navigation module 204 in order to enable determination of current diagnostics device spatial disposition with respect to patient's 103 body (or a specific part thereof).
  • spatial disposition or the like can relate to spatial distances, spatial angles (including orientations), or any other spatial reference that is used for characterizing a spatial relationship between two objects, e.g. between diagnostics device 104 and patient's 103 body (or a specific part thereof).
  • Navigation module 204 can be responsible for the operation of various sensors utilized for that purpose, as further detailed below with reference to Fig. 4. Navigation module 204 can utilize pre-stored reference data for establishing data about diagnostics device 104 current and desired spatial dispositions with respect to patient's 103 body (or a specific part thereof).
  • the pre-stored reference data can consist of image based reference data and/or diagnostics device 104 spatial disposition based reference data, or any other relevant reference data, including data that can be read by diagnostics device 104 navigation module 204 or diagnostic sensors 202, as further detailed below, inter alia with respect to Figs. 6, 9 and 10-13.
  • the reference data can be for example images of patient 103 (external patient images and/or internal patient images of internal body parts), general organ images, device coordinates, data of relativity between spatial dispositions with respect to patient's 103 body (or a specific part thereof), etc.
  • Such pre- stored reference data can be stored on patient & check plan repository 136, trained personnel data repository 123 or any other location operatively connected to diagnostics device 104 on which image based reference data is stored.
  • navigation module can calculate a route to a desired diagnostics device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof), that can be defined, for example, by the patient specific check plan.
  • the route calculation can be performed continuously or periodically (e.g. every pre-determined time interval), for example until arrival to the desired diagnostics device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof), as further detailed below, inter alia with reference to Figs. 6 and 11-13.
  • examination logic module 208 can be configured to utilize guiding module 206 in order to provide various guidance data instructing user 102 how to maneuver diagnostics device 104 to the desired diagnostics device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof).
  • guidance data can include, inter alia, voice commands, image display, diagnostics device 104 vibrations, etc., as further detailed below, inter alia with reference to Figs. 5, 6 and 11-
  • Such guidance data can be presented to user 102 continuously or periodically (e.g. every pre-determined time interval), until diagnostics device 104 arrives at the desired spatial disposition with respect to patient's 103 body (or a specific part thereoftfrom which the medical examination can be performed.
  • Such guidance data can be calculated according to the respective calculation of a route to the desired diagnostics device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof), as calculated by navigation module 204.
  • examination logic module 208 can be configured to utilize reading and verification logic module 212 in order to acquire medical data of patient 103.
  • reading and verification module 212 can be configured to verify that diagnostics device 104 is located at the desired spatial disposition with respect to patient's 103 body (or a specific part thereof) when acquiring medical data of patient 103, as further detailed below, inter alia with reference to Fig.
  • Reading and verification module 212 can be further configured to instruct diagnostics sensor module 202 to prepare to acquire medical data of patient 103, and to perform acquisition of such medical data, as further detailed below, inter alia with reference to Fig. 14. After acquisition of medical data of patient, reading and verification module 212 can be configured to verify that the acquired data meets pre- defined standards (e.g. a required length of reading, reading data thresholds, etc.), as further detailed below, inter alia with reference to Fig. 14. In case the acquired data does not meet the pre-defined standards, diagnostics device 104 can in some cases be configured to instruct user 102 to perform the required repositioning and reorienting thereof in order to bring diagnostics device 104 to the desired spatial disposition with respect to patient's 103 body (or a specific part thereof). Following repositioning and reorienting of diagnostics device 104, reading and verification logic module 212 can be configured to retry acquiring the medical data of patient 103, as further detailed below, inter alia with reference to Fig. 14.
  • pre- defined standards
  • Diagnostics device 104 can be further configured to utilize diagnostics sensor module 202 that can be configured to acquire medical data of patient 103. Diagnostics sensor module 202 can be responsible for the operation of various sensors used for acquiring various medical data of patient 103. Such medical data of patient 103 can be used for example for diagnostics by trained personnel 124. Diagnostics sensor module 202 is further discussed below, inter alia with reference to Fig. 3.
  • diagnostics device 104 can further comprise a calibration logic module 214.
  • Calibration logic module 214 can be configured, inter alia, to acquire reference data relating to medical examinations of patient 103, as further detailed below, for example with reference to Fig. 7.
  • the reference data is acquired by diagnostics device 104 during an initial calibration performed by trained personnel 124.
  • a physician can perform a medical examination of patient 103 and diagnostics device 104 can, for example, record the medical examination performed by trained personnel 124, including the acquired medical data, as further detailed below, for example with reference to Fig. 7.
  • the recorded data can be stored, for example, on one or more of: check plan repository 210, patient & check plan repository 136, trained personnel data repository 123 or any other location operatively connected to diagnostics device 104 on which data relating to patient 103 can be stored.
  • diagnostics device 104 can further comprise data repository 216.
  • Data repository 216 can be configured to store various data, including, inter alia, data relating to one or more patients and various medical data thereof (e.g. data acquired during a medical examination of the patients), as further detailed below.
  • diagnostics device can further comprise check plan repository
  • Check plan repository 210 can be configured to store various data, including, inter alia, data relating to patient specific check plans, as further detailed below.
  • Fig. 3 is a block diagram schematically illustrating an example of diagnostic sensors configured to acquire medical data, in accordance with the presently disclosed subject matter.
  • Diagnostics sensors module 202 can include, inter alia, image based sensors 310, sound based sensors 320, as well as other sensors not shown in the drawing. Diagnostic sensors 202 can be designed for taking a specific organ reading (such as ear image reading (e.g. Otoscope)) and general organ readings (such as external skin reading, eye reading, etc.). Diagnostic sensors 202 can be modular e.g. some sensors can be attached/detached to diagnostic device 104, in accordance with the required medical examination.
  • organ reading such as ear image reading (e.g. Otoscope)
  • general organ readings such as external skin reading, eye reading, etc.
  • Diagnostic sensors 202 can be modular e.g. some sensors can be attached/detached to diagnostic device 104, in accordance with the required medical examination.
  • Image based sensors 310 can include one or more light sources 318.
  • Light sources 318 can be Light Emitting Diodes, or any other light source known in the art. Light sources 318 can be utilized for example to light the areas of which an image is to be acquired in order to provide for sufficient image quality (e.g. a quality that will enable image analysis by trained personnel 124).
  • Image based sensors 310 can further include image examination peripherals 312.
  • Image examination peripherals 312 can include, inter alia, various components that enable safe access to various body parts, such as a human ear, throat, etc. Such components can be, for example, made of plastic and can be attached to diagnostics device 104. Such components can, for example, have a generic physical structure that fits various body parts regardless of the fact that different people, at different ages, have different body parts structure (e.g. a child has a smaller ear than a grown person and the image examination peripherals 312 can be designed to fit substantially any ear structure, etc.).
  • Image examination peripherals 312 can aid user 102 in positioning the diagnostics device 104 in the desired spatial disposition with respect to patient's 103 body (or a specific part thereoftso that acquisition of image based medical data can be performed.
  • Image based sensors 310 can further include image acquisition sensor 316.
  • Image acquisition sensor 316 can be, inter alia, a camera (e.g. a still camera, a video camera, etc.), or any other device capable of acquiring an image.
  • Image acquisition sensor 316 can be based on standard sensors such as CMOS or CCD or any other applicable sensor known in the art.
  • Image acquisition sensor 316 can be designed to fit image acquisition of multiple body parts or organs, regardless of size or distance (e.g. it can have the required resolution and/or size and/or light sensitivity to fit multiple body parts or organ readings). It is to be noted that image acquisition sensor 316 can be the same sensor as the navigation image acquisition sensor and vice versa.
  • Image based sensors 310 can further include examination optics 314.
  • Examination optics 314 can be, for example, camera lenses. Examination optics 314 can be designed to fit various wavelengths, field depth, wide or narrow lens angle, etc. and therefore can fit various types of image readings as well as various types of organ sizes and structures. Examination optics 314 enable image acquisition sensor 316 to acquire image based medical data, having the required properties (e.g. examination optics 314 should enable acquisition of an image that covers the entire area that is required for analysis by trained personnel 124, etc.). In some cases, data acquired from examination optics 314 and image acquisition sensor 316 can be later analyzed and/or transformed and/or aligned to fit the specific required organ area reading (e.g. in order to fit a quality analysis by trained personnel 124, the specific required image area can be cut of the entire image or can be aligned using image analysis and or image transformation or manipulation techniques and/or algorithms known in the art).
  • Sound based sensors 320 can include one or more sound acquisition sensors 324.
  • Sound acquisition sensors 324 can be, for example, a microphone, or any other device capable of acquiring sound data. Sound acquisition sensors 324 can fit multiple sound frequencies that can be adjusted to fit recording of specific organ sound (as, for example, heart sound frequencies are different than lung sound frequencies). Sound acquisition sensors 324, can also include various abilities to assist acquiring a quality sound such as noise cancelation filters, etc.
  • Sound based sensors 320 can further include sound examination peripherals 322.
  • Sound examination peripherals 322 can include, inter alia, various components that enable easy fit, comfortable adjustment and safe access to various body parts, such as a human chest, stomach, lung, etc.
  • Such components can be, for example, made of plastic, rubber, etc. and can be attached to diagnostics device 104.
  • Such components can, for example, have a generic physical structure that fits various body parts regardless of the fact that different people, at different ages, have different body parts structure (e.g. a child has a smaller chest than a grown person and the sound examination peripherals 322 can be designed to fit substantially any chest structure, etc.).
  • Sound examination peripherals 322 can aid user 102 in positioning diagnostics device 104 in the desired spatial disposition with respect to patient 103 body (or a specific part thereof) in a way that will enable acquisition of sound based medical data (e.g. allow minimizing any external noise that can interfere with the sound acquisition).
  • Fig. 4 is a block diagram schematically illustrating an example of a navigation module configured to calculate the spatial disposition of the diagnostic device with respect to patient's body (or a specific part thereof), in accordance with the presently disclosed subject matter.
  • Navigation module 204 can comprise navigation logic module 400.
  • Navigation logic module 400 can be configured to determine current diagnostics device 104 spatial disposition with respect to patient's 103 body, and to calculate a route to a desired diagnostics device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof), as further detailed below, inter alia with respect to Figs. 6, 9 and 10-13.
  • navigation logic module 400 can be configured to utilize navigation sensors such as Inertial Navigation System (INS) sensors 410 (for example IMUs - Inertial Measurement Units) and/or navigation camera 420, etc.
  • INS sensors 410 can include movement sensors 412 (such as accelerometers sensors, etc.) capable of acquiring data relating to the movement of diagnostics device 104 and orientation sensors 414 (such as gyroscope sensors, etc.) capable of acquiring data relating to the orientation of diagnostics device 104.
  • Navigation logic module 400 can use the raw INS sensors data to calculate the exact movement and orientation of the device with respect to patient's 103 body and also include the required logic to eliminate sensors calibration errors using techniques and algorithms known the art.
  • navigation can be based on INS data alone. It is to be noted that navigation based on INS data alone requires substantially no movement of patient 103 during the medical examination, as such movement may result in deviations that will prevent accurate acquisition of medical data.
  • Navigation module 204 can further comprise navigation camera 420.
  • Navigation camera 420 can comprise a navigation image acquisition sensor 422 configured to acquire an image of patient 103 body and can further comprise optics 424.
  • Optics 424 can be, for example, camera lenses.
  • Optics 424 can have various wavelengths, field depth, wide or narrow lens angle, etc.
  • Optics 424 enable navigation camera 420 to acquire image data, having the required properties for enabling navigation of diagnostics device 104.
  • Navigation camera 420 can be used to acquire relevant body and/or organ images that navigation logic module 400 can utilize in order to identify current spatial disposition of diagnostics device 104 with respect to patient's 103 body (or a specific part thereof). This calculation can be done, for example, by comparing an image acquired (e.g.
  • navigation logic module 400 can be configured to perform image matching (for example by utilizing known techniques) to analyze diagnostics device 104 relative position therefrom, and use that match to define the current diagnostics device 104 spatial disposition as a temporary "origin point" to be used as a synchronization point for calculating the required route to the desired diagnostic device 104 spatial disposition, as further detailed below, for example with reference to Fig. 9 and 10.
  • diagnostics device 104 can be configured to continuously or periodically (e.g. every pre-determined time interval) compare the image acquired by navigation camera 420 to reference images (e.g. reference images saved for example on check plan repository 210) and once a match is found diagnostics device 104 can be configured to calculate the current device spatial disposition with respect to the patient's
  • diagnostics device 104 can be configured to calculate the required route to the desired diagnostic device 104 spatial disposition with respect to the patient's 103 body (or a specific part thereof), as further detailed below, for example with reference to Figs. 9 and 10.
  • the 104 identifies that it has reached the desired spatial disposition with respect to the patient's 103 body (or a specific part thereof), it can alert the user not to move until the required image is acquired.
  • Navigation module 204 can further comprise one or more navigation light sources 426.
  • Navigation light sources 426 can be Light Emitting Diodes, or any other light source known in the art.
  • Navigation module 204 can further comprise distance sensors 430.
  • Distance sensors 430 can be for example a laser distance sensor, as known in the art, or any other sensor that can determine distance of diagnostics device 104 from an object (e.g. patient 103 body, or a specific part thereof).
  • Navigation logic module 400 can utilize data received from distance sensors 430 in order to calculate the spatial disposition of diagnostics device with respect to patient's 103 body (or a specific part thereof).
  • Navigation module 204 can further comprise pressure sensors 440.
  • Pressure sensors 430 can be a known in the art pressure sensor that can determine the amount of pressure exerted on diagnostics device 104 as it is pressed against an object (e.g. patient 103 body or a specific part thereof).
  • Navigation logic module 400 can utilize data received from pressure sensors 440 in order to calculate the spatial disposition of diagnostics device with respect to patient's 103 body (or a specific part thereof).
  • Fig. 5 is a block diagram schematically illustrating an example of a guiding module configured to guide the diagnostic device user, in accordance with the presently disclosed subject matter.
  • Guiding module 206 can comprise guiding logic module 500.
  • guiding logic module 500 can be configured to provide various guidance data instructing user 102 how to maneuver diagnostics devise 104 to the desired diagnostics device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof).
  • Such guidance data can include, inter alia, voice commands, image display, diagnostics device 104 vibrations, etc.
  • Such guidance data can be presented to user 102 continuously or periodically, until diagnostics device 104 arrives at desired diagnostics device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof).
  • Such guidance data can be calculated according to the respective calculation of a route to the desired diagnostics device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof), as calculated by navigation module 204.
  • guiding module 206 can comprise one or more output sources, such as, for example, display 502, speaker 510, vibration elements 508, guiding light sources 506, keypad 504, etc.
  • Display 502 can be configured to present visual data providing user 102 with information on how to maneuver diagnostics devise 104 to the desired diagnostics device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof). Such information can, in some cases, include a visual representation of diagnostics device 104 current spatial disposition with respect to patient's 103 body (or a specific part thereof) and on the desired diagnostics device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof).
  • FIG. 13 there is shown a schematic illustration of exemplary presentation of navigational instructions to a diagnostic device user, in accordance with the presently disclosed subject matter.
  • object 950A, 950B, 950C representing diagnostics device 104 current spatial disposition with respect to patient's 103 body (or a specific part thereof)can be presented on display 502, along with a target mark 952 representing the desired diagnostics device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof).
  • a three dimensional smiley object 950A, 950B, 950C representation on display 502 continuously or periodically updates reflecting changes to diagnostics device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof).
  • diagnostics device 950A is positioned relatively far from target mark 952 (it can be appreciated that it is located above and to the right of target mark 952).
  • diagnostics device is not oriented as required (it can be appreciated that it is not facing directly forward).
  • User 102 repositions and reorients diagnostics device 104 according to the feedback presented on display 502. Repositioning can be made by moving diagnostics device 104 forward/backward, up/down, left/right. Reorienting can be made by roll, pitch, yaw movements of diagnostics device 104. Such repositioning and reorientation of diagnostics device 104 is reflected on display 502, for example continuously or periodically.
  • diagnostics device 104 is coming closer to the desired spatial disposition with respect to patient's 103 body (or a specific part thereoftat object 950B, which, as can be appreciated, is closer to target mark 952.
  • diagnostics device 104 is coming still closer to the desired spatial disposition with respect to patient's 103 body (or a specific part thereof)at object 950C, which, as can be appreciated, is even closer to target mark 952 than object 950B.
  • diagnostics device 104 is at target mark 952 - the desired spatial disposition with respect to patient's 103 body (or a specific part thereof).
  • object 950 can comprise of visual representation and hints about the navigation process, such representations can include for example color changes (such as red for wrong and green for good), and/or emoticons illustrating the proximity of diagnostics device 104 to desired spatial disposition with respect to patient's 103 body (or a specific part thereof) (for example, object 950 initially has a sad smiley and as it nears target mark 952 the sad smiley becomes a happy smiley).
  • speaker 510 can provide voice instructions to user 102 indicating the required movements user 102 should perform in order to bring diagnostics device to desired spatial disposition with respect to patient's 103 body (or a specific part thereof).
  • speaker 510 can provide sound feedbacks about proximity of diagnostics device 104 to desired spatial disposition with respect to patient's 103 body (or a specific part thereof) (for example a sound feedback might be a series of short beeps and changes to their rate according to the proximity of diagnostics device 104 to desired spatial disposition).
  • Vibration elements 508 can provide vibrating feedback, for example in order to indicate user 102 that a movement that he is making is not correct (e.g. if diagnostics device 104 should be moved to the right and user 102 moves it to the left, a vibration can be initiated). Vibration can also be provided indicating that diagnostics device reached desired spatial disposition with respect to patient's 103 body (or a specific part thereof). In some cases such vibration will be a different vibration than a vibration indicating wrong movement.
  • Guiding light source 506 can provide light feedback to user 102 about required diagnostics device 104 movement and/or proximity of diagnostics device 104 to desired spatial disposition with respect to patient's 103 body (or a specific part thereof).
  • a combination of LED elements such as a matrix of LED elements located on diagnostic device 104 can provide user 102 with a light feedback about the required movement direction (e.g. right, left, up, down, etc.).
  • guiding light source 506 can be configured to utilize movement sensors 612 and orientation sensors 614 in order to calculate and use the correct light source (e.g. specific LED, etc.) which is relevant to the current movement based on current diagnostic device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof) (e.g. the same LED can sometimes point up and sometimes down according to the device orientation).
  • the LED elements can also provide a proximity feedback using specific rate of lights going on and off.
  • Key pad 504 - In some cases the guiding process, using guiding logic module 500 can also require a feedback from user 102, for example a confirmation about ending a specific medical examination, etc.
  • guiding module 206 can comprise one or more input sources, such as, for example, keypad 504.
  • Fig. 6 is a flowchart illustrating one example of a sequence of operations carried out for performing an automatic and self-guided medical examination, in accordance with the presently disclosed subject matter.
  • a check is performed if the initiated check is the first check of patient 103 with a diagnostics device 104.
  • personalized organ/body calibration is performed (step 600), as further detailed with respect to Fig. 7.
  • a medical examination is initiated (step 602).
  • diagnostics device 104 (for example by utilizing examination logic module 208) can receive an indication of patient 103 to be checked.
  • Diagnostics device 104 can be further configured to retrieve various data relating to patient 103. Such data can be retrieved from one or more of: data repository 216, check plan repository 210, trained personnel data repository 123, patient & check plan repository 136 or any other location operatively connected to diagnostics device 104 on which patient data is stored. Such data can include, inter alia, data relating to a patient specific check plan, reading references, communication parameters, etc.
  • Diagnostics device 104 can be further configured to display a questionnaire (step 604) to be answered by user 102 and/or patient 103.
  • Questionnaire can be displayed, for example, on patient workstation 114 or can be played as a voice based questionnaire.
  • Questionnaire can comprise generic and/or patient 103 specific questions designed to provide trained personnel 124 with various data (e.g. data relating to patient 103 medical condition), including data required to enable analysis of the medical data acquired during the medical examinations (e.g. "does the patient have a fever and how long?", "how high is it?", "does the patient feel any pain?", "where is the pain located?”, etc.).
  • User 102 or patient 103 can answer the questionnaire using for example voice recording using the diagnostic device 104 or using the patient workstation 114, or for example by replying to a computerized questionnaire which can be displayed on patient workstation 114. It is to be noted that other methods can be utilized in order to provide answers to the questionnaire.
  • Diagnostics device 104 can be further configured to perform a medical examination selection and initiation (step 606).
  • diagnostics device 104 can enable user 102 to select a medical examination to be performed, either manually or from a list of checks to be performed as defined in patient 103 check plan.
  • diagnostics device 104 can select and initiate a check according to a pre-defined order set by patient 103 specific check plan, without input from user 102.
  • the medical examination initiation can consist of, for example, retrieving reference medical examination data from the check plan or a relevant repository (similar to medical examination initiation step 602).
  • diagnostics device 104 can be configured to perform device orientation (step 608).
  • diagnostics device 104 can instruct user 102 to move it to a position and orientation in proximity of a known reference point (e.g. patient 103 nose, ear, eye, etc.).
  • diagnostics device 104 can instruct navigation camera to continuously or periodically acquire images of the patient's body. Diagnostics device 104 can continuously compare the acquired images to known reference images of patient 103 (e.g.
  • diagnostics device 104 can be configured to notify user 102 of the match and to determine its spatial disposition with respect to patient's 103 body (or a specific part thereof).
  • diagnostics device 104 can be configured to perform navigation and guiding (step 610) of diagnostics device 104 to the desired spatial disposition with respect to the patient's 103 body (or a specific part thereof) that will enable acquisition of medical data by diagnostics device 104.
  • Diagnostics device 104 can be configured to calculate a route to a desired spatial disposition with respect to the patient's 103 body (or a specific part thereof).
  • Such desired spatial disposition with respect to the patient's 103 body (or a specific part thereof) can be defined, for example, by the patient specific check plan (e.g. in accordance with the personalized organ/body calibration performed for patient 103).
  • the route calculation is performed continuously or periodically, for example until arrival to the desired diagnostics device 104 spatial disposition with respect to the patient's 103 body (or a specific part thereof). It is to be noted that the navigation and route calculation processes are further explained below, inter alia with respect to Figs. 11 and 12.
  • diagnostics device 104 for example by utilizing guiding module 206, can provide various guidance data instructing user 102 how to maneuver diagnostics devise 104 to the desired diagnostics device 104 spatial disposition with respect to the patient's 103 body (or a specific part thereof), in accordance with the navigation calculations indicated above.
  • diagnostics device 104 can be configured to check if the navigation quality is sufficient and if there is a need in re-orienting diagnostics device 104. Such checks can be performed for example by searching for additional reference images at pre-defined locations along the way, whereas in case the images acquired by navigation camera 420 do not match the expected reference images (for example as defined by patient 103 check plan), there is a need in re-orientation of navigation device 104.
  • navigation module logic 400 can also calculate the navigation quality by calculating the distance between diagnostic device 104 spatial disposition with respect to the patient's 103 body (or a specific part thereof) with the target desired spatial disposition with respect to the patient's 103 body (or a specific part thereof) and check whether there is a route convergence (i.e. the distance is getting smaller) or route divergence (distance is getting bigger).
  • diagnostics device 104 navigation can be performed without use of any patient specific reference data, but only using generic reference data.
  • diagnostics device 104 can be configured to continuously or periodically acquire patient medical data, and monitor to see if the acquired medical data meets certain criteria that indicate that the acquired data is the requested data.
  • diagnostic device 104 can use predefined generic images of a typical organ such as an ear drum (not specific to a patient) as a reference.
  • diagnostic device 104 can be configured to continually analyze the acquired patient's internal ear image, and try to match the reading to the generic image reference.
  • Matching criteria can be, for example, a unique image characteristic of the organ such as the circular structure of the eardrum, and its image contrast compared to the surrounding image.
  • a generic organ reading reference can be a generic sound wave of a human heart, and in this case, for example, the matching criteria can be the sound wave unique structure and special characteristics such as pace, amplitude, volume, etc.
  • diagnostics device 104 navigation can be performed with utilization of INS readings alone, using, for example, movement sensors 412 and orientation sensors 414.
  • diagnostics device 104 can be initiated for example by touching three identifiable body points, such as two patient 103 nipples and patient 103 belly button.
  • diagnostics device 104 can than utilize movement sensors 412 and orientation sensors 414 alone to navigate to various body points.
  • diagnostics device 104 Upon arrival to diagnostics device 104 desired spatial disposition with respect to the patient's 103 body (or a specific part thereof), diagnostics device 104, for example by utilizing reading and verification logic module 212, can be configured to perform a reading and verification of the reading (step 612). Diagnostics device 104 can be configured to verify that it is located at the desired spatial disposition with respect to the patient's 103 body (or a specific part thereof) when acquiring medical data of patient 103. Diagnostics device 104 can be further configured to prepare for acquiring medical data of patient 103, and to perform acquisition of such medical data. After acquisition of medical data of patient, diagnostics device 104 can be configured to verify that the acquired data meets pre-defined standards (e.g.
  • diagnostics device 104 can in some cases be configured to instruct user 102 to perform the required repositioning and reorienting thereof in order to bring diagnostics device 104 to the desired spatial disposition with respect to the patient's 103 body (or a specific part thereof).
  • reading and verification logic module 212 can be configured to retry acquiring the medical data of patient 103, as further detailed below, inter alia with reference to Fig. 14.
  • diagnostics device 104 can be configured to check if the medical examination is done (e.g. all medical examinations defined by patient 103 check plan have been performed). If not, diagnostics device 104 can be configured to move to the next medical examination indicated by patient 103 check plan. If all required medical examinations are performed, diagnostics device 104 can be configured to finalize the check (step 614). During the check finalization 614, as well as in any other step of the described process, diagnostic device 104 can be configured to perform any required action to the acquired patient 103 medical data. Such actions can include, for example, updating repository status, embedding patient data or check data in the reading data, encrypting data, compressing data, transmitting the acquired data to different locations (e.g. trained personnel workstation 122 and/or central system 130), etc.
  • Fig. 7 is a flowchart illustrating one example of a sequence of operations carried out for performing personalized calibration of a diagnostic device, in accordance with the presently disclosed subject matter.
  • Diagnostics device 104 for example by utilizing calibration logic module 214, can be configured to initiate a calibration check (step 702). The initial calibration can be performed by trained personnel 124. During calibration trained personnel 124 activates diagnostics device 104 instead of user 102. This can require patient 103 arrival to trained personnel location 120 or trained personnel 124 arrival to patient location 100 for diagnostics device calibration. It is to be noted that diagnostics device 104 can be configured to allow performing the calibration process by user 102 with a remote guiding and assistance of trained personnel 124.
  • trained personnel 124 can select a specific check (for example a check that is required for the specific patient 103 and activate diagnostics device 104 calibration mode (step 704).
  • the specific check is selected from a list of checks (that can be displayed, for example, on diagnostic device 104 or on trained personnel workstation 122).
  • diagnostics device 104 can be configured to guide trained personnel 124 during calibration (step 706). Such guidance of trained personnel 124 is performed in accordance with the selected check and the calibration method.
  • Diagnostics device 104 can be further configured to record reference data (in accordance with the calibration method, as detailed below) during performance of the medical examination by trained personnel 124 and optionally present the recorded data, for example on trained personnel workstation 122 (step 708).
  • the recorded reference data can be stored, for example, in one or more of: check plan repository 210, data repository 216, patient & check plan repository 136, trained personnel data repository 123 or any other location operatively connected to diagnostics device 104 on which patient data is stored.
  • Fig. 8a there is shown a flowchart illustrating an example of a sequence of operations carried out for recording reference data during personalized calibration of a diagnostic device, using imaging and orientation sensors, in accordance with the presently disclosed subject matter.
  • trained personnel 124 perform the medical examination while diagnostics device 104 records various data (step 740), including patient 103 body images using navigation camera 420 and diagnostics device 104 INS data using INS sensors 410 (6 axis movement - using accelerometers and gyroscopes).
  • diagnostics device 104 can be further configured to record data relating to the distance of diagnostics sensor from patient 103 body using distance sensors 430.
  • diagnostics device 104 can be further configured to record data relating to the pressure exerted on diagnostics device against patient 103 body using pressure sensors 440. Following positioning and orienting diagnostics device 104 in the desired spatial disposition with respect to the patient's 103 body (or a specific part thereof) (according to trained personnel 124 decision), trained personnel can perform medical data acquisition, whereas diagnostics device 104 can be configured to record the medical data acquired by image based sensors 310 and/or sound based sensors 320. All the recorded data can be saved on one or more of check plan repository 210, data repository 216, patient & check plan repository 136, medical examination repository 134, or any other location operatively connected to diagnostics device 104 on which patient data can be stored. Reverting to Fig.
  • diagnostics device 104 can be configured to present trained personnel 124 (for example on trained personnel workstation 122) data indicating a reference point to be reached and the next reference point to be reached from the first reference point (step 750).
  • Trained personnel 124 can be instructed to move diagnostics device 104 to the first reference point he should reach, touch the reference point with diagnostics device 104 and from there move to the next reference point he should reach and touch it as well.
  • diagnostics device records data (step 752), including patient 103 body images using navigation camera 420 and diagnostics device 104 INS data using INS sensors 410 (6 axis movement - using accelerometers and gyroscopes).
  • diagnostics device 104 can be further configured to record data relating to the distance of diagnostics sensor from patient 103 body using distance sensors 430.
  • diagnostics device 104 can be further configured to record data relating to the pressure exerted on diagnostics device against patient 103 body using pressure sensors 440.
  • diagnostics device touches the point, thus indicating that its current location is the reference point location (step 754).
  • the trained personnel 124 can also acknowledge reaching the desired reference point, by using the device keypad 504 or any other confirmation method. The process repeats until enough reference points are selected. It is to be noted that in some cases three reference points are enough as they form a basis for utilization of known triangulation techniques that can be used for navigating diagnostics device 104.
  • diagnostics device 104 can be configured to alert trained personnel 124 that medical data acquisition can commence (step 756). Trained personnel 124 can then move diagnostics device 104 to the desired spatial disposition with respect to the patient's 103 body (or a specific part thereof) from which a medical data acquisition can be performed (step 758), while diagnostics device continues to record the reference data (including, inter alia, patient 103 body images and diagnostics device 104 INS data). Following acquisition of all reference data, including the reference points, diagnostics device 104 can be configured to calculate the relative spatial disposition of diagnostics device 104 in respect of the acquired reference points (step 759).
  • Fig. 8c there is shown a flowchart illustrating one example of a sequence of operations carried out for recording reference data during personalized calibration of a diagnostic device, using reference points and pointing object, in accordance with the presently disclosed subject matter.
  • trained personnel 124 performs the medical examination while diagnostics device 104 records various data, including patient's 103 body images using navigation camera 420.
  • Diagnostics device 104 can be configured to instruct trained personnel 124 to point diagnostics device 104 in the direction of a relevant body part (e.g. chest, back, head, etc.) and acquire an image by utilizing, for example, navigation camera 420 (step 770).
  • a relevant body part e.g. chest, back, head, etc.
  • diagnostics device 104 can be configured to try to extract reference points from the acquired image.
  • diagnostics device can be configured to utilize pre-stored data relating to expected points within the area of the acquired image (e.g. if the acquired image is of patient 103 head, expected reference points can be the eyes, the nose, the mouth, the eyebrows, etc., if the acquired image is of patient chest, expected reference points can be the nipples, the navel, etc.) in order to try and find a match thereto within the acquired image (step 772).
  • diagnostics device 104 can be configured to look for the nipples in the acquired image (for example diagnostics device 104 can utilize pre-stored data that indicates that a nipple appearance is round, its size can have a certain range and it is regularly darker than its surrounding area). Diagnostics device 104 can be configured to ask trained personnel 124 to acknowledge the calculated reference points.
  • diagnostics device 104 can optionally be configured to notify trained personnel 124 of the failure. Diagnostics device 104 can optionally be further configured to enable trained personnel 124 to mark the reference points manually on the acquired image that, for that purpose, can be displayed on trained personnel workstation 122 (step 773). Such marking of the reference points can be performed for example by using an indicator presented on trained personnel workstation 122, where said indicator can be moved, for example, by a computer mouse or any other suitable input device (e.g. keypad, track pad, etc.). Alternatively or additionally, diagnostics device 104 can be configured to enable such marking by touch of trained personnel 124 on the reference points, for example using his finger. In such cases, diagnostics device 104 can be configured to identify trained personnel 124 finger within the image acquired by navigation camera 420.
  • diagnostics device 104 can be configured to instruct trained personnel 124 to mark the desired location of diagnostics device 104 for medical data acquisition on the acquired image (step 774). In some cases, diagnostics device 104 can be further configured to mark the next desired location of diagnostics device 104 for medical data acquisition on the acquired image (step 775), and the process repeats until all desired locations of diagnostics device 104 for medical data acquisition are marked on the acquired image.
  • each of the calibration methods detailed above can be performed virtually, as instead of an actual physical meeting between trained personnel 124 and patient 103, a virtual meeting can take place, in which trained personnel 124 can, for example, guide user 102 on how to perform the calibration. In such cases, user 102 can activate diagnostics device throughout the calibration according to trained personnel 124 instructions.
  • Such virtual meeting can utilize known methods and techniques such as video conferencing, etc.
  • diagnostics device 104 can be further configured to enable trained personnel 124 to perform medical data acquisition (step 710). Diagnostics device 104 can be further configured to store the acquired medical data as reference data (step 712) (as indicated above, the recorded reference data can be stored, for example, in one or more of: check plan repository 210, data repository 216, patient & check plan repository 136, trained personnel data repository 123 or any other location operatively connected to diagnostics device 104 on which patient data is stored).
  • Diagnostics device 104 can be further configured to repeat the process until calibration is done (for example as indicated by trained personnel 124). Diagnostics device 104 can be further configured to store the entire examination process (e.g. the series of medical examinations performed by trained personnel 124), for example, in one or more of: check plan repository 210, data repository 216, patient & check plan repository 136, trained personnel data repository 123 or any other location operatively connected to diagnostics device 104 on which patient data is stored (step 730).
  • the entire examination process e.g. the series of medical examinations performed by trained personnel 124
  • diagnostics device 104 can be configured to perform a generic check plan or a modified personal check plan using only generic reference data, without utilizing any personal reference data that requires calibration process to the diagnostic device 104. It is to be further noted that in such cases, when performing a certain check (e.g. throat check, ear check, etc.) diagnostics device 104 can instruct user 102 to move to a spatial disposition with respect to the patient's 103 body (or a specific part thereof) in proximity of a known reference point (e.g. patient 103 nose, ear, eye, etc.).
  • a certain check e.g. throat check, ear check, etc.
  • diagnostics device 104 can instruct user 102 to move to a spatial disposition with respect to the patient's 103 body (or a specific part thereof) in proximity of a known reference point (e.g. patient 103 nose, ear, eye, etc.).
  • diagnostics device 104 can instruct the relevant image based sensor 310 (e.g. relevant organ camera sensor such as ear reading sensor, etc.) to continuously or periodically acquire organ images. Diagnostics device 104 can continuously or periodically compare the acquired images to known generic reference images of the required organ to be read (e.g. reference images of "ear drums", throat tonsils, etc.). The reference images can be saved for example in check plan repository 210, data repository 216, patient & check plan repository 136, trained personnel data repository 123 or any other location operatively connected to diagnostics device 104 on which patient data is stored.
  • relevant image based sensor 310 e.g. relevant organ camera sensor such as ear reading sensor, etc.
  • Diagnostics device 104 can continuously or periodically compare the acquired images to known generic reference images of the required organ to be read (e.g. reference images of "ear drums", throat tonsils, etc.).
  • the reference images can be saved for example in check plan repository 210, data repository 216, patient & check plan repository 136, trained personnel data repository 123
  • diagnostics device 104 can then move diagnostics device 104 towards a spatial disposition with respect to the patient's 103 body (or a specific part thereof)approximate to the desired spatial disposition with respect to the patient's 103 body (or a specific part thereoftuntil diagnostics device 104 identifies at least one matching reference point (as further detailed below, inter alia with respect to Figs. 9 and 10, reference points can also be reference patterns).
  • diagnostics device 104 Once diagnostics device 104 reaches the desired spatial disposition with respect to the patient's 103 body (or a specific part thereof)it can generate an alert to user 102 and perform a data acquisition and verification as defined in the check plan and explained above.
  • each patient organ can be associated with one or more reference points.
  • a reference point can in some cases also be a certain pattern such as the linear shape formed by the patient organ structure.
  • patient nose 900 can have multiple reference points associated therewith, including 910A-910C.
  • Such reference points can be, for example, located at the edges of the nose (such as 91 OA and 9 IOC), at the middle of the nose (such as 910B) or at any other location associated with patient nose (not shown).
  • Patient ear 905 can have multiple reference points associated therewith, including 920A-920D.
  • reference points can be, for example, located at the edges of the ear (such as 920A-920B), at curves formed by the ear structure (such as 920B and 920 D) or at any other location associated with patient ear (not shown).
  • a reference point can also be a certain pattern such as the linear shapes formed by the patient's organ structure.
  • Such types of reference points are illustrated in the figure by reference numerals 915, 925A, 925B and 925C.
  • Such reference points type reflect the relevant organ structure of a specific patient, and utilization thereof inter alia enables determination of the relative diagnostics device 104 spatial disposition with respect to the patient's 103 body (or a specific part thereof)and navigation of diagnostics device 104 to the desired spatial disposition with respect to the patient's 103 body (or a specific part thereof).
  • Fig. 10 is a schematic illustration of exemplary image based and INS based reference points, in accordance with the presently disclosed subject matter. It can be noted that reference points 764A-764C can be used in order to enable for example triangulation based navigation using INS sensors 410. As indicated above (inter alia with reference to Fig. 8b.), diagnostics device 104 can be configured to acquire reference data of three reference points (e.g. left nipple 764C, right nipple 764B and navel 764A).
  • three reference points e.g. left nipple 764C, right nipple 764B and navel 764A.
  • Diagnostics device 104 can be further configured to utilize the reference data and INS sensors 410 data in order to determine the location of desired spatial dispositions of diagnostics device 104 (for example positions and orientations 762A, 762B, etc.) with respect to patient's 103 body (or a specific part thereof). It is to be noted that diagnostic device 104 can also use the reference points 764A-764-C to enable image based calculations and user guidance to a desired spatial disposition (for example 762A, 762B) with respect to patient's 103 body (or a specific part thereof) (for example in order to acquire medical data).
  • desired spatial disposition for example 762A, 762B
  • diagnostic device 104 can also use the reference points 764A-764-C to enable image based calculations and user guidance to a desired spatial disposition (for example 762A, 762B) with respect to patient's 103 body (or a specific part thereof) (for example in order to acquire medical data).
  • Fig. 11 is a flowchart illustrating one example of a sequence of operations carried out for calculating the spatial disposition of a diagnostic device with respect to patient's 103 body (or a specific part thereof), in accordance with the presently disclosed subject matter.
  • Diagnostics device 104 for example by utilizing navigation module 204, can be configured to instruct user 102 to move diagnostics device 104 to be in proximity to a known reference point relating to the specific selected medical examination (step 802). For example, if the selected check is an ear check, diagnostics device 104 can be configured to instruct user 102 to move diagnostics device 104 to the proximity of patient 103 ear.
  • diagnostics device 104 can be configured to activate one or more navigation sensors such as INS sensors 410, navigation camera 420, navigation light sources 426, pressure sensors 440, distance sensors 430, etc. (step 804).
  • navigation sensors such as INS sensors 410, navigation camera 420, navigation light sources 426, pressure sensors 440, distance sensors 430, etc.
  • Diagnostics device 104 can be configured to utilize the data received from the one or more navigation sensors and start searching for known reference points according to which the current spatial disposition of diagnostics device 104 with respect to the desired position and orientation with respect to patient's 103 body (or a specific part thereof) can be calculated (step 806).
  • the current spatial disposition of diagnostics device 104 with respect to patient's 103 body (or a specific part thereof) can be calculated by utilizing identification of one or more known reference points (stored on one or more of: check plan repository 210, data repository 216, patient & check plan repository 136, trained personnel data repository 123 or any other location operatively connected to diagnostics device 104 on which patient data is stored) within the data received from the one or more navigation sensors.
  • diagnostics device activates one or more navigation sensors such as navigation camera 420, etc., and compares the received data relating to patient's 103 throat with relevant reference data (such as patient's throat image, nose image, etc.) stored on one or more of: check plan repository 210, data repository 216, patient & check plan repository 136, trained personnel data repository 123 or any other location operatively connected to diagnostics device 104 on which patient data is stored.
  • diagnostics device 104 can calculate its relative spatial disposition with respect to the desired spatial disposition.
  • Such calculated spatial disposition can be used as an origin point for performing the navigation process to enable medical data acquisition (in the example, medical data relating to patient 103 throat) using known methods and techniques.
  • One exemplary, non-limiting method is comparing images acquired by navigation sensors (e.g. navigation camera 420) with known reference images. When a match is found the approximate spatial disposition can be calculated. It can be appreciated that images can appear at different positions, orientations and scaling factors, however there are some algorithms that can be utilized for compensating such differences, such as, for example, using Scale-Invariant Feature Transform (or SIFT algorithm), which was published by Lowe, David G. (1999) in "Object recognition from local scale-invariant features",doi:10.1109/ICCV.1999.790410 or in U.S. Patent 6,711,293, "Method and apparatus for identifying scale invariant features in an image and use of same for locating an object in an image", David Lowe's patent for the SIFT algorithm.
  • SIFT algorithm Scale-Invariant Feature Transform
  • diagnostics device 104 can be configured to start the navigation and guiding process (step 818).
  • diagnostics device 104 can be configured to notify user 102 (step 812) and lock the current spatial disposition as a starting point for the navigation process (step 814). If no match is found, for example after a pre-defined time period (e.g. 15 seconds), diagnostics device 104 can be configured to check for errors (e.g. validate that navigation sensors are operative, validate that reference data is available, etc.) and notify user 102 of the failure to find a match (step 808). If diagnostics device 104 fails to find any error related to it, diagnostics device 104 can be configured to return to step 806 and search again for known reference points.
  • a pre-defined time period e.g. 15 seconds
  • diagnostics device 104 can be configured to check for errors (e.g. validate that navigation sensors are operative, validate that reference data is available, etc.) and notify user 102 of the failure to find a match (step 808). If diagnostics device 104 fails to find any error related to it, diagnostics device 104 can be configured to return to step 806 and search
  • diagnostics device 104 can be configured to notify user 102 of the error and, if the error has been handled, diagnostics device 104 can be configured to enable user 102 to return to step 806 and search again for known reference points (step 810).
  • Fig. 12 is a flowchart illustrating one example of a sequence of operations carried out for navigating a diagnostic device and guiding a diagnostic device user accordingly, in accordance with the presently disclosed subject matter.
  • Diagnostics device 104 for example by utilizing navigation module 204, can be configured to calculate a route from a known reference point that was found (see for example Fig. 11), to the desired diagnostics device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof) (step 901).
  • the route calculation can be performed, for example, by utilizing known methods and techniques.
  • a route calculation can be performed, for example, by calculating the distance and required movement correction between the current diagnostic device 104 position (XI, Yl, Zl), as identified by utilizing the reference data (see Fig.
  • diagnostics device 104 can be configured to provide user 102 with guidance data instructing user 102 how to maneuver diagnostics device 104 to the desired diagnostics device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof) (step 916).
  • diagnostics device can be configured to present user 102 with the current spatial disposition with respect to patient's 103 body (or a specific part thereoftof diagnostics device 104, for example as detailed with respect to Fig. 13 above. Diagnostics device can also be configured to provide user 102 with voice instructions instructing user 102 on how to maneuver diagnostics device 104. It is to be noted that, as indicated above, other instruction methods can be utilized as well (e.g. diagnostics device 104 vibrations, etc.).
  • Diagnostics device 104 can be further configured to continuously calculate its current spatial disposition with respect to the desired spatial disposition (step 902). During continuous or periodic position and orientation calculation, diagnostics device 104 can be configured to continuously receive data from one or more navigation sensors such as INS sensors 410, navigation camera 420, pressure sensors 440, distance sensors 430, etc. (step 906) and continuously calculate its current spatial disposition with respect to patient's 103 body (or a specific part thereoftby means of comparison of the data received from the one or more navigation sensors with the reference data (e.g. reference image, reference INS data, etc.), for example by using known methods and techniques (step 908).
  • one or more navigation sensors such as INS sensors 410, navigation camera 420, pressure sensors 440, distance sensors 430, etc.
  • the reference data e.g. reference image, reference INS data, etc.
  • One exemplary, non-limiting method is utilizing INS sensors 410 data for computing diagnostics device 104 trajectory according to gyro and accelerometer information.
  • the mathematics is based on a solution of 6 Degrees Of Freedom equations as described in various papers and books (for example "Strapdown Inertial Navigation Technology", D.Titterton and J. Weston, ISBN 1563476932).
  • the diagnostics device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof) can be further calculated according to image comparison as detailed above.
  • the diagnostics device 104 spatial disposition with respect to patient's 103 body can be constantly or periodically computed by utilizing the INS sensors 410 data (by determining diagnostics device velocity and position) while utilizing image comparison in order to eliminate errors (e.g. by matching reference points).
  • the INS sensors 410 data and the image comparison data can be merged for example by using Kalman Filtering which is an exemplary algorithm for information fusion.
  • diagnostics device 104 current spatial disposition with respect to patient's 103 body (or a specific part thereoftis the desired diagnostics device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof)
  • diagnostics device can be configured to acquire patient 103 medical data, as further detailed, inter alia, with respect to Fig. 14.
  • diagnostics device 104 current spatial disposition with respect to patient's 103 body (or a specific part thereof)is not the desired diagnostics device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof)
  • diagnostics device 104 can be configured to perform a movement correction calculation (step 910).
  • diagnostics device 104 can be configured to calculate the delta between its current spatial disposition with respect to patient's 103 body (or a specific part thereof) and the desired spatial disposition with respect to patient's 103 body (or a specific part thereof), and/or the calculated route. Movement correction calculation can be based, for example, on data received from the one or more navigation sensors and the calculated route. In such cases, after a route is calculated, diagnostics device 104 can be configured to check if the actual movements made by it do not fit the expected movements calculated during route calculation. Alternatively movement correction calculation can be based on re- comparing the data received from the one or more navigation sensors and the respective reference data.
  • Diagnostics device 104 can be further configured to perform a navigation quality calculation (step 912). Diagnostics device 104 can be configured to check various parameters indicative of the navigation quality, such as convergence (check that the distance from the desired spatial disposition is getting smaller), etc. In case the navigation quality meets the requirements (e.g. the distance to the desired spatial disposition is getting smaller, etc.), diagnostics device returns to step 916 in order to continue the navigation and guiding process. If, however, the navigation quality does not meet the requirements, diagnostics device 104 can be configured to return to step 608 and perform device re-orientation.
  • a navigation quality calculation step 912
  • Diagnostics device 104 can be configured to check various parameters indicative of the navigation quality, such as convergence (check that the distance from the desired spatial disposition is getting smaller), etc. In case the navigation quality meets the requirements (e.g. the distance to the desired spatial disposition is getting smaller, etc.), diagnostics device returns to step 916 in order to continue the navigation and guiding process. If, however, the navigation quality does not meet the requirements, diagnostics device 104
  • Fig. 12a is a flowchart illustrating another example of a sequence of operations carried out for navigating a diagnostic device and guiding a diagnostic device user accordingly, in accordance with the presently disclosed subject matter.
  • diagnostics device 104 for example by utilizing navigation module 204, can be configured to calculate a route from a known reference point that was found (see for example Fig. 11), to the desired diagnostics device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof).
  • pointing object 935 for example user 102 finger, etc.
  • the route is calculated while utilizing the initial location of pointing object 935 as the starting point. Looking at Fig.
  • FIG. 12b there is shown a schematic illustration of an exemplary pointing object used for navigating a diagnostic device and guiding a diagnostic device user accordingly, in accordance with the presently disclosed subject matter.
  • pointing object 935 points to a certain location on patient 103 body while diagnostics device 104 utilizes one or more navigation sensors as further detailed below for calculating pointing object location and a route from the pointing object location to the desired diagnostics device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof).
  • diagnostics device 104 can be configured to provide user 102 with guidance data instructing user 102 how to maneuver pointing object 935 to the desired diagnostics device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof) (step 936).
  • diagnostics device can be configured to present user 102 with the current location of pointing object 935.
  • Diagnostics device can also be configured to provide user with voice instructions instructing user 102 on how to maneuver pointing object 935. It is to be noted that, as indicated above, other instruction methods can be utilized as well (e.g. diagnostics device 104 vibrations, etc.).
  • Diagnostics device 104 can be further configured to continuously calculate pointing object 935 current location with respect to its desired location (step 920). During continuous pointing object 935 location calculation, diagnostics device 104 can be configured to continuously receive data from one or more navigation sensors such as navigation camera 420, distance sensors 430, etc. (step 922) and continuously calculate pointing object 935 current location by means of comparison of the data received from the one or more navigation sensors with the reference data (e.g. reference image, etc.), for example by using known methods and techniques as detailed above (step 924).
  • one or more navigation sensors such as navigation camera 420, distance sensors 430, etc.
  • the reference data e.g. reference image, etc.
  • diagnostics device can be configured to instruct user 102 to move diagnostics device 104 to the location indicated by pointing object 935 (step 928) and acquire patient 103 medical data, as further detailed, inter alia, with respect to Fig. 14.
  • diagnostics device 104 can be configured to perform a movement correction calculation (step 930). During movement correction calculation, diagnostics device 104 can be configured to calculate the delta between pointing object 935 current location and its desired location, and/or the calculated route.
  • Movement correction calculation can be based, for example, on data received from the one or more navigation sensors and the calculated route. In such cases, after a route is calculated, diagnostics device 104 can be configured to check if the actual movements made by pointing object 935 do not fit the expected movements calculated during route calculation. Alternatively movement correction calculation can be based on re-comparing the data received from the one or more navigation sensors and the respective reference data.
  • Diagnostics device 104 can be further configured to perform a navigation quality calculation (step 932). Diagnostics device 104 can be configured to check various parameters indicative of the navigation quality, such as convergence (check that the distance of pointing object 935 from its desired location is getting smaller), etc. In case the navigation quality meets the requirements (e.g. the distance of pointing object 935 from its desired location is getting smaller, etc.), diagnostics device returns to step 936 in order to continue the navigation and guiding process. If, however, the navigation quality does not meet the requirements, diagnostics device 104 can be configured to return to step 608 and perform device re-orientation.
  • a navigation quality calculation step 932
  • Diagnostics device 104 can be configured to check various parameters indicative of the navigation quality, such as convergence (check that the distance of pointing object 935 from its desired location is getting smaller), etc. In case the navigation quality meets the requirements (e.g. the distance of pointing object 935 from its desired location is getting smaller, etc.), diagnostics device returns to step 936 in order to continue the navigation and guiding
  • Fig. 14 is a flowchart illustrating one example of a sequence of operations carried out for acquisition and verification of a reading by a diagnostic device, in accordance with the presently disclosed subject matter.
  • Diagnostics device 104 for example by utilizing reading and verification logic module 212, can be configured, for example upon arrival to desired diagnostics device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof) (e.g. spatial disposition with respect to patient's 103 body (or a specific part thereof) that enable medical data acquisition, as detailed above), to notify user 102 that it is located in the desired spatial disposition with respect to patient's 103 body (or a specific part thereoftand about to start taking the reading (step 1002).
  • Diagnostics device 104 can be further configured to instruct diagnostics sensor module 202 to prepare to acquire medical data of patient 103 (step 1004). Such preparations can include preparing diagnostics sensors 202 to acquire medical data according to the patient specific check plan. Exemplary preparations are setting image acquisition sensor 316 zoom, activating light sources 318 at correct power, activating sound acquisition sensor 324, etc. In addition diagnostic device 104 can be configured to retrieve the relevant reading parameters and thresholds for example from the patient specific check plan (e.g. the required length of reading, reference thresholds such as minimal sound volume, etc.).
  • Diagnostics device 104 can also be configured to recalculate its current spatial disposition with respect to patient's 103 body (or a specific part thereof) and verify that no movements have been made and that it is still located in the desired spatial disposition (step 902). In case there has been a change in diagnostics device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof), diagnostics device 104 can be configured to return to the navigation and guiding process (610). Otherwise, diagnostics device 104 can be configured to perform medical data acquisition (step 1006).
  • the medical data can be acquired according to the check plan, that, as indicted above, can include information, inter alia about the examination process, steps and logic, and predefined reading parameters such as type of sensor to be used (still image vs.
  • the check plan can define that the sound based sensors 320 are to be used and that the reading length should be 3 seconds, or between 2.5 and 5 seconds, etc.).
  • diagnostics device 104 can be configured to verify that the acquired medical data meets pre-defined standards (e.g. a required length of reading, reading data thresholds, etc.) (step 1008). For example, if the heart is to be checked, and the check plan defines that the reading length should be between 2.5 and 5 seconds, diagnostics device 104 can be configured to check that the reading length meets the requirement. In case the acquired medical data did not meet the pre-defined standards, diagnostics device 104 can be configured to check if the reading acquisition process was ok (step 1010) (for example that the diagnostics sensors 202 are operative, that the check plan data and the reference data were successfully retrieved, that the navigation and guidance processes succeeded, etc.).
  • pre-defined standards e.g. a required length of reading, reading data thresholds, etc.
  • diagnostics device 104 can be configured to return to step 902 (in order to retry acquiring the medical data). If the process was not ok, diagnostics device 104 can be configured to issue a notification to user 102 (for example by presenting a message on diagnostic device 104 or patient workstation 114, etc.) and enable him to review the acquired medical data, if any (step 1012). Diagnostics device 104 can be further configured to enable user 102 to decide if the acquired medical data is to be saved or not.
  • diagnostics device 104 can be configured to save the acquired medical data (for example, in one or more of: data repository 216, patient & check plan repository 136, trained personnel data repository 123 or any other location operatively connected to diagnostics device 104 on which patient data is stored) (step 1014).
  • diagnostics device in case the reading acquisition process was ok, diagnostics device
  • 104 can be configured to update the reference data with the acquired medical data (step 1016). This can be performed in order to keep the reference data up to date, as changes can occur to the human body (for example in light of growing up, aging, medical treatments, etc.).
  • diagnostics device 104 is performed by trained personnel 124. It is to be noted that relevant changes to diagnostics device 104 in comparison to the embodiment described above are mentioned below. As indicated above, it is to be noted that identical reference numerals indicate those components that are common to different embodiments or configurations.
  • Fig. 15 is a block diagram schematically illustrating one example of a system for performing an automatic and remote trained personnel guided medical examination, in accordance with the presently disclosed subject matter.
  • user 102 and patient 103 are located at patient location 100.
  • User 102 can in some cases be patient 103 whose medical examination is required (in such cases, even though user 102 and patient 103 are shown as separate entities in the drawings, they are in fact the same entity). In other cases, user 102 can be a person that will be performing the medical examination of patient 103.
  • Patient workstation 114 can be any computer, including a personal computer, a portable computer, a cellular handset or an apparatus with appropriate processing capabilities, including a computer and/or an apparatus which can be, for example, specifically configured for that purpose.
  • Patient workstation 114 can further comprise patient location camera 1114a and patient location microphone 1114b, that can be used, inter alia, for acquiring image (including video) and sound data of patient 103.
  • diagnostics device 104 comprises (or is otherwise associated with) at least one processor 106 (e.g. digital signal processor (DSP), a microcontroller, a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), etc.) and a memory unit 110 (e.g. ROM, hard disk, etc).
  • processor 106 e.g. digital signal processor (DSP), a microcontroller, a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), etc.
  • memory unit 110 e.g. ROM, hard disk, etc.
  • Processor 106 is configured to receive instructions and control the components and operations of diagnostics device 104.
  • diagnostics device 104 can be configured to communicate with patient workstation 114.
  • the communication between diagnostics device 104 and patient workstation 114 can be realized by any communication means, e.g. via wired or wireless communication. It can be noted that user 102, patient 103, diagnostics device 104 and patient workstation 114 are located at patient location 100.
  • Diagnostics device 104 can be configured to enable acquisition of various data as further detailed below.
  • the acquired data can be transmitted (directly from diagnostics device 104 or through patient workstation 114) to trained personnel workstation 122 located at trained personnel location 120 and/or to central system 130.
  • Central system 130 and trained personnel workstation 120 can be any computer, including a personal computer, a portable computer, a cellular handset or an apparatus with appropriate processing capabilities, including a computer and/or an apparatus which can be, for example, specifically configured for that purpose.
  • the acquired data can be transmitted for example via Internet 116. It is to be noted that the data can be transmitted while utilizing other known communication alternatives, such as a cellular network, VPN, LAN, etc.
  • central system 130 comprises patient & check plan repository 136 in which various data relating to the patient is maintained. Such data can include, for example, patient identification number, patient name, patient age, patient contact details, patient medical data (such as diseases, sensitivities to medicines, etc.), check plans data (as further detailed below), etc.
  • Central system 130 can further comprise a medical examination repository 134 in which data acquired by diagnostics device 104, patient workstation 114 and trained personnel workstation 122 is maintained.
  • diagnostics device 104 can include, for example, results of medical examinations performed using diagnostics device 104 (such as ear recorded images and video readings, lungs or heart recorded sound, blood pressure, body temperature, etc. as further detailed below).
  • Central system 130 can further compris management system 132, that can be configured to establish a connection between a selected trained personnel workstation 122 (for example an available trained personnel workstation 122 or trained personnel workstation 122 with the shortest queue) and diagnostics device 104 and/or patient workstation 114.
  • management system 132 can be configured to establish a connection between a selected trained personnel workstation 122 (for example an available trained personnel workstation 122 or trained personnel workstation 122 with the shortest queue) and diagnostics device 104 and/or patient workstation 114.
  • the connection can be a direct connection or a connection via central system 130, and it can be established e.g. via Internet 116.
  • management system 132 can also manage other processes such as, subscribing patients, planning scheduling of patients to available trained personnel, managing patient and check plan repository 136, viewing and analyzing medical examination repository 134, etc.
  • central system 130 is optional to the solution and that central system 130 can be part of any trained personnel system 120,
  • communication between trained personnel workstation 122 and diagnostics device 104 and/or patient workstation 114 can be implemented directly without the use of, or need for, a central system 130.
  • tp-patient connection can be implemented using a distributed approach i.e. multiple patients can be served by one trained person and/or one patient can be served by multiple trained persons.
  • patient workstation 114 can include for example a local repository containing one or more connections information to a relevant trained personnel workstation 122, and vice- versa.
  • trained personnel workstation 122 When the transmitted data (including image and voice data of patient 103) is received at trained personnel workstation 122, the data can be displayed on trained personnel workstation 122.
  • trained personnel workstation 122 can include, inter alia, a display (e.g. LCD screen). It is to be noted that the image and voice data of patient 103 can be streamed to trained personnel workstation 122.
  • Trained personnel 124 can view the received data on display and provide user 102 with navigational directions for navigating diagnostics device 104 to a desired spatial disposition with respect to patient's 103 body (or a specific part thereof) from which medical data is to be acquired.
  • trained personnel workstation 122 can comprise trained personnel camera 1122a and trained personnel microphone 1122b that can be used for acquiring image (including video) and sound data of trained personnel 124. It is to be noted that during the tp-patient connection a video conference can take place while utilizing, for example, patient location camera 1114a, patient location microphone 1114b, trained personnel camera 1122a, and trained personnel microphone 1122b. In such cases the data received from trained personnel camera 1122a and trained personnel microphone 1122b, can be presented to user 102 utilizing for example a patient workstation 114 display and speaker using for example video-conference software.
  • trained personnel workstation 122 can be further connected to guiding device 1124 (e.g. via a wired or wireless connection).
  • Guiding device 1124 can be any input mean that will enable trained personnel 124 to provide user 102 with six axis movement instructions (up-down, left-right, back-forward, pitch, roll, yaw), as further detailed below, inter alia with respect to Fig. 16.
  • the instructions are transmitted to patient workstation 114 or to diagnostics device 104.
  • Patient workstation 114 or diagnostics device 104 can be configured to present the instructions to user 102, for example visually on a display (e.g. LCD screen included in patient workstation 114 or diagnostics device 104).
  • Another exemplary alternative is to present the instructions to user 102 vocally while translating the received data to voice commands (using known methods and techniques).
  • trained personnel 124 can instruct user 102 to acquire medical data using diagnostics device 104.
  • trained personnel workstation 122 and/or guiding device 1124 can enable trained personnel 124 to acquire the required medical data by themselves. In such a case, trained personnel workstation 122 and/or guiding device 1124 will transfer trained personnel 124 instruction to diagnostic device 104, which will automatically acquire the required readings based on the received instructions.
  • trained personnel workstation 122 and/or guiding device 1124 and/or diagnostic device 104 can also be configured to use the predefined reading acquisition parameters, as defined in check plan repository 210 and/or patient and check plan repository 136 or any other location operatively connected to trained personnel workstation 122 and/or guiding device 1124 and/or diagnostic device 104 on which patient data is stored.
  • diagnostics device can be configured to transmit the acquired data to trained personnel workstation 122 and/or to central system 130.
  • the transmitted data is received at trained personnel workstation 122, the data can be saved in trained personnel data repository 123 that can be connected to trained personnel workstation 122.
  • Trained personnel 124 e.g.
  • trained personnel workstation 122 can include a display (e.g. LCD screen), and a keyboard or any other suitable input/output devices.
  • trained personnel 124 can provide feedback to user 102, for example by transmitting data back to patient workstation 114. Such feedback can include, for example, analysis of the received data, request to receive more data, medical treatment instructions, invitation to a further examination, etc.
  • trained personnel 124 can transmit feedback data to central system 130, which, in turn, can transmit the feedback data to patient workstation 114 (e.g. via the Internet, cellular network, etc.).
  • Fig. 16 is a schematic illustration of some exemplary guiding devices that can be used for providing navigation instructions to a user of a diagnostic device, in accordance with the presently disclosed subject matter.
  • Guiding device 1124 can be, for example, keyboard 1522, mouse 1524, navigation device 1526, etc.
  • keyboard 1522 can enable trained personnel 124 to provide 6-axis movement data 1520 to user 102 as indicated above.
  • keyboard 1522 can have a trackball that enables providing, for example, pitch, yaw and roll movement, arrow keys that enable for example up-down and left-right movement and other key or keys that enable back-forward motion. It is to be noted that this is a mere example as other keys can be used and other functionality can be defined for the trackball and for the keys.
  • the trackball is optional and keyboard keys can perform its functionality.
  • mouse 1524 can be utilized. In such cases, mouse movement can enable for example up-down and left-right movement while mouse 1524 can have an additional trackball for enabling, for example, pitch, yaw and roll movement.
  • Back-forward motion can be represented for example by pressing a mouse key and moving the mouse backwards and forwards. It is to be noted that this is a mere example as other keys can be used and other functionality can be defined for the trackball, the mouse and the mouse keys.
  • navigation device 1526 can be used. Navigation device 1526 can comprise, for example, INS sensors or any other mean that enables identifying navigation device 1526 motion, e.g. in 6 degrees of freedom.
  • Fig. 17 is a flowchart illustrating one example of a sequence of operations carried out for performing an automatic and remote trained personnel guided medical examination, in accordance with the presently disclosed subject matter.
  • the process begins with performance of a physical check initiation (step 1602).
  • Physical check initiation can include establishing and verification of a tp-patient connection and can include one or more of the following initiations: trained personnel check initiation 1602a, patient check initiation 1602b and device check initiation 1602c.
  • Trained personnel check initiation 1602a can include activating trained personnel workstation 122, including the display, the trained personnel camera 1122a, the trained personnel microphone 1122b and optionally guiding device 1124.
  • Trained personnel check initiation 1602a can further include retrieving relevant details relating to patient 103 (e.g. from one or more of: data repository 216, check plan repository 210, trained personnel data repository 123, patient & check plan repository 136 or any other location operatively connected to trained personnel workstation 122 on which patient data is stored) and displaying all or part of the retrieved details on trained personnel workstation 122 (e.g. on a display).
  • the retrieved data can include data relating to a patient specific check plan, reading references, communication parameters, etc.
  • Trained personnel check initiation can further include displaying data received from patient workstation 114 or diagnostics device 104, including image and voice data received (e.g.
  • Trained personnel check initiation 1602a can further include retrieving relevant details relating to patient 103 from external systems such as visit scheduling system, Electronic Medical Record (EMR) system or any other system or repository, which are relevant to the patient's examination.
  • EMR Electronic Medical Record
  • Patient check initiation 1602b can include activating patient workstation 114, including the display, patient location camera 1114a, patient location microphone 1114b and establishment and verification of a tp-patient connection.
  • Patient check initiation 1602b can further include beginning to transmit (e.g. stream) data acquired by patient location camera 1114a and patient location microphone 1114b to trained personnel workstation 122, for example for displaying the acquired data to trained personnel 124.
  • Patient check initiation 1602b can further include retrieving relevant details relating to patient 103 (e.g.
  • patient check initiation 1602b can further include retrieving relevant details relating to patient 103 from external systems such as visit scheduling system, Electronic Medical Record (EMR) system or any other system or repository, which are relevant to the patient examination.
  • EMR Electronic Medical Record
  • Device check initiation 1602c can include activating and checking the status of diagnostics device 104, including communication with patient workstation 114 and activation of one or more of diagnostic device 104 modules or sensors (e.g. diagnostics sensors 202 and/or navigation module 204 and/or guiding module and or/ examination module). Device check initiation 1602c can further include the beginning of transmission (e.g. stream) of data acquired by diagnostics sensors 202 and/or navigation camera 420 to trained personnel workstation 122, for example for displaying the acquired data to trained personnel 124. It is to be noted that device check initiation 1602c can be performed, for example, by examination logic module 208.
  • patient workstation 114 and diagnostics device 104 can be configured to periodically or continuously transmit (e.g. stream, for example using Internet 116, cellular network, etc.) data such as images, video and voice to trained personnel workstation 122 for purpose of displaying the data to trained personnel 124 (step 1603), an exemplary presentation on trained personnel workstation 122 display is provided with respect to Fig. 19.
  • trained personnel workstation 122 and patient workstation 114 can be configured to continuously or periodically transmit bi-directional video and audio both from patient workstation 114 to the trained personnel workstation 122 and vice versa (step 1603). This data transmission can be used for example for general patient view, device orientation & video conferencing, etc.
  • Trained personnel workstation 124 can be further configured to instruct trained personnel 124 to perform a questionnaire with respect to patient 103 (step 1604).
  • the questionnaire can be a pre-defined questionnaire or a questionnaire defined by trained personnel on-the-go.
  • the questionnaire can be presented to user 102 by trained personnel (for example utilizing trained personnel camera 1122a, trained personnel microphone 1122b), by patient workstation 114 (e.g. displaying the questions on patient workstation 114 display) or by any other means.
  • User 102 can provide answers to the questionnaire utilizing patient location camera 1114a, patient location microphone 1114b, in which case trained personnel 124 will type the answers to the questionnaire in trained personnel workstation (e.g. using a keyboard).
  • user 102 can provide answers to the questionnaire by typing the answers in patient workstation 114 (e.g. using a keyboard). It is to be noted that other methods, such as voice recording, etc. can be utilized in order to provide answers to the questionnaire.
  • the answers to the questionnaire can be stored for example in one or more of data repository 216, check plan repository 210, trained personnel data repository 123, patient & check plan repository 136 or any other location on which patient data is stored and that is operatively connected to trained personnel workstation 122.
  • a questionnaire can comprise generic and/or patient 103 specific questions designed to provide trained personnel 124 with a patient's medical data (e.g. data relating to patient 103 medical condition), including data required to enable analysis of the medical data acquired during the medical examinations (e.g. "does the patient have a fever and how long?", “how high is it?", “does the patient feel any pain?", “where is the pain located?”, etc.).
  • a patient's medical data e.g. data relating to patient 103 medical condition
  • data required to enable analysis of the medical data acquired during the medical examinations e.g. "does the patient have a fever and how long?", “how high is it?", “does the patient feel any pain?", "where is the pain located?”, etc.).
  • Trained personnel workstation 122 can be further configured to perform a medical examination selection and initiation (step 1606).
  • trained personnel workstation 122 can enable trained personnel 124 to select a medical examination to be performed, either manually or from a list of checks to be performed as defined in patient 103 check plan.
  • trained personnel workstation 122 can select and initiate a check according to a pre-defined order set by patient 103 specific check plan, without input from trained personnel 124.
  • the medical examination initiation can consist of, for example, retrieving reference medical examination data from the check plan or a relevant repository. The retrieved data can be displayed to trained personnel 124 on a display.
  • An exemplary presentation on trained personnel workstation 122 display is provided with respect to Fig. 19.
  • Medical examination initiation can also consist of sending relevant data to the patient workstation 114 and/or diagnostic device 104.
  • data can include for example user instructions and general guiding information, patient instructions and general guiding information, diagnostic device parameters (e.g what check is currently being performed, required reading parameters, etc.), etc.
  • trained personnel workstation 122 can be configured to enable trained personnel 124 to provide user 102 with navigational instructions on how to navigate diagnostics device 104 to the desired spatial disposition with respect to patient's 103 body (or a specific part thereof) required for acquiring medical data (step 1610).
  • desired spatial disposition with respect to patient's 103 body (or a specific part thereof) can be defined, for example, manually or by the patient specific check plan.
  • Trained personnel 124 can view the data presented on trained personnel workstation 122 (including real-time or near real-time streaming data received from one or more of patient location camera 1114a, patient location microphone 1114b, diagnostics sensors 202, navigation module 204) and instruct trained personnel workstation 122 to provide user 102 with instructions for navigating diagnostics device 104 to the desired spatial disposition with respect to patient's 103 body (or a specific part thereof) for acquiring medical data of patient 103.
  • diagnostics device 104 can be configured to utilize navigation module 204, including, inter alia, activating INS sensors 410, navigation light source 426, navigation camera 420, distance sensors 430, pressure sensors 440, etc., and transmit (e.g. stream) all or part of the data acquired by any of them.
  • the navigation instructions can be provided by voice commands (e.g. by transmitting data acquired by trained personnel microphone 1122b to patient workstation 114 or to diagnostic device 104).
  • the navigation instructions can also be provided by utilizing guiding device 1124 that enables trained personnel 124 to perform the navigation and device spatial disposition correction virtually on trained personnel location 120.
  • the navigation performed by trained personnel 124 utilizing guiding device 1124 is analyzed and translated to voice commands that can be displayed to user 102.
  • the navigation performed by trained personnel 124 utilizing guiding device 1124 is presented to user 102 visually on patient workstation 114 (e.g. on patient workstation 114 display).
  • the movements made by trained personnel 124 using guiding device 1124 can be presented to user 102 using a representation of diagnostics device 104, such as, for example, shown in Fig. 13.
  • the voice and/or visual navigation instructions can be managed by guiding module 206 (e.g. speaker 510, display 502, etc.) of diagnostics device 104 or by patient workstation 114.
  • diagnostics device 104 can be configured to utilize INS sensors 410 for verifying that diagnostics device 104 movements performed by user 102 are in-line with the navigational instructions provided by trained personnel 124. In such cases, if there is a mismatch between diagnostics device 104 movements made by user 102 and the navigational instructions provided by trained personnel 124, diagnostics device 104 can be configured to notify user 102 of the mismatch, and present him with the required movement correction.
  • Such notification can be a voice notification (for example using speaker 510), a vibration notification (for example using vibration elements 508) or an image notification displayed (for example using navigation guiding presentation (as shown in Fig. 13) on patient location workstation 114 display or on display 502.
  • trained personnel workstation 122 can be configured to enable trained personnel 124 to notify user 102 that diagnostics device 104 is in the required spatial disposition.
  • Such notification can be a voice notification (e.g. utilizing trained personnel microphone 1122b).
  • a vibrating notification can be provided by diagnostics device 104 (for example using vibration elements 508) and/or a visual notification can be presented on patient workstation 114 or on display 502 (for example following receipt of an indication from trained personnel 124 that diagnostics device 104 is in the required spatial disposition that can be provided by trained personnel 124 to trained personnel workstation 122, e.g. utilizing keyboard, etc.). It is to be noted that other notification methods can be utilized as well.
  • trained personnel workstation 122 Upon arrival to diagnostics device 104 desired spatial disposition with respect to patient's 103 body (or a specific part thereof), trained personnel workstation 122 can be configured to enable trained personnel 124 to perform a remote reading and verification of the reading 1612. For that purpose, trained personnel workstation 122 can be configured to enable trained personnel 124 to instruct diagnostics device 104 to acquire medical data of patient 103 (e.g. using manual instruction and/or utilizing reading and verification logic module 212 and using diagnostics sensors 202, inter alia as indicated above with respect toFigs. 2 and 3).
  • diagnostics device 104 can be configured to prepare for acquiring medical data of patient 103, to perform acquisition of such medical data and to transmit the acquired data to trained personnel workstation 122 for example for displaying the acquired data to trained personnel 124.
  • Trained personnel workstation 122 can be configured to enable trained personnel 124 to verify that the acquired data is of sufficient quality (e.g. in terms of quality, thresholds, length, etc.).
  • trained personnel workstation 122 can be configured to enable trained personnel 124 to re-acquire the required data or if needed to instruct user 102 and provide navigational instructions to user 102 for repositioning and reorienting diagnostics device 104 in order to bring diagnostics device 104 to desired spatial disposition with respect to patient's 103 body (or a specific part thereof). Following repositioning and reorienting of diagnostics device 104, reading and verification can be re-performed.
  • trained personnel workstation 122 can be configured to check if the medical examination is done (e.g. all medical examinations defined by patient 103 check plan have been performed). The check can be done either automatically by the trained personnel workstation 122 using the predefined check plan or manually by the trained personnel 124. In case the medical examination is not done, trained personnel workstation 122 can be configured to move to the next medical examination indicated by patient 103 check plan, or to allow the trained personnel 124 to do so manually. If all required medical examinations are performed, trained personnel workstation 122 can be configured to finalize the check, or to allow the trained personnel 124 to do so manually (step 1614).
  • the medical examination e.g. all medical examinations defined by patient 103 check plan have been performed. The check can be done either automatically by the trained personnel workstation 122 using the predefined check plan or manually by the trained personnel 124. In case the medical examination is not done, trained personnel workstation 122 can be configured to move to the next medical examination indicated by patient 103 check plan, or to allow the trained personnel 124 to do so
  • Fig. 18 is a flowchart illustrating one example of a sequence of operations carried out for navigating a diagnostic device and guiding a diagnostic device user accordingly in a remote trained personnel guided medical examination, in accordance with the presently disclosed subject matter.
  • Diagnostics device 104 can be configured to activate various diagnostics and navigation sensors (such as patient location camera 1114a, patient location microphone 1114b, diagnostics sensors 202, navigation module 204, etc.) while being moved by user 102 (step 1902). Diagnostics device 104 can be configured to continuously transmit (e.g. stream, for example in real time or near real- time) the data acquired by the various diagnostics and navigation sensors, inter alia to patient workstation 114 and/or to trained personnel workstation 122.
  • various diagnostics and navigation sensors such as patient location camera 1114a, patient location microphone 1114b, diagnostics sensors 202, navigation module 204, etc.
  • Diagnostics device 104 can be configured to continuously transmit (e.g. stream, for example in real time or near real- time) the data acquired by the
  • Patient workstation 114 can utilize the data acquired by the various navigation sensors for presenting (e.g. on a display) data on diagnostics device 104 movements to user 102 (step 1903) This can allow user 102 to receive an immediate feedback relating to diagnostics device 104 movement (prior to receiving a delayed movement correction feedback from trained personnel 124), thus making the navigation process easier.
  • the data on diagnostics device 104 movements can be presented to user 102 for example using a representation of diagnostics device 104, such as, for example, shown in Fig. 13.
  • Trained personnel workstation 122 can utilize the data acquired by the various navigation sensors for presenting (e.g. on a display) data on diagnostics device 104 movements to trained personnel 124 (step 1904).
  • the data on diagnostics device 104 movements can be presented to trained personnel 124 using a representation of diagnostics device 104, such as, for example, shown in Fig. 19 (see index 1940 in Fig. 19).
  • Trained personnel workstation 122 can further utilize data acquired by the various diagnostics sensors for presenting (e.g. on a display) the data to trained personnel 124 (step 1906).
  • the data acquired by the various diagnostics sensors can be presented to trained personnel 124 as, for example, shown in Fig. 19 (see index 1942 in Fig. 19).
  • Trained personnel 124 can then utilize the data presented to him (e.g. on a display) and determine if diagnostics device 104 is located in the desired spatial disposition with respect to patient's 103 body (or a specific part thereof) to acquire the required reading and if the current readings received from the diagnostics sensors are of sufficient quality (step 1908). If diagnostics device 104 is located in the desired spatial disposition with respect to patient's 103 body (or a specific part thereoftand the readings received from the diagnostics sensors are of sufficient quality trained personnel workstation 122 can be configured to enable trained personnel 124 to instruct it to continue to the step of acquiring the medical data (step 1612).
  • trained personnel workstation 122 can be configured to enable trained personnel 124 to provide user 102 with instructions for navigating diagnostics device 104 to the desired spatial disposition with respect to patient's 103 body (or a specific part thereof) (step 1912).
  • the instructions can be voice instructions and/or visual instructions.
  • trained personnel workstation 122 can be configured to enable trained personnel 124 to remotely change or adjust various parameters in diagnostic device 104 (e.g. manually control diagnostic device 104 sensors such as light intensity, camera focus, camera zoom, microphone sensitivity, etc.).
  • trained personnel workstation 122 can be configured to display the movements made by guiding device 1124 (e.g. on a display).
  • the display can present the movements for example using a representation of diagnostics device 104, as shown for example in Fig. 19 (see index 1940 in Fig. 19) (step 1914).
  • This presentation will allow trained personnel 124 to receive an immediate feedback relating to his guiding movement, before receiving the delayed feedback of user 102 corresponding movement using diagnostic device 104.
  • Trained personnel workstation 122 can be configured to transmit (e.g. stream in real time or near-real time) the instructions for correcting the navigation of diagnostics device 104 to the required spatial disposition with respect to patient's 103 body (or a specific part thereoftto patient workstation 114 or to diagnostic device 104 (step 1916).
  • Patient workstation 114 can be configured to provide user 102 with the voice and/or visual instructions provided by trained personnel 124 (step 1918).
  • the instructions can be provided for example by utilizing a display (e.g. display 502) and/or a speaker (e.g. speaker 510).
  • Visual instructions can be presented, for example, as shown and described with reference to Fig. 13 above.
  • Diagnostics device 104 can be further maneuvered by user 102 (step 1920) while repeating the process detailed above until diagnostics device 104 is positioned and oriented in the desired spatial disposition with respect to patient's 103 body (or a specific part thereof) for medical data acquisition.
  • Fig. 19 is a schematic illustration of an exemplary navigation and guiding presentation to trained personnel, in accordance with the presently disclosed subject matter.
  • Trained personnel workstation 122 can be configured to display online visit screen 1930.
  • Online visit screen can be divided to several areas that can contain various data relevant for performing a remote trained personnel guided medical examination.
  • data can comprise for example patient & general information 1932, patient view 1936, organ readings - actual readings 1944, navigation and guiding presentation 1940, organ view - active sensor 1942 and application menu 1934.
  • Patient & general information 1932 can comprise, for example, various data and information about the patient and the online visit status, such as patient name, patient age, patient address, patient language, data relating to diseases and/or sensitivities to medicine, online visit date, time, duration etc.
  • Patient view 1936 can present, for example, data received (e.g. streamed in real time) from patient location camera 1114a or patient location microphone 1114b for enabling trained personnel 124 to see and hear patient 103 and/or user 102.
  • This information can allow for example general patient 103 and diagnostics device 104 orientation as well as video-conferencing between trained personnel 124 and user 102 and/or patient 103.
  • Organ readings - actual readings 1944 can present for example data about reference readings, and/or past readings of the organ to be checked. Upon acquiring a patient 103 organ reading (e.g. organ image, video or sound), the result reading transferred from diagnostic device 104, can be presented in that area. In addition the organ readings - actual readings 1944 can allow video presentation, zooming, scaling etc. It is to be noted that the reading data presented in this area doesn't require real-time update.
  • organ reading e.g. organ image, video or sound
  • the organ readings - actual readings 1944 can allow video presentation, zooming, scaling etc. It is to be noted that the reading data presented in this area doesn't require real-time update.
  • Navigation and guiding presentation 1940 can present the current diagnostics device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof), the desired diagnostics device 104 spatial disposition with respect to patient's 103 body (or a specific part thereof) and the required correction movement to be performed to diagnostics device 104 in order to move it to the desired spatial disposition with respect to patient's 103 body (or a specific part thereof).
  • the area can also present trained personnel guiding device 1124 position and orientation based on trained personnel 124 movement.
  • Navigation and guiding presentation 1940 can also be configured to allow real-time presentation of the guiding / correction movement made by trained personnel 124 vs. the corresponding movement made by user 103 using diagnostic device 104, thus allowing visual presentation of the tracing of user 103 movements based on trained personnel 124 guiding & navigation correction.
  • Organ view - active sensor 1942 can present data received (e.g. streamed in real time or near real-time) from diagnostics sensors (e.g. image based sensors 310). Trained personnel 124 can use this data, inter alia in order to determine if medical data acquisition can be performed (e.g. diagnostics device 104 is positioned and oriented as desired, the image quality is good, etc.). It is to be noted that trained personnel workstation 122 can be configured to use lower quality real-time (or near real-time) data streaming in organ view- active sensor area 1942 (e.g. to increase performance and allow general device position), while using a higher quality reading in the organ reading - actual reading area 1944 (e.g. use higher quality sensor reading like high definition image and sound, to be transferred not in real time).
  • diagnostics sensors e.g. image based sensors 310
  • Trained personnel 124 can use this data, inter alia in order to determine if medical data acquisition can be performed (e.g. diagnostics device 104 is positioned and oriented as desired,
  • Application menu 1934 can present for example various operational options for operating the system, such as beginning a medical examination, saving a medical examination, acquiring medical data, inserting various written data to system (e.g. diagnostics data, comments, etc.), etc.
  • application menu 1934 can be configured to allow a remote control of diagnostic device 104 sensors (e.g. light intense, zoom, focus, sound filters, etc.).
  • application menu 1934 can be also configured as context sensitive menu, e.g. the menu can add/remove functionality with relation to a specific window area currently in focus or being manipulated (e.g. add/remove specific functions related for example to a specific window area).
  • Fig. 20 is a flowchart illustrating one example of a sequence of operations carried out for acquisition and verification of a reading by a diagnostic device in a remote trained personnel guided medical examination, in accordance with the presently disclosed subject matter.
  • Trained personnel workstation 122 can be configured to enable trained personnel 124 to provide user 102 with a notification that diagnostics device 104 is in the desired spatial disposition with respect to patient's 103 body (or a specific part thereof) (step 2002).
  • the notification can be a voice notification, e.g. a voice recording acquired by trained personnel microphone 1122b, transmitted (e.g. streamed) to patient workstation 114 that can be configured to play it to user 102 (e.g. utilizing speaker 510).
  • the notification can be a visual notification, as trained personnel 124 can instruct trained personnel workstation 122 to instruct patient workstation to display for example a notification on a display of patient workstation 114 or diagnostic device 104.
  • the notification can, for example, instruct user 102 not to move diagnostics device.
  • Trained personnel workstation 122 can be configured to enable trained personnel 124 to instruct diagnostics device 104 and the diagnostics sensors to prepare to acquire medical data of patient 103 (step 2004).
  • the preparation can be defined by the patient specific check plan or according to instructions provided by trained personnel 124.
  • Such preparations can include preparing diagnostics sensors 202 to acquire medical data according to the patient specific check plan. Exemplary preparations are setting image acquisition sensor 316 zoom and /or focus, activating light sources 318 at correct power, activating sound acquisition sensor 324, etc.
  • diagnostic device 104 can be configured to retrieve the relevant reading parameters and thresholds for example from the patient specific check plan (e.g. the required length of reading, reference thresholds such as minimal sound volume, etc.). It is to be noted that trained personnel 124 can also manually adjust or change the relevant reading parameters and thresholds (e.g. override the patient specific check plan).
  • Trained personnel workstation 122 can be configured to enable trained personnel 124 to re-evaluate diagnostics device 104 current spatial disposition with respect to patient's 103 body (or a specific part thereoftand verify that no movements have been made and that it is still located in the desired spatial disposition with respect to patient's 103 body (or a specific part thereof) (step 2005).
  • trained personnel workstation 122 can be configured to enable trained personnel 124 to return to the navigation and guiding process (1610).
  • trained personnel workstation 122 can be configured to enable trained personnel 124 to perform medical data acquisition utilizing diagnostics device 104 (step 2006).
  • the medical data can be acquired according to the check plan, that, as indicted above, can include information, inter alia about the examination process, steps and logic, and predefined reading parameters such as type of sensor to be used (still image vs. video), required length of reading (sound or video recording) in terms of time (e.g. seconds), and reading data thresholds (for example definition of acceptable minimal and/or maximal reading limits to be used as a quality parameter of a reading.
  • the check plan can define that the sound based sensors 320 are to be used and that the reading length should be 3 seconds, or between 2.5 and 5 seconds, etc.).
  • trained personnel 124 can also manually adjust or change the relevant reading parameters and thresholds (e.g.
  • the data can be transmitted (e.g. streamed) to trained personnel workstation 122 which can then display the acquired data to trained personnel 124, as shown for example in Fig. 19 (see index 1944 in Fig. 19) (step 2007).
  • Trained personnel workstation 122 can be configured to enable trained personnel 124 to verify that the acquired medical data meets pre-defined standards (e.g. a required length of reading, reading data thresholds, etc.) (step 2008). For example, if the heart is to be checked, and the check plan defines that the reading length should be between 2.5 and 5 seconds, trained personnel workstation 122 can be configured to enable trained personnel 124 to check that the reading length meets the requirement. In case the acquired medical data did not meet the pre-defined standards, trained personnel workstation 122 can be configured to enable trained personnel 124 to check if the acquired medical data is ok (for example that the acquired medical data is of sufficient quality, etc.).
  • pre-defined standards e.g. a required length of reading, reading data thresholds, etc.
  • trained personnel workstation 122 can be configured to enable trained personnel 124 to perform a manual reading (step 2009).
  • trained personnel workstation 122 can be configured to enable trained personnel 124 to manually adjust or control different diagnostic device 104 parameters such as light intensity, camera focus, camera zoom, reading duration, sound filtering, etc.
  • trained personnel workstation 122 can be configured to enable trained personnel 124 to return to step 2004 (in order to retry acquiring the medical data). If the process was not ok, trained personnel workstation 122 can be configured to issue a notification to trained personnel 124 of a potential error (for example by presenting a message on trained personnel workstation 122, etc.) and enable him to decide if the acquired medical data is to be saved or not.
  • trained personnel workstation 122 can be configured to enable trained personnel 124 to save the acquired medical data (for example, in one or more of a data repository 216, patient & check plan repository 136, trained personnel data repository 123 or any other location operatively connected to diagnostics device 104 on which patient data is stored) (step 2014).
  • trained personnel workstation 122 can be configured to update the reference data with the acquired medical data (step 2016). This can be performed in order to keep the reference data up to date, as changes can occur to the human body (for example in light of growing up, aging, medical treatments, etc.).
  • diagnostics device 104 can be performed, alternatively or additionally, by any one of patient workstation 114 or by any other suitable device, including, but not limited to, trained personnel workstation 122, central system 130, etc.
  • system may be a suitably programmed computer.
  • the presently disclosed subject matter contemplates a computer program being readable by a computer for executing the method of the presently disclosed subject matter.
  • the presently disclosed subject matter further contemplates a machine -readable memory tangibly embodying a program of instructions executable by the machine for executing the method of the presently disclosed subject matter.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • Biophysics (AREA)
  • Business, Economics & Management (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Human Computer Interaction (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

L'invention concerne un procédé permettant d'exécuter un ou plusieurs examens médicaux d'un patient au moyen d'un dispositif de diagnostic, le procédé comprenant, pour au moins un examen médical parmi les examens médicaux, les étapes suivantes : fournir des données de référence indiquant une disposition spatiale désirée du dispositif par rapport au corps du patient en vue de l'exécution de l'examen médical ; commander le dispositif pour l'acquisition des données de navigation ; déterminer une disposition spatiale du dispositif par rapport à la disposition spatiale désirée, en utilisant les données de navigation acquises et les données de référence ; calculer une correction de mouvement requise à partir de la disposition spatiale déterminée jusqu'à la disposition spatiale désirée, en vue d'acquérir les données médicales du patient selon ledit examen médical ; fournir à un utilisateur des instructions de manœuvre pour diriger le dispositif vers la disposition spatiale désirée conformément à la route calculée ; et acquérir les données médicales lors de l'arrivée à la disposition spatiale désirée.
PCT/IL2012/050050 2011-02-17 2012-02-16 Système et procédé permettant d'exécuter un examen médical automatique et autoguidé WO2012111012A1 (fr)

Priority Applications (7)

Application Number Priority Date Filing Date Title
CA2827523A CA2827523C (fr) 2011-02-17 2012-02-16 Systeme et procede permettant d'executer un examen medical automatique et autoguide
CN201710067747.2A CN107115123B (zh) 2011-02-17 2012-02-16 用于远程地调整声音采集传感器参数的系统和方法
AU2012219076A AU2012219076B2 (en) 2011-02-17 2012-02-16 System and method for performing an automatic and self-guided medical examination
CN201280018592.9A CN103781403B (zh) 2011-02-17 2012-02-16 用于执行自动的且自我指导的医学检查的系统和方法
US14/000,374 US8953837B2 (en) 2011-02-17 2012-02-16 System and method for performing an automatic and self-guided medical examination
EP12746572.2A EP2675345B1 (fr) 2011-02-17 2012-02-16 Système et procédé permettant d'exécuter un examen médical automatique et autoguidé
JP2013554060A JP6254846B2 (ja) 2011-02-17 2012-02-16 自動の及び自己ガイドされる医学検査を行うためのシステム及び方法

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201161443767P 2011-02-17 2011-02-17
US201161443788P 2011-02-17 2011-02-17
US61/443,767 2011-02-17

Publications (1)

Publication Number Publication Date
WO2012111012A1 true WO2012111012A1 (fr) 2012-08-23

Family

ID=47010818

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/IL2012/050050 WO2012111012A1 (fr) 2011-02-17 2012-02-16 Système et procédé permettant d'exécuter un examen médical automatique et autoguidé
PCT/IL2012/050051 WO2012111013A1 (fr) 2011-02-17 2012-02-16 Système et procédé permettant d'exécuter un examen médical automatique et distant guidé par un personnel qualifié

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/IL2012/050051 WO2012111013A1 (fr) 2011-02-17 2012-02-16 Système et procédé permettant d'exécuter un examen médical automatique et distant guidé par un personnel qualifié

Country Status (1)

Country Link
WO (2) WO2012111012A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014045558A1 (fr) * 2012-09-20 2014-03-27 Sony Corporation Appareil de traitement d'informations, procédé de traitement d'informations, programme et système de mesure
WO2016124616A1 (fr) * 2015-02-03 2016-08-11 Koninklijke Philips N.V. Procédés, systèmes, et appareil portable pour obtenir de multiples paramètres de santé
US11877831B2 (en) 2022-03-14 2024-01-23 O/D Vision Inc. Systems and methods for artificial intelligence based blood pressure computation based on images of the outer eye

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103905710A (zh) * 2012-12-25 2014-07-02 夏普株式会社 图像捕获方法及其移动终端和设备
US11475997B2 (en) * 2020-02-21 2022-10-18 Shanghai United Imaging Intelligence Co., Ltd. Systems and methods for automated healthcare services

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5527261A (en) 1994-08-18 1996-06-18 Welch Allyn, Inc. Remote hand-held diagnostic instrument with video imaging
US20020188227A1 (en) * 2001-06-11 2002-12-12 Hoon Chong Stethoscope system for self-examination using internet
US6778846B1 (en) * 2000-03-30 2004-08-17 Medtronic, Inc. Method of guiding a medical device and system regarding same
WO2007072356A2 (fr) 2005-12-21 2007-06-28 Koninkijke Philips Electronics N.V. Systeme de positionnement de capteurs de surveillance d'un patient
US20080166033A1 (en) * 2003-06-05 2008-07-10 General Electric Company Method, system and apparatus for processing radiographic images of scanned objects
US20090163774A1 (en) * 2007-12-20 2009-06-25 Sudeesh Thatha Managment and Diagnostic System for Patient Monitoring and Symptom Analysis
WO2009141769A1 (fr) 2008-05-19 2009-11-26 Koninklijke Philips Electronics N.V. Positionnement reproductible de dispositifs de détection et/ou de traitement
US20090299155A1 (en) * 2008-01-30 2009-12-03 Dexcom, Inc. Continuous cardiac marker sensor system
US20100312484A1 (en) * 2009-06-05 2010-12-09 Duhamel James B System for monitoring of and managing compliance with treatment for obstructive sleep apnea using oral appliance therapy and method therfor
US20110015504A1 (en) * 2008-03-04 2011-01-20 Samsung Electronics Co., Ltd. Remote medical diagnositic device including bio-mouse and bio-keyboard, and method using the same

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7998062B2 (en) 2004-03-29 2011-08-16 Superdimension, Ltd. Endoscope structures and techniques for navigating to a target in branched structure
US20040073455A1 (en) 2002-05-08 2004-04-15 University Of Rochester Medical Center Child care telehealth access network
JP4088104B2 (ja) 2002-06-12 2008-05-21 株式会社東芝 超音波診断装置
EP1614040A4 (fr) * 2003-04-08 2009-03-11 Medic4All Ag Passerelle portable sans fil pour examen medical a distance
US20050273359A1 (en) * 2004-06-03 2005-12-08 Young David E System and method of evaluating preoperative medical care and determining recommended tests based on patient health history and medical condition and nature of surgical procedure
US20060155589A1 (en) * 2005-01-10 2006-07-13 Welch Allyn, Inc. Portable vital signs measurement instrument and method of use thereof
US20090216113A1 (en) 2005-11-17 2009-08-27 Eric Meier Apparatus and Methods for Using an Electromagnetic Transponder in Orthopedic Procedures
EP2448512B1 (fr) 2009-06-29 2021-10-27 Koninklijke Philips N.V. Appareil de suivi dans une intervention médicale

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5527261A (en) 1994-08-18 1996-06-18 Welch Allyn, Inc. Remote hand-held diagnostic instrument with video imaging
US6778846B1 (en) * 2000-03-30 2004-08-17 Medtronic, Inc. Method of guiding a medical device and system regarding same
US20020188227A1 (en) * 2001-06-11 2002-12-12 Hoon Chong Stethoscope system for self-examination using internet
US20080166033A1 (en) * 2003-06-05 2008-07-10 General Electric Company Method, system and apparatus for processing radiographic images of scanned objects
WO2007072356A2 (fr) 2005-12-21 2007-06-28 Koninkijke Philips Electronics N.V. Systeme de positionnement de capteurs de surveillance d'un patient
US20090163774A1 (en) * 2007-12-20 2009-06-25 Sudeesh Thatha Managment and Diagnostic System for Patient Monitoring and Symptom Analysis
US20090299155A1 (en) * 2008-01-30 2009-12-03 Dexcom, Inc. Continuous cardiac marker sensor system
US20110015504A1 (en) * 2008-03-04 2011-01-20 Samsung Electronics Co., Ltd. Remote medical diagnositic device including bio-mouse and bio-keyboard, and method using the same
WO2009141769A1 (fr) 2008-05-19 2009-11-26 Koninklijke Philips Electronics N.V. Positionnement reproductible de dispositifs de détection et/ou de traitement
US20100312484A1 (en) * 2009-06-05 2010-12-09 Duhamel James B System for monitoring of and managing compliance with treatment for obstructive sleep apnea using oral appliance therapy and method therfor

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PARNIAN: "Integration of Local Positioning System & Strapdown Inertial Navigation System for Hand-Held Tool Tracking.", THESIS, 2008, UNIVERSITY OF WATERLOO., Retrieved from the Internet <URL:http://uwspace.uwaterloo.ca/bitstream/10012/4043/1/Neda Pamian Thesis.pdf> [retrieved on 20120705] *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014045558A1 (fr) * 2012-09-20 2014-03-27 Sony Corporation Appareil de traitement d'informations, procédé de traitement d'informations, programme et système de mesure
CN104684461A (zh) * 2012-09-20 2015-06-03 索尼公司 信息处理装置、信息处理方法、程序和测量系统
US9646378B2 (en) 2012-09-20 2017-05-09 Sony Corporation Information processing apparatus, information processing method, program, and measuring system
WO2016124616A1 (fr) * 2015-02-03 2016-08-11 Koninklijke Philips N.V. Procédés, systèmes, et appareil portable pour obtenir de multiples paramètres de santé
RU2720666C2 (ru) * 2015-02-03 2020-05-12 Конинклейке Филипс Н.В. Способы, системы и носимое устройство для получения множества показателей состояния здоровья
US10842390B2 (en) 2015-02-03 2020-11-24 Koninklijke Philips N.V. Methods, systems, and wearable apparatus for obtaining multiple health parameters
US11877831B2 (en) 2022-03-14 2024-01-23 O/D Vision Inc. Systems and methods for artificial intelligence based blood pressure computation based on images of the outer eye

Also Published As

Publication number Publication date
WO2012111013A1 (fr) 2012-08-23

Similar Documents

Publication Publication Date Title
US20210030275A1 (en) System and method for remotely adjusting sound acquisition sensor parameters
CA2827523C (fr) Systeme et procede permettant d&#39;executer un examen medical automatique et autoguide
KR102662173B1 (ko) 의료 보조기
US11297285B2 (en) Dental and medical loupe system for lighting control, streaming, and augmented reality assisted procedures
JP4296278B2 (ja) 医療用コクピットシステム
WO2012111012A1 (fr) Système et procédé permettant d&#39;exécuter un examen médical automatique et autoguidé
CA3205931A1 (fr) Systemes et procedes d&#39;assistance a des procedures medicales
JP2022000763A (ja) 自動の及び遠隔の訓練された人によりガイドされる医学検査を行うためのシステム及び方法
JP2017102962A (ja) 自動の及び遠隔の訓練された人によりガイドされる医学検査を行うためのシステム及び方法
CN103717127B (zh) 用于执行自动的且远程的由受过训练的人员指导的医学检查的系统及方法
US20240153618A1 (en) Systems and methods for automated communications
WO2024144467A1 (fr) Procédé et système pour stéthoscopes numériques et suivi et gestion de patient à distance
TW202044268A (zh) 醫療機器人及病歷資訊整合系統

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12746572

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2827523

Country of ref document: CA

ENP Entry into the national phase

Ref document number: 2013554060

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 14000374

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 2012746572

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2012219076

Country of ref document: AU

Date of ref document: 20120216

Kind code of ref document: A

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)