US20180197624A1 - Medical assistant - Google Patents

Medical assistant Download PDF

Info

Publication number
US20180197624A1
US20180197624A1 US15/865,023 US201815865023A US2018197624A1 US 20180197624 A1 US20180197624 A1 US 20180197624A1 US 201815865023 A US201815865023 A US 201815865023A US 2018197624 A1 US2018197624 A1 US 2018197624A1
Authority
US
United States
Prior art keywords
user
wearable device
patient
information
medical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/865,023
Other languages
English (en)
Inventor
Nastasja U. Robaina
Nicole Elizabeth Samec
Mark Baerenrodt
Christopher M. Harrises
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Magic Leap Inc
Original Assignee
Magic Leap Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Magic Leap Inc filed Critical Magic Leap Inc
Priority to US15/865,023 priority Critical patent/US20180197624A1/en
Assigned to MAGIC LEAP, INC. reassignment MAGIC LEAP, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ROBAINA, Nastasja U., BAERENRODT, MARK, SAMEC, Nicole Elizabeth
Assigned to MAGIC LEAP, INC. reassignment MAGIC LEAP, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Harrises, Christopher M., TECHNICAL SOLUTIONS, INC.
Publication of US20180197624A1 publication Critical patent/US20180197624A1/en
Assigned to JP MORGAN CHASE BANK, N.A. reassignment JP MORGAN CHASE BANK, N.A. PATENT SECURITY AGREEMENT Assignors: MAGIC LEAP, INC., MENTOR ACQUISITION ONE, LLC, MOLECULAR IMPRINTS, INC.
Assigned to CITIBANK, N.A. reassignment CITIBANK, N.A. ASSIGNMENT OF SECURITY INTEREST IN PATENTS Assignors: JPMORGAN CHASE BANK, N.A.
Assigned to MAGIC LEAP, INC. reassignment MAGIC LEAP, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Harrises, Christopher M., TECHNICAL SOLUTIONS, INC.
Assigned to MAGIC LEAP, INC. reassignment MAGIC LEAP, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SAMEC, Nicole Elizabeth, ROBAINA, Nastasja U., BAERENRODT, MARK
Priority to US18/056,164 priority patent/US12080393B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0041Operational features thereof characterised by display arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • A61B5/044
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • A61B5/1171Identification of persons based on the shapes or appearances of their bodies or parts thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • A61B5/1171Identification of persons based on the shapes or appearances of their bodies or parts thereof
    • A61B5/1176Recognition of faces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • A61B5/339Displays specially adapted therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4803Speech analysis specially adapted for diagnostic purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/22Indexing; Data structures therefor; Storage structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/23Updating
    • G06F16/2379Updates performed during online database operations; commit processing
    • G06F17/2705
    • G06F17/2775
    • G06F17/30312
    • G06F17/30377
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/205Parsing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • G06F40/289Phrasal analysis, e.g. finite state techniques or chunking
    • G06K9/00671
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B5/00Visible signalling systems, e.g. personal calling systems, remote indication of seats occupied
    • G08B5/22Visible signalling systems, e.g. personal calling systems, remote indication of seats occupied using electric transmission; using electromagnetic transmission
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/26Speech to text systems
    • G10L15/265
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H80/00ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00207Electrical control of surgical instruments with hand gesture control or hand gesture recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00216Electrical control of surgical instruments with eye tracking or head position tracking control
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/372Details of monitor hardware
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/502Headgear, e.g. helmet, spectacles
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0127Head-up displays characterised by optical features comprising devices increasing the depth of field
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0141Head-up displays characterised by optical features characterised by the informative content of the display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing

Definitions

  • the present disclosure relates to virtual reality and augmented reality imaging and visualization systems in a healthcare setting.
  • a virtual reality, or “VR”, scenario typically involves presentation of digital or virtual image information without transparency to other actual real-world visual input;
  • an augmented reality, or “AR”, scenario typically involves presentation of digital or virtual image information as an augmentation to visualization of the actual world around the user;
  • the human visual perception system is very complex, and producing a VR, AR, or MR technology that facilitates a comfortable, natural-feeling, rich presentation of virtual image elements amongst other virtual or real-world imagery elements is challenging.
  • Systems and methods disclosed herein address various challenges related to VR, AR and MR technology.
  • a wearable device can present virtual content to a wearer for many applications in a healthcare setting.
  • the wearer may be a patient or a healthcare provider (HCP).
  • HCP healthcare provider
  • Such applications can include, but are not limited to, access, display, and modification of patient medical records and sharing patient medical records among authorized HCPs.
  • the patient medical records can be stored in the centralized location and owned by the patient, rather than by various HCP organizations (e.g., hospitals, clinics, doctors' offices) whose services the patient may use.
  • the wearable device can access and display portions of the patient's medical record to authorized HCPs. Because the patient's medical record is centrally stored and modified whenever the patient has a procedure or treatment, the medical record can remain substantially complete.
  • the wearable device can display to an attending HCP virtual content associated with the patient or the patient's medical record.
  • the HCP can use the wearable device to update the patient's medical record to account for the results of a procedure or treatment.
  • the HCP can use the wearable device to share some or all of the patient's medical record with other authorized HCPs.
  • An outward-facing camera of the wearable device can image and track medical instruments used during a medical procedure.
  • the wearable device can image portions of the patient during a procedure.
  • the wearable device can display an alert to the HCP so that the instrument can be removed from the patient's body or to follow the authorized procedure or protocol.
  • FIG. 1A depicts an illustration of a mixed reality scenario with certain virtual reality objects, and certain physical objects viewed by a person.
  • FIG. 1B illustrates a field of view and a field of regard for a wearer of a wearable system.
  • FIG. 2A schematically illustrates an example of a wearable system.
  • FIG. 2B shows a schematic view of an example of various components of a wearable system comprising environmental sensors.
  • FIG. 3 schematically illustrates aspects of an approach for simulating three-dimensional imagery using multiple depth planes.
  • FIG. 4 schematically illustrates an example of a waveguide stack for outputting image information to a user.
  • FIG. 5 shows example exit beams that may be outputted by a waveguide.
  • FIG. 6 is a schematic diagram showing an optical system including a waveguide apparatus, an optical coupler subsystem to optically couple light to or from the waveguide apparatus, and a control subsystem, used in the generation of a multi-focal volumetric display, image, or light field.
  • FIG. 7 is a block diagram of an example of a wearable system.
  • FIG. 8 is a process flow diagram of an example of a method of rendering virtual content in relation to recognized objects.
  • FIG. 9 is a block diagram of another example of a wearable system.
  • FIG. 10 is a process flow diagram of an example of a method for determining a user input to a wearable system.
  • FIG. 11 is a process flow diagram of an example of a method for interacting with a virtual user interface.
  • FIG. 12 illustrates an example computing environment in which multiple wearable devices and healthcare provider systems can interact with each other in a healthcare setting to provide for medical record management.
  • FIGS. 13A, 13B, 13C, and 13D illustrate example processes for interacting with a healthcare database system.
  • FIG. 14A illustrates an example of accessing a virtual medical record based on an access privilege associated with the virtual medical record.
  • FIG. 14B illustrates a flowchart that shows an example process for accessing a virtual medical record based on an access privilege.
  • FIG. 15 illustrates an example of recording and processing audio data associated with an interaction between a patient and a healthcare provider.
  • FIG. 16 is a flowchart that shows an example process for documenting a medical event by a healthcare provider (HCP).
  • HCP healthcare provider
  • FIG. 17 schematically illustrates an overall system view depicting multiple devices interacting with each other.
  • FIG. 18 illustrates an example of sharing medical information among multiple healthcare providers.
  • FIG. 19 illustrates an example of adding virtual content to images taken during a medical procedure.
  • FIG. 20 is a flowchart that illustrates an example process of sharing virtual content between multiple healthcare providers.
  • FIGS. 21, 22A, 22B, and 23 illustrate examples of presenting virtual content based on contextual information.
  • FIG. 24 is a flowchart that illustrates an example process of accessing and presenting virtual content based on contextual information.
  • FIG. 25 schematically illustrates an example of a medical procedure occurring in an operating room have a sterile region.
  • FIG. 26 is a flowchart that illustrates an example process of tracking medical objects in a sterile region.
  • Wearable devices that can present virtual content to the wearer can have a number of applications in a healthcare setting. Such applications can include, but are not limited to, accessing, displaying, and modifying of patient medical records and sharing patient medical records among authorized healthcare providers (HCPs).
  • HCPs authorized healthcare providers
  • the patient medical records can be stored in the centralized location and owned by the patient, rather than by various HCP organizations (e.g., hospitals, clinics, doctor's offices) whose services the patient may use.
  • the wearable device can access and display to authorized personnel portions of the patient's medical record.
  • the medical record can remain substantially complete (as compared to the currently common piecemeal scenario where each HCP organization that treats the patient stores and modifies its own medical record associated with the patient). Additionally, because the patient's medical record is stored and updated substantially in real-time whenever the patient has a procedure or treatment, the medical record can remain substantially unbiased, accurate, and objective (as compared to the currently common scenario where each HCP stores and updates the patient's record sometime after he treats the patient and he may include subjective information in the medical record due to inaccurate memories.)
  • the wearable device can display to an attending HCP virtual content associated with the patient or the patient's medical record.
  • the HCP can use the wearable device to update the patient's medical record to account for the results of the procedure or treatment.
  • the HCP can use the wearable device to share some or all of the patient's medical record with other authorized HCPs (e.g., a surgeon can share the medical record with the pathologist during an operation on the patient).
  • a danger to a patient during a medical operation is the possibility of a surgeon leaving a foreign object (e.g., a medical instrument such as, e.g., a scalpel) inside the patient's body.
  • An outward-facing camera of the wearable device can image medical instruments used during the operation, and the wearable device can track the location of the medical instruments. If a foreign object were to be left inside the patient's body, the wearable system can display an alert to the surgeon so that the foreign object can be removed from the patient's body before the operation is completed.
  • a wearable system (also referred to herein as an augmented reality (AR) system) can be configured to present 2D or 3D virtual images to a user.
  • the images may be still images, frames of a video, or a video, in combination or the like.
  • At least a portion of the wearable system can be implemented on a wearable device that can present a VR, AR, or MR environment, alone or in combination, for user interaction.
  • the wearable device can be used interchangeably as an AR device (ARD).
  • AR is used interchangeably with the term “MR”.
  • FIG. 1A depicts an illustration of a mixed reality scenario with certain virtual reality objects, and certain physical objects viewed by a person.
  • an MR scene 100 is depicted wherein a user of an MR technology sees a real-world park-like setting 110 featuring people, trees, buildings in the background, and a concrete platform 120 .
  • the user of the MR technology also perceives that he “sees” a robot statue 130 standing upon the real-world platform 120 , and a cartoon-like avatar character 140 flying by which seems to be a personification of a bumble bee, even though these elements do not exist in the real world.
  • each point in the display's visual field it is desirable for each point in the display's visual field to generate the accommodative response corresponding to its virtual depth. If the accommodative response to a display point does not correspond to the virtual depth of that point, as determined by the binocular depth cues of convergence and stereopsis, the human eye may experience an accommodation conflict, resulting in unstable imaging, harmful eye strain, headaches, and, in the absence of accommodation information, almost a complete lack of surface depth.
  • FIG. 1B illustrates a person's field of view (FOV) and field of regard (FOR).
  • the FOV comprises a portion of an environment of the user that is perceived at a given time by the user. This field of view can change as the person moves about, moves their head, or moves their eyes or gaze.
  • the FOR comprises a portion of the environment around the user that is capable of being perceived by the user via the wearable system.
  • the field of regard may include substantially all of the 4 ⁇ steradian solid angle surrounding the wearer, because the wearer can move his or her body, head, or eyes to perceive substantially any direction in space.
  • the user's movements may be more constricted, and accordingly the user's field of regard may subtend a smaller solid angle.
  • FIG. 1B shows such a field of view 155 including central and peripheral regions. The central field of view will provide a person a corresponding view of objects in a central region of the environmental view.
  • the peripheral field of view will provide a person a corresponding view of objects in a peripheral region of the environmental view.
  • the field of view 155 may include objects 121 , 122 .
  • the central field of view 145 includes the object 121 , while the other object 122 is in the peripheral field of view.
  • the field of view (FOV) 155 can contain multiple objects (e.g. objects 121 , 122 ).
  • the field of view 155 can depend on the size or optical characteristics of the AR system, for example, clear aperture size of the transparent window or lens of the head mounted display through which light passes from the real world in front of the user to the user's eyes.
  • the field of view 155 can correspondingly change, and the objects within the field of view 155 may also change.
  • the wearable system may include sensors such as cameras that monitor or image objects in the field of regard 165 as well as objects in the field of view 155 .
  • the wearable system may alert the user of unnoticed objects or events occurring in the user's field of view 155 and/or occurring outside the user's field of view but within the field of regard 165 .
  • the AR system can also distinguish between what a user 210 is or is not directing attention to.
  • the objects in the FOV or the FOR may be virtual or physical objects.
  • the virtual objects may include, for example, operating system objects such as e.g., a terminal for inputting commands, a file manager for accessing files or directories, an icon, a menu, an application for audio or video streaming, a notification from an operating system, and so on.
  • the virtual objects may also include objects in an application such as e.g., avatars, virtual objects in games, or graphics or images, etc.
  • the virtual objects may also include a patient's data (such as physiological data or medical history), as well as the environmental data such as the temperature of an operating room, etc.
  • Some virtual objects can be both an operating system object and an object in an application.
  • the wearable system can add virtual elements to the existing physical objects viewed through the transparent optics of the head mounted display, thereby permitting user interaction with the physical objects.
  • the wearable system may add a virtual menu associated with a medical monitor in the room, where the virtual menu may give the user the option to turn on or adjust medical imaging equipment or dosing controls using the wearable device.
  • the wearable system may present additional virtual image content to the wearer in addition to the object in the environment of the user.
  • FIG. 1B also shows the field of regard (FOR) 165 , which comprises a portion of the environment around a person 210 that is capable of being perceived by the person 210 , for example, by turning their head or redirecting their gaze.
  • the center portion of the field of view 155 of a person's 210 eyes may be referred to as the central field of view 145 .
  • the region within the field of view 155 but outside the central field of view 145 may be referred to as the peripheral field of view.
  • the field of regard 165 can contain a group of objects (e.g., objects 121 , 122 , 127 ) which can be perceived by the user wearing the wearable system.
  • objects 129 may be outside the user's visual FOR but may nonetheless potentially be perceived by a sensor (e.g., a camera) on a wearable device (depending on their location and field of view) and information associated with the object 129 displayed for the user 210 or otherwise used by the wearable device.
  • a sensor e.g., a camera
  • the objects 129 may be behind a wall in a user's environment so that the objects 129 are not visually perceivable by the user.
  • the wearable device may include sensors (such as radio frequency, Bluetooth, wireless, or other types of sensors) that can communicate with the objects 129 .
  • VR, AR, and MR experiences can be provided by display systems having displays in which images corresponding to a plurality of depth planes are provided to a viewer.
  • the images may be different for each depth plane (e.g., provide slightly different presentations of a scene or object) and may be separately focused by the viewer's eyes, thereby helping to provide the user with depth cues based on the accommodation of the eye required to bring into focus different image features for the scene located on different depth planes and/or based on observing different image features on different depth planes being out of focus.
  • depth cues provide credible perceptions of depth.
  • FIG. 2A illustrates an example of wearable system 200 which can be configured to provide an AR/VR/MR scene.
  • the wearable system 200 may be part of a wearable device (such as for that can present a VR, AR, or MR environment, alone or in combination, for user interaction.
  • the wearable system 200 can include a display 220 , and various mechanical and electronic modules and systems to support the functioning of display 220 .
  • the display 220 may be coupled to a frame 230 , which is wearable by a user, wearer, or viewer 210 .
  • the display 220 can be positioned in front of the eyes of the user 210 .
  • the display 220 can present AR/VR/MR content to a user.
  • the display 220 can comprise a head mounted display (HMD) that is worn on the head of the user.
  • HMD head mounted display
  • a speaker 240 is coupled to the frame 230 and positioned adjacent the ear canal of the user (in some embodiments, another speaker, not shown, is positioned adjacent the other ear canal of the user to provide for stereo/shapeable sound control).
  • the wearable system 200 can be configured to allow a user to interact with virtual and physical objects.
  • a doctor can wear an ARD which can present virtual content such as a virtual representation of a patient's medical record or physiological data (e.g., an electrocardiogram) to the doctor while the doctor is examining or performing a procedure or operation on the patient.
  • the virtual content may be presented based on the user's interaction with physical objects in the doctor's environment. For example, while a doctor is performing a surgery on a patient, the wearable system can display virtual information related to surgical equipment used by the doctor, for example, to track the location or status of surgical instruments used by the doctor (or surgical team).
  • the wearable system 200 can also include an outward-facing imaging system 464 (shown in FIG. 4 ) which observes the world in the environment around the user.
  • the wearable system 200 can also include an inward-facing imaging system 462 (shown in FIG. 4 ) which can track the eye movements of the user.
  • the inward-facing imaging system may track either one eye's movements or both eyes' movements.
  • the inward-facing imaging system may be attached to the frame 230 and may be in electrical communication with the processing modules 260 and/or 270 , which may process image information acquired by the inward-facing imaging system to determine, e.g., the pupil diameters and/or orientations of the eyes or eye pose of the user 210 .
  • the wearable system 200 can use the outward-facing imaging system 464 and/or the inward-facing imaging system 462 to acquire images of a pose of the user.
  • the images may be still images, animation, frames of a video, or a video, in combination or the like.
  • the pose of the user may include head pose, eye pose, hand gestures, foot pose, or other body poses.
  • One or more poses may be used to activate or to turn off voice recordings of a patient's visit.
  • the doctor may use a certain hand gesture to indicate whether to start dictating the diagnosis of the patient.
  • the wearable system 200 can also include an audio sensor (e.g., a microphone 232 ).
  • the microphone 232 may be an environmental sensor as further described with reference to FIG. 2B .
  • the microphone may be (fixedly or removably) attached to the frame 230 , display 220 (or other components of the wearable system 200 ), removably attached to the user 210 , fixedly or removably to a physical object (such as a medical equipment) or another person (such as, e.g., a patient of the user).
  • the microphone 232 may be used to receive audio data of a user of the wearable system 200 or sounds in the user's environment (such as when a patient of the user is talking).
  • the audio data received by the microphone 232 may be used to activate or turn off the dictation features described herein.
  • the wearable system 200 can detect a keyword which can trigger the wearable system 200 to record the audio received by the microphone 232 .
  • one or more other audio sensors, not shown, are positioned to provide stereo sound reception. Stereo sound reception can be used to determine the location of a sound source.
  • the wearable system 200 can perform voice or speech recognition on the audio stream.
  • the display 220 can be operatively coupled 250 , such as by a wired lead or wireless connectivity, to a local data processing module 260 which may be mounted in a variety of configurations, such as fixedly attached to the frame 230 , fixedly attached to a helmet or hat worn by the user, embedded in headphones, or otherwise removably attached to the user 210 (e.g., in a backpack-style configuration, in a belt-coupling style configuration).
  • a local data processing module 260 which may be mounted in a variety of configurations, such as fixedly attached to the frame 230 , fixedly attached to a helmet or hat worn by the user, embedded in headphones, or otherwise removably attached to the user 210 (e.g., in a backpack-style configuration, in a belt-coupling style configuration).
  • the local processing and data module 260 may comprise a hardware processor, as well as digital memory, such as non-volatile memory (e.g., flash memory), both of which may be utilized to assist in the processing, caching, and storage of data.
  • the data may include data a) captured from environmental sensors (which may be, e.g., operatively coupled to the frame 230 or otherwise attached to the user 210 ); and/or b) acquired and/or processed using remote processing module 270 and/or remote data repository 280 , possibly for passage to the display 220 after such processing or retrieval.
  • the local processing and data module 260 may be operatively coupled by communication links 262 and/or 264 , such as via wired or wireless communication links, to the remote processing module 270 and/or remote data repository 280 such that these remote modules are available as resources to the local processing and data module 260 .
  • remote processing module 280 and remote data repository 280 may be operatively coupled to each other.
  • the remote processing module 270 may comprise one or more processors configured to analyze and process data and/or image information.
  • the remote data repository 280 may comprise a digital data storage facility, which may be available through the Internet or other networking configuration in a “cloud” resource configuration. In some embodiments, all data is stored and all computations are performed in the local processing and data module 260 , allowing fully autonomous use from a remote module.
  • the remote data repository 280 can be configured to store various data.
  • the remote data repository 280 can store a map of an environment (such as, e.g., a map of a clinic or an operation room). As further described with reference to FIGS. 9 and 12 , the map may be generated based on data collected by multiple wearable systems over time. The map of an environment may be passed from one wearable system to another. For example, the map of the operating room may be shared between the surgeon and nurses in a surgery.
  • the remote data repository 280 can also store medical records. These medical records may be owned by a patient, rather than being owned by the particular HCP that performs an examination or operation on the patient.
  • the patient advantageously can control the access to the patient's sensitive patient information contained in the patient's medical record.
  • the patient's doctor may wear a wearable device while examining the patient.
  • the wearable device can present the medical records on a 3D user interface.
  • the wearable device can also be configured to allow the patient's doctor to add to the existing medical records.
  • the wearable device can allow the doctor to take a picture of the patient, to put virtual flags around a tumor, to input diagnosis using voice control, and so on.
  • the added information may also be stored to the remote data repository 280 .
  • a portion of the medical records or the map may be stored in the local processing and data module 260 .
  • the wearable system 200 can include the environmental sensors to detect objects, stimuli, people, animals, locations, or other aspects of the world around the user.
  • the environmental sensors may include image capture devices (e.g., cameras, inward-facing imaging system, outward-facing imaging system, etc.), microphones, inertial measurement units (IMUs), accelerometers, compasses, global positioning system (GPS) units, radio devices, gyroscopes, altimeters, barometers, chemical sensors, humidity sensors, temperature sensors, external microphones, light sensors (e.g., light meters), timing devices (e.g., clocks or calendars), or any combination or subcombination thereof.
  • the environmental sensors may also include a variety of physiological sensors.
  • Environmental sensors can measure or estimate the user's physiological parameters such as heart rate, respiratory rate, galvanic skin response, blood pressure, encephalographic state, and so on.
  • Environmental sensors may further include emissions devices configured to receive signals such as laser, visible light, invisible wavelengths of light, or sound (e.g., audible sound, ultrasound, or other frequencies).
  • one or more environmental sensors e.g., cameras or light sensors
  • Physical contact sensors such as strain gauges, curb feelers, or the like, may also be included as environmental sensors.
  • FIG. 2B shows a schematic view of an example of various components of a wearable system comprising environmental sensors.
  • the display system 202 may be part of the wearable system 200 illustrated in FIG. 2 .
  • the display system 202 may be a mixed reality display system in some implementations.
  • the system 202 can include various environmental sensors, e.g., sensors 24 , 28 , 30 , 32 , and 34 .
  • An environmental sensor may be configured to detect data regarding the user of the wearable system (also referred to as a user sensor) or be configured to collect data regarding the user's environment (also referred to as an external sensor).
  • a physiological sensor may be an embodiment of a user sensor while a barometer may be an external sensor.
  • a sensor may be both a user sensor and an external sensor.
  • an outward-facing imaging system may acquire an image of the user's environment as well as an image of the user when the user is in front of a reflective surface (such as, e.g., a mirror).
  • a microphone may serve as both the user sensor and the external sensor because the microphone can acquire sound from the user and from the environment.
  • the sensors 24 , 28 , 30 , and 32 may be user sensors while the sensor 34 may be an external sensor.
  • the display system 202 may include various user sensors.
  • the display system 202 may include a viewer imaging system 22 .
  • the viewer imaging system 22 may be an embodiment of the inward-facing imaging system 462 and/or the outward facing imaging system 464 described in FIG. 4 .
  • the viewer imaging system 22 may include cameras 24 (e.g., infrared, UV, and/or visible light cameras) paired with light sources 26 (e.g., infrared light sources) directed at and configured to monitor the user (e.g., the eyes 201 a , 201 b and/or surrounding tissues of the user).
  • the cameras 24 and light sources 26 may be operatively coupled to the local processing module 260 .
  • Such cameras 24 may be configured to monitor one or more of the orientation, shape, and symmetry of pupils (including pupil sizes) or irises of the respective eyes, and/or tissues surrounding the eye, such as eyelids or eyebrows to conduct the various analyses disclosed herein.
  • imaging of the iris and/or retina of an eye may be used for secure identification of a user.
  • cameras 24 may further be configured to image the retinas of the respective eyes, such as for diagnostic purposes and/or for orientation tracking based on the location of retinal features, such as the fovea or features of the fundus.
  • Iris or retina imaging or scanning may be performed for secure identification of users for, e.g., correctly associating user data with a particular user and/or to present private information to the appropriate user.
  • one or more cameras 28 may be configured to detect and/or monitor various other aspects of the status of a user.
  • one or more cameras 28 may be inward-facing and configured to monitor the shape, position, movement, color, and/or other properties of features other than the eyes of the user, e.g., one or more facial features (e.g., facial expression, voluntary movement, involuntary tics).
  • one or more cameras 28 may be downward-facing or outward-facing and configured to monitor the position, movement, and/or other features or properties of the arms, hands, legs, feet, and/or torso of a user, of another person in the user's FOV, objects in the FOV, etc.
  • the cameras 28 may be used to image the environment, and such images can be analyzed by the wearable device to determine whether a triggering event is occurring such that the wearable device may present (or mute) the visual or audible content being presented to the user.
  • the display system 202 may include a spatial light modulator that variably projects, through a fiber scanner (e.g., the image injection devices in FIG. 4 — 420 , 422 , 424 , 426 , 428 ), light beams across the retina of the user to form an image.
  • the fiber scanner may be used in conjunction with, or in place of, the cameras 24 or 28 to, e.g., track or image the user's eyes.
  • the health system may have a separate light-receiving device to receive light reflected from the user's eyes, and to collect data associated with that reflected light.
  • the cameras 24 , 28 and light sources 26 may be mounted on the frame 230 (shown in FIG. 2A ), which may also hold the waveguide stacks 205 , 206 .
  • sensors and/or other electronic devices e.g., the cameras 24 , 28 and light sources 26
  • the display system 202 may be configured to communicate with the local processing and data module 260 through communication links 262 , 264 .
  • one or both of the cameras 24 and 28 may be utilized to track the eyes to provide user input.
  • the viewer imaging system 22 may be utilized to select items on virtual menus, and/or provide other input to the display system 202 , such as for providing user responses in the various tests and analyses disclosed herein.
  • the display system 202 may include motion sensors 32 , such as one or more accelerometers, gyros, gesture sensors, gait sensors, balance sensors, and/or IMU sensors.
  • the sensors 30 may include one or more inwardly directed (user directed) microphones configured to detect sounds, and various properties of those sound, including the intensity and type of sounds detected, the presence of multiple signals, and/or signal location.
  • the sensors 30 are schematically illustrated as being connected to the frame 230 . It will be appreciated that this connection may take the form of a physical attachment to the frame 230 and may be anywhere on the frame 230 , including the ends of the temples of the frame 230 which extend over the user's ears. For example, the sensors 30 may be mounted at the ends of the temples of the frame 230 , at a point of contact between the frame 230 and the user. In some other embodiments, the sensors 30 may extend away from the frame 230 to contact the user 210 (shown in FIG. 2A ). In yet other embodiments, the sensors 30 may not be physically attached to the frame 230 ; rather, the sensors 30 may be spaced apart from the frame 230 .
  • the display system 202 may further include one or more environmental sensors 34 configured to detect objects, stimuli, people, animals, locations, or other aspects of the world around the user.
  • environmental sensors 34 may include one or more cameras, altimeters, barometers, chemical sensors, humidity sensors, temperature sensors, external microphones, light sensors (e.g., light meters), timing devices (e.g., clocks or calendars), or any combination or subcombination thereof.
  • multiple (e.g., two) microphones may be spaced-apart, to facilitate sound source location determinations.
  • cameras may be located, for example, facing outward so as to capture images similar to at least a portion of an ordinary field of view of a user.
  • Environmental sensors may further include emissions devices configured to receive signals such as laser, visible light, invisible wavelengths of light, sound (e.g., audible sound, ultrasound, or other frequencies).
  • one or more environmental sensors e.g., cameras or light sensors
  • Physical contact sensors such as strain gauges, curb feelers, or the like, may also be included as environmental sensors.
  • the display system 202 may further be configured to receive other environmental inputs, such as GPS location data, weather data, date and time, or other available environmental data which may be received from the internet, satellite communication, or other suitable wired or wireless data communication method.
  • the processing module 260 may be configured to access further information characterizing a location of the user, such as pollen count, demographics, air pollution, environmental toxins, information from smart thermostats, lifestyle statistics, or proximity to other users, buildings, or a healthcare provider. In some embodiments, information characterizing the location may be accessed using cloud-based or other remote databases.
  • the processing module 70 may be configured to obtain such data and/or to further analyze data from any one or combinations of the environmental sensors.
  • the display system 202 may be configured to collect and store data obtained through any of the sensors and/or inputs described above for extended periods of time.
  • Data received at the device may be processed and/or stored at the local processing module 260 and/or remotely (e.g., as shown in FIG. 2A , at the remote processing module 270 or emote data repository 280 ).
  • additional data such as date and time, GPS location, or other global data may be received directly at the local processing module 260 .
  • Data regarding content being delivered to the user by the system such as images, other visual content, or auditory content, may be received at the local processing module 260 as well.
  • the human visual system is complicated and providing a realistic perception of depth is challenging. Without being limited by theory, it is believed that viewers of an object may perceive the object as being three-dimensional due to a combination of vergence and accommodation. Vergence movements (i.e., rolling movements of the pupils toward or away from each other to converge the lines of sight of the eyes to fixate upon an object) of the two eyes relative to each other are closely associated with focusing (or “accommodation”) of the lenses of the eyes.
  • FIG. 3 illustrates aspects of an approach for simulating three-dimensional imagery using multiple depth planes.
  • objects at various distances from eyes 302 and 304 on the z-axis are accommodated by the eyes 302 and 304 so that those objects are in focus.
  • the eyes 302 and 304 assume particular accommodated states to bring into focus objects at different distances along the z-axis. Consequently, a particular accommodated state may be said to be associated with a particular one of depth planes 306 , with has an associated focal distance, such that objects or parts of objects in a particular depth plane are in focus when the eye is in the accommodated state for that depth plane.
  • three-dimensional imagery may be simulated by providing different presentations of an image for each of the eyes 302 and 304 , and also by providing different presentations of the image corresponding to each of the depth planes. While shown as being separate for clarity of illustration, it will be appreciated that the fields of view of the eyes 302 and 304 may overlap, for example, as distance along the z-axis increases. In addition, while shown as flat for ease of illustration, it will be appreciated that the contours of a depth plane may be curved in physical space, such that all features in a depth plane are in focus with the eye in a particular accommodated state. Without being limited by theory, it is believed that the human eye typically can interpret a finite number of depth planes to provide depth perception. Consequently, a highly believable simulation of perceived depth may be achieved by providing, to the eye, different presentations of an image corresponding to each of these limited number of depth planes.
  • FIG. 4 illustrates an example of a waveguide stack for outputting image information to a user.
  • a wearable system 400 includes a stack of waveguides, or stacked waveguide assembly 480 that may be utilized to provide three-dimensional perception to the eye/brain using a plurality of waveguides 432 b , 434 b , 436 b , 438 b , 400 b .
  • the wearable system 400 may correspond to wearable system 200 of FIG. 2 , with FIG. 4 schematically showing some parts of that wearable system 200 in greater detail.
  • the waveguide assembly 480 may be integrated into the display 220 of FIG. 2 .
  • the waveguide assembly 480 may also include a plurality of features 458 , 456 , 454 , 452 between the waveguides.
  • the features 458 , 456 , 454 , 452 may be lenses.
  • the features 458 , 456 , 454 , 452 may not be lenses. Rather, they may simply be spacers (e.g., cladding layers and/or structures for forming air gaps).
  • the waveguides 432 b , 434 b , 436 b , 438 b , 440 b and/or the plurality of lenses 458 , 456 , 454 , 452 may be configured to send image information to the eye with various levels of wavefront curvature or light ray divergence.
  • Each waveguide level may be associated with a particular depth plane and may be configured to output image information corresponding to that depth plane.
  • Image injection devices 420 , 422 , 424 , 426 , 428 may be utilized to inject image information into the waveguides 440 b , 438 b , 436 b , 434 b , 432 b , each of which may be configured to distribute incoming light across each respective waveguide, for output toward the eye 410 .
  • a single beam of light (e.g., a collimated beam) may be injected into each waveguide to output an entire field of cloned collimated beams that are directed toward the eye 410 at particular angles (and amounts of divergence) corresponding to the depth plane associated with a particular waveguide.
  • the image injection devices 420 , 422 , 424 , 426 , 428 are discrete displays that each produce image information for injection into a corresponding waveguide 440 b , 438 b , 436 b , 434 b , 432 b , respectively.
  • the image injection devices 420 , 422 , 424 , 426 , 428 are the output ends of a single multiplexed display which may, e.g., pipe image information via one or more optical conduits (such as fiber optic cables) to each of the image injection devices 420 , 422 , 424 , 426 , 428 .
  • a controller 460 controls the operation of the stacked waveguide assembly 480 and the image injection devices 420 , 422 , 424 , 426 , 428 .
  • the controller 460 includes programming (e.g., instructions in a non-transitory computer-readable medium) that regulates the timing and provision of image information to the waveguides 440 b , 438 b , 436 b , 434 b , 432 b .
  • the controller 460 may be a single integral device, or a distributed system connected by wired or wireless communication channels.
  • the controller 460 may be part of the processing modules 260 and/or 270 (illustrated in FIG. 2 ) in some embodiments.
  • the waveguides 440 b , 438 b , 436 b , 434 b , 432 b may be configured to propagate light within each respective waveguide by total internal reflection (TIR).
  • the waveguides 440 b , 438 b , 436 b , 434 b , 432 b may each be planar or have another shape (e.g., curved), with major top and bottom surfaces and edges extending between those major top and bottom surfaces.
  • the waveguides 440 b , 438 b , 436 b , 434 b , 432 b may each include light extracting optical elements 440 a , 438 a , 436 a , 434 a , 432 a that are configured to extract light out of a waveguide by redirecting the light, propagating within each respective waveguide, out of the waveguide to output image information to the eye 410 .
  • Extracted light may also be referred to as outcoupled light
  • light extracting optical elements may also be referred to as outcoupling optical elements.
  • An extracted beam of light is outputted by the waveguide at locations at which the light propagating in the waveguide strikes a light redirecting element.
  • the light extracting optical elements may, for example, be reflective and/or diffractive optical features. While illustrated disposed at the bottom major surfaces of the waveguides 440 b , 438 b , 436 b , 434 b , 432 b for ease of description and drawing clarity, in some embodiments, the light extracting optical elements 440 a , 438 a , 436 a , 434 a , 432 a may be disposed at the top and/or bottom major surfaces, and/or may be disposed directly in the volume of the waveguides 440 b , 438 b , 436 b , 434 b , 432 b .
  • the light extracting optical elements 440 a , 438 a , 436 a , 434 a , 432 a may be formed in a layer of material that is attached to a transparent substrate to form the waveguides 440 b , 438 b , 436 b , 434 b , 432 b .
  • the waveguides 440 b , 438 b , 436 b , 434 b , 432 b may be a monolithic piece of material and the light extracting optical elements 440 a , 438 a , 436 a , 434 a , 432 a may be formed on a surface and/or in the interior of that piece of material.
  • each waveguide 440 b , 438 b , 436 b , 434 b , 432 b is configured to output light to form an image corresponding to a particular depth plane.
  • the waveguide 432 b nearest the eye may be configured to deliver collimated light, as injected into such waveguide 432 b , to the eye 410 .
  • the collimated light may be representative of the optical infinity focal plane.
  • the next waveguide up 434 b may be configured to send out collimated light which passes through the first lens 452 (e.g., a negative lens) before it can reach the eye 410 .
  • First lens 452 may be configured to create a slight convex wavefront curvature so that the eye/brain interprets light coming from that next waveguide up 434 b as coming from a first focal plane closer inward toward the eye 410 from optical infinity.
  • the third up waveguide 436 b passes its output light through both the first lens 452 and second lens 454 before reaching the eye 410 .
  • the combined optical power of the first and second lenses 452 and 454 may be configured to create another incremental amount of wavefront curvature so that the eye/brain interprets light coming from the third waveguide 436 b as coming from a second focal plane that is even closer inward toward the person from optical infinity than was light from the next waveguide up 434 b.
  • the other waveguide layers e.g., waveguides 438 b , 440 b
  • lenses e.g., lenses 456 , 458
  • the highest waveguide 440 b in the stack sending its output through all of the lenses between it and the eye for an aggregate focal power representative of the closest focal plane to the person.
  • a compensating lens layer 430 may be disposed at the top of the stack to compensate for the aggregate power of the lens stack 458 , 456 , 454 , 452 below.
  • Both the light extracting optical elements of the waveguides and the focusing aspects of the lenses may be static (e.g., not dynamic or electro-active). In some alternative embodiments, either or both may be dynamic using electro-active features.
  • the light extracting optical elements 440 a , 438 a , 436 a , 434 a , 432 a may be configured to both redirect light out of their respective waveguides and to output this light with the appropriate amount of divergence or collimation for a particular depth plane associated with the waveguide.
  • waveguides having different associated depth planes may have different configurations of light extracting optical elements, which output light with a different amount of divergence depending on the associated depth plane.
  • the light extracting optical elements 440 a , 438 a , 436 a , 434 a , 432 a may be volumetric or surface features, which may be configured to output light at specific angles.
  • the light extracting optical elements 440 a , 438 a , 436 a , 434 a , 432 a may be volume holograms, surface holograms, and/or diffraction gratings.
  • Light extracting optical elements, such as diffraction gratings are described in U.S. Patent Publication No. 2015/0178939, published Jun. 25, 2015, which is incorporated by reference herein in its entirety.
  • the light extracting optical elements 440 a , 438 a , 436 a , 434 a , 432 a are diffractive features that form a diffraction pattern, or “diffractive optical element” (also referred to herein as a “DOE”).
  • DOE diffractive optical element
  • the DOE's have a relatively low diffraction efficiency so that only a portion of the light of the beam is deflected away toward the eye 410 with each intersection of the DOE, while the rest continues to move through a waveguide via total internal reflection.
  • the light carrying the image information is thus divided into a number of related exit beams that exit the waveguide at a multiplicity of locations and the result is a fairly uniform pattern of exit emission toward the eye 304 for this particular collimated beam bouncing around within a waveguide.
  • one or more DOEs may be switchable between “on” states in which they actively diffract, and “off” states in which they do not significantly diffract.
  • a switchable DOE may comprise a layer of polymer dispersed liquid crystal, in which microdroplets comprise a diffraction pattern in a host medium, and the refractive index of the microdroplets can be switched to substantially match the refractive index of the host material (in which case the pattern does not appreciably diffract incident light) or the microdroplet can be switched to an index that does not match that of the host medium (in which case the pattern actively diffracts incident light).
  • the number and distribution of depth planes and/or depth of field may be varied dynamically based on the pupil sizes and/or orientations of the eyes of the viewer.
  • Depth of field may change inversely with a viewer's pupil size.
  • the depth of field increases such that one plane not discernible because the location of that plane is beyond the depth of focus of the eye may become discernible and appear more in focus with reduction of pupil size and commensurate increase in depth of field.
  • the number of spaced apart depth planes used to present different images to the viewer may be decreased with decreased pupil size.
  • a viewer may not be able to clearly perceive the details of both a first depth plane and a second depth plane at one pupil size without adjusting the accommodation of the eye away from one depth plane and to the other depth plane.
  • These two depth planes may, however, be sufficiently in focus at the same time to the user at another pupil size without changing accommodation.
  • the display system may vary the number of waveguides receiving image information based upon determinations of pupil size and/or orientation, or upon receiving electrical signals indicative of particular pupil sizes and/or orientations. For example, if the user's eyes are unable to distinguish between two depth planes associated with two waveguides, then the controller 460 may be configured or programmed to cease providing image information to one of these waveguides. Advantageously, this may reduce the processing burden on the system, thereby increasing the responsiveness of the system. In embodiments in which the DOEs for a waveguide are switchable between on and off states, the DOEs may be switched to the off state when the waveguide does receive image information.
  • an exit beam may be desirable to have an exit beam meet the condition of having a diameter that is less than the diameter of the eye of a viewer.
  • meeting this condition may be challenging in view of the variability in size of the viewer's pupils.
  • this condition is met over a wide range of pupil sizes by varying the size of the exit beam in response to determinations of the size of the viewer's pupil. For example, as the pupil size decreases, the size of the exit beam may also decrease.
  • the exit beam size may be varied using a variable aperture.
  • the wearable system 400 can include an outward-facing imaging system 464 (e.g., a digital camera) that images a portion of the world 470 .
  • This portion of the world 470 may be referred to as the field of view (FOV) and the imaging system 464 is sometimes referred to as an FOV camera.
  • FOV field of view
  • the entire region available for viewing or imaging by a viewer may be referred to as the field of regard (FOR).
  • the FOR may include 4 ⁇ steradians of solid angle surrounding the wearable system 400 .
  • the FOR may include substantially all of the solid angle around a user of the display system 400 , because the user can move their head and eyes to look at objects surrounding the user (in front, in back, above, below, or on the sides of the user).
  • Images obtained from the outward-facing imaging system 464 can be used to track gestures made by the user (e.g., hand or finger gestures), detect objects in the world 470 in front of the user, and so forth.
  • the wearable system 400 can also include an inward-facing imaging system 466 (e.g., a digital camera), which observes the movements of the user, such as the eye movements and the facial movements.
  • the inward-facing imaging system 466 may be used to capture images of the eye 410 to determine the size and/or orientation of the pupil of the eye 304 .
  • the inward-facing imaging system 466 can be used to obtain images for use in determining the direction the user is looking (e.g., eye pose) or for biometric identification of the user (e.g., iris recognition or retinal scanning, etc.).
  • At least one camera may be utilized for each eye, to separately determine the pupil size and/or eye pose of each eye independently, thereby allowing the presentation of image information to each eye to be dynamically tailored to that eye.
  • the pupil diameter and/or orientation of only a single eye 410 e.g., using only a single camera per pair of eyes
  • the images obtained by the inward-facing imaging system 466 may be analyzed to determine the user's eye pose and/or mood, which can be used by the wearable system 400 to decide which audio or visual content should be presented to the user.
  • the wearable system 400 may also determine head pose (e.g., head position or head orientation) using sensors such as IMUs, accelerometers, gyroscopes, etc.
  • the wearable system 400 can include a user input device 466 by which the user can input commands to the controller 460 to interact with the wearable system 400 .
  • the user input device 466 can include a trackpad, a touchscreen, a joystick, a multiple degree-of-freedom (DOF) controller, a capacitive sensing device, a game controller, a keyboard, a mouse, a directional pad (D-pad), a wand, a haptic device, a totem (e.g., functioning as a virtual user input device), and so forth.
  • DOF multiple degree-of-freedom
  • the user may use a finger (e.g., a thumb) to press or swipe on a touch-sensitive input device to provide input to the wearable system 400 (e.g., to provide user input to a user interface provided by the wearable system 400 ).
  • the user input device 466 may be held by the user's hand during the use of the wearable system 400 .
  • the user input device 466 can be in wired or wireless communication with the wearable system 400 .
  • FIG. 5 shows an example of exit beams outputted by a waveguide.
  • One waveguide is illustrated, but it will be appreciated that other waveguides in the waveguide assembly 480 may function similarly, where the waveguide assembly 480 includes multiple waveguides.
  • Light 520 is injected into the waveguide 432 b at the input edge 432 c of the waveguide 432 b and propagates within the waveguide 432 b by TIR. At points where the light 520 impinges on the DOE 432 a , a portion of the light exits the waveguide as exit beams 510 .
  • the exit beams 510 are illustrated as substantially parallel but they may also be redirected to propagate to the eye 410 at an angle (e.g., forming divergent exit beams), depending on the depth plane associated with the waveguide 432 b . It will be appreciated that substantially parallel exit beams may be indicative of a waveguide with light extracting optical elements that outcouple light to form images that appear to be set on a depth plane at a large distance (e.g., optical infinity) from the eye 410 .
  • waveguides or other sets of light extracting optical elements may output an exit beam pattern that is more divergent, which would require the eye 410 to accommodate to a closer distance to bring it into focus on the retina and would be interpreted by the brain as light from a distance closer to the eye 410 than optical infinity.
  • FIG. 6 is a schematic diagram showing an optical system including a waveguide apparatus, an optical coupler subsystem to optically couple light to or from the waveguide apparatus, and a control subsystem, used in the generation of a multi-focal volumetric display, image, or light field.
  • the optical system can include a waveguide apparatus, an optical coupler subsystem to optically couple light to or from the waveguide apparatus, and a control subsystem.
  • the optical system can be used to generate a multi-focal volumetric, image, or light field.
  • the optical system can include one or more primary planar waveguides 632 a (only one is shown in FIG. 6 ) and one or more DOEs 632 b associated with each of at least some of the primary waveguides 632 a .
  • the planar waveguides 632 b can be similar to the waveguides 432 b , 434 b , 436 b , 438 b , 440 b discussed with reference to FIG. 4 .
  • the optical system may employ a distribution waveguide apparatus to relay light along a first axis (vertical or Y-axis in view of FIG. 6 ), and expand the light's effective exit pupil along the first axis (e.g., Y-axis).
  • the distribution waveguide apparatus may, for example include a distribution planar waveguide 622 b and at least one DOE 622 a (illustrated by double dash-dot line) associated with the distribution planar waveguide 622 b .
  • the distribution planar waveguide 622 b may be similar or identical in at least some respects to the primary planar waveguide 632 b , having a different orientation therefrom.
  • at least one DOE 622 a may be similar or identical in at least some respects to the DOE 632 a .
  • the distribution planar waveguide 622 b and/or DOE 622 a may be comprised of the same materials as the primary planar waveguide 632 b and/or DOE 632 a , respectively.
  • Embodiments of the optical display system 600 shown in FIG. 6 can be integrated into the wearable system 200 shown in FIG. 2 .
  • the relayed and exit-pupil expanded light is optically coupled from the distribution waveguide apparatus into the one or more primary planar waveguides 632 b .
  • the primary planar waveguide 632 b relays light along a second axis, preferably orthogonal to first axis, (e.g., horizontal or X-axis in view of FIG. 6 ).
  • the second axis can be a non-orthogonal axis to the first axis.
  • the primary planar waveguide 632 b expands the light's effective exit pupil along that second axis (e.g., X-axis).
  • the distribution planar waveguide 622 b can relay and expand light along the vertical or Y-axis, and pass that light to the primary planar waveguide 632 b which relays and expands light along the horizontal or X-axis.
  • the optical system may include one or more sources of colored light (e.g., red, green, and blue laser light) 610 which may be optically coupled into a proximal end of a single mode optical fiber 640 .
  • a distal end of the optical fiber 640 may be threaded or received through a hollow tube 8 of piezoelectric material. The distal end protrudes from the tube 642 as fixed-free flexible cantilever 644 .
  • the piezoelectric tube 642 can be associated with four quadrant electrodes (not illustrated). The electrodes may, for example, be plated on the outside, outer surface or outer periphery or diameter of the tube 642 .
  • a core electrode (not illustrated) is also located in a core, center, inner periphery or inner diameter of the tube 642 .
  • Drive electronics 650 for example electrically coupled via wires 660 , drive opposing pairs of electrodes to bend the piezoelectric tube 642 in two axes independently.
  • the protruding distal tip of the optical fiber 644 has mechanical modes of resonance. The frequencies of resonance can depend upon a diameter, length, and material properties of the optical fiber 644 .
  • the tip of the fiber cantilever 644 is scanned biaxially in an area filling two dimensional (2-D) scan.
  • modulating an intensity of light source(s) 610 in synchrony with the scan of the fiber cantilever 644 light emerging from the fiber cantilever 644 forms an image. Descriptions of such a set up are provided in U.S. Patent Publication No. 2014/0003762, which is incorporated by reference herein in its entirety.
  • a component of an optical coupler subsystem collimates the light emerging from the scanning fiber cantilever 644 .
  • the collimated light is reflected by mirrored surface 648 into the narrow distribution planar waveguide 622 b which contains the at least one diffractive optical element (DOE) 622 a .
  • the collimated light propagates vertically (relative to the view of FIG. 6 ) along the distribution planar waveguide 622 b by total internal reflection (TIR), and in doing so repeatedly intersects with the DOE 622 a .
  • TIR total internal reflection
  • the DOE 622 a preferably has a low diffraction efficiency.
  • a fraction e.g. 10%
  • the DOE 632 a may advantageously be designed or configured to have a phase profile that is a summation of a linear diffraction pattern and a radially symmetric diffractive pattern, to produce both deflection and focusing of the light.
  • the DOE 632 a may advantageously have a low diffraction efficiency (e.g., 10%), so that only a portion of the light of the beam is deflected toward the eye of the view with each intersection of the DOE 632 a while the rest of the light continues to propagate through the primary waveguide 632 b via TIR.
  • a low diffraction efficiency e.g. 10%
  • the radially symmetric diffraction pattern of the DOE 632 a additionally imparts a focus level to the diffracted light, both shaping the light wavefront (e.g., imparting a curvature) of the individual beam as well as steering the beam at an angle that matches the designed focus level.
  • these different pathways can cause the light to be coupled out of the primary planar waveguide 632 b by a multiplicity of DOEs 632 a at different angles, focus levels, and/or yielding different fill patterns at the exit pupil.
  • Different fill patterns at the exit pupil can be beneficially used to create a light field display with multiple depth planes.
  • Each layer in the waveguide assembly or a set of layers (e.g., 3 layers) in the stack may be employed to generate a respective color (e.g., red, blue, green).
  • a first set of three adjacent layers may be employed to respectively produce red, blue and green light at a first focal depth.
  • a second set of three adjacent layers may be employed to respectively produce red, blue and green light at a second focal depth.
  • Multiple sets may be employed to generate a full 3D or 4D color image light field with various focal depths.
  • the wearable system may include other components in addition or in alternative to the components of the wearable system described above.
  • the wearable system may, for example, include one or more haptic devices or components.
  • the haptic device(s) or component(s) may be operable to provide a tactile sensation to a user.
  • the haptic device(s) or component(s) may provide a tactile sensation of pressure and/or texture when touching virtual content (e.g., virtual objects, virtual tools, other virtual constructs).
  • the tactile sensation may replicate a feel of a physical object which a virtual object represents, or may replicate a feel of an imagined object or character (e.g., a dragon) which the virtual content represents.
  • haptic devices or components may be worn by the user (e.g., a user wearable glove).
  • haptic devices or components may be held by the user.
  • the wearable system may, for example, include one or more physical objects which are manipulable by the user to allow input or interaction with the AR system. These physical objects may be referred to herein as totems. Some totems may take the form of inanimate objects, such as for example, a piece of metal or plastic, a wall, a surface of table. In certain implementations, the totems may not actually have any physical input structures (e.g., keys, triggers, joystick, trackball, rocker switch). Instead, the totem may simply provide a physical surface, and the AR system may render a user interface so as to appear to a user to be on one or more surfaces of the totem.
  • totems may take the form of inanimate objects, such as for example, a piece of metal or plastic, a wall, a surface of table.
  • the totems may not actually have any physical input structures (e.g., keys, triggers, joystick, trackball, rocker switch). Instead, the totem may simply provide a physical surface, and the AR system may
  • the AR system may render an image of a computer keyboard and trackpad to appear to reside on one or more surfaces of a totem.
  • the AR system may render a virtual computer keyboard and virtual trackpad to appear on a surface of a thin rectangular plate of aluminum which serves as a totem.
  • the rectangular plate does not itself have any physical keys or trackpad or sensors.
  • the AR system may detect user manipulation or interaction or touches with the rectangular plate as selections or inputs made via the virtual keyboard and/or virtual trackpad.
  • the user input device 466 shown in FIG.
  • a totem may, which may include a trackpad, a touchpad, a trigger, a joystick, a trackball, a rocker switch, a mouse, a keyboard, a multi-degree-of-freedom controller, or another physical input device.
  • a user may use the totem, alone or in combination with poses, to interact with the wearable system and/or other users.
  • haptic devices and totems usable with the wearable devices, HMD, ARD, and display systems of the present disclosure are described in U.S. Patent Publication No. 2015/0016777, which is incorporated by reference herein in its entirety.
  • a wearable system may employ various mapping related techniques in order to achieve high depth of field in the rendered light fields.
  • mapping out the virtual world it is advantageous to know all the features and points in the real world to accurately portray virtual objects in relation to the real world.
  • FOV images captured from users of the wearable system can be added to a world model by including new pictures that convey information about various points and features of the real world.
  • the wearable system can collect a set of map points (such as 2D points or 3D points) and find new map points to render a more accurate version of the world model.
  • the world model of a first user can be communicated (e.g., over a network such as a cloud network) to a second user so that the second user can experience the world surrounding the first user.
  • FIG. 7 is a block diagram of an example of an MR environment 700 .
  • the MR environment 700 may be configured to receive inputs (e.g., visual input 702 from the user's wearable system, stationary input 704 such as room cameras, sensory input 706 from various sensors, gestures, totems, eye tracking, user input from the user input device 466 etc.) from one or more user wearable systems (e.g., wearable system 200 or display system 220 ) or stationary room systems (e.g., room cameras, etc.).
  • inputs e.g., visual input 702 from the user's wearable system, stationary input 704 such as room cameras, sensory input 706 from various sensors, gestures, totems, eye tracking, user input from the user input device 466 etc.
  • user wearable systems e.g., wearable system 200 or display system 220
  • stationary room systems e.g., room cameras, etc.
  • the wearable systems can use various sensors (e.g., accelerometers, gyroscopes, temperature sensors, movement sensors, depth sensors, GPS sensors, inward-facing imaging system, outward-facing imaging system, etc.) to determine the location and various other attributes of the environment of the user. This information may further be supplemented with information from stationary cameras in the room that may provide images or various cues from a different point of view. The image data acquired by the cameras (such as the room cameras or the cameras of the outward-facing imaging system) may be reduced to a set of mapping points.
  • sensors e.g., accelerometers, gyroscopes, temperature sensors, movement sensors, depth sensors, GPS sensors, inward-facing imaging system, outward-facing imaging system, etc.
  • One or more object recognizers 708 can crawl through the received data (e.g., the collection of points) and recognize and/or map points, tag images, attach semantic information to objects with the help of a map database 710 .
  • the map database 710 may comprise various points collected over time and their corresponding objects.
  • the various devices and the map database can be connected to each other through a network (e.g., LAN, WAN, etc.) to access the cloud.
  • the object recognizers 708 a to 708 n may recognize objects in an environment.
  • the object recognizers can recognize the patient, body parts of the patient (such as e.g., limbs, torso, head, organs, etc.), medical equipment (such as, e.g., surgical tools or medical devices), as well as other objects in a room (such as, e.g., windows, walls, etc.) or other persons in the room (such as, e.g., attending physicians, nurses, etc.).
  • One or more object recognizers may be specialized for object with certain characteristics.
  • the object recognizer 708 a may be used to recognizer faces, while another object recognizer may be used recognize scalpels.
  • the object may be marked as unknown.
  • the object recognitions may be performed using a variety of computer vision techniques.
  • the wearable system can analyze the images acquired by the outward-facing imaging system 464 (shown in FIG. 4 ) to perform scene reconstruction, event detection, video tracking, object recognition, object pose estimation, learning, indexing, motion estimation, or image restoration, etc.
  • One or more computer vision algorithms may be used to perform these tasks.
  • Non-limiting examples of computer vision algorithms include: Scale-invariant feature transform (SIFT), speeded up robust features (SURF), oriented FAST and rotated BRIEF (ORB), binary robust invariant scalable keypoints (BRISK), fast retina keypoint (FREAK), Viola-Jones algorithm, Eigenfaces approach, Lucas-Kanade algorithm, Horn-Schunk algorithm, Mean-shift algorithm, visual simultaneous location and mapping (vSLAM) techniques, a sequential Bayesian estimator (e.g., Kalman filter, extended Kalman filter, etc.), bundle adjustment, Adaptive thresholding (and other thresholding techniques), Iterative Closest Point (ICP), Semi Global Matching (SGM), Semi Global Block Matching (SGBM), Feature Point Histograms, various machine learning algorithms
  • SIFT Scale-invariant feature transform
  • SURF speeded up robust features
  • ORB oriented FAST and rotated BRIEF
  • BRISK binary robust invariant scalable keypoints
  • FREAK fast
  • the object recognitions can additionally or alternatively be performed by a variety of machine learning algorithms (such as e.g., support vector machine, k-nearest neighbors algorithm, Naive Bayes, neural network (including convolutional or deep neural networks), or other supervised/unsupervised models, etc.), and so forth.
  • machine learning algorithms such as e.g., support vector machine, k-nearest neighbors algorithm, Naive Bayes, neural network (including convolutional or deep neural networks), or other supervised/unsupervised models, etc.
  • the machine learning algorithm can be stored by the wearable device.
  • machine learning algorithms can include supervised or non-supervised machine learning algorithms, including regression algorithms (such as, for example, Ordinary Least Squares Regression), instance-based algorithms (such as, for example, Learning Vector Quantization), decision tree algorithms (such as, for example, classification and regression trees), Bayesian algorithms (such as, for example, Naive Bayes), clustering algorithms (such as, for example, k-means clustering), association rule learning algorithms (such as, for example, a-priori algorithms), artificial neural network algorithms (such as, for example, Perceptron), deep learning algorithms (such as, for example, Deep Boltzmann Machine, or deep neural network), dimensionality reduction algorithms (such as, for example, Principal Component Analysis), ensemble algorithms (such as, for example, Stacked Generalization), and/or other machine learning algorithms.
  • regression algorithms such as, for example, Ordinary Least Squares Regression
  • instance-based algorithms such as, for example, Learning Vector Quantization
  • decision tree algorithms such as, for example, classification and regression trees
  • Bayesian algorithms such as, for
  • individual models can be customized for individual data sets.
  • the wearable device can generate or store a base model.
  • the base model may be used as a starting point to generate additional models specific to a data type (e.g., a particular user in the telepresence session), a data set (e.g., a set of additional images obtained of the user in the telepresence session), conditional situations, or other variations.
  • the wearable device can be configured to utilize a plurality of techniques to generate models for analysis of the aggregated data. Other techniques may include using pre-defined thresholds or data values.
  • One or more object recognizers 708 can also implement various text recognition algorithms to identify and extract the text from the images.
  • Some example text recognition algorithms include: optical character recognition (OCR) algorithms, deep learning algorithms (such as deep neural networks), pattern matching algorithms, algorithms for pre-processing, etc.
  • the wearable system can also supplement recognized objects with semantic information to give life to the objects. For example, if the object recognizer recognizes a set of points to be a door, the system may attach some semantic information (e.g., the door has a hinge and has a 90 degree movement about the hinge). If the object recognizer recognizes a set of points to be a mirror, the system may attach semantic information that the mirror has a reflective surface that can reflect images of objects in the room. As another example, the object recognizer may recognize a scalpel as belonging to a set of surgical tools for performing a certain type of surgery, for example, by comparing the recognized scalpel with a database of medical instruments used in that type of surgery. The medical instruments database may be stored locally in a data repository 260 in the surgeon's wearable device or in a remote data repository 264 (e.g., in the cloud, such as data store 1238 described with reference to FIG. 12 ).
  • semantic information e.g., the door has a hinge and has a 90 degree movement about the hinge
  • the map database grows as the system (which may reside locally or may be accessible through a wireless network) accumulates more data from the world.
  • the information may be transmitted to one or more wearable systems.
  • the MR environment 700 may include information about a scene happening in California.
  • the environment 700 may be transmitted to one or more users in New York.
  • the object recognizers and other software components can map the points collected from the various images, recognize objects etc., such that the scene may be accurately “passed over” to a second user, who may be in a different part of the world.
  • the environment 700 may also use a topological map for localization purposes.
  • the MR environment 700 may be an operating room where the surgeon is performing a surgery on a patient.
  • the MR environment 700 may be shared with persons in the same operating room or outside of the operating room.
  • the surgeon may share the images in his FOV of his ARD with the medical students in a classroom.
  • the MR environment 700 may be shared with a pathology lab so that the physicians in the pathology lab can place virtual flags around a tumor found by the surgeon in the patient's body.
  • FIG. 8 is a process flow diagram of an example of a method 800 of rendering virtual content in relation to recognized objects.
  • the method 800 describes how a virtual scene may be represented to a user of the MR system (e.g., a wearable system).
  • the user may be geographically remote from the scene. For example, the user may be New York, but may want to view a scene that is presently going on in California, or may want to go on a walk with a friend who resides in California.
  • the wearable system may receive input from the user and other users regarding the environment of the user. This may be achieved through various input devices, and knowledge already possessed in the map database.
  • the user's FOV camera, sensors, GPS, eye tracking, etc. convey information to the system at block 810 .
  • the system may determine sparse points based on this information at block 820 .
  • the sparse points may be used in determining pose data (e.g., head pose, eye pose, body pose, and/or hand gestures) that can be used in displaying and understanding the orientation and position of various objects in the user's surroundings.
  • the object recognizers 708 a , 708 n may crawl through these collected points and recognize one or more objects using a map database at block 830 .
  • This information may then be conveyed to the user's individual wearable system at block 840 , and the desired virtual scene may be accordingly displayed to the user at block 850 .
  • the desired virtual scene e.g., user in CA
  • the desired virtual scene may be displayed at the appropriate orientation, position, etc., in relation to the various objects and other surroundings of the user in New York.
  • FIG. 9 is a block diagram of another example of a wearable system.
  • the wearable system 900 comprises a map 920 , which may include map data for the world (which may be part of the map database 710 ).
  • the map may partly reside locally on the wearable system, and may partly reside at networked storage locations accessible by wired or wireless network (e.g., in a cloud system).
  • a pose process 910 may be executed on the wearable computing architecture (e.g., processing module 260 or controller 460 ) and utilize data from the map to determine position and orientation of the wearable computing hardware or user.
  • Pose data may be computed from data collected on the fly as the user is experiencing the system and operating in the world.
  • the data may comprise images, data from sensors (such as inertial measurement devices, which generally comprise accelerometer and gyroscope components) and surface information pertinent to objects in the real or virtual environment.
  • a sparse point representation may be the output of a simultaneous localization and mapping (SLAM or V-SLAM, referring to a configuration wherein the input is images/visual only) process.
  • SLAM simultaneous localization and mapping
  • V-SLAM V-SLAM
  • the system can be configured to not only find out where in the world the various components are, but what the world is made of.
  • Pose may be a building block that achieves many goals, including populating the map and using the data from the map.
  • a sparse point position may not be completely adequate on its own, and further information may be needed to produce a multifocal AR, VR, or MR experience.
  • Dense representations generally referring to depth map information, may be utilized to fill this gap at least in part.
  • Such information may be computed from a process referred to as Stereo 940 , wherein depth information is determined using a technique such as triangulation or time-of-flight sensing.
  • Image information and active patterns (such as infrared patterns created using active projectors) may serve as input to the Stereo process 940 .
  • a significant amount of depth map information may be fused together, and some of this may be summarized with a surface representation.
  • mathematically definable surfaces are efficient (e.g., relative to a large point cloud) and digestible inputs to other processing devices like game engines or medical devices (such as, e.g., medical imaging devices).
  • the output of the Stereo process e.g., a depth map
  • Pose may be an input to this Fusion process 930 as well, and the output of Fusion 930 becomes an input to populating the map process 920 .
  • Sub-surfaces may connect with each other, such as in topographical mapping, to form larger surfaces, and the map becomes a large hybrid of points and surfaces.
  • the location and type of medical devices may tracked and be used as inputs to determine whether the nurse has handed the physician the correct medical devices.
  • the MR reality process 960 may allow a wearable system to present a medical record (such as the medical history, allergies, treatment recommendations, images (e.g., X-rays, ECGs, MRIs, etc.), audio (e.g., from medical examinations, etc.), etc.) of a patient while the doctor is examining or operating on the patient.
  • the medical record may be stored locally or remotely and accessed for display to the wearer.
  • the world map may include information regarding where the physical and virtual objects are relative to each other. This relative location information may be another valuable input to mixed reality. Pose relative to the world becomes an input as well and plays a key role to almost any interactive system.
  • Controls or inputs from the user are another input to the wearable system 900 .
  • user inputs can include visual input, gestures, totems, audio input, sensory input, head or eye pose, etc.
  • the user may need to instruct the wearable system 900 regarding what he or she wants to do.
  • a totem, user input device, or object such as a toy gun may be held by the user and tracked by the system.
  • the system preferably will be configured to know that the user is holding the item and understand what kind of interaction the user is having with the item (e.g., if the totem or object is a gun, the system may be configured to understand location and orientation, as well as whether the user is clicking a trigger or other sensed button or element which may be equipped with a sensor, such as an IMU, which may assist in determining what is going on, even when such activity is not within the field of view of any of the cameras.).
  • a sensor such as an IMU
  • Hand gesture tracking or recognition may also provide input information.
  • the wearable system 900 may be configured to track and interpret hand gestures for button presses, for gesturing left or right, stop, grab, hold, etc. For example, in one configuration, the user may want to flip through emails or a calendar in a non-gaming environment, or do a “fist bump” with another person or player.
  • the wearable system 900 may be configured to leverage a minimum amount of hand gesture, which may or may not be dynamic.
  • the gestures may be simple static gestures like open hand for stop, thumbs up for ok, thumbs down for not ok; or a hand flip right, or left, or up/down for directional commands.
  • Eye tracking is another input (e.g., tracking where the user is looking to control the display technology to render at a specific depth or range).
  • vergence of the eyes may be determined using triangulation, and then using a vergence/accommodation model developed for that particular person, accommodation may be determined.
  • Head tracking can be another input (e.g., tracking the direction of the user's head to determine which virtual or physical object the user is looking toward).
  • the example wearable system 900 shown in FIG. 9 can include three pairs of cameras: a relative wide FOV or passive SLAM pair of cameras arranged to the sides of the user's face, a different pair of cameras oriented in front of the user to handle the Stereo imaging process 940 and also to capture hand gestures and totem/object tracking 950 in front of the user's face.
  • the cameras in the three pairs of cameras may be a part of the outward-facing imaging system 464 (shown in FIG. 4 ).
  • the wearable system 900 can include eye tracking cameras (which may be a part of an inward-facing imaging system 462 shown in FIG. 4 ) oriented toward the eyes of the user in order to triangulate eye vectors and other information.
  • the wearable system 900 may also comprise one or more textured light projectors (such as infrared (IR) projectors) to inject texture into a scene.
  • IR infrared
  • FIG. 10 is a process flow diagram of an example of a method 1000 for determining user input to a wearable system.
  • the user may interact with a totem.
  • the user may have multiple totems.
  • the user may have designated one totem for a social media application, another totem for playing games, etc.
  • the wearable system may detect a motion of a totem.
  • the movement of the totem may be recognized through the user's FOV camera or may be detected through sensors (e.g., haptic glove, image sensors, hand tracking devices, eye-tracking cameras, head pose sensors, etc.).
  • the wearable system Based at least partly on the detected gesture, eye pose, head pose, or input through the totem, the wearable system detects a position, orientation, and/or movement of the totem (or the user's eyes or head or gestures) with respect to a reference frame, at block 1020 .
  • the reference frame may be a set of map points based on which the wearable system translates the movement of the totem (or the user) to an action or command.
  • the user's interaction with the totem is mapped. Based on the mapping of the user interaction with respect to the reference frame 1020 , the system determines the user input at block 1040 .
  • the user may move a totem or physical object back and forth to signify turning a virtual page and moving on to a next page or moving from one user interface (UI) display screen to another UI screen.
  • the user may move their head or eyes to look at different real or virtual objects in the user's FOR. If the user's gaze at a particular real or virtual object is longer than a threshold time, the real or virtual object may be selected as the user input.
  • the vergence of the user's eyes can be tracked and an accommodation/vergence model can be used to determine the accommodation state of the user's eyes, which provides information on a depth plane on which the user is focusing.
  • the wearable system can use raycasting techniques to determine which real or virtual objects are along the direction of the user's head pose or eye pose.
  • the ray casting techniques can include casting thin, pencil rays with substantially little transverse width or casting rays with substantial transverse width (e.g., cones or frustums).
  • the user interface may be projected by the display system as described herein (such as the display 220 in FIG. 2 ). It may also be displayed using a variety of other techniques such as one or more projectors.
  • the projectors may project images onto a physical object such as a canvas or a globe. Interactions with user interface may be tracked using one or more cameras external to the system or part of the system (such as, e.g., using the inward-facing imaging system 462 or the outward-facing imaging system 464 ).
  • FIG. 11 is a process flow diagram of an example of a method 1100 for interacting with a virtual user interface.
  • the method 1100 may be performed by the wearable system described herein.
  • the wearable system may identify a particular UI.
  • the type of UI may be predetermined by the user.
  • the wearable system may identify that a particular UI needs to be populated based on a user input (e.g., gesture, visual data, audio data, sensory data, direct command, etc.).
  • the wearable system may generate data for the virtual UI. For example, data associated with the confines, general structure, shape of the UI etc., may be generated.
  • the wearable system may determine map coordinates of the user's physical location so that the wearable system can display the UI in relation to the user's physical location.
  • the wearable system may determine the coordinates of the user's physical stance, head pose, or eye pose such that a ring UI can be displayed around the user or a planar UI can be displayed on a wall or in front of the user.
  • the map coordinates of the user's hands may be determined. These map points may be derived through data received through the FOV cameras, sensory input, or any other type of collected data.
  • the wearable system may send the data to the display from the cloud or the data may be sent from a local database to the display components.
  • the UI is displayed to the user based on the sent data.
  • a light field display can project the virtual UI into one or both of the user's eyes.
  • the wearable system may simply wait for a command from the user to generate more virtual content on the virtual UI at block 1150 .
  • the UI may be a body centric ring around the user's body. The wearable system may then wait for the command (a gesture, a head or eye movement, input from a user input device, etc.), and if it is recognized (block 1160 ), virtual content associated with the command may be displayed to the user (block 1170 ).
  • the wearable device described herein can be configured to perform various medical applications.
  • the wearable device may include an HMD that are configured to present AR/MR/VR content to the wearer of the HMD.
  • the wearable device can provide a customized medical-related application based on the user of the wearable device.
  • the user of the wearable device may be a patient and the wearable device can provide a medical record management system to be used by the patient or authorized HCPs.
  • FIG. 12 illustrates an example computing environment in which multiple wearable devices and medical record management systems can interact with each other in a healthcare setting to provide medical record management.
  • the wearable system can include a medical record management system that includes a data security management system 1212 and a record update system 1214 (shown in FIG. 12 ).
  • the medical record management system can allow a user (e.g., an authorized HCP) to manage his medical records such as, e.g., adding/editing his medical records, inputting his medical history and his family's medical history, setting access privileges (also referred to as permissions) associated with his medical records, and so on.
  • the medical records are also referred to as virtual medical records.
  • the medical record management system can also allow the patient to view his medical records.
  • the medical record management system may allow the patient to manage some parts of his medical records such as, e.g., adding/editing his medical history and his family medical history, setting access privileges associated with his medical records, and so on.
  • the medical record management system can also, in order to preserve the integrity and accuracy of the medical records, prohibit the patient from managing some parts of his medical records such as, e.g., adding/editing doctors' notes, doctors' diagnoses, tests results, and so on.
  • the system associates different access privileges to different portions of the medical record.
  • the access privileges may include read-only access to some portions of the medical record and edit access to other portions.
  • Different users may have different access privileges to the same portion of the medical record.
  • the patient may have edit privileges for his medical or family history but read-only privileges for diagnoses, test results, etc.
  • the HCP may have read-only privileges for the patient's family history but edit privileges for the HCP's own notes or diagnosis (but not edit privileges for another HCP's notes or diagnoses).
  • the user of the wearable device may be a healthcare provider (HCP) who, e.g., accesses the patient's medical record or provides treatment or diagnosis to the patient.
  • HCP healthcare provider
  • the HCP can include various entities.
  • the HCP may be a physician or other authorized party affiliated with the physician.
  • the term physician can include a medical doctor (MD), a doctor of osteopathic medicine (DO), a physician assistant (PA), a nurse, an optometrist (OD), a podiatrist (DPM), a dentist (DDS or DDM), a veterinarian (DVM), an advanced practice medical nurse (APRN), a clinical pharmacist, a medical or nurse practitioner, a medical psychologist, or any other person authorized or licensed to prescribe medications, perform medical procedures, diagnose medical conditions, analyze medical records, etc.
  • An HCP can include a medical receptionist or assistant who accesses, reviews, or enters information in a patient's healthcare record. Additionally or alternatively, depending on the context, the HCP can refer to an entity or organization associated with human HCPs, such as, e.g., a medical clinic, a hospital, an insurance provider, a pharmacy, or other entities that provide medical-related services.
  • the wearable device can include a healthcare provider system 1230 (shown in FIG. 12 ) to allow the user to access the patient's medical records, use the medical records to perform medical exams or operations, update the patient's medical records based on an interaction with the patient, determine whether the HCP is providing the correct care (such as, e.g., operating on the correct limb during a surgery), etc.
  • the wearable device can use other medical information, such as, e.g., which surgical tools should be used in a certain type of surgery, alone or in combination with the patient's medical records, to enhance the quality of care and to reduce the likelihood of medical accidents.
  • the wearable device can track the surgical tools entering into the sterile region to make sure that the correct surgical tools are used or that no foreign objects (e.g., medical instruments such as, e.g., surgical tools) are accidently left inside of the patient's body.
  • foreign objects e.g., medical instruments such as, e.g., surgical tools
  • the users of their respective wearable devices can share medical records and collaborate using the wearable devices. For example, based on the access privileges to different parts of his medical record, a patient may edit some parts of his medical records in a healthcare database using his wearable device while being able to read (but not edit) other parts of his medical record.
  • the wearable device of a doctor at the clinic can retrieve the patient's disease history and present relevant portions of the disease history in a 2D or 3D user interface to facilitate the doctor's diagnosis, analysis and interactions with the patient.
  • the doctor's wearable device can also record (visually or audibly) the doctor's interaction with the patient, extract relevant information of the patient, and update the patient's medical records based on the relevant information.
  • the wearable device of a surgeon can capture still or video images, audio, or input from medical devices in the operating room and a surgical site on the patient while the surgeon is performing a surgery on the patient.
  • the wearable device of the surgeon can pass the information of the operating room and the patient to the wearable devices of a group of medical students, which allows the medical students to observe the surgery as it occurs or after the surgery is over.
  • FIG. 12 illustrates an example computing environment 1200 in which multiple wearable devices can interact with each other in a healthcare setting.
  • the example computing environment 1200 in FIG. 12 includes healthcare provider systems (e.g., healthcare provider system A 1230 a through healthcare provider system N 1230 n ), patient systems (e.g., patient system A 1210 a through patient system N 1210 n ), and a healthcare database system 1220 .
  • the HCP systems, the patient systems, and the healthcare database system 1220 can communicate with each other using the network 1290 .
  • the network 1290 may be a LAN, a WAN, a peer-to-peer network, radio frequency, Bluetooth, Wi-Fi, a cloud based network, or any other type of communication network.
  • the computing environment 1200 can provide a centralized healthcare database for the users of the wearable devices.
  • the computing environment 1200 can allow users of the wearable devices to input, edit, organize, and access data in the centralized healthcare database.
  • FIG. 12 only illustrates in detail one HCP system and one patient system.
  • Other HCP systems may include similar functionalities as the HCP system A 1230 a .
  • Other patient systems may also include similar functionalities as the patient system A 1210 a.
  • the HCP 1230 a may be part of a wearable device.
  • the HCP 1230 a includes one or more object recognizer(s) 708 , environmental sensors 1232 , a data management system 1234 , a data processing system 1236 , and a data store 1238 .
  • the HCP system 1210 a may include fewer or more systems and components as described.
  • the HCP system 1230 a may not have the data processing system 1236 . Rather, the data processing system 1236 may be part of the healthcare database system 1220 .
  • the HCP system 1230 a may include more systems or functionalities that facilitate the medical care of patients.
  • One or more systems of the HCP system A 1230 a may be combined or be part of another system.
  • the object recognizer(s) 708 may be part of the data processing system 1236 .
  • the object recognizer(s) 708 can be used to recognize objects in the user's environment. As described with reference to FIG. 7 , the object recognizer(s) 708 can apply computer vision algorithms (in addition to or in alternative to machine learning algorithms) to identify medical equipment, documents, faces, etc., in the user's environment.
  • the wearable device can also attach semantic information to the objects. As further described with reference to FIG. 25 , the wearable device can use an object recognizer to detect or track a surgical instrument or a medical device in a FOV of the wearable device or the user of the wearable device. Additionally, the wearable device can identify a medical device (e.g., an ultrasound probe) and connect to the device via a wired or a wireless network.
  • a medical device e.g., an ultrasound probe
  • the wearable device can scan for messages broadcasted by network-enabled medical devices in its vicinity and wirelessly connect to such devices.
  • the wearable device can receive data from the medical device and present information related to the received data to the wearer of the device (e.g., images from an imaging device, sensor data from a probe (e.g., thermometer), and so forth).
  • the wearable device may provide a user interface (UI) that permits the wearer (e.g., a surgeon) to access or control a medical device.
  • UI user interface
  • the wearable device may include a near field communication (NFC) interface that is configured to communicate over a short range (e.g., about 10 cm) with an NFC enabled medical device to exchange information, identify each other, bootstrap to a wireless connection with higher bandwidth, etc.
  • NFC near field communication
  • the NFC interface and the NFC enabled medical device may operate in passive or active modes.
  • the surgical instrument may be associated with semantic information.
  • the semantic information may include indications that the surgical instrument is part of an instrument set used for amputation.
  • the semantic information can also include the functions of the surgical instrument, such as, e.g., stopping blood from spraying, stitching an open wound, etc.
  • the environmental sensors 1232 can include various sensors described with reference to FIGS. 2A, 2B, and 4 .
  • the environmental sensors 1232 may include the user sensors 24 , 28 , 30 , 32 , the external sensor 34 described in FIG. 2B , the microphone 232 in FIG. 2A , the sensors in the outward-facing imaging system 464 and the sensors in the inward-facing imaging system 462 in FIG. 4 , etc.
  • the environment sensors 1232 can be configured to acquire data of the user's environment and data of the user.
  • the microphone 232 can acquire the audio data associated with the phrases spoken by the HCP or a patient.
  • the outward-facing imaging system 464 can image the patient or an environment of the user.
  • the data acquired by the one or more environmental sensors 1232 can be communicated to another system (or sensor) such as the object recognizer(s) 708 for identifying physical objects in the user's environment or the data processing system 1236 to extract relevant medical information.
  • the HCP system A 1230 a can also include a data processing system 1236 .
  • the data processing system 1236 can be configured to extract relevant information from data acquired by the environmental sensors 1232 , data received from a patient system 1210 , or data accessed from the healthcare database system 1220 or the data store 1238 .
  • the data processing system 1236 can process audio data (for example acquired from the microphone 232 ).
  • the data processing system 1236 can parse the audio data to identify the content of the speech by applying various speech recognition algorithms, such as, e.g., hidden Markov models, dynamic time warping (DTW)-based speech recognitions, neural networks, deep learning algorithms such as deep feedforward and recurrent neural networks, end-to-end automatic speech recognitions, machine learning algorithms (described with reference to FIG. 7 ), or other algorithms that uses acoustic modeling or language modeling, etc.
  • the data processing system 1236 can also apply voice recognition algorithms which can identify the identity of the speaker, such as whether the speaker is a certain patient or the patient's doctor.
  • the data processing system 1236 can use various machine learning algorithms described with reference to FIG. 7 to perform the voice recognition.
  • the data processing system 1236 can also be configured to process images.
  • the data processing system 1236 can apply one or more computer vision algorithms described with reference to the object recognizer(s) 708 to identify objects or persons in an image.
  • the object recognizer(s) 708 may be part of the data processing system 1236 .
  • the data processing system 1236 can also perform text recognition.
  • an image may include a scanned copy of the patient's medical history.
  • the data processing system 1236 can extract the text in the image using one or more text recognition algorithms, such as, e.g., character recognition algorithms, deep learning algorithms (such as deep neural networks), pattern matching algorithms, algorithms for pre-processing, etc.
  • the data processing system 1236 can extract relevant information from the data acquired by the environmental sensors.
  • an image acquired by the data processing system 1236 may include a face of the nurse and a face of the patient.
  • the data processing system 1236 can use facial recognition techniques to detect the faces in the image.
  • the data processing system 1236 can further identify the face of the patient, for example, based on the previous image of the patient.
  • the data processing system 1236 can use a machine learning algorithm to detect a keyword spoken by the user and identify the sentence including the keyword.
  • Some or all of the audio, visual, extracted text, or other information obtained or processed by the wearable device can be stored in the patient's medical record.
  • an audiovisual recording of a doctor's examination of the patient can be stored in the medical record.
  • the patient may describe medical problems which he is experiencing, the doctor's wearable device may record and process the patient audio data, and store the information related to the medical problems in the patient's medical record or display the information to the doctor (see, e.g., FIG. 15 ).
  • the HCP's wearable device may detect and record signs (e.g., eye movements, symptoms on the patient's skin, etc.) during the examination.
  • the example computing environment 1200 shows that the data management system 1234 is as part of the HCP system A 1230 a , in some embodiments, at least a portion of the data processing system 1236 is part of the healthcare database system 1220 .
  • the data management system 1234 can be configured to update and manage a patient's medical records.
  • the data management system 1234 can include a 2D or 3D user interface.
  • a user of a wearable device can view the patient's medical records using the 2D or 3D user interface.
  • the user can also edit the medical records using poses or the user input device 466 , such as, e.g., head poses, gestures, voice input, totem, etc.
  • the data management system 1234 can receive data from the data processing system 1236 , object recognizer(s) 708 , or environmental sensor(s) 1232 .
  • the data management system 134 can communicate with the healthcare database system 1220 to update the patient's virtual medical records based on the received data. For example, the data management system 1234 may receive a patient's diagnosis from a doctor.
  • the data management system 1234 can communicate an instruction to the database management system 1220 to add the diagnosis to the patient's medical record.
  • the data management system 1234 can also manage data sharing with another system, such as a patient system or another healthcare provider system.
  • the HCP system A 1230 a may be associated with a surgeon who is performing a surgery in an operating room.
  • the data processing system 1236 can identify people (such as e.g., nurses or other physicians) in the operating room using facial recognition techniques.
  • the data management system 1234 can receive the identities of the people in the operating room from the data processing system 1236 .
  • the data management system 1234 can automatically share the one or more virtual items (such as, e.g., the patient's physiological data) on the 2D or 3D user interface of the wearable device of the surgeon with the identified people.
  • the data management system 1234 can also share the images acquired by the outward-facing imaging system 464 of the surgeon's wearable device or other information (e.g., audio) collected by the surgeon's wearable device with the wearable devices of the identified people in the operating room.
  • information e.g., audio
  • the data management system 1234 in FIG. 12 is illustrated as part of the HCP system A 1230 a , in some embodiments, at least a portion of the data management system 1234 is part of the healthcare database system 1220 .
  • the data store 1238 can be configured to store various algorithms (e.g., as implemented by computer executable codes), such as computer vision algorithms, speech recognition algorithms, voice recognition algorithms, text recognition algorithms, machine learning algorithms, etc. As described with reference to the data processing system 1236 , these algorithms may be applied by the data processing system 1236 to process data acquired by the environmental sensor(s) 1232 .
  • the data store 1238 may include a copy of a patient's virtual medical record. For example, due to large file sizes, the HCP system A 1230 a can preload a portion of the patient's prior medical images in the data store 1238 before the patient's scheduled time for visiting the HCP. When the patient visits the HCP, the HCP can view the virtual medical records preloaded to the data store 1238 to avoid delays for retrieving images from a remote location, such as the medical data store 1222 .
  • various algorithms e.g., as implemented by computer executable codes
  • these algorithms may be applied by the data processing system 1236 to process data acquired by the environmental sensor(s) 12
  • the data store 1238 can also store data acquired or generated by other systems of the HCP provider system A 1230 a .
  • the data store 1238 can store data acquired by the environmental sensor(s) 1232 while a patient is interacting with an HCP.
  • the data store 1238 can also store data associated with the data processing system 1236 .
  • the data processing system 1236 can generate a transcript of a conversion between the patient and an HCP and communicate the transcript for storage by the data store 1238 .
  • the data store 1238 can also store patient data acquired by the patient's wearable device.
  • the patient data can be analyzed using various machine learning algorithms for determining the patient's habits, gait, physiological information such as heart rate, blood pressure, etc.
  • the patient data can also be used to perform eye tracking or head pose tracking as described with reference to FIGS. 2A, 2B, and 4 .
  • a portion of the data stored by the data store 1238 may be communicated to the healthcare database system 1220 .
  • the transcript of the conversation between the patient and the HCP can be added to the patient's virtual medical record stored in the medical data store 1222 of the healthcare database system 1220 .
  • the patient data e.g., as acquired by the patient's wearable device
  • the access privilege to patient data can be customized such that the patient owns access to all of his or her own personal data and can select who to share it with and how much is shared.
  • the patient system A 1210 a includes a data security management system 1212 and a record update system 1214 .
  • the patient system A 1210 a may include fewer or more systems and components as described.
  • the patient system A 1210 a can include a local data store, or a remote data store (e.g., a cloud-based data store) for storing data acquired or generated by the patient system A 1210 a .
  • the patient system A 1210 a may include more systems and/or functionalities that facilitate the medical care of patients.
  • One or more systems of the patient system A 1210 a may be combined or be part of another system.
  • a portion of the record update system 1214 may be part of the healthcare database system 1220 .
  • the patient system A 1210 a may be part of an wearable device associated with a patient. Although not shown in FIG. 12 , the patient system A 1210 a can also include environmental sensors for acquiring information of the patient and the patient's environment. The patient system A 1210 a can also include a 3D user interface configured to allow user interactions with physical objects or virtual objects via the user input device 466 or poses.
  • Patient medical record information is highly personal and confidential to a particular patient.
  • a patient may wish to share only a portion of such information with certain HCPs.
  • a patient may wish to permit a receptionist at a doctor's office to access only patient name, address, and appointment information to schedule an office visit whereas the patient may wish to permit a treating physician access to only the information pertaining to the physician's specialty (e.g., a cardiologist would be permitted access to records related to the patient's heart and body functions but not information related to the patient's visit with a psychologist).
  • Embodiments of the data security management systems described herein can be used to preserve confidentiality of patient medical records (e.g., as required by state or federal licensing laws such as HIPAA) and to permit access to only portions of the medical record on a need-to-know basis.
  • HIPAA state or federal licensing laws
  • the data security management system 1212 can be configured to allow a patient to manage access privileges associated with his medical records.
  • the patient can specify the access privileges via the 3D user interface.
  • Access privileges may be used to determine whether the virtual medical record (or a portion of the virtual medical record) may be viewed, edited, added, or deleted by an account of a user.
  • the user may be an entity such as, e.g., an HCP, a medical insurance provider, a patient's family member, a patient, etc.
  • the patient can add, delete, or edit healthcare providers to whom the patient wants to give permission for accessing the patient's virtual medical records.
  • the access privileges can also involve whether the account has authorizations to manage the access privileges of another device or account (also referred to herein as delegate access). For example, a patient may give his primary care physician permissions to the patient's full medical record. The physician in turn can allow his nurse to view a portion of the patient's medical record (such as the patient's address and medical insurance information).
  • the access privileges can be associated with a time limit which indicates how long a user has permission to access the patient's medical records. The patient may edit the time limit. The patient may receive a notification when a user's access privilege is about to expire. The patient can decide whether to extend the time limit or update the access privilege of that user.
  • a patient's primary care physician may have permissions to access the patient's medical record for a period of five years.
  • the patient may receive a notification asking whether the patient wants to extend the physician's access privilege at the end of the five-year period. If the patient has stopped seeing the physician, the patient can choose not to extend the physician's five-year access privilege and the physician may no longer have access to the patient's medical record.
  • a patient may give an orthopedist an access privilege for treating his broken ankle. The access privilege may last from a first appointment to a final appointment. At the final appointment, if the orthopedist informs the patient that he needs more time to treat the patient's broken ankle, the patient can extend the time limit associated with the orthopedist's access privilege for another time period. Otherwise, the time limit will end, and the orthopedist's access privilege to the patient's medical file will expire.
  • access privileges can include a time limit or an action limit (e.g., successful completion of the medical procedure).
  • a patient may receive a notification when an entity delegates access to another entity.
  • the patient's primary HCP can refer the patient to a secondary HCP, who might be a specialist.
  • the primary HCP may delegate its access privilege of the patient's medical record to the secondary HCP.
  • the data security management system 1212 may receive a notification from the primary HCP or from the healthcare database system 1220 indicating that the secondary HCP now has access to the patient's record.
  • the patient can overwrite the primary HCP's delegation.
  • the primary HCP may initially allow the secondary HCP to access the patient's full medical record. But the patient may edit this delegation by specifying that the secondary HCP only has access to a portion of his virtual medical records.
  • the patient can revoke the access privileges of the secondary HCP unless the secondary HCP meets certain criteria.
  • the patient can specify that an HCP may need to seek the patient's approval if the HCP wants to delegate access to a certain entity. For example, a patient may require a hospital to inform the patient or obtain the patient's approval if the hospital delegates access to a pharmaceutical representative.
  • the levels of access privileges may include a first level of access privilege and a second level of access privilege.
  • the first level of access privilege may include permissions to view, edit, add, and delete the patient's entire medical record but the first level of access privilege may not include the permission to delegate access.
  • the second level of access privilege may include permission to view the patient's entire medical record.
  • the wearable device (of a patient or an HCP) or the healthcare database system 1220 may store a hierarchy of access privileges.
  • the level access privilege of a doctor may be higher than that of a nurse, which is in turn higher than the access privilege of a receptionist who schedules office visits. Accordingly, the doctor may be able to view more information of a patient's virtual medical record than the nurse.
  • the doctor may be able to view the images obtained from a patient's MRI exam as well as the date of the MRI exam. But the nurse may only be able to view the date of the MRI exam because the date of the MRI exam may have sufficient information for the nurse to schedule the patient's next visit. And the receptionist may only be able to view the patient's name, contact information, and office visit schedule.
  • the levels of access privileges may be associated with one or more access criteria.
  • the access criteria may be based on characteristics of virtual medical records.
  • the characteristics of the virtual medical records may include a type, content, date, entities which created the medical record, etc.
  • the patient may only allow an entity to edit his medical records in the past 12 months.
  • the patient may only allow his insurance provider to view the records associated with the patient's surgery at a certain hospital.
  • the patient may set a portion of the medical record as private (which, for example, only the patient himself has access) if the portion includes certain words.
  • the access privilege may automatically be attached if the characteristics of the virtual medical record meet a certain access criterion. For example, where the patient allows an HCP to access his medical records created within the past 12 months, a new medical image associated with the patient's exam today may be automatically become accessible to the HCP.
  • the access criteria may also be based on the characteristics of an entity, such as, e.g., a type of an entity (e.g., a hospital v. a pharmaceutical sales representative), a location of the entity (e.g., whether the entity is outside or inside a certain geographical region), etc.
  • a type of an entity e.g., a hospital v. a pharmaceutical sales representative
  • a location of the entity e.g., whether the entity is outside or inside a certain geographical region
  • a wearable device of an authorized user can monitor the user's activities to determine whether to continue to display the virtual medical records.
  • the wearable device can automatically save the medical records and stop displaying the records after the wearable device detects that the user no longer wears the wearable device.
  • This advantageously can prevent an unauthorized person from wearing the wearable device and viewing the patient's medical records.
  • the wearable device can use the eye-tracking camera to image the wearer's eye (or eyes) and compare an iris or retinal scan with a database of authorized users of the wearable device to determine that the patient (and not a third party) is wearing the wearable device. If the third party is wearing the wearable device, the wearable device can stop displaying the medical record, prevent access to the medical record, communicate an alert that there may be unauthorized attempted access to the medical record, etc.
  • the access privilege of a virtual medical record are associated with a user
  • the access privileges may also be associated with a computing device.
  • a computer in an exam room may have a different level of access privilege than a computer in a doctor's office.
  • the computer in the exam room may only be able to access information of the patient's previous exams while the computer in the doctor's office may have access to the patient's family medical history as well as the patient's previous exams.
  • a patient can interact with his medical records using the record update system 1214 .
  • the record update system 1214 may be configured to include functionalities performed similar to the data processing system 1236 and the data management system 1234 .
  • the record update system 1214 can transcribe the user's conversation with his doctor and communicate the transcribed conversation for storage by the medical data store 1222 .
  • the patient can input, via the 3D user interface of his wearable device, his medical histories.
  • the record update system 1214 can add the inputted medical histories to the patient's virtual medical records stored in the healthcare database system 1220 .
  • the medical records before the patient's medical records are submitted to an insurance company for reimbursement, the medical records can be reviewed or approved by an HCP.
  • the HCP can review, update, or approve the medical records using the HCP system 1230 .
  • the healthcare database system 1220 can be configured to store and manage medical related data such as, e.g., virtual medical records of patients as well as information related to medical exams/procedures (e.g., processes of performing a certain medical exam, medical equipment/instruments required in a medical exam or surgery, etc.).
  • the healthcare database system 1220 may be implemented as part of one or more servers.
  • the healthcare database system 1220 includes a centralized healthcare database for users of the wearable devices.
  • the healthcare database system 1220 can include a medical data store 1222 for storing the patients' virtual medical records.
  • the virtual medical records are owned by a patient (rather than an HCP).
  • the patient can control who accesses or modifies his medical record and can ensure that his medical record is complete, because any updates made by an HCP will be to the patient's medical record rather than to a separate, HCP-owned record.
  • the patient can manage levels of access privileges associated with his virtual medical records.
  • the healthcare database system 1220 can include control instructions for adding, editing, accessing, or organizing virtual medical records.
  • the database system 1220 can receive an input from a user of a wearable device.
  • the user may be an HCP or a patient.
  • the input may include an update (such as, e.g., by adding, deleting, editing) to a virtual record.
  • the healthcare database system 1220 can identify the virtual record that needs to be updated and implement the update accordingly.
  • the healthcare database system 1220 can receive an access privilege setting from a patient system 1210 .
  • the healthcare database system 1220 can automatically attach the access privilege setting to the patient's medical records.
  • the healthcare database system 1220 can receive a request from an HCP system 1230 to retrieve a virtual medical record.
  • the healthcare database system 1220 can identify the virtual medical record based on the request and return the virtual medical record to the HCP system 1230 .
  • the healthcare database system 1220 checks whether the HCP system 1230 meets a required level of access privilege associated with the virtual medical record.
  • the healthcare database system 1220 can return the virtual medical record if the HCP system 1230 meets the required level of access privilege.
  • the healthcare database system 1220 may return only the portion of the virtual medical record that is accessible by the HCP system 1230 .
  • the HCP system 1230 may request all information associated with a patient's exam. However, the HCP system 1230 may only be allowed to view the exam date and location but not allowed to view the images of the exam. Accordingly, the healthcare database system 1220 may return only the date and location of the exam to the HCP system 1230 while not providing the images of the exam to the HCP system 1230 .
  • the healthcare database system 1220 can also organize data stored in the medical data store 1222 , or data stores associated with the HCP systems 1230 or patient systems 1210 .
  • Data can be organized in a variety of ways. For example, data may be organized based on patients, HCPs, medical settings, types of procedure, user access permissions/privacy settings, locations, action items associated with the data, etc., alone or in combination.
  • virtual medical records of the same patient or the same HCP may be grouped together.
  • data associated with radiology may be stored together while data associated with surgeries may be compiled together.
  • medical records of patients having cardiovascular surgeries may be organized together.
  • the healthcare database system 1220 can also manage data based on user access permissions/privacy settings. For example, the data marked as private by the patient may be segregated from the rest of the data.
  • the healthcare database system 120 can also implement extra security features (such as, e.g., extra layers of password authentication, or requiring authentication by biometric information (e.g., iris or retinal security features)) in order to access the data that are marked as private.
  • the healthcare database system 1220 can manage the data based on time or location (e.g., based on location information obtained by GPS sensors in the wearable device).
  • the location may be the place where the data is acquired, the position of the patient, etc.
  • the healthcare database system 1220 may include distributed data storage and the healthcare database system 1220 can store the data close to the geographical location of where the data is acquired.
  • a patient may receive an X-ray in a hospital in southern California.
  • the healthcare database 1220 can store the information associated with the X-ray in a data store in California even though the patient may live outside of California.
  • the healthcare database system 1220 can also organize data based on location of the patient. In the X-ray example, assuming the patient lives in New York, the healthcare database system 1220 may store the data associated with the X-ray at a data store close to New York rather than at a data store in California.
  • the action items associated with data may include whether any follow-ups are needed, where and who will perform the follow-up, what types of follow-ups are needed, etc.
  • the action items associated with data may additionally or alternatively include notifications (e.g., alerts, reminders, etc.) to the user relating to follow-ups, etc.
  • the primary care physician may refer the patient to see a specialist.
  • the action item of the patient's virtual medical record may include scheduling an exam with a specialist located in a certain hospital.
  • the healthcare database system 1220 can group data based on the action items. For example, information of all patients requiring an imaging exam at a certain hospital may be compiled together by the healthcare database system 1220 .
  • the healthcare database system 1220 may include at least a portion of the data processing system 1236 , the data management system 1234 , the data security management system 1212 , or the record update system 1214 , etc.
  • the wearable device of a HCP may record a conversation between the HCP and a patient.
  • the wearable device can communicate the recorded conversation to the healthcare database system 1220 .
  • the healthcare database system 1220 may use its data processing system to analyze the recorded conversation.
  • FIGS. 13A, 13B, 13C, and 13D illustrate example processes for interacting with a healthcare database system.
  • the example processes in FIGS. 13A-13D may be performed by the healthcare database system 1220 shown in FIG. 12 or by a wearable device.
  • the healthcare database system or the wearable device can manage medical related data, such as, e.g., controlling the storage of the medical related data in one or more data stores.
  • a user of an wearable device may want to access the data and send a request to the healthcare database system for accessing the data.
  • the healthcare database system can receive a request to access the data from the wearable device.
  • the healthcare database system can parse the wearable device's request and determine whether the request meets the required access privilege associated with the data.
  • the access privileges may determine whether the user of the wearable device can edit, add, or view the data.
  • the wearable device can display virtual content.
  • the wearable device can receive the request from the healthcare database system.
  • the wearable device can display the requested data as virtual content in the 3D user interface of the wearable device.
  • the wearable device may have permissions to view only a portion of the data. As a result, the wearable device may present only the portion of the data which it has the permission on the 3D user interface.
  • the user of the wearable device can edit data in the healthcare database system or add data to the healthcare database system.
  • the wearable device may communicate with the health database system to verify whether the user has required access privileges before permitting the users to perform the operations in block 1340 .
  • blocks 1340 and 1350 are optional and may be skipped by the method.
  • the healthcare database system can automatically update its data.
  • wearable device can send an input received from a user.
  • the input may include a new virtual medical record or an update to an existing virtual medical record.
  • the healthcare database system can receive an input from the wearable device.
  • the healthcare database can automatically initiate storage of the new virtual medical record or to update the exiting virtual medical record.
  • the healthcare database system may return to the block 1320 to determine whether the user is still authorized to access the data. If authorized, the method continues with the subsequent blocks described above, but if not authorized, the method can terminate (e.g., the method can cease displaying the virtual content to the user).
  • FIG. 13B illustrates an example subprocess for block 1320 in FIG. 13A .
  • the user requests access to a medical file.
  • the medical file may be stored in the medical data store of the healthcare database system.
  • the user's wearable device may send a request for accessing the medical file to the user's healthcare database system.
  • the healthcare database system or the user's wearable device can verify the user's identity by techniques described with reference to FIG. 14A .
  • the user's identity can be verified based on the user input (such as e.g., username and password) or the user's biometric information.
  • the healthcare database system or the wearable device can determine an owner or patient identifier (ID) associated with the requested medical file.
  • the healthcare database system or wearable device can use the owner or patient ID to retrieve the access privileges associated with the medical file.
  • a patient may set his medical records to be viewable by all hospitals.
  • the patient may set his medical records to be viewable by any HCP in an emergency situation. Accordingly, if the user of the wearable device is a doctor in a hospital, the wearable device may grant access to the doctor.
  • the healthcare database system can deny the user's access if the owner of the medical file has not given the user access.
  • the owner of the medical file may be the patient associated with the medical file.
  • the healthcare database system can grant access to the relevant portion of the medical file if the owner has given the permission to the user. For example, although a nurse at a clinic may have access to all medical records of the patient, the wearable device of the nurse may only display the patient's name, address, and social security number for a nurse to book the next appointment for the patient because the nurse does not need other information such as the images from previous medical exams to book the appointment.
  • FIG. 13C illustrates an example subprocess for block 1330 in FIG. 13A .
  • the wearable device can determine the user ID and the owner/patient ID from access verification (described with reference to the subprocess 1320 in FIG. 13B ).
  • the system can determine the time or location.
  • the wearable device can use data from its environmental sensors to determine the current location of the user.
  • Block 1334 can be performed in addition to or in alternative to block 1332 .
  • a default display screen appears.
  • the default display screen can present information associated with a user (such as, e.g., the user's scheduled appointments) or a patient of the user (such as, e.g., the patient's medical records).
  • the wearable device can present the display screen based on contextual information. For example, the wearable device can present the display screen based on medical file requested or based on the user's location.
  • the wearable device can display medical images in different depth planes of the display system described in FIG. 2A . The most recent images may be presented at the depth plane that appear to be closest to the user while the earlier images may be presented at a depth plane farther away from the user.
  • the user can modify the display as needed. For example, the user can move the earlier images to a closer depth plane using poses or the user input device 466 .
  • FIG. 13D illustrates example subprocesses 1340 a and 1340 b for block 1340 in FIG. 13A .
  • the wearable device can monitor its environment using one or more environmental sensors.
  • the wearable device's microphone may be continuously acquiring audio data in the environment while the user is wearing the wearable device.
  • the wearable device can recognize a data capture trigger.
  • the data capture trigger may be a keyword, a gesture, an object or an input from a user input device.
  • the trigger can cause the wearable device to start capturing the data with one or more environmental sensors.
  • the wearable device can continuously capture the data until capture requirements are met.
  • Example capture requirements are described with reference to termination conditions in FIG. 16 .
  • the healthcare provider can activate a data capture trigger by saying, e.g., “take picture 1 ”.
  • the healthcare provider is the user of the wearable device and the trigger comprises the keyword “take picture 1 .”
  • the wearable system (e.g., the healthcare provider's wearable device) can take a first picture based on the trigger. Since the trigger involves picture number 1 , the wearable device can label the picture taken by the wearable device as number 1 . In addition, because the picture includes a tumor of the patient, the wearable device can analyze the picture (using the object recognizers 708 for example) and identify the tumor in the picture. Accordingly, the wearable device can automatically send an instruction to the healthcare database system to place the picture in the tumor resection part of the patient's file. The wearable device may also send other information (e.g., the time when the picture was taken, the location where the picture was taken, the user who took the picture, etc.) to the healthcare database system. The data capture may be complete when the single image is taken.
  • the trigger involves picture number 1
  • the wearable device can label the picture taken by the wearable device as number 1 .
  • the wearable device can analyze the picture (using the object recognizers 708 for example) and identify the tumor in the picture
  • a virtual medical record can be associated with an access privilege.
  • the patient can set an access privilege for his medical related data using the data security management system 1232 .
  • the patient and a designated person with permission from the user can access and grant permissions to others for accessing all or a portion of the patient's virtual medical records.
  • FIG. 14A illustrates an example of accessing a virtual medical record based on an access privilege associated with the virtual medical record.
  • a surgeon 1402 is operating on a patient's heart 1406 in an operating room. While operating on the patient's heart, the surgeon 1402 can view the patient's physiological data acquired by environmental sensors. The surgeon 1402 can wear a wearable device and can view the patient's physiological data as summarized by the virtual element 1404 shown on the surgeon's wearable device's display.
  • the surgeon may have two medical students: the student 1412 and the student 1414 . However, the students 1412 and 1414 do not have permissions to view the patient's physiological data.
  • the user device may receive an indication that the physiological data cannot be viewed. Accordingly, the user device may provide the notification 1420 indicating that the students do not have the permission to access the physiological data.
  • the notification 1420 can be stored in the patient's medical record.
  • the surgeon's wearable device may also provide another notification to the surgeon that the students requested to access the patient's physiological data.
  • the wearable device of the surgeon or a student can provide a focus indicator near with the notification 1420 .
  • the focus indicator can comprise a halo, a color, a perceived size or depth change (e.g., causing a virtual object to appear closer and/or larger when selected), a change in a user interface element (e.g., changing the shape of a cursor from a circle to an escalation mark), a message (with text or graphics), or other audible, tactile, or visual effects which draw the user's attention.
  • the wearable device of a user requesting the patient's medical related data can verify the user's identity and determine the user's access privilege to a virtual medical record using a variety of techniques.
  • the wearable device can also require its user to provide certain inputs for verification of the user's identity.
  • the inputs may include a password, answers to security questions, etc.
  • the wearable device can authenticate the user based on whether the inputs are correct.
  • the wearable device can also verify a user's identity based on the user's biometric information. For example, wearable device may use face recognition, fingerprints, iris codes, eye colors, retina patterns, etc. to identify the user.
  • the biometric information may also be used as the password associated with a user's account.
  • the biometric information provides a stronger form authentication and higher level of security for data accessing (than a password in the form of an alphanumeric string) because certain biometric features (such as, e.g., fingerprints, iris patterns, retina patterns) are unique to each person.
  • biometric features such as, e.g., fingerprints, iris patterns, retina patterns
  • the wearable device can determine whether user has access to a virtual medical record based on information associated with the user.
  • the user may be associated with an account identifier (ID).
  • ID account identifier
  • the wearable device can communicate with the healthcare database system 1220 to determine whether the account ID is within a list of approved account IDs associated with the patient's data.
  • the wearable device can use the account ID to access characteristics of the user and determine whether the characteristics of the user satisfy the access criteria set by the patient.
  • the patient may be associated with a patient ID (also referred to as owner ID).
  • the wearable device of the user can retrieve the access criteria or the list of approved account IDs by querying the healthcare database system 1220 using the patient ID.
  • the user's wearable device can shine light into an eye of the user to perform retina recognition (also referred to herein as retinal scanning).
  • the wearable device can identify a unique pattern associated the user's retina. The unique pattern may be used to determine the user's identity. For example, the retina pattern can be used as a password.
  • the wearable device can determine whether the identified retina pattern matches the retina pattern associated with the user account ID seeking access to the virtual medical record.
  • the wearable device can verify a user's ID together with the healthcare database system 1220 (shown in FIG. 12 ).
  • the wearable device can take an image of the user's eyes.
  • the healthcare database system 1220 can perform retina scanning and authenticate the user's identity based on a retina pattern identified from the retina scanning.
  • the user's identity is verified, the user may share a virtual medical record with another user or receive a virtual medical record shared by the other user.
  • the wearable device can determine whether the user has access to the virtual medical record, which portion of the virtual medical record the user has access to, and whether the user can edit the virtual medical record.
  • the wearable device can determine the user's access privileges of other healthcare related virtual contents, such as a hospital's appointment scheduling system, a status of a current surgery, information associated with medical equipment (e.g., whether the equipment is in use), etc.
  • the wearable device can present to the user only the virtual content that the user has access to.
  • increased data privacy can be achieved by presenting the virtual content with a wearable device based on the user's level of access privilege.
  • the wearable device can reduce the need to display information on the computer monitor or on papers by presenting the private information on the head-mounted display.
  • the visitors may not be able to get access to private patient information previously on a paper or a computer monitor unless the visitors have the required level of access privilege.
  • Some embodiments of the wearable device may analyze images of an iris of the wearer to determine an iris code (such as, e.g., the Daugman iris code described in U.S. Pat. No. 5,291,560) and perform account authorization and access to patient medical records, in addition to a retinal scan or as an alternative, using the iris code.
  • an iris code such as, e.g., the Daugman iris code described in U.S. Pat. No. 5,291,560
  • the user may request another entity to grant permissions.
  • the other entity may be the owner of the virtual medical record.
  • the user's wearable device can automatically send a request to the wearable device of the patient owning the virtual medical record asking the patient to grant access.
  • the other entity may also include a user who can delegate access.
  • the level of access privileges associated with an entity may change based on the patient's interactions with the entity. For example, a front desk receptionist in a doctor's office may be in charge of scheduling patients. When a new patient visits the doctor's office, the front desk receptionist may initially have access to the new patient's basic information such as name and date of birth. After the new patient's first consultation with the doctor, the doctor can increase the receptionist's access privilege to allow the receptionist to view more information about the patient, such as, e.g., the patient's diagnosis, treatment, procedure information, etc. The doctor can update the receptionist's access privilege using the doctor's wearable device.
  • the doctor can use the user input device 466 or poses to add the account ID associated with the receptionist wearable device to the list of account IDs associated with a higher level of access privilege.
  • the receptionist can obtain more information of the patient to appropriately schedule the patient's next visit.
  • the receptionist can determine the duration and the type of the next appointment based on the doctor's diagnosis.
  • the receptionist can accordingly reserve the appropriate exam room and book the correct time slot for the next appointment.
  • the healthcare database system 1220 alone or in combination with another user device can also perform the verification and determine the access levels in addition to or in alternative to the wearable device.
  • FIG. 14B illustrates a flowchart that shows an example process 1480 for accessing a virtual medical record based on an access privilege.
  • the process 1480 may be performed by the healthcare database system 1220 (shown in FIG. 12 ).
  • the healthcare database management system receives a request from an HCP system to access a virtual medical record.
  • the virtual medical record may be owned by the patient associated with the virtual medical record.
  • the virtual medical record may be stored in the medical data store 1222 .
  • the healthcare database management system determines an access privilege associated with the virtual medical record. For example, the healthcare database management system can access the medical data store 1222 to identify the access privilege. In some embodiments, the healthcare database management system 1222 can identify the characteristics of the virtual medical record and determine which level of access privilege should be attached to the virtual medical record based on the access criteria (described in FIG. 12 ) provided by the user.
  • the healthcare database system can determine whether the HCP system is authorized to access the virtual medical record. For example, the healthcare database system can verify the identity of the HCP and determine whether the HCP has the access privilege required for retrieving the virtual medical record.
  • the healthcare database system communicates the virtual medial record to a wearable device associated with the HCP for presentation.
  • the HCP is authorized to access only a portion of the virtual medical record
  • the healthcare database system may return the portion of the virtual medical record that the HCP system has permissions to access.
  • the healthcare database system can provide a notification (such as, e.g., the notification 1420 shown in FIG. 14A ) indicating that the HCP system is not authorized to access the virtual medical record.
  • a notification such as, e.g., the notification 1420 shown in FIG. 14A
  • the healthcare database system can also provide a notification to the HCP system indicating that the HCP system is not authorized to modify the virtual medical record.
  • the healthcare database system can make a note in the medical record that the HCP system has viewed or updated the virtual medical record.
  • the note may also include information regarding where, when, or for how long the HCP system has viewed the virtual medical record.
  • the note may also include the portions that the HCP system has viewed or updated in the virtual medical record.
  • Such notes may be stored in an access log in the medical record or in another database.
  • the access log may have limited edit access privileges so that an unauthorized user cannot edit a patient medical record and then edit the access log to remove any trace of the unauthorized access to cover his tracks.
  • Multiple entities are often involved in a patient's visit of an HCP. For example, during a patient's visit to a clinic, a patient may first fill out an intake form. The nurse may input the patient information into a computer system based on the intake form and retrieve additional information associated with the patient, such as, e.g., the patient's allergies, information of the patient's previous clinic visits, other medical information, etc. The doctor may see the patient and take notes based on the conversations and exams performed on the patient. However, the information obtained by the HCPs may not be complete. For example, the doctor's notes may miss a portion of the conversation with the patient. In addition, there are often delays between when a patient visits a clinic and when the patient's medical record is updated to reflect the information obtained during the patient's visit. The delay may be because the medical record needs to be updated by multiple entities and some entities may only communicate the updates periodically.
  • a wearable device can be configured to document an interaction between a patient and an HCP, process information acquired during the interaction, and automatically update the patient's medical record in substantially real-time as the patient is interacting with the HCP.
  • the interaction between the patient and the HCP may include a conversation between the patient and the HCP, an exam or operation performed on the patient, information provided by the patient to the HCP (such as, e.g., medical histories), information provided by the HCP to the patient (such as, e.g., a diagnosis), etc., in combination or the like.
  • the wearable device can update the patient's virtual medical records based on information obtained during the interaction between the patient and the HCP. For example, the wearable device can add medical images in an exam, a patient's symptoms (as extracted based on the conversation between the patient and the doctor), or a doctor's notes and diagnosis, etc., to the patient's medical record.
  • the wearable device can use one or more environmental sensors (described with reference to FIGS. 2A and 2B ) to record data of an interaction between a patient and an HCP.
  • Data of the interaction may include audio and visual data.
  • the wearable device can process the recorded data (such as, e.g., using the local processing module 260 , the remote processing module 270 , the data processing system 1236 ) to extract relevant information.
  • the wearable device can use a dictation program (which may implement the various speech recognition techniques described with reference to FIG. 12 ) to identify words spoken by the patient or the HCP in the recorded conversation.
  • the wearable device can also use one or more object recognizers 708 to identify an object or a person involved in the interaction. For example, the wearable device may automatically detect the patient's face using a facial recognition algorithm. As another example, the wearable device can detect a portion of the user's body (such as a limb with which the patient is experience problems) using a computer vision algorithm. As yet another example, the wearable device can use text recognition techniques to extract texts in the images acquired by the outward-facing imaging system 464 .
  • the wearable device can identify a portion of the record data as the relevant portion.
  • the recorded data may include the entire conversation between the patient and the doctor during a patient's visit. However, the beginning of the conversation may involve exchanges of pleasantries, which is not relevant to diagnosing the patient's disease.
  • the wearable device may be configured to identify which portion of the conversation is not relevant to the descriptions of symptoms or medical information.
  • the wearable device can use various machine learning algorithms to determine which portion is relevant (or irrelevant). For example, the wearable device can apply a machine learning model which is trained to identify certain keywords in the audio data or word descriptions. As another example, the wearable device can apply a computer vision algorithm to identify a portion of an image which includes a relevant object (such as the portion of the patient's body that has a lump).
  • the wearable device can also use the various machine learning algorithms to identify a symptom of the patient (e.g., via the patient's speech or form images of the patient/environment as captured by the outward-facing imaging system 464 of the wearable device).
  • the wearable device can provide a notification regarding the identified symptom to the user. For example, if the user does not notice that the patient winces from pain as the user performs a physical exam on the patient's left leg during the exam (e.g., there is no conversation between the user and the patient mentioning the pain on the patient's left leg), the wearable device can make a note in the medical file or provide a notification (e.g., a visual alert) that there might be tenderness on the patient's left leg.
  • a notification e.g., a visual alert
  • the wearable device may be configured not to process the irrelevant portion. For example, the wearable device can identify a portion of the audio data that involves a discussion of the weather. The wearable device may choose not to apply the speech recognition to this portion of audio data. As another example, after the wearable device has transcribed the audio data, the wearable device may mark the portion of the descriptions involving the patient's symptoms as the relevant portion. Based on the transcription, the wearable device can apply a machine learning model to the relevant portion to identify key points in the descriptions of the patient's symptoms. However, the wearable device may be configured not to apply the machine learning model to the discussion of weather.
  • FIG. 15 illustrates an example of recording and processing audio data associated with an interaction between a patient 1502 and an HCP.
  • the HCP may be a doctor who may wear a wearable device.
  • the wearable device can execute a dictation program in the background.
  • the dictation program may be an embodiment of the data processing system 1236 described in FIG. 12 .
  • the dictation program can receive audio data from the microphone 232 of the wearable device.
  • the dictation program can monitor the content of the audio data but only record a portion of the conversation between the patient and the HCP. In other embodiments, the dictation program records the entire conversation.
  • the wearable device can record (e.g., via outward-facing imaging system 464 alone or in combination with the microphone 232 ) a portion of the interaction between the patient and the HCP.
  • the wearable device can use various techniques to determine which portion of the interaction should be recorded. For example, the wearable device can start recording in response to a determination that an initiation condition is present and stop recording when a termination condition is present. Additionally or alternatively, the wearable device may process images of the patient taken during the procedure (e.g., by the outward-facing imaging system). The images (or the analysis of the images) can be stored in the patient medical record, which can provide for a more inclusive and accurate record of the procedure. For example, the wearable device may analyze the images showing a patient's broken limb (e.g., to determine an angle of the break) or the color of the patient's skin (e.g., to determine presence of a rash or overheating).
  • the initiation condition and the termination condition may include a keyword, an input from a user input device 466 , or a pose (such as, e.g. a head pose, a hand gesture, or another body pose).
  • the keyword, or the pose may be referred to as activation command.
  • the keyword may include: “sick,” “pain,” “hurts,” diabetes,” “cancer,” “stop,” “start,” etc.
  • the keyword may include default keywords that are pre-programmed into the wearable device, words designated by the user of the wearable device, words identified using a machine learning algorithm, alone or in combination.
  • the wearable device can detect keywords and apply voice or speech recognition techniques to identify the presence of an initiation condition.
  • An HCP may designate an initiation condition to be certain keywords said by a specific user. For example, a doctor may designate the phrase “start documenting” said by doctor himself as the initiation condition. Accordingly, the doctor's wearable device can start recording the audio data received on the microphone (optionally accompanied by a visual recording obtained by the outward-facing camera) if the wearable device detects that the doctor (and not the nurse) says “start documenting” Likewise, the doctor's wearable device can stop recording the audio (or visual) when the wearable device detects a phrase such as “stop documenting.”
  • the wearable device can provide context associated with the detected keywords.
  • the local processing and data module 260 of the wearable device may include a buffer. As the wearable device monitors the user's environment, the audio/visual data of the interaction may be temporarily stored in the buffer. Once the wearable device detects the presence of the initiation/termination condition, the wearable device can retrieve (e.g., from the buffer) and analyze the audio/visual data before/after the activation command to provide context for the keyword. As another example, when the wearable device detects the presence of the activation command, the wearable device may initiate storage of a portion of the audio/visual data before and after the activation command.
  • the wearable device can record (or initiate storage) a conversation 2 minutes before the activation command or 5 minutes after the activation command.
  • the recorded conversation may be processed and stored for an HCP's later view.
  • the wearable device can provide the context of the keywords.
  • the HCP can later edit the text if needed, but the context may help the HCP remember what was said by the patient.
  • the wearable device can later erase or overwrite the audio/visual data that is acquired more than 2 minutes before the activation command or more than 5 minutes after the activation command, for example.
  • the doctor's wearable device can record an entire conversation between the doctor and the patient in the wearable device's memory. Once the wearable device detects an initiation condition or termination condition, the wearable device can retrieve and present a context, including some of the conversation before and after the initiation condition or termination condition, to the doctor. The doctor can review or edit the context and determine whether the context should be documented by the wearable device.
  • an HCP can actuate a user input device using a variety of ways, such as, e.g., clicking on a mouse, tapping on a touch pad, swiping on a touch screen, hovering over or touching a capacitive button, pressing a key on a keyboard or a game controller (e.g., a 5-way d-pad), pointing a joystick, wand, or totem toward the object, pressing a button on a remote control, or other interactions with the user input device, etc.
  • a user input device see e.g., user input device 466 .
  • An HCP can actuate a user input device using a variety of ways, such as, e.g., clicking on a mouse, tapping on a touch pad, swiping on a touch screen, hovering over or touching a capacitive button, pressing a key on a keyboard or a game controller (e.g., a 5-way d-pad), pointing a joystick,
  • the HCP can use a totem to turn the dictation program on or off.
  • the wearable device may start executing the dictation program.
  • the wearable device later detects that the same button is pressed again, the wearable device my stop the dictation program.
  • the wearable device can also detect a pose of the HCP to determine whether an initiation condition or a termination condition is present.
  • the wearable device can use data acquired by its environmental sensors to detect the pose of the HCP. For example, the wearable device can use IMUs to determine whether the user's head pose has changed.
  • the wearable device can also apply a computer vision algorithm to the images acquired by the outward-facing imaging system to detect a user's hand gestures.
  • the wearable device can match a detected hand gesture with the gesture associated with the initiation/termination condition.
  • the gesture associated with the initiation/termination condition may be set by the HCP.
  • the healthcare provider may designate “tapping a finger on the left hand” as an initiation condition of recording audio data and “tapping a finger on the right hand” as a termination condition of recording the audio data.
  • the wearable device of the HCP may run the dictation program in the background when the patient is speaking to the HCP.
  • the wearable device may detect, based on the image acquired by the outward-facing imaging system, that the HCP has tapped his right index finger on his left hand. Because this hand gesture is associated with the initiation condition, the wearable device may start recording the audio data using the dictation program as the patient and the HCP converse.
  • the wearable device may stop recording the audio data although the dictation program may still be running in the background.
  • the initiation condition and the termination condition may be discreet so that the patient will not notice that he is being recorded or imaged.
  • the keywords for the initiation condition may be the phrase “have pain” instead of the phrase “start recording”.
  • the gesture associated with the initiation condition may be tapping a finger twice on the table rather than waving the HCP's arm in front of the HCP's wearable device.
  • the patient ABC 1502 describes his medical situation to an HCP (not shown in FIG. 15 ).
  • the medical situation may include the patient's symptoms, how long the patient had the symptoms, personal medical history, family medical history, etc.
  • the patient's symptoms involve left hip pain which has been lasting for a week.
  • the patient tells the HCP that his mom has arthritis.
  • the dictation program can transcribe the patient's description as the patient speaks.
  • the wearable device of the HCP can generate the virtual item 1520 which summarizes the relevant portions of the patient's description.
  • the wearable device may use one or more environmental sensors or may communicate with another computing system to acquire additional information that was not provided by the patient's description.
  • the wearable device can apply a facial recognition algorithm to detect the patient's face in the image acquired by the outward-facing imaging system to determine the identity of the patient.
  • the wearable device can use the detected face to query the healthcare database system 1220 . If the patient's information is in the healthcare database system 1220 , the healthcare database system 1220 can return the patient's name or other information to the wearable device. In this example, the patient's name is ABC.
  • the wearable device can incorporate the information retrieved from the healthcare database system 1220 into the virtual item 1520 . In this example, the patient's name is added to the virtual item 1520 .
  • the patient may wear a wearable device during the examination, and patient information acquired by user sensors on the patient's wearable device may be communicated to the HCP's wearable device or stored in the patient medical record.
  • patient information acquired by user sensors on the patient's wearable device may be communicated to the HCP's wearable device or stored in the patient medical record.
  • eye-tracking cameras can determine information about the user's eyes
  • physiological sensors on the wearable device can determine information about, e.g., the patient's heart rate, respiration rate, blood pressure, etc.
  • the virtual item 1520 can be displayed by the wearable device of the HCP. In some situations, the patient 1502 also wears a wearable device. The virtual item 1520 may also be displayed by the patient's wearable device.
  • the recorded or processed data may be added to the patient's medical records.
  • the wearable device can verify the patient's identity using various identity verification techniques described with reference to FIGS. 12-14B before adding the data to the patient's medical records.
  • the wearable device of the HCP documents the interaction between the patient and the HCP
  • the wearable device of the patient can also document the interactions alone or in combination with the wearable device of the HCP.
  • patient can dictate his medical situations using a dictation program on his wearable device.
  • the HCP may document the interactions by typing the patient's descriptions using a virtual or physical keyboard of the wearable device.
  • the HCP can also document the interactions by imaging the interactions using the outward-facing imaging system of the HCP's wearable device.
  • the wearable device of an HCP can obtain an image of the patient using one or more environmental sensors.
  • the wearable device can take a picture or videotape the patient using the outward-facing imaging system 464 .
  • the wearable device can provide a hands-free way of recording and thereby minimize sterility issues.
  • the wearable device can use data acquired by one or more environmental sensors to determine whether the user of the wearable device has entered into an operating room.
  • the wearable device may make the determinations using object recognizers 708 to identify items that are typically present in the operating room or identify a sign which states “operating room”.
  • wearable device can record audio/visual data of a procedure occurring in the operating room.
  • the wearable device can take pictures or videotape the user's surroundings every few seconds until a termination condition is detected.
  • the wearable device can help an HCP avoid contacting unsterile devices (such as, e.g., personal devices, camera, or smartphones) in a medical procedure/exam.
  • unsterile devices such as, e.g., personal devices, camera, or smartphones
  • the wearable device can automatically update the patient's virtual medical records (such as, e.g., patient charts) based on the recorded audio/visual data, and thereby reduce the need to upload data from personal devices or digital microscope, etc. in traditional techniques.
  • the wearable device described herein can provide a faster and more accurate way for updating the patient's medical record (such as, e.g., the patient chart).
  • the wearable device can identify a patient's file that may need to be updated.
  • the wearable device can receive a selection of the patient's file by an HCP (such as, e.g., a doctor).
  • the wearable device can also identify the patient using facial recognition, voice recognition, or other biometric information of the patient.
  • the wearable device can reduce the likelihood that a patient's data is accidentally attached to another patient's file.
  • the wearable device can open the patient's file for updates. Accordingly the patient's file becomes accessible to the HCP while the patient is interacting with the HCP. As the HCP is capturing audio/visual data associated with the patient, these data may be attached to the patient's file in near real-time.
  • the wearable device of the HCP can communicate with a healthcare database management system to save the updates to the patient's file.
  • the HCP does not have to email the images back and forth with the PACS (or another HCP); thereby reducing the likelihood that the images are lost.
  • the update to the patient's file can occur during the patient's visit/interaction which can reduce the delays in updating the patients' records.
  • a first user of the wearable device can update the medical records while a second user of the wearable device can view the updated virtual medical records as the first user is updating the medical records.
  • the wearable device can provide and store an indication of each user's interaction with a virtual medical record and who is accessing the virtual medical record (e.g., the access log described above). For example, the wearable device can show a first user's virtual avatar and the word “editing” near a set of images in the virtual medical records. The wearable device can also show a second user's name and the word ‘viewing” near the set of images.
  • FIG. 16 is a flowchart that shows an example process 1600 for documenting a medical event by a wearable device.
  • the medical event may include an interaction between an HCP and a patient, an exam or procedure performed on the patient, or a portion of the interaction/exam/procedure, etc.
  • the process 1600 can be performed by a wearable device.
  • the user of the wearable device may be the HCP or the patient.
  • the wearable device monitors the user's environment.
  • the wearable device can monitor the user's environment using one or more environmental sensors described herein.
  • the wearable device can acquire the sound of the user's environment using the microphone and can acquire the images of the user's environment outward-facing imaging system.
  • the data acquired by the environmental sensors may include data associated with the user or the user's physical environment.
  • the wearable device analyzes data associated with the user's environment to detect an initiation condition.
  • the data associated with the user's environment may include data acquired by the one or more environmental sensors, data stored on the wearable device, data retrieved from the remote data repository 280 , etc.
  • the wearable device may use a speech recognition algorithm to detect words acquired in the audio data.
  • the wearable device can also access keywords associated with the initiation condition from the healthcare database system 1220 .
  • the wearable device can determine whether the audio data includes the keywords associated with the initiation condition.
  • the wearable device can determine whether the initiation condition is present based on the analysis of the data at block 1620 . If an initiation condition is not present, the process 1600 goes back to the block 1610 where the wearable device continuously monitors the environment.
  • the wearable device may document a medical event using one or more environment sensors.
  • the wearable device can use the microphone 232 to record a conversation between the patient and the HCP.
  • the wearable device can also use the outward-facing imaging system 464 to take a picture of the patient.
  • the wearable device can process the audio and visual data to identify relevant information.
  • the wearable device may determine whether a termination condition is present.
  • the wearable device can detect the termination condition based on the data associated with the user's environment. If the termination condition is not present, the wearable device can continuously document the medical event as shown in block 1640 . If the termination condition is present, at block 1660 , the wearable device may terminate the documentation of the medical event. For example, the wearable device may stop processing data acquired by the one or more environmental sensors, turn off an environmental sensor, instruct an environmental sensor to enter sleep mode, etc.
  • the wearable device can generate an instruction for updating a data store with the information associated with the medical event.
  • the information associated with the medical event may include audio or visual data documenting a portion of the medical event or the entire medical event.
  • the instruction can cause a healthcare database system to add the information associated with the medical event to the patient's virtual medical records.
  • the wearable device can update the data store while the wearable device is document the medical events. Additionally or alternatively, the wearable device can update the data store after the wearable device finishes documenting the medical event.
  • FIG. 17 schematically illustrates an overall system view depicting multiple devices interacting with each other.
  • the computing environment 1700 includes a user device 1730 , and wearable devices 210 a and 210 b .
  • the user device may be a wearable device (such as a wearable device), a computer, a mobile device, or any other devices alone or in combination.
  • the user device 1730 and the wearable devices 210 a , 210 b can communicate with each other through the network 1290 (shown in FIG. 12 ).
  • the user device 1730 and the wearable devices 210 a , 210 b can also communicate with another device (e.g., a medical device such as an ultrasound probe) that is connected with the wearable device via a wired or a wireless network.
  • a medical device such as an ultrasound probe
  • an object recognizer 708 can analyze imagery captured by the wearable device and determine that a medical device (e.g., an ultrasound probe) is present.
  • the wearable device may attempt to connect to the medical device (e.g., by listening for advertising events broadcast by the medical device) to imitate a communication link between the device and the wearable device.
  • the computing environment 1700 can also include one or more remote computing systems 1720 .
  • the remote computing system 1720 may include server computer systems that are clustered and located at different geographic locations.
  • the remote computing system 1720 may include an embodiment of the healthcare database system 1220 shown in FIG. 12 .
  • the computing device 1730 and the wearable devices 210 a , 210 b can communicate with the remote computing system 1720 via the network 1290 .
  • the remote computing system 1720 can include the remote data repository 280 (also shown in FIG. 2 ) which can maintain information about physical and/or virtual worlds.
  • the remote computing system 270 can also include a remote processing module 270 (also shown in FIG. 2 ).
  • the remote processing module 270 can include one or more processors which can communicate with the user device 1730 , the wearable devices 210 a and 210 b , and the remote data repository 1280 .
  • at least a portion of the processing or storage can be provided by the local processing and data module 260 (as shown in FIG. 2 ).
  • the users of the wearable devices 210 a , 210 b and the user device 1730 can share information of an environment and interact with virtual and physical objects in the environment via the network.
  • a doctor may wear the wearable device 210 a and perform a surgery on a patient in an operating room.
  • the wearable device 210 a can generate and display a virtual content screen to the doctor.
  • the virtual content screen may include a first portion of the virtual content that only the doctor can view.
  • the virtual content screen may also include a second portion of virtual content that other users in the room can view.
  • the doctor may share some parts of the first portion or the entire first portion of the virtual content with other users.
  • the wearable device 210 a can image the patient and the doctor's surroundings using an outward-facing imaging system.
  • the image acquired by the physician may be communicated to a medical student wearing the wearable device 210 b outside of the operating room.
  • the medical student's wearable device 210 b may receive the world map 920 associated with the operating room.
  • the world map 920 may include virtual and physical objects in the operating room.
  • the medical student can perceive, via the 210 b , the patient's virtual medical records and medical equipment in the operating room.
  • the user of the wearable device 210 b and the user of the wearable device 210 a can interact with virtual or physical objects in the world map 920 .
  • both the doctor and the medical student can examine a patient at the same time during the patient's visit to a clinic.
  • the examination conducted by the doctor can be documented in the patient's medical record. The record of the examination can be reviewed later by the user or another authorized user (e.g., as a case study for medical education).
  • the information of the world map 920 may be developed over time and may be based on the information collected by different user devices.
  • the world map may also be referred to herein as the world model. Models of virtual worlds may also be developed over time and be based on the inputs of different users.
  • information acquired by the user devices may be used to construct a world map 920 .
  • Various object recognizers e.g. 708 a , 708 b , 708 c . . . 708 n ) may be used to recognize objects and tag images, as well as to attach semantic information to the objects.
  • the wearable devices 210 a , 210 b can identify and communicate with a medical device (e.g., an ultrasound probe) that is connected with the wearable devices 210 a , 210 b via a wired or a wireless network.
  • a medical device e.g., an ultrasound probe
  • These object recognizers may include the object recognizers 708 a and 708 n shown in FIG. 7 .
  • the wearable device and the remote computing system 1720 can construct, update, and build a collection of images, points and other information using the information obtained from one or more wearable devices.
  • the wearable device may process raw information acquired and send the processed information to the remote computing system 1720 for further processing.
  • the wearable device may also send the raw information to the remote computing system 1720 for processing.
  • the wearable device may receive the processed information from the remote computing system 1720 and provide final processing before projecting to the user.
  • the wearable device may also process the information obtained and pass the processed information to other wearable devices.
  • the user device may communicate with the remote data repository 1280 while processing acquired information. Multiple wearable devices and/or multiple server computer systems may participate in the construction and/or processing of acquired images.
  • the information on the physical worlds may be developed over time and may be based on the information collected by different user devices. Models of virtual worlds may also be developed over time and be based on the inputs of different users. Such information and models will sometimes be referred to herein as a world map or a world model. As described with reference to FIGS. 7 and 9 , information acquired by the user devices may be used to construct a world map 920 .
  • Various object recognizers may be used to recognize objects and tag images, as well as to attach semantic information to the objects. These object recognizers may include the object recognizers 708 a and 708 n (also described in FIG. 7 ).
  • the remote data repository 280 can be used to store data and to facilitate the construction of the world map 920 .
  • a wearable device can constantly update information about the user's environment and receive information about the world map 920 .
  • the world map 920 may be created by the wearable device or in combination with another computing device.
  • the remote wearable devices 210 a , 210 b , the user device 1730 , and computing system 1720 may construct and/or update the world map 920 .
  • a wearable device may be in communication with the remote processing module 270 and the remote data repository 280 .
  • the wearable device may acquire and/or process information about the user and the user's environment.
  • the remote processing module 270 may be in communication with the remote data repository 280 to process information about the user and the user's environment.
  • the remote computing system 1720 can modify the information acquired by the wearable device 210 a , such as, e.g. attaching access privileges a virtual object, enlarging a portion of the image acquired by the wearable device, extract relevant medical information from the image of the wearable device 210 a , etc.
  • the remote computing system 1720 can send the processed information to the wearable device 210 a or another computing device.
  • the users can have access privilege to access the patient's medical record, or at least access privilege to access the medical information a first user is sharing with a second user.
  • the first wearable device can send a request to the second wearable device using the data sharing system 1238 to view the medical information.
  • the second wearable device may try to access the patient's medical record to access the medical information in the data store 1220 .
  • the patient system can also get a request from the first or second wearable device that the second user is trying to access the medical information.
  • the patient system can then determine whether the second user has the access privilege to the medical information as described with reference to FIGS. 14A-14B .
  • the access privileges can be specific to different portions of the patient medical record or for different users.
  • the patient may have read access to the full medical record but edit access to the patient and family history section.
  • An HCP may have read access to the full medical record and edit access to the portion that is relevant to the HCP's specialty but not have edit access to portions specific to other HCP specialties (e.g., a cardiologist may have edit access to cardiac-related portions of the medical record but not to the patient's dental records).
  • the medical information shared among users can be any interactions with a patient as described, for example, with reference to FIGS. 15-16 .
  • the medical information can include test results the user collects during a test on the patient or previously stored patient data or analysis (e.g., trends of the patient's health condition).
  • the first user can define what medical information he wants to share with the second user. For example, where the first user is seeing the patient, the first user can configure the access privileges of the second user such that only the image regarding the patient's left arm or other parts of the patient to the second user.
  • the wearable device of the first user may share the observations by the first user (e.g., via images captured by the outward-facing imaging system) in real time to the wearable device of the second user.
  • the first wearable device may prompt a notification to the first user asking whether the first user wants to edit the medical information being sent.
  • the first wearable device can present to the first user showing the medical information to be shared.
  • the first wearable device may give the first user options to modify the medical information, e.g., via poses or the user input device 466 .
  • the first user can also provide an indication to one or more other users with whom the first user wants to share the modified medical information. Based on the indication received from the first user's wearable device, the other users can accept the first user's “invitation” to share the medical information in the first user's environment.
  • the first user can attach access privilege with the medical information.
  • the first user can set some medical information as private so that the medical information will not be shared with other users.
  • the first wearable device can observe (e.g., via the outward-facing imaging system) the patient's abdomen and the patient's face.
  • the first wearable device can share the portion of the images related to the patient's abdomen to another physician to facilitate diagnosis of the illness.
  • the first wearable device may be configured not to share the portion of the images having the patient's face to protect the patient's privacy.
  • the access privilege associated with sharing medical information may be based on the viewer. For example, during a surgical procedure, content (virtual or physical) in the surgeon's field of view may be shared with everyone in the same operating room.
  • only a subset of content may be shared with another user that is outside of the operating room.
  • physical content e.g., as observed by the outward-facing imaging system
  • virtual content is shared with a student or a physician in the pathology lab
  • the virtual content as seen by the surgeon is not shared with the student or the physician in the pathology lab.
  • Wearable devices described herein can facilitate sharing medical information among multiple users.
  • Using wearable devices in a healthcare environment can allow multiple users to access and view information for a medical procedure at the same time.
  • wearable devices do not require users to walk around in the healthcare environment and view information on many pieces of equipment, such as monitors, displays, etc.
  • the wearable devices can gather all the information into a centralized location and allow each user to view the same information or a modified version of the same information.
  • the users can view information about the patient's heart such as (heart rate or other statistics) via display 220 instead of finding such data from a medical device.
  • the wearable device can further allow each user to manipulate or edit the information as needed.
  • the updated information may automatically send to other users during a shared session.
  • multiple users can view and update a patient's medical record if they have permissions.
  • the user may see the other users who are viewing the same patient record.
  • the user can also see other information (e.g., file history) regarding the patient's medical record. For example, the user can see who has made updates in the patient's medical record. Accordingly, the users may get information about who else is “in” the patient's record.
  • a user can configure sharing of a virtual content screen which can be created based on information in the patient's record.
  • the wearable device can display the virtual content screen within the user's field of view.
  • the user can then edit or choose the information he wishes to view within the virtual content screen.
  • the user can also configure which sections of the virtual content to share with other users. Additionally or alternatively, the user can also choose to replicate his field of view to other users.
  • the other users can therefore perceive the virtual content screen alone or in combination with other virtual content in the field of view of the user.
  • the first wearable device can receive user instructions to share medical information with the second wearable device.
  • the instructions can be provided upon detection of keywords, or may be provided via totem or gestures.
  • the first user may define the phrase “share medical information” as a trigger to cause his wearable device to share a virtual content screen (presented by the display 220 ) to the other user's wearable device.
  • the trigger may be an actuation of a button or a hand gesture.
  • the wearable device can share a virtual content screen presented by the first wearable device to other users' wearable devices.
  • the second user's wearable device can generate a notification asking whether the second user wants to accept the first user's request to share the medical information or whether the second user would like to participate in a sharing session where the second user can share a virtual experience with the first user via their respective wearable devices.
  • the second user's wearable device may present information shared by the first user in response to receiving the indication from the second user to participate in the sharing session.
  • the wearable device can analyze data that the environment sensors have collected and can determine a trigger for a sharing session based on the analysis of data gathered from the environment sensors. For example, the wearable device can determine whether there are other devices in the proximity (e.g., by using computer vision techniques to analyze acquired images or by communicating with the nearby devices). If nearby devices are discovered, the wearable device of a user may provide a notification to the user asking whether the user wants to share some medical information with the other devices.
  • the trigger for sharing information can also be based on interactions associated with a user.
  • the wearable device can analyze an interaction between the user of the wearable device and a patient.
  • the wearable device can detect the presence of a trigger for a sharing session if the wearable device determines the interaction needs the participation of another user. For example, if the wearable device determines that the user is performing a surgery on the patient's liver and the user makes notes or comments that the disease on the liver might be cancerous, the wearable device may generate an alert displayed to the user asking whether he wants to share the image of the liver to another user (e.g., a physician in a pathology lab) who will analyze the tumor (described below with reference to FIG. 19 ).
  • another user e.g., a physician in a pathology lab
  • a wearable device can analyze data collected in previous interactions to determine whether a trigger for sharing information is present.
  • the wearable device of a physician can analyze data related to the patient's past visit and find that a specialist is recommended for diagnosing the patient's illness.
  • the wearable device can accordingly share some of the patient's medical information with the specialist after the patient's visit.
  • the wearable device of the physician can also share the information related to the patient (e.g., as perceived by the physician) during the patient's next visit so that both the physician and the specialist can diagnose the patient's illness.
  • virtual content to be shared among multiple users can include patient parameters such as the patient's blood pressure, brain activity (e.g. the patient is awake), temperature, etc.
  • the parameters may be collected by medical devices or the user's wearable device.
  • the medical devices can communicate with the wearable device via a wired or wireless network, so that the wearable device can receive the patient parameters from the medical devices.
  • the wearable device of a user can collect data and analyze from multiple external devices in proximity (e.g., a heart monitor, an anesthesia monitor, etc.) via internet, Bluetooth, wireless sharing capabilities. The wearable device can then present and display parameters from all the devices onto a centralized platform to the user.
  • the wearable device can act as a centralized platform to display parameters from all the external devices connected (physically or wirelessly) with the patient.
  • the wearable device may determine the location of the user (e.g., by analyzing location data from GPS sensors). After analyzing the data the environmental sensors collect, the wearable device may provide a notification to the user asking whether he wants to share the medical information with one or more other users. For example, if the wearable device determines the user is in an operating room and there are other users in the operating room, the wearable device can provide an option to the user to share the medical information he perceives with other users in the same operating room. If the user allows all other users in the operating room to view his field of view, the wearable device of the other users may present virtual content screens comprising the user's field of view for a shared experience.
  • the sharing of medical information among users can occur shortly after the request to share or can be delayed until a later time.
  • the wearable device of a user may receive a request to share what he sees in his field of view in real time with the other users.
  • the wearable device may receive a request of the user to share what he saw in his field of view for a period of time in the past (e.g., five minutes ago) to other users.
  • the user may define specific time duration or a start/end time using his wearable device.
  • the wearable devices for one or more of the users can generate notifications indicating that there is another user actively viewing the patient's medical record.
  • the wearable system may provide an access lock on the medical record to prevent two (or more) users from editing the record at the same time, to provide for medical record integrity.
  • the wearable system can allow multiple users to edit a medical record at the same time, and the wearable system may synchronize the edits done by different users to maintain consistency.
  • the wearable device of a user can also configure how medical information will be shared with other users.
  • the user who may be an HCP or a patient
  • the wearable device at 12 AM on Friday, Jan. 6, 2017, can send the other users an email containing the specific part of the patient's medical record.
  • the email may include any information the user wants to share with the other users, including any documentations of interactions, notes or comments, test results, etc.
  • the email may include metadata to record which party initiated the exchange, timestamps, and so forth.
  • the wearable device may generate and display shared virtual content (e.g., via the display 220 ) to the user; and the user may edit or modify the information in the virtual content.
  • the second user's wearable device can present a virtual content the second user, which contains the same information as the virtual content perceived by the first user. If the first user's wearable device receives a modification to the virtual content from the first user, the same modification can be communicated to the second user's wearable device and can be reflected by the second user's wearable device.
  • the first user and the second user's wearable devices may provide an indication that there is another user actively “in” the same patient file.
  • a wearable device can be configured to share only a portion of the content perceived by the user. For example, the wearable device of a first user can identify a second user to share the information perceived by the first user. The wearable device of the first user can receive a request from the second user's wearable device to share the content. The wearable device can determine access privilege of the second user and only share the portion of the information in the first user's FOV to which the second user has access. The second user's wearable device can accordingly present to the second user a subset of information in the first user's FOV.
  • the wearable device can also share and record the data received from medical devices.
  • the wearable device can present (e.g., via the display 220 ) to the user medical information associated with the data retrieved from the external medical devices.
  • the wearable device can present a patient's electrocardiograph (ECG) as the wearable device receives the data from the corresponding monitoring device attached to the patient.
  • ECG electrocardiograph
  • the data (or a subset of the data) can be shared with other users using the techniques described herein.
  • the user can also edit the medical information before sharing it with the other users.
  • FIG. 18 illustrates an example of sharing medical information among multiple users, which may be performed under the computing environment 1700 described with reference to FIG. 17 .
  • a surgeon 1802 is operating on a patient's heart 1804 .
  • the surgeon 1802 can be assisted by two nurses 1820 and 1822 .
  • the operating room may also include one or more medical devices (e.g., an ECG/EKG machine) monitoring the real time conditions of the heart 1804 or other physiological parameters of the patient.
  • the medical device can output the patient's heart condition to a physical display 1824 .
  • the surgeon 1802 , and the two nurses 1820 , 1822 can wear their respective wearable devices.
  • the surgeon 1802 can see the patient's heart 1804 through the wearable device, while the two nurses 1820 and 1822 can perceive the physical display 1824 via the wearable device.
  • the surgeon 1802 may not be able to perceive the physical display 1824 as it is not in his field of view, and the two nurses 1820 , 1822 may not perceive the heart 1804 which is outside of the field of view of the two nurses 1820 , 1822 .
  • the surgeon 1802 and the two nurses 1820 , 1822 can share their respective field of views with each other via the wearable devices.
  • the surgeon 1802 and the two nurses 1820 , 1822 can participate in a sharing session.
  • the surgeon 1802 can perceive the virtual screen 1810 which includes information displayed by the physical screen 1824 , which is inside of the two nurses' field of view but is outside of the surgeon's 1802 field of view.
  • Information on the virtual screen 1810 can be rendered based on images captured by the outward-facing imaging systems of a wearable device worn by the nurse 1820 or 1822 .
  • the medical device can also be in communication with the wearable device via wired or wireless communication channels, and the wearable device can present the patient's conditions in real-time via the virtual screen 1810 .
  • the surgeon 1802 can also share at least a portion of his field of view with the nurses 1820 , 1822 .
  • the wearable device of the surgeon 1802 can communicate images of the heart 1804 as observed by the outward-facing imaging system to the wearable devices of the nurses 1820 , 1822 .
  • the nurses or the surgeon may provide an indication to the wearable devices to initiate a sharing session with other people in the room.
  • the nurse's wearable device may send a request to the surgeon's wearable device to share what the nurses are seeing on the surgical monitor to the surgeon's wearable device.
  • the surgeon's wearable device may present the virtual content screen 1810 (e.g., to show information perceived by the nurses on the physical display 1824 ) to the surgeon while the surgeon 1802 is operating on the patient.
  • the virtual content screen 1810 can contain information associated with the data that the medical device collects at real time. Accordingly, the surgeon's wearable device can give the surgeon 1802 real time medical information about the situation of the patient's heart as shown in the ECG monitor, even though such ECG monitor is not physically in the surgeon's 1802 field of view.
  • the surgeon's wearable device can generate alerts to the surgeon during the operation if the surgeon's wearable device determines, based on environment information (such as, e.g., information received from the medical devices in the operating room), there is an abnormality in the patient's heart.
  • the surgeon's wearable device may not generate and display a virtual screen 1810 showing the real time medical information about the situation of the patient's heart if the ECG monitor shows that the condition of the patient's heart is normal.
  • the surgeon's wearable device may only generate and display a virtual content screen to the surgeon 1804 when there is an abnormality in the patient's heart based on the information on the surgical monitor.
  • the surgeon 1802 or the nurses 1820 , 1822 can configure settings for viewing information outside of the field of view on their respective wearable devices. For example, the surgeon 1802 or the nurse 1802 , 1822 can configure a setting such that the nurse's wearable device only share the information on the physical display screen 1824 with the surgeon 1802 when there is an abnormality in the patient's heart data. In some cases, the nurse's wearable device alone or in combination with the remote computing system 1720 can analyze the data collected from one or more medical devices and determine whether or not there is an abnormality.
  • the surgeon can share the image of the patient's heart 1804 with the nurses 1820 and 1824 based on the presence of a triggering event.
  • the surgeon's wearable device may send a request to the nurses' wearable device to share the image of the heart.
  • the nurse's wearable device upon receiving request from the surgeon's wearable device, can present the virtual content screen 1826 showing images of the patient's heart 1804 as seen by the surgeon.
  • the wearable device can also present other types of medical information on virtual screen(s).
  • the virtual screen 1810 may also include medical information of the patient's other parameters collected by the other devices, such as the patient's blood pressure, brain activity, temperature, etc.
  • the nurses 1820 and 1822 in the example of FIG. 18 can also be any other type of users, such as other surgeons, medical interns, residents, students, patients, etc.
  • the surgeon's wearable device may also record the operation for later review by an authorized user (e.g., the surgeon himself, the patient, the patient's primary healthcare provider, etc.).
  • an authorized user e.g., the surgeon himself, the patient, the patient's primary healthcare provider, etc.
  • the surgeon 1802 is performing a heart surgery.
  • a first healthcare provider can do any type of medical procedures (or any types of interaction with the patient as illustrated in FIG. 15 ).
  • the nurses 1820 and 1822 need not be located in the operating room in which the surgeon is operating.
  • the nurses could be located in an adjacent room.
  • the surgeon may share information on the heart with a consulting physician or a pathologist, each of whom may be located remotely from the operating room (e.g., elsewhere in the hospital or in an entirely separate geographic location).
  • the first user can make notes or comments when his device is documenting an interaction with the patient.
  • the first user can share the notes or comments with the second user as well through their devices.
  • the first user can also instruct his wearable device to keep the notes or comments private.
  • the first user may share the notes or comments with the second user while he is making the notes or comments through their devices.
  • the wearable device can take an image of the patient.
  • the user may have the option to send a request to the wearable device to enlarge the image or edit the image. In editing the image, the user may virtually separate parts of the image, flag different positions of the image, remove several parts of the image, etc.
  • FIG. 19 illustrates an example of adding virtual content to images taken during a medical procedure.
  • the wearable device can improve efficiency in a medical procedure by allowing the user to send medical information (e.g., image information, camera orientation, location of the image or the procedure, time, etc.) to other users while the medical procedure is in progress or after the medical procedure.
  • medical information e.g., image information, camera orientation, location of the image or the procedure, time, etc.
  • the wearable device can send the tumor orientation in a tumor resection procedure to a pathology lab.
  • the wearable device can also send information to a different location, group, or lab for additional assessment or analysis.
  • the medical information may include images of specific organs, symptoms, etc.
  • the wearable device can allow a recipient or the sender to mark the medical information in a virtual environment so that no physical marks are needed. For example, rather than marking on a physical sample of a tumor, a doctor in the pathology lab can mark an image (e.g., a 2D or 3
  • the wearable device can also facilitate an efficient and non-invasive method to analyze medical information. For example, during a tumor resection, a physician typically places physical flags in the tumor to indicate which direction is the up.
  • the wearable system can be configured such that a user can add virtual flags (alone or in combination with texts or other visuals) to the tumor.
  • the wearable device can then send the virtual content (e.g., an image of the tumor with virtual markers) to the patient's chart or to another user, such as a pathology lab, using internet, Bluetooth, wireless sharing capabilities, which provides a faster way to transfer the information (e.g., as opposed to mailing the sample to the radiologist) and a non-invasive way to mark the physical sample (e.g., poking of the tumor is not needed).
  • the virtual content e.g., an image of the tumor with virtual markers
  • another user such as a pathology lab
  • the wearable device can share a virtual screen perceived by the first user to the wearable device of the second user.
  • the virtual screen may include information (e.g., texts, drawings, comments, annotation of physical or virtual objects) inputted by the first user.
  • information e.g., texts, drawings, comments, annotation of physical or virtual objects
  • a surgeon in an operating can share his virtual screen with a radiologist while he is looking at a tumor on a patient. The surgeon can mark regions of the tumor on the virtual screen and communicate such marks to the radiologist.
  • the wearable device can also save the virtual content with virtual markers in the patient's medical record and send a notification to the users in another location that their attention is needed.
  • the wearable device can put virtual flags on a virtual tumor image to indicate which direction is in a fiducial direction (e.g., upward), the orientation of the camera that captured the image, contextual information (e.g., notes or commentary from the surgeon relating to the image, a body part such as the tumor, and so forth), etc.
  • the wearable device can send the virtual tumor image with the virtual flags directly users in a pathology lab by sharing virtual content.
  • the wearable device can also update the virtual tumor image with the virtual flags in the patient's medical record and send a notification to the users in the pathology lab.
  • the users in the pathology lab can also make additional annotations as virtual content using their respective wearable devices. Such additional annotations can also be communicated to the surgeon's wearable device.
  • a surgeon 1902 is performing a medical procedure on a patient's organ 1904 .
  • the surgeon 1902 may determine that the organ 1904 contains some abnormality.
  • the wearable device of the surgeon 1904 may take an image of the organ 1904 . After taking the image, the wearable device to generate and display a first virtual content screen 1910 containing the image of the organ.
  • the surgeon may determine that the organ 1904 contains two parts: a normal part and an abnormal part that needs to be removed.
  • the surgeon may mark in the first virtual content screen 1910 a part A (the normal part) and a part B (the abnormal part).
  • the surgeon 1902 may instruct (either by key words, totem or gesture activation) his wearable device to virtually separate part A and part B of the image 1910 .
  • the surgeon's wearable device may process and analyze the data in the image.
  • the surgeon's wearable device may then generate and display a second virtual content screen 1920 to the surgeon 1902 where the surgeon's wearable device shows the part A and the part B are separated, which may assist in the examination or resection of the abnormal part.
  • the wearable device can also allow a user to manipulate the image of the tumor.
  • the surgeon 1902 can instruct his wearable device (e.g., using hand gestures or voice command) to enlarge the image of the abnormal part B.
  • the surgeon's wearable device may generate and display an image of the abnormal part B as shown in the virtual screen 1930 . Based on this enlarged image, the surgeon can better determine which positions he should remove the part B from the organ 1904 .
  • the surgeon can also input comments or markers associated the tumor.
  • the surgeon may place two flags (flag 1 and flag 2) on an image of the tumor as shown in the virtual screen 1930 .
  • the flag 1 and the flag 2 can indicate, for example, direction or orientation of the tumor, positions of resection, locations of abnormalities, direction of the part B with respect to the body (e.g., flag 1 is at an anterior position relative to flag 2), etc.
  • the surgeon can also share the image of the tumor with another user of a wearable device.
  • the surgeon to confirm the positions of the flag 1 and the flag 2, can share the information in the first, the second, or the third virtual screens to a second user.
  • the second user can be a pathologist who will analyze the abnormal part B.
  • the surgeon can save the three virtual content screens in the patient's medical record and the second user can then view the virtual content screens in the patient's medical report.
  • the surgeon can also share the three virtual content screens with the second user as illustrated above with reference to FIG. 18 .
  • the surgeon may also have the option to send the three virtual content screens to the second user in real-time as the surgeon is performing the surgeon.
  • the surgeon's wearable device may send the surgeon 1902 a notification asking whether the surgeon wants to send the virtual screens or annotations to another user.
  • the wearable device of the surgeon 1902 may generate and show virtual content of the area the surgeon is working on.
  • the virtual content screens 1910 , 1920 , and 1930 may show the area being operated on during a surgery at different points in time.
  • the first virtual screen 1910 may show the organ 1904 before the abnormality is removed.
  • the second virtual screen 1920 may show the organ 1904 after the surgeon removes the abnormal part: it includes a Part A (the normal part) and a part B (the removed abnormal part).
  • the surgeon can then enlarge the abnormal part B on the third virtual screen and virtually flag the parts where the surgeon wants to conduct more tests on.
  • the flag 1 and flag 2 may indicate the parts that the surgeon wants pathology tests to be performed on.
  • flags 1 and 2 can indicate the orientation of the tumor (such as, e.g., by indicating the upward direction associated with the tumor).
  • the surgeon can share the first, the second, or the third virtual content screens with a second user (e.g., a pathologist). By sharing one or more of these screens, the second user can have more context on where in the organ the abnormal part B came from, how the organ or abnormality were positioned in the patient's body, whether or where additional resection of an organ may be needed, etc. when he conducts the pathology test.
  • the use of virtual flags can help preserve the integrity of the abnormal part before analysis, because an actual, physical flag is not placed into the abnormal part B.
  • the surgeon is editing or flagging the virtual image
  • multiple users when sharing the first, the second, or the third virtual content screens, can edit or flag the image.
  • Various edits and flagging by different users can be recorded in the patient's medical record and other users can automatically get the updated images.
  • the flag 1 or 2 may be placed by another physician who receives the image of the tumor provided by the surgeon.
  • FIG. 20 is a flowchart that illustrates an example process 2000 of sharing virtual content between multiple users.
  • the example process 2000 can be performed by the wearable system 200 described herein.
  • the example process 2000 can be performed by the local processing and data module 260 of one or more wearable devices alone or in combination with the remote processing module 270 .
  • the process 2000 presents virtual content by a first wearable device of a first user.
  • the first user's wearable device may present virtual content to the first user.
  • the virtual content may be presented via one or more virtual screens.
  • the process 2000 detects a triggering event for sharing the virtual content with a second user via a second wearable device.
  • the trigger event may be a request by the first user to share the virtual content with another user or may be a request by the second user to view the information perceivable by the first user.
  • the request may be key words, input on a totem, or a gesture.
  • the triggering event may also be based on a condition in the first or the second user's environment. For example, content sharing may be triggered in response to a detection of an emergency of a patient (e.g., such as, e.g., a sudden bleeding or an anomaly in the heart rate).
  • the process 2000 verifies an access privilege associated with the second user.
  • the verification of the access privilege may be conducted by the data security management system 1212 shown in FIG. 12 .
  • the access privilege may be verified by the healthcare database system 1220 which can verify the second user's access privilege based on the second user's profile associated with a medical record.
  • the patient may control who has access to his medical records.
  • the patient can set up the access rules at the healthcare database system 1220 for verifications.
  • the patient system may also itself verify the access privilege of the second wearable device.
  • the process 2000 can determine whether the second wearable device has sufficient access privilege for accessing the virtual content. For example, the first or the second wearable device can receive an indication from the healthcare database system 1220 on whether the second user has an access privilege to the virtual content. If the second user has access privilege to the virtual content, the process 2000 moves to block 2050 where the virtual content, as perceived by the first wearable device, is shared with the second user's wearable device. In some situations, the first user can indicate what part of the virtual content (e.g., a specific part of the virtual content, or the entire virtual content) he wants to share with the second user's wearable device. The second user's wearable device can generate and display a second virtual content screen showing the virtual content received from the first wearable device.
  • the first or the second wearable device can receive an indication from the healthcare database system 1220 on whether the second user has an access privilege to the virtual content. If the second user has access privilege to the virtual content, the process 2000 moves to block 2050 where the virtual content, as perceived by the first wearable device,
  • the second wearable device can receive a modification to the virtual content.
  • the second user can modify the virtual content in the second virtual content screen and the second user's wearable device can record such modification. Such modification can be communicated to the first wearable device.
  • the first wearable device can present the modified virtual content to the first user.
  • the second user's wearable device may update the modified virtual content in the patient's medical record and send a notification to the first user about the modification.
  • the first user's wearable device can then update the first virtual content screen to show the modified virtual content to the first user.
  • the process 2000 can provide an indication to the first user's wearable device that the second user does not have an access privilege to the virtual content.
  • the process 2000 may also send, to the patient of the medical record, an indication that the second user was trying to access the virtual content.
  • the indication to patient system may include the user's name, the information associated with the virtual content.
  • the process 2000 may store this incident of denied access in the patient's medical report.
  • the wearable devices can display and share virtual content in the 3D space using any types of techniques.
  • the virtual content is not required to be rendered on a virtual content screen.
  • the virtual content can appear as a different type of virtual object or have other graphical appearances which do not appear to be part of the virtual content screen.
  • the wearable device can present virtual content based on contextual information.
  • Contextual information can include information associated with a user of a wearable device, such as, e.g., a location of the user, a pose of the user, an emotional state of the user, a level of access privileges, etc.
  • Contextual information can also include information associated with the user's environment, such as, e.g., a physical object, a virtual object, a person in the user's environment, etc.
  • the wearable device can determine contextual information based on data acquired by one or more environmental sensors, data accessed from the remote data repository 280 or the healthcare database system 1220 , in combination or the like. For example, the wearable device can analyze data acquired by the outward-facing imaging system 464 using one or more object recognizers to identify physical objects or persons in the user's environment. The one or more object recognizers can apply various computer vision algorithms (including, for example, face recognition algorithms or object recognition algorithms) to identify the physical objects. As another example, the wearable device can determine the pose of the user using the IMUs and determine the location of the user using data acquired from a GPS.
  • the wearable device can access virtual content and present the virtual content on the 3D display based on contextual information.
  • the virtual content may include information from a patient's virtual medical records, or virtual information specific to a location.
  • the virtual content may also include an alert.
  • the alert may include a visual focus indicator, a message, a sound, etc., alone or in combination.
  • the focus indicator may be near the physical or virtual content that triggered the alert.
  • the alert message can include a brief description explaining the event that triggered the alert. Examples of presenting virtual content based on contextual information are further described with reference to FIGS. 21-24 .
  • a wearable device of an HCP may recognize a person in the surroundings based on data acquired by one or more environmental sensors.
  • the wearable device can recognize a person (e.g., a patient) using techniques (such as e.g., face recognition, voice recognition, etc.) described with reference to FIGS. 12 and 14A .
  • the wearable device can present the patient's information on the 3D display.
  • the wearable device can present patient specific information such as, e.g., the patient's name, procedure, diagnosis, anatomical site being operated, medications, and so forth.
  • the wearable device can improve the efficiency and increase the quality of patient care by recognizing the patient's identity using facial recognition techniques and present patient specific information accordingly.
  • the facial recognition is used in conjunction with data acquired from the outward-facing imaging system.
  • the wearable device can identify a patient chart based on an image acquired by the outward facing imaging system and determine that the patient identified in the patient chart is the same patient recognized with the face recognition technique.
  • the wearable device can check the patient's identity determined with face recognition against the scheduling information of an operating room to make sure that an HCP of the wearable device will be (or is) operating on the correct patient.
  • FIG. 21 illustrates an example of presenting virtual content based on contextual information associated with a user's environment.
  • a surgeon 2102 can wear a wearable device.
  • the wearable device can identify the patient based on facial recognition techniques.
  • the wearable device can retrieve the virtual medical records associated with the patient.
  • the wearable device can present relevant portions of the virtual medical records on the 3D user interface as shown by the virtual objects 2106 , 2108 , and 2110 .
  • the wearable device can determine the type of the surgery based on the patient's virtual medical records or information associated with the operating room.
  • the wearable device can determine that the surgeon 2102 is performing a minimally invasive heart surgery based on the scheduling information of the operating room.
  • the wearable device can present a model 2104 of the patient's heart in the 3D space to help the surgeon 2102 to identify the relevant regions which the surgeon should operate on.
  • the wearable device can also present the patient's physiological data, such as the respiratory rate on the virtual object 2108 and the ECG data on the virtual object 2110 , as well as present patient's personal information such as his age, medical history, allergies, etc., on the virtual object 2106 .
  • the wearable device of an HCP can recognize a portion of the user's body, such as a limb or an organ, based on data acquired from the outward-facing imaging system.
  • the wearable device can access virtual medical records to determine whether the HCP is operating on the correct part of the user's body.
  • the wearable device can present an alert if the HCP is about to operate or is operating on the wrong part of the user's body.
  • FIGS. 22A and 22B illustrate an example of presenting an alert based on contextual information associated with a user's environment.
  • a surgeon 2202 is in an operating room.
  • the surgeon 2202 can perceive, via the wearable device, the virtual object 2210 which includes an image of the patient's legs 2214 r , 2214 l .
  • the wearable device of the surgeon can determine information relating to the type of operation the surgeon is going to perform on the patient. For example, the wearable device can access the scheduling information of the operating room and determine that the surgery scheduled for the patient involves a procedure on the patient's right leg 2204 r .
  • the wearable device can access the patient's medical record to confirm that the operation should be performed on the right leg 2204 r .
  • the wearable device can present to the surgeon a focus indicator (in the virtual object 2210 ) indicating the correct leg for operation is the right leg 2204 r .
  • the wearable device displays an arrow 2216 a pointing at the image 2214 r of the patient's right leg, which provides a visual indication to the surgeon 2202 as to which is the correct leg for the operation.
  • the wearable device can monitor the surgeon's interaction with the patient. Based on the data acquired by the outward-facing imaging system 464 , the wearable device can detect that the surgeon 2202 is holding a scalpel 2204 . In the example in FIG. 22A , the scalpel is operating on the right leg 2204 r , which is the correct leg.
  • the wearable device determines that the scalpel is operating on the right leg 2204 r or that the scalpel is moving toward the patient's right leg 2204 r .
  • the outward-facing imaging system of the wearable device can image the patient's lower limbs and use computer vision techniques to identify the patient's limbs, the position of the surgeon's hands, the presence and location of the scalpel in the surgeon's hands, etc.
  • the wearable device can determine that the surgeon is about to operate (or is operating) on the patient's right leg 2204 r , which is not the correct leg for the surgery in FIG. 22B .
  • the wearable device can present a visual alert warning 2218 that the surgeon 2202 should not operate on the right leg 2204 r .
  • the alert may include a message which informs the surgeon 2202 that he is about to operate on the wrong leg.
  • the alert warning 2218 can be presented to appear near the patients' left leg 22041 to emphasize that the left leg 22041 is the correct leg.
  • the alert warning 2218 may be displayed over the patient's right leg 2204 r , thereby at least partially occluding the surgeon's view of the incorrect leg, which provides an additional visual indication to the surgeon that there may be a problem occurring.
  • the alert also includes a recommended measure to correct the surgeon's mistake (such as, e.g., a reminder that the surgeon should operate on the left leg 22041 ).
  • the alert may include a focus indicator.
  • the arrow 2216 b may change color (e.g., from green to red) or starts to flash which can emphasize that the left leg 22041 is the correct leg.
  • the alert warning 2218 may be accompanied by an audible alert, for example, the speaker system of the wearable device may play the message shown in FIG. 22 to help ensure that the surgeon does not continue attempting to operate on the incorrect leg of the patient.
  • the speaker system of the wearable device may play the message shown in FIG. 22 to help ensure that the surgeon does not continue attempting to operate on the incorrect leg of the patient.
  • this example describes a procedure on the leg of the patient, this is for illustration and is not intended to limit the scope of the alert warning described herein.
  • the use of the alert is not limited to surgery and can be provided in wearable devices used by other HCPs (e.g., to alert an examining physician that he is attempting to examine a skin rash on the patient when she actually complained about left hip pain (see, e.g., FIG. 15 )).
  • Contextual information can include information specific to a user of a wearable device. Because multiple users may use the same wearable device, the wearable device may need to verify the user's identity in order to determine contextual information specific to the user.
  • the wearable device can use various techniques of described with reference to FIGS. 14A and 14B to determine and verify the identity of the user.
  • the wearable device can create and store user profiles in the medical data store 1222 . When the wearable device is actuated by a user, the wearable device can obtain the user's biometric information or require the user to provide certain inputs for verifying of the user's identity. Based on the biometric information or the user inputs, the wearable device can access an appropriate profile to determine the access privileges of the user.
  • the profile can include information about the user, such as, e.g., his name, date of birth, graduate school name, graduate year, specialty, hospital information, access privileges, etc.
  • the profile includes the user's voice characteristics or eye characteristics (e.g., the iris or retina pattern) of the user.
  • the wearable device can collect and analyze the person's eye characteristics based on data acquired by the inward-facing imaging system 462 (e.g., an eye-tracking camera). For example, an image of the person's eye can be segmented to identify the iris and then an iris code can be generated, which uniquely identifies the person.
  • the wearable device can match the eye characteristics collected by the wearable device with the eye characteristics in the profile. If the wearable device determines there is a match, the wearable device can determine that the user's identity has been verified.
  • the wearable device can generate an alert indicating that the user is not authorized to use the wearable device.
  • the user can then try a different identification method.
  • the wearable device may send notifications to one or more users who are authorized to use the wearable device.
  • the notification can include the data, time, or location of the unauthorized access.
  • the wearable device may also store the notifications regarding unauthorized access in the patient's medical record.
  • an HCP's wearable device can analyze the HCP's pose (such as, e.g., the direction of gaze) to determine the object that the HCP is currently interacting with. For example, if the HCP is staring at a window of the operating room for an extended period of time (e.g., 5 minutes, 10 minutes, etc.) rather than the heart, the wearable device may determine that the user is distracted. As a result, the wearable device can generate an alert to bring the HCP's focus back to the surgery. In some embodiments, the wearable device compiles all alerts during the surgery and generates a report.
  • the HCP's pose such as, e.g., the direction of gaze
  • the wearable device can determine that the user is a receptionist whose job duty involves scheduling appointments. Accordingly, the wearable device can automatically present an appointment scheduling tool on the 3D user interface. However, when a doctor uses the same wearable device, the wearable device may present the doctor's weekly schedule rather than the tool for scheduling the appointment.
  • FIG. 23 illustrates an example of presenting virtual content based on a user's location.
  • the user's location may be determined based on one or more environmental sensors of the wearable device.
  • the wearable device can determine the user's location based on data acquired by a GPS.
  • the wearable device can also determine the user's location based on images acquired by the outward-facing imaging system.
  • the wearable device can detect a building using a computer vision algorithm.
  • the wearable device can further determine that the building is a hospital because the image includes a sign with the word “hospital” or other characteristic landmark.
  • FIG. 23 shows two scenes 2310 and 2330 .
  • the location of the scene 2310 is a hospital 2312 while the location of the scene 2330 is a clinic 2314 .
  • the user 2302 wears a wearable device in both scenes.
  • the user 2302 may be a patient.
  • the wearable device's location sensors (such as, e.g., GPS) can acquire the patient's location data.
  • the wearable device can process the location data and determine that the user is at the hospital 2312 .
  • the wearable device can access the patient's virtual medical records, for example, by communicating with the healthcare database system 1220 .
  • the wearable device can analyze the patient's virtual medical records, and determine that the patient has an operation scheduled at the hospital 2312 . Accordingly, the wearable device can present a building map of the hospital 2312 which includes a route to the operating room.
  • the wearable device can analyze the data acquired by the outward-facing imaging system 464 (or the GPS data) of the wearable device to determine the doctor's position inside the hospital and update the route information accordingly.
  • the wearable device can also present the operating room's information 2320 on the 3D display.
  • the wearable device can present information on the procedure, such as, e.g., the risks of the procedure on the 3D display.
  • the wearable device can determine that the patient is at the clinic 2314 based on the user's location information.
  • the wearable device can accordingly display the patient's scheduling information at the clinic 2314 , such as, e.g., the time of the appointment and the physician that the patient will see at the clinic 2314 .
  • the user 2302 may be a doctor who works at the hospital 2312 and the clinic 2314 .
  • the wearable device can determine that the doctor will perform a surgery at operation room A. Accordingly, the wearable device can present information of operating room A (such as e.g., medical equipment that is in the operating room). The wearable device can also present a map or route information from the doctor's current location to the operating room A. The wearable device can also inform the patient that the doctor will operate on.
  • the patient information may be stored on the hospital's network (e.g., the hospital's server), in a file cabinet, or in the healthcare database system 1220 .
  • the wearable device can automatically identify the storage location of the patient's files and access the files from the storage location.
  • the wearable device may provide an indication of the location of the patient's file to help the doctor to find the patient's file.
  • the wearable device can detect that the doctor's current location is the clinic 2314 (rather than the hospital 2312 ).
  • the wearable device can communicate with a clinic's database to retrieve relevant information of the doctor's cases, such as, e.g., the patients whom the doctor will see, the procedures that the doctor will perform in the clinic, exam rooms, voicemail messages received to the clinic after business hours, etc.
  • the wearable device can obtain the relevant information by communicating with the clinic's cloud network using Bluetooth or wireless technologies.
  • the information of the doctor's cases may be uploaded to the cloud network by an assistant from the day before.
  • the wearable device can monitor a user's environment and that the user has entered into a specific operating room in a hospital.
  • the wearable device can access the world map 920 (such as, e.g., an internal surgical unit map) associated with this particular operating room to determine the objects associated with the operating room.
  • the world map of this operating room may be created and updated by multiple users. Details on updating and building a world map are described with reference to FIGS. 9 and 17 .
  • the wearable device can also display default parameters for this particular operating room, such as, e.g., time and patient for the next procedure, and so on.
  • the wearable device can access the nurse's schedule uploaded to the hospital network to determine the default parameters of this particular operating room.
  • FIG. 24 is a flowchart that illustrates an example process 2400 of accessing and presenting virtual content based on contextual information.
  • the process 2400 may be performed by a wearable device.
  • the wearable device accesses data acquired by one or more environmental sensors of an wearable device.
  • the acquired data may include data associated with the user or the user's environment.
  • the wearable device determines contextual information based at least on an analysis of the data acquired by the one or more environmental sensors.
  • the contextual information may include information associated with an HCP, a patient, virtual medical record, an exam or operating room, objects in the exam or operating room, a location of the user of the wearable device, interactions between the user and the patient, etc.
  • the wearable device can also access data in the remote data repository 280 or the healthcare database system 1220 and determine contextual information based on the accessed data.
  • the wearable device identifies virtual objects associated with a FOR for a user of the wearable device based at least partly on the contextual information.
  • the virtual objects may include the virtual content shared among users, past test results in a patient's medical record, documentations of interactions between users and the patient, the user's scheduling information, etc.
  • the wearable device can determine a current location of the user and identify the virtual objects associated with the current location.
  • the wearable device can present one or more of the virtual objects in the user's FOV.
  • the virtual objects associated with the doctor FOR may include the map of the hospital (or a route to a location) and the doctor's schedule of the day.
  • the wearable device may present only the doctor's schedule of the day in the FOV. If the doctor wants to view the map of the hospital, the doctor can actuate the user input device 466 or use a pose.
  • the wearable device can determine, based on the contextual information, whether a threshold condition for generating an alert is met at block 2450 .
  • the threshold condition may include a mistake in a medical procedure.
  • the wearable device can determine that standard steps of a surgery that is performed on a patient.
  • the standard steps may include processes A, B, and C.
  • the wearable device can monitor the surgeon's actions and detect that the surgeon has skipped the process B. In response to the detection that the process B is missing, the wearable device can determine that the threshold condition is met.
  • the wearable device can determine that the threshold condition is met when the surgeon's scalpel is less than a threshold distance to the patient's left leg 22041 when the operation should be on the patient's right leg 2204 r.
  • the wearable device can present an alert to the user of the wearable device.
  • the alert may include a focus indicator or an alert message indicating the mistake, the threshold condition, or a corrective measure of the mistake, etc.
  • the process 2400 goes back to the block 2410 .
  • An HCP may need to track medical instruments involved in a medical procedure or exam.
  • Instrument tracking can include counting medical instruments involved in a surgery/exam or knowing whether a foreign object (e.g., a medical instrument) enters or exits a sterile region of the patient. Knowing whether a correct medical instrument has been used or whether a medical instrument is at the correct position is very important to avoid medical malpractice and to improve the quality of the medical care. For example, a surgeon may request a certain instrument, but a different one is handed to him by a nurse. As a result, the surgeon may use the wrong instrument on the patient which can cause a medical accident. As another example, a patient may be closed up with foreign objects (such as, e.g., a gauze, a medical rag, a cellphone, etc.) from his operation if instruments are miscounted.
  • foreign objects such as, e.g., a gauze, a medical rag, a cellphone, etc.
  • current techniques include counting (such as, e.g., by a nurse) all instruments that have entered into the operating room and ensure that the instruments (and their locations/functions) have been properly accounted for at the end of surgery.
  • instrument tracking in medical procedures is a tedious task for nurses.
  • current techniques are prone to errors because multiple HCPs (e.g., multiple surgeons, nurses, etc.) may be involved in an operation. It may be difficult to take into account everyone's actions and usages of the medical instruments.
  • the environment of the operating room may be high stress, time sensitive, and exhausting to the HCPs. Therefore, the HCPs may accidently forget some of the medical instruments in the operating room.
  • the wearable device can use one or more environmental sensors to identify medical instruments in a user's environment and determine whether the medical instrument is at the correct position (rather than for example, inside the patient's abdomen after a surgery). The wearable device can also determine whether the correct medical instrument has entered into a sterile field based on the data acquired by one or more sensors.
  • a wearable device can use data acquired by one or more environmental sensors alone or in combination with data stored in the remote data repository 280 to identify a foreign object (e.g., a medical instrument) and determine semantic information associated with the foreign object.
  • a wearable device can use the outward-facing imaging system 464 to obtain an image of the user's environment.
  • the object recognizers 708 can detect a medical instrument in the image.
  • the wearable device can also communicate with the remote data repository 280 (or use the local processing data module 260 ) to determine semantic information associated with the detected medical instrument, such as, for example, a name, a type, a use, a function of the instrument, whether the instrument is in a set of instruments, etc.
  • the wearable device can also use one or more optical sensors for identifying the medical instrument.
  • a medical instrument may include an optical label such as, e.g., a quick response code (QR) or a barcode.
  • the optical label may be placed on an exterior surface of the medical instrument.
  • the optical label may encode the identity of the medical instrument (such as, e.g., an identifier associated with the medical instrument).
  • the optical label may also encode semantic information associated with the medical instrument.
  • the one or more optical sensors of the wearable device can scan the optical label.
  • the wearable device can parse the information encoded in the optical label.
  • the wearable device can communicate with a remote data repository 280 to obtain additional information of the optical label.
  • the wearable device can extract an identifier of the medical instrument and communicate with the remote data repository to get the semantic information of the medical instrument.
  • the wearable device may also incorporate or utilize electromagnetic tracking sensors to track the location of objects in the environment.
  • the medical instrument may also have an electromagnetic label, such as, e.g., an RFID tag.
  • the electromagnetic label can emit signals that can be detected by the wearable device.
  • the wearable device may be configured to be able to detect signals with certain frequencies.
  • the wearable device can send a signal to the medical instrument. Based on the feedback received from the medical instrument, the wearable device can identify the medical instrument.
  • the wearable device may communicate with the medical instrument via a wired or a wireless network.
  • the wearable device may provide a focus indicator near the medical instrument. For example, the wearable device can identify a gauze next to the physician and display a green halo around the gauze. The wearable device can assign the focus indicator based on the semantic information of the medical instrument. For example, the wearable device can assign a green halo around the gauze while assign a blue halo around a scalpel.
  • the wearable device can track the position of a medical instrument based on the data acquired by one or more environmental sensors. For example, the wearable device can determine the position of the medical instrument overtime based on data acquired by the outward-facing imaging system 464 . As an example, the wearable device can identify a scalpel in a tray at time 1 and identify the same scalpel in a surgeon's hand at a later time.
  • instrument tracking may be achieved based on data collected from multiple wearable devices because each wearable device may have a limited FOV and cannot observe the whole operating room. For example, a scalpel may appear in a tray at time 1 based on an image acquired by the nurse's wearable device. The position of the scalpel may later be updated to be the surgeon's hand based on an image acquired by the surgeon's wearable device even though the nurse's wearable device might not perceive that the surgeon has picked up the scalpel.
  • FIG. 25 schematically illustrates an example of a medical procedure occurring in an operating room having a sterile region 2510 .
  • two surgeons 2504 and 2506 are involved in the procedure.
  • the two surgeons may wear their respective wearable devices while performing the surgery.
  • the sterile region (also referred to as sterile field) 2510 may be a user-defined area in a physical environment or in the user's FOR.
  • the sterile region 2510 may be associated with a region of the patient (or the operating room) that was disinfected prior to the surgery.
  • the sterile region 2510 may be the region that will be or is cut open in a surgery.
  • the wearable device can present a world map 920 of the operating room to an HCP (such as, e.g., the surgeon 2504 , 2506 , a nurse, or another HCP).
  • the HCP can mark, on the world map 920 which region in the operating room is the sterile region 2510 .
  • the wearable device can automatically access the patient's virtual medical records or the information associated with the surgery to determine the sterile region.
  • a portion of the sterile region 2510 may overlap with a portion of the surgeon's 2504 FOV or a portion of the surgeon's 2506 FOV.
  • the surgeon 2504 can perceive virtual objects 2520 , 2512 , 2514 , 2518 as well as medical instruments 2530 in his FOR.
  • the medical instruments 2530 are also in the sterile region 2510 which may be part of the surgeon's 2504 FOR.
  • the virtual objects 2512 , 2514 , 2518 may be associated with the patient's virtual medical record, the patient's physiological data (such as heart rate, respiratory rate, etc.), information associated with the surgery (such as, e.g., the steps of the surgeon, an enlarged view of the organ that is being operated on, etc.), and so on.
  • the virtual object 2520 includes a list of medical instruments 2530 in the sterile region 2510 .
  • Medical instruments 2530 can include 2 instruments A, 1 instrument B, 1 instrument C, 1 instrument D, and 2 sponges.
  • the surgeon 2504 can perceive in his FOV at least a subset of the virtual and physical objects in his FOR. For example, while the surgeon 2504 is operating on the patient, the surgeon 2504 can perceive the virtual object 2514 and the two sponges via the 3D user interface of his wearable device.
  • the wearable device can monitor the physical objects in the FOV. For example, the wearable device of the surgeon 2504 can constantly scan for recognized objects in the surgeon's 2504 FOV. The wearable device can also scan for new objects entering into the FOV. For example, during the procedure, the surgeon can bring a scalpel inside the FOV. The wearable device can detect that the scalpel has entered into his FOV. If this scalpel was not previously identified by the wearable device, the wearable device can use computer vision techniques or scan the optical/electromagnetic label associated with the scalpel to recognize that a new object has entered the FOV.
  • the wearable device can confirm whether a medical instrument entering into the sterile region (or an FOV of a HCP) is the correct medical instrument.
  • the wearable device can perform such confirmation based on data acquired by one or more environmental sensors. For example, the wearable device can collect audio data using the microphone 232 .
  • the wearable device of an HCP can analyze the audio data to identify a medical instrument mentioned in the audio data.
  • the wearable device can also monitor the objects entering into the HCP's FOV and determine whether the medical instrument entered into the FOV matches the medical instrument identified in the audio data. If the medical instrument entered into the FOV does not match the medical instrument identified in the audio data, the wearable device can provide an alert to the HCP.
  • the alert may be a focus indicator.
  • the focus indicator may be displayed around the non-matching medical instrument entered into the FOV.
  • the alert may also be an alert message which can include a brief description explaining that a non-matching medical instrument has entered into the FOV.
  • the alert may include the name of the non-matching medical instrument and the correct medical instrument requested, and a statement the wrong medical instrument has entered into the FOV.
  • the alert may be presented on the 3D virtual user interface of the wearable device or be presented via the speaker 240 of the wearable device.
  • the surgeon 2504 may ask a nurse to hand him a scalpel in a surgery.
  • the nurse hands over a pair of scissors.
  • the wearable device can identify the word “scalpel” using speech recognition and determine that the doctor has requested a scalpel.
  • the wearable device can also detect the pair of scissors handed over by the nurse using computer vision techniques. Because the pair of scissors is not the scalpel requested by the surgeon, the wearable device can provide a red halo around the pair of scissors indicating that the pair of scissors is not the correct medical instrument.
  • the wearable device can also confirm whether the correct medical instrument has entered into the FOV or the sterile region based on data acquired from the environmental sensors and data accessed from the remote data repository 280 (or data from local processing & data module 260 ).
  • the remote data repository 280 may include information on a set of surgical instruments that are used in a certain type of surgery.
  • the information on the set of surgical instruments may include the type, name, quantity, position, function, etc., of the surgical instruments used in the type of surgery.
  • Information about the instrument can also be uploaded to the patient chart (e.g., which stent was used, which implant was inserted with part number and manufacturer information, etc.). Such information can be added to the procedure file in the patient's medical records.
  • the wearable device can determine a type of surgery that will be or is being performed on the patient based on the patient's virtual medical records.
  • the wearable device can identify the set of surgical instruments associated with the type of surgery based on the data stored in the remote data repository 280 .
  • the wearable device can also identify the surgical instruments in the sterile region based on the acquired images and determine whether the sterile region includes a surgical instrument that does not belong to the identified set of surgical instrument.
  • the wearable device can continuously monitor the objects entering into (or exiting) the sterile region or FOVs of the respective surgeons 2504 , 2506 , update the list of surgical instruments on the virtual object 2520 , and alert the surgeons 2504 , 2506 if a wrong surgical instrument has entered into the sterile region (or the FOVs).
  • a set of surgical instruments for an appendectomy may include various scissors, forceps, retractors, gallipots, kidney dishes, towel clamps, etc.
  • the medical instruments 2530 in the sterile region 2510 shown in FIG. 25 may include 2 scissors, one forceps, one retractor, and 1 bone curette.
  • the wearable device can present the list of surgical instruments in the sterile region 2510 as shown by the virtual object 2520 .
  • the bone curette is not in the set of surgical tools for appendectomy.
  • the wearable device can provide an alert to the surgeon 2504 indicating that the bone curette has entered the sterile region 2510 .
  • the wearable device can show the phrase “bone curette” in a different color on the virtual object 2520 .
  • the surgeon can remove the bone curette from the sterile region 2510 .
  • the wearable device can remove the phrase “bone curette” from the virtual object 2520 .
  • the wearable device can track the objects entering and exiting the sterile region 2510 using one or more environmental sensors and local processing and data module 260 .
  • the wearable device can also communicate with the remote processing module 270 and the remote data repository 280 to track and count medical instruments.
  • the wearable device can keep a list of medical instruments in the sterile region 2510 . If a wearable device detects that a medical instrument enters into the sterile region, the wearable device can add the medical instrument to the list. If a medical instrument is removed from the sterile region, the wearable device can deduct the removed medical instrument from the list. The wearable device can present a current list of instruments in the sterile region using the virtual object 2520 shown in FIG. 25 . In some implementations, the wearable device can display one or more focus indicators showing the medical instrument that is currently being used by the user is in (or out of) the sterile region.
  • Data associated with tracking the medical instruments may be analyzed to determine whether all the instruments entering into the sterile region have been properly accounted for. For example, the wearable device can determine whether all medical instruments in the sterile region have exited the sterile region. For the medical instruments that did not exit the sterile region, the wearable device can determine whether they should be left within the sterile region. As an example, based on the tracking data, the wearable device may identify that a piece of gauze and a surgical thread are still in the sterile region at the end of the surgery. The wearable device can further determine that the position of the surgical thread is proper because the surgical suture is used to hold body tissues together after the surgery.
  • the wearable device may provide an alert to the surgeon 2504 indicating that the piece of gauze is still in the sterile region.
  • Multiple wearable devices may be used to collectively maintain the accuracy of which medical instruments have entered or exited the sterile region 2510 .
  • the remote computing system 1720 (shown in FIG. 17 ) can maintain a list of instruments in the sterile field (as shown by the virtual object 2520 ).
  • the remote computing system 1720 can update the list of instruments based on data acquired by the wearable device of the surgeon 2504 and the wearable device of the surgeon 2506 .
  • Instrument tracking based on data from multiple wearable devices may be beneficial because multiple users may bring the medical instrument into or out of the sterile field.
  • the FOV of a user's wearable device may cover only a portion of the sterile region 2510 and may not be able to track every object in the sterile region 2510 .
  • a user may look away from the sterile region 2510 or leaves the operating room, other users that are interacting with the sterile region 2510 can continue to track the medical instruments entering or exiting the sterile region 2510 .
  • the wearable device can identify a cellphone and track the position of the cellphone to make sure that the cellphone is not accidentally left inside of the patient's body.
  • similar techniques can also be used for monitoring and tracking objects other regions, such as, e.g., the operating room or the FOV of an HCP.
  • FIG. 26 is a flowchart that illustrates an example process 2600 of tracking medical objects in a sterile region.
  • the example process 2600 may be performed by the wearable device 210 alone or in combination with the remote computing system 1720 .
  • the wearable device 210 may be a wearable device.
  • wearable device identifies a sterile region in a FOR associated with the wearable device.
  • the sterile region can be determined based on data obtained from an external database, such as, e.g., the database system 1220 or the remote data repository 280 .
  • the sterile region may be determined based on the hospital protocols associated with an operating room, the type of surgery performed on the patient, the patient's body, etc.
  • the sterile region can also be marked by an HCP.
  • the wearable device can identify an object entered into the sterile region.
  • the wearable device can use data acquired from one or more environmental sensors to identify the object
  • the wearable device can detect the object using computer vision algorithms or scan an optical label associated with the object.
  • the wearable device can access information associated with the object.
  • the wearable device can communicate with the remote data repository 280 or the healthcare database management system 1220 to determine semantic information associated with the object.
  • the information associated with the object may include the name of the object, the type of the object, the position of the object, etc.
  • the wearable device can track the positions of the object in the sterile region. For example, the wearable device can use computer vision algorithm to detect that a scalpel has been moved from a tray to a surgeon's hand.
  • the wearable device can keep a record of all the objects entering into the sterile region and exiting the sterile region. For example, the wearable device can maintain a list of objects that are currently in the sterile region.
  • the wearable device can detect an anomaly associated with the object in the sterile region. The wearable device can make the detection based on the information associated with the object or information associated with the medical procedure (such as the type of the surgery, surgical instruments required for the surgery, etc.) For example, the wearable device can determine whether the sterile region includes a medical instrument that is unrelated to the surgery being performed by the user of the wearable device. The wearable device can also determine whether the user of the wearable device has received a surgical instrument which was not requested.
  • the wearable device can detect that the object has left the sterile region. For example, the wearable device can detect instruments A, B, and C in a first image of the sterile region. However, in the second image of the sterile region, the wearable device only detects instruments B and C. Accordingly, the wearable device can determine that the instrument A has left the sterile region and remove the instrument A from the list of objects that are currently in the sterile region.
  • the wearable device can continue to track the position of the object in the FOR of the wearable device even though the object has left the sterile region. For example, the wearable device can determine whether a needle has been properly disposed by tracking the position of the needle.
  • a wearable device configured to present virtual healthcare content to a wearer
  • the wearable device comprising: a display configured to present virtual content to the wearer, at least a portion of the display being transparent and disposed at a location in front of a wearer's eye when the wearer wears the wearable device such that the transparent portion transmits light from a portion of the environment in front of the wearer to the wearer's eye to provide a view of the portion of the environment in front of the wearer; and a hardware processor in communication with the display, the hardware processor programmed to: receive an access privilege associated with at least a portion of a patient medical record for a patient; access, based at least in part on the access privilege, the at least a portion of the patient medical record; and instruct the display to present to the wearer virtual content relating to the at least a portion of the patient medical record.
  • the wearable device of aspect 1 wherein the wearable device comprises an inward-facing imaging system configured to capture an eye image an eye of the wearer, and the processor is programmed to: analyze the eye image to determine a biometric identification element; and receive a confirmation that the biometric identification element matches an authorized biometric identification element for the wearer.
  • the wearable device of aspect 2 wherein the biometric identification element comprises an iris scan or a retinal scan.
  • the wearable device of any one of aspects 1-3 wherein the wearable device further comprises a sensor to determine a location of the wearer, and the access privilege is based at least partly on the location of the wearer.
  • the wearable device of any one of aspects 1-4 wherein the wearable device further comprises an environmental sensor, and the hardware processor is programmed to: obtain environmental data from the environmental sensor; recognize a data capture trigger from the obtained environmental data; and initiate data capture from at least one data capture element disposed on the wearable device.
  • the wearable device of aspect 5 wherein the environmental sensor comprises a microphone, the environmental data comprises an audio recording, and the data capture trigger comprises a spoken initiation command.
  • the wearable device of aspect 5 or aspect 6 wherein the environmental sensor comprises an outward-facing imaging system, the environmental data comprises images of the environment, and the data capture trigger comprises a gesture made by the wearer.
  • the wearable device of any one of aspects 5-7 wherein the hardware processor is programmed to recognize a data termination trigger from the obtained environmental data and terminate data capture from the at least one data capture element.
  • the wearable device of any one of aspects 5-8 wherein the data capture element comprises a microphone or an outward-facing imaging system.
  • the wearable device of any one of aspects 1-9 wherein the hardware processor is programmed to: display a virtual user interface to the wearer, the virtual user interface comprising functionality to permit the wearer to: input data to the at least a portion of the patient medical record; update the at least a portion of the patient medical record; organize the at least a portion of the patient medical record; or save changes to the at least a portion of the patient medical record.
  • the wearable device of any one of aspects 1-10 wherein the hardware processor is programmed to communicate the at least a portion of the patient medical record to a data store.
  • the wearable device of aspect 11 wherein the data store is external to the wearable device, and the communication is over a wireless network.
  • a healthcare computing environment comprising: the wearable device of any one of aspects 1-12; a healthcare database system comprising: a network interface configured to communicate with the wearable device; non-transitory data storage configured to store the patient medical record.
  • a healthcare provider (HCP) wearable device that comprises: a second display configured to present virtual content to the HCP, at least a portion of the display being transparent and disposed at a location in front of a HCP's eye when the HCP wears the HCP wearable device such that the transparent portion transmits light from a portion of the environment in front of the HCP to the HCP's eye to provide a view of the portion of the environment in front of the HCP; an environmental sensor; and a second hardware processor in communication with the second display, the second hardware processor programmed to include: an object recognizer configured to analyze data from the environmental sensor; and a data management system configured to permit the HCP to access the at least a portion of the patient medical record.
  • HCP healthcare provider
  • the wearable device of any one of aspects 1-15 wherein the hardware processor is programmed to store attempts to access the at least a portion of the patient medical record in an access log.
  • the wearable device of any one of aspects 1-16 wherein the hardware processor is programmed to verify an identity of the wearer.
  • the wearable device of aspect 17 wherein, to verify the identity of the wearer, the hardware processor is programmed to utilize an iris code or a retinal scan.
  • the wearable device of any one of aspects 17-19 wherein, in response to the identity of the wearer not being verified, the hardware processor is programmed to cease display of the portion of the patient medical record.
  • a method of monitoring a user's environment in a healthcare setting comprising: under control of a wearable device comprising a display configured to present virtual content to the user of the wearable device, an environmental sensor, and a hardware processor in communication with the display and the environmental sensor: analyzing data acquired by the environmental sensor to detect an initiation condition; determining that the initiation condition is present; documenting a healthcare event using the environmental sensor to provide healthcare documentation; analyzing the data acquired by the environmental sensor to detect a termination condition; determining that the termination condition is present; and ceasing the documentation of the healthcare event using the environmental sensor.
  • the method of aspect 21, wherein the environmental sensor comprises a microphone or an outward-facing imaging system.
  • the method of claim 21 or claim 22 wherein the environmental sensor comprises a first environmental sensor and a second environmental sensor that is different from the first environmental sensor, and analyzing the data acquired by the environmental sensor to detect an initiation condition comprises analyzing data acquired by the first environmental sensor; and documenting the healthcare event using the environmental sensor comprises documenting the healthcare event using the second environmental sensor.
  • documenting the healthcare event comprises: analyzing the audio recording to determine information spoken by a patient about a patient condition.
  • any one of aspects 21-26 further comprising communicating at least a portion of the healthcare documentation to a data store.
  • the method of any one of aspects 21-27 further comprising updating a patient medical record to include at least a portion of the healthcare documentation.
  • a wearable device configured to present virtual healthcare content to a wearer
  • the wearable device comprising: a display configured to present virtual content to the wearer, at least a portion of the display being transparent and disposed at a location in front of a wearer's eye when the wearer wears the wearable device such that the transparent portion transmits light from a portion of the environment in front of the wearer to the wearer's eye to provide a view of the portion of the environment in front of the wearer; and a hardware processor in communication with the display, the hardware processor programmed to: detect a triggering event for sharing virtual healthcare content with a second wearable device; verify the second wearable device has an access privilege sufficient to present the virtual healthcare content; if the second wearable device has sufficient access privilege, share the virtual healthcare content with the wearable device; and if the second wearable device has insufficient access privilege, present an indication to the wearer that the second wearable device has insufficient access privilege.
  • the wearable device of aspect 29 wherein the hardware processor is programmed to: receive a modification of the virtual healthcare content made by the second wearable device; and present the modified virtual healthcare content to the wearer of the wearable device.
  • the wearable device of aspect 29 or 30, wherein the virtual healthcare content comprises a patient medical record.
  • the wearable device of any one of aspects 29-31 wherein the virtual healthcare content comprises information obtained from a medical device.
  • the wearable device of any one of aspects 29-32 wherein the virtual healthcare content comprises an image of a portion of a patient.
  • the wearable device of any one of aspects 29-33 wherein the hardware processor is programmed to accept wearer input to modify the virtual healthcare content.
  • a wearable device configured to present virtual healthcare content to a wearer
  • the wearable device comprising: a display configured to present virtual content to the wearer, at least a portion of the display being transparent and disposed at a location in front of a wearer's eye when the wearer wears the wearable device such that the transparent portion transmits light from a portion of the environment in front of the wearer to the wearer's eye to provide a view of the portion of the environment in front of the wearer; an environmental sensor configured to obtain environmental data about an environment of the wearer; and a hardware processor in communication with the display and the environmental sensor, the hardware processor programmed to: access the environmental data obtained by the environmental sensor; determine contextual information based at least partly on the environmental data; identify a virtual object associated with a field of view (FOV) of the wearer of the wearable device based at least partly on the contextual information; and present virtual content relating to the virtual object in the FOV of the wearer of the wearable device.
  • FOV field of view
  • the wearable device of aspect 35 wherein the environmental sensor comprises a microphone, an outward-facing imaging system, an inward-facing eye-tracking system, a bar code reader, or a GPS sensor.
  • the wearable device of aspect 35 or aspect 36 wherein the contextual information comprises a location of the wearer, a pose of the wearer, an emotional state of the wearer, a level of access privileges, a symptom or a condition of a patient in the FOV of the wearer, or an identity of a patient in the FOV of the wearer.
  • the wearable device of any one of aspects 35-37 wherein the contextual information comprises information associated with a physical object in the wearer's environment or a virtual object presented to the wearer.
  • the wearable device of any one of aspects 35-38 wherein the environmental sensor comprises an imaging system, and to determine the contextual information the hardware processor is programmed to analyze images captured by the imaging system using a computer vision algorithm, a facial recognition algorithm, or a neural network.
  • the wearable device of any one of aspects 35-39 wherein the virtual content comprises a portion of a patient medical record, an alert, a focus indicator, or a message.
  • the wearable device of any one of aspects 35-40 wherein the hardware processor is programmed to: determine whether the contextual information passes a threshold condition; if the threshold condition is passed, display updated virtual content to the wearer of the wearable device.
  • the wearable device of any one of aspects 35-41 wherein: the contextual information comprises information about medical instruments used during a medical procedure on a patient, and the virtual content comprises information relating to location of the medical instruments.
  • the wearable device of aspect 42 wherein the virtual content further comprises an alert indicating that a medical instrument remains in the body of the patient.
  • the wearable device of aspect 42 or aspect 43 wherein the virtual content comprises an alert that a medical instrument in the FOV of the wearer is inappropriate or unrequested for the medical procedure.
  • the wearable device of any one of aspects 35-44 wherein the contextual information comprises information about a patient body part.
  • the wearable device of aspect 45 wherein the virtual content comprises a virtual flag associated with the patient body part.
  • the wearable device of any one of aspects 35-46 wherein the hardware processor is further programmed to communicate the virtual content to a data repository.
  • the wearable device of any one of aspects 35-47 wherein the hardware processor is further programmed to update a patient medical record with the virtual content.
  • a wearable device configured to present virtual healthcare content to a wearer
  • the wearable device comprising: a display configured to present virtual content to the wearer, at least a portion of the display being transparent and disposed at a location in front of a wearer's eye when the wearer wears the wearable device such that the transparent portion transmits light from a portion of the environment in front of the wearer to the wearer's eye to provide a view of the portion of the environment in front of the wearer; an environmental sensor configured to obtain environmental data about an environment of the wearer; and a hardware processor in communication with the display and the environmental sensor, the hardware processor programmed to: identify a sterile region in a field of regard (FOR) of the wearable device; identify an object entered into the sterile region; access information associated with the object; and track a position of the object in the sterile region.
  • FOR field of regard
  • the wearable device of aspect 49 wherein the hardware processor is further programmed to detect an anomaly associated with the object in the sterile region based at least partly on the information associated with the object.
  • the wearable device of aspect 50 wherein the hardware processor is programmed to present an alert to the wearer based at least partly upon the detection of the anomaly.
  • the wearable device of any one of aspects 49-51 wherein the hardware processor is programmed to detect that the object has left the sterile region.
  • the wearable device of any one of aspects 49-52 wherein the hardware processor is programmed to present to the wearer virtual content associated with the object.
  • the wearable device of aspect 53 wherein the virtual content comprises a list of objects present in the sterile region.
  • the wearable device of aspect 54 wherein the list further includes positions of the objects.
  • the wearable device of any one of aspects 49-55 wherein the environmental sensor comprises a microphone, an outward-facing imaging system, a bar code reader, or an electromagnetic tracking system.
  • a wearable system for managing medical information comprising: a head-mounted display (HMD) comprising a display configured to present virtual content to a user; one or more environmental sensors configured to obtain data associated with the user's environment; a hardware processor in communication with the display and the one or more environmental sensors, and programmed to: monitor the user's environment via the one or more environmental sensors; detect an initiation condition based at least partly on first data acquired by the one or more environmental sensors; document at least a portion of an interaction with a patient via an environmental sensor in response to the detection of the initiation condition, wherein the portion of the interaction comprises second data acquired by the environmental sensor; analyze the second data to extract relevant medical information related to the interaction with the patient based on contextual information; and initiate storage of the relevant medical information to a healthcare database system.
  • HMD head-mounted display
  • one or more environmental sensors configured to obtain data associated with the user's environment
  • a hardware processor in communication with the display and the one or more environmental sensors, and programmed to: monitor the user's environment via the one or more
  • the wearable system of aspect 57 wherein the one or more environmental sensors comprise at least an outward-facing camera or a microphone.
  • the wearable system of aspect 57 or 58 wherein to analyze the second data to extract relevant medical information, the hardware processor is programmed to: determine an audio stream spoken by the patient or the user of the wearable system; convert the audio stream to a text; and parse the text to identify phrases describing the patient's medical condition or history.
  • the wearable system of any one of aspects 57-59 wherein to initiate storage of the relevant medical information, the hardware processor is programmed to: verify the patient's identity based at least partly on the data acquired by the one or more environmental sensor; and update a medical record stored in the healthcare database with the relevant medical information captured from the interaction between the patient and the user.
  • the wearable system of any one of aspects 57-60 wherein the hardware processor is further programmed to: detect a triggering event for sharing healthcare information with a second wearable system; determine an access privilege associated with the second wearable system; and cause at least a portion of the healthcare information to be communicated to the second wearable system in response to a determination that the second wearable system has the access privilege.
  • the wearable system of aspect 61 the hardware processor is programmed to provide an indication to the second wearable system in response to a determination that the second wearable system has insufficient access privilege.
  • the wearable system of aspect 61 or 62 wherein the access privilege associated with the second wearable system is configured by the patient.
  • the wearable system of any one of aspects 61-63 wherein the healthcare information comprises at least a portion of a field of view (FOV) of the user as captured by an outward-facing camera.
  • FOV field of view
  • the wearable system of any one of aspects 61-64 wherein the hardware processor is programmed to share the healthcare information and an annotation associated with the healthcare information with the second wearable system.
  • the wearable system of any one of aspects 57-65 wherein the contextual information comprises at least one of a location of the user, a pose of the user, a level of access privilege of the user, a symptom or a condition of the patient in the FOV of the user, or an identity of the patient in the FOV of the user.
  • the wearable system of any one of aspects 57-66 wherein the hardware processor is further programmed to cause the head-mounted display to present virtual content to the user related to the interaction with the patient.
  • the wearable system of aspect 67 wherein the virtual content comprises at least one of a portion of a patient medical record or information related to the patient's physiological parameters received from a medical device.
  • the wearable system of aspect 67 or 68 wherein the contextual information comprises information about medical instruments used during a medical procedure on a patient, and the virtual content comprises information relating to location of the medical instruments.
  • the wearable system of aspect 69 wherein the virtual content comprises an alert that a medical instrument in the FOV of the user is inappropriate or unrequested for a medical procedure.
  • a method for managing medical information comprising: under control of a hardware processor: monitoring a user's environment based on data acquired by a wearable device comprising an environmental sensor; detecting an initiation condition based at least partly on first data acquired by the wearable device; documenting an interaction between a patient and a healthcare provider (HCP) via the environmental sensor of the wearable device in response to the detection of the initiation condition, wherein the interaction comprises second data acquired by the environmental sensor; analyzing the second data to extract relevant medical information related to the interaction with the patient based on contextual information; and initiating storage of the relevant medical information to a healthcare database system.
  • HCP healthcare provider
  • the method of aspect 71 wherein the environmental sensor comprises a microphone, and analyzing the second data to extract relevant medical information comprises: acquiring, via the microphone, an audio stream spoken by the patient or the HCP; converting the audio stream to a text; and parsing the text to identify phrases describing the patient's medical condition or history.
  • the method of aspect 71 or 72, wherein initiating storage of the relevant medical information comprises: verifying the patient's identity based at least partly on the data acquired by the wearable device; and updating a medical record stored in the healthcare database with the relevant medical information captured from the interaction between the patient and the HCP.
  • the method of any one of aspects 71-73 further comprising: detecting a triggering event for sharing healthcare information with a second wearable device; determining an access privilege associated with the second wearable device; and causing at least a portion of the healthcare information to be communicated to the second wearable device in response to a determination that the second wearable device has the access privilege.
  • the method of aspect 74 or 75 wherein the healthcare information comprises at least a portion of a field of view (FOV) of the user as captured by an outward-facing imaging system.
  • FOV field of view
  • aspect 76 further comprising sharing the healthcare information and an annotation associated with the healthcare information with the second wearable system.
  • the method of any one of aspects 71-77 wherein the contextual information comprises at least one of a location of the user, a pose of the user, a level of access privilege of the user, a symptom or a condition of the patient in the FOV of the user, or an identity of the patient in the FOV of the user.
  • the method of any one of aspects 71-78 further comprising causing the wearable device to display, via a head-mounted display, virtual content comprising at least one of a portion of a patient medical record or information related to the patient's physiological parameters received from a medical device.
  • the method of aspect 79 wherein the virtual content comprises an alert that a medical instrument in the FOV of the user is inappropriate or unrequested for a medical procedure.
  • a first wearable device for managing medical information
  • the first wearable device comprising: an outward-facing imaging system configured image an environment of a user; a head-mounted display configured to present virtual content to the user; and a hardware processor programmed to: monitor objects in a user's environment via data received from the outward-facing imaging system; determine objects in the user's field of view as perceived through the head-mounted display; detect a triggering event for a sharing session with a second wearable device, wherein the sharing session comprises sharing at least first information associated with a first physical object in the user's field of view with a second wearable device, wherein the first information is outside of a field of view of the second wearable device; communicate the first information to the second wearable device; receive virtual content from the second wearable device wherein the virtual content comprises second information associated with a second physical object which is outside of the user's field of view; and present the virtual content received from the second wearable device to the user via the head-mounted display.
  • the hardware processor is further programmed to: receive an annotation associated with the first physical object in the user's field of view via the first wearable device, and wherein to communicate the first information to the second wearable device, the hardware processor is programmed to communicate the annotation and an image of the first object to the second wearable device.
  • the first wearable device of aspect 82 wherein the annotation comprises one or more virtual flags placed on a portion of a patient's body part, wherein the one or more virtual flags indicate an orientation of the portion of the patient's body part.
  • the first wearable device of any one of aspects 81-83 wherein the sharing session is part of an interaction between the user and a patient, and the hardware processor is further programmed to: document at least a portion of the interaction between the user and the patient using at least one of the outward-facing imaging system or a microphone; identify relevant medical information from the interaction; and update a medical record of the patient with the relevant medical information.
  • a method for managing medical information comprising: under control of a first wearable device comprising an outward-facing imaging system, a hardware processor, and a head-mounted display: monitoring objects in a user's environment via the outward-facing imaging system; determining objects in the user's field of view as perceived through the head-mounted display; detecting a triggering event for a sharing session with a second wearable device, wherein the sharing session comprises sharing at least first information associated with a first physical object in the user's field of view with a second wearable device, wherein the first information is outside of a field of view of the second wearable device; communicating the first information to the second wearable device; receiving virtual content from the second wearable device wherein the virtual content comprises second information associated with a second physical object which is outside of the user's field of view; and presenting the virtual content received from the second wearable device to the user via the head-mounted display.
  • the method of aspect 85 further comprising: receiving an annotation associated with the first physical object in the user's field of view via the first wearable device, and wherein communicating the first information to the second wearable device comprises communicating the annotation and an image of the first object to the second wearable device.
  • the method of aspect 86 wherein the annotation comprises one or more virtual flags placed on a portion of a patient's body part, wherein the one or more virtual flags indicate an orientation of the portion of the patient's body part, an orientation of a camera that captured the image, or contextual information associated with the image or the portion of the patient's body part.
  • the method of any one of aspects 85-87 wherein the sharing session is part of an interaction between the user and a patient and the method further comprises: documenting the interaction between the user and the patient using at least one of the outward-facing imaging system or a microphone; identifying relevant medical information from the interaction; and updating a medical record of the patient with the relevant medical information.
  • the method of any one of aspects 85-88 further comprising: verifying an access privilege of the second wearable device; and sharing at least a portion of the first information to which the access privilege of the second wearable device is sufficient.
  • the method of aspect 89 wherein the access privilege is managed by a patient whose medical information is being shared between the first wearable device and the second wearable device.
  • a wearable device comprising: an outward-facing camera configured to image a field of regard (FOR) of a user, the FOR comprising a portion of the environment around the user that is capable of being perceived by the user via the wearable device; a head-mounted display configured to present virtual content to the user, wherein at least a portion of the display is transparent such that the transparent portion transmits light from a portion of the environment in front of the wearer to the wearer's eye to provide a view of the portion of the environment in front of the wearer; a hardware processor in communication with the display and the environmental sensor, the hardware processor programmed to: determine a sterile region in the FOR, wherein the sterile region comprises an area that is disinfected prior to a medical procedure; analyze data acquired by the outward-facing camera to identify a physical object entered into the sterile region; access information associated with the physical object; track a position of the physical object in the sterile region via the data acquired by the outward-facing camera; and cause
  • the wearable device of aspect 91 wherein the sterile region comprises at least a portion of the patient's body.
  • the wearable device of aspect 91 or 92 wherein a boundary of the sterile region is delineated by a user of the wearable device via hand gestures.
  • the wearable device of any one of aspects 91-93 wherein the information associated with the physical object comprises at least one of a function of the physical object or a type of the medical procedure.
  • the wearable device of any one of aspects 91-94 wherein the hardware processor is further programmed to detect an anomaly associated with the physical object in the sterile region based at least partly on the information associated with the physical object.
  • the wearable device of aspect 95 wherein the detected anomaly comprises a determination that the physical object is unrequested or inappropriate for the medical procedure, and wherein the hardware processor is programmed to present an alert to the wearer based at least partly upon the detected anomaly.
  • the wearable device of any one of aspects 91-96 wherein the hardware processor is further programmed to determine whether the physical object has left the sterile region.
  • the wearable device of any one of aspects 91-97 wherein the virtual content associated with the medical procedure comprises a list of medical instructions for the medical procedure.
  • the wearable device of any one of aspects 91-98 wherein the virtual content further comprises a focus indicator indicating one or more medical instructions that are in the sterile region based at least partly on the tracking of the physical object.
  • a method comprising: under control of a hardware processor: determining a sterile region in a field of regard (FOR) of a user of a wearable device, wherein the FOR comprising a portion of the environment around the user that is capable of being perceived by the user via the wearable device; analyzing data acquired by an outward-facing camera of the wearable device to identify a physical object entered into the sterile region; accessing information associated with the physical object; tracking a position of the physical object in the sterile region based on the data acquired by the outward-facing camera; and causing a visual indication to be provided by a head-mounted display of the wearable device, where the visual indication is associated with the position of the physical object.
  • FOR field of regard
  • the method of aspect 100 wherein the sterile region comprises at least a portion of the patient's body.
  • the method of aspect 100 or 101 wherein the information associated with the physical object comprises at least one of a function of the physical object or a type of the medical procedure.
  • the method of any one of aspects 100-102 further comprising detecting an anomaly associated with the physical object in the sterile region based at least partly on the information associated with the physical object.
  • the method of any one of aspects 100-103 wherein the detected anomaly comprises a determination that the physical object is unrequested or inappropriate for the medical procedure, and the method further comprises presenting an alert to the wearer based at least partly upon the detected anomaly.
  • the method of any one of aspects 100-104 further comprising: determining whether the physical object has left the sterile region; and in response to a determination that the physical object has left the sterile region, decrease a count associated with the physical object, wherein the count represents a number of the physical object that is in the sterile region.
  • the method of any one of aspects 100-105 wherein the virtual indication comprises a list of physical objects in the sterile region.
  • Each of the processes, methods, and algorithms described herein and/or depicted in the attached figures may be embodied in, and fully or partially automated by, code modules executed by one or more physical computing systems, hardware computer processors, application-specific circuitry, and/or electronic hardware configured to execute specific and particular computer instructions.
  • computing systems can include general purpose computers (e.g., servers) programmed with specific computer instructions or special purpose computers, special purpose circuitry, and so forth.
  • a code module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language.
  • particular operations and methods may be performed by circuitry that is specific to a given function.
  • a video may include many frames, with each frame having millions of pixels, and specifically programmed computer hardware is necessary to process the video data to provide a desired image processing task or application in a commercially reasonable amount of time.
  • Code modules or any type of data may be stored on any type of non-transitory computer-readable medium, such as physical computer storage including hard drives, solid state memory, random access memory (RAM), read only memory (ROM), optical disc, volatile or non-volatile storage, combinations of the same and/or the like.
  • the methods and modules (or data) may also be transmitted as generated data signals (e.g., as part of a carrier wave or other analog or digital propagated signal) on a variety of computer-readable transmission mediums, including wireless-based and wired/cable-based mediums, and may take a variety of forms (e.g., as part of a single or multiplexed analog signal, or as multiple discrete digital packets or frames).
  • the results of the disclosed processes or process steps may be stored, persistently or otherwise, in any type of non-transitory, tangible computer storage or may be communicated via a computer-readable transmission medium.
  • any processes, blocks, states, steps, or functionalities in flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing code modules, segments, or portions of code which include one or more executable instructions for implementing specific functions (e.g., logical or arithmetical) or steps in the process.
  • the various processes, blocks, states, steps, or functionalities can be combined, rearranged, added to, deleted from, modified, or otherwise changed from the illustrative examples provided herein.
  • additional or different computing systems or code modules may perform some or all of the functionalities described herein.
  • the processes, methods, and systems may be implemented in a network (or distributed) computing environment.
  • Network environments include enterprise-wide computer networks, intranets, local area networks (LAN), wide area networks (WAN), personal area networks (PAN), cloud computing networks, crowd-sourced computing networks, the Internet, and the World Wide Web.
  • the network may be a wired or a wireless network or any other type of communication network.
  • a phrase referring to “at least one of” a list of items refers to any combination of those items, including single members.
  • “at least one of: A, B, or C” is intended to cover: A, B, C, A and B, A and C, B and C, and A, B, and C.
  • Conjunctive language such as the phrase “at least one of X, Y and Z,” unless specifically stated otherwise, is otherwise understood with the context as used in general to convey that an item, term, etc. may be at least one of X, Y or Z. Thus, such conjunctive language is not generally intended to imply that certain embodiments require at least one of X, at least one of Y and at least one of Z to each be present.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Surgery (AREA)
  • General Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • General Engineering & Computer Science (AREA)
  • Pathology (AREA)
  • Human Computer Interaction (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Business, Economics & Management (AREA)
  • Ophthalmology & Optometry (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Optics & Photonics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Software Systems (AREA)
  • General Business, Economics & Management (AREA)
  • Multimedia (AREA)
  • Computational Linguistics (AREA)
  • Computer Security & Cryptography (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Artificial Intelligence (AREA)
  • Gynecology & Obstetrics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Cardiology (AREA)
  • Acoustics & Sound (AREA)
  • Computer Hardware Design (AREA)
US15/865,023 2017-01-11 2018-01-08 Medical assistant Abandoned US20180197624A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15/865,023 US20180197624A1 (en) 2017-01-11 2018-01-08 Medical assistant
US18/056,164 US12080393B2 (en) 2017-01-11 2022-11-16 Medical assistant

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201762445182P 2017-01-11 2017-01-11
US201762448656P 2017-01-20 2017-01-20
US15/865,023 US20180197624A1 (en) 2017-01-11 2018-01-08 Medical assistant

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/056,164 Continuation US12080393B2 (en) 2017-01-11 2022-11-16 Medical assistant

Publications (1)

Publication Number Publication Date
US20180197624A1 true US20180197624A1 (en) 2018-07-12

Family

ID=62783590

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/865,023 Abandoned US20180197624A1 (en) 2017-01-11 2018-01-08 Medical assistant
US18/056,164 Active US12080393B2 (en) 2017-01-11 2022-11-16 Medical assistant

Family Applications After (1)

Application Number Title Priority Date Filing Date
US18/056,164 Active US12080393B2 (en) 2017-01-11 2022-11-16 Medical assistant

Country Status (9)

Country Link
US (2) US20180197624A1 (ko)
EP (1) EP3568783A4 (ko)
JP (3) JP7224288B2 (ko)
KR (2) KR102662173B1 (ko)
CN (2) CN116230153A (ko)
AU (1) AU2018207068A1 (ko)
CA (1) CA3049431A1 (ko)
IL (1) IL267995A (ko)
WO (1) WO2018132336A1 (ko)

Cited By (101)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180365875A1 (en) * 2017-06-14 2018-12-20 Dell Products, L.P. Headset display control based upon a user's pupil state
US20190180883A1 (en) * 2017-12-11 2019-06-13 Teletracking Technologies, Inc. Milestone detection sensing
US10332376B2 (en) * 2017-11-28 2019-06-25 Cheng Chieh Investment Co., Ltd. Workplace management system and wearable device therefor
US20190207814A1 (en) * 2017-11-03 2019-07-04 Vignet Incorporated Systems and methods for managing operation of devices in complex systems and changing environments
US20190357010A1 (en) * 2018-05-16 2019-11-21 International Business Machines Corporation Smart location alert system
WO2020047338A1 (en) * 2018-08-29 2020-03-05 Movidius Ltd. Computer vision system
US20200069181A1 (en) * 2018-09-01 2020-03-05 Philip Jaques Sampson I-Examination
US20200081523A1 (en) * 2017-05-15 2020-03-12 Shanghai United Imaging Healthcare Co., Ltd. Systems and methods for display
US10599877B2 (en) * 2017-04-13 2020-03-24 At&T Intellectual Property I, L.P. Protecting content on a display device from a field-of-view of a person or device
WO2020061054A1 (en) * 2018-09-17 2020-03-26 Vet24seven Inc. Veterinary professional animal tracking and support system
US10645092B1 (en) * 2019-01-03 2020-05-05 Truly Social Games Llc Control and/or verification of interactions of an augmented reality deployment
US20200218820A1 (en) * 2017-07-16 2020-07-09 Chengdu Qianniucao Information Technology Co., Ltd. Method for authorizing form data operation authority
US10729502B1 (en) 2019-02-21 2020-08-04 Theator inc. Intraoperative surgical event summary
WO2020157693A1 (en) * 2019-01-31 2020-08-06 Herzog Samuel Patient-centric health care system
WO2020188119A1 (en) 2019-03-21 2020-09-24 Kepler Vision Technologies B.V. A medical device for transcription of appearances in an image to text with machine learning
CN112397192A (zh) * 2020-11-09 2021-02-23 何奔 一种用于心血管科室的医疗智能远程查房识别系统
US10970858B2 (en) * 2019-05-15 2021-04-06 International Business Machines Corporation Augmented reality for monitoring objects to decrease cross contamination between different regions
US10977495B2 (en) * 2018-10-03 2021-04-13 Cmr Surgical Limited Automatic endoscope video augmentation
US20210121059A1 (en) * 2017-09-11 2021-04-29 Nikon Corporation Ophthalmic instrument, management device, and method for managing an ophthalmic instrument
US11065079B2 (en) 2019-02-21 2021-07-20 Theator inc. Image-based system for estimating surgical contact force
US20210225497A1 (en) * 2017-02-24 2021-07-22 General Electric Company Providing auxiliary information regarding healthcare procedure and system performance using augmented reality
US11087557B1 (en) 2020-06-03 2021-08-10 Tovy Kamine Methods and systems for remote augmented reality communication for guided surgery
US11113427B2 (en) * 2017-12-22 2021-09-07 Lenovo (Beijing) Co., Ltd. Method of displaying contents, a first electronic device and a second electronic device
US11116587B2 (en) 2018-08-13 2021-09-14 Theator inc. Timeline overlay on surgical video
EP3889970A1 (en) 2020-04-03 2021-10-06 Koninklijke Philips N.V. Diagnosis support system
US11151421B2 (en) * 2017-06-29 2021-10-19 The Procter & Gamble Company Method for treating a surface
US11158423B2 (en) 2018-10-26 2021-10-26 Vignet Incorporated Adapted digital therapeutic plans based on biomarkers
EP3909500A1 (en) * 2020-05-11 2021-11-17 BraveHeart Wireless Inc. Systems and methods for using algorithms and acoustic input to control, monitor, annotate, and configure a wearable health monitor that monitors physiological signals
CN113723100A (zh) * 2021-09-09 2021-11-30 国网电子商务有限公司 一种基于指纹特征的开源组件识别方法及装置
US11200744B2 (en) * 2013-03-04 2021-12-14 Alex C. Chen Method and apparatus for recognizing behavior and providing information
US20210407632A1 (en) * 2017-09-13 2021-12-30 Koninklijke Philips N.V. System and method for displaying electronic health records
US11227686B2 (en) 2020-04-05 2022-01-18 Theator inc. Systems and methods for processing integrated surgical video collections to identify relationships using artificial intelligence
US11238979B1 (en) 2019-02-01 2022-02-01 Vignet Incorporated Digital biomarkers for health research, digital therapeautics, and precision medicine
US11244508B2 (en) * 2018-02-03 2022-02-08 The Johns Hopkins University Augmented reality display for surgical procedures
US11281553B1 (en) 2021-04-16 2022-03-22 Vignet Incorporated Digital systems for enrolling participants in health research and decentralized clinical trials
US20220092910A1 (en) * 2010-09-30 2022-03-24 Jesus Perea-Ochoa Method and System of Operating Multi-Task Interactive Electronic Devices and Ultraviolet Light System
US11289196B1 (en) 2021-01-12 2022-03-29 Emed Labs, Llc Health testing and diagnostics platform
WO2022070072A1 (en) * 2020-10-02 2022-04-07 Cilag Gmbh International Control of a display outside the sterile field from a device within the sterile field
US20220104910A1 (en) * 2020-10-02 2022-04-07 Ethicon Llc Monitoring of user visual gaze to control which display system displays the primary information
US11302448B1 (en) 2020-08-05 2022-04-12 Vignet Incorporated Machine learning to select digital therapeutics
US11322260B1 (en) 2020-08-05 2022-05-03 Vignet Incorporated Using predictive models to predict disease onset and select pharmaceuticals
US11321082B2 (en) * 2016-10-28 2022-05-03 Vignet Incorporated Patient engagement in digital health programs
US20220139514A1 (en) * 2020-11-03 2022-05-05 Nuance Communications, Inc. Communication System and Method
US11373756B1 (en) 2021-05-24 2022-06-28 Emed Labs, Llc Systems, devices, and methods for diagnostic aid kit apparatus
US11409417B1 (en) 2018-08-10 2022-08-09 Vignet Incorporated Dynamic engagement of patients in clinical and digital health research
US20220262078A1 (en) * 2021-02-14 2022-08-18 Broadstone Technologies, Llc Remote device provisioning and remote support using augmented reality
US11442126B2 (en) * 2019-12-13 2022-09-13 Siemens Healthcare Gmbh System and method for estimating a relative substance composition of a portion of a body of a patient
US11449189B1 (en) * 2019-10-02 2022-09-20 Facebook Technologies, Llc Virtual reality-based augmented reality development system
US20220301314A1 (en) * 2021-03-16 2022-09-22 Siemens Energy, Inc. System and method for automated foreign material exclusion attendant
US11456080B1 (en) 2020-08-05 2022-09-27 Vignet Incorporated Adjusting disease data collection to provide high-quality health data to meet needs of different communities
US20220335952A1 (en) * 2020-08-05 2022-10-20 Interactive Solutions Corp. System to Change Image Based on Voice
US11488304B2 (en) * 2018-05-01 2022-11-01 Eizo Corporation Gauze detection system and gauze detection method
US11495223B2 (en) * 2017-12-08 2022-11-08 Samsung Electronics Co., Ltd. Electronic device for executing application by using phoneme information included in audio data and operation method therefor
US20220354440A1 (en) * 2021-05-04 2022-11-10 Willis Dennis Grajales Worldwide vision screening and visual field screening booth, kiosk, or exam room using artificial intelligence, screen sharing technology, and telemedicine video conferencing system to interconnect patient with eye doctor anywhere around the world via the internet using ethernet, 4G, 5G, 6G or Wifi for teleconsultation and to review results
US11500606B2 (en) * 2019-10-28 2022-11-15 Beijing Boe Optoelectronics Technology Co., Ltd. AR display device and method
US11501060B1 (en) 2016-09-29 2022-11-15 Vignet Incorporated Increasing effectiveness of surveys for digital health monitoring
US11504011B1 (en) 2020-08-05 2022-11-22 Vignet Incorporated Early detection and prevention of infectious disease transmission using location data and geofencing
US11507679B2 (en) * 2017-07-01 2022-11-22 Chengdu Qianniucao Information Technology Co., Ltd. Authorization method for form related information
US11515037B2 (en) 2021-03-23 2022-11-29 Emed Labs, Llc Remote diagnostic testing and treatment
US20220392629A1 (en) * 2021-06-02 2022-12-08 Sandeep REDDY Portable healthcare system installed at a remote location having limited internet connectivity
US11545271B2 (en) * 2019-08-20 2023-01-03 GE Precision Healthcare LLC Systems and methods for public and private communication threads
US11556720B2 (en) * 2020-05-05 2023-01-17 International Business Machines Corporation Context information reformation and transfer mechanism at inflection point
US11574435B1 (en) * 2020-04-07 2023-02-07 Robert Edwin Douglas Multi-user extended reality viewing technique
US11586524B1 (en) 2021-04-16 2023-02-21 Vignet Incorporated Assisting researchers to identify opportunities for new sub-studies in digital health research and decentralized clinical trials
US11583360B2 (en) * 2018-08-06 2023-02-21 The Board Of Trustees Of The Leland Stanford Junior University Method for monitoring object flow within a surgical space during a surgery
US11610682B2 (en) 2021-06-22 2023-03-21 Emed Labs, Llc Systems, methods, and devices for non-human readable diagnostic tests
US11641460B1 (en) * 2020-04-27 2023-05-02 Apple Inc. Generating a volumetric representation of a capture region
US20230146384A1 (en) * 2021-07-28 2023-05-11 Multinarity Ltd Initiating sensory prompts indicative of changes outside a field of view
US11651555B2 (en) * 2018-05-31 2023-05-16 Microsoft Technology Licensing, Llc Re-creation of virtual environment through a video call
US11672534B2 (en) 2020-10-02 2023-06-13 Cilag Gmbh International Communication capability of a smart stapler
US11682474B2 (en) * 2018-12-12 2023-06-20 International Business Machines Corporation Enhanced user screening for sensitive services
US11687778B2 (en) 2020-01-06 2023-06-27 The Research Foundation For The State University Of New York Fakecatcher: detection of synthetic portrait videos using biological signals
US11705230B1 (en) 2021-11-30 2023-07-18 Vignet Incorporated Assessing health risks using genetic, epigenetic, and phenotypic data sources
WO2023159236A1 (en) * 2022-02-18 2023-08-24 Curelator, Inc. Personal medical avatar
US20230267933A1 (en) * 2021-09-27 2023-08-24 International Business Machines Corporation Selective inclusion of speech content in documents
US20230266872A1 (en) * 2022-02-18 2023-08-24 SoftAcuity, Inc. Intelligent surgical display system and method
US11748924B2 (en) 2020-10-02 2023-09-05 Cilag Gmbh International Tiered system display control based on capacity and user operation
US11763919B1 (en) 2020-10-13 2023-09-19 Vignet Incorporated Platform to increase patient engagement in clinical trials through surveys presented on mobile devices
US11789837B1 (en) 2021-02-03 2023-10-17 Vignet Incorporated Adaptive data collection in clinical trials to increase the likelihood of on-time completion of a trial
US11811876B2 (en) 2021-02-08 2023-11-07 Sightful Computers Ltd Virtual display changes based on positions of viewers
US11816757B1 (en) * 2019-12-11 2023-11-14 Meta Platforms Technologies, Llc Device-side capture of data representative of an artificial reality environment
US11830602B2 (en) 2020-10-02 2023-11-28 Cilag Gmbh International Surgical hub having variable interconnectivity capabilities
WO2023244267A1 (en) * 2022-06-13 2023-12-21 Magic Leap, Inc. Systems and methods for human gait analysis, real-time feedback and rehabilitation using an extended-reality device
US20240015151A1 (en) * 2017-06-30 2024-01-11 Wells Fargo Bank, N.A. Authentication as a service
US11877897B2 (en) 2020-10-02 2024-01-23 Cilag Gmbh International Situational awareness of instruments location and individualization of users to control displays
US11883022B2 (en) 2020-10-02 2024-01-30 Cilag Gmbh International Shared situational awareness of the device actuator activity to prioritize certain aspects of displayed information
US11901083B1 (en) 2021-11-30 2024-02-13 Vignet Incorporated Using genetic and phenotypic data sets for drug discovery clinical trials
WO2024049435A1 (en) * 2022-09-01 2024-03-07 Exo Imaging, Inc. Apparatus, system and method to control an ultrasonic image on a display based on sensor input at an ultrasonic imaging device
US11929168B2 (en) 2021-05-24 2024-03-12 Emed Labs, Llc Systems, devices, and methods for diagnostic aid kit apparatus
US11948263B1 (en) 2023-03-14 2024-04-02 Sightful Computers Ltd Recording the complete physical and extended reality environments of a user
US11963683B2 (en) 2020-10-02 2024-04-23 Cilag Gmbh International Method for operating tiered operation modes in a surgical system
WO2024097431A1 (en) * 2022-11-06 2024-05-10 Hajeebu Sreehita System and method of wound assessment
US11992372B2 (en) 2020-10-02 2024-05-28 Cilag Gmbh International Cooperative surgical displays
US12014829B2 (en) 2021-09-01 2024-06-18 Emed Labs, Llc Image processing and presentation techniques for enhanced proctoring sessions
US12016566B2 (en) 2020-10-02 2024-06-25 Cilag Gmbh International Surgical instrument with adaptive function controls
US12064293B2 (en) 2020-10-02 2024-08-20 Cilag Gmbh International Field programmable surgical visualization system
US12073054B2 (en) 2022-09-30 2024-08-27 Sightful Computers Ltd Managing virtual collisions between moving virtual objects
US12080393B2 (en) 2017-01-11 2024-09-03 Magic Leap, Inc. Medical assistant
US12094070B2 (en) 2021-02-08 2024-09-17 Sightful Computers Ltd Coordinating cursor movement between a physical surface and a virtual surface
WO2024196288A1 (en) * 2023-03-22 2024-09-26 Telefonaktiebolaget Lm Ericsson (Publ) Methods and apparatuses for remote control of controllable electrical devices in a surrounding physical environment of a user
US12122420B2 (en) 2019-08-29 2024-10-22 Intel Corporation Computer vision system

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190051376A1 (en) 2017-08-10 2019-02-14 Nuance Communications, Inc. Automated clinical documentation system and method
US11316865B2 (en) 2017-08-10 2022-04-26 Nuance Communications, Inc. Ambient cooperative intelligence system and method
US11250382B2 (en) 2018-03-05 2022-02-15 Nuance Communications, Inc. Automated clinical documentation system and method
WO2019173333A1 (en) 2018-03-05 2019-09-12 Nuance Communications, Inc. Automated clinical documentation system and method
US10963757B2 (en) 2018-12-14 2021-03-30 Industrial Technology Research Institute Neural network model fusion method and electronic device using the same
US20200234827A1 (en) * 2019-01-22 2020-07-23 Mira Therapeutics, Inc. Methods and systems for diagnosing and treating disorders
US11531807B2 (en) 2019-06-28 2022-12-20 Nuance Communications, Inc. System and method for customized text macros
US11670408B2 (en) 2019-09-30 2023-06-06 Nuance Communications, Inc. System and method for review of automated clinical documentation
KR102223621B1 (ko) 2020-07-01 2021-03-05 부경대학교 산학협력단 비가시광선 가시화 자동정합형 증강현실 글래스
US11571225B2 (en) 2020-08-17 2023-02-07 Russell Todd Nevins System and method for location determination using movement between optical labels and a 3D spatial mapping camera
KR102494464B1 (ko) * 2020-12-02 2023-02-06 강길남 안구인식 데이터를 이용한 뇌신경계 질환 모니터링용 스마트 글래스
US20220208367A1 (en) * 2020-12-31 2022-06-30 Hill-Rom Services, Inc. Virtual signage using augmented reality or mixed reality
US20220331008A1 (en) 2021-04-02 2022-10-20 Russell Todd Nevins System and method for location determination using movement of an optical label fixed to a bone using a spatial mapping camera
TWI763546B (zh) * 2021-06-21 2022-05-01 義大醫療財團法人義大醫院 智能急救車及應用此智能急救車之藥品管理方法
US11600053B1 (en) 2021-10-04 2023-03-07 Russell Todd Nevins System and method for location determination using a mixed reality device and multiple imaging cameras
CN113903480A (zh) * 2021-11-19 2022-01-07 丁玉 Ar元宇宙去中心化医疗社区诊断系统
CN114566275A (zh) * 2022-02-21 2022-05-31 山东大学齐鲁医院 一种基于混合现实的院前急救辅助系统
KR102458495B1 (ko) * 2022-03-17 2022-10-25 주식회사 메디씽큐 원격협진지원을 위한 3차원 포인팅시스템 및 그 제어방법

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7395215B2 (en) * 2001-11-08 2008-07-01 Amos Grushka Portable personal health information package
US20100286490A1 (en) * 2006-04-20 2010-11-11 Iq Life, Inc. Interactive patient monitoring system using speech recognition
US20130174213A1 (en) * 2011-08-23 2013-07-04 James Liu Implicit sharing and privacy control through physical behaviors using sensor-rich devices
US20150088546A1 (en) * 2013-09-22 2015-03-26 Ricoh Company, Ltd. Mobile Information Gateway for Use by Medical Personnel
US20150099458A1 (en) * 2011-01-14 2015-04-09 Covidien Lp Network-Capable Medical Device for Remote Monitoring Systems
US20150156028A1 (en) * 2013-12-01 2015-06-04 Apx Labs, Llc Systems and methods for sharing information between users of augmented reality devices
US9305551B1 (en) * 2013-08-06 2016-04-05 Timothy A. Johns Scribe system for transmitting an audio recording from a recording device to a server
US20160148052A1 (en) * 2013-07-16 2016-05-26 Seiko Epson Corporation Information processing apparatus, information processing method, and information processing system
US20160366084A1 (en) * 2015-06-10 2016-12-15 Google Inc. Contextually driven messaging system
US20180082476A1 (en) * 2016-09-22 2018-03-22 International Business Machines Corporation Collaborative search for out of field of view augmented reality objects
WO2018067515A1 (en) * 2016-10-04 2018-04-12 WortheeMed, Inc. Enhanced reality medical guidance systems and methods of use

Family Cites Families (80)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5291560A (en) 1991-07-15 1994-03-01 Iri Scan Incorporated Biometric personal identification system based on iris analysis
US6222525B1 (en) 1992-03-05 2001-04-24 Brad A. Armstrong Image controllers with sheet connected sensors
JP3717552B2 (ja) * 1995-09-01 2005-11-16 オリンパス株式会社 医療用マニピュレータシステム
JPH11197159A (ja) * 1998-01-13 1999-07-27 Hitachi Ltd 手術支援システム
JP2002140685A (ja) * 2000-11-01 2002-05-17 Fuji Photo Film Co Ltd 画像管理システム及び画像管理方法
US8072470B2 (en) 2003-05-29 2011-12-06 Sony Computer Entertainment Inc. System and method for providing a real-time three-dimensional interactive environment
US7501995B2 (en) * 2004-11-24 2009-03-10 General Electric Company System and method for presentation of enterprise, clinical, and decision support information utilizing eye tracking navigation
JP2007094943A (ja) * 2005-09-30 2007-04-12 Vitas:Kk 特定疾患医療情報管理システム
US11428937B2 (en) 2005-10-07 2022-08-30 Percept Technologies Enhanced optical and perceptual digital eyewear
US20070081123A1 (en) 2005-10-07 2007-04-12 Lewis Scott W Digital eyewear
US8696113B2 (en) 2005-10-07 2014-04-15 Percept Technologies Inc. Enhanced optical and perceptual digital eyewear
US7966269B2 (en) 2005-10-20 2011-06-21 Bauer James D Intelligent human-machine interface
US7518619B2 (en) 2005-11-07 2009-04-14 General Electric Company Method and apparatus for integrating three-dimensional and two-dimensional monitors with medical diagnostic imaging workstations
CA2636030A1 (en) * 2006-01-17 2007-07-26 Qualcomm Incorporated Method and apparatus for setting the boundaries of virtual operations
US20070238981A1 (en) * 2006-03-13 2007-10-11 Bracco Imaging Spa Methods and apparatuses for recording and reviewing surgical navigation processes
US8630867B2 (en) * 2007-04-23 2014-01-14 Samsung Electronics Co., Ltd. Remote-medical-diagnosis system method
US20090054084A1 (en) 2007-08-24 2009-02-26 Motorola, Inc. Mobile virtual and augmented reality system
US20090103785A1 (en) 2007-10-18 2009-04-23 Advanced Medical Optics, Inc. Ocular identification system for use with a medical device
JP2009279193A (ja) 2008-05-22 2009-12-03 Fujifilm Corp 医療機器管理システム
KR100961380B1 (ko) * 2008-06-25 2010-06-07 재단법인 첨단산업개발원 음성 인식이 가능한 디지털셋탑박스를 이용한 원격진료시스템
US20100049664A1 (en) * 2008-08-21 2010-02-25 Yi-Hsuan Kuo Method and system for user-defined alerting of securities information
US20150120321A1 (en) * 2009-02-26 2015-04-30 I.M.D. Soft Ltd. Wearable Data Reader for Medical Documentation and Clinical Decision Support
US20110029320A1 (en) * 2009-08-03 2011-02-03 Mehrnaz Nicole Jamali System and method for managing a medical procedure site with a tracking device
US9304319B2 (en) 2010-11-18 2016-04-05 Microsoft Technology Licensing, Llc Automatic focus improvement for augmented reality displays
US10156722B2 (en) 2010-12-24 2018-12-18 Magic Leap, Inc. Methods and systems for displaying stereoscopy with a freeform optical system with addressable focus for virtual and augmented reality
CA2822978C (en) 2010-12-24 2019-02-19 Hong Hua An ergonomic head mounted display device and optical system
KR101447931B1 (ko) * 2011-04-12 2014-10-13 (주)미래컴퍼니 증강현실을 이용한 수술 로봇 시스템 및 그 제어 방법
JP6316186B2 (ja) 2011-05-06 2018-04-25 マジック リープ, インコーポレイテッドMagic Leap,Inc. 広範囲同時遠隔ディジタル提示世界
US8900131B2 (en) * 2011-05-13 2014-12-02 Intuitive Surgical Operations, Inc. Medical system providing dynamic registration of a model of an anatomical structure for image-guided surgery
US9275079B2 (en) * 2011-06-02 2016-03-01 Google Inc. Method and apparatus for semantic association of images with augmentation data
US10795448B2 (en) 2011-09-29 2020-10-06 Magic Leap, Inc. Tactile glove for human-computer interaction
RU2017115669A (ru) 2011-10-28 2019-01-28 Мэджик Лип, Инк. Система и способ для дополненной и виртуальной реальности
KR102116697B1 (ko) 2011-11-23 2020-05-29 매직 립, 인코포레이티드 3차원 가상 및 증강 현실 디스플레이 시스템
US9147111B2 (en) * 2012-02-10 2015-09-29 Microsoft Technology Licensing, Llc Display with blocking image generation
KR102028732B1 (ko) 2012-04-05 2019-10-04 매직 립, 인코포레이티드 능동 포비에이션 능력을 갖는 와이드-fov(field of view) 이미지 디바이스들
US9671566B2 (en) 2012-06-11 2017-06-06 Magic Leap, Inc. Planar waveguide apparatus with diffraction element(s) and system employing same
US9310559B2 (en) * 2012-06-11 2016-04-12 Magic Leap, Inc. Multiple depth plane three-dimensional display using a wave guide reflector array projector
CA2883498C (en) 2012-08-30 2022-05-31 Truevision Systems, Inc. Imaging system and methods displaying a fused multidimensional reconstructed image
IL221863A (en) * 2012-09-10 2014-01-30 Elbit Systems Ltd Digital video photography system when analyzing and displaying
AU2013315607A1 (en) 2012-09-11 2015-04-02 Magic Leap, Inc Ergonomic head mounted display device and optical system
US20140081659A1 (en) * 2012-09-17 2014-03-20 Depuy Orthopaedics, Inc. Systems and methods for surgical and interventional planning, support, post-operative follow-up, and functional recovery tracking
JP6007712B2 (ja) * 2012-09-28 2016-10-12 ブラザー工業株式会社 ヘッドマウントディスプレイ、それを作動させる方法およびプログラム
IL293789B2 (en) 2013-01-15 2023-08-01 Magic Leap Inc A system for scanning electromagnetic imaging radiation
US20140222462A1 (en) * 2013-02-07 2014-08-07 Ian Shakil System and Method for Augmenting Healthcare Provider Performance
CN105188516B (zh) 2013-03-11 2017-12-22 奇跃公司 用于增强和虚拟现实的系统与方法
KR20140112207A (ko) * 2013-03-13 2014-09-23 삼성전자주식회사 증강현실 영상 표시 시스템 및 이를 포함하는 수술 로봇 시스템
KR102458124B1 (ko) 2013-03-15 2022-10-21 매직 립, 인코포레이티드 디스플레이 시스템 및 방법
US10262462B2 (en) 2014-04-18 2019-04-16 Magic Leap, Inc. Systems and methods for augmented and virtual reality
US9874749B2 (en) * 2013-11-27 2018-01-23 Magic Leap, Inc. Virtual and augmented reality systems and methods
US20150088547A1 (en) * 2013-09-22 2015-03-26 Ricoh Company, Ltd. Mobile Information Gateway for Home Healthcare
IL302408B2 (en) 2013-10-16 2024-09-01 Magic Leap Inc An augmented or virtual reality head device with intrapupillary distance adjustment
US9857591B2 (en) 2014-05-30 2018-01-02 Magic Leap, Inc. Methods and system for creating focal planes in virtual and augmented reality
CN110542938B (zh) 2013-11-27 2023-04-18 奇跃公司 虚拟和增强现实系统与方法
JP6407526B2 (ja) 2013-12-17 2018-10-17 キヤノンメディカルシステムズ株式会社 医用情報処理システム、医用情報処理方法及び情報処理システム
US11103122B2 (en) * 2014-07-15 2021-08-31 Mentor Acquisition One, Llc Content presentation in head worn computing
EP3100186B1 (en) * 2014-01-29 2021-05-12 Becton, Dickinson and Company System and method for collection confirmation and sample tracking at the clinical point of use
NZ722903A (en) 2014-01-31 2020-05-29 Magic Leap Inc Multi-focal display system and method
CN106461955B (zh) 2014-01-31 2019-08-13 奇跃公司 显示增强现实的方法
US10203762B2 (en) 2014-03-11 2019-02-12 Magic Leap, Inc. Methods and systems for creating virtual and augmented reality
US10803538B2 (en) * 2014-04-14 2020-10-13 Optum, Inc. System and method for automated data entry and workflow management
WO2015161307A1 (en) * 2014-04-18 2015-10-22 Magic Leap, Inc. Systems and methods for augmented and virtual reality
WO2016018488A2 (en) 2014-05-09 2016-02-04 Eyefluence, Inc. Systems and methods for discerning eye signals and continuous biometric identification
CN113253476B (zh) 2014-05-30 2022-12-27 奇跃公司 采用虚拟或增强现实装置生成虚拟内容显示的方法和系统
WO2016023097A1 (en) 2014-08-15 2016-02-18 Synaptive Medical (Barbados) Inc. System and method for managing equipment in a medical procedure
KR102295496B1 (ko) 2014-09-29 2021-09-01 매직 립, 인코포레이티드 상이한 파장의 광을 도파관 밖으로 출력하기 위한 아키텍쳐 및 방법
CN107004044A (zh) * 2014-11-18 2017-08-01 皇家飞利浦有限公司 增强现实设备的用户引导系统和方法、使用
KR20150090991A (ko) * 2014-12-01 2015-08-07 을지대학교 산학협력단 의료단말 허브장치
US10154239B2 (en) * 2014-12-30 2018-12-11 Onpoint Medical, Inc. Image-guided surgery with surface reconstruction and augmented reality visualization
US9538962B1 (en) * 2014-12-31 2017-01-10 Verily Life Sciences Llc Heads-up displays for augmented reality network in a medical environment
US10013808B2 (en) * 2015-02-03 2018-07-03 Globus Medical, Inc. Surgeon head-mounted display apparatuses
CN113017836A (zh) * 2015-02-20 2021-06-25 柯惠Lp公司 手术室和手术部位感知
CN107847289A (zh) * 2015-03-01 2018-03-27 阿里斯医疗诊断公司 现实增强的形态学手术
NZ773826A (en) 2015-03-16 2022-07-29 Magic Leap Inc Methods and systems for diagnosing and treating health ailments
US20160287337A1 (en) 2015-03-31 2016-10-06 Luke J. Aram Orthopaedic surgical system and method for patient-specific surgical procedure
USD758367S1 (en) 2015-05-14 2016-06-07 Magic Leap, Inc. Virtual reality headset
US10799792B2 (en) * 2015-07-23 2020-10-13 At&T Intellectual Property I, L.P. Coordinating multiple virtual environments
US10105187B2 (en) * 2015-08-27 2018-10-23 Medtronic, Inc. Systems, apparatus, methods and computer-readable storage media facilitating surgical procedures utilizing augmented reality
US10045825B2 (en) * 2015-09-25 2018-08-14 Karl Storz Imaging, Inc. Partial facial recognition and gaze detection for a medical system
US10365300B2 (en) 2016-02-05 2019-07-30 Tektronix, Inc. Trigger on final occurrence
EP3568783A4 (en) 2017-01-11 2020-11-11 Magic Leap, Inc. MEDICAL ASSISTANT

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7395215B2 (en) * 2001-11-08 2008-07-01 Amos Grushka Portable personal health information package
US20100286490A1 (en) * 2006-04-20 2010-11-11 Iq Life, Inc. Interactive patient monitoring system using speech recognition
US20150099458A1 (en) * 2011-01-14 2015-04-09 Covidien Lp Network-Capable Medical Device for Remote Monitoring Systems
US20130174213A1 (en) * 2011-08-23 2013-07-04 James Liu Implicit sharing and privacy control through physical behaviors using sensor-rich devices
US20160148052A1 (en) * 2013-07-16 2016-05-26 Seiko Epson Corporation Information processing apparatus, information processing method, and information processing system
US9305551B1 (en) * 2013-08-06 2016-04-05 Timothy A. Johns Scribe system for transmitting an audio recording from a recording device to a server
US20150088546A1 (en) * 2013-09-22 2015-03-26 Ricoh Company, Ltd. Mobile Information Gateway for Use by Medical Personnel
US20150156028A1 (en) * 2013-12-01 2015-06-04 Apx Labs, Llc Systems and methods for sharing information between users of augmented reality devices
US20160366084A1 (en) * 2015-06-10 2016-12-15 Google Inc. Contextually driven messaging system
US20180082476A1 (en) * 2016-09-22 2018-03-22 International Business Machines Corporation Collaborative search for out of field of view augmented reality objects
WO2018067515A1 (en) * 2016-10-04 2018-04-12 WortheeMed, Inc. Enhanced reality medical guidance systems and methods of use

Cited By (167)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220092910A1 (en) * 2010-09-30 2022-03-24 Jesus Perea-Ochoa Method and System of Operating Multi-Task Interactive Electronic Devices and Ultraviolet Light System
US11200744B2 (en) * 2013-03-04 2021-12-14 Alex C. Chen Method and apparatus for recognizing behavior and providing information
US11675971B1 (en) 2016-09-29 2023-06-13 Vignet Incorporated Context-aware surveys and sensor data collection for health research
US11501060B1 (en) 2016-09-29 2022-11-15 Vignet Incorporated Increasing effectiveness of surveys for digital health monitoring
US11507737B1 (en) 2016-09-29 2022-11-22 Vignet Incorporated Increasing survey completion rates and data quality for health monitoring programs
US11487531B2 (en) * 2016-10-28 2022-11-01 Vignet Incorporated Customizing applications for health monitoring using rules and program data
US11321082B2 (en) * 2016-10-28 2022-05-03 Vignet Incorporated Patient engagement in digital health programs
US12080393B2 (en) 2017-01-11 2024-09-03 Magic Leap, Inc. Medical assistant
US20210225497A1 (en) * 2017-02-24 2021-07-22 General Electric Company Providing auxiliary information regarding healthcare procedure and system performance using augmented reality
US11080434B2 (en) 2017-04-13 2021-08-03 At&T Intellectual Property I, L.P. Protecting content on a display device from a field-of-view of a person or device
US10599877B2 (en) * 2017-04-13 2020-03-24 At&T Intellectual Property I, L.P. Protecting content on a display device from a field-of-view of a person or device
US20200081523A1 (en) * 2017-05-15 2020-03-12 Shanghai United Imaging Healthcare Co., Ltd. Systems and methods for display
US20180365875A1 (en) * 2017-06-14 2018-12-20 Dell Products, L.P. Headset display control based upon a user's pupil state
US10810773B2 (en) * 2017-06-14 2020-10-20 Dell Products, L.P. Headset display control based upon a user's pupil state
US11151421B2 (en) * 2017-06-29 2021-10-19 The Procter & Gamble Company Method for treating a surface
US20240015151A1 (en) * 2017-06-30 2024-01-11 Wells Fargo Bank, N.A. Authentication as a service
US11507679B2 (en) * 2017-07-01 2022-11-22 Chengdu Qianniucao Information Technology Co., Ltd. Authorization method for form related information
US20200218820A1 (en) * 2017-07-16 2020-07-09 Chengdu Qianniucao Information Technology Co., Ltd. Method for authorizing form data operation authority
US11599656B2 (en) * 2017-07-16 2023-03-07 Chengdu Qianniucao Information Technology Co., Ltd. Method for authorizing form data operation authority
US20210121059A1 (en) * 2017-09-11 2021-04-29 Nikon Corporation Ophthalmic instrument, management device, and method for managing an ophthalmic instrument
US20210407632A1 (en) * 2017-09-13 2021-12-30 Koninklijke Philips N.V. System and method for displaying electronic health records
US11381450B1 (en) * 2017-11-03 2022-07-05 Vignet Incorporated Altering digital therapeutics over time to achieve desired outcomes
US20190207814A1 (en) * 2017-11-03 2019-07-04 Vignet Incorporated Systems and methods for managing operation of devices in complex systems and changing environments
US11700175B2 (en) 2017-11-03 2023-07-11 Vignet Incorporated Personalized digital therapeutics to reduce medication side effects
US11616688B1 (en) 2017-11-03 2023-03-28 Vignet Incorporated Adapting delivery of digital therapeutics for precision medicine
US11153156B2 (en) * 2017-11-03 2021-10-19 Vignet Incorporated Achieving personalized outcomes with digital therapeutic applications
US10938651B2 (en) 2017-11-03 2021-03-02 Vignet Incorporated Reducing medication side effects using digital therapeutics
US11153159B2 (en) 2017-11-03 2021-10-19 Vignet Incorporated Digital therapeutics for precision medicine
US11374810B2 (en) 2017-11-03 2022-06-28 Vignet Incorporated Monitoring adherence and dynamically adjusting digital therapeutics
US10332376B2 (en) * 2017-11-28 2019-06-25 Cheng Chieh Investment Co., Ltd. Workplace management system and wearable device therefor
US11495223B2 (en) * 2017-12-08 2022-11-08 Samsung Electronics Co., Ltd. Electronic device for executing application by using phoneme information included in audio data and operation method therefor
US20190180883A1 (en) * 2017-12-11 2019-06-13 Teletracking Technologies, Inc. Milestone detection sensing
US11113427B2 (en) * 2017-12-22 2021-09-07 Lenovo (Beijing) Co., Ltd. Method of displaying contents, a first electronic device and a second electronic device
US11244508B2 (en) * 2018-02-03 2022-02-08 The Johns Hopkins University Augmented reality display for surgical procedures
US11488304B2 (en) * 2018-05-01 2022-11-01 Eizo Corporation Gauze detection system and gauze detection method
US10743140B2 (en) * 2018-05-16 2020-08-11 International Business Machines Corporation Smart location alert system
US20190357010A1 (en) * 2018-05-16 2019-11-21 International Business Machines Corporation Smart location alert system
US11651555B2 (en) * 2018-05-31 2023-05-16 Microsoft Technology Licensing, Llc Re-creation of virtual environment through a video call
US11583360B2 (en) * 2018-08-06 2023-02-21 The Board Of Trustees Of The Leland Stanford Junior University Method for monitoring object flow within a surgical space during a surgery
US20230200913A1 (en) * 2018-08-06 2023-06-29 Univ Leland Stanford Junior Method for monitoring object flow within a surgical space during a surgery
US11409417B1 (en) 2018-08-10 2022-08-09 Vignet Incorporated Dynamic engagement of patients in clinical and digital health research
US11520466B1 (en) 2018-08-10 2022-12-06 Vignet Incorporated Efficient distribution of digital health programs for research studies
US11116587B2 (en) 2018-08-13 2021-09-14 Theator inc. Timeline overlay on surgical video
WO2020047338A1 (en) * 2018-08-29 2020-03-05 Movidius Ltd. Computer vision system
US20200069181A1 (en) * 2018-09-01 2020-03-05 Philip Jaques Sampson I-Examination
WO2020061054A1 (en) * 2018-09-17 2020-03-26 Vet24seven Inc. Veterinary professional animal tracking and support system
US10977495B2 (en) * 2018-10-03 2021-04-13 Cmr Surgical Limited Automatic endoscope video augmentation
US11158423B2 (en) 2018-10-26 2021-10-26 Vignet Incorporated Adapted digital therapeutic plans based on biomarkers
US11682474B2 (en) * 2018-12-12 2023-06-20 International Business Machines Corporation Enhanced user screening for sensitive services
US10645092B1 (en) * 2019-01-03 2020-05-05 Truly Social Games Llc Control and/or verification of interactions of an augmented reality deployment
WO2020157693A1 (en) * 2019-01-31 2020-08-06 Herzog Samuel Patient-centric health care system
US11923079B1 (en) 2019-02-01 2024-03-05 Vignet Incorporated Creating and testing digital bio-markers based on genetic and phenotypic data for therapeutic interventions and clinical trials
US11238979B1 (en) 2019-02-01 2022-02-01 Vignet Incorporated Digital biomarkers for health research, digital therapeautics, and precision medicine
US11452576B2 (en) 2019-02-21 2022-09-27 Theator inc. Post discharge risk prediction
US11426255B2 (en) 2019-02-21 2022-08-30 Theator inc. Complexity analysis and cataloging of surgical footage
US20200273548A1 (en) * 2019-02-21 2020-08-27 Theator inc. Video Used to Automatically Populate a Postoperative Report
US11798092B2 (en) 2019-02-21 2023-10-24 Theator inc. Estimating a source and extent of fluid leakage during surgery
US11065079B2 (en) 2019-02-21 2021-07-20 Theator inc. Image-based system for estimating surgical contact force
US11484384B2 (en) 2019-02-21 2022-11-01 Theator inc. Compilation video of differing events in surgeries on different patients
US11380431B2 (en) 2019-02-21 2022-07-05 Theator inc. Generating support data when recording or reproducing surgical videos
US11763923B2 (en) 2019-02-21 2023-09-19 Theator inc. System for detecting an omitted event during a surgical procedure
US10943682B2 (en) * 2019-02-21 2021-03-09 Theator inc. Video used to automatically populate a postoperative report
US11769207B2 (en) 2019-02-21 2023-09-26 Theator inc. Video used to automatically populate a postoperative report
US10886015B2 (en) 2019-02-21 2021-01-05 Theator inc. System for providing decision support to a surgeon
US10729502B1 (en) 2019-02-21 2020-08-04 Theator inc. Intraoperative surgical event summary
WO2020188119A1 (en) 2019-03-21 2020-09-24 Kepler Vision Technologies B.V. A medical device for transcription of appearances in an image to text with machine learning
US12073562B2 (en) * 2019-03-21 2024-08-27 Kepler Vision Technologies B.V. Medical device for transcription of appearances in an image to text with machine learning
US20210233234A1 (en) * 2019-03-21 2021-07-29 Kepler Vision Technologies B.V. A medical device for transcription of appearances in an image to text with machine learning
US20230281813A1 (en) * 2019-03-21 2023-09-07 Kepler Vision Technologies B.V. Medical device for transcription of appearances in an image to text with machine learning
US11688062B2 (en) * 2019-03-21 2023-06-27 Kepler Vision Technologies B.V. Medical device for transcription of appearances in an image to text with machine learning
US10970858B2 (en) * 2019-05-15 2021-04-06 International Business Machines Corporation Augmented reality for monitoring objects to decrease cross contamination between different regions
US11545271B2 (en) * 2019-08-20 2023-01-03 GE Precision Healthcare LLC Systems and methods for public and private communication threads
US12122420B2 (en) 2019-08-29 2024-10-22 Intel Corporation Computer vision system
US11449189B1 (en) * 2019-10-02 2022-09-20 Facebook Technologies, Llc Virtual reality-based augmented reality development system
US11500606B2 (en) * 2019-10-28 2022-11-15 Beijing Boe Optoelectronics Technology Co., Ltd. AR display device and method
US11816757B1 (en) * 2019-12-11 2023-11-14 Meta Platforms Technologies, Llc Device-side capture of data representative of an artificial reality environment
US11442126B2 (en) * 2019-12-13 2022-09-13 Siemens Healthcare Gmbh System and method for estimating a relative substance composition of a portion of a body of a patient
US11687778B2 (en) 2020-01-06 2023-06-27 The Research Foundation For The State University Of New York Fakecatcher: detection of synthetic portrait videos using biological signals
US12106216B2 (en) 2020-01-06 2024-10-01 The Research Foundation For The State University Of New York Fakecatcher: detection of synthetic portrait videos using biological signals
EP3889970A1 (en) 2020-04-03 2021-10-06 Koninklijke Philips N.V. Diagnosis support system
US11227686B2 (en) 2020-04-05 2022-01-18 Theator inc. Systems and methods for processing integrated surgical video collections to identify relationships using artificial intelligence
US11224485B2 (en) 2020-04-05 2022-01-18 Theator inc. Image analysis for detecting deviations from a surgical plane
US11348682B2 (en) 2020-04-05 2022-05-31 Theator, Inc. Automated assessment of surgical competency from video analyses
US12033104B2 (en) 2020-04-05 2024-07-09 Theator inc. Time and location-based linking of captured medical information with medical records
US11574435B1 (en) * 2020-04-07 2023-02-07 Robert Edwin Douglas Multi-user extended reality viewing technique
US11641460B1 (en) * 2020-04-27 2023-05-02 Apple Inc. Generating a volumetric representation of a capture region
US11556720B2 (en) * 2020-05-05 2023-01-17 International Business Machines Corporation Context information reformation and transfer mechanism at inflection point
US11998305B2 (en) 2020-05-11 2024-06-04 BraveHeart Wireless Inc. Systems and methods for using a wearable health monitor
EP3909500A1 (en) * 2020-05-11 2021-11-17 BraveHeart Wireless Inc. Systems and methods for using algorithms and acoustic input to control, monitor, annotate, and configure a wearable health monitor that monitors physiological signals
US11087557B1 (en) 2020-06-03 2021-08-10 Tovy Kamine Methods and systems for remote augmented reality communication for guided surgery
US11568877B2 (en) * 2020-08-05 2023-01-31 Interactive Solutions Corp. System to change image based on voice
US11302448B1 (en) 2020-08-05 2022-04-12 Vignet Incorporated Machine learning to select digital therapeutics
US20220335952A1 (en) * 2020-08-05 2022-10-20 Interactive Solutions Corp. System to Change Image Based on Voice
US11456080B1 (en) 2020-08-05 2022-09-27 Vignet Incorporated Adjusting disease data collection to provide high-quality health data to meet needs of different communities
US20230154469A1 (en) * 2020-08-05 2023-05-18 Interactive Solutions Corp. System to Change Image Based on Voice
US11322260B1 (en) 2020-08-05 2022-05-03 Vignet Incorporated Using predictive models to predict disease onset and select pharmaceuticals
US11504011B1 (en) 2020-08-05 2022-11-22 Vignet Incorporated Early detection and prevention of infectious disease transmission using location data and geofencing
US11963683B2 (en) 2020-10-02 2024-04-23 Cilag Gmbh International Method for operating tiered operation modes in a surgical system
WO2022070072A1 (en) * 2020-10-02 2022-04-07 Cilag Gmbh International Control of a display outside the sterile field from a device within the sterile field
US11883022B2 (en) 2020-10-02 2024-01-30 Cilag Gmbh International Shared situational awareness of the device actuator activity to prioritize certain aspects of displayed information
US11877897B2 (en) 2020-10-02 2024-01-23 Cilag Gmbh International Situational awareness of instruments location and individualization of users to control displays
US11992372B2 (en) 2020-10-02 2024-05-28 Cilag Gmbh International Cooperative surgical displays
US11830602B2 (en) 2020-10-02 2023-11-28 Cilag Gmbh International Surgical hub having variable interconnectivity capabilities
US11672534B2 (en) 2020-10-02 2023-06-13 Cilag Gmbh International Communication capability of a smart stapler
US11748924B2 (en) 2020-10-02 2023-09-05 Cilag Gmbh International Tiered system display control based on capacity and user operation
US20220104910A1 (en) * 2020-10-02 2022-04-07 Ethicon Llc Monitoring of user visual gaze to control which display system displays the primary information
US12064293B2 (en) 2020-10-02 2024-08-20 Cilag Gmbh International Field programmable surgical visualization system
US12016566B2 (en) 2020-10-02 2024-06-25 Cilag Gmbh International Surgical instrument with adaptive function controls
US11763919B1 (en) 2020-10-13 2023-09-19 Vignet Incorporated Platform to increase patient engagement in clinical trials through surveys presented on mobile devices
US20220139514A1 (en) * 2020-11-03 2022-05-05 Nuance Communications, Inc. Communication System and Method
CN112397192A (zh) * 2020-11-09 2021-02-23 何奔 一种用于心血管科室的医疗智能远程查房识别系统
US11942218B2 (en) 2021-01-12 2024-03-26 Emed Labs, Llc Health testing and diagnostics platform
US11289196B1 (en) 2021-01-12 2022-03-29 Emed Labs, Llc Health testing and diagnostics platform
US11568988B2 (en) 2021-01-12 2023-01-31 Emed Labs, Llc Health testing and diagnostics platform
US11605459B2 (en) 2021-01-12 2023-03-14 Emed Labs, Llc Health testing and diagnostics platform
US11894137B2 (en) 2021-01-12 2024-02-06 Emed Labs, Llc Health testing and diagnostics platform
US11410773B2 (en) 2021-01-12 2022-08-09 Emed Labs, Llc Health testing and diagnostics platform
US11367530B1 (en) 2021-01-12 2022-06-21 Emed Labs, Llc Health testing and diagnostics platform
US11875896B2 (en) 2021-01-12 2024-01-16 Emed Labs, Llc Health testing and diagnostics platform
US11393586B1 (en) 2021-01-12 2022-07-19 Emed Labs, Llc Health testing and diagnostics platform
US11804299B2 (en) 2021-01-12 2023-10-31 Emed Labs, Llc Health testing and diagnostics platform
US11789837B1 (en) 2021-02-03 2023-10-17 Vignet Incorporated Adaptive data collection in clinical trials to increase the likelihood of on-time completion of a trial
US11924283B2 (en) 2021-02-08 2024-03-05 Multinarity Ltd Moving content between virtual and physical displays
US12095866B2 (en) 2021-02-08 2024-09-17 Multinarity Ltd Sharing obscured content to provide situational awareness
US11811876B2 (en) 2021-02-08 2023-11-07 Sightful Computers Ltd Virtual display changes based on positions of viewers
US12094070B2 (en) 2021-02-08 2024-09-17 Sightful Computers Ltd Coordinating cursor movement between a physical surface and a virtual surface
US11882189B2 (en) 2021-02-08 2024-01-23 Sightful Computers Ltd Color-sensitive virtual markings of objects
US12095867B2 (en) 2021-02-08 2024-09-17 Sightful Computers Ltd Shared extended reality coordinate system generated on-the-fly
US20220262078A1 (en) * 2021-02-14 2022-08-18 Broadstone Technologies, Llc Remote device provisioning and remote support using augmented reality
US20220301314A1 (en) * 2021-03-16 2022-09-22 Siemens Energy, Inc. System and method for automated foreign material exclusion attendant
US11515037B2 (en) 2021-03-23 2022-11-29 Emed Labs, Llc Remote diagnostic testing and treatment
US11615888B2 (en) 2021-03-23 2023-03-28 Emed Labs, Llc Remote diagnostic testing and treatment
US12094606B2 (en) 2021-03-23 2024-09-17 Emed Labs, Llc Remote diagnostic testing and treatment
US11869659B2 (en) 2021-03-23 2024-01-09 Emed Labs, Llc Remote diagnostic testing and treatment
US11894138B2 (en) 2021-03-23 2024-02-06 Emed Labs, Llc Remote diagnostic testing and treatment
US11281553B1 (en) 2021-04-16 2022-03-22 Vignet Incorporated Digital systems for enrolling participants in health research and decentralized clinical trials
US11586524B1 (en) 2021-04-16 2023-02-21 Vignet Incorporated Assisting researchers to identify opportunities for new sub-studies in digital health research and decentralized clinical trials
US11645180B1 (en) 2021-04-16 2023-05-09 Vignet Incorporated Predicting and increasing engagement for participants in decentralized clinical trials
US20220354440A1 (en) * 2021-05-04 2022-11-10 Willis Dennis Grajales Worldwide vision screening and visual field screening booth, kiosk, or exam room using artificial intelligence, screen sharing technology, and telemedicine video conferencing system to interconnect patient with eye doctor anywhere around the world via the internet using ethernet, 4G, 5G, 6G or Wifi for teleconsultation and to review results
US11373756B1 (en) 2021-05-24 2022-06-28 Emed Labs, Llc Systems, devices, and methods for diagnostic aid kit apparatus
US11929168B2 (en) 2021-05-24 2024-03-12 Emed Labs, Llc Systems, devices, and methods for diagnostic aid kit apparatus
US11369454B1 (en) 2021-05-24 2022-06-28 Emed Labs, Llc Systems, devices, and methods for diagnostic aid kit apparatus
US20220392629A1 (en) * 2021-06-02 2022-12-08 Sandeep REDDY Portable healthcare system installed at a remote location having limited internet connectivity
US11610682B2 (en) 2021-06-22 2023-03-21 Emed Labs, Llc Systems, methods, and devices for non-human readable diagnostic tests
US11809213B2 (en) 2021-07-28 2023-11-07 Multinarity Ltd Controlling duty cycle in wearable extended reality appliances
US11816256B2 (en) 2021-07-28 2023-11-14 Multinarity Ltd. Interpreting commands in extended reality environments based on distances from physical input devices
US11829524B2 (en) 2021-07-28 2023-11-28 Multinarity Ltd. Moving content between a virtual display and an extended reality environment
US11861061B2 (en) 2021-07-28 2024-01-02 Sightful Computers Ltd Virtual sharing of physical notebook
US11748056B2 (en) 2021-07-28 2023-09-05 Sightful Computers Ltd Tying a virtual speaker to a physical space
US20230146384A1 (en) * 2021-07-28 2023-05-11 Multinarity Ltd Initiating sensory prompts indicative of changes outside a field of view
US12014829B2 (en) 2021-09-01 2024-06-18 Emed Labs, Llc Image processing and presentation techniques for enhanced proctoring sessions
CN113723100A (zh) * 2021-09-09 2021-11-30 国网电子商务有限公司 一种基于指纹特征的开源组件识别方法及装置
US20230267933A1 (en) * 2021-09-27 2023-08-24 International Business Machines Corporation Selective inclusion of speech content in documents
US11705230B1 (en) 2021-11-30 2023-07-18 Vignet Incorporated Assessing health risks using genetic, epigenetic, and phenotypic data sources
US11901083B1 (en) 2021-11-30 2024-02-13 Vignet Incorporated Using genetic and phenotypic data sets for drug discovery clinical trials
WO2023159236A1 (en) * 2022-02-18 2023-08-24 Curelator, Inc. Personal medical avatar
US20230266872A1 (en) * 2022-02-18 2023-08-24 SoftAcuity, Inc. Intelligent surgical display system and method
WO2023244267A1 (en) * 2022-06-13 2023-12-21 Magic Leap, Inc. Systems and methods for human gait analysis, real-time feedback and rehabilitation using an extended-reality device
WO2024049435A1 (en) * 2022-09-01 2024-03-07 Exo Imaging, Inc. Apparatus, system and method to control an ultrasonic image on a display based on sensor input at an ultrasonic imaging device
US12079442B2 (en) 2022-09-30 2024-09-03 Sightful Computers Ltd Presenting extended reality content in different physical environments
US12073054B2 (en) 2022-09-30 2024-08-27 Sightful Computers Ltd Managing virtual collisions between moving virtual objects
US12099696B2 (en) 2022-09-30 2024-09-24 Sightful Computers Ltd Displaying virtual content on moving vehicles
US12112012B2 (en) 2022-09-30 2024-10-08 Sightful Computers Ltd User-customized location based content presentation
WO2024097431A1 (en) * 2022-11-06 2024-05-10 Hajeebu Sreehita System and method of wound assessment
US11948263B1 (en) 2023-03-14 2024-04-02 Sightful Computers Ltd Recording the complete physical and extended reality environments of a user
WO2024196288A1 (en) * 2023-03-22 2024-09-26 Telefonaktiebolaget Lm Ericsson (Publ) Methods and apparatuses for remote control of controllable electrical devices in a surrounding physical environment of a user
US12124675B2 (en) 2023-12-05 2024-10-22 Sightful Computers Ltd Location-based virtual resource locator

Also Published As

Publication number Publication date
CN110431636A (zh) 2019-11-08
CN116230153A (zh) 2023-06-06
IL267995A (en) 2019-09-26
AU2018207068A1 (en) 2019-07-25
JP7549173B2 (ja) 2024-09-10
CA3049431A1 (en) 2018-07-19
US20230075466A1 (en) 2023-03-09
EP3568783A1 (en) 2019-11-20
JP2020505675A (ja) 2020-02-20
JP7478795B2 (ja) 2024-05-07
KR102662173B1 (ko) 2024-04-30
WO2018132336A1 (en) 2018-07-19
JP2024096186A (ja) 2024-07-12
JP2023001290A (ja) 2023-01-04
KR20240059645A (ko) 2024-05-07
JP7224288B2 (ja) 2023-02-17
US12080393B2 (en) 2024-09-03
EP3568783A4 (en) 2020-11-11
KR20190102060A (ko) 2019-09-02

Similar Documents

Publication Publication Date Title
US12080393B2 (en) Medical assistant
JP7091531B2 (ja) 身体上ジェスチャ・インターフェース及び投影表示のための方法
AU2023214273B2 (en) Imaging modification, display and visualization using augmented and virtual reality eyewear
Silva et al. Emerging applications of virtual reality in cardiovascular medicine
Preum et al. A review of cognitive assistants for healthcare: Trends, prospects, and future directions
JP2021099866A (ja) システム及び方法
CN112566580A (zh) 用于脚踝外科手术的虚拟引导
AU2012219077A1 (en) System and method for performing an automatic and remote trained personnel guided medical examination
Jalaliniya et al. Designing wearable personal assistants for surgeons: An egocentric approach
US20230140072A1 (en) Systems and methods for medical procedure preparation
US20240363209A1 (en) Medical assistant

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: MAGIC LEAP, INC., FLORIDA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HARRISES, CHRISTOPHER M.;TECHNICAL SOLUTIONS, INC.;REEL/FRAME:045661/0556

Effective date: 20170509

Owner name: MAGIC LEAP, INC., FLORIDA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ROBAINA, NASTASJA U.;SAMEC, NICOLE ELIZABETH;BAERENRODT, MARK;SIGNING DATES FROM 20170207 TO 20170208;REEL/FRAME:045660/0467

AS Assignment

Owner name: JP MORGAN CHASE BANK, N.A., NEW YORK

Free format text: PATENT SECURITY AGREEMENT;ASSIGNORS:MAGIC LEAP, INC.;MOLECULAR IMPRINTS, INC.;MENTOR ACQUISITION ONE, LLC;REEL/FRAME:050138/0287

Effective date: 20190820

AS Assignment

Owner name: CITIBANK, N.A., NEW YORK

Free format text: ASSIGNMENT OF SECURITY INTEREST IN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:050967/0138

Effective date: 20191106

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: MAGIC LEAP, INC., FLORIDA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ROBAINA, NASTASJA U.;SAMEC, NICOLE ELIZABETH;BAERENRODT, MARK;SIGNING DATES FROM 20200708 TO 20210106;REEL/FRAME:055039/0336

Owner name: MAGIC LEAP, INC., FLORIDA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HARRISES, CHRISTOPHER M.;TECHNICAL SOLUTIONS, INC.;REEL/FRAME:055039/0293

Effective date: 20200713

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION