EP4262944A1 - Robotisches belüftungssystem zur korrekten maskenpositionierung - Google Patents

Robotisches belüftungssystem zur korrekten maskenpositionierung

Info

Publication number
EP4262944A1
EP4262944A1 EP21890248.4A EP21890248A EP4262944A1 EP 4262944 A1 EP4262944 A1 EP 4262944A1 EP 21890248 A EP21890248 A EP 21890248A EP 4262944 A1 EP4262944 A1 EP 4262944A1
Authority
EP
European Patent Office
Prior art keywords
user
mask
face
arm
sleep state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP21890248.4A
Other languages
English (en)
French (fr)
Inventor
Scott Nortman
David JASSIR
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of EP4262944A1 publication Critical patent/EP4262944A1/de
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M16/00Devices for influencing the respiratory system of patients by gas treatment, e.g. mouth-to-mouth respiration; Tracheal tubes
    • A61M16/06Respiratory or anaesthetic masks
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M16/00Devices for influencing the respiratory system of patients by gas treatment, e.g. mouth-to-mouth respiration; Tracheal tubes
    • A61M16/021Devices for influencing the respiratory system of patients by gas treatment, e.g. mouth-to-mouth respiration; Tracheal tubes operated by electrical means
    • A61M16/022Control means therefor
    • A61M16/024Control means therefor including calculation means, e.g. using a processor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M16/00Devices for influencing the respiratory system of patients by gas treatment, e.g. mouth-to-mouth respiration; Tracheal tubes
    • A61M16/06Respiratory or anaesthetic masks
    • A61M16/0683Holding devices therefor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • G01S5/163Determination of attitude
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M16/00Devices for influencing the respiratory system of patients by gas treatment, e.g. mouth-to-mouth respiration; Tracheal tubes
    • A61M16/08Bellows; Connecting tubes ; Water traps; Patient circuits
    • A61M16/0875Connecting tubes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2205/00General characteristics of the apparatus
    • A61M2205/10General characteristics of the apparatus with powered movement mechanisms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2205/00General characteristics of the apparatus
    • A61M2205/13General characteristics of the apparatus with means for the detection of operative contact with patient, e.g. lip sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2205/00General characteristics of the apparatus
    • A61M2205/33Controlling, regulating or measuring
    • A61M2205/3306Optical measuring means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2205/00General characteristics of the apparatus
    • A61M2205/33Controlling, regulating or measuring
    • A61M2205/332Force measuring means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2205/00General characteristics of the apparatus
    • A61M2205/33Controlling, regulating or measuring
    • A61M2205/3368Temperature
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2205/00General characteristics of the apparatus
    • A61M2205/33Controlling, regulating or measuring
    • A61M2205/3375Acoustical, e.g. ultrasonic, measuring means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2205/00General characteristics of the apparatus
    • A61M2205/50General characteristics of the apparatus with microprocessors or computers
    • A61M2205/502User interfaces, e.g. screens or keyboards
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2205/00General characteristics of the apparatus
    • A61M2205/50General characteristics of the apparatus with microprocessors or computers
    • A61M2205/52General characteristics of the apparatus with microprocessors or computers with memories providing a history of measured variating parameters of apparatus or patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2209/00Ancillary equipment
    • A61M2209/08Supports for equipment
    • A61M2209/082Mounting brackets, arm supports for equipment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2209/00Ancillary equipment
    • A61M2209/08Supports for equipment
    • A61M2209/088Supports for equipment on the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2210/00Anatomical parts of the body
    • A61M2210/06Head
    • A61M2210/0606Face
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2230/00Measuring parameters of the user
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2230/00Measuring parameters of the user
    • A61M2230/04Heartbeat characteristics, e.g. ECG, blood pressure modulation
    • A61M2230/06Heartbeat rate only
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2230/00Measuring parameters of the user
    • A61M2230/20Blood composition characteristics
    • A61M2230/205Blood composition characteristics partial oxygen pressure (P-O2)
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2230/00Measuring parameters of the user
    • A61M2230/30Blood pressure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2230/00Measuring parameters of the user
    • A61M2230/40Respiratory characteristics
    • A61M2230/42Rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2230/00Measuring parameters of the user
    • A61M2230/50Temperature
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2230/00Measuring parameters of the user
    • A61M2230/62Posture
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2230/00Measuring parameters of the user
    • A61M2230/63Motion, e.g. physical activity

Definitions

  • the present disclosure relates to medical systems, devices, and methods, and more specifically to a robotic ventilation system, and still more specifically to a robotic ventilation system comprising a kinematic mount and a fiducial marker.
  • FIG. 1A is a drawing of a robotic ventilation system.
  • FIG. 1 B is a drawing of a coordinate frame system usable with the robotic ventilation system.
  • FIG. 2 is a drawing of a mask system of the robotic ventilation system.
  • FIG. 3 is a drawing of a detail of the robotic ventilation system, showing the mask system attaching to a flange via a kinematic mount.
  • FIG. 4 is a drawing of the robotic ventilation system in use on behalf of a user.
  • FIG. 5 is a flow chart of a method for correct mask placement using a robotic ventilation system including a kinematic mount.
  • FIG. 6 is a flow chart of a method for correct mask placement using a robotic ventilation system including a fiducial marker.
  • FIG. 7 is a flow chart of a method for correct mask placement using a robotic ventilation system including a kinematic mount and further including a fiducial marker.
  • a robotic ventilation system for correct mask placement comprising a kinematic mount includes: a robot comprising an arm, the arm comprising a flange, the flange coupled to an end of the arm, the arm configured to move the flange along a degree of freedom; a mask coupled to the flange, the mask configured to deliver gas to a user, wherein the arm further comprises a kinematic mount, the kinematic mount usable to do one or more of orient and locate the mask with respect to the flange; a ventilator coupled to the mask, the ventilator configured to deliver the gas to the mask; a gas tube coupled to both the mask and the ventilator, the gas tube configured to carry gas between the ventilator and the mask; a controller configured to change a pose of the mask, the controller further configured to control the delivery of the gas from the ventilator to the user; and a tracking system, the tracking system configured to capture image data of one or more of the mask and a face of the user, wherein the controller is configured to change the mask pose
  • a robotic ventilation system for correct mask placement comprising a kinematic mount and a fiducial marker includes: a robot comprising an arm, the arm comprising a flange, the flange coupled to an end of the arm, the arm configured to move the flange along a degree of freedom, the arm further comprising a kinematic mount; a mask coupled to the flange by a mounting feature configured for intermittent attachment, the mask configured to deliver gas to a user, wherein the kinematic mount is usable to do one or more of orient and locate the mask with respect to the flange; a mask coupled to the robot, wherein the mask is coupled to the flange by a mounting feature configured for intermittent attachment, the mounting feature comprising the kinematic mount, the mounting feature further comprising an opening, wherein the mounting feature further comprises a sensor, the sensor configured to verify that proper seating has occurred of the mask on the user, wherein a spatial alignment of the mask with respect to the kinematic mount is known a priori, wherein
  • a method for correct mask placement using a robotic ventilation system comprising a kinematic mount includes: capturing image data, the image data comprising a facial feature of a user, by a system including: a robot comprising an arm, the arm comprising a flange, the flange coupled to an end of the arm, the arm configured to move the flange along a degree of freedom; a mask coupled to the flange, the mask configured to deliver gas to a user, wherein the arm further comprises a kinematic mount, the kinematic mount usable to do one or more of orient and locate the mask with respect to the flange; a ventilator coupled to the mask, the ventilator configured to deliver the gas to the mask; a gas tube coupled to both the mask and the ventilator, the gas tube configured to carry gas between the ventilator and the mask; a controller configured to change a pose of the mask, the controller further configured to control the delivery of the gas from the ventilator to the user; and a tracking system, the tracking system configured to capture
  • a method for correct mask placement using a robotic ventilation system comprising a fiducial marker includes: capturing image data, the image data comprising a facial feature of a user, the image data further comprising image data regarding a fiducial marker, by a system including: a robot comprising an arm, the arm comprising a flange, the flange coupled to an end of the arm, the arm configured to move the flange along a degree of freedom; a mask coupled to the flange, the mask configured to deliver gas to a user, the mask comprising the fiducial marker; a ventilator coupled to the mask, the ventilator configured to deliver the gas to the mask; a gas tube coupled to both the mask and the ventilator, the gas tube configured to carry the gas between the ventilator and the mask; a controller configured to change a pose of the mask, the controller further configured to control the delivery of the gas from the ventilator to the user; and a tracking system, the tracking system configured to capture image data of one or more of the mask and a face of the user
  • a method for correct mask placement using a robotic ventilation system comprising a kinematic mount includes: capturing image data, the image data comprising a facial feature of a user, by a system including: a robot comprising an arm, the arm comprising a flange, the flange coupled to an end of the arm, the arm configured to move the flange along a degree of freedom, the arm further comprising a kinematic mount; a mask coupled to the flange by a mounting feature configured for intermittent attachment, the mask configured to deliver gas to a user, wherein the kinematic mount is usable to do one or more of orient and locate the mask with respect to the flange; a mask coupled to the robot, wherein the mask is coupled to the flange by a mounting feature configured for intermittent attachment, the mounting feature comprising the kinematic mount, the mounting feature further comprising an opening, wherein the mounting feature further comprises a sensor, the sensor configured to verify that proper seating has occurred of the mask on the user, wherein a spatial
  • the disclosure describes a system and method for correct placement and for proper force application of a ventilator mask onto a face of a user, and for maintaining the proper placement and force application of the mask while the user does one or more of sleep and move.
  • a robotic arm has the mask operatively coupled to its distal end, and a robot base is placed beside the user.
  • the reach and dexterity of the robotic arm can allow for correct placement and for proper force application of the mask onto the face of the user while the user does one or more of lie in bed and move during sleep.
  • the distal end of the robotic arm can also be operatively coupled to a computer vision system that views the face of the user and optically tracks needed facial features, in real-time, to cause the robotic arm to maintain correct alignment of the mask with the face of the user.
  • a biometric sensor system can measure and analyze signals of the user to determine a sleep state of the user in real-time, and can cause the robotic arm to position and orient the mask after the user has entered a specified sleep state.
  • FIG. 1A is a drawing of a robotic ventilation system 100.
  • the system 100 comprises a robot 101 and a base 102 operably connected to the robot 101.
  • the robot 101 comprises a mask system 103.
  • the mask system 103 comprises a mask 104, the mask 104 configured to deliver the breathable gas 107 to the user 109.
  • the mask 104 comprises one or more of a face mask, a partial face mask, a nasal mask, a mouth mask, a nasal pillow mask, and a nasal plug mask.
  • the robot 101 further comprises a tube 105, the tube 105 operably connected to the mask system 103.
  • the base 102 comprises a ventilator 106, the ventilator 106 configured to deliver breathable gas 107 to the mask 104 through the tube 105.
  • the mask 104 then delivers the gas 107 to a face 108 of a user 109.
  • Alternative embodiments comprise the ventilator 106 as a standalone ventilator 106. Alternatively, or additionally, the ventilator 106 is
  • the system 100 is configured to deliver the breathable gas 107 to the user 109 via the mask 104.
  • the system 100 delivers the breathable gas 107 to the user 109 via the mask 104 after detecting a state of the user 109.
  • the system 100 delivers the breathable gas 107 to the user 109 via the mask 104 after detecting a requisite sleep state of the user 109.
  • the requisite sleep state may be predetermined by the user 109. Alternatively, or additionally, the requisite sleep state may be determined by the system 100. For example, the requisite sleep state may be calculated by the system 100.
  • the requisite sleep state comprises one or more of a wakeful state, a stage 1 non-rapid eye movement (non- REM) sleep state, a stage 2 non-REM sleep state, a stage 3 non-REM sleep state, and an REM sleep state.
  • the system 100 delivers the breathable gas 107 to the user 109 after the user 109 falls asleep.
  • the system 100 delivers the breathable gas 107 to the user 109 after fulfillment of a time-based condition.
  • the system 100 delivers the breathable gas 107 to the user 109 after one or more of passage of a predetermined amount of time and after a predetermined time of day is reached.
  • the mask 104 further comprises a fiducial marker 110.
  • the fiducial marker 110 comprises a tracked object 110 placed at a known pose in a sensing area of a tracking system that can be viewed by the tracking system and can serve as a reference for the tracking system.
  • the fiducial marker comprises a circular fiducial marker such as the MBA08MMA manufactured by NaturalPoint, Inc., of Corvallis, Oregon (www. naturalpoint . com).
  • the fiducial marker comprises a spherical fiducial marker such as the MKR064M3-10 manufactured by NaturalPoint, Inc., of Corvallis, Oregon (www. naturalpoint . com).
  • the fiducial marker comprises a square fiducial marker such as the ArUco developed by Universitario de Rabanales, University of Cordoba, 14071 , Cordoba, Spain (www . uco . es).
  • the robot 102 further comprises a biometric sensor system 111 , the biometric sensor system 111 configured to sense biometric information regarding the user 109.
  • the biometric information comprises one or more of a heart rate, a respiratory rate, a temperature, a blood oxygen saturation level, an eye state, motion or lack thereof, an estimated sleep state, a sleep profile, a blood pressure, a pulse rate, a temperature, an eye state, a height, a weight, an age, and other biometric information.
  • tracked facial features of the user can be observed using a non-computer-vision-based approach, such as radar, LIDAR, ultrasound, or any other suitable spatial measuring method.
  • the current sleep state is determined using a remote camera comprised in the biometric sensor system 111.
  • the biometric sensor system 111 comprises one or more of a camera and a watch.
  • the biometric sensor system 111 is physically attached to the user 109.
  • the biometric sensor system 111 comprises a remote biometric sensor system 111 that is not physically attached to the user 109.
  • the biometric sensor system 111 is comprised in a virtual assistant device such as the Alexa device sold by Amazon.com of Seattle, Washington (www.amazon.com).
  • the system 100 further comprises a tracking system 112 configured to track one or more of the mask system 103, the fiducial marker 110, and a face feature point of the user 109.
  • the tracking system is further configured to capture image data.
  • the image data comprises a facial feature (not shown in Fig. 1A; item 407 in Fig. 4) of the user 109.
  • the tracking system 112 comprises a camera system configured to capture the image data.
  • a computer vision system 112 is a preferred implementation of the invention, more broadly, any tracking system 112 will work, including one or more of a computer vision system, a radio detection and ranging (RADAR) tracking system, a light detection and ranging (LIDAR) tracking system, an ultrasound tracking system, and any other suitable spatial measuring method.
  • the tracking system 112 is configured to identify a face feature point.
  • the face feature point comprises one or more of an eye center point, a nose center point, a mouth lateral point, and a mouth center point.
  • the robot 102 further comprises an arm 114, the arm configured to do one or more of position the mask 104 with respect to the user 109 and orient the mask 104 with respect to the user 109.
  • the arm 114 comprises a flange 115, the flange 115 coupled to an end of the arm 114.
  • the arm 114 can have one or more degrees of freedom.
  • the flange 115 can have one or more degrees of freedom.
  • the arm 114 is configured to move the flange 115 along a degree of freedom.
  • the arm 114 can have seven degrees of freedom.
  • the arm 114 can have fewer degrees of freedom, for example six degrees of freedom, or more degrees of freedom, for example eight degrees of freedom.
  • the six degrees of freedom comprise one or more of three directions describing a location of the arm 114 and three directions describing an orientation of the arm 114.
  • the robot 102 further comprises a robot base 116 configured to support the arm 114.
  • the flange 115 is coupled to an end of the arm 114 that is distal from the robot base 116.
  • the mask 104 is also coupled to the flange 115.
  • the arm 114 further comprises a kinematic mount 117, the kinematic mount 117 usable to do one or more of orient and locate the mask 104 with respect to the flange 115.
  • the kinematic mount 117 is usable to both orient the mask with respect to the flange 115 and locate the mask 104 with respect to the flange 115.
  • the arm 114 comprises a mount other than a kinematic mount, attaching to the flange 115.
  • the kinematic mount 117 comprises a flange kinematic mount 119, the flange kinematic mount 119 mounted to the flange 115.
  • the kinematic mount 117 further comprises a mask system kinematic mount 120, the mask system kinematic mount 120 mounted to the mask system 103.
  • the mask system kinematic mount 120 attaches to the flange kinematic mount 119 by a mounting feature 121.
  • the mounting feature 121 is configured for intermittent attachment of the flange kinematic mount 119 and the mask system kinematic mount 120. For example, de-attachment of the flange kinematic mount 119 and the mask system kinematic mount 120 may be necessary for one or more of cleaning, sterilizing, sanitizing, and replacing, and the like.
  • the mask system kinematic mount 120 comprises an SB1 kinematic mount base manufactured by Thorlabs of Newton, New Jersey (www . thorlabs . com),
  • the mounting feature 121 comprises one or more of a kinematic mounting feature 121 , a magnetic mounting feature 121 , a mechanical mounting feature 121 , and a clamp-on mounting feature 121.
  • the system 100 is configured to move the arm 114 toward the face 108 of the user 109 after the face feature point is identified.
  • the system 100 is further configured to move the arm 114 so as to bring the mask 104 into contact with the face 108 of the user 109 after the face feature point is identified.
  • the system 100 is further configured, following bringing the mask 104 into contact with the face 108 of the user 109, to command the arm 114 to maintain one or more of an appropriate contact force with respect to the user’s face 108 and an appropriate position with respect to the user’s face 108.
  • the appropriate contact force and the appropriate position create a safe and effective contact engagement between the mask 104 and the face 108 of the user 109.
  • the system 100 is further configured, following bringing the mask 104 into contact with the face 108 of the user 109, to command the arm 114 to maintain both an appropriate contact force with respect to the user’s face 108 and an appropriate position with respect to the user’s face 108.
  • the system 100 is further configured, following movement of the face 108 of the user 109, to move the position of the arm 114 so as to maintain an appropriate position with respect to the user’s face 108.
  • the system 100 is further configured, wherein following a failure of the system 100 to maintain the arm 114 at both an appropriate contact force and an appropriate position, to retract the arm 114 so as to move the mask 104 away from the face 108 of the user 109.
  • the system 100 is further configured, following retraction of the arm 114, to go into a mode in which the system 100 awaits entry by the user 109 into a requisite sleep state.
  • the system 100 is further configured, upon entry by the user 109 into the requisite sleep state, to move the arm 114 toward the face 108 of the user 109.
  • the tracking system 112 can include one or more of a transmitter configured to transmit a signal describing a biometric signal of the user 109, a transmitter configured to transmit spatial tracking information, a camera system configured to capture image data, a tracking sensor configured to detect the signal, thereby gathering sensor data, and a processor configured to control one or more of aspect of the system 100.
  • the tracking system 112 comprises a tracking system coordinate frame 113.
  • the tracking system coordinate frame 113 can be defined from a feature of the tracking system 112. For example, the coordinate system of the Intel RealSense Depth Camera D435, sold by Intel Corporation of Santa Clara, California (www . intel . com) locates and orients its camera coordinate system with respect to its left infrared camera.
  • the tracking system 112 can determine one or more of a coordinate frame and a centroid of the tracked object.
  • the tracking system 112 can determine one or more of a coordinate frame and a centroid of the tracked feature.
  • the coordinate frame can be defined with respect to the tracking coordinate frame 113. Centroid tracking data can be defined with respect to the tracking coordinate frame 113.
  • the base 102 further comprises a controller 122 configured to control the system 100.
  • the controller 122 comprises one or more of a tracking controller configured to control the tracking system 112, a ventilator controller configured to control the ventilator 106, and another controller.
  • the controller 122 commands the arm 114 to maintain one or more of an appropriate contact force with respect to the user’s face 108 and an appropriate position with respect to the user’s face 108.
  • the controller 122 commands the arm 114 to maintain both the appropriate contact force with respect to the user’s face 108 and the appropriate position with respect to the user's face 108.
  • the controller 122 is further configured to change a pose of the mask 104 based on one or more of the image data, lapse of a predetermined period of time, and an estimated sleep state of the user 109.
  • the estimated sleep state of the user 109 comprises one or more of the requisite sleep state and an estimation of stability of the sleep state of the user 109.
  • the system 100 transitions from the sleep state to an approach state if the system determines that the estimated sleep state of the user 109 is the requisite sleep state and if the system further determines that the estimated sleep state of the user 109 is stable.
  • the system 100 transitions from the sleep state to a wait for sleep state in which the system 100 waits if the system 100 determines one or more of that the estimated sleep state of the user 109 is not the requisite sleep state and that the estimated sleep state of the user 109 is not stable.
  • the controller 122 is further configured to change a pose of the mask 104 based at least in part on biometric data from the biometric sensor system 111.
  • the base 102 further comprises a display 123 configured to display information to the user 109 regarding the system 100,
  • the display 123 includes a user interface 124.
  • the user interface 124 can provide information to the user 109, for example, power status of the system 100.
  • the user interface 124 can include push buttons usable by the user 109 to control the system 100.
  • the user interface 124 comprises one or more of an indicator, a speaker, a microphone, a button, and a switch.
  • the user interface 124 comprises one or more of a standalone interface and an interface comprised in a portion of the system 100 other than the display 123.
  • the biometric sensor system 111 then transmits the estimated state of the user 109 to the controller 122.
  • the controller 122 can store the estimated state of the user 109 in a database 128.
  • the biometric sensor system 111 receives biometric data, determines a user parameter based at least in part on the received biometric data, and estimates a state of the user 109 based at least in part on the determined parameter.
  • the biometric sensor system 111 stores in the database 128 one or more of the biometric data, the estimated state of the user, and the determined parameter.
  • the biometric sensor system 111 can send the biometric to the system 100 the one or more of the biometric data, the estimated state of the user, and the determined parameter.
  • the robot base 116 comprises a set of tube-base connections 130 through which the robot base 116 couples to the gas tube 105.
  • the set of tubebase connections 130 through which the robot base 116 couples to the gas tube 105 can include a system sensor 132 configured to gather sensor data and store gathered data in the database 128.
  • the tube-base connection 130 comprises a tube-base opening 130.
  • the robot 102 comprises a system sensor 132 configured to sense one or more of forces, torques, currents, voltages, temperatures, and the like.
  • the arm 114 comprises the system sensor 132.
  • the flange 115 includes a flange-mask connection 136 through which the flange 115 couples to the mask system 103.
  • the flange-mask connection 136 comprises a flange-mask opening 136.
  • the flange-mask connection 136 can include a robot mounting feature 138.
  • the flange kinematic mount 119 comprises a flange kinematic mount coordinate frame 142 which can be defined from a feature of the flange kinematic mount 119.
  • the robot mounting feature 138 can comprise an opening (not shown).
  • the gas 107 can pass through the opening.
  • a signal can pass through the opening.
  • the opening can comprise a sensor, the sensor configured to detect the gas 107.
  • the gas tube 105 is coupled to the robot base 116.
  • the gas tube 105 is also coupled to the mask system 103.
  • the gas tube 105 further includes a tube-mask connection 144 that couples the gas tube 105 to the mask system 103.
  • the mask 104 comprises the tube-mask connection 144.
  • the tube-mask connection 144 comprises a tube-mask opening 144.
  • the gas tube 105 can carry the gas 107 between the robot base 116 and the mask system 103 through the tube-base connection 130.
  • the mask system 103 can comprise the mask 104 configured to distribute the gas 107 to the user 109.
  • the mask system 103 can be removed from the flange 115, or the flange kinematic mount 119, for example, for one or more of cleaning, sterilizing, sanitizing, and replacing, and the like.
  • FIG. 1 B shows key coordinate frames of the robotic ventilation system.
  • the coordinate frames include a world coordinate frame 150, a robot base coordinate frame
  • the flange kinematic mount coordinate frame 142 can represent spatial relationships and can be stored in the database 128.
  • These frames can represent spatial relationships and can be stored in the database 128.
  • the world coordinate frame 150 represents an environment reference.
  • the world coordinate frame 150 defines a common reference for all other coordinate frames.
  • the robot base coordinate frame 152 represents a spatial reference for the robot base 116.
  • the robot base coordinate frame 152 can be defined with respect to the robot base 116.
  • the flange kinematic coordinate frame 142 represents a flange 115 reference.
  • the flange kinematic coordinate frame 142 can be defined with respect to the flange 115.
  • the tracking system coordinate frame 113 represents a tracking system reference.
  • the tracking system coordinate frame 113 can be defined with respect to tracking system 112, and points in the tracking system coordinate frame 113 are relative to the tracking system 112.
  • the mask system coordinate frame 154 represents a mask reference.
  • the mask system coordinate frame 154 can be defined with respect to the mask 104.
  • the tracking system 112 can determine the mask system coordinate frame 154 based on one or more of the mask features and a fiducial marker 110.
  • the tracking system 112 can determine a spatial relationship of the mask system coordinate frame 154 with respect to the tracking system controller frame 113.
  • the tracking system 112 can store the mask system coordinate frame 154 in the database
  • the face 108 of the user 109 is represented with a face coordinate frame 156.
  • the face coordinate frame 156 can be defined with respect to a face fiducial coordinate frame 163.
  • the face coordinate frame 156 can be defined with respect to a face feature coordinate frame 164.
  • the face coordinate frame 156 can be a function of a pose of the face 108 of the user 109.
  • the face coordinate frame 156 can be defined with respect a pose of the face 108 of the user 109 at a particular point in time.
  • the face coordinate frame 156 can change as the pose of the face 108 of the user 109 changes.
  • the face coordinate frame 156 can be identical to the face fiducial coordinate frame 163.
  • the face coordinate frame 156 can comprise the face feature coordinate frame 164.
  • the tracking system 112 can determine a transform between the face coordinate frame 156 and the mask system coordinate frame 154.
  • the tracking system 112 can determine the face coordinate frame 156 based on one or more of a facial feature (not shown in Fig. 1 B; item 407 in Fig. 4) and the fiducial marker 110.
  • the tracking system 112 can store the face coordinate frame 156 in the database 128.
  • the face feature coordinate frame 164 can be defined by a facial feature (not shown in Fig. 1 B; item 407 in Fig. 4) of the user 109.
  • the face feature coordinate frame 164 can be a function of the pose of a face feature (not shown in Fig. 1 B; item 407 in Fig. 4) of the user 109.
  • the face feature coordinate frame 164 can be defined with respect to a pose of a face feature (not shown in Fig. 1 B; item 407 in Fig. 4) of the user 109 at a particular point in time.
  • the face feature coordinate frame 164 can change with respect to the tracking system 112 as the pose of the face 108 of the user 109 changes.
  • the tracking system 112 can determine the face coordinate frame 156 based on one or more of the face feature coordinate frame 164 and the face fiducial coordinate frame 163. For example, the tracking system 112 can determine the face feature coordinate frame 164. Alternatively, or additionally, the tracking system 112 can determine the face fiducial coordinate frame 163. Alternatively, or additionally, the tracking system 112 can determine the face coordinate frame 156. For example, the tracking system 112 can determine the face coordinate frame 156 based on one or more of the determined/defined face feature coordinate frame 164 and the determined/defined face fiducial coordinate frame 163. For example, the tracking system 112 can access the face feature coordinate frame 164 in the database 128.
  • the tracking system 112 can access the face fiducial coordinate frame 163 in the database 128. Alternatively, or additionally, the tracking system 112 can determine the face coordinate frame 156. For example, the tracking system 112 can determine the face coordinate frame 156 based on one or more of the face feature coordinate frame 164 and the face fiducial coordinate frame 163.
  • One or more of the face coordinate frame 156, the face fiducial coordinate frame 163 and the face feature coordinate frame 164 can be tracked by the tracking system 112.
  • the controller 122 can use the one or more of the face coordinate frame
  • the face fiducial coordinate frame 163 and the face feature coordinate frame 164 as a spatial reference to do one or more of correctly locate the mask 104 and place the mask 104 on the face 108 of the user 109.
  • the system base coordinate frame 158 represents a system base reference.
  • the system base coordinate frame 158 can be defined with respect to the robot base 116.
  • points in the system base coordinate frame 158 are defined relative to the robot base 116.
  • the ready coordinate frame 160 represents a taught reference that can be learned and maintained.
  • the ready coordinate frame 160 can be defined with respect to the mask 104.
  • the ready coordinate frame 160 can be defined when the face 108 of the user 109 is in the field of view of the tracking system 112.
  • the system controller 122 can wait for the user 109 to enter a requisite sleep state.
  • the tracking system 112 can determine spatial relationships between two or more bodies in space. For example, the tracking system 112 can determine a spatial relationship between a first body in space and a second body in space. For example, the tracking system 112 can define a first coordinate frame for the first body in space. For example, the tracking system 112 can define a second coordinate frame for the second body in space. The tracking system 112 can then determine a transform between the first coordinate frame and the second coordinate frame, the transform describing a spatial relationship between the first body in space and the second body in space. The tracking system 112 can determine a transform between the face coordinate frame 156 and the mask system coordinate frame 154. The tracking system 112 can use chaining to determine the spatial relationships. The tracking system 112 can store the transform between the first coordinate frame and the second coordinate frame in the database 128.
  • FIG. 2 is a drawing of the mask system 103 of the robotic ventilation system 100.
  • the mask system 103 again comprises the mask 104 and the fiducial marker 110.
  • the mask 104 further comprises the tube-mask connection 144.
  • the mask system 103 further comprises a housing 224.
  • the mask system 103 further comprises a facial interface stem 226, the facial interface stem 226 configured to offset the mask 104 to provide clearance for the mask-tube connection 144.
  • the mask 104 further comprises a mask stem 228 configured to protrude through a mask stem opening 229 to enable clamping.
  • the facial interface stem 226 comprises a stem-flange connection 230 through which the facial interface stem 226 couples to the flange 115.
  • the stem-flange connection 230 comprises a stem-flange opening 230.
  • a single connection (not shown) comprises both the tube-mask connection 144 and the stem-flange connection 230.
  • the mask system 103 further comprises a mask mounting feature 232 usable to mount the mask 104 to the facial interface stem 226.
  • the mask mounting feature 232 is rigidly fixed to the mask stem 228.
  • the mask mounting feature 232 is usable to locate the mask 104 with respect to the flange 115.
  • the mask mounting feature 232 is usable to orient the mask 104 with respect to the flange 115.
  • the mask mounting feature 232 comprises one or more of the mask system mounting feature 232, a magnetic mask system mounting feature 232, a mechanical mask mounting feature, and a clamp-on mask mounting feature.
  • the mask 104 can be coupled to the flange 115 via the mask mounting feature 232.
  • the mask mounting feature 232 can engage with the robot mounting feature 138.
  • the mask mounting feature 232 allows for a repeatable mechanical connection between the mask 104 and the flange 115.
  • the mask 104 can further comprise a contact edge 234.
  • the contact edge 234 comprises a mask seal 236.
  • the mask seal 236 covers the contact edge 234.
  • the contact edge 234 can be substantially rigid.
  • the mask seal 236 can comprise one or more of an inflatable bladder and a soft deformable material.
  • the mask seal 236 is configured to conform to the face of the user.
  • the mask seal 236 is further configured to facilitate formation of an effective seal against the face of the user during use.
  • the mask seal 236 ensures secure delivery to the user of correctly pressurized gas 107.
  • the mask 104 and the facial interface stem 226 are separable components.
  • the mask 104 and the mask stem 228 can be detached from one another.
  • the mask 104 and the mask stem 228 can be coupled using a coupling mechanism.
  • the mask 104 further comprises a groove 238 configured to engage with the mask mounting feature 232.
  • the mask 104 can further include a wing 240 configured to engage with the mask mounting feature 232.
  • the wing 240 extends outward from the mask 104.
  • the mask system 103 further comprises a protrusion 242 configured to engage with the groove 238.
  • the facial interface stem 226 further comprises a clamping portion 244 configured to clamp the mask stem 228.
  • the facial interface stem 226 further comprise a screw portion 246 usable to do one or more of clamp the mask stem 228 and further clamp the mask stem 228. The screw portion 246 is configured to clamp the mask stem 228 via tightening of the screw portion 246.
  • the mask stem 228 and the mask 104 can be effectively coupled together. If the screw portion, is disengaged to loosen the contact between the clamping portion 244 and the wings 240, the mask stem 228 and the mask 104 can be separated from one another by disengaging the contact between the clamping portions 222 and the wings 218, and by disengaging the protrusion 242 from the groove 238.
  • the fiducial marker 110 comprises a fiducial marker coordinate frame 248.
  • the fiducial marker 110 can be tracked by the tracking system 112.
  • the fiducial marker 110 can serve as a tracking reference for the tracking system 112.
  • the fiducial marker coordinate frame 248 can be defined by features of the fiducial marker 110.
  • the open source ArUco library (Universitario de Rabanales, University of Cordoba, 14071 , Cordoba, Spain) (www . uco . es) includes numerous fiducial marker designs with coordinate frames defined from the design features.
  • a mask system fiducial marker coordinate frame 250 can be defined with respect to the mask system coordinate frame 154.
  • the mask system fiducial marker coordinate frame 250 can be defined with respect to the fiducial marker coordinate frame 248.
  • a measuring tool can be used such as the Gauge Max Faro Arm, manufactured by Faro Technologies, Inc. of Lake Mary, Florida (www . faro . com).
  • the spatial relationship 250 may be incorporated into the mask 104 during construction of the mask 104.
  • the mask system coordinate frame can be stored in the database.
  • a spatial relationship 252 of the fiducial marker coordinate frame 248 can be defined with respect to the mask system coordinate frame 154. Alternatively, or additionally, the spatial relationship 252 may be incorporated into the mask system 103 during construction.
  • a spatial relationship 254 of the mask seal 236 can be defined with respect to the mask system coordinate frame 154. Alternatively, or additionally, the spatial relationship 254 may be incorporated into the mask system 103 during construction.
  • a spatial relationship 256 of the fiducial marker coordinate 248 frame can be defined with respect the mask seal 236. Alternatively, or additionally, the spatial relationship 230 may be incorporated into the mask 104 during construction of the mask 104.
  • FIG. 3 is a drawing of a detail of the robotic ventilation system 100, showing the mask system 103 attaching to the flange 115 via the kinematic mount 117.
  • FIG. 3 shows the mask system 103 comprising the mask 104 and the fiducial marker 110.
  • the kinematic mount 117 again comprises the flange kinematic mount 119 and the mask system kinematic mount 120.
  • FIG. 3 also shows the arm 114 that is distal from the robot base (not shown in this figure; item 116 in FIG. 1A).
  • the arm 114 comprises the flange 115.
  • the flange 115 in turn comprises the flange kinematic mount 119.
  • the flange kinematic mount 119 is rigidly fixed relative to the flange 115.
  • the flange kinematic mount 119 can engage with the mask system kinematic mount 120 to do one or more of locate and orient the mask 104 with respect to the flange 115.
  • the flange kinematic mount 119 comprises a flange kinematic mount coordinate frame 142. When coupled, there can be a fixed spatial relationship between the flange coordinate frame 142 and the mask system kinematic mount coordinate frame 154.
  • the flange kinematic mount 119 allows for creation of a repeatable mechanical connection between the mask system 103 and the flange 115 such that when the mask system 103 is removed from a first mask position and then reattached, the mask system 103 goes back into the same first mask position with respect to the flange 115 in a repeatable, predictable manner.
  • the mask system 103 that is removed can be a distinct mask system 103 from the mask system that is later reattached.
  • FIG. 4 is a drawing of the robotic ventilation system 100 in use on behalf of a user 109. Shown again are the mask system 103, the mask 104, the face 108 of the user 109, the fiducial marker 110, and the tracking system 112.
  • the tracking system 112 comprises an imaging device 402 configured to image objects.
  • the tracking system is configured to image objects within a field of view 404.
  • the system 100 further comprises a facial fiducial marker 405.
  • the facial fiducial marker 405 comprises a tracked object 405 placed on the face 108 of the user 109 at a known pose in a field of view 404 of the tracking system 112 that can be viewed by the tracking system 112 and can serve as a reference for the tracking system 112.
  • the face 108 of the user 109 can have a facial fiducial marker 405 affixed in a suitable manner to allow for tracking by the tracking system 112 while the mask 104 is adjacent to the face 108 of the user 109.
  • the tracking system can determine a face fiducial coordinate frame 163.
  • the tracking system 112 further comprises a light source 406 configured to generate light.
  • the tracking system further comprises a processor (not shown).
  • the light source 406 emits light in an area that at least overlaps with the field of view 404 of the imaging device 402.
  • the tracking system 112 can determine a spatial relationship between the tracking system coordinate frame 113 and a centroid of the tracked body, and can store the spatial relationship in the database. Additionally, or alternatively, the tracking system 112 can determine a coordinate frame of the tracked body using at least three spatial centroid points. The tracking system can then store the coordinate frame spatial relationship in the database.
  • a camera can be used such as the OAK-D stereo camera system manufactured by Luxonis Holding Corporation of Riverside, Colorado (www . luxonis . com).
  • the tracking system 112 can instruct the imaging device 402 to capture one or more images.
  • the tracking system 112 can store the image data in the database.
  • the tracking system 112 processes the captured images with an image processing technique to track the facial fiducial marker 405.
  • the tracking system 112 stores in the database a relative spatial relationship of the fiducial marker coordinate frame 248 with respect to the tracking system coordinate frame 113.
  • the tracking system 112 processes the captured images with an image processing technique.
  • the tracking system 112 can track a facial feature 407 of the face 108 of the user 109.
  • the tracking system 112 can store in the database a relative spatial relationship of the centroid point with respect to the tracking system coordinate frame 113.
  • the tracking system 112 can track an object feature point of a tracked object.
  • the tracking system 112 can store in the database the relative spatial relationship of the feature point with respect to the tracking system coordinate frame 113.
  • the field of view 404 can envelop the face 108 of the user 109 such that the tracking system 112 can track a facial feature 407 of the face 108 of the user 109.
  • the facial feature 407 comprises one or more of a left eye 408A, a right eye 408B, a nose 410, a mouth left lateral 412A, a mouth right lateral 412B, a mouth center 414, a left ear (not shown), and a right ear 416.
  • the left eye 408A comprises a left eye point 408A.
  • the right eye 408B comprises a right eye point 408B.
  • the nose 410 comprises a nose point 410.
  • the mouth left lateral 412A comprises a mouth left lateral point 412A.
  • the mouth right lateral 412B comprises a mouth right lateral 412B.
  • the mouth center 414 comprises a mouth center point.
  • the left ear (not shown) comprises a left ear point (not shown).
  • the right ear 416 comprises a right ear point 416.
  • the left eye 408A comprises a left eye centroid point 408A.
  • the right eye 408B comprises a right eye centroid point 408B.
  • the nose 410 comprises a nose centroid point 410.
  • the mouth left lateral 412A comprises a mouth left lateral centroid point 412A.
  • the mouth right lateral 412B comprises a mouth right lateral centroid 412B.
  • the mouth center 414 comprises a mouth center centroid point.
  • the left ear (not shown) comprises a left ear centroid point (not shown).
  • the right ear 416 comprises a right ear centroid point 416.
  • One or more of the eye centroid points 408A-408B and the nose point 410 can be used to form a face feature coordinate frame 164.
  • a face feature coordinate frame origin 418 can be located at the midpoint 420 of a line defined from the eye point 408A to the eye point 408B.
  • a face feature coordinate frame X axis 422 can be the normalized line from the midpoint 420 to the eye point 408B.
  • the face feature coordinate frame Y axis 424 can be the normalized line from the midpoint 420 to the nose point 410.
  • a face feature coordinate frame Z axis 426 can be defined from the normalized cross product of the face feature coordinate frame X axis 422 with the face feature coordinate frame Y axis 424.
  • the mask system 103 with an attached fiducial marker 110 can be in the field of view 404.
  • the tracking system can determine a mask fiducial coordinate frame 440.
  • the spatial relationship of the face fiducial coordinate frame 440 with the tracking system coordinate frame 113 can be stored in the database.
  • the controller 122 can instruct the arm 114 to orient the mask 104 so as to maintain a specified spatial relationship of the mask seal 236 with respect to the facial feature coordinate frame 416.
  • the specified spatial relationship can be computed by the controller 122. Alternatively, or additionally, the specified spatial relationship is received from the user 109.
  • the controller 122 can instruct the arm 114 to maintain one or more of a specified contact force and a specified contact torque of the mask seal 236 with respect to the user’s face 108. This mode of operation is commonly referred to as hybrid position-force control.
  • the controller 122 can instruct the arm 114 to orient the mask 104 so as to maintain a specified spatial relationship of the mask seal 236 with respect to the face fiducial coordinate frame 163.
  • the controller 122 can instruct the arm 114 to maintain specified contact forces and torques of the mask seal 236 with respect to the user’s face 108. This mode of operation is commonly referred to as hybrid position-force control.
  • FIG. 5 is a flow chart of a method 500 for correct mask placement using a robotic ventilation system including a kinematic mount.
  • step 510 image data is captured, the image data comprising a facial feature of a user, by a system comprising: a robot comprising an arm, the arm comprising a flange, the flange coupled to an end of the arm, the arm configured to move the flange along a degree of freedom; a mask coupled to the flange, the mask configured to deliver gas to a user, wherein the arm further comprises a kinematic mount, the kinematic mount usable to do one or more of orient and locate the mask with respect to the flange; a ventilator coupled to the mask, the ventilator configured to deliver the gas to the mask; a gas tube coupled to both the mask and the ventilator, the gas tube configured to carry gas between the ventilator and the mask; a controller configured to change a pose of the mask, the controller further configured to control the delivery of the gas from the ventilator to the user; and a tracking system, the tracking system configured to capture image data of one or more of the mask and a face of the user,
  • the tracking system captures the image data, the image data comprising the facial feature of the user.
  • the image data further comprises image data regarding the fiducial marker.
  • the image data further comprises image data regarding a facial fiducial marker comprised in the user’s face.
  • Block 510 then transfers control to block 520.
  • step 520 by the system, the facial feature of the user is tracked.
  • the system further tracks the facial fiducial marker.
  • Block 520 then transfers control to block 530.
  • a face coordinate frame is determined with respect to the user’s face.
  • the system can store the face coordinate frame.
  • the tracking system can store the face coordinate frame in the database.
  • Block 530 then transfers control to block 540.
  • step 540 by the system, a mask system coordinate frame is determined with respect to the mask.
  • the system can store the mask system coordinate frame.
  • the tracking system can store the mask system coordinate frame in the database.
  • Block 540 then transfers control to block 550.
  • step 550 by the system, a transform is determined between the face coordinate frame and the mask system coordinate frame. For example, the tracking system determines the transform between the face coordinate frame and the mask system coordinate frame. Block 550 then transfers control to block 560.
  • the transform is stored as a spatial relationship between the facial coordinate system and the mask coordinate system, the spatial relationship configured for correct placement of the mask on the user’s face.
  • the tracking system stores the transform as the spatial relationship between the facial coordinate system and the mask coordinate system. Block 560 then terminates the process.
  • FIG. 6 is a flow chart of a method 600 for correct mask placement using a robotic ventilation system including a fiducial marker.
  • image data is captured, the image data comprising a facial feature of a user, the image data further comprising image data regarding the fiducial marker, by a system comprising: a robot comprising an arm, the arm comprising a flange, the flange coupled to an end of the arm, the arm configured to move the flange along a degree of freedom; a mask coupled to the flange, the mask configured to deliver gas to a user, the mask comprising a fiducial marker; a ventilator coupled to the mask, the ventilator configured to deliver the gas to the mask; a gas tube coupled to both the mask and the ventilator, the gas tube configured to carry the gas between the ventilator and the mask; a controller configured to change a pose of the mask, the controller further configured to control the delivery of the gas from the ventilator to the user; and a tracking system, the tracking system configured to capture image data of one or more of the mask and a face of the user, the tracking system further configured to track the fiducial marker, wherein
  • step 620 by the system, the facial feature of the user is tracked and the fiducial marker is tracked. Optionally, the system further tracks the facial fiducial marker. Block 620 then transfers control to block 630.
  • a face coordinate frame is determined with respect to the user’s face.
  • the system can store the face coordinate frame.
  • the tracking system can store the face coordinate frame in the database.
  • Block 630 then transfers control to block 640.
  • a mask system coordinate frame is determined with respect to the mask.
  • the system can store the mask system coordinate frame.
  • the tracking system can store the mask system coordinate frame in the database.
  • Block 640 then transfers control to block 650.
  • a transform is determined between the face coordinate frame and the mask system coordinate frame.
  • the tracking system determines the transform between the face coordinate frame and the mask system coordinate frame.
  • Block 650 then transfers control to block 660.
  • the transform is stored as a spatial relationship between the facial coordinate system and the mask coordinate system, the spatial relationship configured for correct placement of the mask on the user’s face.
  • the tracking system stores the transform as the spatial relationship between the facial coordinate system and the mask coordinate system. Block 660 then terminates the process.
  • FIG. 7 is a flow chart of a method 700 for correct mask placement using a robotic ventilation system including a kinematic mount and further including a fiducial marker.
  • step 710 image data is captured, the image data comprising a facial feature of a user, by a system comprising: a robot comprising an arm, the arm comprising a flange, the flange coupled to an end of the arm, the arm configured to move the flange along a degree of freedom, the arm further comprising a kinematic mount; a mask coupled to the flange by a mounting feature configured for intermittent attachment, the mask configured to deliver gas to a user, wherein the kinematic mount is usable to do one or more of orient and locate the mask with respect to the flange; a mask coupled to the robot, wherein the mask is coupled to the flange by a mounting feature configured for intermittent attachment, the mounting feature comprising the kinematic mount, the mounting feature further comprising an opening, wherein the mounting feature further comprises a sensor, the sensor configured to verify that proper seating has occurred of the mask on the user, wherein a spatial alignment of the mask with respect to the kinematic mount is known a
  • the tracking system captures the image data, the image data comprising the facial feature of the user.
  • the image data further comprises image data regarding the fiducial marker.
  • the image data further comprises image data regarding a facial fiducial marker comprised in the user’s face.
  • Block 710 then transfers control to block 720.
  • step 720 by the system, the facial feature of the user is tracked and the fiducial marker is tracked.
  • the system further tracks the facial fiducial marker. Block 720 then transfers control to block 730.
  • step 730 by the system, a face coordinate frame is determined with respect to the user’s face.
  • the system can store the face coordinate frame.
  • the tracking system can store the face coordinate frame in the database. Block 730 then transfers control to block 740.
  • a mask system coordinate frame is determined with respect to the mask.
  • the system can store the mask system coordinate frame.
  • the tracking system can store the mask system coordinate frame in the database.
  • Block 740 then transfers control to block 750.
  • step 750 by the system, a transform is determined between the face coordinate frame and the mask system coordinate frame. For example, the tracking system determines the transform between the face coordinate frame and the mask system coordinate frame. Block 750 then transfers control to block 760.
  • the transform is stored as a spatial relationship between the facial coordinate system and the mask coordinate system, the spatial relationship configured for correct placement of the mask on the user’s face.
  • the tracking system stores the transform as the spatial relationship between the facial coordinate system and the mask coordinate system. Block 760 then terminates the process.
  • Any steps, operations, or processes described herein may be performed or implemented with one or more hardware or software modules, alone or in combination with other devices.
  • a software module is implemented with a computer program product including a computer-readable medium containing computer program code, which can be executed by a computer processor for performing any or all of the steps, operations, or processes described.
  • Embodiments of the disclosure can also relate to an apparatus for performing the operations herein.
  • This apparatus can be specially constructed for the required purposes, and/or it can include a general-purpose computing device selectively activated or configured by a computer program stored in the computer.
  • a computer program can be stored in a non-transitory, tangible computer readable storage medium, or any type of media suitable for storing electronic instructions, which can be coupled to a computer system bus.
  • any computing systems referred to in the specification can include a single processor or can be architectures employing multiple processor designs for increased computing capability.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Hematology (AREA)
  • Emergency Medicine (AREA)
  • Anesthesiology (AREA)
  • Pulmonology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Physics & Mathematics (AREA)
  • Robotics (AREA)
  • Molecular Biology (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Urology & Nephrology (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Manipulator (AREA)
EP21890248.4A 2020-11-06 2021-11-08 Robotisches belüftungssystem zur korrekten maskenpositionierung Pending EP4262944A1 (de)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US202063198705P 2020-11-06 2020-11-06
US202063198961P 2020-11-25 2020-11-25
PCT/US2021/058480 WO2022099148A1 (en) 2020-11-06 2021-11-08 Robotic ventilation system for correct mask placement

Publications (1)

Publication Number Publication Date
EP4262944A1 true EP4262944A1 (de) 2023-10-25

Family

ID=81456741

Family Applications (1)

Application Number Title Priority Date Filing Date
EP21890248.4A Pending EP4262944A1 (de) 2020-11-06 2021-11-08 Robotisches belüftungssystem zur korrekten maskenpositionierung

Country Status (2)

Country Link
EP (1) EP4262944A1 (de)
WO (1) WO2022099148A1 (de)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7204250B1 (en) * 1999-12-16 2007-04-17 Compumedics Limited Bio-mask
US10307562B2 (en) * 2012-04-13 2019-06-04 Fresca Medical, Inc. Auto-feedback valve for a sleep apnea device
CN108969858B (zh) * 2018-08-08 2021-04-06 贵州中医药大学 一种全自动送氧机器人上氧方法及系统
DE102019204676B3 (de) * 2019-04-02 2020-07-16 Volkswagen Aktiengesellschaft Autonomes Luftfahrzeug zur Beatmung von Personen
WO2021096839A1 (en) * 2019-11-11 2021-05-20 Restful Robotics, Inc. Automatic placement of a mask

Also Published As

Publication number Publication date
WO2022099148A1 (en) 2022-05-12

Similar Documents

Publication Publication Date Title
CN110488745B (zh) 一种人体自动超声扫描机器人、控制器及控制方法
US20220233224A1 (en) Alignment apparatus for use in hip arthroplasty
US20220257892A1 (en) Robotic Ventilation System for Correct Mask Placement
US20240016552A1 (en) Surgical object tracking template generation for computer assisted navigation during surgical procedure
US11682122B2 (en) Auto-tracker characterization with robotics applications
CN111278380A (zh) 用于追踪目标对象的位置的系统
EP4262944A1 (de) Robotisches belüftungssystem zur korrekten maskenpositionierung
US20220096188A1 (en) Systems and methods for tracking anatomical motion
CN112861564A (zh) 一种坐姿检测的实现装置
KR101195994B1 (ko) 3차원 광학 측정기를 이용한 뼈 움직임 감지 및 경로 보정 시스템
US20210369378A1 (en) System and method for drape volume control
US20220020161A1 (en) Registration method and setup
US11256925B2 (en) Monitoring system and monitoring method thereof
IL270512B2 (en) Fast registration of coordinate systems for robotic surgery
JP4565445B2 (ja) 顔情報計測システム
CA3110600A1 (en) Birth delivery magnetic tracking system
US20230363827A1 (en) Accuracy check and automatic calibration of tracked instruments
US20230310086A1 (en) Camera tracking system identifying phantom markers during computer assisted surgery navigation
EP4354394A2 (de) Kameraverfolgungssystem für computergestützte chirurgische navigation
US20230149082A1 (en) Systems, methods, and devices for performing a surgical procedure using a virtual guide
TW201432607A (zh) 隨身型生理資訊監控系統與生理狀態監控方法
CN117440786A (zh) 用于检测和监测盖布配置的系统和方法
Nakamura et al. Accuracy verification of an automatic head-positioning system for a normalized measurement of jaw movement by an x-ray television
CN116648724A (zh) 用于监测一个或多个解剖元素的系统和方法
EP4124293A3 (de) Systeme und verfahren zur fernmessung des halswirbelbewegungsbereichs

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20230823

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)