WO2020082181A1 - Precise teleguidance of humans - Google Patents

Precise teleguidance of humans Download PDF

Info

Publication number
WO2020082181A1
WO2020082181A1 PCT/CA2019/051507 CA2019051507W WO2020082181A1 WO 2020082181 A1 WO2020082181 A1 WO 2020082181A1 CA 2019051507 W CA2019051507 W CA 2019051507W WO 2020082181 A1 WO2020082181 A1 WO 2020082181A1
Authority
WO
WIPO (PCT)
Prior art keywords
command
trainee
trainer
orientation
interface
Prior art date
Application number
PCT/CA2019/051507
Other languages
French (fr)
Inventor
Peter Goldsmith
Original Assignee
Uti Limited Partnership
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uti Limited Partnership filed Critical Uti Limited Partnership
Publication of WO2020082181A1 publication Critical patent/WO2020082181A1/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/06Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0016Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/005Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with signals other than visual, e.g. acoustic, haptic
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/35Surgical robots for telesurgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4254Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4263Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors not mounted on the probe, e.g. mounted on an external reference frame
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/24Use of tools
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • G09B23/286Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine for scanning or photography techniques, e.g. X-rays, ultrasonics
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture

Definitions

  • the present application relates generally to teleguidance systems, and in particular to teleguidance systems that provide timely feedback suitable for guiding remote assistants or telementoring remote trainees.
  • Telepresence systems allow local people to interact with others at remote locations through a bidirectional telecommunication channel that allows the exchange of video and audio data.
  • Tele-operated robots can add physical tele-presence, allowing local human controllers to interact at a distance, physically with a remote environment through a robotic device.
  • Teleguidance is sometimes used to refer to the action or practice of remotely controlling or directing a machine or device.
  • telementoring and teleguidance more specifically denote the guidance of a remote medical assistant by an expert physician, particularly in surgery.
  • Guidance is provided through voice commands and by annotating the remote trainee’s or remote assistant’s display monitor with symbols or sketches.
  • conventional teleguidance is discrete in nature and does not provide continuous real-time control of the precise motions of the assistant or trainee.
  • Embodiments of the present invention allow the remote assistant or trainee to be guided continuously in real time, with an accuracy comparable to that of a tele-operated robot.
  • Some embodiments include a display (e.g. video and/or haptic) that depicts the trainee’s behavior (or the results of the trainee’s behavior) in the remote environment to the trainer so that the trainer can provide corrective guidance accordingly.
  • This corrective guidance is communicated to the trainee via continuous command signals that the trainee senses as auditory, tactile, or visual cues.
  • a teleguidance apparatus including: a sensor for generating sensor data representative of a current position and orientation of a portion of a trainee’s body or a tool held by the portion; a command interface for outputting a command signal to the trainee; a communication interface for sending the sensor data to a trainer’s device and for receiving feedback data from the trainer’s device; and a processor coupled to the communication interface and the command interface.
  • the processor processes the feedback data to determine a command, transmitted by the trainer’s device, to move the portion of the body or the tool to a desired position and orientation.
  • the processor outputs the command signal corresponding to the command, through the command interface.
  • a teleguidance system that includes a trainee’s teleguidance apparatus and a trainer’s device.
  • the teleguidance apparatus includes: a sensor for generating sensor data representative of a current position and orientation of a portion of a trainee’s body or a tool held by the portion; a command interface for outputting a command signal to the trainee; a communication interface for sending the sensor data to a trainer’s device and for receiving feedback data from the trainer’s device; and a processor coupled to the communication interface and the command interface.
  • the processor processes the feedback data to determine a command, transmitted by the trainer’s device, to move the portion of the body or the tool to a desired position and orientation.
  • the processor outputs the command signal corresponding to the command, through the command interface.
  • the trainer’s device includes a feedback controller, wherein the feedback controller is adapted to: receive the sensor data; generate the desired position and orientation based on the sensor data; compute the difference between the current position and orientation, and the desired position and orientation using the sensor data; generate the command to move the portion of the body or the tool to the desired position and orientation; and transmit the command to the teleguidance apparatus.
  • a method of teleguidance including: using a sensor to generate sensor data representative of a current position and orientation of a portion of a trainee’s body or a tool held by the portion; using a communication interface to send the sensor data to a trainer’s device; using a communication interface to receive feedback data from the trainer’s device; and using a processor coupled to the communication interface, to: process the feedback data to determine a command to move the portion of the body or the tool to a desired position and orientation, transmitted by trainer’s device; and output a command signal corresponding to the command on the command interface.
  • a method of teleguidance including: receiving at a feedback device, sensor data representative of a current position and orientation of a portion of a trainee’s body or a tool held by the portion, from a sensor; generating, using a first processor, a desired position and orientation of the trainee’s body or the tool held by the portion, based on the sensor data; computing the difference between the current position and orientation and the desired position and orientation, using the sensor data; and generating a command, based on the difference, in a feedback data to move the portion of the body or the tool to the desired position and orientation.
  • FIG. 1 is a simplified schematic diagram of a teleguidance system exemplary of an embodiment of the present invention that includes a teleguidance apparatus;
  • FIG. 2 is a flowchart depicting exemplary steps taken by the teleguidance apparatus of FIG. 1;
  • FIG. 3 is a simplified schematic diagram of a teleguidance system exemplary of another embodiment of the present invention that includes a teleguidance apparatus.
  • Embodiments of the present invention have applications in telemedicine and other fields in which a human assistant provides benefits over a robot assistant.
  • An important benefit of a human assistant over a robot assistant includes adaptability to a variety of different tasks.
  • Embodiments of the present invention are also particularly suitable when teleguidance is intended to train trainees.
  • Other environments where embodiments of the present invention have applications include rehabilitation of patients by an instructor or therapist at a remote location.
  • One embodiment of the present invention uses sensors and software to capture the commands of the trainer, which are transmitted in real time to the trainee.
  • the sensor could be a joystick or other hand controller, which captures the hand positions of the trainer and converts these to position or velocity commands for the trainee to follow.
  • These commands are transmitted to the trainee as visual, tactile, or auditory sensory cues.
  • auditory cues may include sound pulses whose pulse frequency indicates the magnitude of the desired correction and whose pitch and timbre indicates the direction and axis of motion to which the correction is applied.
  • the trainer can guide the trainee’s hand or tool through a task, using feedback from cameras and other remote sensors to monitor the trainee’s behavior and actions.
  • the instructor or trainer may be an ultrasound specialist who guides an ultrasound probe in the hand of a remote assistant to obtain a desired ultrasound image.
  • the instructor can also guide applied forces in this manner.
  • FIG. 1 illustrates a simplified schematic diagram of a teleguidance system 100 exemplary of an embodiment of the present invention.
  • the system 100 includes an apparatus 104 where a trainee 102 uses a tool 110 such as an ultrasound probe while being guided remotely by a trainer 126 using a device 124.
  • a tool 110 such as an ultrasound probe
  • a trainer 126 using a device 124.
  • Apparatus 104 includes a sensor 108 in the form of a camera, a command interface 112 that may include a display 107 and a haptic interface 114 in physical communication with a portion of the body of trainee 102.
  • the command interface 112 may be in data in communication with a processor 106.
  • the communication interface 113 provides bidirectional data communication with the trainer’s device 124 thorough a network 122.
  • the communication interface 113 receives feedback data from a trainer’s device 124 and sends the sensor data from sensor 108 to the trainer’s device 124.
  • processor 106 is coupled to the communication interface 113 for processing the feedback data to determine a command transmitted by trainer’s device 124 and for outputting a command signal corresponding to the received command on the command interface 112 to move said portion of the body or the tool 110 to the desired position and orientation.
  • sensor 108 In operation, sensor 108 generates sensor data representative of one or more of a position, orientation or applied force of a portion of the body of trainee 102 (e.g. his hand or fingers) or of the tool 110 or both.
  • the command interface 104 in this embodiment is haptic interface 114 including a skin buzzer providing feedback sent from trainer 126 via device 124 to the trainee’s hand.
  • a display outputting visual feedback, a loudspeaker outputting audio feedback or other command interfaces may be used to augment or replace the haptic interface 114.
  • Device 124 also includes a processor or a feedback controller (not specifically illustrated) to compute the difference between the sensor data sent by apparatus 104 and a desired position and orientation as determined by device 124 with input from trainer 126 and to generate commands for guiding the trainee’s hands to the desired position.
  • the processor or feedback controller of device 124 may execute processor executable instructions stored in a processor readable medium or memory in communication with the processor or controller.
  • Device 124 also includes and a communication interface 128 for sending feedback data and for receiving the sensor data from apparatus 104.
  • Device 124 is in communication with a sensor 130 in the form of a camera.
  • Device 124 may also include a display 132 providing visual data representative of the position of trainee 102.
  • the exemplary system 100 adds high-fidelity motion and force following via continuous feedback control.
  • the hand of trainee 102 may be holding or grasping tool 110 such as an ultrasound probe.
  • the instructor or trainer 126 wishes to control the pose, that is, position and orientation, of the trainee’s hand.
  • the same method can be used to guide other limbs, the head, or the body of the trainee 102 via teleguidance.
  • the pose of the trainer’s hand and the pose of the trainee’s hand are tracked by motion capture system such as camera or optical sensors 130, 108 respectively.
  • the error between the trainer pose and the trainee pose is calculated by device 124 and communicated to the trainee 102 through a visual, auditory, and/or tactile feedback.
  • skin buzzer 114 may be used to provide tactile feedback.
  • a visual error display is used. This may be advantageous in certain situations as visual perception has the greatest bandwidth of all the senses.
  • a display device that may be a stationary monitor or a head-mounted display (HMD) can be used.
  • HMD head-mounted display
  • the error display may depict an image of three-dimensional (3D) object that appears displaced (that is, translated and/or rotated in space) from a fixed target pose, by an amount proportional to the measured tracking error.
  • the trainee 102 thus continually moves his or her hand to the correct pose to minimize the displayed error.
  • the target object in the trainee’s error display remains fixed.
  • the error can be displayed using relatively large objects to give high error resolution.
  • the error display can be superimposed over other useful information, such as the output of a guided ultrasound scan or the trainee’s view of his own hand.
  • an augmented reality HMD may be used.
  • FIG. 2 is a flowchart of a method 200 depicting exemplary steps taken by the teleguidance apparatus 104.
  • the apparatus 104 uses a sensor for generating sensor data representative of a position and orientation of a portion of a trainee’s body (e.g., hand) or a tool 110 (e.g., a probe) held by said portion of the body.
  • a sensor for generating sensor data representative of a position and orientation of a portion of a trainee’s body (e.g., hand) or a tool 110 (e.g., a probe) held by said portion of the body.
  • apparatus 104 uses a communication interface (e.g., communication interface 113) for sending the sensor data to a trainer’s device 124.
  • a communication interface e.g., communication interface 113 for sending the sensor data to a trainer’s device 124.
  • apparatus 104 uses a command interface (e.g., command interface 112) to receive feedback data from the trainer’s device 124.
  • a command interface e.g., command interface 112
  • device 124 receives the sensor data; generates the desired position and orientation (e.g., via sensor 130 in the form of a camera recording video of the trainers hand); computes the difference between the sensor data from device 104 and the desired position and orientation. Device 124 then generates a command to move said portion of the body or the tool to the desired position and orientation; and transmits the command to the remote apparatus 104.
  • the desired position and orientation e.g., via sensor 130 in the form of a camera recording video of the trainers hand
  • Device 124 then generates a command to move said portion of the body or the tool to the desired position and orientation; and transmits the command to the remote apparatus 104.
  • apparatus 104 uses a processor (e.g. processor 106) to process the feedback data to determine a command transmitted by trainer’s device.
  • a processor e.g. processor 106
  • apparatus 104 outputs a command signal on a command interface to move a portion of the body of trainee 102 or the tool 110 to the desired position and orientation.
  • FIG. 3 illustrates another simplified schematic diagram of a teleguidance system 300 exemplary of an embodiment of the present invention.
  • the system 300 includes, at the remote trainee’s location, an apparatus 304, which further comprises an ultrasound probe 310 that incorporates haptic or tactile feedback means in the form of a plurality of skin buzzers 314a, 314b, 314c, 314d (collectively skin buzzers 314). In use, skin buzzers 314 are in physical contact with a portion of the hand of trainee 302 as shown.
  • the apparatus 304 also includes a sensor in the form of camera 308.
  • the system 300 includes a display 332 in communication with a probe 310 thorough a network 322. Another display 334 is in communication with camera 308 thorough the network 322.
  • the display 332 displays the ultrasound signal from probe 310 while display 334 displays an image or video as captured by camera 308 and/or other sensors indicative of one or more of the position, orientation or applied force of a portion of the body of trainee 302 or of probe 310.
  • camera 308 sends image video and/or audiovisual data directly to the trainer’s display 334.
  • a trainer’s feedback device 324 which in this embodiment operates like another probe or similar joystick, contains a processor or feedback controller (not specifically illustrated) that executes processor executable instructions stored in a processor readable medium or memory in communication with the processor or controller.
  • the skin buzzers 314 in this particular embodiment form part of the probe 310.
  • skin buzzers 314 along with an associated processor (not shown in FIG. 3) form part of the command interface within the probe 310.
  • a general-purpose glove, wrist band, sleeve, or garment may be used, so that the teleguidance system is independent of the specific tool used.
  • Apparatus 304 or specifically the command interface/processor in probe 310 processes the feedback data from trainer’s feedback device 324 and outputs a corresponding command signal via skin buzzers 314 alerting the trainee to move his or her hand or the probe 310 to the desired position and orientation.
  • the trainee 302 takes an ultrasound image of a patient 330.
  • Camera 308 generates sensor data representative of a position and orientation of a portion of the trainee’s hand and/or probe 310.
  • Skin buzzers 314 provide tactile feedback to the trainee’s hand. The feedback is generated by trainer 326 via feedback device 324.
  • the instructor or trainer 326 is able to use display 332 to monitor the ultrasound signal from probe 310 and view the display 334 to observe trainee’s ultrasound position. Corrective feedback from the trainer 326 is provided by simply moving the feedback device 324 like a joystick as needed. The movements of feedback device 324 by the trainer 326 are encoded and transmitted to probe 310 where they are decoded and converted to appropriate tactile positional instructions or command signals transmitted to the trainee 302 via skin buzzers 314.
  • a physician or specialist can guide a remote assistant to conduct a physical exam or perform a treatment on a remote patient.
  • a remote assistant can guide an assistant to manipulate an ultrasound probe as described with reference to FIG. 3.
  • An instructor such as trainer 326 can guide a trainee such as trainee 302 to complete a task and thereby have the trainee learn by doing.
  • the instructor may choose to give continuous feedback on the desired position and orientation; or the instructor may choose to give the trainee intermittent feedback only if the trainee deviates from an acceptable position envelope.
  • the trainer can be a physical therapist and the trainee may be a patient.
  • the therapist can guide the patient through rehabilitation exercises using visual, audio and/or haptic feedback to ascertain that the patient is going through the exercise regimen with proper form, frequency and the like.
  • the trainer may be a virtual tourist and the trainee may be a traveler at a remote tour location.
  • the virtual tourist may thus guide the surrogate traveler at the remote location to sites of interest to the virtual tourist, and experience the location and social interactions vicariously.
  • Embodiment VI Remote Representative the trainer may be a business person or diplomat, and the trainee may be a surrogate representative.
  • the business person or diplomat can interact with his or her remote counterparts through the surrogate representative. This may involve a presentation, an interview, a negotiation or the like.
  • the trainer and the trainee may be in close proximity to each other.
  • the trainee may be inside a sterile surgical suite while the trainer may be just outside the suite, or adjacent thereto. This is advantageous whenever the theatre of activity, such as the surgical room in this particular example, does not have sufficient space to physically accommodate both the trainer and trainee.
  • error measurement and/or communication of the measurement data may optionally include a filter to remove unintended motion artifacts such as tremors.
  • the measured pose of the trainee could be scaled, rotated, or transposed before error is calculated to allow for body size differences between the trainer and trainee or to allow the trainer to operate on a magnified scale of movement. For example, movement of centimeters by the trainer, may be used to control millimeter scale movements by the trainee surgery. [0061] In addition, image captures of movements could be transposed to account for left- handedness or right-handedness.
  • embodiments of the present invention allow a trainer to guide remote trainees in telemedicine and other fields in which a human assistant provides benefits over a robot assistant or when using teleguidance for training or rehabilitation of humans. Guiding a trainee at a remote location increases the physical telepresence of the trainer.
  • the invention uses sensors and software to capture motion commands from the trainer (e.g. from a hand controller) and transmit these to the trainee through auditory, tactile, or visual displays.
  • the trainer uses information or data from cameras and other remote sensors to guide the trainee through a task.
  • an ultrasound specialist can guide an ultrasound probe in the hand of a remote assistant to obtain a desired image.
  • this invention adds high-fidelity motion and force following via continuous feedback control, to communicate discrete commands to the trainee.
  • sensors may include not just cameras but audio recording microphones and the displays may include speakers to output audio signals.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Business, Economics & Management (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Theoretical Computer Science (AREA)
  • Public Health (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Educational Technology (AREA)
  • Automation & Control Theory (AREA)
  • Educational Administration (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Radiology & Medical Imaging (AREA)
  • General Business, Economics & Management (AREA)
  • Human Computer Interaction (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Mathematical Physics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Computing Systems (AREA)
  • Acoustics & Sound (AREA)
  • Manipulator (AREA)

Abstract

This invention allows a trainer to guide remote trainees in telemedicine and other fields in which a human assistant provides benefits over a robot assistant or when using teleguidance for training or rehabilitation of humans. Guiding a trainee at a remote location increases the physical telepresence of the trainer. The invention uses sensors and software to capture motion commands from the trainer (e.g. from a hand controller) and transmit these to the trainee through auditory, tactile, or visual displays. The trainer uses feedback from cameras and other remote sensors to guide the trainee through a task. For example, an ultrasound specialist can guide an ultrasound probe in the hand of a remote assistant to obtain a desired image. In addition to voice and graphical annotations, this invention adds high-fidelity motion and force following via continuous feedback control, to communicate discrete commands to the trainee.

Description

Precise Teleguidance of Humans
TECHNICAL FIELD
[0001] The present application relates generally to teleguidance systems, and in particular to teleguidance systems that provide timely feedback suitable for guiding remote assistants or telementoring remote trainees.
BACKGROUND ART
[0002] Telepresence systems allow local people to interact with others at remote locations through a bidirectional telecommunication channel that allows the exchange of video and audio data. Tele-operated robots can add physical tele-presence, allowing local human controllers to interact at a distance, physically with a remote environment through a robotic device.
[0003] Teleguidance is sometimes used to refer to the action or practice of remotely controlling or directing a machine or device. In certain applications such as telemedicine however, telementoring and teleguidance more specifically denote the guidance of a remote medical assistant by an expert physician, particularly in surgery.
[0004] Guidance is provided through voice commands and by annotating the remote trainee’s or remote assistant’s display monitor with symbols or sketches. However, such conventional teleguidance is discrete in nature and does not provide continuous real-time control of the precise motions of the assistant or trainee.
[0005] Accordingly, improvements that mitigate at least some of the aforementioned disadvantages are desired. There is a need for improved guidance that provides improved accuracy and timely and more frequent feedback with less latency.
SUMMARY OF INVENTION
[0006] Embodiments of the present invention allow the remote assistant or trainee to be guided continuously in real time, with an accuracy comparable to that of a tele-operated robot. Some embodiments include a display (e.g. video and/or haptic) that depicts the trainee’s behavior (or the results of the trainee’s behavior) in the remote environment to the trainer so that the trainer can provide corrective guidance accordingly. This corrective guidance is communicated to the trainee via continuous command signals that the trainee senses as auditory, tactile, or visual cues.
[0007] In accordance with one aspect of the present invention, there is provided a teleguidance apparatus including: a sensor for generating sensor data representative of a current position and orientation of a portion of a trainee’s body or a tool held by the portion; a command interface for outputting a command signal to the trainee; a communication interface for sending the sensor data to a trainer’s device and for receiving feedback data from the trainer’s device; and a processor coupled to the communication interface and the command interface. The processor processes the feedback data to determine a command, transmitted by the trainer’s device, to move the portion of the body or the tool to a desired position and orientation. The processor outputs the command signal corresponding to the command, through the command interface.
[0008] In accordance with another aspect of the present invention, there is provided a teleguidance system that includes a trainee’s teleguidance apparatus and a trainer’s device. The teleguidance apparatus includes: a sensor for generating sensor data representative of a current position and orientation of a portion of a trainee’s body or a tool held by the portion; a command interface for outputting a command signal to the trainee; a communication interface for sending the sensor data to a trainer’s device and for receiving feedback data from the trainer’s device; and a processor coupled to the communication interface and the command interface. The processor processes the feedback data to determine a command, transmitted by the trainer’s device, to move the portion of the body or the tool to a desired position and orientation. The processor outputs the command signal corresponding to the command, through the command interface. The trainer’s device includes a feedback controller, wherein the feedback controller is adapted to: receive the sensor data; generate the desired position and orientation based on the sensor data; compute the difference between the current position and orientation, and the desired position and orientation using the sensor data; generate the command to move the portion of the body or the tool to the desired position and orientation; and transmit the command to the teleguidance apparatus.
[0009] In accordance with another aspect of the present invention there is provided a method of teleguidance including: using a sensor to generate sensor data representative of a current position and orientation of a portion of a trainee’s body or a tool held by the portion; using a communication interface to send the sensor data to a trainer’s device; using a communication interface to receive feedback data from the trainer’s device; and using a processor coupled to the communication interface, to: process the feedback data to determine a command to move the portion of the body or the tool to a desired position and orientation, transmitted by trainer’s device; and output a command signal corresponding to the command on the command interface.
[0010] In accordance with yet another aspect of the present invention there is provided a method of teleguidance including: receiving at a feedback device, sensor data representative of a current position and orientation of a portion of a trainee’s body or a tool held by the portion, from a sensor; generating, using a first processor, a desired position and orientation of the trainee’s body or the tool held by the portion, based on the sensor data; computing the difference between the current position and orientation and the desired position and orientation, using the sensor data; and generating a command, based on the difference, in a feedback data to move the portion of the body or the tool to the desired position and orientation.
BRIEF DESCRIPTION OF DRAWINGS
[0011] In the figures, which illustrate by way of example only, embodiments of the present invention,
[0012] FIG. 1 is a simplified schematic diagram of a teleguidance system exemplary of an embodiment of the present invention that includes a teleguidance apparatus; [0013] FIG. 2 is a flowchart depicting exemplary steps taken by the teleguidance apparatus of FIG. 1; and
[0014] FIG. 3 is a simplified schematic diagram of a teleguidance system exemplary of another embodiment of the present invention that includes a teleguidance apparatus.
DESCRIPTION OF EMBODIMENTS
[0015] Many conventional tele-robotic systems allow humans to control a remote tele- robotic device to follow his or her motions and forces. The present disclosure describes several embodiments of teleguidance systems, apparatuses and methods that allow an expert to guide a remote human assistant.
[0016] A description of various embodiments of the present invention is provided below. In this disclosure, the use of the word“a” or“an” when used herein in conjunction with the term “comprising” may mean“one,” but it is also consistent with the meaning of“one or more,”“at least one” and“one or more than one”. Any element expressed in the singular form also encompasses its plural form. Any element expressed in the plural form also encompasses its singular form. The term“plurality” as used herein means more than one, for example, two or more, three or more, four or more, and the like. Directional terms such as“top”,“bottom”, “upwards”,“downwards”,“vertically” and“laterally” are used for the purpose of providing relative reference only, and are not intended to suggest any limitations on how any article is to be positioned during use, or to be mounted in an assembly or relative to an environment.
[0017] The terms“comprising”,“having”,“including”, and“containing”, and grammatical variations thereof, are inclusive or open-ended and do not exclude additional, un-recited elements and/or method steps. The term “consisting essentially of’ when used herein in connection with a composition, use or method, denotes that additional elements, method steps or both additional elements and method steps may be present, but that these additions do not materially affect the manner in which the recited composition, method, or use functions. The term“consisting of’ when used herein in connection with a composition, use, or method, excludes the presence of additional elements and/or method steps.
[0018] In addition, the terms“first”,“second”,“third” and the like are used for descriptive purposes only and cannot be interpreted as indicating or implying relative importance.
[0019] In the description of the invention, it should also be noted that the terms“mounted”, “linked” and“connected” should be interpreted in a broad sense unless explicitly defined and limited otherwise. For example, it could be fixed connection, or assembled connection, or integrally connected; either hard-wired or soft-wired; it may be directly connected or indirectly connected through an intermediary. For technical professionals, the specific meanings of the above terms in the invention may be understood in context.
[0020] In the drawings illustrating embodiments of the present invention, the same or similar reference labels correspond to the same or similar parts. In the description of the invention, it should be noted that the meaning of“a plurality of’ means two or more unless otherwise specified; The directions or positions of the terms“up”,“down”,“left”,“right”, “inside”, “outside”, “front end”, “back end”, “head”, “tail”, the orientation or positional relationship shown in the drawings is merely for the convenience of describing the invention and simplifying the description rather than indicating or implying that the indicated device or element must have a particular orientation and be constructed and operated in a particular orientation, and therefore cannot be used as a limitation of the invention.
[0021] Providing guidance to an assistant at a remote location, via high fidelity motion and force, increases the sense of physical telepresence of the instructor or expert providing the guidance at the remote environment.
[0022] Embodiments of the present invention have applications in telemedicine and other fields in which a human assistant provides benefits over a robot assistant. An important benefit of a human assistant over a robot assistant includes adaptability to a variety of different tasks. Embodiments of the present invention are also particularly suitable when teleguidance is intended to train trainees. Other environments where embodiments of the present invention have applications include rehabilitation of patients by an instructor or therapist at a remote location.
[0023] One embodiment of the present invention uses sensors and software to capture the commands of the trainer, which are transmitted in real time to the trainee. The sensor could be a joystick or other hand controller, which captures the hand positions of the trainer and converts these to position or velocity commands for the trainee to follow. These commands are transmitted to the trainee as visual, tactile, or auditory sensory cues. For example, auditory cues may include sound pulses whose pulse frequency indicates the magnitude of the desired correction and whose pitch and timbre indicates the direction and axis of motion to which the correction is applied. In this manner, the trainer can guide the trainee’s hand or tool through a task, using feedback from cameras and other remote sensors to monitor the trainee’s behavior and actions.
Embodiment I Ultrasound training
[0024] For example, the instructor or trainer may be an ultrasound specialist who guides an ultrasound probe in the hand of a remote assistant to obtain a desired ultrasound image. The instructor can also guide applied forces in this manner.
[0025] FIG. 1 illustrates a simplified schematic diagram of a teleguidance system 100 exemplary of an embodiment of the present invention. The system 100 includes an apparatus 104 where a trainee 102 uses a tool 110 such as an ultrasound probe while being guided remotely by a trainer 126 using a device 124.
[0026] Apparatus 104 includes a sensor 108 in the form of a camera, a command interface 112 that may include a display 107 and a haptic interface 114 in physical communication with a portion of the body of trainee 102.
[0027] The command interface 112 may be in data in communication with a processor 106. The communication interface 113 provides bidirectional data communication with the trainer’s device 124 thorough a network 122. The communication interface 113 receives feedback data from a trainer’s device 124 and sends the sensor data from sensor 108 to the trainer’s device 124.
[0028] In apparatus 104, processor 106 is coupled to the communication interface 113 for processing the feedback data to determine a command transmitted by trainer’s device 124 and for outputting a command signal corresponding to the received command on the command interface 112 to move said portion of the body or the tool 110 to the desired position and orientation.
[0029] In operation, sensor 108 generates sensor data representative of one or more of a position, orientation or applied force of a portion of the body of trainee 102 (e.g. his hand or fingers) or of the tool 110 or both. The command interface 104 in this embodiment is haptic interface 114 including a skin buzzer providing feedback sent from trainer 126 via device 124 to the trainee’s hand. In other embodiments, a display outputting visual feedback, a loudspeaker outputting audio feedback or other command interfaces may be used to augment or replace the haptic interface 114.
[0030] Device 124 also includes a processor or a feedback controller (not specifically illustrated) to compute the difference between the sensor data sent by apparatus 104 and a desired position and orientation as determined by device 124 with input from trainer 126 and to generate commands for guiding the trainee’s hands to the desired position. The processor or feedback controller of device 124 may execute processor executable instructions stored in a processor readable medium or memory in communication with the processor or controller.
[0031] Device 124 also includes and a communication interface 128 for sending feedback data and for receiving the sensor data from apparatus 104. Device 124 is in communication with a sensor 130 in the form of a camera. Device 124 may also include a display 132 providing visual data representative of the position of trainee 102.
[0032] Advantageously, while a conventional teleguidance system uses the instructor’s voice and graphical annotations to communicate discrete commands to the trainee 102, the exemplary system 100 adds high-fidelity motion and force following via continuous feedback control.
[0033] As noted above, the hand of trainee 102 may be holding or grasping tool 110 such as an ultrasound probe. The instructor or trainer 126 wishes to control the pose, that is, position and orientation, of the trainee’s hand. The same method can be used to guide other limbs, the head, or the body of the trainee 102 via teleguidance.
[0034] The pose of the trainer’s hand and the pose of the trainee’s hand are tracked by motion capture system such as camera or optical sensors 130, 108 respectively. The error between the trainer pose and the trainee pose is calculated by device 124 and communicated to the trainee 102 through a visual, auditory, and/or tactile feedback. For example, skin buzzer 114 may be used to provide tactile feedback.
[0035] In some embodiments, a visual error display is used. This may be advantageous in certain situations as visual perception has the greatest bandwidth of all the senses. A display device that may be a stationary monitor or a head-mounted display (HMD) can be used.
[0036] The error display may depict an image of three-dimensional (3D) object that appears displaced (that is, translated and/or rotated in space) from a fixed target pose, by an amount proportional to the measured tracking error. The trainee 102 thus continually moves his or her hand to the correct pose to minimize the displayed error.
[0037] In one embodiment, although the trainer’s hand is moving, the target object in the trainee’s error display remains fixed. Thus, the error can be displayed using relatively large objects to give high error resolution. The error display can be superimposed over other useful information, such as the output of a guided ultrasound scan or the trainee’s view of his own hand. For example, an augmented reality HMD may be used.
[0038] FIG. 2 is a flowchart of a method 200 depicting exemplary steps taken by the teleguidance apparatus 104. [0039] As shown at step 202, the apparatus 104 uses a sensor for generating sensor data representative of a position and orientation of a portion of a trainee’s body (e.g., hand) or a tool 110 (e.g., a probe) held by said portion of the body.
[0040] At step 204, apparatus 104 uses a communication interface (e.g., communication interface 113) for sending the sensor data to a trainer’s device 124.
[0041] At step 206, apparatus 104 uses a command interface (e.g., command interface 112) to receive feedback data from the trainer’s device 124.
[0042] At this point, device 124 receives the sensor data; generates the desired position and orientation (e.g., via sensor 130 in the form of a camera recording video of the trainers hand); computes the difference between the sensor data from device 104 and the desired position and orientation. Device 124 then generates a command to move said portion of the body or the tool to the desired position and orientation; and transmits the command to the remote apparatus 104.
[0043] At step 208, apparatus 104 uses a processor (e.g. processor 106) to process the feedback data to determine a command transmitted by trainer’s device.
[0044] At step 210, apparatus 104 outputs a command signal on a command interface to move a portion of the body of trainee 102 or the tool 110 to the desired position and orientation.
Embodiment II Ultrasound Probe incorporating tactile feedback
[0045] FIG. 3 illustrates another simplified schematic diagram of a teleguidance system 300 exemplary of an embodiment of the present invention.
[0046] The system 300 includes, at the remote trainee’s location, an apparatus 304, which further comprises an ultrasound probe 310 that incorporates haptic or tactile feedback means in the form of a plurality of skin buzzers 314a, 314b, 314c, 314d (collectively skin buzzers 314). In use, skin buzzers 314 are in physical contact with a portion of the hand of trainee 302 as shown. The apparatus 304 also includes a sensor in the form of camera 308. [0047] At the trainer’s location, the system 300 includes a display 332 in communication with a probe 310 thorough a network 322. Another display 334 is in communication with camera 308 thorough the network 322. The display 332 displays the ultrasound signal from probe 310 while display 334 displays an image or video as captured by camera 308 and/or other sensors indicative of one or more of the position, orientation or applied force of a portion of the body of trainee 302 or of probe 310. In this particular embodiment, camera 308 sends image video and/or audiovisual data directly to the trainer’s display 334. A trainer’s feedback device 324, which in this embodiment operates like another probe or similar joystick, contains a processor or feedback controller (not specifically illustrated) that executes processor executable instructions stored in a processor readable medium or memory in communication with the processor or controller.
[0048] As will be appreciated, the skin buzzers 314 in this particular embodiment, form part of the probe 310. Thus, skin buzzers 314 along with an associated processor (not shown in FIG. 3) form part of the command interface within the probe 310. In alternate embodiments however, rather than using skin buzzers built in to a probe, a general-purpose glove, wrist band, sleeve, or garment may be used, so that the teleguidance system is independent of the specific tool used.
[0049] Apparatus 304 or specifically the command interface/processor in probe 310 processes the feedback data from trainer’s feedback device 324 and outputs a corresponding command signal via skin buzzers 314 alerting the trainee to move his or her hand or the probe 310 to the desired position and orientation.
[0050] In operation, while being guided remotely by trainer 326 using his or her own tool in the form of feedback device 324, which operates like a joystick, the trainee 302 takes an ultrasound image of a patient 330. Camera 308 generates sensor data representative of a position and orientation of a portion of the trainee’s hand and/or probe 310. Skin buzzers 314 provide tactile feedback to the trainee’s hand. The feedback is generated by trainer 326 via feedback device 324.
[0051] The instructor or trainer 326 is able to use display 332 to monitor the ultrasound signal from probe 310 and view the display 334 to observe trainee’s ultrasound position. Corrective feedback from the trainer 326 is provided by simply moving the feedback device 324 like a joystick as needed. The movements of feedback device 324 by the trainer 326 are encoded and transmitted to probe 310 where they are decoded and converted to appropriate tactile positional instructions or command signals transmitted to the trainee 302 via skin buzzers 314.
Embodiment III Physical Exam
[0052] In a telemedicine setting, a physician or specialist can guide a remote assistant to conduct a physical exam or perform a treatment on a remote patient. One example is guiding an assistant to manipulate an ultrasound probe as described with reference to FIG. 3.
[0053] An instructor such as trainer 326 can guide a trainee such as trainee 302 to complete a task and thereby have the trainee learn by doing. The instructor may choose to give continuous feedback on the desired position and orientation; or the instructor may choose to give the trainee intermittent feedback only if the trainee deviates from an acceptable position envelope.
Embodiment IV Rehabilitation
[0054] In another exemplary embodiment, the trainer can be a physical therapist and the trainee may be a patient. The therapist can guide the patient through rehabilitation exercises using visual, audio and/or haptic feedback to ascertain that the patient is going through the exercise regimen with proper form, frequency and the like.
Embodiment V Virtual tourism
[0055] In another exemplary embodiment, the trainer may be a virtual tourist and the trainee may be a traveler at a remote tour location. The virtual tourist may thus guide the surrogate traveler at the remote location to sites of interest to the virtual tourist, and experience the location and social interactions vicariously.
Embodiment VI Remote Representative [0056] In another exemplary embodiment, the trainer may be a business person or diplomat, and the trainee may be a surrogate representative. The business person or diplomat can interact with his or her remote counterparts through the surrogate representative. This may involve a presentation, an interview, a negotiation or the like.
Embodiment VII Close proximity
[0057] In some embodiments, the trainer and the trainee may be in close proximity to each other. For example, the trainee may be inside a sterile surgical suite while the trainer may be just outside the suite, or adjacent thereto. This is advantageous whenever the theatre of activity, such as the surgical room in this particular example, does not have sufficient space to physically accommodate both the trainer and trainee.
Signal processing
[0058] In exemplary embodiments of the present invention, error measurement and/or communication of the measurement data may optionally include a filter to remove unintended motion artifacts such as tremors.
[0059] As is well known, certain types of variation in signals may represent genuine information while others may represent noisy artifacts that should be removed. Filtering of resulting video capturing the movement of the trainee’s hands for example, can be used to remove artifacts such as those caused by vibration or tremors. Other methods of automatically removing these unwanted visual effects real-time video stabilization approaches may also be used.
[0060] In operation, the measured pose of the trainee could be scaled, rotated, or transposed before error is calculated to allow for body size differences between the trainer and trainee or to allow the trainer to operate on a magnified scale of movement. For example, movement of centimeters by the trainer, may be used to control millimeter scale movements by the trainee surgery. [0061] In addition, image captures of movements could be transposed to account for left- handedness or right-handedness.
One to Many Many to one
[0062] The movements of multiple trainers could be averaged and communicated to a single trainee. This helps reduce the effect of outlier controls sent to the trainee.
[0063] Conversely, the movements of a single trainer may be communicated to several trainees. This may be especially helpful in scenarios where instructors or trainers are in very short supply and the ratio of trainees to trainers is high.
[0064] As will be appreciated, embodiments of the present invention allow a trainer to guide remote trainees in telemedicine and other fields in which a human assistant provides benefits over a robot assistant or when using teleguidance for training or rehabilitation of humans. Guiding a trainee at a remote location increases the physical telepresence of the trainer. The invention uses sensors and software to capture motion commands from the trainer (e.g. from a hand controller) and transmit these to the trainee through auditory, tactile, or visual displays. The trainer uses information or data from cameras and other remote sensors to guide the trainee through a task. For example, an ultrasound specialist can guide an ultrasound probe in the hand of a remote assistant to obtain a desired image. In addition to voice and graphical annotations, this invention adds high-fidelity motion and force following via continuous feedback control, to communicate discrete commands to the trainee.
[0065] In alternate embodiments, sensors may include not just cameras but audio recording microphones and the displays may include speakers to output audio signals.
[0066] Having thus described, by way of example only, embodiments of the present invention, it is to be understood that the invention as defined by the appended claims is not to be limited by particular details set forth in the above description of exemplary embodiments as many variations and permutations are possible without departing from the scope of the claims.

Claims

What is claimed is:
1. A teleguidance apparatus comprising:
a) a sensor for generating sensor data representative of a current position and orientation of a portion of a trainee’s body or a tool held by said portion;
b) a command interface for outputting a command signal to the trainee;
c) a communication interface for sending the sensor data to a trainer’s device and for receiving feedback data from the trainer’s device; and
d) a processor coupled to the communication interface and the command interface, the processor processing the feedback data to determine a command, transmitted by the trainer’s device, to move said portion of the body or the tool to a desired position and orientation, the processor outputting the command signal corresponding to the command, through the command interface.
2. The apparatus of claim 1, further comprising a haptic interface to said body for sending said command signal, the haptic interface in communication with one of the command interface and the processor.
3. The apparatus of claim 2, wherein the haptic interface forms part of the tool.
4. The apparatus of claim 3, wherein the tool is an ultrasound probe and the haptic interface comprises skin buzzers formed on said probe.
5. The apparatus of claim 1, wherein the sensor comprises a camera.
6. The apparatus of claim 2, wherein the haptic interface comprises a tactile interface.
7. The apparatus of claim 6, wherein the haptic interface comprises a plurality of skin buzzers.
8. The apparatus of claim 2, wherein the haptic interface comprises one or more of a tactile interface, a general-purpose glove, a wrist band, a sleeve, a skin buzzer and a garment.
9. The apparatus of claim 1, wherein the command signal comprises one or more of auditory signal, tactile signal, and a visual signal for display.
10. The apparatus of claim 2, wherein the tool is an ultrasound probe and the haptic interface comprises skin buzzers for placement on said trainee’s hand.
11. The apparatus of claim 2, further comprising a display in communication with one of the command interface and the processor; and wherein the command signal comprises one or more of an image signal and a video signal.
12. The apparatus of claim 11, further comprising a loudspeaker in communication with one of the command interface and the processor; and wherein the command signal further comprises audio signal.
13. The apparatus of claim 1, wherein said portion of the trainee’s body is the trainee’s hand.
14. The apparatus of claim 1, wherein the trainer’s device comprises a feedback controller, wherein the feedback controller is adapted to:
a) receive the sensor data;
b) generate said desired position and orientation;
c) compute the difference between said current position and orientation, and said desired position and orientation using said sensor data;
d) generate the command to move said portion of the body or the tool to the desired position and orientation; and
e) transmit the command to the apparatus via the communication interface.
15. The apparatus of claim 14, wherein the trainer and the trainee are respectively one of: a businessperson and a remote surrogate respectively, a diplomat and a representative respectively, or a physician and a remote physician’s assistant respectively.
16. The apparatus of claim 14, wherein the trainer and the trainee are in physical proximity to one another.
17. A teleguidance system comprising:
a) a teleguidance apparatus comprising:
i) a sensor for generating sensor data representative of a current position and orientation of a portion of a trainee’s body or a tool held by said portion;
ii) a command interface for outputting a command signal to the trainee;
iii) a communication interface for sending the sensor data to a trainer’s device and for receiving feedback data from the trainer’s device;
iv) a processor coupled to the communication interface and the command interface, the processor processing the feedback data to determine a command, transmitted by the trainer’s device, to move said portion of the body or the tool to a desired position and orientation, the processor outputting the command signal corresponding to the command, through the command interface.
b) a trainer’s device comprising a feedback controller, wherein the feedback controller is adapted to:
i) receive the sensor data;
ii) generate said desired position and orientation based on the sensor data;
iii) compute the difference between said current position and orientation, and said desired position and orientation using said sensor data;
iv) generate the command to move said portion of the body or the tool to the desired position and orientation; and
v) transmit the command to the teleguidance apparatus.
18. A method of teleguidance comprising:
a) using a sensor to generate sensor data representative of a current position and orientation of a portion of a trainee’s body or a tool held by said portion;
b) using a communication interface to send the sensor data to a trainer’s device;
c) using a communication interface to receive feedback data from the trainer’s device; and d) using a processor coupled to the communication interface, to:
i) process the feedback data to determine a command to move said portion of the body or the tool to a desired position and orientation, transmitted by trainer’s device; and ii) output a command signal corresponding to the command on the command interface.
19. A method of teleguidance comprising:
a) receiving at a feedback device, sensor data representative of a current position and orientation of a portion of a trainee’s body or a tool held by said portion, from a sensor; b) generating, using a first processor, a desired position and orientation of the trainee’s body or the tool held by said portion, based on the sensor data;
c) computing the difference between said current position and orientation and said desired position and orientation, using said sensor data; and
d) generating a command, based on the difference, in a feedback data to move said portion of the body or the tool to the desired position and orientation.
20. The method of claim 19, further comprising sending the feedback data to a second processor coupled to a command interface for outputting a command signal to the trainee, the second processor adapted to process the feedback data to determine the command and output a command signal corresponding to the command, on the command interface.
PCT/CA2019/051507 2018-10-25 2019-10-24 Precise teleguidance of humans WO2020082181A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862750645P 2018-10-25 2018-10-25
US62/750,645 2018-10-25

Publications (1)

Publication Number Publication Date
WO2020082181A1 true WO2020082181A1 (en) 2020-04-30

Family

ID=70330794

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2019/051507 WO2020082181A1 (en) 2018-10-25 2019-10-24 Precise teleguidance of humans

Country Status (1)

Country Link
WO (1) WO2020082181A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112562445A (en) * 2020-12-28 2021-03-26 西南石油大学 Catheter manipulation active guiding mechanism and device for interventional operation training system
EP4011296A1 (en) * 2020-12-09 2022-06-15 Industrial Technology Research Institute Guiding system and guiding method for ultrasound scanning operation
WO2023228149A1 (en) * 2022-05-27 2023-11-30 Instituto Pedro Nunes, Associação Para A Inovação E Desenvolvimento Em Ciência E Tecnologia Bidirectional feedback system and respective method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090253109A1 (en) * 2006-04-21 2009-10-08 Mehran Anvari Haptic Enabled Robotic Training System and Method
US20130157239A1 (en) * 2011-12-16 2013-06-20 Board Of Regents Of The Nevada System Of Higher Education, On Behalf Of The University Of Nevada Augmented reality tele-mentoring (art) platform for laparoscopic training

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090253109A1 (en) * 2006-04-21 2009-10-08 Mehran Anvari Haptic Enabled Robotic Training System and Method
US20130157239A1 (en) * 2011-12-16 2013-06-20 Board Of Regents Of The Nevada System Of Higher Education, On Behalf Of The University Of Nevada Augmented reality tele-mentoring (art) platform for laparoscopic training

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4011296A1 (en) * 2020-12-09 2022-06-15 Industrial Technology Research Institute Guiding system and guiding method for ultrasound scanning operation
JP2022091690A (en) * 2020-12-09 2022-06-21 財團法人工業技術研究院 Guiding system and guiding method for ultrasound scanning operation
JP7271640B2 (en) 2020-12-09 2023-05-11 財團法人工業技術研究院 GUIDING SYSTEM AND METHOD FOR ULTRASOUND SCANNING OPERATIONS
US11806192B2 (en) 2020-12-09 2023-11-07 Industrial Technology Research Institute Guiding system and guiding method for ultrasound scanning operation
CN112562445A (en) * 2020-12-28 2021-03-26 西南石油大学 Catheter manipulation active guiding mechanism and device for interventional operation training system
CN112562445B (en) * 2020-12-28 2024-05-28 西南石油大学 Catheter control active guiding mechanism and device for interventional operation training system
WO2023228149A1 (en) * 2022-05-27 2023-11-30 Instituto Pedro Nunes, Associação Para A Inovação E Desenvolvimento Em Ciência E Tecnologia Bidirectional feedback system and respective method

Similar Documents

Publication Publication Date Title
CN109791801B (en) Virtual reality training, simulation and collaboration in robotic surgical systems
CN110800033B (en) Virtual reality laparoscope type tool
US11944401B2 (en) Emulation of robotic arms and control thereof in a virtual reality environment
US11270601B2 (en) Virtual reality system for simulating a robotic surgical environment
EP1965699B1 (en) Medical robotic system providing three-dimensional telestration
CN109475387A (en) For controlling system, method and the computer-readable storage medium of the aspect of robotic surgical device and viewer's adaptive three-dimensional display
TW201936114A (en) Methods and apparatus for tele-medicine
Galambos Vibrotactile feedback for haptics and telemanipulation: Survey, concept and experiment
WO2020082181A1 (en) Precise teleguidance of humans
Si et al. Design and Quantitative Assessment of Teleoperation-Based Human–Robot Collaboration Method for Robot-Assisted Sonography
Khwanngern et al. Jaw surgery simulation in virtual reality for medical training
US20240173018A1 (en) System and apparatus for remote interaction with an object
Webel Multimodal Training of Maintenance andAssembly Skills Based on Augmented Reality
US20230414307A1 (en) Systems and methods for remote mentoring
Mak et al. Development of Haptic Approaches for a Head-Controlled Soft Robotic Endoscope
WO2023228149A1 (en) Bidirectional feedback system and respective method
EP4181789B1 (en) One-dimensional position indicator
US20230218270A1 (en) System and apparatus for remote interaction with an object
JP7281924B2 (en) Information transmission system
Bianchi Exploration of augmented reality technology for surgical training simulators
TW202111724A (en) A multimedia assisted system for medical training
JPH09305788A (en) Information processor
Dawson Design and Evaluation of A Contact-free Interface for Minimally Invasive Robotics Assisted Surgery
Tendick Visual-manual tracking strategies in humans and robots
Bontreger The design of a haptic device for training and evaluating surgeon and novice laparoscopic movement skills

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19877476

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19877476

Country of ref document: EP

Kind code of ref document: A1