WO2019068095A1 - Dispositif exosquelette haptique pour corps complet - Google Patents

Dispositif exosquelette haptique pour corps complet Download PDF

Info

Publication number
WO2019068095A1
WO2019068095A1 PCT/US2018/053781 US2018053781W WO2019068095A1 WO 2019068095 A1 WO2019068095 A1 WO 2019068095A1 US 2018053781 W US2018053781 W US 2018053781W WO 2019068095 A1 WO2019068095 A1 WO 2019068095A1
Authority
WO
WIPO (PCT)
Prior art keywords
manipulators
workspace
user
full body
movement
Prior art date
Application number
PCT/US2018/053781
Other languages
English (en)
Inventor
Jacob Rosen
Erik KRAMER
Original Assignee
The Regents Of The University Of California
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Regents Of The University Of California filed Critical The Regents Of The University Of California
Publication of WO2019068095A1 publication Critical patent/WO2019068095A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user

Definitions

  • This invention relates to robotic manipulators and more particularly a device that uses robotic manipulators to provide haptic feedback to a user moving through a virtual environment.
  • the application is directed to a full body exoskeleton capable of providing haptic user feedback.
  • Many embodiments are directed to a full body haptic exoskeleton device that includes a base tower which forms the main support for the upper and lower robotic manipulators.
  • a first set of upper robotic manipulators may be mounted on opposing sides of the tower such that the first and second upper manipulators align with left and right arms of a user.
  • the exoskeleton may also include a first set of lower robotic manipulators mounted on opposing sides of the tower such that the first and second lower manipulators align with left and right legs of a user.
  • Each of the upper robotic manipulators may be outfitted with an arm coupling on an end effector of each of the first and second upper robotic manipulators for connecting to an arm of the user.
  • each of the lower robotic manipulators may be configured with a leg coupling on an end effector of the robotic manipulators for connecting to a leg of the user.
  • the full body haptic exoskeleton includes a seating element, wherein the seating element supports the majority of the weight of the user.
  • the seating element is adjustable in position relative to the user such that the user position within the exoskeleton device is adequate for comfortable movement.
  • the upper and lower manipulators are configured to accommodate 90% of male human users.
  • each of the upper manipulators has an overall length of 760 mm and each of the lower manipulators has an overall length of 950mm.
  • the full body haptic exoskeleton includes a control system, wherein the control system is in electronic communication with each of the upper and lower manipulators and wherein the control system is configured to receive input data and translate the input data into relative motion of the manipulators such that a non- colliding workspace overlap is maintained.
  • the input data is received from a group consisting of motion sensors, force sensors, and a virtual reality control application.
  • the full body haptic exoskeleton includes an enclosure panel wherein the enclosure panel encloses the user within the exoskeleton device and may be used to better simulate a virtual reality environment.
  • the enclosure panel is in a fixed position and configured to have an opening in a portion thereof to allow a user to easily interact with the device.
  • the full body haptic exoskeleton further includes a processing device having a processor and a memory component wherein the memory component is configured to receive and store instructions wherein the instructions may then be used by the processor.
  • the processor component may be a processor, microprocessor, controller, and/or a combination of processors, microprocessors, and controller.
  • the memory component may be selected from either volatile memory or non-volatile memory.
  • each of the upper and lower manipulators further include end effectors wherein the couplings are disposed on the end effectors.
  • the arm couplings may be either stirrups or handles or a combination thereof.
  • the end effectors on the lower manipulators further comprise slots into which a user can insert his or her feet.
  • Other embodiments include a method for configuring the various manipulators of the full body haptic exoskeleton device in which the position of each of the manipulators may be determined and thus positioned accordingly.
  • Such embodiments may include having a base tower configured to or adaptable to receive a set of upper and lower robotic manipulators each having at least six degrees of freedom where each of the manipulators may be positioned on opposing sides of the base tower.
  • An optimal user workspace may be determined for the exoskeleton device according to a majority of potential users.
  • the workspace for the robotic manipulators may be determined based on the dimensions of the manipulators. Having a determined workspace for both the user and the robotic manipulators a workspace overlap can then be calculated.
  • the overlap of the two workspaces is such that a user could be adversely affected by the robotic manipulator movement
  • the overlap can be adjusted and further optimized such that the exoskeleton workspace encapsulates the optimal user workspace.
  • the workspace overlap is configured such that the exoskeleton workspace encapsulates the user workspace the upper and lower manipulators can be mounted to the base tower such that while in use the overlap is maintained.
  • determining the optimal user workspace and exoskeleton workspace further includes using three dimensional point cloud data wherein a plurality of corresponding points are generated and then compared with respect to the relative positioning.
  • the optimizing the workspace overlap is performed using an iterative closest point algorithm.
  • optimizing the workspace overlap further includes determining a collision rejection variable in which any points within the workspace overlap that do not have a corresponding robotic pose or may interfere with the movement of the user, may be removed from the optimization calculation.
  • Still other embodiments include a method for controlling a full body haptic exoskeleton which include having a user and a full body haptic exoskeleton device according the many embodiments.
  • the full body haptic exoskeleton device may be configured to receive virtual reality information into a processing unit with respect to a desired virtual environment wherein the virtual environment may include multiple virtual objects within a virtual workspace.
  • the virtual reality information may contain data with respect to the location and dimension of the virtual objects and the user.
  • the device can translate the virtual objects and their respective positions into a virtual workspace. Additionally, as the virtual environment, including the objects, is determined, limitations on the movement of the upper and lower manipulators can be set to ensure the movement is within the workspace. The movement of each of the manipulators can then be calculated based on the workspace and the virtual environment. Finally, a controller device can control the movement of each of the manipulators of the exoskeleton device.
  • calculating the movement of each of the upper and lower manipulators further includes; determining movement criteria.
  • the movement criteria include determining if the movement of the manipulator will collide with the user, will exceed the operator joint limitations, and/or will collide with a virtual object.
  • the movement of each manipulator may be revised to match the earlier determined criteria.
  • Each of the manipulators may be controlled based on the calculated movement.
  • Figure 1 illustrates a front perspective view of an open full body haptic exoskeleton device in accordance with an embodiment of the invention.
  • Figure 2 illustrates a front perspective view of a controlled environment full body haptic exoskeleton device in accordance with an embodiment of the invention.
  • Figure 3 illustrates graphs showing the results of an iterative closest point calculation showing the movement of the workspaces to their closes overlapped state of robotic manipulators in a haptic device in accordance with an embodiment of the invention.
  • Figure 4 illustrates a showing the overlap workspace of the left human arm with a robotic arm workspace in a haptic device in accordance with an embodiment of the invention.
  • Figure 5 illustrates a processing system that controls the manipulators in accordance with an embodiment of the invention.
  • Figure 6 illustrates a flow diagram of a process performed by the processing system to control the movement of the manipulators in accordance with an embodiment of the invention.
  • full body haptic exoskeleton systems in accordance with various embodiments of the invention are disclosed.
  • four industrial robotic manipulators are used to create a full body haptic device that allows for haptic interactions with both arms and both legs of a user.
  • impedance control of an industrial robotic manipulator can be used to make the manipulator act as a follower of and resistor to human applied input.
  • One such use is the provision of haptic feedback to a user.
  • a system can advantageously use the rigidity of these industrial robotic manipulators by attaching an end effector of an industrial robotic manipulator to a limb or limbs of a human operator. The manipulator is then used to resist movement of the limb by the human operator in the real world to reflect objects or boundaries in a virtual world. While it is unorthodox to pair a human with a rigid industrial robotic manipulator, this type of system allows for a very strong and stiff reaction when the user interacts with a solid and/or immovable virtual object such as a wall.
  • Other types of haptic systems use compliant robotic systems that are designed to interact with humans by mechanically flexing. However, these compliant systems result in less immersive experiences for the users.
  • the robotic manipulators that interact with the limbs of a user are positioned in the device with respect to the human operator via rigorous calculations to achieve high overlapping workspace coverage and matching manipulability of the robotic manipulators to the maneuverability of human limbs.
  • a full body haptic exoskeleton device in accordance with an embodiment of the invention is shown in Figure 1 .
  • a full body haptic exoskeleton device 100 in accordance with an embodiment of the invention includes four industrial robotic manipulators 101 -104 connected in parallel with a human operator. The lower two robotic manipulators 102, 104 attach to the lower extremities of human operator in the shown embodiment.
  • the connection is at the ankles of the user. In accordance with several embodiments, the connection may be at the foot and in accordance many embodiments, the connection may be to the lower leg of the user.
  • the upper two robotic manipulators 101 , 103 are attached to the wrists of an operator in the shown embodiments.
  • the connection may be to the hand of the operator and in still further embodiments, the connection may be to the forearm of the operator.
  • a stirrup/handle tooltip can be located on the end effector of each of the upper robotic manipulators that can be grasped by an operator.
  • the end effector of the lower robotic manipulators can have slots into which a user can insert his or her feet. This configuration allows for quick mounting and dismounting of the operator in the device.
  • a second human-device interface configuration has wrist and ankle mounted clutches. The wrist and ankle clutches allow for free human hand and ankle orientation and break away from the robotic arms should a force pass a pre-designated threshold value.
  • the four robotic manipulators 101 -104 are mounted on a command tower 1 10 that houses the controllers as well as a seat 105 for the operator.
  • the seat 105 is optional and is designed to move in the vertical direction to accommodate persons of varying heights and may be configured to improve the comfort of the user when operating the exoskeleton device.
  • the seat 105 may also be driven during operation to enhance the haptic experience in accordance with many embodiments.
  • the seat 105 reduces the load that the robotic manipulators need to carry by supporting most of the weight of the operator.
  • a harness may be provided to support the weight of the operator.
  • the entire device can be built as open environments as shown in Figure 1 or as an enclosed, controlled environment as shown in Figure 2.
  • environmental control systems may control airflow and moisture in enclosure 205 to enhance the haptic experience.
  • the enclosure 205 may be integrated with the command tower 1 10 and may be permanently fixed with an opening 210 to allow for the user to enter and exit the exoskeleton device 100.
  • the enclosure may include a moveable panel that may be moved out of the way of the exoskeleton device 100 to better accommodate the user while positioning themselves within the device 100.
  • the optimal placement of the mounting for each robotic manipulator with respect to a coupled human limb considers several criteria that should ideally be met for each robotic manipulator.
  • a mounting placement determination method that optimizes a position of a robotic manipulator with a weighted function based on human-robot workspace overlap, manipulability similarity, collision rejection, and operation location probability is used to determine the mount position of each robotic manipulator.
  • Human- robot workspace overlap can be the most important factor in the weighted optimization to reduce the likelihood that the device restricts the range of movements that an operator might want to perform in the device.
  • achieving maximum overlap in the human-robot workspace can increase the comfort level of the human user of the device.
  • methods in accordance with many embodiments of the invention generate a workspace of the robotic manipulator as point cloud data and compare the workspace to a simulated human arm workspace based on the joint limits of a healthy human subject.
  • the configuration and placement of the manipulators to accommodate 90% of human male users and thus accommodate the majority of users. Comparison may be performed using an iterative closest point algorithm in accordance with many embodiments to find a volumetric coverage of the human limb's workspace (Voverlap/VHuman Total).
  • the Iterative Closest Point (ICP) algorithm can compare point data from two different data sets and utilizes an iterative process to create an overlap of the two data sets within a defined error limit.
  • the two data sets may be the point data sets from the human and robotic arm workspaces.
  • the ICP algorithm may determine the distance between points. Subsequently, the algorithm moves or adjusts, calculates distances, and readjusts iteratively until a specified error is met.
  • the ICP algorithm may be utilized in a haptic exoskeleton device within a virtual reality environment to improve the response of the exoskeleton within the human workspace. Furthermore, the ICP method can be useful in preventing the movement of the exoskeleton manipulators from exceeding the range of motion of the user thereby reducing the risk to the user.
  • FIG. 3 A graph showing a sample iterative closest point calculation is shown in Figure 3.
  • the first plot 300 shows the work space of an upper robotic manipulator 320 and the work space of a human arm 325.
  • the second plot 305 shows the result an iterative closest point comparison wherein the workspace of the human arm 325 is enclosed within the workspace of an upper robotic manipulator 320.
  • Graph 310 shows the convergence of an iterative closest point comparison.
  • FIG. 4 A visualization of a mounting position of a left upper robotic manipulator with respect to a left arm of a user in accordance with an embodiment of the invention is shown in Figure 4.
  • a diagram shows the overlap workspace of the left human arm with a robotic arm workspace in accordance with an embodiment of the invention.
  • the overlap workspace between the human arm and the robotic manipulator may be determined best by using the iterative point method described previously.
  • the reachable workspace of the left human arm is shown in green, the unreachable workspace of the arm in red, and total robotic manipulator workspace in blue.
  • the selected manipulators are typically chosen to have at least six degrees of freedom.
  • the robotic manipulators can also be chosen to have a reach such that there is a minimum of wasted workspace space beyond the workspace of the human limbs. For a 90-percentile male this is 760mm reach for upper manipulators and 950 mm reach for lower manipulators. Payloads are also important to provide adequate haptic feedback forces/torques, which for the system could be a minimum amount to support the human leg and arm without any human assistance. This would mean a minimum of 4kg for the upper limbs and 12kg for the lower limbs. Large payloads may also be selected to increase the feedback magnitude capabilities. In many embodiments, the robot joint centers do not coincide with the human joint locations.
  • manipulability is also considered in selection of the placement of the mount of each manipulator to ensure the robotic manipulator can follow the human operator with relative ease.
  • placement determination methods in accordance with some embodiment determine mounting locations that satisfy the workspace overlap criteria. For each of these locations, manipulability of the robotic manipulator is calculated at numerous workspace points and compared to the manipulability of the corresponding human limb at the same points. By using the human limb manipulability as a reference, a score is assigned to how closely manipulability of the robotic manipulator matches for all possible motions of the corresponding limb. The score is used in the final optimization to select the best mounting location.
  • collision rejection and workspace location probability may also be used as filters in placement determination methods in accordance with many embodiments of the invention to remove any configurations that may be unsuitable physically.
  • collision rejection is evaluated by examining the overlapping workspace of the human-robot system (described above) and removing any points that do not have suitable robotic arm poses and do not interfere with the movement of the corresponding limb of the operator.
  • Workspace data may also be evaluated by comparing the workspace to the probability of human activity in specific regimes of the workspace of the corresponding limb of an operator.
  • Areas where the operator is less likely to move or place the limb are less critical than areas of high activity for the limb. For example, it is more important to have unrestricted human arm movement (both position and velocity based) in the area directly in front of the torso than it is behind the back of the operator where the operator seldom reaches.
  • movement of the robotic manipulators is controlled to provide haptic feedback for a virtual reality environment.
  • a control system receives information about the virtual reality environment and controls the manipulators based upon the pose of the operator within the virtual reality environment.
  • the virtual reality environment may include a wall and the full body haptic device prevents the user from adopting a pose that would correspond to movement through the wall in the virtual reality environment.
  • the control system may include a processing system that executes software instructions for a control application to control the robotic manipulators of the device.
  • a processing system that is part of the control system in accordance with some embodiments of the invention is shown in Figure 5. In accordance with several embodiments, the processing system may be housed within the device.
  • the processing system may be outside of the device and communicates with controllers inside the device via a hard-wired or wireless connection.
  • controllers inside the device via a hard-wired or wireless connection.
  • a particular processing system may include other components that are omitted for brevity without departing from this invention.
  • the processing device 500 shown in Figure 5, includes a processor 505, a non-volatile memory 510, and a volatile memory 515.
  • the processor 505 is a processor, microprocessor, controller, or a combination of processors, microprocessor, and/or controllers that performs instructions stored in the volatile memory 515 or non-volatile memory 510 to manipulate data stored in the memory.
  • the non-volatile memory 510 can store the processor instructions utilized to configure the processing system 500 to perform processes including processes in accordance with embodiments of the invention and/or data for the processes being utilized.
  • the processing system software and/or firmware can be stored in any of a variety of non- transient computer readable media appropriate to a specific application.
  • a network interface is a device that allows processing system 500 to transmit and receive data over network based upon the instructions performed by processor 505.
  • processor 505 a processing system 500 is illustrated in FIG. 5, any of a variety of processing systems in the various devices can configured to provide the methods and systems in accordance with embodiments of the invention can be utilized.
  • a control application executed by the processing system in accordance with some of these embodiments maps a virtual reality environment to a workspace environment of the device. Based on the mapping of the virtual reality environment to the workspace, the parameters for movements of the manipulators within the workspace are set such the manipulators provide haptic feedback for movement by the user within the virtual reality environment.
  • a process for controlling the robotic manipulators of a full body haptic device in accordance with an embodiment of the invention is shown in Figure 6.
  • virtual reality environment information is received (605).
  • the virtual reality information includes information about the location and/or dimensions of objects in the environment.
  • the virtual reality information may also include position and orientation information for the user within the virtual environment.
  • the position and dimension of objects and other structures in the virtual reality information are then translated from the virtual reality space to coordinates within the workspace of the manipulators of the full body haptic device (610).
  • the workspace coordinates of the positions and dimension of objects and other structures in the virtual reality environment are then used to set limits on the movements of each of the robotic manipulators (615). This process may then be periodically repeated as new virtual reality information is received by the processing system.
  • the process 600 then begins to receive input of operator's movements from the sensors mounted on the manipulators indicating desired movements by the operator in the virtual reality world (620).
  • a movement for each manipulator to emulate the user's movement in the virtual reality world is then determined from the received input from the sensors (625).
  • the movement for each manipulator is then analyzed to determine whether, the movement will cause a collision with operator (630). If a possible collision is detected, the movement of each manipulator is revised (645) then revised movements are retested.
  • the movements are also tested to determine whether the movement will violate an operator joint limit (635). If a possible violation of a joint limit is detected, the movement of each manipulator is revised (645) then revised movements are retested.
  • the movements are further tested to see if a collision with a VR object occurs (640). If a possible collision is detected, the movement of each manipulator is revised (645) then revised movements are retested.
  • the control scheme is dominated by admittance control allowing the operator to move freely with no resistance.
  • admittance control and impedance control can be used to simulate a virtual stiffness of the object. This can range from completely rigid (thus not allowing any penetration by the user's limbs for a virtual wall or similar object) to semi-compliant (such as for a virtual fluid).
  • the movement for each manipulator is sent to the manipulators (650) and the process repeats until an end signal is detected (655).

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un dispositif exosquelette haptique pour corps complet. Le dispositif comporte une tour qui a deux manipulateurs robotiques supérieurs qui se connectent à des bras d'un utilisateur et deux manipulateurs robotiques inférieurs qui se connectent à des jambes de l'utilisateur. Un système de commande commande les manipulateurs robotiques supérieur et inférieur pour fournir une rétroaction haptique pour un environnement virtuel lorsque l'utilisateur traverse l'environnement.
PCT/US2018/053781 2017-09-29 2018-10-01 Dispositif exosquelette haptique pour corps complet WO2019068095A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762565447P 2017-09-29 2017-09-29
US62/565,447 2017-09-29

Publications (1)

Publication Number Publication Date
WO2019068095A1 true WO2019068095A1 (fr) 2019-04-04

Family

ID=65902778

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/053781 WO2019068095A1 (fr) 2017-09-29 2018-10-01 Dispositif exosquelette haptique pour corps complet

Country Status (1)

Country Link
WO (1) WO2019068095A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113435011A (zh) * 2021-06-01 2021-09-24 华中科技大学鄂州工业技术研究院 基于工作空间相似度的外肢体机器人参数优化方法
WO2022018480A1 (fr) * 2020-07-19 2022-01-27 Julio Alberto Mendoza Mendoza Procédés d'interaction augmentée haptique et immersive

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6270414B2 (en) * 1997-12-31 2001-08-07 U.S. Philips Corporation Exoskeletal platform for controlling multi-directional avatar kinetics in a virtual environment
US20080304935A1 (en) * 2007-05-01 2008-12-11 Scott Stephen H Robotic exoskeleton for limb movement
US20090248202A1 (en) * 2006-08-31 2009-10-01 Koichi Osuka Multi-joint structure, mounting tool using it, system and human machine interface
US20120179075A1 (en) * 2006-03-29 2012-07-12 University Of Washington Exoskeleton
US8401225B2 (en) * 2011-01-31 2013-03-19 Microsoft Corporation Moving object segmentation using depth images
US20160058647A1 (en) * 2014-08-29 2016-03-03 Conor J. MADDRY Pneumatic electromyographic exoskeleton
US20160320862A1 (en) * 2014-05-01 2016-11-03 Aaron Schradin Motion control seat input device
US9652037B2 (en) * 2013-07-05 2017-05-16 Axonvr Corporation Whole-body human-computer interface

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6270414B2 (en) * 1997-12-31 2001-08-07 U.S. Philips Corporation Exoskeletal platform for controlling multi-directional avatar kinetics in a virtual environment
US20120179075A1 (en) * 2006-03-29 2012-07-12 University Of Washington Exoskeleton
US20090248202A1 (en) * 2006-08-31 2009-10-01 Koichi Osuka Multi-joint structure, mounting tool using it, system and human machine interface
US20080304935A1 (en) * 2007-05-01 2008-12-11 Scott Stephen H Robotic exoskeleton for limb movement
US8401225B2 (en) * 2011-01-31 2013-03-19 Microsoft Corporation Moving object segmentation using depth images
US9652037B2 (en) * 2013-07-05 2017-05-16 Axonvr Corporation Whole-body human-computer interface
US20160320862A1 (en) * 2014-05-01 2016-11-03 Aaron Schradin Motion control seat input device
US20160058647A1 (en) * 2014-08-29 2016-03-03 Conor J. MADDRY Pneumatic electromyographic exoskeleton

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"Computer Memory", WIKIPEDIA, 10 September 2017 (2017-09-10), XP055586264, Retrieved from the Internet <URL:https://en.wikipedia.org/wiki/Computer_memory> *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022018480A1 (fr) * 2020-07-19 2022-01-27 Julio Alberto Mendoza Mendoza Procédés d'interaction augmentée haptique et immersive
CN113435011A (zh) * 2021-06-01 2021-09-24 华中科技大学鄂州工业技术研究院 基于工作空间相似度的外肢体机器人参数优化方法
CN113435011B (zh) * 2021-06-01 2022-05-20 华中科技大学鄂州工业技术研究院 基于工作空间相似度的外肢体机器人参数优化方法

Similar Documents

Publication Publication Date Title
Wu et al. A teleoperation interface for loco-manipulation control of mobile collaborative robotic assistant
JP5180989B2 (ja) 人型ロボットの自動制御のための方法及び装置
US20210402590A1 (en) Robotic navigation system and method
EP2252231B1 (fr) Procédé et système pour simulation de chirurgie robotisée
CN109700535B (zh) 用于外科手术及其它应用的软件中心和高度可配置的机器人系统
Zhou et al. RML glove—An exoskeleton glove mechanism with haptics feedback
Koenemann et al. Real-time imitation of human whole-body motions by humanoids
US8406989B1 (en) Method for adaptive obstacle avoidance for articulated redundant robot arm
Dean-Leon et al. TOMM: Tactile omnidirectional mobile manipulator
WO2019068095A1 (fr) Dispositif exosquelette haptique pour corps complet
Dean-Leon et al. Whole-body active compliance control for humanoid robots with robot skin
JP7035309B2 (ja) マスタスレーブシステム
Zacharias et al. Using a model of the reachable workspace to position mobile manipulators for 3-d trajectories
Chen et al. Development of a user experience enhanced teleoperation approach
Buzzi et al. An uncontrolled manifold analysis of arm joint variability in virtual planar position and orientation telemanipulation
Sarac et al. Rendering strategies for underactuated hand exoskeletons
Song et al. Integrated voluntary-reactive control of a human-superlimb hybrid system for hemiplegic patient support
Zhao et al. An intuitive human robot interface for tele-operation
Ju et al. Human-centered evaluation of shared teleoperation system for maintenance and repair tasks in nuclear power plants
Chang et al. On wearable, lightweight, low-cost human machine interfaces for the intuitive collection of robot grasping and manipulation data
JP3884249B2 (ja) 人間型ハンドロボット用教示システム
Marrone et al. Compliant interaction of a domestic service robot with a human and the environment
Kerpa et al. Arm-hand-control by tactile sensing for human robot co-operation
Frejek et al. A methodology for tele-operating mobile manipulators with an emphasis on operator ease of use
Amirshirzad et al. Synergistic human-robot shared control via human goal estimation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18862553

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18862553

Country of ref document: EP

Kind code of ref document: A1