WO2009055707A1 - Auto-évitement de collision et d'obstacle en temps réel - Google Patents
Auto-évitement de collision et d'obstacle en temps réel Download PDFInfo
- Publication number
- WO2009055707A1 WO2009055707A1 PCT/US2008/081171 US2008081171W WO2009055707A1 WO 2009055707 A1 WO2009055707 A1 WO 2009055707A1 US 2008081171 W US2008081171 W US 2008081171W WO 2009055707 A1 WO2009055707 A1 WO 2009055707A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- motion
- body segment
- redirected
- determining
- joint
- Prior art date
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
- B25J9/1666—Avoiding collision or forbidden zones
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39091—Avoid collision with moving obstacles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39096—Self-collision, internal collison, collision between links of one robot
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40475—In presence of moving obstacles, dynamic environment
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40492—Model manipulator by spheres for collision avoidance
Definitions
- the invention generally relates to the field of controlling motion of a system, and more specifically, to controlling motion of a system to avoid collision.
- Embodiments of the present invention provide a method (and corresponding system and computer program product) for avoiding collision of a body segment with unconnected structures in an articulated system.
- a virtual surface is constructed surrounding an actual surface of the body segment. Distances between the body segment and unconnected structures are monitored. Responding to an unconnected structure penetrating the virtual surface, a redirected joint motion that prevents the unconnected structure from penetrating deeper into the virtual surface is determined. The body segment is redirected based on the redirected joint motion to avoid colliding with the unconnected structure.
- a collision point on the actual surface of the body segment is determined and used to determining a redirected motion of the collision point that prevents the unconnected structure from penetrating deeper into the virtual surface.
- the redirect joint motion is determined based on the redirected motion of the collision point and used to avoid colliding with the unconnected structure.
- FIG. 1 is a block diagram illustrating a motion retargeting system for controlling a target system in accordance with one embodiment of the invention.
- Figure 2 is a block diagram illustrating a configuration of the motion retargeting system shown in Figure 1 in accordance with one embodiment of the invention.
- Figure 3 is a flow diagram illustrating a tracking retargeting process in accordance with one embodiment of the invention.
- Figure 4 is a diagram illustrating two unconnected rigid bodies redirected to avoid colliding into each other in accordance with one embodiment of the invention.
- Figure 5 is a diagram illustrating the plot of an example blending function used to avoid collision in accordance with one embodiment of the invention.
- Figure 6 is a flowchart illustrating an example process for preventing collisions between unconnected bodies in accordance with one embodiment of the invention.
- the present invention provides a motion retargeting system (and corresponding method and computer program product) for real-time motion control of robots and other articulated rigid body systems while avoiding self collisions and obstacles. Collisions are avoided by constructing a virtual surface around body segments, and redirecting motion of collision points away from possible collisions when the virtual surface is penetrated.
- FIGS. The Figures (FIGS.) and the following description relate to embodiments of the present invention by way of illustration only. Reference will now be made in detail to several embodiments, examples of which are illustrated in the accompanying figures. It is noted that wherever practicable similar or like reference numbers may be used in the figures and may indicate similar or like functionality. The figures depict embodiments of the disclosed system (or method) for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles described herein.
- FIG. 1 is a block diagram illustrating a motion retargeting system 100 for controlling a target system 104, such as a robotic/bio-robotic system, to simulate motions tracked in a source system 102 in real time while avoiding self collisions and obstacles, according to one embodiment of the present invention.
- the motion retargeting system 100 (also known as motion planning system or motion filtering system) detects motion descriptors 108 of the source system 102.
- the source system 102 may be any motion generator, for example, a human or an animal.
- the motion retargeting system 100 generates joint variables 110 for controlling the motion of the target system 104.
- the target system 104 may be, for example, a generic articulated system, such as a robot, an articulated mechanism (e.g., a humanoid robot), an avatar, or an exoskeleton apparatus for wearing by a human or animal.
- the motion retargeting system 100 captures motions generated in the source system 102 and transfers the captured motions to the target system 104, a process commonly referred to as motion retargeting.
- Motions in the source system 102 are tracked (e.g., by measuring marker positions, feature points) and expressed as motion descriptors 108 (also known as motion trajectories, desired task descriptors, task variables) using one or more motion primitives in Cartesian (or task) space.
- the motion descriptors 108 are converted to joint variables 110 (also known as joint space trajectories, joint motions, joint commands, joint motion descriptors) by the motion retargeting system 100.
- the motion retargeting system 100 uses the joint variables 110 to control the target system 104 to simulate the motion in the source system 102.
- the motion retargeting system 100 can impose constraints on motion in the target system 104 to avoid joint limits, muscular torque limits, self collisions, obstacles, and the like.
- the source system 102 represents a human model and the source motion represents human motion primitives which are typically observed or inferred from measurements
- the target system 104 represents a humanoid robot that is controlled to imitate the human model's motion.
- the motion retargeting system 100 may be used for other purposes such as human pose estimation, tracking and estimation, and joint torque estimation in biomechanics.
- FIG. 2 is a block diagram illustrating a configuration of the motion retargeting system 100 for generating joint commands from observed motion descriptors 108 according to one embodiment.
- the motion retargeting system 100 generates a joint command q (the joint variables 110) for application to a robot system 214 (the target system 104) in response to the motion descriptors 108 extracted from observations of human motion in the source system 102 and sensed motion information 236 from the robot system 214.
- the motion retargeting system 100 comprises a tracking retargeting system 202, a constraints system 204, and a balance control system 206.
- the tracking retargeting system 202 generates the joint command q from the observed motion descriptors 108, constraint motion descriptors 230 and appropriate weighting matrices 232 from the constraints system 204, and balance motion descriptors 234 from the balance control system 206.
- the constraints system 204 generates the constraint motion descriptors 230 in response to the sensed motion information from the robot system 214.
- the balanced control system 206 generates the balance motion descriptors 234 in response to the sensed motion information 236 from the robot system 214.
- the motion retargeting system 100, the tracking retargeting system 202, the constraints system 204, and/or the balance control system 206 takes the generated joint command q as input, instead of or in addition to the sensed motion information 236 from the robot system 214.
- the motion retargeting system 100 uses a task space control framework to generate motion for all degrees of freedom in the target system 104 (in this case, the robot system 214) from a set of desired motion descriptors 108 which are observed from measurements (e.g., at feature points), synthesized, or computed from the current configuration of the target system 104.
- the tracking retargeting system 202 generates motion results in a set of computed joint commands which track the desired task descriptors, e.g., minimize the Cartesian tracking error.
- the balance control system 206 controls the resulting motion for balance and keeps the target system 104 stable.
- the constraint system 204 provides commands to prevent the target system 104 from violating the physical limits, such as joint limits, velocity limits, torque limits, and also works with the tracking retargeting system 202 to ensure the target system 104 avoids obstacles, self collisions, and computational problems arising from singularities.
- the three systems 202, 204 and 206 may present a large number of conflicting tasks which may be resolved through a hierarchical task management strategy. Further information of resolving conflicting tasks is found in U.S. Application No. 11/734,758, filed April 12, 2007, titled "Control Of Robots From Human Motion Descriptors", the content of which is incorporated by reference herein in its entirety.
- the motion retargeting system 100 may be configured as software (e.g., modules that comprise instructions executable by a processor), hardware (e.g., an application specific integrated circuit), or a combination thereof.
- the software and/or hardware may operate in a computer system that is structured to include a processor, memory, computer-readable storage medium (e.g., hard drive), network interfaces, and applicable operating system and other functional software (e.g., network drivers, communication protocols).
- the tracking retargeting system 202 converts the desired trajectories (the motion descriptors 108) from Cartesian space to joint space through a trajectory conversion process.
- the joint (or configuration) space refers to the set of all possible configurations of the target system 104.
- the Cartesian (or task) space refers to the space of the source system 102.
- One goal for the tracking retargeting system 202 is to generate collision-free joint motion based on reference motion described in the task space.
- X 1 J,(q) q , (1)
- J 1 is the Jacobian of the task.
- ⁇ z and P 1 are vectors corresponding to the angular velocity of the task frame and the linear velocity of the task position referenced to the base frame, respectively.
- an augmented spatial velocity vector X 1 and an augmented Jacobian matrix J are formed as follows,
- the Jacobian matrix may be decomposed to its rotational and translational components, denoted by J 0 and J p , respectively, as follows,
- Closed loop inverse kinematics is an effective method to perform the trajectory conversion from task space to joint space.
- the tracking retargeting system 202 utilizes a CLIK algorithm to perform the trajectory conversion.
- CLIK algorithms can be found in B. Dariush, M. Gienger, B. Jian, C. Goerick, and K. Fujimura, "Whole body humanoid control from human motion descriptors", Int. Conf. Robotics and Automation, (ICRA), Pasadena, CA (2008), the content of which is incorporated by reference herein in its entirety.
- a CLIK algorithm uses a set of task descriptors as input and estimates the robot joint commands that minimize the tracking error between the reference Cartesian motion (the desired task descriptors) and predicted Cartesian motion.
- a control policy of the CLIK algorithm is configured to produce robot joint commands such that the Cartesian error between the predicted robot task descriptors and the reference task descriptors is minimized.
- the tracking performance is subject to the kinematic constraints of the robot system 214, as well as the execution of multiple and often conflicting task descriptor requirements.
- the formulation of such a constrained optimization is based on extensions of the CLIK formulation.
- the joint velocities may be computed by inverting Equation 1 and adding a feedback error term to correct for numerical drift.
- J denotes the regularized right pseudo-inverse of J weighted by a positive definite matrix Wi and regularized by a positive definite damping matrix W2,
- the rate of convergence of the error for the Uh descriptor is controlled by K u a diagonal 6 x 6 positive definite gain matrix.
- FIG. 3 is a flow diagram illustrating a tracking retargeting process based on the CLIK algorithm.
- the process is implemented in the tracking retargeting system 202, such that it can use the CLIK algorithm to track the position and orientation of the observed motion descriptors 108.
- the tracking retargeting system 202 generates a joint command (q) for following a time varying desired position (p r ) and orientation ( ⁇ r ) of the task descriptors 108 in response to position errors generated by a position error system 307 and orientation errors generated by a orientation error system 304.
- the tracking retargeting system 202 uses Equation 7 as the control law to generate the joint command for the robot system 214.
- Collision avoidance of a target system 104 with itself or with other obstacles allows the target system 104 to safely execute a motion.
- Collision avoidance may be categorized as self-collision avoidance or obstacle avoidance.
- Self collision avoidance refers to a situation where two segments of the robot system 214 come into contact; whereas obstacle avoidance refers to the situation where the robot system 214 comes into contact with an object in the environment.
- Self collision avoidance may be further categorized as avoidance of self collision between connected body segment pairs and avoidance of self collision between unconnected body segment pairs. By connected segment pairs, it is implied that the two segments are connected at a common joint and assumed that the joint is rotational. In the case of obstacle avoidance, the two colliding bodies are always unconnected.
- the approach can be implemented in the CLIK formulation of the tracking retargeting system 202. Avoidance of self collision between connected body segment pairs are described first, followed by avoidance of collision between unconnected bodies. It is noted that the approach for avoidance of collision between unconnected bodies can be used to avoid both self collision between unconnected body segment pairs and collision with obstacle since both involves a segment of the robot system 214 comes into contact with an unconnected body.
- avoidance of self collision between connected body segments and joint limit avoidance are achieved by the proper selection of the weighting matrix Wi in Equation 8.
- weighting matrix Wi is defined by the Weighted Least- Norm (WLN) solution. The WLN solution was originally proposed by T. F. Chan and R. V.
- a WLN solution is formulated in the context of Damped Least Squares Jacobian inverse.
- the WLN solution is utilized to generate an appropriate weighting matrix based on the gradient of a joint limit function to dampen joints nearing their limits. This solution is described below.
- H ⁇ q A candidate joint limit function that has higher values when the joints near their limits and tends to infinity at the joint limits.
- the gradient — is equal to zero if the joint is at the middle of its range and goes to
- the diagonal elements W JLI are defined by:
- ⁇ H / Bq 1 1 represents the change in the magnitude of the joint limit gradient function.
- a positive value indicates the joint is moving toward its limit while a negative value indicates the joint is moving away from its limit.
- the associated weighting factor described by the first condition in Equation 13 becomes very large causing the motion to slow down.
- the weighting factor is near infinity and the corresponding joint virtually stops. If the joint is moving away from the limit, there is no need to restrict or penalize the motions.
- the second condition in Equation 13 allows the joint to move freely. Therefore, W jL can be used for joint limit avoidance.
- Figure 4 is a diagram illustrating two unconnected rigid bodies 410, 420 (i.e., bodies which do not share a joint) redirected to avoid colliding into each other according to one embodiment.
- body 410 also referred to as body A
- body 420 also referred to as body B
- body A may be moving toward a stationary body B, as indicated by linear velocity p a 442.
- the coordinates of the shortest distance d (d > 0) between the two bodies are denoted by p a 412 and p b 422, referring to the base frame of the joint space.
- the two points, pa and pb are also referred to as collision points.
- the coordinates of the point on the virtual surface corresponding top a denoted by p vs 424, is defined by
- the region between the actual surface of body A and its virtual surface 430 is referred to as the critical zone 440. If body B is stationary, the motion atp a can be redirected to prevent penetration in the critical zone 440. This redirection is invoked when d ⁇ d c .
- the tracking retargeting system 202 controls (or redirects) the motion o ⁇ p a by modifying the trajectory of the desired task descriptor p r for body A in the task space.
- the redirected motion o ⁇ p a is denoted by p ' a and its associated linear velocity by p' a 444.
- the tracking retargeting system 202 can redirect the collision point to prevent the two bodies from penetrating deeper into the critical zone 440 using different magnitude and direction of p ⁇ 444.
- the tracking retargeting system 202 redirects the collision point p a in a direction opposite to the unit normal vector ⁇ a .
- the tracking retargeting system 202 redirects the collision point p a so that it slides along a direction which is tangent to the surface of body A at the collision point p a , as shown in Figure 4.
- the tracking retargeting system 202 guides the collision point motion of p a along the virtual surface boundary, producing a more natural motion toward its target.
- Equation 7 Least Squares inverse.
- the term J * (p r + Ke) is the joint velocity solution obtained from Equation 7.
- Equation 16 The physical interpretation of Equation 16 is as follows. The first term determines the joint velocities needed to redirect the collision point velocities along p ⁇ .
- Equation 16 is the orthogonal complement of the first term which computes the entries for those joint velocities which do not affect the motion of the collision point p a .
- the discontinuity of p ⁇ results in a discontinuity of p ⁇ , as given by the solution in Equation 17.
- the tracking retargeting system 202 may blend the solutions of p ⁇ before and after redirection occurs.
- the parameter ⁇ may be used to shift the distance d where blending is initiated and terminated.
- ⁇ 0.5
- the tracking retargeting system 202 can specify the redirection vectors at the critical points p ⁇ and pb and use augmentation to control both critical points.
- the augmented velocity vector and Jacobian at the critical points are described by,
- FIG. 6 is a flowchart illustrating an example process 600 for preventing collisions between unconnected bodies utilizing the above algorithm according to on embodiment of the present invention.
- the process 600 can be executed or otherwise performed by, for example, the tracking retargeting system 202 illustrated in Figure 2.
- the process 600 focuses on collision prevention between one moving body segment of a robot system 214 and a static unconnected structure.
- One of ordinary skill in the related arts would understand that the process 600 can be applied to some or all body segments of the robot system 214, and/or to collision prevention between two moving bodies.
- One or more portions of the process 600 may be implemented in embodiments of hardware and/or software or combinations thereof.
- the process 600 may be embodied through instructions for performing the actions described herein and such instrumentations can be stored within a tangible computer readable medium (e.g., flash memory, hard drive), and are executable by a computer processor.
- a tangible computer readable medium e.g., flash memory, hard drive
- one of ordinary skill in the related arts would understand that other embodiments can perform the steps of the process 600 in different order.
- other embodiments can include different and/or additional steps than the ones described here.
- the tracking retargeting system 202 can perform multiple instances of the steps of the process 600 concurrently and/or in parallel.
- the tracking retargeting system 202 controls motion of the robot system 214 based on motion descriptors received from a source system 102.
- the tracking retargeting system 202 constructs 610 a virtual surface surrounding a body segment of the robot system 214. The distance between the virtual surface and the actual surface of the body segment is determined by a critical distance.
- the tracking retargeting system 202 monitors 620 distances between the body segment and unconnected structures such as indirectly connected or unconnected body segments and obstacles in the environment.
- the tracking retargeting system 202 maintains a distance table listing the distances between the body segment and all unconnected structures that may potentially collide with the body segment, measures their distances through sensors in real time, and updates the distance table accordingly.
- the tracking retargeting system 202 detects 630 that an unconnected structure has penetrated the virtual surface based on the monitored distances. If the distance between the body segment and the unconnected structure passes (or reaches) the critical distance d c , the tracking retargeting system 202 determines that the unconnected structure has penetrated the virtual surface and thus poses a threat to collide with the body segment. It is noted that in alternate embodiments the tracking retargeting system 202 determines the collision threat when the two unconnected bodies approaches the critical distance d c .
- the tracking retargeting system 202 calculates (or determines) 640 a redirected joint motion that affects the motion of the body segment.
- the tracking retargeting system 202 identifies a collision point on the actual surface of the body segment, and calculates (or computes, determines) a redirected motion of the collision point that prevents the unconnected structure from penetrating deeper into the virtual surface (e.g. using Equation 15).
- the tracking retargeting system 202 calculates a redirected joint motion that causes the redirected motion at the collision point (e.g., using Equation 16).
- the tracking retargeting system 202 calculates a redesigned task descriptor of a feature point (e.g., using Equation 17) based on the redirected joint velocity, and calculates 640 a redirected joint motion based on the redirected task descriptor (e.g., using Equation 18). [0054] The tracking retargeting system 202 redirects 650 the body segment based on the redirected joint motion to avoid collision with the unconnected structure. Thereafter, the tracking retargeting system 202 continues monitoring 620 the redirected body segment to prevent future collisions with unconnected structures.
- the above embodiments describe a motion retargeting system for real-time motion control of robot systems to avoid self collisions and obstacles in motion retargeting.
- the motion retargeting system can be used to control other articulated rigid body systems, either in real environment or virtual environment (e.g., human figure animation), and for purpose other than motion retargeting (e.g., robotic motion generation and control, human pose estimation, tracking and estimation, and joint torque estimation in biomechanics).
- Some portions of above description describe the embodiments in terms of algorithmic processes or operations, for example, the processes and operations as described with Figure 6. These algorithmic descriptions and representations are commonly used by those skilled in the data processing arts to convey the substance of their work effectively to others skilled in the art.
- any reference to "one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment.
- the appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
- connection along with their derivatives. It should be understood that these terms are not intended as synonyms for each other. For example, some embodiments may be described using the term “connected” to indicate that two or more elements are in direct physical or electrical contact with each other. In another example, some embodiments may be described using the term “coupled” to indicate that two or more elements are in direct physical or electrical contact. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other. The embodiments are not limited in this context.
Abstract
L'invention concerne un système, un procédé et un produit de programme informatique pour éviter la collision d'un segment corporel avec des structures non reliées dans un système articulé. Une surface virtuelle est construite pour entourer une surface réelle du segment corporel. Des distances entre le segment corporel et les structures non reliées sont contrôlées. En répondant à une structure non reliée pénétrant la surface virtuelle, un mouvement de joint redirigé qui empêche la structure non reliée de pénétrer plus profondément dans la surface virtuelle est déterminé. Le segment corporel est redirigé en se basant sur le mouvement de joint redirigé pour éviter une collision avec la structure non reliée.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010531282A JP5156836B2 (ja) | 2007-10-26 | 2008-10-24 | リアルタイム自己衝突および障害物回避 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US98306107P | 2007-10-26 | 2007-10-26 | |
US60/983,061 | 2007-10-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2009055707A1 true WO2009055707A1 (fr) | 2009-04-30 |
Family
ID=40580059
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2008/081171 WO2009055707A1 (fr) | 2007-10-26 | 2008-10-24 | Auto-évitement de collision et d'obstacle en temps réel |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP5156836B2 (fr) |
WO (1) | WO2009055707A1 (fr) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011035069A3 (fr) * | 2009-09-15 | 2011-05-05 | Harris Corporation | Appareil robotique mettant en oeuvre un mécanisme anticollision et procédés associés |
CN104097205A (zh) * | 2013-04-07 | 2014-10-15 | 同济大学 | 基于任务空间的机器人实时运动自碰撞避免控制方法 |
EP2952301A1 (fr) * | 2014-06-05 | 2015-12-09 | Aldebaran Robotics | Robot humanoïde avec des capacités d'évitement de collisions et de récupération d'une trajectoire |
WO2016044574A1 (fr) * | 2014-09-17 | 2016-03-24 | Intuitive Surgical Operations, Inc. | Systèmes et procédés pour l'utilisation de jacobien augmenté afin de commander un mouvement d'articulation de manipulateur |
WO2016170144A1 (fr) * | 2015-04-22 | 2016-10-27 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Commande et/ou régulation de moteurs d'un robot |
CN116152404A (zh) * | 2023-04-19 | 2023-05-23 | 苏州浪潮智能科技有限公司 | 动画重定向方法、装置、计算机设备及存储介质 |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102015204641B4 (de) * | 2014-06-03 | 2021-03-25 | ArtiMinds Robotics GmbH | Verfahren und System zur Programmierung eines Roboters |
EP2952300A1 (fr) * | 2014-06-05 | 2015-12-09 | Aldebaran Robotics | Détection de collision |
JP6378783B2 (ja) * | 2014-12-25 | 2018-08-22 | 川崎重工業株式会社 | アーム型のロボットの障害物自動回避方法及び制御装置 |
ITUA20163608A1 (it) * | 2016-05-19 | 2017-11-19 | Milano Politecnico | Procedimento e dispositivo per il controllo della movimentazione di uno o più robot collaborativi |
TWI741943B (zh) | 2021-02-03 | 2021-10-01 | 國立陽明交通大學 | 機器人控制方法、動作計算裝置及機器人系統 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5675720A (en) * | 1993-09-14 | 1997-10-07 | Fujitsu Limited | Method of searching for points of closest approach, and preprocessing method therefor |
US6708142B1 (en) * | 1999-01-14 | 2004-03-16 | University Of Central Florida | Automatic motion modeling of rigid bodies using collision detection |
US6853964B1 (en) * | 2000-06-30 | 2005-02-08 | Alyn Rockwood | System for encoding and manipulating models of objects |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0682287B2 (ja) * | 1986-10-23 | 1994-10-19 | 新明和工業株式会社 | ロボツトの走査制御方法 |
JPS63300903A (ja) * | 1987-06-01 | 1988-12-08 | Hitachi Ltd | 干渉判別における高速化の方式 |
JPH08108383A (ja) * | 1994-10-05 | 1996-04-30 | Fujitsu Ltd | マニピュレータ制御装置 |
JPH1133958A (ja) * | 1997-07-18 | 1999-02-09 | Ube Ind Ltd | 金型スプレイロボットの簡易干渉チェック方法 |
JP2003089082A (ja) * | 2001-09-17 | 2003-03-25 | National Institute Of Advanced Industrial & Technology | 物体間に働く拘束力検出方法及びシステム |
US7859540B2 (en) * | 2005-12-22 | 2010-12-28 | Honda Motor Co., Ltd. | Reconstruction, retargetting, tracking, and estimation of motion for articulated systems |
US8924021B2 (en) * | 2006-04-27 | 2014-12-30 | Honda Motor Co., Ltd. | Control of robots from human motion descriptors |
-
2008
- 2008-10-24 WO PCT/US2008/081171 patent/WO2009055707A1/fr active Application Filing
- 2008-10-24 JP JP2010531282A patent/JP5156836B2/ja active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5675720A (en) * | 1993-09-14 | 1997-10-07 | Fujitsu Limited | Method of searching for points of closest approach, and preprocessing method therefor |
US6708142B1 (en) * | 1999-01-14 | 2004-03-16 | University Of Central Florida | Automatic motion modeling of rigid bodies using collision detection |
US6853964B1 (en) * | 2000-06-30 | 2005-02-08 | Alyn Rockwood | System for encoding and manipulating models of objects |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011035069A3 (fr) * | 2009-09-15 | 2011-05-05 | Harris Corporation | Appareil robotique mettant en oeuvre un mécanisme anticollision et procédés associés |
US8386080B2 (en) | 2009-09-15 | 2013-02-26 | Harris Corporation | Robotic apparatus implementing collision avoidance scheme and associated methods |
US8527091B2 (en) | 2009-09-15 | 2013-09-03 | Harris Corporation | Robotic apparatus implementing collision avoidance scheme and associated methods |
CN104097205A (zh) * | 2013-04-07 | 2014-10-15 | 同济大学 | 基于任务空间的机器人实时运动自碰撞避免控制方法 |
AU2015270458B2 (en) * | 2014-06-05 | 2018-05-10 | Softbank Robotics Europe | Humanoid robot with collision avoidance and trajectory recovery capabilities |
EP2952301A1 (fr) * | 2014-06-05 | 2015-12-09 | Aldebaran Robotics | Robot humanoïde avec des capacités d'évitement de collisions et de récupération d'une trajectoire |
WO2015185738A3 (fr) * | 2014-06-05 | 2016-01-21 | Aldebaran Robotics | Robot humanoïde doté de capacités d'évitement de collision et de reprise de trajectoire |
US10179406B2 (en) | 2014-06-05 | 2019-01-15 | Softbank Robotics Europe | Humanoid robot with collision avoidance and trajectory recovery capabilities |
CN110772323A (zh) * | 2014-09-17 | 2020-02-11 | 直观外科手术操作公司 | 用于利用增广雅可比矩阵控制操纵器接头移动的系统和方法 |
US10327855B2 (en) | 2014-09-17 | 2019-06-25 | Intuitive Surgical Operations, Inc. | Systems and methods for utilizing augmented Jacobian to control manipulator joint movement |
WO2016044574A1 (fr) * | 2014-09-17 | 2016-03-24 | Intuitive Surgical Operations, Inc. | Systèmes et procédés pour l'utilisation de jacobien augmenté afin de commander un mouvement d'articulation de manipulateur |
US11213360B2 (en) | 2014-09-17 | 2022-01-04 | Intuitive Surgical Operations, Inc. | Systems and methods for utilizing augmented jacobian to control manipulator joint movement |
CN110772323B (zh) * | 2014-09-17 | 2022-05-17 | 直观外科手术操作公司 | 用于利用增广雅可比矩阵控制操纵器接头移动的系统和方法 |
WO2016170144A1 (fr) * | 2015-04-22 | 2016-10-27 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Commande et/ou régulation de moteurs d'un robot |
US10678210B2 (en) | 2015-04-22 | 2020-06-09 | Kastanienbaum GmbH | Controlling and/or regulating motors of a robot |
EP3285975B1 (fr) * | 2015-04-22 | 2023-02-15 | Kastanienbaum GmbH | Commande et/ou régulation de moteurs d'un robot |
CN116152404A (zh) * | 2023-04-19 | 2023-05-23 | 苏州浪潮智能科技有限公司 | 动画重定向方法、装置、计算机设备及存储介质 |
CN116152404B (zh) * | 2023-04-19 | 2023-07-14 | 苏州浪潮智能科技有限公司 | 动画重定向方法、装置、计算机设备及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
JP5156836B2 (ja) | 2013-03-06 |
JP2011500349A (ja) | 2011-01-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8170287B2 (en) | Real-time self collision and obstacle avoidance | |
US8396595B2 (en) | Real-time self collision and obstacle avoidance using weighting matrix | |
WO2009055707A1 (fr) | Auto-évitement de collision et d'obstacle en temps réel | |
Petrič et al. | Smooth continuous transition between tasks on a kinematic control level: Obstacle avoidance as a control problem | |
US8311731B2 (en) | Robots with collision avoidance functionality | |
US8160745B2 (en) | Robots with occlusion avoidance functionality | |
Liu et al. | Control of semi-autonomous teleoperation system with time delays | |
Xu et al. | Motion planning of manipulators for simultaneous obstacle avoidance and target tracking: An RNN approach with guaranteed performance | |
US9205887B2 (en) | Constrained resolved acceleration control | |
US20070255454A1 (en) | Control Of Robots From Human Motion Descriptors | |
WO2007076118A2 (fr) | Reconstitution, reciblage, suivi et estimation de mouvement pour systemes articules | |
WO2007076119A2 (fr) | Reconstitution, reciblage, suivi et estimation de mouvement pour systemes articules | |
Dariush et al. | Constrained closed loop inverse kinematics | |
Fallah et al. | Depth-based visual predictive control of tendon-driven continuum robots | |
Heins et al. | Mobile manipulation in unknown environments with differential inverse kinematics control | |
US20210387334A1 (en) | Direct force feedback control method, and controller and robot using the same | |
Bin Hammam et al. | Kinodynamically consistent motion retargeting for humanoids | |
Hsiao et al. | Robust belief-based execution of manipulation programs | |
Ye et al. | Novel two-stage hybrid IBVS controller combining Cartesian and polar based methods | |
Chan et al. | Collision-free visual servoing of an eye-in-hand manipulator via constraint-aware planning and control | |
Vahrenkamp et al. | Planning and execution of grasping motions on a humanoid robot | |
Zube et al. | Model predictive contact control for human-robot interaction | |
Cervantes et al. | Vision-based PID control of planar robots | |
KR20170019916A (ko) | 로봇 팔의 파지 궤적 생성 방법 및 장치 | |
Zheng et al. | Real-time whole-body obstacle avoidance for 7-DOF redundant manipulators |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 08842447 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010531282 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 08842447 Country of ref document: EP Kind code of ref document: A1 |