US8447428B2 - Method for editing movements of a robot - Google Patents

Method for editing movements of a robot Download PDF

Info

Publication number
US8447428B2
US8447428B2 US12/667,580 US66758008A US8447428B2 US 8447428 B2 US8447428 B2 US 8447428B2 US 66758008 A US66758008 A US 66758008A US 8447428 B2 US8447428 B2 US 8447428B2
Authority
US
United States
Prior art keywords
robot
positions
movement
temporal
computing device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US12/667,580
Other versions
US20100198403A1 (en
Inventor
Bruno Maisonnier
Jèróme Monceaux
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aldebaran SAS
Original Assignee
Aldebaran Robotics SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aldebaran Robotics SA filed Critical Aldebaran Robotics SA
Assigned to ALDEBARAN ROBOTICS S.A. reassignment ALDEBARAN ROBOTICS S.A. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MAISONNIER, BRUNO, MONCEAUX, JEROME
Publication of US20100198403A1 publication Critical patent/US20100198403A1/en
Application granted granted Critical
Publication of US8447428B2 publication Critical patent/US8447428B2/en
Assigned to SOFTBANK ROBOTICS EUROPE reassignment SOFTBANK ROBOTICS EUROPE CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: ALDEBARAN ROBOTICS
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1671Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40392Programming, visual robot programming language
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40395Compose movement with primitive movement segments from database

Definitions

  • the invention relates to a method for editing movements of a robot from a computing device suitable for communicating with the robot.
  • the invention also relates to a system for editing movements of a robot, including a computing device suitable for communicating with a robot.
  • the invention also relates to a computer program which, when executed, if it is loaded onto a computing device suitable for communicating with a robot, enables the editing of movements of the robot.
  • Methods notably for editing movements of a robot are known. Such methods enable a developer or user to edit movements which the robot will then be able to reproduce. This may involve, for example, a walking or dancing movement, or more simply, the lifting or waving of an arm.
  • a computer program is loaded onto the computing device suitable for communicating with the robot.
  • This computer program generates, for example, an interface which enables the position of one or more joints of the robot to be modified via motors associated with these joints.
  • the interface may appear on a screen of the computing device in the form of a cursor to be moved between two boundaries, or a numerical value to be modified.
  • the modification of the position of the cursor or the numerical value enables the position of a motor of the robot to be modified.
  • This robot may be either a real robot separated from the computing device, or a virtual robot displayed, for example, on a screen of the computing device.
  • temporal sequence of positions includes a series of reference positions, referred to as “keyframes”, which are interlinked via a temporal transition in such a way as to define the temporal sequence of positions.
  • keyframes a series of reference positions, referred to as “keyframes”, which are interlinked via a temporal transition in such a way as to define the temporal sequence of positions.
  • the different positions to be assumed by the robot, and therefore the different positions of the motors of the robots are generated as mentioned above from the interface displayed on the computing device, for example by means of a cursor.
  • An editing method of this type based on a cursor displayed on the computing device offers the advantage of enabling a fine adjustment of the reference positions in the temporal sequence of positions.
  • the problem solved by the invention is therefore to facilitate the editing of a wide range of movements of a robot.
  • a user may therefore define the reference positions firstly by a rough adjustment of the position of the robot, by moving the robot really or virtually, then by a fine adjustment thanks to movement means of the computing device.
  • the step consisting in generating the plurality of reference positions furthermore includes a step in which:
  • the robot includes a plurality of joints
  • the method includes a step in which:
  • the temporal sequences of positions of the plurality of temporal sequences of positions are read simultaneously in such a way as instigate a movement of the robot for the plurality of joints.
  • the robot is a real robot.
  • the robot is a virtual robot displayed on a screen of the computing device.
  • the virtual robot is generated on the basis of a real robot and a simulator, the simulator being designed in such a way that each movement of the real robot is capable of instigating the same movement for the virtual robot, and each movement of the virtual robot is capable of instigating the same movement for the real robot.
  • the aforementioned problem is also solved, according to a second aspect of the invention, by a computer program which, when executed, if it is loaded onto a computing device suitable for communicating with a robot, enables the editing of movements of the robot according to the previously described method.
  • the invention also relates to a system for editing movements of a robot, including a computing device suitable for communicating with the robot, the system including first means to generate a plurality of reference positions of the robot;
  • the system includes means to display a graphical representation of the temporal changes in the characteristics of at least one motor of the robot according to the temporal sequence of positions; means to modify the graphical representation of the temporal changes in such a way as to modify the temporal sequence of positions, and means to modify a new temporal sequence of positions according to the modifications of the graphical representation.
  • the robot includes a plurality of joints
  • the system includes means to generate a plurality of temporal sequences of positions, each of the temporal sequences of positions including reference positions corresponding to positions of some of the joints from the plurality of joints.
  • the system comprises means for simultaneously reading the temporal sequences of positions of the plurality of temporal sequences of positions in such a way as to instigate a movement of a robot according to the plurality of joints.
  • FIG. 1 shows a system according to the invention
  • FIG. 2 shows a first interface for the implementation of the method according to the invention
  • FIG. 3 shows a second interface including a temporal sequence of positions for the implementation of the method according to the invention
  • FIG. 4 shows a third interface for the implementation of the method according to the invention.
  • FIG. 1 shows, in a general manner, a real robot 1 .
  • the invention enables the editing of movements of the real robot 1 .
  • the real robot 1 includes motors, in its joints, and sensors enabling the position of the motors to be determined.
  • the real robot 1 is suitable for communicating, i.e. exchanging data, with a computing device 2 via communication means 3 .
  • These communication means 3 are, for example, of the WiFi wireless network type.
  • FIG. 2 shows a first interface 4 for the implementation of the method for editing movements of a robot according to the invention.
  • This interface 4 is generated on a screen of the computing device 2 if a computer program is loaded onto the computing device 2 .
  • the first interface 4 includes a window 5 providing a virtual representation in the form of a virtual robot 6 of the real robot 1 .
  • the virtual robot 6 is implemented through simulation of the real robot 1 using a simulator.
  • the simulator may be activated or de-activated, and enables the visualization in the window 5 of the movements and actions of the real robot 1 , and also the possible lights, sounds or words produced by the real robot 1 .
  • the simulator incorporates the laws of physics, and causes, for example, the robot to fall if it is unbalanced, and instigates a contact if the real robot touches an object.
  • the simulator also enables the visualization of the image recorded by a camera integrated in the real robot 1 .
  • the simulator is linked to the real robot 1 in such a way that all of the behaviors of the real robot 1 may be carried out on the virtual robot 6 . If the simulator is activated, a movement on the virtual robot 6 instigates the same movement on the real robot 1 . Similarly, a movement on the real robot 1 instigates the same movement on the virtual robot 6 .
  • a plurality of real robots 1 may also be represented in the window 5 in the form of a plurality of virtual robots 6 , even if the real robots 1 are not physically close, in such a way that the real robots 1 can be controlled in a grouped and synchronized manner from the simulator.
  • the description of the invention which follows and the actions on the robot are applicable in an identical manner to the virtual robot 6 and to the real robot 1 . If the user defines complex movements of the robot in which the latter risks falling, the use of the virtual robot will be particularly preferred by at least partially de-activating the simulator to avoid damaging the real robot 1 .
  • the zone 7 of the interface 1 includes a plurality of cursors 7 a , 7 b , 7 c capable of being moved, for example, thanks to a mouse associated with the computing device 2 .
  • Each of the cursors defines a position of a motor of the real robot 1 and therefore of a joint of the real robot 1 . If one of the cursors is moved as far as a certain value, the associated motor is moved according to the movement of the cursor.
  • the cursors 7 a , 7 b , 7 c therefore correspond to movements of a motor in a joint of the robot, for example, the motors of the head, elbow and shoulder respectively.
  • the cursors 7 a , 7 b , 7 c enable a fine adjustment of the position of the associated motors.
  • a position of the real robot 1 by moving, for example manually, the real robot 1 .
  • This manual movement defines a required position for the robot.
  • This required position is then recorded by the computing device 2 .
  • This type of movement notably enables a rough definition of the positions of the robot, for example by making the arm of the robot move from a low position to a high position by actuating the motor of the shoulder.
  • the changes in positions or movements of the cursors 7 a , 7 b and 7 c and of the robot are thus interdependent or linked.
  • the interface 4 may also include a window 14 including a list of predefined positions P 1 , P 2 , P 3 .
  • the parameters of the joints associated with the motors are associated with each of these positions. By clicking on an element from this list, for example using a mouse, the robot assumes the position defined by these parameters.
  • Reference positions of the robot are defined on the basis of these different means of positioning of the robots. As shown in FIG. 3 , these reference positions are inserted into a temporal sequence of positions 9 a , 9 b , including reference frames 10 . Each of the reference frames 10 corresponds to a reference position 10 of the robot, i.e. to the definition of the characteristics of each motor of the robot or some of the motors of the robot.
  • a movement is defined corresponding to a transition 11 between two reference positions 10 .
  • the transition may be a linear transition, in which the movement of each joint associated with a motor is regular and at a constant speed between the two reference positions 10 .
  • the transition may also be a smooth transition, in which the movement presents an acceleration then deceleration phase in such a way that the overall movement does not present any spurts at each reference position.
  • Other types of transition may also be added and defined by a user.
  • different temporal sequences of positions 9 a , 9 b can be used for different subsets of the set of motors and joints of the robot.
  • the sequence 9 a may correspond to movements of the motors of the upper part of the robot, including, notably, the shoulders, elbows and head
  • the sequence 9 b may correspond to the movements of the motors of the lower part of the robot, including, notably, the hips, knees, and ankles.
  • each of the sequences 9 a , 9 b can be activated or de-activated and instigated independently of the other sequences.
  • a conflict between two sequences i.e. if, during the reading of two distinct sequences, the same joint must assume two different positions at the same time, it is possible either to prioritize one sequence over another, or to proceed with an average of the two positions.
  • the defined sequence can be played with the aid of a sequence-reading function.
  • the position of a particular joint can be displayed as a function of time.
  • a window 12 implementing such a display is shown in FIG. 4 .
  • the window 12 includes a diagram 13 illustrating the changes in the position of a joint associated with a motor of the robot.
  • the diagram 13 is editable and can be modified with the aid of a mouse associated with the computing device 2 .
  • the editing of the diagram 13 also enables a fine adjustment of the transitions between two reference positions, and a fine adjustment of the positions of the joint within a reference frame, in order to define a new reference position.
  • each of the positions or temporal sequences of positions defined thanks to the invention may, having been stored in a memory, be exchanged by users, in such a way that a community of numerous users can improve the editing of the movements of the robot, each of the users of the community being able to use the movements edited by the other users of the community.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)
  • Numerical Control (AREA)

Abstract

The invention relates to a method of editing movements of a robot on the basis of computer equipment able to communicate with the robot, the method comprising steps in which: —a plurality of reference positions of the robot are generated; —at least one time sequence of positions is generated, the time sequence of positions comprising the plurality of reference positions, and transitional movements between two successive reference positions; and the robot is made interdependent with the displacement means included in the computer equipment and causing the displacement of the robot. The robot is a real robot and/or a virtual robot displayed on a screen of the computer equipment.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application is a National Stage of International patent application PCT/EP2008/058443, filed on Jul. 1, 2008, which claims priority to foreign French patent application No. FR 0704826, filed on Jul. 4, 2007, the disclosures of which are hereby incorporated by reference in their entirety.
FIELD OF THE INVENTION
The invention relates to a method for editing movements of a robot from a computing device suitable for communicating with the robot.
The invention also relates to a system for editing movements of a robot, including a computing device suitable for communicating with a robot.
The invention also relates to a computer program which, when executed, if it is loaded onto a computing device suitable for communicating with a robot, enables the editing of movements of the robot.
BACKGROUND OF THE INVENTION
Methods notably for editing movements of a robot are known. Such methods enable a developer or user to edit movements which the robot will then be able to reproduce. This may involve, for example, a walking or dancing movement, or more simply, the lifting or waving of an arm.
In such a known method, a computer program is loaded onto the computing device suitable for communicating with the robot. This computer program generates, for example, an interface which enables the position of one or more joints of the robot to be modified via motors associated with these joints. The interface may appear on a screen of the computing device in the form of a cursor to be moved between two boundaries, or a numerical value to be modified. The modification of the position of the cursor or the numerical value enables the position of a motor of the robot to be modified. This robot may be either a real robot separated from the computing device, or a virtual robot displayed, for example, on a screen of the computing device.
In the known methods, a plurality of positions are therefore defined for the robot thanks to the computer program loaded on the computing device.
These different positions are then integrated into a temporal sequence of positions, referred to as a “timeline”. Such a temporal sequence of positions includes a series of reference positions, referred to as “keyframes”, which are interlinked via a temporal transition in such a way as to define the temporal sequence of positions. In the temporal sequence of positions, the different positions to be assumed by the robot, and therefore the different positions of the motors of the robots, are generated as mentioned above from the interface displayed on the computing device, for example by means of a cursor.
An editing method of this type based on a cursor displayed on the computing device offers the advantage of enabling a fine adjustment of the reference positions in the temporal sequence of positions.
However, a method of this type is inconvenient for defining substantial movements of the robot and, therefore, rough adjustment of the positions. In particular, if the movements of the robot are defined on the basis of a cursor actuating a motor, it is complicated to manipulate the cursors of each of the motors to be actuated for a sufficient movement. The known methods for editing movements of a robot do not therefore enable a user to edit a wide range of movements in a simple manner.
SUMMARY OF THE INVENTION
The problem solved by the invention is therefore to facilitate the editing of a wide range of movements of a robot.
This problem is solved, according to a first aspect of the invention, by a method for editing movements of a robot from a computing device suitable for communicating with the robot, the method comprising steps in which:
    • a plurality of reference positions of the robot are generated;
    • a temporal sequence of positions is generated, the temporal sequence of positions including the plurality of reference positions, and transition movements between two successive reference positions;
      in which the step consisting in generating the plurality of reference positions includes steps in which:
    • a reference position of the temporal sequence of positions is generated by movement means of the computing device, the movement means instigating the movement of the robot, in such a way as to define the reference position;
      the step consisting in generating the plurality of reference positions, furthermore including a step in which:
    • the reference position is furthermore generated by moving the robot into a required position;
    • characteristics of the required position are recorded on a computing device,
    • the reference position is defined on the basis of characteristics of the required position.
Thanks to the invention, a user may therefore define the reference positions firstly by a rough adjustment of the position of the robot, by moving the robot really or virtually, then by a fine adjustment thanks to movement means of the computing device.
This enables the simple generation of a large number of positions of the robot.
According to one embodiment, the step consisting in generating the plurality of reference positions furthermore includes a step in which:
  • a graphical representation is displayed of the temporal changes in the characteristics of at least one joint of the robot according to the temporal sequence of positions;
  • the graphical representation of the temporal changes is modified in such a way as to modify the temporal sequence of positions;
  • a new temporal sequence of positions is generated according to the modifications of the graphical representation.
According to one embodiment, the robot includes a plurality of joints, and the method includes a step in which:
  • a plurality of temporal sequences of positions are generated, each of the temporal sequences of positions including reference positions corresponding to positions of some of the joints from the plurality of joints.
In this embodiment, the temporal sequences of positions of the plurality of temporal sequences of positions are read simultaneously in such a way as instigate a movement of the robot for the plurality of joints.
According to one embodiment, the robot is a real robot.
According to a different embodiment, the robot is a virtual robot displayed on a screen of the computing device.
In this case, the virtual robot is generated on the basis of a real robot and a simulator, the simulator being designed in such a way that each movement of the real robot is capable of instigating the same movement for the virtual robot, and each movement of the virtual robot is capable of instigating the same movement for the real robot.
The aforementioned problem is also solved, according to a second aspect of the invention, by a computer program which, when executed, if it is loaded onto a computing device suitable for communicating with a robot, enables the editing of movements of the robot according to the previously described method.
The invention also relates to a system for editing movements of a robot, including a computing device suitable for communicating with the robot, the system including first means to generate a plurality of reference positions of the robot;
  • second means to generate a temporal sequence of positions, the temporal sequence of positions including the plurality of reference positions, and movements between two successive reference positions;
  • movement means suitable for instigating the movement of at least one motor of the robot;
  • third means to generate a reference position of the temporal sequence of positions on the basis of movement means of a computing device in such a way as to define the reference position;
  • fourth means to furthermore generate the reference position by moving the robot into a required position, to record, on the computing device, the characteristics of the required position, and to define the reference position on the basis of the characteristics of the required position.
According to one embodiment of the aforementioned system, the system includes means to display a graphical representation of the temporal changes in the characteristics of at least one motor of the robot according to the temporal sequence of positions; means to modify the graphical representation of the temporal changes in such a way as to modify the temporal sequence of positions, and means to modify a new temporal sequence of positions according to the modifications of the graphical representation.
According to one embodiment, the robot includes a plurality of joints, and the system includes means to generate a plurality of temporal sequences of positions, each of the temporal sequences of positions including reference positions corresponding to positions of some of the joints from the plurality of joints.
In this case, the system comprises means for simultaneously reading the temporal sequences of positions of the plurality of temporal sequences of positions in such a way as to instigate a movement of a robot according to the plurality of joints.
BRIEF DESCRIPTION OF THE DRAWINGS
An embodiment of the invention will now be described with reference to the attached figures, in which:
FIG. 1 shows a system according to the invention;
FIG. 2 shows a first interface for the implementation of the method according to the invention;
FIG. 3 shows a second interface including a temporal sequence of positions for the implementation of the method according to the invention;
FIG. 4 shows a third interface for the implementation of the method according to the invention.
DETAILED DESCRIPTION
FIG. 1 shows, in a general manner, a real robot 1. The invention enables the editing of movements of the real robot 1. The real robot 1 includes motors, in its joints, and sensors enabling the position of the motors to be determined. The real robot 1 is suitable for communicating, i.e. exchanging data, with a computing device 2 via communication means 3. These communication means 3 are, for example, of the WiFi wireless network type.
FIG. 2 shows a first interface 4 for the implementation of the method for editing movements of a robot according to the invention. This interface 4 is generated on a screen of the computing device 2 if a computer program is loaded onto the computing device 2.
The first interface 4 includes a window 5 providing a virtual representation in the form of a virtual robot 6 of the real robot 1. The virtual robot 6 is implemented through simulation of the real robot 1 using a simulator. The simulator may be activated or de-activated, and enables the visualization in the window 5 of the movements and actions of the real robot 1, and also the possible lights, sounds or words produced by the real robot 1. The simulator incorporates the laws of physics, and causes, for example, the robot to fall if it is unbalanced, and instigates a contact if the real robot touches an object. In order to facilitate the definition of the movements, the simulator also enables the visualization of the image recorded by a camera integrated in the real robot 1. This enables the definition of the movements of the real robot 1 on the basis of the virtual robot 6, while taking account of the environment of the real robot 1. The simulator is linked to the real robot 1 in such a way that all of the behaviors of the real robot 1 may be carried out on the virtual robot 6. If the simulator is activated, a movement on the virtual robot 6 instigates the same movement on the real robot 1. Similarly, a movement on the real robot 1 instigates the same movement on the virtual robot 6. A plurality of real robots 1 may also be represented in the window 5 in the form of a plurality of virtual robots 6, even if the real robots 1 are not physically close, in such a way that the real robots 1 can be controlled in a grouped and synchronized manner from the simulator.
Since the stimulator equally enables the control of both the real robot 1 and the virtual robot 6, the description of the invention which follows and the actions on the robot are applicable in an identical manner to the virtual robot 6 and to the real robot 1. If the user defines complex movements of the robot in which the latter risks falling, the use of the virtual robot will be particularly preferred by at least partially de-activating the simulator to avoid damaging the real robot 1.
In the continuing description, actions on the real robot 1 will be described, but it is understood that the same actions can be applied to the virtual robot 6.
The zone 7 of the interface 1 includes a plurality of cursors 7 a, 7 b, 7 c capable of being moved, for example, thanks to a mouse associated with the computing device 2. Each of the cursors defines a position of a motor of the real robot 1 and therefore of a joint of the real robot 1. If one of the cursors is moved as far as a certain value, the associated motor is moved according to the movement of the cursor. The cursors 7 a, 7 b, 7 c therefore correspond to movements of a motor in a joint of the robot, for example, the motors of the head, elbow and shoulder respectively.
The cursors 7 a, 7 b, 7 c enable a fine adjustment of the position of the associated motors.
According to the invention, it is also possible to define a position of the real robot 1 by moving, for example manually, the real robot 1. This manual movement defines a required position for the robot. This required position is then recorded by the computing device 2. This type of movement notably enables a rough definition of the positions of the robot, for example by making the arm of the robot move from a low position to a high position by actuating the motor of the shoulder. The changes in positions or movements of the cursors 7 a, 7 b and 7 c and of the robot are thus interdependent or linked.
Following this rough movement, a fine movement with the aid of the cursors 7 a, 7 b and 7 c previously described enables fine adjustment of the position of the robot. The combination of the possibilities of fine and rough movements previously described enables a user to edit, in a simple manner, a large number of movements of the robot.
In order to further simplify the definition of the positions of the robot, the interface 4 may also include a window 14 including a list of predefined positions P1, P2, P3. The parameters of the joints associated with the motors are associated with each of these positions. By clicking on an element from this list, for example using a mouse, the robot assumes the position defined by these parameters.
Reference positions of the robot are defined on the basis of these different means of positioning of the robots. As shown in FIG. 3, these reference positions are inserted into a temporal sequence of positions 9 a, 9 b, including reference frames 10. Each of the reference frames 10 corresponds to a reference position 10 of the robot, i.e. to the definition of the characteristics of each motor of the robot or some of the motors of the robot.
In order to generate the temporal sequence of positions 9 a, 9 b on the basis of the reference positions 10, a movement is defined corresponding to a transition 11 between two reference positions 10. The transition may be a linear transition, in which the movement of each joint associated with a motor is regular and at a constant speed between the two reference positions 10. The transition may also be a smooth transition, in which the movement presents an acceleration then deceleration phase in such a way that the overall movement does not present any spurts at each reference position. Other types of transition may also be added and defined by a user.
According to the invention, different temporal sequences of positions 9 a, 9 b can be used for different subsets of the set of motors and joints of the robot. For example, the sequence 9 a may correspond to movements of the motors of the upper part of the robot, including, notably, the shoulders, elbows and head, and the sequence 9 b may correspond to the movements of the motors of the lower part of the robot, including, notably, the hips, knees, and ankles. This substantially facilitates the possibilities for defining the movements of the robot, since a reference frame does not have to include the definition of the positions of all of the joints, but only some of them. In instigating the different temporal sequences 9 a, 9 b in a synchronized manner, a movement of the entire robot is then obtained. Each of the sequences 9 a, 9 b can be activated or de-activated and instigated independently of the other sequences. In the event of a conflict between two sequences, i.e. if, during the reading of two distinct sequences, the same joint must assume two different positions at the same time, it is possible either to prioritize one sequence over another, or to proceed with an average of the two positions.
It is also possible to specify the time interval between two reference frames and copy the characteristics from one frame into another frame. In order to verify that the movement of the robot defined in the temporal sequence of positions corresponds exactly to the required movement, the defined sequence can be played with the aid of a sequence-reading function.
When the temporal sequence of positions is defined, the position of a particular joint can be displayed as a function of time. A window 12 implementing such a display is shown in FIG. 4. In FIG. 4, the window 12 includes a diagram 13 illustrating the changes in the position of a joint associated with a motor of the robot. The diagram 13 is editable and can be modified with the aid of a mouse associated with the computing device 2. The editing of the diagram 13 also enables a fine adjustment of the transitions between two reference positions, and a fine adjustment of the positions of the joint within a reference frame, in order to define a new reference position.
All of the functions defined above thus enable a particularly simple editing of the movements of the robot, even by a novice user of robotics programming. Furthermore, each of the positions or temporal sequences of positions defined thanks to the invention may, having been stored in a memory, be exchanged by users, in such a way that a community of numerous users can improve the editing of the movements of the robot, each of the users of the community being able to use the movements edited by the other users of the community.

Claims (14)

The invention claimed is:
1. A method for editing movements of a robot from a computing device suitable for communicating with the robot, the method comprising steps in which:
a plurality of reference positions of the robot are generated;
at least one temporal sequence of positions is generated, the temporal sequence of positions including the plurality of reference positions, and transition movements between two successive reference positions;
an interdependence being created between the robot and movement means included in the computing device and instigating the movement of the robot, and
in that the robot is a real robot and/or a virtual robot displayed on a screen of the computing device.
2. The method as claimed in claim 1, wherein the step consisting in generating the plurality of reference positions includes a step in which a reference position of the temporal sequence of positions is generated by movement means included in the computing device, the movement means instigating the movement of the robot, in such a way as to define the reference position.
3. The method as claimed in claim 1, wherein the step consisting in generating the plurality of reference positions furthermore includes a step in which:
the reference position is furthermore generated by moving the robot into a required position;
characteristics of the required position are recorded on the computing device,
the reference position is defined on the basis of characteristics of the required position.
4. The method as claimed in claim 1, wherein the step consisting in generating the plurality of reference positions furthermore includes a step in which:
a graphical representation is displayed of the temporal changes in the characteristics of at least one joint of the robot according to the temporal sequence of positions;
the graphical representation of the temporal changes is modified in such a way as to modify the temporal sequence of positions;
a new temporal sequence of positions is generated according to modifications of the graphical representation.
5. The method as claimed in claim 1, wherein the robot includes a plurality of joints, and in which
a plurality of temporal sequences of positions are generated, each of the temporal sequences of positions including reference positions corresponding to positions of some of the joints from the plurality of joints.
6. The method as claimed in claim 1, wherein the temporal sequences of positions of the plurality of temporal sequences of positions are read simultaneously, in such a way as to instigate a movement of the robot for the plurality of joints.
7. The method as claimed in claim 1, wherein the virtual robot is generated on the basis of a real robot and a simulator, the simulator being designed in such a way that each movement of the real robot is capable of instigating the same movement for the virtual robot, and each movement of the virtual robot is capable of instigating the same movement for the real robot.
8. A system for editing movements of a robot, including a computing device suitable for communicating with the robot, the system including:
first means to generate a plurality of reference positions of the robot;
second means to generate a temporal sequence of positions, the sequence including the plurality of reference positions, the movement of the robot between two reference positions being generated on the basis of an interpolation;
movement means suitable for instigating a movement of the robot, interdependent with the robot,
the robot being a real robot and/or a virtual robot displayed on a screen of the computing device.
9. The system as claimed in claim 8, further including third means to generate a reference position of the temporal sequence of positions on the basis of the movement means included in the computing device in such a way as to define the reference position.
10. The system as claimed in claim 8, further including fourth means for furthermore generating the reference position by moving the robot into a required position, to record, on the computing device, the characteristics of the required position, and to define the reference position on the basis of the characteristics of the required position.
11. The system as claimed in claim 8, including means to display a graphical representation of the temporal changes in the characteristics of at least one motor of the robot according to the temporal sequence of positions; means to modify the graphical representation of the temporal changes in such a way as to modify the temporal sequence of positions, and means to modify a new temporal sequence of positions according to the modifications of the graphical representation.
12. The system as claimed in claim 8, wherein the robot includes a plurality of joints, and in which the system includes means to generate a plurality of temporal sequences of positions, each of the temporal sequences of positions including reference positions corresponding to positions of some of the joints from the plurality of joints.
13. The system as claimed in claim 12, including means to read simultaneously the temporal sequences of positions of the plurality of temporal sequences of positions in such a way as to instigate a movement of the robot according to the plurality of joints.
14. A nontransitory computer readable medium for editing movements of a robot, the medium including a program thereon which, when executed by a computing device suitable for communicating with the robot, causes the computing device to perform:
generate a plurality of reference positions of the robot generate at least one temporal sequence of positions, the temporal sequence of positions including the plurality of reference positions, and transition movements between two successive reference positions; and
create an interdependence between the robot and movement means included in the computing device and instigating the movement of the robot,
wherein the robot is a real robot and/or a virtual robot displayed on a screen of the computing device.
US12/667,580 2007-07-04 2008-07-01 Method for editing movements of a robot Expired - Fee Related US8447428B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
FR0704826 2007-07-04
FR0704826A FR2918477A1 (en) 2007-07-04 2007-07-04 METHOD FOR EDITING MOVEMENTS OF A ROBOT
PCT/EP2008/058443 WO2009004004A1 (en) 2007-07-04 2008-07-01 Method of editing movements of a robot

Publications (2)

Publication Number Publication Date
US20100198403A1 US20100198403A1 (en) 2010-08-05
US8447428B2 true US8447428B2 (en) 2013-05-21

Family

ID=39271406

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/667,580 Expired - Fee Related US8447428B2 (en) 2007-07-04 2008-07-01 Method for editing movements of a robot

Country Status (6)

Country Link
US (1) US8447428B2 (en)
EP (1) EP2168018B1 (en)
JP (1) JP2010531743A (en)
AT (1) ATE556365T1 (en)
FR (1) FR2918477A1 (en)
WO (1) WO2009004004A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140285482A1 (en) * 2013-03-22 2014-09-25 Kt Corporation Apparatus and method for developing robot contents
KR20140115902A (en) * 2013-03-22 2014-10-01 주식회사 케이티 Apparatus and method for developing robot contents
US9592603B2 (en) 2014-12-01 2017-03-14 Spin Master Ltd. Reconfigurable robotic system
US10510189B2 (en) 2014-04-16 2019-12-17 Sony Interactive Entertainment Inc. Information processing apparatus, information processing system, and information processing method

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2946160B1 (en) * 2009-05-26 2014-05-09 Aldebaran Robotics SYSTEM AND METHOD FOR EDIT AND ORDER BEHAVIOR OF MOBILE ROBOT.
KR100968944B1 (en) * 2009-12-14 2010-07-14 (주) 아이알로봇 Apparatus and method for synchronizing robot
KR101572042B1 (en) 2014-03-19 2015-11-25 (주)로보티즈 Device, method and computer-readable recording medium for editing and playing robot motion
JP6598454B2 (en) * 2014-11-14 2019-10-30 株式会社クリエイティブマシン Teaching data creation method, creation device, and creation program
US20170043478A1 (en) * 2015-08-14 2017-02-16 Sphero, Inc. Data exchange system
JP6708581B2 (en) * 2017-04-07 2020-06-10 ライフロボティクス株式会社 Teaching device, display device, teaching program and display program
EP3629972A4 (en) 2017-05-25 2021-03-03 Covidien LP Event initiated release of function selection control for robotic surgical systems
JP7146402B2 (en) * 2018-01-18 2022-10-04 キヤノン株式会社 Information processing device and information processing method
CN114029949A (en) * 2021-11-08 2022-02-11 北京市商汤科技开发有限公司 Robot action editing method and device, electronic equipment and storage medium

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4541060A (en) * 1982-07-31 1985-09-10 Hitachi, Ltd. Path control method and apparatus
US4761745A (en) * 1985-11-07 1988-08-02 Mitsubishi Denki Kabushiki Kaisha Off-line programming for a robot with division of the work sequence into plural jobs
US4803640A (en) * 1986-05-12 1989-02-07 Matsushita Electric Industrial Co., Ltd. Control arrangement for an industrial robot
US4831549A (en) * 1987-07-28 1989-05-16 Brigham Young University Device and method for correction of robot inaccuracy
US4987527A (en) * 1987-10-26 1991-01-22 Hitachi, Ltd. Perspective display device for displaying and manipulating 2-D or 3-D cursor, 3-D object and associated mark position
US5046022A (en) * 1988-03-10 1991-09-03 The Regents Of The University Of Michigan Tele-autonomous system and method employing time/position synchrony/desynchrony
US5250886A (en) * 1988-12-27 1993-10-05 Canon Kabushiki Kaisha Method of controlling robot and robot control apparatus
US5355064A (en) * 1992-03-04 1994-10-11 Honda Giken Kogyo Kabushiki Kaisha Control system for legged mobile robot
US5467430A (en) * 1990-10-29 1995-11-14 Mitsubishi Denki Kabushiki Kaisha Robot controlling method and apparatus
US5675229A (en) * 1994-09-21 1997-10-07 Abb Robotics Inc. Apparatus and method for adjusting robot positioning
US6463358B1 (en) * 1996-11-26 2002-10-08 Fanuc Ltd. Robot control device having operation route simulation function
US6535793B2 (en) * 2000-05-01 2003-03-18 Irobot Corporation Method and system for remote control of mobile robot
US20040012593A1 (en) 2002-07-17 2004-01-22 Robert Lanciault Generating animation data with constrained parameters
US6845297B2 (en) * 2000-05-01 2005-01-18 Irobot Corporation Method and system for remote control of mobile robot
US7057643B2 (en) * 2001-05-30 2006-06-06 Minolta Co., Ltd. Image capturing system, image capturing apparatus, and manual operating apparatus
US7236854B2 (en) * 2004-01-05 2007-06-26 Abb Research Ltd. Method and a system for programming an industrial robot

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5231693A (en) * 1991-05-09 1993-07-27 The United States Of America As Represented By The Administrator, National Aeronautics And Space Administration Telerobot control system
JPH10329068A (en) * 1997-05-30 1998-12-15 Tokico Ltd Teaching device for robot
JP4280338B2 (en) * 1998-10-12 2009-06-17 株式会社アマダ Teaching method and apparatus for YAG laser processing machine
JP4765155B2 (en) * 2000-09-28 2011-09-07 ソニー株式会社 Authoring system, authoring method, and storage medium
JP4271193B2 (en) * 2003-08-12 2009-06-03 株式会社国際電気通信基礎技術研究所 Communication robot control system
JP4592276B2 (en) * 2003-10-24 2010-12-01 ソニー株式会社 Motion editing apparatus, motion editing method, and computer program for robot apparatus

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4541060A (en) * 1982-07-31 1985-09-10 Hitachi, Ltd. Path control method and apparatus
US4761745A (en) * 1985-11-07 1988-08-02 Mitsubishi Denki Kabushiki Kaisha Off-line programming for a robot with division of the work sequence into plural jobs
US4803640A (en) * 1986-05-12 1989-02-07 Matsushita Electric Industrial Co., Ltd. Control arrangement for an industrial robot
US4831549A (en) * 1987-07-28 1989-05-16 Brigham Young University Device and method for correction of robot inaccuracy
US4987527A (en) * 1987-10-26 1991-01-22 Hitachi, Ltd. Perspective display device for displaying and manipulating 2-D or 3-D cursor, 3-D object and associated mark position
US5046022A (en) * 1988-03-10 1991-09-03 The Regents Of The University Of Michigan Tele-autonomous system and method employing time/position synchrony/desynchrony
US5250886A (en) * 1988-12-27 1993-10-05 Canon Kabushiki Kaisha Method of controlling robot and robot control apparatus
US5467430A (en) * 1990-10-29 1995-11-14 Mitsubishi Denki Kabushiki Kaisha Robot controlling method and apparatus
US5355064A (en) * 1992-03-04 1994-10-11 Honda Giken Kogyo Kabushiki Kaisha Control system for legged mobile robot
US5675229A (en) * 1994-09-21 1997-10-07 Abb Robotics Inc. Apparatus and method for adjusting robot positioning
US6463358B1 (en) * 1996-11-26 2002-10-08 Fanuc Ltd. Robot control device having operation route simulation function
US6535793B2 (en) * 2000-05-01 2003-03-18 Irobot Corporation Method and system for remote control of mobile robot
US6845297B2 (en) * 2000-05-01 2005-01-18 Irobot Corporation Method and system for remote control of mobile robot
US7057643B2 (en) * 2001-05-30 2006-06-06 Minolta Co., Ltd. Image capturing system, image capturing apparatus, and manual operating apparatus
US20040012593A1 (en) 2002-07-17 2004-01-22 Robert Lanciault Generating animation data with constrained parameters
US7236854B2 (en) * 2004-01-05 2007-06-26 Abb Research Ltd. Method and a system for programming an industrial robot

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140285482A1 (en) * 2013-03-22 2014-09-25 Kt Corporation Apparatus and method for developing robot contents
KR20140115902A (en) * 2013-03-22 2014-10-01 주식회사 케이티 Apparatus and method for developing robot contents
US9495788B2 (en) * 2013-03-22 2016-11-15 Kt Corporation Apparatus and method for developing robot contents
US10510189B2 (en) 2014-04-16 2019-12-17 Sony Interactive Entertainment Inc. Information processing apparatus, information processing system, and information processing method
US9592603B2 (en) 2014-12-01 2017-03-14 Spin Master Ltd. Reconfigurable robotic system
US9737986B2 (en) 2014-12-01 2017-08-22 Spin Master Ltd. Reconfigurable robotic system
US9981376B2 (en) 2014-12-01 2018-05-29 Spin Master Ltd. Reconfigurable robotic system

Also Published As

Publication number Publication date
EP2168018A1 (en) 2010-03-31
US20100198403A1 (en) 2010-08-05
ATE556365T1 (en) 2012-05-15
WO2009004004A1 (en) 2009-01-08
EP2168018B1 (en) 2012-05-02
FR2918477A1 (en) 2009-01-09
JP2010531743A (en) 2010-09-30

Similar Documents

Publication Publication Date Title
US8447428B2 (en) Method for editing movements of a robot
CN106142091B (en) Information processing method and information processing unit
US20190295325A1 (en) Methods for augmented reality applications
US6208357B1 (en) Method and apparatus for creating and animating characters having associated behavior
DK2435216T3 (en) SYSTEM AND PROCEDURE FOR EDITING AND MANAGING A MOBILE ROBOT'S BEHAVIOR
JP6144845B2 (en) System and method for defining movements of multiple robots performing a show cooperatively
US8633933B2 (en) System and method of producing an animated performance utilizing multiple cameras
US7957547B2 (en) Sound panner superimposed on a timeline
EP2174299B1 (en) Method and system for producing a sequence of views
Cannavò et al. Immersive virtual reality-based interfaces for character animation
Marcinčin et al. Utilization of open source tools in assembling process with application of elements of augmented reality
CN113759753A (en) Simulation debugging system based on digital twin platform
KR102095951B1 (en) A system for coding education using generating an application for robot control
Krings et al. Design and evaluation of ar-assisted end-user robot path planning strategies
Riedl et al. A fast robot playback programming system using video editing concepts
JP2019532385A (en) System for configuring or modifying a virtual reality sequence, configuration method, and system for reading the sequence
US20180165877A1 (en) Method and apparatus for virtual reality animation
WO2019190722A1 (en) Systems and methods for content management in augmented reality devices and applications
KR20200003437A (en) A system for coding education using generating an application for robot control
Osorio-Gómez et al. An augmented reality tool to validate the assembly sequence of a discrete product
US20200254358A1 (en) Terminal for action robot and method of operating the same
KR20160051582A (en) Platform structure making support services of manufacturing enterprises based on 3 dimensional data
CN114968419B (en) Business process simulation implementation method and system with action records
US20240221533A1 (en) Virtual reality training tool for industrial equipment and processes
US20230173670A1 (en) Information processing apparatus, system, information processing method, method of manufacturing products, and recording medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: ALDEBARAN ROBOTICS S.A., FRANCE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MAISONNIER, BRUNO;MONCEAUX, JEROME;REEL/FRAME:023799/0160

Effective date: 20100114

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAT HOLDER NO LONGER CLAIMS SMALL ENTITY STATUS, ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: STOL); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

AS Assignment

Owner name: SOFTBANK ROBOTICS EUROPE, FRANCE

Free format text: CHANGE OF NAME;ASSIGNOR:ALDEBARAN ROBOTICS;REEL/FRAME:043207/0318

Effective date: 20160328

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20210521