EP2855105A1 - System and method for generating contextual behaviours of a mobile robot executed in real time - Google Patents
System and method for generating contextual behaviours of a mobile robot executed in real timeInfo
- Publication number
- EP2855105A1 EP2855105A1 EP13728694.4A EP13728694A EP2855105A1 EP 2855105 A1 EP2855105 A1 EP 2855105A1 EP 13728694 A EP13728694 A EP 13728694A EP 2855105 A1 EP2855105 A1 EP 2855105A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- module
- robot
- behavior
- text
- editing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
- 230000006399 behavior Effects 0.000 title claims abstract description 123
- 238000000034 method Methods 0.000 title claims abstract description 15
- 230000008451 emotion Effects 0.000 claims abstract description 8
- 238000004590 computer program Methods 0.000 claims description 13
- 230000015654 memory Effects 0.000 claims description 6
- 230000009471 action Effects 0.000 claims description 4
- 230000003750 conditioning effect Effects 0.000 claims description 2
- 230000036316 preload Effects 0.000 abstract description 2
- 230000001360 synchronised effect Effects 0.000 abstract description 2
- 238000004891 communication Methods 0.000 description 7
- 238000013515 script Methods 0.000 description 7
- 230000033001 locomotion Effects 0.000 description 6
- 230000015572 biosynthetic process Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000003137 locomotive effect Effects 0.000 description 4
- 238000003786 synthesis reaction Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 3
- 238000011282 treatment Methods 0.000 description 3
- 241000931705 Cicada Species 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000001154 acute effect Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000003542 behavioural effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000001427 coherent effect Effects 0.000 description 1
- 230000001143 conditioned effect Effects 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000008918 emotional behaviour Effects 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 230000000193 eyeblink Effects 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000001795 light effect Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1661—Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/30—Creation or generation of source code
- G06F8/34—Graphical or visual programming
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/004—Artificial life, i.e. computing arrangements simulating life
- G06N3/008—Artificial life, i.e. computing arrangements simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. based on robots replicating pets or humans in their appearance or behaviour
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40099—Graphical user interface for robotics, visual robot user interface
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40392—Programming, visual robot programming language
Definitions
- the present invention belongs to the field of robot programming systems. BACKGROUND OF THE INVENTION More precisely, it applies to the control of behaviors coherent with the context in which the robot, in particular of human or animal form, evolves, expresses itself and moves on members articulated or not.
- a robot can be called a humanoid from the moment it has certain attributes of the appearance and functionality of the man: a head, a trunk, two arms, possibly two hands, two legs, two feet ...
- One of the features likely to give the robot an appearance and quasi-human behavior is the possibility of ensuring a strong coupling between gestural expression and oral expression. In particular, achieving this result intuitively allows new user groups to access humanoid robot behavior programming.
- the patent application WO201 1/003628 discloses a system and a method responding to this general problem.
- the invention disclosed by this application makes it possible to overcome some of the disadvantages of the prior art in which specialized programming languages accessible only to a professional programmer were used.
- languages specialized in the programming of behaviors at the functional or intentional level, independently of physical actions such as FML (Function Markup Language) or at the level of behaviors themselves (which involve several parts of the virtual character to perform a function) such as BML (Behavior Markup Language) remain accessible only to the professional programmer and do not integrate with scripts written in everyday language.
- FML Field Markup Language
- BML Behavior Markup Language
- the invention covered by the patent application cited does not control the robot in real time because it uses a publisher who is not able to send orders directly to the robot in "streaming", that is to say ie who can interact in real time with the behavior of the robot that can evolve according to the evolution of its environment.
- a scenario must be replayed from the beginning when an event not provided for in the command scenario occurs.
- the robot of the invention is provided with an editor and a command interpreter that can integrate graphically within thumbnails grouping texts and behaviors of a scenario that can be executed as soon as they are issued.
- the present invention discloses a system for editing and controlling at least one scenario, said at least one scenario comprising at least one behavior to be executed and a text to be pronounced by at least one robot equipped with motor and locutionaries, said system comprising a module for editing said behaviors and texts, said editing module being autonomous with respect to said robot and comprising an input sub-module of said text to be pronounced by the robot and a sub-module for managing the behaviors, said system being characterized in that said editing module further comprises a graphic representation and association sub-module of said at least one behavior and said at least one text in at least one combined display area of said at least one least one behavior and said at least one text, said combined display area constituting a thumbnail, said thumbnail constituting a computing object compilable to be executed on said robot .
- said at least one sticker comprises at least one graphic object belonging to the group comprising a waiting icon, an icon robot behavior and a text bubble comprising at least one word, said text to be pronounced by the robot.
- said behavior icon of a sticker includes a graphical mark representing a personality and / or emotion of the robot associated with at least one text bubble in the thumbnail.
- said graphical representation of said scenario further comprises at least one synchronization banner of the sequence of actions represented by said at least one thumbnail.
- the editing and control system of the invention further comprises a module for interpreting said scenarios, said interpretation module being embedded on said at least one robot and communicating with the editing module in streaming mode.
- the interpretation module of said scenarios comprises a conditioning sub-module of at least one scenario, said submodule being configured to provide said at least one input scenario of an identifier and a type.
- the interpretation module of said scenarios comprises a compilation sub-module of said at least one behavior, said sub-module being configured to associate with said behavior the attributes of an object structure.
- said compilation sub-module is configured to split said scenarios into subsets delimited by a punctuation mark or an end of line.
- the interpretation module of said scenarios comprises a control sub-module for pre-loading said at least one behavior in the robot memory for execution by said behavior execution module.
- the interpretation module of said scenarios comprises a synchronization sub-module of said at least one text with said at least one behavior.
- the invention also discloses a method for editing and controlling at least one scenario, said at least one scenario comprising at least one behavior to be executed and a text to be pronounced by at least one robot with locomotive and locomotive capabilities, said method comprising a step of editing said behaviors and texts, said editing step being autonomous with respect to said robot and comprising an input substep of said text to be pronounced by the robot and a substep of behavior management, said method being characterized in that said editing step further comprises a substep of graphical representation and association of said at least one behavior and said at least one text in at least one thumbnail.
- the invention also discloses a computer program comprising program code instructions for executing the method of the invention when the program is executed on a computer, said program being adapted to allow editing of at least one scenario, said at least one scenario comprising at least one behavior to be executed and a text to be pronounced by at least one robot equipped with locomotory and locomotory capabilities, said computer program comprising a module for editing said behaviors and texts, said module an edition being autonomous with respect to said robot and comprising an input sub-module for said text to be pronounced by the robot and a behavior management sub-module, said computer program being characterized in that said editing module comprises furthermore, a submodule for representing and graphically associating said at least one behavior and said at least one text in at least one vignette.
- the invention also discloses a computer program comprising program code instructions for executing the method according to the invention when the program is executed on a computer, said program being adapted to allow the interpretation of at least one scenario, said at least one scenario comprising at least one behavior to be executed and a text to be pronounced by at least one robot with capacities motor and locutoires, said computer program comprising an interpretation module of said scenarios, said interpretation module being embedded on said at least one robot and communicating with an external platform in streaming mode.
- the interpretation module of said scenarios comprises a compilation sub-module of said at least one behavior, said sub-module being configured to associate with said behavior the attributes of an object structure.
- the interpretation module of said scenarios comprises a control sub-module for pre-loading said at least one behavior in the robot memory for execution by said behavior execution module (460).
- the interpretation module of said scenarios comprises a synchronization sub-module of said at least one text with said at least one behavior.
- the invention makes it possible to create behavioral libraries and to easily insert them into a script of scenes played by the robot.
- the behaviors are modeled by graphic vignettes representing in each vignette the gestural and emotional behaviors of the robot, as well as its lyrics and the elements of environment (music, images, lyrics of other characters ).
- This scenario creation interface is intuitive and allows the user to easily create complex scenarios that can be adapted in real time.
- the invention also satisfactorily completes French Patent Application No. 09/53434 relating to a system and a method for editing and controlling the behavior of a mobile robot belonging to the Applicant.
- This provides means to execute behaviors by a robot, said behaviors can be controlled either using a specialized scripting language, accessible to programmers, or graphically by using preprogrammed libraries that can be selected and insert in a series of behavior boxes linked by events.
- the invention makes it possible to further simplify the programming interface of the behavior of the robot.
- FIG. 1 represents the physical architecture of a system for implementing the invention according to several embodiments
- FIG. 2 represents a general flowchart of the treatments according to several embodiments of the invention.
- FIG. 3 represents a flowchart of the processes carried out in a command editing module according to several embodiments of the invention
- FIG. 4 represents a flowchart of the processing carried out in a command interpreting module according to several embodiments of the invention
- FIGS. 5a and 5b show thumbnails constituting a scenario executed by a robot in one embodiment of the invention.
- FIG. 1 represents the physical architecture of a system for implementing the invention according to several embodiments.
- a humanoid robot 1 1 0 is shown in the figure in one embodiment of the invention. Such a robot has been disclosed in particular in the patent application WO2009 / 124951 published on January 15, 2009. This platform served as a basis for the improvements that led to the present invention. In the remainder of the description, this humanoid robot can be indifferently referred to under this generic name or under its trademark NAO TM, without the generality of the reference being modified.
- the robot has about two dozen electronic sensor control cards and actuators that drive the joints.
- the electronic control card includes a commercial microcontroller. It can be for example a DSPIC TM of the company Microchip. It is a 16-bit MCU coupled to a DSP. This MCU has a servo loop cycle of one ms.
- the robot can also include other types of actuators, including LEDs (electroluminescent diodes) whose color and intensity can reflect the emotions of the robot. It may also include other types of position sensors, including an inertial unit, FSR (ground pressure sensors), etc ....
- the head includes the intelligence of the robot, including the card that performs the high-level functions that allow the robot to perform the tasks assigned to it, including, in the context of the present invention, for the execution of written scenarios by a user who is not a professional programmer.
- the head may also include specialized cards, especially in the speech or vision processing or also in the processing of service inputs / outputs, such as the encoding necessary to open a port to establish a communication remotely over Wide Area Network (WAN).
- the card processor can be a commercial x86 processor. We will choose in a preferred way a low-power processor, for example an ATOM TM from Intel (32 bits, 1600 MHz).
- the card also includes a set of RAM and flash memories.
- This card also manages the communication of the robot with the outside (behavior server, other robots 8), normally on a WiFi transmission layer, WiMax, possibly on a public network of mobile data communications with standard protocols possibly encapsulated in a VPN.
- the processor is normally controlled by a standard OS which allows to use the usual high-level languages (C, C ++, Python, ...) or the specific languages of artificial intelligence like URBI (programming language specialized in robotics) for programming high-level functions.
- the robot 1 1 0 will be able to perform behaviors for which it may have been programmed in advance, including a code generated according to the invention disclosed in the French patent application No. 09/53434 already cited, said code having been created by a programmer in a graphical interface.
- These behaviors may also have been arranged in a scenario created by a user who is not a professional programmer using the invention disclosed in the patent application WO201 1/003628 also already mentioned.
- it may be behaviors articulated among themselves according to a relatively complex logic in which the sequences of behaviors are conditioned by the events that occur in the environment of the robot.
- a user who must have a minimum of programmer skills can use the Choregraph TM workshop, whose main operating modes are described in the cited application.
- the flow logic of the scenario is not in principle adaptive.
- a user who is not a professional programmer, 120 is able to produce a complex scenario comprising sets of behaviors including various gestures and movements, sound or visual signal transmissions, speech forming questions. and responses, these different elements being represented together graphically by icons on a sequence of thumbnails (see Figure 5).
- the thumbnails are, as we will see later, the programming interface of the story that will be played by the robot.
- FIG. 2 represents a general flowchart of the treatments according to several embodiments of the invention.
- the PC 1 20 includes a software module 210 for graphically editing the commands that will be passed to the robot or robots.
- the architecture and operation will be detailed in comment in Figure 3.
- the PC communicates with the robot and transmits to it the thumbnails that will be interpreted to be executed by the thumbnail interpretation software module 220.
- the architecture and operation of this module 220 will be detailed in comment in FIG. 4.
- the user's PC communicates with the robot via a wired or radio interface, or both, in the case where the robot and the user are located in remote locations and communicate over a wide area network. This last case is not shown in the figure but is one of the possible embodiments of the invention.
- FIG. 3 represents a flowchart of the processes performed in a command editing module according to several embodiments of the invention.
- the editing module 210 comprises a scenario collector 310 which is in communication with scenario files 31.
- the scenarios can be viewed and modified in a scenario editor 320 which can simultaneously have several 321 0 scenarios in memory. corresponds in general to a text and is constituted by a succession of vignettes.
- the editing module includes a thumbnail editor 330.
- a thumbnail In a thumbnail are inserted basic behavior commands represented by an icon. These behaviors can be reproduced by the robot. You can also insert a text (inserted in a bubble, as explained in comment in Figure 5). This text will also be reproduced by the robot in a vocal way.
- the editing module normally receives input text that defines a scenario. This entry can be made directly using a simple computer keyboard or by loading into the system a file type text (* .doc, * .txt or other) or a html file (optionally designated by its URL). These files can also be received from a remote site, for example through a messaging system. To perform this reading, the system or the robot are provided with a synthesis device capable of interpreting the text of the script editor to produce sounds, which can be either words in the case of a humanoid robot, or sounds representative of the behavior of an animal. The sound synthesis device can also reproduce background sounds, for example background music which, possibly, can be played on a remote computer.
- a synthesis device capable of interpreting the text of the script editor to produce sounds, which can be either words in the case of a humanoid robot, or sounds representative of the behavior of an animal.
- the sound synthesis device can also reproduce background sounds, for example background music which, possibly, can be played on a remote computer.
- the triggering of the reading of a story can be done during the reception of an external event to the robot such as:
- An action of a user which can be the touch of a touch zone on the robot (for example, his head), a gesture or speech pre-programmed to do so.
- the behavior commands are represented in a thumbnail by an icon illustrative of said behavior.
- the behavior commands can generate:
- the insertion of the behavior commands can take place in a behavior management module 340 by dragging a behavior control icon chosen from a library 3410 to a thumbnail located in the thumbnail editing module 330.
- the 330 edition also allows you to copy and paste text.
- the interpretation module embedded in the robot can interpret an annotated text from an external application.
- the external application may be a Choregraph TM box, this application being the programming software of the NAO robot which is described in particular in French Patent Application No. 09/53434 already cited.
- These annotated texts can also be web pages, e-mails, instantaneous short messages (SMS), or from other applications provided that the module 330 includes the necessary interface to integrate them.
- the editing module 210 communicates with the robot via a communication management module 370 which conditions XML streams sent to the physical layer by which the robot is connected to the PC.
- An interpretation manager 350 and a communications manager 360 complete the editing module.
- Interpretation Manager 350 is used to initiate the interpretation of the text, to stop it and to have information about interpretation (place in the text where the interpretation is made, for example).
- the communication manager 360 is used to connect to a robot, to disconnect and receive information about the connection (connection status or inadvertent disconnection for example).
- FIG. 4 represents a flowchart of the processes carried out in a command interpreting module according to several embodiments of the invention.
- the XML streams from the editing module 210 and other streams, such as annotated text from a mailbox or mobile phone, are provided with an identifier (ID) and a type by a sub-module 410 of the thumbnail interpretation module 220.
- ID identifier
- the identified and typed streams of the queue 41 are then converted into interpretable objects as behaviors by a compilation thread 420.
- a reference to a behavior is replaced which is not necessarily explicit out of context by a synchronization tag coupled with a direct reference to the behavior via the path to where it is stored. ).
- This thread exchanges with the behavior management module 340 of the thumbnail editor 21 0. These exchanges allow the detection of references to behaviors in the text.
- the build thread Since the build thread does not know the tags that might correspond to a behavior, it must first ask all these tags to the behavior management module to be able to detect them in the text. Then, when it detects a tag in the text, it asks the behavior management module what is the behavior corresponding to this tag (eg "law"). The behavior management module responds by giving him the path to the corresponding behavior ("Animations / Positive / bau" for example). These exchanges are done synchronously with the compilation thread.
- the compilation thread When the compilation thread detects an end of sentence (which may be defined by punctuation marks, end of line, etc.), it sends the sentence to the queue 421 0
- an end of sentence which may be defined by punctuation marks, end of line, etc.
- the call programmed by its identifier ID will be immediate as soon as, according to the scenario, a behavior must be executed.
- the runtime module pre-loads the behavior and returns the unique ID of the instance of the behavior that is ready to run.
- the execution module can immediately execute said behavior as soon as it is needed, the synchronization of text and behaviors being thereby greatly improved.
- a synchronization thread 440 makes it possible to link temporally the text said by the speech synthesis module 450 and the behaviors executed by the execution module of the behaviors 460.
- the text with synchronization tags is sent to the voice synthesis module 450, while the ID behavior identifiers corresponding to the synchronization tempo are sent to the behavior execution module 460 which performs the pre-loaded behavior calls corresponding to the IDs of the behaviors to be executed.
- Figures 5a and 5b show thumbnails constituting a scenario executed by a robot in one embodiment of the invention.
- the scenario of the figure includes 16 vignettes.
- a scenario can include any number of thumbnails.
- the robot waits for the touch sensor 51 1 0 5120 located on the head is actuated.
- the robot waits for a specified period after 5520 action touching on the touch sensor elapsed.
- the robot is first character, the narrator 531 0 and executes a first behavior symbolized by the graphical representation of the character that is to be rotated by reading the text written in the 5320 bubble with a voice characterizing said first character.
- the robot shows a second character 5410 (in the scenario of the example, a cicada symbolized by a graphic mark 5430) and executes a second behavior symbolized by the graphical representation of the character which is to swing his arms right up and down by reading the text written in the bubble 5420 with a voice different from that of the narrator and characterizing said second character.
- the narrator robot is in a static position represented by the character 5510 and reads the text written in the balloon 5520.
- the robot 561 cicada 0 is also in the static position shown in the same way as in 5510 and reads the text written in the balloon 5620.
- the robot represents a third character (in the scenario of the example, an ant symbolized by a graphic mark 5730) and pronounces a text 5720.
- the number of behaviors and emotions is not limited either.
- the behaviors can be taken from a base of behaviors 341 0, created in Choreographer, the professional behavior editor or other tools. They may possibly be modified in the module 340 for managing behaviors of the editing module 210 which manages the behavior database 3410.
- a behavior object is defined by a name, a category, possibly a subcategory, a representation, possibly one or more parameters, possibly the association of one or more files (audio or other).
- a vignette may include several bubbles, a bubble comprising at least one word, as illustrated on the thumbnail 5A0.
- a scenario can also be characterized by a banner 5H0 which may or may not correspond to a musical score, said partition being synchronized with the thumbnail / bubble tree.
- This synchronization facilitates the nesting of several levels of thumbnails whose execution is conditional.
- Several bands can run in parallel as shown in the figure by the banner 5I0.
- the texts can be read in different languages, with different prosodies (speed, volume, style, voice ...) -
- the variety of behaviors and emotions that can be used in the system of the invention is not limited.
- the voice may be a male, female or child voice; the tone may be more or less serious or acute; the speed may be more or less rapid; the intonation can be chosen according to the emotion that the robot is likely to feel according to the text of the script (affection, astonishment, anger, joy, remonstrance, etc.).
- Accompanying gestures of the script may for example be a movement of the arms upwards or forwards; strike a foot on the ground; movements of the head upwards, downwards, to the right or to the left, according to the impression that one wants to communicate coherently with the script ...
- the robot can interact with its environment and its interlocutors in a very varied way: speech, gestures, touch, light signals, etc.
- speech gestures, touch, light signals, etc.
- these can be activated to translate strong emotions "felt" by the robot by reading the text or to generate an eye blink adapted to the shape and speed of speech.
- certain commands may be interrupt and wait commands for an external event, such as a motion in response to a question posed by the robot.
- Some commands may be dependent on the reactions of the robot to its environment, captured for example by a camera or ultrasonic sensors.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Robotics (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biophysics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Computational Linguistics (AREA)
- Biomedical Technology (AREA)
- Mathematical Physics (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Manipulator (AREA)
- Stored Programmes (AREA)
- Toys (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1255105A FR2991222B1 (en) | 2012-06-01 | 2012-06-01 | SYSTEM AND METHOD FOR GENERATING CONTEXTUAL MOBILE ROBOT BEHAVIOR EXECUTED IN REAL-TIME |
PCT/EP2013/061180 WO2013178741A1 (en) | 2012-06-01 | 2013-05-30 | System and method for generating contextual behaviours of a mobile robot executed in real time |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2855105A1 true EP2855105A1 (en) | 2015-04-08 |
Family
ID=47080621
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP13728694.4A Ceased EP2855105A1 (en) | 2012-06-01 | 2013-05-30 | System and method for generating contextual behaviours of a mobile robot executed in real time |
Country Status (7)
Country | Link |
---|---|
US (1) | US20150290807A1 (en) |
EP (1) | EP2855105A1 (en) |
JP (1) | JP6319772B2 (en) |
CN (1) | CN104470686B (en) |
BR (1) | BR112014030043A2 (en) |
FR (1) | FR2991222B1 (en) |
WO (1) | WO2013178741A1 (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6594646B2 (en) * | 2015-04-10 | 2019-10-23 | ヴイストン株式会社 | Robot, robot control method, and robot system |
JP6781545B2 (en) * | 2015-12-28 | 2020-11-04 | ヴイストン株式会社 | robot |
JP6604912B2 (en) * | 2016-06-23 | 2019-11-13 | 日本電信電話株式会社 | Utterance motion presentation device, method and program |
US20180133900A1 (en) * | 2016-11-15 | 2018-05-17 | JIBO, Inc. | Embodied dialog and embodied speech authoring tools for use with an expressive social robot |
CN108932167B (en) * | 2017-05-22 | 2023-08-08 | 中兴通讯股份有限公司 | Intelligent question-answer synchronous display method, device and system and storage medium |
JP6956562B2 (en) * | 2017-08-10 | 2021-11-02 | 学校法人慶應義塾 | Intelligent robot systems and programs |
US11325263B2 (en) * | 2018-06-29 | 2022-05-10 | Teradyne, Inc. | System and method for real-time robotic control |
CN110543144B (en) * | 2019-08-30 | 2021-06-01 | 天津施格自动化科技有限公司 | Method and system for graphically programming control robot |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2496160A1 (en) * | 1980-12-11 | 1982-06-18 | Lamothe Andre | Sealed connector for deep drilling tools - where drilling liq. can be fed to tool, or another liq. can be fed into drilled hole without reaching the tool |
JPH07261820A (en) * | 1994-03-25 | 1995-10-13 | Nippon Telegr & Teleph Corp <Ntt> | Software constituting method and controller for industrial robot operation |
JP4366617B2 (en) * | 1999-01-25 | 2009-11-18 | ソニー株式会社 | Robot device |
JP4670136B2 (en) * | 2000-10-11 | 2011-04-13 | ソニー株式会社 | Authoring system, authoring method, and storage medium |
GB2385954A (en) * | 2002-02-04 | 2003-09-03 | Magenta Corp Ltd | Managing a Virtual Environment |
US7995090B2 (en) * | 2003-07-28 | 2011-08-09 | Fuji Xerox Co., Ltd. | Video enabled tele-presence control host |
JP4744847B2 (en) * | 2004-11-02 | 2011-08-10 | 株式会社安川電機 | Robot control device and robot system |
JP2009025224A (en) * | 2007-07-23 | 2009-02-05 | Clarion Co Ltd | Navigation device and control method for navigation device |
FR2929873B1 (en) * | 2008-04-09 | 2010-09-03 | Aldebaran Robotics | CONTROL-CONTROL ARCHITECTURE OF A MOBILE ROBOT USING ARTICULATED MEMBERS |
FR2946160B1 (en) * | 2009-05-26 | 2014-05-09 | Aldebaran Robotics | SYSTEM AND METHOD FOR EDIT AND ORDER BEHAVIOR OF MOBILE ROBOT. |
FR2947923B1 (en) * | 2009-07-10 | 2016-02-05 | Aldebaran Robotics | SYSTEM AND METHOD FOR GENERATING CONTEXTUAL BEHAVIOR OF A MOBILE ROBOT |
US9472112B2 (en) * | 2009-07-24 | 2016-10-18 | Modular Robotics Incorporated | Educational construction modular unit |
US8260460B2 (en) * | 2009-09-22 | 2012-09-04 | GM Global Technology Operations LLC | Interactive robot control system and method of use |
DE102010004476A1 (en) * | 2010-01-13 | 2011-07-14 | KUKA Laboratories GmbH, 86165 | Method for controlling e.g. palatalized robot application, involves generating and/or modifying control interfaces based on configuration of robot application or during change of configuration of robot application |
-
2012
- 2012-06-01 FR FR1255105A patent/FR2991222B1/en not_active Expired - Fee Related
-
2013
- 2013-05-30 CN CN201380037538.3A patent/CN104470686B/en not_active Expired - Fee Related
- 2013-05-30 US US14/404,924 patent/US20150290807A1/en not_active Abandoned
- 2013-05-30 JP JP2015514502A patent/JP6319772B2/en active Active
- 2013-05-30 BR BR112014030043A patent/BR112014030043A2/en not_active Application Discontinuation
- 2013-05-30 WO PCT/EP2013/061180 patent/WO2013178741A1/en active Application Filing
- 2013-05-30 EP EP13728694.4A patent/EP2855105A1/en not_active Ceased
Non-Patent Citations (2)
Title |
---|
None * |
See also references of WO2013178741A1 * |
Also Published As
Publication number | Publication date |
---|---|
FR2991222A1 (en) | 2013-12-06 |
WO2013178741A1 (en) | 2013-12-05 |
BR112014030043A2 (en) | 2017-06-27 |
JP6319772B2 (en) | 2018-05-09 |
FR2991222B1 (en) | 2015-02-27 |
CN104470686B (en) | 2017-08-29 |
JP2015525137A (en) | 2015-09-03 |
US20150290807A1 (en) | 2015-10-15 |
CN104470686A (en) | 2015-03-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2855105A1 (en) | System and method for generating contextual behaviours of a mobile robot executed in real time | |
FR2963132A1 (en) | HUMANOID ROBOT HAVING A NATURAL DIALOGUE INTERFACE, METHOD OF USING AND PROGRAMMING THE SAME | |
KR102306624B1 (en) | Persistent companion device configuration and deployment platform | |
FR2947923A1 (en) | SYSTEM AND METHOD FOR GENERATING CONTEXTUAL BEHAVIOR OF A MOBILE ROBOT | |
FR2989209A1 (en) | ROBOT FOR INTEGRATING NATURAL DIALOGUES WITH A USER IN HIS BEHAVIOR, METHODS OF PROGRAMMING AND USING THE SAME | |
US10929759B2 (en) | Intelligent robot software platform | |
EP2435216B1 (en) | System and method for editing and controlling the behaviour of a movable robot | |
US9292957B2 (en) | Portable virtual characters | |
EP3053162B1 (en) | Method for dialogue between a machine, such as a humanoid robot, and a human interlocutor; computer program product; and humanoid robot for implementing such a method | |
KR102001293B1 (en) | Executing software applications on a robot | |
US20100333037A1 (en) | Dioramic user interface having a user customized experience | |
TW201916005A (en) | Interaction method and device | |
Newnham | Microsoft HoloLens By Example | |
FR3080926A1 (en) | METHOD FOR CONTROLLING A PLURALITY OF EFFECTORS OF A ROBOT | |
Alonso et al. | A flexible and scalable social robot architecture employing voice assistant technologies | |
WO2018183812A1 (en) | Persistent companion device configuration and deployment platform | |
Singh | Universal gesture tracking framework in OpenISS and ROS and its applications | |
Geraci | Design and implementation of embodied conversational agents | |
US20210042639A1 (en) | Converting nonnative skills for conversational computing interfaces | |
Hahkio | Service robots’ feasibility in the hotel industry: A case study of Hotel Presidentti | |
Lleonsí Carrillo | Development of a teaching assistance application for SoftBank Pepper | |
WO2006061308A1 (en) | Method for the temporal animation of an avatar from a source signal containing branching information, and corresponding device, computer program, storage means and source signal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20141230 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: ALDEBARAN ROBOTICS S.A. |
|
DAX | Request for extension of the european patent (deleted) | ||
17Q | First examination report despatched |
Effective date: 20170105 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
18R | Application refused |
Effective date: 20190116 |