EP2855105A1 - System und verfahren zur erzeugung von in echtzeit ausgeführten kontextabhängigen verhaltensweisen eines mobilen roboters - Google Patents

System und verfahren zur erzeugung von in echtzeit ausgeführten kontextabhängigen verhaltensweisen eines mobilen roboters

Info

Publication number
EP2855105A1
EP2855105A1 EP13728694.4A EP13728694A EP2855105A1 EP 2855105 A1 EP2855105 A1 EP 2855105A1 EP 13728694 A EP13728694 A EP 13728694A EP 2855105 A1 EP2855105 A1 EP 2855105A1
Authority
EP
European Patent Office
Prior art keywords
module
robot
behavior
text
editing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP13728694.4A
Other languages
English (en)
French (fr)
Inventor
Victor PALEOLOGUE
Maxime MORISSET
Flora BRIAND
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aldebaran SAS
Original Assignee
Aldebaran Robotics SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aldebaran Robotics SA filed Critical Aldebaran Robotics SA
Publication of EP2855105A1 publication Critical patent/EP2855105A1/de
Ceased legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/34Graphical or visual programming
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/008Artificial life, i.e. computing arrangements simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. based on robots replicating pets or humans in their appearance or behaviour
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40099Graphical user interface for robotics, visual robot user interface
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40392Programming, visual robot programming language

Definitions

  • the present invention belongs to the field of robot programming systems. BACKGROUND OF THE INVENTION More precisely, it applies to the control of behaviors coherent with the context in which the robot, in particular of human or animal form, evolves, expresses itself and moves on members articulated or not.
  • a robot can be called a humanoid from the moment it has certain attributes of the appearance and functionality of the man: a head, a trunk, two arms, possibly two hands, two legs, two feet ...
  • One of the features likely to give the robot an appearance and quasi-human behavior is the possibility of ensuring a strong coupling between gestural expression and oral expression. In particular, achieving this result intuitively allows new user groups to access humanoid robot behavior programming.
  • the patent application WO201 1/003628 discloses a system and a method responding to this general problem.
  • the invention disclosed by this application makes it possible to overcome some of the disadvantages of the prior art in which specialized programming languages accessible only to a professional programmer were used.
  • languages specialized in the programming of behaviors at the functional or intentional level, independently of physical actions such as FML (Function Markup Language) or at the level of behaviors themselves (which involve several parts of the virtual character to perform a function) such as BML (Behavior Markup Language) remain accessible only to the professional programmer and do not integrate with scripts written in everyday language.
  • FML Field Markup Language
  • BML Behavior Markup Language
  • the invention covered by the patent application cited does not control the robot in real time because it uses a publisher who is not able to send orders directly to the robot in "streaming", that is to say ie who can interact in real time with the behavior of the robot that can evolve according to the evolution of its environment.
  • a scenario must be replayed from the beginning when an event not provided for in the command scenario occurs.
  • the robot of the invention is provided with an editor and a command interpreter that can integrate graphically within thumbnails grouping texts and behaviors of a scenario that can be executed as soon as they are issued.
  • the present invention discloses a system for editing and controlling at least one scenario, said at least one scenario comprising at least one behavior to be executed and a text to be pronounced by at least one robot equipped with motor and locutionaries, said system comprising a module for editing said behaviors and texts, said editing module being autonomous with respect to said robot and comprising an input sub-module of said text to be pronounced by the robot and a sub-module for managing the behaviors, said system being characterized in that said editing module further comprises a graphic representation and association sub-module of said at least one behavior and said at least one text in at least one combined display area of said at least one least one behavior and said at least one text, said combined display area constituting a thumbnail, said thumbnail constituting a computing object compilable to be executed on said robot .
  • said at least one sticker comprises at least one graphic object belonging to the group comprising a waiting icon, an icon robot behavior and a text bubble comprising at least one word, said text to be pronounced by the robot.
  • said behavior icon of a sticker includes a graphical mark representing a personality and / or emotion of the robot associated with at least one text bubble in the thumbnail.
  • said graphical representation of said scenario further comprises at least one synchronization banner of the sequence of actions represented by said at least one thumbnail.
  • the editing and control system of the invention further comprises a module for interpreting said scenarios, said interpretation module being embedded on said at least one robot and communicating with the editing module in streaming mode.
  • the interpretation module of said scenarios comprises a conditioning sub-module of at least one scenario, said submodule being configured to provide said at least one input scenario of an identifier and a type.
  • the interpretation module of said scenarios comprises a compilation sub-module of said at least one behavior, said sub-module being configured to associate with said behavior the attributes of an object structure.
  • said compilation sub-module is configured to split said scenarios into subsets delimited by a punctuation mark or an end of line.
  • the interpretation module of said scenarios comprises a control sub-module for pre-loading said at least one behavior in the robot memory for execution by said behavior execution module.
  • the interpretation module of said scenarios comprises a synchronization sub-module of said at least one text with said at least one behavior.
  • the invention also discloses a method for editing and controlling at least one scenario, said at least one scenario comprising at least one behavior to be executed and a text to be pronounced by at least one robot with locomotive and locomotive capabilities, said method comprising a step of editing said behaviors and texts, said editing step being autonomous with respect to said robot and comprising an input substep of said text to be pronounced by the robot and a substep of behavior management, said method being characterized in that said editing step further comprises a substep of graphical representation and association of said at least one behavior and said at least one text in at least one thumbnail.
  • the invention also discloses a computer program comprising program code instructions for executing the method of the invention when the program is executed on a computer, said program being adapted to allow editing of at least one scenario, said at least one scenario comprising at least one behavior to be executed and a text to be pronounced by at least one robot equipped with locomotory and locomotory capabilities, said computer program comprising a module for editing said behaviors and texts, said module an edition being autonomous with respect to said robot and comprising an input sub-module for said text to be pronounced by the robot and a behavior management sub-module, said computer program being characterized in that said editing module comprises furthermore, a submodule for representing and graphically associating said at least one behavior and said at least one text in at least one vignette.
  • the invention also discloses a computer program comprising program code instructions for executing the method according to the invention when the program is executed on a computer, said program being adapted to allow the interpretation of at least one scenario, said at least one scenario comprising at least one behavior to be executed and a text to be pronounced by at least one robot with capacities motor and locutoires, said computer program comprising an interpretation module of said scenarios, said interpretation module being embedded on said at least one robot and communicating with an external platform in streaming mode.
  • the interpretation module of said scenarios comprises a compilation sub-module of said at least one behavior, said sub-module being configured to associate with said behavior the attributes of an object structure.
  • the interpretation module of said scenarios comprises a control sub-module for pre-loading said at least one behavior in the robot memory for execution by said behavior execution module (460).
  • the interpretation module of said scenarios comprises a synchronization sub-module of said at least one text with said at least one behavior.
  • the invention makes it possible to create behavioral libraries and to easily insert them into a script of scenes played by the robot.
  • the behaviors are modeled by graphic vignettes representing in each vignette the gestural and emotional behaviors of the robot, as well as its lyrics and the elements of environment (music, images, lyrics of other characters ).
  • This scenario creation interface is intuitive and allows the user to easily create complex scenarios that can be adapted in real time.
  • the invention also satisfactorily completes French Patent Application No. 09/53434 relating to a system and a method for editing and controlling the behavior of a mobile robot belonging to the Applicant.
  • This provides means to execute behaviors by a robot, said behaviors can be controlled either using a specialized scripting language, accessible to programmers, or graphically by using preprogrammed libraries that can be selected and insert in a series of behavior boxes linked by events.
  • the invention makes it possible to further simplify the programming interface of the behavior of the robot.
  • FIG. 1 represents the physical architecture of a system for implementing the invention according to several embodiments
  • FIG. 2 represents a general flowchart of the treatments according to several embodiments of the invention.
  • FIG. 3 represents a flowchart of the processes carried out in a command editing module according to several embodiments of the invention
  • FIG. 4 represents a flowchart of the processing carried out in a command interpreting module according to several embodiments of the invention
  • FIGS. 5a and 5b show thumbnails constituting a scenario executed by a robot in one embodiment of the invention.
  • FIG. 1 represents the physical architecture of a system for implementing the invention according to several embodiments.
  • a humanoid robot 1 1 0 is shown in the figure in one embodiment of the invention. Such a robot has been disclosed in particular in the patent application WO2009 / 124951 published on January 15, 2009. This platform served as a basis for the improvements that led to the present invention. In the remainder of the description, this humanoid robot can be indifferently referred to under this generic name or under its trademark NAO TM, without the generality of the reference being modified.
  • the robot has about two dozen electronic sensor control cards and actuators that drive the joints.
  • the electronic control card includes a commercial microcontroller. It can be for example a DSPIC TM of the company Microchip. It is a 16-bit MCU coupled to a DSP. This MCU has a servo loop cycle of one ms.
  • the robot can also include other types of actuators, including LEDs (electroluminescent diodes) whose color and intensity can reflect the emotions of the robot. It may also include other types of position sensors, including an inertial unit, FSR (ground pressure sensors), etc ....
  • the head includes the intelligence of the robot, including the card that performs the high-level functions that allow the robot to perform the tasks assigned to it, including, in the context of the present invention, for the execution of written scenarios by a user who is not a professional programmer.
  • the head may also include specialized cards, especially in the speech or vision processing or also in the processing of service inputs / outputs, such as the encoding necessary to open a port to establish a communication remotely over Wide Area Network (WAN).
  • the card processor can be a commercial x86 processor. We will choose in a preferred way a low-power processor, for example an ATOM TM from Intel (32 bits, 1600 MHz).
  • the card also includes a set of RAM and flash memories.
  • This card also manages the communication of the robot with the outside (behavior server, other robots 8), normally on a WiFi transmission layer, WiMax, possibly on a public network of mobile data communications with standard protocols possibly encapsulated in a VPN.
  • the processor is normally controlled by a standard OS which allows to use the usual high-level languages (C, C ++, Python, ...) or the specific languages of artificial intelligence like URBI (programming language specialized in robotics) for programming high-level functions.
  • the robot 1 1 0 will be able to perform behaviors for which it may have been programmed in advance, including a code generated according to the invention disclosed in the French patent application No. 09/53434 already cited, said code having been created by a programmer in a graphical interface.
  • These behaviors may also have been arranged in a scenario created by a user who is not a professional programmer using the invention disclosed in the patent application WO201 1/003628 also already mentioned.
  • it may be behaviors articulated among themselves according to a relatively complex logic in which the sequences of behaviors are conditioned by the events that occur in the environment of the robot.
  • a user who must have a minimum of programmer skills can use the Choregraph TM workshop, whose main operating modes are described in the cited application.
  • the flow logic of the scenario is not in principle adaptive.
  • a user who is not a professional programmer, 120 is able to produce a complex scenario comprising sets of behaviors including various gestures and movements, sound or visual signal transmissions, speech forming questions. and responses, these different elements being represented together graphically by icons on a sequence of thumbnails (see Figure 5).
  • the thumbnails are, as we will see later, the programming interface of the story that will be played by the robot.
  • FIG. 2 represents a general flowchart of the treatments according to several embodiments of the invention.
  • the PC 1 20 includes a software module 210 for graphically editing the commands that will be passed to the robot or robots.
  • the architecture and operation will be detailed in comment in Figure 3.
  • the PC communicates with the robot and transmits to it the thumbnails that will be interpreted to be executed by the thumbnail interpretation software module 220.
  • the architecture and operation of this module 220 will be detailed in comment in FIG. 4.
  • the user's PC communicates with the robot via a wired or radio interface, or both, in the case where the robot and the user are located in remote locations and communicate over a wide area network. This last case is not shown in the figure but is one of the possible embodiments of the invention.
  • FIG. 3 represents a flowchart of the processes performed in a command editing module according to several embodiments of the invention.
  • the editing module 210 comprises a scenario collector 310 which is in communication with scenario files 31.
  • the scenarios can be viewed and modified in a scenario editor 320 which can simultaneously have several 321 0 scenarios in memory. corresponds in general to a text and is constituted by a succession of vignettes.
  • the editing module includes a thumbnail editor 330.
  • a thumbnail In a thumbnail are inserted basic behavior commands represented by an icon. These behaviors can be reproduced by the robot. You can also insert a text (inserted in a bubble, as explained in comment in Figure 5). This text will also be reproduced by the robot in a vocal way.
  • the editing module normally receives input text that defines a scenario. This entry can be made directly using a simple computer keyboard or by loading into the system a file type text (* .doc, * .txt or other) or a html file (optionally designated by its URL). These files can also be received from a remote site, for example through a messaging system. To perform this reading, the system or the robot are provided with a synthesis device capable of interpreting the text of the script editor to produce sounds, which can be either words in the case of a humanoid robot, or sounds representative of the behavior of an animal. The sound synthesis device can also reproduce background sounds, for example background music which, possibly, can be played on a remote computer.
  • a synthesis device capable of interpreting the text of the script editor to produce sounds, which can be either words in the case of a humanoid robot, or sounds representative of the behavior of an animal.
  • the sound synthesis device can also reproduce background sounds, for example background music which, possibly, can be played on a remote computer.
  • the triggering of the reading of a story can be done during the reception of an external event to the robot such as:
  • An action of a user which can be the touch of a touch zone on the robot (for example, his head), a gesture or speech pre-programmed to do so.
  • the behavior commands are represented in a thumbnail by an icon illustrative of said behavior.
  • the behavior commands can generate:
  • the insertion of the behavior commands can take place in a behavior management module 340 by dragging a behavior control icon chosen from a library 3410 to a thumbnail located in the thumbnail editing module 330.
  • the 330 edition also allows you to copy and paste text.
  • the interpretation module embedded in the robot can interpret an annotated text from an external application.
  • the external application may be a Choregraph TM box, this application being the programming software of the NAO robot which is described in particular in French Patent Application No. 09/53434 already cited.
  • These annotated texts can also be web pages, e-mails, instantaneous short messages (SMS), or from other applications provided that the module 330 includes the necessary interface to integrate them.
  • the editing module 210 communicates with the robot via a communication management module 370 which conditions XML streams sent to the physical layer by which the robot is connected to the PC.
  • An interpretation manager 350 and a communications manager 360 complete the editing module.
  • Interpretation Manager 350 is used to initiate the interpretation of the text, to stop it and to have information about interpretation (place in the text where the interpretation is made, for example).
  • the communication manager 360 is used to connect to a robot, to disconnect and receive information about the connection (connection status or inadvertent disconnection for example).
  • FIG. 4 represents a flowchart of the processes carried out in a command interpreting module according to several embodiments of the invention.
  • the XML streams from the editing module 210 and other streams, such as annotated text from a mailbox or mobile phone, are provided with an identifier (ID) and a type by a sub-module 410 of the thumbnail interpretation module 220.
  • ID identifier
  • the identified and typed streams of the queue 41 are then converted into interpretable objects as behaviors by a compilation thread 420.
  • a reference to a behavior is replaced which is not necessarily explicit out of context by a synchronization tag coupled with a direct reference to the behavior via the path to where it is stored. ).
  • This thread exchanges with the behavior management module 340 of the thumbnail editor 21 0. These exchanges allow the detection of references to behaviors in the text.
  • the build thread Since the build thread does not know the tags that might correspond to a behavior, it must first ask all these tags to the behavior management module to be able to detect them in the text. Then, when it detects a tag in the text, it asks the behavior management module what is the behavior corresponding to this tag (eg "law"). The behavior management module responds by giving him the path to the corresponding behavior ("Animations / Positive / bau" for example). These exchanges are done synchronously with the compilation thread.
  • the compilation thread When the compilation thread detects an end of sentence (which may be defined by punctuation marks, end of line, etc.), it sends the sentence to the queue 421 0
  • an end of sentence which may be defined by punctuation marks, end of line, etc.
  • the call programmed by its identifier ID will be immediate as soon as, according to the scenario, a behavior must be executed.
  • the runtime module pre-loads the behavior and returns the unique ID of the instance of the behavior that is ready to run.
  • the execution module can immediately execute said behavior as soon as it is needed, the synchronization of text and behaviors being thereby greatly improved.
  • a synchronization thread 440 makes it possible to link temporally the text said by the speech synthesis module 450 and the behaviors executed by the execution module of the behaviors 460.
  • the text with synchronization tags is sent to the voice synthesis module 450, while the ID behavior identifiers corresponding to the synchronization tempo are sent to the behavior execution module 460 which performs the pre-loaded behavior calls corresponding to the IDs of the behaviors to be executed.
  • Figures 5a and 5b show thumbnails constituting a scenario executed by a robot in one embodiment of the invention.
  • the scenario of the figure includes 16 vignettes.
  • a scenario can include any number of thumbnails.
  • the robot waits for the touch sensor 51 1 0 5120 located on the head is actuated.
  • the robot waits for a specified period after 5520 action touching on the touch sensor elapsed.
  • the robot is first character, the narrator 531 0 and executes a first behavior symbolized by the graphical representation of the character that is to be rotated by reading the text written in the 5320 bubble with a voice characterizing said first character.
  • the robot shows a second character 5410 (in the scenario of the example, a cicada symbolized by a graphic mark 5430) and executes a second behavior symbolized by the graphical representation of the character which is to swing his arms right up and down by reading the text written in the bubble 5420 with a voice different from that of the narrator and characterizing said second character.
  • the narrator robot is in a static position represented by the character 5510 and reads the text written in the balloon 5520.
  • the robot 561 cicada 0 is also in the static position shown in the same way as in 5510 and reads the text written in the balloon 5620.
  • the robot represents a third character (in the scenario of the example, an ant symbolized by a graphic mark 5730) and pronounces a text 5720.
  • the number of behaviors and emotions is not limited either.
  • the behaviors can be taken from a base of behaviors 341 0, created in Choreographer, the professional behavior editor or other tools. They may possibly be modified in the module 340 for managing behaviors of the editing module 210 which manages the behavior database 3410.
  • a behavior object is defined by a name, a category, possibly a subcategory, a representation, possibly one or more parameters, possibly the association of one or more files (audio or other).
  • a vignette may include several bubbles, a bubble comprising at least one word, as illustrated on the thumbnail 5A0.
  • a scenario can also be characterized by a banner 5H0 which may or may not correspond to a musical score, said partition being synchronized with the thumbnail / bubble tree.
  • This synchronization facilitates the nesting of several levels of thumbnails whose execution is conditional.
  • Several bands can run in parallel as shown in the figure by the banner 5I0.
  • the texts can be read in different languages, with different prosodies (speed, volume, style, voice ...) -
  • the variety of behaviors and emotions that can be used in the system of the invention is not limited.
  • the voice may be a male, female or child voice; the tone may be more or less serious or acute; the speed may be more or less rapid; the intonation can be chosen according to the emotion that the robot is likely to feel according to the text of the script (affection, astonishment, anger, joy, remonstrance, etc.).
  • Accompanying gestures of the script may for example be a movement of the arms upwards or forwards; strike a foot on the ground; movements of the head upwards, downwards, to the right or to the left, according to the impression that one wants to communicate coherently with the script ...
  • the robot can interact with its environment and its interlocutors in a very varied way: speech, gestures, touch, light signals, etc.
  • speech gestures, touch, light signals, etc.
  • these can be activated to translate strong emotions "felt" by the robot by reading the text or to generate an eye blink adapted to the shape and speed of speech.
  • certain commands may be interrupt and wait commands for an external event, such as a motion in response to a question posed by the robot.
  • Some commands may be dependent on the reactions of the robot to its environment, captured for example by a camera or ultrasonic sensors.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Manipulator (AREA)
  • Toys (AREA)
  • Stored Programmes (AREA)
EP13728694.4A 2012-06-01 2013-05-30 System und verfahren zur erzeugung von in echtzeit ausgeführten kontextabhängigen verhaltensweisen eines mobilen roboters Ceased EP2855105A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1255105A FR2991222B1 (fr) 2012-06-01 2012-06-01 Systeme et procede pour generer des comportements contextuels d'un robot mobile executes en temps reel
PCT/EP2013/061180 WO2013178741A1 (fr) 2012-06-01 2013-05-30 Systeme et procede pour generer des comportements contextuels d'un robot mobile executes en temps reel

Publications (1)

Publication Number Publication Date
EP2855105A1 true EP2855105A1 (de) 2015-04-08

Family

ID=47080621

Family Applications (1)

Application Number Title Priority Date Filing Date
EP13728694.4A Ceased EP2855105A1 (de) 2012-06-01 2013-05-30 System und verfahren zur erzeugung von in echtzeit ausgeführten kontextabhängigen verhaltensweisen eines mobilen roboters

Country Status (7)

Country Link
US (1) US20150290807A1 (de)
EP (1) EP2855105A1 (de)
JP (1) JP6319772B2 (de)
CN (1) CN104470686B (de)
BR (1) BR112014030043A2 (de)
FR (1) FR2991222B1 (de)
WO (1) WO2013178741A1 (de)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6594646B2 (ja) * 2015-04-10 2019-10-23 ヴイストン株式会社 ロボット及びロボット制御方法並びにロボットシステム
JP6781545B2 (ja) * 2015-12-28 2020-11-04 ヴイストン株式会社 ロボット
JP6604912B2 (ja) * 2016-06-23 2019-11-13 日本電信電話株式会社 発話動作提示装置、方法およびプログラム
US20180133900A1 (en) * 2016-11-15 2018-05-17 JIBO, Inc. Embodied dialog and embodied speech authoring tools for use with an expressive social robot
CN108932167B (zh) * 2017-05-22 2023-08-08 中兴通讯股份有限公司 一种智能问答同步显示方法、装置、系统及存储介质
JP6956562B2 (ja) * 2017-08-10 2021-11-02 学校法人慶應義塾 知能ロボットシステム及びプログラム
US11325263B2 (en) * 2018-06-29 2022-05-10 Teradyne, Inc. System and method for real-time robotic control
US11153238B2 (en) * 2019-01-08 2021-10-19 Snap Inc. Dynamic application configuration
CN110543144B (zh) * 2019-08-30 2021-06-01 天津施格自动化科技有限公司 图形化编程控制机器人的方法及系统

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2496160A1 (fr) * 1980-12-11 1982-06-18 Lamothe Andre Raccord etanche pour l'utilisation des outils conventionnels de forage en circulation inverse
JPH07261820A (ja) * 1994-03-25 1995-10-13 Nippon Telegr & Teleph Corp <Ntt> 産業用ロボット作業のソフトウェア構成方法及び制御装置
JP4366617B2 (ja) * 1999-01-25 2009-11-18 ソニー株式会社 ロボット装置
JP4670136B2 (ja) * 2000-10-11 2011-04-13 ソニー株式会社 オーサリング・システム及びオーサリング方法、並びに記憶媒体
GB2385954A (en) * 2002-02-04 2003-09-03 Magenta Corp Ltd Managing a Virtual Environment
US7995090B2 (en) * 2003-07-28 2011-08-09 Fuji Xerox Co., Ltd. Video enabled tele-presence control host
JP4744847B2 (ja) * 2004-11-02 2011-08-10 株式会社安川電機 ロボット制御装置およびロボットシステム
JP2009025224A (ja) * 2007-07-23 2009-02-05 Clarion Co Ltd ナビゲーション装置、および、ナビゲーション装置の制御方法
FR2929873B1 (fr) 2008-04-09 2010-09-03 Aldebaran Robotics Architecture de controle-commande d'un robot mobile utilisant des membres articules
FR2946160B1 (fr) * 2009-05-26 2014-05-09 Aldebaran Robotics Systeme et procede pour editer et commander des comportements d'un robot mobile.
FR2947923B1 (fr) * 2009-07-10 2016-02-05 Aldebaran Robotics Systeme et procede pour generer des comportements contextuels d'un robot mobile
WO2011011084A1 (en) * 2009-07-24 2011-01-27 Modular Robotics Llc Modular robotics
US8260460B2 (en) * 2009-09-22 2012-09-04 GM Global Technology Operations LLC Interactive robot control system and method of use
DE102010004476A1 (de) * 2010-01-13 2011-07-14 KUKA Laboratories GmbH, 86165 Verfahren und Vorrichtung zum Kontrollieren einer Roboterapplikation

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
None *
See also references of WO2013178741A1 *

Also Published As

Publication number Publication date
CN104470686A (zh) 2015-03-25
JP2015525137A (ja) 2015-09-03
WO2013178741A1 (fr) 2013-12-05
FR2991222B1 (fr) 2015-02-27
US20150290807A1 (en) 2015-10-15
FR2991222A1 (fr) 2013-12-06
CN104470686B (zh) 2017-08-29
BR112014030043A2 (pt) 2017-06-27
JP6319772B2 (ja) 2018-05-09

Similar Documents

Publication Publication Date Title
EP2855105A1 (de) System und verfahren zur erzeugung von in echtzeit ausgeführten kontextabhängigen verhaltensweisen eines mobilen roboters
FR2963132A1 (fr) Robot humanoide dote d&#39;une interface de dialogue naturel, methode d&#39;utilisation et de programmation de ladite interface
KR102306624B1 (ko) 지속적 컴패니언 디바이스 구성 및 전개 플랫폼
FR2947923A1 (fr) Systeme et procede pour generer des comportements contextuels d&#39;un robot mobile
US10929759B2 (en) Intelligent robot software platform
FR2989209A1 (fr) Robot apte a integrer des dialogues naturels avec un utilisateur dans ses comportements, procedes de programmation et d&#39;utilisation dudit robot
EP2435216B1 (de) System und verfahren zur bearbeitung und steuerung des verhaltens eines beweglichen roboters
US9292957B2 (en) Portable virtual characters
KR102001293B1 (ko) 로봇 상의 소프트웨어 애플리케이션 실행하기
CA2925930C (fr) Procede de dialogue entre une machine, telle qu&#39;un robot humanoide, et un interlocuteur humain, produit programme d&#39;ordinateur et robot humanoide pour la mise en oeuvre d&#39;un tel procede
US20100333037A1 (en) Dioramic user interface having a user customized experience
TW201916005A (zh) 互動方法和設備
EP4378638A2 (de) Verfahren zur steuerung einer vielzahl von robotereffektoren
Feng et al. A platform for building mobile virtual humans
Alonso et al. A flexible and scalable social robot architecture employing voice assistant technologies
WO2018183812A1 (en) Persistent companion device configuration and deployment platform
Geraci Design and implementation of embodied conversational agents
US20210042639A1 (en) Converting nonnative skills for conversational computing interfaces
Hahkio Service robots’ feasibility in the hotel industry: A case study of Hotel Presidentti
Lleonsí Carrillo Development of a teaching assistance application for SoftBank Pepper
WO2006061308A1 (fr) Procédé d&#39;animation temporelle d&#39;un avatar à partir d&#39;un signal source comprenant des informations d&#39;aiguillage, dispositif, programme d&#39;ordinateur, moyen de stockage et signal source correspondants.
GIL GONÇALVES CORREIA

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20141230

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: ALDEBARAN ROBOTICS S.A.

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20170105

REG Reference to a national code

Ref country code: DE

Ref legal event code: R003

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20190116