WO2013178741A1 - Systeme et procede pour generer des comportements contextuels d'un robot mobile executes en temps reel - Google Patents
Systeme et procede pour generer des comportements contextuels d'un robot mobile executes en temps reel Download PDFInfo
- Publication number
- WO2013178741A1 WO2013178741A1 PCT/EP2013/061180 EP2013061180W WO2013178741A1 WO 2013178741 A1 WO2013178741 A1 WO 2013178741A1 EP 2013061180 W EP2013061180 W EP 2013061180W WO 2013178741 A1 WO2013178741 A1 WO 2013178741A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- module
- robot
- behavior
- text
- editing
- Prior art date
Links
- 230000006399 behavior Effects 0.000 title claims abstract description 123
- 238000000034 method Methods 0.000 title claims abstract description 15
- 230000008451 emotion Effects 0.000 claims abstract description 8
- 238000004590 computer program Methods 0.000 claims description 13
- 230000015654 memory Effects 0.000 claims description 6
- 230000009471 action Effects 0.000 claims description 4
- 230000003750 conditioning effect Effects 0.000 claims description 2
- 230000036316 preload Effects 0.000 abstract description 2
- 230000001360 synchronised effect Effects 0.000 abstract description 2
- 238000004891 communication Methods 0.000 description 7
- 238000013515 script Methods 0.000 description 7
- 230000033001 locomotion Effects 0.000 description 6
- 230000015572 biosynthetic process Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000003137 locomotive effect Effects 0.000 description 4
- 238000003786 synthesis reaction Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 3
- 238000011282 treatment Methods 0.000 description 3
- 241000931705 Cicada Species 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000001154 acute effect Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000003542 behavioural effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000001427 coherent effect Effects 0.000 description 1
- 230000001143 conditioned effect Effects 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000008918 emotional behaviour Effects 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 230000000193 eyeblink Effects 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000001795 light effect Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1661—Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/30—Creation or generation of source code
- G06F8/34—Graphical or visual programming
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/004—Artificial life, i.e. computing arrangements simulating life
- G06N3/008—Artificial life, i.e. computing arrangements simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. based on robots replicating pets or humans in their appearance or behaviour
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40099—Graphical user interface for robotics, visual robot user interface
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40392—Programming, visual robot programming language
Definitions
- the invention covered by the patent application cited does not control the robot in real time because it uses a publisher who is not able to send orders directly to the robot in "streaming", that is to say ie who can interact in real time with the behavior of the robot that can evolve according to the evolution of its environment.
- a scenario must be replayed from the beginning when an event not provided for in the command scenario occurs.
- FIG. 1 represents the physical architecture of a system for implementing the invention according to several embodiments.
- a humanoid robot 1 1 0 is shown in the figure in one embodiment of the invention. Such a robot has been disclosed in particular in the patent application WO2009 / 124951 published on January 15, 2009. This platform served as a basis for the improvements that led to the present invention. In the remainder of the description, this humanoid robot can be indifferently referred to under this generic name or under its trademark NAO TM, without the generality of the reference being modified.
- the head includes the intelligence of the robot, including the card that performs the high-level functions that allow the robot to perform the tasks assigned to it, including, in the context of the present invention, for the execution of written scenarios by a user who is not a professional programmer.
- the head may also include specialized cards, especially in the speech or vision processing or also in the processing of service inputs / outputs, such as the encoding necessary to open a port to establish a communication remotely over Wide Area Network (WAN).
- the card processor can be a commercial x86 processor. We will choose in a preferred way a low-power processor, for example an ATOM TM from Intel (32 bits, 1600 MHz).
- the card also includes a set of RAM and flash memories.
- the robot 1 1 0 will be able to perform behaviors for which it may have been programmed in advance, including a code generated according to the invention disclosed in the French patent application No. 09/53434 already cited, said code having been created by a programmer in a graphical interface.
- These behaviors may also have been arranged in a scenario created by a user who is not a professional programmer using the invention disclosed in the patent application WO201 1/003628 also already mentioned.
- it may be behaviors articulated among themselves according to a relatively complex logic in which the sequences of behaviors are conditioned by the events that occur in the environment of the robot.
- a user who must have a minimum of programmer skills can use the Choregraph TM workshop, whose main operating modes are described in the cited application.
- the flow logic of the scenario is not in principle adaptive.
- the PC 1 20 includes a software module 210 for graphically editing the commands that will be passed to the robot or robots.
- the architecture and operation will be detailed in comment in Figure 3.
- the behavior commands are represented in a thumbnail by an icon illustrative of said behavior.
- the behavior commands can generate:
- Figures 5a and 5b show thumbnails constituting a scenario executed by a robot in one embodiment of the invention.
- the robot can interact with its environment and its interlocutors in a very varied way: speech, gestures, touch, light signals, etc.
- speech gestures, touch, light signals, etc.
- these can be activated to translate strong emotions "felt" by the robot by reading the text or to generate an eye blink adapted to the shape and speed of speech.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Robotics (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biophysics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Computational Linguistics (AREA)
- Biomedical Technology (AREA)
- Mathematical Physics (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Manipulator (AREA)
- Stored Programmes (AREA)
- Toys (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015514502A JP6319772B2 (ja) | 2012-06-01 | 2013-05-30 | リアルタイムで行われる移動ロボットの脈絡ある挙動を生成する方法およびシステム |
EP13728694.4A EP2855105A1 (fr) | 2012-06-01 | 2013-05-30 | Systeme et procede pour generer des comportements contextuels d'un robot mobile executes en temps reel |
CN201380037538.3A CN104470686B (zh) | 2012-06-01 | 2013-05-30 | 用于生成被实时地执行的移动机器人的上下文行为的系统和方法 |
US14/404,924 US20150290807A1 (en) | 2012-06-01 | 2013-05-30 | System and method for generating contextual behaviours of a mobile robot executed in real time |
BR112014030043A BR112014030043A2 (pt) | 2012-06-01 | 2013-05-30 | sistema e processo para gerar comportamentos contextuais de um robô móvel executados em tem po real |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1255105A FR2991222B1 (fr) | 2012-06-01 | 2012-06-01 | Systeme et procede pour generer des comportements contextuels d'un robot mobile executes en temps reel |
FR1255105 | 2012-06-01 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013178741A1 true WO2013178741A1 (fr) | 2013-12-05 |
Family
ID=47080621
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2013/061180 WO2013178741A1 (fr) | 2012-06-01 | 2013-05-30 | Systeme et procede pour generer des comportements contextuels d'un robot mobile executes en temps reel |
Country Status (7)
Country | Link |
---|---|
US (1) | US20150290807A1 (fr) |
EP (1) | EP2855105A1 (fr) |
JP (1) | JP6319772B2 (fr) |
CN (1) | CN104470686B (fr) |
BR (1) | BR112014030043A2 (fr) |
FR (1) | FR2991222B1 (fr) |
WO (1) | WO2013178741A1 (fr) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6594646B2 (ja) * | 2015-04-10 | 2019-10-23 | ヴイストン株式会社 | ロボット及びロボット制御方法並びにロボットシステム |
JP6781545B2 (ja) * | 2015-12-28 | 2020-11-04 | ヴイストン株式会社 | ロボット |
JP6604912B2 (ja) * | 2016-06-23 | 2019-11-13 | 日本電信電話株式会社 | 発話動作提示装置、方法およびプログラム |
US20180133900A1 (en) * | 2016-11-15 | 2018-05-17 | JIBO, Inc. | Embodied dialog and embodied speech authoring tools for use with an expressive social robot |
CN108932167B (zh) * | 2017-05-22 | 2023-08-08 | 中兴通讯股份有限公司 | 一种智能问答同步显示方法、装置、系统及存储介质 |
JP6956562B2 (ja) * | 2017-08-10 | 2021-11-02 | 学校法人慶應義塾 | 知能ロボットシステム及びプログラム |
US11325263B2 (en) * | 2018-06-29 | 2022-05-10 | Teradyne, Inc. | System and method for real-time robotic control |
US11153238B2 (en) * | 2019-01-08 | 2021-10-19 | Snap Inc. | Dynamic application configuration |
CN110543144B (zh) * | 2019-08-30 | 2021-06-01 | 天津施格自动化科技有限公司 | 图形化编程控制机器人的方法及系统 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009124951A1 (fr) | 2008-04-09 | 2009-10-15 | Aldebaran Robotics | Architecture de controle - commande d'un robot mobile utilisant des membres articules |
FR2946160A1 (fr) * | 2009-05-26 | 2010-12-03 | Aldebaran Robotics | Systeme et procede pour editer et commander des comportements d'un robot mobile. |
WO2011003628A2 (fr) | 2009-07-10 | 2011-01-13 | Aldebaran Robotics S.A | Systeme et procede pour generer des comportements contextuels d'un robot mobile |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2496160A1 (fr) * | 1980-12-11 | 1982-06-18 | Lamothe Andre | Raccord etanche pour l'utilisation des outils conventionnels de forage en circulation inverse |
JPH07261820A (ja) * | 1994-03-25 | 1995-10-13 | Nippon Telegr & Teleph Corp <Ntt> | 産業用ロボット作業のソフトウェア構成方法及び制御装置 |
JP4366617B2 (ja) * | 1999-01-25 | 2009-11-18 | ソニー株式会社 | ロボット装置 |
JP4670136B2 (ja) * | 2000-10-11 | 2011-04-13 | ソニー株式会社 | オーサリング・システム及びオーサリング方法、並びに記憶媒体 |
GB2385954A (en) * | 2002-02-04 | 2003-09-03 | Magenta Corp Ltd | Managing a Virtual Environment |
US7995090B2 (en) * | 2003-07-28 | 2011-08-09 | Fuji Xerox Co., Ltd. | Video enabled tele-presence control host |
JP4744847B2 (ja) * | 2004-11-02 | 2011-08-10 | 株式会社安川電機 | ロボット制御装置およびロボットシステム |
JP2009025224A (ja) * | 2007-07-23 | 2009-02-05 | Clarion Co Ltd | ナビゲーション装置、および、ナビゲーション装置の制御方法 |
WO2011011084A1 (fr) * | 2009-07-24 | 2011-01-27 | Modular Robotics Llc | Robotique modulaire |
US8260460B2 (en) * | 2009-09-22 | 2012-09-04 | GM Global Technology Operations LLC | Interactive robot control system and method of use |
DE102010004476A1 (de) * | 2010-01-13 | 2011-07-14 | KUKA Laboratories GmbH, 86165 | Verfahren und Vorrichtung zum Kontrollieren einer Roboterapplikation |
-
2012
- 2012-06-01 FR FR1255105A patent/FR2991222B1/fr not_active Expired - Fee Related
-
2013
- 2013-05-30 JP JP2015514502A patent/JP6319772B2/ja active Active
- 2013-05-30 BR BR112014030043A patent/BR112014030043A2/pt not_active Application Discontinuation
- 2013-05-30 CN CN201380037538.3A patent/CN104470686B/zh not_active Expired - Fee Related
- 2013-05-30 US US14/404,924 patent/US20150290807A1/en not_active Abandoned
- 2013-05-30 EP EP13728694.4A patent/EP2855105A1/fr not_active Ceased
- 2013-05-30 WO PCT/EP2013/061180 patent/WO2013178741A1/fr active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009124951A1 (fr) | 2008-04-09 | 2009-10-15 | Aldebaran Robotics | Architecture de controle - commande d'un robot mobile utilisant des membres articules |
FR2946160A1 (fr) * | 2009-05-26 | 2010-12-03 | Aldebaran Robotics | Systeme et procede pour editer et commander des comportements d'un robot mobile. |
WO2011003628A2 (fr) | 2009-07-10 | 2011-01-13 | Aldebaran Robotics S.A | Systeme et procede pour generer des comportements contextuels d'un robot mobile |
Non-Patent Citations (1)
Title |
---|
ALAN ATHERTON J ET AL: "Visual robot choreography for clinicians", COLLABORATION TECHNOLOGIES AND SYSTEMS (CTS), 2011 INTERNATIONAL CONFERENCE ON, IEEE, 23 May 2011 (2011-05-23), pages 186 - 189, XP032000487, ISBN: 978-1-61284-638-5, DOI: 10.1109/CTS.2011.5928685 * |
Also Published As
Publication number | Publication date |
---|---|
FR2991222A1 (fr) | 2013-12-06 |
EP2855105A1 (fr) | 2015-04-08 |
JP2015525137A (ja) | 2015-09-03 |
JP6319772B2 (ja) | 2018-05-09 |
CN104470686B (zh) | 2017-08-29 |
US20150290807A1 (en) | 2015-10-15 |
FR2991222B1 (fr) | 2015-02-27 |
BR112014030043A2 (pt) | 2017-06-27 |
CN104470686A (zh) | 2015-03-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2013178741A1 (fr) | Systeme et procede pour generer des comportements contextuels d'un robot mobile executes en temps reel | |
FR2963132A1 (fr) | Robot humanoide dote d'une interface de dialogue naturel, methode d'utilisation et de programmation de ladite interface | |
US10929759B2 (en) | Intelligent robot software platform | |
FR2947923A1 (fr) | Systeme et procede pour generer des comportements contextuels d'un robot mobile | |
KR102306624B1 (ko) | 지속적 컴패니언 디바이스 구성 및 전개 플랫폼 | |
FR2989209A1 (fr) | Robot apte a integrer des dialogues naturels avec un utilisateur dans ses comportements, procedes de programmation et d'utilisation dudit robot | |
CN105917404B (zh) | 用于实现数字个人助理的方法、设备和系统 | |
US9292957B2 (en) | Portable virtual characters | |
EP3053162B1 (fr) | Procede de dialogue entre une machine, telle qu'un robot humanoïde, et un interlocuteur humain, produit programme d'ordinateur et robot humanoïde pour la mise en oeuvre d'un tel procede | |
KR102001293B1 (ko) | 로봇 상의 소프트웨어 애플리케이션 실행하기 | |
US20100333037A1 (en) | Dioramic user interface having a user customized experience | |
WO2010136427A1 (fr) | Systeme et procede pour editer et commander des comportements d'un robot mobile | |
EP1605420A2 (fr) | Système de formation à l'exploitation, l'utilisation ou la maintenance d'un cadre de travail dans un environnement de realité virtuelle | |
Feng et al. | A platform for building mobile virtual humans | |
Alonso et al. | A flexible and scalable social robot architecture employing voice assistant technologies | |
EP4378638A2 (fr) | Procédé de commande d'une pluralité d'effecteurs d'un robot | |
Geraci | Design and implementation of embodied conversational agents | |
Hahkio | Service robots’ feasibility in the hotel industry: A case study of Hotel Presidentti | |
US20210042639A1 (en) | Converting nonnative skills for conversational computing interfaces | |
Caballero | Emotion detection, processing and response through human-machine interaction system modelling | |
Lleonsí Carrillo | Development of a teaching assistance application for SoftBank Pepper | |
Correia | Modular Prototype of a Smart Mirror with Voice Interaction | |
Zhao et al. | SotA report Smart Space DIY application creation and interaction design | |
WO2006061308A1 (fr) | Procédé d'animation temporelle d'un avatar à partir d'un signal source comprenant des informations d'aiguillage, dispositif, programme d'ordinateur, moyen de stockage et signal source correspondants. | |
GIL | GONÇALVES CORREIA |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13728694 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015514502 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013728694 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14404924 Country of ref document: US |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112014030043 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112014030043 Country of ref document: BR Kind code of ref document: A2 Effective date: 20141201 |