EP3787849A1 - Method for controlling a plurality of robot effectors - Google Patents
Method for controlling a plurality of robot effectorsInfo
- Publication number
- EP3787849A1 EP3787849A1 EP19734847.7A EP19734847A EP3787849A1 EP 3787849 A1 EP3787849 A1 EP 3787849A1 EP 19734847 A EP19734847 A EP 19734847A EP 3787849 A1 EP3787849 A1 EP 3787849A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- robot
- action
- effectors
- primitives
- rules
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
- 239000012636 effector Substances 0.000 title claims abstract description 26
- 238000000034 method Methods 0.000 title claims abstract description 26
- 230000009471 action Effects 0.000 claims abstract description 80
- 230000006870 function Effects 0.000 claims abstract description 31
- 230000008447 perception Effects 0.000 claims abstract description 19
- 230000004913 activation Effects 0.000 claims description 11
- 238000004891 communication Methods 0.000 claims description 10
- 230000001360 synchronised effect Effects 0.000 claims description 3
- 241000282414 Homo sapiens Species 0.000 description 36
- 230000008451 emotion Effects 0.000 description 10
- 230000003993 interaction Effects 0.000 description 10
- 230000006399 behavior Effects 0.000 description 7
- 238000004364 calculation method Methods 0.000 description 7
- 238000009833 condensation Methods 0.000 description 6
- 230000005494 condensation Effects 0.000 description 6
- 238000011282 treatment Methods 0.000 description 5
- 238000001514 detection method Methods 0.000 description 4
- 238000006073 displacement reaction Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 241000282412 Homo Species 0.000 description 3
- 239000002131 composite material Substances 0.000 description 3
- 230000002996 emotional effect Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000001755 vocal effect Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000000763 evoking effect Effects 0.000 description 2
- 238000010191 image analysis Methods 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 235000002566 Capsicum Nutrition 0.000 description 1
- 206010033425 Pain in extremity Diseases 0.000 description 1
- 239000006002 Pepper Substances 0.000 description 1
- 235000016761 Piper aduncum Nutrition 0.000 description 1
- 235000017804 Piper guineense Nutrition 0.000 description 1
- 244000203593 Piper nigrum Species 0.000 description 1
- 235000008184 Piper nigrum Nutrition 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000003542 behavioural effect Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 230000008094 contradictory effect Effects 0.000 description 1
- 208000029436 dilated pupil Diseases 0.000 description 1
- 238000000537 electroencephalography Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000008921 facial expression Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 238000004321 preservation Methods 0.000 description 1
- NHDHVHZZCFYRSB-UHFFFAOYSA-N pyriproxyfen Chemical compound C=1C=CC=NC=1OC(C)COC(C=C1)=CC=C1OC1=CC=CC=C1 NHDHVHZZCFYRSB-UHFFFAOYSA-N 0.000 description 1
- 238000007665 sagging Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000000638 stimulation Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0084—Programme-controlled manipulators comprising a plurality of manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/0005—Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means
- B25J11/001—Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means with emotions simulating means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J15/00—Gripping heads and other end effectors
- B25J15/0052—Gripping heads and other end effectors multiple gripper units or multiple end effectors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
Definitions
- the present invention relates to the field of communicating robots. More specifically, it relates to the management of the resources of said robot to control an operation of the communicating robot simulating an empathic intelligence allowing them to interact with one or more humans by actions that are not only slaves (the term "robot” having been imagined in 1920 by Czech Karel Capek inspired by the word “robota” meaning "slave” in Slavic languages).
- Such a robot can take the form of a humanoid robot, an autonomous car or more simply an equipment having a communicating interface allowing bidirectional interaction with one or more humans, by multimodal messages (tactile, visual or sound) issued and received by the robot.
- the European Patent EP1486300 B1 describes a behavior control system for a robot that operates autonomously, comprising: a plurality of behavioral description sections for describing the movements of the robot. a machine body of said robot; an external environment recognition section for recognizing an external environment of said machine body;
- an internal state management section for managing an internal state of said robot in response to the recognized external environment and / or a result of executing a behavior
- a behavior evaluation section for evaluating the performance of behaviors described in said behavior description sections in response to the external environment and / or the internal environment.
- the internal state management section manages emotions, each of which is an index of the internal state in a hierarchical structure presenting a plurality of layers, and uses the emotions in a layer of primary emotions necessary for an individual preservation and another layer of secondary emotions that vary according to the excess / lack of primary emotions, and in further divides the primary emotions into layers comprising an innate or physiological reflective layer and an associated layer depending on the dimensions.
- European Patent EP1494210 describing a voice communication system with a function for having a conversation with a conversation partner, comprising: voice recognition means for recognizing the voice of the conversation partner; conversation control means for controlling the conversation with the conversation partner on the basis of a recognition result of the voice recognition means; image recognition means for recognizing a face of the conversation partner; and
- a search control means for searching for the existence of the conversation partner on the basis of one of the results or two results which are a recognition result of the image recognition means and a recognition result of the means of recognition;
- the conversation control means continues the conversation when the speech content of the conversation partner obtained as recognition result of the voice recognition means is identical to the expected response content even if the search of the search control means fails.
- solutions of the prior art dissociate the technical resources intended for the processing of the language on the one hand, and on the other hand the technical resources intended for the recognition process of the robot's environment by cameras and possibly other sensors and the control of the movements of the robot.
- the patent EP1486300 does not provide for taking into account the verbal dimension of the interaction between the human and the robot, and thus does not allow to proceed, for example, to a natural learning of the robot during the interaction with the human.
- Patent EP1494210 meanwhile only provides verbal interaction between the human and the robot to control artificial conversations.
- the object of the present invention is to respond to these drawbacks by proposing, according to the most general acceptance of the invention, a method of controlling a plurality of effectors of a robot by a plurality of primitives constituted by parametric coded functions o said primitives being conditionally activated by actions selected by an action selection system o said action selection system:
- the method is based on the association, at each step, of the coded objects with a sequence of characters corresponding to their semantic description, with in particular: i. a semantic description of said coded objects stored in the memory, composed of a character string representing a perception function of the robot and another character string representing a perceived object ii. a semantic description of said primitives, consisting of a character string representing a possible action of the robot and another optional character string representing the optional parameters of this action iii. a semantic description of said rules (102 to 106), composed of the combination of a character string representing the associated context and another character string representing the associated action.
- said action selection system classifies the rules according to the proximity of the context of each of these rules with the context calculated from the content of the objects contained in the memory and selects the actions associated with the relevant rules with the context.
- the method furthermore comprises steps of recording new rules associated with a new action, via a learning module, relying for example on voice recognition, gesture recognition or mimicry;
- the method comprises a step of calculating, for each rule, an IS parameter which is recalculated after each execution of the associated action, the priority order of the actions selected by the action selection system being determined according to the value of the IS parameter associated with each of the actions; the method further comprises communication steps with other robots via a communication module and a shared server, the declarative memory of the robot's action selection system being periodically synchronized with the content of said robot. shared server.
- the activation of the primitives by the actions is filtered by a resource manager guaranteeing the compatibility of the commands sent to the effectors;
- the method further comprises: i. constant updating steps, in the background of internal state variables VEIx, according to the evolution of the objects stored in said memory (80). ii. steps of setting said action primitives (131 to 134), said perception functions (72 to 75) and said action selection module (100) according to the values of said internal state variables VEIx.
- the invention also relates to a robot comprising a plurality of effectors controlled by a controller executing a method according to the invention.
- a controller executing a method according to the invention.
- FIG. 1 represents a hardware architecture of the invention
- the robot comprises communication interface circuits (1) with sensors (2 to 5) for example:
- Each sensor (2 to 5) is associated with a physical pilot circuit (12 to 15) "driver” integrated in the sensor or provided on the communication interface circuit (1).
- the communication interface circuit (1) combines the preprocessing circuits (22 to 25) of the signals supplied by the sensors (2 to 5) to transmit the information to the main computer (30) or to dedicated computers (31 to 33). ).
- Some pretreatment circuits (22 to 25) may be specialized circuits, for example image analysis circuits or speech recognition circuits.
- the dedicated computers (31 to 33) receive the signals of certain sensors (2 to 5) as well as commands from the main computer (30) for calculating instructions which are transmitted to preprocessing circuits (51 to 53) for calculation. action parameters of an effector (41 to 43). These parameters are exploited by interface circuits (61 to 63) to supply the effectors (41 to 43) with the control signals, for example in the form of electrical signals modulated in PWM pulse width.
- the main computer (30) is also associated with a communication module (36) for exchanging data with external resources, for example the Internet.
- the effectors (41 to 43) are for example constituted by:
- FIG. 2 represents the functional architecture of an exemplary embodiment of the robot.
- the information from the sensors (2 to 5) provides information for calculating via digital perception functions (72 to 75) digital metadata, an image of the semantic representation that the robot makes of the world.
- Each of these perception functions (72 to 75) receives pretreated data from one or more sensors (2 to 5) to calculate the metadata corresponding to a perception type.
- a first perception function (72) calculates a distance between the robot and a detected object
- a second perception function (73) performs an image recognition to characterize the detected objects
- a third perception function (74) performs a voice recognition to restore a sequence of characters corresponding to the sentence said by a user
- This digital metadata consists of objects coded in object language, for example in C # or C ++.
- This metadata includes some or all of the following: a semantic description in the form of a sequence of characters, eg "I SEE ONE MAN 1 METER DISTANCE AND 20 DEGREES ON THE RIGHT
- object-specific attributes for example, for a human "AGE, SEX, SIZE, etc.”
- the set of coded objects corresponding to the selection functions is stored in a memory (80) whose content expresses the representation of the world as perceived by the robot.
- processes (91 to 98) are applied in the background by condensation functions which extract data characteristic of each of said objects and group together coded objects sharing the same characteristic data. .
- a first condensation function performs face recognition processing on the detected objects of human type, from the following objects:
- object 1 also including a corresponding image file
- a second condensation function performs the association treatment of a person with a recognized sound, from the following objects:
- a third condensation function performs the association processing of the two objects, from the following recalculated objects:
- the condensation treatments (91 to 98) are applied recursively to the contents of the memory (80) containing the representation of the world by the robot.
- the system also includes a selection system action (“rule manager”) (100) which includes a declarative memory (101) in which a rule library (102 to 106) is stored associating a context (112 to 116) with an action (122 to 126).
- rule manager includes a declarative memory (101) in which a rule library (102 to 106) is stored associating a context (112 to 116) with an action (122 to 126).
- a first rule RI is constituted by a numerical sequence of type:
- An action is materialized by a numerical sequence designating a tree of primitives (131 to 134) executed by the effectors of the robot, such as:
- Action 1 "SEND A MESSAGE TO YOUR OWNER" corresponds to a unitary sequence consisting of a single primitive (131 to 134):
- the action selection system (100) classifies the rules (102 to 105) according to the proximity of the context (112 to 115) of each of these rules to the context calculated from the content of the objects (81 to 88) contained in the memory (80), by a distance calculation in the N-dimensional space of the context.
- This calculation periodically provides a subset (110) of rules (102 to 104) associated with actions (122 to
- the list is optionally filtered according to a threshold value to form a subset of rules whose distance from the current context is less than this threshold value.
- This subset (HO) is then ordered according to a Satisfaction Indicator (SI).
- SI Satisfaction Indicator
- each rule is assigned when executed at a variable numeric parameter IS.
- the methods for determining this digital parameter IS will be explained below.
- the subset (110) thus determined defines a set of ordered actions (122 to 124) associated with said rules (102 to 104) and used to control the operation of the effectors of the robot.
- the execution of the actions (122 to 124) is performed via the activation of primitives (131 to 134) whose parameterization is determined by the content of said actions (amplitude of the displacement, address of the sound sequence, intensity of the sound sequence, ).
- the primitives (131 to 134) designate meta-functions, resulting in a computer code whose execution is carried out by a set of commands (201 to 205) transmitted to software applications (211 to 215) or directly to effectors (41 to 43), optionally with parameters, for example:
- Each primitive (131 to 134) is parameterized, if necessary with:
- the activation of the primitives (131 to 134) is filtered by a resource manager (200) whose purpose is to prevent the sending of contradictory or impossible commands to the same effector (41). This filtering favors the activation of the primitives associated with the actions (122 to 124) having the highest IS parameter.
- the resource manager (200) inhibits the action having the lowest IS parameter, and the parameterized primitive actually executed is that resulting from the action having the highest IS parameter.
- the resource manager (200) calculates a new primitive from two primitives deemed incompatible, whose parameterization is calculated by weighting the settings of the two incompatible primitives according to the parameter IS associated with each of them and corresponds to thus to a compromise.
- the recording of the rules (102 to 106) is performed by a voice learning.
- a learning module (400) comprising a speech recognition and semantic analysis module analyzes the sentences pronounced by an operator, to extract actions and contexts defining a new rule (106).
- the module builds and records a rule associating action "ACTIVATING A SOUND SIGNAL” in the context "YOU HEAR A NOISE EXCEEDING A LEVEL X”.
- the learning module (400) also comprises kinetic recognition means for recognizing a gesture, for example the designation of an object.
- the learning module (400) further comprises an image analysis module, to enable learning by gestures combined with spoken words.
- the human designates an object in the field of vision of the robot's camera and pronounces the sentence "IF YOU SEE THIS #OBJECT, SEIZED", leads to the recording of the rule constituted by the action "SEIZE"
- the learning module (400) comprises learning means by mimicry or recurrence: when the robot repeatedly records the same action associated with the same context, it triggers the recording of a new rule associating this action with this context.
- the combinations of actions and contexts are constructed according to the value of the complicity parameter VEI 4 calculated for each pair of an action and a context, during interactions between the human and the robot.
- Each rule (102 to 106) present in the declarative memory (101) is associated with a parameter IS (Satisfaction Indicator) which is recalculated after each implementation of the associated action (112 to 116), as a function of the current value of the VEI parameter 3 .
- IS Servicefaction Indicator
- VEI 3 If the value of VEI 3 is low, that is to say less than a reference value, then the IS parameter of the rule whose action has been executed is reduced.
- the value of the parameter IS is used to order the actions (112 to 116) and to promote resource access by the resource manager (200) to the actions associated with the parameter IS having the highest value.
- the rules (102 to 106) are stored in the local memory of a robot.
- the declarative memory (101) of the robot's action selection system (100) is periodically synchronized with the contents of a server shared between a plurality of robots, via the communication module (36).
- VEI X Internal State Variables
- VEl ! is representative of the state of awakening or activation of the robot.
- the value 0 corresponds to an inactivity of the robot where the robot is almost immobile and silent, with an appearance of falling asleep
- the value 1 corresponds to an overexcitation state where the movements of the robot have a maximum amplitude, with a dilated pupil appearance for robots with an animated face, frequent and jerky movements ...
- the value of the VE ⁇ parameter is for example calculated according to the evolution of the robot's environment: in case of stability of the information perceived by the sensors (2 to 5), and absence of human detected by the robot, the value will be low.
- VEI 2 is representative of the state of surprise of the robot.
- the value 0 corresponds to a nominal activity of the robot
- the value 1 corresponds to punctual and abrupt variations of the effectors (sudden movements of the articulations, sound level varying abruptly, ...)
- the parameter VEI 2 is for example calculated according to the temporal variations of the parameter VEl ! for example by calculating a derivative of VEl !
- VEI 3 is representative of the state of apparent satisfaction of the robot.
- the value 0 corresponds to a disappointed physical appearance of the robot (look directed mainly downwards, arm pointing downwards, ...)
- the value 1 corresponds to a proud physical appearance of the robot (look directed mainly upwards, position of the joints giving a proud appearance, ...)
- the parameter VEI 3 is for example calculated according to certain interactions between the human and the robot, such as:
- VEI 4 is representative of the state of complicity between the robot and the humans with whom it interacts.
- the value 0 corresponds to a low level of interaction and activation of the effectors
- the value 1 corresponds to a strong activation of the effectors and a tendency to mimicry of the robot with the human with which it interacts, and a tendency to move the robot towards this human and orientation of the gaze of the robot towards this human .
- This parameter can be calculated by means of coded control steps of user stimulation and acquisition of the effect induced on the user, to deduce the level of empathy according to the conformity between the stimulus and the acquired image.
- the parameter VEI 4 is for example calculated as a function of the level of mimicry between the human and the robot, that is to say the rate of movement of the human reproducing the movements of the robot with a small time offset, or responsive to the movement of the robot, with a small time offset, or certain interactions between the human and the robot, such as:
- VEI 5 is representative of the state of joy of the robot.
- the value 0 corresponds to an appearance of sadness (downward gaze, drooping mouth area, neutral prosody, sagging joints, ...
- the value 1 corresponds to a strong activation of the effectors and a tendency to smile and the emission of appropriate sounds.
- the parameter VE 5 is for example calculated according to the detection of the smile on the face of the human and the result of a combination of the other parameters VE X.
- the parameters VEI X are calculated by modules (181 to 185) from the data supplied by the module (80) determining the representation of the world by the robot according to the data acquired by the set of sensors (2). to 5) and the information representative of the state of the robot, for the determination of the synchronisms between the robot and the human.
- the parameters VEI to VEI 5 are also calculated according to internal criteria, independent of the external environment of the robot. These criteria take into account, for example:
- the frequency of the variations of the VEI value weights the amplitude of the variations
- the result of the treatments provides periodically updated values for the VEI X parameters, which modulate the perception functions (71 to 75) and the primitives (131 to 134), as well as the behavior selection module (100).
- the sound file S is selected according to the values of the parameters VEI to VEI 5 .
- the calculation functions of the parameters IEV X may be determined by an initial characterization step of recording experimental data obtained by a process of subjecting people to a plurality of stimuli and associating with each of said stimuli corresponding to said perception functions, a level of perception [pleasure, satisfaction, awakening, surprise, ...]
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
- Numerical Control (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP24171738.8A EP4378638A2 (en) | 2018-05-04 | 2019-04-26 | Method for controlling a plurality of effectors of a robot |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1853868A FR3080926B1 (en) | 2018-05-04 | 2018-05-04 | METHOD FOR CONTROLLING A PLURALITY OF EFFECTORS OF A ROBOT |
PCT/FR2019/050983 WO2019211552A1 (en) | 2018-05-04 | 2019-04-26 | Method for controlling a plurality of robot effectors |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP24171738.8A Division EP4378638A2 (en) | 2018-05-04 | 2019-04-26 | Method for controlling a plurality of effectors of a robot |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3787849A1 true EP3787849A1 (en) | 2021-03-10 |
Family
ID=62816784
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP19734847.7A Ceased EP3787849A1 (en) | 2018-05-04 | 2019-04-26 | Method for controlling a plurality of robot effectors |
EP24171738.8A Pending EP4378638A2 (en) | 2018-05-04 | 2019-04-26 | Method for controlling a plurality of effectors of a robot |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP24171738.8A Pending EP4378638A2 (en) | 2018-05-04 | 2019-04-26 | Method for controlling a plurality of effectors of a robot |
Country Status (6)
Country | Link |
---|---|
US (1) | US12011828B2 (en) |
EP (2) | EP3787849A1 (en) |
JP (1) | JP7414735B2 (en) |
CA (1) | CA3102196A1 (en) |
FR (1) | FR3080926B1 (en) |
WO (1) | WO2019211552A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11513795B2 (en) * | 2020-06-24 | 2022-11-29 | Dell Products L.P. | Systems and methods for firmware-based user awareness arbitration |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002283259A (en) * | 2001-03-27 | 2002-10-03 | Sony Corp | Operation teaching device and operation teaching method for robot device and storage medium |
JP2002361582A (en) | 2001-06-05 | 2002-12-18 | Sony Corp | Robot power supplying system and robot |
CN100509308C (en) * | 2002-03-15 | 2009-07-08 | 索尼公司 | Robot behavior control system, behavior control method, and robot device |
JP2004220458A (en) | 2003-01-17 | 2004-08-05 | Taku Obata | Automatic programming for machinery according to daily language and deductive inference engine |
JP4048492B2 (en) * | 2003-07-03 | 2008-02-20 | ソニー株式会社 | Spoken dialogue apparatus and method, and robot apparatus |
JP4560078B2 (en) | 2007-12-06 | 2010-10-13 | 本田技研工業株式会社 | Communication robot |
US20110178619A1 (en) * | 2007-12-21 | 2011-07-21 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Security-activated robotic tasks |
FR2963132A1 (en) | 2010-07-23 | 2012-01-27 | Aldebaran Robotics | HUMANOID ROBOT HAVING A NATURAL DIALOGUE INTERFACE, METHOD OF USING AND PROGRAMMING THE SAME |
US8971572B1 (en) * | 2011-08-12 | 2015-03-03 | The Research Foundation For The State University Of New York | Hand pointing estimation for human computer interaction |
FR2989209B1 (en) * | 2012-04-04 | 2015-01-23 | Aldebaran Robotics | ROBOT FOR INTEGRATING NATURAL DIALOGUES WITH A USER IN HIS BEHAVIOR, METHODS OF PROGRAMMING AND USING THE SAME |
WO2014138925A1 (en) * | 2013-03-15 | 2014-09-18 | Interaxon Inc. | Wearable computing apparatus and method |
EP2933067B1 (en) * | 2014-04-17 | 2019-09-18 | Softbank Robotics Europe | Method of performing multi-modal dialogue between a humanoid robot and user, computer program product and humanoid robot for implementing said method |
US10936050B2 (en) * | 2014-06-16 | 2021-03-02 | Honda Motor Co., Ltd. | Systems and methods for user indication recognition |
US9626001B2 (en) * | 2014-11-13 | 2017-04-18 | International Business Machines Corporation | Speech recognition candidate selection based on non-acoustic input |
JP2017220191A (en) | 2016-06-03 | 2017-12-14 | 洋彰 宮崎 | Artificial intelligence carrying out autonomous processing of problem by combination of procedures |
CN107030691B (en) * | 2017-03-24 | 2020-04-14 | 华为技术有限公司 | Data processing method and device for nursing robot |
US10456912B2 (en) * | 2017-05-11 | 2019-10-29 | King Fahd University Of Petroleum And Minerals | Dynamic multi-objective task allocation |
US10235128B2 (en) * | 2017-05-19 | 2019-03-19 | Intel Corporation | Contextual sound filter |
US11345040B2 (en) * | 2017-07-25 | 2022-05-31 | Mbl Limited | Systems and methods for operating a robotic system and executing robotic interactions |
US10511808B2 (en) * | 2018-04-10 | 2019-12-17 | Facebook, Inc. | Automated cinematic decisions based on descriptive models |
-
2018
- 2018-05-04 FR FR1853868A patent/FR3080926B1/en active Active
-
2019
- 2019-04-26 WO PCT/FR2019/050983 patent/WO2019211552A1/en active Application Filing
- 2019-04-26 EP EP19734847.7A patent/EP3787849A1/en not_active Ceased
- 2019-04-26 CA CA3102196A patent/CA3102196A1/en active Pending
- 2019-04-26 EP EP24171738.8A patent/EP4378638A2/en active Pending
- 2019-04-26 US US17/052,703 patent/US12011828B2/en active Active
- 2019-04-26 JP JP2020563488A patent/JP7414735B2/en active Active
Also Published As
Publication number | Publication date |
---|---|
US20220009082A1 (en) | 2022-01-13 |
WO2019211552A1 (en) | 2019-11-07 |
FR3080926B1 (en) | 2020-04-24 |
FR3080926A1 (en) | 2019-11-08 |
JP2021523472A (en) | 2021-09-02 |
CA3102196A1 (en) | 2019-11-07 |
JP7414735B2 (en) | 2024-01-16 |
EP4378638A2 (en) | 2024-06-05 |
US12011828B2 (en) | 2024-06-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110291478B (en) | Driver Monitoring and Response System | |
JP6815486B2 (en) | Mobile and wearable video capture and feedback platform for the treatment of mental illness | |
CN106457563B (en) | Humanoid robot and method for executing dialogue between humanoid robot and user | |
Vinola et al. | A survey on human emotion recognition approaches, databases and applications | |
Zhang et al. | Intelligent facial emotion recognition and semantic-based topic detection for a humanoid robot | |
US9875445B2 (en) | Dynamic hybrid models for multimodal analysis | |
WO2017215297A1 (en) | Cloud interactive system, multicognitive intelligent robot of same, and cognitive interaction method therefor | |
US11113890B2 (en) | Artificial intelligence enabled mixed reality system and method | |
EP2834811A1 (en) | Robot capable of incorporating natural dialogues with a user into the behaviour of same, and methods of programming and using said robot | |
Raudonis et al. | Evaluation of human emotion from eye motions | |
CN110737335B (en) | Interaction method and device of robot, electronic equipment and storage medium | |
KR101984283B1 (en) | Automated Target Analysis System Using Machine Learning Model, Method, and Computer-Readable Medium Thereof | |
CN116188642A (en) | Interaction method, device, robot and storage medium | |
Maroto-Gómez et al. | Active learning based on computer vision and human–robot interaction for the user profiling and behavior personalization of an autonomous social robot | |
KR102396794B1 (en) | Electronic device and Method for controlling the electronic device thereof | |
EP3787849A1 (en) | Method for controlling a plurality of robot effectors | |
Yu | Robot behavior generation and human behavior understanding in natural human-robot interaction | |
CN115358365A (en) | Method, device, electronic equipment and storage medium for realizing general artificial intelligence | |
CN111971670B (en) | Generating a response in a dialog | |
US20240335952A1 (en) | Communication robot, communication robot control method, and program | |
Thompson et al. | Emotibot: An Interactive Tool for Multi-Sensory Affect Communication | |
WO2023017753A1 (en) | Learning device, learning method, and program | |
Lee et al. | Human robot social interaction framework based on emotional episodic memory | |
Chiba et al. | Cluster-based approach to discriminate the user’s state whether a user is embarrassed or thinking to an answer to a prompt | |
JP4355823B2 (en) | Information processing device for facial expressions |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20201104 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: MONCEAUX, JEROME Inventor name: HERVIER, THIBAULT Inventor name: MASURELLE, AYMERIC |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
APBK | Appeal reference recorded |
Free format text: ORIGINAL CODE: EPIDOSNREFNE |
|
APBN | Date of receipt of notice of appeal recorded |
Free format text: ORIGINAL CODE: EPIDOSNNOA2E |
|
APBR | Date of receipt of statement of grounds of appeal recorded |
Free format text: ORIGINAL CODE: EPIDOSNNOA3E |
|
APAF | Appeal reference modified |
Free format text: ORIGINAL CODE: EPIDOSCREFNE |
|
APBT | Appeal procedure closed |
Free format text: ORIGINAL CODE: EPIDOSNNOA9E |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
18R | Application refused |
Effective date: 20240423 |