EP4265377A1 - Robot et son procédé de commande - Google Patents

Robot et son procédé de commande Download PDF

Info

Publication number
EP4265377A1
EP4265377A1 EP21948555.4A EP21948555A EP4265377A1 EP 4265377 A1 EP4265377 A1 EP 4265377A1 EP 21948555 A EP21948555 A EP 21948555A EP 4265377 A1 EP4265377 A1 EP 4265377A1
Authority
EP
European Patent Office
Prior art keywords
motion
identifier
robot
input
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP21948555.4A
Other languages
German (de)
English (en)
Other versions
EP4265377A4 (fr
Inventor
Seonah Nam
Myeongsang Yu
Yusun Lee
Jiyeon Lee
Dain CHUNG
Jangwon Lee
Jinsung Kim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020210083874A external-priority patent/KR20230001225A/ko
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of EP4265377A1 publication Critical patent/EP4265377A1/fr
Publication of EP4265377A4 publication Critical patent/EP4265377A4/fr
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/0005Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0003Home robots, i.e. small robots for domestic use
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/006Controls for manipulators by means of a wireless system for controlling one or several manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/10Programme-controlled manipulators characterised by positioning means for manipulator elements
    • B25J9/12Programme-controlled manipulators characterised by positioning means for manipulator elements electric
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/10Programme-controlled manipulators characterised by positioning means for manipulator elements
    • B25J9/12Programme-controlled manipulators characterised by positioning means for manipulator elements electric
    • B25J9/126Rotary actuators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/4155Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by programme execution, i.e. part programme or machine function execution, e.g. selection of a programme
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/50Machine tool, machine tool null till machine tool work handling
    • G05B2219/50391Robot

Definitions

  • Various embodiments of the disclosure relate to a robot for controlling motion input processing in consideration of a motion state of the robot and a driving method thereof.
  • a robot called a ⁇ telepresence robot' is a robot that can be generally remote-controlled via an electronic device such as a smartphone, by which a user of the electronic device can perform both video and voice communications with a robot user.
  • the robot provides motions corresponding to video/voice in addition to video/voice communications, one can further enhance the interactive experience between users, such as e.g., vividness, friendly feeling, and fun. In this instance, it has to process a motion input based on a user input such as e.g., video, voice, or the like, in line with the motion driven by the robot.
  • a user input such as e.g., video, voice, or the like
  • Various embodiments of the disclosure provide a robot capable of adjusting motion input processing in consideration of whether a motion state of the robot is an active state or an idle state, when the robot drives a motion corresponding to a user input, and a driving method thereof.
  • a robot may include at least one motor driving the robot to perform a predetermined motion; a memory storing a motion map database and a program comprising one or more instructions; and at least one processor electrically connected to the at least one motor and the memory, the at least one processor being configured to execute the one or more instructions of the program stored in the memory to: obtain an input motion identifier based on a user input, identify a motion state indicating whether the robot is performing a motion, based on the motion state being in an active state, store the input motion identifier in the memory, and based on the motion state being in an idle state: determine an active motion identifier from at least one motion identifier stored in the memory based on a predetermined criterion; and control the at least one motor to drive a motion corresponding to the active motion identifier based on the motion map database.
  • an electronic device may include an input/output interface configured to obtain a user input; a communication interface configured to receive a motion state corresponding to either one of an active state or an idle state from a robot; a memory storing a program comprising one or more instructions; and at least one processor electrically connected to the input/output interface, the communication interface, and the memory, the at least one processor may be configured to execute the one or more instructions of the program stored in the memory to: obtain an input motion identifier based on the user input, based on the motion state being in the active state, store the input motion identifier in the memory; and based on the motion state being in the idle state: determine an active motion identifier from at least one motion identifier stored in the memory based on a predetermined criterion; and transmit the active motion identifier to the robot using the communication interface.
  • a method of driving a robot may include obtaining an input motion identifier based on a user input; identifying a motion state indicating whether the robot is in an active state or an idle state; based on the motion state being in the active state, storing the input motion identifier; and based on the motion state being in the idle state: determining an active motion identifier from at least one stored motion identifier based on a predetermined criterion; and driving a motion corresponding to the active motion identifier based on a motion map database.
  • a method of operating an electronic device may include receiving from a robot a motion state indicating one of an active state or an idle state; obtaining an input motion identifier based on a user input; based on the motion state being in the active state, storing the input motion identifier; and based on the motion state being in the idle state: determining an active motion identifier from at least one stored motion identifier based on a predetermined criterion; and transmitting the active motion identifier to the robot.
  • a robot it is possible to further enhance the user experience such as e.g., vividness, friendly feeling, concentration, fun, learning effect and so on, enabling a robot to provide a motion corresponding to a user input such as e.g., voice, texts, images, emoticons and gestures. Further, providing a motion corresponding to the user input in the robot makes it possible to further facilitate communications between its users through the robot.
  • a user input such as e.g., voice, texts, images, emoticons and gestures.
  • a robot can drive a motion corresponding to a motion identifier obtained based on a user input, so that when the robot is in an active motion state, the obtained motion identifier is stored, and when the robot is in an idle motion state, the motion identifier is determined based on a predetermined criterion from at least one stored motion identifier to perform the motion input, thereby allowing the motion input processing to be controlled in consideration of the robot's motion state. Accordingly, even when a motion input speed is faster than a motion driving speed by the robot, any possible errors caused by the collision of the motion driving in the robot can be avoided in advance, and the users can obtain more natural user experience as if a human performs such a motion.
  • FIG. 1 shows a robot driving environment including a robot, an electronic device, and a network connecting the robot and the electronic device to each other according to various embodiments.
  • the robot driving environment may include a networked robot 110 and an electronic device 120.
  • the electronic device 120 or the robot 110 may obtain a user input.
  • the user input may include, for example, voice, text, image, emoticon, and gesture.
  • the electronic device 120 or the robot 110 may obtain a text based on the user input and perform a natural language process for the obtained text to obtain a motion identifier for performing a motion.
  • the robot 110 and the electronic device 120 may perform voice and/or video communications with each other through their applications, respectively.
  • the robot 110 may output a voice and/or a video provided from the electronic device 120.
  • the electronic device 120 may obtain the motion identifier based on the voice and/or the video.
  • the robot 110 may receive the motion identifier from the electronic device 120.
  • the robot 110 may drive a motion corresponding to the motion identifier.
  • the robot 110 or the electronic device 120 may adjust motion input processing based on the user input in consideration of a motion state of the robot.
  • the electronic device 120 may include a terminal capable of performing the computing and communication functions, and the like.
  • the electronic device 120 may be a desktop computer, a smartphone, a notebook computer, a tablet PC, a smart TV, a mobile phone, a personal digital assistant (PDA), a laptop, a media player, a micro server, a global positioning system (GPS) device, an e-book terminal, a digital broadcasting terminal, a navigation system, a kiosk, an MP3 player, a digital camera, a home appliance, or other mobile or non-mobile computing devices, but is not limited thereto.
  • PDA personal digital assistant
  • GPS global positioning system
  • the electronic device 120 may be a wearable terminal of e.g., a watch, glasses, a hair band, a ring and so on, capable of performing the computing and communication functions.
  • the electronic device 120 may be one of various types of terminals without limiting to the above description.
  • the network connecting the robot 110 and the electronic device 120 may be a short-range communication network such as e.g., Bluetooth, Wireless Fidelity (Wi-Fi), Zigbee, or Infrared Data Association (IrDA), and a remote area communication network such as e.g., a cellular network, a next-generation communication network, Internet, or a computer network (e.g., LAN or WAN).
  • the cellular network may include, for example, Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), Code Division Multiple Access (CDMA), Time Division Multiplexing Access (TDMA), 5G, Long Term Evolution (LTE), and LTE Advanced(LTE-A).
  • GSM Global System for Mobile Communications
  • EDGE Enhanced Data GSM Environment
  • CDMA Code Division Multiple Access
  • TDMA Time Division Multiplexing Access
  • 5G Long Term Evolution
  • LTE Long Term Evolution
  • LTE Advanced(LTE-A) LTE Advanced
  • the network may include a connection of network elements such as e.g., hubs, bridges, routers, switches and gateways.
  • the network may include one or more connected networks, for instance, a multiple network environment inclusive of a public network such as Internet and a private network such as a secured enterprise private network. Access to the network may be provided via one or more wired or wireless access networks.
  • the network may support an Internet of Things (IoT) network for exchanging and processing information between distributed components of various things.
  • IoT Internet of Things
  • FIG. 2 is a diagram showing an appearance of a robot (for example, the robot 110 of FIG. 1 ) according to an embodiment.
  • the robot 200 may include a camera 210, a display 220, a neck 230, an arm 240, and a torso 250.
  • the camera 210 may capture an image around the robot 200.
  • the camera 210 may take a picture of a person (e.g., a child) around the robot 200.
  • the electronic device 120 may monitor people around the robot 200 through the robot 200.
  • the camera 210 may include one or more cameras and may be located on the display 220, but the number and location of the cameras 210 is not limited thereto.
  • the display 220 may display a predetermined facial expression (such as e.g., a doll-like face), when the robot 200 does not communicate with other devices.
  • a predetermined facial expression such as e.g., a doll-like face
  • the display 220 may output a predetermined screen.
  • the predetermined screen may include a user video or a user image corresponding to the user of the electronic device 120, but is not limited thereto.
  • the display 220 may output an image received from the electronic device 120.
  • the robot 200 may include at least one connection part (i.e., a joint) in the neck 230, the arm 240, and the torso 250, and each connection part may have at least one degree of freedom (DOF).
  • the degree of freedom may refer to the degree of freedom in kinematics or inverse kinematics.
  • the degree of freedom may imply the minimum number of variables required to determine the position and posture of each j oint.
  • each j oint in a three-dimensional space consisting of x-axis, y-axis, and z-axis may have at least one degree of freedom of three degrees of freedom (position on each axis) to determine a spatial position and three degrees of freedom (angle of rotation about each axis) to determine a spatial posture.
  • each connection part may include at least one motor.
  • each degree of freedom may be implemented by each motor, or a certain number of degrees of freedom may be implemented by a single motor.
  • connection part of the neck 230 in the robot 200 when the connection part of the neck 230 in the robot 200 is rotatable about two axes (e.g., in a back-and-forth direction and a side-to-side direction), the connection part of the neck 230 may have two degrees of freedom.
  • each of the shoulder connection part and the elbow connection part When each of the shoulder connection part and the elbow connection part is rotatable about two axes, each of the shoulder connection part and the elbow connection part may have two degrees of freedom.
  • the torso 250 connection part when the torso 250 connection part is rotatable about one axis, the torso 250 connection part may have one degree of freedom.
  • the number and location of the connection parts and their respective degrees of freedom are not limited to the above examples.
  • FIG. 3 conceptually illustrates a robot driving method including the user input, the motion identifier determination, the motion matching, and the robot operation according to an embodiment.
  • the robot driving method may obtain a user input from the electronic device 120 or the robot 110 and determine a motion identifier based on the obtained user input.
  • the user input may include, for instance, voice, text, image, emoticon, gesture and the like.
  • the electronic device 120 or the robot 110 may obtain at least one text through at least one of voice recognition, image recognition, emoticon recognition, and gesture recognition, based on the user input. For example, when the user input is of a voice, the electronic device 120 or the robot 110 may convert a signal, which is obtained from the outside by a user's utterance, into an electrical audio signal, and then obtain at least one text sequentially recognized from the converted audio signal.
  • the electronic device 120 or the robot 110 may obtain at least one character string (or keyword) corresponding to each motion from the obtained at least one text through natural language processing.
  • the natural language processing is a technology for allowing the electronic device 120 or robot 110, being a machine, to understand the human language, and may include splitting the natural language complying with grammar.
  • the electronic device 120 or the robot 110 may determine each motion identifier based on the obtained respective character string.
  • the motion identifier may identify each motion that the robot 110 can drive.
  • the motion identifier may be the obtained character string or a numeric code corresponding to the obtained character string, but it is not limited thereto and may have various formats.
  • the robot driving method may obtain from a motion map database at least one set of motor values capable of driving a motion corresponding to the determined motion identifier, and drive the motion using the obtained at least one set of motor values.
  • FIG. 4 illustrates a conceptual diagram of obtaining a motion record corresponding to a motion identifier from a motion map database according to an embodiment.
  • the robot 110 may use the motion identifier to obtain a motion record from the motion map database, in order to drive the motion corresponding to the obtained motion identifier, based on a user input from the electronic device 120 or the robot 110.
  • the electronic device 120 or the robot 110 may obtain a voice input from a user, obtain at least one text based on the obtained voice input, and perform the natural language process onto the obtained at least one text, so as to obtain at least one character string (or keyword) corresponding to each motion.
  • the electronic device 120 or the robot 110 may determine each motion identifier based on the obtained each character string.
  • the motion identifier may identify each motion that the robot 110 can drive.
  • the motion identifier may be the obtained character string or a numeric code corresponding to the obtained character string, but it is not limited thereto and may have various formats.
  • Table 1 shows motion identifiers in the form of a character string or a numeric code corresponding to the character string in the example shown in FIG. 3 .
  • Table 1 Character String Numeric Code Hello (410) 0 I see (420) 1 Haha (430) 2 I love you (440) 3
  • the robot 110 may receive the motion identifier from the electronic device 120.
  • the robot 110 may store the motion map database, and the motion map database may be set by the robot 110 or an external device to be provided to the robot 110. The operation of setting the motion map database will be described later with reference to FIG. 5 .
  • the robot 110 may obtain a motion record corresponding to the motion identifier from the motion map database by using the motion identifier.
  • FIG. 4 shows the motion records 450 to 480 in the motion map database, corresponding to the motion identifiers 410 to 440 disclosed by way of examples in the above Table 1.
  • Each motion record corresponding to each motion capable of driving by at least one motor may include a motion identifier and at least one set of motor values for each motion timeframe.
  • Each motion record may further include information indicating a motion speed for each motion timeframe. For example, when a motion is driven at a motion speed corresponding to either one of acceleration, constant velocity, or deceleration in each of the motion timeframes, the motion record may further include information indicating the motion speed corresponding to one of acceleration, constant velocity, or deceleration for each of the motion timeframes.
  • Table 2 below exemplarily discloses a motion record corresponding to the motion identifier having the character string ⁇ Hello' or the numeric code '0' in the examples of the Table 1.
  • a connection part (joint) of each of a left arm, a right arm, a neck, and a torso may have one degree of freedom, and the one degree of freedom of each connection part exemplarily discloses the motion record corresponding to a scenario implemented by one motor.
  • the motors each driving the left arm, the right arm, the neck and the torso may drive the left arm, the right arm, the neck and the torso by 0 degree, 0 degree, 0 degree and 0 degree, respectively, for a time period from 0 second to 1 second, with respect to one axis; drive the left arm, the right arm, the neck, and the torso by 0 degree, 60 degrees, 0 degree, and 0 degree, respectively, for a time period from 1 second to 2 seconds; and drive the left arm, the right arm, the neck, and the torso by 0 degree, 0 degree, 0 degree and 0 degree, respectively, for a time period from 2 seconds to 2.5 seconds.
  • Motion Identifier At least one set of motor values for each motion timeframe ⁇ left arm, right arm, neck, and torso ⁇ 0 (or 'Hello') ⁇ 1 sec., 0 deg., 0 deg., 0 deg., 0 deg. ⁇ , ⁇ 2 sec, 0 deg., 60 deg., 0 deg., 0 deg. ⁇ , ⁇ 2.5 sec., 0 deg., 0 deg., 0 deg., 0 deg. ⁇
  • FIG. 5 is diagram of setting a motion map database according to an embodiment.
  • the electronic device 120 or the robot 110 may set a motion map database through an application.
  • the electronic device 120 or the robot 110 may obtain a user input to set a motion of the robot and a character string corresponding to the set motion through the interface.
  • FIG. 5 is diagram of setting a motion map database according to an embodiment.
  • the electronic device 120 or the robot 110 may set a motion map database through an application.
  • the electronic device 120 or the robot 110 may obtain a user input to set a motion of the robot and a character string corresponding to the set motion through the interface.
  • FIG. 5 is diagram of setting a motion map database according to an embodiment.
  • the electronic device 120 or the robot 110 may output a robot image, set a motion of the robot through conversion of the robot image and outputting of the converted image, based on a user input such as e.g., dragging or clicking the robot image, and receive a character string (e.g., ⁇ Hello') corresponding to the set motion (e.g., ⁇ Hello') of the robot from the user.
  • a character string e.g., ⁇ Hello'
  • the electronic device 120 or the robot 110 may obtain a motion identifier based on the character string.
  • the motion identifier may identify each motion that the robot 110 can drive.
  • the motion identifier may be the obtained character string or a numeric code corresponding to the obtained character string, but it is not limited thereto and may have various formats.
  • the electronic device 120 or the robot 110 may obtain at least one set of motor values for each motion timeframe corresponding to the set motion.
  • the electronic device 120 or the robot 110 may include an inverse kinematics module using Jacobian inverse matrix, and may map Cartesian Coordinate information corresponding to the set motion to Joint Coordinate information of the robot 110, using the inverse kinematics module.
  • the electronic device 120 or the robot 110 may obtain at least one set of motor values for each motion timeframe corresponding to the set motion, from the Cartesian Coordinate information corresponding to the set motion.
  • the electronic device 120 or the robot 110 may store a record in the motion map database, wherein the record includes the motion identifier and at least one set of motor values for each motion timeframe.
  • the robot 110 may receive the motion map database from the device.
  • FIG. 6 is a schematic block diagram of an electronic device according to an embodiment.
  • an electronic device 600 may include a processor 610, a memory 620, a communication interface 640, and/or an input/output interface 650. At least one of a microphone (MIC) 681, a speaker (SPK) 682, a camera (CAM) 683, or a display (DPY) 684 may be connected to the input/output interface 650.
  • the memory 620 may include a program 630 including one or more instructions.
  • the program 630 may include an operating system (OS) program and an application program.
  • the electronic device 600 may include any additional components in addition to the illustrated components, or omit at least one of the illustrated components, as circumstances demand.
  • the communication interface 640 may provide an interface for communication with other systems or devices.
  • the communication interface 640 may include a network interface card or a wireless transmission/reception unit for enabling communications through an external network.
  • the communication interface 640 may perform signal processing for accessing a wireless network.
  • the wireless network may include, for example, at least one of a wireless LAN or a cellular network (e.g., LTE (Long Term Evolution) network).
  • the input/output interface 650 may detect an input from the outside (e.g., a user) and provide data corresponding to the detected input to the processor 610.
  • the input/output interface 650 may include at least one hardware module to detect an input from the outside.
  • the at least one hardware module may include, for example, at least one of a sensor, a keyboard, a key pad, a touch pad, or a touch panel.
  • the input/output interface 650 may be coupled to the display 684 to provide a touch screen.
  • the input/output interface 650 may provide the processor 610 with data about a user's touch input such as e.g., tap, press, pinch, stretch, slide, swipe, rotate, or the like.
  • the display 684 may perform functions to output information in the form of numbers, characters, images, and/or graphics.
  • the display 684 may include at least one hardware module for outputting.
  • the at least one hardware module may include, for instance, at least one of a Liquid Crystal Display (LCD), a Light Emitting Diode (LED), a Light Emitting Polymer Display (LPD), an Organic Light Emitting Diode (OLED), an Active Matrix Organic Light Emitting Diode (AMOLED), or a Flexible LED (FLED).
  • the display 684 may display a screen corresponding to data received from the processor 610.
  • the display 684 may be referred to as an 'output unit', a ⁇ display unit', or other terms having technical meaning equivalent thereto.
  • the microphone 681 which may be electrically coupled to the processor 610 through the input/output interface 650, may convert an audible signal input from the outside due to a user's utterance into an electrical audio signal.
  • the audio signal converted by the microphone 681 may be provided to the processor 610 through the input/output interface 650.
  • a component that may be electrically coupled to the processor 610 through the input/output interface 650, in addition to the microphone 681, may be at least one of the speaker 682 or the camera 683.
  • the speaker 682 may convert the electrical audio signal provided from the processor 610 through the input/output interface 650 into an audible signal that humans can hear, and then output the audible signal.
  • the camera 683 may capture a subject in response to a control from the processor 610, and convert the captured image into an electrical signal to be provided to the processor 610 through the input/output interface 650.
  • the memory 620 may store data such as e.g., a program 630 including one or more instructions or setting information.
  • the program 630 may include an operating system program corresponding to a basic program for the overall operation of the electronic device 600 and one or more application programs for supporting various functions.
  • the memory 620 may be composed of a volatile memory, a non-volatile memory, or a combination of a volatile memory and a non-volatile memory.
  • the memory 620 may provide the stored data according to a request of the processor 610.
  • the processor 610 may use the program 630 stored in the memory 620 to execute the operation or data processing required for the control and/or communication of at least one other components in the electronic device 600.
  • the processor 610 may include, for example, at least one of a central processing unit (CPU), a graphics processing unit (GPU), a micro controller unit (MCU), a sensor hub, a supplementary processor, a communication processor, an application processor, an application specific integrated circuit (ASIC) or field programmable gate arrays (FPGA), and may have multiple cores.
  • CPU central processing unit
  • GPU graphics processing unit
  • MCU micro controller unit
  • FPGA field programmable gate arrays
  • the processor 610 may process data obtained through the input/output interface 650 or control operation states of various input and/or output means through the input/output interface 650.
  • the various input and/or output means may be, for example, at least one of a microphone (MIC) 681, a speaker (SPK) 682, a camera (CAM) 683, or a display (DPY) 684.
  • the processor 610 may transmit and/or receive signals through the communication interface 640.
  • the input/output interface 650 may obtain a user input.
  • the user input may include, for example, voice, text, image, emoticon, and/or gesture.
  • the communication interface 640 may receive a motion state corresponding to one of an active state or an idle state from the robot 110.
  • the motion state may be received from the robot 110 when the motion state is changed in the robot 110, received from the robot 110 in response to a motion identifier transmission, or received from the robot 110 as a response to a motion state request of the electronic device 600.
  • the active motion state may mean a state in which at least one motor of the robot 110 is being driven
  • the idle motion state may mean a state in which none of the motors is being driven in the robot 110.
  • the processor 610 may obtain a motion identifier based on the user input.
  • the processor 610 may obtain a text based on the user input and perform a natural language process for the obtained text to obtain at least one character string (or keyword) corresponding to each motion.
  • the processor 610 may determine each motion identifier based on the obtained each character string.
  • the motion identifier may identify each motion that the robot 110 can drive.
  • the motion identifier may be either the obtained character string or a numeric code corresponding to the obtained character string, but it is not limited thereto and may have various formats.
  • the processor 610 may store the obtained motion identifier in the memory 620, when the motion state received from the robot 110 is the active state.
  • the processor 610 may not store the motion identifier in the memory 620, based on a preset storage option.
  • the preset storage option may be set to a value indicating either ⁇ Enable' or 'Disable'.
  • the processor 610 may skip the obtained motion identifier without storing it in the memory 620, so as to adjust motion input processing in consideration of the motion state of the robot.
  • the processor 610 may determine a motion identifier based on a predetermined criterion from the at least one stored motion identifier.
  • the processor 610 may control to transmit the determined motion identifier to the robot, using the communication interface 640.
  • the communication interface 640 may transmit the determined motion identifier to the robot.
  • the predetermined criterion may correspond to one of a motion identifier set based on an external input, a motion identifier obtained the most based on the user input, or a most recently stored motion identifier.
  • a predetermined motion identifier set based on the external input may be a motion identifier determined to be used the most frequently, based on various formats of external information.
  • the predetermined motion identifier set based on the external input may be of a null value. In this instance, when the motion state received from the robot 110 is the active state, the obtained motion identifier may be skipped in the electronic device 600, so that the electronic device 600 can adjust the motion input processing in consideration of the motion state of the robot.
  • the processor 610 may store the obtained motion identifier in the memory 620 when the robot is in an active motion state, and when the robot is in an idle motion state, determine a motion identifier from the stored at least one motion identifier based on a predetermined criterion, and control the communication interface 640 to transmit the determined motion identifier to the robot 120, thereby adjusting the motion input processing in consideration of the motion state of the robot.
  • FIG. 7 is a schematic block diagram of a robot according to an embodiment.
  • the robot 700 may include a processor 710, a memory 720, a motor 750, a communication interface 760, and/or an input/output interface 770. At least one of a microphone (MIC) 781, a speaker (SPK) 782, a camera (CAM) 783, and a display (DPY) 784 may be connected to the input/output interface 770.
  • the memory 720 may store a program 730 including one or more instructions and a motion map database 740.
  • the program 730 may include an operating system (OS) program and at least one application program.
  • the robot 700 may include additional components in addition to the illustrated components, or may omit at least one of the illustrated components as required.
  • OS operating system
  • the communication interface 760 may provide an interface for communication with other systems or devices.
  • the communication interface 760 may include a network interface card or a wireless transmission/reception unit enabling communication through an external network.
  • the communication interface 760 may perform signal processing for accessing a wireless network.
  • the wireless network may include, for example, at least one of a wireless LAN or a cellular network (e.g., Long Term Evolution (LTE)).
  • LTE Long Term Evolution
  • the input/output interface 770 may detect an input from the outside (e.g., a user) and provide data corresponding to the detected input to the processor 710.
  • the input/output interface 770 may include at least one hardware module for detecting an input from the outside.
  • the at least one hardware module may include, for example, at least one of a sensor, a keyboard, a key pad, a touch pad, or a touch panel.
  • the input/output interface 770 may be coupled to the display 784 to provide a touch screen.
  • the input/output interface 770 may provide the processor 710 with data about a user's touch input such as, for example, tap, press, pinch, stretch, slide, swipe, rotate or the like.
  • the display 784 may perform functions to output information in the form of numbers, characters, images, and/or graphics.
  • the display 784 may include at least one hardware module for outputting.
  • the at least one hardware module may include, for example, at least one of a Liquid Crystal Display (LCD), a Light Emitting Diode (LED), a Light Emitting Polymer Display (LPD), an Organic Light Emitting Diode (OLED), and an Active Matrix Organic Light Emitting Diode (AMOLED), or a Flexible LED (FLED).
  • the display 784 may display a screen corresponding to data received from the processor 710.
  • the display 784 may be referred to as an 'output unit', a ⁇ display unit', or other terms having technical meaning equivalent thereto.
  • the microphone 781 that may be electrically coupled to the processor 710 through the input/output interface 770 may convert an audible signal input from the outside due to a user's utterance into an electrical audio signal.
  • the audio signal converted by the microphone 781 may be provided to the processor 710 through the input/output interface 770.
  • Any other component that may be electrically coupled to the processor 710 through the input/output interface 770, in addition to the microphone 781, may be at least one of a speaker 782 or a camera 783.
  • the speaker 782 may convert the electrical audio signal provided from the processor 710 through the input/output interface 770 into an audible signal that a person can hear and then output the audible signal.
  • the camera 783 may capture a subject in response to a control from the processor 710, convert the captured image into an electrical signal, and provide the captured image to the processor 710 through the input/output interface 770.
  • the memory 720 may store a program 730 including one or more instructions and a motion map database 740.
  • the memory 720 may further store data such as, for example, the setting information.
  • the program 730 may include an operating system program corresponding to a basic program for the operation of the robot 700 and at least one application program supporting various functions.
  • the memory 720 may include a volatile memory, a non-volatile memory, or a combination of a volatile memory and a non-volatile memory.
  • the memory 720 may provide the stored data according to a request of the processor 710.
  • the processor 710 may use the program 730 stored in the memory 720 to execute operations or data processing relating to the control and/or communication of at least one other component in the robot 700.
  • the processor 710 may include, for instance, at least one of a central processing unit (CPU), a graphics processing unit (GPU), a micro-controller unit (MCU), a sensor hub, a supplementary processor, a communication processor, an application processor, an application specific integrated circuit (ASIC), or field programmable gate arrays (FPGA), and may have a plurality of cores.
  • the processor 710 may process data obtained through the input/output interface 770 or control the operating states of various input and/or output means through the input/output interface 770.
  • the various input and/or output means may be, for example, at least one of a microphone (MIC) 781, a speaker (SPK) 782, a camera (CAM) 783 or a display (DPY) 784.
  • the processor 710 may transmit and/or receive a signal through the communication interface 760.
  • the motor 750 may include at least one motor that drives the robot to perform a predetermined motion.
  • the motor 750 may be electrically connected to the processor 710 and may be controlled by the processor 710 to drive the motion of the robot.
  • the robot 700 may include at least one motor corresponding to each connection part.
  • each degree of freedom of the connection part may be implemented by each motor, or a predetermined number of degrees of freedom thereof may be implemented by one motor.
  • the processor 710 may obtain a motion identifier based on a user input.
  • the processor 710 may obtain the motion identifier determined based on the user input through the input/output interface 770, or the communication interface 760 may receive from the electronic device 120 the motion identifier determined based on the user input in the electronic device 120 so that the processor 710 can obtain the received motion identifier from the communication interface 760.
  • the user input may include, for instance, voice, text, image, emoticon, and gesture.
  • the processor 710 may identify a motion state indicating whether the robot 700 is performing a motion. When the motion state is the active state, the processor 710 may store the obtained motion identifier in the memory 720.
  • the processor 710 may not store the motion identifier in the memory 720, based on a preset storage option.
  • the preset storage option may be set to a value indicating either ⁇ Enable' or 'Disable'.
  • the processor 710 may skip the obtained motion identifier without storing it in the memory 720, thereby adjusting the motion input processing in consideration of the motion state of the robot.
  • the processor 710 may control to determine a motion identifier from the at least one stored motion identifier, based on a predetermined criterion, and cause the motor 750 to drive the motion corresponding to the determined motion identifier, based on the motion map database 740.
  • the active motion state of the motion states may mean a state in which at least one motor of the robot 700 is being driven, and the idle motion state may mean a state in which none of the motors is being driven in the robot 700.
  • the motion map database may include, for each motion drivable by the at least one motor, a motion identifier and at least one set of motor values for each motion timeframe.
  • the predetermined criterion may correspond to one of a motion identifier set based on an external input, a motion identifier most obtained based on the user input, or a most recently stored motion identifier.
  • the predetermined motion identifier set based on the external input may be a motion identifier determined to be used the most frequently based on various formats of external information.
  • the predetermined motion identifier set based on the external input may be of a null value. In this instance, when the robot 700 is in the active motion state, the obtained motion identifier may be skipped without being driven for a motion, thereby adjusting the motion input processing in consideration of the motion state of the robot.
  • the motor 750 may drive the motion at a motion speed corresponding to one of acceleration, constant velocity, or deceleration in each motion timeframe, based on at least one set of motor values for each motion timeframe, being obtained using the motion identifier from the motion map database 740. For example, the motor 750 may sequentially drive the motion at a motion speed of acceleration, constant velocity, or deceleration, thereby naturally driving the motion and therefore, enhancing the user experience.
  • the processor 710 may store the obtained motion identifier in the memory 720.
  • the processor 710 may store the obtained motion identifier in the memory 720, when the motion is being driven at a motion speed corresponding to one of constant velocity or deceleration. As another example, when the motion is being driven at a motion speed corresponding to deceleration, the processor 710 may store the obtained motion identifier in the memory 720. Through the above examples, the processor 710 may skip the obtained motion identifier without storing it in the memory 720 when the motion is being driven in an initial operation section (e.g., acceleration period) of the entire operation period of the motion (i.e., a plurality of timeframes), thereby adjusting the motion input processing.
  • an initial operation section e.g., acceleration period
  • the robot 700 may generate a motion map database 740.
  • the input/output interface 770 may obtain a user input to set a motion of the robot, and a character string corresponding to the set motion.
  • the processor 710 may obtain a motion identifier based on the character string.
  • the processor 710 may obtain at least one set of motor values for each motion timeframe, corresponding to the set motion.
  • the processor 710 may store a record in the motion map database 740, wherein the record includes the motion identifier and at least one set of motor values for each of the motion timeframes.
  • FIG. 8 shows a schematic flowchart of a driving method in a robot according to an embodiment.
  • the robot 110 may obtain an input motion identifier based on a user input.
  • the robot 110 may receive the motion identifier corresponding to the user input from the electronic device 120 to obtain the input motion identifier.
  • the robot 110 may obtain the user input using the interface and determine the input motion identifier based on the obtained user input.
  • the user input may include, for example, voice, text, image, emoticon, and gesture.
  • the robot 110 may identify a motion state indicating whether the robot 110 is performing a motion.
  • the robot 110 may determine whether the motion state is the active state. If the motion state is the active state, an operation 840 may be performed, and if the motion state is not the active state (i.e., idle state), an operation 850 may be performed.
  • the robot 110 may store the input motion identifier.
  • the robot 110 may determine an active motion identifier from at least one stored motion identifier, based on a predetermined criterion.
  • the predetermined criterion may correspond to one of a motion identifier set based on an external input, a motion identifier most obtained based on a user input, or a most recently stored motion identifier.
  • the robot 110 may drive the motion corresponding to the active motion identifier, based on the motion map database.
  • the motion map database may include at least one record corresponding to each motion, wherein the at least one record includes the motion identifier and at least one set of motor values for each motion timeframe.
  • the robot 110 may drive the motion at a motion speed corresponding to one of acceleration, constant velocity, or deceleration in each of the motion timeframes, based on at least one set of motor values for each motion timeframe.
  • the operation of storing the input motion identifier (operation 840) may be an operation of storing the input motion identifier, when the motion is being driven at a predetermined motion speed.
  • the robot 110 when the robot 110 is driving the motion at a motion speed corresponding to one of constant velocity or deceleration, the input motion identifier may be stored.
  • the robot 110 may skip the input motion identifier without storing it into the memory, when the motion is being driven in the initial motion section (e.g., acceleration period) of the entire operation period of the motion, thereby adjusting the motion input processing.
  • the robot 110 may set the motion map database.
  • the robot 110 may set a motion of the robot, obtain a character string corresponding to the set motion, and obtain a motion identifier based on the character string.
  • the robot 110 may obtain at least one set of motor values for each motion timeframe, corresponding to the set motion.
  • the robot 110 may store a record in the motion map database, wherein the record includes the motion identifier and at least one set of motor values for each of the motion timeframes.
  • FIG. 9 shows a schematic flowchart of a method of operating an electronic device according to an embodiment.
  • the electronic device 120 may receive a motion state corresponding to one of an active state or an idle state from the robot 110.
  • the operation 910 may be performed before or after the operation 920.
  • the electronic device 120 may obtain an input motion identifier based on a user input.
  • the electronic device 120 may determine whether the motion state is the active state. If the motion state is the active state, an operation 940 may be performed, and if the motion state is not active state (i.e., idle state), an operation 950 may be performed.
  • the electronic device 120 may store the input motion identifier.
  • the electronic device 120 may determine a motion identifier from at least one stored motion identifier, based on a predetermined criterion.
  • the predetermined criterion may correspond to one of a motion identifier set based on an external input, a most extracted motion identifier, or a most recently stored motion identifier.
  • the electronic device 120 may transmit the determined motion identifier to the robot 110.
  • the electronic device 600, the robot 700, the electronic device 600, and the program executed by the robot 700, as described throughout the disclosure, may be implemented with hardware components, or software components, and/or a combination of hardware components and software components.
  • the program may be executed by any system capable of executing computer readable instructions.
  • the software may include computer programs, codes, instructions, or a combination of one or more of these, and may configure a processing unit to operate as desired or command the processing unit either independently or collectively.
  • the software may be implemented as a computer program including instructions stored in a computer-readable storage medium.
  • the computer-readable recording medium may include, for example, a magnetic storage medium (e.g., floppy disk, hard disk, etc.), a solid-state storage medium (e.g., read-only memory (ROM), random-access memory (RAM), etc.), an optical-readable storage medium (e.g., CD-ROM, Digital Versatile Disc (DVD), etc.) or the like.
  • the computer-readable recording medium may be distributed over network-connected computer systems, so that the computer-readable codes can be stored and executed in a distributed manner.
  • the medium may be readable by a computer, stored in a memory, and executed on a processor.
  • the computer-readable storage medium may be provided in the form of a non-transitory storage medium.
  • non-transitory may merely imply that the storage medium does not include a signal and is tangible, and it does not distinguish that data is semi-permanently or temporarily stored in the storage medium.
  • the program according to embodiments may be contained in a computer program product.
  • the computer program products may be traded between sellers and buyers as commodities.
  • the computer program product may include a software program and a computer-readable storage medium in which the software program is stored.
  • the computer program product may include a product (e.g., a downloadable application) in the form of a software program distributed electronically via a manufacturer of such a device or an electronic market (e.g., Google TM Play Store, App Store).
  • the storage medium may be a server of the manufacturer, a server of the electronic market, or a storage medium of a relay server temporarily storing a software program.
  • the computer program product may include a storage medium of the server or a storage medium of the device.
  • the computer program product may include a storage medium of the third device.
  • the computer program product may include the software program itself transmitted from the server to the device or the third device or transmitted from the third device to the device.
  • one of the server, the device and the third device may execute the computer program product to perform the methods according to the embodiments.
  • two or more of the server, the device, and the third device may execute the computer program product to implement the methods according to the disclosed embodiments in a distributed manner.
  • the server may execute the computer program product stored in the server to control a device communicatively connected with the server to perform the methods according to the embodiments.
  • the third device may execute the computer program product to control the device communicatively connected with the third device to perform the methods according to the embodiments.
  • the third device may download the computer program product from the server and execute the downloaded computer program product.
  • the third device may execute the computer program product provided in a pre-loaded state to perform the methods according to the embodiments.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Human Computer Interaction (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Manufacturing & Machinery (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Manipulator (AREA)
  • Toys (AREA)
EP21948555.4A 2021-06-28 2021-10-13 Robot et son procédé de commande Pending EP4265377A4 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020210083874A KR20230001225A (ko) 2021-06-28 2021-06-28 로봇 및 그 구동 방법
PCT/KR2021/014100 WO2023277255A1 (fr) 2021-06-28 2021-10-13 Robot et son procédé de commande

Publications (2)

Publication Number Publication Date
EP4265377A1 true EP4265377A1 (fr) 2023-10-25
EP4265377A4 EP4265377A4 (fr) 2024-07-10

Family

ID=84543663

Family Applications (1)

Application Number Title Priority Date Filing Date
EP21948555.4A Pending EP4265377A4 (fr) 2021-06-28 2021-10-13 Robot et son procédé de commande

Country Status (3)

Country Link
US (1) US11731262B2 (fr)
EP (1) EP4265377A4 (fr)
CN (1) CN116847957A (fr)

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20020008848A (ko) 2000-03-31 2002-01-31 이데이 노부유끼 로봇 장치, 로봇 장치의 행동 제어 방법, 외력 검출 장치및 외력 검출 방법
KR100396754B1 (ko) 2000-08-18 2003-09-02 엘지전자 주식회사 전자 우편을 이용한 완구형 로봇 구동장치 및 방법
JP2002283259A (ja) 2001-03-27 2002-10-03 Sony Corp ロボット装置のための動作教示装置及び動作教示方法、並びに記憶媒体
KR20030073408A (ko) 2002-03-11 2003-09-19 주식회사 보스텍 로봇 메일 서비스 방법
JP2003308142A (ja) 2002-04-17 2003-10-31 Seiko Epson Corp メッセージ処理システム、音声信号処理システム、メッセージ処理設備、メッセージ送信端末、音声信号処理設備、メッセージ処理プログラム、音声信号処理プログラム、設備用プログラム、端末用プログラム及びメッセージのデータ構造、並びにメッセージ処理方法、音声信号処理方法及びメッセージ生成方法
KR100498061B1 (ko) 2004-12-20 2005-07-01 주식회사 아이오. 테크 수신된 이메일에 포함된 동작 명령에 따라 동작을 하는홈로봇 시스템
KR20110041648A (ko) 2009-10-16 2011-04-22 모젼스랩(주) 로봇 구동을 위한 멀티미디어 콘텐츠가 저장되는 저장매체 및 로봇 구동을 위한 멀티미디어 콘텐츠의 생성 방법
KR101191978B1 (ko) 2010-04-26 2012-10-17 박상규 사용자 단말에 의해 제어 가능한 제어 유닛을 구비한 이동로봇 및 이를 활용한 홈 케어링 시스템
KR101568347B1 (ko) 2011-04-12 2015-11-12 한국전자통신연구원 지능형 로봇 특성을 갖는 휴대형 컴퓨터 장치 및 그 동작 방법
FR2989209B1 (fr) 2012-04-04 2015-01-23 Aldebaran Robotics Robot apte a integrer des dialogues naturels avec un utilisateur dans ses comportements, procedes de programmation et d'utilisation dudit robot
US9846843B2 (en) * 2013-10-30 2017-12-19 Georgia Tech Research Corporation Methods and systems for facilitating interactions between a robot and user
CN107921639B (zh) 2015-08-25 2021-09-21 川崎重工业株式会社 多个机器人系统间的信息共享系统及信息共享方法
KR20180119515A (ko) 2017-04-25 2018-11-02 김현민 스마트 휴대 기기를 이용한 스마트 기기와 로봇의 개인 맞춤형 서비스 운용 시스템 및 방법
TWI694904B (zh) 2017-10-05 2020-06-01 國立交通大學 機器人語音操控系統及方法
JP7147167B2 (ja) 2017-12-28 2022-10-05 富士通株式会社 制御プログラム、制御方法及び情報処理装置
WO2019180916A1 (fr) 2018-03-23 2019-09-26 三菱電機株式会社 Dispositif de commande de robot
JP7119896B2 (ja) * 2018-10-24 2022-08-17 トヨタ自動車株式会社 コミュニケーションロボットおよびコミュニケーションロボットの制御プログラム
CN112809709B (zh) * 2019-01-25 2022-12-02 北京妙趣伙伴科技有限公司 机器人及其操作系统、控制装置、控制方法及存储介质
KR20210115068A (ko) 2019-02-11 2021-09-27 엘지전자 주식회사 액션 로봇용 단말기 및 그의 동작 방법
KR102301763B1 (ko) 2020-01-15 2021-09-16 한국과학기술연구원 이동로봇을 제어하기 위한 시스템 및 방법
KR102370873B1 (ko) 2020-08-07 2022-03-07 네이버랩스 주식회사 로봇 원격 제어 방법 및 시스템
JP2022187852A (ja) 2021-06-08 2022-12-20 株式会社Jvcケンウッド ロボットシステム及び遠隔操作方法

Also Published As

Publication number Publication date
US11731262B2 (en) 2023-08-22
CN116847957A (zh) 2023-10-03
US20220410368A1 (en) 2022-12-29
EP4265377A4 (fr) 2024-07-10

Similar Documents

Publication Publication Date Title
US20210295483A1 (en) Image fusion method, model training method, and related apparatuses
US9950431B2 (en) Interactive robot initialization
JP6129073B2 (ja) 自然な対話インターフェースを備えたヒューマノイドロボット、同ロボットを制御する方法、および対応プログラム
CN111124123A (zh) 基于虚拟机器人形象的语音交互方法及装置、车载设备智能控制系统
CN110291576A (zh) 基于触摸的操作系统的免提导航
US20150138333A1 (en) Agent Interfaces for Interactive Electronics that Support Social Cues
JP2023525173A (ja) レンダリングされたグラフィカル出力を利用する会話型aiプラットフォーム
CN106325228B (zh) 机器人的控制数据的生成方法及装置
US20190389075A1 (en) Robot system and robot dialogue method
WO2019057019A1 (fr) Procédé et dispositif d'interaction de robot
WO2020026850A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
US20240127564A1 (en) Interaction method and apparatus of virtual space, device, and medium
EP3529707A1 (fr) Dispositif compagnon émotionnellement intelligent
US20230324711A1 (en) Intelligent actuated and adjustable glasses nose pad arms
JP2023120130A (ja) 抽出質問応答を利用する会話型aiプラットフォーム
WO2016206646A1 (fr) Procédé et système pour pousser un dispositif de machine à générer une action
EP4265377A1 (fr) Robot et son procédé de commande
US20180126561A1 (en) Generation device, control method, robot device, call system, and computer-readable recording medium
WO2020153038A1 (fr) Dispositif de traitement d'informations et procédé de traitement d'informations
US20190329417A1 (en) Method for performing emotional gestures by a device to interact with a user
KR20230001225A (ko) 로봇 및 그 구동 방법
WO2022212144A1 (fr) Espaces contextuels définis par l'utilisateur
US20220393993A1 (en) Information processing apparatus, information processing system, information processing method, and program
US20230324713A1 (en) Intelligent actuated temple tips
US20230324714A1 (en) Intelligent actuated temple attachments

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20230718

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

A4 Supplementary search report drawn up and despatched

Effective date: 20240611

RIC1 Information provided on ipc code assigned before grant

Ipc: B25J 13/00 20060101ALI20240605BHEP

Ipc: B25J 11/00 20060101ALI20240605BHEP

Ipc: B25J 9/12 20060101ALI20240605BHEP

Ipc: B25J 9/16 20060101AFI20240605BHEP

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)