WO2016206642A1 - Procédé et appareil de génération de données de commande de robot - Google Patents

Procédé et appareil de génération de données de commande de robot Download PDF

Info

Publication number
WO2016206642A1
WO2016206642A1 PCT/CN2016/087257 CN2016087257W WO2016206642A1 WO 2016206642 A1 WO2016206642 A1 WO 2016206642A1 CN 2016087257 W CN2016087257 W CN 2016087257W WO 2016206642 A1 WO2016206642 A1 WO 2016206642A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
interaction behavior
sensing unit
trigger condition
data
Prior art date
Application number
PCT/CN2016/087257
Other languages
English (en)
Chinese (zh)
Inventor
聂华闻
Original Assignee
北京贝虎机器人技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201510364661.7A external-priority patent/CN106325228B/zh
Priority claimed from CN201510363348.1A external-priority patent/CN106325065A/zh
Priority claimed from CN201510363346.2A external-priority patent/CN106325113B/zh
Application filed by 北京贝虎机器人技术有限公司 filed Critical 北京贝虎机器人技术有限公司
Publication of WO2016206642A1 publication Critical patent/WO2016206642A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/10Program control for peripheral devices

Definitions

  • the present invention relates to the field of robot technology, and in particular, to a method and an apparatus for generating control data of a robot.
  • Today's robots are mostly industrial robots, while industrial robots are mostly non-sense.
  • the operating procedures of these robots are pre-defined and the determined tasks are completed without fail in accordance with the predetermined procedures. They lack adaptability and produce consistent results only when the objects involved are the same.
  • the embodiment of the invention provides a method and a device for generating control data of a robot, so as to at least effectively improve the adaptive interaction behavior and the degree of intelligence of the robot.
  • a method for generating control data of a robot includes: setting a trigger condition for controlling a robot interaction behavior according to one or more preset sensing units, wherein the sensing unit is configured to control the robot interaction a minimum unit of behavior; setting an interaction behavior triggered by the trigger condition according to one or more preset interaction behaviors set for the robot to execute; generating information for responding to the robot perception according to the set trigger condition and interaction behavior A control entry that controls the interaction of the robot.
  • setting a trigger condition for controlling a robot interaction behavior according to one or more preset sensing units includes: selecting at least one sensing unit from a preset sensing unit; setting an attribute of the selected sensing unit The attribute of the sensing unit includes the value of the sensing unit; and the triggering condition for controlling the interaction behavior of the robot is set according to the selected sensing unit and the attribute of the sensing unit.
  • the triggering condition triggered interaction behavior is set according to one or more preset interaction behaviors set for the robot to perform, including: selecting at least from the preset interaction behavior set for the robot to perform An interaction behavior; setting an attribute of the selected interaction behavior, wherein the attribute of the interaction behavior includes one or more action instructions of the interaction behavior that can be parsed by the robot to execute and parameters of the action instruction; according to the selected interaction behavior and the interaction behavior
  • the property sets the interaction behavior triggered by the trigger condition.
  • a control device for robot interaction behavior includes: a trigger condition setting module, configured to set a trigger condition for controlling a robot interaction behavior according to one or more preset sensing units, wherein the sensing unit The smallest unit that is set to control the interactive behavior of the robot; the interactive behavior setting module is used to The plurality of presets are set as interaction behaviors performed by the robot, and the interaction behavior triggered by the trigger condition is set; the generating module is configured to generate the information for responding to the robot to control the robot interaction behavior according to the set trigger condition and the interaction behavior. Control entry.
  • the sensing unit is defined as a minimum unit for controlling the interaction behavior of the robot, and defines an interaction behavior
  • the control item is controlled according to the sensing unit and the interaction behavior to control the robot.
  • the interaction behavior unifies the input and output standards of robot control, so that non-technical personnel can also edit the behavior of the robot, effectively improving the robot's adaptive interaction behavior and intelligence.
  • FIG. 1 illustrates a schematic structural view of a robot in accordance with some embodiments of the present invention
  • FIG. 2a illustrates a flow chart of a method of generating control data for a robot in accordance with some embodiments of the present invention
  • 2b illustrates a schematic diagram of a generation interface of robot control data in accordance with some embodiments of the present invention
  • 2c illustrates a schematic diagram of a generation interface for controlling interaction behavior in data, in accordance with some embodiments of the present invention
  • FIG. 3 illustrates a flow chart 1 of a method of controlling robot interaction behavior in accordance with some embodiments of the present invention
  • FIG. 4 illustrates a flow chart 2 of a method of controlling robot interaction behavior in accordance with some embodiments of the present invention
  • FIG. 5 illustrates a flow chart 3 of a method of controlling robot interaction behavior in accordance with some embodiments of the present invention
  • FIG. 6 is a block diagram showing the structure of a device for generating control data of a machine according to some embodiments of the present invention.
  • FIG. 7 illustrates a block diagram of a trigger condition setting module in accordance with some embodiments of the present invention.
  • FIG. 8 illustrates a block diagram of another trigger condition setting module in accordance with some embodiments of the present invention.
  • FIG. 9 illustrates a structural block diagram of an interactive behavior setting module in accordance with some embodiments of the present invention.
  • Figure 10 illustrates a block diagram of another interactive behavior setting module in accordance with some embodiments of the present invention.
  • the robot 100 includes a memory 102, a memory controller 104, one or more processing units (CPUs) 106, a peripheral interface 108, a radio frequency (RF) circuit 114, an audio circuit 116, a speaker 118, a microphone 120, a sensing subsystem 122, a gesture Sensor 132, camera 134, tactile sensor 136, and one or more other sensing devices 138, as well as external interface 140. These components communicate over one or more communication buses or signal lines 110.
  • CPUs processing units
  • RF radio frequency
  • the robot 100 is just one example of the robot 100, which may have more or fewer components than the illustration, or have different component configurations.
  • the bot 100 can include one or more CPUs 106, memory 102, one or more sensing devices (eg, sensing devices as described above), and one or more are stored in the memory 102 to A module, program, or instruction set that performs a robot interaction behavior control method.
  • the various components shown in FIG. 1 can be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
  • the robot 100 may be an electromechanical device having a biological shape (eg, a humanoid, etc.), or may be a smart device that does not have a biological appearance but has human characteristics (eg, language communication, etc.), the smart device may Mechanical devices are also included, as well as virtual devices implemented by software (eg, virtual chat bots, etc.).
  • the virtual chat bot can perceive information through the device in which it is located, and the device in which it is located includes electronic devices such as handheld electronic devices, personal computers, and the like.
  • Memory 102 can include high speed random access memory and can also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid state storage devices.
  • memory 102 may also include memory remote from one or more CPUs 106, such as network attached memory accessed via RF circuitry 114 or external interface 140 and a communication network (not shown), wherein the communication network It can be the Internet, one or more intranets, a local area network (LAN), a wide area network (WLAN), a storage area network (SAN), etc., or a suitable combination thereof.
  • Memory controller 104 can control access to memory 102 by other components of robot 100, such as CPU 106 and peripheral interface 108.
  • Peripheral interface 108 couples the input and output peripherals of the device to CPU 106 and memory 102.
  • the one or more processors 106 described above execute various software programs and/or sets of instructions stored in the memory 102 to perform various functions of the robot 100 and process the data.
  • peripheral interface 108, CPU 106, and memory controller 104 can be implemented on a single chip, such as chip 112. In some other embodiments, they may be implemented on multiple discrete chips.
  • the RF circuit 114 receives and transmits electromagnetic waves.
  • the RF circuit 114 converts an electrical signal into an electromagnetic wave, or converts the electromagnetic wave into an electrical signal, and communicates with the communication network and other communication devices via the electromagnetic wave.
  • the RF Circuitry 114 may include well-known circuitry for performing these functions, including but not limited to antenna systems, RF transceivers, one or more amplifiers, tuners, one or more oscillators, digital signal processors, CODEC chipsets, Subscriber Identity Module (SIM) card, memory, etc.
  • SIM Subscriber Identity Module
  • the RF circuit 112 can communicate with a network and other devices via wireless communication, such as the World Wide Web (WWW) Internet, an intranet, and/or a wireless network such as a cellular telephone network, a wireless local area network (LAN), and / or Metropolitan Area Network (MAN).
  • WWW World Wide Web
  • LAN wireless local area network
  • MAN Metropolitan Area Network
  • the above wireless communication may use any of a variety of communication standards, protocols, and technologies including, but not limited to, Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), Wideband Code Division Multiple Access (W-CDMA). ), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (eg IEEE 802.11a, IEEE 802.11b, IEEE 802.11g, and/or IEEE 802.11n), Internet-based Protocol Voice over Internet Protocol (VoIP), Wi-MAX, protocols for e-mail, instant messaging, and/or short message service (SMS), or any other suitable communication protocol, including those not yet developed at the filing date of this document. letter of agreement.
  • GSM Global System for Mobile Communications
  • EDGE Enhanced Data GSM Environment
  • W-CDMA Code Division Multiple Access
  • CDMA Code Division Multiple Access
  • TDMA Time Division Multiple Access
  • Bluetooth Bluetooth
  • Wi-Fi eg IEEE 802.11a, IEEE 802.11b, IEEE 802.
  • Audio circuitry 116, speaker 118, and microphone 120 provide an audio interface between the user and the robot 100.
  • Audio circuitry 116 receives audio data from peripheral interface 108, converts the audio data into electrical signals, and transmits the electrical signals to speaker 118.
  • the speaker transforms the electrical signal into a human audible sound wave.
  • Audio circuit 116 also receives electrical signals that are converted from sound waves by microphone 118.
  • the audio circuit 116 converts the electrical signals into audio data and transmits the audio data to the peripheral interface 108 for processing. Audio data may be retrieved from memory 102 and/or RF circuitry 114 by peripheral interface 108 and/or transmitted to memory 102 and/or RF circuitry 114.
  • a plurality of microphones 120 can be included, the plurality of microphones 120 being distributed at different locations, and the direction in which the sound is emitted is determined according to a predetermined strategy based on the microphones 120 at different locations. It should be understood that the direction of the sound can also be identified by some sensors.
  • audio circuit 116 also includes a headset jack (not shown).
  • the headset jack provides an interface between the audio circuit 114 and a removable audio input/output peripheral, for example, the audio input/output peripheral can be either a pure output headset or both Output (for single or binaural headphones) and input (microphone) headset.
  • a speech recognition device (not shown) is also included for implementing speech-to-text recognition and synthesizing speech based on text.
  • the speech recognition device can be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
  • the audio circuit 116 receives the audio data from the peripheral interface 108, converts the audio data into electrical signals, and the voice recognition device can identify the audio data and convert the audio data into text data.
  • the speech recognition apparatus can also synthesize the audio data based on the text data, convert the audio data into an electrical signal through the audio circuit 116, and transmit the electrical signal to the speaker 118.
  • Perception subsystem 122 provides an interface between the perceptual peripherals of robot 100 and peripheral interface 108, such as attitude sensor 132, camera 134, tactile sensor 136, and other sensing devices 128.
  • Perception subsystem 122 includes an attitude controller 124, a visual controller 126, a haptic controller 128, and one or more other perceptual device controllers 130.
  • the one or more other sensing device controllers 130 receive/transmit electrical signals from/to other sensing devices 138.
  • the other sensing devices 138 may include temperature sensors, distance sensors, proximity sensors, air pressure sensors, air quality detecting devices, and the like.
  • the robot 100 can have a plurality of attitude controllers 124 to control different limbs of the robot 100, which can include, but are not limited to, arms, feet, and heads. Accordingly, the robot 100 can include a plurality of attitude sensors 132. In some embodiments, the robot 100 may not have the attitude controller 124 and the attitude sensor 132. The robot 100 may be in a fixed configuration and does not have mechanical moving parts such as an arm or a foot. In some embodiments, the pose of the robot 100 may not be a mechanical arm, foot, and head, but may also employ a deformable configuration.
  • the robot 100 also includes a power system 142 for powering various components.
  • the power system 142 can include a power management system, one or more power sources (eg, batteries, alternating current (AC)), charging systems, power failure detection circuits, power converters or inverters, power status indicators (eg, light emitting diodes (eg LED)), as well as any other components associated with power generation, management, and distribution in portable devices.
  • the charging system can be a wired charging system or a wireless charging system.
  • the software components include an operating system 144, a communication module (or set of instructions) 146, an interactive behavior control device (or set of instructions) 148, and one or more other devices (or sets of instructions) 150.
  • Operating system 144 eg, Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as Vxworks
  • controls and management of general system tasks eg, memory management, storage device control, power management, etc.
  • software components and/or drivers that facilitate communication between various hardware and software components.
  • Communication module 146 facilitates communication with other devices via one or more external interfaces 140, and it also includes various software components for processing data received by RF circuitry 114 and/or external interface 140.
  • External interface 140 eg, Universal Serial Bus (USB), FIREWIRE, etc.
  • USB Universal Serial Bus
  • FIREWIRE FireWire
  • Communication module 146 facilitates communication with other devices via one or more external interfaces 140, and it also includes various software components for processing data received by RF circuitry 114 and/or external interface 140.
  • External interface 140 eg, Universal Serial Bus (USB), FIREWIRE, etc.
  • USB Universal Serial Bus
  • FIREWIRE FireWire
  • Communication module 146 facilitates communication with other devices via one or more external interfaces 140, and it also includes various software components for processing data received by RF circuitry 114 and/or external interface 140.
  • External interface 140 eg, Universal Serial Bus (USB), FIREWIRE, etc.
  • USB Universal Serial Bus
  • FIREWIRE FireWire
  • the robot 100 may also include a display device (not shown), which may include, but is not limited to, a touch sensitive display, a touch pad, and the like.
  • a display device (not shown), which may include, but is not limited to, a touch sensitive display, a touch pad, and the like.
  • One or more of the other devices 150 described above can include a graphics module (not shown) that includes various known software components for presenting and displaying graphics on the display device. Attention The term "graphics" includes any object that can be displayed to a user, including but not limited to text, web pages, icons (eg, user interface objects including soft keys), digital images, video, animation, and the like. Touch sensitive displays or touch pads can also be used for user input.
  • the robot 100 senses the external environment of the robot 10 and the condition of the robot itself by, for example, the attitude sensor 132, the camera 134, the tactile sensor 136, and other sensing devices 128, the microphone 120, etc., and the information perceived by the robot 100 is controlled via the sensing peripheral.
  • the device processes and is processed by one or more CPUs 106.
  • the perception of the environment by the robot 100 includes, but is not limited to, information detected by its own sensors (eg, the attitude sensor 132, the camera 134, the tactile sensor 136, and other sensing devices 128), and may also be an external device connected to the robot 100 (not shown)
  • the detected information establishes a communication connection between the robot 100 and the external device through which the robot 100 and the external device transmit data.
  • External devices include various types of sensors, smart home devices, and the like.
  • the information perceived by the robot 100 includes, but is not limited to, sound, images, environmental parameters, haptic information, time, space, and the like.
  • Environmental parameters include, but are not limited to, temperature, humidity, gas concentration, etc.
  • tactile information includes, but is not limited to, contact with the robot 100, including but not limited to contact with a touch sensitive display, contact or proximity to a tactile sensor, and the tactile sensor can be placed at The head, arm, etc. of the robot (not shown) should be described to include other forms of information.
  • the sound may include voice and other sounds, the sound may be the sound collected by the microphone 120, or may be the sound stored in the memory 102; the voice may include, but is not limited to, human speaking or singing.
  • the image may be a single picture or video, including but not limited to captured by camera 134, or may be read from memory 102 or transmitted to the robot 100 over a network.
  • the information perceived by the robot 100 includes not only information external to the robot 100 but also information of the robot 100 itself, including but not limited to information such as the amount of power, temperature, and the like of the robot 100.
  • the robot 100 can be moved to the charging position for automatic charging when it is perceived that the power of the machine 100 is less than 20%.
  • the robot 100 is not limited to perceiving information in the manner described above, but may also perceive information in other forms, including perceptual techniques that have not been developed at the filing date of this document.
  • the sensing device of the robot 100 is not limited to the sensing device provided on the robot 100, and may also include a sensing device associated with the robot 100 and not provided on the robot 100, such as various sensors for sensing information.
  • the robot 100 may be associated with a temperature sensor, a humidity sensor (not shown), or the like disposed within a certain area through which the corresponding information is perceived.
  • the robot 100 can communicate with these sensors through various types of communication protocols to obtain information from these sensors.
  • the information perceived by the robot 100 may be set according to preset conditions, which may include, but are not limited to, setting which information the robot 100 perceives, when to perceive the information, and the like. For example, it may be set to sense the user's voice, track the user's face, recognize the user's gesture, etc., in the user's voice conversation, without perceiving other information, or reducing the effect of other information when generating the sensing unit, or sensing the Other information is processed, etc.; or, during a certain period of time (for example, when the user goes out, the robot 100 is indoors alone), the environmental parameters, the perceived image, and the video data are sensed, and the environmental parameters are used to determine whether it is necessary to interact with the air conditioner or the like.
  • preset conditions may include, but are not limited to, setting which information the robot 100 perceives, when to perceive the information, and the like. For example, it may be set to sense the user's voice, track the user's face, recognize the user's gesture, etc., in the user
  • condition for setting the perceived information is not limited thereto, and the above conditions are merely exemplified, and the information that the robot 100 needs to perceive may be set depending on the situation.
  • At least one sensing unit is defined, which is the smallest unit (or referred to as a minimum input unit) that controls the robot 100, and the robot 100 makes interactive behavior based on at least the sensing unit.
  • the interaction behavior of the robot 100 may be controlled by one or more sensing units, for example, when the values of one or more sensing units change, the robot 100 may react to the changes in response to the changes; or, when one or more perceptions When the value of the unit is within a certain value range or equal to a certain value, the robot 100 can perform an interactive behavior in response to the sensing unit. It should be understood that the control of the interaction behavior of the robot 100 by the sensing unit is not limited to the above case, and the above case is merely illustrative.
  • the sensing unit can include multiple levels, and the higher level sensing unit can include one or more sensing units of the lower level.
  • the higher level perceptual unit may include one or more perceptual units of the lower level adjacent thereto, and the sensing unit of the same higher level may include different lower level perceptual units.
  • the low-level sensing units that synthesize the high-level sensing units include, but are not limited to, low-level sensing units of the same time or time period, and historical low-level sensing units of the time or time period.
  • the higher level perceptual units are determined by lower level sensing units at different times.
  • the value of the sensing unit may be one or a set of values, or may be a range of one or more values.
  • the value of the sensing unit may be determined according to the information perceived by the robot 100.
  • One sensing unit may be determined by one or more pieces of information that is perceived, and the same sensing unit may be determined by different data that is perceived.
  • the perceived information may include real-time perceived information, or historically perceived information (such as information perceived at a certain time or some time in the past). In some cases, the value of the sensing unit is determined by the information perceived in real time and the information perceived by the history.
  • the voice recognition process is performed on the received voice to identify the text of the voice in the voice, and the value of the hearing may be the text of the voice heard; in some embodiments, the vision may also include The direction of the sound, the direction of the sound is referenced to the face of the robot, including left, right, front, back and other directions.
  • the robot 100 can analyze the image or video to determine whether there is or is there a current movement, and the visual value can include whether or not someone has moved or not.
  • the time describes the time information, and the value may be a time point or a time range, for example, 14:00 every February 1st.
  • the environment describes the environmental conditions, including temperature, humidity, noise, PM2.5, ppm of gas in the air, carbon monoxide content in the air, oxygen content in the air, etc., and the value may be the value or range of each parameter.
  • the value of the sensing unit can be predefined.
  • the value of the predefined sensing unit may be one or more specific values, or one or more ranges of values.
  • the value of the sensing unit may be an explicit value, or may be formed by a wildcard (or the like) together with an explicit value, but is not limited thereto. For example, when the sensing unit is “speech”, the value may be “*rain*”, indicating that any voice information containing “raining” is included; or the value may be “*[below] rain*”, indicating Any voice message that contains "rain” or "rain”.
  • the robot 100 may generate the sensing data according to the sensing unit and the perceived information, and the sensing data may include one or more sensing units, where the sensing data includes the identification and value of the sensing unit.
  • the sensing data includes the identification and value of the sensing unit.
  • the robot 100 generates the sensing data according to the perceived information according to the sensing unit, and can obtain the value of the sensing unit according to the perceived information by using various analysis methods, for example, obtaining the text of the voice through the voice recognition technology, and analyzing the sensing through the image recognition technology. Whether there is a portrait in the image to be obtained, the attribute of the portrait is determined by the portrait (face) recognition technique, and the like. It should be understood that the robot 100 is not limited to obtaining the value of the sensing unit by the above manner, and may also include processing techniques that have not been developed at the filing date of this document by other means.
  • the trigger condition and the interaction behavior triggered by the trigger condition can be set.
  • a control item for controlling the interaction behavior of the robot in response to the information perceived by the robot is generated.
  • Control entries can have unique identifiers to distinguish control entries.
  • the triggering condition may be composed of one or more sensing units, and the logical units may be configured between the sensing units, and the logical relationships include, but are not limited to, “and”, “or”, “not”, and the like.
  • the triggering condition may include an identifier and a value of the sensing unit constituting the triggering condition, and the value of the sensing unit may be one or a set of values, or one Or a set of values.
  • the value of the sensing unit may be an explicit value, or may be composed of a wildcard (or the like) and an explicit value, but is not limited thereto.
  • the sensing unit in the trigger condition is “speech”
  • the value may be “*rain*”, indicating that any voice information containing “rain” is included; or the value may be “*[below] rain* ",” means that any voice message containing "rain” or "rain” is included.
  • One or more interactions that the trigger condition can trigger.
  • the order between interactions can be set to perform multiple interactions in a set order.
  • the interaction behavior can be configured as one or more action instructions that can be parsed by the robot for execution, and the action instructions can also include one or more parameters.
  • the order of execution of the one or more action instructions can also be configured.
  • the execution order may include, but is not limited to, randomly executing one or a set of action instructions to effect random execution of one or more actions; or executing a plurality of action instructions in a predetermined sequence of steps.
  • the operating system 144 of the robot 100 and other related devices can parse the action instructions of the interactive behavior, so that the robot performs the interactive behavior. For example, to move the robot forward by 5 meters, the action command can be "move ⁇ "m":5 ⁇ ".
  • the robot interaction behavior control device 148 parses the action instruction, obtains the task (move) and task parameters (5 meters forward) to be executed, passes the task and parameters to the operating system 144, and the operating system 144 further processes the mobile device (not shown) To perform the movement, the mobile device may include a foot, a wheel, a crawler, and the like. It should be understood that specific instructions, such as parameters of individual motors (or similar components) of the mobile device, may also be provided.
  • the action instructions of the interactive behavior include: links to other control entries set for execution of other control entries, and/or for selecting content from a plurality of content and/or parameters and/or A link to multiple parameters and/or multiple content set by parameters or parameters.
  • Each control entry may have a unique identification to which an action instruction may refer to the control entry.
  • the content of the action instruction link may be a set of actions, and the robot 100 may perform actions in a set of actions according to other factors. For example, attributes such as personality or gender of the robot 100 may be pre-configured, and the attributes may be stored in the memory 102, different genders.
  • the interactive robot 100 may have different interaction behaviors for the same situation (or called a scene), and the robot 100 may select an executed action from a set of actions according to attributes such as a set personality or gender, and the actions may include, but are not limited to, the robot 100. Physical movements, etc.
  • the action instruction may be linked to one or a group of content, which may include, but is not limited to, the content of the voice chat, various Internet information, etc., for example, the action performed by the robot 100 according to the control item is to query the weather in Beijing, and the action instruction may be a Querying the weather address, the robot 100 obtains the weather in Beijing at this address, which may include a uniform resource locator (URL), a memory address, a database field, and the like.
  • URL uniform resource locator
  • the interactive behavior of the robot 100 includes, but is not limited to, by outputting a voice, adjusting a gesture, outputting an image or video, interacting with other devices, and the like.
  • Output speech includes, but is not limited to, chatting with a user, playing music; adjusting gestures including, but not limited to, moving (eg, mimicking human walking, etc.), limb swings (eg, arm, head swing), posture adjustment, etc.; outputting images or videos Including but not limited to displaying an image or video on a display device, the image may be a dynamic electronic expression or the like, or may be a captured image, or an image obtained from a network; interaction with other devices includes, but is not limited to, controlling other devices ( For example, adjusting the operating parameters of the air conditioner, etc., transferring data to other devices, establishing connections with other devices, and the like.
  • the interactive behavior is not limited to the above enumerated contents, and the reaction of the robot 100 to the perceived information can be regarded as the interactive behavior of the robot
  • Control entries can be configured in a data exchange format, although other formats can be used.
  • Data exchange formats include, but are not limited to, XML, JSON, or YAML.
  • JSON Take JSON as an example, you need to implement: When the user says, "Sing me a song,” first go back to 10cm at a medium speed of 0 and then start singing a song. After singing the song, take a photo and send it to the user 10 seconds later. Then 0 angle forward 5CM.
  • the control entry for the JSON data format can be as follows:
  • the "ifs” part is a trigger condition set according to the sensing unit
  • "ear” is the identification of the sensing unit
  • “singing” is the value of the sensing unit.
  • the “trigger” part is the interactive behavior triggered by the trigger condition, including three interaction behaviors of “move”, “song” and “take_pic”, each of which includes a corresponding action instruction. Among them, “song” is linked to "http://bpeer.com/i.mp3", the content of singing is obtained from “http://bpeer.com/i.mp3”, and "gr" is action Execution order.
  • control entries may be stored as documents in a data exchange format, or may also be stored in a database.
  • the bot 100 can also include a database system for storing control entries.
  • the database system provides an interface for one or more CPUs 106 to read data from the database and to write data to the database system.
  • the control device 148 of the robot interaction behavior may control the interaction behavior of the robot according to the control item, and the control device 148 acquires the information perceived by the robot, and generates the sensing data according to the perceived information, at least according to a predefined sensing unit, wherein the sensing data includes the sensing.
  • the identification and value of the unit finding a control entry that matches the generated perceptual data; if the control entry matching the generated perceptual data is found, causing the robot to perform the interaction behavior in the found control entry.
  • control device 148 can also transmit the information perceived by the robot 100, and the remote server (not shown) generates the sensing data according to the sensed information and the sensing unit, and searches for and generates the sensing unit.
  • the matching control entries are then sent to the control device 148, which causes the robot to perform the interactive behavior in the control entry.
  • an identification of the perceptual information may be generated to determine if the received control entry is a control entry for the transmitted perceptual information.
  • the control device 148 may be sent to the control entry itself, or may be an identifier of the control entry, or interactive behavior data that controls the configuration of the entry, or other information that causes the control device 148 to determine the interaction behavior of the control entry configuration.
  • control device 148 can generate the sensing data according to the information and the sensing unit perceived by the robot 100, and send the generated sensing data to the remote server, and the remote server receives the sensing data to find the control that matches the sensing data.
  • An entry, which sends the found control entry to the robot 100, causes the robot 100 to perform the interactive behavior in the control entry.
  • control device 148 is not limited to controlling the interactive behavior of the robot by the manner described above, but may be a combination of the above several manners or other means.
  • the trigger condition may be set according to the sensing unit, and the interaction behavior triggered by the trigger condition, the control item is obtained, and the control item is used as data for controlling the interaction behavior of the robot 100.
  • FIG. 2a illustrates a flow chart of a method of generating control data for a robot, as shown in FIG. 2a, in accordance with some embodiments of the present invention, the method comprising:
  • Step S202 setting a trigger condition for controlling the interaction behavior of the robot according to one or more preset sensing units
  • Step S204 setting an interaction behavior triggered by the trigger condition according to one or more preset interaction behaviors set for the robot to perform;
  • Step S206 generating a control item for controlling the interaction behavior of the robot in response to the information perceived by the robot according to the set trigger condition and the interaction behavior.
  • At least one sensing unit may be selected from the preset sensing unit; the attribute of the selected sensing unit is set, wherein the attribute of the sensing unit includes the value of the sensing unit; according to the selected sensing unit and the sensing unit
  • the property settings are used to control the triggering conditions of the robot's interactive behavior.
  • a relationship between the plurality of sensing units is further set, and the relationship between the sensing units includes, but is not limited to, logical relationships such as “AND”, “OR”, and “NO”; the foregoing step S202 may be selected according to the selection.
  • a trigger condition is set for the relationship between the sensing unit and the sensing unit and the relationship between the sensing units.
  • the weight of the sensing unit may also be set to distinguish the importance of the different sensing units, thereby correspondingly interacting with the important sensing units.
  • At least one interaction behavior may be selected from a preset interaction behavior set for the robot to perform; setting an attribute of the selected interaction behavior, wherein the attribute of the interaction behavior includes one or more interaction behaviors
  • the robot parses the action instruction and the parameters of the action instruction; and sets the interaction behavior triggered by the trigger condition according to the selected interaction behavior and the attribute of the interaction behavior.
  • the execution order of the multiple interaction behaviors may also be set.
  • the interaction behavior triggered by the trigger condition may be set according to the selected interaction behavior and the attributes of the interaction behavior and the execution order.
  • the execution order of the interaction behavior includes, but is not limited to, randomly performing one or more interaction behaviors, or performing a plurality of interaction behaviors according to predetermined steps.
  • the triggering condition and the triggering condition triggered interaction behavior are described in terms of a predetermined data representation.
  • the control entry may be generated using a data exchange format based on the interaction behavior triggered by the trigger condition and the trigger condition.
  • Data exchange formats include, but are not limited to, one or any combination of the following: XML, JSON, or YAML. It should be understood that other formats may also be used to generate triggering conditions triggered by trigger conditions and trigger conditions, including data representations that have not yet been developed on the filing date of this document.
  • multiple control entries can be set and multiple control entries stored as documents in a data exchange format.
  • multiple control entries can also be stored in the database.
  • adjacent control entries can be separated by a predetermined symbol to distinguish between different control entries.
  • the document storing the control entry may be stored in the memory 102 of the robot 100, and the document of the control entry may also be stored in the remote server.
  • the interaction behavior is configured as one or more action instructions.
  • the above action instructions include: links to other control entries set for executing other control entries, and/or multiple parameters set for selecting content and/or parameters from a plurality of content and/or parameters. And/or links to multiple content.
  • the action command of "query weather” it is possible to link to a webpage providing weather information, and obtain weather information of the city to be queried from the webpage. After the weather information is queried, it may be displayed on the display device of the robot 100, or the weather information may be broadcasted by voice.
  • the actions performed may be selected according to other configurations. Parameters;
  • when linking to multiple content such as multiple corpora of chats), you can also choose what to render based on other configurations.
  • the execution order of the action instructions may also be set, wherein the execution sequence includes: randomly executing one or more action instructions, or executing a plurality of action instructions in predetermined steps.
  • the order of execution can be marked with symbols. If there is no mark, it can be in the order in which the actions are described. The same type of action can be used as a whole, and the order of actions can be marked. For example, “move forward 5 meters, nod 5 times, then back 10 meters”, the action instruction can be expressed as [move: ⁇ gr: 0,m:+5;gr:2,m:-10 ⁇ ;head ⁇ gr:1,head:5 ⁇ ], "gr" indicates the execution order of the action, and the action with a small value is executed first.
  • a graphical user interface can be provided for setting trigger conditions and interaction behaviors, the graphical user interface providing a set sensing unit (eg, the name of the sensing unit, the identification, etc.), a perceptible unit that can be set The value of the relationship between the sensing unit and the sensing unit.
  • the user who sets the triggering condition can select the sensing unit, the value of the sensing unit, and the logical relationship of the sensing unit. After selecting the sensing unit that sets the triggering condition, the triggering condition is generated according to the corresponding format.
  • the graphical user interface can also provide set interaction behaviors, which can be pre-defined interaction behaviors, and after the interaction behavior is selected, the interaction behavior is generated according to the corresponding format.
  • the trigger condition and the interaction behavior may also be directly edited, for example, according to the data exchange format described above, the action instruction specification using the predefined sensing unit and the interaction behavior, and the interaction behavior triggered by the trigger condition and the trigger condition are edited. Get control entries.
  • Figure 2b illustrates a generation interface for control data for a robot of certain embodiments.
  • a graphical user interface is added to add control entries (referred to as add rules in Figure 2b).
  • the graphical user interface includes two parts: the trigger condition and the interaction behavior triggered by the trigger condition:
  • the first part is "execution condition for selecting a robot rule" (trigger condition in the embodiment of the present invention)
  • the sensing unit includes "voice", "video monitoring situation", "time”, "whether or not” Some people are at home, "environment”, where the value of "voice” can be the text spoken to the robot, and the value of "video surveillance situation” includes the robot monitoring to someone or having to move.
  • the relationship between the sensing units is also included, and the relationship of "simultaneous satisfaction” (logical AND) and “satisfying one” (logical OR) are shown in Fig. 2b.
  • the second part is "adding robot's execution action” (interaction behavior in the embodiment of the present invention), as shown in FIG. 2b, including “speaking”, “entering standby”, “recording audio and video”, “playing music” Interactive behaviors such as “moving”, “vacuum”, and “charging”.
  • the attributes of the "go to standby” interaction include “stop all work into standby” and "exit”.
  • Figure 2c illustrates a generation interface for interactive behavior in control data of certain embodiments.
  • the interaction behavior to be added after selecting the interaction behavior to be added and setting the properties of the selected interaction behavior, click the "Add” button, and the data of the added interaction behavior can be displayed in the "Action List” section.
  • the added interactive behaviors include: “record audio and video”, “play music”, “move”, “vacuum” and “charge”. Where "gr" indicates the execution order of the interaction behavior.
  • Figures 2b and 2c are only one example.
  • a graphical user interface for generating control entries may also be provided by other means, such as setting icons, by adding "drag" the icon to the editing area. Perceived unit or interactive behavior.
  • content may be fetched from the Internet (eg, a web page, etc.), the captured content may be analyzed, content for setting control entries may be obtained, and triggering conditions triggered by trigger conditions and trigger conditions may be set according to the content. For example, if you call the emergency number when you are ill from the Internet, you can set the trigger condition of “ill” according to the sensing unit, and set the interaction behavior triggered by the trigger condition to “call emergency call”, for example, the preset interaction. The behavior of the "Call" parameter is set to the emergency number. If the sensing unit of “health status” is predefined, the value of the sensing unit can be directly set to “ill”, and the triggering condition can be ⁇ if(“health”: “sick”) ⁇ .
  • the robot 100 can determine the health status of the user based on the perceived data, determine whether the health condition is "ill", for example, perform a voice chat with the user to understand the state of the user, and detect the heart rate, body temperature, and the like of the user.
  • the sensor 100 When the health condition is "ill", the sensor 100 generates perceptual data including ⁇ "health": "sick" ⁇ .
  • the plurality of robots 100 may also constitute a robotic system based on the interaction of the robot 100 with the user.
  • the robot 100 can transmit the situation to the other one or more robots 100, and the other one or more robots 100 can communicate with other users.
  • Mutual get interactions based on interactions with other users, and generate control entries based on this process. For example, when the robot 100 perceives the user's voice information "how to do Kung Pao Chicken", if the control item matching the voice information is not found, the robot 100 can transmit the voice information to the other robot 100, and the other robots 100 take the initiative.
  • the robot 100 can select the transmitted robot 100 according to the information of the user of the other robot 100. For example, the robot 100 determines that the subject of the problem is law, and the robot 100 can find the robot 100 that the user is a legal professional, and send the above problem to the robot 100. .
  • control item After the control item is used as the data to control the interaction behavior of the robot, the interaction behavior of the robot can be controlled according to the control item.
  • FIG. 3 illustrates a flow chart 1 of a method of controlling robot interaction behavior, as shown in FIG. 3, in accordance with some embodiments of the present invention, including:
  • Step S302 acquiring data that is perceived by the robot
  • Step S304 Generate sensing data according to the perceptual information, at least according to a predefined sensing unit, where the sensing data includes an identifier and a value of the sensing unit.
  • Step S306 searching for a control entry that matches the generated sensing data among the stored plurality of control entries;
  • Step S308 if the control item matching the generated sensing data is found, the robot is caused to perform the interactive behavior in the found control item.
  • the bot 100 communicates with a remote server (not shown) over a network, the bot 100 perceives at least one piece of data, and the remote server acquires information perceived by the bot from the bot 100, the acquisition including a remote server request
  • the robot 100 transmits the information it perceives, or the robot senses the information, and transmits the information perceived by the robot 100 to the remote server.
  • the robot 100 may periodically transmit the perceived information to the remote server or transmit the perceived information to the remote server when the perceived information changes to reduce the amount of data transmission between the remote server and the robot 100.
  • the control entry document can be stored in a remote server that includes one or more processors and one or more modules, programs, or sets of instructions that are stored in memory to perform the method illustrated in FIG.
  • the remote server can be a single server or a server cluster consisting of multiple servers. It should be understood that the above described program or set of instructions is not limited to running on a single server, but can also be run on distributed computing resources.
  • the found control entry can be sent to the bot 100, which reads the interactive behavior from the control entry and performs the interactive behavior. Or, you can interact with the found control entry
  • the data is sent to the robot 100.
  • the data of the interactive behavior in the control entry may be parsed to obtain an instruction that the robot 100 can execute, and the obtained command is transmitted to the robot 100, and the robot 100 executes the instruction. It should be understood that the above manner is merely illustrative.
  • FIG. 4 illustrates a flow chart 2 of a method of controlling robot interaction behavior, as shown in FIG. 4, according to some embodiments of the present invention, the method comprising:
  • Step S402 receiving the sensing data of the robot, wherein the sensing data is generated according to the information perceived by the robot, at least according to a predefined sensing unit, and the sensing data includes the identifier and the value of the sensing unit;
  • Step S404 searching for a control item matching the sensing data of the robot among the stored plurality of control items;
  • Step S406 if the control item matching the perceptual data of the robot is found, the robot is caused to perform the interaction behavior in the found control item.
  • the robot 100 senses at least one piece of information, and generates sensing data based on the sensed information and the sensing unit, and transmits the sensing data.
  • the bot 100 sends the sensory data to a remote server (not shown).
  • the robot 100 may transmit the sensing data after generating the sensing data, or may send the sensing data after receiving the request from the remote server.
  • the remote server stores documents that control entries, such as documents in a data exchange format, or a database, and the like.
  • control entry documents can be distributed across multiple storage spaces.
  • the remote server may include one or more processors and one or more modules, programs or sets of instructions stored in memory to perform the method illustrated in FIG.
  • FIG. 5 illustrates a third flowchart of a method for controlling the interaction behavior of a robot according to some embodiments of the present invention. As shown in FIG. 5, the method includes:
  • Step S502 sensing at least one piece of information
  • the sensing data is generated according to the perceptual information, at least according to the predefined sensing unit, where the sensing data includes the identifier and the value of the sensing unit;
  • Step S506 sending the generated sensing data
  • Step S508 receiving information of a control item that matches the sensing data
  • Step S510 performing an interaction behavior of the control item configuration according to the information of the control item.
  • the interactive behavior control device 148 of the robot 100 performs the method as shown in FIG.
  • the robot 100 perceives at least one piece of information, generates a policy according to the perceptual data, and generates perceptual data according to the sensing unit. After the robot 100 generates the sensing data, it transmits the sensing data to the remote server.
  • the control entry is sent to the robot 100 to a control entry that matches the sensory data of the robot.
  • an action instruction that controls the interactive behavior in the entry can be sent to the robot 100.
  • the identification of the generated sensory data may also be determined prior to transmitting the generated sensory data. After determining the identifier of the generated sensing data, the generated sensing data and its identifier are sent out. After the remote server finds the control entry that matches the generated sensing data, the information of the control entry and the identifier of the corresponding sensing data are sent to the control device 148, and the information of the control entry may be the control entry itself, the identifier of the control entry, The behavior of the control bar entry configuration and any combination thereof, but is not limited to this. The control device receives the information of the control entry, and determines whether the information of the received control entry is the information of the control entry that matches the generated sensing data according to the identifier of the sensing data carried in the information of the control entry.
  • Control device 148 can determine a corresponding control entry based on the identification of the control entry and perform an interaction behavior in the control entry. Alternatively, the control device 148 can read the interaction behavior of the control entry configuration directly from the control entry sent by the remote server to perform the interaction. Moreover, if the remote server sends the interaction behavior configured in the control entry, the control device 148 can directly parse and execute the interaction behavior.
  • the sensory data of the robot may be matched with a trigger condition in the control entry, including but not limited to determining whether there is a certain sensing unit, the value of the comparison sensing unit.
  • the degree of matching between the perceptual data of the robot and the matched plurality of trigger conditions may be determined, at least according to the degree of matching.
  • a control entry that senses data matching may be determined for the speech text in the perceptual data, the degree of matching may be determined by using the editing distance, and the smaller the value of the editing distance, the more similar the two texts are. Speech text can also be matched using regular expressions.
  • the priority of the control entry can also be set, and the priority of the control entry can be referenced when selecting the control entry.
  • control entries can be classified into core control entries, user control entries, and temporary control entries, with the core control entry being the highest priority control entry, followed by the user control entry, and finally the temporary control entry.
  • the robot 100 can perceive at least one piece of information, generate perceptual data based on the perceptual information and sensing unit, and read control items (including but not limited to reading from the memory 102 of the robot 100), A control entry that matches the generated perceptual data is found, and if a control entry that matches the generated perceptual data is found, the robot 100 performs an interaction behavior in the found control entry.
  • the document of the control entry may be stored in the memory 102 and the remote server of the bot 100.
  • the robot 100 perceives at least one piece of information, generates sensing data based on the sensed information and the sensing unit, reads the control item from the memory 102, and searches for the control item matching the generated sensing data in the read control item.
  • the robot 100 performs the interaction behavior in the found control entry; if the control entry matching the generated perceptual data is not found in the read control entry, the robot 100 may Sending the generated sensing data to the remote server, the remote server searching for the control entry matching the received sensing data in the stored control entry, and if the control entry matching the received sensing data is found, causing the robot 100 to execute The interaction behavior in this control entry.
  • the remote server can also send the found control entry to the bot 100, which can receive the control entry via an interface (not shown) and store the received control entry.
  • the robot 100 when a control entry that matches the perceptual data is found, the robot 100 is caused to perform an interactive behavior in the control entry. When the control item matching the perceptual data is not found, the interaction behavior may be omitted, and the robot 100 may continue to perceive at least one piece of information, and perceive what information can be determined according to the preset condition. In some embodiments, when a control entry that matches the perceptual data is not found, a voice reply can be made or imported into the Internet (eg, displaying web page information, etc.).
  • the control item matching the sensing data it may be determined whether the sensing data is related to the voice (for example, whether the user's voice instruction is received, etc.), and if the sensing data is determined to be related to the voice, the voice response may be performed, or according to the voice.
  • the content is searched for relevant content in the Internet and presented to the user in the display device of the robot 100.
  • the control entries can be set based on the interaction behavior of the robot with the user.
  • the robot 100 can perform a voice chat with the user.
  • the robot 100 analyzes the user's needs and intentions, and obtains the interaction behavior of the scene and the robot in the situation.
  • the interaction behavior of the robot the control item is generated according to the sensing unit. For example, when the user is sick, the robot says “I am sick", and the control item of the robot 100 does not have an interaction behavior when the user is sick.
  • the robot 100 can perform a voice interaction with the user, such as asking the user "I don't know what needs to be done.
  • the robot 100 can make a call.
  • the robot 100 needs to contact the doctor when analyzing that the user is "ill", and according to the result of the analysis, the robot 100 can generate a control item, for example, the trigger condition is [if(health:sick)], The interaction behavior triggered by the trigger condition is [call ⁇ number:"//doctor_number.php"].
  • the structure of the apparatus for generating control data of the robot of some embodiments will be described below. Since the principle of solving the problem by the generating device of the control data of the robot is similar to the control method of the interactive behavior of the robot, the implementation of the generating device for controlling the data of the robot can be referred to the implementation of the method for generating the control data of the robot, and the repeated description will not be repeated.
  • the term "unit” or "module” may implement a combination of software and/or hardware of a predetermined function.
  • the apparatus described in the following embodiments is preferably implemented in software, hardware, or a combination of software and hardware, is also possible and contemplated.
  • FIG. 6 is a block diagram showing the structure of a control device for generating control data of a robot according to some embodiments of the present invention. As shown in FIG. 6, the device includes:
  • the trigger condition setting module 602 is configured to set a trigger condition for controlling the interaction behavior of the robot according to the one or more preset sensing units, wherein the sensing unit is set as a minimum unit that controls the interaction behavior of the robot;
  • the interaction behavior setting module 604 is connected to the trigger condition setting module 602, and configured to set an interaction behavior triggered by the trigger condition according to one or more preset interaction behaviors set for the robot to perform;
  • the generating module 606 is connected to the interaction behavior setting module 604, and is configured to generate a control item for controlling the interaction behavior of the robot in response to the information perceived by the robot according to the set trigger condition and the interaction behavior.
  • FIG. 7 illustrates a structural block diagram of a trigger condition setting module 602.
  • the trigger condition setting module 602 may include: a sensing unit selecting unit 702 for using a preset sensing unit. At least one sensing unit is selected; the sensing unit attribute setting unit 704 is connected to the sensing unit selecting unit 702, and configured to set the attribute of the selected sensing unit, wherein the attribute of the sensing unit includes the value of the sensing unit; the trigger condition setting unit 706 And connected to the sensing unit attribute setting unit 704, configured to set a trigger condition for controlling the interaction behavior of the robot according to the selected sensing unit and the attribute of the sensing unit.
  • FIG. 8 illustrates a block diagram of another trigger condition setting module 602 according to some embodiments of the present invention.
  • the trigger condition setting module 602 may further include: a relationship setting unit, in addition to the unit included in FIG. 708, connected to the trigger condition setting unit 706, for setting a relationship between the plurality of sensing units.
  • the trigger condition setting unit 706 is further configured to set a trigger condition according to the attributes of the selected sensing unit and the sensing unit and the relationship between the sensing units.
  • FIG. 9 illustrates a structural block diagram of an interaction behavior setting module 604.
  • the interaction behavior setting module 604 may include an interaction behavior selection unit 902 for setting from a preset. Selecting at least one interaction behavior for the interaction behavior performed by the robot; the interaction behavior attribute setting unit 904 is connected to the interaction behavior selection unit 902, and configured to set the attribute of the selected interaction behavior, wherein the attribute of the interaction behavior includes one of the interaction behaviors. Or a plurality of motion instructions that can be parsed by the robot to execute and parameters of the motion instruction;
  • the interaction behavior setting unit 906 is connected to the interaction behavior attribute setting unit 904, and sets the interaction behavior triggered by the trigger condition according to the selected interaction behavior and the attribute of the interaction behavior.
  • FIG. 10 illustrates a block diagram of another interaction behavior setting module 604.
  • the interaction behavior setting module 604 may further include: a sequence setting unit. 908, connected to the interaction behavior setting unit 906, for setting an execution order of the plurality of interaction behaviors.
  • the interaction behavior setting unit 906 is configured to set an interaction behavior triggered by the trigger condition according to the selected interaction behavior and the attribute of the interaction behavior, and the foregoing execution order.
  • the sensing unit is defined as a minimum unit for controlling the interaction behavior of the robot, and defines an interaction behavior, and controls the interaction behavior of the robot according to the sensing unit and the interaction behavior setting control item, and unifies the input and output standards of the robot control, and It enables non-technical personnel to edit the behavior of the robot and effectively improve the robot's adaptive interaction behavior and intelligence.
  • modules or steps of the embodiments of the present invention can be implemented by a general computing device, which can be concentrated on a single computing device or distributed in multiple computing devices. Alternatively, they may be implemented by program code executable by the computing device such that they may be stored in the storage device by the computing device and, in some cases, may be different from The steps shown or described are performed sequentially, or they are separately fabricated into individual integrated circuit modules, or a plurality of modules or steps thereof are fabricated into a single integrated circuit module. Thus, embodiments of the invention are not limited to any specific combination of hardware and software.

Abstract

L'invention concerne un procédé et un appareil de génération de données de commande d'un robot. Le procédé de génération de données de commande d'un robot comprend : le réglage, selon une ou plusieurs unités de détection prédéfinies, d'une condition de déclenchement pour commander un comportement d'interaction d'un robot (S202), l'unité de détection étant configurée comme unité minimale pour commander le comportement d'interaction du robot ; le réglage, selon un ou plusieurs comportements d'interaction prédéfinis configurés pour être exécutés par le robot, d'un comportement d'interaction déclenché par la condition de déclenchement (S204) ; et la génération, conformément à la condition de déclenchement et au comportement d'interaction réglés, d'un élément de commande pour commander, en réponse à des informations détectées par le robot, le comportement d'interaction du robot (S206). Dans le procédé, un élément de commande est réglé selon une unité de détection et un comportement d'interaction, pour commander un comportement d'interaction d'un robot, améliorant ainsi efficacement une capacité de comportement d'interaction adaptatif et le niveau d'intelligence du robot.
PCT/CN2016/087257 2015-06-26 2016-06-27 Procédé et appareil de génération de données de commande de robot WO2016206642A1 (fr)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
CN201510364661.7 2015-06-26
CN201510364661.7A CN106325228B (zh) 2015-06-26 2015-06-26 机器人的控制数据的生成方法及装置
CN201510363348.1A CN106325065A (zh) 2015-06-26 2015-06-26 机器人交互行为的控制方法、装置及机器人
CN201510363348.1 2015-06-26
CN201510363346.2A CN106325113B (zh) 2015-06-26 2015-06-26 机器人控制引擎及系统
CN201510363346.2 2015-06-26

Publications (1)

Publication Number Publication Date
WO2016206642A1 true WO2016206642A1 (fr) 2016-12-29

Family

ID=57584497

Family Applications (6)

Application Number Title Priority Date Filing Date
PCT/CN2016/087261 WO2016206646A1 (fr) 2015-06-26 2016-06-27 Procédé et système pour pousser un dispositif de machine à générer une action
PCT/CN2016/087262 WO2016206647A1 (fr) 2015-06-26 2016-06-27 Système de commande d'appareil mécanique permettant de générer une action
PCT/CN2016/087258 WO2016206643A1 (fr) 2015-06-26 2016-06-27 Procédé et dispositif de commande de comportement interactif de robot et robot associé
PCT/CN2016/087259 WO2016206644A1 (fr) 2015-06-26 2016-06-27 Moteur et système de commande de robot
PCT/CN2016/087260 WO2016206645A1 (fr) 2015-06-26 2016-06-27 Procédé et appareil de chargement de données de commande dans un dispositif de machine
PCT/CN2016/087257 WO2016206642A1 (fr) 2015-06-26 2016-06-27 Procédé et appareil de génération de données de commande de robot

Family Applications Before (5)

Application Number Title Priority Date Filing Date
PCT/CN2016/087261 WO2016206646A1 (fr) 2015-06-26 2016-06-27 Procédé et système pour pousser un dispositif de machine à générer une action
PCT/CN2016/087262 WO2016206647A1 (fr) 2015-06-26 2016-06-27 Système de commande d'appareil mécanique permettant de générer une action
PCT/CN2016/087258 WO2016206643A1 (fr) 2015-06-26 2016-06-27 Procédé et dispositif de commande de comportement interactif de robot et robot associé
PCT/CN2016/087259 WO2016206644A1 (fr) 2015-06-26 2016-06-27 Moteur et système de commande de robot
PCT/CN2016/087260 WO2016206645A1 (fr) 2015-06-26 2016-06-27 Procédé et appareil de chargement de données de commande dans un dispositif de machine

Country Status (1)

Country Link
WO (6) WO2016206646A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI735168B (zh) * 2020-02-27 2021-08-01 東元電機股份有限公司 語音控制機器人

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11220008B2 (en) * 2017-07-18 2022-01-11 Panasonic Intellectual Property Management Co., Ltd. Apparatus, method, non-transitory computer-readable recording medium storing program, and robot
CN108388399B (zh) * 2018-01-12 2021-04-06 北京光年无限科技有限公司 虚拟偶像的状态管理方法及系统
JP7188950B2 (ja) * 2018-09-20 2022-12-13 株式会社Screenホールディングス データ処理方法およびデータ処理プログラム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101618280A (zh) * 2009-06-30 2010-01-06 哈尔滨工业大学 具有人机交互功能的仿人头像机器人装置及行为控制方法
WO2011058530A1 (fr) * 2009-11-16 2011-05-19 Koninklijke Philips Electronics, N.V. Commande partagée humain-machine pour assistant robot endoscopique
CN102446428A (zh) * 2010-09-27 2012-05-09 北京紫光优蓝机器人技术有限公司 基于机器人的交互式学习系统及其交互方法
CN103399637A (zh) * 2013-07-31 2013-11-20 西北师范大学 基于kinect人体骨骼跟踪控制的智能机器人人机交互方法
CN104640677A (zh) * 2012-06-21 2015-05-20 睿信科机器人有限公司 训练和操作工业机器人

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001353678A (ja) * 2000-06-12 2001-12-25 Sony Corp オーサリング・システム及びオーサリング方法、並びに記憶媒体
JP4108342B2 (ja) * 2001-01-30 2008-06-25 日本電気株式会社 ロボット、ロボット制御システム、およびそのプログラム
US7089184B2 (en) * 2001-03-22 2006-08-08 Nurv Center Technologies, Inc. Speech recognition for recognizing speaker-independent, continuous speech
US6957215B2 (en) * 2001-12-10 2005-10-18 Hywire Ltd. Multi-dimensional associative search engine
CN100442732C (zh) * 2003-11-20 2008-12-10 松下电器产业株式会社 关联控制设备、关联控制方法及服务关联系统
JP2005193331A (ja) * 2004-01-06 2005-07-21 Sony Corp ロボット装置及びその情動表出方法
WO2006093394A1 (fr) * 2005-03-04 2006-09-08 Chutnoon Inc. Serveur, procede et systeme pour service de recherche d'informations au moyen d'une page web segmentee en plusieurs blocs d'information
JP2007044825A (ja) * 2005-08-10 2007-02-22 Toshiba Corp 行動管理装置、行動管理方法および行動管理プログラム
US7945441B2 (en) * 2007-08-07 2011-05-17 Microsoft Corporation Quantized feature index trajectory
WO2009157733A1 (fr) * 2008-06-27 2009-12-30 Yujin Robot Co., Ltd. Système d’apprentissage interactif utilisant un robot et son procédé de fonctionnement pour l’éducation des enfants
FR2946160B1 (fr) * 2009-05-26 2014-05-09 Aldebaran Robotics Systeme et procede pour editer et commander des comportements d'un robot mobile.
US20110213659A1 (en) * 2010-02-26 2011-09-01 Marcus Fontoura System and Method for Automatic Matching of Contracts in an Inverted Index to Impression Opportunities Using Complex Predicates and Confidence Threshold Values
FR2963132A1 (fr) * 2010-07-23 2012-01-27 Aldebaran Robotics Robot humanoide dote d'une interface de dialogue naturel, methode d'utilisation et de programmation de ladite interface
KR20120047577A (ko) * 2010-11-04 2012-05-14 주식회사 케이티 대화형 행동모델을 이용한 로봇 인터랙션 서비스 제공 장치 및 방법
WO2013052894A1 (fr) * 2011-10-05 2013-04-11 Opteon Corporation Procédés, appareil et systèmes pour la surveillance et/ou le contrôle d'environnements dynamiques
US20150242505A1 (en) * 2012-09-27 2015-08-27 Omron Corporation Device managing apparatus and device searching method
CN103324100B (zh) * 2013-05-02 2016-08-31 郭海锋 一种信息驱动的情感车载机器人
CN103729476A (zh) * 2014-01-26 2014-04-16 王玉娇 一种根据环境状态来关联内容的方法和系统
CN103793536B (zh) * 2014-03-03 2017-04-26 陈念生 一种智能平台实现方法及装置
CN105511608B (zh) * 2015-11-30 2018-12-25 北京光年无限科技有限公司 基于智能机器人的交互方法及装置、智能机器人

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101618280A (zh) * 2009-06-30 2010-01-06 哈尔滨工业大学 具有人机交互功能的仿人头像机器人装置及行为控制方法
WO2011058530A1 (fr) * 2009-11-16 2011-05-19 Koninklijke Philips Electronics, N.V. Commande partagée humain-machine pour assistant robot endoscopique
CN102446428A (zh) * 2010-09-27 2012-05-09 北京紫光优蓝机器人技术有限公司 基于机器人的交互式学习系统及其交互方法
CN104640677A (zh) * 2012-06-21 2015-05-20 睿信科机器人有限公司 训练和操作工业机器人
CN103399637A (zh) * 2013-07-31 2013-11-20 西北师范大学 基于kinect人体骨骼跟踪控制的智能机器人人机交互方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI735168B (zh) * 2020-02-27 2021-08-01 東元電機股份有限公司 語音控制機器人

Also Published As

Publication number Publication date
WO2016206645A1 (fr) 2016-12-29
WO2016206646A1 (fr) 2016-12-29
WO2016206643A1 (fr) 2016-12-29
WO2016206647A1 (fr) 2016-12-29
WO2016206644A1 (fr) 2016-12-29

Similar Documents

Publication Publication Date Title
CN106325228B (zh) 机器人的控制数据的生成方法及装置
US11810562B2 (en) Reducing the need for manual start/end-pointing and trigger phrases
US9543918B1 (en) Configuring notification intensity level using device sensors
WO2021008538A1 (fr) Procédé d'interaction vocale et dispositif associé
WO2016206642A1 (fr) Procédé et appareil de génération de données de commande de robot
CN106325065A (zh) 机器人交互行为的控制方法、装置及机器人
US11367443B2 (en) Electronic device and method for controlling electronic device
WO2015155977A1 (fr) Système de liaison, dispositif, procédé, et support d'enregistrement
US20130159400A1 (en) User device, server, and operating conditions setting system
CN106325113B (zh) 机器人控制引擎及系统
KR20190009201A (ko) 이동 단말기 및 그 제어 방법
CN106921802B (zh) 音频数据的播放方法及装置
CN110399474B (zh) 一种智能对话方法、装置、设备及存储介质
CN111816168A (zh) 一种模型训练的方法、语音播放的方法、装置及存储介质
WO2023006033A1 (fr) Procédé d'interaction vocale, dispositif électronique et support
US20200234187A1 (en) Information processing apparatus, information processing method, and program
KR20200101221A (ko) 사용자 입력 처리 방법 및 이를 지원하는 전자 장치
WO2017081894A1 (fr) Système de communication et procédé de commande de communication
US11731262B2 (en) Robot and method for operating the same
WO2020153146A1 (fr) Dispositif de traitement d'informations et procédé de traitement d'informations
CN109902606B (zh) 一种操作方法及终端设备
US20220055223A1 (en) Electronic device for providing reaction on basis of user state and operating method therefor
CN113241077A (zh) 用于可穿戴设备的语音录入方法和装置
EP2930889A1 (fr) Systèmes et procédés pour des réseaux de notification adaptatifs

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16813758

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16813758

Country of ref document: EP

Kind code of ref document: A1