WO2016206642A1 - Method and apparatus for generating control data of robot - Google Patents

Method and apparatus for generating control data of robot Download PDF

Info

Publication number
WO2016206642A1
WO2016206642A1 PCT/CN2016/087257 CN2016087257W WO2016206642A1 WO 2016206642 A1 WO2016206642 A1 WO 2016206642A1 CN 2016087257 W CN2016087257 W CN 2016087257W WO 2016206642 A1 WO2016206642 A1 WO 2016206642A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
interaction behavior
sensing unit
trigger condition
data
Prior art date
Application number
PCT/CN2016/087257
Other languages
French (fr)
Chinese (zh)
Inventor
聂华闻
Original Assignee
北京贝虎机器人技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201510363348.1A external-priority patent/CN106325065A/en
Priority claimed from CN201510363346.2A external-priority patent/CN106325113B/en
Priority claimed from CN201510364661.7A external-priority patent/CN106325228B/en
Application filed by 北京贝虎机器人技术有限公司 filed Critical 北京贝虎机器人技术有限公司
Publication of WO2016206642A1 publication Critical patent/WO2016206642A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/10Program control for peripheral devices

Definitions

  • the present invention relates to the field of robot technology, and in particular, to a method and an apparatus for generating control data of a robot.
  • Today's robots are mostly industrial robots, while industrial robots are mostly non-sense.
  • the operating procedures of these robots are pre-defined and the determined tasks are completed without fail in accordance with the predetermined procedures. They lack adaptability and produce consistent results only when the objects involved are the same.
  • the embodiment of the invention provides a method and a device for generating control data of a robot, so as to at least effectively improve the adaptive interaction behavior and the degree of intelligence of the robot.
  • a method for generating control data of a robot includes: setting a trigger condition for controlling a robot interaction behavior according to one or more preset sensing units, wherein the sensing unit is configured to control the robot interaction a minimum unit of behavior; setting an interaction behavior triggered by the trigger condition according to one or more preset interaction behaviors set for the robot to execute; generating information for responding to the robot perception according to the set trigger condition and interaction behavior A control entry that controls the interaction of the robot.
  • setting a trigger condition for controlling a robot interaction behavior according to one or more preset sensing units includes: selecting at least one sensing unit from a preset sensing unit; setting an attribute of the selected sensing unit The attribute of the sensing unit includes the value of the sensing unit; and the triggering condition for controlling the interaction behavior of the robot is set according to the selected sensing unit and the attribute of the sensing unit.
  • the triggering condition triggered interaction behavior is set according to one or more preset interaction behaviors set for the robot to perform, including: selecting at least from the preset interaction behavior set for the robot to perform An interaction behavior; setting an attribute of the selected interaction behavior, wherein the attribute of the interaction behavior includes one or more action instructions of the interaction behavior that can be parsed by the robot to execute and parameters of the action instruction; according to the selected interaction behavior and the interaction behavior
  • the property sets the interaction behavior triggered by the trigger condition.
  • a control device for robot interaction behavior includes: a trigger condition setting module, configured to set a trigger condition for controlling a robot interaction behavior according to one or more preset sensing units, wherein the sensing unit The smallest unit that is set to control the interactive behavior of the robot; the interactive behavior setting module is used to The plurality of presets are set as interaction behaviors performed by the robot, and the interaction behavior triggered by the trigger condition is set; the generating module is configured to generate the information for responding to the robot to control the robot interaction behavior according to the set trigger condition and the interaction behavior. Control entry.
  • the sensing unit is defined as a minimum unit for controlling the interaction behavior of the robot, and defines an interaction behavior
  • the control item is controlled according to the sensing unit and the interaction behavior to control the robot.
  • the interaction behavior unifies the input and output standards of robot control, so that non-technical personnel can also edit the behavior of the robot, effectively improving the robot's adaptive interaction behavior and intelligence.
  • FIG. 1 illustrates a schematic structural view of a robot in accordance with some embodiments of the present invention
  • FIG. 2a illustrates a flow chart of a method of generating control data for a robot in accordance with some embodiments of the present invention
  • 2b illustrates a schematic diagram of a generation interface of robot control data in accordance with some embodiments of the present invention
  • 2c illustrates a schematic diagram of a generation interface for controlling interaction behavior in data, in accordance with some embodiments of the present invention
  • FIG. 3 illustrates a flow chart 1 of a method of controlling robot interaction behavior in accordance with some embodiments of the present invention
  • FIG. 4 illustrates a flow chart 2 of a method of controlling robot interaction behavior in accordance with some embodiments of the present invention
  • FIG. 5 illustrates a flow chart 3 of a method of controlling robot interaction behavior in accordance with some embodiments of the present invention
  • FIG. 6 is a block diagram showing the structure of a device for generating control data of a machine according to some embodiments of the present invention.
  • FIG. 7 illustrates a block diagram of a trigger condition setting module in accordance with some embodiments of the present invention.
  • FIG. 8 illustrates a block diagram of another trigger condition setting module in accordance with some embodiments of the present invention.
  • FIG. 9 illustrates a structural block diagram of an interactive behavior setting module in accordance with some embodiments of the present invention.
  • Figure 10 illustrates a block diagram of another interactive behavior setting module in accordance with some embodiments of the present invention.
  • the robot 100 includes a memory 102, a memory controller 104, one or more processing units (CPUs) 106, a peripheral interface 108, a radio frequency (RF) circuit 114, an audio circuit 116, a speaker 118, a microphone 120, a sensing subsystem 122, a gesture Sensor 132, camera 134, tactile sensor 136, and one or more other sensing devices 138, as well as external interface 140. These components communicate over one or more communication buses or signal lines 110.
  • CPUs processing units
  • RF radio frequency
  • the robot 100 is just one example of the robot 100, which may have more or fewer components than the illustration, or have different component configurations.
  • the bot 100 can include one or more CPUs 106, memory 102, one or more sensing devices (eg, sensing devices as described above), and one or more are stored in the memory 102 to A module, program, or instruction set that performs a robot interaction behavior control method.
  • the various components shown in FIG. 1 can be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
  • the robot 100 may be an electromechanical device having a biological shape (eg, a humanoid, etc.), or may be a smart device that does not have a biological appearance but has human characteristics (eg, language communication, etc.), the smart device may Mechanical devices are also included, as well as virtual devices implemented by software (eg, virtual chat bots, etc.).
  • the virtual chat bot can perceive information through the device in which it is located, and the device in which it is located includes electronic devices such as handheld electronic devices, personal computers, and the like.
  • Memory 102 can include high speed random access memory and can also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid state storage devices.
  • memory 102 may also include memory remote from one or more CPUs 106, such as network attached memory accessed via RF circuitry 114 or external interface 140 and a communication network (not shown), wherein the communication network It can be the Internet, one or more intranets, a local area network (LAN), a wide area network (WLAN), a storage area network (SAN), etc., or a suitable combination thereof.
  • Memory controller 104 can control access to memory 102 by other components of robot 100, such as CPU 106 and peripheral interface 108.
  • Peripheral interface 108 couples the input and output peripherals of the device to CPU 106 and memory 102.
  • the one or more processors 106 described above execute various software programs and/or sets of instructions stored in the memory 102 to perform various functions of the robot 100 and process the data.
  • peripheral interface 108, CPU 106, and memory controller 104 can be implemented on a single chip, such as chip 112. In some other embodiments, they may be implemented on multiple discrete chips.
  • the RF circuit 114 receives and transmits electromagnetic waves.
  • the RF circuit 114 converts an electrical signal into an electromagnetic wave, or converts the electromagnetic wave into an electrical signal, and communicates with the communication network and other communication devices via the electromagnetic wave.
  • the RF Circuitry 114 may include well-known circuitry for performing these functions, including but not limited to antenna systems, RF transceivers, one or more amplifiers, tuners, one or more oscillators, digital signal processors, CODEC chipsets, Subscriber Identity Module (SIM) card, memory, etc.
  • SIM Subscriber Identity Module
  • the RF circuit 112 can communicate with a network and other devices via wireless communication, such as the World Wide Web (WWW) Internet, an intranet, and/or a wireless network such as a cellular telephone network, a wireless local area network (LAN), and / or Metropolitan Area Network (MAN).
  • WWW World Wide Web
  • LAN wireless local area network
  • MAN Metropolitan Area Network
  • the above wireless communication may use any of a variety of communication standards, protocols, and technologies including, but not limited to, Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), Wideband Code Division Multiple Access (W-CDMA). ), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (eg IEEE 802.11a, IEEE 802.11b, IEEE 802.11g, and/or IEEE 802.11n), Internet-based Protocol Voice over Internet Protocol (VoIP), Wi-MAX, protocols for e-mail, instant messaging, and/or short message service (SMS), or any other suitable communication protocol, including those not yet developed at the filing date of this document. letter of agreement.
  • GSM Global System for Mobile Communications
  • EDGE Enhanced Data GSM Environment
  • W-CDMA Code Division Multiple Access
  • CDMA Code Division Multiple Access
  • TDMA Time Division Multiple Access
  • Bluetooth Bluetooth
  • Wi-Fi eg IEEE 802.11a, IEEE 802.11b, IEEE 802.
  • Audio circuitry 116, speaker 118, and microphone 120 provide an audio interface between the user and the robot 100.
  • Audio circuitry 116 receives audio data from peripheral interface 108, converts the audio data into electrical signals, and transmits the electrical signals to speaker 118.
  • the speaker transforms the electrical signal into a human audible sound wave.
  • Audio circuit 116 also receives electrical signals that are converted from sound waves by microphone 118.
  • the audio circuit 116 converts the electrical signals into audio data and transmits the audio data to the peripheral interface 108 for processing. Audio data may be retrieved from memory 102 and/or RF circuitry 114 by peripheral interface 108 and/or transmitted to memory 102 and/or RF circuitry 114.
  • a plurality of microphones 120 can be included, the plurality of microphones 120 being distributed at different locations, and the direction in which the sound is emitted is determined according to a predetermined strategy based on the microphones 120 at different locations. It should be understood that the direction of the sound can also be identified by some sensors.
  • audio circuit 116 also includes a headset jack (not shown).
  • the headset jack provides an interface between the audio circuit 114 and a removable audio input/output peripheral, for example, the audio input/output peripheral can be either a pure output headset or both Output (for single or binaural headphones) and input (microphone) headset.
  • a speech recognition device (not shown) is also included for implementing speech-to-text recognition and synthesizing speech based on text.
  • the speech recognition device can be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
  • the audio circuit 116 receives the audio data from the peripheral interface 108, converts the audio data into electrical signals, and the voice recognition device can identify the audio data and convert the audio data into text data.
  • the speech recognition apparatus can also synthesize the audio data based on the text data, convert the audio data into an electrical signal through the audio circuit 116, and transmit the electrical signal to the speaker 118.
  • Perception subsystem 122 provides an interface between the perceptual peripherals of robot 100 and peripheral interface 108, such as attitude sensor 132, camera 134, tactile sensor 136, and other sensing devices 128.
  • Perception subsystem 122 includes an attitude controller 124, a visual controller 126, a haptic controller 128, and one or more other perceptual device controllers 130.
  • the one or more other sensing device controllers 130 receive/transmit electrical signals from/to other sensing devices 138.
  • the other sensing devices 138 may include temperature sensors, distance sensors, proximity sensors, air pressure sensors, air quality detecting devices, and the like.
  • the robot 100 can have a plurality of attitude controllers 124 to control different limbs of the robot 100, which can include, but are not limited to, arms, feet, and heads. Accordingly, the robot 100 can include a plurality of attitude sensors 132. In some embodiments, the robot 100 may not have the attitude controller 124 and the attitude sensor 132. The robot 100 may be in a fixed configuration and does not have mechanical moving parts such as an arm or a foot. In some embodiments, the pose of the robot 100 may not be a mechanical arm, foot, and head, but may also employ a deformable configuration.
  • the robot 100 also includes a power system 142 for powering various components.
  • the power system 142 can include a power management system, one or more power sources (eg, batteries, alternating current (AC)), charging systems, power failure detection circuits, power converters or inverters, power status indicators (eg, light emitting diodes (eg LED)), as well as any other components associated with power generation, management, and distribution in portable devices.
  • the charging system can be a wired charging system or a wireless charging system.
  • the software components include an operating system 144, a communication module (or set of instructions) 146, an interactive behavior control device (or set of instructions) 148, and one or more other devices (or sets of instructions) 150.
  • Operating system 144 eg, Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as Vxworks
  • controls and management of general system tasks eg, memory management, storage device control, power management, etc.
  • software components and/or drivers that facilitate communication between various hardware and software components.
  • Communication module 146 facilitates communication with other devices via one or more external interfaces 140, and it also includes various software components for processing data received by RF circuitry 114 and/or external interface 140.
  • External interface 140 eg, Universal Serial Bus (USB), FIREWIRE, etc.
  • USB Universal Serial Bus
  • FIREWIRE FireWire
  • Communication module 146 facilitates communication with other devices via one or more external interfaces 140, and it also includes various software components for processing data received by RF circuitry 114 and/or external interface 140.
  • External interface 140 eg, Universal Serial Bus (USB), FIREWIRE, etc.
  • USB Universal Serial Bus
  • FIREWIRE FireWire
  • Communication module 146 facilitates communication with other devices via one or more external interfaces 140, and it also includes various software components for processing data received by RF circuitry 114 and/or external interface 140.
  • External interface 140 eg, Universal Serial Bus (USB), FIREWIRE, etc.
  • USB Universal Serial Bus
  • FIREWIRE FireWire
  • the robot 100 may also include a display device (not shown), which may include, but is not limited to, a touch sensitive display, a touch pad, and the like.
  • a display device (not shown), which may include, but is not limited to, a touch sensitive display, a touch pad, and the like.
  • One or more of the other devices 150 described above can include a graphics module (not shown) that includes various known software components for presenting and displaying graphics on the display device. Attention The term "graphics" includes any object that can be displayed to a user, including but not limited to text, web pages, icons (eg, user interface objects including soft keys), digital images, video, animation, and the like. Touch sensitive displays or touch pads can also be used for user input.
  • the robot 100 senses the external environment of the robot 10 and the condition of the robot itself by, for example, the attitude sensor 132, the camera 134, the tactile sensor 136, and other sensing devices 128, the microphone 120, etc., and the information perceived by the robot 100 is controlled via the sensing peripheral.
  • the device processes and is processed by one or more CPUs 106.
  • the perception of the environment by the robot 100 includes, but is not limited to, information detected by its own sensors (eg, the attitude sensor 132, the camera 134, the tactile sensor 136, and other sensing devices 128), and may also be an external device connected to the robot 100 (not shown)
  • the detected information establishes a communication connection between the robot 100 and the external device through which the robot 100 and the external device transmit data.
  • External devices include various types of sensors, smart home devices, and the like.
  • the information perceived by the robot 100 includes, but is not limited to, sound, images, environmental parameters, haptic information, time, space, and the like.
  • Environmental parameters include, but are not limited to, temperature, humidity, gas concentration, etc.
  • tactile information includes, but is not limited to, contact with the robot 100, including but not limited to contact with a touch sensitive display, contact or proximity to a tactile sensor, and the tactile sensor can be placed at The head, arm, etc. of the robot (not shown) should be described to include other forms of information.
  • the sound may include voice and other sounds, the sound may be the sound collected by the microphone 120, or may be the sound stored in the memory 102; the voice may include, but is not limited to, human speaking or singing.
  • the image may be a single picture or video, including but not limited to captured by camera 134, or may be read from memory 102 or transmitted to the robot 100 over a network.
  • the information perceived by the robot 100 includes not only information external to the robot 100 but also information of the robot 100 itself, including but not limited to information such as the amount of power, temperature, and the like of the robot 100.
  • the robot 100 can be moved to the charging position for automatic charging when it is perceived that the power of the machine 100 is less than 20%.
  • the robot 100 is not limited to perceiving information in the manner described above, but may also perceive information in other forms, including perceptual techniques that have not been developed at the filing date of this document.
  • the sensing device of the robot 100 is not limited to the sensing device provided on the robot 100, and may also include a sensing device associated with the robot 100 and not provided on the robot 100, such as various sensors for sensing information.
  • the robot 100 may be associated with a temperature sensor, a humidity sensor (not shown), or the like disposed within a certain area through which the corresponding information is perceived.
  • the robot 100 can communicate with these sensors through various types of communication protocols to obtain information from these sensors.
  • the information perceived by the robot 100 may be set according to preset conditions, which may include, but are not limited to, setting which information the robot 100 perceives, when to perceive the information, and the like. For example, it may be set to sense the user's voice, track the user's face, recognize the user's gesture, etc., in the user's voice conversation, without perceiving other information, or reducing the effect of other information when generating the sensing unit, or sensing the Other information is processed, etc.; or, during a certain period of time (for example, when the user goes out, the robot 100 is indoors alone), the environmental parameters, the perceived image, and the video data are sensed, and the environmental parameters are used to determine whether it is necessary to interact with the air conditioner or the like.
  • preset conditions may include, but are not limited to, setting which information the robot 100 perceives, when to perceive the information, and the like. For example, it may be set to sense the user's voice, track the user's face, recognize the user's gesture, etc., in the user
  • condition for setting the perceived information is not limited thereto, and the above conditions are merely exemplified, and the information that the robot 100 needs to perceive may be set depending on the situation.
  • At least one sensing unit is defined, which is the smallest unit (or referred to as a minimum input unit) that controls the robot 100, and the robot 100 makes interactive behavior based on at least the sensing unit.
  • the interaction behavior of the robot 100 may be controlled by one or more sensing units, for example, when the values of one or more sensing units change, the robot 100 may react to the changes in response to the changes; or, when one or more perceptions When the value of the unit is within a certain value range or equal to a certain value, the robot 100 can perform an interactive behavior in response to the sensing unit. It should be understood that the control of the interaction behavior of the robot 100 by the sensing unit is not limited to the above case, and the above case is merely illustrative.
  • the sensing unit can include multiple levels, and the higher level sensing unit can include one or more sensing units of the lower level.
  • the higher level perceptual unit may include one or more perceptual units of the lower level adjacent thereto, and the sensing unit of the same higher level may include different lower level perceptual units.
  • the low-level sensing units that synthesize the high-level sensing units include, but are not limited to, low-level sensing units of the same time or time period, and historical low-level sensing units of the time or time period.
  • the higher level perceptual units are determined by lower level sensing units at different times.
  • the value of the sensing unit may be one or a set of values, or may be a range of one or more values.
  • the value of the sensing unit may be determined according to the information perceived by the robot 100.
  • One sensing unit may be determined by one or more pieces of information that is perceived, and the same sensing unit may be determined by different data that is perceived.
  • the perceived information may include real-time perceived information, or historically perceived information (such as information perceived at a certain time or some time in the past). In some cases, the value of the sensing unit is determined by the information perceived in real time and the information perceived by the history.
  • the voice recognition process is performed on the received voice to identify the text of the voice in the voice, and the value of the hearing may be the text of the voice heard; in some embodiments, the vision may also include The direction of the sound, the direction of the sound is referenced to the face of the robot, including left, right, front, back and other directions.
  • the robot 100 can analyze the image or video to determine whether there is or is there a current movement, and the visual value can include whether or not someone has moved or not.
  • the time describes the time information, and the value may be a time point or a time range, for example, 14:00 every February 1st.
  • the environment describes the environmental conditions, including temperature, humidity, noise, PM2.5, ppm of gas in the air, carbon monoxide content in the air, oxygen content in the air, etc., and the value may be the value or range of each parameter.
  • the value of the sensing unit can be predefined.
  • the value of the predefined sensing unit may be one or more specific values, or one or more ranges of values.
  • the value of the sensing unit may be an explicit value, or may be formed by a wildcard (or the like) together with an explicit value, but is not limited thereto. For example, when the sensing unit is “speech”, the value may be “*rain*”, indicating that any voice information containing “raining” is included; or the value may be “*[below] rain*”, indicating Any voice message that contains "rain” or "rain”.
  • the robot 100 may generate the sensing data according to the sensing unit and the perceived information, and the sensing data may include one or more sensing units, where the sensing data includes the identification and value of the sensing unit.
  • the sensing data includes the identification and value of the sensing unit.
  • the robot 100 generates the sensing data according to the perceived information according to the sensing unit, and can obtain the value of the sensing unit according to the perceived information by using various analysis methods, for example, obtaining the text of the voice through the voice recognition technology, and analyzing the sensing through the image recognition technology. Whether there is a portrait in the image to be obtained, the attribute of the portrait is determined by the portrait (face) recognition technique, and the like. It should be understood that the robot 100 is not limited to obtaining the value of the sensing unit by the above manner, and may also include processing techniques that have not been developed at the filing date of this document by other means.
  • the trigger condition and the interaction behavior triggered by the trigger condition can be set.
  • a control item for controlling the interaction behavior of the robot in response to the information perceived by the robot is generated.
  • Control entries can have unique identifiers to distinguish control entries.
  • the triggering condition may be composed of one or more sensing units, and the logical units may be configured between the sensing units, and the logical relationships include, but are not limited to, “and”, “or”, “not”, and the like.
  • the triggering condition may include an identifier and a value of the sensing unit constituting the triggering condition, and the value of the sensing unit may be one or a set of values, or one Or a set of values.
  • the value of the sensing unit may be an explicit value, or may be composed of a wildcard (or the like) and an explicit value, but is not limited thereto.
  • the sensing unit in the trigger condition is “speech”
  • the value may be “*rain*”, indicating that any voice information containing “rain” is included; or the value may be “*[below] rain* ",” means that any voice message containing "rain” or "rain” is included.
  • One or more interactions that the trigger condition can trigger.
  • the order between interactions can be set to perform multiple interactions in a set order.
  • the interaction behavior can be configured as one or more action instructions that can be parsed by the robot for execution, and the action instructions can also include one or more parameters.
  • the order of execution of the one or more action instructions can also be configured.
  • the execution order may include, but is not limited to, randomly executing one or a set of action instructions to effect random execution of one or more actions; or executing a plurality of action instructions in a predetermined sequence of steps.
  • the operating system 144 of the robot 100 and other related devices can parse the action instructions of the interactive behavior, so that the robot performs the interactive behavior. For example, to move the robot forward by 5 meters, the action command can be "move ⁇ "m":5 ⁇ ".
  • the robot interaction behavior control device 148 parses the action instruction, obtains the task (move) and task parameters (5 meters forward) to be executed, passes the task and parameters to the operating system 144, and the operating system 144 further processes the mobile device (not shown) To perform the movement, the mobile device may include a foot, a wheel, a crawler, and the like. It should be understood that specific instructions, such as parameters of individual motors (or similar components) of the mobile device, may also be provided.
  • the action instructions of the interactive behavior include: links to other control entries set for execution of other control entries, and/or for selecting content from a plurality of content and/or parameters and/or A link to multiple parameters and/or multiple content set by parameters or parameters.
  • Each control entry may have a unique identification to which an action instruction may refer to the control entry.
  • the content of the action instruction link may be a set of actions, and the robot 100 may perform actions in a set of actions according to other factors. For example, attributes such as personality or gender of the robot 100 may be pre-configured, and the attributes may be stored in the memory 102, different genders.
  • the interactive robot 100 may have different interaction behaviors for the same situation (or called a scene), and the robot 100 may select an executed action from a set of actions according to attributes such as a set personality or gender, and the actions may include, but are not limited to, the robot 100. Physical movements, etc.
  • the action instruction may be linked to one or a group of content, which may include, but is not limited to, the content of the voice chat, various Internet information, etc., for example, the action performed by the robot 100 according to the control item is to query the weather in Beijing, and the action instruction may be a Querying the weather address, the robot 100 obtains the weather in Beijing at this address, which may include a uniform resource locator (URL), a memory address, a database field, and the like.
  • URL uniform resource locator
  • the interactive behavior of the robot 100 includes, but is not limited to, by outputting a voice, adjusting a gesture, outputting an image or video, interacting with other devices, and the like.
  • Output speech includes, but is not limited to, chatting with a user, playing music; adjusting gestures including, but not limited to, moving (eg, mimicking human walking, etc.), limb swings (eg, arm, head swing), posture adjustment, etc.; outputting images or videos Including but not limited to displaying an image or video on a display device, the image may be a dynamic electronic expression or the like, or may be a captured image, or an image obtained from a network; interaction with other devices includes, but is not limited to, controlling other devices ( For example, adjusting the operating parameters of the air conditioner, etc., transferring data to other devices, establishing connections with other devices, and the like.
  • the interactive behavior is not limited to the above enumerated contents, and the reaction of the robot 100 to the perceived information can be regarded as the interactive behavior of the robot
  • Control entries can be configured in a data exchange format, although other formats can be used.
  • Data exchange formats include, but are not limited to, XML, JSON, or YAML.
  • JSON Take JSON as an example, you need to implement: When the user says, "Sing me a song,” first go back to 10cm at a medium speed of 0 and then start singing a song. After singing the song, take a photo and send it to the user 10 seconds later. Then 0 angle forward 5CM.
  • the control entry for the JSON data format can be as follows:
  • the "ifs” part is a trigger condition set according to the sensing unit
  • "ear” is the identification of the sensing unit
  • “singing” is the value of the sensing unit.
  • the “trigger” part is the interactive behavior triggered by the trigger condition, including three interaction behaviors of “move”, “song” and “take_pic”, each of which includes a corresponding action instruction. Among them, “song” is linked to "http://bpeer.com/i.mp3", the content of singing is obtained from “http://bpeer.com/i.mp3”, and "gr" is action Execution order.
  • control entries may be stored as documents in a data exchange format, or may also be stored in a database.
  • the bot 100 can also include a database system for storing control entries.
  • the database system provides an interface for one or more CPUs 106 to read data from the database and to write data to the database system.
  • the control device 148 of the robot interaction behavior may control the interaction behavior of the robot according to the control item, and the control device 148 acquires the information perceived by the robot, and generates the sensing data according to the perceived information, at least according to a predefined sensing unit, wherein the sensing data includes the sensing.
  • the identification and value of the unit finding a control entry that matches the generated perceptual data; if the control entry matching the generated perceptual data is found, causing the robot to perform the interaction behavior in the found control entry.
  • control device 148 can also transmit the information perceived by the robot 100, and the remote server (not shown) generates the sensing data according to the sensed information and the sensing unit, and searches for and generates the sensing unit.
  • the matching control entries are then sent to the control device 148, which causes the robot to perform the interactive behavior in the control entry.
  • an identification of the perceptual information may be generated to determine if the received control entry is a control entry for the transmitted perceptual information.
  • the control device 148 may be sent to the control entry itself, or may be an identifier of the control entry, or interactive behavior data that controls the configuration of the entry, or other information that causes the control device 148 to determine the interaction behavior of the control entry configuration.
  • control device 148 can generate the sensing data according to the information and the sensing unit perceived by the robot 100, and send the generated sensing data to the remote server, and the remote server receives the sensing data to find the control that matches the sensing data.
  • An entry, which sends the found control entry to the robot 100, causes the robot 100 to perform the interactive behavior in the control entry.
  • control device 148 is not limited to controlling the interactive behavior of the robot by the manner described above, but may be a combination of the above several manners or other means.
  • the trigger condition may be set according to the sensing unit, and the interaction behavior triggered by the trigger condition, the control item is obtained, and the control item is used as data for controlling the interaction behavior of the robot 100.
  • FIG. 2a illustrates a flow chart of a method of generating control data for a robot, as shown in FIG. 2a, in accordance with some embodiments of the present invention, the method comprising:
  • Step S202 setting a trigger condition for controlling the interaction behavior of the robot according to one or more preset sensing units
  • Step S204 setting an interaction behavior triggered by the trigger condition according to one or more preset interaction behaviors set for the robot to perform;
  • Step S206 generating a control item for controlling the interaction behavior of the robot in response to the information perceived by the robot according to the set trigger condition and the interaction behavior.
  • At least one sensing unit may be selected from the preset sensing unit; the attribute of the selected sensing unit is set, wherein the attribute of the sensing unit includes the value of the sensing unit; according to the selected sensing unit and the sensing unit
  • the property settings are used to control the triggering conditions of the robot's interactive behavior.
  • a relationship between the plurality of sensing units is further set, and the relationship between the sensing units includes, but is not limited to, logical relationships such as “AND”, “OR”, and “NO”; the foregoing step S202 may be selected according to the selection.
  • a trigger condition is set for the relationship between the sensing unit and the sensing unit and the relationship between the sensing units.
  • the weight of the sensing unit may also be set to distinguish the importance of the different sensing units, thereby correspondingly interacting with the important sensing units.
  • At least one interaction behavior may be selected from a preset interaction behavior set for the robot to perform; setting an attribute of the selected interaction behavior, wherein the attribute of the interaction behavior includes one or more interaction behaviors
  • the robot parses the action instruction and the parameters of the action instruction; and sets the interaction behavior triggered by the trigger condition according to the selected interaction behavior and the attribute of the interaction behavior.
  • the execution order of the multiple interaction behaviors may also be set.
  • the interaction behavior triggered by the trigger condition may be set according to the selected interaction behavior and the attributes of the interaction behavior and the execution order.
  • the execution order of the interaction behavior includes, but is not limited to, randomly performing one or more interaction behaviors, or performing a plurality of interaction behaviors according to predetermined steps.
  • the triggering condition and the triggering condition triggered interaction behavior are described in terms of a predetermined data representation.
  • the control entry may be generated using a data exchange format based on the interaction behavior triggered by the trigger condition and the trigger condition.
  • Data exchange formats include, but are not limited to, one or any combination of the following: XML, JSON, or YAML. It should be understood that other formats may also be used to generate triggering conditions triggered by trigger conditions and trigger conditions, including data representations that have not yet been developed on the filing date of this document.
  • multiple control entries can be set and multiple control entries stored as documents in a data exchange format.
  • multiple control entries can also be stored in the database.
  • adjacent control entries can be separated by a predetermined symbol to distinguish between different control entries.
  • the document storing the control entry may be stored in the memory 102 of the robot 100, and the document of the control entry may also be stored in the remote server.
  • the interaction behavior is configured as one or more action instructions.
  • the above action instructions include: links to other control entries set for executing other control entries, and/or multiple parameters set for selecting content and/or parameters from a plurality of content and/or parameters. And/or links to multiple content.
  • the action command of "query weather” it is possible to link to a webpage providing weather information, and obtain weather information of the city to be queried from the webpage. After the weather information is queried, it may be displayed on the display device of the robot 100, or the weather information may be broadcasted by voice.
  • the actions performed may be selected according to other configurations. Parameters;
  • when linking to multiple content such as multiple corpora of chats), you can also choose what to render based on other configurations.
  • the execution order of the action instructions may also be set, wherein the execution sequence includes: randomly executing one or more action instructions, or executing a plurality of action instructions in predetermined steps.
  • the order of execution can be marked with symbols. If there is no mark, it can be in the order in which the actions are described. The same type of action can be used as a whole, and the order of actions can be marked. For example, “move forward 5 meters, nod 5 times, then back 10 meters”, the action instruction can be expressed as [move: ⁇ gr: 0,m:+5;gr:2,m:-10 ⁇ ;head ⁇ gr:1,head:5 ⁇ ], "gr" indicates the execution order of the action, and the action with a small value is executed first.
  • a graphical user interface can be provided for setting trigger conditions and interaction behaviors, the graphical user interface providing a set sensing unit (eg, the name of the sensing unit, the identification, etc.), a perceptible unit that can be set The value of the relationship between the sensing unit and the sensing unit.
  • the user who sets the triggering condition can select the sensing unit, the value of the sensing unit, and the logical relationship of the sensing unit. After selecting the sensing unit that sets the triggering condition, the triggering condition is generated according to the corresponding format.
  • the graphical user interface can also provide set interaction behaviors, which can be pre-defined interaction behaviors, and after the interaction behavior is selected, the interaction behavior is generated according to the corresponding format.
  • the trigger condition and the interaction behavior may also be directly edited, for example, according to the data exchange format described above, the action instruction specification using the predefined sensing unit and the interaction behavior, and the interaction behavior triggered by the trigger condition and the trigger condition are edited. Get control entries.
  • Figure 2b illustrates a generation interface for control data for a robot of certain embodiments.
  • a graphical user interface is added to add control entries (referred to as add rules in Figure 2b).
  • the graphical user interface includes two parts: the trigger condition and the interaction behavior triggered by the trigger condition:
  • the first part is "execution condition for selecting a robot rule" (trigger condition in the embodiment of the present invention)
  • the sensing unit includes "voice", "video monitoring situation", "time”, "whether or not” Some people are at home, "environment”, where the value of "voice” can be the text spoken to the robot, and the value of "video surveillance situation” includes the robot monitoring to someone or having to move.
  • the relationship between the sensing units is also included, and the relationship of "simultaneous satisfaction” (logical AND) and “satisfying one” (logical OR) are shown in Fig. 2b.
  • the second part is "adding robot's execution action” (interaction behavior in the embodiment of the present invention), as shown in FIG. 2b, including “speaking”, “entering standby”, “recording audio and video”, “playing music” Interactive behaviors such as “moving”, “vacuum”, and “charging”.
  • the attributes of the "go to standby” interaction include “stop all work into standby” and "exit”.
  • Figure 2c illustrates a generation interface for interactive behavior in control data of certain embodiments.
  • the interaction behavior to be added after selecting the interaction behavior to be added and setting the properties of the selected interaction behavior, click the "Add” button, and the data of the added interaction behavior can be displayed in the "Action List” section.
  • the added interactive behaviors include: “record audio and video”, “play music”, “move”, “vacuum” and “charge”. Where "gr" indicates the execution order of the interaction behavior.
  • Figures 2b and 2c are only one example.
  • a graphical user interface for generating control entries may also be provided by other means, such as setting icons, by adding "drag" the icon to the editing area. Perceived unit or interactive behavior.
  • content may be fetched from the Internet (eg, a web page, etc.), the captured content may be analyzed, content for setting control entries may be obtained, and triggering conditions triggered by trigger conditions and trigger conditions may be set according to the content. For example, if you call the emergency number when you are ill from the Internet, you can set the trigger condition of “ill” according to the sensing unit, and set the interaction behavior triggered by the trigger condition to “call emergency call”, for example, the preset interaction. The behavior of the "Call" parameter is set to the emergency number. If the sensing unit of “health status” is predefined, the value of the sensing unit can be directly set to “ill”, and the triggering condition can be ⁇ if(“health”: “sick”) ⁇ .
  • the robot 100 can determine the health status of the user based on the perceived data, determine whether the health condition is "ill", for example, perform a voice chat with the user to understand the state of the user, and detect the heart rate, body temperature, and the like of the user.
  • the sensor 100 When the health condition is "ill", the sensor 100 generates perceptual data including ⁇ "health": "sick" ⁇ .
  • the plurality of robots 100 may also constitute a robotic system based on the interaction of the robot 100 with the user.
  • the robot 100 can transmit the situation to the other one or more robots 100, and the other one or more robots 100 can communicate with other users.
  • Mutual get interactions based on interactions with other users, and generate control entries based on this process. For example, when the robot 100 perceives the user's voice information "how to do Kung Pao Chicken", if the control item matching the voice information is not found, the robot 100 can transmit the voice information to the other robot 100, and the other robots 100 take the initiative.
  • the robot 100 can select the transmitted robot 100 according to the information of the user of the other robot 100. For example, the robot 100 determines that the subject of the problem is law, and the robot 100 can find the robot 100 that the user is a legal professional, and send the above problem to the robot 100. .
  • control item After the control item is used as the data to control the interaction behavior of the robot, the interaction behavior of the robot can be controlled according to the control item.
  • FIG. 3 illustrates a flow chart 1 of a method of controlling robot interaction behavior, as shown in FIG. 3, in accordance with some embodiments of the present invention, including:
  • Step S302 acquiring data that is perceived by the robot
  • Step S304 Generate sensing data according to the perceptual information, at least according to a predefined sensing unit, where the sensing data includes an identifier and a value of the sensing unit.
  • Step S306 searching for a control entry that matches the generated sensing data among the stored plurality of control entries;
  • Step S308 if the control item matching the generated sensing data is found, the robot is caused to perform the interactive behavior in the found control item.
  • the bot 100 communicates with a remote server (not shown) over a network, the bot 100 perceives at least one piece of data, and the remote server acquires information perceived by the bot from the bot 100, the acquisition including a remote server request
  • the robot 100 transmits the information it perceives, or the robot senses the information, and transmits the information perceived by the robot 100 to the remote server.
  • the robot 100 may periodically transmit the perceived information to the remote server or transmit the perceived information to the remote server when the perceived information changes to reduce the amount of data transmission between the remote server and the robot 100.
  • the control entry document can be stored in a remote server that includes one or more processors and one or more modules, programs, or sets of instructions that are stored in memory to perform the method illustrated in FIG.
  • the remote server can be a single server or a server cluster consisting of multiple servers. It should be understood that the above described program or set of instructions is not limited to running on a single server, but can also be run on distributed computing resources.
  • the found control entry can be sent to the bot 100, which reads the interactive behavior from the control entry and performs the interactive behavior. Or, you can interact with the found control entry
  • the data is sent to the robot 100.
  • the data of the interactive behavior in the control entry may be parsed to obtain an instruction that the robot 100 can execute, and the obtained command is transmitted to the robot 100, and the robot 100 executes the instruction. It should be understood that the above manner is merely illustrative.
  • FIG. 4 illustrates a flow chart 2 of a method of controlling robot interaction behavior, as shown in FIG. 4, according to some embodiments of the present invention, the method comprising:
  • Step S402 receiving the sensing data of the robot, wherein the sensing data is generated according to the information perceived by the robot, at least according to a predefined sensing unit, and the sensing data includes the identifier and the value of the sensing unit;
  • Step S404 searching for a control item matching the sensing data of the robot among the stored plurality of control items;
  • Step S406 if the control item matching the perceptual data of the robot is found, the robot is caused to perform the interaction behavior in the found control item.
  • the robot 100 senses at least one piece of information, and generates sensing data based on the sensed information and the sensing unit, and transmits the sensing data.
  • the bot 100 sends the sensory data to a remote server (not shown).
  • the robot 100 may transmit the sensing data after generating the sensing data, or may send the sensing data after receiving the request from the remote server.
  • the remote server stores documents that control entries, such as documents in a data exchange format, or a database, and the like.
  • control entry documents can be distributed across multiple storage spaces.
  • the remote server may include one or more processors and one or more modules, programs or sets of instructions stored in memory to perform the method illustrated in FIG.
  • FIG. 5 illustrates a third flowchart of a method for controlling the interaction behavior of a robot according to some embodiments of the present invention. As shown in FIG. 5, the method includes:
  • Step S502 sensing at least one piece of information
  • the sensing data is generated according to the perceptual information, at least according to the predefined sensing unit, where the sensing data includes the identifier and the value of the sensing unit;
  • Step S506 sending the generated sensing data
  • Step S508 receiving information of a control item that matches the sensing data
  • Step S510 performing an interaction behavior of the control item configuration according to the information of the control item.
  • the interactive behavior control device 148 of the robot 100 performs the method as shown in FIG.
  • the robot 100 perceives at least one piece of information, generates a policy according to the perceptual data, and generates perceptual data according to the sensing unit. After the robot 100 generates the sensing data, it transmits the sensing data to the remote server.
  • the control entry is sent to the robot 100 to a control entry that matches the sensory data of the robot.
  • an action instruction that controls the interactive behavior in the entry can be sent to the robot 100.
  • the identification of the generated sensory data may also be determined prior to transmitting the generated sensory data. After determining the identifier of the generated sensing data, the generated sensing data and its identifier are sent out. After the remote server finds the control entry that matches the generated sensing data, the information of the control entry and the identifier of the corresponding sensing data are sent to the control device 148, and the information of the control entry may be the control entry itself, the identifier of the control entry, The behavior of the control bar entry configuration and any combination thereof, but is not limited to this. The control device receives the information of the control entry, and determines whether the information of the received control entry is the information of the control entry that matches the generated sensing data according to the identifier of the sensing data carried in the information of the control entry.
  • Control device 148 can determine a corresponding control entry based on the identification of the control entry and perform an interaction behavior in the control entry. Alternatively, the control device 148 can read the interaction behavior of the control entry configuration directly from the control entry sent by the remote server to perform the interaction. Moreover, if the remote server sends the interaction behavior configured in the control entry, the control device 148 can directly parse and execute the interaction behavior.
  • the sensory data of the robot may be matched with a trigger condition in the control entry, including but not limited to determining whether there is a certain sensing unit, the value of the comparison sensing unit.
  • the degree of matching between the perceptual data of the robot and the matched plurality of trigger conditions may be determined, at least according to the degree of matching.
  • a control entry that senses data matching may be determined for the speech text in the perceptual data, the degree of matching may be determined by using the editing distance, and the smaller the value of the editing distance, the more similar the two texts are. Speech text can also be matched using regular expressions.
  • the priority of the control entry can also be set, and the priority of the control entry can be referenced when selecting the control entry.
  • control entries can be classified into core control entries, user control entries, and temporary control entries, with the core control entry being the highest priority control entry, followed by the user control entry, and finally the temporary control entry.
  • the robot 100 can perceive at least one piece of information, generate perceptual data based on the perceptual information and sensing unit, and read control items (including but not limited to reading from the memory 102 of the robot 100), A control entry that matches the generated perceptual data is found, and if a control entry that matches the generated perceptual data is found, the robot 100 performs an interaction behavior in the found control entry.
  • the document of the control entry may be stored in the memory 102 and the remote server of the bot 100.
  • the robot 100 perceives at least one piece of information, generates sensing data based on the sensed information and the sensing unit, reads the control item from the memory 102, and searches for the control item matching the generated sensing data in the read control item.
  • the robot 100 performs the interaction behavior in the found control entry; if the control entry matching the generated perceptual data is not found in the read control entry, the robot 100 may Sending the generated sensing data to the remote server, the remote server searching for the control entry matching the received sensing data in the stored control entry, and if the control entry matching the received sensing data is found, causing the robot 100 to execute The interaction behavior in this control entry.
  • the remote server can also send the found control entry to the bot 100, which can receive the control entry via an interface (not shown) and store the received control entry.
  • the robot 100 when a control entry that matches the perceptual data is found, the robot 100 is caused to perform an interactive behavior in the control entry. When the control item matching the perceptual data is not found, the interaction behavior may be omitted, and the robot 100 may continue to perceive at least one piece of information, and perceive what information can be determined according to the preset condition. In some embodiments, when a control entry that matches the perceptual data is not found, a voice reply can be made or imported into the Internet (eg, displaying web page information, etc.).
  • the control item matching the sensing data it may be determined whether the sensing data is related to the voice (for example, whether the user's voice instruction is received, etc.), and if the sensing data is determined to be related to the voice, the voice response may be performed, or according to the voice.
  • the content is searched for relevant content in the Internet and presented to the user in the display device of the robot 100.
  • the control entries can be set based on the interaction behavior of the robot with the user.
  • the robot 100 can perform a voice chat with the user.
  • the robot 100 analyzes the user's needs and intentions, and obtains the interaction behavior of the scene and the robot in the situation.
  • the interaction behavior of the robot the control item is generated according to the sensing unit. For example, when the user is sick, the robot says “I am sick", and the control item of the robot 100 does not have an interaction behavior when the user is sick.
  • the robot 100 can perform a voice interaction with the user, such as asking the user "I don't know what needs to be done.
  • the robot 100 can make a call.
  • the robot 100 needs to contact the doctor when analyzing that the user is "ill", and according to the result of the analysis, the robot 100 can generate a control item, for example, the trigger condition is [if(health:sick)], The interaction behavior triggered by the trigger condition is [call ⁇ number:"//doctor_number.php"].
  • the structure of the apparatus for generating control data of the robot of some embodiments will be described below. Since the principle of solving the problem by the generating device of the control data of the robot is similar to the control method of the interactive behavior of the robot, the implementation of the generating device for controlling the data of the robot can be referred to the implementation of the method for generating the control data of the robot, and the repeated description will not be repeated.
  • the term "unit” or "module” may implement a combination of software and/or hardware of a predetermined function.
  • the apparatus described in the following embodiments is preferably implemented in software, hardware, or a combination of software and hardware, is also possible and contemplated.
  • FIG. 6 is a block diagram showing the structure of a control device for generating control data of a robot according to some embodiments of the present invention. As shown in FIG. 6, the device includes:
  • the trigger condition setting module 602 is configured to set a trigger condition for controlling the interaction behavior of the robot according to the one or more preset sensing units, wherein the sensing unit is set as a minimum unit that controls the interaction behavior of the robot;
  • the interaction behavior setting module 604 is connected to the trigger condition setting module 602, and configured to set an interaction behavior triggered by the trigger condition according to one or more preset interaction behaviors set for the robot to perform;
  • the generating module 606 is connected to the interaction behavior setting module 604, and is configured to generate a control item for controlling the interaction behavior of the robot in response to the information perceived by the robot according to the set trigger condition and the interaction behavior.
  • FIG. 7 illustrates a structural block diagram of a trigger condition setting module 602.
  • the trigger condition setting module 602 may include: a sensing unit selecting unit 702 for using a preset sensing unit. At least one sensing unit is selected; the sensing unit attribute setting unit 704 is connected to the sensing unit selecting unit 702, and configured to set the attribute of the selected sensing unit, wherein the attribute of the sensing unit includes the value of the sensing unit; the trigger condition setting unit 706 And connected to the sensing unit attribute setting unit 704, configured to set a trigger condition for controlling the interaction behavior of the robot according to the selected sensing unit and the attribute of the sensing unit.
  • FIG. 8 illustrates a block diagram of another trigger condition setting module 602 according to some embodiments of the present invention.
  • the trigger condition setting module 602 may further include: a relationship setting unit, in addition to the unit included in FIG. 708, connected to the trigger condition setting unit 706, for setting a relationship between the plurality of sensing units.
  • the trigger condition setting unit 706 is further configured to set a trigger condition according to the attributes of the selected sensing unit and the sensing unit and the relationship between the sensing units.
  • FIG. 9 illustrates a structural block diagram of an interaction behavior setting module 604.
  • the interaction behavior setting module 604 may include an interaction behavior selection unit 902 for setting from a preset. Selecting at least one interaction behavior for the interaction behavior performed by the robot; the interaction behavior attribute setting unit 904 is connected to the interaction behavior selection unit 902, and configured to set the attribute of the selected interaction behavior, wherein the attribute of the interaction behavior includes one of the interaction behaviors. Or a plurality of motion instructions that can be parsed by the robot to execute and parameters of the motion instruction;
  • the interaction behavior setting unit 906 is connected to the interaction behavior attribute setting unit 904, and sets the interaction behavior triggered by the trigger condition according to the selected interaction behavior and the attribute of the interaction behavior.
  • FIG. 10 illustrates a block diagram of another interaction behavior setting module 604.
  • the interaction behavior setting module 604 may further include: a sequence setting unit. 908, connected to the interaction behavior setting unit 906, for setting an execution order of the plurality of interaction behaviors.
  • the interaction behavior setting unit 906 is configured to set an interaction behavior triggered by the trigger condition according to the selected interaction behavior and the attribute of the interaction behavior, and the foregoing execution order.
  • the sensing unit is defined as a minimum unit for controlling the interaction behavior of the robot, and defines an interaction behavior, and controls the interaction behavior of the robot according to the sensing unit and the interaction behavior setting control item, and unifies the input and output standards of the robot control, and It enables non-technical personnel to edit the behavior of the robot and effectively improve the robot's adaptive interaction behavior and intelligence.
  • modules or steps of the embodiments of the present invention can be implemented by a general computing device, which can be concentrated on a single computing device or distributed in multiple computing devices. Alternatively, they may be implemented by program code executable by the computing device such that they may be stored in the storage device by the computing device and, in some cases, may be different from The steps shown or described are performed sequentially, or they are separately fabricated into individual integrated circuit modules, or a plurality of modules or steps thereof are fabricated into a single integrated circuit module. Thus, embodiments of the invention are not limited to any specific combination of hardware and software.

Abstract

A method and an apparatus for generating control data of a robot. The method for generating control data of a robot comprises: setting, according to one or more preset sensing units, a triggering condition for controlling an interaction behavior of a robot (S202), wherein the sensing unit is configured as a minimum unit for controlling the interaction behavior of the robot; setting, according to one or more preset interaction behaviors configured to be executed by the robot, an interaction behavior triggered by the triggering condition (S204); and generating, according to the set triggering condition and interaction behavior, a control item for controlling, in response to information sensed by the robot, the interaction behavior of the robot (S206). In the method, a control item is set according to a sensing unit and an interaction behavior, to control an interaction behavior of a robot, thus effectively improving an adaptive interaction behavior capability and intelligent level of the robot.

Description

机器人的控制数据的生成方法及装置Method and device for generating control data of robot 技术领域Technical field
本发明涉及机器人技术领域,特别涉及一种机器人的控制数据的生成方法及装置。The present invention relates to the field of robot technology, and in particular, to a method and an apparatus for generating control data of a robot.
背景技术Background technique
当今的机器人多为工业机器人,而工业机器人以无感知能力的居多。这些机器人的操作程序都是预先制定的,并按照预定程序重复无误地完成确定的任务。它们缺乏适应性,只有当涉及的对象相同时,才能产生一致的结果。Today's robots are mostly industrial robots, while industrial robots are mostly non-sense. The operating procedures of these robots are pre-defined and the determined tasks are completed without fail in accordance with the predetermined procedures. They lack adaptability and produce consistent results only when the objects involved are the same.
发明内容Summary of the invention
本发明实施例提供了一种机器人的控制数据的生成方法及装置,以至少有效提高机器人自适应交互行为能力与智能化程度。The embodiment of the invention provides a method and a device for generating control data of a robot, so as to at least effectively improve the adaptive interaction behavior and the degree of intelligence of the robot.
在某些实施例中,一种机器人的控制数据的生成方法,包括:根据一个或多个预设的感知单元设置用于控制机器人交互行为的触发条件,其中,感知单元被设置为控制机器人交互行为的最小单元;根据一个或多个预设的被设置为供机器人执行的交互行为,设置该触发条件触发的交互行为;根据设置的触发条件和交互行为生成用于响应机器人感知到的信息来控制机器人交互行为的控制条目。In some embodiments, a method for generating control data of a robot includes: setting a trigger condition for controlling a robot interaction behavior according to one or more preset sensing units, wherein the sensing unit is configured to control the robot interaction a minimum unit of behavior; setting an interaction behavior triggered by the trigger condition according to one or more preset interaction behaviors set for the robot to execute; generating information for responding to the robot perception according to the set trigger condition and interaction behavior A control entry that controls the interaction of the robot.
在某些实施例中,根据一个或多个预设的感知单元设置用于控制机器人交互行为的触发条件,包括:从预设的感知单元中选取至少一个感知单元;设置选取的感知单元的属性,其中,感知单元的属性包括感知单元的取值;根据选取的感知单元及感知单元的属性设置用于控制机器人交互行为的触发条件。In some embodiments, setting a trigger condition for controlling a robot interaction behavior according to one or more preset sensing units includes: selecting at least one sensing unit from a preset sensing unit; setting an attribute of the selected sensing unit The attribute of the sensing unit includes the value of the sensing unit; and the triggering condition for controlling the interaction behavior of the robot is set according to the selected sensing unit and the attribute of the sensing unit.
在某些实施例中,根据一个或多个预设的被设置为供机器人执行的交互行为设置触发条件触发的交互行为,包括:从预设的被设置为供机器人执行的交互行为中选取至少一个交互行为;设置选取的交互行为的属性,其中,交互行为的属性包括交互行为的一个或多个可被机器人解析以执行的动作指令以及动作指令的参数;根据选取的交互行为及交互行为的属性设置触发条件触发的交互行为。In some embodiments, the triggering condition triggered interaction behavior is set according to one or more preset interaction behaviors set for the robot to perform, including: selecting at least from the preset interaction behavior set for the robot to perform An interaction behavior; setting an attribute of the selected interaction behavior, wherein the attribute of the interaction behavior includes one or more action instructions of the interaction behavior that can be parsed by the robot to execute and parameters of the action instruction; according to the selected interaction behavior and the interaction behavior The property sets the interaction behavior triggered by the trigger condition.
在某些实施例中,一种机器人交互行为的控制装置,包括:触发条件设置模块,用于根据一个或多个预设的感知单元设置用于控制机器人交互行为的触发条件,其中,感知单元被设置为控制机器人交互行为的最小单元;交互行为设置模块,用于根据一个或 多个预设的被设置为供机器人执行的交互行为,设置触发条件触发的交互行为;生成模块,用于根据设置的触发条件和交互行为生成用于响应机器人感知到的信息来控制机器人交互行为的控制条目。In some embodiments, a control device for robot interaction behavior includes: a trigger condition setting module, configured to set a trigger condition for controlling a robot interaction behavior according to one or more preset sensing units, wherein the sensing unit The smallest unit that is set to control the interactive behavior of the robot; the interactive behavior setting module is used to The plurality of presets are set as interaction behaviors performed by the robot, and the interaction behavior triggered by the trigger condition is set; the generating module is configured to generate the information for responding to the robot to control the robot interaction behavior according to the set trigger condition and the interaction behavior. Control entry.
在本发明实施例中,提出了一种机器人的控制数据的生成方法及装置,定义感知单元为控制机器人交互行为的最小单位,并定义交互行为,根据感知单元和交互行为设置控制条目来控制机器人的交互行为,统一了机器人控制的输入输出标准,使得非技术人员也可以编辑机器人的行为,有效提高机器人自适应交互行为能力与智能化程度。In the embodiment of the present invention, a method and a device for generating control data of a robot are proposed. The sensing unit is defined as a minimum unit for controlling the interaction behavior of the robot, and defines an interaction behavior, and the control item is controlled according to the sensing unit and the interaction behavior to control the robot. The interaction behavior unifies the input and output standards of robot control, so that non-technical personnel can also edit the behavior of the robot, effectively improving the robot's adaptive interaction behavior and intelligence.
附图说明DRAWINGS
此处所说明的附图用来提供对本发明的进一步理解,构成本申请的一部分,并不构成对本发明的限定。在附图中:The drawings described herein are provided to provide a further understanding of the invention, and are not intended to limit the invention. In the drawing:
图1说明根据本发明某些实施例的机器人的结构示意图;1 illustrates a schematic structural view of a robot in accordance with some embodiments of the present invention;
图2a说明根据本发明某些实施例的机器人的控制数据的生成方法的流程图;2a illustrates a flow chart of a method of generating control data for a robot in accordance with some embodiments of the present invention;
图2b说明根据本发明某些实施例的机器人控制数据的生成界面的示意图;2b illustrates a schematic diagram of a generation interface of robot control data in accordance with some embodiments of the present invention;
图2c说明根据本发明某些实施例的控制数据中交互行为的生成界面的示意图;2c illustrates a schematic diagram of a generation interface for controlling interaction behavior in data, in accordance with some embodiments of the present invention;
图3说明根据本发明某些实施例的机器人交互行为的控制方法的流程图一;3 illustrates a flow chart 1 of a method of controlling robot interaction behavior in accordance with some embodiments of the present invention;
图4说明根据本发明某些实施例的机器人交互行为的控制方法的流程图二;4 illustrates a flow chart 2 of a method of controlling robot interaction behavior in accordance with some embodiments of the present invention;
图5说明根据本发明某些实施例的机器人交互行为的控制方法的流程图三;5 illustrates a flow chart 3 of a method of controlling robot interaction behavior in accordance with some embodiments of the present invention;
图6说明根据本发明某些实施例的机器的控制数据的生成装置的结构框图;6 is a block diagram showing the structure of a device for generating control data of a machine according to some embodiments of the present invention;
图7说明根据本发明某些实施例的一种触发条件设置模块的结构框图;7 illustrates a block diagram of a trigger condition setting module in accordance with some embodiments of the present invention;
图8说明根据本发明某些实施例的另一种触发条件设置模块的结构框图;8 illustrates a block diagram of another trigger condition setting module in accordance with some embodiments of the present invention;
图9说明根据本发明某些实施例的一种交互行为设置模块的结构框图;以及9 illustrates a structural block diagram of an interactive behavior setting module in accordance with some embodiments of the present invention;
图10说明根据本发明某些实施例的另一种交互行为设置模块的结构框图。Figure 10 illustrates a block diagram of another interactive behavior setting module in accordance with some embodiments of the present invention.
具体实施方式detailed description
现在详细参考附图中描述的实施例。为了全面理解本发明,在以下详细描述中提到了众多具体细节。但是本领域技术人员应该理解,本发明可以无需这些具体细节而实现。在其他实例中,不详细描述公知的方法、过程、组件和电路,以免不必要地使实施例模糊。 Reference is now made in detail to the embodiments described in the drawings. In order to fully understand the invention, numerous specific details are recited in the following detailed description. However, it will be understood by those skilled in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits are not described in detail to avoid unnecessarily obscuring the embodiments.
图1说明根据本发明某些实施例的机器人的结构示意图。机器人100包括存储器102、存储器控制器104、一个或多个处理单元(CPU)106、外设接口108、射频(RF)电路114、音频电路116、扬声器118、麦克风120、感知子系统122、姿态传感器132、摄像机134、触觉传感器136以及一个或多个其他感知装置138,以及外部接口140。这些组件通过一条或多条通信总线或信号线110进行通信。1 illustrates a schematic structural view of a robot in accordance with some embodiments of the present invention. The robot 100 includes a memory 102, a memory controller 104, one or more processing units (CPUs) 106, a peripheral interface 108, a radio frequency (RF) circuit 114, an audio circuit 116, a speaker 118, a microphone 120, a sensing subsystem 122, a gesture Sensor 132, camera 134, tactile sensor 136, and one or more other sensing devices 138, as well as external interface 140. These components communicate over one or more communication buses or signal lines 110.
应当理解,机器人100只是机器人100的一个实例,该机器人100的组件可以比图示具有更多或更少的组件,或具有不同的组件配置。例如,在某些实施例中,机器人100可以包括一个或多个CPU 106、存储器102、一个或多个感知装置(例如如上所述的感知装置),以及一个或多个保存在存储器102中以执行机器人交互行为控制方法的模块、程序或指令集。图1所示的各种组件可以用硬件、软件或软硬件的组合来实现,包括一个或多个信号处理和/或专用集成电路。It should be understood that the robot 100 is just one example of the robot 100, which may have more or fewer components than the illustration, or have different component configurations. For example, in some embodiments, the bot 100 can include one or more CPUs 106, memory 102, one or more sensing devices (eg, sensing devices as described above), and one or more are stored in the memory 102 to A module, program, or instruction set that performs a robot interaction behavior control method. The various components shown in FIG. 1 can be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
在某些实施例中,机器人100可以是具有生物外形(例如,人形等)的机电设备,还可以是不具有生物外形但具有人类特征(例如,语言交流等)的智能装置,该智能装置可以包括机械装置,也可以包括由软件实现的虚拟装置(例如,虚拟聊天机器人等)。虚拟聊天机器人可以通过其所在的设备感知到信息,其所在的设备包括电子设备,例如手持电子设备、个人计算机等。In some embodiments, the robot 100 may be an electromechanical device having a biological shape (eg, a humanoid, etc.), or may be a smart device that does not have a biological appearance but has human characteristics (eg, language communication, etc.), the smart device may Mechanical devices are also included, as well as virtual devices implemented by software (eg, virtual chat bots, etc.). The virtual chat bot can perceive information through the device in which it is located, and the device in which it is located includes electronic devices such as handheld electronic devices, personal computers, and the like.
存储器102可包括高速随机存取存储器,并且还可包括非易失性存储器,例如一个或多个磁盘存储设备、闪存设备或其他非易失性固态存储设备。在某些实施例中,存储器102还可以包括远离一个或多个CPU 106的存储器,例如经由RF电路114或外部接口140以及通信网络(未示出)访问的网络附加存储器,其中所述通信网络可以是因特网、一个或多个内部网、局域网(LAN)、广域网(WLAN)、存储局域网(SAN)等,或其适当组合。存储器控制器104可控制机器人100的诸如CPU 106和外设接口108之类的其他组件对存储器102的访问。Memory 102 can include high speed random access memory and can also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid state storage devices. In some embodiments, memory 102 may also include memory remote from one or more CPUs 106, such as network attached memory accessed via RF circuitry 114 or external interface 140 and a communication network (not shown), wherein the communication network It can be the Internet, one or more intranets, a local area network (LAN), a wide area network (WLAN), a storage area network (SAN), etc., or a suitable combination thereof. Memory controller 104 can control access to memory 102 by other components of robot 100, such as CPU 106 and peripheral interface 108.
外设接口108将设备的输入和输出外设耦接到CPU 106和存储器102。上述一个或多个处理器106运行各种存储在存储器102中的软件程序和/或指令集,以便执行机器人100的各种功能,并对数据进行处理。 Peripheral interface 108 couples the input and output peripherals of the device to CPU 106 and memory 102. The one or more processors 106 described above execute various software programs and/or sets of instructions stored in the memory 102 to perform various functions of the robot 100 and process the data.
在某些实施例中,外设接口108、CPU 106以及存储器控制器104可以在单个芯片,例如芯片112上实现。而在某些其他实施例中,它们可能在多个分立芯片上实现。In some embodiments, peripheral interface 108, CPU 106, and memory controller 104 can be implemented on a single chip, such as chip 112. In some other embodiments, they may be implemented on multiple discrete chips.
RF电路114接收并发送电磁波。该RF电路114将电信号变换成电磁波,或是将电磁波变换成电信号,并且经由电磁波来与通信网络以及其他通信设备进行通信。该RF 电路114可以包括用于执行这些功能的公知电路,包括但不局限于天线系统、RF收发机、一个或多个放大器、调谐器、一个或多个振荡器、数字信号处理器、CODEC芯片组、用户身份模块(SIM)卡、存储器等等。该RF电路112可以通过无线通信来与网络和其他设备进行通信,该网络例如又名万维网(WWW)的因特网、内部网和/或诸如蜂窝电话网络之类的无线网络、无线局域网(LAN)和/或城域网(MAN)。The RF circuit 114 receives and transmits electromagnetic waves. The RF circuit 114 converts an electrical signal into an electromagnetic wave, or converts the electromagnetic wave into an electrical signal, and communicates with the communication network and other communication devices via the electromagnetic wave. The RF Circuitry 114 may include well-known circuitry for performing these functions, including but not limited to antenna systems, RF transceivers, one or more amplifiers, tuners, one or more oscillators, digital signal processors, CODEC chipsets, Subscriber Identity Module (SIM) card, memory, etc. The RF circuit 112 can communicate with a network and other devices via wireless communication, such as the World Wide Web (WWW) Internet, an intranet, and/or a wireless network such as a cellular telephone network, a wireless local area network (LAN), and / or Metropolitan Area Network (MAN).
上述无线通信可以使用多种通信标准、协议和技术中的任何一种,包括但不局限于全球移动通信系统(GSM)、增强型数据GSM环境(EDGE)、宽带码分多址(W-CDMA)、码分多址(CDMA)、时分多址(TDMA)、蓝牙、无线保真(Wi-Fi)(例如IEEE 802.11a、IEEE 802.11b、IEEE802.11g和/或IEEE 802.11n)、基于因特网协议的语音传输(VoIP)、Wi-MAX,用于电子邮件、即时消息传递和/或短消息服务(SMS)的协议,或任何其他合适的通信协议,包括在本文件提交日尚未开发出的通信协议。The above wireless communication may use any of a variety of communication standards, protocols, and technologies including, but not limited to, Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), Wideband Code Division Multiple Access (W-CDMA). ), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (eg IEEE 802.11a, IEEE 802.11b, IEEE 802.11g, and/or IEEE 802.11n), Internet-based Protocol Voice over Internet Protocol (VoIP), Wi-MAX, protocols for e-mail, instant messaging, and/or short message service (SMS), or any other suitable communication protocol, including those not yet developed at the filing date of this document. letter of agreement.
音频电路116、扬声器118和麦克风120提供了用户与机器人100之间的音频接口。音频电路116接收来自外设接口108的音频数据,将音频数据变换成电信号,并且将电信号传送到扬声器118。扬声器将电信号变换成人类可听见的声波。音频电路116还接收由麦克风118从声波变换的电信号。该音频电路116将电信号变换成音频数据,并且将音频数据传送到外设接口108,以便进行处理。音频数据可以由外设接口108从存储器102和/或RF电路114中检索出,和/或传送到存储器102和/或RF电路114。 Audio circuitry 116, speaker 118, and microphone 120 provide an audio interface between the user and the robot 100. Audio circuitry 116 receives audio data from peripheral interface 108, converts the audio data into electrical signals, and transmits the electrical signals to speaker 118. The speaker transforms the electrical signal into a human audible sound wave. Audio circuit 116 also receives electrical signals that are converted from sound waves by microphone 118. The audio circuit 116 converts the electrical signals into audio data and transmits the audio data to the peripheral interface 108 for processing. Audio data may be retrieved from memory 102 and/or RF circuitry 114 by peripheral interface 108 and/or transmitted to memory 102 and/or RF circuitry 114.
在某些实施例中,可以包括多个麦克风120,多个麦克风120分布可以在不同位置,根据不同位置的麦克风120、按照预定策略确定声音发出的方向。应当理解,也可以通过某些传感器来识别声音方向。In some embodiments, a plurality of microphones 120 can be included, the plurality of microphones 120 being distributed at different locations, and the direction in which the sound is emitted is determined according to a predetermined strategy based on the microphones 120 at different locations. It should be understood that the direction of the sound can also be identified by some sensors.
在某些实施例中,音频电路116还包括头戴送受话器插孔(未示出)。该头戴送受话器插孔提供音频电路114与可拆装的音频输入/输出外设之间的接口,举例来说,该音频输入/输出外设既可以是纯输出耳机,也可以是同时具有输出(用于单耳或双耳的耳机)和输入(麦克风)的头戴送受话器。In some embodiments, audio circuit 116 also includes a headset jack (not shown). The headset jack provides an interface between the audio circuit 114 and a removable audio input/output peripheral, for example, the audio input/output peripheral can be either a pure output headset or both Output (for single or binaural headphones) and input (microphone) headset.
在某些实施例中,还包括语音识别装置(未示出),用于实现语音到文字的识别,以及根据文字合成语音。语音识别装置可以用硬件、软件或软硬件的组合来实现,包括一个或多个信号处理和/或专用集成电路。音频电路116接收来自外设接口108的音频数据,将音频数据变换成电信号,语音识别装置可以对音频数据进行识别,将音频数据转换为文本数据。语音识别装置还可以根据文字数据合成音频数据,通过音频电路116将音频数据变换成电信号,并且将电信号传送到扬声器118。 In some embodiments, a speech recognition device (not shown) is also included for implementing speech-to-text recognition and synthesizing speech based on text. The speech recognition device can be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits. The audio circuit 116 receives the audio data from the peripheral interface 108, converts the audio data into electrical signals, and the voice recognition device can identify the audio data and convert the audio data into text data. The speech recognition apparatus can also synthesize the audio data based on the text data, convert the audio data into an electrical signal through the audio circuit 116, and transmit the electrical signal to the speaker 118.
感知子系统122提供机器人100的感知外设和外设接口108之间的接口,感知外设例如姿态传感器132、摄像机134、触觉传感器136和其他感知装置128。感知子系统122包括姿态控制器124、视觉控制器126、触觉控制器128以及一个或多个其他感知装置控制器130。所述一个或多个其他感知装置控制器130接收/发送来自/去往其他感知装置138的电信号。所述其他感知装置138可包括温度传感器、距离传感器、接近觉传感器、气压传感器以及空气质量检测装置等等。Perception subsystem 122 provides an interface between the perceptual peripherals of robot 100 and peripheral interface 108, such as attitude sensor 132, camera 134, tactile sensor 136, and other sensing devices 128. Perception subsystem 122 includes an attitude controller 124, a visual controller 126, a haptic controller 128, and one or more other perceptual device controllers 130. The one or more other sensing device controllers 130 receive/transmit electrical signals from/to other sensing devices 138. The other sensing devices 138 may include temperature sensors, distance sensors, proximity sensors, air pressure sensors, air quality detecting devices, and the like.
在某些实施例中,机器人100可以具有多个姿态控制器124,以控制机器人100的不同肢体,机器人的肢体可以包括但不限于手臂、足和头部。相应的,机器人100可以包括多个姿态传感器132。在某些实施方式中,机器人100可以不具备姿态控制器124和姿态传感器132,机器人100可以是固定形态,不具备手臂、足等机械活动部件。在某些实施例中,机器人100的姿态可以不是机械的手臂、足和头部,也可以采用可变形的构造。In some embodiments, the robot 100 can have a plurality of attitude controllers 124 to control different limbs of the robot 100, which can include, but are not limited to, arms, feet, and heads. Accordingly, the robot 100 can include a plurality of attitude sensors 132. In some embodiments, the robot 100 may not have the attitude controller 124 and the attitude sensor 132. The robot 100 may be in a fixed configuration and does not have mechanical moving parts such as an arm or a foot. In some embodiments, the pose of the robot 100 may not be a mechanical arm, foot, and head, but may also employ a deformable configuration.
机器人100还包括用于为各种组件供电的电源系统142。该电源系统142可以包括电源管理系统、一个或多个电源(例如电池、交流电(AC))、充电系统、电源故障检测电路、电源转换器或逆变器、电源状态指示器(例如发光二极管(LED)),以及与便携式设备中的电能生成、管理和分布相关联的其他任何组件。充电系统可以是有线充电系统,或者也可以是无线充电系统。The robot 100 also includes a power system 142 for powering various components. The power system 142 can include a power management system, one or more power sources (eg, batteries, alternating current (AC)), charging systems, power failure detection circuits, power converters or inverters, power status indicators (eg, light emitting diodes (eg LED)), as well as any other components associated with power generation, management, and distribution in portable devices. The charging system can be a wired charging system or a wireless charging system.
在某些实施例中,软件组件包括操作系统144、通信模块(或指令集)146、交互行为控制装置(或指令集)148以及一个或多个其他装置(或指令集)150。In some embodiments, the software components include an operating system 144, a communication module (or set of instructions) 146, an interactive behavior control device (or set of instructions) 148, and one or more other devices (or sets of instructions) 150.
操作系统144(例如Darwin、RTXC、LINUX、UNIX、OS X、WINDOWS或是诸如Vxworks之类的嵌入式操作系统)包括用于控制和管理常规系统任务(例如内存管理、存储设备控制、电源管理等等)以及有助于各种软硬件组件之间通信的各种软件组件和/或驱动器。Operating system 144 (eg, Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as Vxworks) includes controls and management of general system tasks (eg, memory management, storage device control, power management, etc.) Etc.) and various software components and/or drivers that facilitate communication between various hardware and software components.
通信模块146有助于经一个或多个外部接口140而与其他设备进行通信,并且它还包括用于处理RF电路114和/或外部接口140接收的数据的各种软件组件。外部接口140(例如通用串行总线(USB)、FIREWIRE等等)适合于直接或者经网络(例如因特网,无线LAN等等)间接耦接到其他设备。Communication module 146 facilitates communication with other devices via one or more external interfaces 140, and it also includes various software components for processing data received by RF circuitry 114 and/or external interface 140. External interface 140 (eg, Universal Serial Bus (USB), FIREWIRE, etc.) is adapted to be indirectly coupled to other devices either directly or via a network (eg, the Internet, wireless LAN, etc.).
在某些实施例中,机器人100还可以包括显示装置(未示出),显示装置可以包括但不限于触敏显示器、触摸板等。上述一个或多个其他装置150可以包括图形模块(未示出),图形模块包括用于在显示装置上呈现和显示图形的各种已知软件组件。注意术 语“图形”包括可以显示给用户的任何对象,包括但不局限于文本、网页、图标(例如包括软按键在内的用户界面对象)、数字图像、视频、动画等等。触敏显示器或触摸板还可以用于用户输入。In some embodiments, the robot 100 may also include a display device (not shown), which may include, but is not limited to, a touch sensitive display, a touch pad, and the like. One or more of the other devices 150 described above can include a graphics module (not shown) that includes various known software components for presenting and displaying graphics on the display device. Attention The term "graphics" includes any object that can be displayed to a user, including but not limited to text, web pages, icons (eg, user interface objects including soft keys), digital images, video, animation, and the like. Touch sensitive displays or touch pads can also be used for user input.
机器人100通过例如姿态传感器132、摄像机134、触觉传感器136和其他感知装置128、麦克风120等感知外设感知机器人10的外部环境和机器人本身的状况,机器人100感知到的信息经由感知外设对应控制装置处理,并交由一个或多个CPU 106处理。机器人100对环境的感知包括但不限于自身的传感器(例如姿态传感器132、摄像机134、触觉传感器136和其他感知装置128)检测到的信息,还可以是与机器人100相连的外部装置(未示出)检测到的信息,机器人100与外部装置之间建立通信连接,机器人100和外部装置通过该通信连接传输数据。外部装置包括各种类型的传感器、智能家居设备等。The robot 100 senses the external environment of the robot 10 and the condition of the robot itself by, for example, the attitude sensor 132, the camera 134, the tactile sensor 136, and other sensing devices 128, the microphone 120, etc., and the information perceived by the robot 100 is controlled via the sensing peripheral. The device processes and is processed by one or more CPUs 106. The perception of the environment by the robot 100 includes, but is not limited to, information detected by its own sensors (eg, the attitude sensor 132, the camera 134, the tactile sensor 136, and other sensing devices 128), and may also be an external device connected to the robot 100 (not shown) The detected information establishes a communication connection between the robot 100 and the external device through which the robot 100 and the external device transmit data. External devices include various types of sensors, smart home devices, and the like.
在某些实施例中,机器人100感知到的信息包括但不限于声音、图像、环境参数、触觉信息、时间、空间等。环境参数包括但不限于温度、湿度、气体浓度等;触觉信息包括但不限于与机器人100的接触,包括但不限于与触敏显示器的接触、与触觉传感器的接触或靠近,触觉传感器可以设置在机器人的头部、手臂等部位(未示出),应当说明的是还包括其他形式的信息。声音可以包括语音和其他声音,声音可以是麦克风120采集到的声音,也可以是存储器102中存储的声音;语音可以包括但不限于人类说话或唱歌等。图像可以是单张图片或视频,图片和视频包括但不限于由摄像机134拍摄得到,也可以从存储器102中读取或者通过网络传输到机器人100。In some embodiments, the information perceived by the robot 100 includes, but is not limited to, sound, images, environmental parameters, haptic information, time, space, and the like. Environmental parameters include, but are not limited to, temperature, humidity, gas concentration, etc.; tactile information includes, but is not limited to, contact with the robot 100, including but not limited to contact with a touch sensitive display, contact or proximity to a tactile sensor, and the tactile sensor can be placed at The head, arm, etc. of the robot (not shown) should be described to include other forms of information. The sound may include voice and other sounds, the sound may be the sound collected by the microphone 120, or may be the sound stored in the memory 102; the voice may include, but is not limited to, human speaking or singing. The image may be a single picture or video, including but not limited to captured by camera 134, or may be read from memory 102 or transmitted to the robot 100 over a network.
机器人100感知的信息不仅包括机器人100外部的信息,还可以包括机器人100自身的信息,包括但不限于机器人100的电量、温度等信息。例如,可以在感知到机器100的电量低于20%时,使机器人100移动到充电位置自动充电。The information perceived by the robot 100 includes not only information external to the robot 100 but also information of the robot 100 itself, including but not limited to information such as the amount of power, temperature, and the like of the robot 100. For example, the robot 100 can be moved to the charging position for automatic charging when it is perceived that the power of the machine 100 is less than 20%.
应当理解,机器人100不限于通过上述的方式感知到信息,还可以通过其他形式感知到信息,包括在本文件提交日尚未开发出的感知技术。此外,机器人100的感知装置也不限于设置在机器人100上的感知装置,还可以包括与机器人100关联而未设置在机器人100上的感知装置,例如各种用于感知信息的传感器。作为一个示例,机器人100可以与设置在一定区域内的温度传感器、湿度传感器(未示出)等关联,通过这些传感器感知到相应的信息。机器人100可以通过多种类型的通信协议与这些传感器通信,以从这些传感器获取信息。 It should be understood that the robot 100 is not limited to perceiving information in the manner described above, but may also perceive information in other forms, including perceptual techniques that have not been developed at the filing date of this document. Further, the sensing device of the robot 100 is not limited to the sensing device provided on the robot 100, and may also include a sensing device associated with the robot 100 and not provided on the robot 100, such as various sensors for sensing information. As an example, the robot 100 may be associated with a temperature sensor, a humidity sensor (not shown), or the like disposed within a certain area through which the corresponding information is perceived. The robot 100 can communicate with these sensors through various types of communication protocols to obtain information from these sensors.
在某些实施例中,可以根据预设的条件设定机器人100感知的信息,这些条件可以包括但不限于设定机器人100感知哪些信息、在什么时间感知信息等。例如,可以设定在于用户语音对话时,感知用户的声音、追踪用户的面部、识别用户的姿态等,而不感知其他信息、或者在生成感知单元时降低其他信息的作用、或者对感知到的其他信息进行处理等;或者,在某一时间段(例如,用户外出、机器人100单独在室内的时间内)感知环境参数、感知图像和视频数据,通过环境参数判断是否需要与空调等设备交互,通过图像和视频数据判断室内是否有陌生人进入等。应当理解,设定感知的信息的条件并不限于此,上述条件仅作为举例说明,可以根据情况设定机器人100需要感知的信息。In some embodiments, the information perceived by the robot 100 may be set according to preset conditions, which may include, but are not limited to, setting which information the robot 100 perceives, when to perceive the information, and the like. For example, it may be set to sense the user's voice, track the user's face, recognize the user's gesture, etc., in the user's voice conversation, without perceiving other information, or reducing the effect of other information when generating the sensing unit, or sensing the Other information is processed, etc.; or, during a certain period of time (for example, when the user goes out, the robot 100 is indoors alone), the environmental parameters, the perceived image, and the video data are sensed, and the environmental parameters are used to determine whether it is necessary to interact with the air conditioner or the like. Use image and video data to determine if there are strangers entering the room. It should be understood that the condition for setting the perceived information is not limited thereto, and the above conditions are merely exemplified, and the information that the robot 100 needs to perceive may be set depending on the situation.
关于感知单元About the sensing unit
定义至少一个感知单元,感知单元作为控制机器人100的最小单元(或者称为最小输入单元),机器人100至少根据感知单元做出交互行为。机器人100的交互行为可以受到一个或多个感知单元控制,例如,当一个或多个感知单元的取值发生变化时,机器人100可以响应这些变化做出交互行为;或者,当一个或多个感知单元的取值在某一取值范围内或等于某一值时,机器人100可以响应感知单元做出交互行为。应当理解,感知单元对机器人100交互行为的控制不限于上述情况,上述情况仅作为举例说明。At least one sensing unit is defined, which is the smallest unit (or referred to as a minimum input unit) that controls the robot 100, and the robot 100 makes interactive behavior based on at least the sensing unit. The interaction behavior of the robot 100 may be controlled by one or more sensing units, for example, when the values of one or more sensing units change, the robot 100 may react to the changes in response to the changes; or, when one or more perceptions When the value of the unit is within a certain value range or equal to a certain value, the robot 100 can perform an interactive behavior in response to the sensing unit. It should be understood that the control of the interaction behavior of the robot 100 by the sensing unit is not limited to the above case, and the above case is merely illustrative.
在某些实施例中,感知单元可以包括多个层级,高层级的感知单元可以包含低层级的一个或多个感知单元。在某些实施例中,高层级的感知单元可以包含与其相邻的低层级的一个或多个感知单元,同一高层级的感知单元可以包含不同的低层级的感知单元。在时间上,合成高层级的感知单元的低层级感知单元包括但不限于同一时间或时间段的低层级感知单元,以及该时间或时间段之前的历史的低层级的感知单元。在某些实施例中,高层级的感知单元由不同时间的低层级感知单元确定。In some embodiments, the sensing unit can include multiple levels, and the higher level sensing unit can include one or more sensing units of the lower level. In some embodiments, the higher level perceptual unit may include one or more perceptual units of the lower level adjacent thereto, and the sensing unit of the same higher level may include different lower level perceptual units. In time, the low-level sensing units that synthesize the high-level sensing units include, but are not limited to, low-level sensing units of the same time or time period, and historical low-level sensing units of the time or time period. In some embodiments, the higher level perceptual units are determined by lower level sensing units at different times.
在某些实施例中,感知单元的取值可以是一个或一组值,也可以是一个或多个取值的范围。可以根据机器人100感知到的信息确定感知单元的取值,一个感知单元可以由感知到的一项或多项信息确定,同一感知单元可以由感知到的不同数据来确定。感知到的信息可以包括实时感知到的信息,或者历史感知到的信息(例如过去某一时刻或某段时间感知到的信息)。在某些情况下,感知单元的取值由实时感知到的信息和历史感知到的信息共同确定。In some embodiments, the value of the sensing unit may be one or a set of values, or may be a range of one or more values. The value of the sensing unit may be determined according to the information perceived by the robot 100. One sensing unit may be determined by one or more pieces of information that is perceived, and the same sensing unit may be determined by different data that is perceived. The perceived information may include real-time perceived information, or historically perceived information (such as information perceived at a certain time or some time in the past). In some cases, the value of the sensing unit is determined by the information perceived in real time and the information perceived by the history.
作为一个例子,可以设置听觉(ear)、视觉(eye)、时间(timer)、是否有人在家(so_at_home)以及环境(environment)几个感知单元。听觉描述听到的语音,在机 器人100接收到声音时,对接收到的声音进行语音识别处理,识别得到声音中语音的文本,听觉的取值可以是听到的语音的文本;在某些实施例中,视觉还可以包括声音的方向,声音的方向以机器人的面部为参考,包括左、右、前、后等方向。视觉描述视频监控情况,机器人100可以对图像或视频进行分析,判断当前是否有人或者是否有移动,视觉的取值可以包括是否有人、是否有移动等等。是否有人在家的取值可以是“0”或“1”,“0”表示没有人在家,“1”表示有人在家。时间描述时间信息,其取值可以是一个时间点或者一个时间范围,例如每年2月1日14点整。环境描述环境情况,包括温度、湿度、噪音、PM2.5、空气中的燃气的ppm、空气中的一氧化碳含量、空气中的氧气含量等,其取值可以是每种参数的值或者范围。As an example, it is possible to set ear, eye, timer, whether there are people at home (so_at_home), and environment (pervironment). Hearing describes the voice heard, on the machine When the person 100 receives the sound, the voice recognition process is performed on the received voice to identify the text of the voice in the voice, and the value of the hearing may be the text of the voice heard; in some embodiments, the vision may also include The direction of the sound, the direction of the sound is referenced to the face of the robot, including left, right, front, back and other directions. Visually describing the video surveillance situation, the robot 100 can analyze the image or video to determine whether there is or is there a current movement, and the visual value can include whether or not someone has moved or not. Whether someone at home can be "0" or "1", "0" means no one is at home, and "1" means someone is at home. The time describes the time information, and the value may be a time point or a time range, for example, 14:00 every February 1st. The environment describes the environmental conditions, including temperature, humidity, noise, PM2.5, ppm of gas in the air, carbon monoxide content in the air, oxygen content in the air, etc., and the value may be the value or range of each parameter.
在某些实施例中,可以预定义感知单元的取值。预定义的感知单元的取值可以是一个或多个具体值、或者一个或多个取值范围。感知单元的取值可以是明确的值,也可以由通配符(或其类似)与明确的值共同构成,但不限于此。例如,感知单元为“语音”时,其取值可以是“*下雨*”,表示任意包含“下雨”的语音信息;或者其取值可以是“*[下有]雨*”,表示任意包含“下雨”或“有雨”的语音信息。In some embodiments, the value of the sensing unit can be predefined. The value of the predefined sensing unit may be one or more specific values, or one or more ranges of values. The value of the sensing unit may be an explicit value, or may be formed by a wildcard (or the like) together with an explicit value, but is not limited thereto. For example, when the sensing unit is “speech”, the value may be “*rain*”, indicating that any voice information containing “raining” is included; or the value may be “*[below] rain*”, indicating Any voice message that contains "rain" or "rain".
机器人100可以根据感知单元和感知到的信息生成感知数据,感知数据可以包括一项或多项感知单元,感知数据中包括感知单元的标识和取值。感知数据中每个感知单元的取值参见对感知单元的描述。机器人100根据感知到的信息、按照感知单元生成感知数据,可以采用多种分析方法根据感知到的信息得到感知单元的取值,例如,通过语音识别技术得到语音的文本、通过图像识别技术分析感知到的图像中是否存在人像、通过人像(面部)识别技术确定人像的属性等。应当理解,机器人100不限于通过上述的方式得到感知单元的取值,还可以通过其他方式,包括在本文件提交日尚未开发出的处理技术。The robot 100 may generate the sensing data according to the sensing unit and the perceived information, and the sensing data may include one or more sensing units, where the sensing data includes the identification and value of the sensing unit. For the value of each sensing unit in the sensing data, refer to the description of the sensing unit. The robot 100 generates the sensing data according to the perceived information according to the sensing unit, and can obtain the value of the sensing unit according to the perceived information by using various analysis methods, for example, obtaining the text of the voice through the voice recognition technology, and analyzing the sensing through the image recognition technology. Whether there is a portrait in the image to be obtained, the attribute of the portrait is determined by the portrait (face) recognition technique, and the like. It should be understood that the robot 100 is not limited to obtaining the value of the sensing unit by the above manner, and may also include processing techniques that have not been developed at the filing date of this document by other means.
关于控制条目About control entries
基于预先定义的感知单元和预设的供机器人执行的交互行为,可以设置触发条件以及触发条件触发的交互行为。根据触发条件和触发条件触发的交互行为,生成用于响应机器人感知到的信息控制机器人交互行为的控制条目。控制条目可以具有唯一标识,以区分控制条目。Based on the pre-defined sensing unit and the preset interaction behavior for the robot, the trigger condition and the interaction behavior triggered by the trigger condition can be set. According to the interaction behavior triggered by the trigger condition and the trigger condition, a control item for controlling the interaction behavior of the robot in response to the information perceived by the robot is generated. Control entries can have unique identifiers to distinguish control entries.
触发条件可以由一个或多个感知单元构成,感知单元之间可以配置逻辑关系,逻辑关系包括但不限于“与”、“或”以及“非”等。在某些实施例中,触发条件可以包括构成触发条件的感知单元的标识和取值,感知单元的取值可以是一个或一组值、或者一个 或一组取值范围。感知单元的取值可以是明确的值,也可以由通配符(或其类似)与明确的值构成,但不限于此。例如,触发条件中感知单元为“语音”时,其取值可以是“*下雨*”,表示任意包含“下雨”的语音信息;或者其取值可以是“*[下有]雨*”,表示任意包含“下雨”或“有雨”的语音信息。The triggering condition may be composed of one or more sensing units, and the logical units may be configured between the sensing units, and the logical relationships include, but are not limited to, “and”, “or”, “not”, and the like. In some embodiments, the triggering condition may include an identifier and a value of the sensing unit constituting the triggering condition, and the value of the sensing unit may be one or a set of values, or one Or a set of values. The value of the sensing unit may be an explicit value, or may be composed of a wildcard (or the like) and an explicit value, but is not limited thereto. For example, when the sensing unit in the trigger condition is “speech”, the value may be “*rain*”, indicating that any voice information containing “rain” is included; or the value may be “*[below] rain* "," means that any voice message containing "rain" or "rain" is included.
触发条件可以触发的一个或多个交互行为。在某些实施例中,可以设置交互行为之间的顺序,以按照设置的顺序执行多个交互行为。交互行为可以被配置为一个或多个可被机器人解析以执行的动作指令,动作指令还可以包括一个或多个参数。在某些实施例中,还可以配置所述一个或多个动作指令的执行顺序。该执行顺序可以包括但不限于随机执行一个或一组动作指令,以实现随机执行一个或多个动作;或者,按照预定步骤顺序执行多个动作指令。One or more interactions that the trigger condition can trigger. In some embodiments, the order between interactions can be set to perform multiple interactions in a set order. The interaction behavior can be configured as one or more action instructions that can be parsed by the robot for execution, and the action instructions can also include one or more parameters. In some embodiments, the order of execution of the one or more action instructions can also be configured. The execution order may include, but is not limited to, randomly executing one or a set of action instructions to effect random execution of one or more actions; or executing a plurality of action instructions in a predetermined sequence of steps.
机器人100的操作系统144及其他相关装置,可以解析交互行为的动作指令,使得机器人执行交互行为。例如,为了使机器人向前移动5米,动作指令可以是“move{“m”:5}”。机器人交互行为的控制装置148解析该动作指令,得到要执行的任务(移动)和任务参数(向前5米),向操作系统144传递任务和参数,操作系统144进一步处理使得移动装置(未示出)执行移动,移动装置可以包括足式、轮式以及履带式等。应当理解,也可以设置具体指令,比如移动装置的各个电机(或者类似部件)的参数。The operating system 144 of the robot 100 and other related devices can parse the action instructions of the interactive behavior, so that the robot performs the interactive behavior. For example, to move the robot forward by 5 meters, the action command can be "move{"m":5}". The robot interaction behavior control device 148 parses the action instruction, obtains the task (move) and task parameters (5 meters forward) to be executed, passes the task and parameters to the operating system 144, and the operating system 144 further processes the mobile device (not shown) To perform the movement, the mobile device may include a foot, a wheel, a crawler, and the like. It should be understood that specific instructions, such as parameters of individual motors (or similar components) of the mobile device, may also be provided.
在某些实施例中,交互行为的动作指令包括:用于执行其他控制条目而设置的到其他控制条目的链接,和/或用于从多个内容和/或多个参数中选取内容和/或参数而设置的到多个参数和/或多个内容的链接。每个控制条目可以具有唯一标识,动作指令可以引用控制条目的标识连接到该控制条目。动作指令链接的内容可以是一组动作,机器人100可以根据其他因素执行一组动作中的动作,例如,可以预先配置机器人100的性格或性别等属性,这些属性可以存储在存储器102中,不同性别或者性格的机器人100对同一情况(或称为场景)的交互行为可以不同,机器人100可以根据设置的性格或性别等属性从一组动作中选择执行的动作,这些动作可以包括但不限于机器人100的肢体动作等。动作指令可以链接到一个或一组内容,该内容可以包括但不限于语音聊天的内容、各种互联网信息等,例如,机器人100根据控制条目执行的动作为查询北京的天气,动作指令可以是一个查询天气的地址,机器人100到这一地址获取北京的天气,这一地址可以包括统一资源定位符(URL)、内存地址、数据库字段等。 In some embodiments, the action instructions of the interactive behavior include: links to other control entries set for execution of other control entries, and/or for selecting content from a plurality of content and/or parameters and/or A link to multiple parameters and/or multiple content set by parameters or parameters. Each control entry may have a unique identification to which an action instruction may refer to the control entry. The content of the action instruction link may be a set of actions, and the robot 100 may perform actions in a set of actions according to other factors. For example, attributes such as personality or gender of the robot 100 may be pre-configured, and the attributes may be stored in the memory 102, different genders. Alternatively, the interactive robot 100 may have different interaction behaviors for the same situation (or called a scene), and the robot 100 may select an executed action from a set of actions according to attributes such as a set personality or gender, and the actions may include, but are not limited to, the robot 100. Physical movements, etc. The action instruction may be linked to one or a group of content, which may include, but is not limited to, the content of the voice chat, various Internet information, etc., for example, the action performed by the robot 100 according to the control item is to query the weather in Beijing, and the action instruction may be a Querying the weather address, the robot 100 obtains the weather in Beijing at this address, which may include a uniform resource locator (URL), a memory address, a database field, and the like.
机器人100的交互行为包括但不限于通过输出语音、调整姿态、输出图像或视频、与其他设备进行交互等。输出语音包括但不限于与用户聊天、播放音乐;调整姿态包括但不限于移动(例如,模仿人类步行等)、肢体摆动(例如,手臂、头部的摆动)、神态调整等;输出图像或视频包括但不限于在显示装置上显示图像或视频,图像可以是动态电子表情等,也可以是拍摄得到的图像,或者从网络中获取到的图像;与其他设备交互包括但不限于控制其他设备(例如调整空调设备的工作参数等)、向其他设备传输数据、与其他设备建立连接等。应当理解,交互行为并不限于上述列举的内容,机器人100对感知到的信息的反应均可被视为机器人100的交互行为。The interactive behavior of the robot 100 includes, but is not limited to, by outputting a voice, adjusting a gesture, outputting an image or video, interacting with other devices, and the like. Output speech includes, but is not limited to, chatting with a user, playing music; adjusting gestures including, but not limited to, moving (eg, mimicking human walking, etc.), limb swings (eg, arm, head swing), posture adjustment, etc.; outputting images or videos Including but not limited to displaying an image or video on a display device, the image may be a dynamic electronic expression or the like, or may be a captured image, or an image obtained from a network; interaction with other devices includes, but is not limited to, controlling other devices ( For example, adjusting the operating parameters of the air conditioner, etc., transferring data to other devices, establishing connections with other devices, and the like. It should be understood that the interactive behavior is not limited to the above enumerated contents, and the reaction of the robot 100 to the perceived information can be regarded as the interactive behavior of the robot 100.
控制条目可以采用数据交换格式配置,当然也可以采用其他格式配置。数据交换格式包括但不限于XML、JSON或者YAML等。以JSON为例,需要实现:当用户说:“给我唱一首歌”,先往以中等速度0角度后退10cm然后开始唱一首歌,唱完歌以后10秒拍个照片发送给用户,然后0角度前行5CM。JSON数据格式的控制条目可以是如下内容:Control entries can be configured in a data exchange format, although other formats can be used. Data exchange formats include, but are not limited to, XML, JSON, or YAML. Take JSON as an example, you need to implement: When the user says, "Sing me a song," first go back to 10cm at a medium speed of 0 and then start singing a song. After singing the song, take a photo and send it to the user 10 seconds later. Then 0 angle forward 5CM. The control entry for the JSON data format can be as follows:
Figure PCTCN2016087257-appb-000001
Figure PCTCN2016087257-appb-000001
在上述的控制条目中,“ifs”部分为根据感知单元设置的触发条件,“ear”为感知单元的标识,“唱歌”为感知单元的取值。"trigger"部分为触发条件触发的交互行为,包括“move(移动)”、“song(唱歌)”和“take_pic(拍照)”三个交互行为,每个交互行为包括相应的动作指令。其中,“song(唱歌)”链接到“http://bpeer.com/i.mp3”,唱歌的内容从“http://bpeer.com/i.mp3”中获取,“gr”为动作的执行顺序。In the above control entry, the "ifs" part is a trigger condition set according to the sensing unit, "ear" is the identification of the sensing unit, and "singing" is the value of the sensing unit. The "trigger" part is the interactive behavior triggered by the trigger condition, including three interaction behaviors of "move", "song" and "take_pic", each of which includes a corresponding action instruction. Among them, "song" is linked to "http://bpeer.com/i.mp3", the content of singing is obtained from "http://bpeer.com/i.mp3", and "gr" is action Execution order.
在某些实施例中,多个控制条目可以存储为数据交换格式的文档,或者也可以存储在数据库中。在某些实施例中,机器人100还可以包括数据库系统,该数据库系统用以存储控制条目。数据库系统提供接口供一个或多个CPU 106从数据库中读取数据,以及向数据库系统写入数据。 In some embodiments, multiple control entries may be stored as documents in a data exchange format, or may also be stored in a database. In some embodiments, the bot 100 can also include a database system for storing control entries. The database system provides an interface for one or more CPUs 106 to read data from the database and to write data to the database system.
机器人交互行为的控制装置148可以根据控制条目控制机器人的交互行为,控制装置148获取机器人感知到的信息,根据感知到的信息、至少按照预先定义的感知单元生成感知数据,其中,感知数据包括感知单元的标识和取值;查找与生成的感知数据匹配的控制条目;如果查找到与生成的感知数据匹配的控制条目,使机器人执行查找到的控制条目中的交互行为。The control device 148 of the robot interaction behavior may control the interaction behavior of the robot according to the control item, and the control device 148 acquires the information perceived by the robot, and generates the sensing data according to the perceived information, at least according to a predefined sensing unit, wherein the sensing data includes the sensing. The identification and value of the unit; finding a control entry that matches the generated perceptual data; if the control entry matching the generated perceptual data is found, causing the robot to perform the interaction behavior in the found control entry.
在某些实施例中,控制装置148也可以将机器人100感知到的信息发送出去,由远端服务器(未示出)根据感知到的信息和感知单元生成感知数据,并查找与生成的感知单元匹配的控制条目,然后将查找到的控制条目发送给控制装置148,控制装置148使机器人执行控制条目中的交互行为。可选地,可以生成感知到的信息的标识,以确定接收到的控制条目是否为针对发送的感知到的信息的控制条目。可选地,发送给控制装置148的可以是控制条目本身,也可以是控制条目的标识,或者控制条目配置的交互行为数据,或者其他使控制装置148确定控制条目配置的交互行为的信息。In some embodiments, the control device 148 can also transmit the information perceived by the robot 100, and the remote server (not shown) generates the sensing data according to the sensed information and the sensing unit, and searches for and generates the sensing unit. The matching control entries are then sent to the control device 148, which causes the robot to perform the interactive behavior in the control entry. Alternatively, an identification of the perceptual information may be generated to determine if the received control entry is a control entry for the transmitted perceptual information. Optionally, the control device 148 may be sent to the control entry itself, or may be an identifier of the control entry, or interactive behavior data that controls the configuration of the entry, or other information that causes the control device 148 to determine the interaction behavior of the control entry configuration.
在某些实施例中,控制装置148可以根据机器人100感知到的信息和感知单元生成感知数据,将生成的感知数据发送至远端服务器,远端服务器接收感知数据,查找与感知数据匹配的控制条目,将查找到的控制条目发送至机器人100,控制装置148使机器人100执行控制条目中的交互行为。In some embodiments, the control device 148 can generate the sensing data according to the information and the sensing unit perceived by the robot 100, and send the generated sensing data to the remote server, and the remote server receives the sensing data to find the control that matches the sensing data. An entry, which sends the found control entry to the robot 100, causes the robot 100 to perform the interactive behavior in the control entry.
应当理解,控制装置148并不限于通过如上所述的方式控制机器人的交互行为,还可以是以上几种方式的组合或者其他方式。It should be understood that the control device 148 is not limited to controlling the interactive behavior of the robot by the manner described above, but may be a combination of the above several manners or other means.
关于生成控制条目About generating control entries
在某些实施例中,可以根据感知单元设置触发条件,以及该触发条件触发的交互行为,得到控制条目,并将控制条目作为控制机器人100交互行为的数据。In some embodiments, the trigger condition may be set according to the sensing unit, and the interaction behavior triggered by the trigger condition, the control item is obtained, and the control item is used as data for controlling the interaction behavior of the robot 100.
图2a说明根据本发明某些实施例的机器人的控制数据的生成方法的流程图,如图2a所示,该方法包括:2a illustrates a flow chart of a method of generating control data for a robot, as shown in FIG. 2a, in accordance with some embodiments of the present invention, the method comprising:
步骤S202,根据一个或多个预设的感知单元设置用于控制机器人交互行为的触发条件;Step S202, setting a trigger condition for controlling the interaction behavior of the robot according to one or more preset sensing units;
步骤S204,根据一个或多个预设的被设置为供机器人执行的交互行为,设置所述触发条件触发的交互行为;Step S204, setting an interaction behavior triggered by the trigger condition according to one or more preset interaction behaviors set for the robot to perform;
步骤S206,根据设置的触发条件和交互行为生成用于响应机器人感知到的信息来控制机器人交互行为的控制条目。 Step S206, generating a control item for controlling the interaction behavior of the robot in response to the information perceived by the robot according to the set trigger condition and the interaction behavior.
在上述步骤S202中,可以从预设的感知单元中选取至少一个感知单元;设置选取的感知单元的属性,其中,感知单元的属性包括感知单元的取值;根据选取的感知单元及感知单元的属性设置用于控制机器人交互行为的触发条件。在某些实施例中,还设置多个感知单元之间的关系,感知单元之间的关系包括但不限于“与”、“或”和“非”等逻辑关系;上述步骤S202,可以根据选择的感知单元及感知单元的属性、以及感知单元之间的关系设置触发条件。在某些实施例中,还可以设置感知单元的权重,以区分不同感知单元的重要程度,进而使交互行为与重要的感知单元对应。In the above step S202, at least one sensing unit may be selected from the preset sensing unit; the attribute of the selected sensing unit is set, wherein the attribute of the sensing unit includes the value of the sensing unit; according to the selected sensing unit and the sensing unit The property settings are used to control the triggering conditions of the robot's interactive behavior. In some embodiments, a relationship between the plurality of sensing units is further set, and the relationship between the sensing units includes, but is not limited to, logical relationships such as “AND”, “OR”, and “NO”; the foregoing step S202 may be selected according to the selection. A trigger condition is set for the relationship between the sensing unit and the sensing unit and the relationship between the sensing units. In some embodiments, the weight of the sensing unit may also be set to distinguish the importance of the different sensing units, thereby correspondingly interacting with the important sensing units.
上述步骤S204中,可以从预设的被设置为供机器人执行的交互行为中选取至少一个交互行为;设置选取的交互行为的属性,其中,交互行为的属性包括交互行为的一个或多个可被机器人解析以执行的动作指令以及动作指令的参数;根据选取的交互行为及交互行为的属性设置触发条件触发的交互行为。在某些实施例中,还可以设置多个交互行为的执行顺序,上述步骤S204,可以根据选取的交互行为及交互行为的属性、以及上述执行顺序设置触发条件触发的交互行为。交互行为的执行顺序包括但不限于:随机执行一个或多个交互行为,或者按预定步骤执行多个交互行为。In the above step S204, at least one interaction behavior may be selected from a preset interaction behavior set for the robot to perform; setting an attribute of the selected interaction behavior, wherein the attribute of the interaction behavior includes one or more interaction behaviors The robot parses the action instruction and the parameters of the action instruction; and sets the interaction behavior triggered by the trigger condition according to the selected interaction behavior and the attribute of the interaction behavior. In some embodiments, the execution order of the multiple interaction behaviors may also be set. In the above step S204, the interaction behavior triggered by the trigger condition may be set according to the selected interaction behavior and the attributes of the interaction behavior and the execution order. The execution order of the interaction behavior includes, but is not limited to, randomly performing one or more interaction behaviors, or performing a plurality of interaction behaviors according to predetermined steps.
在某些实施例中,按照预定的数据表达方式来描述触发条件和触发条件触发的交互行为。可选地,可以使用数据交换格式根据触发条件和触发条件触发的交互行为生成控制条目。数据交换格式包括但不限于以下之一或任意组合:XML、JSON或者YAML。应当理解,还可以采用其他格式生成触发条件和触发条件触发的交互行为,包括本文件提交日尚未开发出的数据表达方式。In some embodiments, the triggering condition and the triggering condition triggered interaction behavior are described in terms of a predetermined data representation. Alternatively, the control entry may be generated using a data exchange format based on the interaction behavior triggered by the trigger condition and the trigger condition. Data exchange formats include, but are not limited to, one or any combination of the following: XML, JSON, or YAML. It should be understood that other formats may also be used to generate triggering conditions triggered by trigger conditions and trigger conditions, including data representations that have not yet been developed on the filing date of this document.
在某些实施例中,可以设置多个控制条目,并将多个控制条目存储为数据交换格式的文档。或者,多个控制条目也可以存储在数据库中。将多个控制条目存储为数据交换格式的文档时,相邻的控制条目之间可以用预定符号分隔,以区分不同的控制条目。存储控制条目的文档可以存储在机器人100的存储器102中,控制条目的文档也可以存储在远端服务器中。In some embodiments, multiple control entries can be set and multiple control entries stored as documents in a data exchange format. Alternatively, multiple control entries can also be stored in the database. When storing multiple control entries as documents in a data exchange format, adjacent control entries can be separated by a predetermined symbol to distinguish between different control entries. The document storing the control entry may be stored in the memory 102 of the robot 100, and the document of the control entry may also be stored in the remote server.
交互行为被配置为一个或多个动作指令。上述动作指令包括:用于执行其他控制条目而设置的到其他控制条目的链接,和/或用于从多个内容和/或多个参数中选取内容和/或参数而设置的到多个参数和/或多个内容的链接。例如,对于“查询天气”的动作指令,可以链接到提供天气信息的网页,从网页中获取要查询的城市的天气信息。查询到天气信息后,可以显示在机器人100的显示装置上,或者也可以通过语音播报天气信息。在某些实施例中,链接到一组动作的参数时,可以根据其他配置选择执行的动作的 参数;同样,链接到多个内容时(例如聊天的多个语料),也可以根据其他配置选择呈现的内容。The interaction behavior is configured as one or more action instructions. The above action instructions include: links to other control entries set for executing other control entries, and/or multiple parameters set for selecting content and/or parameters from a plurality of content and/or parameters. And/or links to multiple content. For example, for the action command of "query weather", it is possible to link to a webpage providing weather information, and obtain weather information of the city to be queried from the webpage. After the weather information is queried, it may be displayed on the display device of the robot 100, or the weather information may be broadcasted by voice. In some embodiments, when linking to a set of action parameters, the actions performed may be selected according to other configurations. Parameters; Similarly, when linking to multiple content (such as multiple corpora of chats), you can also choose what to render based on other configurations.
还可以设置动作指令的执行顺序,其中,执行顺序包括:随机执行一个或多个动作指令,或者按预定步骤执行多个动作指令。执行顺序可以用符号进行标记,如果没有标记,可以按照描述动作的先后顺序。同一类型的动作可以作为一个整体,动作之间的先后顺序可以进行标记,例如“先向前移动5米,点头5次,然后向后退10米”,动作指令可以表达为[move:{gr:0,m:+5;gr:2,m:-10};head{gr:1,head:5}],“gr”表示动作的执行先后顺序,取值小的动作先执行。The execution order of the action instructions may also be set, wherein the execution sequence includes: randomly executing one or more action instructions, or executing a plurality of action instructions in predetermined steps. The order of execution can be marked with symbols. If there is no mark, it can be in the order in which the actions are described. The same type of action can be used as a whole, and the order of actions can be marked. For example, “move forward 5 meters, nod 5 times, then back 10 meters”, the action instruction can be expressed as [move:{gr: 0,m:+5;gr:2,m:-10};head{gr:1,head:5}], "gr" indicates the execution order of the action, and the action with a small value is executed first.
在某些实施例中,可以提供用于设置触发条件和交互行为的图形用户界面(GUI),图形用户界面提供设置的感知单元(例如,感知单元的名称、标识等)、可以设置的感知单元的取值、感知单元之间的逻辑关系,设置触发条件的用户可以选择感知单元、感知单元的取值以及感知单元的逻辑关系,选择设置触发条件的感知单元后,按照相应的格式生成触发条件。图形用户界面还可以提供设置的交互行为,可以是预先定义好的交互行为,在选择完交互行为之后,按照相应的格式生成交互行为。在某些实施例中,也可以直接编辑触发条件和交互行为,例如按照上述的数据交换格式、使用预先定义的感知单元以及交互行为的动作指令规范,编辑触发条件和触发条件触发的交互行为,得到控制条目。In some embodiments, a graphical user interface (GUI) can be provided for setting trigger conditions and interaction behaviors, the graphical user interface providing a set sensing unit (eg, the name of the sensing unit, the identification, etc.), a perceptible unit that can be set The value of the relationship between the sensing unit and the sensing unit. The user who sets the triggering condition can select the sensing unit, the value of the sensing unit, and the logical relationship of the sensing unit. After selecting the sensing unit that sets the triggering condition, the triggering condition is generated according to the corresponding format. . The graphical user interface can also provide set interaction behaviors, which can be pre-defined interaction behaviors, and after the interaction behavior is selected, the interaction behavior is generated according to the corresponding format. In some embodiments, the trigger condition and the interaction behavior may also be directly edited, for example, according to the data exchange format described above, the action instruction specification using the predefined sensing unit and the interaction behavior, and the interaction behavior triggered by the trigger condition and the trigger condition are edited. Get control entries.
图2b示出了某些实施例的机器人的控制数据的生成界面。如图2b所示,提供了添加控制条目(在图2b中称为添加规则)的图形用户界面。该图形用户界面中,包括触发条件和触发条件触发的交互行为两个部分:Figure 2b illustrates a generation interface for control data for a robot of certain embodiments. As shown in Figure 2b, a graphical user interface is added to add control entries (referred to as add rules in Figure 2b). The graphical user interface includes two parts: the trigger condition and the interaction behavior triggered by the trigger condition:
(1)第一部分为“选择机器人规则的执行条件”(本发明实施例中的触发条件),如图2b所示,感知单元包括“语音”、“视频监控情况”、“时间”、“是否有人在家”、“环境”,其中,“语音”的取值可以是对机器人说的文本,“视频监控情况”的取值包括机器人监控到有人或者有移动等。在图2b中,还包括感知单元之间的关系,图2b中示出了“同时满足”(逻辑与)和“满足一条即可”(逻辑或)两种关系。(1) The first part is "execution condition for selecting a robot rule" (trigger condition in the embodiment of the present invention), as shown in FIG. 2b, the sensing unit includes "voice", "video monitoring situation", "time", "whether or not" Some people are at home, "environment", where the value of "voice" can be the text spoken to the robot, and the value of "video surveillance situation" includes the robot monitoring to someone or having to move. In Fig. 2b, the relationship between the sensing units is also included, and the relationship of "simultaneous satisfaction" (logical AND) and "satisfying one" (logical OR) are shown in Fig. 2b.
(2)第二部分为“添加机器人的执行动作”(本发明实施例中的交互行为),如图2b所示,包括“说话”、“进入待命”、“录制音视频”、“播放音乐”、“移动”、“吸尘”、“充电”等交互行为。在图2b中,“进入待命”这一交互行为的属性包括“停止所有工作进入待命”和“退出”。 (2) The second part is "adding robot's execution action" (interaction behavior in the embodiment of the present invention), as shown in FIG. 2b, including "speaking", "entering standby", "recording audio and video", "playing music" Interactive behaviors such as "moving", "vacuum", and "charging". In Figure 2b, the attributes of the "go to standby" interaction include "stop all work into standby" and "exit".
图2c示出了某些实施例的控制数据中交互行为的生成界面。如图2c所示,选中要添加的交互行为并设置选中的交互行为的属性后,点击“添加”按钮,添加的交互行为的数据可以显示在“动作列表”部分中。如图2c所示,添加的交互行为包括:“录制音视频”、“播放音乐”、“移动”、“吸尘”和“充电”。其中,“gr”表示交互行为的执行顺序,在图2c中,“gr”按照动作的添加顺序自动生成;“播放音乐(song)”中“path”链接到一组要播放的内容(例如,MP3格式的音乐);“吸尘(suck)”中属性“power(功率)”设置为3,“m(移动距离)”设置为3。Figure 2c illustrates a generation interface for interactive behavior in control data of certain embodiments. As shown in Figure 2c, after selecting the interaction behavior to be added and setting the properties of the selected interaction behavior, click the "Add" button, and the data of the added interaction behavior can be displayed in the "Action List" section. As shown in Figure 2c, the added interactive behaviors include: "record audio and video", "play music", "move", "vacuum" and "charge". Where "gr" indicates the execution order of the interaction behavior. In Figure 2c, "gr" is automatically generated according to the order in which the actions are added; "path" in "song" is linked to a group of content to be played (for example, Music in MP3 format); the attribute "power" in "suck" is set to 3, and "m (moving distance)" is set to 3.
设置完成后,点击“提交”按钮,可以生成如下控制条目:After the setup is complete, click the "Submit" button to generate the following control entries:
{{
"ifs":{"ear":[{"txt":"你好"}],"eye":{"action":"human"},"is_and":"0"},"ifs":{"ear":[{"txt":"Hello"}],"eye":{"action":"human"},"is_and":"0"},
"trigger":{"record":[{"gr":"1","sint":"1"}],"song":[{"gr":"2","path":["http:\/\/"]}],"move":[{"gr":"3","dire":"left","dire_value":"0","speed":"3"}],"suck":[{"gr":"4","power":"3","m":"3"}],"battery":[{"gr":"5","b":"1"}]},"trigger":{"record":[{"gr":"1","sint":"1"}],"song":[{"gr":"2","path":["http :\/\/"]}],"move":[{"gr":"3","dire":"left","dire_value":"0","speed":"3"}], "suck":[{"gr":"4","power":"3","m":"3"}],"battery":[{"gr":"5","b": "1"}]},
"rule_name":"专利""rule_name": "patent"
}}
应当理解,图2b和2c仅作为一个示例,在某些实施例中,还可以通过其他方式提供生成控制条目的图形用户界面,例如,设置图标,通过将图标“拖拽”到编辑区域来添加感知单元或交互行为。It should be understood that Figures 2b and 2c are only one example. In some embodiments, a graphical user interface for generating control entries may also be provided by other means, such as setting icons, by adding "drag" the icon to the editing area. Perceived unit or interactive behavior.
在某些实施例中,可以从互联网抓取内容(例如网页等),对抓取的内容进行分析,得到用于设置控制条目的内容,根据这些内容设置触发条件和触发条件触发的交互行为。例如,从互联网中抓取到生病时拨打急救电话,可以根据感知单元设置“生病”的触发条件,并将该触发条件触发的交互行为设置为“拨打急救电话”,例如,将预设的交互行为“拨打电话(Call)”的参数设置为急救电话。如果预先定义了“健康状况”这一感知单元,可以直接将感知单元的值设置为“生病”,构成的触发条件可以为{if(“health”:“sick”)}。机器人100可以根据感知到的数据判断用户的健康状况,确定健康状况是否为“生病”,例如,与用户进行语音聊天以了解用户的状态,以及检测用户的心率、体温等。在健康状况为“生病”时,机器人100生成的感知数据中包括可以{“health”:“sick”}。In some embodiments, content may be fetched from the Internet (eg, a web page, etc.), the captured content may be analyzed, content for setting control entries may be obtained, and triggering conditions triggered by trigger conditions and trigger conditions may be set according to the content. For example, if you call the emergency number when you are ill from the Internet, you can set the trigger condition of “ill” according to the sensing unit, and set the interaction behavior triggered by the trigger condition to “call emergency call”, for example, the preset interaction. The behavior of the "Call" parameter is set to the emergency number. If the sensing unit of “health status” is predefined, the value of the sensing unit can be directly set to “ill”, and the triggering condition can be {if(“health”: “sick”)}. The robot 100 can determine the health status of the user based on the perceived data, determine whether the health condition is "ill", for example, perform a voice chat with the user to understand the state of the user, and detect the heart rate, body temperature, and the like of the user. When the health condition is "ill", the sensor 100 generates perceptual data including {"health": "sick"}.
在某些实施例中,还可以根据机器人100与用户的交互多个机器人100可以构成机器人系统。当没有控制条目可以控制机器人100的交互行为时,该机器人100可以将该情况的发送给其他一个或多个机器人100,其他一个或多个机器人100与其他用户交 互,根据与其他用户的交互获取交互行为,并根据这一过程生成控制条目。例如,当机器人100感知到用户的语音信息“宫保鸡丁怎么做”时,如果未查找到该语音信息匹配的控制条目,机器人100可以将该语音信息发送到其他机器人100,其他机器人100主动与其他用户进行语音交互,询问其他用户“宫保鸡丁怎么做”,并记录和分析其他用户的回答,生成对应的控制条目。机器人100可以根据其他机器人100的用户的信息选择发送的机器人100,例如,机器人100判断出问题的主题是法律,机器人100可以查找到用户为法律专业的机器人100,并向该机器人100发送上述问题。In some embodiments, the plurality of robots 100 may also constitute a robotic system based on the interaction of the robot 100 with the user. When no control item can control the interactive behavior of the robot 100, the robot 100 can transmit the situation to the other one or more robots 100, and the other one or more robots 100 can communicate with other users. Mutual, get interactions based on interactions with other users, and generate control entries based on this process. For example, when the robot 100 perceives the user's voice information "how to do Kung Pao Chicken", if the control item matching the voice information is not found, the robot 100 can transmit the voice information to the other robot 100, and the other robots 100 take the initiative. Interact with other users, ask other users "How to do Kung Pao Chicken", and record and analyze the answers of other users to generate corresponding control items. The robot 100 can select the transmitted robot 100 according to the information of the user of the other robot 100. For example, the robot 100 determines that the subject of the problem is law, and the robot 100 can find the robot 100 that the user is a legal professional, and send the above problem to the robot 100. .
关于使用控制条目控制机器人的交互行为About using control items to control robot interaction
将控制条目作为控制机器人交互行为的数据之后,可以根据控制条目来控制机器人的交互行为。After the control item is used as the data to control the interaction behavior of the robot, the interaction behavior of the robot can be controlled according to the control item.
图3说明根据本发明某些实施例的机器人交互行为的控制方法的流程图一,如图3所示,该方法包括:3 illustrates a flow chart 1 of a method of controlling robot interaction behavior, as shown in FIG. 3, in accordance with some embodiments of the present invention, including:
步骤S302,获取机器人感知到的数据;Step S302, acquiring data that is perceived by the robot;
步骤S304,根据感知到的信息、至少按照预先定义的感知单元生成感知数据,其中,感知数据包括感知单元的标识和取值;Step S304: Generate sensing data according to the perceptual information, at least according to a predefined sensing unit, where the sensing data includes an identifier and a value of the sensing unit.
步骤S306,在存储的多个控制条目中查找与生成的感知数据匹配的控制条目;Step S306, searching for a control entry that matches the generated sensing data among the stored plurality of control entries;
步骤S308,如果查找到与生成的感知数据匹配的控制条目,使机器人执行查找到的控制条目中的交互行为。Step S308, if the control item matching the generated sensing data is found, the robot is caused to perform the interactive behavior in the found control item.
在某些实施例中,机器人100通过网络与远端服务器(未示出)通信,机器人100感知至少一项数据,远端服务器从机器人100获取机器人感知到的信息,该获取包括远端服务器请求机器人100发送其感知到的信息,或者机器人感知到信息后,向远端服务器发送机器人100感知到的信息。机器人100可以周期性向远端服务器发送感知到的信息,或者在感知到的信息发生变化时向远端服务器发送感知到的信息,以降低远端服务器与机器人100之间的数据传输量。In some embodiments, the bot 100 communicates with a remote server (not shown) over a network, the bot 100 perceives at least one piece of data, and the remote server acquires information perceived by the bot from the bot 100, the acquisition including a remote server request The robot 100 transmits the information it perceives, or the robot senses the information, and transmits the information perceived by the robot 100 to the remote server. The robot 100 may periodically transmit the perceived information to the remote server or transmit the perceived information to the remote server when the perceived information changes to reduce the amount of data transmission between the remote server and the robot 100.
控制条目文档可以存储在远端服务器中,远端服务器包括一个或多个处理器,以及一个或多个保存在存储器中以执行图3所示的方法的模块、程序或指令集。远端服务器可以是单一的服务器,也可以是由多个服务器组成的服务器集群。应当理解,上述的程序或指令集并不局限于在一台服务器上运行,也可以在分布式的计算资源上运行。The control entry document can be stored in a remote server that includes one or more processors and one or more modules, programs, or sets of instructions that are stored in memory to perform the method illustrated in FIG. The remote server can be a single server or a server cluster consisting of multiple servers. It should be understood that the above described program or set of instructions is not limited to running on a single server, but can also be run on distributed computing resources.
在某些实施例中,可以将查找到的控制条目发送给机器人100,机器人100从控制条目中读取交互行为,并执行交互行为。或者,可以将查找到的控制条目中的交互行为 的数据发送给机器人100。或者,也可以对控制条目中的交互行为的数据进行解析,得到机器人100可以执行的指令,将得到的指令发送至机器人100,机器人100执行该指令。应当理解,上述方式仅为举例说明。In some embodiments, the found control entry can be sent to the bot 100, which reads the interactive behavior from the control entry and performs the interactive behavior. Or, you can interact with the found control entry The data is sent to the robot 100. Alternatively, the data of the interactive behavior in the control entry may be parsed to obtain an instruction that the robot 100 can execute, and the obtained command is transmitted to the robot 100, and the robot 100 executes the instruction. It should be understood that the above manner is merely illustrative.
图4说明根据本发明某些实施例的机器人交互行为的控制方法的流程图二,如图4所示,该方法包括:4 illustrates a flow chart 2 of a method of controlling robot interaction behavior, as shown in FIG. 4, according to some embodiments of the present invention, the method comprising:
步骤S402,接收机器人的感知数据,其中,感知数据根据机器人感知到的信息、至少按照预先定义的感知单元生成,感知数据包括感知单元的标识和取值;Step S402, receiving the sensing data of the robot, wherein the sensing data is generated according to the information perceived by the robot, at least according to a predefined sensing unit, and the sensing data includes the identifier and the value of the sensing unit;
步骤S404,在存储的多个控制条目中查找与机器人的感知数据匹配的控制条目;Step S404, searching for a control item matching the sensing data of the robot among the stored plurality of control items;
步骤S406,如果查找到与机器人的感知数据匹配的控制条目,使机器人执行查找到的控制条目中的交互行为。Step S406, if the control item matching the perceptual data of the robot is found, the robot is caused to perform the interaction behavior in the found control item.
如图4所示,机器人100感知至少一项信息,并根据感知到的信息和感知单元生成感知数据,将感知数据发送出去。在某些实施例中,机器人100将感知数据发送至远端服务器(未示出)。机器人100可以在生成感知数据之后发送感知数据,也可以在接收到远端服务器的请求后发送感知数据。As shown in FIG. 4, the robot 100 senses at least one piece of information, and generates sensing data based on the sensed information and the sensing unit, and transmits the sensing data. In some embodiments, the bot 100 sends the sensory data to a remote server (not shown). The robot 100 may transmit the sensing data after generating the sensing data, or may send the sensing data after receiving the request from the remote server.
在某些实施例中,远端服务器中存储控制条目的文档,例如,数据交换格式的文档,或者数据库等。当然,控制条目文档可以分布式存储在多个存储空间。远端服务器可以包括一个或多个处理器,以及一个或多个保存在存储器中以执行图3所示的方法的模块、程序或指令集。In some embodiments, the remote server stores documents that control entries, such as documents in a data exchange format, or a database, and the like. Of course, control entry documents can be distributed across multiple storage spaces. The remote server may include one or more processors and one or more modules, programs or sets of instructions stored in memory to perform the method illustrated in FIG.
图5说明根据本发明某些实施例的机器人交互行为的控制方法的流程图三,如图5所示,该方法包括:FIG. 5 illustrates a third flowchart of a method for controlling the interaction behavior of a robot according to some embodiments of the present invention. As shown in FIG. 5, the method includes:
步骤S502,感知至少一项信息;Step S502, sensing at least one piece of information;
步骤S504,根据感知到的信息、至少按照所述预先定义的感知单元生成感知数据,其中,感知数据包括感知单元的标识和取值;In step S504, the sensing data is generated according to the perceptual information, at least according to the predefined sensing unit, where the sensing data includes the identifier and the value of the sensing unit;
步骤S506,将生成的感知数据发送出去;Step S506, sending the generated sensing data;
步骤S508,接收与感知数据匹配的控制条目的信息;Step S508, receiving information of a control item that matches the sensing data;
步骤S510,根据控制条目的信息执行该控制条目配置的交互行为。Step S510, performing an interaction behavior of the control item configuration according to the information of the control item.
在某些实施例中,机器人100的交互行为控制装置148执行如图5所示的方法。机器人100感知至少一项信息,按照感知数据生成策略,根据感知单元生成感知数据。机器人100生成感知数据之后,将感知数据发送至远端服务器。远端服务器中存储控制条目的文档,在存储多个控制条目中查找与机器人的感知数据匹配的控制条目,如果查找 到与机器人的感知数据匹配的控制条目,将该控制条目发送至机器人100。在某些实施例中,可以将控制条目中的交互行为的动作指令发送至机器人100。In some embodiments, the interactive behavior control device 148 of the robot 100 performs the method as shown in FIG. The robot 100 perceives at least one piece of information, generates a policy according to the perceptual data, and generates perceptual data according to the sensing unit. After the robot 100 generates the sensing data, it transmits the sensing data to the remote server. A document storing a control entry in the remote server, and searching for a control entry matching the perceptual data of the robot in storing a plurality of control entries, if searching The control entry is sent to the robot 100 to a control entry that matches the sensory data of the robot. In some embodiments, an action instruction that controls the interactive behavior in the entry can be sent to the robot 100.
在某些实施例中,将生成的感知数据发送出去之前,还可以确定生成的感知数据的标识。确定生成的感知数据的标识后,将生成的感知数据及其标识发送出去。远端服务器在查找到与生成的感知数据匹配的控制条目后,将控制条目的信息和对应的感知数据的标识发送至控制装置148,控制条目的信息可以是控制条目本身、控制条目的标识、控制条条目配置的行为及其任意组合,但不限于此。控制装置接收控制条目的信息,并根据控制条目的信息中携带的感知数据的标识,判断接收到的控制条目的信息是否为与生成的感知数据匹配的控制条目的信息。In some embodiments, the identification of the generated sensory data may also be determined prior to transmitting the generated sensory data. After determining the identifier of the generated sensing data, the generated sensing data and its identifier are sent out. After the remote server finds the control entry that matches the generated sensing data, the information of the control entry and the identifier of the corresponding sensing data are sent to the control device 148, and the information of the control entry may be the control entry itself, the identifier of the control entry, The behavior of the control bar entry configuration and any combination thereof, but is not limited to this. The control device receives the information of the control entry, and determines whether the information of the received control entry is the information of the control entry that matches the generated sensing data according to the identifier of the sensing data carried in the information of the control entry.
控制装置148可以根据控制条目的标识确定对应的控制条目,并执行控制条目中的交互行为。或者,控制装置148可以直接从远端服务器发送的控制条目中读取控制条目配置的交互行为,执行该交互行为。再者,如果远端服务器发送的是控制条目中配置的交互行为,控制装置148可以直接解析并执行该交互行为。 Control device 148 can determine a corresponding control entry based on the identification of the control entry and perform an interaction behavior in the control entry. Alternatively, the control device 148 can read the interaction behavior of the control entry configuration directly from the control entry sent by the remote server to perform the interaction. Moreover, if the remote server sends the interaction behavior configured in the control entry, the control device 148 can directly parse and execute the interaction behavior.
在某些实施例中,可以将机器人的感知数据与控制条目中的触发条件进行匹配,所述的匹配包括但不限于判断是否存在某一感知单元、比较感知单元的取值。In some embodiments, the sensory data of the robot may be matched with a trigger condition in the control entry, including but not limited to determining whether there is a certain sensing unit, the value of the comparison sensing unit.
在某些实施例中,当查找到多个与机器人的感知数据项匹配的触发条件时,可以确定机器人的感知数据与匹配到的多个触发条件的匹配程度,至少根据匹配程度选择与生成的感知数据匹配的控制条目。作为一个例子,对于感知数据中的语音文本,可以但不限于采用编辑距离确定匹配程度,编辑距离的取值越小两个文本越相似。语音文本还可以采用正则表达式来匹配。In some embodiments, when a plurality of trigger conditions matching the perceptual data items of the robot are found, the degree of matching between the perceptual data of the robot and the matched plurality of trigger conditions may be determined, at least according to the degree of matching. A control entry that senses data matching. As an example, for the speech text in the perceptual data, the degree of matching may be determined by using the editing distance, and the smaller the value of the editing distance, the more similar the two texts are. Speech text can also be matched using regular expressions.
在某些实施例中,还可以设置控制条目的优先级,在选择控制条目时可以参考控制条目的优先级。例如,可以将控制条目分类为核心控制条目、用户控制条目以及临时控制条目,核心控制条目为优先级最高的控制条目,其次是用户控制条目,最后是临时控制条目。在查找控制条目时,可以先从核心控制条目中查找与感知数据匹配的控制条目。如果在核心控制条目中未查找到与感知数据匹配的控制条目,可以在用户控制条目中查找与感知数据匹配的控制条目。如果在用户控制条目中未查找到与感知数据匹配的控制条目,可以在临时训练库中查找与感知数据匹配的控制条目。In some embodiments, the priority of the control entry can also be set, and the priority of the control entry can be referenced when selecting the control entry. For example, control entries can be classified into core control entries, user control entries, and temporary control entries, with the core control entry being the highest priority control entry, followed by the user control entry, and finally the temporary control entry. When looking up a control entry, you can first look up the control entry that matches the perceptual data from the core control entry. If a control entry that matches the perceptual data is not found in the core control entry, a control entry that matches the perceptual data can be found in the user control entry. If a control entry that matches the perceptual data is not found in the user control entry, a control entry that matches the perceptual data can be found in the temporary training library.
在某些实施例中,机器人100可以感知至少一项信息,根据感知到的信息和感知单元生成感知数据,并读取控制条目(包括但不限于从机器人100的存储器102读取), 查找与生成的感知数据匹配的控制条目,如果查找到与生成的感知数据匹配的控制条目,机器人100执行查找到的控制条目中的交互行为。In some embodiments, the robot 100 can perceive at least one piece of information, generate perceptual data based on the perceptual information and sensing unit, and read control items (including but not limited to reading from the memory 102 of the robot 100), A control entry that matches the generated perceptual data is found, and if a control entry that matches the generated perceptual data is found, the robot 100 performs an interaction behavior in the found control entry.
在某些实施例中,控制条目的文档可以存储机器人100的存储器102和远端服务器中。机器人100感知至少一项信息,根据感知到的信息和感知单元生成感知数据,从存储器102中读取控制条目,在读取的控制条目中查找与生成的感知数据匹配的控制条目。如果查找到与生成的感知数据匹配的控制条目,机器人100执行查找到的控制条目中的交互行为;如果在读取的控制条目中没有查找到与生成的感知数据匹配的控制条目,机器人100可以将生成的感知数据发送到远端服务器,远端服务器在存储的控制条目中查找与接收到的感知数据匹配的控制条目,如果查找到与接收到的感知数据匹配的控制条目,使机器人100执行该控制条目中的交互行为。远端服务器还可以将查找到的控制条目发送至机器人100,机器人100可以通过接口(未示出)接收控制条目,并存储接收到的控制条目。In some embodiments, the document of the control entry may be stored in the memory 102 and the remote server of the bot 100. The robot 100 perceives at least one piece of information, generates sensing data based on the sensed information and the sensing unit, reads the control item from the memory 102, and searches for the control item matching the generated sensing data in the read control item. If the control entry matching the generated perceptual data is found, the robot 100 performs the interaction behavior in the found control entry; if the control entry matching the generated perceptual data is not found in the read control entry, the robot 100 may Sending the generated sensing data to the remote server, the remote server searching for the control entry matching the received sensing data in the stored control entry, and if the control entry matching the received sensing data is found, causing the robot 100 to execute The interaction behavior in this control entry. The remote server can also send the found control entry to the bot 100, which can receive the control entry via an interface (not shown) and store the received control entry.
如上所述,当查找到与感知数据匹配的控制条目时,使机器人100执行控制条目中的交互行为。当未查找到与感知数据匹配的控制条目时,可以不作交互行为,机器人100可以继续感知至少一项信息,感知何种信息可以根据预设的条件确定。在某些实施例中,当未查找到与感知数据匹配的控制条目时,可以进行语音回复或者导入互联网中的内容(例如,展示网页信息等)。当未查找到与感知数据匹配的控制条目时,可以判断感知数据是否与语音有关(例如,是否接收到用户的语音指令等),如果确定感知数据与语音有关,可以进行语音回复,或者根据语音内容在互联网中搜索相关内容,在机器人100的显示装置中呈现给用户。As described above, when a control entry that matches the perceptual data is found, the robot 100 is caused to perform an interactive behavior in the control entry. When the control item matching the perceptual data is not found, the interaction behavior may be omitted, and the robot 100 may continue to perceive at least one piece of information, and perceive what information can be determined according to the preset condition. In some embodiments, when a control entry that matches the perceptual data is not found, a voice reply can be made or imported into the Internet (eg, displaying web page information, etc.). When the control item matching the sensing data is not found, it may be determined whether the sensing data is related to the voice (for example, whether the user's voice instruction is received, etc.), and if the sensing data is determined to be related to the voice, the voice response may be performed, or according to the voice. The content is searched for relevant content in the Internet and presented to the user in the display device of the robot 100.
在某些实施例中,可以根据机器人与用户的交互行为设置控制条目。当未查找到与机器人100的感知数据匹配的控制条目时,机器人100可以与用户进行语音聊天,在聊天过程中,机器人100分析用户的需求和意图,得到情景和该情境下机器人的交互行为,根据情景、机器人的交互行为、按照感知单元生成控制条目。例如,用户生病时,对机器人说“我生病了”,机器人100的控制条目中没有在用户生病时的交互行为,此时机器人100可以和用户进行语音交互,比如询问用户“我不清楚需要做什么”,用户可以说“帮我拨打我的私人医生的电话吧,电话号码是….”,机器人100可以拨打电话。此外,在这种情况下,机器人100分析得出用户“生病”时需要联系医生,根据分析得出的结果,机器人100可以生成控制条目,例如,触发条件为[if(health:sick)],触发条件触发的交互行为为[call{number:“//doctor_number.php”]。 In some embodiments, the control entries can be set based on the interaction behavior of the robot with the user. When the control item matching the sensing data of the robot 100 is not found, the robot 100 can perform a voice chat with the user. During the chat, the robot 100 analyzes the user's needs and intentions, and obtains the interaction behavior of the scene and the robot in the situation. According to the situation, the interaction behavior of the robot, the control item is generated according to the sensing unit. For example, when the user is sick, the robot says "I am sick", and the control item of the robot 100 does not have an interaction behavior when the user is sick. At this time, the robot 100 can perform a voice interaction with the user, such as asking the user "I don't know what needs to be done. What, the user can say "Help me call my private doctor, the phone number is ....", the robot 100 can make a call. In addition, in this case, the robot 100 needs to contact the doctor when analyzing that the user is "ill", and according to the result of the analysis, the robot 100 can generate a control item, for example, the trigger condition is [if(health:sick)], The interaction behavior triggered by the trigger condition is [call{number:"//doctor_number.php"].
下面对某些实施例的机器人的控制数据的生成装置的结构进行描述。由于机器人的控制数据的生成装置解决问题的原理与机器人交互行为的控制方法相似,因此机器人的控制数据的生成装置的实施可以参见机器人的控制数据的生成方法的实施,重复之处不再赘述。以下所使用的,术语“单元”或者“模块”可以实现预定功能的软件和/或硬件的组合。尽管以下实施例所描述的装置较佳地以软件来实现,但是硬件,或者软件和硬件的组合的实现也是可能并被构想的。The structure of the apparatus for generating control data of the robot of some embodiments will be described below. Since the principle of solving the problem by the generating device of the control data of the robot is similar to the control method of the interactive behavior of the robot, the implementation of the generating device for controlling the data of the robot can be referred to the implementation of the method for generating the control data of the robot, and the repeated description will not be repeated. As used hereinafter, the term "unit" or "module" may implement a combination of software and/or hardware of a predetermined function. Although the apparatus described in the following embodiments is preferably implemented in software, hardware, or a combination of software and hardware, is also possible and contemplated.
图6说明根据本发明某些实施例的机器人的控制数据的生成装置的结构框图,如图6所示,该装置包括:6 is a block diagram showing the structure of a control device for generating control data of a robot according to some embodiments of the present invention. As shown in FIG. 6, the device includes:
触发条件设置模块602,用于根据一个或多个预设的感知单元设置用于控制机器人交互行为的触发条件,其中,感知单元被设置为控制机器人交互行为的最小单元;The trigger condition setting module 602 is configured to set a trigger condition for controlling the interaction behavior of the robot according to the one or more preset sensing units, wherein the sensing unit is set as a minimum unit that controls the interaction behavior of the robot;
交互行为设置模块604,与触发条件设置模块602相连,用于根据一个或多个预设的被设置为供机器人执行的交互行为,设置所述触发条件触发的交互行为;The interaction behavior setting module 604 is connected to the trigger condition setting module 602, and configured to set an interaction behavior triggered by the trigger condition according to one or more preset interaction behaviors set for the robot to perform;
生成模块606,与交互行为设置模块604相连,用于根据设置的触发条件和交互行为生成用于响应机器人感知到的信息来控制机器人交互行为的控制条目。The generating module 606 is connected to the interaction behavior setting module 604, and is configured to generate a control item for controlling the interaction behavior of the robot in response to the information perceived by the robot according to the set trigger condition and the interaction behavior.
图7说明根据本发明某些实施例的一种触发条件设置模块602的结构框图,如图7所示,触发条件设置模块602可以包括:感知单元选取单元702,用于从预设的感知单元中选取至少一个感知单元;感知单元属性设置单元704,与感知单元选取单元702相连,用于设置选取的感知单元的属性,其中,感知单元的属性包括感知单元的取值;触发条件设置单元706,与感知单元属性设置单元704相连,用于根据选取的感知单元及感知单元的属性设置用于控制机器人交互行为的触发条件。FIG. 7 illustrates a structural block diagram of a trigger condition setting module 602. As shown in FIG. 7, the trigger condition setting module 602 may include: a sensing unit selecting unit 702 for using a preset sensing unit. At least one sensing unit is selected; the sensing unit attribute setting unit 704 is connected to the sensing unit selecting unit 702, and configured to set the attribute of the selected sensing unit, wherein the attribute of the sensing unit includes the value of the sensing unit; the trigger condition setting unit 706 And connected to the sensing unit attribute setting unit 704, configured to set a trigger condition for controlling the interaction behavior of the robot according to the selected sensing unit and the attribute of the sensing unit.
图8说明根据本发明某些实施例的另一种触发条件设置模块602的结构框图,如图8所示,在图7包含的单元之外,触发条件设置模块602还可以包括:关系设置单元708,与触发条件设置单元706相连,用于设置多个感知单元之间的关系。触发条件设置单元706,进一步用于根据选择的感知单元及感知单元的属性、以及感知单元之间的关系设置触发条件。FIG. 8 illustrates a block diagram of another trigger condition setting module 602 according to some embodiments of the present invention. As shown in FIG. 8, the trigger condition setting module 602 may further include: a relationship setting unit, in addition to the unit included in FIG. 708, connected to the trigger condition setting unit 706, for setting a relationship between the plurality of sensing units. The trigger condition setting unit 706 is further configured to set a trigger condition according to the attributes of the selected sensing unit and the sensing unit and the relationship between the sensing units.
图9说明根据本发明某些实施例的一种交互行为设置模块604的结构框图,如图9所示,交互行为设置模块604可以包括:交互行为选取单元902,用于从预设的被设置为供机器人执行的交互行为中选取至少一个交互行为;交互行为属性设置单元904,与交互行为选取单元902相连,用于设置选取的交互行为的属性,其中,交互行为的属性包括交互行为的一个或多个可以被机器人解析以执行的动作指令以及动作指令的参数; 交互行为设置单元906,与交互行为属性设置单元904相连,根据选取的交互行为及交互行为的属性设置触发条件触发的交互行为。FIG. 9 illustrates a structural block diagram of an interaction behavior setting module 604. As shown in FIG. 9, the interaction behavior setting module 604 may include an interaction behavior selection unit 902 for setting from a preset. Selecting at least one interaction behavior for the interaction behavior performed by the robot; the interaction behavior attribute setting unit 904 is connected to the interaction behavior selection unit 902, and configured to set the attribute of the selected interaction behavior, wherein the attribute of the interaction behavior includes one of the interaction behaviors. Or a plurality of motion instructions that can be parsed by the robot to execute and parameters of the motion instruction; The interaction behavior setting unit 906 is connected to the interaction behavior attribute setting unit 904, and sets the interaction behavior triggered by the trigger condition according to the selected interaction behavior and the attribute of the interaction behavior.
图10说明根据本发明某些实施例的另一种交互行为设置模块604的结构框图,如图10所示,在图9包含的单元之外,交互行为设置模块604还可以包括:顺序设置单元908,与交互行为设置单元906相连,用于设置多个交互行为的执行顺序。交互行为设置单元906,用于根据选取的交互行为及交互行为的属性、以及上述执行顺序设置触发条件触发的交互行为。FIG. 10 illustrates a block diagram of another interaction behavior setting module 604. As shown in FIG. 10, in addition to the unit included in FIG. 9, the interaction behavior setting module 604 may further include: a sequence setting unit. 908, connected to the interaction behavior setting unit 906, for setting an execution order of the plurality of interaction behaviors. The interaction behavior setting unit 906 is configured to set an interaction behavior triggered by the trigger condition according to the selected interaction behavior and the attribute of the interaction behavior, and the foregoing execution order.
在本发明实施例中,定义感知单元为控制机器人交互行为的最小单位,并定义交互行为,根据感知单元和交互行为设置控制条目来控制机器人的交互行为,统一了机器人控制的输入输出标准,并且使得非技术人员也可以编辑机器人的行为,有效提高机器人自适应交互行为能力与智能化程度。In the embodiment of the present invention, the sensing unit is defined as a minimum unit for controlling the interaction behavior of the robot, and defines an interaction behavior, and controls the interaction behavior of the robot according to the sensing unit and the interaction behavior setting control item, and unifies the input and output standards of the robot control, and It enables non-technical personnel to edit the behavior of the robot and effectively improve the robot's adaptive interaction behavior and intelligence.
显然,本领域的技术人员应该明白,上述的本发明实施例的各模块或各步骤可以用通用的计算装置来实现,它们可以集中在单个的计算装置上,或者分布在多个计算装置所组成的网络上,可选地,它们可以用计算装置可执行的程序代码来实现,从而,可以将它们存储在存储装置中由计算装置来执行,并且在某些情况下,可以以不同于此处的顺序执行所示出或描述的步骤,或者将它们分别制作成各个集成电路模块,或者将它们中的多个模块或步骤制作成单个集成电路模块来实现。这样,本发明实施例不限制于任何特定的硬件和软件结合。Obviously, those skilled in the art should understand that the above modules or steps of the embodiments of the present invention can be implemented by a general computing device, which can be concentrated on a single computing device or distributed in multiple computing devices. Alternatively, they may be implemented by program code executable by the computing device such that they may be stored in the storage device by the computing device and, in some cases, may be different from The steps shown or described are performed sequentially, or they are separately fabricated into individual integrated circuit modules, or a plurality of modules or steps thereof are fabricated into a single integrated circuit module. Thus, embodiments of the invention are not limited to any specific combination of hardware and software.
以上所述仅为本发明的优选实施例而已,并不用于限制本发明,对于本领域的技术人员来说,本发明实施例可以有各种更改和变化。凡在本发明的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。 The above description is only the preferred embodiment of the present invention, and is not intended to limit the present invention, and various changes and modifications may be made to the embodiments of the present invention. Any modifications, equivalent substitutions, improvements, etc. made within the spirit and scope of the present invention are intended to be included within the scope of the present invention.

Claims (21)

  1. 一种机器人的控制数据的生成方法,其特征在于,包括:A method for generating control data of a robot, comprising:
    根据一个或多个预设的感知单元设置用于控制机器人交互行为的触发条件,其中,感知单元被设置为控制机器人交互行为的最小单元;Setting a trigger condition for controlling a robot interaction behavior according to one or more preset sensing units, wherein the sensing unit is set as a minimum unit that controls the robot interaction behavior;
    根据一个或多个预设的被设置为供机器人执行的交互行为,设置所述触发条件触发的交互行为;Setting an interaction behavior triggered by the trigger condition according to one or more preset interaction behaviors set for the robot to perform;
    根据设置的触发条件和交互行为生成用于响应机器人感知到的信息来控制机器人交互行为的控制条目。A control entry for controlling the interaction behavior of the robot in response to the information perceived by the robot is generated according to the set trigger condition and the interaction behavior.
  2. 根据权利要求1所述的方法,其特征在于,根据一个或多个预设的感知单元设置用于控制机器人交互行为的触发条件,包括:The method according to claim 1, wherein the triggering condition for controlling the interaction behavior of the robot is set according to one or more preset sensing units, including:
    从预设的感知单元中选取至少一个感知单元;Selecting at least one sensing unit from a preset sensing unit;
    设置选取的感知单元的属性,其中,感知单元的属性包括感知单元的取值;Setting an attribute of the selected sensing unit, where the attribute of the sensing unit includes a value of the sensing unit;
    根据选取的感知单元及感知单元的属性设置用于控制机器人交互行为的触发条件。A trigger condition for controlling the interaction behavior of the robot is set according to the selected sensing unit and the attribute of the sensing unit.
  3. 根据权利要求1或2所述的方法,其特征在于,根据一个或多个预设的被设置为供机器人执行的交互行为设置所述触发条件触发的交互行为,包括:The method according to claim 1 or 2, wherein the interaction behavior triggered by the trigger condition is set according to one or more preset interaction behaviors set for the robot to perform, including:
    从预设的被设置为供机器人执行的交互行为中选取至少一个交互行为;Selecting at least one interactive behavior from a preset interaction behavior set for the robot to perform;
    设置选取的交互行为的属性,其中,所述交互行为的属性包括交互行为的一个或多个可被机器人解析以执行的动作指令以及动作指令的参数;Setting an attribute of the selected interaction behavior, wherein the attribute of the interaction behavior includes one or more action instructions of the interaction behavior that can be parsed by the robot to execute and parameters of the action instruction;
    根据选取的交互行为及交互行为的属性设置触发条件触发的交互行为。The interaction behavior triggered by the trigger condition is set according to the selected interaction behavior and the attribute of the interaction behavior.
  4. 根据权利要求2或3所述的方法,其特征在于,还包括:设置多个感知单元之间的关系;The method according to claim 2 or 3, further comprising: setting a relationship between the plurality of sensing units;
    其中,根据选取的感知单元及感知单元的属性设置触发条件,包括:根据选择的感知单元及感知单元的属性、以及感知单元之间的关系设置触发条件。The setting a trigger condition according to the selected attributes of the sensing unit and the sensing unit includes: setting a trigger condition according to the selected sensing unit and the attribute of the sensing unit and the relationship between the sensing units.
  5. 根据权利要求2至4中任一项所述的方法,其特征在于,还包括:设置多个交互行为的执行顺序;The method according to any one of claims 2 to 4, further comprising: setting an execution order of the plurality of interaction behaviors;
    其中,根据选取的交互行为及交互行为的属性设置触发条件触发的交互行为,包括:根据选取的交互行为及交互行为的属性、以及所述执行顺序设置触发条件触发的交互行为。The interaction behavior triggered by the trigger condition is set according to the selected interaction behavior and the attribute of the interaction behavior, including: setting the trigger behavior triggered by the trigger condition according to the selected interaction behavior and the attribute of the interaction behavior, and the execution order.
  6. 根据权利要求2所述的方法,其特征在于,感知单元的取值为一个或多个预定义值, The method according to claim 2, wherein the value of the sensing unit is one or more predefined values.
    其中,设置选取的感知单元的属性包括:从感知单元的一个或多个预定义值中选取感知单元的取值。The setting the attribute of the selected sensing unit includes: selecting a value of the sensing unit from one or more predefined values of the sensing unit.
  7. 根据权利要求3所述的方法,其特征在于,所述动作指令的参数包括:用于执行其他控制条目而设置的到其他控制条目的链接,和/或用于从多个内容和/或多个参数中选取内容和/或参数而设置的到多个参数和/或多个内容的链接。The method according to claim 3, wherein the parameters of the action instruction comprise: links to other control entries set for executing other control entries, and/or for use from multiple content and/or more A link to a plurality of parameters and/or a plurality of contents set by selecting content and/or parameters among the parameters.
  8. 根据权利要求1至7中任一项所述的方法,其特征在于,根据设置的触发条件和交互行为生成用于响应机器人感知到的信息来控制机器人交互行为的控制条目,包括:The method according to any one of claims 1 to 7, characterized in that the control item for controlling the interaction behavior of the robot in response to the information perceived by the robot is generated according to the set trigger condition and the interaction behavior, including:
    根据设置的触发条件、按照第一预定格式生成触发条件数据;Generating trigger condition data according to the set trigger condition according to the first predetermined format;
    根据设置的交互行为、按照第二预定格式生成交互行为数据,其中,所述第一预定格式与所述第二预定格式相同或不同;Generating interactive behavior data according to the set interaction behavior according to a second predetermined format, wherein the first predetermined format is the same as or different from the second predetermined format;
    关联所述触发条件数据和所述交互行为数据,得到用于响应机器人感知到的信息来控制机器人交互行为的控制条目。Correlating the trigger condition data and the interaction behavior data, obtaining a control item for controlling the robot interaction behavior in response to the information perceived by the robot.
  9. 根据权利要求8所述的方法,其特征在于,所述第一预定格式和所述第二预定格式为数据交换格式。The method of claim 8 wherein said first predetermined format and said second predetermined format are data exchange formats.
  10. 根据权利要求9所述的方法,其特征在于,所述数据交换格式包括以下之一或任意组合:XML、JSON或者YAML。The method of claim 9, wherein the data exchange format comprises one or any combination of the following: XML, JSON, or YAML.
  11. 根据权利要求1至10中任一项所述的方法,其特征在于,所述预设的感知单元包括多个层级,其中,高层级的感知单元包括低层级的一个或多个感知单元。The method according to any one of claims 1 to 10, wherein the preset sensing unit comprises a plurality of levels, wherein the high-level sensing unit comprises one or more sensing units of a lower level.
  12. 根据权利要求11所述的方法,其特征在于,高层级的感知单元包括与其相邻的低层级的一个或多个感知单元。The method of claim 11 wherein the higher level sensing unit comprises one or more sensing units of the lower level adjacent thereto.
  13. 根据权利要求1所述的方法,其特征在于,还包括:The method of claim 1 further comprising:
    设置并生成多个控制条目;Set up and generate multiple control entries;
    为每个控制条目分配唯一标识;Assign a unique identifier to each control entry;
    将所述多个控制条目存储在数据交换格式的文档和/或数据库文档中。The plurality of control entries are stored in a document and/or database document in a data exchange format.
  14. 根据权利要求1至13中任一项所述的方法,其特征在于,还包括:The method according to any one of claims 1 to 13, further comprising:
    获取机器人感知到的信息;Obtain the information that the robot perceives;
    根据所述感知到的信息、按照至少一个感知单元生成感知数据,其中,所述感知数据包括感知单元的标识和取值;Generating the sensing data according to the at least one sensing unit according to the perceived information, where the sensing data includes an identifier and a value of the sensing unit;
    查找与生成的感知数据匹配的控制条目; Find a control entry that matches the generated perceptual data;
    如果查找到与所述生成的感知数据匹配的控制条目,使所述机器人执行查找到的控制条目配置的交互行为。If a control entry that matches the generated perceptual data is found, the robot is caused to perform the interactive behavior of the found control entry configuration.
  15. 根据权利要求1至13中任一项所述的方法,其特征在于,还包括:The method according to any one of claims 1 to 13, further comprising:
    感知至少一项信息;Perceive at least one piece of information;
    根据感知到的信息、按照至少一个感知单元生成感知数据,其中,感知数据包括感知单元的标识和取值;Generating the sensing data according to the at least one sensing unit according to the perceived information, wherein the sensing data includes an identifier and a value of the sensing unit;
    将生成的感知数据发送出去;Send the generated sensory data;
    接收与所述生成的感知数据匹配的控制条目的信息;Receiving information of a control entry that matches the generated perceptual data;
    执行所述控制条目配置的交互行为。Performing the interactive behavior of the control entry configuration.
  16. 根据权利要求1至13中任一项所述的方法,其特征在于,还包括:The method according to any one of claims 1 to 13, further comprising:
    接收机器人的感知数据,其中,感知数据根据机器人感知到的信息、按照至少一个的感知单元生成,感知数据包括感知单元的标识和取值;Receiving the sensory data of the robot, wherein the sensory data is generated according to at least one sensing unit according to the information perceived by the robot, and the sensing data includes the identifier and the value of the sensing unit;
    查找与机器人的感知数据匹配的控制条目;Find a control entry that matches the perceptual data of the robot;
    如果查找到与机器人的感知数据匹配的控制条目,使机器人执行查找到的控制条目配置的交互行为。If a control entry that matches the perceptual data of the robot is found, the robot is caused to perform the interactive behavior of the found control entry configuration.
  17. 一种机器人的控制数据的生成装置,其特征在于,包括:A device for generating control data of a robot, comprising:
    触发条件设置模块,用于根据一个或多个预设的感知单元设置用于控制机器人交互行为的触发条件,其中,感知单元被设置为控制机器人交互行为的最小单元;a trigger condition setting module, configured to set a trigger condition for controlling a robot interaction behavior according to one or more preset sensing units, wherein the sensing unit is set as a minimum unit that controls the robot interaction behavior;
    交互行为设置模块,用于根据一个或多个预设的被设置为供机器人执行的交互行为,设置所述触发条件触发的交互行为;An interaction behavior setting module, configured to set an interaction behavior triggered by the trigger condition according to one or more preset interaction behaviors set for the robot to perform;
    生成模块,用于根据设置的触发条件和交互行为生成用于响应机器人感知到的信息来控制机器人交互行为的控制条目。And a generating module, configured to generate, according to the set trigger condition and the interaction behavior, a control item for controlling the interaction behavior of the robot in response to the information perceived by the robot.
  18. 根据权利要求17所述的装置,其特征在于,所述触发条件设置模块,包括:The device according to claim 17, wherein the trigger condition setting module comprises:
    感知单元选取单元,用于从预设的感知单元中选取至少一个感知单元;a sensing unit selecting unit, configured to select at least one sensing unit from the preset sensing unit;
    感知单元属性设置单元,用于设置选取的感知单元的属性,其中,所述感知单元的属性包括感知单元的取值;a sensing unit attribute setting unit, configured to set an attribute of the selected sensing unit, where the attribute of the sensing unit includes a value of the sensing unit;
    触发条件设置单元,用于根据选取的感知单元及感知单元的属性设置用于控制机器人交互行为的触发条件。The trigger condition setting unit is configured to set a trigger condition for controlling the interaction behavior of the robot according to the selected sensing unit and the attribute of the sensing unit.
  19. 根据权利要求17或18所述的装置,其特征在于,所述交互行为设置模块,包括: The device according to claim 17 or 18, wherein the interaction behavior setting module comprises:
    交互行为选取单元,用于从预设的被设置为供机器人执行的交互行为中选取至少一个交互行为;An interaction behavior selection unit, configured to select at least one interaction behavior from a preset interaction behavior set for the robot to perform;
    交互行为属性设置单元,用于设置选取的交互行为的属性,其中,所述交互行为的属性包括交互行为的一个或多个可以被机器人解析以执行的动作指令以及动作指令的参数;An interaction behavior attribute setting unit, configured to set an attribute of the selected interaction behavior, where the attribute of the interaction behavior includes one or more action instructions of the interaction behavior that can be parsed by the robot to execute and parameters of the action instruction;
    交互行为设置单元,根据选取的交互行为及交互行为的属性设置触发条件触发的交互行为。The interaction behavior setting unit sets the interaction behavior triggered by the trigger condition according to the selected interaction behavior and the attribute of the interaction behavior.
  20. 根据权利要求18或19所述的装置,其特征在于,所述触发条件设置模块还包括:关系设置单元,用于设置多个感知单元之间的关系;The device according to claim 18 or 19, wherein the trigger condition setting module further comprises: a relationship setting unit, configured to set a relationship between the plurality of sensing units;
    其中,所述触发条件设置单元,用于根据选择的感知单元及感知单元的属性、以及感知单元之间的关系设置触发条件。The trigger condition setting unit is configured to set a trigger condition according to the attributes of the selected sensing unit and the sensing unit and the relationship between the sensing units.
  21. 根据权利要求19或20所述的装置,其特征在于,所述交互行为设置模块,还包括:顺序设置单元,用于设置多个交互行为的执行顺序;The device according to claim 19 or 20, wherein the interaction behavior setting module further comprises: a sequence setting unit configured to set an execution order of the plurality of interaction behaviors;
    其中,交互行为设置单元,用于根据选取的交互行为及交互行为的属性、以及所述执行顺序设置触发条件触发的交互行为。 The interaction behavior setting unit is configured to set an interaction behavior triggered by a trigger condition according to the selected interaction behavior and the attribute of the interaction behavior, and the execution order.
PCT/CN2016/087257 2015-06-26 2016-06-27 Method and apparatus for generating control data of robot WO2016206642A1 (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
CN201510363346.2 2015-06-26
CN201510363348.1 2015-06-26
CN201510363348.1A CN106325065A (en) 2015-06-26 2015-06-26 Robot interactive behavior control method, device and robot
CN201510364661.7 2015-06-26
CN201510363346.2A CN106325113B (en) 2015-06-26 2015-06-26 Robot controls engine and system
CN201510364661.7A CN106325228B (en) 2015-06-26 2015-06-26 Method and device for generating control data of robot

Publications (1)

Publication Number Publication Date
WO2016206642A1 true WO2016206642A1 (en) 2016-12-29

Family

ID=57584497

Family Applications (6)

Application Number Title Priority Date Filing Date
PCT/CN2016/087260 WO2016206645A1 (en) 2015-06-26 2016-06-27 Method and apparatus for loading control data into machine device
PCT/CN2016/087262 WO2016206647A1 (en) 2015-06-26 2016-06-27 System for controlling machine apparatus to generate action
PCT/CN2016/087257 WO2016206642A1 (en) 2015-06-26 2016-06-27 Method and apparatus for generating control data of robot
PCT/CN2016/087259 WO2016206644A1 (en) 2015-06-26 2016-06-27 Robot control engine and system
PCT/CN2016/087261 WO2016206646A1 (en) 2015-06-26 2016-06-27 Method and system for urging machine device to generate action
PCT/CN2016/087258 WO2016206643A1 (en) 2015-06-26 2016-06-27 Method and device for controlling interactive behavior of robot and robot thereof

Family Applications Before (2)

Application Number Title Priority Date Filing Date
PCT/CN2016/087260 WO2016206645A1 (en) 2015-06-26 2016-06-27 Method and apparatus for loading control data into machine device
PCT/CN2016/087262 WO2016206647A1 (en) 2015-06-26 2016-06-27 System for controlling machine apparatus to generate action

Family Applications After (3)

Application Number Title Priority Date Filing Date
PCT/CN2016/087259 WO2016206644A1 (en) 2015-06-26 2016-06-27 Robot control engine and system
PCT/CN2016/087261 WO2016206646A1 (en) 2015-06-26 2016-06-27 Method and system for urging machine device to generate action
PCT/CN2016/087258 WO2016206643A1 (en) 2015-06-26 2016-06-27 Method and device for controlling interactive behavior of robot and robot thereof

Country Status (1)

Country Link
WO (6) WO2016206645A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI735168B (en) * 2020-02-27 2021-08-01 東元電機股份有限公司 Voice robot

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11220008B2 (en) * 2017-07-18 2022-01-11 Panasonic Intellectual Property Management Co., Ltd. Apparatus, method, non-transitory computer-readable recording medium storing program, and robot
CN108388399B (en) * 2018-01-12 2021-04-06 北京光年无限科技有限公司 Virtual idol state management method and system
JP7188950B2 (en) * 2018-09-20 2022-12-13 株式会社Screenホールディングス Data processing method and data processing program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101618280A (en) * 2009-06-30 2010-01-06 哈尔滨工业大学 Humanoid-head robot device with human-computer interaction function and behavior control method thereof
WO2011058530A1 (en) * 2009-11-16 2011-05-19 Koninklijke Philips Electronics, N.V. Human-robot shared control for endoscopic assistant robot
CN102446428A (en) * 2010-09-27 2012-05-09 北京紫光优蓝机器人技术有限公司 Robot-based interactive learning system and interaction method thereof
CN103399637A (en) * 2013-07-31 2013-11-20 西北师范大学 Man-computer interaction method for intelligent human skeleton tracking control robot on basis of kinect
CN104640677A (en) * 2012-06-21 2015-05-20 睿信科机器人有限公司 Training and operating industrial robots

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001353678A (en) * 2000-06-12 2001-12-25 Sony Corp Authoring system and method and storage medium
JP4108342B2 (en) * 2001-01-30 2008-06-25 日本電気株式会社 Robot, robot control system, and program thereof
US7089184B2 (en) * 2001-03-22 2006-08-08 Nurv Center Technologies, Inc. Speech recognition for recognizing speaker-independent, continuous speech
US6957215B2 (en) * 2001-12-10 2005-10-18 Hywire Ltd. Multi-dimensional associative search engine
KR101077404B1 (en) * 2003-11-20 2011-10-26 파나소닉 주식회사 Association control apparatus, association control method and service association system
JP2005193331A (en) * 2004-01-06 2005-07-21 Sony Corp Robot device and its emotional expression method
WO2006093394A1 (en) * 2005-03-04 2006-09-08 Chutnoon Inc. Server, method and system for providing information search service by using web page segmented into several information blocks
JP2007044825A (en) * 2005-08-10 2007-02-22 Toshiba Corp Action control device, action control method and program therefor
US7945441B2 (en) * 2007-08-07 2011-05-17 Microsoft Corporation Quantized feature index trajectory
CN102077260B (en) * 2008-06-27 2014-04-09 悠进机器人股份公司 Interactive learning system using robot and method of operating same in child education
FR2946160B1 (en) * 2009-05-26 2014-05-09 Aldebaran Robotics SYSTEM AND METHOD FOR EDIT AND ORDER BEHAVIOR OF MOBILE ROBOT.
US20110213659A1 (en) * 2010-02-26 2011-09-01 Marcus Fontoura System and Method for Automatic Matching of Contracts in an Inverted Index to Impression Opportunities Using Complex Predicates and Confidence Threshold Values
FR2963132A1 (en) * 2010-07-23 2012-01-27 Aldebaran Robotics HUMANOID ROBOT HAVING A NATURAL DIALOGUE INTERFACE, METHOD OF USING AND PROGRAMMING THE SAME
KR20120047577A (en) * 2010-11-04 2012-05-14 주식회사 케이티 Apparatus and method for providing robot interaction services using interactive behavior model
JP2015501025A (en) * 2011-10-05 2015-01-08 オプテオン コーポレーション Method, apparatus and system for monitoring and / or controlling a dynamic environment
US20150242505A1 (en) * 2012-09-27 2015-08-27 Omron Corporation Device managing apparatus and device searching method
CN103324100B (en) * 2013-05-02 2016-08-31 郭海锋 A kind of emotion on-vehicle machines people of information-driven
CN103729476A (en) * 2014-01-26 2014-04-16 王玉娇 Method and system for correlating contents according to environmental state
CN103793536B (en) * 2014-03-03 2017-04-26 陈念生 Intelligent platform obtaining method and device
CN105511608B (en) * 2015-11-30 2018-12-25 北京光年无限科技有限公司 Exchange method and device, intelligent robot based on intelligent robot

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101618280A (en) * 2009-06-30 2010-01-06 哈尔滨工业大学 Humanoid-head robot device with human-computer interaction function and behavior control method thereof
WO2011058530A1 (en) * 2009-11-16 2011-05-19 Koninklijke Philips Electronics, N.V. Human-robot shared control for endoscopic assistant robot
CN102446428A (en) * 2010-09-27 2012-05-09 北京紫光优蓝机器人技术有限公司 Robot-based interactive learning system and interaction method thereof
CN104640677A (en) * 2012-06-21 2015-05-20 睿信科机器人有限公司 Training and operating industrial robots
CN103399637A (en) * 2013-07-31 2013-11-20 西北师范大学 Man-computer interaction method for intelligent human skeleton tracking control robot on basis of kinect

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI735168B (en) * 2020-02-27 2021-08-01 東元電機股份有限公司 Voice robot

Also Published As

Publication number Publication date
WO2016206645A1 (en) 2016-12-29
WO2016206644A1 (en) 2016-12-29
WO2016206646A1 (en) 2016-12-29
WO2016206643A1 (en) 2016-12-29
WO2016206647A1 (en) 2016-12-29

Similar Documents

Publication Publication Date Title
CN106325228B (en) Method and device for generating control data of robot
US11810562B2 (en) Reducing the need for manual start/end-pointing and trigger phrases
US9543918B1 (en) Configuring notification intensity level using device sensors
WO2021008538A1 (en) Voice interaction method and related device
WO2016206642A1 (en) Method and apparatus for generating control data of robot
CN106325065A (en) Robot interactive behavior control method, device and robot
US11367443B2 (en) Electronic device and method for controlling electronic device
WO2015155977A1 (en) Linking system, device, method, and recording medium
US20130159400A1 (en) User device, server, and operating conditions setting system
CN106325113B (en) Robot controls engine and system
KR20190009201A (en) Mobile terminal and method for controlling the same
CN110399474B (en) Intelligent dialogue method, device, equipment and storage medium
CN106921802B (en) Audio data playing method and device
CN111816168A (en) Model training method, voice playing method, device and storage medium
WO2023006033A1 (en) Speech interaction method, electronic device, and medium
US20200234187A1 (en) Information processing apparatus, information processing method, and program
KR20200101221A (en) Method for processing user input and electronic device supporting the same
WO2017081894A1 (en) Communication system and communication control method
US11731262B2 (en) Robot and method for operating the same
WO2020153146A1 (en) Information processing device and information processing method
CN109902606B (en) Operation method and terminal equipment
US20220055223A1 (en) Electronic device for providing reaction on basis of user state and operating method therefor
CN113241077A (en) Voice entry method and device for wearable device
EP2930889A1 (en) Systems and methods for adaptive notification networks

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16813758

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16813758

Country of ref document: EP

Kind code of ref document: A1