WO2016206644A1 - Robot control engine and system - Google Patents

Robot control engine and system Download PDF

Info

Publication number
WO2016206644A1
WO2016206644A1 PCT/CN2016/087259 CN2016087259W WO2016206644A1 WO 2016206644 A1 WO2016206644 A1 WO 2016206644A1 CN 2016087259 W CN2016087259 W CN 2016087259W WO 2016206644 A1 WO2016206644 A1 WO 2016206644A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
control
sensing unit
sensing
data
Prior art date
Application number
PCT/CN2016/087259
Other languages
French (fr)
Chinese (zh)
Inventor
聂华闻
Original Assignee
北京贝虎机器人技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201510364661.7A external-priority patent/CN106325228B/en
Priority claimed from CN201510363346.2A external-priority patent/CN106325113B/en
Priority claimed from CN201510363348.1A external-priority patent/CN106325065A/en
Application filed by 北京贝虎机器人技术有限公司 filed Critical 北京贝虎机器人技术有限公司
Publication of WO2016206644A1 publication Critical patent/WO2016206644A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/10Program control for peripheral devices

Definitions

  • the invention relates to the field of artificial intelligence technology, and in particular to a robot control engine and system.
  • Today's robots are mostly industrial robots, while industrial robots are mostly non-sense.
  • the operating procedures of these robots are pre-defined and the determined tasks are completed without fail in accordance with the predetermined procedures. They lack adaptability and produce consistent results only when the objects involved are the same.
  • the embodiment of the invention provides a robot control engine and system to at least effectively improve the adaptive interaction behavior and the degree of intelligence of the robot.
  • a robot control engine includes:
  • the sensing data acquiring device is configured to acquire the sensing data generated by the robot-perceived information according to the at least one preset sensing unit, wherein the sensing data includes the value of the sensing unit;
  • control item generating device configured to generate and maintain a control item that controls an interaction behavior of the robot based on the perceptual data of the robot, wherein the control item includes an activation condition triggered by a trigger condition and a trigger condition composed of the at least one sensing unit;
  • An inverted index generating device configured to generate an inverted index with the sensing unit included in the trigger condition in each control item as a primary key and to control the identification of the entry;
  • the control item retrieval means is arranged to retrieve a control item for controlling the interaction behavior of the robot based on the perceptual data of the robot and the inverted index.
  • a robot control engine includes:
  • the sensing data acquiring device is configured to acquire the sensing data generated by the robot-perceived information according to the at least one preset sensing unit, wherein the sensing data includes the value of the sensing unit;
  • control item generating device configured to generate a control item for controlling an interaction behavior of the robot based on the perceptual data of the robot, wherein the control item includes a trigger condition triggered by the at least one sensing unit and an interaction condition triggered by the trigger condition;
  • the sensing unit classifying device is configured to classify the sensing unit based on the type of the sensing unit to form a sensing unit set differentiated by the sensing unit type;
  • an inverted index generating device configured to generate a plurality of inverted indexes differentiated by the sensing unit type based on the sensing unit set, aiming at a sensing unit included in a trigger condition in each control item as a primary key ;
  • Controlling the item retrieval agent device configured to analyze the sensing unit included in the sensory data of the robot, and selecting a corresponding inverted index based on the type of the sensing unit included in the sensory data of the robot;
  • the control item retrieval means is arranged to retrieve a control item for controlling the interaction behavior of the robot based on the perceptual data of the robot and the inverted index selected by the control item retrieval agent means.
  • a robot control engine includes:
  • the sensing data acquiring device is configured to acquire the sensing data generated by the robot-perceived information according to the at least one preset sensing unit, wherein the sensing data includes the value of the sensing unit;
  • control item generating device configured to generate a control item for controlling an interaction behavior of the robot based on the perceptual data of the robot, wherein the control item includes a trigger condition triggered by the at least one sensing unit and an interaction condition triggered by the trigger condition;
  • the inverted index generating device is configured to convert the value of the sensing unit included in the trigger condition in the control entry into an integer integer (for example, a digital signature technology), and transform the obtained integer integer as a primary key to control the identifier of the entry as The target produces an inverted index;
  • the control item retrieval means is configured to convert the value of the sensing unit in the sensory data of the robot into an integer based on the digital signature technique, and to use the integer integer obtained by the value conversion of the sensing unit in the sensory data and the inverted index search A control entry that controls the interaction behavior of the robot.
  • a robot control engine includes:
  • the sensing data acquiring device is configured to acquire the sensing data generated by the robot-perceived information according to the at least one preset sensing unit, wherein the sensing data includes the value of the sensing unit;
  • control item generating device configured to generate a control item for controlling an interaction behavior of the robot based on the perceptual data of the robot, wherein the control item includes a trigger condition triggered by the at least one sensing unit and an interaction condition triggered by the trigger condition;
  • the inverted index generating device is configured to generate an inverted index with the sensing unit included in the trigger condition in each control item as a primary key and the identification of the control item as a target;
  • Controlling an item retrieval means configured to retrieve a control item for controlling the interaction behavior of the robot based on the value of the sensing unit in the sensory data of the robot and the inverted index;
  • the retrieval result synthesizing means is configured to merge the control items retrieved based on the values of the respective sensing units in the perceptual data of the robot to form a control item that matches the perceptual data of the robot.
  • a robot control engine includes:
  • the sensing data acquiring device is configured to acquire the sensing data generated by the robot-perceived information according to the at least one preset sensing unit, wherein the sensing data includes the value of the sensing unit;
  • control item generating device configured to generate a control item for controlling an interaction behavior of the robot based on the perceptual data of the robot, wherein the control item includes a trigger condition triggered by the at least one sensing unit and an interaction condition triggered by the trigger condition;
  • the inverted index generating device is configured to generate an inverted index with the sensing unit included in the trigger condition in each control item as a primary key and the identification of the control item as a target;
  • Controlling an item retrieval means configured to retrieve a control item for controlling the interaction behavior of the robot based on the value of the sensing unit in the sensory data of the robot and the inverted index;
  • the retrieval result synthesizing means is configured to merge the control items retrieved based on the values of the respective sensing units in the perceptual data of the robot based on the logical relationship between the sensing units constituting the triggering conditions in the retrieved control items, to form a control item with the robot A control entry that senses data matching.
  • a robot control engine includes:
  • the sensing data acquiring device is configured to acquire the sensing data generated by the robot-perceived information according to the at least one preset sensing unit, wherein the sensing data includes the value of the sensing unit;
  • control item generating device configured to generate a control item for controlling an interaction behavior of the robot based on the perceptual data of the robot, wherein the control item includes a trigger condition triggered by the at least one sensing unit and an interaction condition triggered by the trigger condition;
  • the inverted index generating device is configured to generate an inverted index with the sensing unit included in the trigger condition in each control item as a primary key and the identification of the control item as a target;
  • Controlling an item retrieval means configured to retrieve a control item for controlling the interaction behavior of the robot based on the perceptual data of the robot and the inverted index;
  • the control item sorting means is arranged to sort the control items retrieved by the control item retrieval means to select control items for controlling the robot interaction behavior based on the sorted result.
  • a robot control engine includes:
  • the sensing data acquiring device is configured to acquire the sensing data generated by the robot-perceived information according to the at least one preset sensing unit, wherein the sensing data includes the value of the sensing unit;
  • control item generating device configured to generate a control item for controlling an interaction behavior of the robot based on the perceptual data of the robot, wherein the control item includes a trigger condition triggered by the at least one sensing unit and an interaction condition triggered by the trigger condition;
  • the inverted index generating device is configured to generate an inverted index with the sensing unit included in the trigger condition in each control item as a primary key and the identification of the control item as a target;
  • Controlling the item retrieval means configured to retrieve a control item for controlling the interaction behavior of the robot based on the perceptual data of the robot and the inverted index;
  • a user feedback obtaining device configured to obtain user feedback of the user's interaction behavior with the robot
  • control item execution status recording device configured to record execution status information of the control item to form an execution log
  • a user behavior recording device configured to record user behavior to form a user behavior log
  • Controlling an item ranking device configured to retrieve the control item retrieval device based on the user feedback, and/or the execution log, and/or the priority of the control entry, and/or the user behavior log
  • the control entries are sorted to select control entries that control the interactive behavior of the robot based on the sorted results.
  • a robot control engine includes:
  • the sensing data acquiring device is configured to acquire the sensing data generated by the robot-perceived information according to the at least one preset sensing unit, wherein the sensing data includes the value of the sensing unit;
  • An Internet content crawling device configured to crawl content from the Internet to form a collection of Internet content
  • Controlling the item generating means configured to generate a control item for controlling the interaction behavior of the robot based on the perceptual data of the robot based on the set of Internet content, the preset sensing unit, and the preset interaction behavior;
  • An inverted index generating device configured to generate an inverted index with the sensing unit included in the trigger condition in each control item as a primary key and to control the identification of the entry;
  • the control item retrieval means is arranged to retrieve a control item for controlling the interaction behavior of the robot based on the perceptual data of the robot and the inverted index.
  • a storage medium is further provided, wherein the software includes the above-mentioned software, including but not limited to: an optical disk, a floppy disk, a hard disk, an erasable memory, and the like.
  • the embodiment of the invention provides a robot control engine and system, which pre-defines the interaction behavior of the sensing unit and the robot that controls the interaction behavior of the robot, and uses it as the smallest unit for controlling the interaction behavior of the robot, according to the sensing unit and the preset interaction behavior.
  • Set the interaction behavior triggered by the trigger condition and the trigger condition obtain the control items of the control robot, unify the input and output standards of the robot control, so that the non-technical personnel can also edit the behavior of the robot, thereby facilitating the control of the robot's interaction behavior and effectively improving the robot.
  • Adaptive interaction behavior and intelligence which pre-defines the interaction behavior of the sensing unit and the robot that controls the interaction behavior of the robot, and uses it as the smallest unit for controlling the interaction behavior of the robot, according to the sensing unit and the preset interaction behavior.
  • FIG. 1 is a schematic diagram of a robot control system according to an embodiment of the present invention.
  • FIG. 2 is a schematic diagram of a robot according to an embodiment of the present invention.
  • FIG. 3 is a structural block diagram of a robot control engine according to an embodiment of the present invention.
  • FIG. 4 is a structural block diagram of another robot control engine according to an embodiment of the present invention.
  • FIG. 5 is a structural block diagram of still another robot control engine according to an embodiment of the present invention.
  • FIG. 6 is a structural block diagram of still another robot control engine according to an embodiment of the present invention.
  • FIG. 7 is a structural block diagram of another robot control engine according to an embodiment of the present invention.
  • At least one sensing unit is defined in advance, and the value of the sensing unit depends on information perceived by the robot.
  • the sensing unit acts as the smallest unit (or called the minimum input unit) that controls the robot, and the robot can make interactive behavior based on at least the sensing unit.
  • the interaction behavior of the robot may be controlled by one or more sensing units, for example, when the values of one or more sensing units change, the robot may respond to the changes to make an interactive behavior; or, when one or more sensing units When the value is within a certain value range or equal to a certain value, the robot can respond to the sensing unit to make an interactive behavior. It should be understood that the control of the interaction behavior of the robot by the sensing unit is not limited to the above case, and the above case is merely illustrative.
  • the sensing unit can include multiple levels, and the higher level sensing unit can include one or more sensing units of the lower level.
  • the higher level perceptual unit may include one or more perceptual units of the lower level adjacent thereto, and the sensing unit of the same higher level may include different lower level perceptual units.
  • the low-level sensing units that synthesize the high-level sensing units include, but are not limited to, low-level sensing units of the same time or time period, and historical low-level sensing units of the time or time period.
  • the higher level perceptual units are determined by lower level sensing units at different times.
  • the value of the sensing unit may be one or a set of values, or may be a range of one or more values.
  • the value of the sensing unit may be determined according to the information perceived by the robot.
  • One sensing unit may be determined by one or more pieces of information that is perceived, and the same sensing unit may be determined by different data that is perceived.
  • the perceived information may include real-time perceived information, or historically perceived information (such as information perceived at a certain time or some time in the past). In some cases, the value of the sensing unit is determined by the information perceived in real time and the information perceived by the history.
  • a sensing unit such as ear, eye, timer, whether someone is at home (so_at_home), and environment (environment).
  • the auditory describes the speech that is heard, and when the robot receives the sound, performs speech recognition processing on the received sound to identify the text of the speech in the sound, and the value of the auditory may be the text of the speech being heard; in some embodiments
  • the sound source can also be positioned.
  • the hearing can also include the direction of the sound. The direction of the sound is referenced to the face of the robot, including left, right, front, and rear directions.
  • emotion recognition technology can also be used to obtain the voice information. Identify the emotions contained in the voice.
  • the robot can analyze the image or video to determine whether there is any current or whether there is movement.
  • the visual value can include whether there is anybody, whether there is movement, etc.
  • the monitoring object can also be identified based on video surveillance (for example The emotions of the at least one user who is in conversation with the robot can be determined based on facial recognition and physical activity. Whether someone at home can be "0" or "1", "0" means no one is at home, "1" means someone At home, whether someone is at home can be determined in a variety of ways, such as through video surveillance to determine whether the monitored object includes people.
  • the time describes the time information, and the value may be a time point or a time range, for example, 14:00 every February 1st.
  • the environment describes the environmental conditions, including temperature, humidity, noise, PM2.5, ppm of gas in the air, carbon monoxide content in the air, oxygen content in the air, etc., and the value may be the value or range of each parameter.
  • the value of the sensing unit can be predefined.
  • the value of the predefined sensing unit may be one or more specific values, or one or more ranges of values.
  • the value of the sensing unit may be an explicit value, or may be formed by a wildcard (or the like) together with an explicit value, but is not limited thereto. For example, when the sensing unit is “speech”, the value may be “*rain*”, indicating that any voice information containing “raining” is included; or the value may be “*[below] rain*”, indicating Any voice message that contains "rain” or "rain”.
  • the bot may generate the sensing data according to at least the sensing unit and the perceived information, and the sensing data may include one or more sensing units, where the sensing data includes the value of the sensing unit and the identification of the sensing unit.
  • the sensing data includes the value of the sensing unit and the identification of the sensing unit.
  • the robot generates the sensing data according to the perceived information according to the sensing unit, and can obtain the value of the sensing unit according to the perceived information by using various analysis methods, for example, obtaining the text of the voice through the voice recognition technology, and analyzing and sensing through the image recognition technology. Whether there is a portrait in the image, the attribute of the portrait is determined by the portrait (face) recognition technique, and the like. It should be understood that the robot is not limited to obtaining the value of the sensing unit by the above-mentioned manner, and may also be processed by other means including the processing technology that has not been developed at the filing date of this document.
  • a plurality of sensing units may be preset, and it should be understood that the setting of the following exemplary sensing unit is not a division of the sensing unit, or a limitation of the number of sensing units, or the expression of the sensing unit, in fact any The division of the sensing unit can all be considered.
  • An example of the sensing unit is shown in Table 1.
  • perceptual unit an exemplary perceptual data is given below. It should be understood that the following perceptual data is not the number of elements of perceptual data, or the definition of perceptual data elements, or the format or perception of perceptual data. The definition of the way the data is expressed.
  • the JSON-aware data of an example case is expressed as follows, but is not limited thereto, and other methods are also possible.
  • “vision_human_position” records that the human user is behind the robot ("back")
  • “back” can also be represented by other characters, which can distinguish different positions. It should be understood that the position is also It can be expressed by "angle value”, for example, “vision_human_position”: “45°” or the like.
  • “sensing_touch” records the touch of the human user on the robot. The position of the touch is the hand (“hand”). The “hand” can also be represented by other characters. It can distinguish different positions. It should be understood that the touch position can be more The value of "sensing_touch” can be an array that records multiple locations.
  • “audio_speak_txt” records what the human user said “very happy to see you”, and the content can also be audio data.
  • “audio_speak_language” records the language “chinese” spoken by human users.
  • “vision_human_posture” records the human user's gesture “posture1”, and “posture1” can also be represented by other characters, which can distinguish different postures.
  • “system_date” records the date “2016/3/16" of the generation of the perceptual data
  • “system_time” records the time “13-00-00” of the perceptual data generation.
  • “system_power” records the robot's power "80%”, it should be understood that the power can also be identified in other ways.
  • the trigger condition and the interaction behavior triggered by the trigger condition can be set.
  • a control item for controlling the interaction behavior of the robot in response to the information perceived by the robot is generated.
  • Control entries can have unique identifiers.
  • the triggering condition may be composed of one or more sensing units, and the logical units may be configured between the sensing units, and the logical relationships include, but are not limited to, “and”, “or”, “not”, and the like.
  • the triggering condition may include an identification and a value of the sensing unit constituting the triggering condition, and the value of the sensing unit may be one or a set of values, or one or a set of value ranges.
  • the value of the sensing unit can be an explicit value or a wildcard (or similar) A clear value constitutes, but is not limited to.
  • the sensing unit in the trigger condition is “speech”
  • the value may be “*rain*”, indicating that any voice information containing “rain” is included; or the value may be “*[below] rain* ",” means that any voice message containing "rain” or "rain” is included.
  • One or more interactions that the trigger condition can trigger.
  • the order between interactions can be set to perform multiple interactions in a set order.
  • the order of execution of the one or more interactions can also be configured.
  • the execution order may include, but is not limited to, randomly performing one or a set of interaction behaviors to achieve random execution of one or more actions; or performing a plurality of interaction behaviors in a predetermined sequence of steps.
  • the interaction behavior can be configured as one or more action instructions that can be parsed by the robot for execution, and the action instructions can also include one or more parameters.
  • the order of execution of the one or more action instructions can also be configured.
  • the execution order may include, but is not limited to, randomly executing one or a set of action instructions to effect random execution of one or more actions; or executing a plurality of action instructions in a predetermined sequence of steps.
  • the action instructions of the interactive behavior include: links to other control entries set for execution of other control entries, and/or for selecting content from a plurality of content and/or parameters and/or A link to multiple parameters and/or multiple content set by parameters or parameters.
  • Each control entry may have a unique identification to which an action instruction may refer to the control entry.
  • the content linked to the action instruction may be a set of actions, and the robot may perform actions in a set of actions according to other factors. For example, attributes such as character or gender of the robot may be pre-configured, and the attributes may be stored in a memory, different genders or characters. The robot may have different interaction behaviors for the same situation (or called a scene).
  • the robot may select an action to be performed from a group of actions according to attributes such as set personality or gender, and these actions may include, but are not limited to, the body motion of the robot.
  • the one or a group of content linked to the action instruction may include, but is not limited to, the content of the voice chat, various Internet information, and the like.
  • the action performed by the robot according to the control item is to query the weather in Beijing, and the action instruction may be a weather query. Address, the robot to this address to get the weather in Beijing, this address can include Uniform Resource Locator (URL), memory address, database field and so on.
  • URL Uniform Resource Locator
  • Robot interactions include, but are not limited to, outputting speech, adjusting gestures, outputting images or video, interacting with other devices, and the like.
  • Output speech includes, but is not limited to, chatting with a user, playing music; adjusting gestures including, but not limited to, moving (eg, mimicking human walking, etc.), limb swings (eg, arm, head swing), posture adjustment, etc.; outputting images or videos Including but not limited to displaying an image or video on a display device, the image may be a dynamic electronic expression or the like, or may be a captured image, or an image obtained from a network; interaction with other devices includes, but is not limited to, controlling other devices ( For example, adjusting the operating parameters of air-conditioning equipment, etc.), transmitting data to other equipment, Establish connections with other devices, etc.
  • the interaction behavior is not limited to the above enumerated contents, and the robot's reaction to the perceived information can be regarded as the interaction behavior of the robot.
  • Control entries can be configured in a data exchange format, although other formats can be used.
  • Data exchange formats include, but are not limited to, XML, JSON, or YAML.
  • JSON Take JSON as an example, you need to implement: When the user says, "Sing me a song,” first go back to 10cm at a medium speed of 0 and then start singing a song. After singing the song, take a photo and send it to the user 10 seconds later. Then 0 angle forward 5CM.
  • the control entry for the JSON data format can be as follows:
  • the "ifs” part is a trigger condition set according to the sensing unit
  • "ear” is the identification of the sensing unit
  • “singing” is the value of the sensing unit.
  • the “trigger” part is the interactive behavior triggered by the trigger condition, including three interaction behaviors of “move”, “song” and “take_pic”, each of which includes a corresponding action instruction. Among them, “song” is linked to "http://bpeer.com/i.mp3", the content of singing is obtained from “http://bpeer.com/i.mp3”, and "gr" is action Execution order.
  • the behavioral instruction and behavior control parameters can be written in the JSON language, but are not limited thereto, and other methods are also possible.
  • Non-limiting ones may include:
  • the behavior name is: audio_speak;
  • Behavior control parameters can include: text (content to say), volume (volume of speech), etc. (eg, vocal gender, or vocal age, etc.)
  • JSON JSON is expressed as follows:
  • text may include a conversion character that corresponds to a parameter.
  • the "owner” conversion character can be defined as "@master”.
  • JSON representation containing the conversion characters is as follows:
  • volume is set as a percentage, and the robot can calculate the specific parameters of the robot based on the percentage value of "volume”.
  • volume can also be represented as a specific parameter of the robot.
  • the behavior name is: audio_sound_music
  • Behavior control parameters may include: path (path to play music, or file name, etc.), volume (volume of playing music), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: audio_sound_info
  • Behavior control parameters include: name (the name of the tone to be played), volume (play volume), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: motion_head;
  • Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
  • JSON JSON is expressed as follows:
  • velocity is represented as a gear position, and the robot can calculate a specific "velocity” based on the gear position.
  • "velocity” can also be expressed as a specific parameter of the robot's head movement.
  • angle is expressed as the angle of the motor.
  • angle can be expressed as relative data such as percentage, for example, “angle”: “50%”
  • the robot can determine the specific range according to the angle range.
  • the parameter for example, the maximum angle is 180 degrees, then the calculated specific angle is 90 degrees, but is not limited thereto.
  • the behavior name is: motion_neck;
  • Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: motion_shoulder;
  • Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: motion_elbow;
  • Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: motion_wrist
  • Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: motion_waist
  • Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: motion_eye;
  • Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: display_emotion
  • Behavior control parameters can include: content (displayed emoticons), velocity (display speed), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: program_photo;
  • Behavior control parameters can include: flash (whether the flash is turned on), etc.
  • JSON JSON is expressed as follows:
  • control_tv The behavior name is: control_tv;
  • Behavior control parameters can include: state (eg open, close), etc.
  • JSON JSON is expressed as follows:
  • control_led The behavior name is: control_led;
  • Behavior control parameters can include: state (eg open, close), color, etc.
  • JSON JSON is expressed as follows:
  • Figure 1 shows a robot control system 1000 in accordance with one embodiment of the present invention.
  • the robot control system 1000 may include a robot control engine 1100, a robot 1200, and a user 1300.
  • a robot 1200 such as a cleaning robot, can be placed within the indoor space 1400.
  • the robot 1200 can establish a communication link 1510 with the routing device 1500 within the indoor space 1400 via an embedded communication device (not shown in FIG. 1).
  • the routing device 1500 establishes a communication link 1520 with the robot control engine 1100, and the robot 1200 passes through the communication chain.
  • Road 1510 and communication link 1520 are in communication with robot control engine 1100.
  • the robot control engine 1100 may also be configured as the robot 1200, or on the Internet cloud computing platform and
  • the robot 1200 is each provided with a robot control engine 1100.
  • the user 1300 may be a member of the indoor space 1400 or a person associated with the indoor space 1400, and the robot 1200 may interact with the user 1300.
  • the robot 1200 can also interact with devices within the indoor space 1400 (eg, household appliances such as air conditioners, televisions, air purifiers, etc.) or can also interact with devices outside the indoor space 1400.
  • devices within the indoor space 1400 eg, household appliances such as air conditioners, televisions, air purifiers, etc.
  • devices outside the indoor space 1400 eg, household appliances such as air conditioners, televisions, air purifiers, etc.
  • user 1300 of robot 1200 may be a plurality of members of indoor space 1400.
  • multiple users 1300 may also be grouped, such as grouping members of indoor space 1400 into groups, and grouping 1300 outside of indoor space 1400 into groups.
  • the robot 1200 can perceive various information through embedded sensing devices (not shown in FIG. 1), including but not limited to environmental parameters in the indoor space 1400, voice information of the user 1300 or other personnel (including natural language and voice commands, etc.) ), user 1300 or other personnel and image or video information of the item, and the like.
  • the sensing devices embedded in the robot 1400 include, but are not limited to, a microphone, a camera, an infrared sensor, an ultrasonic sensor, and the like. It should be understood that the robot 1200 can also communicate with its external sensing device to acquire information perceived by the sensing device.
  • the robot 1200 can be coupled with a temperature sensor, a humidity sensor (not shown in FIG. 1) disposed in the indoor space 1400. Communication, obtaining temperature and humidity parameters of the indoor space 1400.
  • the perceived information may be processed based on the preset sensing unit to obtain the sensing data including the value of the sensing unit.
  • the robot 1200 can transmit the generated sensory data to the robot control engine 1100 to acquire information of the control item for controlling the robot interaction behavior based on the sensory data feedback by the robot control engine 1100.
  • the information of the control entry includes, but is not limited to, the data itself of the control entry, the identification of the control entry, the interaction behavior data in the control entry, and the like.
  • the bot 1200 can store control entries, and the bot 1200 can then obtain an identification of control entries for controlling the interactive behavior of the bot 1200 from the robot control engine 1100 based on the perceptual data.
  • the robot 1200 may acquire the control item itself or the interaction behavior data in the control item from the robot control engine 1100.
  • the robot control engine 1100 generates a control item that controls the robot interaction behavior based on the perceptual data of the robot 1200, the control item including an activation condition triggered by a trigger condition and a trigger condition composed of at least one perceptual unit, wherein the interaction behavior can be parsed and executed by the robot 1200.
  • the triggering condition may be formed by the relationship between the value of the sensing unit and the sensing unit, and the relationship between the sensing units includes, but is not limited to, logical relationships such as “and”, “or”, and “not”.
  • the control entry has a unique identifier that is distinguishable from other control entries, and the identity of the control entry can be, but is not limited to, an integer integer. It should be understood that the identification of the control entry may also be a URL or the like.
  • the robot control engine 1100 may establish an inverted index with the sensing unit included in the trigger condition included in the control entry as the primary key and the identification of the control item. For the value of the sensing unit as a set of values, the robot control engine 1100 may establish an inverted index based on all the values in the set of values as the primary key and the control entry corresponding to the value of the sensing unit. In the inverted index that is established, the value of one sensing unit may correspond to one or more control entries, that is, control entries in which the value of the sensing unit appears.
  • the inverted index can be stored in the memory in the form of an inverted list to be able to append the inverted records; or it can be stored as a file on the disk.
  • the inverted index combines the value of the sensing unit with the control item, and takes the value of the sensing unit as the index structure of the primary key.
  • the inverted index can be divided into two parts. The first part: an index table consisting of the values of different sensing units, called the "dictionary". The value of each sensing unit is saved, and the statistics of the value of the sensing unit may be included, for example, the number of times the value of the sensing unit appears in the control item, etc.; the second part: taken by each sensing unit A collection of control entries for which values have occurred, as well as other information (eg, priority of control entries, etc.), also known as a "record table” or "record list.”
  • the value of the sensing unit may be transformed into an integer based on a digital signature technique, a string mapping technique (eg, MD5, etc.) to perceive the integer integer obtained by the value transformation of the unit.
  • the primary key is used to establish an inverted index with the identifier of the control entry.
  • the values of different sensing units correspond to different integer integers to distinguish the values of different sensing units.
  • the transformed integer obtained by the transformation can be compressed to reduce the amount of data storage and increase the processing speed.
  • the control item for controlling the robot 1200 can be retrieved based on the sensory data of the robot 1200 and the inverted index.
  • the robot control engine 1100 can parse the sensory data of the robot 1200, extract the value of the sensing unit included in the sensory data from the sensory data of the robot 1200, based on the extracted value and the inversion of the sensing unit.
  • the index retrieves control entries that match the perceptual data of the robot 1200 to obtain control entries that control the interaction behavior of the robot 1200 based on the perceptual data of the robot 1200.
  • the value of the sensing unit may be transformed into a shaping integer based on a digital signature technology. number.
  • the robot control engine 1100 compares the integer integer of the value of the sensing unit in the sensory data of the robot 1200 with the integer integer in the inverted index to retrieve the control item corresponding to the value of the sensing unit in the sensory data of the robot 1200.
  • the value of the sensing unit included in the sensing data of the 1200 may be the value of the sensing unit itself, or may be an integer integer obtained based on a digital signature technique.
  • the 1200 may first determine the value of the sensing unit, and then transform the value of the sensing unit into an integer based on the digital signature technology, based on the value transformation of each sensing unit. The resulting integer integer generates perceptual data.
  • the robot control engine 1100 After the robot control engine 1100 retrieves the control item corresponding to the value of the sensing unit in the sensing data of the robot 1200, the robot control engine 1100 merges the retrieved control item based on the logical relationship between the sensing unit and the sensing unit included in the triggering condition included in the control entry.
  • the sensory data of the robot 1200 includes values of five sensing units, and the robot control engine 1100 retrieves a plurality of control items based on the values of the five sensing units.
  • the robot control engine 1100 can find the intersection of the retrieved plurality of control items, and obtain the values that satisfy the values of the five sensing units at the same time.
  • the robot control engine 1100 retrieves a plurality of control entries that match the perceptual data of the robot 1200, at which point the robot control engine 1100 can retrieve a plurality of control entries for the perceptual data with the robot 1200. Sorting is performed to select, from the plurality of control items, a control item that controls the interactive behavior of the robot 1200 based on the perceptual data.
  • the control entry with a higher priority may be preferentially selected based on the prioritization of the control entries; or the user of the plurality of users 1300 may be preferentially selected based on the user feedback performed by the plurality of users 1300 on the control entry.
  • Feedback can be used to evaluate the best control entries; or, the log can be sorted based on the control entries, the control entry execution log can record the number of executions of the control entries, the execution time, etc., and the control entries with multiple execution times or the control entries with the most recent executions are preferentially selected. . It should be understood that the ordering of the control items is not limited to the above manner, and any combination of the above manners and other manners may be performed to select a control item for controlling the interaction behavior of the robot 1200 based on the perceptual data of the robot 1200.
  • the information of the determined control item may be transmitted to the robot 1200, and the robot 1200 performs the interaction behavior in the control item based on the information of the received control item.
  • the information of the control entry includes the control entry itself; or the information of the control entry is an identification of the control entry; or, the interaction behavior in the control entry.
  • the information of the control entry may also be any combination of the above information.
  • the robot control engine 1100 if the robot control engine 1100 does not retrieve a control entry for the sensory data of the bot 1200, the bot 1200 can perform an interactive behavior in accordance with the sensed data in a predetermined pattern, eg, if the sensory data includes voice information, with the user 1300 Do voice chat, if you do not include voice messages, do not perform the operation. It should be understood that the robot 1200 may not be limited to the above modes.
  • FIG. 2 shows a robot 1200 in accordance with one embodiment of the present invention.
  • the robot 1200 includes a memory 102, a memory controller 104, one or more processing units (CPUs) 106, a peripheral interface 108, a radio frequency (RF) circuit 114, an audio circuit 116, a speaker 118, a microphone 120, Perception subsystem 122, attitude sensor 132, camera 134, tactile sensor 136, and one or more other sensing devices 138, as well as external interface 140.
  • CPUs processing units
  • RF radio frequency
  • the robot 1200 is just one example of a robot 1200 that may have more or fewer components than illustrated, or have different component configurations.
  • the bot 1200 can include one or more CPUs 106, memory 102, one or more sensing devices (such as sensing devices as described above), and one or more are stored in the memory 102.
  • a module, program, or instruction set that performs a robot interaction behavior control method.
  • the various components shown in FIG. 2 can be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
  • the robot 1200 can be an electromechanical device having a biological shape (eg, a humanoid, etc.), or can be a smart device that does not have a biological appearance but has human characteristics (eg, language communication, etc.), the smart device can Mechanical devices are also included, as well as virtual devices implemented by software (eg, virtual chat bots, etc.).
  • the virtual chat bot can perceive information through the device in which it is located, and the device in which it is located includes electronic devices such as handheld electronic devices, personal computers, and the like.
  • Memory 102 can include high speed random access memory and can also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid state storage devices.
  • memory 102 may also include memory remote from one or more CPUs 106, such as network attached memory accessed via RF circuitry 114 or external interface 140 and a communication network (not shown), wherein the communication network It can be the Internet, one or more intranets, a local area network (LAN), a wide area network (WLAN), a storage area network (SAN), etc., or a suitable combination thereof.
  • Memory controller 104 can control access to memory 102 by other components of robot 1200, such as CPU 106 and peripheral interface 108.
  • Peripheral interface 108 couples the input and output peripherals of the device to CPU 106 and memory 102.
  • the one or more processors 106 described above execute various software programs and/or sets of instructions stored in the memory 102 to perform various functions of the robot 1200 and process the data.
  • peripheral interface 108, CPU 106, and memory controller 104 can be implemented on a single chip, such as chip 112. In some other embodiments, they may be implemented on multiple discrete chips.
  • the RF circuit 114 receives and transmits electromagnetic waves.
  • the RF circuit 114 converts an electrical signal into an electromagnetic wave, or converts the electromagnetic wave into an electrical signal, and communicates with the communication network and other communication devices via the electromagnetic wave.
  • the RF circuit 114 may include well-known circuitry for performing these functions including, but not limited to, an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chip. Group, User Identity Module (SIM) card, memory, and more.
  • SIM User Identity Module
  • the RF circuit 112 can communicate with a network and other devices via wireless communication, such as the World Wide Web (WWW) Internet, an intranet, and/or a wireless network such as a cellular telephone network, a wireless local area network (LAN), and / or Metropolitan Area Network (MAN).
  • WWW World Wide Web
  • LAN wireless local area network
  • MAN Metropolitan Area Network
  • the above wireless communication may use any of a variety of communication standards, protocols, and technologies including, but not limited to, Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), Wideband Code Division Multiple Access (W-CDMA). ), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (eg IEEE 802.11a, IEEE 802.11b, IEEE 802.11g, and/or IEEE 802.11n), Internet-based Protocol Voice over Internet Protocol (VoIP), Wi-MAX, protocols for e-mail, instant messaging, and/or short message service (SMS), or any other suitable communication protocol, including those not yet developed at the filing date of this document. letter of agreement.
  • GSM Global System for Mobile Communications
  • EDGE Enhanced Data GSM Environment
  • W-CDMA Code Division Multiple Access
  • CDMA Code Division Multiple Access
  • TDMA Time Division Multiple Access
  • Bluetooth Bluetooth
  • Wi-Fi eg IEEE 802.11a, IEEE 802.11b, IEEE 802.
  • Audio circuitry 116, speaker 118, and microphone 120 provide an audio interface between the user and the bot 1200.
  • Audio circuitry 116 receives audio data from peripheral interface 108, converts the audio data into electrical signals, and transmits the electrical signals to speaker 118.
  • the speaker transforms the electrical signal into a human audible sound wave.
  • Audio circuit 116 also receives electrical signals that are converted from sound waves by microphone 118.
  • the audio circuit 116 converts the electrical signals into audio data and transmits the audio data to the peripheral interface 108 for processing. Audio data may be retrieved from memory 102 and/or RF circuitry 114 by peripheral interface 108 and/or transmitted to memory 102 and/or RF circuitry 114.
  • a plurality of microphones 120 can be included, the plurality of microphones 120 being distributed at different locations, and the direction in which the sound is emitted is determined according to a predetermined strategy based on the microphones 120 at different locations. It should be understood that the direction of the sound can also be identified by some sensors.
  • audio circuit 116 also includes a headset jack (not shown).
  • the headset jack provides an interface between the audio circuit 114 and a removable audio input/output peripheral, for example, the audio input/output peripheral can be either a pure output headset or both Output (for single or binaural headphones) and input (microphone) headset.
  • a speech recognition device (not shown) is also included for implementing speech-to-text recognition and synthesizing speech based on text.
  • the speech recognition device can be implemented by hardware, software or a combination of hardware and software, including One or more signal processing and/or application specific integrated circuits.
  • the audio circuit 116 receives the audio data from the peripheral interface 108, converts the audio data into electrical signals, and the voice recognition device can identify the audio data and convert the audio data into text data.
  • the speech recognition apparatus can also synthesize the audio data based on the text data, convert the audio data into an electrical signal through the audio circuit 116, and transmit the electrical signal to the speaker 118.
  • Perception subsystem 122 provides an interface between the perceptual peripherals of robot 1200 and peripheral interface 108, such as attitude sensor 132, camera 134, tactile sensor 136, and other sensing devices 128.
  • Perception subsystem 122 includes an attitude controller 124, a visual controller 126, a haptic controller 128, and one or more other perceptual device controllers 130.
  • the one or more other sensing device controllers 130 receive/transmit electrical signals from/to other sensing devices 138.
  • the other sensing devices 138 may include temperature sensors, distance sensors, proximity sensors, air pressure sensors, air quality detecting devices, and the like.
  • the robot 1200 can have a plurality of attitude controllers 124 to control different limbs of the robot 1200, which can include, but are not limited to, arms, feet, and heads. Accordingly, the robot 1200 can include a plurality of attitude sensors 132. In some embodiments, the robot 1200 may not have the attitude controller 124 and the attitude sensor 132. The robot 1200 may be in a fixed configuration and does not have mechanical moving parts such as an arm or a foot. In some embodiments, the pose of the robot 1200 may not be a mechanical arm, foot, and head, but may also employ a deformable configuration.
  • the robot 1200 also includes a power system 142 for powering various components.
  • the power system 142 can include a power management system, one or more power sources (eg, batteries, alternating current (AC)), charging systems, power failure detection circuits, power converters or inverters, power status indicators (eg, light emitting diodes (eg LED)), as well as any other components associated with power generation, management, and distribution in portable devices.
  • the charging system can be a wired charging system or a wireless charging system.
  • the software components include an operating system 144, a communication module (or set of instructions) 146, an interactive behavior control device (or set of instructions) 148, and one or more other devices (or sets of instructions) 150.
  • Operating system 144 eg, Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as Vxworks
  • controls and management of general system tasks eg, memory management, storage device control, power management, etc.
  • software components and/or drivers that facilitate communication between various hardware and software components.
  • Communication module 146 facilitates communication with other devices via one or more external interfaces 140, and it also includes various software components for processing data received by RF circuitry 114 and/or external interface 140.
  • External interface 140 eg, Universal Serial Bus (USB), FIREWIRE, etc.
  • USB Universal Serial Bus
  • FIREWIRE FireWire
  • Communication module 146 facilitates communication with other devices via one or more external interfaces 140, and it also includes various software components for processing data received by RF circuitry 114 and/or external interface 140.
  • External interface 140 eg, Universal Serial Bus (USB), FIREWIRE, etc.
  • USB Universal Serial Bus
  • FIREWIRE FireWire
  • Communication module 146 facilitates communication with other devices via one or more external interfaces 140, and it also includes various software components for processing data received by RF circuitry 114 and/or external interface 140.
  • External interface 140 eg, Universal Serial Bus (USB), FIREWIRE, etc.
  • USB Universal Serial Bus
  • FIREWIRE FireWire
  • the bot 1200 can also include a display device (not shown), which can include, but is not limited to, a touch sensitive display, a touch pad, and the like.
  • a display device can include, but is not limited to, a touch sensitive display, a touch pad, and the like.
  • One or more of the other devices 150 described above can include a graphics module (not shown) that includes various known software components for presenting and displaying graphics on the display device.
  • graphics includes any object that can be displayed to a user, including but not limited to text, web pages, icons (eg, user interface objects including soft keys), digital images, video, animation, and the like. Touch sensitive displays or touch pads can also be used for user input.
  • the robot 1200 senses the external environment of the robot 10 and the condition of the robot itself by, for example, the attitude sensor 132, the camera 134, the tactile sensor 136, and other sensing devices 128, the microphone 120, etc., and the information perceived by the robot 1200 is controlled via the sensing peripheral.
  • the device processes and is processed by one or more CPUs 106.
  • the perception of the environment by the robot 1200 includes, but is not limited to, information detected by its own sensors (eg, the attitude sensor 132, the camera 134, the tactile sensor 136, and other sensing devices 128), and may also be an external device coupled to the robot 1200 (not shown)
  • the detected information, the robot 1200 establishes a communication connection with the external device, and the robot 1200 and the external device transmit data through the communication connection.
  • External devices include various types of sensors, smart home devices, and the like.
  • the information perceived by the robot 1200 includes, but is not limited to, sound, images, environmental parameters, tactile information, time, space, and the like.
  • Environmental parameters include, but are not limited to, temperature, humidity, gas concentration, etc.
  • tactile information includes, but is not limited to, contact with the robot 1200, including but not limited to contact with a touch sensitive display, contact or proximity to a tactile sensor, and the tactile sensor can be placed at The head, arm, etc. of the robot (not shown) should be described to include other forms of information.
  • the sound may include voice and other sounds, the sound may be the sound collected by the microphone 120, or may be the sound stored in the memory 102; the voice may include, but is not limited to, human speaking or singing.
  • the image may be a single picture or video, including but not limited to captured by camera 134, or may be read from memory 102 or transmitted to robot 1200 over a network.
  • the information sensed by the robot 1200 includes not only information external to the robot 1200 but also information of the robot 1200 itself, including but not limited to information such as the amount of power, temperature, and the like of the robot 1200.
  • the robot 1200 can be moved to the charging position for automatic charging when it is perceived that the power of the machine 100 is less than 20%.
  • the bot 1200 is not limited to perceiving information in the manner described above, but may also perceive information in other forms, including perceptual techniques that have not been developed at the filing date of this document.
  • the sensing device of the robot 1200 is not limited to the sensing device disposed on the robot 1200, and may also be associated with the robot 1200 and not disposed in the Sensing devices on the robot 1200, such as various sensors for sensing information.
  • the robot 1200 may be associated with a temperature sensor, a humidity sensor (not shown), or the like disposed within a certain area through which the corresponding information is perceived.
  • the bot 1200 can communicate with these sensors through various types of communication protocols to obtain information from these sensors.
  • the information perceived by the robot 1200 may be set according to preset conditions, which may include, but are not limited to, setting which information the robot 1200 perceives, when to perceive the information, and the like. For example, it may be set to sense the user's voice, track the user's face, recognize the user's gesture, etc., in the user's voice conversation, without perceiving other information, or reducing the effect of other information when generating the sensing unit, or sensing the Other information is processed, etc.; or, during a certain period of time (for example, when the user is out, the robot 1200 is indoors alone), the environmental parameters, the perceived image, and the video data are sensed, and the environmental parameters are used to determine whether it is necessary to interact with the air conditioner or the like.
  • preset conditions may include, but are not limited to, setting which information the robot 1200 perceives, when to perceive the information, and the like. For example, it may be set to sense the user's voice, track the user's face, recognize the user's gesture, etc.
  • condition for setting the perceived information is not limited thereto, and the above conditions are merely exemplified, and the information that the robot 1200 needs to perceive may be set according to the situation.
  • FIG. 3 shows a robot control engine 1100 in accordance with one embodiment of the present invention.
  • the robot control engine 1100 may include: a sensing data acquiring device 1110 configured to acquire sensing data generated according to at least one sensing unit based on information perceived by the robot 1200; and a control item generating device 1120 configured to generate And maintaining a control item for controlling the interaction behavior of the robot 1200 based on the sensing data of the robot 1200; the inverted index generating device 1130 is configured to target the sensing unit included in the triggering condition in each control item as the primary key, and to control the identification of the item An inverted index is generated; and, the control item retrieval means 1140 is arranged to retrieve a control entry for controlling the interaction behavior of the robot 1200 based on the perceptual data of the robot 1200 and the inverted index.
  • FIG. 4 shows a robot control engine 1100 in accordance with another embodiment of the present invention.
  • the robot control engine 1100 shown in FIG. 4 may further include: a sensing unit classification device 1150, which is configured to classify the sensing unit based on the type of the sensing unit to form a perceptual unit. A collection of perceptual units that are distinguished by type.
  • the sensing unit can be divided into multiple types, for example, the sensing unit is divided into an audible, visual, tactile, environmental, and the like; or the sensing unit can be divided into news according to the theme involved in the sensing unit. Shopping, games, indoor security, environmental monitoring and other types. It should be understood that the type of sensing unit is not limited to the above classification.
  • the inverted index generating means 1130 is further configured to form a plurality of inverted indexes differentiated according to the perceptual unit type based on the perceptual unit set obtained by the classification. Multiple inverted indexes can be stored in no In the same device, these devices may be physical devices or virtual devices.
  • the robot control engine 1100 as shown in FIG. 4 may further include: a control item retrieval agent device 1160 configured to analyze the sensing unit included in the sensory data of the robot 1200, and select a corresponding inverted index based on the type of the sensing unit included.
  • the control item retrieval means 1140 is further arranged to retrieve a control entry for controlling the interactive behavior of the robot 1200 based on the inverted index selected by the control item retrieval agent means 1160.
  • the robot control engine 1100 can also include a plurality of control item retrieval devices 1140 as shown in FIG. 4, each control item retrieval device 1140 can correspond to an inverted index of at least one perceptual unit type.
  • the control item retrieval agent device 1160 may store the type of the sensing unit corresponding to each of the control item retrieval devices 1140 to select a corresponding inverted index based on the type of the sensing unit included in the sensing data, and retrieve the sensing by the corresponding control item retrieval device 1140. A control entry corresponding to the sensing unit in the data.
  • the inverted index generating device 1130 may be further configured to transform the value of the sensing unit included in the trigger condition in the control entry into an integer based on a digital signature technique (eg, MD5, etc.) to transform The resulting integer integer is the primary key, and the inverted index is generated with the target of the control entry as the target.
  • the control item retrieving device 1140 is further configured to convert the value of the sensing unit in the sensing data of the robot 1200 into an integer based on the digital signature technique, and to perform an integer integer and an inverted index search based on the value conversion of the sensing unit in the sensing data.
  • the sensory data of the bot 1200 can include a plurality of sensing units. After the control items are retrieved based on the plurality of sensing units, the robot control engine 1100 synthesizes a control that matches the sensory data of the robot based on the retrieved control items. entry.
  • FIG. 5 illustrates a robot control engine 1100 in accordance with yet another embodiment of the present invention.
  • the robot control engine 1100 may further include: a retrieval result synthesizing device 1170 configured to merge the perceptual units in the perceptual data based on the robot 1200.
  • the value retrieved control entry forms a control entry that matches the sensory data of the robot 1200.
  • the retrieval result synthesizing means 1170 is further configured to merge the retrieved control entries based on the logical relationship between the perceptual units constituting the triggering conditions in the retrieved control entries to form a matching match with the perceptual data of the robot 1200. Control entry.
  • the search result synthesizing means 1170 can find an intersection of the set of control items retrieved based on the value of each perceptual unit, and form one or more control items corresponding to the perceptual data of the robot 1200.
  • the robot control engine 1100 as described in FIG. 5 may further include: a control item sorting means 1180 configured to sort the control items retrieved by the control item retrieval means 1140 to select the control machine based on the sorted result Control entry for human interaction.
  • the retrieval result synthesizing device 1170 may form one or more control items corresponding to the sensing data of the robot 1200, and the control item sorting device 1180 may sort the formed plurality of control items based on the preset policy to select the sensing data control based on the robot 1200. Control entry for robot 1200 interaction behavior.
  • FIG. 6 shows a robot control engine 1100 in accordance with still another embodiment of the present invention.
  • the robot control engine 1100 may further include one or any combination of the following: the user feedback acquisition device 1190 is configured to acquire a plurality of users 1300 to the robot 1200. User feedback of the interactive behavior; control item execution status recording means 1192, configured to record execution status information of the control item, forming an execution log; control item priority configuration means 1194, set to configure the priority of the control item; user behavior Recording device 1196 is configured to record user behavior to form a user behavior log.
  • user feedback includes, but is not limited to, user 1300's evaluation of the interaction behavior of the robot 1200, including but not limited to voice feedback by the user 1300 after the robot 1200 performs the interactive behavior, the user 1300 and the robot 1200. Physical contact, feedback instructions sent by the user 1300 through a terminal (eg, a smartphone, etc.).
  • the execution status information of the control entry includes, but is not limited to, the number of executions of the control entry, the execution time of the control entry, the execution success rate of the control entry, and the like.
  • the priority of the control entry can be based on the source setting of the control entry, and the control entry with the higher priority can be selected first.
  • control item sorting device 1180 is further configured to control based on user feedback, and/or execution logs, and/or priority of control entries, and/or user behavior log pairs.
  • the control items retrieved by the item retrieval means 1140 are sorted to select control items that control the interactive behavior of the robot based on the sorted results.
  • FIG. 7 shows a robot control engine 1100 in accordance with still another embodiment of the present invention.
  • the robot control engine 1100 can also include an internet content capture device 1198 configured to fetch content from the Internet to form an internet content collection.
  • the control item generating device 1120 is further configured to generate a control item for controlling the interaction behavior of the robot based on the perceptual data of the robot based on the Internet content set, the preset sensing unit, and the preset interaction behavior.
  • Internet content includes at least one or any combination of the following: web pages, text, sound, video, images.
  • the data content can be captured by a data crawling tool (for example, a crawler, etc.), and the captured Internet content is analyzed based on a data mining algorithm to obtain an Internet content collection.
  • Internet content collections can be constructed in the form of "if this then that" (if so, if so), describing the feedback under various conditions. For example, an answer describing a problem, an expression describing a certain emotion, or a limb movement.
  • Controlling the item generating device 1120, further The step is set to generate a control item based on the Internet content set, the preset sensing unit, and the preset interaction behavior to control the interaction behavior of the robot based on the perceptual data of the robot.
  • content may be fetched from the Internet (eg, a web page, etc.), the captured content may be analyzed, content for setting control entries may be obtained, and triggering conditions triggered by trigger conditions and trigger conditions may be set according to the content.
  • the trigger condition of “ill” can be set according to the sensing unit, and the interaction behavior triggered by the trigger condition is set to “call emergency call”.
  • the sensing unit of “health status” is predefined, the value of the sensing unit can be directly set to “ill”, and the triggering condition can be ⁇ if(“health”: “sick”) ⁇ .
  • a sensing unit for controlling the interaction behavior of the robot is defined in advance, and is used as a minimum unit for controlling the interaction behavior of the robot, and the interaction behavior triggered by the triggering condition and the triggering condition is set according to the sensing unit to obtain the controlling robot.
  • the control items unify the input and output standards of the robot control, so that non-technical personnel can also edit the behavior of the robot, thereby facilitating the control of the robot's interaction behavior, and effectively improving the robot's adaptive interaction behavior and intelligence.
  • modules or steps of the embodiments of the present invention can be implemented by a general computing device, which can be concentrated on a single computing device or distributed in multiple computing devices. Alternatively, they may be implemented by program code executable by the computing device such that they may be stored in the storage device by the computing device and, in some cases, may be different from The steps shown or described are performed sequentially, or they are separately fabricated into individual integrated circuit modules, or a plurality of modules or steps thereof are fabricated into a single integrated circuit module. Thus, embodiments of the invention are not limited to any specific combination of hardware and software.

Abstract

A robot control engine and system. The robot control engine comprises: a sensing data acquiring means (1110), configured to acquire sensing data that is generated on the basis of information sensed by a robot and in accordance with at least one preset sensing unit, wherein the sensing data comprises a value of the sensing unit; a control item generating means (1120), configured to generate and maintain a control item for controlling an interaction behavior of the robot on the basis of the sensing data of the robot, wherein the control item comprises a triggering condition consisting of at least one sensing unit, and an interaction behavior triggered by the triggering condition; an inverted index generating means (1130), configured to generate an inverted index by using the sensing unit comprised in the triggering condition in each control item as a primary key and using an identifier of the control item as a target; and a control item retrieval means (1140), configured to retrieve the control item on the basis of the sensing data of the robot and the inverted index. This solution effectively improves an adaptive interaction behavior and an intelligent level of the robot.

Description

机器人控制引擎及系统Robot control engine and system 技术领域Technical field
本发明涉及人工智能技术领域,特别涉及一种机器人控制引擎及系统。The invention relates to the field of artificial intelligence technology, and in particular to a robot control engine and system.
背景技术Background technique
当今的机器人多为工业机器人,而工业机器人以无感知能力的居多。这些机器人的操作程序都是预先制定的,并按照预定程序重复无误地完成确定的任务。它们缺乏适应性,只有当涉及的对象相同时,才能产生一致的结果。Today's robots are mostly industrial robots, while industrial robots are mostly non-sense. The operating procedures of these robots are pre-defined and the determined tasks are completed without fail in accordance with the predetermined procedures. They lack adaptability and produce consistent results only when the objects involved are the same.
发明内容Summary of the invention
本发明实施例提供了一种机器人控制引擎及系统,以至少有效提高机器人自适应交互行为能力与智能化程度。The embodiment of the invention provides a robot control engine and system to at least effectively improve the adaptive interaction behavior and the degree of intelligence of the robot.
在某些实施例中,一种机器人控制引擎,包括:In some embodiments, a robot control engine includes:
感知数据获取装置,被设置为获取基于机器人感知到的信息按照至少一个预设的感知单元生成的感知数据,其中,感知数据包含感知单元的取值;The sensing data acquiring device is configured to acquire the sensing data generated by the robot-perceived information according to the at least one preset sensing unit, wherein the sensing data includes the value of the sensing unit;
控制条目产生装置,被设置为产生并维护基于机器人的感知数据控制机器人的交互行为的控制条目,其中,控制条目包含由至少一个感知单元构成的触发条件和触发条件触发的交互行为;a control item generating device configured to generate and maintain a control item that controls an interaction behavior of the robot based on the perceptual data of the robot, wherein the control item includes an activation condition triggered by a trigger condition and a trigger condition composed of the at least one sensing unit;
倒排索引产生装置,被设置为以各个控制条目中的触发条件包含的感知单元为主键、以控制条目的标识为目标产生倒排索引;以及An inverted index generating device configured to generate an inverted index with the sensing unit included in the trigger condition in each control item as a primary key and to control the identification of the entry;
控制条目检索装置,被设置为基于机器人的感知数据和倒排索引检索用于控制机器人交互行为的控制条目。The control item retrieval means is arranged to retrieve a control item for controlling the interaction behavior of the robot based on the perceptual data of the robot and the inverted index.
在某些实施例中,一种机器人控制引擎,包括:In some embodiments, a robot control engine includes:
感知数据获取装置,被设置为获取基于机器人感知到的信息按照至少一个预设的感知单元生成的感知数据,其中,感知数据包含感知单元的取值;The sensing data acquiring device is configured to acquire the sensing data generated by the robot-perceived information according to the at least one preset sensing unit, wherein the sensing data includes the value of the sensing unit;
控制条目产生装置,被设置为产生基于机器人的感知数据控制机器人的交互行为的控制条目,其中,控制条目包含由至少一个感知单元构成的触发条件和触发条件触发的交互行为; a control item generating device configured to generate a control item for controlling an interaction behavior of the robot based on the perceptual data of the robot, wherein the control item includes a trigger condition triggered by the at least one sensing unit and an interaction condition triggered by the trigger condition;
感知单元分类装置,被设置为基于感知单元的类型对感知单元进行分类,形成按感知单元类型区分的感知单元集合;The sensing unit classifying device is configured to classify the sensing unit based on the type of the sensing unit to form a sensing unit set differentiated by the sensing unit type;
倒排索引产生装置,被设置为以各个控制条目中的触发条件包含的感知单元为主键、以控制条目的标识为目标、基于所述感知单元集合产生按感知单元类型区分的多个倒排索引;And an inverted index generating device configured to generate a plurality of inverted indexes differentiated by the sensing unit type based on the sensing unit set, aiming at a sensing unit included in a trigger condition in each control item as a primary key ;
控制条目检索代理装置,被设置为分析机器人的感知数据包含的感知单元,并基于机器人的感知数据所包含的感知单元的类型选择对应的倒排索引;以及,Controlling the item retrieval agent device, configured to analyze the sensing unit included in the sensory data of the robot, and selecting a corresponding inverted index based on the type of the sensing unit included in the sensory data of the robot;
控制条目检索装置,被设置为基于机器人的感知数据和控制条目检索代理装置选取的倒排索引检索用于控制机器人交互行为的控制条目。The control item retrieval means is arranged to retrieve a control item for controlling the interaction behavior of the robot based on the perceptual data of the robot and the inverted index selected by the control item retrieval agent means.
在某些实施例中,一种机器人控制引擎,包括:In some embodiments, a robot control engine includes:
感知数据获取装置,被设置为获取基于机器人感知到的信息按照至少一个预设的感知单元生成的感知数据,其中,感知数据包含感知单元的取值;The sensing data acquiring device is configured to acquire the sensing data generated by the robot-perceived information according to the at least one preset sensing unit, wherein the sensing data includes the value of the sensing unit;
控制条目产生装置,被设置为产生基于机器人的感知数据控制机器人的交互行为的控制条目,其中,控制条目包含由至少一个感知单元构成的触发条件和触发条件触发的交互行为;a control item generating device configured to generate a control item for controlling an interaction behavior of the robot based on the perceptual data of the robot, wherein the control item includes a trigger condition triggered by the at least one sensing unit and an interaction condition triggered by the trigger condition;
倒排索引产生装置,被设置为将控制条目中的触发条件包含的感知单元的取值变换成整形整数(例如,数字签名技术),以变换得到的整形整数为主键、以控制条目的标识为目标产生倒排索引;以及The inverted index generating device is configured to convert the value of the sensing unit included in the trigger condition in the control entry into an integer integer (for example, a digital signature technology), and transform the obtained integer integer as a primary key to control the identifier of the entry as The target produces an inverted index;
控制条目检索装置,被设置为基于数字签名技术将机器人的感知数据中感知单元的取值变换成整形整数,基于感知数据中感知单元的取值变换得到的整形整数和所述倒排索引检索用于控制机器人交互行为的控制条目。The control item retrieval means is configured to convert the value of the sensing unit in the sensory data of the robot into an integer based on the digital signature technique, and to use the integer integer obtained by the value conversion of the sensing unit in the sensory data and the inverted index search A control entry that controls the interaction behavior of the robot.
在某些实施例中,一种机器人控制引擎,包括:In some embodiments, a robot control engine includes:
感知数据获取装置,被设置为获取基于机器人感知到的信息按照至少一个预设的感知单元生成的感知数据,其中,感知数据包含感知单元的取值;The sensing data acquiring device is configured to acquire the sensing data generated by the robot-perceived information according to the at least one preset sensing unit, wherein the sensing data includes the value of the sensing unit;
控制条目产生装置,被设置为产生基于机器人的感知数据控制机器人的交互行为的控制条目,其中,控制条目包含由至少一个感知单元构成的触发条件和触发条件触发的交互行为;a control item generating device configured to generate a control item for controlling an interaction behavior of the robot based on the perceptual data of the robot, wherein the control item includes a trigger condition triggered by the at least one sensing unit and an interaction condition triggered by the trigger condition;
倒排索引产生装置,被设置为以各个控制条目中的触发条件包含的感知单元为主键、以控制条目的标识为目标产生倒排索引; The inverted index generating device is configured to generate an inverted index with the sensing unit included in the trigger condition in each control item as a primary key and the identification of the control item as a target;
控制条目检索装置,被设置为基于机器人的感知数据中的感知单元的取值和所述倒排索引检索用于控制机器人交互行为的控制条目;以及Controlling an item retrieval means configured to retrieve a control item for controlling the interaction behavior of the robot based on the value of the sensing unit in the sensory data of the robot and the inverted index;
检索结果合成装置,被配置为合并基于机器人的感知数据中的各个感知单元的取值检索到的控制条目,形成与机器人的感知数据匹配的控制条目。The retrieval result synthesizing means is configured to merge the control items retrieved based on the values of the respective sensing units in the perceptual data of the robot to form a control item that matches the perceptual data of the robot.
在某些实施例中,一种机器人控制引擎,包括:In some embodiments, a robot control engine includes:
感知数据获取装置,被设置为获取基于机器人感知到的信息按照至少一个预设的感知单元生成的感知数据,其中,感知数据包含感知单元的取值;The sensing data acquiring device is configured to acquire the sensing data generated by the robot-perceived information according to the at least one preset sensing unit, wherein the sensing data includes the value of the sensing unit;
控制条目产生装置,被设置为产生基于机器人的感知数据控制机器人的交互行为的控制条目,其中,控制条目包含由至少一个感知单元构成的触发条件和触发条件触发的交互行为;a control item generating device configured to generate a control item for controlling an interaction behavior of the robot based on the perceptual data of the robot, wherein the control item includes a trigger condition triggered by the at least one sensing unit and an interaction condition triggered by the trigger condition;
倒排索引产生装置,被设置为以各个控制条目中的触发条件包含的感知单元为主键、以控制条目的标识为目标产生倒排索引;The inverted index generating device is configured to generate an inverted index with the sensing unit included in the trigger condition in each control item as a primary key and the identification of the control item as a target;
控制条目检索装置,被设置为基于机器人的感知数据中的感知单元的取值和所述倒排索引检索用于控制机器人交互行为的控制条目;以及Controlling an item retrieval means configured to retrieve a control item for controlling the interaction behavior of the robot based on the value of the sensing unit in the sensory data of the robot and the inverted index;
检索结果合成装置,被配置为基于检索到的控制条目中构成触发条件的感知单元之间的逻辑关系合并基于机器人的感知数据中的各个感知单元的取值检索到的控制条目,形成与机器人的感知数据匹配的控制条目。The retrieval result synthesizing means is configured to merge the control items retrieved based on the values of the respective sensing units in the perceptual data of the robot based on the logical relationship between the sensing units constituting the triggering conditions in the retrieved control items, to form a control item with the robot A control entry that senses data matching.
在某些实施例中,一种机器人控制引擎,包括:In some embodiments, a robot control engine includes:
感知数据获取装置,被设置为获取基于机器人感知到的信息按照至少一个预设的感知单元生成的感知数据,其中,感知数据包含感知单元的取值;The sensing data acquiring device is configured to acquire the sensing data generated by the robot-perceived information according to the at least one preset sensing unit, wherein the sensing data includes the value of the sensing unit;
控制条目产生装置,被设置为产生基于机器人的感知数据控制机器人的交互行为的控制条目,其中,控制条目包含由至少一个感知单元构成的触发条件和触发条件触发的交互行为;a control item generating device configured to generate a control item for controlling an interaction behavior of the robot based on the perceptual data of the robot, wherein the control item includes a trigger condition triggered by the at least one sensing unit and an interaction condition triggered by the trigger condition;
倒排索引产生装置,被设置为以各个控制条目中的触发条件包含的感知单元为主键、以控制条目的标识为目标产生倒排索引;The inverted index generating device is configured to generate an inverted index with the sensing unit included in the trigger condition in each control item as a primary key and the identification of the control item as a target;
控制条目检索装置,被设置为基于机器人的感知数据和所述倒排索引检索用于控制机器人交互行为的控制条目;以及Controlling an item retrieval means configured to retrieve a control item for controlling the interaction behavior of the robot based on the perceptual data of the robot and the inverted index;
控制条目排序装置,被设置为对所述控制条目检索装置检索到的控制条目进行排序,以基于排序的结果选取控制机器人交互行为的控制条目。The control item sorting means is arranged to sort the control items retrieved by the control item retrieval means to select control items for controlling the robot interaction behavior based on the sorted result.
在某些实施例中,一种机器人控制引擎,包括: In some embodiments, a robot control engine includes:
感知数据获取装置,被设置为获取基于机器人感知到的信息按照至少一个预设的感知单元生成的感知数据,其中,感知数据包含感知单元的取值;The sensing data acquiring device is configured to acquire the sensing data generated by the robot-perceived information according to the at least one preset sensing unit, wherein the sensing data includes the value of the sensing unit;
控制条目产生装置,被设置为产生基于机器人的感知数据控制机器人的交互行为的控制条目,其中,控制条目包含由至少一个感知单元构成的触发条件和触发条件触发的交互行为;a control item generating device configured to generate a control item for controlling an interaction behavior of the robot based on the perceptual data of the robot, wherein the control item includes a trigger condition triggered by the at least one sensing unit and an interaction condition triggered by the trigger condition;
倒排索引产生装置,被设置为以各个控制条目中的触发条件包含的感知单元为主键、以控制条目的标识为目标产生倒排索引;The inverted index generating device is configured to generate an inverted index with the sensing unit included in the trigger condition in each control item as a primary key and the identification of the control item as a target;
控制条目检索装置,被设置为基于机器人的感知数据和所述倒排索引检索用于控制机器人交互行为的控制条目;Controlling the item retrieval means, configured to retrieve a control item for controlling the interaction behavior of the robot based on the perceptual data of the robot and the inverted index;
用户反馈获取装置,被设置为获取用户对机器人的交互行为的用户反馈;a user feedback obtaining device configured to obtain user feedback of the user's interaction behavior with the robot;
控制条目执行情况记录装置,被设置为记录控制条目的执行情况信息,形成执行日志;a control item execution status recording device configured to record execution status information of the control item to form an execution log;
控制条目优先级配置装置,被设置为配置控制条目的优先级;Controlling the item priority configuration device, set to the priority of the configuration control entry;
用户行为记录装置,被配置为记录用户行为,形成用户行为日志;以及a user behavior recording device configured to record user behavior to form a user behavior log;
控制条目排序装置,被设置为基于所述用户反馈、和/或所述执行日志、和/或所述控制条目的优先级、和/或所述用户行为日志对所述控制条目检索装置检索到的控制条目进行排序,以基于排序的结果选取控制机器人交互行为的控制条目。Controlling an item ranking device, configured to retrieve the control item retrieval device based on the user feedback, and/or the execution log, and/or the priority of the control entry, and/or the user behavior log The control entries are sorted to select control entries that control the interactive behavior of the robot based on the sorted results.
在某些实施例中,一种机器人控制引擎,包括:In some embodiments, a robot control engine includes:
感知数据获取装置,被设置为获取基于机器人感知到的信息按照至少一个预设的感知单元生成的感知数据,其中,感知数据包含感知单元的取值;The sensing data acquiring device is configured to acquire the sensing data generated by the robot-perceived information according to the at least one preset sensing unit, wherein the sensing data includes the value of the sensing unit;
互联网内容抓取装置,被设置为从互联网抓取内容,形成互联网内容集合;An Internet content crawling device configured to crawl content from the Internet to form a collection of Internet content;
控制条目产生装置,被设置为基于所述互联网内容集合、预设的感知单元和预设的交互行为,产生基于机器人的感知数据控制机器人的交互行为的控制条目;Controlling the item generating means, configured to generate a control item for controlling the interaction behavior of the robot based on the perceptual data of the robot based on the set of Internet content, the preset sensing unit, and the preset interaction behavior;
倒排索引产生装置,被设置为以各个控制条目中的触发条件包含的感知单元为主键、以控制条目的标识为目标产生倒排索引;以及An inverted index generating device configured to generate an inverted index with the sensing unit included in the trigger condition in each control item as a primary key and to control the identification of the entry;
控制条目检索装置,被设置为基于机器人的感知数据和所述倒排索引检索用于控制机器人交互行为的控制条目。The control item retrieval means is arranged to retrieve a control item for controlling the interaction behavior of the robot based on the perceptual data of the robot and the inverted index.
在另外一个实施例中,还提供了一种软件,该软件用于执行上述实施例及优选实施方式中描述的技术方案。 In another embodiment, software is also provided for performing the technical solutions described in the above embodiments and preferred embodiments.
在另外一个实施例中,还提供了一种存储介质,该存储介质中存储有上述软件,该存储介质包括但不限于:光盘、软盘、硬盘、可擦写存储器等。In another embodiment, a storage medium is further provided, wherein the software includes the above-mentioned software, including but not limited to: an optical disk, a floppy disk, a hard disk, an erasable memory, and the like.
本发明实施例提供了一种机器人控制引擎及系统,预先定义了控制机器人交互行为的感知单元和机器人的交互行为,将其作为控制机器人交互行为的最小单元,根据感知单元和预先设置的交互行为设置触发条件和触发条件所触发的交互行为,得到控制机器人的控制条目,统一了机器人控制的输入输出标准,使得非技术人员也可以编辑机器人的行为,从而便于控制机器人的交互行为,有效提高机器人自适应交互行为能力与智能化程度。The embodiment of the invention provides a robot control engine and system, which pre-defines the interaction behavior of the sensing unit and the robot that controls the interaction behavior of the robot, and uses it as the smallest unit for controlling the interaction behavior of the robot, according to the sensing unit and the preset interaction behavior. Set the interaction behavior triggered by the trigger condition and the trigger condition, obtain the control items of the control robot, unify the input and output standards of the robot control, so that the non-technical personnel can also edit the behavior of the robot, thereby facilitating the control of the robot's interaction behavior and effectively improving the robot. Adaptive interaction behavior and intelligence.
附图说明DRAWINGS
此处所说明的附图用来提供对本发明的进一步理解,构成本申请的一部分,并不构成对本发明的限定。在附图中:The drawings described herein are provided to provide a further understanding of the invention, and are not intended to limit the invention. In the drawing:
图1是本发明实施例提供的一种机器人控制系统的示意图;1 is a schematic diagram of a robot control system according to an embodiment of the present invention;
图2是本发明实施例提供的一种机器人的示意图;2 is a schematic diagram of a robot according to an embodiment of the present invention;
图3是本发明实施例提供的一种机器人控制引擎的结构框图;3 is a structural block diagram of a robot control engine according to an embodiment of the present invention;
图4是本发明实施例提供的另一种机器人控制引擎的结构框图;4 is a structural block diagram of another robot control engine according to an embodiment of the present invention;
图5是本发明实施例提供的又一种机器人控制引擎的结构框图;FIG. 5 is a structural block diagram of still another robot control engine according to an embodiment of the present invention; FIG.
图6是本发明实施例提供的再一种机器人控制引擎的结构框图;以及6 is a structural block diagram of still another robot control engine according to an embodiment of the present invention;
图7是本发明实施例提供的另外一种机器人控制引擎的结构框图。FIG. 7 is a structural block diagram of another robot control engine according to an embodiment of the present invention.
具体实施方式detailed description
为使本发明的目的、技术方案和优点更加清楚明白,下面结合实施方式和附图,对本发明做进一步详细说明。在此,本发明的示意性实施方式及其说明用于解释本发明,但并不作为对本发明的限定。In order to make the objects, technical solutions and advantages of the present invention more comprehensible, the present invention will be further described in detail with reference to the embodiments and drawings. The illustrative embodiments of the present invention and the description thereof are intended to explain the present invention, but are not intended to limit the invention.
针对一个实施方式描述和/或例示的特征,可以在一个或更多个其它实施方式中以相同方式或以类似方式使用,和/或与其他实施方式的特征相结合或代替其他实施方式的特征。Features described and/or exemplified for one embodiment may be used in the same manner or in a similar manner in one or more other embodiments, and/or in combination with or in place of features of other embodiments. .
应当强调的是,词语“包括”、“包含”当在本说明书中使用时用来指所引述的特征、要素、步骤或组成部分的存在,但不排除一个或更多个其它特征、要素、步骤、组成部分或它们的组合的存在或增加。 It should be emphasized that the words "comprise" and "comprising", when used in the specification, are used to refer to the <RTI ID=0.0> </ RTI> </ RTI> <RTIgt; The presence or addition of steps, components, or combinations thereof.
关于感知单元About the sensing unit
预先定义至少一个感知单元,感知单元的取值取决于机器人感知到的信息。感知单元作为控制机器人的最小单元(或者称为最小输入单元),机器人可以至少根据感知单元做出交互行为。机器人的交互行为可以受到一个或多个感知单元控制,例如,当一个或多个感知单元的取值发生变化时,机器人可以响应这些变化做出交互行为;或者,当一个或多个感知单元的取值在某一取值范围内或等于某一值时,机器人可以响应感知单元做出交互行为。应当理解,感知单元对机器人的交互行为的控制不限于上述情况,上述情况仅作为举例说明。At least one sensing unit is defined in advance, and the value of the sensing unit depends on information perceived by the robot. The sensing unit acts as the smallest unit (or called the minimum input unit) that controls the robot, and the robot can make interactive behavior based on at least the sensing unit. The interaction behavior of the robot may be controlled by one or more sensing units, for example, when the values of one or more sensing units change, the robot may respond to the changes to make an interactive behavior; or, when one or more sensing units When the value is within a certain value range or equal to a certain value, the robot can respond to the sensing unit to make an interactive behavior. It should be understood that the control of the interaction behavior of the robot by the sensing unit is not limited to the above case, and the above case is merely illustrative.
在某些实施例中,感知单元可以包括多个层级,高层级的感知单元可以包含低层级的一个或多个感知单元。在某些实施例中,高层级的感知单元可以包含与其相邻的低层级的一个或多个感知单元,同一高层级的感知单元可以包含不同的低层级的感知单元。在时间上,合成高层级的感知单元的低层级感知单元包括但不限于同一时间或时间段的低层级感知单元,以及该时间或时间段之前的历史的低层级的感知单元。在某些实施例中,高层级的感知单元由不同时间的低层级感知单元确定。In some embodiments, the sensing unit can include multiple levels, and the higher level sensing unit can include one or more sensing units of the lower level. In some embodiments, the higher level perceptual unit may include one or more perceptual units of the lower level adjacent thereto, and the sensing unit of the same higher level may include different lower level perceptual units. In time, the low-level sensing units that synthesize the high-level sensing units include, but are not limited to, low-level sensing units of the same time or time period, and historical low-level sensing units of the time or time period. In some embodiments, the higher level perceptual units are determined by lower level sensing units at different times.
在某些实施例中,感知单元的取值可以是一个或一组值,也可以是一个或多个取值的范围。可以根据机器人感知到的信息确定感知单元的取值,一个感知单元可以由感知到的一项或多项信息确定,同一感知单元可以由感知到的不同数据来确定。感知到的信息可以包括实时感知到的信息,或者历史感知到的信息(例如过去某一时刻或某段时间感知到的信息)。在某些情况下,感知单元的取值由实时感知到的信息和历史感知到的信息共同确定。In some embodiments, the value of the sensing unit may be one or a set of values, or may be a range of one or more values. The value of the sensing unit may be determined according to the information perceived by the robot. One sensing unit may be determined by one or more pieces of information that is perceived, and the same sensing unit may be determined by different data that is perceived. The perceived information may include real-time perceived information, or historically perceived information (such as information perceived at a certain time or some time in the past). In some cases, the value of the sensing unit is determined by the information perceived in real time and the information perceived by the history.
作为一个例子,可以设置听觉(ear)、视觉(eye)、时间(timer)、是否有人在家(so_at_home)以及环境(environment)等感知单元。听觉描述听到的语音,在机器人接收到声音时,对接收到的声音进行语音识别处理,识别得到声音中语音的文本,听觉的取值可以是听到的语音的文本;在某些实施例中,还可以进行声源定位,听觉还可以包括声音的方向,声音的方向以机器人的面部为参考,包括左、右、前、后等方向;此外,还可以采用情绪识别技术,从语音信息中识别出语音蕴含的情绪。视觉描述视频情况,机器人可以对图像或视频进行分析,判断当前是否有人或者是否有移动,视觉的取值可以包括是否有人、是否有移动等等;此外,还可以基于视频监控识别监控对象(例如与机器人对话的至少一个用户)的情绪,情绪可以基于面部识别以及肢体活动确定。是否有人在家的取值可以是“0”或“1”,“0”表示没有人在家,“1”表示有人 在家,是否有人在家可以通过多种方式确定,例如通过视频监控判断监控到的对象是否包括人等。时间描述时间信息,其取值可以是一个时间点或者一个时间范围,例如每年2月1日14点整。环境描述环境情况,包括温度、湿度、噪音、PM2.5、空气中的燃气的ppm、空气中的一氧化碳含量、空气中的氧气含量等,其取值可以是每种参数的值或者范围。As an example, it is possible to set a sensing unit such as ear, eye, timer, whether someone is at home (so_at_home), and environment (environment). The auditory describes the speech that is heard, and when the robot receives the sound, performs speech recognition processing on the received sound to identify the text of the speech in the sound, and the value of the auditory may be the text of the speech being heard; in some embodiments The sound source can also be positioned. The hearing can also include the direction of the sound. The direction of the sound is referenced to the face of the robot, including left, right, front, and rear directions. In addition, emotion recognition technology can also be used to obtain the voice information. Identify the emotions contained in the voice. Visually describing the video situation, the robot can analyze the image or video to determine whether there is any current or whether there is movement. The visual value can include whether there is anybody, whether there is movement, etc. In addition, the monitoring object can also be identified based on video surveillance (for example The emotions of the at least one user who is in conversation with the robot can be determined based on facial recognition and physical activity. Whether someone at home can be "0" or "1", "0" means no one is at home, "1" means someone At home, whether someone is at home can be determined in a variety of ways, such as through video surveillance to determine whether the monitored object includes people. The time describes the time information, and the value may be a time point or a time range, for example, 14:00 every February 1st. The environment describes the environmental conditions, including temperature, humidity, noise, PM2.5, ppm of gas in the air, carbon monoxide content in the air, oxygen content in the air, etc., and the value may be the value or range of each parameter.
在某些实施例中,可以预定义感知单元的取值。预定义的感知单元的取值可以是一个或多个具体值、或者一个或多个取值范围。感知单元的取值可以是明确的值,也可以由通配符(或其类似)与明确的值共同构成,但不限于此。例如,感知单元为“语音”时,其取值可以是“*下雨*”,表示任意包含“下雨”的语音信息;或者其取值可以是“*[下有]雨*”,表示任意包含“下雨”或“有雨”的语音信息。In some embodiments, the value of the sensing unit can be predefined. The value of the predefined sensing unit may be one or more specific values, or one or more ranges of values. The value of the sensing unit may be an explicit value, or may be formed by a wildcard (or the like) together with an explicit value, but is not limited thereto. For example, when the sensing unit is “speech”, the value may be “*rain*”, indicating that any voice information containing “raining” is included; or the value may be “*[below] rain*”, indicating Any voice message that contains "rain" or "rain".
机器人可以至少根据感知单元和感知到的信息生成感知数据,感知数据可以包括一项或多项感知单元,感知数据中包括感知单元的取值以及感知单元的标识。感知数据中每个感知单元的取值参见对感知单元的描述。机器人根据感知到的信息、按照感知单元生成感知数据,可以采用多种分析方法根据感知到的信息得到感知单元的取值,例如,通过语音识别技术得到语音的文本、通过图像识别技术分析感知到的图像中是否存在人像、通过人像(面部)识别技术确定人像的属性等。应当理解,机器人不限于通过上述的方式得到感知单元的取值,还可以通过其他方式,包括在本文件提交日尚未开发出的处理技术。The bot may generate the sensing data according to at least the sensing unit and the perceived information, and the sensing data may include one or more sensing units, where the sensing data includes the value of the sensing unit and the identification of the sensing unit. For the value of each sensing unit in the sensing data, refer to the description of the sensing unit. The robot generates the sensing data according to the perceived information according to the sensing unit, and can obtain the value of the sensing unit according to the perceived information by using various analysis methods, for example, obtaining the text of the voice through the voice recognition technology, and analyzing and sensing through the image recognition technology. Whether there is a portrait in the image, the attribute of the portrait is determined by the portrait (face) recognition technique, and the like. It should be understood that the robot is not limited to obtaining the value of the sensing unit by the above-mentioned manner, and may also be processed by other means including the processing technology that has not been developed at the filing date of this document.
作为一个非限制性示例,可预先设定多个感知单元,应当理解下述示例性感知单元的设置不是对感知单元的划分、或感知单元的数量、或感知单元的表达的限定,实际上任何感知单元的划分都是可以被考虑的。感知单元的示例如表1所示。As a non-limiting example, a plurality of sensing units may be preset, and it should be understood that the setting of the following exemplary sensing unit is not a division of the sensing unit, or a limitation of the number of sensing units, or the expression of the sensing unit, in fact any The division of the sensing unit can all be considered. An example of the sensing unit is shown in Table 1.
表1感知单元示例表Table 1 Perceptual unit example table
Figure PCTCN2016087259-appb-000001
Figure PCTCN2016087259-appb-000001
Figure PCTCN2016087259-appb-000002
Figure PCTCN2016087259-appb-000002
以表1中感知单元的定义,以下给出了一个示例性感知数据,应当理解,以下感知数据并不是对感知数据的元素个数、或感知数据元素的定义、或感知数据的格式、或感知数据的表达方式的限定。一个示例情况的JSON感知数据表示如下,但不限于此,其他方式也是可行的。With the definition of the perceptual unit in Table 1, an exemplary perceptual data is given below. It should be understood that the following perceptual data is not the number of elements of perceptual data, or the definition of perceptual data elements, or the format or perception of perceptual data. The definition of the way the data is expressed. The JSON-aware data of an example case is expressed as follows, but is not limited thereto, and other methods are also possible.
{{
“vision_human_position”:“back”, "vision_human_position": "back",
“sensing_touch”:“hand”,"sensing_touch": "hand",
“audio_speak_txt”:“很高兴见到你”,"audio_speak_txt": "I am very happy to meet you",
“audio_speak_language”:“chinese”,"audio_speak_language": "chinese",
“vision_human_posture”:“posture1”,"vision_human_posture": "posture1",
“system_date”:“2016-03-16”,"system_date": "2016-03-16",
“system_time”:“13-00-00”,"system_time": "13-00-00",
“system_power”:“80%”"system_power": "80%"
}}
在该示例在感知数据中,“vision_human_position”记录了人类用户在相对于机器人的后面(“back”),“back”也可以用其他字符表示,能区分开不同的位置即可,应当理解位置也可以用“角度值”表示,例如“vision_human_position”:“45°”等。“sensing_touch”记录了人类用户在机器人上的触摸,触摸的位置为手(“hand”),“hand”也可以用其他字符表示,能区分开不同的位置即可,应当理解触摸位置可以由多个,“sensing_touch”的值可以为数组,其记录多个位置。“audio_speak_txt”记录了人类用户所说的内容“很高兴见到你”,所说的内容也可以为音频数据。“audio_speak_language”记录了人类用户所说的语种“chinese”。“vision_human_posture”记录了人类用户的姿势“posture1”,“posture1”也可以用其他字符表示,能区分开不同的姿势即可。“system_date”记录了感知数据产生的日期“2016/3/16”,“system_time”记录了感知数据产生的时间“13-00-00”。“system_power”记录了机器人的电量“80%”,应当理解,电量还可以按照其他方式标识。In the example, in the perceptual data, "vision_human_position" records that the human user is behind the robot ("back"), and "back" can also be represented by other characters, which can distinguish different positions. It should be understood that the position is also It can be expressed by "angle value", for example, "vision_human_position": "45°" or the like. “sensing_touch” records the touch of the human user on the robot. The position of the touch is the hand (“hand”). The “hand” can also be represented by other characters. It can distinguish different positions. It should be understood that the touch position can be more The value of "sensing_touch" can be an array that records multiple locations. "audio_speak_txt" records what the human user said "very happy to see you", and the content can also be audio data. "audio_speak_language" records the language "chinese" spoken by human users. "vision_human_posture" records the human user's gesture "posture1", and "posture1" can also be represented by other characters, which can distinguish different postures. "system_date" records the date "2016/3/16" of the generation of the perceptual data, and "system_time" records the time "13-00-00" of the perceptual data generation. "system_power" records the robot's power "80%", it should be understood that the power can also be identified in other ways.
关于控制条目About control entries
基于预先定义的感知单元和预设的供机器人执行的交互行为,可以设置触发条件以及触发条件触发的交互行为。根据触发条件和触发条件触发的交互行为,生成用于响应机器人感知到的信息控制机器人交互行为的控制条目。控制条目可以具有唯一标识。Based on the pre-defined sensing unit and the preset interaction behavior for the robot, the trigger condition and the interaction behavior triggered by the trigger condition can be set. According to the interaction behavior triggered by the trigger condition and the trigger condition, a control item for controlling the interaction behavior of the robot in response to the information perceived by the robot is generated. Control entries can have unique identifiers.
触发条件可以由一个或多个感知单元构成,感知单元之间可以配置逻辑关系,逻辑关系包括但不限于“与”、“或”以及“非”等。在某些实施例中,触发条件可以包括构成触发条件的感知单元的标识和取值,感知单元的取值可以是一个或一组值、或者一个或一组取值范围。感知单元的取值可以是明确的值,也可以由通配符(或其类似)与 明确的值构成,但不限于此。例如,触发条件中感知单元为“语音”时,其取值可以是“*下雨*”,表示任意包含“下雨”的语音信息;或者其取值可以是“*[下有]雨*”,表示任意包含“下雨”或“有雨”的语音信息。The triggering condition may be composed of one or more sensing units, and the logical units may be configured between the sensing units, and the logical relationships include, but are not limited to, “and”, “or”, “not”, and the like. In some embodiments, the triggering condition may include an identification and a value of the sensing unit constituting the triggering condition, and the value of the sensing unit may be one or a set of values, or one or a set of value ranges. The value of the sensing unit can be an explicit value or a wildcard (or similar) A clear value constitutes, but is not limited to. For example, when the sensing unit in the trigger condition is “speech”, the value may be “*rain*”, indicating that any voice information containing “rain” is included; or the value may be “*[below] rain* "," means that any voice message containing "rain" or "rain" is included.
触发条件可以触发的一个或多个交互行为。在某些实施例中,可以设置交互行为之间的顺序,以按照设置的顺序执行多个交互行为。在某些实施例中,还可以配置所述一个或多个交互行为的执行顺序。该执行顺序可以包括但不限于随机执行一个或一组交互行为,以实现随机执行一个或多个动作;或者,按照预定步骤顺序执行多个交互行为。One or more interactions that the trigger condition can trigger. In some embodiments, the order between interactions can be set to perform multiple interactions in a set order. In some embodiments, the order of execution of the one or more interactions can also be configured. The execution order may include, but is not limited to, randomly performing one or a set of interaction behaviors to achieve random execution of one or more actions; or performing a plurality of interaction behaviors in a predetermined sequence of steps.
交互行为可以被配置为一个或多个可被机器人解析以执行的动作指令,动作指令还可以包括一个或多个参数。在某些实施例中,还可以配置所述一个或多个动作指令的执行顺序。该执行顺序可以包括但不限于随机执行一个或一组动作指令,以实现随机执行一个或多个动作;或者,按照预定步骤顺序执行多个动作指令。The interaction behavior can be configured as one or more action instructions that can be parsed by the robot for execution, and the action instructions can also include one or more parameters. In some embodiments, the order of execution of the one or more action instructions can also be configured. The execution order may include, but is not limited to, randomly executing one or a set of action instructions to effect random execution of one or more actions; or executing a plurality of action instructions in a predetermined sequence of steps.
在某些实施例中,交互行为的动作指令包括:用于执行其他控制条目而设置的到其他控制条目的链接,和/或用于从多个内容和/或多个参数中选取内容和/或参数而设置的到多个参数和/或多个内容的链接。每个控制条目可以具有唯一标识,动作指令可以引用控制条目的标识连接到该控制条目。动作指令链接到的内容可以是一组动作,机器人可以根据其他因素执行一组动作中的动作,例如,可以预先配置机器人的性格或性别等属性,这些属性可以存储在存储器中,不同性别或者性格的机器人对同一情况(或称为场景)的交互行为可以不同,机器人可以根据设置的性格或性别等属性从一组动作中选择执行的动作,这些动作可以包括但不限于机器人的肢体动作等。动作指令链接到的一个或一组内容,可以包括但不限于语音聊天的内容、各种互联网信息等,例如,机器人根据控制条目执行的动作为查询北京的天气,动作指令可以是一个查询天气的地址,机器人到这一地址获取北京的天气,这一地址可以包括统一资源定位符(URL)、内存地址、数据库字段等。In some embodiments, the action instructions of the interactive behavior include: links to other control entries set for execution of other control entries, and/or for selecting content from a plurality of content and/or parameters and/or A link to multiple parameters and/or multiple content set by parameters or parameters. Each control entry may have a unique identification to which an action instruction may refer to the control entry. The content linked to the action instruction may be a set of actions, and the robot may perform actions in a set of actions according to other factors. For example, attributes such as character or gender of the robot may be pre-configured, and the attributes may be stored in a memory, different genders or characters. The robot may have different interaction behaviors for the same situation (or called a scene). The robot may select an action to be performed from a group of actions according to attributes such as set personality or gender, and these actions may include, but are not limited to, the body motion of the robot. The one or a group of content linked to the action instruction may include, but is not limited to, the content of the voice chat, various Internet information, and the like. For example, the action performed by the robot according to the control item is to query the weather in Beijing, and the action instruction may be a weather query. Address, the robot to this address to get the weather in Beijing, this address can include Uniform Resource Locator (URL), memory address, database field and so on.
机器人的交互行为包括但不限于通过输出语音、调整姿态、输出图像或视频、与其他设备进行交互等。输出语音包括但不限于与用户聊天、播放音乐;调整姿态包括但不限于移动(例如,模仿人类步行等)、肢体摆动(例如,手臂、头部的摆动)、神态调整等;输出图像或视频包括但不限于在显示装置上显示图像或视频,图像可以是动态电子表情等,也可以是拍摄得到的图像,或者从网络中获取到的图像;与其他设备交互包括但不限于控制其他设备(例如调整空调设备的工作参数等)、向其他设备传输数据、 与其他设备建立连接等。应当理解,交互行为并不限于上述列举的内容,机器人对感知到的信息的反应均可被视为机器人的交互行为。Robot interactions include, but are not limited to, outputting speech, adjusting gestures, outputting images or video, interacting with other devices, and the like. Output speech includes, but is not limited to, chatting with a user, playing music; adjusting gestures including, but not limited to, moving (eg, mimicking human walking, etc.), limb swings (eg, arm, head swing), posture adjustment, etc.; outputting images or videos Including but not limited to displaying an image or video on a display device, the image may be a dynamic electronic expression or the like, or may be a captured image, or an image obtained from a network; interaction with other devices includes, but is not limited to, controlling other devices ( For example, adjusting the operating parameters of air-conditioning equipment, etc.), transmitting data to other equipment, Establish connections with other devices, etc. It should be understood that the interaction behavior is not limited to the above enumerated contents, and the robot's reaction to the perceived information can be regarded as the interaction behavior of the robot.
控制条目可以采用数据交换格式配置,当然也可以采用其他格式配置。数据交换格式包括但不限于XML、JSON或者YAML等。以JSON为例,需要实现:当用户说:“给我唱一首歌”,先往以中等速度0角度后退10cm然后开始唱一首歌,唱完歌以后10秒拍个照片发送给用户,然后0角度前行5CM。JSON数据格式的控制条目可以是如下内容:Control entries can be configured in a data exchange format, although other formats can be used. Data exchange formats include, but are not limited to, XML, JSON, or YAML. Take JSON as an example, you need to implement: When the user says, "Sing me a song," first go back to 10cm at a medium speed of 0 and then start singing a song. After singing the song, take a photo and send it to the user 10 seconds later. Then 0 angle forward 5CM. The control entry for the JSON data format can be as follows:
{{
"id":控制条目的ID(不为空),"id": the ID of the control entry (not empty),
"ifs":{"ear":["*唱歌*"]},"ifs":{"ear":["*sing*"]},
"trigger":{"trigger":{
"move":[{"gr":0,"a":0,"m":-10},{"gr":1,"a":0,"m":5}],"move":[{"gr":0,"a":0,"m":-10},{"gr":1,"a":0,"m":5}],
"song":[{"path":"http://bpeer.com/i.mp3","gr":2}],"song":[{"path":"http://bpeer.com/i.mp3","gr":2}],
"take_pic":[{"gr":3,"time":10}]"take_pic":[{"gr":3,"time":10}]
     }}
}}
在上述的控制条目中,“ifs”部分为根据感知单元设置的触发条件,“ear”为感知单元的标识,“唱歌”为感知单元的取值。"trigger"部分为触发条件触发的交互行为,包括“move(移动)”、“song(唱歌)”和“take_pic(拍照)”三个交互行为,每个交互行为包括相应的动作指令。其中,“song(唱歌)”链接到“http://bpeer.com/i.mp3”,唱歌的内容从“http://bpeer.com/i.mp3”中获取,“gr”为动作的执行顺序。In the above control entry, the "ifs" part is a trigger condition set according to the sensing unit, "ear" is the identification of the sensing unit, and "singing" is the value of the sensing unit. The "trigger" part is the interactive behavior triggered by the trigger condition, including three interaction behaviors of "move", "song" and "take_pic", each of which includes a corresponding action instruction. Among them, "song" is linked to "http://bpeer.com/i.mp3", the content of singing is obtained from "http://bpeer.com/i.mp3", and "gr" is action Execution order.
作为一个非限制性示例,行为的指令和行为控制参数可用JSON语言编写,但不限于此,其他方式也是可行的。非限制性的可包括:As a non-limiting example, the behavioral instruction and behavior control parameters can be written in the JSON language, but are not limited thereto, and other methods are also possible. Non-limiting ones may include:
1、让机器人说话1. Let the robot speak
行为名称为:audio_speak;The behavior name is: audio_speak;
行为控制参数可包括:text(要说的内容)、volume(说话的音量)等(例如,发声性别、或发声年龄等)Behavior control parameters can include: text (content to say), volume (volume of speech), etc. (eg, vocal gender, or vocal age, etc.)
JSON表示如下:JSON is expressed as follows:
“audio_speak”:{“text”:“你好吗”,“volume”:“50%”} "audio_speak": {"text": "How are you", "volume": "50%"}
非限制性地,“text”可包括转换字符,转换字符与参数对应。例如,“主人”的转换字符可被定义为“@master”,作为一个例子,包含转换字符的JSON表示如下:Without limitation, "text" may include a conversion character that corresponds to a parameter. For example, the "owner" conversion character can be defined as "@master". As an example, the JSON representation containing the conversion characters is as follows:
“audio_speak”:{“text”:“你好,@master”,“volume”:“50%”}"audio_speak": {"text": "Hello, @master", "volume": "50%"}
在执行“audio_speak”时,可将“@master”替换成“主人”的姓名。When "audio_speak" is executed, "@master" can be replaced with the name of "owner".
另外,在上述示例中“volume”被设置为百分比,机器人可根据“volume”的百分比值计算得到机器人的具体参数。作为另一个示例,“volume”也可以被表示为机器人的具体参数。In addition, in the above example, "volume" is set as a percentage, and the robot can calculate the specific parameters of the robot based on the percentage value of "volume". As another example, "volume" can also be represented as a specific parameter of the robot.
2、让机器人播放音乐2, let the robot play music
行为名称为:audio_sound_music;The behavior name is: audio_sound_music;
行为控制参数可包括:path(要播放音乐的路径、或文件名等)、volume(播放音乐的音量)等Behavior control parameters may include: path (path to play music, or file name, etc.), volume (volume of playing music), etc.
JSON表示如下:JSON is expressed as follows:
“audio_sound_music”:{“path”:“http//bpeer.com/happy.mp3”,“volume”:“50%”}"audio_sound_music": {"path": "http//bpeer.com/happy.mp3", "volume": "50%"}
3、让机器人播放提示音3, let the robot play the prompt tone
行为名称为:audio_sound_info;The behavior name is: audio_sound_info;
行为控制参数包括:name(要播放的提示音的名称)、volume(播放音量)等Behavior control parameters include: name (the name of the tone to be played), volume (play volume), etc.
JSON表示如下:JSON is expressed as follows:
“audio_sound_info”:{“name”:“warning”,“volume”:“normal”}"audio_sound_info": {"name": "warning", "volume": "normal"}
4、让机器人的头部运动4, let the robot's head movement
行为名称为:motion_head;The behavior name is: motion_head;
行为控制参数可包括:motor(执行运动的电机)、velocity(电机运动速度),angle(电机运动角度)等Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
JSON表示如下:JSON is expressed as follows:
“motion_head”:{“motor”:“1”,“velocity”:“1”,“angle”:“45”}"motion_head": {"motor": "1", "velocity": "1", "angle": "45"}
在上述示例中,“velocity”被表示为档位,机器人可根据该档位计算得到具体的“velocity”。实际上,“velocity”也可表示为机器人头部运动的具体参数。In the above example, "velocity" is represented as a gear position, and the robot can calculate a specific "velocity" based on the gear position. In fact, "velocity" can also be expressed as a specific parameter of the robot's head movement.
另外,上述示例中,“angle”被表示为电机的角度,实际上,“angle”可被表示为百分比等相对数据,例如,“angle”:“50%”,机器人可根据角度范围确定具体的参数,例如最大角度为180度,那么计算得到具体角度为90度,但不限于此。 In addition, in the above example, "angle" is expressed as the angle of the motor. In fact, "angle" can be expressed as relative data such as percentage, for example, "angle": "50%", the robot can determine the specific range according to the angle range. The parameter, for example, the maximum angle is 180 degrees, then the calculated specific angle is 90 degrees, but is not limited thereto.
5、让机器人的脖子运动5, let the robot's neck movement
行为名称为:motion_neck;The behavior name is: motion_neck;
行为控制参数可包括:motor(执行运动的电机)、velocity(电机运动速度),angle(电机运动角度)等Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
JSON表示如下:JSON is expressed as follows:
“motion_neck”:{“motor”:“1”,“velocity”:“2”,“angle”:“60”}"motion_neck": {"motor": "1", "velocity": "2", "angle": "60"}
6、让机器人的肩膀运动6, let the robot's shoulder movement
行为名称为:motion_shoulder;The behavior name is: motion_shoulder;
行为控制参数可包括:motor(执行运动的电机)、velocity(电机运动速度),angle(电机运动角度)等Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
JSON表示如下:JSON is expressed as follows:
“motion_shoulder”:{“motor”:“1”,“velocity”:“3”,“angle”:“60”}"motion_shoulder": {"motor": "1", "velocity": "3", "angle": "60"}
7、让机器人的肘部运动7, let the robot's elbow movement
行为名称为:motion_elbow;The behavior name is: motion_elbow;
行为控制参数可包括:motor(执行运动的电机)、velocity(电机运动速度),angle(电机运动角度)等Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
JSON表示如下:JSON is expressed as follows:
“motion_elbow”:{“motor”:“1”,“velocity”:“2”,“angle”:“50”}"motion_elbow": {"motor": "1", "velocity": "2", "angle": "50"}
8、让机器人的腕部运动8, let the robot's wrist movement
行为名称为:motion_wrist;The behavior name is: motion_wrist;
行为控制参数可包括:motor(执行运动的电机)、velocity(电机运动速度),angle(电机运动角度)等Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
JSON表示如下:JSON is expressed as follows:
“motion_wrist”:{“motor”:“1”,“velocity”:“2”,“angle”:“50”}"motion_wrist": {"motor": "1", "velocity": "2", "angle": "50"}
9、让机器人的腰部运动9, let the robot's waist movement
行为名称为:motion_waist;The behavior name is: motion_waist;
行为控制参数可包括:motor(执行运动的电机)、velocity(电机运动速度),angle(电机运动角度)等Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
JSON表示如下:JSON is expressed as follows:
“motion_waist”:{“motor”:“1”,“velocity”:“2”,“angle”:“50”} "motion_waist": {"motor": "1", "velocity": "2", "angle": "50"}
10、让机器人的眼睛运动10, let the robot's eye movement
行为名称为:motion_eye;The behavior name is: motion_eye;
行为控制参数可包括:motor(执行运动的电机)、velocity(电机运动速度),angle(电机运动角度)等Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
JSON表示如下:JSON is expressed as follows:
“motion_eye”:{“motor”:“1”,“velocity”:“2”,“angle”:“50”}"motion_eye": {"motor": "1", "velocity": "2", "angle": "50"}
11、让机器人显示表情11, let the robot display expressions
行为名称为:display_emotion;The behavior name is: display_emotion;
行为控制参数可包括:content(有显示的表情内容)、velocity(显示速度)等Behavior control parameters can include: content (displayed emoticons), velocity (display speed), etc.
JSON表示如下:JSON is expressed as follows:
“display_emotion”:{“content”:“happy”,“velocity”:“3”}"display_emotion": {"content": "happy", "velocity": "3"}
12、让机器人拍照12, let the robot take pictures
行为名称为:program_photo;The behavior name is: program_photo;
行为控制参数可包括:flash(是否打开闪光灯)等Behavior control parameters can include: flash (whether the flash is turned on), etc.
JSON表示如下:JSON is expressed as follows:
“program_photo”:{“flash”:“1”}"program_photo": {"flash": "1"}
13、让机器人控制电视13, let the robot control the TV
行为名称为:control_tv;The behavior name is: control_tv;
行为控制参数可包括:state(例如open、close)等Behavior control parameters can include: state (eg open, close), etc.
JSON表示如下:JSON is expressed as follows:
“control_tv”:{“state”:“open”}"control_tv": {"state": "open"}
14、让机器人控制LED灯14, let the robot control the LED lights
行为名称为:control_led;The behavior name is: control_led;
行为控制参数可包括:state(例如open、close)、color等Behavior control parameters can include: state (eg open, close), color, etc.
JSON表示如下:JSON is expressed as follows:
“control_led”:{“state”:“open”,“color”:“yellow”}"control_led": {"state": "open", "color": "yellow"}
关于机器人控制About robot control
图1示出了本发明一个实施例的机器人控制系统1000。如图1所示,机器人控制系统1000可以包括:机器人控制引擎1100、机器人1200以及用户1300。 Figure 1 shows a robot control system 1000 in accordance with one embodiment of the present invention. As shown in FIG. 1, the robot control system 1000 may include a robot control engine 1100, a robot 1200, and a user 1300.
机器人1200,例如清洁机器人,可以置于室内空间1400内。机器人1200可以通过内嵌通信装置(图1中未示出)与室内空间1400内的路由设备1500建立通信链路1510,路由设备1500与机器人控制引擎1100建立通信链路1520,机器人1200通过通信链路1510和通信链路1520与机器人控制引擎1100通信。应当理解,虽然图1中示出的是机器人控制引擎1100设置在互联网云计算平台的情况,在某些实施例中,机器人控制引擎1100也可以设置为机器人1200中,或者在互联网云计算平台和机器人1200均设置机器人控制引擎1100。A robot 1200, such as a cleaning robot, can be placed within the indoor space 1400. The robot 1200 can establish a communication link 1510 with the routing device 1500 within the indoor space 1400 via an embedded communication device (not shown in FIG. 1). The routing device 1500 establishes a communication link 1520 with the robot control engine 1100, and the robot 1200 passes through the communication chain. Road 1510 and communication link 1520 are in communication with robot control engine 1100. It should be understood that although the case where the robot control engine 1100 is disposed on the Internet cloud computing platform is illustrated in FIG. 1, in some embodiments, the robot control engine 1100 may also be configured as the robot 1200, or on the Internet cloud computing platform and The robot 1200 is each provided with a robot control engine 1100.
用户1300可以是室内空间1400的成员,也可以是与室内空间1400具有关联关系的人员,机器人1200可以与用户1300进行交互。机器人1200还可以与室内空间1400内的设备(例如,空调、电视、空气净化器等家用电器)进行交互,或者还可以与室内空间1400之外的设备进行交互。应当理解,虽然图1中示出了包含一个用户1300,但并不限于一个用户1300,例如,机器人1200的用户1300可以是室内空间1400的多个成员。在某些实施例中,还可以对多个用户1300进行分组,例如将室内空间1400的成员分为一组,将室内空间1400之外的用户1300分为一组。The user 1300 may be a member of the indoor space 1400 or a person associated with the indoor space 1400, and the robot 1200 may interact with the user 1300. The robot 1200 can also interact with devices within the indoor space 1400 (eg, household appliances such as air conditioners, televisions, air purifiers, etc.) or can also interact with devices outside the indoor space 1400. It should be understood that although one user 1300 is shown in FIG. 1, it is not limited to one user 1300. For example, user 1300 of robot 1200 may be a plurality of members of indoor space 1400. In some embodiments, multiple users 1300 may also be grouped, such as grouping members of indoor space 1400 into groups, and grouping 1300 outside of indoor space 1400 into groups.
机器人1200可以通过内嵌的感知装置(图1中未示出)感知各种信息,包括但不限于室内空间1400内的环境参数、用户1300或其他人员的语音信息(包括自然语言和语音指令等)、用户1300或其他人员以及物品的图像或视频信息等。机器人1400内嵌的感知装置包括但不限于麦克风、摄像头、红外传感器、超声波传感器等。应当理解的是,机器人1200还可以与其外部的感知装置通信,以获取感知装置感知到的信息,例如,机器人1200可以与设置于室内空间1400的温度传感器、湿度传感器(图1中未示出)通信,获取室内空间1400的温度和湿度参数。The robot 1200 can perceive various information through embedded sensing devices (not shown in FIG. 1), including but not limited to environmental parameters in the indoor space 1400, voice information of the user 1300 or other personnel (including natural language and voice commands, etc.) ), user 1300 or other personnel and image or video information of the item, and the like. The sensing devices embedded in the robot 1400 include, but are not limited to, a microphone, a camera, an infrared sensor, an ultrasonic sensor, and the like. It should be understood that the robot 1200 can also communicate with its external sensing device to acquire information perceived by the sensing device. For example, the robot 1200 can be coupled with a temperature sensor, a humidity sensor (not shown in FIG. 1) disposed in the indoor space 1400. Communication, obtaining temperature and humidity parameters of the indoor space 1400.
机器人1200感知到信息时,可以基于预设的感知单元对感知到的信息进行处理,得到包含感知单元的取值的感知数据。机器人1200可以将生成的感知数据发送至机器人控制引擎1100,以获取机器人控制引擎1100基于感知数据反馈的用于控制机器人交互行为的控制条目的信息。控制条目的信息包括但不限于控制条目的数据本身、控制条目的标识、控制条目中的交互行为数据等。在某些实施例中,机器人1200可以存储控制条目,进而机器人1200可以基于感知数据从机器人控制引擎1100获取用于控制机器人1200的交互行为的控制条目的标识。在机器人1200未存储该控制条目的标识指示的控制条目时,机器人1200可以从机器人控制引擎1100获取控制条目本身或者控制条目中的交互行为数据。 When the robot 1200 senses the information, the perceived information may be processed based on the preset sensing unit to obtain the sensing data including the value of the sensing unit. The robot 1200 can transmit the generated sensory data to the robot control engine 1100 to acquire information of the control item for controlling the robot interaction behavior based on the sensory data feedback by the robot control engine 1100. The information of the control entry includes, but is not limited to, the data itself of the control entry, the identification of the control entry, the interaction behavior data in the control entry, and the like. In some embodiments, the bot 1200 can store control entries, and the bot 1200 can then obtain an identification of control entries for controlling the interactive behavior of the bot 1200 from the robot control engine 1100 based on the perceptual data. When the robot 1200 does not store the control item indicated by the identification of the control item, the robot 1200 may acquire the control item itself or the interaction behavior data in the control item from the robot control engine 1100.
机器人控制引擎1100产生基于机器人1200的感知数据控制机器人交互行为的控制条目,控制条目包含由至少一个感知单元构成的触发条件和触发条件触发的交互行为,其中,交互行为可由机器人1200解析并执行。触发条件可以由感知单元的取值和感知单元之间的关系构成,感知单元之间的关系包括但不限于“与”、“或”、“非”等逻辑关系。控制条目具有与其他控制条目区分的唯一标识,控制条目的标识可以但不限于是整形整数。应当理解,控制条目的标识还可以是URL等。The robot control engine 1100 generates a control item that controls the robot interaction behavior based on the perceptual data of the robot 1200, the control item including an activation condition triggered by a trigger condition and a trigger condition composed of at least one perceptual unit, wherein the interaction behavior can be parsed and executed by the robot 1200. The triggering condition may be formed by the relationship between the value of the sensing unit and the sensing unit, and the relationship between the sensing units includes, but is not limited to, logical relationships such as “and”, “or”, and “not”. The control entry has a unique identifier that is distinguishable from other control entries, and the identity of the control entry can be, but is not limited to, an integer integer. It should be understood that the identification of the control entry may also be a URL or the like.
机器人控制引擎1100可以以控制条目中触发条件包含的感知单元为主键、以控制条目的标识为目标建立倒排索引。对于感知单元的取值为一组值,机器人控制引擎1100可以基于一组值中的所有取值为主键、以该感知单元的取值对应的控制条目为目标,建立倒排索引。在建立的倒排索引中,一个感知单元的取值可以对应一个或多个控制条目,即出现该感知单元的取值的控制条目。倒排索引可以以倒排表的形式存储在内存中,以能够追加倒排记录;或者,以文件的形式存储在磁盘中。The robot control engine 1100 may establish an inverted index with the sensing unit included in the trigger condition included in the control entry as the primary key and the identification of the control item. For the value of the sensing unit as a set of values, the robot control engine 1100 may establish an inverted index based on all the values in the set of values as the primary key and the control entry corresponding to the value of the sensing unit. In the inverted index that is established, the value of one sensing unit may correspond to one or more control entries, that is, control entries in which the value of the sensing unit appears. The inverted index can be stored in the memory in the form of an inverted list to be able to append the inverted records; or it can be stored as a file on the disk.
倒排索引以感知单元的取值和控制条目结合,并以感知单元的取值作为主键的索引结构。倒排索引可以分为两个部分。第一部分:由不同的感知单元的取值组成的索引表,称为“词典”。其中保存了各种感知单元的取值,还可以包括感知单元的取值的统计信息,例如,感知单元的取值在控制条目中出现的次数等;第二部分:由每个感知单元的取值出现过的控制条目的集合,以及其他信息(例如,控制条目的优先级等)构成,也称为“记录表”或“记录列表”。The inverted index combines the value of the sensing unit with the control item, and takes the value of the sensing unit as the index structure of the primary key. The inverted index can be divided into two parts. The first part: an index table consisting of the values of different sensing units, called the "dictionary". The value of each sensing unit is saved, and the statistics of the value of the sensing unit may be included, for example, the number of times the value of the sensing unit appears in the control item, etc.; the second part: taken by each sensing unit A collection of control entries for which values have occurred, as well as other information (eg, priority of control entries, etc.), also known as a "record table" or "record list."
在某些实施例中,还可以(例如,基于数字签名技术、字符串映射技术(例如,MD5等))将感知单元的取值变换成整形整数,以感知单元的取值变换得到的整形整数为主键、以控制条目的标识为目标建立倒排索引,其中,不同的感知单元的取值对应不同的整形整数,以区分不同的感知单元的取值。此外,还可以对变换得到的整形整数进行压缩处理,以降低数据存储量和提高处理速度。In some embodiments, the value of the sensing unit may be transformed into an integer based on a digital signature technique, a string mapping technique (eg, MD5, etc.) to perceive the integer integer obtained by the value transformation of the unit. The primary key is used to establish an inverted index with the identifier of the control entry. The values of different sensing units correspond to different integer integers to distinguish the values of different sensing units. In addition, the transformed integer obtained by the transformation can be compressed to reduce the amount of data storage and increase the processing speed.
机器人控制引擎1100获取到机器人1200的感知数据时,可以基于机器人1200的感知数据和倒排索引检索用于控制机器人1200的控制条目。在某些实施例中,机器人控制引擎1100可以解析机器人1200的感知数据,从机器人1200的感知数据中提取出感知数据包含的感知单元的取值,基于提取出的感知单元的取值和倒排索引检索与机器人1200的感知数据匹配的控制条目,以得到基于机器人1200的感知数据控制机器人1200交互行为的控制条目。在某些实施例中,机器人控制引擎1100在提取出机器人1200的感知数据中的感知单元的取值时,可以基于数字签名技术将感知单元的取值变换成整形整 数。机器人控制引擎1100将机器人1200的感知数据中的感知单元的取值的整形整数与倒排索引中整形整数进行比较,以检索到机器人1200的感知数据中的感知单元的取值对应的控制条目。When the robot control engine 1100 acquires the sensory data of the robot 1200, the control item for controlling the robot 1200 can be retrieved based on the sensory data of the robot 1200 and the inverted index. In some embodiments, the robot control engine 1100 can parse the sensory data of the robot 1200, extract the value of the sensing unit included in the sensory data from the sensory data of the robot 1200, based on the extracted value and the inversion of the sensing unit. The index retrieves control entries that match the perceptual data of the robot 1200 to obtain control entries that control the interaction behavior of the robot 1200 based on the perceptual data of the robot 1200. In some embodiments, when the robot control engine 1100 extracts the value of the sensing unit in the sensing data of the robot 1200, the value of the sensing unit may be transformed into a shaping integer based on a digital signature technology. number. The robot control engine 1100 compares the integer integer of the value of the sensing unit in the sensory data of the robot 1200 with the integer integer in the inverted index to retrieve the control item corresponding to the value of the sensing unit in the sensory data of the robot 1200.
应当理解,机器人1200的感知数据包含的感知单元的取值,可以是感知单元的取值本身,或者可以是基于数字签名技术变化得到的整形整数。机器人1200基于感知到的信息和至少一个感知单元生成感知数据时,可以先确定感知单元的取值,然后基于数字签名技术将感知单元的取值变换成整形整数,基于各个感知单元的取值变换得到的整形整数生成感知数据。It should be understood that the value of the sensing unit included in the sensing data of the 1200 may be the value of the sensing unit itself, or may be an integer integer obtained based on a digital signature technique. When the 1200 is configured to generate the sensing data based on the received information and the at least one sensing unit, the 1200 may first determine the value of the sensing unit, and then transform the value of the sensing unit into an integer based on the digital signature technology, based on the value transformation of each sensing unit. The resulting integer integer generates perceptual data.
机器人控制引擎1100检索到机器人1200的感知数据中的感知单元的取值对应的控制条目后,基于感知数据和控制条目包含的触发条件中感知单元之间的逻辑关系合并检索到的控制条目,得到与机器人1200的感知数据匹配的控制条目。例如,机器人1200的感知数据包含5个感知单元的取值,机器人控制引擎1100基于这5个感知单元的取值检索到多条控制条目。机器人控制引擎1100可以对检索到的多条控制条目求交集,得到同时满足这5个感知单元的取值的取值。此外,还可以排除触发条件中包含“非”某一感知单元的取值的控制条目。After the robot control engine 1100 retrieves the control item corresponding to the value of the sensing unit in the sensing data of the robot 1200, the robot control engine 1100 merges the retrieved control item based on the logical relationship between the sensing unit and the sensing unit included in the triggering condition included in the control entry. A control entry that matches the perceptual data of the robot 1200. For example, the sensory data of the robot 1200 includes values of five sensing units, and the robot control engine 1100 retrieves a plurality of control items based on the values of the five sensing units. The robot control engine 1100 can find the intersection of the retrieved plurality of control items, and obtain the values that satisfy the values of the five sensing units at the same time. In addition, it is also possible to exclude control entries that contain a value of "not" a certain sensing unit in the trigger condition.
在某些实施例中,机器人控制引擎1100检索到多个与机器人1200的感知数据匹配的控制条目,此时,机器人控制引擎1100可以对检索到的、与机器人1200的感知数据的多个控制条目进行排序,以从该多个控制条目中选取基于该感知数据控制机器人1200的交互行为的控制条目。在某些实施例中,可以基于控制条目的优先级排序,优先选取优先级高的控制条目;或者,可以基于多个用户1300对控制条目执行的用户反馈排序,优先选取多个用户1300的用户反馈评价最好的控制条目;或者,可以基于控制条目执行日志排序,控制条目执行日志可以记录控制条目的执行次数、执行时间等,优先选取执行次数多个控制条目或者最近执行次数多的控制条目。应当理解,控制条目的排序并不限于上述方式,还可以对上述方式及其他方式进行任意组合,以选取基于机器人1200的感知数据控制机器人1200的交互行为的控制条目。In some embodiments, the robot control engine 1100 retrieves a plurality of control entries that match the perceptual data of the robot 1200, at which point the robot control engine 1100 can retrieve a plurality of control entries for the perceptual data with the robot 1200. Sorting is performed to select, from the plurality of control items, a control item that controls the interactive behavior of the robot 1200 based on the perceptual data. In some embodiments, the control entry with a higher priority may be preferentially selected based on the prioritization of the control entries; or the user of the plurality of users 1300 may be preferentially selected based on the user feedback performed by the plurality of users 1300 on the control entry. Feedback can be used to evaluate the best control entries; or, the log can be sorted based on the control entries, the control entry execution log can record the number of executions of the control entries, the execution time, etc., and the control entries with multiple execution times or the control entries with the most recent executions are preferentially selected. . It should be understood that the ordering of the control items is not limited to the above manner, and any combination of the above manners and other manners may be performed to select a control item for controlling the interaction behavior of the robot 1200 based on the perceptual data of the robot 1200.
机器人控制引擎1100确定与机器人1200的感知数据匹配的控制条目时,可以将确定的控制条目的信息发送至机器人1200,机器人1200根据接收到的控制条目的信息执行控制条目中的交互行为。在某些实施例中,控制条目的信息包括控制条目本身;或者,控制条目的信息为控制条目的标识;或者,控制条目中的交互行为。另外,控制条目的信息还可以是以上信息的任意组合。 When the robot control engine 1100 determines a control item that matches the sensory data of the robot 1200, the information of the determined control item may be transmitted to the robot 1200, and the robot 1200 performs the interaction behavior in the control item based on the information of the received control item. In some embodiments, the information of the control entry includes the control entry itself; or the information of the control entry is an identification of the control entry; or, the interaction behavior in the control entry. In addition, the information of the control entry may also be any combination of the above information.
在某些实施例中,如果机器人控制引擎1100未检索到机器人1200的感知数据的控制条目,机器人1200可以基于感知数据按照预定模式执行交互行为,例如,如果感知数据包含语音信息,则与用户1300进行语音聊天,如果不包含语音信息则不执行操作。应当理解,机器人1200可以并不限于上述模式。In some embodiments, if the robot control engine 1100 does not retrieve a control entry for the sensory data of the bot 1200, the bot 1200 can perform an interactive behavior in accordance with the sensed data in a predetermined pattern, eg, if the sensory data includes voice information, with the user 1300 Do voice chat, if you do not include voice messages, do not perform the operation. It should be understood that the robot 1200 may not be limited to the above modes.
图2示出了本发明一个实施例的机器人1200。如图2所示,机器人1200包括存储器102、存储器控制器104、一个或多个处理单元(CPU)106、外设接口108、射频(RF)电路114、音频电路116、扬声器118、麦克风120、感知子系统122、姿态传感器132、摄像机134、触觉传感器136以及一个或多个其他感知装置138,以及外部接口140。这些组件通过一条或多条通信总线或信号线110进行通信。Figure 2 shows a robot 1200 in accordance with one embodiment of the present invention. As shown in FIG. 2, the robot 1200 includes a memory 102, a memory controller 104, one or more processing units (CPUs) 106, a peripheral interface 108, a radio frequency (RF) circuit 114, an audio circuit 116, a speaker 118, a microphone 120, Perception subsystem 122, attitude sensor 132, camera 134, tactile sensor 136, and one or more other sensing devices 138, as well as external interface 140. These components communicate over one or more communication buses or signal lines 110.
应当理解,机器人1200只是机器人1200的一个实例,该机器人1200的组件可以比图示具有更多或更少的组件,或具有不同的组件配置。例如,在某些实施例中,机器人1200可以包括一个或多个CPU 106、存储器102、一个或多个感知装置(例如如上所述的感知装置),以及一个或多个保存在存储器102中以执行机器人交互行为控制方法的模块、程序或指令集。图2所示的各种组件可以用硬件、软件或软硬件的组合来实现,包括一个或多个信号处理和/或专用集成电路。It should be understood that the robot 1200 is just one example of a robot 1200 that may have more or fewer components than illustrated, or have different component configurations. For example, in some embodiments, the bot 1200 can include one or more CPUs 106, memory 102, one or more sensing devices (such as sensing devices as described above), and one or more are stored in the memory 102. A module, program, or instruction set that performs a robot interaction behavior control method. The various components shown in FIG. 2 can be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
在某些实施例中,机器人1200可以是具有生物外形(例如,人形等)的机电设备,还可以是不具有生物外形但具有人类特征(例如,语言交流等)的智能装置,该智能装置可以包括机械装置,也可以包括由软件实现的虚拟装置(例如,虚拟聊天机器人等)。虚拟聊天机器人可以通过其所在的设备感知到信息,其所在的设备包括电子设备,例如手持电子设备、个人计算机等。In some embodiments, the robot 1200 can be an electromechanical device having a biological shape (eg, a humanoid, etc.), or can be a smart device that does not have a biological appearance but has human characteristics (eg, language communication, etc.), the smart device can Mechanical devices are also included, as well as virtual devices implemented by software (eg, virtual chat bots, etc.). The virtual chat bot can perceive information through the device in which it is located, and the device in which it is located includes electronic devices such as handheld electronic devices, personal computers, and the like.
存储器102可包括高速随机存取存储器,并且还可包括非易失性存储器,例如一个或多个磁盘存储设备、闪存设备或其他非易失性固态存储设备。在某些实施例中,存储器102还可以包括远离一个或多个CPU 106的存储器,例如经由RF电路114或外部接口140以及通信网络(未示出)访问的网络附加存储器,其中所述通信网络可以是因特网、一个或多个内部网、局域网(LAN)、广域网(WLAN)、存储局域网(SAN)等,或其适当组合。存储器控制器104可控制机器人1200的诸如CPU 106和外设接口108之类的其他组件对存储器102的访问。Memory 102 can include high speed random access memory and can also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid state storage devices. In some embodiments, memory 102 may also include memory remote from one or more CPUs 106, such as network attached memory accessed via RF circuitry 114 or external interface 140 and a communication network (not shown), wherein the communication network It can be the Internet, one or more intranets, a local area network (LAN), a wide area network (WLAN), a storage area network (SAN), etc., or a suitable combination thereof. Memory controller 104 can control access to memory 102 by other components of robot 1200, such as CPU 106 and peripheral interface 108.
外设接口108将设备的输入和输出外设耦接到CPU 106和存储器102。上述一个或多个处理器106运行各种存储在存储器102中的软件程序和/或指令集,以便执行机器人1200的各种功能,并对数据进行处理。 Peripheral interface 108 couples the input and output peripherals of the device to CPU 106 and memory 102. The one or more processors 106 described above execute various software programs and/or sets of instructions stored in the memory 102 to perform various functions of the robot 1200 and process the data.
在某些实施例中,外设接口108、CPU 106以及存储器控制器104可以在单个芯片,例如芯片112上实现。而在某些其他实施例中,它们可能在多个分立芯片上实现。In some embodiments, peripheral interface 108, CPU 106, and memory controller 104 can be implemented on a single chip, such as chip 112. In some other embodiments, they may be implemented on multiple discrete chips.
RF电路114接收并发送电磁波。该RF电路114将电信号变换成电磁波,或是将电磁波变换成电信号,并且经由电磁波来与通信网络以及其他通信设备进行通信。该RF电路114可以包括用于执行这些功能的公知电路,包括但不局限于天线系统、RF收发机、一个或多个放大器、调谐器、一个或多个振荡器、数字信号处理器、CODEC芯片组、用户身份模块(SIM)卡、存储器等等。该RF电路112可以通过无线通信来与网络和其他设备进行通信,该网络例如又名万维网(WWW)的因特网、内部网和/或诸如蜂窝电话网络之类的无线网络、无线局域网(LAN)和/或城域网(MAN)。The RF circuit 114 receives and transmits electromagnetic waves. The RF circuit 114 converts an electrical signal into an electromagnetic wave, or converts the electromagnetic wave into an electrical signal, and communicates with the communication network and other communication devices via the electromagnetic wave. The RF circuit 114 may include well-known circuitry for performing these functions including, but not limited to, an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chip. Group, User Identity Module (SIM) card, memory, and more. The RF circuit 112 can communicate with a network and other devices via wireless communication, such as the World Wide Web (WWW) Internet, an intranet, and/or a wireless network such as a cellular telephone network, a wireless local area network (LAN), and / or Metropolitan Area Network (MAN).
上述无线通信可以使用多种通信标准、协议和技术中的任何一种,包括但不局限于全球移动通信系统(GSM)、增强型数据GSM环境(EDGE)、宽带码分多址(W-CDMA)、码分多址(CDMA)、时分多址(TDMA)、蓝牙、无线保真(Wi-Fi)(例如IEEE 802.11a、IEEE 802.11b、IEEE802.11g和/或IEEE 802.11n)、基于因特网协议的语音传输(VoIP)、Wi-MAX,用于电子邮件、即时消息传递和/或短消息服务(SMS)的协议,或任何其他合适的通信协议,包括在本文件提交日尚未开发出的通信协议。The above wireless communication may use any of a variety of communication standards, protocols, and technologies including, but not limited to, Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), Wideband Code Division Multiple Access (W-CDMA). ), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (eg IEEE 802.11a, IEEE 802.11b, IEEE 802.11g, and/or IEEE 802.11n), Internet-based Protocol Voice over Internet Protocol (VoIP), Wi-MAX, protocols for e-mail, instant messaging, and/or short message service (SMS), or any other suitable communication protocol, including those not yet developed at the filing date of this document. letter of agreement.
音频电路116、扬声器118和麦克风120提供了用户与机器人1200之间的音频接口。音频电路116接收来自外设接口108的音频数据,将音频数据变换成电信号,并且将电信号传送到扬声器118。扬声器将电信号变换成人类可听见的声波。音频电路116还接收由麦克风118从声波变换的电信号。该音频电路116将电信号变换成音频数据,并且将音频数据传送到外设接口108,以便进行处理。音频数据可以由外设接口108从存储器102和/或RF电路114中检索出,和/或传送到存储器102和/或RF电路114。 Audio circuitry 116, speaker 118, and microphone 120 provide an audio interface between the user and the bot 1200. Audio circuitry 116 receives audio data from peripheral interface 108, converts the audio data into electrical signals, and transmits the electrical signals to speaker 118. The speaker transforms the electrical signal into a human audible sound wave. Audio circuit 116 also receives electrical signals that are converted from sound waves by microphone 118. The audio circuit 116 converts the electrical signals into audio data and transmits the audio data to the peripheral interface 108 for processing. Audio data may be retrieved from memory 102 and/or RF circuitry 114 by peripheral interface 108 and/or transmitted to memory 102 and/or RF circuitry 114.
在某些实施例中,可以包括多个麦克风120,多个麦克风120分布可以在不同位置,根据不同位置的麦克风120、按照预定策略确定声音发出的方向。应当理解,也可以通过某些传感器来识别声音方向。In some embodiments, a plurality of microphones 120 can be included, the plurality of microphones 120 being distributed at different locations, and the direction in which the sound is emitted is determined according to a predetermined strategy based on the microphones 120 at different locations. It should be understood that the direction of the sound can also be identified by some sensors.
在某些实施例中,音频电路116还包括头戴送受话器插孔(未示出)。该头戴送受话器插孔提供音频电路114与可拆装的音频输入/输出外设之间的接口,举例来说,该音频输入/输出外设既可以是纯输出耳机,也可以是同时具有输出(用于单耳或双耳的耳机)和输入(麦克风)的头戴送受话器。In some embodiments, audio circuit 116 also includes a headset jack (not shown). The headset jack provides an interface between the audio circuit 114 and a removable audio input/output peripheral, for example, the audio input/output peripheral can be either a pure output headset or both Output (for single or binaural headphones) and input (microphone) headset.
在某些实施例中,还包括语音识别装置(未示出),用于实现语音到文字的识别,以及根据文字合成语音。语音识别装置可以用硬件、软件或软硬件的组合来实现,包括 一个或多个信号处理和/或专用集成电路。音频电路116接收来自外设接口108的音频数据,将音频数据变换成电信号,语音识别装置可以对音频数据进行识别,将音频数据转换为文本数据。语音识别装置还可以根据文字数据合成音频数据,通过音频电路116将音频数据变换成电信号,并且将电信号传送到扬声器118。In some embodiments, a speech recognition device (not shown) is also included for implementing speech-to-text recognition and synthesizing speech based on text. The speech recognition device can be implemented by hardware, software or a combination of hardware and software, including One or more signal processing and/or application specific integrated circuits. The audio circuit 116 receives the audio data from the peripheral interface 108, converts the audio data into electrical signals, and the voice recognition device can identify the audio data and convert the audio data into text data. The speech recognition apparatus can also synthesize the audio data based on the text data, convert the audio data into an electrical signal through the audio circuit 116, and transmit the electrical signal to the speaker 118.
感知子系统122提供机器人1200的感知外设和外设接口108之间的接口,感知外设例如姿态传感器132、摄像机134、触觉传感器136和其他感知装置128。感知子系统122包括姿态控制器124、视觉控制器126、触觉控制器128以及一个或多个其他感知装置控制器130。所述一个或多个其他感知装置控制器130接收/发送来自/去往其他感知装置138的电信号。所述其他感知装置138可包括温度传感器、距离传感器、接近觉传感器、气压传感器以及空气质量检测装置等等。Perception subsystem 122 provides an interface between the perceptual peripherals of robot 1200 and peripheral interface 108, such as attitude sensor 132, camera 134, tactile sensor 136, and other sensing devices 128. Perception subsystem 122 includes an attitude controller 124, a visual controller 126, a haptic controller 128, and one or more other perceptual device controllers 130. The one or more other sensing device controllers 130 receive/transmit electrical signals from/to other sensing devices 138. The other sensing devices 138 may include temperature sensors, distance sensors, proximity sensors, air pressure sensors, air quality detecting devices, and the like.
在某些实施例中,机器人1200可以具有多个姿态控制器124,以控制机器人1200的不同肢体,机器人的肢体可以包括但不限于手臂、足和头部。相应的,机器人1200可以包括多个姿态传感器132。在某些实施方式中,机器人1200可以不具备姿态控制器124和姿态传感器132,机器人1200可以是固定形态,不具备手臂、足等机械活动部件。在某些实施例中,机器人1200的姿态可以不是机械的手臂、足和头部,也可以采用可变形的构造。In some embodiments, the robot 1200 can have a plurality of attitude controllers 124 to control different limbs of the robot 1200, which can include, but are not limited to, arms, feet, and heads. Accordingly, the robot 1200 can include a plurality of attitude sensors 132. In some embodiments, the robot 1200 may not have the attitude controller 124 and the attitude sensor 132. The robot 1200 may be in a fixed configuration and does not have mechanical moving parts such as an arm or a foot. In some embodiments, the pose of the robot 1200 may not be a mechanical arm, foot, and head, but may also employ a deformable configuration.
机器人1200还包括用于为各种组件供电的电源系统142。该电源系统142可以包括电源管理系统、一个或多个电源(例如电池、交流电(AC))、充电系统、电源故障检测电路、电源转换器或逆变器、电源状态指示器(例如发光二极管(LED)),以及与便携式设备中的电能生成、管理和分布相关联的其他任何组件。充电系统可以是有线充电系统,或者也可以是无线充电系统。The robot 1200 also includes a power system 142 for powering various components. The power system 142 can include a power management system, one or more power sources (eg, batteries, alternating current (AC)), charging systems, power failure detection circuits, power converters or inverters, power status indicators (eg, light emitting diodes (eg LED)), as well as any other components associated with power generation, management, and distribution in portable devices. The charging system can be a wired charging system or a wireless charging system.
在某些实施例中,软件组件包括操作系统144、通信模块(或指令集)146、交互行为控制装置(或指令集)148以及一个或多个其他装置(或指令集)150。In some embodiments, the software components include an operating system 144, a communication module (or set of instructions) 146, an interactive behavior control device (or set of instructions) 148, and one or more other devices (or sets of instructions) 150.
操作系统144(例如Darwin、RTXC、LINUX、UNIX、OS X、WINDOWS或是诸如Vxworks之类的嵌入式操作系统)包括用于控制和管理常规系统任务(例如内存管理、存储设备控制、电源管理等等)以及有助于各种软硬件组件之间通信的各种软件组件和/或驱动器。Operating system 144 (eg, Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as Vxworks) includes controls and management of general system tasks (eg, memory management, storage device control, power management, etc.) Etc.) and various software components and/or drivers that facilitate communication between various hardware and software components.
通信模块146有助于经一个或多个外部接口140而与其他设备进行通信,并且它还包括用于处理RF电路114和/或外部接口140接收的数据的各种软件组件。外部接口 140(例如通用串行总线(USB)、FIREWIRE等等)适合于直接或者经网络(例如因特网,无线LAN等等)间接耦接到其他设备。Communication module 146 facilitates communication with other devices via one or more external interfaces 140, and it also includes various software components for processing data received by RF circuitry 114 and/or external interface 140. External interface 140 (eg, Universal Serial Bus (USB), FIREWIRE, etc.) is adapted to be indirectly coupled to other devices either directly or via a network (eg, the Internet, a wireless LAN, etc.).
在某些实施例中,机器人1200还可以包括显示装置(未示出),显示装置可以包括但不限于触敏显示器、触摸板等。上述一个或多个其他装置150可以包括图形模块(未示出),图形模块包括用于在显示装置上呈现和显示图形的各种已知软件组件。注意术语“图形”包括可以显示给用户的任何对象,包括但不局限于文本、网页、图标(例如包括软按键在内的用户界面对象)、数字图像、视频、动画等等。触敏显示器或触摸板还可以用于用户输入。In some embodiments, the bot 1200 can also include a display device (not shown), which can include, but is not limited to, a touch sensitive display, a touch pad, and the like. One or more of the other devices 150 described above can include a graphics module (not shown) that includes various known software components for presenting and displaying graphics on the display device. Note that the term "graphics" includes any object that can be displayed to a user, including but not limited to text, web pages, icons (eg, user interface objects including soft keys), digital images, video, animation, and the like. Touch sensitive displays or touch pads can also be used for user input.
机器人1200通过例如姿态传感器132、摄像机134、触觉传感器136和其他感知装置128、麦克风120等感知外设感知机器人10的外部环境和机器人本身的状况,机器人1200感知到的信息经由感知外设对应控制装置处理,并交由一个或多个CPU 106处理。机器人1200对环境的感知包括但不限于自身的传感器(例如姿态传感器132、摄像机134、触觉传感器136和其他感知装置128)检测到的信息,还可以是与机器人1200相连的外部装置(未示出)检测到的信息,机器人1200与外部装置之间建立通信连接,机器人1200和外部装置通过该通信连接传输数据。外部装置包括各种类型的传感器、智能家居设备等。The robot 1200 senses the external environment of the robot 10 and the condition of the robot itself by, for example, the attitude sensor 132, the camera 134, the tactile sensor 136, and other sensing devices 128, the microphone 120, etc., and the information perceived by the robot 1200 is controlled via the sensing peripheral. The device processes and is processed by one or more CPUs 106. The perception of the environment by the robot 1200 includes, but is not limited to, information detected by its own sensors (eg, the attitude sensor 132, the camera 134, the tactile sensor 136, and other sensing devices 128), and may also be an external device coupled to the robot 1200 (not shown) The detected information, the robot 1200 establishes a communication connection with the external device, and the robot 1200 and the external device transmit data through the communication connection. External devices include various types of sensors, smart home devices, and the like.
在某些实施例中,机器人1200感知到的信息包括但不限于声音、图像、环境参数、触觉信息、时间、空间等。环境参数包括但不限于温度、湿度、气体浓度等;触觉信息包括但不限于与机器人1200的接触,包括但不限于与触敏显示器的接触、与触觉传感器的接触或靠近,触觉传感器可以设置在机器人的头部、手臂等部位(未示出),应当说明的是还包括其他形式的信息。声音可以包括语音和其他声音,声音可以是麦克风120采集到的声音,也可以是存储器102中存储的声音;语音可以包括但不限于人类说话或唱歌等。图像可以是单张图片或视频,图片和视频包括但不限于由摄像机134拍摄得到,也可以从存储器102中读取或者通过网络传输到机器人1200。In some embodiments, the information perceived by the robot 1200 includes, but is not limited to, sound, images, environmental parameters, tactile information, time, space, and the like. Environmental parameters include, but are not limited to, temperature, humidity, gas concentration, etc.; tactile information includes, but is not limited to, contact with the robot 1200, including but not limited to contact with a touch sensitive display, contact or proximity to a tactile sensor, and the tactile sensor can be placed at The head, arm, etc. of the robot (not shown) should be described to include other forms of information. The sound may include voice and other sounds, the sound may be the sound collected by the microphone 120, or may be the sound stored in the memory 102; the voice may include, but is not limited to, human speaking or singing. The image may be a single picture or video, including but not limited to captured by camera 134, or may be read from memory 102 or transmitted to robot 1200 over a network.
机器人1200感知的信息不仅包括机器人1200外部的信息,还可以包括机器人1200自身的信息,包括但不限于机器人1200的电量、温度等信息。例如,可以在感知到机器100的电量低于20%时,使机器人1200移动到充电位置自动充电。The information sensed by the robot 1200 includes not only information external to the robot 1200 but also information of the robot 1200 itself, including but not limited to information such as the amount of power, temperature, and the like of the robot 1200. For example, the robot 1200 can be moved to the charging position for automatic charging when it is perceived that the power of the machine 100 is less than 20%.
应当理解,机器人1200不限于通过上述的方式感知到信息,还可以通过其他形式感知到信息,包括在本文件提交日尚未开发出的感知技术。此外,机器人1200的感知装置也不限于设置在机器人1200上的感知装置,还可以包括与机器人1200关联而未设置在 机器人1200上的感知装置,例如各种用于感知信息的传感器。作为一个示例,机器人1200可以与设置在一定区域内的温度传感器、湿度传感器(未示出)等关联,通过这些传感器感知到相应的信息。机器人1200可以通过多种类型的通信协议与这些传感器通信,以从这些传感器获取信息。It should be understood that the bot 1200 is not limited to perceiving information in the manner described above, but may also perceive information in other forms, including perceptual techniques that have not been developed at the filing date of this document. In addition, the sensing device of the robot 1200 is not limited to the sensing device disposed on the robot 1200, and may also be associated with the robot 1200 and not disposed in the Sensing devices on the robot 1200, such as various sensors for sensing information. As an example, the robot 1200 may be associated with a temperature sensor, a humidity sensor (not shown), or the like disposed within a certain area through which the corresponding information is perceived. The bot 1200 can communicate with these sensors through various types of communication protocols to obtain information from these sensors.
在某些实施例中,可以根据预设的条件设定机器人1200感知的信息,这些条件可以包括但不限于设定机器人1200感知哪些信息、在什么时间感知信息等。例如,可以设定在于用户语音对话时,感知用户的声音、追踪用户的面部、识别用户的姿态等,而不感知其他信息、或者在生成感知单元时降低其他信息的作用、或者对感知到的其他信息进行处理等;或者,在某一时间段(例如,用户外出、机器人1200单独在室内的时间内)感知环境参数、感知图像和视频数据,通过环境参数判断是否需要与空调等设备交互,通过图像和视频数据判断室内是否有陌生人进入等。应当理解,设定感知的信息的条件并不限于此,上述条件仅作为举例说明,可以根据情况设定机器人1200需要感知的信息。In some embodiments, the information perceived by the robot 1200 may be set according to preset conditions, which may include, but are not limited to, setting which information the robot 1200 perceives, when to perceive the information, and the like. For example, it may be set to sense the user's voice, track the user's face, recognize the user's gesture, etc., in the user's voice conversation, without perceiving other information, or reducing the effect of other information when generating the sensing unit, or sensing the Other information is processed, etc.; or, during a certain period of time (for example, when the user is out, the robot 1200 is indoors alone), the environmental parameters, the perceived image, and the video data are sensed, and the environmental parameters are used to determine whether it is necessary to interact with the air conditioner or the like. Use image and video data to determine if there are strangers entering the room. It should be understood that the condition for setting the perceived information is not limited thereto, and the above conditions are merely exemplified, and the information that the robot 1200 needs to perceive may be set according to the situation.
图3示出了本发明一个实施例的机器人控制引擎1100。如图3所示,机器人控制引擎1100可以包括:感知数据获取装置1110,被设置为获取基于机器人1200感知到的信息按照至少一个感知单元生成的感知数据;控制条目产生装置1120,被设置为产生并维护基于机器人1200的感知数据控制机器人1200的交互行为的控制条目;倒排索引产生装置1130,被设置为以各个控制条目中的触发条件包含的感知单元为主键、以控制条目的标识为目标产生倒排索引;以及,控制条目检索装置1140,被设置为基于机器人1200的感知数据和倒排索引检索用于控制机器人1200交互行为的控制条目。FIG. 3 shows a robot control engine 1100 in accordance with one embodiment of the present invention. As shown in FIG. 3, the robot control engine 1100 may include: a sensing data acquiring device 1110 configured to acquire sensing data generated according to at least one sensing unit based on information perceived by the robot 1200; and a control item generating device 1120 configured to generate And maintaining a control item for controlling the interaction behavior of the robot 1200 based on the sensing data of the robot 1200; the inverted index generating device 1130 is configured to target the sensing unit included in the triggering condition in each control item as the primary key, and to control the identification of the item An inverted index is generated; and, the control item retrieval means 1140 is arranged to retrieve a control entry for controlling the interaction behavior of the robot 1200 based on the perceptual data of the robot 1200 and the inverted index.
图4示出了本发明另一个实施例的机器人控制引擎1100。如图4所示机器人控制引擎1100,相比如图3所示的机器人控制引擎1100还可以包括:感知单元分类装置1150,被设置为基于感知单元的类型对感知单元进行分类,形成按感知单元的类型区分的感知单元集合。FIG. 4 shows a robot control engine 1100 in accordance with another embodiment of the present invention. The robot control engine 1100 shown in FIG. 4 may further include: a sensing unit classification device 1150, which is configured to classify the sensing unit based on the type of the sensing unit to form a perceptual unit. A collection of perceptual units that are distinguished by type.
在某些实施例中,感知单元可以分为多种类型,例如,将感知单元分为听觉、视觉、触觉、环境等类型;或者,可以按照感知单元涉及的主题,将感知单元分为新闻、购物、游戏、室内安全、环境监控等类型。应当理解,感知单元的类型不限于上述分类方式。In some embodiments, the sensing unit can be divided into multiple types, for example, the sensing unit is divided into an audible, visual, tactile, environmental, and the like; or the sensing unit can be divided into news according to the theme involved in the sensing unit. Shopping, games, indoor security, environmental monitoring and other types. It should be understood that the type of sensing unit is not limited to the above classification.
基于感知单元的分类,倒排索引产生装置1130进一步被设置为,基于分类得到的感知单元集合,形成按照感知单元类型区分的多个倒排索引。多个倒排索引可以存储在不 同的设备中,这些设备可以是物理设备也可以是虚拟设备。如图4所示机器人控制引擎1100还可以包括:控制条目检索代理装置1160,被设置为分析机器人1200的感知数据包含的感知单元,并基于所包含的感知单元的类型选择对应的倒排索引。控制条目检索装置1140进一步被设置为,基于控制条目检索代理装置1160选取的倒排索引检索用于控制机器人1200交互行为的控制条目。Based on the classification of the perceptual unit, the inverted index generating means 1130 is further configured to form a plurality of inverted indexes differentiated according to the perceptual unit type based on the perceptual unit set obtained by the classification. Multiple inverted indexes can be stored in no In the same device, these devices may be physical devices or virtual devices. The robot control engine 1100 as shown in FIG. 4 may further include: a control item retrieval agent device 1160 configured to analyze the sensing unit included in the sensory data of the robot 1200, and select a corresponding inverted index based on the type of the sensing unit included. The control item retrieval means 1140 is further arranged to retrieve a control entry for controlling the interactive behavior of the robot 1200 based on the inverted index selected by the control item retrieval agent means 1160.
在某些实施例中,如图4所示机器人控制引擎1100还可以包括多个控制条目检索装置1140,每个控制条目检索装置1140可以对应至少一个感知单元类型的倒排索引。控制条目检索代理装置1160可以存储每个控制条目检索装置1140对应的感知单元的类型,以便基于感知数据所包含的感知单元的类型选择对应的倒排索引,由对应的控制条目检索装置1140检索感知数据中的感知单元对应的控制条目。In some embodiments, the robot control engine 1100 can also include a plurality of control item retrieval devices 1140 as shown in FIG. 4, each control item retrieval device 1140 can correspond to an inverted index of at least one perceptual unit type. The control item retrieval agent device 1160 may store the type of the sensing unit corresponding to each of the control item retrieval devices 1140 to select a corresponding inverted index based on the type of the sensing unit included in the sensing data, and retrieve the sensing by the corresponding control item retrieval device 1140. A control entry corresponding to the sensing unit in the data.
在某些实施例中,倒排索引产生装置1130可以进一步被设置为,基于数字签名技术(例如,MD5等)将控制条目中的触发条件包含的感知单元的取值变换成整形整数,以变换得到的整形整数为主键、以控制条目的标识为目标产生倒排索引。控制条目检索装置1140进一步被设置为,基于数字签名技术将机器人1200的感知数据中感知单元的取值变换成整形整数,基于感知数据中感知单元的取值变换得到的整形整数和倒排索引检索用于控制机器人1200交互行为的控制条目。In some embodiments, the inverted index generating device 1130 may be further configured to transform the value of the sensing unit included in the trigger condition in the control entry into an integer based on a digital signature technique (eg, MD5, etc.) to transform The resulting integer integer is the primary key, and the inverted index is generated with the target of the control entry as the target. The control item retrieving device 1140 is further configured to convert the value of the sensing unit in the sensing data of the robot 1200 into an integer based on the digital signature technique, and to perform an integer integer and an inverted index search based on the value conversion of the sensing unit in the sensing data. A control entry for controlling the interaction behavior of the robot 1200.
在某些实施例中,机器人1200的感知数据可以包括多个感知单元,基于多个感知单元检索到控制条目后,机器人控制引擎1100基于检索到的控制条目,合成与机器人的感知数据匹配的控制条目。In some embodiments, the sensory data of the bot 1200 can include a plurality of sensing units. After the control items are retrieved based on the plurality of sensing units, the robot control engine 1100 synthesizes a control that matches the sensory data of the robot based on the retrieved control items. entry.
图5示出了本发明又一实施例的机器人控制引擎1100。如图5所示,相比图4所示的机器人控制引擎1100,该机器人控制引擎1100还可以包括:检索结果合成装置1170,被配置为合并基于机器人1200的感知数据中的各个感知单元的取值检索到的控制条目,形成与机器人1200的感知数据匹配的控制条目。FIG. 5 illustrates a robot control engine 1100 in accordance with yet another embodiment of the present invention. As shown in FIG. 5, compared to the robot control engine 1100 shown in FIG. 4, the robot control engine 1100 may further include: a retrieval result synthesizing device 1170 configured to merge the perceptual units in the perceptual data based on the robot 1200. The value retrieved control entry forms a control entry that matches the sensory data of the robot 1200.
在某些实施例中,检索结果合成装置1170进一步被设置为,基于检索到的控制条目中构成触发条件的感知单元之间的逻辑关系合并检索到的控制条目,形成与机器人1200的感知数据匹配的控制条目。检索结果合成装置1170可以对基于每个感知单元的取值检索到的控制条目的集合求交集,形成机器人1200的感知数据对应的一个或多个控制条目。In some embodiments, the retrieval result synthesizing means 1170 is further configured to merge the retrieved control entries based on the logical relationship between the perceptual units constituting the triggering conditions in the retrieved control entries to form a matching match with the perceptual data of the robot 1200. Control entry. The search result synthesizing means 1170 can find an intersection of the set of control items retrieved based on the value of each perceptual unit, and form one or more control items corresponding to the perceptual data of the robot 1200.
如图5所述的机器人控制引擎1100还可以包括:控制条目排序装置1180,被设置为对控制条目检索装置1140检索到的控制条目进行排序,以基于排序的结果选取控制机 器人交互行为的控制条目。检索结果合成装置1170可以形成机器人1200的感知数据对应的一个或多个控制条目,控制条目排序装置1180可以基于预设策略对形成的多个控制条目进行排序,以选取基于机器人1200的感知数据控制机器人1200交互行为的控制条目。The robot control engine 1100 as described in FIG. 5 may further include: a control item sorting means 1180 configured to sort the control items retrieved by the control item retrieval means 1140 to select the control machine based on the sorted result Control entry for human interaction. The retrieval result synthesizing device 1170 may form one or more control items corresponding to the sensing data of the robot 1200, and the control item sorting device 1180 may sort the formed plurality of control items based on the preset policy to select the sensing data control based on the robot 1200. Control entry for robot 1200 interaction behavior.
图6示出了本发明再一实施例的机器人控制引擎1100。如图6所示,相比图5所示的机器人控制引擎1100,该机器人控制引擎1100还可以包括以下之一或任意组合:用户反馈获取装置1190,被设置为获取多个用户1300对机器人1200的交互行为的用户反馈;控制条目执行情况记录装置1192,被设置为记录控制条目的执行情况信息,形成执行日志;控制条目优先级配置装置1194,被设置为配置控制条目的优先级;用户行为记录装置1196,被配置为记录用户行为,形成用户行为日志。FIG. 6 shows a robot control engine 1100 in accordance with still another embodiment of the present invention. As shown in FIG. 6 , the robot control engine 1100 may further include one or any combination of the following: the user feedback acquisition device 1190 is configured to acquire a plurality of users 1300 to the robot 1200. User feedback of the interactive behavior; control item execution status recording means 1192, configured to record execution status information of the control item, forming an execution log; control item priority configuration means 1194, set to configure the priority of the control item; user behavior Recording device 1196 is configured to record user behavior to form a user behavior log.
在某些实施例中,用户反馈包括但不限于用户1300对机器人1200的交互行为的评价,评价的方式包括但不限于用户1300在机器人1200执行交互行为后的语音反馈、用户1300与机器人1200的肢体接触、用户1300通过终端(例如智能手机等)发送的反馈指令等。控制条目的执行情况信息包括但不限于控制条目的执行次数、控制条目的执行时间、控制条目的执行成功率等。控制条目的优先级可以基于控制条目的来源设置,优先级高的控制条目可以优先被选取。In some embodiments, user feedback includes, but is not limited to, user 1300's evaluation of the interaction behavior of the robot 1200, including but not limited to voice feedback by the user 1300 after the robot 1200 performs the interactive behavior, the user 1300 and the robot 1200. Physical contact, feedback instructions sent by the user 1300 through a terminal (eg, a smartphone, etc.). The execution status information of the control entry includes, but is not limited to, the number of executions of the control entry, the execution time of the control entry, the execution success rate of the control entry, and the like. The priority of the control entry can be based on the source setting of the control entry, and the control entry with the higher priority can be selected first.
在如图6所示的机器人控制引擎1100中,控制条目排序装置1180进一步被设置为,基于用户反馈、和/或执行日志、和/或控制条目的优先级、和/或用户行为日志对控制条目检索装置1140检索到的控制条目进行排序,以基于排序的结果选取控制机器人交互行为的控制条目。In the robot control engine 1100 as shown in FIG. 6, the control item sorting device 1180 is further configured to control based on user feedback, and/or execution logs, and/or priority of control entries, and/or user behavior log pairs. The control items retrieved by the item retrieval means 1140 are sorted to select control items that control the interactive behavior of the robot based on the sorted results.
图7示出了本发明再一实施例的机器人控制引擎1100。如图7所示,相比图3所示的机器人控制引擎1100,该机器人控制引擎1100还可以包括互联网内容抓取装置1198,被设置为从互联网抓取内容,形成互联网内容集合。其中,控制条目产生装置1120,进一步被设置为基于互联网内容集合、预设的感知单元和预设的交互行为,产生基于机器人的感知数据控制机器人的交互行为的控制条目。互联网内容包括以下至少之一或任意组合:网页、文字、声音、视频、图像。FIG. 7 shows a robot control engine 1100 in accordance with still another embodiment of the present invention. As shown in FIG. 7, in contrast to the robot control engine 1100 shown in FIG. 3, the robot control engine 1100 can also include an internet content capture device 1198 configured to fetch content from the Internet to form an internet content collection. The control item generating device 1120 is further configured to generate a control item for controlling the interaction behavior of the robot based on the perceptual data of the robot based on the Internet content set, the preset sensing unit, and the preset interaction behavior. Internet content includes at least one or any combination of the following: web pages, text, sound, video, images.
例如,可以通过数据抓取工具(例如,爬虫等)抓取互联网内容,基于数据挖掘算法对抓取的互联网内容进行分析,得到互联网内容集合。互联网内容集合可以以“if this then that”(如果满足这样,那么那样)的形式构建,描述各个条件下的反馈。例如,描述某一问题的答案、描述某一情绪的表情或肢体动作等。控制条目产生装置1120,进一 步被设置为基于互联网内容集合、预设的感知单元和预设的交互行为产生基于机器人的感知数据控制机器人的交互行为的控制条目。For example, the data content can be captured by a data crawling tool (for example, a crawler, etc.), and the captured Internet content is analyzed based on a data mining algorithm to obtain an Internet content collection. Internet content collections can be constructed in the form of "if this then that" (if so, if so), describing the feedback under various conditions. For example, an answer describing a problem, an expression describing a certain emotion, or a limb movement. Controlling the item generating device 1120, further The step is set to generate a control item based on the Internet content set, the preset sensing unit, and the preset interaction behavior to control the interaction behavior of the robot based on the perceptual data of the robot.
在某些实施例中,可以从互联网抓取内容(例如网页等),对抓取的内容进行分析,得到用于设置控制条目的内容,根据这些内容设置触发条件和触发条件触发的交互行为。例如,从互联网中抓取到生病时拨打急救电话,可以根据感知单元设置“生病”的触发条件,并将该触发条件触发的交互行为设置为“拨打急救电话”。如果预先定义了“健康状况”这一感知单元,可以直接将感知单元的值设置为“生病”,构成的触发条件可以为{if(“health”:“sick”)}。基于上述过程,得到一条控制条目,在检测到用户1300处于生病状态时,执行拨打急救电话的交互行为。In some embodiments, content may be fetched from the Internet (eg, a web page, etc.), the captured content may be analyzed, content for setting control entries may be obtained, and triggering conditions triggered by trigger conditions and trigger conditions may be set according to the content. For example, when the emergency call is made from the Internet and the patient is ill, the trigger condition of “ill” can be set according to the sensing unit, and the interaction behavior triggered by the trigger condition is set to “call emergency call”. If the sensing unit of “health status” is predefined, the value of the sensing unit can be directly set to “ill”, and the triggering condition can be {if(“health”: “sick”)}. Based on the above process, a control entry is obtained, and when the user 1300 is detected to be in an ill state, the interactive action of dialing the emergency call is performed.
本发明实施例实现了如下技术效果:预先定义了控制机器人交互行为的感知单元,将其作为控制机器人交互行为的最小单元,根据感知单元设置触发条件和触发条件所触发的交互行为,得到控制机器人的控制条目,统一了机器人控制的输入输出标准,使得非技术人员也可以编辑机器人的行为,从而便于控制机器人的交互行为,有效提高机器人自适应交互行为能力与智能化程度。The embodiment of the invention achieves the following technical effects: a sensing unit for controlling the interaction behavior of the robot is defined in advance, and is used as a minimum unit for controlling the interaction behavior of the robot, and the interaction behavior triggered by the triggering condition and the triggering condition is set according to the sensing unit to obtain the controlling robot. The control items unify the input and output standards of the robot control, so that non-technical personnel can also edit the behavior of the robot, thereby facilitating the control of the robot's interaction behavior, and effectively improving the robot's adaptive interaction behavior and intelligence.
显然,本领域的技术人员应该明白,上述的本发明实施例的各模块或各步骤可以用通用的计算装置来实现,它们可以集中在单个的计算装置上,或者分布在多个计算装置所组成的网络上,可选地,它们可以用计算装置可执行的程序代码来实现,从而,可以将它们存储在存储装置中由计算装置来执行,并且在某些情况下,可以以不同于此处的顺序执行所示出或描述的步骤,或者将它们分别制作成各个集成电路模块,或者将它们中的多个模块或步骤制作成单个集成电路模块来实现。这样,本发明实施例不限制于任何特定的硬件和软件结合。Obviously, those skilled in the art should understand that the above modules or steps of the embodiments of the present invention can be implemented by a general computing device, which can be concentrated on a single computing device or distributed in multiple computing devices. Alternatively, they may be implemented by program code executable by the computing device such that they may be stored in the storage device by the computing device and, in some cases, may be different from The steps shown or described are performed sequentially, or they are separately fabricated into individual integrated circuit modules, or a plurality of modules or steps thereof are fabricated into a single integrated circuit module. Thus, embodiments of the invention are not limited to any specific combination of hardware and software.
以上所述仅为本发明的优选实施例而已,并不用于限制本发明,对于本领域的技术人员来说,本发明实施例可以有各种更改和变化。凡在本发明的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。 The above description is only the preferred embodiment of the present invention, and is not intended to limit the present invention, and various changes and modifications may be made to the embodiments of the present invention. Any modifications, equivalent substitutions, improvements, etc. made within the spirit and scope of the present invention are intended to be included within the scope of the present invention.

Claims (12)

  1. 一种机器人控制引擎,其特征在于,包括:A robot control engine is characterized by comprising:
    感知数据获取装置,被设置为获取基于机器人感知到的信息按照至少一个预设的感知单元生成的感知数据,其中,所述感知数据包含感知单元的取值;The sensing data acquiring device is configured to acquire the sensing data generated by the at least one preset sensing unit based on the information sensed by the robot, wherein the sensing data includes a value of the sensing unit;
    控制条目产生装置,被设置为产生并维护基于机器人的感知数据控制机器人的交互行为的控制条目,其中,所述控制条目包含由至少一个感知单元构成的触发条件和触发条件触发的交互行为;a control item generating device configured to generate and maintain a control item that controls an interaction behavior of the robot based on the perceptual data of the robot, wherein the control item includes an activation condition triggered by a trigger condition and a trigger condition composed of the at least one sensing unit;
    倒排索引产生装置,被设置为以各个控制条目中的触发条件包含的感知单元为主键、以控制条目的标识为目标产生倒排索引;以及An inverted index generating device configured to generate an inverted index with the sensing unit included in the trigger condition in each control item as a primary key and to control the identification of the entry;
    控制条目检索装置,被设置为基于机器人的感知数据和所述倒排索引检索用于控制机器人交互行为的控制条目。The control item retrieval means is arranged to retrieve a control item for controlling the interaction behavior of the robot based on the perceptual data of the robot and the inverted index.
  2. 如权利要求1所述的机器人控制引擎,其特征在于,还包括:The robot control engine of claim 1 further comprising:
    感知单元分类装置,被设置为基于感知单元的类型对感知单元进行分类,形成按感知单元类型区分的感知单元集合;The sensing unit classifying device is configured to classify the sensing unit based on the type of the sensing unit to form a sensing unit set differentiated by the sensing unit type;
    其中,所述倒排索引产生装置进一步被设置为,基于所述感知单元集合形成按感知单元类型区分的多个倒排索引;The inverted index generating device is further configured to form a plurality of inverted indexes differentiated by the sensing unit type based on the sensing unit set;
    其中,所述机器人控制引擎还包括:控制条目检索代理装置,被设置为分析机器人的感知数据包含的感知单元,并基于所包含的感知单元的类型选择对应的倒排索引;The robot control engine further includes: a control item retrieval proxy device configured to analyze the sensing unit included in the sensing data of the robot, and select a corresponding inverted index based on the type of the sensing unit included;
    其中,所述控制条目检索装置进一步被设置为,基于所述控制条目检索代理装置选取的倒排索引检索用于控制机器人交互行为的控制条目。Wherein the control item retrieval means is further configured to retrieve a control item for controlling the interaction behavior of the robot based on the inverted index selected by the control item retrieval agent means.
  3. 如权利要求1或2所述的机器人控制引擎,其特征在于,所述倒排索引产生装置进一步被设置为,将控制条目中的触发条件包含的感知单元的取值变换成整形整数,以变换得到的整形整数为主键、以控制条目的标识为目标产生倒排索引,其中,不同的感知单元的取值对应不同的整形整数;The robot control engine according to claim 1 or 2, wherein the inverted index generating means is further configured to transform the value of the sensing unit included in the trigger condition in the control entry into an integer, to transform The obtained integer integer is a primary key, and an inverted index is generated for the target of the control entry, wherein the values of different sensing units correspond to different integer integers;
    其中,所述控制条目检索装置进一步被设置为,将机器人的感知数据中感知单元的取值变换成整形整数,基于感知数据中感知单元的取值变换得到的整形整数和所述倒排索引检索用于控制机器人交互行为的控制条目。The control item retrieval device is further configured to: convert the value of the sensing unit in the sensory data of the robot into an integer integer, and retrieve the integer based on the value of the sensing unit in the sensory data and the inverted index search A control entry that controls the interactive behavior of the robot.
  4. 如权利要求1至3中任一项所述的机器人控制引擎,其特征在于,所述控制条目检索装置进一步被设置为,基于机器人的感知数据中的感知单元的取值和所述倒排索引检索用于控制机器人交互行为的控制条目; The robot control engine according to any one of claims 1 to 3, wherein the control item retrieving means is further configured to: based on a value of the sensing unit and the inverted index in the perceptual data of the robot Retrieving control entries for controlling robot interaction behavior;
    其中,所述机器人控制引擎还包括:检索结果合成装置,被配置为合并基于机器人的感知数据中的各个感知单元的取值检索到的控制条目,形成与机器人的感知数据匹配的控制条目。The robot control engine further includes: a retrieval result synthesizing device configured to merge the control items retrieved based on the values of the respective sensing units in the perceptual data of the robot to form a control item that matches the perceptual data of the robot.
  5. 如权利要求4所述的机器人控制引擎,其特征在于,所述检索结果合成装置进一步被设置为,基于检索到的控制条目中构成触发条件的感知单元之间的逻辑关系合并检索到的控制条目,形成与机器人的感知数据匹配的控制条目。The robot control engine according to claim 4, wherein said retrieval result synthesizing means is further configured to merge the retrieved control items based on a logical relationship between the sensing units constituting the trigger condition in the retrieved control items Forming a control entry that matches the sensory data of the robot.
  6. 如权利要求1至5中任一项所述的机器人控制引擎,其特征在于,还包括:The robot control engine according to any one of claims 1 to 5, further comprising:
    控制条目排序装置,被设置为对所述控制条目检索装置检索到的控制条目进行排序,以基于排序的结果选取控制机器人交互行为的控制条目。The control item sorting means is arranged to sort the control items retrieved by the control item retrieval means to select control items for controlling the robot interaction behavior based on the sorted result.
  7. 如权利要求6所述的机器人控制引擎,其特征在于,还包括:The robot control engine of claim 6 further comprising:
    用户反馈获取装置,被设置为获取用户对机器人的交互行为的用户反馈;和/或a user feedback acquisition device configured to obtain user feedback of the user's interaction with the robot; and/or
    控制条目执行情况记录装置,被设置为记录控制条目的执行情况信息,形成执行日志;和/或Controlling an item execution recording device configured to record execution status information of the control entry to form an execution log; and/or
    控制条目优先级配置装置,被设置为配置控制条目的优先级;和/或Controlling the item priority configuration device, set to prioritize the configuration control entry; and/or
    用户行为记录装置,被配置为记录用户行为,形成用户行为日志;a user behavior recording device configured to record user behavior to form a user behavior log;
    其中,所述控制条目排序装置进一步被设置为,基于所述用户反馈、和/或所述执行日志、和/或所述控制条目的优先级、和/或所述用户行为日志对所述控制条目检索装置检索到的控制条目进行排序,以基于排序的结果选取控制机器人交互行为的控制条目。Wherein the control item sorting means is further configured to control the control based on the user feedback, and/or the execution log, and/or the priority of the control entry, and/or the user behavior log The control items retrieved by the item retrieval means are sorted to select control items that control the interaction behavior of the robot based on the sorted results.
  8. 如权利要求7所述的机器人控制引擎,其特征在于,The robot control engine according to claim 7, wherein
    所述执行情况信息包括控制条目的执行次数和/或控制条目的执行时间;和/或The performance information includes a number of executions of the control entry and/or an execution time of the control entry; and/or
    所述用户反馈包括用户对机器人的交互行为的评价。The user feedback includes a user's evaluation of the interactive behavior of the robot.
  9. 如权利要求1所述的机器人控制引擎,其特征在于,还包括:The robot control engine of claim 1 further comprising:
    互联网内容抓取装置,被设置为从互联网抓取内容,形成互联网内容集合;An Internet content crawling device configured to crawl content from the Internet to form a collection of Internet content;
    其中,所述控制条目产生装置,进一步被设置为基于所述互联网内容集合、预设的感知单元和预设的交互行为,产生基于机器人的感知数据控制机器人的交互行为的控制条目。The control item generating device is further configured to generate a control item for controlling an interaction behavior of the robot based on the perceptual data of the robot based on the Internet content set, the preset sensing unit, and the preset interaction behavior.
  10. 如权利要求9所述的机器人控制引擎,其特征在于,所述互联网内容包括以下至少之一或任意组合:网页、文字、声音、视频、图像。The robot control engine according to claim 9, wherein said Internet content comprises at least one or any combination of the following: a web page, a text, a voice, a video, an image.
  11. 如权利要求1至10中任一项所述的机器人控制引擎,其特征在于,所述机器人控制引擎部署在互联网云计算平台上。 The robot control engine according to any one of claims 1 to 10, wherein the robot control engine is deployed on an internet cloud computing platform.
  12. 一种机器人控制系统,其特征在于,包括:A robot control system, comprising:
    机器人,包括:感知装置,被设置为感知至少一项信息;感知数据生成装置,被设置为基于感知到的信息和至少一个预设的感知单元生成的感知数据,其中,所述感知数据包含感知单元的取值;The robot includes: a sensing device configured to perceive at least one piece of information; the sensing data generating device configured to generate sensing data based on the perceived information and the at least one preset sensing unit, wherein the sensing data includes sensing The value of the unit;
    如权利要求1至11中任一项所述的机器人控制引擎;A robot control engine according to any one of claims 1 to 11;
    其中,所述机器人还包括:交互行为执行装置,被设置为执行所述机器人控制引擎基于感知数据检索到的控制条目中的交互行为。 Wherein, the robot further comprises: an interaction behavior executing device, configured to execute an interaction behavior in the control item retrieved by the robot control engine based on the perceptual data.
PCT/CN2016/087259 2015-06-26 2016-06-27 Robot control engine and system WO2016206644A1 (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
CN201510364661.7A CN106325228B (en) 2015-06-26 2015-06-26 Method and device for generating control data of robot
CN201510363348.1 2015-06-26
CN201510363346.2 2015-06-26
CN201510363346.2A CN106325113B (en) 2015-06-26 2015-06-26 Robot controls engine and system
CN201510364661.7 2015-06-26
CN201510363348.1A CN106325065A (en) 2015-06-26 2015-06-26 Robot interactive behavior control method, device and robot

Publications (1)

Publication Number Publication Date
WO2016206644A1 true WO2016206644A1 (en) 2016-12-29

Family

ID=57584497

Family Applications (6)

Application Number Title Priority Date Filing Date
PCT/CN2016/087260 WO2016206645A1 (en) 2015-06-26 2016-06-27 Method and apparatus for loading control data into machine device
PCT/CN2016/087262 WO2016206647A1 (en) 2015-06-26 2016-06-27 System for controlling machine apparatus to generate action
PCT/CN2016/087261 WO2016206646A1 (en) 2015-06-26 2016-06-27 Method and system for urging machine device to generate action
PCT/CN2016/087259 WO2016206644A1 (en) 2015-06-26 2016-06-27 Robot control engine and system
PCT/CN2016/087258 WO2016206643A1 (en) 2015-06-26 2016-06-27 Method and device for controlling interactive behavior of robot and robot thereof
PCT/CN2016/087257 WO2016206642A1 (en) 2015-06-26 2016-06-27 Method and apparatus for generating control data of robot

Family Applications Before (3)

Application Number Title Priority Date Filing Date
PCT/CN2016/087260 WO2016206645A1 (en) 2015-06-26 2016-06-27 Method and apparatus for loading control data into machine device
PCT/CN2016/087262 WO2016206647A1 (en) 2015-06-26 2016-06-27 System for controlling machine apparatus to generate action
PCT/CN2016/087261 WO2016206646A1 (en) 2015-06-26 2016-06-27 Method and system for urging machine device to generate action

Family Applications After (2)

Application Number Title Priority Date Filing Date
PCT/CN2016/087258 WO2016206643A1 (en) 2015-06-26 2016-06-27 Method and device for controlling interactive behavior of robot and robot thereof
PCT/CN2016/087257 WO2016206642A1 (en) 2015-06-26 2016-06-27 Method and apparatus for generating control data of robot

Country Status (1)

Country Link
WO (6) WO2016206645A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11220008B2 (en) * 2017-07-18 2022-01-11 Panasonic Intellectual Property Management Co., Ltd. Apparatus, method, non-transitory computer-readable recording medium storing program, and robot
CN108388399B (en) * 2018-01-12 2021-04-06 北京光年无限科技有限公司 Virtual idol state management method and system
JP7188950B2 (en) * 2018-09-20 2022-12-13 株式会社Screenホールディングス Data processing method and data processing program
TWI735168B (en) * 2020-02-27 2021-08-01 東元電機股份有限公司 Voice robot

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020184024A1 (en) * 2001-03-22 2002-12-05 Rorex Phillip G. Speech recognition for recognizing speaker-independent, continuous speech
WO2006093394A1 (en) * 2005-03-04 2006-09-08 Chutnoon Inc. Server, method and system for providing information search service by using web page segmented into several information blocks
CN1911606A (en) * 2005-08-10 2007-02-14 株式会社东芝 Apparatus and method for controlling behavior of robot
US20090043575A1 (en) * 2007-08-07 2009-02-12 Microsoft Corporation Quantized Feature Index Trajectory
US20110213659A1 (en) * 2010-02-26 2011-09-01 Marcus Fontoura System and Method for Automatic Matching of Contracts in an Inverted Index to Impression Opportunities Using Complex Predicates and Confidence Threshold Values
CN102448678A (en) * 2009-05-26 2012-05-09 奥尔德巴伦机器人公司 System and method for editing and controlling the behavior of a movable robot
WO2014050192A1 (en) * 2012-09-27 2014-04-03 オムロン株式会社 Device management apparatus and device search method
CN103729476A (en) * 2014-01-26 2014-04-16 王玉娇 Method and system for correlating contents according to environmental state

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001353678A (en) * 2000-06-12 2001-12-25 Sony Corp Authoring system and method and storage medium
JP4108342B2 (en) * 2001-01-30 2008-06-25 日本電気株式会社 Robot, robot control system, and program thereof
US6957215B2 (en) * 2001-12-10 2005-10-18 Hywire Ltd. Multi-dimensional associative search engine
US7600218B2 (en) * 2003-11-20 2009-10-06 Panasonic Corporation Association control apparatus, association control method and service association system
JP2005193331A (en) * 2004-01-06 2005-07-21 Sony Corp Robot device and its emotional expression method
KR101088406B1 (en) * 2008-06-27 2011-12-01 주식회사 유진로봇 Interactive learning system using robot and method of operating the same in child education
CN101618280B (en) * 2009-06-30 2011-03-23 哈尔滨工业大学 Humanoid-head robot device with human-computer interaction function and behavior control method thereof
WO2011058530A1 (en) * 2009-11-16 2011-05-19 Koninklijke Philips Electronics, N.V. Human-robot shared control for endoscopic assistant robot
FR2963132A1 (en) * 2010-07-23 2012-01-27 Aldebaran Robotics HUMANOID ROBOT HAVING A NATURAL DIALOGUE INTERFACE, METHOD OF USING AND PROGRAMMING THE SAME
CN201940040U (en) * 2010-09-27 2011-08-24 深圳市杰思谷科技有限公司 Domestic robot
KR20120047577A (en) * 2010-11-04 2012-05-14 주식회사 케이티 Apparatus and method for providing robot interaction services using interactive behavior model
WO2013052894A1 (en) * 2011-10-05 2013-04-11 Opteon Corporation Methods, apparatus, and systems for monitoring and/or controlling dynamic environments
US8965580B2 (en) * 2012-06-21 2015-02-24 Rethink Robotics, Inc. Training and operating industrial robots
CN103324100B (en) * 2013-05-02 2016-08-31 郭海锋 A kind of emotion on-vehicle machines people of information-driven
CN103399637B (en) * 2013-07-31 2015-12-23 西北师范大学 Based on the intelligent robot man-machine interaction method of kinect skeleton tracing control
CN103793536B (en) * 2014-03-03 2017-04-26 陈念生 Intelligent platform obtaining method and device
CN105511608B (en) * 2015-11-30 2018-12-25 北京光年无限科技有限公司 Exchange method and device, intelligent robot based on intelligent robot

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020184024A1 (en) * 2001-03-22 2002-12-05 Rorex Phillip G. Speech recognition for recognizing speaker-independent, continuous speech
WO2006093394A1 (en) * 2005-03-04 2006-09-08 Chutnoon Inc. Server, method and system for providing information search service by using web page segmented into several information blocks
CN1911606A (en) * 2005-08-10 2007-02-14 株式会社东芝 Apparatus and method for controlling behavior of robot
US20090043575A1 (en) * 2007-08-07 2009-02-12 Microsoft Corporation Quantized Feature Index Trajectory
CN102448678A (en) * 2009-05-26 2012-05-09 奥尔德巴伦机器人公司 System and method for editing and controlling the behavior of a movable robot
US20110213659A1 (en) * 2010-02-26 2011-09-01 Marcus Fontoura System and Method for Automatic Matching of Contracts in an Inverted Index to Impression Opportunities Using Complex Predicates and Confidence Threshold Values
WO2014050192A1 (en) * 2012-09-27 2014-04-03 オムロン株式会社 Device management apparatus and device search method
CN103729476A (en) * 2014-01-26 2014-04-16 王玉娇 Method and system for correlating contents according to environmental state

Also Published As

Publication number Publication date
WO2016206642A1 (en) 2016-12-29
WO2016206645A1 (en) 2016-12-29
WO2016206643A1 (en) 2016-12-29
WO2016206647A1 (en) 2016-12-29
WO2016206646A1 (en) 2016-12-29

Similar Documents

Publication Publication Date Title
US20220012470A1 (en) Multi-user intelligent assistance
JP6816925B2 (en) Data processing method and equipment for childcare robots
KR101726945B1 (en) Reducing the need for manual start/end-pointing and trigger phrases
US11010601B2 (en) Intelligent assistant device communicating non-verbal cues
KR102558437B1 (en) Method For Processing of Question and answer and electronic device supporting the same
US10486312B2 (en) Robot, robot control method, and robot system
WO2017215297A1 (en) Cloud interactive system, multicognitive intelligent robot of same, and cognitive interaction method therefor
CN106325228B (en) Method and device for generating control data of robot
WO2016206644A1 (en) Robot control engine and system
IL229370A (en) Interface apparatus and method for providing interaction of a user with network entities
CN110609620A (en) Human-computer interaction method and device based on virtual image and electronic equipment
CN106325113B (en) Robot controls engine and system
JP2016103081A (en) Conversation analysis device, conversation analysis system, conversation analysis method and conversation analysis program
CN106325065A (en) Robot interactive behavior control method, device and robot
WO2022199500A1 (en) Model training method, scene recognition method, and related device
CN108806699B (en) Voice feedback method and device, storage medium and electronic equipment
TW202223804A (en) Electronic resource pushing method and system
JP6798258B2 (en) Generation program, generation device, control program, control method, robot device and call system
JP2018186326A (en) Robot apparatus and program
WO2023006033A1 (en) Speech interaction method, electronic device, and medium
US11687049B2 (en) Information processing apparatus and non-transitory computer readable medium storing program
WO2020087534A1 (en) Generating response in conversation
CN113539282A (en) Sound processing device, system and method
JP5701935B2 (en) Speech recognition system and method for controlling speech recognition system
US20210392427A1 (en) Systems and Methods for Live Conversation Using Hearing Devices

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16813760

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16813760

Country of ref document: EP

Kind code of ref document: A1