WO2016206644A1 - Moteur et système de commande de robot - Google Patents

Moteur et système de commande de robot Download PDF

Info

Publication number
WO2016206644A1
WO2016206644A1 PCT/CN2016/087259 CN2016087259W WO2016206644A1 WO 2016206644 A1 WO2016206644 A1 WO 2016206644A1 CN 2016087259 W CN2016087259 W CN 2016087259W WO 2016206644 A1 WO2016206644 A1 WO 2016206644A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
control
sensing unit
sensing
data
Prior art date
Application number
PCT/CN2016/087259
Other languages
English (en)
Chinese (zh)
Inventor
聂华闻
Original Assignee
北京贝虎机器人技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201510364661.7A external-priority patent/CN106325228B/zh
Priority claimed from CN201510363348.1A external-priority patent/CN106325065A/zh
Priority claimed from CN201510363346.2A external-priority patent/CN106325113B/zh
Application filed by 北京贝虎机器人技术有限公司 filed Critical 北京贝虎机器人技术有限公司
Publication of WO2016206644A1 publication Critical patent/WO2016206644A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/10Program control for peripheral devices

Definitions

  • the invention relates to the field of artificial intelligence technology, and in particular to a robot control engine and system.
  • Today's robots are mostly industrial robots, while industrial robots are mostly non-sense.
  • the operating procedures of these robots are pre-defined and the determined tasks are completed without fail in accordance with the predetermined procedures. They lack adaptability and produce consistent results only when the objects involved are the same.
  • the embodiment of the invention provides a robot control engine and system to at least effectively improve the adaptive interaction behavior and the degree of intelligence of the robot.
  • a robot control engine includes:
  • the sensing data acquiring device is configured to acquire the sensing data generated by the robot-perceived information according to the at least one preset sensing unit, wherein the sensing data includes the value of the sensing unit;
  • control item generating device configured to generate and maintain a control item that controls an interaction behavior of the robot based on the perceptual data of the robot, wherein the control item includes an activation condition triggered by a trigger condition and a trigger condition composed of the at least one sensing unit;
  • An inverted index generating device configured to generate an inverted index with the sensing unit included in the trigger condition in each control item as a primary key and to control the identification of the entry;
  • the control item retrieval means is arranged to retrieve a control item for controlling the interaction behavior of the robot based on the perceptual data of the robot and the inverted index.
  • a robot control engine includes:
  • the sensing data acquiring device is configured to acquire the sensing data generated by the robot-perceived information according to the at least one preset sensing unit, wherein the sensing data includes the value of the sensing unit;
  • control item generating device configured to generate a control item for controlling an interaction behavior of the robot based on the perceptual data of the robot, wherein the control item includes a trigger condition triggered by the at least one sensing unit and an interaction condition triggered by the trigger condition;
  • the sensing unit classifying device is configured to classify the sensing unit based on the type of the sensing unit to form a sensing unit set differentiated by the sensing unit type;
  • an inverted index generating device configured to generate a plurality of inverted indexes differentiated by the sensing unit type based on the sensing unit set, aiming at a sensing unit included in a trigger condition in each control item as a primary key ;
  • Controlling the item retrieval agent device configured to analyze the sensing unit included in the sensory data of the robot, and selecting a corresponding inverted index based on the type of the sensing unit included in the sensory data of the robot;
  • the control item retrieval means is arranged to retrieve a control item for controlling the interaction behavior of the robot based on the perceptual data of the robot and the inverted index selected by the control item retrieval agent means.
  • a robot control engine includes:
  • the sensing data acquiring device is configured to acquire the sensing data generated by the robot-perceived information according to the at least one preset sensing unit, wherein the sensing data includes the value of the sensing unit;
  • control item generating device configured to generate a control item for controlling an interaction behavior of the robot based on the perceptual data of the robot, wherein the control item includes a trigger condition triggered by the at least one sensing unit and an interaction condition triggered by the trigger condition;
  • the inverted index generating device is configured to convert the value of the sensing unit included in the trigger condition in the control entry into an integer integer (for example, a digital signature technology), and transform the obtained integer integer as a primary key to control the identifier of the entry as The target produces an inverted index;
  • the control item retrieval means is configured to convert the value of the sensing unit in the sensory data of the robot into an integer based on the digital signature technique, and to use the integer integer obtained by the value conversion of the sensing unit in the sensory data and the inverted index search A control entry that controls the interaction behavior of the robot.
  • a robot control engine includes:
  • the sensing data acquiring device is configured to acquire the sensing data generated by the robot-perceived information according to the at least one preset sensing unit, wherein the sensing data includes the value of the sensing unit;
  • control item generating device configured to generate a control item for controlling an interaction behavior of the robot based on the perceptual data of the robot, wherein the control item includes a trigger condition triggered by the at least one sensing unit and an interaction condition triggered by the trigger condition;
  • the inverted index generating device is configured to generate an inverted index with the sensing unit included in the trigger condition in each control item as a primary key and the identification of the control item as a target;
  • Controlling an item retrieval means configured to retrieve a control item for controlling the interaction behavior of the robot based on the value of the sensing unit in the sensory data of the robot and the inverted index;
  • the retrieval result synthesizing means is configured to merge the control items retrieved based on the values of the respective sensing units in the perceptual data of the robot to form a control item that matches the perceptual data of the robot.
  • a robot control engine includes:
  • the sensing data acquiring device is configured to acquire the sensing data generated by the robot-perceived information according to the at least one preset sensing unit, wherein the sensing data includes the value of the sensing unit;
  • control item generating device configured to generate a control item for controlling an interaction behavior of the robot based on the perceptual data of the robot, wherein the control item includes a trigger condition triggered by the at least one sensing unit and an interaction condition triggered by the trigger condition;
  • the inverted index generating device is configured to generate an inverted index with the sensing unit included in the trigger condition in each control item as a primary key and the identification of the control item as a target;
  • Controlling an item retrieval means configured to retrieve a control item for controlling the interaction behavior of the robot based on the value of the sensing unit in the sensory data of the robot and the inverted index;
  • the retrieval result synthesizing means is configured to merge the control items retrieved based on the values of the respective sensing units in the perceptual data of the robot based on the logical relationship between the sensing units constituting the triggering conditions in the retrieved control items, to form a control item with the robot A control entry that senses data matching.
  • a robot control engine includes:
  • the sensing data acquiring device is configured to acquire the sensing data generated by the robot-perceived information according to the at least one preset sensing unit, wherein the sensing data includes the value of the sensing unit;
  • control item generating device configured to generate a control item for controlling an interaction behavior of the robot based on the perceptual data of the robot, wherein the control item includes a trigger condition triggered by the at least one sensing unit and an interaction condition triggered by the trigger condition;
  • the inverted index generating device is configured to generate an inverted index with the sensing unit included in the trigger condition in each control item as a primary key and the identification of the control item as a target;
  • Controlling an item retrieval means configured to retrieve a control item for controlling the interaction behavior of the robot based on the perceptual data of the robot and the inverted index;
  • the control item sorting means is arranged to sort the control items retrieved by the control item retrieval means to select control items for controlling the robot interaction behavior based on the sorted result.
  • a robot control engine includes:
  • the sensing data acquiring device is configured to acquire the sensing data generated by the robot-perceived information according to the at least one preset sensing unit, wherein the sensing data includes the value of the sensing unit;
  • control item generating device configured to generate a control item for controlling an interaction behavior of the robot based on the perceptual data of the robot, wherein the control item includes a trigger condition triggered by the at least one sensing unit and an interaction condition triggered by the trigger condition;
  • the inverted index generating device is configured to generate an inverted index with the sensing unit included in the trigger condition in each control item as a primary key and the identification of the control item as a target;
  • Controlling the item retrieval means configured to retrieve a control item for controlling the interaction behavior of the robot based on the perceptual data of the robot and the inverted index;
  • a user feedback obtaining device configured to obtain user feedback of the user's interaction behavior with the robot
  • control item execution status recording device configured to record execution status information of the control item to form an execution log
  • a user behavior recording device configured to record user behavior to form a user behavior log
  • Controlling an item ranking device configured to retrieve the control item retrieval device based on the user feedback, and/or the execution log, and/or the priority of the control entry, and/or the user behavior log
  • the control entries are sorted to select control entries that control the interactive behavior of the robot based on the sorted results.
  • a robot control engine includes:
  • the sensing data acquiring device is configured to acquire the sensing data generated by the robot-perceived information according to the at least one preset sensing unit, wherein the sensing data includes the value of the sensing unit;
  • An Internet content crawling device configured to crawl content from the Internet to form a collection of Internet content
  • Controlling the item generating means configured to generate a control item for controlling the interaction behavior of the robot based on the perceptual data of the robot based on the set of Internet content, the preset sensing unit, and the preset interaction behavior;
  • An inverted index generating device configured to generate an inverted index with the sensing unit included in the trigger condition in each control item as a primary key and to control the identification of the entry;
  • the control item retrieval means is arranged to retrieve a control item for controlling the interaction behavior of the robot based on the perceptual data of the robot and the inverted index.
  • a storage medium is further provided, wherein the software includes the above-mentioned software, including but not limited to: an optical disk, a floppy disk, a hard disk, an erasable memory, and the like.
  • the embodiment of the invention provides a robot control engine and system, which pre-defines the interaction behavior of the sensing unit and the robot that controls the interaction behavior of the robot, and uses it as the smallest unit for controlling the interaction behavior of the robot, according to the sensing unit and the preset interaction behavior.
  • Set the interaction behavior triggered by the trigger condition and the trigger condition obtain the control items of the control robot, unify the input and output standards of the robot control, so that the non-technical personnel can also edit the behavior of the robot, thereby facilitating the control of the robot's interaction behavior and effectively improving the robot.
  • Adaptive interaction behavior and intelligence which pre-defines the interaction behavior of the sensing unit and the robot that controls the interaction behavior of the robot, and uses it as the smallest unit for controlling the interaction behavior of the robot, according to the sensing unit and the preset interaction behavior.
  • FIG. 1 is a schematic diagram of a robot control system according to an embodiment of the present invention.
  • FIG. 2 is a schematic diagram of a robot according to an embodiment of the present invention.
  • FIG. 3 is a structural block diagram of a robot control engine according to an embodiment of the present invention.
  • FIG. 4 is a structural block diagram of another robot control engine according to an embodiment of the present invention.
  • FIG. 5 is a structural block diagram of still another robot control engine according to an embodiment of the present invention.
  • FIG. 6 is a structural block diagram of still another robot control engine according to an embodiment of the present invention.
  • FIG. 7 is a structural block diagram of another robot control engine according to an embodiment of the present invention.
  • At least one sensing unit is defined in advance, and the value of the sensing unit depends on information perceived by the robot.
  • the sensing unit acts as the smallest unit (or called the minimum input unit) that controls the robot, and the robot can make interactive behavior based on at least the sensing unit.
  • the interaction behavior of the robot may be controlled by one or more sensing units, for example, when the values of one or more sensing units change, the robot may respond to the changes to make an interactive behavior; or, when one or more sensing units When the value is within a certain value range or equal to a certain value, the robot can respond to the sensing unit to make an interactive behavior. It should be understood that the control of the interaction behavior of the robot by the sensing unit is not limited to the above case, and the above case is merely illustrative.
  • the sensing unit can include multiple levels, and the higher level sensing unit can include one or more sensing units of the lower level.
  • the higher level perceptual unit may include one or more perceptual units of the lower level adjacent thereto, and the sensing unit of the same higher level may include different lower level perceptual units.
  • the low-level sensing units that synthesize the high-level sensing units include, but are not limited to, low-level sensing units of the same time or time period, and historical low-level sensing units of the time or time period.
  • the higher level perceptual units are determined by lower level sensing units at different times.
  • the value of the sensing unit may be one or a set of values, or may be a range of one or more values.
  • the value of the sensing unit may be determined according to the information perceived by the robot.
  • One sensing unit may be determined by one or more pieces of information that is perceived, and the same sensing unit may be determined by different data that is perceived.
  • the perceived information may include real-time perceived information, or historically perceived information (such as information perceived at a certain time or some time in the past). In some cases, the value of the sensing unit is determined by the information perceived in real time and the information perceived by the history.
  • a sensing unit such as ear, eye, timer, whether someone is at home (so_at_home), and environment (environment).
  • the auditory describes the speech that is heard, and when the robot receives the sound, performs speech recognition processing on the received sound to identify the text of the speech in the sound, and the value of the auditory may be the text of the speech being heard; in some embodiments
  • the sound source can also be positioned.
  • the hearing can also include the direction of the sound. The direction of the sound is referenced to the face of the robot, including left, right, front, and rear directions.
  • emotion recognition technology can also be used to obtain the voice information. Identify the emotions contained in the voice.
  • the robot can analyze the image or video to determine whether there is any current or whether there is movement.
  • the visual value can include whether there is anybody, whether there is movement, etc.
  • the monitoring object can also be identified based on video surveillance (for example The emotions of the at least one user who is in conversation with the robot can be determined based on facial recognition and physical activity. Whether someone at home can be "0" or "1", "0" means no one is at home, "1" means someone At home, whether someone is at home can be determined in a variety of ways, such as through video surveillance to determine whether the monitored object includes people.
  • the time describes the time information, and the value may be a time point or a time range, for example, 14:00 every February 1st.
  • the environment describes the environmental conditions, including temperature, humidity, noise, PM2.5, ppm of gas in the air, carbon monoxide content in the air, oxygen content in the air, etc., and the value may be the value or range of each parameter.
  • the value of the sensing unit can be predefined.
  • the value of the predefined sensing unit may be one or more specific values, or one or more ranges of values.
  • the value of the sensing unit may be an explicit value, or may be formed by a wildcard (or the like) together with an explicit value, but is not limited thereto. For example, when the sensing unit is “speech”, the value may be “*rain*”, indicating that any voice information containing “raining” is included; or the value may be “*[below] rain*”, indicating Any voice message that contains "rain” or "rain”.
  • the bot may generate the sensing data according to at least the sensing unit and the perceived information, and the sensing data may include one or more sensing units, where the sensing data includes the value of the sensing unit and the identification of the sensing unit.
  • the sensing data includes the value of the sensing unit and the identification of the sensing unit.
  • the robot generates the sensing data according to the perceived information according to the sensing unit, and can obtain the value of the sensing unit according to the perceived information by using various analysis methods, for example, obtaining the text of the voice through the voice recognition technology, and analyzing and sensing through the image recognition technology. Whether there is a portrait in the image, the attribute of the portrait is determined by the portrait (face) recognition technique, and the like. It should be understood that the robot is not limited to obtaining the value of the sensing unit by the above-mentioned manner, and may also be processed by other means including the processing technology that has not been developed at the filing date of this document.
  • a plurality of sensing units may be preset, and it should be understood that the setting of the following exemplary sensing unit is not a division of the sensing unit, or a limitation of the number of sensing units, or the expression of the sensing unit, in fact any The division of the sensing unit can all be considered.
  • An example of the sensing unit is shown in Table 1.
  • perceptual unit an exemplary perceptual data is given below. It should be understood that the following perceptual data is not the number of elements of perceptual data, or the definition of perceptual data elements, or the format or perception of perceptual data. The definition of the way the data is expressed.
  • the JSON-aware data of an example case is expressed as follows, but is not limited thereto, and other methods are also possible.
  • “vision_human_position” records that the human user is behind the robot ("back")
  • “back” can also be represented by other characters, which can distinguish different positions. It should be understood that the position is also It can be expressed by "angle value”, for example, “vision_human_position”: “45°” or the like.
  • “sensing_touch” records the touch of the human user on the robot. The position of the touch is the hand (“hand”). The “hand” can also be represented by other characters. It can distinguish different positions. It should be understood that the touch position can be more The value of "sensing_touch” can be an array that records multiple locations.
  • “audio_speak_txt” records what the human user said “very happy to see you”, and the content can also be audio data.
  • “audio_speak_language” records the language “chinese” spoken by human users.
  • “vision_human_posture” records the human user's gesture “posture1”, and “posture1” can also be represented by other characters, which can distinguish different postures.
  • “system_date” records the date “2016/3/16" of the generation of the perceptual data
  • “system_time” records the time “13-00-00” of the perceptual data generation.
  • “system_power” records the robot's power "80%”, it should be understood that the power can also be identified in other ways.
  • the trigger condition and the interaction behavior triggered by the trigger condition can be set.
  • a control item for controlling the interaction behavior of the robot in response to the information perceived by the robot is generated.
  • Control entries can have unique identifiers.
  • the triggering condition may be composed of one or more sensing units, and the logical units may be configured between the sensing units, and the logical relationships include, but are not limited to, “and”, “or”, “not”, and the like.
  • the triggering condition may include an identification and a value of the sensing unit constituting the triggering condition, and the value of the sensing unit may be one or a set of values, or one or a set of value ranges.
  • the value of the sensing unit can be an explicit value or a wildcard (or similar) A clear value constitutes, but is not limited to.
  • the sensing unit in the trigger condition is “speech”
  • the value may be “*rain*”, indicating that any voice information containing “rain” is included; or the value may be “*[below] rain* ",” means that any voice message containing "rain” or "rain” is included.
  • One or more interactions that the trigger condition can trigger.
  • the order between interactions can be set to perform multiple interactions in a set order.
  • the order of execution of the one or more interactions can also be configured.
  • the execution order may include, but is not limited to, randomly performing one or a set of interaction behaviors to achieve random execution of one or more actions; or performing a plurality of interaction behaviors in a predetermined sequence of steps.
  • the interaction behavior can be configured as one or more action instructions that can be parsed by the robot for execution, and the action instructions can also include one or more parameters.
  • the order of execution of the one or more action instructions can also be configured.
  • the execution order may include, but is not limited to, randomly executing one or a set of action instructions to effect random execution of one or more actions; or executing a plurality of action instructions in a predetermined sequence of steps.
  • the action instructions of the interactive behavior include: links to other control entries set for execution of other control entries, and/or for selecting content from a plurality of content and/or parameters and/or A link to multiple parameters and/or multiple content set by parameters or parameters.
  • Each control entry may have a unique identification to which an action instruction may refer to the control entry.
  • the content linked to the action instruction may be a set of actions, and the robot may perform actions in a set of actions according to other factors. For example, attributes such as character or gender of the robot may be pre-configured, and the attributes may be stored in a memory, different genders or characters. The robot may have different interaction behaviors for the same situation (or called a scene).
  • the robot may select an action to be performed from a group of actions according to attributes such as set personality or gender, and these actions may include, but are not limited to, the body motion of the robot.
  • the one or a group of content linked to the action instruction may include, but is not limited to, the content of the voice chat, various Internet information, and the like.
  • the action performed by the robot according to the control item is to query the weather in Beijing, and the action instruction may be a weather query. Address, the robot to this address to get the weather in Beijing, this address can include Uniform Resource Locator (URL), memory address, database field and so on.
  • URL Uniform Resource Locator
  • Robot interactions include, but are not limited to, outputting speech, adjusting gestures, outputting images or video, interacting with other devices, and the like.
  • Output speech includes, but is not limited to, chatting with a user, playing music; adjusting gestures including, but not limited to, moving (eg, mimicking human walking, etc.), limb swings (eg, arm, head swing), posture adjustment, etc.; outputting images or videos Including but not limited to displaying an image or video on a display device, the image may be a dynamic electronic expression or the like, or may be a captured image, or an image obtained from a network; interaction with other devices includes, but is not limited to, controlling other devices ( For example, adjusting the operating parameters of air-conditioning equipment, etc.), transmitting data to other equipment, Establish connections with other devices, etc.
  • the interaction behavior is not limited to the above enumerated contents, and the robot's reaction to the perceived information can be regarded as the interaction behavior of the robot.
  • Control entries can be configured in a data exchange format, although other formats can be used.
  • Data exchange formats include, but are not limited to, XML, JSON, or YAML.
  • JSON Take JSON as an example, you need to implement: When the user says, "Sing me a song,” first go back to 10cm at a medium speed of 0 and then start singing a song. After singing the song, take a photo and send it to the user 10 seconds later. Then 0 angle forward 5CM.
  • the control entry for the JSON data format can be as follows:
  • the "ifs” part is a trigger condition set according to the sensing unit
  • "ear” is the identification of the sensing unit
  • “singing” is the value of the sensing unit.
  • the “trigger” part is the interactive behavior triggered by the trigger condition, including three interaction behaviors of “move”, “song” and “take_pic”, each of which includes a corresponding action instruction. Among them, “song” is linked to "http://bpeer.com/i.mp3", the content of singing is obtained from “http://bpeer.com/i.mp3”, and "gr" is action Execution order.
  • the behavioral instruction and behavior control parameters can be written in the JSON language, but are not limited thereto, and other methods are also possible.
  • Non-limiting ones may include:
  • the behavior name is: audio_speak;
  • Behavior control parameters can include: text (content to say), volume (volume of speech), etc. (eg, vocal gender, or vocal age, etc.)
  • JSON JSON is expressed as follows:
  • text may include a conversion character that corresponds to a parameter.
  • the "owner” conversion character can be defined as "@master”.
  • JSON representation containing the conversion characters is as follows:
  • volume is set as a percentage, and the robot can calculate the specific parameters of the robot based on the percentage value of "volume”.
  • volume can also be represented as a specific parameter of the robot.
  • the behavior name is: audio_sound_music
  • Behavior control parameters may include: path (path to play music, or file name, etc.), volume (volume of playing music), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: audio_sound_info
  • Behavior control parameters include: name (the name of the tone to be played), volume (play volume), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: motion_head;
  • Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
  • JSON JSON is expressed as follows:
  • velocity is represented as a gear position, and the robot can calculate a specific "velocity” based on the gear position.
  • "velocity” can also be expressed as a specific parameter of the robot's head movement.
  • angle is expressed as the angle of the motor.
  • angle can be expressed as relative data such as percentage, for example, “angle”: “50%”
  • the robot can determine the specific range according to the angle range.
  • the parameter for example, the maximum angle is 180 degrees, then the calculated specific angle is 90 degrees, but is not limited thereto.
  • the behavior name is: motion_neck;
  • Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: motion_shoulder;
  • Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: motion_elbow;
  • Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: motion_wrist
  • Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: motion_waist
  • Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: motion_eye;
  • Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: display_emotion
  • Behavior control parameters can include: content (displayed emoticons), velocity (display speed), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: program_photo;
  • Behavior control parameters can include: flash (whether the flash is turned on), etc.
  • JSON JSON is expressed as follows:
  • control_tv The behavior name is: control_tv;
  • Behavior control parameters can include: state (eg open, close), etc.
  • JSON JSON is expressed as follows:
  • control_led The behavior name is: control_led;
  • Behavior control parameters can include: state (eg open, close), color, etc.
  • JSON JSON is expressed as follows:
  • Figure 1 shows a robot control system 1000 in accordance with one embodiment of the present invention.
  • the robot control system 1000 may include a robot control engine 1100, a robot 1200, and a user 1300.
  • a robot 1200 such as a cleaning robot, can be placed within the indoor space 1400.
  • the robot 1200 can establish a communication link 1510 with the routing device 1500 within the indoor space 1400 via an embedded communication device (not shown in FIG. 1).
  • the routing device 1500 establishes a communication link 1520 with the robot control engine 1100, and the robot 1200 passes through the communication chain.
  • Road 1510 and communication link 1520 are in communication with robot control engine 1100.
  • the robot control engine 1100 may also be configured as the robot 1200, or on the Internet cloud computing platform and
  • the robot 1200 is each provided with a robot control engine 1100.
  • the user 1300 may be a member of the indoor space 1400 or a person associated with the indoor space 1400, and the robot 1200 may interact with the user 1300.
  • the robot 1200 can also interact with devices within the indoor space 1400 (eg, household appliances such as air conditioners, televisions, air purifiers, etc.) or can also interact with devices outside the indoor space 1400.
  • devices within the indoor space 1400 eg, household appliances such as air conditioners, televisions, air purifiers, etc.
  • devices outside the indoor space 1400 eg, household appliances such as air conditioners, televisions, air purifiers, etc.
  • user 1300 of robot 1200 may be a plurality of members of indoor space 1400.
  • multiple users 1300 may also be grouped, such as grouping members of indoor space 1400 into groups, and grouping 1300 outside of indoor space 1400 into groups.
  • the robot 1200 can perceive various information through embedded sensing devices (not shown in FIG. 1), including but not limited to environmental parameters in the indoor space 1400, voice information of the user 1300 or other personnel (including natural language and voice commands, etc.) ), user 1300 or other personnel and image or video information of the item, and the like.
  • the sensing devices embedded in the robot 1400 include, but are not limited to, a microphone, a camera, an infrared sensor, an ultrasonic sensor, and the like. It should be understood that the robot 1200 can also communicate with its external sensing device to acquire information perceived by the sensing device.
  • the robot 1200 can be coupled with a temperature sensor, a humidity sensor (not shown in FIG. 1) disposed in the indoor space 1400. Communication, obtaining temperature and humidity parameters of the indoor space 1400.
  • the perceived information may be processed based on the preset sensing unit to obtain the sensing data including the value of the sensing unit.
  • the robot 1200 can transmit the generated sensory data to the robot control engine 1100 to acquire information of the control item for controlling the robot interaction behavior based on the sensory data feedback by the robot control engine 1100.
  • the information of the control entry includes, but is not limited to, the data itself of the control entry, the identification of the control entry, the interaction behavior data in the control entry, and the like.
  • the bot 1200 can store control entries, and the bot 1200 can then obtain an identification of control entries for controlling the interactive behavior of the bot 1200 from the robot control engine 1100 based on the perceptual data.
  • the robot 1200 may acquire the control item itself or the interaction behavior data in the control item from the robot control engine 1100.
  • the robot control engine 1100 generates a control item that controls the robot interaction behavior based on the perceptual data of the robot 1200, the control item including an activation condition triggered by a trigger condition and a trigger condition composed of at least one perceptual unit, wherein the interaction behavior can be parsed and executed by the robot 1200.
  • the triggering condition may be formed by the relationship between the value of the sensing unit and the sensing unit, and the relationship between the sensing units includes, but is not limited to, logical relationships such as “and”, “or”, and “not”.
  • the control entry has a unique identifier that is distinguishable from other control entries, and the identity of the control entry can be, but is not limited to, an integer integer. It should be understood that the identification of the control entry may also be a URL or the like.
  • the robot control engine 1100 may establish an inverted index with the sensing unit included in the trigger condition included in the control entry as the primary key and the identification of the control item. For the value of the sensing unit as a set of values, the robot control engine 1100 may establish an inverted index based on all the values in the set of values as the primary key and the control entry corresponding to the value of the sensing unit. In the inverted index that is established, the value of one sensing unit may correspond to one or more control entries, that is, control entries in which the value of the sensing unit appears.
  • the inverted index can be stored in the memory in the form of an inverted list to be able to append the inverted records; or it can be stored as a file on the disk.
  • the inverted index combines the value of the sensing unit with the control item, and takes the value of the sensing unit as the index structure of the primary key.
  • the inverted index can be divided into two parts. The first part: an index table consisting of the values of different sensing units, called the "dictionary". The value of each sensing unit is saved, and the statistics of the value of the sensing unit may be included, for example, the number of times the value of the sensing unit appears in the control item, etc.; the second part: taken by each sensing unit A collection of control entries for which values have occurred, as well as other information (eg, priority of control entries, etc.), also known as a "record table” or "record list.”
  • the value of the sensing unit may be transformed into an integer based on a digital signature technique, a string mapping technique (eg, MD5, etc.) to perceive the integer integer obtained by the value transformation of the unit.
  • the primary key is used to establish an inverted index with the identifier of the control entry.
  • the values of different sensing units correspond to different integer integers to distinguish the values of different sensing units.
  • the transformed integer obtained by the transformation can be compressed to reduce the amount of data storage and increase the processing speed.
  • the control item for controlling the robot 1200 can be retrieved based on the sensory data of the robot 1200 and the inverted index.
  • the robot control engine 1100 can parse the sensory data of the robot 1200, extract the value of the sensing unit included in the sensory data from the sensory data of the robot 1200, based on the extracted value and the inversion of the sensing unit.
  • the index retrieves control entries that match the perceptual data of the robot 1200 to obtain control entries that control the interaction behavior of the robot 1200 based on the perceptual data of the robot 1200.
  • the value of the sensing unit may be transformed into a shaping integer based on a digital signature technology. number.
  • the robot control engine 1100 compares the integer integer of the value of the sensing unit in the sensory data of the robot 1200 with the integer integer in the inverted index to retrieve the control item corresponding to the value of the sensing unit in the sensory data of the robot 1200.
  • the value of the sensing unit included in the sensing data of the 1200 may be the value of the sensing unit itself, or may be an integer integer obtained based on a digital signature technique.
  • the 1200 may first determine the value of the sensing unit, and then transform the value of the sensing unit into an integer based on the digital signature technology, based on the value transformation of each sensing unit. The resulting integer integer generates perceptual data.
  • the robot control engine 1100 After the robot control engine 1100 retrieves the control item corresponding to the value of the sensing unit in the sensing data of the robot 1200, the robot control engine 1100 merges the retrieved control item based on the logical relationship between the sensing unit and the sensing unit included in the triggering condition included in the control entry.
  • the sensory data of the robot 1200 includes values of five sensing units, and the robot control engine 1100 retrieves a plurality of control items based on the values of the five sensing units.
  • the robot control engine 1100 can find the intersection of the retrieved plurality of control items, and obtain the values that satisfy the values of the five sensing units at the same time.
  • the robot control engine 1100 retrieves a plurality of control entries that match the perceptual data of the robot 1200, at which point the robot control engine 1100 can retrieve a plurality of control entries for the perceptual data with the robot 1200. Sorting is performed to select, from the plurality of control items, a control item that controls the interactive behavior of the robot 1200 based on the perceptual data.
  • the control entry with a higher priority may be preferentially selected based on the prioritization of the control entries; or the user of the plurality of users 1300 may be preferentially selected based on the user feedback performed by the plurality of users 1300 on the control entry.
  • Feedback can be used to evaluate the best control entries; or, the log can be sorted based on the control entries, the control entry execution log can record the number of executions of the control entries, the execution time, etc., and the control entries with multiple execution times or the control entries with the most recent executions are preferentially selected. . It should be understood that the ordering of the control items is not limited to the above manner, and any combination of the above manners and other manners may be performed to select a control item for controlling the interaction behavior of the robot 1200 based on the perceptual data of the robot 1200.
  • the information of the determined control item may be transmitted to the robot 1200, and the robot 1200 performs the interaction behavior in the control item based on the information of the received control item.
  • the information of the control entry includes the control entry itself; or the information of the control entry is an identification of the control entry; or, the interaction behavior in the control entry.
  • the information of the control entry may also be any combination of the above information.
  • the robot control engine 1100 if the robot control engine 1100 does not retrieve a control entry for the sensory data of the bot 1200, the bot 1200 can perform an interactive behavior in accordance with the sensed data in a predetermined pattern, eg, if the sensory data includes voice information, with the user 1300 Do voice chat, if you do not include voice messages, do not perform the operation. It should be understood that the robot 1200 may not be limited to the above modes.
  • FIG. 2 shows a robot 1200 in accordance with one embodiment of the present invention.
  • the robot 1200 includes a memory 102, a memory controller 104, one or more processing units (CPUs) 106, a peripheral interface 108, a radio frequency (RF) circuit 114, an audio circuit 116, a speaker 118, a microphone 120, Perception subsystem 122, attitude sensor 132, camera 134, tactile sensor 136, and one or more other sensing devices 138, as well as external interface 140.
  • CPUs processing units
  • RF radio frequency
  • the robot 1200 is just one example of a robot 1200 that may have more or fewer components than illustrated, or have different component configurations.
  • the bot 1200 can include one or more CPUs 106, memory 102, one or more sensing devices (such as sensing devices as described above), and one or more are stored in the memory 102.
  • a module, program, or instruction set that performs a robot interaction behavior control method.
  • the various components shown in FIG. 2 can be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
  • the robot 1200 can be an electromechanical device having a biological shape (eg, a humanoid, etc.), or can be a smart device that does not have a biological appearance but has human characteristics (eg, language communication, etc.), the smart device can Mechanical devices are also included, as well as virtual devices implemented by software (eg, virtual chat bots, etc.).
  • the virtual chat bot can perceive information through the device in which it is located, and the device in which it is located includes electronic devices such as handheld electronic devices, personal computers, and the like.
  • Memory 102 can include high speed random access memory and can also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid state storage devices.
  • memory 102 may also include memory remote from one or more CPUs 106, such as network attached memory accessed via RF circuitry 114 or external interface 140 and a communication network (not shown), wherein the communication network It can be the Internet, one or more intranets, a local area network (LAN), a wide area network (WLAN), a storage area network (SAN), etc., or a suitable combination thereof.
  • Memory controller 104 can control access to memory 102 by other components of robot 1200, such as CPU 106 and peripheral interface 108.
  • Peripheral interface 108 couples the input and output peripherals of the device to CPU 106 and memory 102.
  • the one or more processors 106 described above execute various software programs and/or sets of instructions stored in the memory 102 to perform various functions of the robot 1200 and process the data.
  • peripheral interface 108, CPU 106, and memory controller 104 can be implemented on a single chip, such as chip 112. In some other embodiments, they may be implemented on multiple discrete chips.
  • the RF circuit 114 receives and transmits electromagnetic waves.
  • the RF circuit 114 converts an electrical signal into an electromagnetic wave, or converts the electromagnetic wave into an electrical signal, and communicates with the communication network and other communication devices via the electromagnetic wave.
  • the RF circuit 114 may include well-known circuitry for performing these functions including, but not limited to, an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chip. Group, User Identity Module (SIM) card, memory, and more.
  • SIM User Identity Module
  • the RF circuit 112 can communicate with a network and other devices via wireless communication, such as the World Wide Web (WWW) Internet, an intranet, and/or a wireless network such as a cellular telephone network, a wireless local area network (LAN), and / or Metropolitan Area Network (MAN).
  • WWW World Wide Web
  • LAN wireless local area network
  • MAN Metropolitan Area Network
  • the above wireless communication may use any of a variety of communication standards, protocols, and technologies including, but not limited to, Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), Wideband Code Division Multiple Access (W-CDMA). ), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (eg IEEE 802.11a, IEEE 802.11b, IEEE 802.11g, and/or IEEE 802.11n), Internet-based Protocol Voice over Internet Protocol (VoIP), Wi-MAX, protocols for e-mail, instant messaging, and/or short message service (SMS), or any other suitable communication protocol, including those not yet developed at the filing date of this document. letter of agreement.
  • GSM Global System for Mobile Communications
  • EDGE Enhanced Data GSM Environment
  • W-CDMA Code Division Multiple Access
  • CDMA Code Division Multiple Access
  • TDMA Time Division Multiple Access
  • Bluetooth Bluetooth
  • Wi-Fi eg IEEE 802.11a, IEEE 802.11b, IEEE 802.
  • Audio circuitry 116, speaker 118, and microphone 120 provide an audio interface between the user and the bot 1200.
  • Audio circuitry 116 receives audio data from peripheral interface 108, converts the audio data into electrical signals, and transmits the electrical signals to speaker 118.
  • the speaker transforms the electrical signal into a human audible sound wave.
  • Audio circuit 116 also receives electrical signals that are converted from sound waves by microphone 118.
  • the audio circuit 116 converts the electrical signals into audio data and transmits the audio data to the peripheral interface 108 for processing. Audio data may be retrieved from memory 102 and/or RF circuitry 114 by peripheral interface 108 and/or transmitted to memory 102 and/or RF circuitry 114.
  • a plurality of microphones 120 can be included, the plurality of microphones 120 being distributed at different locations, and the direction in which the sound is emitted is determined according to a predetermined strategy based on the microphones 120 at different locations. It should be understood that the direction of the sound can also be identified by some sensors.
  • audio circuit 116 also includes a headset jack (not shown).
  • the headset jack provides an interface between the audio circuit 114 and a removable audio input/output peripheral, for example, the audio input/output peripheral can be either a pure output headset or both Output (for single or binaural headphones) and input (microphone) headset.
  • a speech recognition device (not shown) is also included for implementing speech-to-text recognition and synthesizing speech based on text.
  • the speech recognition device can be implemented by hardware, software or a combination of hardware and software, including One or more signal processing and/or application specific integrated circuits.
  • the audio circuit 116 receives the audio data from the peripheral interface 108, converts the audio data into electrical signals, and the voice recognition device can identify the audio data and convert the audio data into text data.
  • the speech recognition apparatus can also synthesize the audio data based on the text data, convert the audio data into an electrical signal through the audio circuit 116, and transmit the electrical signal to the speaker 118.
  • Perception subsystem 122 provides an interface between the perceptual peripherals of robot 1200 and peripheral interface 108, such as attitude sensor 132, camera 134, tactile sensor 136, and other sensing devices 128.
  • Perception subsystem 122 includes an attitude controller 124, a visual controller 126, a haptic controller 128, and one or more other perceptual device controllers 130.
  • the one or more other sensing device controllers 130 receive/transmit electrical signals from/to other sensing devices 138.
  • the other sensing devices 138 may include temperature sensors, distance sensors, proximity sensors, air pressure sensors, air quality detecting devices, and the like.
  • the robot 1200 can have a plurality of attitude controllers 124 to control different limbs of the robot 1200, which can include, but are not limited to, arms, feet, and heads. Accordingly, the robot 1200 can include a plurality of attitude sensors 132. In some embodiments, the robot 1200 may not have the attitude controller 124 and the attitude sensor 132. The robot 1200 may be in a fixed configuration and does not have mechanical moving parts such as an arm or a foot. In some embodiments, the pose of the robot 1200 may not be a mechanical arm, foot, and head, but may also employ a deformable configuration.
  • the robot 1200 also includes a power system 142 for powering various components.
  • the power system 142 can include a power management system, one or more power sources (eg, batteries, alternating current (AC)), charging systems, power failure detection circuits, power converters or inverters, power status indicators (eg, light emitting diodes (eg LED)), as well as any other components associated with power generation, management, and distribution in portable devices.
  • the charging system can be a wired charging system or a wireless charging system.
  • the software components include an operating system 144, a communication module (or set of instructions) 146, an interactive behavior control device (or set of instructions) 148, and one or more other devices (or sets of instructions) 150.
  • Operating system 144 eg, Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as Vxworks
  • controls and management of general system tasks eg, memory management, storage device control, power management, etc.
  • software components and/or drivers that facilitate communication between various hardware and software components.
  • Communication module 146 facilitates communication with other devices via one or more external interfaces 140, and it also includes various software components for processing data received by RF circuitry 114 and/or external interface 140.
  • External interface 140 eg, Universal Serial Bus (USB), FIREWIRE, etc.
  • USB Universal Serial Bus
  • FIREWIRE FireWire
  • Communication module 146 facilitates communication with other devices via one or more external interfaces 140, and it also includes various software components for processing data received by RF circuitry 114 and/or external interface 140.
  • External interface 140 eg, Universal Serial Bus (USB), FIREWIRE, etc.
  • USB Universal Serial Bus
  • FIREWIRE FireWire
  • Communication module 146 facilitates communication with other devices via one or more external interfaces 140, and it also includes various software components for processing data received by RF circuitry 114 and/or external interface 140.
  • External interface 140 eg, Universal Serial Bus (USB), FIREWIRE, etc.
  • USB Universal Serial Bus
  • FIREWIRE FireWire
  • the bot 1200 can also include a display device (not shown), which can include, but is not limited to, a touch sensitive display, a touch pad, and the like.
  • a display device can include, but is not limited to, a touch sensitive display, a touch pad, and the like.
  • One or more of the other devices 150 described above can include a graphics module (not shown) that includes various known software components for presenting and displaying graphics on the display device.
  • graphics includes any object that can be displayed to a user, including but not limited to text, web pages, icons (eg, user interface objects including soft keys), digital images, video, animation, and the like. Touch sensitive displays or touch pads can also be used for user input.
  • the robot 1200 senses the external environment of the robot 10 and the condition of the robot itself by, for example, the attitude sensor 132, the camera 134, the tactile sensor 136, and other sensing devices 128, the microphone 120, etc., and the information perceived by the robot 1200 is controlled via the sensing peripheral.
  • the device processes and is processed by one or more CPUs 106.
  • the perception of the environment by the robot 1200 includes, but is not limited to, information detected by its own sensors (eg, the attitude sensor 132, the camera 134, the tactile sensor 136, and other sensing devices 128), and may also be an external device coupled to the robot 1200 (not shown)
  • the detected information, the robot 1200 establishes a communication connection with the external device, and the robot 1200 and the external device transmit data through the communication connection.
  • External devices include various types of sensors, smart home devices, and the like.
  • the information perceived by the robot 1200 includes, but is not limited to, sound, images, environmental parameters, tactile information, time, space, and the like.
  • Environmental parameters include, but are not limited to, temperature, humidity, gas concentration, etc.
  • tactile information includes, but is not limited to, contact with the robot 1200, including but not limited to contact with a touch sensitive display, contact or proximity to a tactile sensor, and the tactile sensor can be placed at The head, arm, etc. of the robot (not shown) should be described to include other forms of information.
  • the sound may include voice and other sounds, the sound may be the sound collected by the microphone 120, or may be the sound stored in the memory 102; the voice may include, but is not limited to, human speaking or singing.
  • the image may be a single picture or video, including but not limited to captured by camera 134, or may be read from memory 102 or transmitted to robot 1200 over a network.
  • the information sensed by the robot 1200 includes not only information external to the robot 1200 but also information of the robot 1200 itself, including but not limited to information such as the amount of power, temperature, and the like of the robot 1200.
  • the robot 1200 can be moved to the charging position for automatic charging when it is perceived that the power of the machine 100 is less than 20%.
  • the bot 1200 is not limited to perceiving information in the manner described above, but may also perceive information in other forms, including perceptual techniques that have not been developed at the filing date of this document.
  • the sensing device of the robot 1200 is not limited to the sensing device disposed on the robot 1200, and may also be associated with the robot 1200 and not disposed in the Sensing devices on the robot 1200, such as various sensors for sensing information.
  • the robot 1200 may be associated with a temperature sensor, a humidity sensor (not shown), or the like disposed within a certain area through which the corresponding information is perceived.
  • the bot 1200 can communicate with these sensors through various types of communication protocols to obtain information from these sensors.
  • the information perceived by the robot 1200 may be set according to preset conditions, which may include, but are not limited to, setting which information the robot 1200 perceives, when to perceive the information, and the like. For example, it may be set to sense the user's voice, track the user's face, recognize the user's gesture, etc., in the user's voice conversation, without perceiving other information, or reducing the effect of other information when generating the sensing unit, or sensing the Other information is processed, etc.; or, during a certain period of time (for example, when the user is out, the robot 1200 is indoors alone), the environmental parameters, the perceived image, and the video data are sensed, and the environmental parameters are used to determine whether it is necessary to interact with the air conditioner or the like.
  • preset conditions may include, but are not limited to, setting which information the robot 1200 perceives, when to perceive the information, and the like. For example, it may be set to sense the user's voice, track the user's face, recognize the user's gesture, etc.
  • condition for setting the perceived information is not limited thereto, and the above conditions are merely exemplified, and the information that the robot 1200 needs to perceive may be set according to the situation.
  • FIG. 3 shows a robot control engine 1100 in accordance with one embodiment of the present invention.
  • the robot control engine 1100 may include: a sensing data acquiring device 1110 configured to acquire sensing data generated according to at least one sensing unit based on information perceived by the robot 1200; and a control item generating device 1120 configured to generate And maintaining a control item for controlling the interaction behavior of the robot 1200 based on the sensing data of the robot 1200; the inverted index generating device 1130 is configured to target the sensing unit included in the triggering condition in each control item as the primary key, and to control the identification of the item An inverted index is generated; and, the control item retrieval means 1140 is arranged to retrieve a control entry for controlling the interaction behavior of the robot 1200 based on the perceptual data of the robot 1200 and the inverted index.
  • FIG. 4 shows a robot control engine 1100 in accordance with another embodiment of the present invention.
  • the robot control engine 1100 shown in FIG. 4 may further include: a sensing unit classification device 1150, which is configured to classify the sensing unit based on the type of the sensing unit to form a perceptual unit. A collection of perceptual units that are distinguished by type.
  • the sensing unit can be divided into multiple types, for example, the sensing unit is divided into an audible, visual, tactile, environmental, and the like; or the sensing unit can be divided into news according to the theme involved in the sensing unit. Shopping, games, indoor security, environmental monitoring and other types. It should be understood that the type of sensing unit is not limited to the above classification.
  • the inverted index generating means 1130 is further configured to form a plurality of inverted indexes differentiated according to the perceptual unit type based on the perceptual unit set obtained by the classification. Multiple inverted indexes can be stored in no In the same device, these devices may be physical devices or virtual devices.
  • the robot control engine 1100 as shown in FIG. 4 may further include: a control item retrieval agent device 1160 configured to analyze the sensing unit included in the sensory data of the robot 1200, and select a corresponding inverted index based on the type of the sensing unit included.
  • the control item retrieval means 1140 is further arranged to retrieve a control entry for controlling the interactive behavior of the robot 1200 based on the inverted index selected by the control item retrieval agent means 1160.
  • the robot control engine 1100 can also include a plurality of control item retrieval devices 1140 as shown in FIG. 4, each control item retrieval device 1140 can correspond to an inverted index of at least one perceptual unit type.
  • the control item retrieval agent device 1160 may store the type of the sensing unit corresponding to each of the control item retrieval devices 1140 to select a corresponding inverted index based on the type of the sensing unit included in the sensing data, and retrieve the sensing by the corresponding control item retrieval device 1140. A control entry corresponding to the sensing unit in the data.
  • the inverted index generating device 1130 may be further configured to transform the value of the sensing unit included in the trigger condition in the control entry into an integer based on a digital signature technique (eg, MD5, etc.) to transform The resulting integer integer is the primary key, and the inverted index is generated with the target of the control entry as the target.
  • the control item retrieving device 1140 is further configured to convert the value of the sensing unit in the sensing data of the robot 1200 into an integer based on the digital signature technique, and to perform an integer integer and an inverted index search based on the value conversion of the sensing unit in the sensing data.
  • the sensory data of the bot 1200 can include a plurality of sensing units. After the control items are retrieved based on the plurality of sensing units, the robot control engine 1100 synthesizes a control that matches the sensory data of the robot based on the retrieved control items. entry.
  • FIG. 5 illustrates a robot control engine 1100 in accordance with yet another embodiment of the present invention.
  • the robot control engine 1100 may further include: a retrieval result synthesizing device 1170 configured to merge the perceptual units in the perceptual data based on the robot 1200.
  • the value retrieved control entry forms a control entry that matches the sensory data of the robot 1200.
  • the retrieval result synthesizing means 1170 is further configured to merge the retrieved control entries based on the logical relationship between the perceptual units constituting the triggering conditions in the retrieved control entries to form a matching match with the perceptual data of the robot 1200. Control entry.
  • the search result synthesizing means 1170 can find an intersection of the set of control items retrieved based on the value of each perceptual unit, and form one or more control items corresponding to the perceptual data of the robot 1200.
  • the robot control engine 1100 as described in FIG. 5 may further include: a control item sorting means 1180 configured to sort the control items retrieved by the control item retrieval means 1140 to select the control machine based on the sorted result Control entry for human interaction.
  • the retrieval result synthesizing device 1170 may form one or more control items corresponding to the sensing data of the robot 1200, and the control item sorting device 1180 may sort the formed plurality of control items based on the preset policy to select the sensing data control based on the robot 1200. Control entry for robot 1200 interaction behavior.
  • FIG. 6 shows a robot control engine 1100 in accordance with still another embodiment of the present invention.
  • the robot control engine 1100 may further include one or any combination of the following: the user feedback acquisition device 1190 is configured to acquire a plurality of users 1300 to the robot 1200. User feedback of the interactive behavior; control item execution status recording means 1192, configured to record execution status information of the control item, forming an execution log; control item priority configuration means 1194, set to configure the priority of the control item; user behavior Recording device 1196 is configured to record user behavior to form a user behavior log.
  • user feedback includes, but is not limited to, user 1300's evaluation of the interaction behavior of the robot 1200, including but not limited to voice feedback by the user 1300 after the robot 1200 performs the interactive behavior, the user 1300 and the robot 1200. Physical contact, feedback instructions sent by the user 1300 through a terminal (eg, a smartphone, etc.).
  • the execution status information of the control entry includes, but is not limited to, the number of executions of the control entry, the execution time of the control entry, the execution success rate of the control entry, and the like.
  • the priority of the control entry can be based on the source setting of the control entry, and the control entry with the higher priority can be selected first.
  • control item sorting device 1180 is further configured to control based on user feedback, and/or execution logs, and/or priority of control entries, and/or user behavior log pairs.
  • the control items retrieved by the item retrieval means 1140 are sorted to select control items that control the interactive behavior of the robot based on the sorted results.
  • FIG. 7 shows a robot control engine 1100 in accordance with still another embodiment of the present invention.
  • the robot control engine 1100 can also include an internet content capture device 1198 configured to fetch content from the Internet to form an internet content collection.
  • the control item generating device 1120 is further configured to generate a control item for controlling the interaction behavior of the robot based on the perceptual data of the robot based on the Internet content set, the preset sensing unit, and the preset interaction behavior.
  • Internet content includes at least one or any combination of the following: web pages, text, sound, video, images.
  • the data content can be captured by a data crawling tool (for example, a crawler, etc.), and the captured Internet content is analyzed based on a data mining algorithm to obtain an Internet content collection.
  • Internet content collections can be constructed in the form of "if this then that" (if so, if so), describing the feedback under various conditions. For example, an answer describing a problem, an expression describing a certain emotion, or a limb movement.
  • Controlling the item generating device 1120, further The step is set to generate a control item based on the Internet content set, the preset sensing unit, and the preset interaction behavior to control the interaction behavior of the robot based on the perceptual data of the robot.
  • content may be fetched from the Internet (eg, a web page, etc.), the captured content may be analyzed, content for setting control entries may be obtained, and triggering conditions triggered by trigger conditions and trigger conditions may be set according to the content.
  • the trigger condition of “ill” can be set according to the sensing unit, and the interaction behavior triggered by the trigger condition is set to “call emergency call”.
  • the sensing unit of “health status” is predefined, the value of the sensing unit can be directly set to “ill”, and the triggering condition can be ⁇ if(“health”: “sick”) ⁇ .
  • a sensing unit for controlling the interaction behavior of the robot is defined in advance, and is used as a minimum unit for controlling the interaction behavior of the robot, and the interaction behavior triggered by the triggering condition and the triggering condition is set according to the sensing unit to obtain the controlling robot.
  • the control items unify the input and output standards of the robot control, so that non-technical personnel can also edit the behavior of the robot, thereby facilitating the control of the robot's interaction behavior, and effectively improving the robot's adaptive interaction behavior and intelligence.
  • modules or steps of the embodiments of the present invention can be implemented by a general computing device, which can be concentrated on a single computing device or distributed in multiple computing devices. Alternatively, they may be implemented by program code executable by the computing device such that they may be stored in the storage device by the computing device and, in some cases, may be different from The steps shown or described are performed sequentially, or they are separately fabricated into individual integrated circuit modules, or a plurality of modules or steps thereof are fabricated into a single integrated circuit module. Thus, embodiments of the invention are not limited to any specific combination of hardware and software.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un moteur et un système de commande de robot. Le moteur de commande de robot comprend : un moyen d'acquisition de données de détection (1110), configuré pour acquérir des données de détection qui sont générées sur la base d'informations détectées par un robot et en fonction d'au moins une unité de détection prédéfinie, les données de détection comprenant une valeur de l'unité de détection ; un moyen de génération d'élément de commande (1120), configuré pour générer et maintenir un élément de commande servant à commander un comportement d'interaction du robot sur la base des données de détection du robot, l'élément de commande comprenant une condition de déclenchement constituée d'au moins une unité de détection, et un comportement d'interaction déclenché par la condition de déclenchement ; un moyen de génération d'index inversé (1130), configuré pour générer un index inversé en utilisant en tant que clé primaire l'unité de détection comprise dans la condition de déclenchement dans chaque élément de commande et en utilisant en tant que cible un identifiant de l'élément de commande ; et un moyen de récupération d'élément de commande (1140), configuré pour récupérer l'élément de commande sur la base des données de détection du robot et de l'index inversé. Cette solution permet d'améliorer efficacement un comportement d'interaction adaptatif et un niveau d'intelligence du robot.
PCT/CN2016/087259 2015-06-26 2016-06-27 Moteur et système de commande de robot WO2016206644A1 (fr)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
CN201510364661.7 2015-06-26
CN201510363346.2 2015-06-26
CN201510364661.7A CN106325228B (zh) 2015-06-26 2015-06-26 机器人的控制数据的生成方法及装置
CN201510363348.1A CN106325065A (zh) 2015-06-26 2015-06-26 机器人交互行为的控制方法、装置及机器人
CN201510363346.2A CN106325113B (zh) 2015-06-26 2015-06-26 机器人控制引擎及系统
CN201510363348.1 2015-06-26

Publications (1)

Publication Number Publication Date
WO2016206644A1 true WO2016206644A1 (fr) 2016-12-29

Family

ID=57584497

Family Applications (6)

Application Number Title Priority Date Filing Date
PCT/CN2016/087262 WO2016206647A1 (fr) 2015-06-26 2016-06-27 Système de commande d'appareil mécanique permettant de générer une action
PCT/CN2016/087259 WO2016206644A1 (fr) 2015-06-26 2016-06-27 Moteur et système de commande de robot
PCT/CN2016/087257 WO2016206642A1 (fr) 2015-06-26 2016-06-27 Procédé et appareil de génération de données de commande de robot
PCT/CN2016/087261 WO2016206646A1 (fr) 2015-06-26 2016-06-27 Procédé et système pour pousser un dispositif de machine à générer une action
PCT/CN2016/087258 WO2016206643A1 (fr) 2015-06-26 2016-06-27 Procédé et dispositif de commande de comportement interactif de robot et robot associé
PCT/CN2016/087260 WO2016206645A1 (fr) 2015-06-26 2016-06-27 Procédé et appareil de chargement de données de commande dans un dispositif de machine

Family Applications Before (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/087262 WO2016206647A1 (fr) 2015-06-26 2016-06-27 Système de commande d'appareil mécanique permettant de générer une action

Family Applications After (4)

Application Number Title Priority Date Filing Date
PCT/CN2016/087257 WO2016206642A1 (fr) 2015-06-26 2016-06-27 Procédé et appareil de génération de données de commande de robot
PCT/CN2016/087261 WO2016206646A1 (fr) 2015-06-26 2016-06-27 Procédé et système pour pousser un dispositif de machine à générer une action
PCT/CN2016/087258 WO2016206643A1 (fr) 2015-06-26 2016-06-27 Procédé et dispositif de commande de comportement interactif de robot et robot associé
PCT/CN2016/087260 WO2016206645A1 (fr) 2015-06-26 2016-06-27 Procédé et appareil de chargement de données de commande dans un dispositif de machine

Country Status (1)

Country Link
WO (6) WO2016206647A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11220008B2 (en) * 2017-07-18 2022-01-11 Panasonic Intellectual Property Management Co., Ltd. Apparatus, method, non-transitory computer-readable recording medium storing program, and robot
CN108388399B (zh) * 2018-01-12 2021-04-06 北京光年无限科技有限公司 虚拟偶像的状态管理方法及系统
JP7188950B2 (ja) 2018-09-20 2022-12-13 株式会社Screenホールディングス データ処理方法およびデータ処理プログラム
TWI735168B (zh) * 2020-02-27 2021-08-01 東元電機股份有限公司 語音控制機器人

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020184024A1 (en) * 2001-03-22 2002-12-05 Rorex Phillip G. Speech recognition for recognizing speaker-independent, continuous speech
WO2006093394A1 (fr) * 2005-03-04 2006-09-08 Chutnoon Inc. Serveur, procede et systeme pour service de recherche d'informations au moyen d'une page web segmentee en plusieurs blocs d'information
CN1911606A (zh) * 2005-08-10 2007-02-14 株式会社东芝 用于控制机器人的行为的装置和方法
US20090043575A1 (en) * 2007-08-07 2009-02-12 Microsoft Corporation Quantized Feature Index Trajectory
US20110213659A1 (en) * 2010-02-26 2011-09-01 Marcus Fontoura System and Method for Automatic Matching of Contracts in an Inverted Index to Impression Opportunities Using Complex Predicates and Confidence Threshold Values
CN102448678A (zh) * 2009-05-26 2012-05-09 奥尔德巴伦机器人公司 用于编辑和控制移动机器人的行为的系统和方法
WO2014050192A1 (fr) * 2012-09-27 2014-04-03 オムロン株式会社 Appareil de gestion de dispositif et procédé de recherche de dispositif
CN103729476A (zh) * 2014-01-26 2014-04-16 王玉娇 一种根据环境状态来关联内容的方法和系统

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001353678A (ja) * 2000-06-12 2001-12-25 Sony Corp オーサリング・システム及びオーサリング方法、並びに記憶媒体
JP4108342B2 (ja) * 2001-01-30 2008-06-25 日本電気株式会社 ロボット、ロボット制御システム、およびそのプログラム
US6957215B2 (en) * 2001-12-10 2005-10-18 Hywire Ltd. Multi-dimensional associative search engine
KR101077404B1 (ko) * 2003-11-20 2011-10-26 파나소닉 주식회사 연관 제어 장치, 연관 제어 방법과 서비스 연관 시스템
JP2005193331A (ja) * 2004-01-06 2005-07-21 Sony Corp ロボット装置及びその情動表出方法
KR101088406B1 (ko) * 2008-06-27 2011-12-01 주식회사 유진로봇 유아교육시 로봇을 이용한 양방향 학습 시스템 및 그 운영방법
CN101618280B (zh) * 2009-06-30 2011-03-23 哈尔滨工业大学 具有人机交互功能的仿人头像机器人装置及行为控制方法
CN102665590B (zh) * 2009-11-16 2015-09-23 皇家飞利浦电子股份有限公司 用于内窥镜辅助机器人的人-机器人共享控制
FR2963132A1 (fr) * 2010-07-23 2012-01-27 Aldebaran Robotics Robot humanoide dote d'une interface de dialogue naturel, methode d'utilisation et de programmation de ladite interface
CN201940040U (zh) * 2010-09-27 2011-08-24 深圳市杰思谷科技有限公司 家用机器人
KR20120047577A (ko) * 2010-11-04 2012-05-14 주식회사 케이티 대화형 행동모델을 이용한 로봇 인터랙션 서비스 제공 장치 및 방법
EP2764455B1 (fr) * 2011-10-05 2022-04-20 Opteon Corporation Système et procédé pour la surveillance et/ou le contrôle d'environnements dynamiques
US8965580B2 (en) * 2012-06-21 2015-02-24 Rethink Robotics, Inc. Training and operating industrial robots
CN103324100B (zh) * 2013-05-02 2016-08-31 郭海锋 一种信息驱动的情感车载机器人
CN103399637B (zh) * 2013-07-31 2015-12-23 西北师范大学 基于kinect人体骨骼跟踪控制的智能机器人人机交互方法
CN103793536B (zh) * 2014-03-03 2017-04-26 陈念生 一种智能平台实现方法及装置
CN105511608B (zh) * 2015-11-30 2018-12-25 北京光年无限科技有限公司 基于智能机器人的交互方法及装置、智能机器人

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020184024A1 (en) * 2001-03-22 2002-12-05 Rorex Phillip G. Speech recognition for recognizing speaker-independent, continuous speech
WO2006093394A1 (fr) * 2005-03-04 2006-09-08 Chutnoon Inc. Serveur, procede et systeme pour service de recherche d'informations au moyen d'une page web segmentee en plusieurs blocs d'information
CN1911606A (zh) * 2005-08-10 2007-02-14 株式会社东芝 用于控制机器人的行为的装置和方法
US20090043575A1 (en) * 2007-08-07 2009-02-12 Microsoft Corporation Quantized Feature Index Trajectory
CN102448678A (zh) * 2009-05-26 2012-05-09 奥尔德巴伦机器人公司 用于编辑和控制移动机器人的行为的系统和方法
US20110213659A1 (en) * 2010-02-26 2011-09-01 Marcus Fontoura System and Method for Automatic Matching of Contracts in an Inverted Index to Impression Opportunities Using Complex Predicates and Confidence Threshold Values
WO2014050192A1 (fr) * 2012-09-27 2014-04-03 オムロン株式会社 Appareil de gestion de dispositif et procédé de recherche de dispositif
CN103729476A (zh) * 2014-01-26 2014-04-16 王玉娇 一种根据环境状态来关联内容的方法和系统

Also Published As

Publication number Publication date
WO2016206647A1 (fr) 2016-12-29
WO2016206645A1 (fr) 2016-12-29
WO2016206643A1 (fr) 2016-12-29
WO2016206646A1 (fr) 2016-12-29
WO2016206642A1 (fr) 2016-12-29

Similar Documents

Publication Publication Date Title
US20220012470A1 (en) Multi-user intelligent assistance
JP6816925B2 (ja) 育児ロボットのデータ処理方法及び装置
KR101726945B1 (ko) 수동 시작/종료 포인팅 및 트리거 구문들에 대한 필요성의 저감
US11010601B2 (en) Intelligent assistant device communicating non-verbal cues
US10486312B2 (en) Robot, robot control method, and robot system
WO2017215297A1 (fr) Système interactif en nuage, robot intelligent multicognitif, et procédé d'interaction cognitive associés
CN106325228B (zh) 机器人的控制数据的生成方法及装置
WO2016206644A1 (fr) Moteur et système de commande de robot
IL229370A (en) Interface system and method for providing user interaction with network entities
CN110609620A (zh) 基于虚拟形象的人机交互方法、装置及电子设备
CN106325113B (zh) 机器人控制引擎及系统
JP2016103081A (ja) 会話分析装置、会話分析システム、会話分析方法及び会話分析プログラム
CN106325065A (zh) 机器人交互行为的控制方法、装置及机器人
WO2022199500A1 (fr) Procédé d'entraînement de modèle, procédé de reconnaissance de scène et dispositif associé
CN108806699B (zh) 语音反馈方法、装置、存储介质及电子设备
TW202223804A (zh) 電子資源推送方法及系統
JP6798258B2 (ja) 生成プログラム、生成装置、制御プログラム、制御方法、ロボット装置及び通話システム
JP2018186326A (ja) ロボット装置及びプログラム
WO2023006033A1 (fr) Procédé d'interaction vocale, dispositif électronique et support
US11687049B2 (en) Information processing apparatus and non-transitory computer readable medium storing program
WO2020087534A1 (fr) Génération de réponse dans une conversation
CN113539282A (zh) 声音处理装置、系统和方法
CN111919250A (zh) 传达非语言提示的智能助理设备
JP5701935B2 (ja) 音声認識システムおよび音声認識システムの制御方法
US11997445B2 (en) Systems and methods for live conversation using hearing devices

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16813760

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16813760

Country of ref document: EP

Kind code of ref document: A1