WO2016206647A1 - Système de commande d'appareil mécanique permettant de générer une action - Google Patents
Système de commande d'appareil mécanique permettant de générer une action Download PDFInfo
- Publication number
- WO2016206647A1 WO2016206647A1 PCT/CN2016/087262 CN2016087262W WO2016206647A1 WO 2016206647 A1 WO2016206647 A1 WO 2016206647A1 CN 2016087262 W CN2016087262 W CN 2016087262W WO 2016206647 A1 WO2016206647 A1 WO 2016206647A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- behavior
- frame
- control
- entity
- input
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F13/00—Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
- G06F13/10—Program control for peripheral devices
Definitions
- the present invention relates to the field of human-computer interaction technology, and in particular, to a system for controlling a machine device to generate an action.
- Machine devices such as electronic devices, or robots are widely used in life and production.
- the machine equipment in the related art often interacts with an entity (such as a human user or a robot) in an "input-output" manner, the entity generates an input, and the machine device generates an output for the input, but the machine device cannot have logic with the entity. Interacting, such interactions are difficult to help an entity complete tasks in multiple steps or multiple situations.
- various aspects of the subject matter described herein relate to a system for controlling a machine device to generate an action to at least increase the logic of machine device interaction.
- the subject matter described herein includes an interactive flow of actions that are constrained by event logic, and the machine device can enter an interactive action flow that performs the actions defined by the interactive action flow.
- the interactive action flow includes a behavior frame and a logical control frame, wherein the behavior frame includes a start behavior frame, each logical control frame is connected to one or more previous behavior frames and one or more subsequent behavior frames, and the logical control frame includes One or more entity-related control conditions and subsequent behavior frames corresponding to one or more entity-related control conditions, or control conditions associated with one or more previous behavioral frames and the one or more previous behavioral frames The subsequent behavior frame corresponding to the relevant control condition.
- the logical control frame and the previous behavior frame connected to the logical control frame may be defined as an interactive action item, each interactive action item having a unique identifier; wherein the subsequent behavior frame corresponding to the control condition in the logical control frame is included in the subsequent
- the unique identifier of the interactive action item of the behavior frame refers to.
- a corresponding interactive action item can be entered based on the reference to perform a behavior frame in the interactive action item.
- a machine apparatus comprising: an input unit for acquiring data related to an entity (such as a human user or a robot, etc.); and a calculation unit for generating data for determining to be executed by the machine device based on the entity-related data a control input of the action; and an output unit for generating an action under the control of the computing unit;
- the computing unit is further configured to determine an action to be performed by the machine device based on the control input, wherein the action to be performed by the machine device comprises an interactive action flow constrained according to event logic, the interactive action flow comprising Behavior frames and logical control frames, the behavior frame includes a starting behavior frame, each logical control frame is connected to one or more previous behavior frames and one or more subsequent behavior frames, and the logical control frame includes one or more entity-related controls a condition and a subsequent behavior frame corresponding to the one or more control conditions; wherein the computing unit is further configured to, when entering the interactive action flow, control the output unit and/or the input unit to perform the initial behavior frame;
- the logical control frame further includes one or more previous behavior frame related control conditions; wherein the calculation unit is further configured to determine, based on an output result of the one or more previous behavior frames, when entering the logical control frame A control input that determines a subsequent behavior frame that controls the input match based on the logical control frame.
- the computing unit is further configured to exit the interactive action flow when the interactive action flow ends and continue to determine control inputs based on the entity-related data and determine an action to be performed by the machine device based on the control input.
- the computing unit is operative to search for a control entry that matches the control input, wherein the control entry includes condition data and control data corresponding to the condition data, the control data defining an action to be performed by the machine device.
- the computing unit is configured to determine whether the action to be performed by the machine device is an interactive action flow based on the data type of the control data, wherein when determined to be an interactive action flow, the interactive action flow execution process is entered.
- control input includes what the entity says
- computing unit is used to search for control entries that match what the entity says.
- the behavioral frame may correspond to one or more basic behaviors, each of which may be directly performed by the machine device.
- the basic behavior can be defined by the behavior name and behavior control parameters.
- the logical control frame includes links to other interactive action flows corresponding to one or more entity-related control conditions, or other interactive actions corresponding to control conditions associated with one or more previous behavior frames The link to the stream.
- the links of other interactive action flows may include a unique identification of the interactive action flow, or the unique identification and behavioral frame specified when entering the interactive action flow.
- the input unit comprises one or any combination selected from the group consisting of a voice input unit, or an image input unit, or a motion input unit; and/or the output unit comprises a voice output unit, or a display unit, or a machine One or any combination of an actuator, or a light output unit, or a signal modulation and output unit.
- control input includes information for multiple dimensions.
- the machine device is a mobile robot with interactive capabilities, or an electronic device.
- the entity-related data includes what the entity says
- the control input includes what the entity says
- each of the behavior frames includes an act of causing the machine device to present content to be spoken by the machine device.
- each of the behavior frames also includes an activity to be performed along with content to be spoken by the machine device.
- control conditions include a yes or no decision condition, or a plurality of option selection conditions, or structural data, or a high level conclusion, or a wait timeout, or any combination.
- the computing unit is configured to schedule components for performing basic behavior in accordance with temporal logic constraints to control the actions of the output unit and/or the input unit to perform behavioral frames.
- a machine apparatus comprising: an input unit for acquiring entity-related data; an output unit for performing an action; and a calculation unit for performing an interactive action flow constrained by the event logic, wherein
- the interactive action flow includes a behavior frame and a logical control frame, and the behavior frame includes a start behavior frame, each logical control frame is connected to one or more previous behavior frames and one or more subsequent behavior frames, and the logical control frame includes one or a plurality of entity-related control conditions and subsequent behavior frames corresponding to one or more entity-related control conditions;
- the computing unit is configured to: execute at least one first processing thread, wherein the first processing thread is used to control the output The unit and/or the input unit performs an action corresponding to the behavior frame, wherein the control output unit and/or the input unit execute the initial behavior frame when entering the interactive action flow; and execute at least one while executing the at least one first processing thread a second processing thread, wherein the second processing thread is configured to detect entity-related data and is based on entity correlation Data determining
- the behavioral frame may correspond to a basic behavior or a plurality of basic behaviors that are logically constrained by time, each of which may be directly performed by the machine device.
- the logical control frame includes links to other interactive action flows corresponding to one or more entity-related control conditions, or other interactive actions corresponding to control conditions associated with one or more previous behavior frames The link to the stream.
- the input unit comprises one or any combination selected from the group consisting of a voice input unit, or an image input unit, or a motion input unit; and/or the output unit comprises a voice output unit, or a display unit, or a machine One or any combination of an actuator, or a light output unit, or a signal modulation and output unit.
- control input includes information for multiple dimensions.
- machine device is a mobile robot with interactive capabilities, or an electronic device.
- the entity-related data includes what the entity says
- the control input includes what the entity says
- each of the behavior frames includes an act of causing the machine device to present the content to be spoken by the machine device.
- each of the behavioral frames also includes an act to be performed along with the content to be spoken by the machine device.
- a system for controlling a machine device to generate an action comprising an input unit for retrieving entity-related data and an output unit for performing an action
- the system comprising: an input generation module for Generating a control input for determining an action to be performed by the machine device based on the entity-related data retrieved by the input unit; a control generation module for determining an action to be performed by the machine device based on the control input, wherein the device is to be
- the executed action includes an interactive action flow constrained by event logic, the action action flow including a behavior frame and a logical control frame, the behavior frame including a start behavior frame, and each logical control frame connected to one or more previous behavior frames And one or more subsequent behavior frames, the logical control frame including one or more entity-related control conditions and subsequent behavior frames corresponding to one or more entity-related control conditions; and a control module for entering the interaction
- the control output unit and/or the input unit perform the initial behavior frame;
- the logical control frame further includes one or more previous behavior frame related control conditions; wherein the control module is further configured to determine control based on an output result of the one or more previous behavior frames upon entering the logical control frame Input, based on the logical control frame, determines a subsequent behavior frame corresponding to the control input.
- the interactive action flow exits when the interactive action flow ends, and the control input is determined by the input generation module based on the entity-related data, and the control module determines the action to be performed by the machine device based on the control input.
- control module is operative to search for a control entry that matches the control input, wherein the control entry includes condition data and control data corresponding to the condition data, the control data defining an action to be performed by the machine device.
- control input includes what the entity says
- control module is used to search for control entries that match what the entity says.
- control module is operative to determine whether the action to be performed by the machine device is an interactive action flow based on the data type of the control data, wherein when determined to be an interactive action flow, the interactive action flow execution process is entered.
- the behavioral frame may correspond to a basic behavior or a plurality of basic behaviors that are logically constrained by time, each of which may be directly performed by the machine device.
- the logical control frame includes links to other interactive action flows corresponding to one or more entity-related control conditions, or other interactive actions corresponding to control conditions associated with one or more previous behavior frames The link to the stream.
- the input unit comprises one or any combination selected from the group consisting of a voice input unit, or an image input unit, or a motion input unit; and/or the output unit comprises a voice output unit, or a display unit, or a machine One or any combination of an actuator, or a light output unit, or a signal modulation and output unit.
- the entity-related data includes what the entity says
- the control input includes what the entity says
- each of the behavior frames includes an act of causing the machine device to present the content to be spoken by the machine device.
- each of the behavior frames also includes an activity to be performed along with content to be spoken by the machine device.
- a method for controlling a machine device to generate an action comprising an input unit for retrieving entity related data and an output unit for performing an action, the method comprising: based on the input unit The entity-related data generates a control input for determining an action to be performed by the machine device; determining an action to be performed by the machine device based on the control input, wherein the action to be performed by the machine device comprises an interactive constraint according to event logic
- An action flow the interactive action flow includes a behavior frame and a logical control frame, the behavior frame includes a start behavior frame, and each logical control frame is connected to one or more previous behavior frames and one or more subsequent behavior frames, and the logical control frame Including one or more entity-related control conditions and subsequent behavior frames corresponding to one or more entity-related control conditions; and, when entering the interactive action flow, controlling the output unit and/or the input unit to perform the initial behavior Frame; when entering a logical control frame, the control input is determined based on the current entity-related data Determining a
- the logical control frame further includes one or more previous behavior frame related control conditions; wherein the method further comprises: determining control based on an output of the one or more previous behavior frames upon entering the logical control frame Input, based on the logical control frame, determines a subsequent behavior frame corresponding to the control input.
- the interactive action flow is exited when the interactive action flow ends, and the current control input is determined based on the entity-related data and the action to be performed by the machine device is determined based on the current control input.
- determining an action to be performed by the machine device based on the control input includes: searching for a control entry that matches the control input, wherein the control entry includes condition data and control data corresponding to the condition data, the control data definition being to be The actions performed by the machine.
- control input includes what the entity says
- determining an action to be performed by the machine device based on the control input includes searching for a control entry that matches what the entity said.
- the method further includes determining whether the action to be performed by the machine device is an interactive action flow based on a data type of the control data, wherein when determining to be an interactive action flow, entering the interactive action flow execution process .
- the behavioral frame corresponds to a basic behavior or a plurality of basic behaviors that are logically constrained by time, each of which can be directly performed by the machine device.
- the logical control frame includes links to other interactive action flows corresponding to one or more entity-related control conditions, or other interactive actions corresponding to control conditions associated with one or more previous behavior frames The link to the stream.
- the input unit comprises one or any combination selected from the group consisting of a voice input unit, or an image input unit, or a motion input unit; and/or the output unit comprises a voice output unit, or a display unit, or a machine One or any combination of an actuator, or a light output unit, or a signal modulation and output unit.
- the entity-related data includes what the entity says
- the control input includes what the entity says
- each of the behavior frames includes an act of causing the machine device to present the content to be spoken by the machine device.
- each of the behavior frames also includes an activity to be performed along with content to be spoken by the machine device.
- a method for controlling a machine device to generate an action comprising an input unit for retrieving entity-related data and an output unit for performing an action, the method for performing constraint according to event logic
- An interactive action flow wherein the interactive action flow includes a behavior frame and a logical control frame, the behavior frame includes a start behavior frame, and each logical control frame is connected to one or more previous behavior frames and one or more subsequent behavior frames , The logical control frame includes one or more entity-related control conditions and subsequent behavior frames corresponding to the one or more entity-related control conditions; wherein the method includes: executing at least one first processing thread, wherein the first processing The thread is configured to control the output unit and/or the input unit to perform an action corresponding to the behavior frame, wherein the control output unit and/or the input unit execute the initial behavior frame when entering the interactive action flow; executing at least one first processing thread Simultaneously, executing at least one second processing thread, wherein the second processing thread is configured to detect entity-related data, determine control inputs
- the behavioral frame corresponds to a basic behavior or a plurality of basic behaviors that are logically constrained by time, each of which can be directly performed by the machine device.
- the logical control frame includes links to other interactive action flows corresponding to one or more entity-related control conditions, or other interactive actions corresponding to control conditions associated with one or more previous behavior frames The link to the stream.
- the input unit comprises one or any combination selected from the group consisting of a voice input unit, or an image input unit, or a motion input unit; and/or the output unit comprises a voice output unit, or a display unit, or a machine One or any combination of an actuator, or a light output unit, or a signal modulation and output unit.
- control input includes information for multiple dimensions.
- the machine device described above is a mobile robot with interactive capabilities, or an electronic device.
- the entity-related data includes what the entity says
- the control input includes what the entity says
- each of the behavior frames includes an act of causing the machine device to present content to be spoken by the machine device.
- each of the behavior frames also includes an activity to be performed along with content to be spoken by the machine device.
- the invention relates to a device for controlling a machine device, the machine device comprising an input unit for acquiring entity-related data and an output unit for performing an action, the device for controlling the machine device to perform an interactive action flow
- the interactive action stream includes a behavior frame and a logical control frame, the behavior frame includes a start behavior frame, and each logical control frame is connected to one or more previous behavior frames and one or more subsequent behavior frames, and the logical control frame includes One or more entity-related control conditions and subsequent behavior frames corresponding to one or more entity-related control conditions;
- the apparatus includes: an analysis scheduling component, and a control input generation component, and a logic determination component;
- the parsing component is configured to parse the interactive control flow, and when entering the interactive action flow, execute the initial behavior frame, and when entering the logical control frame, cause the control input generation component to generate data based on the entity obtained by the input unit Controlling the input; and causing the logic determining component to determine a subsequent behavior frame that matches the control input based on the logical control frame
- the apparatus further includes: one or more basic behavioral execution components, each of the basic behavioral execution components for performing a corresponding basic behavior; wherein each of the behavioral frames corresponds to a basic behavior or according to time A plurality of basic behaviors of the logical constraint; wherein the parsing scheduling component is configured to execute the basic behavior execution component based on the behavior frame to perform one or more basic behaviors included in the behavior frame according to the temporal logic constraint.
- the logical control frame and the previous behavior frame connected to the logical control frame are defined as interactive action items, each interactive action item having a unique identification; wherein the subsequent behavior of the control condition in the logical control frame A frame is referred to by a unique identifier of an interactive action item that contains the subsequent behavior frame.
- a method for controlling a machine device comprising an input unit for retrieving entity-related data and an output unit for performing an action
- the apparatus performs an interactive action flow; wherein the interactive action flow includes a behavior frame and a logical control frame, the behavior frame includes a start behavior frame, each logical control frame is connected to one or more previous behavior frames and one or more subsequent actions a frame, the logical control frame includes one or more entity-related control conditions and subsequent behavior frames corresponding to one or more entity-related control conditions; wherein the method includes: parsing the interactive control flow, entering the interactive action At the time of streaming, the initial behavior frame is executed; upon entering the logical control frame, a control input is generated based on the data related to the entity obtained by the input unit; and a subsequent behavior frame matching the control input is determined based on the logical control frame, and the subsequent behavior is performed frame.
- each of the behavioral frames corresponds to a basic behavior or a plurality of basic behaviors that are logically constrained by time; wherein performing the behavioral frame comprises: scheduling a basic behavioral execution component for performing basic behavior based on the behavioral frame, Performing one or more basic behaviors contained in the behavioral frame according to temporal logic constraints, wherein each of the basic behavioral execution components is used to perform a corresponding basic behavior.
- the logical control frame and the previous behavior frame connected to the logical control frame are defined as interactive action items, each interactive action item having a unique identification; wherein the subsequent behavior of the control condition in the logical control frame A frame is referred to by a unique identifier of an interactive action item that contains the subsequent behavior frame.
- the execution of the interactive action flow can begin from a specified behavioral frame upon entering the interactive action flow.
- the specified behavior frame can be referred to by the unique identifier of the interactive action item that contains the behavior frame.
- the actions performed by the upper machine device may include action type actions and/or motion type actions.
- the basic behavior includes user-oriented interactions.
- control input is generated in accordance with a predefined plurality of data elements, and the control input can include data for one or more data elements.
- Conditional data may include conditions set based on one or more data elements. The data elements that control the input belong to the data elements of the condition data.
- the basic behavior is defined by a behavior name and a behavior control parameter
- performing the basic behavior includes: determining a component for performing the basic behavior based on the behavior name, and passing the behavior control parameter to a component corresponding to the behavior name, so that The component corresponding to the behavior name generates an action based on the behavior control parameter.
- the entity-related data includes at least one or any combination of visual, or audible, or gesture, or haptic.
- the machine device can perform logical interaction with the entity, and at least adaptively jump to the corresponding process based on the entity-related response, thereby improving the interactive experience.
- a software is provided for performing the technical solutions described in the above embodiments and preferred embodiments.
- a storage medium having the above-described software stored, including but not limited to: an optical disk, a floppy disk, a hard disk, an erasable memory, and the like.
- FIG. 1 is a schematic diagram of an example of a machine device 100
- FIG. 3 is a schematic structural diagram of a system for controlling an action of the machine device 100;
- FIG. 4 is a flow chart of a method for controlling an execution of an interactive action flow of a machine device 100;
- FIG. 5 is a schematic diagram of a communication system 200 of a machine device 100
- FIG. 6 is another schematic diagram of a communication system 200 of the machine device 100;
- FIG. 7 is a block diagram showing the structure of an apparatus 700 for performing an interactive action flow
- FIG. 8 is a flow chart of a method of controlling a machine device 100 to perform an interactive action flow.
- FIG. 1 is a schematic diagram of an example of a machine device 100.
- the machine apparatus 100 can include an input unit 110 and an output unit 120, and a computing unit 130 coupled to the input unit 110 and the output unit 120.
- the input unit 110 can be configured to input data related to an entity such as a human user, or a service, or a robot, or the like.
- the input unit 110 may include a voice input unit.
- the voice input unit may include a microphone and an audio circuit, and may also include a voice recognition module.
- the microphone is an energy conversion device that converts a sound signal into an electrical signal
- the audio circuit converts the electrical signal into audio data, and transmits the audio data to the computing unit 130 for processing.
- the speech recognition module is operative to convert the speech signal into corresponding text or commands through an identification and understanding process, which may include software and/or hardware modules.
- the speech recognition module can be a module that communicates with a speech recognition engine on the network that sends the speech signal to a speech recognition engine on the network.
- the speech recognition engine converts the speech signal into a corresponding text or command and sends the converted text or command to the speech recognition module.
- the speech recognition module may also include a local speech recognition engine of the machine device 100, which locally converts the speech signal into corresponding text or commands. It should be understood that the implementation of the speech recognition module is not limited thereto, and other ways of converting the speech signal into corresponding words or commands are also possible.
- the microphone may comprise a single microphone, but may also comprise a microphone array consisting of a plurality of microphones, which may employ known techniques.
- the microphone array can couple the signals of multiple microphones into one signal. Collected using multiple microphones The signal can enhance or suppress the sound coming from a certain direction.
- the direction in which the sound source of the voice signal is received and its change can also be analyzed according to the application of the beamforming in the time domain and the spatial filter. These analyses can be used to display the intensity and angle of the speech signal in a beam form from a polar plot. Therefore, the machine device 100 can know the orientation of the sound source.
- the speech input unit may also include a voiceprint recognition module for identifying the speaker, a software module and/or a hardware module that may include known recognition algorithms.
- Input unit 110 can include a touch-sensitive input unit, such as a touch-sensitive display, or a touchpad or the like.
- the touch-sensitive input unit is for user input, and one or a combination of a contact position of the user with the touch-sensitive input unit, or a contact gesture, or a press strength, etc., may correspond to a corresponding input.
- the contact position includes a physical position of the touch-sensitive input unit, or a position at which an image is displayed on a touch-sensitive input unit (for example, a touch-sensitive display), and the like, but is not limited thereto.
- the input unit 110 may include an image input unit including a video/image processing system such as a camera/camera.
- the video processing system can include an image recognition module that can identify objects and objects of various different modes, such as people, objects, or animals in the image, based on processing, analyzing, and understanding the images.
- the image recognition module can transmit the image data to an image recognition engine on the network, perform recognition processing by an image recognition engine on the network, and receive the recognition result returned by the image recognition engine.
- the image recognition module may also include a local image recognition engine that performs image recognition locally on the machine device 100.
- the video processing system may also include a video compression module that may include software modules and/or hardware modules of known algorithms for compressing images/video captured by the camera for transmission or display. It should be understood that the implementation of the video processing system is not limited in this respect, and other approaches are possible.
- the image recognition module can identify objects contained in the image, such as identifying a person, or an animal, or an item, etc. in the image.
- the image recognition module can also recognize the face to determine the identity of the user being served; the image recognition module can also identify the age of the person in the image, etc., and details are not described herein.
- the image recognition module can be identified using known algorithms, which can include software and/or hardware modules.
- the input unit 110 can include a button input unit that can generate a corresponding input by turning the control circuit on or off, the button can correspond to a configured input or command.
- input unit 110 may include an input unit such as a keyboard, joystick, etc. for generating an input or command corresponding thereto.
- the input unit 110 may include one or more sensor modules, such as an infrared sensor, or a light sensor, or a distance sensor, or a noise sensor, or a temperature sensor, or a motion detection sensor, or the like. Or one or more sensor components, including a voice recognition module, an image recognition module, an event detection module, and the like.
- sensor modules such as an infrared sensor, or a light sensor, or a distance sensor, or a noise sensor, or a temperature sensor, or a motion detection sensor, or the like.
- sensor components including a voice recognition module, an image recognition module, an event detection module, and the like.
- the input unit 110 may include a unit that reads information from the local storage device of the machine device 100 such that the machine device 100 is capable of reading entity related data.
- the local storage device can store data from an external sensor module associated with the machine device 100, or data acquired by the machine device 100 from the network.
- Input unit 110 may include one or more agents of sensor devices communicatively coupled to machine device 100 and disposed in a physical environment, which may acquire data from or transmit commands to the sensor device.
- the sensor device in the physical environment may include at least one or any combination of a temperature sensor, or a humidity sensor, or a barometric pressure sensor, or a motion sensor, etc., but is not limited thereto.
- the input unit 110 may include one or a combination of one or more input units.
- the input unit 110 can transmit the acquired information to the computing unit 130 for processing by the computing unit 130.
- input unit 110 may communicate the retrieved raw data to computing unit 130 for processing by computing unit 130.
- input unit 110 may process the raw data to provide entity-related high-level or fine-grained data for further processing by computing unit 130.
- some components of the input unit 110 can also be used to perform actions, such as a camera or camera in an image input unit, the camera can be used to take a photo, and the camera can be used to record video to output a video/image.
- the microphone in the voice input unit can be used to record audio to output audio/text.
- the input unit 110 may include a unit that writes information to a storage device (e.g., a memory space or a database, etc.) that can perform an action of recording information. It should be understood that other input units 110 may also be used to perform actions, which are not enumerated here.
- Output unit 120 can include one or more actuators.
- the one or more actuators can include a drive system and an actuator for driving the corresponding actuator to generate an action.
- the drive system can receive the instructions transmitted by the computing unit 130 and can feed back information to the computing unit 130.
- the drive system may include a drive device and a transmission mechanism (optional) including a motor such as a motor for driving the transmission mechanism, the transmission mechanism coupled to the actuator, and the actuator generating motion based on the motion of the transmission mechanism.
- the one or more actuators include one or more mechanical actuators.
- the drive device can drive the actuator to generate motion under the control of the drive command of the computing unit 130.
- Machine device 100 can include a robot.
- one or more motion actuators can include a head and up to four limbs.
- the relevant parts of the machine device 100 may be referred to as a base, a waist, an arm, a wrist, a hand (clamp or end effector, etc.) and a walking portion (corresponding to a mobile robot). Wait.
- one or more actuators include an actuator that performs a particular task.
- the one or more motion actuators may perform actions under the control of the computing unit 130 to accomplish a particular task, but are not limited thereto.
- Output unit 120 can also include a display unit that includes a display screen and a known computer program module or set of instructions for displaying graphics.
- the display unit is used to display graphics, and it should be understood that "graphics” includes visual information such as "text, image, link, graphical user interface (GUI)".
- GUI graphical user interface
- the output unit 120 may further include a voice output unit for outputting a voice signal.
- the voice output unit may include an audio circuit, a speaker, a voice synthesis module, and the like.
- the audio circuit is for receiving audio data, converting the audio data into an electrical signal, and transmitting the electrical signal to a speaker.
- the speaker converts the electrical signal into an acoustic signal.
- the speech synthesis module can include software components and/or hardware components for synthesizing text into corresponding audio signals.
- the computing unit 130 can control the voice output unit to output an acoustic signal, and the remote service response can include text and/or audio data corresponding to the acoustic signal to be output.
- the output unit 120 may further include an audio output unit for acquiring audio data generated by a microphone or the like, and outputting the audio data to a storage device or the like, but is not limited thereto.
- the output unit 120 may further include an image output unit that takes image or video data, and outputs the image or video data in a manner of storage or transmission, but is not limited thereto.
- the output unit 120 may further include a light output unit for outputting an optical signal.
- the light output unit may include an LED lamp or the like, and the LED lamp may convert the electrical signal into an optical signal under the control of the computing unit 130.
- the LED light can be set in the eye of the robot for visual output of emotions and the like, and the light signal mode corresponds to emotion, but is not limited thereto, which is merely illustrative.
- Output unit 120 may also include a signal modulation and transmission unit.
- the signal modulation and transmission unit can be used to modulate and transmit signals for controlling other devices, which can generate control commands in accordance with a command format of the controlled device, or a communication protocol with the controlled device, etc., but are not limited thereto.
- the output unit 120 can include a unit that reads information from a storage device (eg, a memory space or a database, etc.) that can be used to perform data reading behavior.
- a storage device eg, a memory space or a database, etc.
- the machine device 100 may include any output unit or a combination thereof, and the output unit 100 may include software components and/or hardware components, which are not limited in this embodiment.
- the computing unit 130 can include at least one processor and one or more storage devices and one or more computer programs or sets of instructions stored in the storage device, the computing unit 130 being coupled to the input unit 110 and the output unit 120, but is not limited thereto Therefore, other computing units are also possible.
- the one or more storage devices include local storage devices and/or storage devices located on the network, the at least one processor having access to storage devices located on the network to read or write to the storage devices Enter information.
- the basic behavior is defined by the behavior name and behavior control parameters.
- Computing unit 130 may include one or more components comprised of software and/or hardware for performing basic behavior. The computing unit 130 may determine a component for performing a basic behavior based on the behavior name, and deliver the behavior control parameter to a component corresponding to the behavior name such that the component corresponding to the behavior name generates an action based on the behavior control parameter.
- the storage device may store a control entry library 140 for controlling the machine device 100 to generate an action, the control entry library 140 containing a plurality of control entries 141.
- the control item 141 is composed of condition data 142 and control data 143 corresponding to the condition data 142, which defines an action to be performed by the machine device 100.
- Control data 143 includes an interactive action flow constrained by event logic, the action action flow comprising a behavior frame and a logical control frame, wherein the behavior frame includes a start behavior frame, each of the logical control frames being connected to one or more previous The behavior frame and one or more subsequent behavior frames, the logical control frame including one or more entity-related control conditions and subsequent behavior frames corresponding to one or more entity-related control conditions.
- condition data may include information of a plurality of dimensions, such as what the entity said, or the detected posture of the entity, or contact of the entity at the touch-sensitive input unit of the machine device 100 (which may include location information), Or one or any combination of the pressing of the button input unit of the machine device 100, or the voiceprint of the entity, the gender of the entity, the identity of the entity, etc., but is not limited thereto.
- condition data 142 may contain information for multiple dimensions that may be consistent with the information dimension of the control input.
- the storage device of the machine device 100 can also store control data corresponding to the interactive action flow, and the computing unit 130 can acquire corresponding control data based on the unique identification of the interactive action flow to perform an interactive action flow.
- Each of the behavior frames 145 may correspond to a basic behavior or a plurality of basic behaviors that are logically constrained by time, each of which may be directly performed by the machine device 100.
- the one or more basic behaviors can be predefined behaviors.
- the basic behavior can be defined by a behavior name and a behavior control parameter, which can include an execution object that performs the behavior and a parameter that is required to perform the execution behavior of the object.
- the execution object may include a motor that drives the actuator, and the corresponding execution parameters may include at least one or any combination of the speed of movement of the motor, or the angle of motion of the motor.
- the execution object may include a function module such as a photographing, and the corresponding execution parameter may include at least one or any combination of whether the camera flash is turned on, or the time when the photographing is started.
- the execution object may include playing music, and the corresponding execution parameters may include at least one or any combination of a link of music to be played, a volume of playing music, and the like.
- Basic behaviors can include:
- the behavior name is: audio_speak;
- Behavior control parameters include: text (content to say), volume (voice volume), etc.
- JSON JSON is expressed as follows:
- the behavior name is: audio_sound_music
- Behavior control parameters include: path (path to play music), volume (volume of playing music), etc.
- JSON JSON is expressed as follows:
- the behavior name is: audio_sound_info
- Behavior control parameters include: name (the name of the tone to be played), volume (play volume), etc.
- JSON JSON is expressed as follows:
- Behavior control parameters include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
- JSON JSON is expressed as follows:
- Behavior control parameters include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
- JSON JSON is expressed as follows:
- Behavior control parameters include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
- JSON JSON is expressed as follows:
- the behavior name is: motion_elbow;
- Behavior control parameters include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
- JSON JSON is expressed as follows:
- Behavior control parameters include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
- JSON JSON is expressed as follows:
- the behavior name is: motion_waist
- Behavior control parameters include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
- JSON JSON is expressed as follows:
- the behavior name is: motion_eye;
- Behavior control parameters include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
- JSON JSON is expressed as follows:
- the behavior name is: display_emotion
- Behavior control parameters include: content (with emoticon content displayed), velocity (display speed), etc.
- JSON JSON is expressed as follows:
- the behavior name is: program_photo;
- Behavior control parameters include: flash (whether the flash is turned on), etc.
- JSON JSON is expressed as follows:
- the basic behavior of each of the behavior frames that is logically constrained by time may include: a sequence of basic behaviors and a set of basic behavior sequences nested in a sequence of basic behaviors that are combined by basic behaviors constrained by time logic.
- the calculation unit 130 may generate a control input for determining an action to be performed by the machine device 100 based on the entity-related data acquired by the input unit 120.
- the control input can include information of one or more dimensions associated with an entity (such as a human user or robot, etc.), such as what the entity says.
- the calculating unit 130 may perform analysis processing on the entity-related data, determine the value of the predefined data unit, and obtain a control input from the value of the at least one data unit.
- control input may include information of a plurality of dimensions, such as what the entity said, or the detected posture of the entity, or contact of the entity at the touch-sensitive input unit of the machine device 100 (which may include location information), Or one or any combination of the pressing of the button input unit of the machine device 100, or the voiceprint of the entity, the gender of the entity, the identity of the entity, and the like.
- information that controls various dimensions in the input may be formatted, such as a gesture of the entity may be represented as a string representing a gesture at the touch-sensitive input unit of the machine device 100
- the contact can be expressed as an identification of the contact location, or the magnitude of the contact intensity, etc., the content said by the entity is identified as the corresponding text, and the emotion of the entity is represented as a character string representing the emotion.
- computing unit 130 may compare at least the dimension of the control input with condition data 142 in control entry 141 or with control conditions in the logic control frame to obtain a corresponding result. For example, the text of the content spoken by the entity may be edited by a distance or a semantic distance to determine the degree of matching; or, for the gesture, the control may be lost. The comparison is made with the character string representing the gesture in the condition data 142 or the control condition to determine whether the gestures are consistent. The information of other dimensions may adopt corresponding judgment methods, and details are not described herein again.
- the calculating unit 130 may analyze the data related to the entity to obtain high-level conclusions, such as analyzing the content said by the entity, and obtaining the meaning expressed by the entity (for example, “Yes", or “No", or “I don't understand", etc.) For example, “good”, or “good”, or “may” may correspond to the meaning of “yes”, “not right”, or “don't", or “not used” may correspond to the meaning of "no” .
- computing unit 130 may determine behavior to be performed by machine device 100 based on the generated control inputs. For example, a control entry 141 that matches the control input is searched in the control entry library 140. The calculation unit 130 may match the generated control input with the condition data 142 in the control entry 141 to obtain a control entry 141 corresponding to the generated control input, thereby obtaining control data 143 that controls the machine device 100 to generate an action.
- the control input can include what the entity (such as a human user or robot, etc.) speaks.
- the control data 143 can include a variety of data types, including the interactive action stream 144 in some examples, and the computing unit 130 can determine whether the control data 143 belongs to the interactive action stream 144 based on the data type of the control data 143. The computing unit 130 determines to proceed to the execution of the interactive action stream 144 when the control data 143 belongs to the interactive action stream 144.
- control input is generated in accordance with a predefined plurality of data elements, and the control input can include data for one or more data elements.
- the condition data 142 may include conditions based on data element settings corresponding to the data elements of the control input.
- the data elements of the control input may be consistent or have a corresponding relationship with the data elements of the condition data 142.
- a plurality of data elements may be pre-set, it being understood that the setting of the exemplary data elements described below is not a division of data elements, or a number of data elements, or a definition of a data element, in fact any The division of data elements can all be considered. Examples of data elements are shown in Table 1.
- perceptual data is not the number of elements of perceptual data, or the definition of perceptual data elements, or the format or perception of perceptual data.
- perceptual data is not the number of elements of perceptual data, or the definition of perceptual data elements, or the format or perception of perceptual data.
- the JSON-aware data of an example case is expressed as follows, but is not limited thereto, and other methods are also possible.
- “vision_human_position” records that the human user is behind ("back") relative to the machine device 110, and “back” can also be represented by other characters, which can distinguish different positions, and should be understood.
- the position can also be represented by an “angle value” such as “vision_human_position”: “45°” or the like.
- “sensing_touch” records the touch of a human user.
- the position of the touch is the hand (“hand”).
- the “hand” can also be represented by other characters. It can be distinguished from different positions. It should be understood that the touch position can be multiple, “ The value of sensing_touch can be an array that records multiple locations.
- “audio_speak_txt” records what the human user said “very happy to see you”, and the content can also be audio data.
- “audio_speak_language” records the language “chinese” spoken by human users.
- “vision_human_posture” records the human user's gesture “posture1”, and “posture1” can also be represented by other characters, which can distinguish different postures.
- “system_date” records the date “2016/3/16" of the generation of the perceptual data
- “system_time” records the time “13-00-00” of the perceptual data generation.
- “system_power” records the power "80%”, it should be understood that the power can also be identified in other ways.
- the computing unit 130 may control the output unit 120 and/or the input unit 110 to perform a starting behavior frame upon entering the interactive action stream 144.
- the calculating unit 130 may generate a current control input based on the entity-related data acquired by the input unit 110 when entering the logical control frame, and determine a subsequent behavior frame of the current control input matching based on the logical control frame, if it is determined with the current Controlling the subsequent behavior frames of the input match, control output unit 120 and/or input unit 110 performs the behavior of determining the derived behavior frame definition.
- computing unit 130 can match the generated control inputs to control conditions in the logic control frame, and determine control inputs and control conditions when the control inputs match information at least partial dimensions of the control conditions in the logic control frame. Matching, and then the behavior frame corresponding to the matched control condition is used as a subsequent behavior frame, and the subsequent behavior frame can be executed.
- the computing unit 130 schedules components for performing basic behavior based on one basic behavior corresponding to the behavior frame or a plurality of basic behaviors that are temporally constrained to control the actions of the output unit 120 and/or the input unit 110 to perform the behavior frame.
- the storage module of computing unit 130 may store a plurality of software components and/or hardware components for performing basic behavior.
- computing unit 130 can include components for moving the head, the basic behavior can include control of the head, and corresponding control parameters can include the head motor being controlled, or the speed of movement of the head motor, or the head The angle of motion of the motor, etc.
- computing unit 130 may include software components for photographing, and corresponding control parameters may include whether to turn on the camera flash, or when the photographing begins.
- control script can be in control data, and the computing unit 130 can be used to directly parse the control script in the control data.
- the control data is only a reference to the corresponding control script, and the calculation unit 130 can be used to obtain a control script corresponding thereto based on the reference.
- the logical control frame further includes one or more links to other interactive action flows corresponding to the input data associated with the entity, or other interactive actions corresponding to control conditions associated with one or more previous behavior frames.
- the computing unit 130 can obtain an interactive action flow based on the link of the interactive action flow and enter an execution process of the interactive action flow.
- the links of other interactive action flows may include a unique identification of the interactive action flow, or the unique identification and behavioral frame specified when entering the interactive action flow.
- Computing unit 130 may begin execution of the interactive action flow from the specified behavior frame.
- control condition may include a yes or no determination condition, when the control input is "Yes”, corresponding to the subsequent behavior frame corresponding to "Yes”, when the control input is "No", corresponding to "No”
- the corresponding subsequent behavior frame when the control input is not “yes” or “no”, corresponds to the subsequent behavior frame corresponding to "other".
- the control condition may include a selection event of a plurality of options corresponding to a subsequent behavior frame corresponding to the one or more "options” when the control input is one or more "options”.
- the control input may include waiting for a timeout condition, when the control input is "waiting for timeout", corresponding to a subsequent behavior frame corresponding to "waiting for timeout", such as exiting the execution of the interactive action flow, or repeating the last behavior frame.
- the control input includes a specific single control input, such as inputting personal information (such as name, address, etc.) of the human user.
- the logical control frame further includes one or more control conditions corresponding to the previous behavior frame and subsequent behavior frames corresponding to the control condition.
- the control input may also be determined based on the output result of the one or more previous behavior frames
- the subsequent behavior frame of the control input matching is determined based on the logical control frame
- the output unit 120 and/or the input unit are controlled. 110 executes the subsequent behavior frame.
- control conditions in the logical control frame may include high-level conclusions, such as a high-level meaning of "yes” / "affirmative”, which may be defined as “super_yes”, representing the high meaning of "no” / “negative”, Defined as “super_no” and as a high-level meaning of "unclear”, can be defined as "unknown”.
- the calculation unit 130 can be used to generate high level conclusions to generate control inputs including high level conclusions. For example, based on semantic analysis, the content expressed by the entity is determined to be “yes”/“affirmative”, or “no”/“negative”, or “unclear”, and the like.
- control conditions in the logical control frame also include structural data, such as a particular form of information, such as a name, or an identification number, or a phone number, or an address, and the like.
- the calculating unit 130 may determine whether the control input includes structural data corresponding to the control condition. For example, the control input is “135120000”, the structural data of the control condition is “mobile phone number”, and the calculating unit 130 may determine according to the structure of the “mobile phone number”. The "135120000" in the control input is not a phone number, so the control input does not match the control condition.
- the calculation unit 130 may continue to generate a control input for determining an action of the machine device 100 based on the entity-related data acquired by the input unit 110, and determine an action to be performed by the robot based on the control input, and Control output unit 120 and/or input unit 110 perform actions.
- control input can include what the entity says
- behavior frame can include an act of causing the machine device 100 to present (such as playing a sound or displaying a graphic, etc.) the content to be spoken by the machine device 100.
- the machine device 100 can therefore have an interactive dialogue with the entity.
- behaviors accompanying the content to be spoken by the machine device 100 to be performed by the machine device 100 such as the motion of the mechanical actuator, or the display of the display unit, etc., may also be included in the behavior frame.
- the logical control frame and the previous behavioral frame connected thereto can be defined as interactive action items, each interactive action item having a unique identification.
- the interactive action item may only include behavioral frames when needed. Subsequent behavioral frames in the logical control frame that correspond to the control conditions may be referred to by a unique identification of the interactive action item that includes the subsequent behavior frame.
- Computing unit 130 may determine a behavior frame to be performed based on the unique identification of the interaction item in the logical control frame.
- the flow of the calculation unit 130 controlling the machine device 100 to generate an action will be described below. 2 is a flow chart for controlling the behavior of the machine device 100.
- a control input (201) for determining an action to be performed by the machine device 100 is generated based on the entity-related data acquired by the input unit 110.
- the non-limiting entity-related data may include what the entity said, as well as other data related to the entity.
- Control inputs can include information in one or more dimensions.
- An action to be performed by machine device 100 is determined based on the generated control input (202).
- a control entry that matches the control input can be searched, as described above, the control entry 141 can be conditional data 142 and associated with the The piece of data 142 corresponds to control data 143, and the control data may include the above-described interactive action flow constrained by event logic.
- the interactive action flow refer to the foregoing description, and details are not described herein again.
- control output unit 120 and/or input unit 110 Upon entering the interactive action flow, control output unit 120 and/or input unit 110 performs a start behavior frame (203). Upon entering the logical control frame, a control input is determined based on the current entity-related data, a subsequent behavior frame that controls the input match is determined based on the logical control frame, and control output unit 120 and/or input unit 110 performs a subsequent behavior frame (204).
- the logical control frame further includes one or more links to other interactive action flows corresponding to the input data associated with the entity, or other interactive actions corresponding to control conditions associated with one or more previous behavior frames.
- Links to other interactive action flows may include a unique identification of the interactive action flow, or the unique identification and behavioral frame specified when entering the interactive action flow. When entering an interactive action flow, the execution of the interactive action flow can begin from the specified behavior frame.
- the control input when entering the logical control frame, may be determined based on an output result of the previous behavior frame included in the logical control frame, the subsequent behavior frame of the control input matching is determined based on the logical control frame, and the output unit 120 is controlled And/or input unit 110 performs subsequent behavior frames.
- the control condition included in the logical control frame includes whether the execution of the previous behavior frame is successful. When the execution result of the previous behavior frame is “successful”, corresponding to a subsequent behavior frame, when the execution result of the previous behavior frame is “failed”, Corresponding to another subsequent behavior frame, but is not limited to this.
- components for performing basic behavior may be scheduled according to a basic behavior corresponding to the behavior frame or a plurality of basic behaviors constrained by time logic to control the output unit 120 and/or the input unit 110 to perform the behavior frame. action.
- Each component that performs the basic behavior is used to perform the corresponding basic behavior.
- a basic behavior corresponding to the behavior frame or a control script of a plurality of basic behaviors constrained by time logic may be directly parsed from the control data, and the component performing the basic behavior is logically scheduled according to the time.
- the action of the behavior frame is performed by the control output unit 120 and/or the input unit 110.
- the control script corresponding to the reference script may be obtained based on a basic behavior corresponding to the behavior frame or a control script of multiple basic behaviors according to the time logic constraint, and the basic behavior is scheduled to be executed according to the time logic
- the component controls the output unit 120 and/or the input unit 110 to perform the action of the behavior frame.
- the temporal logic constraint includes one or more basic behaviors that begin execution at the same time; or one or more basic behaviors that perform one or more basic behaviors of the next time node after a predetermined time; or one or more After the execution of the basic behavior ends, one or more basic behaviors of the next time node are started; or a plurality of basic behaviors according to the "timeline" are distributed at corresponding time points on the timeline, and the corresponding basic behavior is performed at the arrival time point.
- the time logic may be one or a combination of the above, but is not limited thereto.
- the interactive action flow execution process is exited.
- the interactive action flow can include an end tag, and the execution of the end interactive action flow can be determined based on the end tag.
- FIG. 3 is a schematic structural diagram of a system for controlling the operation of the machine device 100.
- a system for controlling machine device 100 to generate an action includes one or more computer program modules (or sets of instructions) storable in a storage device that, when executed by one or more processors of computing unit 130, implements the methods described above .
- a system for controlling the machine device 100 to generate an action may include an input generation module 131, a control generation module 132, and a control module 133.
- the input generation module 131 is configured to generate a control input for determining an action to be performed by the machine device 100 based on the entity-related data acquired by the input unit 110.
- Data related to an entity may include what the entity said, and accordingly, the control input may include what the entity said.
- the entity-related data may include data of a plurality of dimensions
- the control input may include information of a plurality of dimensions
- each of the control input information dimensions may be determined based on entity-related data of one or more dimensions.
- a control generation module 132 for determining an action to be performed by the machine device 100 based on a control input generated by the input generation module 131, wherein the action to be performed by the machine device 100 includes an interactive action flow constrained by event logic, the interaction
- the action flow includes a behavior frame and a logical control frame, and the behavior frame may include a start behavior frame, each logical control frame is connected to one or more previous behavior frames and one or more subsequent behavior frames, and the logical control frame includes one or more Entity-related control conditions and subsequent behavior frames corresponding to one or more entity-related control conditions.
- control generation module 132 may search for control entries that match the control input, as described above, control entry 141 may be comprised of condition data 142 and control data 143 corresponding to the condition data 142, the control data may include the An interactive action flow constrained by event logic.
- control entry 141 may be comprised of condition data 142 and control data 143 corresponding to the condition data 142, the control data may include the An interactive action flow constrained by event logic.
- the control module 133 is configured to, when entering the interactive action flow, control the output unit 120 and/or the input unit 110 to execute the initial behavior frame; when entering the logical control frame, determine the control input based on the current entity-related data, based on the logic
- the control frame determines a subsequent behavior frame that controls the input match and controls output unit 120 and/or input unit 110 to perform subsequent behavior frames.
- control module 133 can match the generated control input to a control condition in the logic control frame, and determine the control input when the control input matches at least a portion of the information of the control condition in the logic control frame.
- the control condition is matched, and the behavior frame corresponding to the matched control condition is used as a subsequent behavior frame, and the subsequent behavior frame can be executed.
- control module 133 can be configured to determine a control input based on an output result of a previous behavior frame included in the logical control frame when entering the logical control frame, determine a subsequent behavior frame of the control input matching based on the logical control frame, and Control output unit 120 and/or input unit 110 performs subsequent behavior frames.
- the control input included in the logical control frame includes whether the execution of the previous behavior frame is successful. When the execution result of the previous behavior frame is “successful”, corresponding to a subsequent behavior frame, when the execution result of the previous behavior frame is “failed”, Corresponding to another subsequent behavior frame, but is not limited to this.
- control conditions in the logical control frame may include high-level conclusions, such as a high-level meaning of "yes” / "affirmative”, which may be defined as “super_yes”, representing the high meaning of "no” / “negative”, Defined as “super_no” and as a high-level meaning of "unclear”, can be defined as "unknown”.
- Control module 133 can be used to generate high level conclusions to generate control inputs that include high level conclusions. For example, based on semantic analysis, the content expressed by the entity is determined to be “yes”/“affirmative”, or “no”/“negative”, or “unclear”, and the like.
- control module 133 may perform a semantic analysis using a semantic analysis algorithm to produce a high level of meaning.
- the control module 133 can also serve as an interface to communicate with the remote semantic analysis server, which can send text to the semantic analysis server, which is semantically analyzed by the semantic analysis server to obtain a high level of meaning, and the control module 133 provides the analysis results from the semantic analysis server. It should be understood that this embodiment is by way of example only and is not a limitation of the control module 133.
- control conditions in the logical control frame also include structural data, such as a particular form of information, such as a name, or an identification number, or a phone number, or an address, and the like.
- the control module 133 can determine whether the control input includes structural data corresponding to the control condition. For example, the control input is "135120000", the structural data of the control condition is "mobile phone number”, and the control module 133 can determine according to the structure of the "mobile phone number”. The "135120000" in the control input is not a phone number, so the control input does not match the control condition.
- the logical control frame and the previous behavioral frame connected thereto can be defined as interactive action items, each interactive action item having a unique identification.
- the interactive action item may only include behavioral frames when needed.
- a subsequent behavior frame in the logical control frame corresponding to the control input may be referred to by a unique identification of the interactive action item containing the subsequent behavior frame.
- Control module 133 can determine a behavior frame to be executed based on the unique identification of the interaction item in the logical control frame.
- the control module 133 can be configured to schedule components for performing basic behaviors according to a basic behavior corresponding to the behavior frame or a plurality of basic behaviors constrained by time logic to control the actions of the output unit 120 and/or the input unit 110 to perform the behavior frame. Each of the components for performing the basic behavior is used to perform the corresponding basic behavior.
- the basic behavior is defined by the behavior name and behavior control parameters.
- Machine device 100 may include one or more components comprised of software and/or hardware for performing basic acts.
- the control module 133 may determine a component for performing a basic behavior based on the behavior name, and deliver the behavior control parameter to a component corresponding to the behavior name such that the component corresponding to the behavior name generates an action based on the behavior control parameter.
- the interactive action flow execution process is exited.
- the input generation module 131 may continue to generate control inputs for determining behavior to be performed by the machine device 100 based on the entity-related data retrieved by the input unit 110, and the control generation module 132 may continue to be based on The control input generated by the input generation module 131 determines an action to be performed by the machine device 100.
- the logical control frame further includes one or more links to other interactive action flows corresponding to the input data associated with the entity, or other interactive actions corresponding to control conditions associated with one or more previous behavior frames.
- Links to other interactive action flows may include a unique identification of the interactive action flow, or the unique identification and behavioral frame specified when entering the interactive action flow.
- the control module 133 can be configured to initiate execution of the interactive action flow from the specified behavior frame upon entering the interactive action flow.
- the computing unit 130 is configured to execute an interactive action flow according to event logic, wherein the interactive action flow includes a behavior frame and a logical control frame, and the behavior frame may include a start behavior frame, and each logical control frame is connected to one or more The previous behavior frame and one or more subsequent behavior frames, the logical control frame includes one or more entity-related control conditions and subsequent behavior frames corresponding to the one or more entity-related control conditions.
- the computing unit 130 is configured to: execute at least one first processing thread, wherein the first processing thread is configured to control the output unit 120 and/or the input unit 110 to perform an action of a behavior frame, wherein the output unit is controlled when entering the interactive action stream 120 and/or input unit 110 performs a starting behavior frame. And executing at least one second processing thread while executing the at least one first processing thread, wherein the second processing thread is configured to detect entity-related data, and determine control input based on the entity-related data, and based on the interaction At least a portion of the logical control frames in the action stream determine subsequent behavior frames for which the control input matches.
- a subsequent behavior frame that matches the current control input is determined, causing the first processing thread to terminate the currently executing behavior frame, performing the determined subsequent behavior frame; if the current control input does not match
- the subsequent behavior of the frame causes the first processing thread to continue executing the behavioral frame that is/will be executed. Therefore, it is possible to jump anywhere in a plurality of behavior frames based on the control input.
- the logical control frame and the previous behavioral frame connected thereto can be defined as interactive action items, each interactive action item having a unique identification.
- the interactive action item may only include behavioral frames when needed.
- Subsequent behavioral frames in the logical control frame that correspond to the control conditions may be referred to by a unique identification of the interactive action item that includes the subsequent behavior frame.
- a behavior frame to be executed may be determined based on a unique identification of an interaction item in the logical control frame.
- the at least one first processing thread may schedule components for performing basic behavior to control the output unit 120 and/or according to a basic behavior corresponding to the behavior frame or a plurality of basic behaviors that are temporally constrained according to time.
- the input unit 110 performs an action of a behavior frame.
- Each of the components for performing the basic behavior is used to perform the corresponding basic behavior.
- the basic behavior is defined by the behavior name and behavior control parameters.
- Machine device 100 may include one or more components comprised of software and/or hardware for performing basic acts.
- the first processing thread may determine a component for performing the basic behavior based on the behavior name, and pass the behavior control parameter to a component corresponding to the behavior name such that the component corresponding to the behavior name generates an action based on the behavior control parameter.
- the second processing thread is configured to match the control input with control conditions in at least a portion of the logical control frames included in the interactive action flow to obtain subsequent behavior frames.
- the second processing thread is configured to select a matching subsequent behavior frame based on the selection policy and terminate the first processing thread if the control input matches the control condition of the plurality of logical control frames The behavior frame being executed, the subsequent behavior frame of the selection is performed.
- the first processing thread waits for a behavior frame to be executed, the logical control frame to which the behavior frame is connected is preferentially matched.
- the logical control frame also includes one or more previous behavioral frame related control conditions.
- a second processing thread is operative to determine a control input based on one or more previous behavior frame related output results and to determine a subsequent behavior frame that matches the control input based on the logical control frame.
- the second processing thread can match the generated control input with a control condition in the logic control frame, and determine control input and control when the control input matches at least a portion of the information of the control condition in the logic control frame.
- the condition is matched, and then the behavior frame corresponding to the matched control condition is used as the subsequent behavior frame, and the first processing thread can execute the subsequent behavior frame.
- control conditions in the logical control frame may include high-level conclusions, such as a high-level meaning of "yes” / "affirmative”, which may be defined as “super_yes”, representing the high meaning of "no” / “negative”, Defined as “super_no” and as a high-level meaning of "unclear”, can be defined as "unknown”.
- Second processing thread Can be used to generate high-level conclusions to generate control inputs that include high-level conclusions. For example, based on semantic analysis, the content expressed by the entity is determined to be “yes”/“affirmative”, or “no”/“negative”, or “unclear”, and the like.
- control conditions in the logical control frame also include structural data, such as a particular form of information, such as a name, or an identification number, or a phone number, or an address, and the like.
- the second processing thread can determine whether the control input includes structural data corresponding to the control condition. For example, the control input is “135120000”, the structural data of the control condition is “mobile number”, and the second processing thread can be based on “mobile number”.
- the "135120000" in the structure judgment control input is not a telephone number, and thus the control input does not match the control condition.
- the logical control frame further includes one or more links to other interactive action flows corresponding to the input data associated with the entity, or other interactive actions corresponding to control conditions associated with one or more previous behavior frames.
- Links to other interactive action flows may include a unique identification of the interactive action flow, or the unique identification and behavioral frame specified when entering the interactive action flow.
- the first processing thread can be used to initiate execution of the interactive action flow from the specified behavior frame upon entering the interactive action flow.
- FIG. 4 is a flow chart of a method for controlling the execution of an interactive action flow of machine device 100.
- the method executes at least one first processing thread, wherein the first processing thread is used to control the output unit 120 and/or the input unit 110 to perform an action corresponding to the behavior frame, wherein the control is performed when entering the interactive action flow.
- Output unit 120 and/or input unit 110 performs a starting behavior frame (401).
- the logical control frame further includes one or more links to other interactive action flows corresponding to the input data associated with the entity, or other interactive actions corresponding to control conditions associated with one or more previous behavior frames.
- Links to other interactive action flows may include a unique identification of the interactive action flow, or the unique identification and behavioral frame specified when entering the interactive action flow.
- the first processing thread can be used to initiate execution of the interactive action flow from the specified behavior frame upon entering the interactive action flow.
- At least one second processing thread (402) is executed while executing at least one first processing thread, wherein the second processing thread is configured to detect entity-related data, and determine control inputs based on the entity-related data, and based on the interaction At least a portion of the logical control frames in the motion stream determine subsequent motion frames for which the control input matches.
- the first processing thread is caused to continue executing the behavior frame that is/will be executed (404).
- control data can include multiple data types, such as an interactive action flow Wait.
- FIG. 5 is a schematic diagram of a communication system 200 of machine device 100.
- communication system 200 can include machine device 100, network 300, and machine device control server 400.
- the machine device 100 may also include a communication unit 150 that enables the machine device 100 to transmit and receive information over the network 300.
- Machine device control server 400 can be communicatively coupled to machine device 100 via network 300.
- the machine device control server 400 can include a service front end 401, a search engine 402, and a control entry library 403.
- the service front end 401 is configured to communicate with the machine device 100 over the network 300 to enable the machine device control server 400 to transmit information to the machine device 100 and information transmitted by the receiver device 100, including information transmitted by the machine device 100 for determining to be executed by the robot Control input for the action.
- the search engine 402 is configured to search the control entry library for control data corresponding to the control input based on the control input received by the service front end 401, wherein the control data may include an interactive action flow.
- the service front end 401 can also transmit the control data by the search engine 402 in the communication signal via the network 300 for receipt at the machine device 100 via the communication unit 150.
- the machine device control server 400 can transmit control data to the machine device 100, and can also transmit a unique identification of the control data to the machine device 100, which can be read in the storage device of the robot device 100 based on the unique identification. Take the corresponding control data.
- the computing unit 130 can be used to communicate with the machine device control server 400 over the network 300 using the communication unit 150, transmitting the generated control input in the communication signal to be served at the service front end 401 at the machine device control server 400. receive.
- the computing unit 130 is also operative to control the control data transmitted by the server 400 via the communication unit 150, the control data may include an interactive action stream.
- the calculation unit 130 may determine whether the action to be performed belongs to the interactive action flow based on the data type in the control data, and enter the interactive action flow execution flow when the action to be performed belongs to the interactive action flow.
- computing unit 130 can be used to locally search for control data that matches the control input at machine device 100 and send control inputs in the communication signal over network 300 to serve the front end of the service at machine device control server 400. 401 was received.
- the calculating unit 130 can be configured to select the locally searched control data or the control data transmitted by the machine device control server 400 to control the output unit 120 and/or the input unit 110 to perform the movement. Work.
- one of the control data obtained first may be selected, or one of the control data obtained locally or one of the control data transmitted by the machine device control server 400 may be selected.
- computing unit 130 can be used to locally search for control data that matches the control input at machine device 100, and send control input in the communication signal over network 300 when control data that matches the control input is not locally searched for. To be received at the service front end 401 at the machine device control server 400. Further, the calculation unit 130 receives and executes the control data transmitted by the machine device control server 400.
- control generation module 132 can be used to locally search for control data that matches the control input at the machine device 100 and to send control inputs in the communication signal over the network 300 to control the service front end at the machine device control server 400. 401 was received.
- Control generation module 132 can include at least one interface for communicating with machine device control server 400.
- the control generation module 132 can be used to select locally searched control data or control data transmitted by the machine device control server 400, and the control module 133 can be configured to control the output unit 120 and/or the input unit 110 to perform actions based on the selected control data.
- one of the control data obtained first may be selected, or one of the control data obtained locally or one of the control data transmitted by the machine device control server 400 may be selected.
- control generation module 132 can be configured to locally search for control data that matches the control input at the machine device 100, and to transmit control in the communication signal over the network 300 when control data that matches the control input is not locally searched for.
- the input is received at the service front end 401 at the machine device control server 400.
- control generation module 132 receives the control data transmitted by the server device control server 400, and the control module 133 executes the control data transmitted by the machine device control server 400 to control the output unit 120 and/or the input unit 110 to generate an action.
- control conditions in the logical control frame may include high-level conclusions, such as a high-level meaning of "yes” / "affirmative”, which may be defined as “super_yes”, representing the high meaning of "no” / “negative”, Defined as “super_no” and as a high-level meaning of "unclear”, can be defined as "unknown”.
- Control module 133 can be used to generate high level conclusions to generate control inputs that include high level conclusions. For example, based on semantic analysis, the content expressed by the entity is determined to be “yes”/“affirmative”, or “no”/“negative”, or “unclear”, and the like.
- control conditions in the logical control frame also include structural data, such as a particular form of information, such as a name, or an identification number, or a phone number, or an address, and the like.
- the control module 133 can determine whether the control input includes structural data corresponding to the control condition. For example, the control input is "135120000", the structural data of the control condition is "mobile phone number”, and the control module 133 can determine according to the structure of the "mobile phone number”. The "135120000" in the control input is not a phone number, so the control input does not match the control condition.
- FIG. 6 is another schematic diagram of communication system 200 of machine device 100.
- communication system 200 can include machine device 100 and computing device 500 , where computing device 500 can be communicatively coupled to machine device 100 via network 300 .
- Computing device 500 can include any type of fixed or mobile computing device, including but not limited to a desktop computer (eg, a personal computer, etc.), a mobile computer, or a computing device (eg, a personal digital assistant (PDA), laptop computer, notebook computer, such as Tablet PCs, netbooks, etc., such as Apple iPad/Microsoft Surface, mobile phones (for example, cellular phones, smart phones such as Microsoft Windows Phone, Apple iPhone, Google Android Phone), interactive robots or other types of Other mobile, fixed or portable computing devices.
- a desktop computer eg, a personal computer, etc.
- a mobile computer e.g, a mobile computer, or a computing device (eg, a personal digital assistant (PDA), laptop computer, notebook computer, such as Tablet PCs, netbooks, etc., such as Apple iPad/Microsoft Surface, mobile phones (for example, cellular phones, smart phones such as Microsoft Windows Phone, Apple iPhone, Google Android Phone), interactive robots or other types of Other mobile, fixed or portable computing devices.
- PDA personal
- Computing device 500 can transmit control data via communication signals for receipt at machine device 100, and control data transmitted by computing device 500 can include an interactive action stream.
- Machine device 100 may execute control data transmitted by computing device 500, such as an interactive action flow that performs control data definitions.
- Machine device 100 can include a computer program device 700 (or set of instructions) for performing an interactive flow of operations for performing an interactive flow of actions.
- FIG. 7 is a block diagram showing the structure of an apparatus 700 for performing an interactive action flow.
- an apparatus for performing an interactive action flow includes a parsing scheduling component 701, a control input generating component 702, and a logical judging component 703.
- the parsing scheduling component 701 is configured to parse the interactive control stream, and when entering the interactive action stream, causes the machine device 100 to execute the initial behavior frame, and when entering the logical control frame, causes the control control input generating component 702 to be based on the input unit 110. Obtaining an entity-related data generation control input; and causing the logic determination component 703 to determine a subsequent behavior frame that matches the control input based on the logical control frame, the parsing scheduling component 701 is further configured to execute the subsequent behavior frame.
- the logical control frame further includes one or more links to other interactive action flows corresponding to the input data associated with the entity, or other interactive actions corresponding to control conditions associated with one or more previous behavior frames.
- Links to other interactive action flows may include a unique identification of the interactive action flow, or the unique identification and behavioral frame specified when entering the interactive action flow.
- the parsing scheduling component 701 can be configured to cause the machine device 100 to begin execution of the interactive action stream from the specified behavior frame upon entering the interactive action stream.
- control conditions in the logical control frame may include high-level conclusions, such as a high-level meaning of "yes” / "affirmative”, which may be defined as “super_yes”, representing the high meaning of "no” / “negative”, Defined as “super_no” and as a high-level meaning of "unclear”, can be defined as "unknown”.
- Control input generation component 702 can be used to generate high level conclusions to generate control inputs that include high level conclusions. For example, based on semantic analysis, the content expressed by the entity is determined to be “yes”/“affirmative”, or “no”/“negative”, or “unclear”, and the like.
- control conditions in the logical control frame also include structural data, such as specific forms of information such as name, or identification number, or phone number, or address, or date, or license plate number, or train or flight. No..
- the logic determining component 703 can determine whether the control input includes structural data corresponding to the control condition. For example, the control input is "135120000", the structural data of the control condition is "mobile number”, and the logical determination component 703 can be based on the "mobile number”.
- the "135120000” in the structure judgment control input is not a telephone number, and thus the control input does not match the control condition.
- the apparatus further includes one or more basic behavioral execution components 704, each of the basic behavioral execution components 704 for performing a corresponding basic behavior; wherein each of the behavioral frames corresponds to a basic behavior or Multiple basic behaviors of temporal logic constraints.
- the parsing scheduling component 701 is configured to schedule the basic behavior execution component 704 based on the behavioral frame to perform one or more basic behaviors included in the behavior frame in accordance with temporal logic constraints.
- the basic behavior execution component 704 may include a motor control component corresponding to each actuator in the output unit 120, or a display control component corresponding to the display unit in the output unit 120, or a sound output component corresponding to the sound output unit in the output unit 120 (
- the sound output component may include a music output component, or a sound output component, or a voice output component.
- the basic behavior execution component 704 can also include components for the input unit 110, such as a camera component, or a video component, or a component that records audio, and the like.
- the basic behavioral execution component 704 can also include a visual focus component for locking the focused object in the camera, or a tracking component for tracking the moving object.
- a plurality of basic behaviors may be defined in advance, and the corresponding basic behavior execution component 704 is set, and is not limited to the basic behavior and basic behavior execution component 704 described in this example.
- the basic behavior is defined by the behavior name and behavior control parameters.
- the parsing scheduling component 701 thread may determine a component for performing the basic behavior based on the behavior name and pass the behavior control parameter to the component corresponding to the behavior name such that the component corresponding to the behavior name generates an action based on the behavior control parameter.
- the logical control frame may also include control conditions associated with one or more previous behavior frames.
- Control input component 702 can be configured to determine a control input based on one or more previous behavior frame related output results, and logic determination component 703 is configured to determine a subsequent behavior frame that matches the control input based on the logical control frame.
- the logical control frame and the previous behavioral frame connected thereto can be defined as interactive action items, each interactive action item having a unique identification.
- the interactive action item may only include behavioral frames when needed. Subsequent behavioral frames in the logical control frame that correspond to the control conditions may be referred to by a unique identification of the interactive action item that includes the subsequent behavior frame.
- the logic decision component 703 can determine a subsequent behavior frame to be executed based on the unique identification of the interaction item in the logical control frame.
- the parsing dispatch component 701 can also obtain an interactive action flow to be executed based on the identification of the interactive control flow, and determine a behavior frame to be executed when entering the interactive action flow based on the unique identification of the interactive action item, Upon entering the interactive action flow, the determined behavior frame is executed, and when entering the corresponding logical control frame, the logic determination component 703 generates a control input based on the entity-related data obtained by the input unit 110, or based on the output of the previous behavior frame. The result determines a control input; the logic determination component 703 determines a subsequent behavior frame that matches the control input based on the logical control frame, and the parsing scheduling component 701 executes the subsequent behavior frame.
- the logic determination component 703 can match the generated control input with a control condition in the logic control frame, and determine control input and control when the control input matches the information of at least a portion of the control condition in the logic control frame.
- the condition is matched, and the behavior frame corresponding to the matched control condition is taken as the subsequent behavior frame, and the parsing scheduling component 701 can execute the subsequent behavior frame.
- FIG. 8 is a flow chart of a method of controlling a machine device 100 to perform an interactive action flow.
- the method includes parsing an interactive control flow (801), executing an initial behavior frame (802) upon entering the interactive action flow, and obtaining entity related data based on input by the input unit 110 upon entering the logical control frame A control input is generated (803); and a subsequent behavior frame (804) that matches the control input is determined based on the logical control frame and the subsequent behavior frame is executed (805).
- the logical control frame may also include control conditions associated with one or more previous behavior frames.
- the method may also determine a control input based on one or more previous behavior frame related output results, and determine a subsequent behavior frame that matches the control input based on the logical control frame.
- each of the behavioral frames corresponds to a basic behavior or a plurality of basic behaviors that are logically constrained by time; wherein performing the behavioral frame comprises: scheduling a basic behavioral execution component for performing basic behavior based on the behavioral frame, Performing one or more basic behaviors contained in the behavioral frame according to temporal logic constraints, wherein each of the basic behavioral execution components is used to perform a corresponding basic behavior.
- the logical control frame and the previous behavior frame connected to the logical control frame are defined as interactive action items, each interactive action item having a unique identification; wherein the subsequent behavior of the control condition in the logical control frame A frame is referred to by a unique identifier of an interactive action item that contains the subsequent behavior frame.
- the subsequent behavior frame can therefore be determined based on the unique identification of the interactive action item containing the behavior frame.
- the interactive action flow to be executed may also be obtained based on the identification of the interactive control flow, and the behavioral frame to be executed when entering the interactive action flow is determined based on the unique identification of the interactive action item, upon entering the interactive Action
- performing the determined behavior frame when entering the corresponding logical control frame, generating a control input based on the entity-related data obtained by the input unit 110, or determining the control input based on the output result of the previous behavior frame; and based on the logical control frame A subsequent behavior frame that matches the control input is determined and the subsequent behavior frame is executed.
- control data of the interactive action flow is described below, and it should be understood that the control data is merely an example for ease of understanding, and is not a definition of control data for the interactive action flow.
- the logical control frame and the previous behavioral frame connected thereto are defined as interactive action items, each interactive action item having a unique identification, and the interactive action flow having a unique identification.
- Page_id 1000; / / interactive action flow unique identifier;
- a logical control frame in /item0 which is connected to the trigger of item0 as its previous behavior frame, and is linked to the behavior frame in "item1" and “item2" based on the control condition as its subsequent behavior frame;
- page_id is the unique identifier of the interactive action flow
- item is the interactive action item
- item has a unique identifier in the interactive action flow
- item can include the behavior frame “trigger” and
- the logical control frame “flow_map”, “trigger” may include one or more basic behaviors that are logically constrained by time, and "flow_map” may include one or more control conditions "ifs” and corresponding (in this example, "goto")
- the subsequent behavior frame in this example, is referred to by the unique identifier of the interactive action item "item”.
- the end tag "end” may also be included in “item”.
- the initial behavior frame can be determined based on the unique identifier of the item, and the initial behavior frame is executed.
- the initial behavior frame is in "item0", and “trigger” in “item0” is executed.
- the “trigger” includes four behaviors that are logically constrained by time, and the time logic is represented by numbers “0", “1", "2”, and the like.
- the number “0” corresponds to the basic behavior, the content to be said is set to "How are you?", the volume of the speech is set to "50%”, and the component that causes the machine device 100 to speak can be scheduled according to the set parameters. How are you”.
- the number "1” is executed, and the component that the machine device 100 speaks is executed and the result of the execution completion is returned to start the execution of the number "1".
- the number "1" corresponds to the basic behavior of playing music, the music to be played is "http//bpeer.com/happy.mp3", the volume of the play is "50%”, and the component that causes the machine device 100 to play music can be called to play " Http//bpeer.com/happy.mp3".
- the number "2" is executed.
- the number “2” includes two basic behaviors performed simultaneously, namely head movement and neck movement, wherein the head movement is performed by the head motor “1", the movement speed is set to “1", and the movement angle is "45”. “degree; neck movement is performed by the neck motor “1”, the movement speed is set to “2”, and the movement angle is "60" degrees.
- the component performing the head motion and performing the neck motion may determine the motor to be controlled based on the motor identification that performs the behavior, and the machine device 100 may maintain a mapping table that will perform the behavior of the motor in the interactive motion stream
- the identification is mapped to a motor corresponding to the machine device 100.
- a mapping table can be maintained to map the motion speed to the motion speed performed by the motor.
- the motion speed in the interactive motion stream can be a speed gear. For example, “1” means slow speed, “2” means normal speed, “3”. Expressed quickly.
- the angle of motion can also be a relative value that machine device 100 can convert to a final value of execution.
- the motor is not limited to the actuator, and other controlled objects can be controlled in a similar manner.
- the behavior frame "trigger" of “item 1” includes two basic behaviors, firstly the “0" eye movement, which is performed by the eye motor “1", and the movement speed is “2". "50” degrees; then speak for "1", the content of the speech is "Don't be sad, how about telling a joke to you?", and the volume of the speech is "50%".
- the text of the spoken content is given in the example, in some examples, the audio data of the spoken content may also be directly given.
- a storage medium is further provided, wherein the software includes the above-mentioned software, including but not limited to: an optical disk, a floppy disk, a hard disk, an erasable memory, and the like.
- the embodiment of the invention achieves the following technical effects: through the interactive action flow, the machine device can perform logical interaction with the entity, thereby improving the interaction experience. Since the interactive action flow can enable the machine device to have a logical interaction, the machine device can assist the entity in performing many tasks in accordance with the logic flow. Interactive action flow can be widely used in online education, voice interaction, online assistance, intelligent customer service and other fields.
- modules or steps of the embodiments of the present invention can be implemented by a general computing device, which can be concentrated on a single computing device or distributed in multiple computing devices. Alternatively, they may be implemented by program code executable by the computing device such that they may be stored in the storage device by the computing device and, in some cases, may be different from The steps shown or described are performed sequentially, or they are separately fabricated into individual integrated circuit modules, or a plurality of modules or steps thereof are fabricated into a single integrated circuit module. Thus, embodiments of the invention are not limited to any specific combination of hardware and software.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
Abstract
L'invention concerne un système de commande d'appareil mécanique (100) permettant de générer une action. Le système est configuré afin d'exécuter un flux d'action interactif (144), le flux d'action interactif comprenant une trame de comportement (145) et une trame de commande logique. La trame de comportement (145) comprend une trame de comportement initiale, et chaque trame de commande logique est reliée à une ou plusieurs trames de comportement antérieures, et à une ou plusieurs trames de comportement postérieures. La trame de commande logique comprend une ou plusieurs connexions de commande relative à l'entité et une trame de comportement postérieure correspondant à une ou plusieurs conditions de commande relative à l'entité, et peut en outre comprendre une ou plusieurs conditions de commande relative à une trame de comportement antérieure et une trame de comportement postérieure correspondant à l'une ou plusieurs des conditions de commande relative à la trame de comportement antérieure. Au moyen du flux d'action interactif (144), l'appareil mécanique (100) peut logiquement interagir avec une entité, en améliorant l'expérience d'interaction.
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510364661.7A CN106325228B (zh) | 2015-06-26 | 2015-06-26 | 机器人的控制数据的生成方法及装置 |
CN201510363348.1A CN106325065A (zh) | 2015-06-26 | 2015-06-26 | 机器人交互行为的控制方法、装置及机器人 |
CN201510363348.1 | 2015-06-26 | ||
CN201510363346.2A CN106325113B (zh) | 2015-06-26 | 2015-06-26 | 机器人控制引擎及系统 |
CN201510363346.2 | 2015-06-26 | ||
CN201510364661.7 | 2015-06-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016206647A1 true WO2016206647A1 (fr) | 2016-12-29 |
Family
ID=57584497
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2016/087258 WO2016206643A1 (fr) | 2015-06-26 | 2016-06-27 | Procédé et dispositif de commande de comportement interactif de robot et robot associé |
PCT/CN2016/087262 WO2016206647A1 (fr) | 2015-06-26 | 2016-06-27 | Système de commande d'appareil mécanique permettant de générer une action |
PCT/CN2016/087260 WO2016206645A1 (fr) | 2015-06-26 | 2016-06-27 | Procédé et appareil de chargement de données de commande dans un dispositif de machine |
PCT/CN2016/087259 WO2016206644A1 (fr) | 2015-06-26 | 2016-06-27 | Moteur et système de commande de robot |
PCT/CN2016/087257 WO2016206642A1 (fr) | 2015-06-26 | 2016-06-27 | Procédé et appareil de génération de données de commande de robot |
PCT/CN2016/087261 WO2016206646A1 (fr) | 2015-06-26 | 2016-06-27 | Procédé et système pour pousser un dispositif de machine à générer une action |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2016/087258 WO2016206643A1 (fr) | 2015-06-26 | 2016-06-27 | Procédé et dispositif de commande de comportement interactif de robot et robot associé |
Family Applications After (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2016/087260 WO2016206645A1 (fr) | 2015-06-26 | 2016-06-27 | Procédé et appareil de chargement de données de commande dans un dispositif de machine |
PCT/CN2016/087259 WO2016206644A1 (fr) | 2015-06-26 | 2016-06-27 | Moteur et système de commande de robot |
PCT/CN2016/087257 WO2016206642A1 (fr) | 2015-06-26 | 2016-06-27 | Procédé et appareil de génération de données de commande de robot |
PCT/CN2016/087261 WO2016206646A1 (fr) | 2015-06-26 | 2016-06-27 | Procédé et système pour pousser un dispositif de machine à générer une action |
Country Status (1)
Country | Link |
---|---|
WO (6) | WO2016206643A1 (fr) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109262606A (zh) * | 2017-07-18 | 2019-01-25 | 松下知识产权经营株式会社 | 装置、方法、程序以及机器人 |
TWI709833B (zh) * | 2018-09-20 | 2020-11-11 | 日商斯庫林集團股份有限公司 | 資料處理、資料處理裝置以及電腦可讀取記錄媒體 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108388399B (zh) * | 2018-01-12 | 2021-04-06 | 北京光年无限科技有限公司 | 虚拟偶像的状态管理方法及系统 |
TWI735168B (zh) * | 2020-02-27 | 2021-08-01 | 東元電機股份有限公司 | 語音控制機器人 |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020038168A1 (en) * | 2000-06-12 | 2002-03-28 | Tomoaki Kasuga | Authoring system and method, and storage medium used therewith |
US20020123826A1 (en) * | 2001-01-30 | 2002-09-05 | Nec Corporation | Robot, robot control system, and program for the same |
JP2005193331A (ja) * | 2004-01-06 | 2005-07-21 | Sony Corp | ロボット装置及びその情動表出方法 |
CN102077260A (zh) * | 2008-06-27 | 2011-05-25 | 悠进机器人股份公司 | 利用机器人的交互式学习系统和在儿童教育中操作该系统的方法 |
CN102448678A (zh) * | 2009-05-26 | 2012-05-09 | 奥尔德巴伦机器人公司 | 用于编辑和控制移动机器人的行为的系统和方法 |
US20120116584A1 (en) * | 2010-11-04 | 2012-05-10 | Kt Corporation | Apparatus and method for providing robot interaction services using interactive behavior model |
CN103119644A (zh) * | 2010-07-23 | 2013-05-22 | 奥尔德巴伦机器人公司 | 装备自然对话接口的类人机器人、用于控制机器人的方法和对应程序 |
CN105511608A (zh) * | 2015-11-30 | 2016-04-20 | 北京光年无限科技有限公司 | 基于智能机器人的交互方法及装置、智能机器人 |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7089184B2 (en) * | 2001-03-22 | 2006-08-08 | Nurv Center Technologies, Inc. | Speech recognition for recognizing speaker-independent, continuous speech |
US6957215B2 (en) * | 2001-12-10 | 2005-10-18 | Hywire Ltd. | Multi-dimensional associative search engine |
KR101077404B1 (ko) * | 2003-11-20 | 2011-10-26 | 파나소닉 주식회사 | 연관 제어 장치, 연관 제어 방법과 서비스 연관 시스템 |
WO2006093394A1 (fr) * | 2005-03-04 | 2006-09-08 | Chutnoon Inc. | Serveur, procede et systeme pour service de recherche d'informations au moyen d'une page web segmentee en plusieurs blocs d'information |
JP2007044825A (ja) * | 2005-08-10 | 2007-02-22 | Toshiba Corp | 行動管理装置、行動管理方法および行動管理プログラム |
US7945441B2 (en) * | 2007-08-07 | 2011-05-17 | Microsoft Corporation | Quantized feature index trajectory |
CN101618280B (zh) * | 2009-06-30 | 2011-03-23 | 哈尔滨工业大学 | 具有人机交互功能的仿人头像机器人装置及行为控制方法 |
CN102665590B (zh) * | 2009-11-16 | 2015-09-23 | 皇家飞利浦电子股份有限公司 | 用于内窥镜辅助机器人的人-机器人共享控制 |
US20110213659A1 (en) * | 2010-02-26 | 2011-09-01 | Marcus Fontoura | System and Method for Automatic Matching of Contracts in an Inverted Index to Impression Opportunities Using Complex Predicates and Confidence Threshold Values |
CN201940040U (zh) * | 2010-09-27 | 2011-08-24 | 深圳市杰思谷科技有限公司 | 家用机器人 |
US9459607B2 (en) * | 2011-10-05 | 2016-10-04 | Opteon Corporation | Methods, apparatus, and systems for monitoring and/or controlling dynamic environments |
US20130343640A1 (en) * | 2012-06-21 | 2013-12-26 | Rethink Robotics, Inc. | Vision-guided robots and methods of training them |
EP2902913A4 (fr) * | 2012-09-27 | 2016-06-15 | Omron Tateisi Electronics Co | Appareil de gestion de dispositif et procédé de recherche de dispositif |
CN103324100B (zh) * | 2013-05-02 | 2016-08-31 | 郭海锋 | 一种信息驱动的情感车载机器人 |
CN103399637B (zh) * | 2013-07-31 | 2015-12-23 | 西北师范大学 | 基于kinect人体骨骼跟踪控制的智能机器人人机交互方法 |
CN103729476A (zh) * | 2014-01-26 | 2014-04-16 | 王玉娇 | 一种根据环境状态来关联内容的方法和系统 |
CN103793536B (zh) * | 2014-03-03 | 2017-04-26 | 陈念生 | 一种智能平台实现方法及装置 |
-
2016
- 2016-06-27 WO PCT/CN2016/087258 patent/WO2016206643A1/fr active Application Filing
- 2016-06-27 WO PCT/CN2016/087262 patent/WO2016206647A1/fr active Application Filing
- 2016-06-27 WO PCT/CN2016/087260 patent/WO2016206645A1/fr active Application Filing
- 2016-06-27 WO PCT/CN2016/087259 patent/WO2016206644A1/fr active Application Filing
- 2016-06-27 WO PCT/CN2016/087257 patent/WO2016206642A1/fr active Application Filing
- 2016-06-27 WO PCT/CN2016/087261 patent/WO2016206646A1/fr active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020038168A1 (en) * | 2000-06-12 | 2002-03-28 | Tomoaki Kasuga | Authoring system and method, and storage medium used therewith |
US20020123826A1 (en) * | 2001-01-30 | 2002-09-05 | Nec Corporation | Robot, robot control system, and program for the same |
JP2005193331A (ja) * | 2004-01-06 | 2005-07-21 | Sony Corp | ロボット装置及びその情動表出方法 |
CN102077260A (zh) * | 2008-06-27 | 2011-05-25 | 悠进机器人股份公司 | 利用机器人的交互式学习系统和在儿童教育中操作该系统的方法 |
CN102448678A (zh) * | 2009-05-26 | 2012-05-09 | 奥尔德巴伦机器人公司 | 用于编辑和控制移动机器人的行为的系统和方法 |
CN103119644A (zh) * | 2010-07-23 | 2013-05-22 | 奥尔德巴伦机器人公司 | 装备自然对话接口的类人机器人、用于控制机器人的方法和对应程序 |
US20120116584A1 (en) * | 2010-11-04 | 2012-05-10 | Kt Corporation | Apparatus and method for providing robot interaction services using interactive behavior model |
CN105511608A (zh) * | 2015-11-30 | 2016-04-20 | 北京光年无限科技有限公司 | 基于智能机器人的交互方法及装置、智能机器人 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109262606A (zh) * | 2017-07-18 | 2019-01-25 | 松下知识产权经营株式会社 | 装置、方法、程序以及机器人 |
CN109262606B (zh) * | 2017-07-18 | 2023-10-27 | 松下知识产权经营株式会社 | 装置、方法、记录介质以及机器人 |
TWI709833B (zh) * | 2018-09-20 | 2020-11-11 | 日商斯庫林集團股份有限公司 | 資料處理、資料處理裝置以及電腦可讀取記錄媒體 |
US11474150B2 (en) | 2018-09-20 | 2022-10-18 | SCREEN Holdings Co., Ltd. | Data processing method, data processing device, and non-transitory computer-readable recording medium |
Also Published As
Publication number | Publication date |
---|---|
WO2016206642A1 (fr) | 2016-12-29 |
WO2016206643A1 (fr) | 2016-12-29 |
WO2016206645A1 (fr) | 2016-12-29 |
WO2016206644A1 (fr) | 2016-12-29 |
WO2016206646A1 (fr) | 2016-12-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11854527B2 (en) | Electronic device and method of controlling speech recognition by electronic device | |
US20240038218A1 (en) | Speech model personalization via ambient context harvesting | |
CN106030440B (zh) | 智能循环音频缓冲器 | |
US8606735B2 (en) | Apparatus and method for predicting user's intention based on multimodal information | |
JP2022547704A (ja) | 訓練を減らした意図認識技術 | |
EP3655863A1 (fr) | Intégration automatique de capture et de reconnaissance d'image dans une interrogation vocale pour comprendre une intention | |
US11709475B2 (en) | Systems and methods to adapt and optimize human-machine interaction using multimodal user-feedback | |
CN102903362A (zh) | 集成的本地和基于云的语音识别 | |
US11120792B2 (en) | System for processing user utterance and controlling method thereof | |
US11769492B2 (en) | Voice conversation analysis method and apparatus using artificial intelligence | |
WO2016206647A1 (fr) | Système de commande d'appareil mécanique permettant de générer une action | |
US11817097B2 (en) | Electronic apparatus and assistant service providing method thereof | |
US20210110815A1 (en) | Method and apparatus for determining semantic meaning of pronoun | |
CN110308886A (zh) | 提供与个性化任务相关联的声音命令服务的系统和方法 | |
KR20210042523A (ko) | 전자 장치 및 이의 제어 방법 | |
KR102369309B1 (ko) | 파셜 랜딩 후 사용자 입력에 따른 동작을 수행하는 전자 장치 | |
JP7215417B2 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
JP6798258B2 (ja) | 生成プログラム、生成装置、制御プログラム、制御方法、ロボット装置及び通話システム | |
US10770094B2 (en) | Routing audio streams based on semantically generated result sets | |
JP7230803B2 (ja) | 情報処理装置および情報処理方法 | |
KR20210027991A (ko) | 전자장치 및 그 제어방법 | |
US11997445B2 (en) | Systems and methods for live conversation using hearing devices | |
CN112740219A (zh) | 手势识别模型的生成方法、装置、存储介质及电子设备 | |
Le et al. | Multimodal smart interactive presentation system | |
CN111971670A (zh) | 在对话中生成响应 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16813763 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16813763 Country of ref document: EP Kind code of ref document: A1 |