Disclosure of Invention
The invention mainly aims to provide a feedback control method and system of a robot and a computer readable storage medium, and aims to solve the technical problems that the robot in the prior art is poor in overall interaction effect and stiff in execution process of behaviors.
In order to achieve the purpose, the invention provides a feedback control method for robot behaviors, which comprises the following steps:
detecting to obtain robot operation information;
generating a behavior control instruction corresponding to the robot operation information according to the robot operation information;
controlling the robot to execute the robot behavior corresponding to the robot operation information according to the behavior control instruction;
detecting the running state of the robot when the robot executes the robot behavior in real time, and generating state parameter information according to the running state;
generating a behavior feedback adjustment instruction according to the robot operation information and the state parameter information;
and adjusting the robot behavior in real time according to the behavior feedback adjustment instruction.
Preferably, the step of detecting the operation information of the robot includes:
acquiring external voice data, and matching the external voice data according to preset voice characteristics to obtain robot voice operation information;
detecting to obtain robot application operation information sent by a robot application terminal;
acquiring robot touch parameters transmitted by a touch sensor on the robot, and generating robot touch operation information according to the robot touch parameters;
a step of generating a behavior control command corresponding to the robot operation information based on the robot operation information, the step including:
generating a behavior control instruction corresponding to the robot voice operation information according to the robot voice operation information;
generating a behavior control instruction corresponding to the robot application operation information according to the robot application operation information;
and generating a behavior control command corresponding to the robot touch operation information according to the robot touch operation information.
Preferably, the feedback control method for robot behavior, before the step of detecting and obtaining the robot application operation information sent by the robot application terminal, further includes: setting a compiling instruction library storing preset compiling instructions;
the step of detecting and obtaining the robot application operation information sent by the robot application terminal comprises the following steps:
acquiring a robot operation program sent by a robot application terminal, wherein the robot operation program is an operation program generated by compiling a user detected by the robot application terminal according to a preset compiling instruction;
the method comprises the following steps of generating a behavior control command corresponding to the robot application operation information according to the robot application operation information, wherein the steps comprise:
and generating a corresponding behavior control instruction according to the robot operation program so as to control the robot to execute the robot behavior corresponding to the robot operation program.
Preferably, the step of detecting the operation state of the robot when executing the robot behavior in real time and generating the state parameter information according to the operation state includes:
detecting the current posture of the robot in real time, and obtaining posture parameter information according to the current posture;
generating a behavior feedback adjustment instruction according to the robot operation information and the state parameter information, wherein the step comprises the following steps:
and generating a corresponding attitude feedback adjustment instruction according to the attitude parameter information and the robot operation information.
Preferably, the step of controlling the robot to execute the robot behavior corresponding to the robot operation information according to the behavior control command includes:
controlling an audio playing device of the robot to play audio information corresponding to the audio operation instruction according to the audio operation instruction in the behavior control instruction;
controlling display equipment of the robot to display information corresponding to the display operation instruction according to the display operation instruction in the behavior control instruction;
the method comprises the following steps of adjusting the robot behavior in real time according to the behavior feedback adjustment instruction, and comprises the following steps:
and feeding back an adjusting instruction according to the posture, and adjusting the robot action of the robot in real time.
In addition, to achieve the above object, the present invention provides a feedback control system for robot behavior, including:
the system comprises an operation detection module, an information processing module, a state feedback module and a behavior control module; the operation detection module is connected with the information processing module, the state feedback module is connected with the information processing module, and the information processing module is also connected with the behavior control module;
the operation detection module is used for detecting and obtaining the robot operation information;
the information processing module is used for generating a behavior control instruction corresponding to the robot operation information according to the robot operation information;
the behavior control module is used for controlling the robot to execute the robot behavior corresponding to the robot operation information according to the behavior control instruction;
the state feedback module is used for detecting the running state of the robot when the robot executes the robot behavior in real time, generating state parameter information according to the running state and feeding the state parameter information back to the information processing module;
the information processing module is also used for generating a behavior feedback adjustment instruction according to the robot operation information and the state parameter information;
and the behavior control module is also used for feeding back and adjusting the instructions according to the behaviors and adjusting the behaviors of the robot in real time.
Preferably, the operation detection module comprises:
the voice detection sub-module, the application control sub-module and the touch detection sub-module are respectively connected with the information processing module; wherein,
the voice detection submodule is used for acquiring external voice data, matching the external voice data according to preset voice characteristics to obtain robot voice operation information, and sending the robot voice operation information to the information processing module so that the information processing module generates a corresponding behavior control instruction according to the robot voice operation information;
the application control submodule is used for detecting and obtaining robot application operation information sent by the robot application terminal and sending the robot application operation information to the information processing module so that the information processing module generates a corresponding behavior control instruction according to the robot application operation information;
and the touch detection submodule is used for acquiring the robot touch parameters sent by the touch sensor on the robot, generating robot touch operation information according to the robot touch parameters, and sending the robot touch operation information to the information processing module so that the information processing module generates a corresponding behavior control instruction according to the robot touch operation information.
Preferably, the application control sub-module further comprises: the robot control system comprises a programming control unit and a control unit, wherein the programming control unit comprises a compiling instruction library storing preset compiling instructions and is used for acquiring a robot operation program generated by a user according to the compiling of the preset compiling instructions;
and the information processing module is also used for generating a corresponding behavior control instruction according to the robot operating program so as to enable the behavior control module to control the robot to execute the robot behavior corresponding to the robot operating program.
Preferably, the state feedback module comprises:
and the attitude detection submodule is used for detecting the current attitude of the robot in real time to obtain attitude parameter information, and feeding the attitude parameter information back to the information processing module so that the information processing module generates a corresponding attitude feedback adjustment instruction according to the attitude parameter information and the robot operation information.
Preferably, the behavior control module comprises:
the action control submodule is used for feeding back an adjusting instruction according to the posture and adjusting the robot action of the robot in real time;
the audio control submodule is used for controlling the audio playing equipment of the robot to play audio information corresponding to the audio operation instruction according to the audio operation instruction in the behavior control instruction;
and the display control submodule is used for controlling the display equipment of the robot to display the display information corresponding to the display operation instruction according to the display operation instruction in the behavior control instruction.
In addition, to achieve the above object, the present invention further provides a computer-readable storage medium having a feedback control program of robot behavior stored thereon, the feedback control program of robot behavior implementing the steps of the feedback control method of robot behavior described above when executed by a processor.
The feedback control technical scheme of the robot behavior provided by the invention comprises the steps of detecting robot operation information, generating a corresponding behavior control instruction according to the robot operation information, controlling the robot to execute the corresponding robot behavior according to the behavior control instruction, simultaneously detecting the running state of the robot when executing the robot behavior in real time, generating state parameter information according to the running state, generating a feedback adjustment instruction according to the state parameter information, and adjusting the robot behavior in a negative feedback mode. Through the steps, the running state of the robot in the robot behavior execution process can be determined in real time, errors of the robot in the robot behavior execution process are corrected in real time, the robot can smoothly and accurately complete the robot behavior corresponding to the robot operation information, and therefore the problems that in the prior art, the interaction effect of the robot and the human is poor, and the behavior execution process is rigid are solved.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1, fig. 1 is a schematic flow chart of a first feedback control method for robot behavior according to an embodiment of the present invention, and as shown in fig. 1, the feedback control method for robot behavior includes the following steps:
s110: and detecting to obtain the robot operation information.
In this step, the robot operation information is obtained by matching according to an operation instruction sent by the user. In the embodiment of the application, an operation information feature library is preset, and the operation instruction sent by the user is subjected to feature matching with the operation information in the operation information feature library, that is, the operation information can be extracted from the operation instruction, and then the operation information is further processed (for example, time sequence processing), so that the robot operation information required by the embodiment of the application can be obtained.
The robot operation information can be acquired and detected from the outside through a wireless communication mode, such as Bluetooth, or a detection module is arranged in the robot operation information to acquire and detect the robot operation information in a short distance.
S120: and generating a behavior control command corresponding to the robot operation information according to the robot operation information.
In this step, after the robot operation information is detected, the robot operation information needs to be further processed and converted into control information, i.e., a behavior control instruction, capable of operating the robot behavior, so that the robot can be controlled. In the embodiment of the present application, an information-instruction feature library is preset, where the information-instruction feature library includes features and corresponding relations of robot operation information and behavior control instructions, and a process of generating the behavior control instructions is performed, that is, the process includes: and matching the robot operation information and the corresponding behavior control instruction according to the characteristics and the corresponding relation in the preset information-instruction characteristic library, and further processing the matched behavior control instruction so as to accurately obtain the behavior control instruction corresponding to the robot operation information.
S130: and controlling the robot to execute the robot behavior corresponding to the robot operation information according to the behavior control instruction.
In this step, after acquiring a behavior control instruction, a special control module is required to send the control instruction to the robot to specifically control the behavior of the robot; the robot is controlled to execute the robot behavior corresponding to the robot operation information, an optimal operation path from the current operation state of the robot to the execution of the robot behavior needs to be generated, and then the robot is controlled to execute the robot behavior corresponding to the user operation information along the optimal operation path, so that the robot execution behavior process is smooth and natural. In addition, the control module serves as an intermediary module for the entire control apparatus and the robot, and also functions to communicate between the two.
S140: and detecting the running state of the robot when the robot executes the robot behavior in real time, and generating state parameter information according to the running state.
In this step, the robot may continuously adjust its own operation state when executing the behavior control command to achieve a target operation state corresponding to the robot operation information, and the process of adjusting its own operation state of the robot may deviate from an optimal operation path to achieve the target operation state, which may cause the robot to be stiff and stiff when executing the operation process.
S150: and generating a behavior feedback adjustment instruction according to the robot operation information and the state parameter information.
In the step, the state parameter information includes the current running state of the robot, and the robot operation information includes the final behavior that the user needs to operate the robot to reach; by comparing the state parameter information with the robot operation information, the degree of the current running state of the robot deviating from the optimal path for executing the robot behavior can be obtained, so that a feedback adjustment instruction is generated, the current behavior of the robot is corrected, the running state of the robot is changed, the robot runs more smoothly and naturally, and the process of executing the robot behavior by the robot is more stable and accurate.
S160: and adjusting the robot behavior in real time according to the behavior feedback adjustment instruction.
The robot provided by the application is a deformable robot and is provided with a motion part and a deformation part; in this step, the motion of the transformable robot and the corresponding action, such as "transformation", of the transformation component can be controlled according to the motion data in the feedback adjustment command.
According to the technical scheme provided by the embodiment of the invention, the operation information of the robot is detected, the corresponding behavior control instruction is generated according to the operation information of the robot, then the robot is controlled to execute the corresponding robot behavior according to the behavior control instruction, meanwhile, the running condition of the robot when executing the robot behavior is detected in real time, the state parameter information is generated according to the running state, and the feedback adjustment instruction is generated according to the state parameter information so as to adjust the behavior of the robot in a negative feedback manner. Through the steps, the running state of the robot can be determined in real time when the robot executes the behavior of the robot, errors of the robot when the robot executes the behavior of the robot are corrected in real time, and the robot behavior corresponding to the operation information of the robot is finished smoothly and accurately, so that the problems that in the prior art, the interaction effect of the robot and the human is poor, and the behavior executing process is rigid are solved.
In the feedback control method of robot behavior shown in fig. 1, the robot operation information mentioned above includes different types of robot operation information such as robot voice operation information extracted from voice data sent by a user, robot application operation information obtained from an application terminal, and robot touch operation information detected from a robot sensor. In order to realize the detection of the operation information of the robots of different types, as shown in the embodiment of fig. 2, step S110 in the embodiment of fig. 1: detecting to obtain the robot operation information, including:
s111: and acquiring external voice data, and matching the external voice data according to preset voice characteristics to obtain the robot voice operation information.
In this step, a voice data-operation information feature library is preset in the embodiment of the application, and voice operation information features and a corresponding relationship between voice data and operation information are stored in the voice data-operation information feature library; the voice operation information characteristics in the external voice data can be extracted through the voice data-operation information characteristic library, and then the corresponding robot voice operation information is matched. The external voice data may be too noisy and does not necessarily only contain the voice data of the authority user, so the voice operation information features in the application comprise the voice attribute and the operation key words of the authority user; for example: sound attributes such as audio, volume and timbre of authorized users. In addition, since the external voice data itself is used to operate the robot, the voice operation feature in this embodiment also includes an operation keyword, specifically, a keyword and a keyword for operating a specific behavior of the robot, such as a keyword for controlling the robot to walk at a specific speed.
S112: and acquiring robot application operation information sent by the robot application terminal.
In this step, the robot application operation information is detected robot operation information transmitted by the user using the application terminal. For example, the user who this equipment detects obtains operates cell-phone APP, robot application operation information through modes such as bluetooth transmission.
The hardware device for implementing the method of the embodiment is provided with a bluetooth module, the bluetooth module is a dual bluetooth module, and the dual bluetooth module can implement communication between the hardware device and the robot application terminal, including receiving the robot application operation information. And the information transmission between the hardware equipment and the robot can also be realized by using the double Bluetooth modules in a Bluetooth transmission mode.
S113: and acquiring robot touch operation information transmitted by a touch sensor on the robot.
In the step, in order to implement the step, touch points may be set at key positions (such as a tail position, a head portion, a shoulder portion, and the like of the robot) on the surface of the robot, then a touch sensor is set at the touch detection points, a robot touch parameter is obtained through the touch sensor, and then corresponding robot touch operation information is obtained by detecting the touch parameter. The robot touch parameters comprise parameters such as a part touching the robot, a time length touching the robot, and force touching the robot. And generating robot touch operation information according to the robot touch parameters, and controlling the robot to execute corresponding feedback operation according to the robot touch operation information, so that good interactive experience with a user can be obtained.
Corresponding to the robot voice operation information, the robot application operation information, and the robot touch operation information,
step 120 in the embodiment shown in fig. 1: generating a behavior control command corresponding to the robot operation information according to the robot operation information, comprising:
s121: and generating a behavior control instruction corresponding to the robot voice operation information according to the robot voice operation information.
In this step, a speech operation-behavior control instruction feature library is preset in the embodiment of the present application, and speech operation information features and a corresponding relationship between the speech operation information and the behavior control instruction are stored. And after the voice operation information of the robot is detected, performing feature matching on the voice operation information and preset voice operation information in a feature library, and searching according to the corresponding relationship to obtain a corresponding behavior control instruction. Through the process, the robot and the user can realize voice interaction, the user operation is facilitated, and the experience effect of the user is improved.
Specifically, after external environment voice data is acquired, robot voice operation information is generated according to the external environment voice data, a corresponding behavior control instruction is generated according to the voice operation information, interactive feedback data is adapted according to the behavior control instruction, the robot is controlled according to the interactive feedback data to complete corresponding feedback, and for example, an audio playing device of the robot is controlled to play audio information corresponding to the audio operation instruction, and a robot display device is controlled to display corresponding display information.
For example, the voice operation-behavior control instruction feature library in the present application prestores corresponding information such as text information and volume. In the step, the related voice detection module receives the deformation sound sent by the user, the characteristic value of the deformation text information in the deformation sound sent by the user is extracted through a voice text extraction algorithm, the extracted characteristic value of the deformation text information is matched with the deformation voice control instruction corresponding to the deformation text information in an information operation-behavior control instruction characteristic library corresponding to the pre-stored text information, and therefore the deformation voice control instruction sent by the user is recognized, and the programmable deformation competitive robot can make deformation action.
S122: and generating a behavior control command corresponding to the robot application operation information according to the robot application operation information.
For example: the method comprises the steps that operation data which are sent by an application terminal and input by a user based on click operation are obtained by hardware equipment, then the hardware equipment enters an APP interaction mode, after the APP interaction mode is entered, application operation information is obtained according to the operation data detection, a preset interaction mode corresponding to the application operation information is determined, and a behavior control instruction corresponding to the application operation information is generated after the corresponding preset interaction mode is entered. The specific preset interaction mode comprises an intelligent programming mode, a special effect mode, an online fighting mode, an intelligence-developing early education mode and the like, and is respectively used for realizing intelligent self-defined programming, controlling the robot to execute corresponding special effects, controlling the robot fighting, controlling the robot to realize the intelligence-developing early education function and the like. In the trick mode, the user can control the robot to complete various actions in real time through the mobile phone App, for example, the user can obtain the instructions of 'trick', 'addle', 'show', 'deformation', 'multi-machine performance', 'dance with voice', 'rocker' and 'walking in all directions' and the like through the operation input by the user mobile phone App, and the various actions are completed. In the online combat mode, the fact that a user controls the robot to fight through the mobile phone App in real time can be received, for example, "online combat" or "free combat" is selected through operation input by the mobile phone App of the user, and audio output is also accompanied when wins and losses occur in the combat process. In addition, under the intelligence-developing early education mode, the system can receive local online contents such as a songbook, a story, encyclopedia or national resources pushed by a user through the mobile phone App operation, and receive audio and play audio data contents selected by the user through the Bluetooth system of the robot.
In this step, the robot application operation information is detected robot operation information transmitted by the user using the application terminal. After the robot application operation information is obtained, a corresponding behavior control instruction is generated according to the robot application operation information, the robot can be controlled to execute corresponding behaviors according to the application terminal operation of the user, the user can conveniently perform terminal operation, and the experience effect of the user is improved.
S123: and generating a behavior control command corresponding to the robot touch operation information according to the robot touch operation information.
In order to realize the step, a touch operation information-behavior control instruction feature library can be preset, and the feature library stores the features of the touch operation information and the behavior control instruction and the corresponding relation of the features. When the touch operation information of the robot is detected, performing characteristic matching with the behavior control instruction according to the characteristics of the touch operation information to obtain the behavior control instruction matched with the touch operation information, and then controlling the robot to execute corresponding robot behaviors according to the behavior control instruction.
According to the technical scheme provided by the embodiment of the application, the voice data sent by the user and the operation instruction and the robot touch parameter sent by the application terminal can be respectively and correspondingly detected to obtain the robot voice operation information, the robot application operation information and the robot touch operation information, and then corresponding behavior control instructions are respectively generated according to the information, so that the robot is controlled to execute behaviors corresponding to the voice operation, the application operation instruction and the touch parameter of the user, and the user obtains good interactive experience.
In addition, in order to facilitate the interaction between the user and the robot, the user can conveniently and freely control the robot to enable the robot to execute a specific function, and the use experience of the user is improved, as shown in fig. 3, an "intelligent programming mode" may be set in the embodiment, and in the "intelligent programming mode", the user can program by himself, so that the robot can be freely controlled to execute a corresponding behavior according to the intention of the user. The feedback control method for robot behavior provided by the embodiment shown in fig. 3 is based on the embodiment shown in fig. 2, and the feedback control method for robot behavior provided by the embodiment shown in fig. 3 includes the following steps in addition to the steps shown in fig. 2:
s210: and setting a compiling instruction library storing preset compiling instructions.
In the step, a preset compiling instruction is stored in the compiling instruction library, and if a user needs to program the robot to operate, the user can freely program by directly using the preset compiling instruction in the compiling instruction library, so that the robot executes corresponding behaviors according to the user-defined operation. The compiling instruction library comprises preset compiling instructions in the forms of texts or graphics and the like, a user can customize a robot operation program by directly utilizing the preset compiling instructions on an application terminal, and then the corresponding behavior control instruction is generated by obtaining the robot operation program customized by the user, so that the robot is controlled to realize robot behaviors corresponding to the robot operation program customized by the user. The specific robot behavior is such functions as host control, logic operation, variable operation, etc.
Correspondingly, in this embodiment, the embodiment shown in the original figure 2
The method comprises the following steps: the method for acquiring the robot application operation information sent by the robot application terminal comprises the following steps:
s220: and acquiring a robot operating program sent by the robot application terminal, wherein the robot operating program is generated by compiling a user according to a preset compiling instruction in the compiling instruction library.
In the step, the preset compiling instruction in the compiling instruction library comprises a text form, a graphic form or the like, so that a user can self-define and generate a robot operation program by directly dragging the corresponding graphic, and then the robot is controlled to execute the robot behavior corresponding to the robot operation information, and the operation process is simple and efficient.
The method comprises the following steps: the method comprises the following steps of generating a behavior control command corresponding to the robot application operation information according to the robot application operation information, wherein the steps comprise:
s230: and generating a corresponding behavior control instruction according to the robot operating program so as to control the robot to execute the robot behavior corresponding to the robot operating program.
In this step, since the robot operating program is generated by user-defined compilation, the robot executes the behavior control instruction corresponding to the robot operating program, and can control the robot to execute highly difficult actions and realize specific functions.
For example: under the intelligent programming mode, a corresponding robot operating program can be generated according to text content input by a user, and then a corresponding behavior control instruction is generated, so that the robot can realize functions of corresponding host control, logic operation, variable operation and the like according to the behavior control instruction.
Specifically, in the intelligent programming mode, the user can implement the "morphing-marching" function by self-programming. As shown in fig. 4, the mode of interaction between the control device and the application terminal is an App interaction mode. After the fact that the user enters the App interaction mode is detected, the specific mode which the user needs to enter is detected, if the fact that the user clicks to enter an operation interface of the intelligent programming mode is detected, a corresponding program editing area is displayed, the application terminal can detect the robot operation program which is edited in the program editing area by the user, and specifically the application terminal can enter a preset programming interaction mode which makes corresponding interaction reflection according to 'program content' input by the user. In the preset programming interaction mode, when the action that a user writes a robot to move forward and then deforms is detected: when the program is executed, the instruction block under the program start block is executed first, the forward block is dragged out from the operation instruction block, the program start block is connected below the program start block, the deformation block is dragged out from the operation instruction block and is connected below the forward block, and when the last one detects that the user clicks an execution key, the intelligent programming unit in the application terminal sends the forward-deformation robot operation program to corresponding control equipment through the Bluetooth module, the information processing module in the control equipment executes the forward-deformation robot operation program, and then the behavior control module controls the robot to complete forward-deformation actions, namely, each moving part of the robot executes forward-deformation, and then re-deformed ". Wherein the moving parts comprise respective limb or joint fixation hardware structures accessible by the robot.
It should be noted that the intelligent programming unit in the intelligent programming mode includes a compiling area, a citizen programming module, a control module, a logic module, a variable module, an action module, a sound module and a light module; the robot in the application is a programmable competitive robot, a fighting function and a special effect function can be achieved, and accordingly an absolute programming module can be compiled according to a user to generate fighting 'absolute', and therefore the interaction effect of the robot is improved.
The robot can be programmed in a user-defined mode through the intelligent programming unit in the application terminal and the Bluetooth module in the application terminal, and because a compiling instruction library is preset and comprises a plurality of modules and instruction icons in various forms, each module in the intelligent programming unit is provided with the corresponding instruction icon, and a user can drag the instruction icon to a compiling area to be compiled and executed. For example, the instruction icons in the control module include instruction icons such as program start and touch start; the instruction icons in the logic module are marked with instruction icons of 'if' and 'loop'. And, the user can save the compiled program to the compiling instruction library. For example, in the absolute programming module, the user can self-define and write a corresponding operation program, and after writing is completed, the operation program can be stored in the compiling instruction library and renamed, so that the operation program can be directly used as the absolute when the robot is controlled to fight online or fight freely next time. The data can be output through the Bluetooth system in the absolute programming mode, and the data can be downloaded to a storage unit of a processor of the robot, and the robot can be continuously used in other occasions such as special effects, remote control or online fighting. In addition, the user can imagine compiling action, sound and light according to the user, and each instruction sequence can be modified and deleted in the compiling process so as to realize specific kinetic energy.
In addition, the robot posture is a relatively important robot running state. Feedback control of the robot behavior, such as controlling robot "deformation", is achieved, and detection of robot pose is essential. In order to realize detection of the robot pose and adjust the robot behavior according to the robot pose, the present application further provides a feedback control method of the robot behavior, specifically as shown in fig. 5, in the embodiment shown in fig. 5, the steps in the embodiment shown in fig. 1 are as follows: the running state when real-time detection robot execution robot action, according to running state generation state parameter information, include:
s310: and detecting the current posture of the robot in real time, and obtaining posture parameter information according to the current posture.
S320: and generating a corresponding attitude feedback adjustment instruction according to the attitude parameter information and the robot operation information.
S330: and feeding back an adjusting instruction according to the posture, and adjusting the robot action of the robot in real time.
The current state of the robot, such as the states of a human shape or a vehicle shape, a falling state or a standing state, can be determined by detecting the current posture of the robot. Specifically, the current action mode of the robot is determined by detecting the postures of the robot, such as the vehicle shape or the human shape, which is changed currently, and the postures of the robot, such as falling and standing, which are positioned; and then generating attitude parameter information according to the detected current attitude of the robot, sending the attitude parameter information to a related information processing module, generating a corresponding attitude feedback adjustment instruction by the information processing module according to the attitude parameter information and the original robot operation information, and controlling a corresponding motion control module of the robot to generate the attitude feedback adjustment instruction so as to control each movable part of the robot to finish the adjustment action adaptive to the attitude of the robot.
The current posture of the robot can be obtained by induction of a potentiometer arranged on the robot body. For example, 2 independent potentiometers can be arranged on the robot, each angle on the potentiometer has a corresponding AD value, and each action and the corresponding posture have a corresponding AD value. By sampling the size of each AD value on the robot potentiometer, whether the robot is in a human-shaped posture at the current moment can be confirmed and corrected. In addition, can also set up one and fall detection sensor on the robot health, fall detection sensor and bluetooth system connection, through gathering the signal that this detection sensor sent, confirm whether the robot is in "falls" gesture. When the robot is determined to be in the standing posture or the falling posture at the current moment, the posture adjustment parameters corresponding to the standing posture or the falling posture are matched according to the AD value corresponding to the standing posture or the falling posture, the posture adjustment parameters are sent to the corresponding information processing module, a corresponding posture feedback adjustment instruction is generated, the posture feedback adjustment instruction is sent to the motion control module by the information processing module, and the motion control module controls the motion part of the robot to perform corresponding feedback according to the behavior feedback adjustment instruction. Wherein, the feedback form comprises: sound, motion, expression, etc.
According to the technical scheme provided by the embodiment of the application, the robot gesture is one of the running states of the robot, namely the action of the robot in a certain determined time, and the current action of the robot can be fed back and adjusted according to the robot gesture and the action to be reached by the robot contained in the operation information of the robot by acquiring the robot gesture in real time, so that the robot can smoothly and accurately execute the preset action corresponding to the operation information of a user.
In addition, the embodiment of the application also provides a feedback control method for robot behavior, which comprises the following steps:
and controlling the audio playing equipment of the robot to play audio information corresponding to the audio operation instruction according to the audio operation instruction in the behavior control instruction.
And controlling display equipment of the robot to display information corresponding to the display operation instruction according to the display operation instruction in the behavior control instruction.
And feeding back an adjusting instruction according to the posture, and adjusting the robot action of the robot in real time.
By controlling the robot to play audio information, display information and the like, the interaction performance between the user and the environment can be enhanced, the entertainment of the robot is improved, and for example, the educational intelligence-developing function of the robot to the user is realized.
Specifically, after receiving a sound emitted by a user, a sound characteristic value in the sound of the user or in an environmental sound can be extracted through a specific extraction algorithm, the extracted characteristic value is matched with a voice control instruction in a preset voice mode library, so that voice operation information in the sound emitted by the user is identified, after the voice operation information in the sound emitted by the user is identified by a voice module, a behavior control instruction corresponding to the voice operation information is generated and then is sent to each behavior control module through the behavior control instruction, for example, to a motion control module, a display module or an audio output module, after each module receives the corresponding behavior control instruction, the corresponding behavior control instruction and the robot are adapted to interact feedback data according to the behavior control instruction, and the robot is controlled to complete corresponding feedback according to the interaction feedback data, such as controlling the robot to sound, controlling the robot to display corresponding information and adjusting the robot action.
According to the technical scheme, the control operation based on the external environment voice data and the user is achieved, the environment voice data is automatically detected and recognized, the behavior control instruction is generated, corresponding interaction feedback data are automatically matched according to the behavior control instruction and the current state parameters of the robot, the programmable deformation competitive robot is controlled according to the interaction feedback data to complete corresponding interaction feedback actions, the interaction performance between the robot and the user and the environment is improved, the entertainment of the robot is enhanced, and the educational intelligence-benefiting effect of the robot for the user is achieved.
Based on the same concept of the embodiment of the method, the embodiment of the invention further provides a feedback control system for robot behavior, which is used for implementing the method of the invention.
Referring to fig. 6, fig. 6 is a schematic diagram of a hardware architecture operating environment according to an embodiment of the present invention. The embodiment of the invention provides a feedback control system carrying robot behaviors. As shown in fig. 10, the feedback control system for robot behavior includes:
a
power supply system 001, the
power supply system 001 can be a lithium battery or a power management chip.
The
bluetooth system 002, the
bluetooth system 002 can be a bluetooth chip, and the
bluetooth system 002 can realize the communication between the whole feedback control system and the external device, such as the feedback control system and the application terminal, and the information interaction between the feedback control system and the robot.
A
processor system 003, the
processor system 003 can be a micro control unit MCU.
A 2.4
G system 004, such as a 2.4G family of chips, the 2.4G system capable of communicative coupling with the
processor system 003. In particular, communication of the
processor system 003 with an external remote control is enabled by the 2.4G system. Upon receiving a 2.4G signal or a code value transmitted from a user via an external remote controller, the 2.4
G system 004 can generate a remote control command from the 2.4G signal or the code value and transmit the remote control command to the
processor system 003. For example: the robot has the functions of competitive fighting, performance and the like, and when the robot is in online fighting or multi-machine performance, the 2.4G system needs to send 2.4G signals or code values to an external remote controller to realize interaction.
Wherein,
the
bluetooth system 002 is responsible for communication, and the
bluetooth system 002 includes a touch detection
signal input device 0021, a balance detection
signal input device 0022, an audio
signal output device 0023, and an RGB _ Led display
signal output device 0024. Wherein, the balance detection
signal input device 0022 is used for inputting the attitude parameters of the robot.
The
processor system 003 includes: a
posture detecting device 0031, a
voice recognizing device 0032, and a
motion controlling device 0033. Wherein the
motion control device 0033 includes: an integrated steering engine control device 00331 and a direct current motor control device 00332.
The 2.4
G system 004 includes a remote
control communication device 0041.
The
bluetooth system 002 and the
processor system 003 can communicate with each other, and the 2.4
G system 004 and the
processor system 003 can communicate with each other.
The
processor system 003 further includes a memory, and a feedback control program of the robot behavior in the foregoing embodiments of the present application is stored in the memory and can be executed on the
processor system 003, wherein the feedback control program of the robot behavior when executed by the
processor system 003 implements the steps of the feedback control method of the robot behavior as described in any one of the above technical solutions.
The modules for implementing the feedback control method for the robot behavior in the
processing system 003 are as follows, and are shown in fig. 7 to 10.
Referring to fig. 7, fig. 7 is a schematic structural diagram of a feedback control system for robot behavior according to an embodiment of the present disclosure. As shown in fig. 7, the feedback control system for robot behavior includes:
an
operation detection module 110, configured to detect to obtain robot operation information;
the
information processing module 120 is configured to generate a corresponding behavior control instruction according to the robot operation information;
the
behavior control module 140 is configured to control the robot to execute a robot behavior corresponding to the robot operation information according to the behavior control instruction;
the
state feedback module 130 is used for detecting the running state of the robot when the robot executes the robot behavior in real time, generating state parameter information according to the running state, and feeding the state parameter information back to the information processing module;
the
information processing module 120 is further configured to generate a row as a feedback adjustment instruction according to the robot operation information and the state parameter information;
the
behavior control module 140 is further configured to adjust the robot behavior in real time according to the behavior feedback adjustment instruction.
According to the technical scheme provided by the embodiment of the invention, the
operation detection module 110 detects robot operation information, the
information processing module 120 generates a corresponding behavior control instruction according to the robot operation information, the
behavior control module 140 controls the robot to execute a corresponding robot behavior according to the behavior control instruction, meanwhile, the
state feedback module 130 detects the running state of the robot when executing the robot behavior in real time, the
information processing module 120 generates state parameter information according to the running state, and the
behavior control module 140 generates a feedback adjustment instruction according to the state parameter information to adjust the robot behavior in a negative feedback manner. Through the feedback control process, the running state of the robot can be determined in real time when the robot executes the behavior of the robot, errors of the robot when the robot executes the behavior of the robot are corrected in real time, so that the robot can smoothly and accurately complete the robot behavior corresponding to the operation information of the robot, and the problems that the interaction effect of the robot and the human is poor and the behavior execution process is rigid in the prior art are solved.
As shown in fig. 8, as a preferred embodiment, in this embodiment, the
operation detection module 110 in the embodiment shown in fig. 7 includes:
the voice detection submodule 111, the
application control submodule 112 and the
touch detection submodule 113 are respectively connected with the
information processing module 120; wherein,
the voice detection submodule 111 is configured to obtain robot voice operation information according to preset voice feature matching, and send the robot voice operation information to the
information processing module 120, so that the
information processing module 120 generates a corresponding behavior control instruction according to the robot voice operation information;
the
application control sub-module 112 is configured to detect robot application operation information sent by the robot application terminal, and send the robot application operation information to the
information processing module 120, so that the
information processing module 120 generates a corresponding behavior control instruction according to the robot application operation information;
and the
touch detection submodule 113 is configured to detect and obtain robot touch operation information sent by a touch sensor on the robot, and send the robot touch operation information to the
information processing module 120, so that the
information processing module 120 generates a corresponding behavior control instruction according to the robot touch operation information.
According to the technical scheme provided by the embodiment of the application, voice data sent by a user is acquired through the voice detection submodule 111, the
application control submodule 112 acquires an operation instruction sent by an application terminal, the
touch detection submodule 113 acquires robot touch parameters, robot voice operation information, robot application operation information and robot touch operation information can be acquired through corresponding detection respectively, corresponding behavior control instructions are generated respectively according to the information, the robot is controlled to execute behaviors corresponding to the voice operation, the application operation instruction and the touch parameters of the user, and the user obtains good interactive experience.
As shown in fig. 9, as a preferred embodiment, in this embodiment, the
application control sub-module 112 shown in fig. 8 further includes:
the
programming control unit 1121 is configured to include a compiling instruction library in which a preset compiling instruction is stored, and the
programming control unit 1121 is configured to obtain a robot operating program generated by a user on the
application terminal 200 according to compiling of the preset compiling instruction;
the
information processing module 120 is further configured to generate a corresponding behavior control instruction according to the robot operating program, so that the
behavior control module 140 controls the robot to execute the robot behavior corresponding to the robot operating program.
In the technical solution provided in the embodiment of the present application, a compiling instruction library is set in the
programming control unit 1121, and a preset compiling instruction is stored in the compiling instruction library. If a user needs to program the
robot 300 to execute corresponding operations, the user can freely write a program on the
robot application terminal 200 by directly using the compiling instruction in the compiling instruction library, so that the
robot 300 can execute corresponding behaviors according to the user-defined operations. The preset compiling instruction in the compiling instruction library comprises a text form, a graphic form or the like, so that a user can generate a robot operation program in a self-defined manner by directly dragging the corresponding graphic, and then the
robot 300 is controlled to execute the robot behavior corresponding to the robot operation information, and the operation process is simple and efficient. Since the robot operation program is generated by user-defined compilation, the
robot 300 executes the behavior control command corresponding to the robot operation program, and can control the robot to execute highly difficult motions and realize specific functions.
As a preferred embodiment, as shown in fig. 10, the
state feedback module 130 in the embodiment shown in fig. 8 includes:
the
posture detection submodule 131 is configured to detect the current posture of the
robot 300 in real time to obtain posture parameter information, and feed back the posture parameter information to the
information processing module 120, so that the information processing module generates a corresponding posture feedback adjustment instruction according to the posture parameter information and the robot operation information.
Correspondingly, the
behavior control module 140 in the original embodiment includes:
the
action control submodule 141 is used for feeding back an adjustment instruction according to the posture and adjusting the robot action of the robot in real time;
the
audio control submodule 142 is used for controlling the audio playing equipment of the robot to play audio information corresponding to the audio operation instruction according to the audio operation instruction in the behavior control instruction;
and a
display control sub-module 143, configured to control the display device of the robot to display information corresponding to the display operation instruction according to the display operation instruction in the behavior control instruction.
In the technical solution provided in the embodiment of the present application, the
gesture detection submodule 131 obtains the gesture of the robot in real time, and the
action control submodule 141 can feed back and adjust the current action of the robot according to the gesture of the robot and the action to be reached by the
robot 300 included in the robot operation information, so that the
robot 300 can smoothly and accurately execute the predetermined action corresponding to the operation information of the user. In addition, the
audio control submodule 142 and the
display control submodule 143 are arranged to realize each control operation based on external environment data and a user, automatically detect and identify the environment data and generate a behavior control command, automatically match corresponding interaction feedback data according to the behavior control command and the current state parameter of the
robot 300, and control the
robot 300 to finish corresponding actions of sending voice, displaying images and the like according to the interaction feedback data, so that the interaction performance between the robot and the user and the environment is improved, the entertainment and the intelligence of the robot are enhanced, and the experience effect of the user is improved.
The voice detection submodule 111 in the
operation detection module 110 can also identify the volume of the sound emitted by the user or the volume of the environmental sound according to the sound volume extraction algorithm, and generate volume response interactive information according to the volume, and the
information processing module 120 of the programmable deformable competitive robot can generate a volume interaction control instruction, such as an instruction for controlling the robot to deform, according to the volume response interactive information, and control the
motion control submodule 141 or the
audio control submodule 142 and the
display control submodule 143 to control the robot to respond to interaction feedback.
Specifically, the
information processing module 120 may be a dual-core processor including a piece of 32-bit MCU.
For example, after the voice detection sub-module 111 recognizes "distortion" voice operation information in the voice uttered by the user, the recognized "distortion" voice operation information is transmitted to the dual-core processor, the dual-core processor processes the "distortion" voice operation information to obtain a "distortion" voice control command, and the "distortion" voice control command is transmitted to the action control sub-module 141 adapted to the "distortion" action.
After receiving the "deformation" voice control instruction sent by the
information processing module 120, the
action control module 141 adapts the interactive feedback data according to the voice control instruction and the programmable deformation competitive robot, and controls the programmable deformation competitive robot to complete the corresponding feedback of the deformation according to the interactive feedback data.
Then, the
posture detection sub-module 131 can determine the current posture of the robot, such as "human shape" or "vehicle shape", "falling down" or "standing up", by detecting the robot posture, specifically, determine the current action mode of the robot by detecting the currently changed posture of the robot, such as "vehicle shape" or "human shape", and the falling down and standing posture of the robot in real time; then generating attitude parameter information according to the detected current attitude of the robot, sending the attitude parameter information to the
information processing module 120, generating a corresponding attitude feedback adjustment instruction by the information processing module according to the attitude parameter information and the original robot operation information, and controlling each movable part of the robot to complete an adjustment action adaptive to the attitude of the robot by the
behavior control module 140 of the robot according to the attitude feedback adjustment instruction, so that the robot smoothly completes a 'deformation' behavior.
Each of the movement and deformation members of the robot is a hardware structure such as each of limbs or joints that can be moved.
In addition, in order to realize actions such as "deformation", in the feedback control system of robot behavior provided in the embodiment of the present invention, the
behavior control module 140 may include an angle detection unit, a proportional-integral-derivative PID algorithm unit, and a motor execution unit, wherein the angle detection unit is configured to detect a rotation angle, a waist rotation angle, and a waist pull rod angle of a structurally integrated steering engine built in the robot; the PID algorithm unit is used for calculating the behavior of the robot through a PID algorithm according to the rotation angle; the motor execution unit can control the robot to execute corresponding actions through the motor according to the action.
Specifically, when the robot performs the "deformation" action, the
information processing module 120 receives the external "deformation" operation information, generates a "deformation" control command, sends the command
behavior control module 140, determines whether the robot is currently in the "human shape" or the "car shape" through the angle detection unit in the
behavior control module 140, and performs another state according to the current state to perform the "deformation" behavior, for example, to control the robot to change the "human shape" state into the "car shape" state.
In addition, the present application also claims a computer-readable storage medium, on which a feedback control program of robot behavior is stored, which when executed by a processor implements the steps of any one of the above-mentioned feedback control methods of robot behavior.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It should be noted that in the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the appended claims be interpreted as including the preferred embodiment and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.