CN116214528A - Storage control method and control system for humanoid robot - Google Patents

Storage control method and control system for humanoid robot Download PDF

Info

Publication number
CN116214528A
CN116214528A CN202310517300.6A CN202310517300A CN116214528A CN 116214528 A CN116214528 A CN 116214528A CN 202310517300 A CN202310517300 A CN 202310517300A CN 116214528 A CN116214528 A CN 116214528A
Authority
CN
China
Prior art keywords
data
action
stored
scene
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310517300.6A
Other languages
Chinese (zh)
Other versions
CN116214528B (en
Inventor
李修录
朱小聪
尹善腾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Axd Anxinda Memory Technology Co ltd
Original Assignee
Axd Anxinda Memory Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Axd Anxinda Memory Technology Co ltd filed Critical Axd Anxinda Memory Technology Co ltd
Priority to CN202310517300.6A priority Critical patent/CN116214528B/en
Publication of CN116214528A publication Critical patent/CN116214528A/en
Application granted granted Critical
Publication of CN116214528B publication Critical patent/CN116214528B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1612Programme controls characterised by the hand, wrist, grip control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The invention is applicable to the technical field of data processing, and particularly relates to a storage control method and a storage control system for a humanoid robot, wherein the method comprises the following steps: acquiring data to be stored, wherein the data to be stored at least comprises scene data and control data; dividing data to be stored into a plurality of action data sets according to the part division of the humanoid robot; inquiring a preset standard action database, and analyzing the action data set to obtain action difference data; converting the data to be stored into a control action set, converting the scene data into scene characteristics, and storing the control action set, the action difference data and the scene characteristics. According to the method, the data to be stored is analyzed and converted into a plurality of actions, the control difference between each action and the standard action is determined, the action set, the action difference data and the scene characteristics are controlled to serve as the basis for recovering the data to be stored for storage, so that the memory required by data storage is greatly reduced, and the cost of data storage is reduced.

Description

Storage control method and control system for humanoid robot
Technical Field
The invention belongs to the technical field of data processing, and particularly relates to a storage control method and a storage control system for a humanoid robot.
Background
The humanoid robot is a robot which aims at simulating the appearance and the behavior of human beings, particularly refers to a kind of human body similar to human beings, and the development of robotics can design a functional and simulated humanoid robot.
However, in the use process of the humanoid robot, in order to study the humanoid robot, the operation data thereof are recorded, and if the data generated by all elements of the humanoid robot are recorded, a large amount of data are generated, and a large amount of memory is required.
Disclosure of Invention
The embodiment of the invention aims to provide a storage control method for a humanoid robot, which aims to solve the problem that if data generated by all elements of the humanoid robot are recorded, a large amount of data can be generated and a large amount of memory is required to be occupied.
The embodiment of the invention is realized in such a way that a storage control method of a humanoid robot comprises the following steps:
acquiring data to be stored, wherein the data to be stored at least comprises scene data and control data;
dividing data to be stored into a plurality of action data sets according to the part division of the humanoid robot, wherein the action data sets comprise independent action data packets and linkage action data packets;
inquiring a preset standard action database based on the action data set, and analyzing the action data set to obtain action difference data;
converting the data to be stored into a control action set, converting the scene data into scene characteristics, and storing the control action set, the action difference data and the scene characteristics.
Preferably, the step of dividing the data to be stored into a plurality of action data sets according to the division of the parts of the humanoid robot specifically includes:
dividing the humanoid robot into a plurality of independent control areas according to the functional areas, wherein the independent control areas are matched with each other to complete corresponding actions;
identifying action content contained in the data to be stored, and dividing the data to be stored according to the action content to obtain a plurality of action data blocks;
and determining the action contained in each action data block, and dividing the action into an independent action data packet and a linkage action data packet.
Preferably, the step of querying a preset standard action database based on the action data set to analyze the action data set to obtain action difference data specifically includes:
searching a standard action database according to actions contained in the action data set, and searching standard action data matched with the standard action database;
extracting standard response parameters corresponding to the standard action data, and extracting real-time response parameters of the action data set;
and comparing the standard response parameters with the real-time response parameters, and taking the difference value as action difference data.
Preferably, the step of converting the data to be stored into a control action set, converting the scene data into scene features, and storing the control action set, the action difference data and the scene features specifically includes:
identifying data to be stored, determining the sequence of all actions contained in the data to obtain a control action set, wherein the control action set at least comprises the execution time of each action;
extracting scene data corresponding to different actions, and recording judging conditions required by the corresponding actions to obtain scene characteristics;
scene characteristics and action difference data corresponding to each action are determined and stored together with a control action set.
Preferably, the scene data includes interactive images, interactive voices and interactive instructions.
Preferably, when the stored data is required to be used, the corresponding scene characteristics and action difference data are searched according to the control action set, and the data are imported into the corresponding robot model for operation, so that the data to be stored are obtained.
Another object of an embodiment of the present invention is to provide a storage control system for a humanoid robot, the system including:
the data acquisition module is used for acquiring data to be stored, wherein the data to be stored at least comprises scene data and control data;
the action dividing module is used for dividing the data to be stored into a plurality of action data sets according to the part division of the humanoid robot, wherein the action data sets comprise independent action data packets and linkage action data packets;
the action analysis module is used for inquiring a preset standard action database based on the action data set, analyzing the action data set and obtaining action difference data;
the data storage module is used for converting the data to be stored into a control action set, converting the scene data into scene characteristics and storing the control action set, the action difference data and the scene characteristics.
Preferably, the action dividing module includes:
the robot partition unit is used for dividing the humanoid robot into a plurality of independent control areas according to the functional areas, and the independent control areas are matched to complete corresponding actions;
the action recognition unit is used for recognizing action content contained in the data to be stored, and dividing the data to be stored according to the action content to obtain a plurality of action data blocks;
and the data dividing unit is used for determining the action contained in each action data block and dividing the action into an independent action data packet and a linkage action data packet.
Preferably, the action analysis module includes:
the data matching unit is used for searching the standard action database according to the actions contained in the action data set and searching the standard action data matched with the standard action database;
the parameter extraction unit is used for extracting standard response parameters corresponding to the standard action data and extracting real-time response parameters of the action data set;
and the data comparison unit is used for comparing the standard response parameters with the real-time response parameters, and taking the difference value as action difference data.
Preferably, the data storage module includes:
the action set identification unit is used for identifying the data to be stored, determining the sequence of all actions contained in the data to obtain a control action set, wherein the control action set at least comprises the execution time of each action;
the scene feature extraction unit is used for extracting scene data corresponding to different actions and recording judgment conditions required by the corresponding actions to obtain scene features;
and the data decomposition storage unit is used for determining scene characteristics and action difference data corresponding to each action and storing the scene characteristics and the action difference data and the control action set.
According to the storage control method for the humanoid robot, the data to be stored is analyzed and converted into the plurality of actions, the control difference between each action and the standard action is determined, the control action set, the action difference data and the scene characteristics are used as the basis for recovering the data to be stored for storage, so that the memory required by data storage is greatly reduced, and the cost of data storage is reduced.
Drawings
FIG. 1 is a flow chart of a storage control method for a humanoid robot provided by an embodiment of the invention;
FIG. 2 is a flowchart illustrating steps for dividing data to be stored into a plurality of action data sets according to the position division of the humanoid robot according to the embodiment of the present invention;
FIG. 3 is a flowchart of a step of querying a preset standard action database based on an action data set to analyze the action data set to obtain action difference data according to an embodiment of the present invention;
FIG. 4 is a flowchart illustrating steps for converting data to be stored into a control action set, converting scene data into scene features, and storing the control action set, action difference data and scene features according to an embodiment of the present invention;
fig. 5 is a schematic diagram of a storage control system of a humanoid robot according to an embodiment of the present invention;
FIG. 6 is a block diagram of an action partitioning module according to an embodiment of the present invention;
FIG. 7 is a block diagram of an action analysis module according to an embodiment of the present invention;
fig. 8 is a schematic diagram of a data storage module according to an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
As shown in fig. 1, a flowchart of a storage control method for a humanoid robot according to an embodiment of the present invention is provided, where the method includes:
s100, obtaining data to be stored, wherein the data to be stored at least comprises scene data and control data.
In this step, data to be stored is obtained, the data to be stored is data generated in the operation process of the humanoid robot, the humanoid robot can be a welcome robot, the data to be stored is data generated in the operation process of the humanoid robot, such as the rotation angle of a motor, the telescopic length of a telescopic cylinder and the like, the data to be stored also comprises control data and scene data, the control data is instruction data for controlling the operation of each part in the humanoid robot, such as the rotation of a motor A, the extension of a telescopic cylinder B and the like, the scene data is a trigger event for triggering each action, such as the detection of the entry of a person by an image acquisition unit in the humanoid robot, the swinging of the humanoid robot is controlled, the welcome action is formed, the image acquired by the image acquisition unit is scene data, and the control data is a control instruction for controlling the welcome action of the humanoid robot.
S200, dividing data to be stored into a plurality of action data sets according to the part division of the humanoid robot, wherein the action data sets comprise independent action data packets and linkage action data packets.
In this step, the data to be stored is divided into a plurality of action data sets according to the part division of the humanoid robot, which is called as the humanoid robot, and is similar to a human body, and also has limbs and facial organs, such as a welcome robot, which retains arms, the arms are composed of a plurality of parts, and the parts can be relatively moved, particularly, the arms are controlled by an internally arranged motor or other driving structures, so each part is provided with an independent driving unit, each part can complete independent actions, such as lifting, lowering and the like of a large arm part, and then the humanoid robot can treat the humanoid robot as an action set of a plurality of parts, such as a welcome action, including lifting the large arm, retracting the small arm, turning the palm and the like, and therefore, each action can be realized by a combination of each part, and the action data set comprises an independent action data packet and a linkage action data packet, and the independent action data packet comprises all data generated by actions completed by one humanoid robot part, such as turning over, and the linkage data packet is required to complete all the actions of the robot, such as a swing hand.
S300, inquiring a preset standard action database based on the action data set, and analyzing the action data set to obtain action difference data.
In this step, a preset standard motion database is queried based on a motion data set, in the standard motion database, a plurality of motions are preset, including independent motions completed by a single humanoid robot component part and joint motions completed by a plurality of humanoid robot component parts, and execution motions corresponding to the respective motions are recorded, so as to realize the independent motion A, ten instructions need to be executed by the corresponding component parts, and parameters A1-A10 of each instruction are executed, and in the actual operation process, because the required motions are not identical, if the standard motions are 20 degrees for lifting a big arm, and the motions actually required to be executed are 15 degrees for lifting the big arm, parameters corresponding to the instructions required for completing the motions are necessarily different, the two motions are compared, and difference values are recorded, so that motion difference data is obtained.
S400, converting the data to be stored into a control action set, converting the scene data into scene characteristics, and storing the control action set, the action difference data and the scene characteristics.
In this step, the data to be stored is converted into a control action set, all actions of the humanoid robot are completed through each component, so that each action can be sequenced according to time sequence to form an action time sequence, namely, the control action set is obtained, further, scene data is converted into scene characteristics, the process is used for reducing useless data to extract key data, finally, the control action set, action difference data and scene characteristics are stored, in order to ensure the accuracy of the stored data, the corresponding control action set, action difference data and scene characteristics are called, the actions completed by the humanoid robot are determined according to the control action set, further, the standard actions are determined according to the control action set by querying the corresponding standard action database, and the standard actions are adjusted according to the action difference data, so as to form preliminary recovery data, the scene characteristics are imported into a preset robot model, simulation operation is performed to obtain an output result, the output result and the preliminary recovery data are checked, and the validity of the preliminary recovery data is determined, if the preliminary recovery data is valid, the control action set, the action difference data and the scene characteristics are used for replacing the data to be stored, and otherwise, the data is stored directly.
As shown in fig. 2, as a preferred embodiment of the present invention, the step of dividing the data to be stored into a plurality of action data sets according to the division of the parts of the humanoid robot specifically includes:
s201, dividing the humanoid robot into a plurality of independent control areas according to the functional areas, and completing corresponding actions among the independent control areas through cooperation.
In this step, the humanoid robot is divided into a plurality of independent control areas according to the functional areas, wherein the independent control areas are the humanoid robot components capable of designating individual instructions, such as a wrist, a hand, a forearm, a big arm and the like, and each independent control area can complete independent instructions, such as lifting the forearm and the like.
S202, identifying action content contained in the data to be stored, and dividing the data to be stored according to the action content to obtain a plurality of action data blocks.
In this step, the motion content included in the data to be stored is identified, and the motion of the humanoid robot is actually completed by motions, such as a preset greeting motion, a bowing motion, and the like, and each preset motion is required to be implemented by the motions of a plurality of independent control areas, so that the data to be stored is divided according to the preset motions, thereby obtaining a plurality of data blocks.
S203, determining the actions contained in each action data block, and dividing the actions into independent action data packets and linkage action data packets.
In this step, the actions included in each action data block are determined, and different actions are completed by different independent control areas, so that the actions are divided into independent action data packets and linkage action data packets according to the number of the independent control areas.
As shown in fig. 3, as a preferred embodiment of the present invention, the step of querying a preset standard action database based on an action data set, and analyzing the action data set to obtain action difference data specifically includes:
s301, searching a standard action database according to actions contained in the action data set, and searching standard action data matched with the standard action database.
In this step, the standard motion database is searched according to the motion included in the motion data set, and before use, the standard motion database is configured according to the preset motion that can be implemented by the humanoid robot, if one preset motion includes a plurality of different execution modes, for example, a 90-degree bow is performed at 120-degree bow, that is, a bow can be performed at different bending angles, then one angle is randomly selected as the standard motion, the required control instruction is stored in the standard motion database, then during searching, the motion executed by the current humanoid robot is determined, and the corresponding standard motion is determined to obtain the standard motion data.
S302, extracting standard response parameters corresponding to the standard action data, and extracting real-time response parameters of the action data set.
In this step, the standard response parameters corresponding to the standard motion data are extracted, and in the standard motion data, the instruction for completing the standard motion is recorded and extracted, for example, the rotation angle, the rotation speed, and the like of the motor, and similarly, the instruction is actually executed for the real-time response parameters.
S303, comparing the standard response parameter with the real-time response parameter, and taking the difference value as action difference data.
In this step, the standard response parameters and the real-time response parameters are compared, and because of a certain difference in the executed actions, such as 90 degrees bow and 100 degrees bow, the difference in the motor control parameters of the two is determined by comparing the difference between the determined parameters, so as to obtain action difference data, specifically, the action difference data can be directly represented by positive and negative data, if the standard response parameters are 50, and if the real-time response parameters are 30, the action difference data are recorded to-20.
As shown in fig. 4, as a preferred embodiment of the present invention, the steps of converting data to be stored into a control action set, converting scene data into scene features, and storing the control action set, the action difference data and the scene features specifically include:
s401, identifying data to be stored, determining the sequence of all actions contained in the data to obtain a control action set, wherein the control action set at least comprises the execution time of each action.
In this step, the data to be stored is identified, and for the same action, after being processed by the humanoid robot, a plurality of execution steps are generated, so that the final result is the same, for example, the preset action a includes two independent control areas b and c, and the preset action a can be implemented by controlling b and c first, or by controlling c first and b second, and recording the sequence of execution actions to form a control action set.
S402, extracting scene data corresponding to different actions, and recording judging conditions required by the corresponding actions to obtain scene characteristics.
In this step, the scene data corresponding to different actions are extracted, when the humanoid robot performs the actions, the humanoid robot is obtained according to real-time scene analysis, if the image acquisition equipment recognizes that a person enters, the corresponding actions are judged to be performed, then the corresponding judgment conditions are that the picture change is detected, the picture is guided into the face recognition model, the person is detected, and meanwhile, the infrared detection device detects the heat source, then the person enters is judged, at this time, the corresponding actions are performed, and then the data corresponding to the judgment conditions are extracted as scene characteristics, namely, the information including the face image, the picture change rate, the number of infrared image pixels, the position of the person and the like is extracted.
S403, determining scene characteristics and action difference data corresponding to each action, and storing the scene characteristics and the action difference data and a control action set.
In the step, determining the scene characteristics and the action difference data corresponding to each action, storing, calling a corresponding control action set, action difference data and scene characteristics for data verification, determining the actions completed by the humanoid robot according to the control action set, further inquiring a corresponding standard action database according to the control action set to determine standard actions, adjusting the standard actions according to the action difference data to form preliminary recovery data, importing the scene characteristics into a preset robot model, performing simulation operation to obtain an output result, checking the output result and the preliminary recovery data, judging the validity of the preliminary recovery data, and if the preliminary recovery data is valid, replacing the data to be stored with the control action set, the action difference data and the scene characteristics, otherwise, directly storing the data to be stored.
As shown in fig. 5, a storage control system for a humanoid robot according to an embodiment of the present invention includes:
the data acquisition module 100 is configured to acquire data to be stored, where the data to be stored includes at least scene data and control data.
In the system, the data acquisition module 100 acquires data to be stored, the data to be stored is data generated in the running process of the humanoid robot, the humanoid robot can be a welcome robot, the data to be stored is data generated in the running process of the humanoid robot, such as the rotation angle of a motor, the telescopic length of a telescopic cylinder and the like, the data to be stored also comprises control data and scene data, the control data is instruction data for controlling the running of each part in the humanoid robot, such as the rotation of a motor, the extension of a telescopic cylinder and the like, the scene data is a trigger event for triggering each action, such as the detection of the entry of a person by an image acquisition unit in the humanoid robot, the swinging of the humanoid robot is controlled, the welcome action is formed, the image acquired by the image acquisition unit is scene data, and the control data is a control instruction for controlling the swinging out of the welcome action of the humanoid robot.
The action dividing module 200 is configured to divide data to be stored into a plurality of action data sets according to the division of the part of the humanoid robot, where the action data sets include independent action data packets and linkage action data packets.
In the system, the motion dividing module 200 divides data to be stored into a plurality of motion data sets according to the part division of the humanoid robot, which is called as a humanoid robot, similar to a human body, and also has limbs and facial organs, such as a greeting robot, which retains arms, wherein the arms are composed of a plurality of parts, and the parts can be relatively moved, particularly, the parts can be controlled by an internally arranged motor or other driving structures, so that each part is provided with an independent driving unit, each part can complete independent motion, such as lifting, lowering and the like of a large arm part, and then the humanoid robot can treat the humanoid robot as a motion set of a plurality of parts, such as a greeting motion, including lifting the large arm, retracting the small arm, turning the palm and the like, and therefore, each motion can be realized by a combination of each part, and the motion data set comprises an independent motion data packet and a motion data packet, wherein the independent motion data packet comprises all data generated by motions completed by one humanoid robot part, such as turning the wrist part, and the linkage data packet is required to complete the combined motions of the human body, such as a swing motion of the human body.
The action analysis module 300 is configured to query a preset standard action database based on the action data set, and analyze the action data set to obtain action difference data.
In the system, the motion analysis module 300 queries a preset standard motion database based on a motion data set, in the standard motion database, a plurality of motions are preset, including independent motions completed by a single humanoid robot component part and joint motions completed by a plurality of humanoid robot component parts, and records execution motions corresponding to each motion, so as to realize the implementation of the A independent motion, ten instructions are required to be executed by the corresponding component parts, parameters A1-A10 of each instruction are executed, and in the actual operation process, because the required motions are not completely the same, if the standard motions are 20 degrees for lifting a big arm, and the motions actually required to be executed are 15 degrees for lifting the big arm, parameters corresponding to the instructions required for completing the motions are necessarily different, the two are compared, and difference values are recorded, so that motion difference data is obtained.
The data storage module 400 is configured to convert data to be stored into a control action set, convert scene data into scene features, and store the control action set, the action difference data and the scene features.
In the system, the data storage module 400 converts data to be stored into a control action set, all actions of the humanoid robot are completed through each component, so that each action can be sequenced according to time sequence to form an action time sequence, namely, the control action set is obtained, further, scene data are converted into scene features, the process is used for reducing useless data to extract key data, the control action set, action difference data and scene features are finally stored, in order to ensure the accuracy of the stored data, the corresponding control action set, action difference data and scene features are called, actions completed by the humanoid robot are determined according to the control action set, further, standard actions are determined according to the control action set, the standard actions are queried according to the action difference data, the initial recovery data are formed, the scene features are imported into a preset robot model, simulation operation is performed to obtain an output result, the output result and the initial recovery data are checked, the validity of the initial recovery data is judged, and if the initial recovery data is valid, the control action set, the action difference data and the scene features are used for replacing the data, and the data are stored directly, otherwise, the data is stored.
As shown in fig. 6, as a preferred embodiment of the present invention, the action dividing module 200 includes:
the robot partition unit 201 is configured to divide the humanoid robot into a plurality of independent control areas according to the functional areas, and the independent control areas complete corresponding actions through cooperation.
In this module, the robot partition unit 201 divides the humanoid robot into a plurality of independent control areas according to the functional areas, wherein the independent control areas are the components of the humanoid robot capable of designating individual instructions, such as a wrist, a hand, a forearm and the like, and each independent control area can complete independent instructions, such as lifting the forearm and the like.
The action recognition unit 202 is configured to recognize action content included in the data to be stored, and divide the data to be stored according to the action content to obtain a plurality of action data blocks.
In this module, the motion recognition unit 202 recognizes the motion content included in the data to be stored, and the motion of the humanoid robot is actually completed by the motion, such as a preset welcome motion, a greeting motion, a bowing motion, and the like, and each preset motion is required to be implemented by the motion of a plurality of independent control areas, so that the data to be stored is divided according to the preset motion, thereby obtaining a plurality of data blocks.
The data dividing unit 203 is configured to determine an action included in each action data block, and divide the action into an independent action data packet and a linkage action data packet.
In this module, the data dividing unit 203 determines the actions included in each action data block, and different actions need to be completed by different independent control areas, so that the actions are divided into independent action data packets and linkage action data packets according to the number of the independent control areas.
As shown in fig. 7, as a preferred embodiment of the present invention, the motion analysis module 300 includes:
the data matching unit 301 is configured to search the standard motion database according to the motion contained in the motion data set, and find the standard motion data matched with the standard motion database.
In this module, the data matching unit 301 searches the standard motion database according to the motion included in the motion data set, and before using, configures the standard motion database according to the preset motion that can be implemented by the humanoid robot, if one preset motion includes a plurality of different execution modes, such as bending at 90 ° and bowing at 120 ° or bending at different bending angles, then randomly selects an angle as the standard motion, stores the control instruction required by the angle into the standard motion database, then determines the motion executed by the current humanoid robot during searching, and determines the corresponding standard motion to obtain the standard motion data.
The parameter extraction unit 302 is configured to extract a standard response parameter corresponding to the standard action data, and extract a real-time response parameter of the action data set.
In this module, the parameter extraction unit 302 extracts a standard response parameter corresponding to standard motion data, in which an instruction for completing the standard motion is recorded and extracted, such as a motor rotation angle, a motor rotation speed, and the like, and also an instruction actually executed for a real-time response parameter.
The data comparing unit 303 is configured to compare the standard response parameter with the real-time response parameter, and take the difference value as motion difference data.
In this module, the data comparison unit 303 compares the standard response parameter with the real-time response parameter, and because there is a certain difference in the executed actions, such as 90 degrees for bow and 100 degrees for bow, the motor control parameters of the two are different, and the difference between the parameters is determined by comparison, so as to obtain action difference data, specifically, the action difference data can be directly represented by positive and negative data, if the standard response parameter is 50, and the real-time response parameter is 30, the action difference data is recorded to be-20.
As shown in fig. 8, as a preferred embodiment of the present invention, the data storage module 400 includes:
the action set identifying unit 401 is configured to identify data to be stored, determine the sequence of all actions included in the data, and obtain a control action set, where the control action set includes at least the execution time of each action.
In this module, the action set identification unit 401 identifies the data to be stored, and for the same action, after the processing by the humanoid robot, multiple execution steps will be generated, so that the final result is the same, for example, the preset action a includes two independent control areas b and c, and the preset action a can be implemented by controlling b and then c, or controlling c and then b, and recording the sequence of executing actions to form a control action set.
The scene feature extraction unit 402 is configured to extract scene data corresponding to different actions, record determination conditions required by the corresponding actions, and obtain scene features.
In this module, the scene feature extraction unit 402 extracts scene data corresponding to different actions, when the humanoid robot performs the actions, the humanoid robot is obtained according to real-time scene analysis, if the image acquisition device recognizes that a person enters, it is determined that the corresponding actions need to be performed, then the corresponding determination conditions are that a frame change is detected, a frame is led into a face recognition model, a person is detected, and at the same time, the infrared detection device detects a heat source, then it is determined that a person enters, at this time, the corresponding actions are performed, and then data corresponding to the determination conditions are extracted as scene features, namely, information including a face image, a frame change rate, the number of pixels of an infrared image, the position of the person, and the like.
The data decomposition storage unit 403 is configured to determine scene characteristics and motion difference data corresponding to each motion, and store the scene characteristics and the motion difference data together with the control motion set.
In this module, the data decomposition storage unit 403 determines the scene feature and the motion difference data corresponding to each motion, stores the scene feature and the motion difference data, retrieves the corresponding control motion set, motion difference data and scene feature for data verification, determines the motion completed by the humanoid robot according to the control motion set, further determines the standard motion according to the control motion set query corresponding standard motion database, adjusts the standard motion according to the motion difference data to form preliminary recovery data, and imports the scene feature into a preset robot model for simulation operation, obtains an output result, checks the output result and the preliminary recovery data, determines the validity of the preliminary recovery data, and if the preliminary recovery data is valid, replaces the data to be stored with the control motion set, the motion difference data and the scene feature, otherwise directly stores the data to be stored.
It should be understood that, although the steps in the flowcharts of the embodiments of the present invention are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in various embodiments may include multiple sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, nor do the order in which the sub-steps or stages are performed necessarily performed in sequence, but may be performed alternately or alternately with at least a portion of the sub-steps or stages of other steps or other steps.
Those skilled in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by a computer program for instructing relevant hardware, where the program may be stored in a non-volatile computer readable storage medium, and where the program, when executed, may include processes in the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the various embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), memory bus direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the invention and are described in detail herein without thereby limiting the scope of the invention. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the invention, which are all within the scope of the invention. Accordingly, the scope of protection of the present invention is to be determined by the appended claims.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, and alternatives falling within the spirit and principles of the invention.

Claims (10)

1. A storage control method for a humanoid robot, the method comprising:
acquiring data to be stored, wherein the data to be stored at least comprises scene data and control data;
dividing data to be stored into a plurality of action data sets according to the part division of the humanoid robot, wherein the action data sets comprise independent action data packets and linkage action data packets;
inquiring a preset standard action database based on the action data set, and analyzing the action data set to obtain action difference data;
converting the data to be stored into a control action set, converting the scene data into scene characteristics, and storing the control action set, the action difference data and the scene characteristics.
2. The storage control method of a humanoid robot according to claim 1, wherein the step of dividing the data to be stored into a plurality of action data sets according to the division of the parts of the humanoid robot specifically comprises:
dividing the humanoid robot into a plurality of independent control areas according to the functional areas, wherein the independent control areas are matched with each other to complete corresponding actions;
identifying action content contained in the data to be stored, and dividing the data to be stored according to the action content to obtain a plurality of action data blocks;
and determining the action contained in each action data block, and dividing the action into an independent action data packet and a linkage action data packet.
3. The storage control method of a humanoid robot according to claim 1, wherein the step of querying a preset standard motion database based on a motion data set, analyzing the motion data set, and obtaining motion difference data specifically comprises:
searching a standard action database according to actions contained in the action data set, and searching standard action data matched with the standard action database;
extracting standard response parameters corresponding to the standard action data, and extracting real-time response parameters of the action data set;
and comparing the standard response parameters with the real-time response parameters, and taking the difference value as action difference data.
4. The storage control method of the humanoid robot according to claim 1, wherein the step of converting the data to be stored into a control action set, converting the scene data into scene features, and storing the control action set, the action difference data, and the scene features specifically includes:
identifying data to be stored, determining the sequence of all actions contained in the data to obtain a control action set, wherein the control action set at least comprises the execution time of each action;
extracting scene data corresponding to different actions, and recording judging conditions required by the corresponding actions to obtain scene characteristics;
scene characteristics and action difference data corresponding to each action are determined and stored together with a control action set.
5. The humanoid robot storage control method of claim 1, wherein the scene data includes interactive images, interactive voices and interactive instructions.
6. The storage control method of a humanoid robot according to claim 1, wherein when the stored data is required to be used, corresponding scene characteristics and motion difference data are searched according to a control motion set, and the data are imported into a corresponding robot model to be operated, so that data to be stored are obtained.
7. A humanoid robot storage control system, the system comprising:
the data acquisition module is used for acquiring data to be stored, wherein the data to be stored at least comprises scene data and control data;
the action dividing module is used for dividing the data to be stored into a plurality of action data sets according to the part division of the humanoid robot, wherein the action data sets comprise independent action data packets and linkage action data packets;
the action analysis module is used for inquiring a preset standard action database based on the action data set, analyzing the action data set and obtaining action difference data;
the data storage module is used for converting the data to be stored into a control action set, converting the scene data into scene characteristics and storing the control action set, the action difference data and the scene characteristics.
8. The humanoid robot storage control system of claim 7, wherein the action dividing module includes:
the robot partition unit is used for dividing the humanoid robot into a plurality of independent control areas according to the functional areas, and the independent control areas are matched to complete corresponding actions;
the action recognition unit is used for recognizing action content contained in the data to be stored, and dividing the data to be stored according to the action content to obtain a plurality of action data blocks;
and the data dividing unit is used for determining the action contained in each action data block and dividing the action into an independent action data packet and a linkage action data packet.
9. The humanoid robot storage control system of claim 7, wherein the action analysis module includes:
the data matching unit is used for searching the standard action database according to the actions contained in the action data set and searching the standard action data matched with the standard action database;
the parameter extraction unit is used for extracting standard response parameters corresponding to the standard action data and extracting real-time response parameters of the action data set;
and the data comparison unit is used for comparing the standard response parameters with the real-time response parameters, and taking the difference value as action difference data.
10. The humanoid robot storage control system of claim 7, wherein the data storage module includes:
the action set identification unit is used for identifying the data to be stored, determining the sequence of all actions contained in the data to obtain a control action set, wherein the control action set at least comprises the execution time of each action;
the scene feature extraction unit is used for extracting scene data corresponding to different actions and recording judgment conditions required by the corresponding actions to obtain scene features;
and the data decomposition storage unit is used for determining scene characteristics and action difference data corresponding to each action and storing the scene characteristics and the action difference data and the control action set.
CN202310517300.6A 2023-05-10 2023-05-10 Storage control method and control system for humanoid robot Active CN116214528B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310517300.6A CN116214528B (en) 2023-05-10 2023-05-10 Storage control method and control system for humanoid robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310517300.6A CN116214528B (en) 2023-05-10 2023-05-10 Storage control method and control system for humanoid robot

Publications (2)

Publication Number Publication Date
CN116214528A true CN116214528A (en) 2023-06-06
CN116214528B CN116214528B (en) 2023-10-03

Family

ID=86569993

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310517300.6A Active CN116214528B (en) 2023-05-10 2023-05-10 Storage control method and control system for humanoid robot

Country Status (1)

Country Link
CN (1) CN116214528B (en)

Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS57207908A (en) * 1981-06-17 1982-12-20 Hitachi Ltd Robot controller
EP0104270A1 (en) * 1982-09-23 1984-04-04 Kabushiki Kaisha Sankyo Seiki Seisakusho Robot control apparatus
JPS625408A (en) * 1985-07-01 1987-01-12 Fanuc Ltd Method for controlling joint-type robot
JP2000153479A (en) * 1998-09-03 2000-06-06 Ricoh Elemex Corp Robot system
JP2003211380A (en) * 2002-01-18 2003-07-29 Honda Motor Co Ltd Abnormality detector for mobile robot
CN101020313A (en) * 2007-03-08 2007-08-22 华中科技大学 Motion controller for modular embedded polypod robot
KR20090109652A (en) * 2008-04-16 2009-10-21 삼성전자주식회사 Humanoid robot and method for controlling thereof
KR20110035128A (en) * 2009-09-29 2011-04-06 주식회사 케이티 System for managing robot behavior based on network
CA2944774A1 (en) * 2014-04-08 2015-10-15 Kawasaki Jukogyo Kabushiki Kaisha Data collection system and method
WO2016025941A1 (en) * 2014-08-15 2016-02-18 University Of Central Florida Research Foundation, Inc. Control interface for robotic humanoid avatar system and related methods
CN205905033U (en) * 2016-07-15 2017-01-25 杭州欢乐飞机器人科技股份有限公司 Guest greeting robot
JP2017213612A (en) * 2016-05-30 2017-12-07 トヨタ自動車株式会社 Robot and method for controlling robot
CN107515900A (en) * 2017-07-24 2017-12-26 宗晖(上海)机器人有限公司 Intelligent robot and its event memorandum system and method
CN108229855A (en) * 2018-02-06 2018-06-29 上海小蚁科技有限公司 Method for monitoring service quality and device, computer readable storage medium, terminal
EP3441037A1 (en) * 2017-08-11 2019-02-13 avateramedical GmbH Surgical robot system and access control method of a surgical instrument designed to be inserted in a surgical robot system
US10216695B1 (en) * 2017-09-21 2019-02-26 Palantir Technologies Inc. Database system for time series data storage, processing, and analysis
WO2019120148A1 (en) * 2017-12-19 2019-06-27 北京可以科技有限公司 Control system for modular robot, modular robot system, and control method for modular robot
CN110175150A (en) * 2019-05-15 2019-08-27 重庆大学 Guest-greeting machine personal data based on data compression stores monitoring system
CN110253567A (en) * 2019-05-22 2019-09-20 北京镁伽机器人科技有限公司 For controlling kinetic control system, method and the robot of robot motion
WO2020207017A1 (en) * 2019-04-11 2020-10-15 上海交通大学 Method and device for collaborative servo control of uncalibrated movement vision of robot in agricultural scene
CN112329341A (en) * 2020-11-02 2021-02-05 浙江智昌机器人科技有限公司 Fault diagnosis system and method based on AR and random forest model
US20210170580A1 (en) * 2019-12-04 2021-06-10 Ubtech Robertics Corp Ltd Action imitation method and robot and computer readable medium using the same
CN112988467A (en) * 2021-04-19 2021-06-18 深圳市安信达存储技术有限公司 Solid state disk, data recovery method thereof and terminal equipment
WO2021189695A1 (en) * 2020-03-25 2021-09-30 平安科技(深圳)有限公司 Distributed database dynamic expansion method and apparatus, and device and storage medium
CN115220375A (en) * 2021-09-30 2022-10-21 达闼科技(北京)有限公司 Robot control method, robot control device, storage medium, and electronic apparatus
EP4088886A1 (en) * 2021-05-14 2022-11-16 The Boeing Company Rapid change mechanism for complex end effectors

Patent Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS57207908A (en) * 1981-06-17 1982-12-20 Hitachi Ltd Robot controller
EP0104270A1 (en) * 1982-09-23 1984-04-04 Kabushiki Kaisha Sankyo Seiki Seisakusho Robot control apparatus
JPS625408A (en) * 1985-07-01 1987-01-12 Fanuc Ltd Method for controlling joint-type robot
JP2000153479A (en) * 1998-09-03 2000-06-06 Ricoh Elemex Corp Robot system
JP2003211380A (en) * 2002-01-18 2003-07-29 Honda Motor Co Ltd Abnormality detector for mobile robot
CN101020313A (en) * 2007-03-08 2007-08-22 华中科技大学 Motion controller for modular embedded polypod robot
KR20090109652A (en) * 2008-04-16 2009-10-21 삼성전자주식회사 Humanoid robot and method for controlling thereof
KR20110035128A (en) * 2009-09-29 2011-04-06 주식회사 케이티 System for managing robot behavior based on network
CA2944774A1 (en) * 2014-04-08 2015-10-15 Kawasaki Jukogyo Kabushiki Kaisha Data collection system and method
WO2016025941A1 (en) * 2014-08-15 2016-02-18 University Of Central Florida Research Foundation, Inc. Control interface for robotic humanoid avatar system and related methods
JP2017213612A (en) * 2016-05-30 2017-12-07 トヨタ自動車株式会社 Robot and method for controlling robot
CN205905033U (en) * 2016-07-15 2017-01-25 杭州欢乐飞机器人科技股份有限公司 Guest greeting robot
CN107515900A (en) * 2017-07-24 2017-12-26 宗晖(上海)机器人有限公司 Intelligent robot and its event memorandum system and method
EP3441037A1 (en) * 2017-08-11 2019-02-13 avateramedical GmbH Surgical robot system and access control method of a surgical instrument designed to be inserted in a surgical robot system
US10216695B1 (en) * 2017-09-21 2019-02-26 Palantir Technologies Inc. Database system for time series data storage, processing, and analysis
WO2019120148A1 (en) * 2017-12-19 2019-06-27 北京可以科技有限公司 Control system for modular robot, modular robot system, and control method for modular robot
CN108229855A (en) * 2018-02-06 2018-06-29 上海小蚁科技有限公司 Method for monitoring service quality and device, computer readable storage medium, terminal
WO2020207017A1 (en) * 2019-04-11 2020-10-15 上海交通大学 Method and device for collaborative servo control of uncalibrated movement vision of robot in agricultural scene
CN110175150A (en) * 2019-05-15 2019-08-27 重庆大学 Guest-greeting machine personal data based on data compression stores monitoring system
CN110253567A (en) * 2019-05-22 2019-09-20 北京镁伽机器人科技有限公司 For controlling kinetic control system, method and the robot of robot motion
US20210170580A1 (en) * 2019-12-04 2021-06-10 Ubtech Robertics Corp Ltd Action imitation method and robot and computer readable medium using the same
WO2021189695A1 (en) * 2020-03-25 2021-09-30 平安科技(深圳)有限公司 Distributed database dynamic expansion method and apparatus, and device and storage medium
CN112329341A (en) * 2020-11-02 2021-02-05 浙江智昌机器人科技有限公司 Fault diagnosis system and method based on AR and random forest model
CN112988467A (en) * 2021-04-19 2021-06-18 深圳市安信达存储技术有限公司 Solid state disk, data recovery method thereof and terminal equipment
EP4088886A1 (en) * 2021-05-14 2022-11-16 The Boeing Company Rapid change mechanism for complex end effectors
CN115220375A (en) * 2021-09-30 2022-10-21 达闼科技(北京)有限公司 Robot control method, robot control device, storage medium, and electronic apparatus

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘帅;王敏珍;刘超;: "基于模式识别的机器人运行轨迹数据分类存储系统设计", 现代电子技术, no. 04 *
成艳真;: "基于VR场景设计的采摘机器人协同作业分析", 农机化研究, no. 12 *

Also Published As

Publication number Publication date
CN116214528B (en) 2023-10-03

Similar Documents

Publication Publication Date Title
CN109522818B (en) Expression recognition method and device, terminal equipment and storage medium
CN106991395B (en) Information processing method and device and electronic equipment
CN110751022A (en) Urban pet activity track monitoring method based on image recognition and related equipment
WO2016172872A1 (en) Method and device for verifying real human face, and computer program product
CN109743624B (en) Video cutting method and device, computer equipment and storage medium
CN105825268A (en) Method and system for data processing for robot action expression learning
CN111048113B (en) Sound direction positioning processing method, device, system, computer equipment and storage medium
CN106960177A (en) Living body faces verification method and system, living body faces checking device
CN111063083B (en) Access control method and device, computer readable storage medium and computer equipment
CN110046577B (en) Pedestrian attribute prediction method, device, computer equipment and storage medium
CN111240984A (en) Abnormal page identification method and device, computer equipment and storage medium
CN111191564A (en) Multi-pose face emotion recognition method and system based on multi-angle neural network
CN111126233B (en) Call channel construction method and device based on distance value and computer equipment
CN113239874A (en) Behavior posture detection method, device, equipment and medium based on video image
CN111582179B (en) Monitoring video analysis method and device, computer equipment and storage medium
CN111242840A (en) Handwritten character generation method, apparatus, computer device and storage medium
CN116214528B (en) Storage control method and control system for humanoid robot
CN110941992A (en) Smile expression detection method and device, computer equipment and storage medium
CN114493902A (en) Multi-mode information anomaly monitoring method and device, computer equipment and storage medium
CN111368061B (en) Short text filtering method, device, medium and computer equipment
Dai et al. Loop closure detection using KPCA and CNN for visual SLAM
CN111898035A (en) Data processing strategy configuration method and device based on Internet of things and computer equipment
KR100434907B1 (en) Monitoring system including function of figure acknowledgement and method using this system
CN114140822A (en) Pedestrian re-identification method and device
CN109145737B (en) Rapid face recognition method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant