CN114378823B - Robot action control method and device, readable storage medium and robot - Google Patents

Robot action control method and device, readable storage medium and robot Download PDF

Info

Publication number
CN114378823B
CN114378823B CN202210068153.4A CN202210068153A CN114378823B CN 114378823 B CN114378823 B CN 114378823B CN 202210068153 A CN202210068153 A CN 202210068153A CN 114378823 B CN114378823 B CN 114378823B
Authority
CN
China
Prior art keywords
robot
motion
layer
data frame
action
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210068153.4A
Other languages
Chinese (zh)
Other versions
CN114378823A (en
Inventor
曾祥安
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ubtech Robotics Corp
Original Assignee
Ubtech Robotics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ubtech Robotics Corp filed Critical Ubtech Robotics Corp
Priority to CN202210068153.4A priority Critical patent/CN114378823B/en
Publication of CN114378823A publication Critical patent/CN114378823A/en
Application granted granted Critical
Publication of CN114378823B publication Critical patent/CN114378823B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The application belongs to the technical field of robots, and particularly relates to a robot action control method and device, a computer readable storage medium and a robot. The method comprises the following steps: reading a preset robot action file; the robot action file comprises a plurality of layers, and each layer corresponds to the action of one movement part of the robot; analyzing the robot action file layer by layer to obtain action data frame streams of each movement part of the robot; and respectively controlling the movements of each movement part of the robot according to the movement data frame stream of each movement part of the robot. According to the application, the robot action file is divided into a plurality of layers according to the movement parts, and each layer is respectively analyzed and controlled, so that control decoupling between actions of each movement part is realized, the actions of each movement part can be respectively and independently controlled, and the robot action is more flexible and natural and has better movement expressive force.

Description

Robot action control method and device, readable storage medium and robot
Technical Field
The application belongs to the technical field of robots, and particularly relates to a robot action control method and device, a computer readable storage medium and a robot.
Background
With the continuous development of robotics, a wide variety of robots have become increasingly popular in various work and living scenarios of people. In order for a performance type humanoid robot to enhance its athletic performance, it is required to be able to flexibly make various actions like a human. However, in the prior art, when the performance type humanoid robot moves, in each frame of motion, a plurality of parts generally start to move at the same time and finish the motion at the same time, so that the motion is stiff and the motion expressive force is poor.
Disclosure of Invention
In view of the above, the embodiments of the present application provide a method and apparatus for controlling a robot motion, a computer readable storage medium, and a robot, so as to solve the problems of stiffness and poor motion performance of the existing method for controlling a robot motion.
A first aspect of an embodiment of the present application provides a robot motion control method, which may include:
reading a preset robot action file; the robot action file comprises a plurality of layers, each layer corresponds to the action of one movement part of the robot, and each layer comprises information of the layer and an action data frame list; the layer information comprises a layer identifier, a layer total byte number, a layer total playing time length and a layer total frame number; the action data frame list comprises a plurality of action data frames;
analyzing the robot action file layer by layer to obtain action data frame streams of each movement part of the robot;
and respectively controlling the movements of each movement part of the robot according to the movement data frame stream of each movement part of the robot.
In a specific implementation manner of the first aspect, after analyzing the robot motion file layer by layer to obtain a motion data frame stream of each motion part of the robot, the method further includes:
respectively storing the motion data frame streams of each motion part of the robot in a preset motion data frame queue;
the method for controlling the motion of each motion part of the robot according to the motion data frame stream of each motion part of the robot comprises the following steps:
and respectively playing the action data frame streams in each action data frame queue so as to control the actions of each movement part of the robot.
In a specific implementation manner of the first aspect, each action data frame in the action data frame list includes a current frame number, a current frame length, a waiting time before playing the current frame, a current frame motion time and an articulation parameter list;
the articulation parameter list includes a plurality of sets of articulation parameters, each set of articulation parameters including an articulation joint identifier and an angle of motion corresponding to the articulation joint identifier.
In a specific implementation manner of the first aspect, the robot action file includes three layers, which correspond to a head action, a hand action and a foot action of the robot, respectively.
A second aspect of an embodiment of the present application provides a robot motion control apparatus, which may include:
the resolver is used for reading a preset robot action file; the robot action file comprises a plurality of layers, each layer corresponds to the action of one movement part of the robot, and each layer comprises information of the layer and an action data frame list; the layer information comprises a layer identifier, a layer total byte number, a layer total playing time length and a layer total frame number; the action data frame list comprises a plurality of action data frames; analyzing the robot action file layer by layer to obtain action data frame streams of each movement part of the robot;
and the play controller is used for respectively controlling the movements of each movement part of the robot according to the movement data frame streams of each movement part of the robot.
In a specific implementation manner of the second aspect, the parser may be further configured to store a motion data frame stream of each motion part of the robot in a preset motion data frame queue respectively;
the play controller may be specifically configured to play the motion data frame streams in each motion data frame queue respectively, so as to control the motion of each motion part of the robot.
In a specific implementation manner of the second aspect, each action data frame in the action data frame list includes a current frame number, a current frame length, a waiting time before playing the current frame, a current frame motion time and an articulation parameter list;
the articulation parameter list includes a plurality of sets of articulation parameters, each set of articulation parameters including an articulation joint identifier and an angle of motion corresponding to the articulation joint identifier.
In a specific implementation manner of the second aspect, the robot action file includes three layers, which correspond to a head action, a hand action and a foot action of the robot, respectively.
A third aspect of the embodiments of the present application provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of any one of the robot motion control methods described above.
A fourth aspect of the embodiments of the present application provides a robot comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of any one of the above-mentioned robot motion control methods when executing the computer program.
A fifth aspect of the embodiments of the present application provides a computer program product for, when run on a robot, causing the robot to perform the steps of any of the robot motion control methods described above.
Compared with the prior art, the embodiment of the application has the beneficial effects that: the embodiment of the application reads the preset robot action file; the robot action file comprises a plurality of layers, and each layer corresponds to the action of one movement part of the robot; analyzing the robot action file layer by layer to obtain action data frame streams of each movement part of the robot; and respectively controlling the movements of each movement part of the robot according to the movement data frame stream of each movement part of the robot. According to the embodiment of the application, the robot action file is divided into a plurality of layers according to the movement parts, and each layer is respectively analyzed and controlled, so that control decoupling between actions of each movement part is realized, the actions of each movement part can be respectively and independently controlled, and the robot action is more flexible and natural and has better movement expressive force.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of one embodiment of a method for controlling robot motion in accordance with an embodiment of the present application;
FIG. 2 is a hierarchical schematic diagram of a robot action file;
FIG. 3 is an exemplary diagram of a robot action file;
FIG. 4 is a schematic diagram of the overall motion control process;
FIG. 5 is a block diagram of an embodiment of a robot motion control device according to an embodiment of the present application;
fig. 6 is a schematic block diagram of a robot in an embodiment of the application.
Detailed Description
In order to make the objects, features and advantages of the present application more comprehensible, the technical solutions in the embodiments of the present application are described in detail below with reference to the accompanying drawings, and it is apparent that the embodiments described below are only some embodiments of the present application, but not all embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
In addition, in the description of the present application, the terms "first," "second," "third," etc. are used merely to distinguish between descriptions and should not be construed as indicating or implying relative importance.
Referring to fig. 1, an embodiment of a robot motion control method according to an embodiment of the present application may include:
step S101, a preset robot action file is read.
The robot action file may include several layers, each of which may correspond to an action of one motion part of the robot. The specific number of layers and the motion of the motion part corresponding to each layer can be set according to actual conditions.
As shown in fig. 2, in a specific implementation manner of the embodiment of the present application, the robot action file may include three layers, which are respectively denoted as a head layer, a hand layer, and a foot layer, where the head layer corresponds to the head action of the robot, the hand layer corresponds to the hand action of the robot, and the foot layer corresponds to the foot action of the robot. It should be noted that the above layering manner is merely an example, and in practical application, more or fewer layers may be set according to specific situations, and corresponding movement positions may be set for each layer, which is not particularly limited in the embodiment of the present application.
The layers of the robot action file are independent of each other and can be cut independently without affecting other layers (for example, only the hand layer can be used). The relative positions of the layers in the robot action file are essentially indistinguishable and are not separated in sequence, and the layers can be freely arranged according to actual conditions.
In a specific implementation manner of the embodiment of the application, each layer of the robot action file may include the layer information and the action data frame list.
The layer information may include a layer identifier, a layer total number of bytes, a layer total playing time length, and a layer total frame number. The layer identifier is a preset fixed string, and is used to indicate different layers, for example, the layer identifier of the header layer may be set to "head\x0", the layer identifier of the hand layer may be set to "hand\x0", and the layer identifier of the foot layer may be set to "foot\x0". It should be noted that the above-mentioned manner of setting the identifier of the present layer is merely an example, and in practical applications, other forms of the identifier of the present layer may be set according to specific situations, which is not limited in particular by the embodiment of the present application. The total number of bytes in this layer is the total number of bytes contained in this layer, and setting the parameter can facilitate the jump, offset, and index between layers, and taking the header layer as an example, the parameter can be recorded as head_tl. The total playing time length of the layer is the total playing time length required by completely playing all action data frames in the layer, and the playing controller of the robot can determine the total playing time length of the layer according to the parameter, and takes a head layer as an example, and the parameter can be recorded as head_tt. The total frame number of the layer is the total frame number of all action data frames in the layer, the parser of the robot can determine the total frame number to be parsed in the layer according to the parameter, and the parameter can be recorded as head_tf by taking the header layer as an example.
The action data frame list may include a number of action data frames, and a specific frame number is consistent with a total frame number of the layer in the layer information. Each action data frame in the action data frame list comprises a current frame number, a current frame length, a waiting time before playing the current frame, a current frame motion time and an articulation parameter list. The current frame number is the identification number of the current action data frame, and the parser of the robot can distinguish different action data frames according to the parameter, so as to control the playing flow, and taking the header layer as an example, the parameter can be recorded as head_fn. The current frame length is the byte number of the current action data frame, the parser of the robot can quickly shift to the next frame for playing according to the parameter, and the parameter can be recorded as head_fl by taking a header layer as an example. The waiting time before playing the current frame is the waiting time before starting playing the current action data frame, and when actions of other layers are done first, the current action data frame can be done for waiting, taking a header layer as an example, and the parameter can be marked as head_fwt. The motion duration of the current frame is the motion duration of the current motion data frame control motion, the motion duration of the motion joint can be controlled according to the parameter, the play controller of the robot also needs to synchronously time in the motion process, and the parameter can be recorded as head_fmt by taking a head layer as an example. The total playing time length of the layer is equal to the sum of the waiting time length before playing the current frame and the motion time length of the current frame of all the action data frames, taking a head layer as an example, namely: head_tt= Σ (head_fwt+head_fmt).
The articulation parameter list may include several sets of articulation parameters, the specific number of sets may be set according to the actual situation, consistent with the number of articulation joints that need to be controlled by the current motion data frame. Each set of articulation parameters includes an articulation joint identifier and an angle of motion corresponding to the articulation joint identifier. Wherein the motion joint Identification (ID) is the current motion joint identification number and is used for designating the motion joint. The motion angle corresponding to the motion joint identification is the angle to which the current motion joint needs to move within the motion duration of the current frame.
Taking the head layer as an example, the structure of this layer is as follows:
layer/x information
The layer identifier head\x0
The total byte number of the layer head_tl
The total playing time length head_tt of the layer
The total frame number head_tf of the layer
Frame list { } +.
{
Current frame number head_fn
Current frame length head_fl
Waiting time head_fwt before playing current frame
Current frame motion duration head_fmt
List of joint movement parameters [ ] x
[
Sports joint identification ID
Movement angle corresponding to movement joint identification ID
]
[]
[]
...
}
{}
{}
...
The structure of the other layers is similar and will not be described in detail here.
Step S102, analyzing the robot action file layer by layer to obtain action data frame streams of each movement part of the robot.
Fig. 3 shows an example of a robot action file, and after the file is read by a parser of a robot, the file is parsed by layers, so that the following parsing result can be obtained:
head layer:
the layer identifier head\x0;
the total number of bytes of this layer head_tl=0x23=35b;
the total play duration head_tt=0x1388=5000 ms= (1000+1000+1000+2000) ms;
the total number of frames head_tf=0x0002=2 frames;
current frame number head_fn: frame 1
Current frame length head_fl=0x000a=10b;
the waiting time head_fw=0x03e8=1000 ms before the current frame is played;
the current frame motion duration head_fmt=0x32=50, with the unit of 20ms, i.e. 50×20ms=1000 ms;
the motion joint identity id=0x11=17, the corresponding motion angle=0x1e=30°;
current frame number head_fn: frame 2
Current frame length head_fl=0x000a=10b;
the waiting time head_fw=0x03e8=1000 ms before the current frame is played;
the current frame motion duration head_fmt=0x64=100, i.e. 100×20ms=2000 ms;
the motion joint identity id=0x11=17, the corresponding motion angle=0x78=128°;
hand layer: and so forth.
Foot layer: and so forth.
After the analysis of the robot action file is completed, the analyzer of the robot can obtain the action data frame stream of each movement part of the robot.
Step S103, the motion of each motion part of the robot is controlled according to the motion data frame stream of each motion part of the robot.
In a specific implementation manner of the embodiment of the present application, a play controller of a robot may preset a plurality of motion data frame queues, where each motion data frame queue corresponds to a motion part of the robot, and is configured to store a motion data frame stream of the motion part. The action data frame queues are mutually independent.
After the parser of the robot parses the motion data frame stream of each motion part of the robot, the motion data frame stream of each motion part of the robot may be stored in each corresponding motion data frame queue.
The play controller of the robot can play the motion data frame streams in the motion data frame queues respectively, and generate control signals to control the motion of each motion part of the robot.
Fig. 4 is a schematic diagram of the whole motion control process, where after a robot motion file is read, a parser of a robot parses a head frame stream, a hand frame stream and a foot frame stream from the motion file, and stores the motion data frame streams in corresponding motion data frame queues, a play controller of the robot plays the motion data frame streams, generates a head control signal to control the motion of the head of the robot, generates a hand control signal to control the motion of the hand of the robot, and generates a foot control signal to control the motion of the foot of the robot.
In summary, the embodiment of the application reads the preset robot action file; the robot action file comprises a plurality of layers, and each layer corresponds to the action of one movement part of the robot; analyzing the robot action file layer by layer to obtain action data frame streams of each movement part of the robot; and respectively controlling the movements of each movement part of the robot according to the movement data frame stream of each movement part of the robot. According to the embodiment of the application, the robot action file is divided into a plurality of layers according to the movement parts, and each layer is respectively analyzed and controlled, so that control decoupling between actions of each movement part is realized, the actions of each movement part can be respectively and independently controlled, and the robot action is more flexible and natural and has better movement expressive force.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present application.
Corresponding to the method for controlling the motion of the robot described in the above embodiments, fig. 5 shows a block diagram of an embodiment of a device for controlling the motion of a robot according to an embodiment of the present application.
In this embodiment, a robot motion control device may include:
a parser 501 for reading a preset robot action file; the robot action file comprises a plurality of layers, and each layer corresponds to the action of one movement part of the robot; analyzing the robot action file layer by layer to obtain action data frame streams of each movement part of the robot;
and the play controller 502 is configured to control the motion of each motion part of the robot according to the motion data frame stream of each motion part of the robot.
In a specific implementation manner of the embodiment of the present application, the parser may be further configured to store the motion data frame streams of each motion part of the robot in each preset motion data frame queue;
the play controller may be specifically configured to play the motion data frame streams in each motion data frame queue respectively, so as to control the motion of each motion part of the robot.
In a specific implementation manner of the embodiment of the present application, each layer of the robot action file includes the information of the layer and an action data frame list;
the layer information comprises a layer identifier, a layer total byte number, a layer total playing time length and a layer total frame number;
the action data frame list includes a number of action data frames.
In a specific implementation manner of the embodiment of the present application, each action data frame in the action data frame list includes a current frame number, a current frame length, a waiting time before playing the current frame, a current frame motion time and an articulation parameter list;
the articulation parameter list includes a plurality of sets of articulation parameters, each set of articulation parameters including an articulation joint identifier and an angle of motion corresponding to the articulation joint identifier.
In a specific implementation manner of the embodiment of the present application, the robot motion file includes three layers, which correspond to the head motion, the hand motion and the foot motion of the robot, respectively.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described apparatus, modules and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Fig. 6 shows a schematic block diagram of a robot provided in an embodiment of the present application, and only a portion related to the embodiment of the present application is shown for convenience of explanation.
As shown in fig. 6, the robot 6 of this embodiment includes: a processor 60, a memory 61 and a computer program 62 stored in said memory 61 and executable on said processor 60. The processor 60, when executing the computer program 62, implements the steps of the respective robot motion control method embodiments described above, for example, steps S101 to S103 shown in fig. 1. Alternatively, the processor 60, when executing the computer program 62, performs the functions of the modules/units of the apparatus embodiments described above, such as the functions of modules 501-502 shown in fig. 5.
Illustratively, the computer program 62 may be partitioned into one or more modules/units that are stored in the memory 61 and executed by the processor 60 to complete the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing a specific function for describing the execution of the computer program 62 in the robot 6.
It will be appreciated by those skilled in the art that fig. 6 is merely an example of a robot 6 and is not meant to be limiting of the robot 6, and may include more or fewer components than shown, or may combine certain components, or different components, e.g., the robot 6 may also include input and output devices, network access devices, buses, etc.
The processor 60 may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 61 may be an internal storage unit of the robot 6, such as a hard disk or a memory of the robot 6. The memory 61 may be an external storage device of the robot 6, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the robot 6. Further, the memory 61 may also include both an internal memory unit and an external memory device of the robot 6. The memory 61 is used for storing the computer program as well as other programs and data required by the robot 6. The memory 61 may also be used for temporarily storing data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/robot and method may be implemented in other ways. For example, the apparatus/robot embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable storage medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable storage medium may include content that is subject to appropriate increases and decreases as required by jurisdictions and by jurisdictions in which such computer readable storage medium does not include electrical carrier signals and telecommunications signals.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (8)

1. A robot motion control method, comprising:
reading a preset robot action file; the robot action file comprises a plurality of layers, each layer corresponds to the action of one movement part of the robot, and each layer comprises information of the layer and an action data frame list; the layer information comprises a layer identifier, a layer total byte number, a layer total playing time length and a layer total frame number; the action data frame list comprises a plurality of action data frames;
analyzing the robot action file layer by layer to obtain action data frame streams of each movement part of the robot;
and respectively controlling the movements of each movement part of the robot according to the movement data frame stream of each movement part of the robot.
2. The robot motion control method according to claim 1, further comprising, after parsing the robot motion file layer by layer to obtain a motion data frame stream for each motion part of the robot:
respectively storing the motion data frame streams of each motion part of the robot in a preset motion data frame queue;
the method for controlling the motion of each motion part of the robot according to the motion data frame stream of each motion part of the robot comprises the following steps:
and respectively playing the action data frame streams in each action data frame queue so as to control the actions of each movement part of the robot.
3. The robot motion control method of claim 1, wherein each motion data frame in the motion data frame list comprises a current frame number, a current frame length, a current frame pre-play waiting time period, a current frame motion time period, and an articulation parameter list;
the articulation parameter list includes a plurality of sets of articulation parameters, each set of articulation parameters including an articulation joint identifier and an angle of motion corresponding to the articulation joint identifier.
4. A robot motion control method according to any one of claims 1 to 3, wherein the robot motion file includes three layers corresponding to a head motion, a hand motion, and a foot motion of the robot, respectively.
5. A robot motion control device, comprising:
the resolver is used for reading a preset robot action file; the robot action file comprises a plurality of layers, each layer corresponds to the action of one movement part of the robot, and each layer comprises information of the layer and an action data frame list; the layer information comprises a layer identifier, a layer total byte number, a layer total playing time length and a layer total frame number; the action data frame list comprises a plurality of action data frames; analyzing the robot action file layer by layer to obtain action data frame streams of each movement part of the robot;
and the play controller is used for respectively controlling the movements of each movement part of the robot according to the movement data frame streams of each movement part of the robot.
6. The robot motion control apparatus of claim 5, wherein the parser is further configured to store motion data frame streams of respective motion parts of the robot in a preset respective motion data frame queue, respectively;
the play controller is specifically configured to play the motion data frame streams in each motion data frame queue, so as to control the motion of each motion part of the robot.
7. A computer-readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the robot action control method according to any one of claims 1 to 4.
8. A robot comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor, when executing the computer program, realizes the steps of the robot motion control method according to any one of claims 1 to 4.
CN202210068153.4A 2022-01-20 2022-01-20 Robot action control method and device, readable storage medium and robot Active CN114378823B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210068153.4A CN114378823B (en) 2022-01-20 2022-01-20 Robot action control method and device, readable storage medium and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210068153.4A CN114378823B (en) 2022-01-20 2022-01-20 Robot action control method and device, readable storage medium and robot

Publications (2)

Publication Number Publication Date
CN114378823A CN114378823A (en) 2022-04-22
CN114378823B true CN114378823B (en) 2023-12-15

Family

ID=81203348

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210068153.4A Active CN114378823B (en) 2022-01-20 2022-01-20 Robot action control method and device, readable storage medium and robot

Country Status (1)

Country Link
CN (1) CN114378823B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002307350A (en) * 2001-04-18 2002-10-23 Sony Corp Robot device, and operation control method, control system, program and recording medium for the same
CN102064961A (en) * 2010-12-02 2011-05-18 中兴通讯股份有限公司 Multilayer protection method and device
KR20160078678A (en) * 2014-12-24 2016-07-05 장준영 System and method for dancing robots by means of matching movement to music source based on one device
CN108687779A (en) * 2018-06-25 2018-10-23 上海思依暄机器人科技股份有限公司 A kind of the dancing development approach and system of domestic robot
CN111515959A (en) * 2020-05-19 2020-08-11 厦门大学 Programmable puppet performance robot control method and system and robot
WO2020221311A1 (en) * 2019-04-30 2020-11-05 齐鲁工业大学 Wearable device-based mobile robot control system and control method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4592276B2 (en) * 2003-10-24 2010-12-01 ソニー株式会社 Motion editing apparatus, motion editing method, and computer program for robot apparatus
US20210197378A1 (en) * 2019-12-27 2021-07-01 X Development Llc Offline robot planning with online adaptation

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002307350A (en) * 2001-04-18 2002-10-23 Sony Corp Robot device, and operation control method, control system, program and recording medium for the same
CN102064961A (en) * 2010-12-02 2011-05-18 中兴通讯股份有限公司 Multilayer protection method and device
KR20160078678A (en) * 2014-12-24 2016-07-05 장준영 System and method for dancing robots by means of matching movement to music source based on one device
CN108687779A (en) * 2018-06-25 2018-10-23 上海思依暄机器人科技股份有限公司 A kind of the dancing development approach and system of domestic robot
WO2020221311A1 (en) * 2019-04-30 2020-11-05 齐鲁工业大学 Wearable device-based mobile robot control system and control method
CN111515959A (en) * 2020-05-19 2020-08-11 厦门大学 Programmable puppet performance robot control method and system and robot

Also Published As

Publication number Publication date
CN114378823A (en) 2022-04-22

Similar Documents

Publication Publication Date Title
CN110049266A (en) Video data issues method, apparatus, electronic equipment and storage medium
EP3982260A1 (en) Digital twin operation device, digital twin operation method, program and data structure
JPH0828053B2 (en) Data recording method
CN108563521A (en) Applied to the information sharing method of game client, device, processor and terminal
CN114378823B (en) Robot action control method and device, readable storage medium and robot
US20040177747A1 (en) Digital signal processing method and apparatus thereof, control data generation method and apparatus thereof, and program recording medium
CN111798969A (en) Medical medicine matching method and device, electronic equipment and storage medium
CN107820677A (en) Determine the method and its device, terminal of filter coefficient
US5543930A (en) Video data management method and apparatus
JP2006092561A (en) Design method of interface
CN113472834A (en) Object pushing method and device
KR20210083624A (en) Method and apparatus for controlling data input and output of neural network
CN110096624B (en) Encoding and decoding method and device, computer equipment and storage medium
CN113470672A (en) Voice enhancement method, device, equipment and storage medium
KR101268452B1 (en) A method for transmitting and receiving data which describe virtual world, and an apparatus thereof
EP3933710A1 (en) Data processing device, data processing system, and data processing method
CN114211485B (en) Robot dance control method and device, robot and storage medium
CN104679697B (en) File reading, device and CD-ROM drive driving plate and CD-ROM equipment
CN112233662A (en) Audio analysis method and device, computing equipment and storage medium
CN114003546B (en) Multi-channel switching value composite coding design method and device
JP2002344328A (en) Decoder, program and method for decoding variable length code
KR100516214B1 (en) A digital signal processor for parallel processing of instructions and its process method
WO2022038747A1 (en) Learning method, device, and program
JP5174233B2 (en) Electronic signature program, electronic signature device, and electronic signature method
CN116974773B (en) Program data linking method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant