CN112192585B - Interactive performance method and system of palm-faced puppet performance robot - Google Patents

Interactive performance method and system of palm-faced puppet performance robot Download PDF

Info

Publication number
CN112192585B
CN112192585B CN202011089702.3A CN202011089702A CN112192585B CN 112192585 B CN112192585 B CN 112192585B CN 202011089702 A CN202011089702 A CN 202011089702A CN 112192585 B CN112192585 B CN 112192585B
Authority
CN
China
Prior art keywords
robot
performance
user
action
puppet
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011089702.3A
Other languages
Chinese (zh)
Other versions
CN112192585A (en
Inventor
佘莹莹
林琳
徐筱猛
刘华辉
林佳煜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiamen University
Original Assignee
Xiamen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiamen University filed Critical Xiamen University
Priority to CN202011089702.3A priority Critical patent/CN112192585B/en
Publication of CN112192585A publication Critical patent/CN112192585A/en
Application granted granted Critical
Publication of CN112192585B publication Critical patent/CN112192585B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/003Manipulators for entertainment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/0005Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)
  • Toys (AREA)

Abstract

An interactive performance method and system of a palm puppet performance robot. The method comprises the following steps: acquiring somatosensory data; a robot state decision step; calculating the robot action; and (5) performing interactively. The invention can break through the limitation of manpower and finish the rehearsal of boring actions; the real-time interactive performance function is supported, and the method is suitable for multiple interactive scenes such as display, teaching and the like; the motion law of the real performance of analysis deconstruction performance action and form makes the arrangement design more reasonable, possesses the function of real-time intelligent analysis user action, makes user's experience more have the sense of immersing and participate in the sense, can also add other multimedia technologies, and other parts can be planned to the control program overall, have scalability.

Description

Interactive performance method and system of palm-faced puppet performance robot
Technical Field
The present application relates generally to the field of robots, and more particularly to an interactive performance method and system for a palm-facing puppet performance robot.
Background
The outward propagation of the traditional hand puppet show skills is mainly performed by the way of play round, and is limited by great manpower conditions; and contemporary spectators have not been satisfied with passively watching the puppet performance in favor of participating in the experience of manipulating the puppet, whereas traditional puppet play lacks ways to provide interactivity. In considering dramatic interactivity, some existing approaches use virtual reality technology in conjunction with a motion capture device to manipulate a virtual hand puppet. However, in the actual demonstration effect, the methods weaken the artistic value of the traditional hand puppet manufacturing process and lack visual hand puppet performance visual experience.
Chinese patent CN201120133417.7 discloses a puppet performance device controlled by a robot, which comprises a controller, a motor connected with the controller, a crutch head puppet, and a line-lifting puppet, and is characterized in that the puppet performance device further comprises a mechanical control arm of the crutch head puppet and a mechanical control arm of the line-lifting puppet connected with the motor, and a water spraying device and a voice playing device connected with the controller. The electric motor and the steering engine are controlled by the controller, so that the crutch head puppet, the line-lifting puppet and the animal head puppet perform a unique performance under the control of the mechanical control arm, and voice and music are played by combining the audio playing device.
However, the existing puppet performance robot is still deficient in the aspects of real-time interactive performance, real-time intelligent analysis of user actions, user immersion and participation feeling, expandability and the like.
Disclosure of Invention
In view of the above-mentioned drawbacks and deficiencies of the prior art, an interactive performance method and system for a palm-faced puppet performance robot are provided. The invention can break through the limitation of manpower and finish the rehearsal of boring actions; the real-time interactive performance function is supported, and the method is suitable for multiple interactive scenes such as display, teaching and the like; the motion law of the real performance of analysis deconstruction performance action and form makes the arrangement design more reasonable, possesses the function of real-time intelligent analysis user action, makes user's experience more have the sense of immersing and participate in the sense, can also add other multimedia technologies, and other parts can be planned to the control program overall, have scalability.
In a first aspect, an embodiment of the present application provides an interactive performance method facing a palm puppet performance robot, where the method includes:
the method comprises a somatosensory data acquisition step of acquiring somatosensory motion data of a user, sending the acquired data to a control program, and making a state decision of the robot, wherein the acquired data specifically comprises the following steps: the position of the user, the facing direction, the displacement of the human body of the user and the gesture action of the user;
the method comprises the steps of robot state decision making, wherein a finite state machine is used as an action decision module of the robot, when the state machine is defined, control actions of a glove puppet are decomposed by analyzing a puppet performance video given by an actor, then key points of the control actions of all acting bodies are popularized to the finite state, the state of the finite state machine is determined by data collected by a somatosensory data collection module, the data for controlling the finite state machine are different in different links of performance, specifically, the links of the actor performance are controlled by the position and the face of the actor, the interaction links are controlled by gesture actions of a user, the finite state machine finally outputs an action vector for action calculation in the next stage, and each dimension of the action vector controls the movement of a movable joint of the robot;
a robot action calculation step, namely receiving the action vector sent by the decision module by a robot action calculation module, analyzing the vector to determine the specific motion of the robot, analyzing the vectors by using a reverse dynamics algorithm, extracting two axial motions as the basic motion of the system, and calculating the rotation degree of each joint by using a cosine law;
and in the interactive performance step, the motion vectors obtained from the state decision module are sequentially calculated one by one, and when the motion represented by one motion vector is finished, the robot can sequentially execute the motion represented by the next vector until the whole group of motions are completely finished.
In a second aspect, embodiments of the present application provide an interactive performance system facing a palm puppet performance robot, the system including:
the body feeling data acquisition module is used for acquiring body feeling action data of a user, transmitting the acquired data to a control program, and making a state decision of the robot, wherein the acquired data specifically comprise: the position of the user, the facing direction, the displacement of the human body of the user and the gesture action of the user;
the robot arm state decision module is used for finishing action decision by using a finite state machine, when the state machine is defined, firstly, the control action of the glove puppet is decomposed by analyzing a puppet performance video given by an actor, then, key points of the control action taking each role as a main body are popularized to a finite state, the state of the finite state machine is determined by data collected by the somatosensory data collection module, in different links of the performance, the data for controlling the finite state machine are different, specifically, the links of the actor performance are controlled by the position and the face of the actor, in an interaction link, the finite state machine is controlled by the gesture action of a user, finally, an action vector is output by the finite state machine and is used for action calculation in the next stage, and each dimension of the action vectors respectively controls the movement of one movable joint of the robot;
the robot arm motion calculation module is used for receiving motion vectors sent by the robot arm state decision module, analyzing the vectors to determine specific motions of the robot, analyzing the vectors by using a reverse dynamics algorithm, extracting two axial motions as basic motions of the system, and calculating the rotation degree of each joint by using a cosine law;
and the robot arm motion module is used for calculating the motion vectors obtained from the robot arm state decision module one by one in sequence, and after the motion represented by one motion vector is finished, the robot can execute the motion represented by the next vector in sequence until the whole group of motions are completely finished.
In a third aspect, an embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, where the computer program is configured to: which when executed by a processor implements a method as described in embodiments of the present application.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 shows an interactive performance flow diagram of a palm-facing puppet performance robot of the present application;
FIG. 2 shows a program flow diagram of the present application;
FIG. 3 shows a schematic diagram of the robot architecture composition of the present application;
FIG. 4 is a schematic diagram showing the structural composition of a robot arm part of the present application;
FIG. 5 shows a schematic structural assembly of a small arm portion of the robot of the present application;
FIG. 6 illustrates an exemplary diagram of a robot action decision module of the present application;
fig. 7 shows an exemplary diagram of a robot motion calculation module of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the present invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Referring to fig. 1, fig. 1 shows a schematic view of an interactive performance flow of a palm-facing puppet performance robot according to the present application. As shown in fig. 1, the method includes:
the method comprises a somatosensory data acquisition step of acquiring somatosensory motion data of a user, sending the acquired data to a control program, and making a state decision of the robot, wherein the acquired data specifically comprises the following steps: the position of the user, the facing direction, the displacement of the human body of the user and the gesture action of the user; particularly, Microsoft Kinect is used for carrying out somatosensory motion data acquisition on audiences;
the method comprises the steps of robot state decision making, wherein a finite state machine is used as an action decision module of the robot, when the state machine is defined, control actions of a glove puppet are decomposed by analyzing a puppet performance video given by an actor, then key points of the control actions of all acting bodies are popularized to the finite state, the state of the finite state machine is determined by data collected by a somatosensory data collection module, the data for controlling the finite state machine are different in different links of performance, specifically, the links of the actor performance are controlled by the position and the face of the actor, the interaction links are controlled by gesture actions of a user, the finite state machine finally outputs an action vector for action calculation in the next stage, and each dimension of the action vector controls the movement of a movable joint of the robot;
the robot motion calculation step, the motion vector sent out by the decision-making module is received by the robot motion calculation module, the vector is analyzed to determine the specific motion of the robot, the vectors are analyzed by using a reverse dynamics algorithm, two axial motions are extracted as the basic motion of the system, the robot simulates the human body arm, the robot has a simple skeleton structure, and the cosine law is utilized to calculate the rotation degree of each joint;
and in the interactive performance step, the motion vectors obtained from the state decision module are sequentially calculated one by one, and when the motion represented by one motion vector is finished, the robot can sequentially execute the motion represented by the next vector until the whole group of motions are completely finished.
Preferably, the interaction with the puppet performance robot further comprises: three kinds of data of the user are collected by the data collection module: distances d1, d2, d3, d4 and d5 from fingertips to palms, translation speed vT of palms, rotation speed vR of palms, and the robot is initially in an initial state, and when d1, d2, d3, d4 and d5 are detected to be less than 5cm, the system judges that the user is clenching a fist at present and the robot enters a bending state; when detecting that vT is larger than 0.4m/s, the system judges that the user is translating the palm at present and the robot enters a moving state; when the detected vR is larger than 30 degrees/s, the system judges that the user rotates the palm currently and the robot enters a rotating state. As shown in fig. 6.
Preferably, the calculating the required rotation angle of each joint specifically comprises: one upper limb is L1Length of lower limbs is L2For a given target coordinate (X, Y), the upper limb needs to be rotated by theta1Rotation of the lower limbs by theta2Angle theta between degree, direction vector (X, Y) and normalTWherein theta1And theta2Can be calculated according to the following formula:
Figure BDA0002721583830000041
Figure BDA0002721583830000042
as shown in fig. 7.
Preferably, the construction step of the palm puppet performance robot comprises: designing a basic prototype of each sub-joint of the robot according to the integral structure of the limb skeleton and the hand puppet of the real person, and mapping the basic prototype into a specific motor component for assembling; the robot specifically comprises a base, a large arm part and a small arm part, wherein the lower end of the large arm part is arranged on the base, the upper end of the large arm part is connected with the small arm part, the large arm part is used for controlling rigid displacement motion of the whole puppet, the large arm part is provided with the base, 3 arm rods and 6 movable joints, and large stepping motors are respectively arranged in the 6 movable joints; the movable joints are used for simulating the upper limbs of performers and comprise shoulders, elbows and wrists, and 6 degrees of freedom of the 6 movable joints provide the axial rotation capacity of two joints for each part. The palm puppet performance robot is shown in figures 3-5.
As another aspect, the present application also provides an interactive performance system facing a palm puppet performance robot, the system comprising:
the body feeling data acquisition module is used for acquiring body feeling action data of a user, transmitting the acquired data to a control program, and making a state decision of the robot, wherein the acquired data specifically comprise: the position of the user, the facing direction, the displacement of the human body of the user and the gesture action of the user;
the robot arm state decision module is used for finishing action decision by using a finite state machine, when the state machine is defined, firstly, the control action of the glove puppet is decomposed by analyzing a puppet performance video given by an actor, then, key points of the control action taking each role as a main body are popularized to a finite state, the state of the finite state machine is determined by data collected by the somatosensory data collection module, in different links of the performance, the data for controlling the finite state machine are different, specifically, the links of the actor performance are controlled by the position and the face of the actor, in an interaction link, the finite state machine is controlled by the gesture action of a user, finally, an action vector is output by the finite state machine and is used for action calculation in the next stage, and each dimension of the action vectors respectively controls the movement of one movable joint of the robot;
the robot arm motion calculation module is used for receiving motion vectors sent by the robot arm state decision module, analyzing the vectors to determine specific motions of the robot, analyzing the vectors by using a reverse dynamics algorithm, extracting two axial motions as basic motions of the system, and calculating the rotation degree of each joint by using a cosine law;
and the robot arm motion module is used for calculating the motion vectors obtained from the robot arm state decision module one by one in sequence, and after the motion represented by one motion vector is finished, the robot can execute the motion represented by the next vector in sequence until the whole group of motions are completely finished.
Preferably, the robot arm state decision module is further configured to: three kinds of data of the user are collected by the data collection module: distances d1, d2, d3, d4 and d5 from fingertips to palms, translation speed vT of palms, rotation speed vR of palms, and the robot is initially in an initial state, and when d1, d2, d3, d4 and d5 are detected to be less than 5cm, the system judges that the user is clenching a fist at present and the robot enters a bending state; when detecting that vT is larger than 0.4m/s, the system judges that the user is translating the palm at present and the robot enters a moving state; when the detected vR is larger than 30 degrees/s, the system judges that the user rotates the palm currently and the robot enters a rotating state.
Preferably, the robot arm motion calculation module is further configured to: one upper limb is L1Length of lower limbs is L2For a given target coordinate (X, Y), the upper limb needs to be rotated by theta1Rotation of the lower limbs by theta2Angle theta between degree, direction vector (X, Y) and normalTWherein theta1And theta2Can be calculated according to the following formula:
Figure BDA0002721583830000061
Figure BDA0002721583830000062
preferably, the palm puppet performance robot is integrally constructed according to the limb skeleton of the real person and the palm puppet, and is assembled by motor components formed by mapping basic prototypes of all the sub-joints of the robot; the robot specifically comprises a base, a large arm part and a small arm part, wherein the lower end of the large arm part is arranged on the base, the upper end of the large arm part is connected with the small arm part, the large arm part is used for controlling rigid displacement motion of the whole puppet, the large arm part is provided with the base, 3 arm rods and 6 movable joints, and large stepping motors are respectively arranged in the 6 movable joints; the movable joints are used for simulating the upper limbs of performers and comprise shoulders, elbows and wrists, and 6 degrees of freedom of the 6 movable joints provide the axial rotation capacity of two joints for each part.
As another aspect, the present application also provides a computer-readable storage medium, which may be the computer-readable storage medium included in the foregoing device in the foregoing embodiment; or it may be a separate computer readable storage medium not incorporated into the device. The computer readable storage medium stores one or more programs for use by one or more processors in performing the interactive performance method described in the palm puppet robot of the present application. The program flow chart is shown in fig. 2.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. If implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic Gate circuit for realizing a logic function for a data signal, an asic having an appropriate combinational logic Gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), and the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (9)

1. An interactive performance method of a palm-facing puppet performance robot, the method comprising:
the method comprises a somatosensory data acquisition step of acquiring somatosensory motion data of a user, sending the acquired data to a control program, and making a state decision of the robot, wherein the acquired data specifically comprises the following steps: the position of the user, the facing direction, the displacement of the human body of the user and the gesture action of the user;
the method comprises the steps of robot state decision making, wherein a finite state machine is used as an action decision module of the robot, when the state machine is defined, control actions of a glove puppet are decomposed by analyzing a puppet performance video given by an actor, then key points of the control actions of all acting bodies are popularized to the finite state, the state of the finite state machine is determined by data collected by a somatosensory data collection module, the data for controlling the finite state machine are different in different links of performance, specifically, the links of the actor performance are controlled by the position and the face of the actor, the interaction links are controlled by gesture actions of a user, the finite state machine finally outputs an action vector for action calculation in the next stage, and each dimension of the action vector controls the movement of a movable joint of the robot;
a robot action calculation step, namely receiving the action vector sent by the decision module by a robot action calculation module, analyzing the vector to determine the specific motion of the robot, analyzing the vectors by using a reverse dynamics algorithm, extracting two axial motions as the basic motion of the system, and calculating the rotation degree of each joint by using a cosine law;
and in the interactive performance step, the motion vectors obtained from the state decision module are sequentially calculated one by one, and when the motion represented by one motion vector is finished, the robot can sequentially execute the motion represented by the next vector until the whole group of motions are completely finished.
2. The method of claim 1, wherein interacting with the puppet performance robot further comprises: three kinds of data of the user are collected by the data collection module: distances d1, d2, d3, d4 and d5 from fingertips to palms, translation speed vT of palms, rotation speed vR of palms, and the robot is initially in an initial state, and when d1, d2, d3, d4 and d5 are detected to be less than 5cm, the system judges that the user is clenching a fist at present and the robot enters a bending state; when detecting that vT is larger than 0.4m/s, the system judges that the user is translating the palm at present and the robot enters a moving state; when the detected vR is larger than 30 degrees/s, the system judges that the user rotates the palm currently and the robot enters a rotating state.
3. The method of claim 1, wherein calculating the angle that each joint needs to be rotated specifically comprises: one upper limb is L1Length of lower limbs is L2For a given target coordinate (X, Y), the upper limb needs to be rotated by theta1Rotation of the lower limbs by theta2Angle theta between degree, direction vector (X, Y) and normalTWherein theta1And theta2Calculated according to the following formula:
Figure FDA0002721583820000021
Figure FDA0002721583820000022
4. the method of claim 1, wherein: the construction steps of the palm puppet performance robot comprise: designing a basic prototype of each sub-joint of the robot according to the integral structure of the limb skeleton and the hand puppet of the real person, and mapping the basic prototype into a specific motor component for assembling; the robot specifically comprises a base, a large arm part and a small arm part, wherein the lower end of the large arm part is arranged on the base, the upper end of the large arm part is connected with the small arm part, the large arm part is used for controlling rigid displacement motion of the whole puppet, the large arm part is provided with the base, 3 arm rods and 6 movable joints, and large stepping motors are respectively arranged in the 6 movable joints; the movable joints are used for simulating the upper limbs of performers and comprise shoulders, elbows and wrists, and 6 degrees of freedom of the 6 movable joints provide the axial rotation capacity of two joints for each part.
5. An interactive performance system for a palm-facing puppet performance robot, the system comprising:
the body feeling data acquisition module is used for acquiring body feeling action data of a user, transmitting the acquired data to a control program, and making a state decision of the robot, wherein the acquired data specifically comprise: the position of the user, the facing direction, the displacement of the human body of the user and the gesture action of the user;
the robot arm state decision module is used for finishing action decision by using a finite state machine, when the state machine is defined, firstly, the control action of the glove puppet is decomposed by analyzing a puppet performance video given by an actor, then, key points of the control action taking each role as a main body are popularized to a finite state, the state of the finite state machine is determined by data collected by the somatosensory data collection module, in different links of the performance, the data for controlling the finite state machine are different, specifically, the links of the actor performance are controlled by the position and the face of the actor, in an interaction link, the finite state machine is controlled by the gesture action of a user, finally, an action vector is output by the finite state machine and is used for action calculation in the next stage, and each dimension of the action vectors respectively controls the movement of one movable joint of the robot;
the robot arm motion calculation module is used for receiving motion vectors sent by the robot arm state decision module, analyzing the vectors to determine specific motions of the robot, analyzing the vectors by using a reverse dynamics algorithm, extracting two axial motions as basic motions of the system, and calculating the rotation degree of each joint by using a cosine law;
and the robot arm motion module is used for calculating the motion vectors obtained from the robot arm state decision module one by one in sequence, and after the motion represented by one motion vector is finished, the robot can execute the motion represented by the next vector in sequence until the whole group of motions are completely finished.
6. The system of claim 5, wherein the robotic arm state decision module is further configured to: three kinds of data of the user are collected by the data collection module: distances d1, d2, d3, d4 and d5 from fingertips to palms, translation speed vT of palms, rotation speed vR of palms, and the robot is initially in an initial state, and when d1, d2, d3, d4 and d5 are detected to be less than 5cm, the system judges that the user is clenching a fist at present and the robot enters a bending state; when detecting that vT is larger than 0.4m/s, the system judges that the user is translating the palm at present and the robot enters a moving state; when the detected vR is larger than 30 degrees/s, the system judges that the user rotates the palm currently and the robot enters a rotating state.
7. The system of claim 5, wherein the robotic arm motion calculation module is further configured to: one upper limb is L1Length of lower limbs is L2For a given target coordinate (X, Y), the upper limb needs to be rotated by theta1Degree (C) belowThe limb requiring rotation theta2Angle theta between degree, direction vector (X, Y) and normalTWherein theta1And theta2Calculated according to the following formula:
Figure FDA0002721583820000031
Figure FDA0002721583820000032
8. the system of claim 5, wherein: the palm puppet performance robot is integrally constructed according to the limb skeleton of a real person and the palm puppet, and is assembled by motor components formed by mapping basic prototypes of all the sub-joints of the robot; the robot specifically comprises a base, a large arm part and a small arm part, wherein the lower end of the large arm part is arranged on the base, the upper end of the large arm part is connected with the small arm part, the large arm part is used for controlling rigid displacement motion of the whole puppet, the large arm part is provided with the base, 3 arm rods and 6 movable joints, and large stepping motors are respectively arranged in the 6 movable joints; the movable joints are used for simulating the upper limbs of performers and comprise shoulders, elbows and wrists, and 6 degrees of freedom of the 6 movable joints provide the axial rotation capacity of two joints for each part.
9. A computer-readable storage medium having stored thereon a computer program for: the computer program, when executed by a processor, implementing the method as claimed in any one of claims 1-4.
CN202011089702.3A 2020-10-13 2020-10-13 Interactive performance method and system of palm-faced puppet performance robot Active CN112192585B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011089702.3A CN112192585B (en) 2020-10-13 2020-10-13 Interactive performance method and system of palm-faced puppet performance robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011089702.3A CN112192585B (en) 2020-10-13 2020-10-13 Interactive performance method and system of palm-faced puppet performance robot

Publications (2)

Publication Number Publication Date
CN112192585A CN112192585A (en) 2021-01-08
CN112192585B true CN112192585B (en) 2022-02-15

Family

ID=74009118

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011089702.3A Active CN112192585B (en) 2020-10-13 2020-10-13 Interactive performance method and system of palm-faced puppet performance robot

Country Status (1)

Country Link
CN (1) CN112192585B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107330913A (en) * 2017-05-27 2017-11-07 南京信息工程大学 A kind of intelligent robot marionette performance system based on autonomous learning drama
CN108453742A (en) * 2018-04-24 2018-08-28 南京理工大学 Robot man-machine interactive system based on Kinect and method
CN109333544A (en) * 2018-09-11 2019-02-15 厦门大学 A kind of image exchange method for the marionette performance that spectators participate in
CN110515455A (en) * 2019-07-25 2019-11-29 山东科技大学 It is a kind of based on the dummy assembly method cooperateed in Leap Motion and local area network
CN110694286A (en) * 2019-11-05 2020-01-17 厦门大学 Method for simulating palm puppet performance by using mechanical arm
US10768708B1 (en) * 2014-08-21 2020-09-08 Ultrahaptics IP Two Limited Systems and methods of interacting with a robotic tool using free-form gestures

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101479234B1 (en) * 2008-09-04 2015-01-06 삼성전자 주식회사 Robot and method of controlling the same

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10768708B1 (en) * 2014-08-21 2020-09-08 Ultrahaptics IP Two Limited Systems and methods of interacting with a robotic tool using free-form gestures
CN107330913A (en) * 2017-05-27 2017-11-07 南京信息工程大学 A kind of intelligent robot marionette performance system based on autonomous learning drama
CN108453742A (en) * 2018-04-24 2018-08-28 南京理工大学 Robot man-machine interactive system based on Kinect and method
CN109333544A (en) * 2018-09-11 2019-02-15 厦门大学 A kind of image exchange method for the marionette performance that spectators participate in
CN110515455A (en) * 2019-07-25 2019-11-29 山东科技大学 It is a kind of based on the dummy assembly method cooperateed in Leap Motion and local area network
CN110694286A (en) * 2019-11-05 2020-01-17 厦门大学 Method for simulating palm puppet performance by using mechanical arm

Also Published As

Publication number Publication date
CN112192585A (en) 2021-01-08

Similar Documents

Publication Publication Date Title
CN110930483B (en) Role control method, model training method and related device
Riley et al. Enabling real-time full-body imitation: a natural way of transferring human movement to humanoids
CN102362293A (en) Chaining animations
CN102356373A (en) Virtual object manipulation
Ikeuchi et al. Describing upper-body motions based on labanotation for learning-from-observation robots
JPH11513163A (en) Modeling and control of virtual environment operation device
Inamura et al. SIGVerse: A cloud-based VR platform for research on social and embodied human-robot interaction
CN112192585B (en) Interactive performance method and system of palm-faced puppet performance robot
CN110694286B (en) Method for simulating palm puppet performance by using mechanical arm
Li Application of virtual environment in the teaching of basketball tactics
CN116485953A (en) Data processing method, device, equipment and readable storage medium
Kobayashi et al. Motion Capture Dataset for Practical Use of AI-based Motion Editing and Stylization
Lin et al. Action recognition for human-marionette interaction
Yang et al. Humanoid robot magic show performance
CN115294228A (en) Multi-graph human body posture generation method and device based on modal guidance
Huang et al. Designing an exergaming system for exercise bikes using kinect sensors and Google Earth
Bernardes Jr et al. Design and implementation of a flexible hand gesture command interface for games based on computer vision
CN115222847A (en) Animation data generation method and device based on neural network and related products
Tollmar et al. Navigating in virtual environments using a vision-based interface
Cheng et al. Create a puppet play and interative digital models with leap Motion
Dalla Libera et al. A new paradigm of humanoid robot motion programming based on touch interpretation
Chou et al. Multijoint robot hand design for puppet operations
Zhan et al. Imitation system of humanoid robots and its applications
Tsai et al. Two-phase optimized inverse kinematics for motion replication of real human models
Luo et al. Puppet playing: An interactive character animation system with hand motion control

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant