CN116394277B - Robot is played to imitative people piano - Google Patents

Robot is played to imitative people piano Download PDF

Info

Publication number
CN116394277B
CN116394277B CN202310676079.9A CN202310676079A CN116394277B CN 116394277 B CN116394277 B CN 116394277B CN 202310676079 A CN202310676079 A CN 202310676079A CN 116394277 B CN116394277 B CN 116394277B
Authority
CN
China
Prior art keywords
piano
humanoid
playing
robot
arm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310676079.9A
Other languages
Chinese (zh)
Other versions
CN116394277A (en
Inventor
宛敏红
顾建军
朱世强
严敏东
黄秋兰
钟灵
高广
张璞
方伟
姚运昌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Lab
Original Assignee
Zhejiang Lab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Lab filed Critical Zhejiang Lab
Priority to CN202310676079.9A priority Critical patent/CN116394277B/en
Publication of CN116394277A publication Critical patent/CN116394277A/en
Application granted granted Critical
Publication of CN116394277B publication Critical patent/CN116394277B/en
Priority to PCT/CN2023/120387 priority patent/WO2024008217A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/003Manipulators for entertainment
    • B25J11/004Playing a music instrument
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J17/00Joints
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/04Viewing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1612Programme controls characterised by the hand, wrist, grip control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Fuzzy Systems (AREA)
  • Artificial Intelligence (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Multimedia (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Manipulator (AREA)
  • Toys (AREA)

Abstract

The utility model discloses a humanoid piano performance robot, humanoid piano performance robot can include paw, arm, waist, neck, shank, control system and base, and wherein, the neck corresponds two degrees of freedom to drive the head and carry out gyration and pitching motion, and the waist corresponds two degrees of freedom, so as to drive the upper body and carry out gyration and pitching motion, and this humanoid piano performance robot can be through visual perception unit, pinpoints the position of key to the content of music score that can intelligent discernment, and carry out piano performance according to the music score automatically. Through the two degrees of freedom of waist, head and the control of two arm positions, make the piano performance robot in this specification can nimble and intelligent play piano.

Description

Robot is played to imitative people piano
Technical Field
The specification relates to the field of robots, and in particular relates to a robot for playing a humanoid piano.
Background
With the progress of technology, robots are slowly applied in people's lives, for example: a sweeping robot and a medical robot.
In practical applications, it is difficult to realize the application of the robot in piano performance.
Therefore, the effective application of the robot to piano playing is a problem to be solved.
Disclosure of Invention
The present specification provides a humanoid piano playing robot to partially solve the above-mentioned problems existing in the prior art.
The technical scheme adopted in the specification is as follows:
the present specification provides a humanoid piano playing robot, the humanoid piano playing robot includes: the robot comprises a left paw, a right paw, a left arm, a right arm, a waist, a neck, a visual perception unit, left legs, right legs, a control system and a base, wherein the neck corresponds to two degrees of freedom so as to drive a head to perform a rotation motion and a pitching motion, and the waist corresponds to two degrees of freedom so as to drive the upper body of the robot to perform a rotation motion and a pitching motion;
the humanoid piano playing robot is used for acquiring a music score image and a piano key image of a piano key facing the humanoid piano playing robot through the visual perception unit, identifying the music score image to obtain music score information, and positioning the piano key according to the key image to obtain positioning data;
according to the positioning data and the music score information, a fingering adopted by the humanoid piano playing robot when playing the music corresponding to the music score information is planned, and a pose change sequence corresponding to the body part of the humanoid piano playing robot when playing the music according to the fingering is determined;
And controlling the left paw and the right paw according to the fingering through the control system, and controlling the body parts except the left paw and the right paw according to the pose change sequence.
Optionally, the humanoid piano playing robot is further configured to encode the music score information to obtain encoded information in a preset format, and send the encoded information to the control system, where the music score information at least includes a beat, a note and a modifier.
Optionally, the humanoid piano playing robot further comprises: a speech unit;
the humanoid piano playing robot is used for receiving voice information sent by a user through the voice unit, judging whether the purpose of the user is to request the humanoid piano playing robot to conduct piano playing according to the voice information, and if yes, controlling the humanoid piano playing robot to conduct piano playing through the control system.
Optionally, the humanoid piano playing robot is configured to, if it is determined according to the voice information that the user aims to request the humanoid piano playing robot to perform a piano playing, identify the voice information, determine track information of the humanoid piano playing robot requested by the user, and control the humanoid piano playing robot to perform a piano playing through the control system according to the track information.
Optionally, the humanoid piano playing robot is used for filtering noise in the voice information, and judging whether the purpose of the user is to request the humanoid piano playing robot to perform piano playing according to the voice information after noise filtering.
Optionally, the humanoid piano playing robot is configured to determine, according to the positioning data, an arm end pose required by the left arm and/or the right arm to play the music piece corresponding to the music piece according to the fingering at each moment, and determine, according to the arm end pose, a pose change sequence corresponding to a body part of the humanoid piano playing robot when the humanoid piano playing robot plays the music piece according to the fingering.
Optionally, the control system is configured to perform inverse kinematics solution for each arm end pose in the pose sequence, and determine a joint angle required by the left arm or the right arm to reach the arm end pose respectively; and controlling the left arm or the right arm to move according to the joint angle corresponding to the pose of the tail end of each arm.
Optionally, the control system is configured to determine, for each arm end pose in the pose change sequence, an arm to which the arm end pose belongs, and convert the arm end pose to a shoulder coordinate system of an arm to which the arm end pose belongs according to a fixed rotation angle between a shoulder and a waist corresponding to the arm end pose, a yaw angle and a pitch angle of the waist at the moment, and coordinates of a shoulder corresponding to the arm to which the arm end pose belongs under a local coordinate system of a pitch joint of the waist, so as to obtain a converted pose; and the control system controls the left arm and/or the right arm to move according to the converted pose.
The present specification provides a control method of a humanoid piano-playing robot, the method being applied to the humanoid piano-playing robot, the humanoid piano-playing robot comprising: left hand claw, right hand claw, left arm, right arm, waist, neck, visual perception unit, left leg, right leg, control system, base, the neck corresponds two degrees of freedom to drive the head and carry out horizontal rotation and use the swing action of vertical direction as the axis, the waist corresponds two degrees of freedom, in order to carry out the swing action of turning action and using the vertical direction as the axis, include:
acquiring a music score image and a key image of a piano key facing the humanoid piano playing robot through the visual perception unit;
identifying the music score image to obtain music score information, and positioning the piano keys according to the key images to obtain positioning data;
according to the positioning data and the music score information, a fingering adopted by the humanoid piano playing robot when playing the music corresponding to the music score information is planned, and a pose change sequence corresponding to the body part of the humanoid piano playing robot when playing the music according to the fingering is determined;
And controlling the left paw and the right paw according to the fingering through the control system, and controlling the body parts except the left paw and the right paw according to the pose change sequence.
Optionally, the method further comprises:
and encoding the music score information to obtain encoding information in a preset format, and transmitting the encoding information to the control system, wherein the music score information at least comprises beats, notes and modifiers.
Optionally, the humanoid piano playing robot further comprises: a speech unit;
the method further comprises the steps of:
receiving voice information sent by a user through the voice unit;
and judging whether the purpose of the user is to request the humanoid piano playing robot to conduct piano playing according to the voice information, and if so, controlling the humanoid piano playing robot to conduct piano playing through the control system.
Optionally, the controlling system controls the humanoid piano playing robot to perform piano playing, specifically including:
if the purpose of the user is to request the humanoid piano playing robot to perform piano playing according to the voice information, the voice information is identified, and the track information of the humanoid piano playing robot playing requested by the user is determined;
And controlling the humanoid piano playing robot to perform piano playing through the control system according to the track information.
Optionally, determining a pose change sequence corresponding to a body part of the humanoid piano playing robot when the humanoid piano playing robot plays the music according to the fingering specifically includes:
according to the music score information and the positioning data, determining the arm tail end pose required by the left arm and/or the right arm to play music corresponding to the music score information at each moment;
and determining a pose change sequence corresponding to the body part of the humanoid piano playing robot when the humanoid piano playing robot plays the music according to the fingering according to the pose of the tail end of the arm.
The present specification provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the control method of the above-described humanoid piano-playing robot.
The present specification provides an electronic apparatus including a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the control method of the above-described humanoid piano-playing robot when executing the program.
The above-mentioned at least one technical scheme that this specification adopted can reach following beneficial effect:
as can be seen from the above-mentioned humanoid piano-playing robot, the humanoid piano-playing robot may include a left hand claw, a right hand claw, a left arm, a right hand arm, a waist, a neck, a visual sensing unit, a left leg, a right leg, a control system, and a base, wherein the neck corresponds to two degrees of freedom to drive the head to perform a swivel motion and a pitch motion, and the waist corresponds to two degrees of freedom to drive the upper body to perform a swivel motion and a pitch motion; the humanoid piano playing robot is used for acquiring a music score image and a piano key image of a piano key facing the humanoid piano playing robot through a visual perception unit, identifying the music score image to obtain music score information, and positioning the piano key according to the key image to obtain positioning data; according to the positioning data and the music score information, a fingering adopted by the humanoid piano playing robot when playing music corresponding to the music score information is planned, and a pose change sequence corresponding to a body part of the humanoid piano playing robot when playing the music according to the fingering is determined; the left hand claw and the right hand claw are controlled according to fingering through a control system, and body parts except the left hand claw and the right hand claw are controlled according to a pose change sequence.
From the above, it can be seen that the robot for playing a humanoid piano provided in the present specification has two degrees of freedom both in the waist and the head, so that the objective of playing a piano by personification can be flexibly achieved, and the positions of the keys are accurately positioned by the visual sensing unit, and the content of a music score can be intelligently identified, and the piano performance can be automatically performed according to the music score. Through the two degrees of freedom of waist, head and the control of two arm positions, make the piano performance robot in this specification can nimble and intelligent play piano.
Drawings
The accompanying drawings, which are included to provide a further understanding of the specification, illustrate and explain the exemplary embodiments of the present specification and their description, are not intended to limit the specification unduly. In the drawings:
fig. 1 is a schematic diagram of a performance system composed of a humanoid piano-playing robot and a piano provided in the present specification;
fig. 2 is a schematic structural view of a humanoid piano-playing robot provided in the present specification;
fig. 3 is a schematic view showing joint motions of the waist and the neck of a humanoid piano-playing robot provided in the present specification;
FIG. 4 is a system block diagram of a visual perception unit provided herein;
FIG. 5 is a block diagram of a specific system for providing a speech unit according to the present disclosure;
FIG. 6 is a schematic diagram of a world coordinate system and a local coordinate system corresponding to a left arm provided in the present specification;
fig. 7 is a flowchart of a control method of the humanoid piano-playing robot in the present specification;
fig. 8 is a schematic view of the electronic device corresponding to fig. 1 provided in the present specification.
In fig. 1, 1 denotes a robot for playing a humanoid piano, 2 denotes a piano, 11 denotes a left hand claw of the robot for playing a humanoid piano, 12 denotes a right hand claw of the robot for playing a humanoid piano, 13 denotes a left arm of the robot for playing a humanoid piano, 14 denotes a right arm of the robot for playing a humanoid piano, 15 denotes a waist of the robot for playing a humanoid piano, 16 denotes a neck of the robot for playing a humanoid piano, 17 denotes a visual sensing unit of the robot for playing a humanoid piano, 18 denotes a voice unit of the robot for playing a humanoid piano, 19 denotes a left leg of the robot for playing a humanoid piano, 110 denotes a right leg of the robot for playing a humanoid piano, 111 denotes a control system of the robot for playing a humanoid piano, and 112 denotes a base of the robot for playing a humanoid piano.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the present specification more apparent, the technical solutions of the present specification will be clearly and completely described below with reference to specific embodiments of the present specification and corresponding drawings. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present specification. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are intended to be within the scope of the present disclosure.
The following describes in detail the technical solutions provided by the embodiments of the present specification with reference to the accompanying drawings.
In the present specification, a human-simulated piano-playing robot is provided, and a playing system of the human-simulated piano-playing robot in combination with a piano can be shown in fig. 1.
Fig. 1 is a schematic diagram of a performance system composed of a humanoid piano-playing robot and a piano provided in the present specification.
Wherein, 1 is a robot for playing a human-like piano, 2 is a piano, and by controlling the left and right arm claws to move in space, positioning fingers above the corresponding keys and driving the fingers to press the keys, the piano playing by the robot for playing a human-like piano can be realized, and the specific structure of the robot for playing a human-like piano and the implementation manner of the piano playing are described below, as shown in fig. 2.
Fig. 2 is a schematic structural view of a humanoid piano-playing robot provided in the present specification.
Wherein, this humanoid piano performance robot can include: left hand claw 11, right hand claw 12, left hand arm 13, right hand arm 14, waist 15, neck 16, visual perception unit 17, speech unit 18, left leg 19, right leg 110, control system 111, base 112.
The neck 16 corresponds to two degrees of freedom, including a rotation degree and a pitching degree, and is driven by a servo motor to drive the head to perform rotation and pitching so as to achieve the anthropomorphic head shaking effect.
The waist 15 corresponds to two degrees of freedom, including a swivel and a pitch degree of freedom, to drive the upper body to perform a swivel motion and a pitch motion, both driven by a servo motor.
The waist is used for driving the whole upper body to move so as to realize the anthropomorphic waist swinging effect, and the waist and the neck can jointly move to realize the anthropomorphic 'shaking brain' action in the piano playing process through two degrees of freedom respectively configured for the neck 16 and the waist 15, as shown in fig. 3.
Fig. 3 is a schematic view showing joint motions of the waist and the neck of a humanoid piano-playing robot provided in the present specification.
As can be seen from fig. 3, while the humanoid piano-playing robot is playing piano, both the waist and the head can swing downward, and of course, not limited thereto, the left, right, forward and downward swinging of the body and the head, etc. can be achieved by two degrees of freedom of the configuration, and how the motions of the waist and the head are controlled will be described in detail later.
Because the waist can rotate and pitch through configuration of two degrees of freedom, in the process that the humanoid piano playing robot plays a piano, the claws of the humanoid piano playing robot can more flexibly reach the positions of keys required to be pressed.
The left arm 13 and the right arm 14 are multi-degree-of-freedom humanoid mechanical arms, and the mechanical arms can be specifically configured with 7 degrees of freedom, namely 7 joints exist, and the arms can perform pose calculation and motion planning according to the action requirement of the hand claw playing the piano, so that the accurate positioning of the hand claw in space is realized.
The left leg and the right leg are all imitated to human leg design, wherein a pitching degree of freedom is configured at the lower part of the right leg, and the pedal can be used for stepping on a piano pedal.
The control system can be used for planning the movement track of the robot and sending control instructions to the driving system to control the movement of each joint.
The base is used for installing the robot main body and has the functions of height adjustment, multidirectional movement and locking.
When a piano playing is required, the humanoid piano playing robot can acquire a music score image and a piano key image of a piano key facing the humanoid piano playing robot through the visual perception unit 17, and identify the music score image to obtain music score information, and position the piano key according to the key image to obtain positioning data.
The visual perception unit may be connected with a depth camera (RGB-D camera) through which a music spectrum image and a key image are acquired, and the system block diagram of the visual perception unit may be specifically shown in fig. 4.
Fig. 4 is a system block diagram of a visual perception unit provided in the present specification.
The visual perception unit can comprise a piano key positioning module and a piano spectrum identification module.
When the visual perception unit acquires a spectrum reading instruction, an image is acquired as input based on the RGB-D camera, the musical instrument spectrum recognition module detects musical instrument spectrums, extracts semantic information in the musical instrument spectrums and encodes the semantic information into an information stream in a musicXML format, and sends the information stream to the control system. The semantic information in the musical notation may include, among other things, track name, tempo, beat, key, notes, modifiers, etc.
When the visual perception unit acquires a playing instruction, a key image can be acquired through the RGB-D camera, and then the piano key positioning module calculates the key image to obtain the relative pose of the piano relative to the humanoid piano playing robot, so that the positioning data are determined and sent to the control system, and the control system plans fingering and pose change sequences to control the humanoid piano playing robot to perform piano playing.
When the positioning data are determined, the key position corresponding to each key can be determined, the key position corresponding to the initial playing note can also be determined, and the key position corresponding to each key is determined through the relative position relation among the pre-configured piano keys.
Then, the fingering adopted by the humanoid piano playing robot when playing the music corresponding to the music information can be planned according to the positioning data (the key positions are represented) and the music information, and the pose change sequence corresponding to the body part of the humanoid piano playing robot when playing the music according to the fingering is determined.
The fingering adopted when playing the music corresponding to the music spectrum information can be the rule that each note in the music spectrum information is planned to be played by which finger in the right hand claw of the left hand claw, and the specific manner of determining the fingering can be as follows: fingering is performed by adopting a scoring-based programming method, and an arm claw action sequence (namely fingering) which can play the music optimally is programmed for a certain music. That is, a plurality of fingering can be planned for one musical piece, and an optimum fingering can be selected. That is, a plurality of arm claw action sequences capable of playing the musical composition can be planned, each arm claw action sequence is scored, and the arm claw action sequence with the highest score is obtained as the adopted fingering by scoring each arm claw action sequence. The score for the arm paw action sequence may be determined by the cost of the required movement of the arm paw action sequence, the higher the cost the lower the score.
Then, a pose change sequence corresponding to the body part of the humanoid piano playing robot when the music is played according to the fingering can be determined, wherein the pose change sequence can comprise poses at a plurality of time points, and the pose corresponding to each time point can comprise poses corresponding to all the positions: such as the pose corresponding to the left arm 13 and the right arm 14, and the pose corresponding to the waist 15, the neck 16 and the right leg 110.
Finally, by the control system 111, the left and right paws 11, 12 can be controlled according to the above-planned fingering, and the body parts other than the left and right paws 11, 12 can be controlled according to the pose change sequence.
The claws (the left claw 11 and the right claw 12) may be in five-finger form, wherein the thumb and the index finger may be configured with two degrees of freedom of pitching and rolling, and the other three fingers may be configured with one degree of freedom of pitching. All fingers are driven by a miniature planetary gear motor, and meanwhile, the motor is converted into the pressing motion of fingertips in a connecting rod transmission mode.
Each finger can be provided with a pressing force detection module, so that the pressure of fingertips to keys is detected in real time, and the sensing detection capacity in the playing process is improved. There are various functions of the pressing force detection module, for example, if it is detected that the finger tip is excessively pressed against the key (e.g., greater than a specified threshold value) based on the pressing force detection model, the power-off process of the robot may be performed to protect the key. For another example, the pressure required to be implemented by the fingertip of the robot for the key can be determined according to the sound intensity information in the music score, and the determined pressure required to be implemented by the fingertip of the robot for the key can be achieved through the cooperation of the pressing force detection module and the motor (the miniature planetary speed reduction motor) arranged on the finger while the finger is controlled to press the key.
Through action planning, the fingering can be automatically generated according to the music score content, namely, an action sequence of pressing the key by the finger corresponding to each note, and sound is generated by controlling the finger to press the corresponding key.
The arms adopt a multi-degree-of-freedom humanoid configuration, and the smart movement of the human arms can be imitated, so that the personification playing effect is realized. All joints of the arm are driven by a servo motor, so that high-precision motion control can be realized.
The above-mentioned pose change sequence includes the pose of the arm end, and the pose of the arm end can be used to guide a finger of the paw to stay on the key to be played, so that the pose of the arm end required by the left arm 13 and/or the right arm 14 to play music according to the fingering information at each moment can be determined through the above-mentioned positioning data, and the pose change sequence corresponding to the body part of the humanoid piano playing robot when playing music according to fingering is determined according to the pose of the arm end.
Of course, the pose changing sequence can also include poses corresponding to other positions, such as poses corresponding to the waist and the head, and can be determined by rhythms in the music score information, so that two degrees of freedom corresponding to the waist and the head respectively are controlled according to the determined poses, and the waist and the head achieve the swinging effect when the anthropomorphic musical instrument is played. Of course, the pose corresponding to the waist and the head may be determined according to the preset swing rule of the waist and the head, so as to control the motion of the waist and the head, and the specific mode is not limited.
From the above, it can be seen that the positioning of the paw in space is achieved by arm motion control. In the playing process, the pose of the tail end of the arm is calculated according to the positions of the keys and the positions of the fingers, and then the motion angle of each arm joint is calculated through inverse kinematics solution, so that the claws can reach the positions for pressing the keys through controlling the arm motion.
The above-mentioned voice unit 18 is mainly used for realizing voice recognition and voice interaction, and the voice unit 18 can receive voice information sent by a user, and the humanoid piano playing robot can determine whether the user aims at requesting the humanoid piano playing robot to perform piano playing according to the voice information, if yes, the humanoid piano playing robot can be controlled by the control system 111 to perform piano playing.
The above-mentioned timing of receiving the performance command may be timing when it is determined that the user is to request the humanoid piano playing robot to perform the piano playing, and of course, the performance command may be initiated by the user through a key arranged on the humanoid piano playing robot, and the specific receiving timing of the performance command is not limited.
If it is determined that the user is required to perform a piano action by the humanoid piano action robot based on the above-described voice information, the voice information is identified, track information of the piano action robot required by the user is determined, and the humanoid piano action robot is controlled to perform a piano action by the control system 111 based on the track information. The track information mentioned here may include a track name, a track type, etc.
Fig. 5 is a block diagram of a specific system for providing a speech unit in the present specification.
As shown in fig. 5. Firstly, the microphone can input original voice information into the preprocessing module, after the original voice information is subjected to voice noise reduction and echo cancellation, noise and audio data played by the microphone are filtered, voice information after noise filtering can be obtained, and the voice information after noise filtering is sent to the natural language processing module.
The natural language processing module is deployed with a speech recognition algorithm, a speech classification algorithm, and a semantic understanding algorithm. The voice recognition algorithm transcribes the voice into words; the voice classification algorithm classifies voice types into task types, boring types and question-answering types; the semantic understanding algorithm may extract the purpose of the speech and some important parameters (e.g., track information).
After the voice is classified, if the voice is a boring question and answer type, an intelligent answer can be generated by the intelligent answer generation module, then the voice is synthesized by the voice synthesis module, and finally the voice is output to the loudspeaker to make a sound. If the task class and the purpose of the voice is playing the musical instrument, after the song information of the playing musical instrument is extracted, the song information can be sent to the control system, the control system can acquire corresponding song information through the song information (the specific mode is not limited, if the specific mode is not limited, the specific mode can be acquired through a public network, and the song information corresponding to the fixed song information can also be stored in the robot so that the control information can directly acquire the song information), and then the control system drives the robot to move to play the musical instrument.
From the above, it can be seen that the control system can be responsible for motion planning and execution control, and is performed in two steps, the first step preferably requires playing the musical instrument according to the track planning, and mapping of the musical beat to the finger in the space pose (which finger plays the corresponding relation of which note can be understood); and secondly, planning the track of the tail end of the mechanical arm (arm) according to the space pose of the finger determined in the first step, and performing inverse kinematics solution to drive all joints of the arm to move.
The second step is specifically described, in which the sequence of the pose of the end of the arm is determined, and the angles of the joint angles of the arm are determined according to the sequence of the pose of the end of the arm, so as to control the movement of the arm.
That is, for each arm end pose in the pose sequence, performing inverse kinematics solution, and determining a joint angle required by the left arm 13 or the right arm 14 to reach the arm end pose respectively; and controls the left arm 13 or the right arm 14 to move according to the joint angle corresponding to the pose of the tail end of each arm. For an arm end pose, if the arm end pose belongs to a left arm, the joint angle of the left arm needs to be determined, and if the arm end pose belongs to a right arm, the joint angle of the right arm needs to be determined.
It should be noted that, because there is a distinction between the left arm and the right arm, and the determined arm end pose is usually under one coordinate system (for example, a world coordinate system centered at the waist center of the robot), for each arm end pose in the pose change sequence, the arm to which the arm end pose belongs may be determined, and the arm end pose may be converted into the shoulder coordinate system of the arm to which the arm end pose belongs according to the fixed rotation angle between the shoulder and the waist 15 corresponding to the arm to which the arm end pose belongs, the yaw angle and pitch angle of the waist at the moment, and the coordinates of the shoulder corresponding to the arm to which the arm end pose belongs under the local coordinate system of the pitch joint of the waist 15. Obtaining the converted pose; so that the control system 111 controls the left arm 13 and/or the right arm 14 to move according to the converted pose.
It should be noted that, the pose of the arm that needs to be reached by each joint can be determined by the pose of the arm end, and then the joint angle of each joint can be determined, and these poses can also be converted into the shoulder coordinate system in a similar manner as described above.
The world coordinate system and the shoulder coordinate system corresponding to the left arm may be specifically shown in fig. 6.
Fig. 6 is a schematic diagram of a world coordinate system and a shoulder coordinate system corresponding to a left arm provided in the present specification.
The control of the double mechanical arms adopts the pose control of a Cartesian space, and the origin of the coordinate of the world coordinate system of the Cartesian space is positioned at the center of the waist. The above-mentioned local coordinate system of the pitch joint of the waist 15 may refer to the world coordinate system, with the X-axis pointing directly in front of the robot, the Y-axis pointing directly to the left of the robot, and the Z-axis pointing directly above the robot.
The origin of coordinates of the two shoulder coordinate systems of the two arms are located at the centers of the left and right shoulders (described in the shoulder coordinate systems in the above description), respectively, and the coordinate axis direction is parallel to the world coordinate system. Taking a right arm as an example, the control system can receive the arm end pose of the right arm end converted by the musical instrument spectrum in a world coordinate system, a path from the current position to the target position of the mechanical arm end is planned in a reverse parabolic mode, and the points on the whole path can be subjected to inverse kinematics calculation.
For a time point, the position and the posture of the arm end of the mechanical arm can be converted into a shoulder coordinate system, and the position and the posture of the arm end of the mechanical arm can be obtained by the following formulaTransition to right shoulderPose +.>Wherein->And->For the yaw and pitch angle of the waist at this moment, +.>For a fixed angle of rotation between the shoulder and the waist +.>Is the coordinates of the arm shoulder in the local coordinate system of the pitch joint of the waist.
Wherein, each rotation matrix is as follows:
finally, through inverse kinematics calculation, seven joint angles of the right arm in a joint space can be solved, and the seven joint angles are sent to a driver to control the mechanical arm to move to a target pose. The inverse kinematics solution can be implemented by using redundant mechanical arms based on more arm angles at present.
Fig. 7 is a flow chart of a control method of a humanoid piano-playing robot in the present specification, specifically comprising the following steps:
s701: acquiring a music score image and a key image of a piano key facing the humanoid piano playing robot through the visual perception unit;
s702: identifying the music score image to obtain music score information, and positioning the piano keys according to the key images to obtain positioning data;
S703: according to the positioning data and the music score information, a fingering adopted by the humanoid piano playing robot when playing the music corresponding to the music score information is planned, and a pose change sequence corresponding to the body part of the humanoid piano playing robot when playing the music according to the fingering is determined;
s704: and controlling the left paw and the right paw according to the fingering through the control system, and controlling the body parts except the left paw and the right paw according to the pose change sequence.
Since the control method of the humanoid piano-playing robot provided in the present specification has been basically explained in the above description of the humanoid piano-playing robot, detailed description thereof will be omitted.
From the above, it can be seen that the robot for playing a humanoid piano provided in the present specification has two degrees of freedom both in the waist and the head, so that the objective of playing a piano by personification can be flexibly achieved, and the positions of keys can be accurately located and the content of a music score can be intelligently identified by the visual sensing unit, and the piano performance can be automatically performed according to the music score.
In addition, the voice unit can enable the humanoid piano playing robot to intelligently recognize piano tracks required by users to play, so that piano playing is automatically performed, and the piano playing robot in the specification can play pianos flexibly and intelligently through two degrees of freedom of waist and head and control of the pose of the two mechanical arms.
The present specification also provides a computer-readable storage medium storing a computer program usable for executing the control method of the above-described humanoid piano-playing robot.
The present specification also provides a schematic structural diagram of the electronic device shown in fig. 8. At the hardware level, the electronic device includes a processor, an internal bus, a network interface, a memory, and a non-volatile storage, as illustrated in fig. 8, although other hardware required by other services may be included. The processor reads the corresponding computer program from the nonvolatile memory to the memory and then runs the computer program to realize the control method of the humanoid piano playing robot.
Of course, other implementations, such as logic devices or combinations of hardware and software, are not excluded from the present description, that is, the execution subject of the following processing flows is not limited to each logic unit, but may be hardware or logic devices.
In the 90 s of the 20 th century, improvements to one technology could clearly be distinguished as improvements in hardware (e.g., improvements to circuit structures such as diodes, transistors, switches, etc.) or software (improvements to the process flow). However, with the development of technology, many improvements of the current method flows can be regarded as direct improvements of hardware circuit structures. Designers almost always obtain corresponding hardware circuit structures by programming improved method flows into hardware circuits. Therefore, an improvement of a method flow cannot be said to be realized by a hardware entity module. For example, a programmable logic device (Programmable Logic Device, PLD) (e.g., field programmable gate array (Field Programmable Gate Array, FPGA)) is an integrated circuit whose logic function is determined by the programming of the device by a user. A designer programs to "integrate" a digital system onto a PLD without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Moreover, nowadays, instead of manually manufacturing integrated circuit chips, such programming is mostly implemented by using "logic compiler" software, which is similar to the software compiler used in program development and writing, and the original code before the compiling is also written in a specific programming language, which is called hardware description language (Hardware Description Language, HDL), but not just one of the hdds, but a plurality of kinds, such as ABEL (Advanced Boolean Expression Language), AHDL (Altera Hardware Description Language), confluence, CUPL (Cornell University Programming Language), HDCal, JHDL (Java Hardware Description Language), lava, lola, myHDL, PALASM, RHDL (Ruby Hardware Description Language), etc., VHDL (Very-High-Speed Integrated Circuit Hardware Description Language) and Verilog are currently most commonly used. It will also be apparent to those skilled in the art that a hardware circuit implementing the logic method flow can be readily obtained by merely slightly programming the method flow into an integrated circuit using several of the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer readable medium storing computer readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, application specific integrated circuits (Application Specific Integrated Circuit, ASIC), programmable logic controllers, and embedded microcontrollers, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, atmel AT91SAM, microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic of the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller in a pure computer readable program code, it is well possible to implement the same functionality by logically programming the method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers, etc. Such a controller may thus be regarded as a kind of hardware component, and means for performing various functions included therein may also be regarded as structures within the hardware component. Or even means for achieving the various functions may be regarded as either software modules implementing the methods or structures within hardware components.
The system, apparatus, module or unit set forth in the above embodiments may be implemented in particular by a computer chip or entity, or by a product having a certain function. One typical implementation is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being functionally divided into various units, respectively. Of course, the functions of each element may be implemented in one or more software and/or hardware elements when implemented in the present specification.
It will be appreciated by those skilled in the art that embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the present specification may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present description can take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The present description is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the specification. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
It will be appreciated by those skilled in the art that embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the present specification may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present description can take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The description may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for system embodiments, since they are substantially similar to method embodiments, the description is relatively simple, as relevant to see a section of the description of method embodiments.
The foregoing is merely exemplary of the present disclosure and is not intended to limit the disclosure. Various modifications and alterations to this specification will become apparent to those skilled in the art. Any modifications, equivalent substitutions, improvements, or the like, which are within the spirit and principles of the present description, are intended to be included within the scope of the claims of the present description.

Claims (12)

1. A humanoid piano playing robot, characterized in that the humanoid piano playing robot comprises: the device comprises a left paw, a right paw, a left arm, a right arm, a waist, a neck, a visual perception unit, left legs, right legs, a control system and a base, wherein the neck corresponds to two degrees of freedom so as to drive a head to perform a rotation motion and a pitching motion, and the waist corresponds to two degrees of freedom so as to drive an upper body to perform a rotation motion and a pitching motion;
the humanoid piano playing robot is used for acquiring a music score image and a piano key image of a piano key facing the humanoid piano playing robot through the visual perception unit, identifying the music score image to obtain music score information, and positioning the piano key according to the key image to obtain positioning data;
according to the positioning data and the music score information, a fingering adopted by the humanoid piano playing robot when playing the music corresponding to the music score information is planned, and a pose change sequence corresponding to the body part of the humanoid piano playing robot when playing the music according to the fingering is determined; according to the positioning data, determining the arm tail end pose required by the left arm and/or the right arm to play music corresponding to the music score information according to the fingering at each moment, and determining a pose change sequence corresponding to the body part of the humanoid piano playing robot when the humanoid piano playing robot plays the music according to the arm tail end pose;
The control system is used for controlling the left paw and the right paw according to the fingering, and controlling body parts except the left paw and the right paw according to the pose change sequence, wherein for each arm end pose in the pose change sequence, an arm to which the arm end pose belongs is determined, and the converted pose is obtained according to a fixed corner between a shoulder and a waist corresponding to the arm to which the arm end pose belongs, a deflection angle and a pitch angle of the waist at the moment, and coordinates of a shoulder corresponding to the arm to which the arm end pose belongs under a local coordinate system of a pitch joint of the waist; and the control system controls the left arm and/or the right arm to move according to the converted pose.
2. The robot of claim 1, wherein the humanoid piano playing robot is further configured to encode the music score information to obtain encoded information in a preset format, and transmit the encoded information to the control system, and the music score information includes at least a beat, a note, and a modifier.
3. The robot of claim 1, wherein the humanoid piano-playing robot further comprises: a speech unit;
the humanoid piano playing robot is used for receiving voice information sent by a user through the voice unit, judging whether the purpose of the user is to request the humanoid piano playing robot to conduct piano playing according to the voice information, and if yes, controlling the humanoid piano playing robot to conduct piano playing through the control system.
4. The robot of claim 3, wherein the humanoid piano-playing robot is adapted to, if it is judged based on the voice information that the user aims to request the humanoid piano-playing robot to perform a piano-playing, recognize the voice information, determine track information of the humanoid piano-playing robot requested by the user, and control the humanoid piano-playing robot to perform a piano-playing by the control system based on the track information.
5. The robot of claim 3, wherein the humanoid piano-playing robot is adapted to filter noise in the voice information, and to judge whether the user's purpose is to request the humanoid piano-playing robot to perform a piano-playing based on the voice information after filtering noise.
6. The robot of claim 1, wherein the control system is configured to perform inverse kinematics solution for each arm end pose in the pose change sequence, and determine a joint angle required for the left arm or the right arm to reach the arm end pose, respectively; and controlling the left arm or the right arm to move according to the joint angle corresponding to the pose of the tail end of each arm.
7. A control method based on the humanoid piano-playing robot of any one of claims 1 to 6, characterized in that the humanoid piano-playing robot comprises: left hand claw, right hand claw, left arm, right arm, waist, neck, visual perception unit, left leg, right leg, control system, base, the neck corresponds two degrees of freedom to drive the head and carry out gyration action and pitching action, the waist corresponds two degrees of freedom, in order to drive the upper body and carry out gyration action and pitching action, include:
acquiring a music score image and a key image of a piano key facing the humanoid piano playing robot through the visual perception unit;
identifying the music score image to obtain music score information, and positioning the piano keys according to the key images to obtain positioning data;
According to the positioning data and the music score information, a fingering adopted by the humanoid piano playing robot when playing the music corresponding to the music score information is planned, and a pose change sequence corresponding to the body part of the humanoid piano playing robot when playing the music according to the fingering is determined;
and controlling the left paw and the right paw according to the fingering through the control system, and controlling the body parts except the left paw and the right paw according to the pose change sequence.
8. The method of claim 7, wherein the method further comprises:
and encoding the music score information to obtain encoding information in a preset format, and transmitting the encoding information to the control system, wherein the music score information at least comprises beats, notes and modifiers.
9. The method of claim 7, wherein the humanoid piano-playing robot further comprises: a speech unit;
the method further comprises the steps of:
receiving voice information sent by a user through the voice unit;
and judging whether the purpose of the user is to request the humanoid piano playing robot to conduct piano playing according to the voice information, and if so, controlling the humanoid piano playing robot to conduct piano playing through the control system.
10. The method of claim 9, wherein controlling, by the control system, the humanoid piano-playing robot to perform piano playing, specifically comprises:
if the purpose of the user is to request the humanoid piano playing robot to perform piano playing according to the voice information, the voice information is identified, and the track information of the humanoid piano playing robot playing requested by the user is determined;
and controlling the humanoid piano playing robot to perform piano playing through the control system according to the track information.
11. A computer readable storage medium, characterized in that the storage medium stores a computer program which, when executed by a processor, implements the method of any of the preceding claims 7-10.
12. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method of any of the preceding claims 7-10 when executing the program.
CN202310676079.9A 2023-06-08 2023-06-08 Robot is played to imitative people piano Active CN116394277B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202310676079.9A CN116394277B (en) 2023-06-08 2023-06-08 Robot is played to imitative people piano
PCT/CN2023/120387 WO2024008217A1 (en) 2023-06-08 2023-09-21 Humanoid piano playing robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310676079.9A CN116394277B (en) 2023-06-08 2023-06-08 Robot is played to imitative people piano

Publications (2)

Publication Number Publication Date
CN116394277A CN116394277A (en) 2023-07-07
CN116394277B true CN116394277B (en) 2023-08-25

Family

ID=87014665

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310676079.9A Active CN116394277B (en) 2023-06-08 2023-06-08 Robot is played to imitative people piano

Country Status (2)

Country Link
CN (1) CN116394277B (en)
WO (1) WO2024008217A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116394277B (en) * 2023-06-08 2023-08-25 之江实验室 Robot is played to imitative people piano
CN116728419B (en) * 2023-08-09 2023-12-22 之江实验室 Continuous playing action planning method, system, equipment and medium for playing robot
CN117207204B (en) * 2023-11-09 2024-01-30 之江实验室 Control method and control device of playing robot

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH081562A (en) * 1994-06-16 1996-01-09 Hiroshi Ota Robot finger-hit playing device for electronic piano having small number of keys
CN106956277A (en) * 2017-04-07 2017-07-18 温州职业技术学院 A kind of mechanical structure of intelligent entertainment robot
CN108010504A (en) * 2017-12-26 2018-05-08 昆山塔米机器人有限公司 A kind of piano performance system and method controlled by robot
CN110394784A (en) * 2019-07-18 2019-11-01 天津大学 A kind of manipulator drive lacking structure and design method applied to Piano Teaching
CN210998752U (en) * 2020-04-07 2020-07-14 北京建筑大学 Robot is played to piano
CN111421563A (en) * 2020-05-12 2020-07-17 北京木甲天枢文化科技有限公司 Bamboo flute playing robot
CN111531562A (en) * 2020-05-12 2020-08-14 北京木甲天枢文化科技有限公司 Konghou playing robot
WO2021258117A1 (en) * 2020-06-18 2021-12-23 Loc Vo Gia A robotic arm, a robotic hand, a robotic system for playing the piano
CN114227707A (en) * 2021-12-16 2022-03-25 东北林业大学 Intelligent embedded equipment for music score identification and automatic piano playing
CN114347070A (en) * 2022-03-18 2022-04-15 之江实验室 Method, system and device for controlling piano playing action based on humanoid arm claw robot
CN114952868A (en) * 2022-07-26 2022-08-30 之江实验室 7-degree-of-freedom SRS (sounding reference Signal) type mechanical arm control method and device and piano playing robot
CN115431251A (en) * 2022-09-16 2022-12-06 哈尔滨工业大学 Humanoid robot upper limb
CN115781733A (en) * 2022-12-01 2023-03-14 之江实验室 Manipulator and robot

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6982375B2 (en) * 2003-06-06 2006-01-03 Mcgregor Rob Musical teaching device and method
KR101307595B1 (en) * 2011-12-28 2013-09-12 한국기술교육대학교 산학협력단 Method for controlling piano robot and apparatus for thereof
US20210387346A1 (en) * 2016-10-22 2021-12-16 Carla R. Gillett Humanoid robot for performing maneuvers like humans
CN108053815A (en) * 2017-12-12 2018-05-18 广州德科投资咨询有限公司 The performance control method and robot of a kind of robot
WO2019116521A1 (en) * 2017-12-14 2019-06-20 株式会社ソニー・インタラクティブエンタテインメント Entertainment system, robot device, and server device
CN217943361U (en) * 2022-07-18 2022-12-02 华强方特(深圳)科技有限公司 Robot for simulating playing of plucked musical instrument
CN115870980A (en) * 2022-12-09 2023-03-31 北部湾大学 Vision-based piano playing robot control method and device
CN116394277B (en) * 2023-06-08 2023-08-25 之江实验室 Robot is played to imitative people piano

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH081562A (en) * 1994-06-16 1996-01-09 Hiroshi Ota Robot finger-hit playing device for electronic piano having small number of keys
CN106956277A (en) * 2017-04-07 2017-07-18 温州职业技术学院 A kind of mechanical structure of intelligent entertainment robot
CN108010504A (en) * 2017-12-26 2018-05-08 昆山塔米机器人有限公司 A kind of piano performance system and method controlled by robot
CN110394784A (en) * 2019-07-18 2019-11-01 天津大学 A kind of manipulator drive lacking structure and design method applied to Piano Teaching
CN210998752U (en) * 2020-04-07 2020-07-14 北京建筑大学 Robot is played to piano
CN111421563A (en) * 2020-05-12 2020-07-17 北京木甲天枢文化科技有限公司 Bamboo flute playing robot
CN111531562A (en) * 2020-05-12 2020-08-14 北京木甲天枢文化科技有限公司 Konghou playing robot
WO2021258117A1 (en) * 2020-06-18 2021-12-23 Loc Vo Gia A robotic arm, a robotic hand, a robotic system for playing the piano
CN114227707A (en) * 2021-12-16 2022-03-25 东北林业大学 Intelligent embedded equipment for music score identification and automatic piano playing
CN114347070A (en) * 2022-03-18 2022-04-15 之江实验室 Method, system and device for controlling piano playing action based on humanoid arm claw robot
CN114952868A (en) * 2022-07-26 2022-08-30 之江实验室 7-degree-of-freedom SRS (sounding reference Signal) type mechanical arm control method and device and piano playing robot
CN115431251A (en) * 2022-09-16 2022-12-06 哈尔滨工业大学 Humanoid robot upper limb
CN115781733A (en) * 2022-12-01 2023-03-14 之江实验室 Manipulator and robot

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
仿人机器人头部设计与目标跟踪运动控制;宛敏红;周维佳;刘玉旺;;农业机械学报(第10期);第406-412页 *

Also Published As

Publication number Publication date
CN116394277A (en) 2023-07-07
WO2024008217A1 (en) 2024-01-11

Similar Documents

Publication Publication Date Title
CN116394277B (en) Robot is played to imitative people piano
US5461711A (en) Method and system for spatial accessing of time-based information
JP6505748B2 (en) Method for performing multi-mode conversation between humanoid robot and user, computer program implementing said method and humanoid robot
US7216082B2 (en) Action teaching apparatus and action teaching method for robot system, and storage medium
US9431027B2 (en) Synchronized gesture and speech production for humanoid robots using random numbers
Wheatland et al. State of the art in hand and finger modeling and animation
JP5616325B2 (en) How to change the display based on user instructions
JP7173031B2 (en) Information processing device, information processing method, and program
Tanaka et al. Multimodal interaction in music using the electromyogram and relative position sensing
CN106569613A (en) Multi-modal man-machine interaction system and control method thereof
CN106896796A (en) Industrial robot master-slave mode teaching programmed method based on data glove
US20200269421A1 (en) Information processing device, information processing method, and program
JP5252393B2 (en) Motion learning device
WO2019216016A1 (en) Information processing device, information processing method, and program
Chang et al. A kinect-based gesture command control method for human action imitations of humanoid robots
JP2002337079A (en) Device/method for processing information, recording medium and program
CN114347070B (en) Method, system and device for controlling piano playing action based on humanoid arm claw robot
Aiswarya et al. Hidden Markov model-based Sign Language to speech conversion system in TAMIL
JP4600736B2 (en) Robot control apparatus and method, recording medium, and program
JP4677543B2 (en) Facial expression voice generator
JP2005059185A (en) Robot device and method of controlling the same
JP2005231012A (en) Robot device and its control method
Malkin et al. Energy and loudness for speed control in the Vocal Joystick
JP7156300B2 (en) Information processing device, information processing method, and program
Wechsler Applications of Motion Tracking for Persons with Disabilities

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant