CN105252532A - Method of cooperative flexible attitude control for motion capture robot - Google Patents

Method of cooperative flexible attitude control for motion capture robot Download PDF

Info

Publication number
CN105252532A
CN105252532A CN201510824988.8A CN201510824988A CN105252532A CN 105252532 A CN105252532 A CN 105252532A CN 201510824988 A CN201510824988 A CN 201510824988A CN 105252532 A CN105252532 A CN 105252532A
Authority
CN
China
Prior art keywords
rightarrow
angle
joint
robot
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510824988.8A
Other languages
Chinese (zh)
Other versions
CN105252532B (en
Inventor
邢建平
孟宪昊
王康
孟宪鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Muke Space Information Technology Co Ltd
Original Assignee
Shandong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University filed Critical Shandong University
Priority to CN201510824988.8A priority Critical patent/CN105252532B/en
Publication of CN105252532A publication Critical patent/CN105252532A/en
Application granted granted Critical
Publication of CN105252532B publication Critical patent/CN105252532B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a method of cooperative flexible attitude control for a motion capture robot. The method comprises the following steps: capturing corresponding target joint displacement data of a target human body at different moments in real time by use of a body sensing apparatus Kinect; sending the target joint displacement data to a PC; drawing a human body skeleton frame by the PC in real time; dynamically displaying the human body skeleton frame on a screen by the PC and providing an error feedback; obtaining joint angle data; and controlling a steering engine of a humanoid robot NAO so that the humanoid robot to track the motions of the target human body in real time. According to the method, the displacement joint data of the target human body is captured by use of the Kinect so that various gestures of the human body can be identified, and therefore, real-time imitation of the human body motions by the robot is realized. The method is high in human body motion identification speed, high in accuracy, smooth in robot tracking motions, and capable of reflecting the gestures of the target human body in real time, and has good robustness under different environmental complex conditions.

Description

Motion capture robot works in coordination with the method for flexible gesture stability
Technical field
The present invention relates to a kind of method that motion capture robot works in coordination with flexible gesture stability, belong to field of human-computer interaction and artificial intelligence field.
Background technology
Man-machine interaction (HCI, Human-ComputerInteraction) be one about design, evaluate and realize interactive computer system for people to use, and study the subject of correlated phenomena occurred therefrom.Human-computer interaction technology mainly studies the information exchange between people and computer, be embodied in people to computer and computer to information interaction two parts of people, the former refers to that people can by equipment such as keyboard, mouse, action bars, position tracker, data glove, with the action of hand, pin, sound, attitude and health, sight line even brain wave etc. to computer transmission of information; The latter is that computer provides information by the output equipments such as printer, plotter, display, helmet-mounted display, audio amplifier or display device to people.
Kinect be Microsoft on the E3 great Zhan on June 2nd, 2009, the formal XBOX360 body sense periphery peripheral hardware announced.Kinect has thoroughly overturned the single operation of game, and the theory of human-computer interaction is shown more thoroughly.It is a kind of 3D body sense video camera, and it has imported the functions such as instant motion capture, image identification, microphone input, speech recognition, community interactive simultaneously.Body sense, i.e. somatesthesia are the general names of sense of touch, pressure sensation, the sense of heat, the pain sensation and proprioception (sensation about muscle and joint position and motion, body posture and motion and facial expression).The Kinect of LightCoding and Source calibration technology is adopted to utilize RF transmitter to send laser light, by each speckle in infrared C MOS camera record space, in conjunction with original speckle pattern, the image with the 3D degree of depth is calculated by wafer, and then be transformed into skeleton tracing system, make Kinect can be applied to many fields.
NAO is the anthropomorphic robot able to programme of the height 58 centimetres developed by AldebaranRobotics company.Its health has 25 frees degree, and main element is motor and electric actuator.Its motion module, based on generalized inverse kinematics, can process cartesian coordinate system, joint control, balance, redundancy and priority of task.NAO head embedded Intel ATOM1.6GHz processor, run LINUX kernel, have NAOqi operating system, be a software platform of increasing income, available C++ or Python are programmed.Based on above feature, robot NAO can realize motion tracking the most realistically, is important component part of the present invention.
Summary of the invention
For the deficiencies in the prior art, the present invention relates to a kind of method that motion capture robot works in coordination with flexible gesture stability.The object of the invention is to for man-machine interaction proposes a kind of new model.
Technical scheme of the present invention is as follows:
Motion capture robot works in coordination with a method for flexible gesture stability, comprises step as follows:
(1) body sense equipment Kinect is utilized to catch not the corresponding target joint displacement data of target body in the same time in real time; Target body stands in body sense equipment Kinect horizon range, is to ensure best identified effect, and target body should be in 1.2-3.5m before the camera lens of body sense equipment Kinect, within horizontal view angle ± 57 °;
(2) the target joint displacement data that body sense equipment Kinect captures is sent to PC end;
(3) PC termination receives all target joint displacement datas, and draws out skeleton framework in real time according to it;
(4) PC holds by skeleton framework Dynamic Announce on screen, and provides the feedback that reports an error;
(5) PC end processes all target joint displacement datas received: comprise relative threshold distance and compare filtering calculating, space vector calculating, obtain the gesture stability data of anthropomorphic robot NAO, i.e. joint angles data; The present invention, due to reasons such as environmental factor and sensor self shakes, causes there is interfering data in the initial data that obtains, is therefore necessary that carrying out relative threshold distance to initial data compares filtering and calculate, robot motion is followed the trail of more accurately, reliable;
(6) call the JointControl function in the NAOqi operating system of anthropomorphic robot NAO, according to the joint angles data sent, the steering wheel of anthropomorphic robot NAO is controlled, make the action of anthropomorphic robot NAO real-time tracking target body.
Preferred according to the present invention, described step (4) also comprises, and described PC holds Dynamic Announce skeleton framework: if keeping strokes of the skeleton framework of PC end display and target body, then joint angles data are sent to anthropomorphic robot NAO; If the skeleton framework of PC end display does not mate with the action of target body, then reinitialize program, receive reliable gesture stability data with guarantor's anthropomorphic robot NAO.
Preferred according to the present invention, in described step (5), described relative threshold distance compares filtering calculating and comprises:
By observing the space coordinates change of displacement artis, calculate same joint in set time section (such as: 0.1s-0.5s) vectorial in the fluctuation of time started and end time composition, observe this fluctuation vector field homoemorphism and the fluctuation in space coordinates all directions, by setting fluctuation threshold value screening joint undulating value.Can find out that the shake of artis recognizing site is mainly along the fast jitter of change in coordinate axis direction, and when there is shake, fluctuation vector field homoemorphism significantly can increase.Therefore, the larger artis of reply fluctuation does respective handling, keeps laststate constant, use different threshold values to compare for different joints to the artis of fluctuation within a narrow range, ensure that each filtered result is all optimal solution, to ensure the continuity that robot pose converts.
Preferred according to the present invention, in described step (5), described space vector calculates and comprises:
The space coordinates that Kinect uses are different from common space coordinates, and its x-axis is identical with Traditional Space coordinate system with the zero point of y-axis, but its z-axis co-ordinate zero point is Kinect sensor, and positive direction is the dead ahead that Kinect points to.Kinect space coordinates are as shown in Figure 6:
By in Kinect space coordinates, draw any two coordinate points P1 do not overlapped (x1, y1, z1) in Kinect coordinate system, P2 (x2, y2, z2), the vectorial P1P2 to its composition:
If there is another P3 and this point not on the straight line at P1P2 place, then there is following relational expression:
According to above-mentioned character, human synovial angle calculation is reduced to the calculating to space vector angle, the computational methods of upper limb joint angle will be described respectively below:
1) joint angles: LeftElbowRoll
As shown in Figure 7: calculate one group of vector that LeftElbowRoll angle only need construct this place, space angle both sides,
And obtain according to the joint angles computing formula above mentioned:
L e f t E l b o w R o l l = ∠ A B C = a r c c o s B A → · B C → | B A → | | B C → | - - - ( 3 - 3 )
2) joint angles: LeftElbowYaw
As shown in Figure 8:
The angle in LeftElbowYaw joint is the angle that ancon produces when upper arm rotates, and this angle is the angle of ABC and BCD two intersecting plane under normal circumstances, i.e. the angle of figure midplane S1 and S2; Show that the computational methods of angle LeftElbowRoll are as follows according to space plane angle calcu-lation formula.
First the formula of the normal vector clearly calculating two non-colinear vector place planes is needed:
m → = a → × b → = det i j k a x a y a z b x b y b z - - - ( 3 - 4 )
Therefore the normal vector of S1, S2 plane is expressed as:
M 1 → = B A → × B C → = det i → j → k → BA x BA y BA z BC x BC y BC z = ( BA y BC z - BA z BC y ) i → + ( BA z BC x - BA x BC z ) j → + ( BA x BC y - BA y BC x ) k → - - - ( 3 - 5 )
M 2 → = C B → × C D → = det i → j → k → CB x CB y CB z CD x CD y CD z = ( CB y CD z - CB z CD y ) i → + ( CB z CD x - CB x CD z ) j → + ( CB x CD y - CB y CD x ) k →
Namely the angle in LeftElbowYaw joint equals M1, M2 two angle of normal vector:
L e f t E l b o w Y a w = a r c c o s M 1 → · M 2 → | M 1 → | | M 2 → | - - - ( 3 - 7 )
3) joint angles: LeftShoulder.Roll
As shown in Figure 9:
L e f t S h o u l d e r R o l l = a r c c o s B A → · B C → | B A → | | B C → | - - - ( 3 - 8 )
4) joint angles: LeftShoulderPitch
As shown in Figure 10:
Angle between the angle in the LeftShoulderPitch joint plane that to be the plane that forms of upper arm and both shoulders axis form with both shoulders axis and backbone point, this angle is the angle of ABC and BCD two intersecting plane under normal circumstances, according to space plane angle calcu-lation formula, the computational methods of analogy LeftElbowYaw joint angle, can show that the computational methods of angle LeftShoulderPitch are as follows:
The normal vector of plane ABC:
The normal vector of plane BCD:
Joint angle LeftShoulderPitch calculates:
5) hand opening and closing angle: LeftHand
Because KINECT cannot to be accurate to the state of finger for the reading of hand information, therefore for hand opening and closing angle calculation, estimated the angle of hand opening and closing by the distance calculated between Lefthand and LeftWrist, computational methods are as follows:
As shown in figure 11:
L e f t H a u d = | A B → | = ( A x - B x ) 2 + ( A y - B y ) 2 + ( A z - B z ) 2 - - - ( 3 - 12 )
So far, the articulate angle of left arm all calculates complete, in like manner can calculate right arm joint angles;
6) joint angles: HeadPitch
In head two joints, and bow and come back relevant angle name HeadPitch
As shown in figure 12:
The angle of joint of head HeadPitch chooses in shoulder the angle between the normal vector of the plane that vector and two is takeed on and backbone forms pointing to head, and specific formula for calculation is as follows:
7) joint angle: HeadYaw
As shown in figure 13:
The important joint angle HeadYaw of head another one is responsible for realizing head left-right rotation, due to the data message of face orientation cannot be provided in KINECT bone recognition result, therefore HeadYaw angle cannot directly calculate, but can find out by observing bone drafting animation, the locus of head is obviously arranged in the front of shoulder, and (namely front is when KINECT, certain distance is there is in Head point and Shouldercenter point in z-axis), therefore will be converted into the calculating of plane included angle herein about this joint angle:
The normal vector of plane ABC
Plane BCD normal vector
Joint angles
In the motion process of robot, joint of lower extremity activity will directly have influence on the stationarity of robot entirety, in order to simplify control difficulty, relative position method is adopted for lower limb controlling herein, by calculating the position of lower limb end in relative coordinate system in conjunction with the height of mass center of human body, realize the control to robot motion;
As shown in figure 14:
By the Hipcenter of bone recognition result point upright projection to ground, as new coordinate origin, get left and right ankle point Rightankle and Leftankle in new coordinate system coordinate as the control data of robot.
B, C 2 coordinate in O coordinate system is as follows,
LeftFoot=(A z-B z,A x-B x)(3-17)
RightFoot=(A z-C z,A x-C x)(3-18)
In order to the absolute distance error caused due to different people's height differences, herein that coordinate values is wide divided by human body hip, wherein the wide computing formula of hip is as follows:
H i p w i d t h = | B C → | = ( B x - C x ) 2 + ( B y - C y ) 2 + ( B z - C z ) 2 - - - ( 3 - 19 )
Therefore the position of lower limb end in new coordinate system is as follows:
L e f t F o o t = ( A z - B z H i p w i d t h , A x - B x H i p w i d t h ) - - - ( 3 - 20 )
R i g h t F o o t = ( A z - C z H i p w i d t h , A x - C x H i p w i d t h ) - - - ( 3 - 21 ) .
Feature of the present invention is:
1, the present invention adopts advanced body sense equipment and forward position Robotics, devises a kind of new interactive mode system.
2, the present invention realizes controlling anthropomorphic robot and imitates human action in real time, has very high reliability, very strong flexibility, has good robustness at varying environment complex condition.
3, the present invention adopts relative threshold Distance Filter method, reduces environmental factor and the impact caused shaken by sensor self, effectively improves the reliability of data, the stability of system cloud gray model.
4, bone tracing algorithm is applied to human body attitude identification by the present invention, it is the feature of the anthropomorphic robot able to programme having 25 frees degree for NAO, realize controlling anthropomorphic robot and imitate human action in real time, there is very high reliability, at varying environment complex condition, there is good robustness.The present invention is based on the tracking of Kinect bone framework and robot NAO hardware platform.Kinect holds, and first-selection carries out program initialization, and this process comprises signal wiring and drives inspection, instantiation sensor object, obtains degree of depth authority, registered events monitor.Then start the identification of bone framework, this process comprises the acquisition of depth image, is identified human joint points by bone recognizer storehouse depth image and is extracted space coordinates, after filtering, space vector calculate, for robot NAO provides gesture stability data.
Accompanying drawing explanation
Fig. 1 is the flow chart of the method for the invention;
Fig. 2 is if the skeleton framework of PC end display does not mate with the action of target body, then reinitialize the flow chart of program;
Fig. 3 is skeleton framework identification process figure;
Fig. 4 is the schematic diagram of skeleton framework Dynamic Announce at screen;
Fig. 5 is the flow chart that relative threshold distance of the present invention compares filtering algorithm;
Fig. 6 is Kinect space coordinates;
Fig. 7 is the calculating schematic diagram of joint angles: LeftElbowRoll;
Fig. 8 is the calculating schematic diagram of joint angles: LeftElbowYaw;
Fig. 9 is the calculating schematic diagram of joint angles: LeftShoulder.Roll;
Figure 10 is the calculating schematic diagram of joint angles: LeftShoulderPitch;
Figure 11 is hand opening and closing angles: the schematic diagram of LeftHand;
Figure 12 is the calculating schematic diagram of joint angles: HeadPitch;
Figure 13 is the calculating schematic diagram of joint angle: HeadYaw;
Figure 14 is by the Hipcenter of bone recognition result point upright projection to ground, as new coordinate origin, get left and right ankle point Rightankle and Leftankle in new coordinate system coordinate as the calculating schematic diagram of the control data of robot.
Detailed description of the invention
Below in conjunction with embodiment and Figure of description, the present invention is described in detail, but is not limited thereto.
As Figure 1-5.
Embodiment,
Motion capture robot works in coordination with a method for flexible gesture stability, comprises step as follows:
(1) body sense equipment Kinect is utilized to catch not the corresponding target joint displacement data of target body in the same time in real time; Target body stands in body sense equipment Kinect horizon range, is to ensure best identified effect, and target body should be in 1.2-3.5m before the camera lens of body sense equipment Kinect, within horizontal view angle ± 57 °; Described body sense equipment Kinect possesses colour and depth perception answers camera lens, horizontal field of view: 57 degree, vertical visual field: 43 degree, degree of depth inductor scope 1.2m-3.5m.Run the service routine of PC end, see Fig. 2, first initialize body sense equipment Kinect, this process comprises signal wiring and drives inspection, instantiation sensor object, obtains degree of depth authority, registered events etc.;
(2) the target joint displacement data that body sense equipment Kinect captures is sent to PC end; Target body stands in before body sense equipment Kinect depth perception answers camera lens, can change the attitudes such as head, arm, finger, catch all target joint displacement datas by body sense equipment Kinect;
(3) PC termination receives all target joint displacement datas, and draws out skeleton framework in real time according to it; Draw the flow process of skeleton framework as Fig. 3;
(4) PC holds by skeleton framework Dynamic Announce on screen, and provides the feedback that reports an error; As Fig. 4, give four representative service routine running status figure: the part being labeled as 1 is that target body stands in horizon range, program initialization terminates the skeleton framework of the data construct of being caught first by body sense equipment Kinect of rear display, the program that represents starts normal, starts the attitudes vibration of real-time target acquisition human body; The part being labeled as 2 is the skeleton framework that program starts the data construct that the rear a certain moment is caught by body sense equipment Kinect, and this part reflects the attitude of this moment target body; The part being labeled as 3 represent to run due to program report an error, camera lens that target body is close to body sense equipment Kinect, cause the situation of skeleton framework distortion; The part being labeled as 4 is because target body is away from horizon range, causes body sense equipment Kinect to catch.The part being labeled as 3,4 be program to user feedback that reports an error, at this time need to reinitialize program, to ensure that robot receives reliable gesture stability data;
(5) PC end processes all target joint displacement datas received: comprise relative threshold distance and compare filtering calculating, space vector calculating, obtain the gesture stability data of anthropomorphic robot NAO, i.e. joint angles data; The present invention, due to reasons such as environmental factor and sensor self shakes, causes there is interfering data in the initial data that obtains, is therefore necessary that carrying out relative threshold distance to initial data compares filtering and calculate, robot motion is followed the trail of more accurately, reliable;
(6) call the JointControl function in the NAOqi operating system of anthropomorphic robot NAO, according to the joint angles data sent, the steering wheel of anthropomorphic robot NAO is controlled, make the action of anthropomorphic robot NAO real-time tracking target body.
Preferred according to the present invention, described step (4) also comprises, and described PC holds Dynamic Announce skeleton framework: if keeping strokes of the skeleton framework of PC end display and target body, then joint angles data are sent to anthropomorphic robot NAO; If the skeleton framework of PC end display does not mate with the action of target body, then reinitialize program, receive reliable gesture stability data with guarantor's anthropomorphic robot NAO.
Preferred according to the present invention, in described step (5), described relative threshold distance compares filtering calculating and comprises:
By observing the space coordinates change of displacement artis, calculate same joint in set time section (such as: 0.1s-0.5s) vectorial in the fluctuation of time started and end time composition, observe this fluctuation vector field homoemorphism and the fluctuation in space coordinates all directions, by setting fluctuation threshold value screening joint undulating value.Can find out that the shake of artis recognizing site is mainly along the fast jitter of change in coordinate axis direction, and when there is shake, fluctuation vector field homoemorphism significantly can increase.Therefore, the larger artis of reply fluctuation does respective handling, keeps laststate constant, use different threshold values to compare for different joints to the artis of fluctuation within a narrow range, ensure that each filtered result is all optimal solution, to ensure the continuity that robot pose converts.
Preferred according to the present invention, in described step (5), described space vector calculates and comprises:
Preferred according to the present invention, in described step (5), described space vector calculates and comprises:
The space coordinates that Kinect uses are different from common space coordinates, and its x-axis is identical with Traditional Space coordinate system with the zero point of y-axis, but its z-axis co-ordinate zero point is Kinect sensor, and positive direction is the dead ahead that Kinect points to.Kinect space coordinates are as shown in Figure 6:
By in Kinect space coordinates, draw any two coordinate points P1 do not overlapped (x1, y1, z1) in Kinect coordinate system, P2 (x2, y2, z2), the vectorial P1P2 to its composition:
If there is another P3 and this point not on the straight line at P1P2 place, then there is following relational expression:
According to above-mentioned character, human synovial angle calculation is reduced to the calculating to space vector angle, the computational methods of upper limb joint angle will be described respectively below:
1) joint angles: LeftElbowRoll
As shown in Figure 7: calculate one group of vector that LeftElbowRoll angle only need construct this place, space angle both sides,
And obtain according to the joint angles computing formula above mentioned:
L e f t E l b o w R o l l = ∠ A B C = a r c c o s B A → · B C → | B A → | | B C → | - - - ( 3 - 3 )
2) joint angles: LeftElbowYaw
As shown in Figure 8:
The angle in LeftElbowYaw joint is the angle that ancon produces when upper arm rotates, and this angle is the angle of ABC and BCD two intersecting plane under normal circumstances, i.e. the angle of figure midplane S1 and S2; Show that the computational methods of angle LeftElbowRoll are as follows according to space plane angle calcu-lation formula.
First the formula of the normal vector clearly calculating two non-colinear vector place planes is needed:
m → = a → × b → = det i j k a x a y a z b x b y b z - - - ( 3 - 4 )
Therefore the normal vector of S1, S2 plane is expressed as:
M 1 → = B A → × B C → = det i → j → k → BA x BA y BA z BC x BC y BC z = ( BA y BC z - BA z BC y ) i → + ( BA z BC x - BA x BC z ) j → + ( BA x BC y - BA y BC x ) k → - - - ( 3 - 5 )
M 2 → = C B → × C D → = det i → j → k → CB x CB y CB z CD x CD y CD z = ( CB y CD z - CB z CD y ) i → + ( CB z CD x - CB x CD z ) j → + ( CB x CD y - CB y CD x ) k →
Namely the angle in LeftElbowYaw joint equals M1, M2 two angle of normal vector:
L e f t E l b o w Y a w = a r c c o s M 1 → · M 2 → | M 1 → | | M 2 → | - - - ( 3 - 7 )
3) joint angles: LeftShoulder.Roll
As shown in Figure 9:
L e f t S h o u l d e r R o l l = a r c c o s B A → · B C → | B A → | | B C → | - - - ( 3 - 8 )
4) joint angles: LeftShoulderPitch
As shown in Figure 10:
Angle between the angle in the LeftShoulderPitch joint plane that to be the plane that forms of upper arm and both shoulders axis form with both shoulders axis and backbone point, this angle is the angle of ABC and BCD two intersecting plane under normal circumstances, according to space plane angle calcu-lation formula, the computational methods of analogy LeftElbowYaw joint angle, can show that the computational methods of angle LeftShoulderPitch are as follows:
The normal vector of plane ABC:
The normal vector of plane BCD:
Joint angle LeftShoulderPitch calculates:
5) hand opening and closing angle: LeftHand
Because KINECT cannot to be accurate to the state of finger for the reading of hand information, therefore for hand opening and closing angle calculation, estimated the angle of hand opening and closing by the distance calculated between Lefthand and LeftWrist, computational methods are as follows:
As shown in figure 11:
L e f t H a u d = | A B → | = ( A x - B x ) 2 + ( A y - B y ) 2 + ( A z - B z ) 2 - - - ( 3 - 12 )
So far, the articulate angle of left arm all calculates complete, in like manner can calculate right arm joint angles;
6) joint angles: HeadPitch
In head two joints, and bow and come back relevant angle name HeadPitch
As shown in figure 12:
The angle of joint of head HeadPitch chooses in shoulder the angle between the normal vector of the plane that vector and two is takeed on and backbone forms pointing to head, and specific formula for calculation is as follows:
7) joint angle: HeadYaw
As shown in figure 13:
The important joint angle HeadYaw of head another one is responsible for realizing head left-right rotation, due to the data message of face orientation cannot be provided in KINECT bone recognition result, therefore HeadYaw angle cannot directly calculate, but can find out by observing bone drafting animation, the locus of head is obviously arranged in the front of shoulder, and (namely front is when KINECT, certain distance is there is in Head point and Shouldercenter point in z-axis), therefore will be converted into the calculating of plane included angle herein about this joint angle:
The normal vector of plane ABC
Plane BCD normal vector
Joint angles
In the motion process of robot, joint of lower extremity activity will directly have influence on the stationarity of robot entirety, in order to simplify control difficulty, relative position method is adopted for lower limb controlling herein, by calculating the position of lower limb end in relative coordinate system in conjunction with the height of mass center of human body, realize the control to robot motion;
As shown in figure 14:
By the Hipcenter of bone recognition result point upright projection to ground, as new coordinate origin, get left and right ankle point Rightankle and Leftankle in new coordinate system coordinate as the control data of robot.
B, C 2 coordinate in O coordinate system is as follows,
LeftFoot=(A z-B z,A x-B x)(3-17)
RightFoot=(A z-C z,A x-C x)(3-18)
In order to the absolute distance error caused due to different people's height differences, herein that coordinate values is wide divided by human body hip, wherein the wide computing formula of hip is as follows:
H i p w i d t h = | B C → | = ( A x - C x ) 2 + ( B y - C y ) 2 + ( B z - C z ) 2 - - - ( 3 - 19 )
Therefore the position of lower limb end in new coordinate system is as follows:
L e f t F o o t = ( A z - B z H i p w i d t h , A x - B x H i p w i d t h ) - - - ( 3 - 20 )
R i g h t F o o t = ( A z - C z H i p w i d t h , A x - C x H i p w i d t h ) - - - ( 3 - 21 ) .

Claims (4)

1. motion capture robot works in coordination with a method for flexible gesture stability, it is characterized in that the method comprising the steps of as follows:
(1) body sense equipment Kinect is utilized to catch not the corresponding target joint displacement data of target body in the same time in real time;
(2) the target joint displacement data that body sense equipment Kinect captures is sent to PC end;
(3) PC termination receives all target joint displacement datas, and draws out skeleton framework in real time according to it;
(4) PC holds by skeleton framework Dynamic Announce on screen, and provides the feedback that reports an error;
(5) PC end processes all target joint displacement datas received: comprise relative threshold distance and compare filtering calculating, space vector calculating, obtain the gesture stability data of anthropomorphic robot NAO, i.e. joint angles data;
(6) call the JointControl function in the NAOqi operating system of anthropomorphic robot NAO, according to the joint angles data sent, the steering wheel of anthropomorphic robot NAO is controlled, make the action of anthropomorphic robot NAO real-time tracking target body.
2. a kind of motion capture robot according to claim 1 works in coordination with the method for flexible gesture stability, it is characterized in that, described step (4) also comprises, described PC holds Dynamic Announce skeleton framework: if keeping strokes of the skeleton framework of PC end display and target body, then joint angles data are sent to anthropomorphic robot NAO; If the skeleton framework of PC end display does not mate with the action of target body, then reinitialize program.
3. a kind of motion capture robot according to claim 1 works in coordination with the method for flexible gesture stability, it is characterized in that, in described step (5), described relative threshold distance compares filtering calculating and comprises:
By observing the space coordinates change of displacement artis, calculate same joint in set time section vectorial in the fluctuation of time started and end time composition, observe this fluctuation vector field homoemorphism and the fluctuation in space coordinates all directions, by setting fluctuation threshold value screening joint undulating value.
4. a kind of motion capture robot according to claim 1 works in coordination with the method for flexible gesture stability, it is characterized in that, in described step (5), described space vector calculates and comprises:
By in Kinect space coordinates, draw any two coordinate points P1 do not overlapped (x1, y1, z1) in Kinect coordinate system, P2 (x2, y2, z2), the vectorial P1P2 to its composition:
If there is another P3 and this point not on the straight line at P1P2 place, then there is following relational expression:
According to above-mentioned character, human synovial angle calculation is reduced to the calculating to space vector angle, the computational methods of upper limb joint angle will be described respectively below:
1) joint angles: LeftElbowRoll
Calculate one group of vector that LeftElbowRoll angle only need construct this place, space angle both sides,
And obtain according to the joint angles computing formula above mentioned:
leftElbowRoll = ∠ ABC = arccos BA → · BC → | BA → | | BC → | - - - ( 3 - 3 )
2) joint angles: LeftElbowYaw
The angle in LeftElbowYaw joint is the angle that ancon produces when upper arm rotates, and this angle is the angle of ABC and BCD two intersecting plane under normal circumstances; Show that the computational methods of angle LeftElbowRoll are as follows according to space plane angle calcu-lation formula:
First the formula of the normal vector clearly calculating two non-colinear vector place planes is needed:
m → = a → × b → = det | i j k a x a y a z b x b y b z | - - - ( 3 - 4 )
Therefore the normal vector of S1, S2 plane is expressed as:
M 1 → = B A → × B C → = det | i → j → k → BA x BA y BA z BC x BC y BC z | = ( BA y BC z - BA z BC y ) i → + ( BA z BC x - BA x BC z ) j → + ( BA x BC y - BA y BC x ) k → - - - ( 3 - 5 )
M 2 → = C B → × C D → = det | i → j → k → CB x CB y CB z CD x CD y CD z | = ( CB y CD z - CB z CD y ) i → + ( CB z CD x - CB x CD z ) j → + ( CB x CD y - CB y CD x ) k →
Namely the angle in LeftElbowYaw joint equals M1, M2 two angle of normal vector:
L e f t E l b o w Y a w = a r c c o s M 1 → · M 2 → | M 1 → | | M 2 → | - - - ( 3 - 7 )
3) joint angles: LeftShoulder.Roll
L e f t S h o u l d e r R o l l = a r c c o s B A → · B C → | B A → | | B C → | - - - ( 3 - 8 )
4) joint angles: LeftShoulderPitch
Angle between the angle in the LeftShoulderPitch joint plane that to be the plane that forms of upper arm and both shoulders axis form with both shoulders axis and backbone point, this angle is the angle of ABC and BCD two intersecting plane under normal circumstances, according to space plane angle calcu-lation formula, the computational methods of analogy LeftElbowYaw joint angle, can show that the computational methods of angle LeftShoulderPitch are as follows:
The normal vector of plane ABC:
The normal vector of plane BCD:
Joint angle LeftShoulderPitch calculates:
5) hand opening and closing angle: LeftHand
Because KINECT cannot to be accurate to the state of finger for the reading of hand information, therefore for hand opening and closing angle calculation, estimated the angle of hand opening and closing by the distance calculated between Lefthand and LeftWrist, computational methods are as follows:
L e f t H a u d = | A B → | = ( A x - B x ) 2 + ( A y - B y ) 2 + ( A z - B z ) 2 - - - ( 3 - 12 )
So far, the articulate angle of left arm all calculates complete, in like manner can calculate right arm joint angles;
6) joint angles: HeadPitch
In head two joints, and bow and come back relevant angle name HeadPitch
The angle of joint of head HeadPitch chooses in shoulder the angle between the normal vector of the plane that vector and two is takeed on and backbone forms pointing to head, and specific formula for calculation is as follows:
7) joint angle: HeadYaw
The important joint angle HeadYaw of head another one is responsible for realizing head left-right rotation, therefore will be converted into the calculating of plane included angle herein about this joint angle:
The normal vector of plane ABC
Plane BCD normal vector
Joint angles
In the motion process of robot, joint of lower extremity activity will directly have influence on the stationarity of robot entirety, in order to simplify control difficulty, relative position method is adopted for lower limb controlling herein, by calculating the position of lower limb end in relative coordinate system in conjunction with the height of mass center of human body, realize the control to robot motion;
By the Hipcenter of bone recognition result point upright projection to ground, as new coordinate origin, get left and right ankle point Rightankle and Leftankle in new coordinate system coordinate as the control data of robot;
B, C 2 coordinate in O coordinate system is as follows,
LeftFoot=(A z-B z,A x-B x)(3-17)
RightFoot=(A z-C z,A x-C x)(3-18)
In order to the absolute distance error caused due to different people's height differences, herein that coordinate values is wide divided by human body hip, wherein the wide computing formula of hip is as follows:
H i p w i d t h = | B C → | = ( B x - C x ) 2 + ( B y - C y ) 2 + ( B z - C z ) 2 - - - ( 3 - 19 )
Therefore the position of lower limb end in new coordinate system is as follows:
L e f t F o o t = ( A z - B z H i p w i d t h , A x - B x H i p w i d t h ) - - - ( 3 - 20 )
R i g h t F o o t = ( A z - C z H i p w i d t h , A x - C x H i p w i d t h ) - - - ( 3 - 21 ) .
CN201510824988.8A 2015-11-24 2015-11-24 The method of the flexible gesture stability of motion capture robot collaboration Active CN105252532B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510824988.8A CN105252532B (en) 2015-11-24 2015-11-24 The method of the flexible gesture stability of motion capture robot collaboration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510824988.8A CN105252532B (en) 2015-11-24 2015-11-24 The method of the flexible gesture stability of motion capture robot collaboration

Publications (2)

Publication Number Publication Date
CN105252532A true CN105252532A (en) 2016-01-20
CN105252532B CN105252532B (en) 2017-07-04

Family

ID=55092618

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510824988.8A Active CN105252532B (en) 2015-11-24 2015-11-24 The method of the flexible gesture stability of motion capture robot collaboration

Country Status (1)

Country Link
CN (1) CN105252532B (en)

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105945947A (en) * 2016-05-20 2016-09-21 西华大学 Robot writing system based on gesture control and control method of robot writing system
CN105999670A (en) * 2016-05-31 2016-10-12 山东科技大学 Shadow-boxing movement judging and guiding system based on kinect and guiding method adopted by same
CN106313072A (en) * 2016-10-12 2017-01-11 南昌大学 Humanoid robot based on leap motion of Kinect
CN106564055A (en) * 2016-10-31 2017-04-19 金阳娃 Stable motion planning method of simulation humanoid robot and control device thereof
CN106607910A (en) * 2015-10-22 2017-05-03 中国科学院深圳先进技术研究院 Robot real-time simulation method
CN106648116A (en) * 2017-01-22 2017-05-10 隋文涛 Virtual reality integrated system based on action capture
CN106667493A (en) * 2017-01-22 2017-05-17 河北大学 Human body balance assessment system and assessment method
CN107225573A (en) * 2017-07-05 2017-10-03 上海未来伙伴机器人有限公司 The method of controlling operation and device of robot
CN107272882A (en) * 2017-05-03 2017-10-20 江苏大学 The holographic long-range presentation implementation method of one species
CN108518368A (en) * 2018-05-04 2018-09-11 贵阳海之力液压有限公司 A kind of valve control Hydraulic Power Transmission System applied to exoskeleton robot
CN108621164A (en) * 2018-05-10 2018-10-09 山东大学 Taiji push hands machine people based on depth camera
CN108762495A (en) * 2018-05-18 2018-11-06 深圳大学 The virtual reality driving method and virtual reality system captured based on arm action
CN109064487A (en) * 2018-07-02 2018-12-21 中北大学 A kind of human posture's comparative approach based on the tracking of Kinect bone node location
CN109079794A (en) * 2018-09-18 2018-12-25 广东省智能制造研究所 It is a kind of followed based on human body attitude robot control and teaching method
CN109591013A (en) * 2018-12-12 2019-04-09 山东大学 A kind of flexible assembly analogue system and its implementation
CN110135332A (en) * 2019-05-14 2019-08-16 吉林大学 A kind of bearing assembly producing line efficiency monitoring method
CN110450145A (en) * 2019-08-13 2019-11-15 广东工业大学 A kind of biomimetic manipulator based on skeleton identification
CN110598647A (en) * 2019-09-17 2019-12-20 四川爱目视光智能科技有限公司 Head posture recognition method based on image recognition
CN110815215A (en) * 2019-10-24 2020-02-21 上海航天控制技术研究所 Multi-mode fused rotating target approaching and stopping capture ground test system and method
CN110853099A (en) * 2019-11-19 2020-02-28 福州大学 Man-machine interaction method and system based on double Kinect cameras
CN110978064A (en) * 2019-12-11 2020-04-10 山东大学 Human body safety assessment method and system in human-computer cooperation
CN111273783A (en) * 2020-03-25 2020-06-12 北京百度网讯科技有限公司 Digital human control method and device
CN111360819A (en) * 2020-02-13 2020-07-03 平安科技(深圳)有限公司 Robot control method and device, computer device and storage medium
CN112090076A (en) * 2020-08-14 2020-12-18 深圳中清龙图网络技术有限公司 Game character action control method, device, equipment and medium
CN112936342A (en) * 2021-02-02 2021-06-11 福建天晴数码有限公司 System and method for evaluating actions of entity robot based on human body posture recognition algorithm
CN112975993A (en) * 2021-02-22 2021-06-18 北京国腾联信科技有限公司 Robot teaching method, device, storage medium and equipment
CN113077493A (en) * 2021-05-11 2021-07-06 德鲁动力科技(成都)有限公司 Method and system for following target of mobile robot
CN113146634A (en) * 2021-04-25 2021-07-23 达闼机器人有限公司 Robot attitude control method, robot and storage medium
CN113318426A (en) * 2020-12-23 2021-08-31 广州富港万嘉智能科技有限公司 Novel game system
CN113318425A (en) * 2020-12-23 2021-08-31 广州富港万嘉智能科技有限公司 Novel game device and control method
CN113318424A (en) * 2020-12-23 2021-08-31 广州富港万嘉智能科技有限公司 Novel game device and control method
CN117340914A (en) * 2023-10-24 2024-01-05 哈尔滨工程大学 Humanoid robot human body feeling control method and control system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103170973A (en) * 2013-03-28 2013-06-26 上海理工大学 Man-machine cooperation device and method based on Kinect video camera
CN203973551U (en) * 2014-06-13 2014-12-03 济南翼菲自动化科技有限公司 A kind of remote control robot of controlling by body gesture
CN104440926A (en) * 2014-12-09 2015-03-25 重庆邮电大学 Mechanical arm somatic sense remote controlling method and mechanical arm somatic sense remote controlling system based on Kinect
CN104589356A (en) * 2014-11-27 2015-05-06 北京工业大学 Dexterous hand teleoperation control method based on Kinect human hand motion capturing
US20150154467A1 (en) * 2013-12-04 2015-06-04 Mitsubishi Electric Research Laboratories, Inc. Method for Extracting Planes from 3D Point Cloud Sensor Data
CN105058396A (en) * 2015-07-31 2015-11-18 深圳先进技术研究院 Robot teaching system and control method thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103170973A (en) * 2013-03-28 2013-06-26 上海理工大学 Man-machine cooperation device and method based on Kinect video camera
US20150154467A1 (en) * 2013-12-04 2015-06-04 Mitsubishi Electric Research Laboratories, Inc. Method for Extracting Planes from 3D Point Cloud Sensor Data
CN203973551U (en) * 2014-06-13 2014-12-03 济南翼菲自动化科技有限公司 A kind of remote control robot of controlling by body gesture
CN104589356A (en) * 2014-11-27 2015-05-06 北京工业大学 Dexterous hand teleoperation control method based on Kinect human hand motion capturing
CN104440926A (en) * 2014-12-09 2015-03-25 重庆邮电大学 Mechanical arm somatic sense remote controlling method and mechanical arm somatic sense remote controlling system based on Kinect
CN105058396A (en) * 2015-07-31 2015-11-18 深圳先进技术研究院 Robot teaching system and control method thereof

Cited By (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106607910A (en) * 2015-10-22 2017-05-03 中国科学院深圳先进技术研究院 Robot real-time simulation method
CN106607910B (en) * 2015-10-22 2019-03-22 中国科学院深圳先进技术研究院 A kind of robot imitates method in real time
CN105945947A (en) * 2016-05-20 2016-09-21 西华大学 Robot writing system based on gesture control and control method of robot writing system
CN105999670A (en) * 2016-05-31 2016-10-12 山东科技大学 Shadow-boxing movement judging and guiding system based on kinect and guiding method adopted by same
CN106313072A (en) * 2016-10-12 2017-01-11 南昌大学 Humanoid robot based on leap motion of Kinect
CN106564055B (en) * 2016-10-31 2019-08-27 金阳娃 Human simulation robot stabilization motion planning method and control device
CN106564055A (en) * 2016-10-31 2017-04-19 金阳娃 Stable motion planning method of simulation humanoid robot and control device thereof
CN106667493A (en) * 2017-01-22 2017-05-17 河北大学 Human body balance assessment system and assessment method
CN106648116B (en) * 2017-01-22 2023-06-20 隋文涛 Virtual reality integrated system based on motion capture
CN106648116A (en) * 2017-01-22 2017-05-10 隋文涛 Virtual reality integrated system based on action capture
CN107272882A (en) * 2017-05-03 2017-10-20 江苏大学 The holographic long-range presentation implementation method of one species
CN107225573A (en) * 2017-07-05 2017-10-03 上海未来伙伴机器人有限公司 The method of controlling operation and device of robot
CN108518368A (en) * 2018-05-04 2018-09-11 贵阳海之力液压有限公司 A kind of valve control Hydraulic Power Transmission System applied to exoskeleton robot
CN108518368B (en) * 2018-05-04 2023-09-19 贵阳海之力液压有限公司 Valve control hydraulic transmission system applied to exoskeleton robot
CN108621164A (en) * 2018-05-10 2018-10-09 山东大学 Taiji push hands machine people based on depth camera
CN108762495A (en) * 2018-05-18 2018-11-06 深圳大学 The virtual reality driving method and virtual reality system captured based on arm action
WO2019218457A1 (en) * 2018-05-18 2019-11-21 深圳大学 Virtual reality driving method based on arm motion capture, and virtual reality system
CN109064487A (en) * 2018-07-02 2018-12-21 中北大学 A kind of human posture's comparative approach based on the tracking of Kinect bone node location
CN109064487B (en) * 2018-07-02 2021-08-06 中北大学 Human body posture comparison method based on Kinect skeleton node position tracking
CN109079794A (en) * 2018-09-18 2018-12-25 广东省智能制造研究所 It is a kind of followed based on human body attitude robot control and teaching method
CN109079794B (en) * 2018-09-18 2020-12-22 广东省智能制造研究所 Robot control and teaching method based on human body posture following
CN109591013A (en) * 2018-12-12 2019-04-09 山东大学 A kind of flexible assembly analogue system and its implementation
CN110135332A (en) * 2019-05-14 2019-08-16 吉林大学 A kind of bearing assembly producing line efficiency monitoring method
CN110135332B (en) * 2019-05-14 2022-05-31 吉林大学 Bearing assembly production line efficiency monitoring method
CN110450145A (en) * 2019-08-13 2019-11-15 广东工业大学 A kind of biomimetic manipulator based on skeleton identification
CN110598647B (en) * 2019-09-17 2022-04-22 四川爱目视光智能科技有限公司 Head posture recognition method based on image recognition
CN110598647A (en) * 2019-09-17 2019-12-20 四川爱目视光智能科技有限公司 Head posture recognition method based on image recognition
CN110815215A (en) * 2019-10-24 2020-02-21 上海航天控制技术研究所 Multi-mode fused rotating target approaching and stopping capture ground test system and method
CN110853099B (en) * 2019-11-19 2023-04-14 福州大学 Man-machine interaction method and system based on double Kinect cameras
CN110853099A (en) * 2019-11-19 2020-02-28 福州大学 Man-machine interaction method and system based on double Kinect cameras
CN110978064A (en) * 2019-12-11 2020-04-10 山东大学 Human body safety assessment method and system in human-computer cooperation
CN110978064B (en) * 2019-12-11 2022-06-24 山东大学 Human body safety assessment method and system in human-computer cooperation
CN111360819A (en) * 2020-02-13 2020-07-03 平安科技(深圳)有限公司 Robot control method and device, computer device and storage medium
CN111273783A (en) * 2020-03-25 2020-06-12 北京百度网讯科技有限公司 Digital human control method and device
CN111273783B (en) * 2020-03-25 2023-01-31 北京百度网讯科技有限公司 Digital human control method and device
CN112090076A (en) * 2020-08-14 2020-12-18 深圳中清龙图网络技术有限公司 Game character action control method, device, equipment and medium
CN113318424B (en) * 2020-12-23 2023-07-21 广州富港生活智能科技有限公司 Novel game device and control method
CN113318424A (en) * 2020-12-23 2021-08-31 广州富港万嘉智能科技有限公司 Novel game device and control method
CN113318425A (en) * 2020-12-23 2021-08-31 广州富港万嘉智能科技有限公司 Novel game device and control method
CN113318426A (en) * 2020-12-23 2021-08-31 广州富港万嘉智能科技有限公司 Novel game system
CN113318426B (en) * 2020-12-23 2023-05-26 广州富港生活智能科技有限公司 Novel game system
CN112936342B (en) * 2021-02-02 2023-04-28 福建天晴数码有限公司 Physical robot action evaluation system and method based on human body gesture recognition algorithm
CN112936342A (en) * 2021-02-02 2021-06-11 福建天晴数码有限公司 System and method for evaluating actions of entity robot based on human body posture recognition algorithm
CN112975993A (en) * 2021-02-22 2021-06-18 北京国腾联信科技有限公司 Robot teaching method, device, storage medium and equipment
CN113146634A (en) * 2021-04-25 2021-07-23 达闼机器人有限公司 Robot attitude control method, robot and storage medium
CN113077493A (en) * 2021-05-11 2021-07-06 德鲁动力科技(成都)有限公司 Method and system for following target of mobile robot
CN117340914A (en) * 2023-10-24 2024-01-05 哈尔滨工程大学 Humanoid robot human body feeling control method and control system
CN117340914B (en) * 2023-10-24 2024-05-14 哈尔滨工程大学 Humanoid robot human body feeling control method and control system

Also Published As

Publication number Publication date
CN105252532B (en) 2017-07-04

Similar Documents

Publication Publication Date Title
CN105252532A (en) Method of cooperative flexible attitude control for motion capture robot
CN111402290B (en) Action restoration method and device based on skeleton key points
Qiao et al. Real-time human gesture grading based on OpenPose
Qian et al. Developing a gesture based remote human-robot interaction system using kinect
Riley et al. Enabling real-time full-body imitation: a natural way of transferring human movement to humanoids
CN111443619B (en) Virtual-real fused human-computer cooperation simulation method and system
CN106843507B (en) Virtual reality multi-person interaction method and system
CN109243575B (en) Virtual acupuncture method and system based on mobile interaction and augmented reality
CN104570731A (en) Uncalibrated human-computer interaction control system and method based on Kinect
CN103529944A (en) Human body movement identification method based on Kinect
US20130202212A1 (en) Information processing apparatus, information processing method, and computer program
CN113103230A (en) Human-computer interaction system and method based on remote operation of treatment robot
JP2022084501A (en) Robot teaching system and method based on image segmentation and surface electromyography
Zhang et al. A real-time upper-body robot imitation system
Maycock et al. Robust tracking of human hand postures for robot teaching
Su et al. Development of an effective 3D VR-based manipulation system for industrial robot manipulators
Sreejith et al. Real-time hands-free immersive image navigation system using Microsoft Kinect 2.0 and Leap Motion Controller
Placidi et al. Data integration by two-sensors in a LEAP-based Virtual Glove for human-system interaction
Gao et al. Kinect-based motion recognition tracking robotic arm platform
Wang et al. Design and implementation of humanoid robot behavior imitation system based on skeleton tracking
Tian et al. Design and implementation of dance teaching system based on Unity3D
Infantino et al. A cognitive architecture for robotic hand posture learning
Jayasurya et al. Gesture controlled AI-robot using Kinect
Yu et al. Efficiency and learnability comparison of the gesture-based and the mouse-based telerobotic systems
KR20150044243A (en) Electronic learning apparatus and method for controlling contents by hand avatar

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20190517

Address after: 250101 Shandong Province Jinan High-tech Zone Shunfeng Road 101 Qilu Cultural Creative Base 15 Building 4 Unit 5 Floor

Patentee after: Shandong Muke Space Information Technology Co., Ltd.

Address before: No. 27, mountain Dana Road, Ji'nan City, Shandong, Shandong

Patentee before: Shandong University