CN115990878A - Three-degree-of-freedom control method for head gesture of robot and related equipment - Google Patents

Three-degree-of-freedom control method for head gesture of robot and related equipment Download PDF

Info

Publication number
CN115990878A
CN115990878A CN202211644467.0A CN202211644467A CN115990878A CN 115990878 A CN115990878 A CN 115990878A CN 202211644467 A CN202211644467 A CN 202211644467A CN 115990878 A CN115990878 A CN 115990878A
Authority
CN
China
Prior art keywords
head
neck
push rod
robot
expected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211644467.0A
Other languages
Chinese (zh)
Inventor
高松
刘旭东
姜琳
尹富珑
何俊培
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Pengxing Intelligent Research Co Ltd
Original Assignee
Shenzhen Pengxing Intelligent Research Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Pengxing Intelligent Research Co Ltd filed Critical Shenzhen Pengxing Intelligent Research Co Ltd
Priority to CN202211644467.0A priority Critical patent/CN115990878A/en
Publication of CN115990878A publication Critical patent/CN115990878A/en
Pending legal-status Critical Current

Links

Images

Abstract

The application discloses a three-degree-of-freedom control method for the head gesture of a robot and related equipment, relates to the technical field of robots, and aims to solve the technical problem of how to improve the anthropomorphic degree of the head gesture of the robot. The three-degree-of-freedom control method for the head gesture of the robot comprises the following steps: in response to the head pose control instruction, calculating a desired rotation angle of at least one motor and a desired movement distance of the push rod of each of the at least two lifting mechanisms; acquiring a motor driving instruction according to the expected rotation angle of at least one motor; acquiring a driving instruction of the lifting mechanism according to the expected movement distance of the push rod of each of the at least two lifting mechanisms; driving at least one motor in response to a motor driving instruction; and driving at least two lifting mechanisms to control the head of the robot to move from the current pose to the desired pose in response to the lifting mechanism driving instruction.

Description

Three-degree-of-freedom control method for head gesture of robot and related equipment
Technical Field
The application relates to the technical field of robots, in particular to a three-degree-of-freedom control method for the head gesture of a robot and related equipment.
Background
With the development of robot technology, the requirements of users on the anthropomorphic degree of the robot are higher and higher, wherein the head motion of the robot can influence the anthropomorphic degree of the robot obviously, and the head motion of the robot is driven by a neck mechanism. Currently, the neck mechanism of the robot mostly adopts a three-degree-of-freedom serial configuration neck mechanism, and a roll (roll) axis and a pitch (pitch) axis of the three-degree-of-freedom serial configuration neck mechanism are not intersected, so that the anthropomorphic degree of the head action of the robot is seriously affected.
Disclosure of Invention
In view of the above, the present application provides a three-degree-of-freedom control method for a head gesture of a robot and related devices, which aim to solve the technical problem of how to improve the anthropomorphic degree of the head gesture of the robot.
The first aspect of the present application provides a three-degree-of-freedom control method for a head gesture of a robot, the robot including a head, a body, and a neck connecting the head and the body, the neck including at least two lifting mechanisms and at least one motor, the at least two lifting mechanisms being connected in parallel, the at least one motor being connected to a push rod of the at least two lifting mechanisms through at least one connecting piece, the three-degree-of-freedom control method for a head gesture of a robot comprising: in response to the head pose control instruction, calculating a desired rotation angle of at least one motor and a desired movement distance of the push rod of each of the at least two lifting mechanisms; acquiring a motor driving instruction according to the expected rotation angle of at least one motor; acquiring a driving instruction of the lifting mechanism according to the expected movement distance of the push rod of each of the at least two lifting mechanisms; driving at least one motor in response to a motor driving instruction; and driving at least two lifting mechanisms to control the head of the robot to move from the current pose to the desired pose in response to the lifting mechanism driving instruction.
According to the three-degree-of-freedom control method for the head gesture of the robot, firstly, the expected rotation angle of at least one motor and the expected movement distance of the push rods of at least two lifting mechanisms are calculated in response to the head gesture control instruction, then, a motor driving instruction is obtained according to the expected rotation angle of at least one motor, a lifting mechanism driving instruction is obtained according to the expected movement distance of the push rods of at least two lifting mechanisms, then, at least one motor is driven according to the motor driving instruction, and at least two lifting mechanisms are driven according to the lifting mechanism driving instruction, so that the head of the robot is controlled to move from the current gesture to the expected gesture, the neck mechanism of the robot can be subjected to kinematic solution when the roll axis, the pitch axis and the yaw axis of the neck mechanism of the three-degree-of-freedom parallel configuration are orthogonal or non-orthogonal to each other, and the anthropomorphic degree of the head action of the robot is improved.
A second aspect of the present application provides a robot comprising a processor, a head, a body, and a neck connecting the head and the body, the neck comprising at least two lifting mechanisms connected in parallel and at least one motor connected to a push rod of the at least two lifting mechanisms by at least one connector; the robot is configured to: calculating, by the processor, a desired rotation angle of the at least one motor and a desired movement distance of the respective push rods of the at least two lifting mechanisms in response to the head pose control instruction; acquiring a motor driving instruction according to the expected rotation angle of at least one motor; acquiring a driving instruction of the lifting mechanism according to the expected movement distance of the push rod of each of the at least two lifting mechanisms; driving at least one motor in response to a motor driving instruction; and driving at least two lifting mechanisms to control the head of the robot to move from the current pose to the desired pose in response to the lifting mechanism driving instruction.
The third aspect of the application provides a control terminal of a robot, which comprises a processor, wherein the control terminal is in communication connection with the robot, the robot comprises a head, a body and a neck for connecting the head and the body, the neck comprises at least two lifting mechanisms and at least one motor, the at least two lifting mechanisms are connected in parallel, and the at least one motor is connected with push rods of the at least two lifting mechanisms through at least one connecting piece; the control terminal is configured to: calculating, by the processor, an expected rotation angle of at least one motor of the robot neck and an expected movement distance of the respective push rods of the at least two lifting mechanisms in response to the head pose control instruction; acquiring a motor driving instruction according to the expected rotation angle of at least one motor, wherein the motor driving instruction is used for driving at least one motor; acquiring a lifting mechanism driving instruction according to the expected movement distance of the push rod of each of the at least two lifting mechanisms, wherein the lifting mechanism driving instruction is used for driving the at least two lifting mechanisms; a motor drive command and a hoist drive command are sent to the robot to control movement of the head of the robot from a current pose to a desired pose.
It can be understood that the specific embodiments and beneficial effects of the robot provided in the second aspect of the present application and the control terminal of the robot provided in the third aspect of the present application are substantially the same as those of the three-degree-of-freedom control method for the head gesture of the robot provided in the first aspect of the present application, and are not repeated herein.
Drawings
Fig. 1 is a block diagram of a robot according to an embodiment of the present application.
Fig. 2 is a schematic view of a neck structure of a robot according to an embodiment of the present application.
Fig. 3 is a flowchart of a method for controlling three degrees of freedom of a head gesture of a robot according to an embodiment of the present application.
Fig. 4 is a flowchart of the substeps of step S301 shown in fig. 3 provided in an embodiment of the present application.
Fig. 5 is a flowchart of the substeps of step S406 shown in fig. 4 provided in an embodiment of the present application.
Fig. 6 is a flow chart of substeps of step S306 shown in fig. 3 provided in an embodiment of the present application.
Fig. 7 is a block diagram of a control terminal according to an embodiment of the present application.
Fig. 8 is a block diagram of a multi-legged robot according to one embodiment of the present application.
Fig. 9 is a schematic view of a scenario in which a control terminal provided in an embodiment of the present application controls a multi-legged robot.
Detailed Description
It should be noted that, in the embodiments of the present application, "at least one" refers to one or more, and "multiple" refers to two or more. "and/or", describes an association relationship of an association object, and the representation may have three relationships, for example, a and/or B may represent: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The terms "first," "second," "third," "fourth" and the like in the description and in the claims and drawings, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order.
It should be further noted that the method disclosed in the embodiments of the present application or the method shown in the flowchart, including one or more steps for implementing the method, may be performed in an order that the steps may be interchanged with one another, and some steps may be deleted without departing from the scope of the claims.
Some technical terms of the embodiments of the present application are described below.
1, kinematic solution
The head motion of the robot is driven by the neck mechanism, which requires a kinematic solution before driving the head motion. Kinematic solutions include forward and inverse kinematic solutions. The process of converting the rotation angle of a motor of the neck of the robot or the movement distance of a push rod of the lifting mechanism into the head gesture of the robot is called positive movement solving. The process of converting the head gesture of the robot into the rotation angle of a motor of the neck of the robot or the movement distance of a push rod of the lifting mechanism is called inverse movement solution.
2, neck mechanism in tandem configuration
The serial configuration neck mechanism means that at least two lifting mechanisms of the robot neck are connected in a serial manner. The roll (roll) axis and pitch (pitch) axis of the three degree of freedom tandem configuration neck mechanism do not intersect, severely affecting the anthropomorphic degree of the robotic head motion.
3, parallel configuration neck mechanism
The parallel configuration neck mechanism means that at least two lifting mechanisms of the robot neck are connected in parallel. The roll axis and the pitch axis of the neck mechanism with the three-degree-of-freedom parallel configuration intersect at one point, so that the anthropomorphic degree of the head motion of the robot can be effectively improved. However, the roll, pitch and yaw (yaw) axes of the neck mechanism in a three degree of freedom parallel configuration are mutually orthogonal, resulting in a single and limited kinematic solution process.
Based on the above, the application provides a three-degree-of-freedom control method and related equipment for the head gesture of a robot, firstly, the expected rotation angle of at least one motor and the expected movement distance of the push rods of at least two lifting mechanisms are calculated in response to the head gesture control instruction, then a motor driving instruction is obtained according to the expected rotation angle of at least one motor, a lifting mechanism driving instruction is obtained according to the expected movement distance of the push rods of at least two lifting mechanisms, at least one motor is driven according to the motor driving instruction, and at least two lifting mechanisms are driven according to the lifting mechanism driving instruction, so that the head of the robot is controlled to move from the current gesture to the expected gesture, the neck mechanism of the robot can be subjected to kinematic solution when the roll axis, the pitch axis and the yaw axis of the neck mechanism of the three-degree-of-freedom parallel configuration are not orthogonal, and the anthropomorphic degree of the head action of the robot is improved.
The robot according to the embodiment of the present application will be described below.
Fig. 1 is a block diagram of a robot 10 according to an embodiment of the present application.
Referring to fig. 1, a robot 10 includes a head 100, a body 200, a neck 300, and a processor 400. The neck 300 connects the head 100 and the body 200. The neck 300 includes at least two lifting mechanisms 310, at least one motor 320, and at least one connector 330. Wherein, at least two lifting mechanisms 310 are connected in parallel, and at least one motor 320 is connected with at least two lifting mechanisms 310 through at least one connecting piece 330. The lifting mechanism 310 includes a pushrod 311 and at least one motor 312.
In the present embodiment, the robot 10 may calculate a desired rotation angle of the at least one motor 320 and a desired movement distance of the push rods 311 of the at least two elevating mechanisms 310, respectively, in response to the head pose control command by the processor 400, and then generate a motor driving command and an elevating mechanism driving command, and then drive the at least one motor 320 and the at least two elevating mechanisms 310, respectively.
Specifically, the robot 10 may perform a rotational motion of the neck 300 around the yaw axis direction by driving at least one motor 320 in response to a motor driving command through the processor 400. The robot 10 may respond to the driving instruction of the lifting mechanism through the processor 400, and drive the motors 312 inside the at least two lifting mechanisms 310 to perform rotational movement, so as to drive the push rods 311 to move, and thus drive the neck 300 to perform rotational movement around the roll axis direction and/or the pitch axis direction. The rotational movement of the at least one motor 320 and the movement of the push rods 311 inside the at least two lifting mechanisms 310 brings the neck 300 into rotational movement about at least one of the yaw, roll and pitch axes, thereby bringing the head 100 connected to the neck 300 into movement.
It is understood that the pushrod 311 movement may include lengthening or shortening. The head 100 motion may include back and forth swing, side to side swing, or rotation.
For example, when a user issues a head posture control instruction to the robot 10 by voice or touch, etc., the robot 10 calculates a desired rotation angle of the at least one motor 320 and a desired movement distance of the respective push rods 311 of the at least two elevating mechanisms 310 in response to the head posture control instruction through the processor 400. The robot 10 then generates a motor driving command according to the desired rotation angle of the at least one motor 320 and a lifter driving command according to the desired movement distance of the push rods 311 of the at least two lifters 310, respectively, through the processor 400. Then, the robot 10 drives at least one motor 320 to perform a rotational motion in response to the motor driving command through the processor 400, and drives the neck 300 to perform a rotational motion around the yaw axis direction, thereby driving the head 100 to rotate. The robot 10 responds to the driving instruction of the lifting mechanisms through the processor 400, drives the motors 312 in the at least two lifting mechanisms 310 to perform rotary motion, drives the push rods 311 of the at least two lifting mechanisms 310 to extend or shorten, and accordingly drives the neck 300 to perform rotary motion around the transverse rolling axis direction, and then drives the head 100 to swing left and right; and/or, the neck 300 is driven to rotate around the pitch axis direction, and then the head 100 is driven to swing back and forth.
The neck of the robot according to the embodiment of the present application will be described below by taking a three-degree-of-freedom parallel configuration neck mechanism as an example.
Fig. 2 is a schematic view of the structure of a neck 300 of the robot 10 according to an embodiment of the present application.
Referring to fig. 2, the neck 300 of the robot 10 is a three degree of freedom parallel configuration neck mechanism. The neck 300 includes two lifting mechanisms 310, a motor 320, a connector 330, and a base 340. The two lifting mechanisms 310 include a left lifting mechanism 3101 and a right lifting mechanism 3102. Wherein the left elevating mechanism 3101 and the right elevating mechanism 3102 are connected in parallel, and the motor 320 connects the left elevating mechanism 3101 and the right elevating mechanism 3102 through the connector 330. The left elevating mechanism 3101 includes a left push rod 3111 and at least one motor (not shown). The right elevating mechanism 3102 includes a right push rod 3112 and at least one motor (not shown). The hinge of the left push rod 3111 with the link 330 is an upper left hinge point 3301, and the hinge of the left push rod 3111 with the base 340 is a lower left hinge point 3401. The hinge of the right push rod 3112 with the link 330 is an upper right hinge point 3302, and the hinge of the right push rod 3112 with the base 340 is a lower right hinge point 3402. The direction perpendicular to the left push rod 3111 and the right push rod 3112 is the pitch axis direction. The roll axis direction is orthogonal to the pitch axis direction and is parallel to the lower surface of the base 340. The direction perpendicular to the upper and lower surfaces of the motor 320 is the yaw axis direction. It can be seen that the roll axis direction is non-orthogonal to the yaw axis direction.
In the present embodiment, the robot 10 may calculate the desired rotation angle of the motor 320, the desired movement distance of the left push rod 3111, and the desired movement distance of the right push rod 3112 by constructing three partial coordinate systems. The three local coordinate systems include a base coordinate system, a neck coordinate system, and a head coordinate system. Wherein the origin of the base coordinate system is located near the base 340, the base coordinate system includes X 0 Axis, Y 0 Axis and Z 0 Axes, three axes are mutually orthogonal, X 0 Axes and Y 0 The axes being all parallel to the lower surface of the base 340, Z 0 The axis is perpendicular to the lower surface of the base 340. The origin of the neck coordinate system is located near the connector 330, the neck coordinate system being wrapped aroundX is drawn together 1 Axis, Y 1 Axis and Z 1 Axes, three axes are mutually orthogonal, Y 1 The axis being parallel to the pitch axis direction, Z 1 The shaft is parallel to the yaw axis direction. The origin of the head coordinate system is located near the motor 320, and the head coordinate system includes X 2 Axis, Y 2 Axis and Z 2 Axes, three axes are mutually orthogonal, X 2 Axis, Y 2 Axis and Z 2 Axes being respectively parallel to X of the basic coordinate system 0 Axis, Y 0 Axis and Z 0 A shaft.
The three-degree-of-freedom control method of the robot head posture according to the embodiment of the present application will be described below taking the neck 300 of the robot 10 shown in fig. 2 as an example.
Fig. 3 is a flowchart of a method for controlling three degrees of freedom of a head gesture of a robot according to an embodiment of the present application.
Referring to fig. 3, the three degree-of-freedom control method for the head gesture of the robot may include the steps of:
s301, in response to the head posture control instruction, a desired rotation angle of the motor 320, a desired movement distance of the left push rod 3111 of the left elevating mechanism 3101, and a desired movement distance of the right push rod 3112 of the right elevating mechanism 3102 are calculated.
In the present embodiment, the head pose control instruction is for controlling the head 100 of the robot 10 to move from the current pose to the desired pose. The head pose control instructions include a head desired pose, which refers to a desired pose of the head 100 of the robot 10 in a head coordinate system. The robot 10 may recognize the head desired posture from the head posture control instruction first, and then calculate the desired rotation angle of the motor 320, the desired movement distance of the left push rod 3111 of the left elevating mechanism 3101, and the desired movement distance of the right push rod 3112 of the right elevating mechanism 3102 from the head desired posture by solving the inverse movement.
It will be appreciated that the motor 320 may be rotated to a desired rotational angle to rotate the neck 300 of the robot 10 about the yaw axis and thus the head 100. The left lifting mechanism 3101 moves towards the left push rod 3111 by a desired movement distance, and the right lifting mechanism 3102 moves towards the right push rod 3112 by a desired movement distance, so as to drive the neck 300 to rotate around the roll axis direction, and thus drive the head 100 to swing left and right; and/or the neck 300 may be rotated about the pitch axis to thereby swing the head 100 back and forth.
For example, when motor 320 rotates in a clockwise direction, neck 300 rotates in a clockwise direction about the yaw axis, thereby rotating head 100 in a clockwise direction. When motor 320 rotates in a counter-clockwise direction, neck 300 rotates in a counter-clockwise direction about the yaw axis, thereby rotating head 100 in a counter-clockwise direction. When the left push rod 3111 of the left elevating mechanism 3101 is extended a distance greater than the right push rod 3112 of the right elevating mechanism 3102, the neck 300 is rotated in the clockwise direction around the roll axis direction, thereby driving the head 100 to swing rightward. When the left push rod 3111 of the left elevating mechanism 3101 is extended a distance smaller than the right push rod 3112 of the right elevating mechanism 3102, the neck 300 is rotated in the counterclockwise direction around the roll axis direction, thereby driving the head 100 to swing leftward. When both the left push rod 3111 of the left elevating mechanism 3101 and the right push rod of the right elevating mechanism 3102 are elongated, the neck 300 rotates in the counterclockwise direction around the pitch axis direction, thereby driving the head 100 to swing backward. When both the left push rod 3111 of the left elevating mechanism 3101 and the right push rod of the right elevating mechanism 3102 are shortened, the neck 300 rotates in the clockwise direction around the pitch axis direction, thereby driving the head 100 to swing forward.
S302, a motor driving instruction is acquired according to the desired rotation angle of the motor 320.
In the present embodiment, when the robot 10 calculates a desired rotation angle of the motor 320, a motor driving instruction may be generated. The motor driving command is used to drive the motor 320 to perform a rotational motion toward a desired rotational angle.
S303, an elevating mechanism driving command is acquired based on the desired movement distance of the left push rod 3111 of the left elevating mechanism 3101 and the desired movement distance of the right push rod 3112 of the right elevating mechanism 3102.
In the present embodiment, when the robot 10 calculates the desired movement distance of the left push rod 3111 of the left elevating mechanism 3101 and the desired movement distance of the right push rod 3112 of the right elevating mechanism 3102, an elevating mechanism driving command may be generated. The lift mechanism drive command is used to drive the left lift mechanism 3101 to move a desired movement distance to the left push rod 3111 and to drive the right lift mechanism 3102 to move a desired movement distance to the right push rod 3112.
S304 of controlling the head 100 of the robot 10 to move from the current posture to the desired posture in response to the motor driving instruction and the elevating mechanism driving instruction.
In the present embodiment, the robot 10 drives the motor 320 to perform a rotational motion to a desired rotational angle in response to a motor drive instruction; in response to the lift mechanism driving instruction, the left lift mechanism 3101 is driven to move toward the left push rod 3111 by a desired movement distance, and the right lift mechanism 3102 is driven to move toward the right push rod 3112 by a desired movement distance, thereby controlling the head 100 of the robot 10 to move from the current posture to the desired posture.
For example, assume that the motor drive command is "the motor 320 rotates 60 degrees in the clockwise direction", and the lift mechanism drive command is "the left push rod 3111 of the left lift mechanism 3101 is elongated 30 cm, and the right push rod 3112 of the right lift mechanism 3102 is elongated 20 cm". In response to the motor driving command, the robot 10 drives the motor 320 to rotate 60 degrees in the clockwise direction, which rotates the neck 300 60 degrees in the clockwise direction around the yaw axis direction, thereby rotating the head 100 60 degrees in the clockwise direction. In response to the lifting mechanism driving instruction, the robot 10 drives the left push rod 3111 of the left lifting mechanism 3101 to extend for 30 cm, and drives the right push rod 3112 of the right lifting mechanism 3102 to extend for 20 cm, so as to drive the neck 300 to rotate around the roll axis direction by a corresponding desired angle in the clockwise direction, and drive the head 100 to swing rightward by a corresponding desired angle; at the same time, the neck 300 is rotated around the pitch axis direction by a corresponding desired angle in the counterclockwise direction, thereby driving the head 100 to swing backward by a corresponding desired angle.
S305, the rotation angle of the motor 320, the movement distance of the left push rod 3111 of the left elevating mechanism 3101, and the movement distance of the right push rod 3112 of the right elevating mechanism 3102 are detected.
In the present embodiment, the robot 10 may detect the rotation angle of the motor 320, and the movement distance of the left push rod 3111 of the left elevating mechanism 3101 and the movement distance of the right push rod 3112 of the right elevating mechanism 3102 in real time or periodically by means of sensors.
S306, calculating the current posture of the head 100 of the robot 10 based on the detected rotation angle of the motor 320, the movement distance of the left push rod 3111 of the left elevating mechanism 3101, and the movement distance of the right push rod 3112 of the right elevating mechanism 3102.
In the present embodiment, the robot 10 can calculate the current posture of the head 100 of the robot 10 from the rotation angle of the motor 320, and the movement distance of the left push rod 3111 of the left elevating mechanism 3101 and the movement distance of the right push rod 3112 of the right elevating mechanism 3102, by solving through the forward movement after detecting the rotation angle of the motor 320, and the movement distance of the left push rod 3111 of the left elevating mechanism 3101 and the movement distance of the right push rod 3112 of the right elevating mechanism 3102.
S307, it is determined whether the current posture of the head 100 of the robot 10 reaches the desired posture.
In the present embodiment, when the head 100 of the robot 10 moves from the current posture to the desired posture, the current posture of the head 100 is constantly changed, and the robot 10 may periodically calculate the current posture of the head 100, and detect whether the current posture of the head 100 reaches the desired posture.
When the current posture of the head 100 of the robot 10 reaches the desired posture, step S301 is executed back. That is, the head posture control instruction is received in the next control cycle, and the desired rotation angle of the motor 320, the desired movement distance of the left push rod 3111 of the left elevating mechanism 3101, and the desired movement distance of the right push rod 3112 of the right elevating mechanism 3102 are calculated in response to the head posture control instruction again.
When the current posture of the head 100 of the robot 10 does not reach the desired posture, step S305 is executed back. That is, the waiting head 100 continues to move toward the desired posture, and then continues to detect the rotation angle of the motor 320, the movement distance of the left push rod 3111 of the left elevating mechanism 3101, and the movement distance of the right push rod 3112 of the right elevating mechanism 3102, and then calculates the current posture of the head 100 of the robot 10 until the current posture of the head 100 reaches the desired posture.
It is understood that the robot 10 has a plurality of control periods T, and the control periods T may be set as needed.
For example, the control period T may include at least one first control period T 1 At least one second control period T 2 And at least one third control period T 3 . Adjacent two first control periods T 1 Spaced apart from each other by at least two control periods T. Second control period T 2 For a first control period T 1 A later period of time. For example, a second control period T 2 And a first control period T 2 May be two consecutive time periods, i.e. during the first control period T 1 Immediately after entering the second control period T 2 . Second control period T 2 May also be in association with the first control period T 1 And may be separated by at least one control period T. For example, a second control period T 2 May be located in two adjacent first control periods T 1 Between two first control periods T which can also be arranged at intervals 1 Between them. That is, in two adjacent first control periods T 1 May include at least one second control period T 2 The second control period T may not be included 2 . Similarly, a third control period T 3 For a second control period T 2 A later period of time. For example, a second control period T 2 And a third control period T 3 May be two consecutive time periods, i.e. during the second control period T 2 Immediately after entering the third control period T 3 . Second control period T 2 May also be in association with a third control period T 3 And may be separated by at least one control period T. For example, a third control period T 3 May be located in two adjacent second control periods T 2 Between two second control periods T which can also be arranged at intervals 2 Between them. That is, in two adjacent second control periods T 2 May include at least one third control period T 3 The third control period T may not be included 3
Fig. 4 is a flowchart of the substeps of step S301 shown in fig. 3 provided in an embodiment of the present application.
Referring to fig. 4, in response to the head posture control instruction, the robot 10 calculates a desired rotation angle of the motor 320, a desired movement distance of the left push rod 3111 of the left elevating mechanism 3101, and a desired movement distance of the right push rod 3112 of the right elevating mechanism 3102, and may include the steps of:
s401, the desired head pose of the robot 10 is acquired according to the head pose control instruction.
In the present embodiment, the head posture control instruction includes a head desired posture. The desired head pose includes a head roll angle r 2 Pitch angle p of head 2 And a yaw angle y of the head 2 . Head roll angle r 2 Refers to the X of the robot 10 with the head 100 surrounding the head coordinate system 2 Rotation angle in axial direction, pitch angle p of head 2 Refers to Y of the robot 10 with the head 100 surrounding the head coordinate system 2 Rotation angle in axial direction, yaw angle of head y 2 Refers to Z of the robot 10 with the head 100 surrounding the head coordinate system 2 Rotation angle in the axial direction.
For example, when a user issues a head pose control command of "30 degrees forward swing of the head, 45 degrees left swing, 60 degrees clockwise rotation" to the robot 10 by voice or touch, etc., the robot 10 may recognize a head desired pose of the robot 10 from the head pose control command, the head desired pose including the head roll angle r 2 -45 ° pitch angle p 2 =30° and head yaw angle y 2 =60°。
It will be appreciated that the head roll angle r 2 Pitch angle p of head 2 And a yaw angle y of the head 2 All are vector values, and the direction of the vector values can be set according to the needs. For example, when the head swings rightward, the head roll angle r 2 Positive values; head roll angle r when the head swings to the left 2 Is negative. Head pitch angle p when the head swings forward 2 Positive values; head pitch angle p as the head swings backwards 2 Is negative. When the head rotates clockwise, the head yaw angle y 2 Positive values; when the head rotates counterclockwise, the head yaw angle y 2 Is negative.
S402Roll angle r of head 2 Pitch angle p of head 2 And a yaw angle y of the head 2 And converting the desired posture of the head into a homogeneous coordinate transformation matrix.
In the present embodiment, the robot 10 can convert the head roll angle r by homogeneous coordinate transformation after recognizing the desired posture of the head 2 Pitch angle p of head 2 And a yaw angle y of the head 2 And converting the coordinate system into a homogeneous coordinate transformation matrix of the expected posture of the head, so that the expected posture of the head can be subjected to coordinate system conversion.
For example, robot 10 may roll head angle r 2 Pitch angle p of head 2 And a yaw angle y of the head 2 A homogeneous coordinate transformation matrix that converts to the following head desired pose:
Figure BDA0004009242010000071
Wherein T is head A homogeneous coordinate transformation matrix for the desired pose of the head; sr 2 For head roll angle r 2 Sine value cr of (2) 2 For head roll angle r 2 Cosine value, sp of 2 Is the pitch angle p of the head 2 Sine value, cp of (c) 2 Is the pitch angle p of the head 2 Cosine value, sy 2 For the yaw angle y of the head 2 Is of sine value, cy 2 For the yaw angle y of the head 2 Cosine values of (2); (x) 2 ,y 2 ,z 2 ) Is the coordinate transformed from the head coordinate system to the base coordinate system for a certain reference point of the head 100.
S403, converting the homogeneous coordinate transformation matrix of the head desired posture into the first homogeneous coordinate transformation matrix of the neck desired posture by coordinate transformation.
In the present embodiment, the robot 10 converts the head desired posture of the head coordinate system into the neck desired posture of the neck coordinate system according to the coordinate conversion relationship between the head coordinate system and the neck coordinate system after converting into the homogeneous coordinate conversion matrix of the head desired posture.
For example, the robot 10 may convert the homogeneous coordinate transformation matrix of the desired pose of the head into the first homogeneous coordinate transformation matrix of the desired pose of the neck by matrix transformation as follows:
T neck =T head ×T headtoneck
wherein T is neck A first coordinate transformation matrix T for the desired posture of the neck head Homogeneous coordinate transformation matrix for head expected posture, T headtoneck Is a coordinate transformation matrix for transforming from the head coordinate system to the neck coordinate system.
S404, converting the first uniform coordinate transformation matrix of the neck expected posture into a neck roll angle r 1 Neck pitch angle p 1 And neck yaw angle y 1
In the present embodiment, after the first coordinate transformation matrix of the neck desired posture is obtained by transformation, the robot 10 may transform the first coordinate transformation matrix of the neck desired posture into the neck roll angle r by inverse transformation of the first coordinate transformation matrix 1 Neck pitch angle p 1 And neck yaw angle y 1 . Wherein the neck yaw angle y 1 Is the desired rotation angle of motor 320.
S405, the neck roll angle r 1 And neck pitch angle p 1 And converting into a second homogeneous coordinate transformation matrix of the neck expected gesture.
In the present embodiment, the robot 10 converts the neck roll angle r 1 Neck pitch angle p 1 And neck yaw angle y 1 Thereafter, the neck yaw angle y 1 Setting 0, and transforming the neck roll angle r by homogeneous coordinate transformation 1 And neck pitch angle p 1 And converting the second homogeneous coordinate transformation matrix into the neck expected posture, so that the neck expected posture can be subjected to coordinate system conversion.
For example, robot 10 may roll neck angle r 1 And neck pitch angle p 1 A second homogeneous coordinate transformation matrix that converts to the following neck desired poses:
Figure BDA0004009242010000081
wherein T' neck A second homogeneous coordinate transformation matrix for the desired pose of the neck; sr 1 Is the neck roll angle r 1 Sine value cr of (2) 1 Is the neck roll angle r 1 Cosine value, sp of 1 Is the pitch angle p of the neck 1 Sine value, cp of (c) 1 Is the pitch angle p of the neck 1 Cosine values of (2); (x) n ,y n ,z n ) Is the coordinate transformed from the neck coordinate system to the base coordinate system for a certain reference point of the head 100.
S406, a desired movement distance of the left push rod 3111 of the left elevating mechanism 3101 and a desired movement distance of the right push rod 3112 of the right elevating mechanism 3102 are calculated from the second homogeneous coordinate transformation matrix of the neck desired posture.
In the present embodiment, after the second homogeneous coordinate transformation matrix of the neck desired pose is obtained by transformation, the robot 10 may convert the head desired pose from the neck coordinate system to the base coordinate system by matrix translation transformation, and then obtain the desired movement distance of the left push rod 3111 of the left elevating mechanism 3101 by calculating the desired movement distance of the left upper hinge point 3301 of the left push rod 3111 of the left elevating mechanism 3101 in the base coordinate system, and obtain the desired movement distance of the right push rod 3112 of the right elevating mechanism 3102 by calculating the desired movement distance of the right upper hinge point 3302 of the right push rod 3112 of the right elevating mechanism 3102.
It will be appreciated that since the position of the left lower hinge point 3401 of the left push rod 3111 and the position of the right lower hinge point 3402 of the right push rod 3112 are both fixed, the desired movement distance of the left upper hinge point 3301 of the left push rod 3111, i.e., the desired movement distance of the left push rod 3111, and the desired movement distance of the right upper hinge point 3302 of the right push rod 3112, i.e., the desired movement distance of the right push rod 3112.
Fig. 5 is a flowchart of the substeps of step S406 shown in fig. 4 provided in an embodiment of the present application.
Referring to fig. 5, the robot 10 calculates a desired movement distance of the left push rod 3111 of the left elevating mechanism 3101 and a desired movement distance of the right push rod 3112 of the right elevating mechanism 3102 according to the second homogeneous coordinate transformation matrix of the desired posture of the neck, and may include the steps of:
s501, a left upper hinge point 3301 desired position and a left lower hinge point 3401 desired position of the left push rod 3111 of the left elevating mechanism 3101, and a right upper hinge point 3302 desired position and a right lower hinge point 3402 desired position of the right push rod 3112 of the right elevating mechanism 3102 are calculated according to the second homogeneous coordinate transformation matrix of the neck desired posture.
In the present embodiment, since the coordinates of both the left lower hinge point 3401 of the left push rod 3111 and the right lower hinge point 3402 of the right push rod 3112 in the base coordinate system are fixed, the robot 10 can determine the coordinates of the left lower hinge point 3401 in the base coordinate system, that is, the position of the left lower hinge point 3401, before the left push rod 3111 moves; and determining the coordinates of the right lower hinge point 3402 in the base coordinate system, i.e., the position of the right lower hinge point 3402, before the right push rod 3112 moves. After the movement of the left push rod 3111, the robot 10 may directly determine the position of the left lower hinge point 3401 of the left push rod 3111 before the movement as the desired position of the left lower hinge point 3401 of the left push rod 3111 after the movement. After the movement of the right push rod 3112, the robot 10 may directly determine the position of the right lower hinge point 3402 of the right push rod 3112 before the movement as the desired position of the right lower hinge point 3402 of the right push rod 3112 after the movement.
The robot 10 may calculate the desired position of the upper left hinge point 3301 of the left push rod 3111 of the left elevating mechanism 3101 and the desired position of the upper right hinge point 3302 of the right push rod 3112 of the right elevating mechanism 3102 in the basic coordinate system through matrix translation transformation.
For example, the robot 10 may calculate the desired position of the upper left hinge point 3301 of the left push rod 3111 of the left elevating mechanism 3101 by a matrix translational transformation as follows:
Figure BDA0004009242010000091
wherein T is hinge,left Is a left upper hinge point 3301 matrix, T 'of a left push rod 3111 of the left elevating mechanism 3101' neck A second homogeneous coordinate transformation matrix for the desired pose of the neck, T necktohinge,left Translation transformation matrix, T, for origin of neck coordinate system to upper left hinge point 3301 of left push rod 3111 necktohinge,left Is fixed and unchanged; sr 1 Is the neck roll angle r 1 Sine value cr of (2) 1 Is the neck roll angle r 1 Cosine value, sp of 1 Is the pitch angle p of the neck 1 Sine value, cp of (c) 1 Is the pitch angle p of the neck 1 Cosine values of (2); (x) n,left ,y n,left ,z n,left ) Is the coordinates of the upper left hinge point 3301 of the left push rod 3111 in the basic coordinate system, i.e., the desired position of the upper left hinge point 3301 of the left push rod 3111.
The robot 10 may calculate the desired position of the upper right hinge point 3302 of the right push rod 3112 of the right elevating mechanism 3102 by a matrix translational transformation as follows:
Figure BDA0004009242010000092
wherein T is hinge,right Is a matrix of upper right hinge points 3302, T 'of the right push rod 3112 of the right elevating mechanism 3102' neck A second homogeneous coordinate transformation matrix for the desired pose of the neck, T necktohinge,right Translation transformation matrix, T, for origin of neck coordinate system to upper right hinge point 3302 of right push rod 3112 necktohinge,right Is fixed and unchanged; sr 1 Is the neck roll angle r 1 Sine value cr of (2) 1 Is the neck roll angle r 1 Cosine value, sp of 1 Is the pitch angle p of the neck 1 Sine value, cp of (c) 1 Is the pitch angle p of the neck 1 Cosine values of (2); (x) n,right ,y n,right ,z n,right ) Is the coordinates of the upper right hinge point 3302 of the right push rod 3112 in the base coordinate system, i.e., the desired position of the upper right hinge point 3302 of the right push rod 3112.
S502, a first distance of the left push rod 3111 between the position of the upper left hinge point 3301 and the position of the lower left hinge point 3401 of the left push rod 3111 of the left elevating mechanism 3101, and a first distance of the right push rod 3112 between the position of the upper right hinge point 3302 and the position of the lower right hinge point 3402 of the right push rod 3112 of the right elevating mechanism 3102 are detected.
In the present embodiment, the robot 10 may detect the positions of the upper left hinge point 3301 and the lower left hinge point 3401 of the left push rod 3111 of the left elevating mechanism 3101 in the base coordinate system and the positions of the upper right hinge point 3302 and the lower right hinge point 3402 of the right push rod 3112 of the right elevating mechanism 3102 in the base coordinate system in real time or periodically by sensors. Then, the robot 10 calculates a left push rod 3111 first distance between the upper left hinge point 3301 position and the lower left hinge point 3401 position, and calculates a right push rod 3112 first distance between the upper right hinge point 3302 position and the lower right hinge point 3402 position.
S503, a left push rod 3111 second distance between the left upper hinge point 3301 desired position and the left lower hinge point 3401 desired position of the left push rod 3111 of the left elevating mechanism 3101, and a right push rod 3112 second distance between the right upper hinge point 3302 desired position and the right lower hinge point 3402 desired position of the right push rod 3112 of the right elevating mechanism 3102 are calculated.
In the present embodiment, the robot 10 calculates the second distance of the left push rod 3111 between the desired position of the left upper hinge point 3301 and the desired position of the left lower hinge point 3401 of the left push rod 3111 after calculating the desired position of the left upper hinge point 3301 and the desired position of the left lower hinge point 3401 of the left elevating mechanism 3101. After calculating the desired position of the upper right hinge point 3302 and the desired position of the lower right hinge point 3402 of the right push rod 3112 of the right elevating mechanism 3102, the robot 10 calculates the second distance of the right push rod 3112 between the desired position of the upper right hinge point 3302 and the desired position of the lower right hinge point 3402 of the right push rod 3112.
It will be appreciated that the first distance of the left push rod 3111 is the distance that the left push rod 3111 moves in the first control period, and the second distance of the left push rod 3111 is the desired distance that the left push rod 3111 moves in the second control period. In some embodiments, the left push rod 3111 may be a first distance of 0. The first distance of the right push rod 3112 is the distance that the right push rod 3112 moves in the first control period, and the second distance of the right push rod 3112 is the desired distance that the right push rod 3112 moves in the second control period. In some embodiments, the right push rod 3112 may be a first distance of 0. Wherein the second control period is a period of time after the first control period.
S504, subtracting the first distance from the left push rod 3111 from the second distance from the left push rod 3111 results in a desired movement distance of the left push rod 3111 of the left elevating mechanism 3101, and subtracting the first distance from the second distance from the right push rod 3112 results in a desired movement distance of the right push rod 3112 of the right elevating mechanism 3102.
In the present embodiment, the robot 10 subtracts the left push rod 3111 from the left push rod 3111 by a first distance after calculating the first distance of the left push rod 3111 and the second distance of the left push rod 3111, that is, subtracts a desired distance of movement of the left push rod 3111 in the second control period from a distance of movement in the first control period, thereby obtaining a desired movement distance of the left push rod 3111 in a period from the first control period to the second control period. After calculating the first distance of the right push rod 3112 and the second distance of the right push rod 3112, the robot 10 subtracts the second distance of the right push rod 3112 from the first distance of the right push rod 3112, that is, subtracts the desired distance of movement of the right push rod 3112 in the second control period from the distance of movement in the first control period, thereby obtaining the desired movement distance of the right push rod 3112 in a period from the first control period to the second control period.
For example, the robot 10 may calculate the desired movement distance of the left push rod 3111 of the left elevating mechanism 3101 by formula (1):
L left =D left2 -D left1 (1)
wherein L is left D for the desired distance of movement of the left push rod 3111 left2 D is the second distance of the left push rod 3111 left1 Is the first distance of the left push rod 3111.
The robot 10 may calculate the desired movement distance of the right push rod 3112 of the right elevating mechanism 3102 by equation (2):
L right =D right2 -D right1 (2)
wherein L is right D for the desired distance of movement of the right push rod 3112 right2 For a second distance, D, of right push rod 3112 right1 A first distance for the right push rod 3112.
It will be appreciated that L left 、D left2 、D left1 L and right 、D right2 、D right1 are euclidean distances.
Fig. 6 is a flow chart of substeps of step S306 shown in fig. 3 provided in an embodiment of the present application.
Referring to fig. 6, the robot 10 calculates a current posture of the head 100 of the robot 10 according to the detected rotation angle of the motor 320, the movement distance of the left push rod 3111 of the left elevating mechanism 3101, and the movement distance of the right push rod 3112 of the right elevating mechanism 3102, and may include the steps of:
s601, a deviation of the movement distance of the left push rod 3111 between the movement distance of the left push rod 3111 of the left elevating mechanism 3101 and the desired movement distance of the left push rod 3111, and a deviation of the movement distance of the right push rod 3112 between the movement distance of the right push rod 3112 of the right elevating mechanism 3102 and the desired movement distance of the right push rod 3112 are calculated.
In the present embodiment, the robot 10 calculates the deviation of the movement distance of the left push rod 3111 between the movement distance of the left push rod 3111 of the left elevating mechanism 3101 and the desired movement distance of the left push rod 3111 after detecting the movement distance of the left push rod 3111 of the left elevating mechanism 3101, so that the accuracy of the movement of the left push rod 3111 can be judged based on the deviation of the movement distance of the left push rod 3111. After detecting the movement distance of the right push rod 3112 of the right elevating mechanism 3102, the robot 10 calculates a deviation of the movement distance of the right push rod 3112 between the movement distance of the right push rod 3112 of the right elevating mechanism 3102 and the desired movement distance of the right push rod 3112, so that the accuracy of the movement of the right push rod 3112 can be judged based on the deviation of the movement distance of the right push rod 3112.
S602, it is determined whether the deviation of the movement distance of the left push rod 3111 and the deviation of the movement distance of the right push rod 3112 are both smaller than the accuracy threshold.
In the present embodiment, the robot 10 determines whether the deviation of the movement distance of the left push rod 3111 is smaller than the accuracy threshold after calculating the deviation of the movement distance of the left push rod 3111. After calculating the deviation of the movement distance of the right push rod 3112, the robot 10 determines whether the deviation of the movement distance of the right push rod 3112 is smaller than the accuracy threshold.
It will be appreciated that the accuracy threshold may be set as desired. Since the current pose of the head 100 of the robot 10 is constantly changing during the movement of the head 100 from the current pose to the desired pose, the current pose of the neck 300 is also constantly changing. Assuming that the rotation angle of the motor 320 is fixed, the neck yaw angle y 1 Is also solidInvariable, the robot 10 can update the neck roll angle r by iteration 1 And neck pitch angle p 1 The current posture of the neck 300 is updated so that the accuracy of the movement of the left push rod 3111 and the right push rod 3112 can be judged based on the current posture of the neck 300.
For example, robot 10 may iteratively update neck roll angle r by equation (3) 1 And neck pitch angle p 1
x new =x new -J(x new ) -1 ×(L new -L tar ) (3)
Wherein x is new For the current pose of the neck 300, J (x new ) To roll from the neck angle r 1 And neck pitch angle p 1 Jacobian matrix, J (x new ) -1 Is J (x) new ) Inverse matrix of L new For distance of movement of push rod L tar A desired distance of movement for the push rod. X is x new =[r 1,new p 1,new ],r 1,new Is the neck roll angle r 1 Is the iteration variable of p 1,new Is the pitch angle p of the neck 1 Is a variable of the iteration of (a). The push rod movement distance is a set of the movement distance of the left push rod 3111 and the movement distance of the right push rod 3112, and the push rod desired movement distance is a set of the desired movement distance of the left push rod 3111 and the desired movement distance of the right push rod 3112. L (L) new =[L left0 L right0 ],L left0 L is the movement distance of the left push rod 3111 right0 Is the distance the right push rod 3112 moves. L (L) tar =[L left,tar L right,tar ],L left,tar L for the desired distance of movement of the left push rod 3111 right,tar A desired distance of movement for the right push rod 3112.
It will be appreciated that when the neck roll angle r 1 And neck pitch angle p 1 When the number of iterations exceeds the upper limit of the number of iterations, the robot 10 may send out a message that the iteration update fails. When the neck roll angle r 1 And neck pitch angle p 1 When the number of iterations of (a) does not exceed the upper limit value of the number of iterations, the robot 10 updates the neck roll angle r each time 1 And neck pitch angle p 1 Then, judge left pushPrecision of the movement of the bar 3111 and the right push bar 3112. The upper limit of the iteration number can be set according to the requirement.
Specifically, the robot 10 updates the neck roll angle r each time 1 And neck pitch angle p 1 Thereafter, if the deviation of the movement distance of the left push rod 3111 and the deviation of the movement distance of the right push rod 3112 are both smaller than the accuracy threshold, steps S603 to S605 are executed; if the deviation of the movement distance of the left push rod 3111 and/or the deviation of the movement distance of the right push rod 3112 is greater than or equal to the accuracy threshold, the process returns to step S601, and the neck roll angle r is iteratively updated 1 And neck pitch angle p 1 Then, a deviation of the movement distance of the left push rod 3111 between the movement distance of the left push rod 3111 of the left elevating mechanism 3101 and the desired movement distance of the left push rod 3111 and a deviation of the movement distance of the right push rod 3112 between the movement distance of the right push rod 3112 of the right elevating mechanism 3102 and the desired movement distance of the right push rod 3112 are calculated.
In some embodiments, robot 10 calculates the second norm of the difference between the push rod movement distance and the push rod expected movement distance after calculating the left push rod 3111 movement distance and the left push rod 3111 expected movement distance, and the right push rod 3112 movement distance and the right push rod 3112 expected movement distance, and determines whether the second norm of the difference between the push rod movement distance and the push rod expected movement distance is less than the accuracy threshold.
For example, robot 10 may determine whether the two norms of the difference between the distance traveled by the pushrod and the desired distance traveled by the pushrod is less than the accuracy threshold by equation (4):
||L new -L tar || 2 <e (4)
wherein L is new For distance of movement of push rod L tar For the distance of movement desired for the push rod, L new -L tar || 2 And e is the accuracy threshold, which is the two norms of the difference between the push rod movement distance and the push rod expected movement distance.
The robot 10 updates the neck roll angle r each time 1 And neck pitch angle p 1 Then, if the two norms of the difference between the push rod movement distance and the push rod expected movement distance are smaller than the precision threshold, executing steps S603 to S605; if the distance of movement of the push rod is equal to the desired distance of movement of the push rodIf the second norm of the difference is greater than or equal to the accuracy threshold, iteratively updating the neck roll angle r1 and the neck pitch angle p1 is continued, then calculating a left push rod 3111 movement distance and a left push rod 3111 expected movement distance, and a right push rod 3112 movement distance and a right push rod 3112 expected movement distance, then calculating the second norm of the difference between the push rod movement distance and the push rod expected movement distance, and then judging whether the second norm of the difference between the push rod movement distance and the push rod expected movement distance is smaller than the accuracy threshold.
S603, the neck roll angle r 1 Neck pitch angle p 1 And neck yaw angle y 1 And converting into a homogeneous coordinate transformation matrix of the neck gesture.
In the present embodiment, the robot 10 updates the neck roll angle r each time 1 And neck pitch angle p 1 Thereafter, when the deviation of the movement distance of the left push rod 3111 and the deviation of the movement distance of the right push rod 3112 are both smaller than the accuracy threshold, or when the two norms of the difference between the movement distance of the push rod and the desired movement distance of the push rod are smaller than the accuracy threshold, the neck roll angle r may be converted by homogeneous coordinate transformation 1 Neck pitch angle p 1 And neck yaw angle y 1 And converting the coordinate system into a homogeneous coordinate transformation matrix of the neck posture, so that the neck posture can be subjected to coordinate system conversion.
For example, robot 10 may roll neck angle r 1 Neck pitch angle p 1 And neck yaw angle y 1 A homogeneous coordinate transformation matrix converted into the following neck poses:
Figure BDA0004009242010000121
wherein T is neck1 A homogeneous coordinate transformation matrix for the neck posture; sr 1 Is the neck roll angle r 1 Sine value cr of (2) 1 Is the neck roll angle r 1 Cosine value, sp of 1 Is the pitch angle p of the neck 1 Sine value, cp of (c) 1 Is the pitch angle p of the neck 1 Cosine value, sy 1 For neck yaw angle y 1 Is of sine value, cy 1 For neck yaw angle y 1 Cosine value of (2);(x n ,y n ,z n ) Is the coordinate transformed from the neck coordinate system to the base coordinate system for a certain reference point of the head 100.
S604, the homogeneous coordinate transformation matrix of the neck pose is transformed into the homogeneous coordinate transformation matrix of the head pose by coordinate transformation.
In the present embodiment, the robot 10 converts the neck pose of the neck coordinate system into the head pose of the head coordinate system according to the coordinate conversion relationship between the neck coordinate system and the head coordinate system after converting into the homogeneous coordinate conversion matrix of the neck pose.
For example, the robot 10 may convert the homogeneous coordinate transformation matrix of the neck pose into the homogeneous coordinate transformation matrix of the head pose by matrix transformation:
T head1 =T neck1 ×T necktohead
wherein T is head1 For homogeneous coordinate transformation matrix of head gesture, T neck1 A homogeneous coordinate transformation matrix for neck gesture, T necktohead Is a coordinate transformation matrix that transforms from a neck coordinate system to a head coordinate system.
S605, converting the homogeneous coordinate transformation matrix of the head posture into a head roll angle r 2 Pitch angle p of head 2 And a yaw angle y of the head 2 The head pose of the robot 10 is obtained.
In the present embodiment, after the robot 10 converts the homogeneous coordinate transformation matrix of the head pose, the homogeneous coordinate transformation matrix of the head pose can be converted into the head roll angle r by inverse transformation of the homogeneous coordinate transformation matrix 2 Pitch angle p of head 2 And a yaw angle y of the head 2 Thereby obtaining the head pose of the robot 10.
The control terminal of the embodiment of the present application is described below.
Fig. 7 is a block diagram of a control terminal 20 according to an embodiment of the present application.
The control terminal 20 may be communicatively connected to the robot 10 to enable manipulation of the robot 10. Referring to fig. 7, the control terminal 20 includes a processor 21 and a memory 22, and the processor 21 may run a computer program or code stored in the memory 22 to implement the three degrees of freedom control method for the head pose of the robot according to the embodiment of the present application.
In the present embodiment, the control terminal 20 may calculate a desired rotation angle of at least one motor 320 of the neck 300 of the robot 10 and a desired movement distance of the push rods 311 of the at least two lifters 310 in response to the head pose control command through the processor 21, then generate a motor driving command and a lifter driving command, and then transmit the motor driving command and the lifter driving command to the robot 10, thereby driving the at least one motor 320 of the neck 300 of the robot 10 through the motor driving command, and driving the at least two lifters 310 of the neck 300 of the robot 10 through the lifter driving command.
The processor 21 may include one or more processing units. For example, the processor 21 may include, but is not limited to, an application processor (Application Processor, AP), a modem processor, a graphics processor (Graphics Processing Unit, GPU), an image signal processor (Image Signal Processor, ISP), a controller, a video codec, a digital signal processor (Digital Signal Processor, DSP), a baseband processor, a Neural-Network Processor (NPU), and the like. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
A memory may also be provided in the processor 21 for storing instructions and data. In some embodiments, the memory in the processor 21 is a cache memory. The memory may hold instructions or data that has just been used or recycled by the processor 21. If the processor 21 needs to reuse the instruction or data, it can be called directly from the memory.
In some embodiments, the processor 21 may include one or more interfaces. The interfaces may include, but are not limited to, integrated circuit (Inter-Integrated Circuit, I2C) interfaces, integrated circuit built-in audio (Inter-Integrated Circuit Sound, I2S) interfaces, pulse code modulation (Pulse Code Modulation, PCM) interfaces, universal asynchronous receiver Transmitter (Universal Asynchronous Receiver/Transmitter, UART) interfaces, mobile industry processor interfaces (Mobile Industry Processor Interface, MIPI), general-Purpose Input/Output (GPIO) interfaces, subscriber identity module (Subscriber Identity Module, SIM) interfaces, universal serial bus (Universal Serial Bus, USB) interfaces, and the like.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is merely illustrative, and is not meant to limit the structure of the robot 10. In other embodiments, the robot 10 may also employ different interfaces in the above embodiments, or a combination of interfaces.
The memory 22 may include an external memory interface and an internal memory. The external memory interface may be used to connect to an external memory card, such as a Micro SD card, to implement the memory capability of the extension robot 10. The external memory card communicates with the processor 21 through an external memory interface to implement a data storage function. The internal memory may be used to store computer-executable program code that includes instructions. The internal memory may include a stored program area and a stored data area. The storage program area may store an application program (e.g., a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data (e.g., audio data, phonebook, etc.) created during use of the robot 10, and the like. In addition, the internal memory may include a high-speed random access memory, and may also include a nonvolatile memory, such as at least one disk storage device, a flash memory device, or a universal flash memory (Universal Flash Storage, UFS), or the like. The processor 21 performs various functional applications of the robot 10 and data processing, for example, implements the robot head pose three degree of freedom control method of the embodiment of the present application by executing instructions stored in an internal memory, and/or instructions stored in a memory provided in the processor 21.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the robot 10 or the control terminal 20. In other embodiments, the robot 10 or control terminal 20 may include more or fewer components than shown, or certain components may be combined, certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
For example, referring to fig. 8 and fig. 9 together, fig. 8 is a block diagram of a multi-legged robot according to an embodiment of the present application, and fig. 9 is a schematic diagram of a scenario in which the control terminal 20 according to an embodiment of the present application controls the multi-legged robot 30.
As shown in fig. 8, the multi-legged robot 30 includes a mechanical unit 301, a communication unit 302, a sensing unit 303, an interface unit 304, a storage unit 305, a display unit 306, an input unit 307, a control module 308, and a power supply 309. The various components of the multi-legged robot 30 can be connected in any manner, including wired or wireless connections, and the like.
It will be appreciated that the particular configuration of the multi-legged robot shown in fig. 8 is not limiting of the multi-legged robot, and that the multi-legged robot may include more or fewer components than shown, and that certain components may not be necessary for the multi-legged robot, may be omitted entirely, or may be combined as desired within the scope of not changing the nature of the application.
The various components of the multi-legged robot 30 are described in detail below in conjunction with fig. 8 and 9.
The mechanical unit 301 is hardware of the multi-legged robot 30. As shown in fig. 8, the mechanical unit 301 may include a drive plate 3011, a motor 3012, and a mechanical structure 3013.
As shown in fig. 9, the mechanical structure 3013 may include a body 3014, a rotatable head structure 3017, and a neck structure (not shown) connecting the body 3014 and the rotatable head structure 3017, extendable legs 3015, foot ends 3016, a swingable tail structure 3018, a carrying structure 3019, a saddle structure 3020, a camera structure 3021, and in other embodiments, the mechanical structure 3013 may further include an extendable robotic arm (not shown), and the like.
It should be noted that, the number of the component modules of the machine unit 301 may be one or plural, and may be set according to circumstances, for example, the number of the legs 3015 may be 4, and 3 motors 3012 may be disposed for each leg 3015, and the number of the corresponding motors 3012 may be 12.
The communication unit 302 may be used for receiving and transmitting signals, or may be used for communicating with a network and other devices, for example, receiving command information sent by the remote controller or other multi-legged robot 30 to move in a specific direction at a specific speed value according to a specific gait, and then transmitting the command information to the control module 308 for processing. The communication unit 302 includes, for example, a WiFi module, a 4G module, a 5G module, a bluetooth module, an infrared module, and the like.
The sensing unit 303 is used for acquiring information data of the surrounding environment of the multi-legged robot 30 and monitoring parameter data of each component inside the multi-legged robot 30, and sending the information data to the control module 308. The sensing unit 303 includes various sensors such as a sensor that acquires surrounding information: lidar (for remote object detection, distance determination and/or speed value determination), millimeter wave radar (for short range object detection, distance determination and/or speed value determination), cameras, infrared cameras, global navigation satellite systems (GNSS, global Navigation Satellite System), etc. Such as sensors to monitor various components within the multi-legged robot 30: an inertial measurement unit (IMU, inertial Measurement Unit) (values for measuring velocity values, acceleration values and angular velocity values), plantar sensors (for monitoring plantar force point position, plantar posture, touchdown force magnitude and direction), temperature sensors (for detecting component temperature). As for the other sensors such as the load sensor, the touch sensor, the motor angle sensor, the torque sensor, etc. which may be further configured for the multi-legged robot 30, the detailed description thereof will be omitted.
The interface unit 304 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more components within the multi-legged robot 30, or may be used to output (e.g., data information, power, etc.) to an external device. The interface unit 304 may include a power port, a data port (e.g., a USB port), a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, and the like.
The storage unit 305 is used to store a software program and various data. The storage unit 305 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system program, a motion control program, an application program (such as a text editor), and the like; the data storage area may store data generated by the multi-legged robot 30 in use (such as various sensed data acquired by the sensing unit 303, log file data), and the like. In addition, storage unit 305 may include high-speed random access memory, but may also include non-volatile memory, such as disk memory, flash memory, or other volatile solid-state memory.
The display unit 306 is used to display information input by a user or information provided to the user. The display unit 306 may include a display panel 3061, and the display panel 3061 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an Organic Light-Emitting Diode (OLED), or the like.
The input unit 307 may be used to receive input numeric or character information. In particular, the input unit 307 may include a touch panel 3071 and other input devices 3072. The touch panel 3071, also referred to as a touch screen, may collect touch operations of a user (e.g., operations of the user on the touch panel 3071 or in the vicinity of the touch panel 3071 using a palm, a finger, or a suitable accessory), and drive the corresponding connection device according to a preset program. The touch panel 3071 may include two parts, a touch detection device 3073 and a touch controller 3074. Wherein, the touch detection device 3073 detects the touch orientation of the user, and detects a signal caused by the touch operation, and transmits the signal to the touch controller 3074; touch controller 3074 receives touch information from touch sensing device 3073 and converts it to touch point coordinates, which are then sent to control module 308, and can receive commands from control module 308 and execute them. The input unit 307 may include other input devices 3072 in addition to the touch panel 3071. In particular, other input devices 3072 may include, but are not limited to, one or more of a remote operated handle, etc., and are not limited herein in particular.
Further, the touch panel 3071 may overlay the display panel 3061, and when the touch panel 3071 detects a touch operation thereon or thereabout, the touch operation is transferred to the control module 308 to determine a type of touch event, and then the control module 308 provides a corresponding visual output on the display panel 3061 according to the type of touch event. Although in fig. 8, the touch panel 3071 and the display panel 3061 are implemented as two separate components to implement the input and output functions, in some embodiments, the touch panel 3071 and the display panel 3061 may be integrated to implement the input and output functions, which is not limited herein.
The control module 308 is a control center of the multi-legged robot 30, connects the respective components of the entire multi-legged robot 30 using various interfaces and lines, and performs overall control of the multi-legged robot 30 by running or executing a software program stored in the storage unit 305, and calling data stored in the storage unit 305.
The power supply 309 is used to power the various components, and the power supply 309 may include a battery and a power control board for controlling battery charging, discharging, and power consumption management functions. In the embodiment shown in fig. 8, the power supply 309 is electrically connected to the control module 308, and in other embodiments, the power supply 309 may be electrically connected to the sensing unit 303 (such as a camera, a radar, a speaker, etc.), and the motor 3012, respectively. It should be noted that each component may be connected to a different power source 309 or may be powered by the same power source 309.
In the scenario in which the control terminal 20 controls the multi-legged robot 30 shown in fig. 9, specifically, the control terminal 20 is in communication connection with the multi-legged robot 30, when the control terminal 20 communicates with the multi-legged robot 30, the control terminal 20 may send a control instruction to the multi-legged robot 30, the multi-legged robot 30 may receive the control instruction through the communication unit 302, and may transmit the control instruction to the control module 308 in case of receiving the control instruction, so that the control module 308 may implement a corresponding function according to the control instruction. The control terminal 20 includes, but is not limited to: a mobile phone, a tablet personal computer, a server, a personal computer, a wearable intelligent device and other electrical equipment with an image shooting function.
The control instruction may be determined according to a preset condition. In one embodiment, the multi-legged robot 30 may include a sensing unit 303, and the sensing unit 303 may generate control instructions according to the current environment in which the multi-legged robot 30 is located. The control module 308 can determine whether the current speed value of the multi-legged robot 30 satisfies the corresponding preset condition according to the control instruction. If the preset condition is met, the current speed value and current gait movement of the multi-legged robot 30 will be maintained. If the preset condition is not satisfied, the target speed value and the corresponding target gait are determined according to the corresponding preset condition, so that the multi-legged robot 30 can be controlled to move at the target speed value and the corresponding target gait. The communication mode between the sensing unit 303 and the control module 308 may be wired communication or wireless communication. Means of wireless communication include, but are not limited to: wireless networks, mobile communication networks (3G, 4G, 5G, etc.), bluetooth, infrared.
It can be appreciated that the multi-legged robot 30 can implement all the method steps of the three-degree-of-freedom control method for the head gesture of the robot provided in the embodiments of the present application, and the same method steps and beneficial effects will not be described in detail herein.
The embodiments of the present application have been described in detail above with reference to the accompanying drawings, but the present application is not limited to the above embodiments, and various changes can be made within the knowledge of one of ordinary skill in the art without departing from the spirit of the present application.

Claims (11)

1. A method for controlling three degrees of freedom of a robot head pose, wherein the robot comprises a head, a body and a neck connecting the head and the body, the neck comprises at least two lifting mechanisms and at least one motor, the at least two lifting mechanisms are connected in parallel, and the at least one motor is connected with push rods of the at least two lifting mechanisms through at least one connecting piece, the method comprising:
calculating a desired rotation angle of at least one motor and a desired movement distance of a push rod of each of at least two lifting mechanisms in response to a head posture control instruction;
acquiring a motor driving instruction according to the expected rotation angle of at least one motor;
Acquiring a driving instruction of the lifting mechanism according to the expected movement distance of the push rods of at least two lifting mechanisms;
driving at least one of the motors in response to the motor driving instruction; and
at least two lifting mechanisms are driven in response to the lifting mechanism driving instruction so as to control the head of the robot to move from the current posture to the expected posture.
2. The method for controlling three degrees of freedom of a head pose of a robot according to claim 1, wherein at least one of the motors is driven in response to the motor driving command; and driving at least two of the lifting mechanisms in response to the lifting mechanism driving instruction to control the head of the robot to move from the current posture to the desired posture, the method further comprising:
detecting the rotation angle of at least one motor and the respective push rod movement distance of at least two lifting mechanisms;
calculating the current posture of the head of the robot according to the detected rotation angle of at least one motor and the respective push rod movement distance of at least two lifting mechanisms;
determining that the current pose of the robot head reaches a desired pose.
3. The method for controlling three degrees of freedom in a head pose of a robot according to claim 1, wherein the calculating a desired rotation angle of at least one of the motors and a desired movement distance of the respective push rods of at least two of the elevating mechanisms in response to the head pose control command comprises:
acquiring a head expected gesture of the robot according to the head gesture control instruction, wherein the head expected gesture comprises a head roll angle, a head pitch angle and a head yaw angle;
converting the head roll angle, the head pitch angle and the head yaw angle into a homogeneous coordinate transformation matrix of the head expected posture;
converting the homogeneous coordinate transformation matrix of the head expected posture into a first homogeneous coordinate transformation matrix of the neck expected posture through coordinate transformation;
converting the first coordinate transformation matrix of the neck expected gesture into a neck roll angle, a neck pitch angle and a neck yaw angle, wherein the neck yaw angle is an expected rotation angle of at least one motor;
converting the neck roll angle and the neck pitch angle into a second homogeneous coordinate transformation matrix of the neck expected posture;
and calculating the expected movement distance of the push rod of each of at least two lifting mechanisms according to the second homogeneous coordinate transformation matrix of the expected neck gesture.
4. A method for controlling three degrees of freedom in a head pose of a robot according to claim 3, wherein said calculating a desired movement distance of the respective push rod of at least two lifting mechanisms based on the second homogeneous coordinate transformation matrix of the desired pose of the neck comprises:
calculating the expected positions of upper hinge points and lower hinge points of the push rods of the lifting mechanisms according to the second homogeneous coordinate transformation matrix of the neck expected posture;
detecting a first distance between an upper hinge point of the push rod and a lower hinge point of the push rod before the movement of each lifting mechanism;
calculating a second distance between the upper hinge point of the push rod and the lower hinge point of the push rod after each lifting mechanism moves according to the expected position of the upper hinge point of the push rod and the expected position of the lower hinge point of the push rod of each lifting mechanism;
and subtracting the second distance of the push rod from the corresponding first distance of the push rod to obtain the expected movement distance of the push rod of each of the at least two lifting mechanisms.
5. A method of controlling three degrees of freedom in a head pose of a robot according to claim 3, wherein after calculating the respective desired movement distances of the push rods of at least two of said elevating mechanisms from the second homogeneous coordinate transformation matrix of the desired pose of the neck, the method further comprises:
Detecting the push rod movement distance of each lifting mechanism;
calculating the deviation between the push rod movement distance of each lifting mechanism and the expected push rod movement distance;
determining whether the deviation is less than a precision threshold;
if the deviation is smaller than the precision threshold, converting the neck roll angle, the neck pitch angle and the neck yaw angle into a homogeneous coordinate transformation matrix of the neck posture; converting the homogeneous coordinate transformation matrix of the neck gesture into the homogeneous coordinate transformation matrix of the head gesture through coordinate transformation; converting the homogeneous coordinate transformation matrix of the head gesture into a head roll angle, a head pitch angle and a head yaw angle to obtain the head gesture of the robot;
if the deviation is greater than or equal to the precision threshold, detecting the push rod movement distance of each lifting mechanism again; calculating the deviation between the push rod movement distance of each lifting mechanism and the expected push rod movement distance; determining whether the deviation is less than the accuracy threshold.
6. A robot comprising a processor, a head, a body and a neck connecting the head and the body, characterized in that the neck comprises at least two lifting mechanisms and at least one motor, wherein at least two lifting mechanisms are connected in parallel, and at least one motor is connected with push rods of at least two lifting mechanisms through at least one connecting piece; the robot is configured to:
Calculating, by the processor, a desired rotation angle of at least one of the motors and a desired movement distance of the respective push rods of at least two of the lifting mechanisms in response to a head pose control instruction;
acquiring a motor driving instruction according to the expected rotation angle of at least one motor;
acquiring a driving instruction of the lifting mechanism according to the expected movement distance of the push rods of at least two lifting mechanisms;
driving at least one of the motors in response to the motor driving instruction; and
at least two lifting mechanisms are driven in response to the lifting mechanism driving instruction so as to control the head of the robot to move from the current posture to the expected posture.
7. The robot of claim 6 wherein at least one of said motors is driven by said processor in response to said motor drive command; and driving at least two of the lifting mechanisms in response to the lifting mechanism driving instruction to control the head of the robot to move from the current pose to the desired pose, the robot being further configured to:
detecting the rotation angle of at least one motor and the respective push rod movement distance of at least two lifting mechanisms through a sensor;
Calculating the current posture of the head of the robot by the processor according to the detected rotation angle of at least one motor and the respective push rod movement distance of at least two lifting mechanisms; and
it is determined whether the current pose of the robot head reaches a desired pose.
8. The robot of claim 6 wherein calculating, by said processor in response to head pose control instructions, a desired angle of rotation of at least one of said motors and a desired distance of movement of respective pushrods of at least two of said lift mechanisms comprises:
acquiring, by the processor, a head desired pose of the robot according to the head pose control instruction, the head desired pose including a head roll angle, a head pitch angle, and a head yaw angle;
converting the head roll angle, the head pitch angle and the head yaw angle into a homogeneous coordinate transformation matrix of the head expected posture;
converting the homogeneous coordinate transformation matrix of the head expected posture into a first homogeneous coordinate transformation matrix of the neck expected posture through coordinate transformation;
converting the first coordinate transformation matrix of the neck expected gesture into a neck roll angle, a neck pitch angle and a neck yaw angle, wherein the neck yaw angle is an expected rotation angle of at least one motor;
Converting the neck roll angle and the neck pitch angle into a second homogeneous coordinate transformation matrix of the neck expected posture;
and calculating the expected movement distance of the push rod of each of at least two lifting mechanisms according to the second homogeneous coordinate transformation matrix of the expected neck gesture.
9. The robot of claim 8 wherein calculating, by said processor, the respective distance of expected movement of the push rod of at least two of said lifting mechanisms from a second homogeneous coordinate transformation matrix of the expected pose of the neck comprises:
calculating the expected positions of upper hinge points and lower hinge points of the push rods of the lifting mechanisms according to a second homogeneous coordinate transformation matrix of the expected neck gesture by the processor;
detecting a first distance between an upper hinge point of the push rod and a lower hinge point of the push rod before the movement of each lifting mechanism through a sensor;
calculating, by the processor, a pushrod second distance between the pushrod upper hinge point and the pushrod lower hinge point after movement of each lifting mechanism according to the pushrod upper hinge point desired position and the pushrod lower hinge point desired position of each lifting mechanism;
And subtracting the second distance of the push rod from the corresponding first distance of the push rod to obtain the expected movement distance of the push rod of each of the at least two lifting mechanisms.
10. The robot of claim 8, wherein after calculating, by the processor, a desired distance of movement of the respective pushrods of the at least two lifting mechanisms from the second homogeneous coordinate transformation matrix of the desired pose of the neck, the robot is further configured to:
detecting the push rod movement distance of each lifting mechanism through a sensor;
calculating, by the processor, a deviation between a distance of movement of the push rod of each of the lifting mechanisms and a desired distance of movement of the push rod;
determining whether the deviation is less than a precision threshold;
if the deviation is smaller than the precision threshold, converting the neck roll angle, the neck pitch angle and the neck yaw angle into a homogeneous coordinate transformation matrix of the neck posture through the processor;
converting the homogeneous coordinate transformation matrix of the neck gesture into the homogeneous coordinate transformation matrix of the head gesture through coordinate transformation; and
converting the homogeneous coordinate transformation matrix of the head gesture into a head roll angle, a head pitch angle and a head yaw angle to obtain the head gesture of the robot;
If the deviation is greater than or equal to the precision threshold, detecting the push rod movement distance of each lifting mechanism again through a sensor;
calculating, by the processor, a deviation between a distance of movement of the push rod of each of the lifting mechanisms and a desired distance of movement of the push rod; and
determining whether the deviation is less than the accuracy threshold.
11. The control terminal of the robot comprises a processor, wherein the control terminal is in communication connection with the robot, and is characterized by comprising a head part, a body and a neck part for connecting the head part and the body, wherein the neck part comprises at least two lifting mechanisms and at least one motor, the at least two lifting mechanisms are connected in parallel, and the at least one motor is connected with push rods of the at least two lifting mechanisms through at least one connecting piece; the control terminal is configured to:
calculating, by the processor, a desired rotation angle of at least one of the motors of the robot neck and a desired movement distance of the respective push rods of at least two of the lifting mechanisms in response to a head pose control instruction;
acquiring a motor driving instruction according to the expected rotation angle of at least one motor, wherein the motor driving instruction is used for driving at least one motor;
Acquiring a lifting mechanism driving instruction according to the expected movement distance of the push rods of at least two lifting mechanisms, wherein the lifting mechanism driving instruction is used for driving at least two lifting mechanisms;
and sending the motor driving instruction and the lifting mechanism driving instruction to the robot so as to control the head of the robot to move from the current posture to the expected posture.
CN202211644467.0A 2022-12-20 2022-12-20 Three-degree-of-freedom control method for head gesture of robot and related equipment Pending CN115990878A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211644467.0A CN115990878A (en) 2022-12-20 2022-12-20 Three-degree-of-freedom control method for head gesture of robot and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211644467.0A CN115990878A (en) 2022-12-20 2022-12-20 Three-degree-of-freedom control method for head gesture of robot and related equipment

Publications (1)

Publication Number Publication Date
CN115990878A true CN115990878A (en) 2023-04-21

Family

ID=85989825

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211644467.0A Pending CN115990878A (en) 2022-12-20 2022-12-20 Three-degree-of-freedom control method for head gesture of robot and related equipment

Country Status (1)

Country Link
CN (1) CN115990878A (en)

Similar Documents

Publication Publication Date Title
KR102212838B1 (en) Three-way communication system comprising end device, edge server controlling end device and cloud server, and operating method of the same
CN108406798A (en) A kind of man-machine interactive system of Service Robots
CN109079736B (en) ROS-based mobile robot platform control method and system
CN113119104B (en) Mechanical arm control method, mechanical arm control device, computing equipment and system
CN113814951B (en) Control method, system and storage medium for master-slave robot
US11110612B2 (en) Processing device, system, and control method
RU124622U1 (en) MOBILE ROBOT CONTROL SYSTEM
CN117073662A (en) Map construction method, device, robot and storage medium
JP2015093353A (en) Multifunctional information terminal remote-manipulation type robot
CN106774178B (en) Automatic control system and method and mechanical equipment
CN115990878A (en) Three-degree-of-freedom control method for head gesture of robot and related equipment
CN107291049B (en) Motion control method and device of mechanical arm
CN115446844B (en) Robot control method, robot and control terminal
CN108958249B (en) Ground robot control system and method considering unknown control direction
JP2020160615A (en) Object posture control program and object posture control system
CN111376246A (en) Robot operation control method, gesture recognition device and robot
CN115922731B (en) Control method of robot and robot
CN115709471B (en) Robot state value adjusting method and robot
RU2690753C1 (en) Universal coordinate device for personal computers
US11731278B1 (en) Robot teleoperation using mobile device motion sensors and web standards
CN117452464A (en) Positioning method, positioning device, robot and storage medium
CN214480701U (en) Cloud robot control system based on Web webpage
CN112492009A (en) Cloud robot control system based on Web webpage and implementation method
CN117274546A (en) Three-dimensional model display method and device, electronic equipment and storage medium
CN117021106A (en) Gait control method of robot and robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination