CN113910232A - Self-adaptive attitude tracking method and device, storage medium and electronic equipment - Google Patents

Self-adaptive attitude tracking method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN113910232A
CN113910232A CN202111252255.3A CN202111252255A CN113910232A CN 113910232 A CN113910232 A CN 113910232A CN 202111252255 A CN202111252255 A CN 202111252255A CN 113910232 A CN113910232 A CN 113910232A
Authority
CN
China
Prior art keywords
robot
interpolation
posture
controller
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111252255.3A
Other languages
Chinese (zh)
Other versions
CN113910232B (en
Inventor
谢胜文
王珂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Elite Robot Co Ltd
Original Assignee
Suzhou Elite Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Elite Robot Co Ltd filed Critical Suzhou Elite Robot Co Ltd
Priority to CN202111252255.3A priority Critical patent/CN113910232B/en
Publication of CN113910232A publication Critical patent/CN113910232A/en
Application granted granted Critical
Publication of CN113910232B publication Critical patent/CN113910232B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J18/00Arms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/005Manipulators for mechanical processing tasks
    • B25J11/0065Polishing or grinding
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a self-adaptive attitude tracking method, a self-adaptive attitude tracking device, a storage medium and electronic equipment. The method is applied to a robot, and comprises the following steps: fusing at least two preset controllers to generate a hybrid controller; determining a position interpolation quantity and an initial posture interpolation quantity of the robot according to the hybrid controller, and clearing the initial posture interpolation quantity; calculating the attitude interpolation amount of the robot which adaptively changes according to the current attitude; acquiring joint space interpolation quantity of the robot according to the position interpolation quantity and the posture interpolation quantity; and interpolating the current pose of the robot according to the joint space interpolation quantity, and controlling the robot to operate based on the interpolated pose. By adopting the scheme, the operation difficulty of the robot can be simplified, and the elevator robot can meet the response capability of complex working conditions.

Description

Self-adaptive attitude tracking method and device, storage medium and electronic equipment
Technical Field
The invention belongs to the field of industrial robots, and particularly relates to a self-adaptive posture tracking method and device, a storage medium and electronic equipment.
Background
China is a large country in manufacturing industry, the traditional labor-intensive production mode is difficult to continue along with the decline of population dividends, a machine is imperative to replace manpower, and enterprises are mainly developed towards the upgrading and reconstruction of automatic production. The industrial robot field includes traditional industrial robot and cooperative robot, and traditional industrial robot replaces manual operation in being applied to industrial environment, and neotype cooperative robot mainly used optimizes on having produced the line overall arrangement, and the people and the machine collaborative work of being convenient for, and cooperative robot's work scene makes it provide higher requirement to performance such as security, portability.
The robot is configured with different controllers to implement different control functions, such as position-based control or force-based control, but in the prior art, the control function of the controller is relatively single, the controllers are usually used independently, and the robot mainly faces a simple scene which can be implemented by a single controller, but cannot be implemented by a single controller directly when facing a complex working scene. In the prior art, the processing of complex working conditions is usually solved by adopting a manual teaching or software programming mode, but the precision of the manual teaching mode is relatively limited, the software programming mode is relatively complex to realize, the requirement on programming professional knowledge of a user is higher, and the flexible adjustment based on an application scene cannot be realized.
Disclosure of Invention
The application provides a self-adaptive attitude tracking method, a self-adaptive attitude tracking device, a storage medium and electronic equipment, wherein at least two controllers are fused, the integral fusion interpolation quantity is calculated, a hybrid controller is realized, and meanwhile, the functions of force tracking and attitude tracking are realized, so that the problem that the attitude tracking is realized in a manual programming or teaching mode in the prior art is solved.
In order to achieve the above object, the present invention can adopt the following technical solutions: an adaptive pose tracking method applied to a robot for connecting a tool to perform a predetermined operation on a work object, the method comprising: fusing at least two preset controllers to generate a hybrid controller; determining a position interpolation quantity and an initial posture interpolation quantity of the robot according to the hybrid controller, and clearing the initial posture interpolation quantity; calculating the attitude interpolation amount of the robot which adaptively changes according to the current attitude; acquiring joint space interpolation quantity of the robot according to the position interpolation quantity and the posture interpolation quantity; and interpolating the current pose of the robot according to the joint space interpolation quantity, and controlling the robot to operate based on the interpolated pose.
Further, the calculating a posture interpolation amount of the robot adaptively changing according to the current posture comprises: setting a target posture of the robot, wherein the target posture comprises a target operation angle of the tool on a working object; acquiring a current posture of the robot, wherein the current posture comprises a current operation angle of the tool on a working object; and calculating the posture interpolation quantity of the robot changing from the current posture to the target posture according to the target operation angle, the current operation angle and a pre-designed angle tracking controller.
Further, the acquiring the current posture of the robot includes: and acquiring an included angle between the TCP attitude of the robot and the tangent vector of the TCP track.
Further, the angle tracking controller is a PID controller.
Further, before calculating the pose interpolation amount of the robot changing from the current pose to the target pose, the method further includes: and carrying out filtering processing on the current operation angle according to a predetermined filter.
Further, the at least two controllers include at least two of a position controller, a force controller, and an impedance controller.
Further, the fusing at least two preset controllers to generate a hybrid controller includes: obtaining the spatial interpolation quantities of at least two controllers; and determining a fusion interpolation amount according to the space interpolation amount of each controller to generate a hybrid controller.
The invention can also adopt the following technical scheme: an adaptive attitude tracking device applied to a robot comprises: the fusion unit is used for fusing at least two preset controllers to generate a hybrid controller; the determining unit is used for determining the position interpolation quantity and the initial posture interpolation quantity of the robot according to the hybrid controller and clearing the initial posture interpolation quantity; the computing unit is used for computing the attitude interpolation amount of the robot which adaptively changes according to the current attitude; an obtaining unit, configured to obtain a joint space interpolation amount of the robot according to the position interpolation amount and the posture interpolation amount; and the control unit is used for interpolating the current pose of the robot according to the joint space interpolation quantity and controlling the robot to operate based on the interpolated pose.
The invention can also adopt the following technical scheme: a computer readable storage medium storing a computer program which, when executed, implements an adaptive pose tracking method as described in any of the preceding.
The invention can also adopt the following technical scheme: an electronic device, comprising: a memory storing a computer program; a processor for executing the computer program in the memory to implement the adaptive tracking method of any of the preceding.
Compared with the prior art, the specific implementation mode of the invention at least has the following beneficial effects:
according to the robot system, the generated hybrid controller is used for obtaining the position interpolation quantity of the robot based on the hybrid controller, the recalculated gesture interpolation quantity is combined, the robot joint space interpolation quantity is obtained, interpolation operation is carried out on the robot according to the position interpolation quantity, the robot can realize self-adaptation according to position change and force change according to the hybrid controller, meanwhile, gesture self-adaptation is carried out based on the recalculated gesture interpolation quantity which is adaptively changed according to the current gesture of the robot, the robot can realize force tracking and self-adaptation gesture tracking at the same time, a moving instruction does not need to be set manually, the robot can process complex scenes which need force tracking and gesture tracking when curved surfaces are polished and the like, programming work of a user for teaching the robot is simplified, and the capability of the robot for processing complex working conditions is improved.
Drawings
FIG. 1 is a schematic view of a robot according to one embodiment of the present invention;
FIG. 2 is a schematic diagram of an adaptive pose tracking method of one embodiment of the invention;
FIG. 3 is a schematic diagram of an adaptive pose tracking method according to another embodiment of the invention;
FIG. 4 is a schematic diagram of an adaptive pose tracking method according to yet another embodiment of the invention;
FIG. 5 is a flow chart of an adaptive tracking method of one embodiment of the present invention;
FIG. 6 is a schematic diagram of an adaptive tracking device according to one embodiment of the present invention;
FIG. 7 is a schematic view of an electronic device of one embodiment of the invention.
Detailed Description
In order to make the technical solution of the present invention more clear, embodiments of the present invention will be described below with reference to the accompanying drawings. It should be understood that the detailed description of the embodiments is intended only to teach one skilled in the art how to practice the invention, and is not intended to be exhaustive of all possible ways of practicing the invention, nor is it intended to limit the scope of the practice of the invention. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, shall fall within the scope of protection of the present invention.
The application provides an adaptive attitude tracking method, which is applied to a robot, and referring to fig. 1, a robot 100 comprises a base 30, a joint 20 and a connecting piece 40, the tail end of the robot is connected with a tool 300 to perform a predetermined operation on a working object, the joint 20 of the robot 100 is a power source and comprises a motor, a driver, a reducer and other elements, and sensors such as a force sensor and a visual sensor are optionally arranged at the tail end of the robot 100 to acquire necessary operation information of the robot. The robot includes a control system integrated with the robot or formed as a separate control component for processing operation information of the robot and issuing instructions to the robot to control the operation.
The present application provides an adaptive gesture tracking method, applied to a robot 100, which can be executed by the above control system, referring to fig. 2, the adaptive gesture tracking method includes the steps of:
s201, fusing at least two preset controllers to generate a hybrid controller;
s202, determining a position interpolation quantity and an initial posture interpolation quantity of the robot according to the hybrid controller, and clearing the initial posture interpolation quantity;
s203, calculating a posture interpolation amount of the robot which adaptively changes according to the current posture;
s204, acquiring joint space interpolation quantity of the robot according to the position interpolation quantity and the posture interpolation quantity;
and S205, interpolating the current pose of the robot according to the joint space interpolation quantity, and controlling the robot to operate based on the interpolated pose.
In a specific embodiment, referring to fig. 3, step S201 includes steps S2011 to S2012.
S2011, spatial interpolation quantities of at least two controllers are obtained;
and S2012, determining fusion interpolation amount according to the space interpolation amount of each controller to generate a hybrid controller.
The robot comprises a plurality of joints, correspondingly, the robot has a plurality of degrees of freedom, for example, a common six-joint robot has six degrees of freedom, when a plurality of controllers of the robot are fused, the fusion of the plurality of degrees of freedom of the robot is completed through respective degrees of freedom of the robot, and then the fusion of the plurality of degrees of freedom of the robot is realized. Specifically, step S2011 includes obtaining spatial interpolation quantities of at least two controllers in the target degree of freedom; step S2012 includes determining an interpolation amount of the target degree of freedom from the spatial interpolation amount of each controller in the target degree of freedom, and fusing the interpolation amounts of the plurality of degrees of freedom of the robot to determine a fused interpolation amount to generate a hybrid controller. For example, there may be only some controller-induced interpolation amounts in a certain degree of freedom of the robot, or there may be a plurality of controller-induced interpolation amounts in a certain degree of freedom of the robot. By generating the hybrid controller, the position interpolation amount and the posture interpolation amount of the robot can be generated comprehensively based on the plurality of acquired variables. The spatial interpolation quantity is a Cartesian spatial interpolation quantity or a joint spatial interpolation quantity. In the cartesian space, the interpolation amount of the robot can be expressed as the coordinates of a cartesian coordinate system, and the interpolation amount in the cartesian space can be converted into the interpolation amount in the joint space after being fused for fusion; in the joint space, the interpolation amount can be expressed by the joint angle, and the controllers are merged.
For example, before the robot performs work, the user selects based on a scene of performing the work, the user may preset at least two controllers to be fused based on a teach pendant or other interactive devices, and the robot automatically implements function fusion of the at least two controllers. Or the user sets the type of work to be performed through a demonstrator or other modes, the robot determines which controllers need to be fused according to the work type, or the robot manufacturer recommends default controllers for common work scenes of the robot in the design and production stages, and the robot automatically sets at least two corresponding controllers after the user selects the work scenes and realizes the fusion of the controllers to support the subsequent work.
Wherein the at least two controllers comprise at least two of a position controller, a force controller, and an impedance controller. For example, taking a curved surface polishing operation as an example, a robot needs to be at least provided with a position controller to realize movement control of a robot position, the force controller is arranged to ensure that an acting force follows a target force in the robot position movement process, and in combination with attitude adaptation, the acting force can be controlled to follow the target acting force in the robot movement process, and the robot attitude can adaptively follow the target attitude.
Different controllers of the robot have different functions, for example, the position controller can realize the position following of the robot, the force control can make the force follow the target force in the moving process of the robot, the impedance controller can make the robot have flexibility, and the spatial interpolation quantity is calculated according to the controllers respectively so as to realize a hybrid controller. For example, the following is an exemplary description of how the controller outputs are converted to cartesian spatial interpolation for controller fusion:
(1) force tracking controller
TCP has 6 degrees of freedom under the coordinate system { f }, including 3 degrees of freedom in position and 3 degrees of freedom in attitude. Assuming that the force/moment to be tracked in the target degree of freedom is fdWhere the actual force is f, then the force tracking error is fe=fd-f, the interpolated acceleration in the target degree of freedom being:
Figure BDA0003322826610000051
wherein Kp Ki KdIs a PID parameter. The actual Cartesian space interpolation is
Figure BDA0003322826610000052
(2) Impedance controller
Let x denote the value of the target degree of freedom of TCP under the coordinate system f, which may be, for example, position x, y or z, or an attitude variable. The cartesian space interpolation in the target degree of freedom is δ x.
If TCP is expected to behave in a mass-spring-damping model under the action of an external force, the external force on the target degree of freedom is fextThen, the cartesian space interpolation amount in the target degree of freedom is:
Figure BDA0003322826610000053
wherein M, B and K are mass-spring-damping model parameters, and f is the operating process of the mechanical armextThe interpolation quantity is obtained by a terminal force/moment sensor and then an Euler method is used for solving a differential equation (2) to obtain the interpolation quantity.
(3) Motion controller
In a possible implementation manner, the motion planning can be performed in a cartesian space to directly output cartesian space interpolation, or the motion planning can be performed in a joint space to obtain the cartesian space interpolation through a positive kinematics solution. The cartesian space coordinate system is the coordinate system { f } to be fused.
In one possible implementation, it is assumed that the interpolation quantities in this degree of freedom for the force tracking controller, the impedance controller, the motion controller and the other controllers are δ xtr,δxim,δxmov,δxotherThen the interpolation in this degree of freedom after the final fusion is:
δx=f1(δxtr)+f2(δxim)+f3(δxmov)+f4(δxother) (3)
wherein f is1-f4Is a custom fusion function. For example, a common simple fusion function is that the value of f (x) is either 0 or x (x ≠ 0); if 0, the corresponding controller does not function in that degree of freedom, and if x, the corresponding controller functions in that degree of freedom.
Merging different controllers would have different physical meanings, such as if f2 and f3 in merge (3) then TCP behaves in this degree of freedom as: if the mechanical arm is not subjected to external force, the mechanical arm is purely controlled by motion, and if the mechanical arm is subjected to external force, the mechanical arm is controlled by motion and impedance, so that the tail end of the mechanical arm can be protected to show certain flexibility after meeting obstacles in the motion process. It can be seen that fusing different controllers in a single degree of freedom will eventually show completely different effects.
The controller of the robot can output the interpolation quantity according to the position or the moment, and after the hybrid controller is generated, the output is adjusted according to the change of the position and the moment parameters, so that the robot can perform force tracking and position tracking in a self-adaptive mode. In the conventional robot, if the robot needs to track the attitude, the robot cannot adaptively generate the attitude interpolation amount through the acquired position information, and when the position of the robot does not change, the attitude of the robot can also change, so that the robot is not feasible to realize the attitude adaptation based on position control or force control. The hybrid controller can obtain attitude information from input position information and force information, but the attitude information does not have the capability of changing adaptively.
In the prior art, a program instruction is edited by a user so as to accurately match the moving position, the moment, the posture information and the like of the robot, and the robot automatically outputs a force tracking signal according to a predetermined requirement by arranging a hybrid controller. Meanwhile, even if fusion of a plurality of controllers can be realized, the posture cannot be tracked adaptively through the hybrid controller, because the interpolation quantity output by the hybrid controller is calculated based on the position and/or the moment of the robot, when the posture of the robot changes, the hybrid controller cannot realize adaptive adjustment of the posture, the hybrid controller can calculate the posture interpolation quantity, but cannot deal with the problem of adaptive change of the posture interpolation quantity when the posture changes, and further cannot realize adaptive tracking of the posture. Therefore, the attitude information output by the hybrid controller is cleared, and the attitude interpolation amount of the robot which changes in a self-adaptive manner is calculated by a method different from that of the hybrid controller, so that the attitude tracking of the robot is realized; and acquiring joint space interpolation quantity of the robot according to the calculated attitude interpolation quantity and the position interpolation quantity output by the hybrid controller so as to interpolate the current pose of the robot. Specifically, the gesture interpolation amount is adaptively changed according to the current position of the robot, and how to obtain the gesture interpolation amount adaptively changed according to the current position of the robot may have various forms.
For example, in a specific embodiment, referring to fig. 4, step S203 includes:
s2031, setting a target posture of the robot, wherein the target posture comprises a target operation angle of the tool on a working object;
s2032, acquiring the current posture of the robot, wherein the current posture comprises the current operation angle of the tool on a working object;
s2033, calculating the gesture interpolation amount of the robot changing from the current gesture to the target gesture according to the target operation angle, the current operation angle and a pre-designed angle tracking controller.
Wherein the robot has a tool center point, TCP point, which is typically the tip point of the tool, and said obtaining the current pose of the robot comprises: and acquiring an included angle between the TCP attitude of the robot and the tangent vector of the TCP track. The target pose of the robot is preset by the user. Taking curved surface polishing as an example, the included angle between the robot tool and the work object is the included angle between the Z axis of the tool coordinate system and the tangent of the three-dimensional curve through which the TCP passes. Through the mode, the robot can operate along with the target posture in a self-adaptive manner in the current posture in the moving process of the robot, and the robot can perform self-adaptive tracking of the posture while tracking the force. Specifically, the control system can preset a target posture of the robot, determine a difference value between the current posture and the target posture based on the current posture detected in the running process of the robot, further determine a posture interpolation amount from the current posture to the target posture of the robot by combining the angle tracking controller, change the difference value between the current posture and the target posture when the current posture of the robot changes, and realize that the current posture of the robot always follows the target posture by calculating in real time through the robot. Specifically, the current posture of the robot may be obtained by a six-dimensional sensor at the end of the robot, or may be obtained by other possible means such as a visual sensor.
The calculating of the joint space interpolation amount according to the position interpolation amount and the posture interpolation amount comprises calculating the joint space interpolation amount through inverse kinematics solution according to the position interpolation amount and the posture interpolation amount. The inverse kinematics solution may use an analytical solution method or a numerical solution method, which is not described in detail herein.
Further, in a specific embodiment, the angle tracking controller is a PID controller, i.e., a proportional-integral-derivative controller, and the current operating angle is set to θtThe target operating angle is thetadThen angle error thetae=θdtThe output of the PID controller is the attitude interpolation amount δ θ. The PID controller calculates the attitude interpolation quantity according to the angle error
Figure BDA0003322826610000071
Figure BDA0003322826610000072
Wherein KP、Ki、KdRespectively, a proportional coefficient, an integral coefficient, and a differential coefficient. Therefore, when the current operation angle of the PID controller changes, the angle error changes, and the attitude interpolation quantity output by the PID controller also changes, so that the self-adaptive tracking of the robot attitude can be realized.
Preferably, before calculating the attitude interpolation amount for the robot to change from the current attitude to the target attitude, the method further includes: and according to a predetermined filter, carrying out filtering processing on the current operation angle so as to process sudden change of the current operation angle caused by position change in the robot control process.
Referring to fig. 5, fig. 5 is a flowchart of an adaptive pose tracking method in an embodiment of the present application, where current state information of a robot is obtained through a force sensor at a robot end, a current pose of the robot, and a designated coordinate system, and a position interpolation amount is processed and output according to a hybrid controller; the method comprises the steps of obtaining a TCP track tangent vector and a TCP gesture according to the current pose of the robot, calculating the current operation angle of a robot tool and a working object according to the TCP track tangent vector and the TCP gesture, filtering the current operation angle to remove interference information, combining a preset target operation angle, obtaining a gesture interpolation quantity through an angle tracking controller, performing kinematic inverse solution on the gesture interpolation quantity and a position interpolation quantity output by a hybrid controller to obtain a joint space interpolation quantity, finally, interpolating the current pose of the robot according to the joint space interpolation quantity, controlling the servo drive of the robot, further controlling the motion angle of each joint of the robot, and finally achieving the effect of adjusting the pose of the robot.
The beneficial effects of the above preferred embodiment are: the method can avoid complex programming of the robot and enrich the scene that the robot processes complex working conditions, and the robot can track the target gesture in a self-adaptive manner, so that the use experience is improved.
The present application is also configured to provide an adaptive posture tracking apparatus, which is applied to a robot, and referring to fig. 6, the adaptive posture tracking apparatus includes:
a fusion unit 410 for fusing at least two controllers set in advance to generate a hybrid controller;
a determining unit 420, configured to determine a position interpolation amount and an initial posture interpolation amount of the robot according to the hybrid controller, and zero-clearing the initial posture interpolation amount;
a calculating unit 430, configured to calculate a posture interpolation amount that the robot adaptively changes according to the current posture;
an obtaining unit 440, configured to obtain a joint space interpolation amount of the robot according to the position interpolation amount and the posture interpolation amount;
and the control unit 450 is configured to interpolate the current pose of the robot according to the joint space interpolation amount, and control the robot to operate based on the interpolated pose.
The determining unit 420 calculates an initial attitude interpolation amount and clears the initial attitude interpolation amount, the robot attitude interpolation amount is calculated according to the calculating unit 430, the calculating unit 430 determines the attitude interpolation amount in a manner different from that of the determining unit, and the calculating unit 430 calculates the attitude interpolation amount adaptively changed by the robot according to the current attitude. Preferably, the calculation unit 430 includes adaptive processing on the current posture of the robot to determine a posture interpolation amount of the robot, and further determine a robot joint space interpolation amount, and after the posture of the robot is interpolated, force tracking and posture tracking can be achieved.
Wherein, the preset at least two controllers comprise at least two of a position controller, a force controller and an impedance controller. The fusing at least two preset controllers to generate a hybrid controller includes: obtaining the spatial interpolation quantities of at least two controllers; and determining a fusion interpolation amount according to the space interpolation amount of each controller to generate a hybrid controller.
In a specific embodiment, the calculation unit 430 may calculate the pose interpolation amount of the robot adaptively changing according to the current pose, and may be implemented as: setting a target posture of the robot, wherein the target posture represents a target operation angle of the tool on a working object; acquiring a current posture of the robot, wherein the current posture comprises a current operation angle of the tool on the working object; and calculating the posture interpolation quantity of the robot changing from the current posture to the target posture according to the target operation angle, the current operation angle and a pre-designed angle tracking controller. Specifically, the target pose may be set manually, for example, a user may set the target pose through a demonstrator or other portable device to indicate an operation angle of a tool expected by the user to a working object, so as to ensure an operation effect of the robot, the current pose and the target pose are compared at any time during the operation of the robot, a pose interpolation amount is calculated by combining an angle tracking controller, and the pose of the robot after the final interpolation corresponds to the preset target pose, so as to ensure the operation effect of the robot.
Specifically, acquiring the current posture of the robot includes: and acquiring an included angle between the TCP attitude of the robot and the tangent vector of the TCP track. TCP, i.e. the tool center point of the robot, the tool center point in the initial state is the origin of the tool coordinate system, and when the robot is manually or programmatically moved close to a certain point in the space, it is essential to move the tool center point close to the point, so the trajectory motion of the robot can be represented by the tool center point. The TCP attitude, namely the tool center point attitude, and the TCP trajectory tangent vector, namely the tool center point trajectory tangent vector. Establishing a tool coordinate system of the robot, wherein a Z axis of the tool coordinate system represents the TCP posture of the robot; and taking curved surface polishing as an example, an included angle formed by the Z axis of the tool coordinate system and the tangential direction of a three-dimensional curve passed by the TCP is an operation angle of the robot tool on a working object, and the angle difference value required to be changed is obtained by detecting the angle and combining a preset target posture.
In one possible embodiment, the angle tracking controller is a PID controller. And before calculating the attitude interpolation quantity of the robot changing from the current attitude to the target attitude, the method further comprises the following steps: and carrying out filtering processing on the current operation angle according to a preset filter.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
In an exemplary embodiment, the present application further provides a computer readable storage medium, such as a memory, having a computer program stored thereon, the computer program being executable by a processor to perform an adaptive pose tracking method. Alternatively, the storage medium may be a non-transitory computer readable storage medium, which may be, for example, a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, the present application further provides an electronic device comprising a memory and a processor, the memory storing a computer program; the processor is configured to execute the computer program in the memory to implement the adaptive pose tracking method described above.
In a particular embodiment, referring to FIG. 7, electronic device 500 may include a processor 510, a memory 520, an input/output component 530, and a communication port 540. Processor 510 (e.g., a CPU) may execute program commands in the form of one or more processors. The memory 520 includes various forms of program storage and data storage such as a hard disk, Read Only Memory (ROM), Random Access Memory (RAM), etc. for storing various data files that are processed and/or transmitted by a computer. Input/output component 530 may be used to support input/output between a processing device and other components. The communication port 540 may be connected to a network for enabling data communication. An exemplary processing device may include program instructions stored in read memory (ROM), Random Access Memory (RAM), and/or other types of non-transitory storage media that are executed by a processor. The methods and/or processes of the embodiments of the present specification may be implemented as program instructions.
Finally, it is to be noted that the above description is intended to be illustrative and not exhaustive, and that the invention is not limited to the disclosed embodiments, and that several modifications and variations may be resorted to by those skilled in the art without departing from the scope and spirit of the invention as set forth in the appended claims. Therefore, the protection scope of the present invention should be subject to the claims.

Claims (10)

1. An adaptive pose tracking method applied to a robot for connecting a tool to perform a predetermined operation on a work object, the method comprising:
fusing at least two preset controllers to generate a hybrid controller;
determining a position interpolation quantity and an initial posture interpolation quantity of the robot according to the hybrid controller, and clearing the initial posture interpolation quantity;
calculating the attitude interpolation amount of the robot which adaptively changes according to the current attitude;
acquiring joint space interpolation quantity of the robot according to the position interpolation quantity and the posture interpolation quantity;
and interpolating the current pose of the robot according to the joint space interpolation quantity, and controlling the robot to operate based on the interpolated pose.
2. The method of claim 1, wherein calculating a pose interpolation amount for the robot to adaptively change according to the current pose comprises:
setting a target posture of the robot, wherein the target posture comprises a target operation angle of the tool on a working object;
acquiring a current posture of the robot, wherein the current posture comprises a current operation angle of the tool on a working object;
and calculating the posture interpolation quantity of the robot changing from the current posture to the target posture according to the target operation angle, the current operation angle and a pre-designed angle tracking controller.
3. The method of claim 2, wherein the obtaining the current pose of the robot comprises: and acquiring an included angle between the TCP attitude of the robot and the tangent vector of the TCP track.
4. The method of claim 2, wherein the angle tracking controller is a PID controller.
5. The method of claim 2, wherein before calculating the pose interpolation amount for the robot to change from the current pose to the target pose, further comprising:
and carrying out filtering processing on the current operation angle according to a predetermined filter.
6. The method of claim 1, wherein the at least two controllers comprise at least two of a position controller, a force controller, and an impedance controller.
7. The method of claim 1, wherein the fusing the pre-set at least two controllers to generate a hybrid controller comprises:
obtaining the spatial interpolation quantities of at least two controllers;
and determining a fusion interpolation amount according to the space interpolation amount of each controller to generate a hybrid controller.
8. An adaptive attitude tracking device applied to a robot, comprising:
the fusion unit is used for fusing at least two preset controllers to generate a hybrid controller;
the determining unit is used for determining the position interpolation quantity and the initial posture interpolation quantity of the robot according to the hybrid controller and clearing the initial posture interpolation quantity;
the computing unit is used for computing the attitude interpolation amount of the robot which adaptively changes according to the current attitude;
an obtaining unit, configured to obtain a joint space interpolation amount of the robot according to the position interpolation amount and the posture interpolation amount;
and the control unit is used for interpolating the current pose of the robot according to the joint space interpolation quantity and controlling the robot to operate based on the interpolated pose.
9. A computer-readable storage medium storing a computer program, wherein the computer program when executed implements the adaptive pose tracking method of any of claims 1-7.
10. An electronic device, comprising:
a memory storing a computer program;
a processor for executing the computer program in the memory to implement the adaptive tracking method of any one of claims 1-7.
CN202111252255.3A 2021-10-27 2021-10-27 Self-adaptive attitude tracking method and device, storage medium and electronic equipment Active CN113910232B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111252255.3A CN113910232B (en) 2021-10-27 2021-10-27 Self-adaptive attitude tracking method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111252255.3A CN113910232B (en) 2021-10-27 2021-10-27 Self-adaptive attitude tracking method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN113910232A true CN113910232A (en) 2022-01-11
CN113910232B CN113910232B (en) 2022-12-20

Family

ID=79243007

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111252255.3A Active CN113910232B (en) 2021-10-27 2021-10-27 Self-adaptive attitude tracking method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN113910232B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114734436A (en) * 2022-03-24 2022-07-12 苏州艾利特机器人有限公司 Robot encoder calibration method and device and robot
CN117921683A (en) * 2024-03-19 2024-04-26 库卡机器人(广东)有限公司 Joint robot, control method and device thereof, and readable storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010253668A (en) * 2009-03-31 2010-11-11 Daihen Corp Control device of robot
US20140379131A1 (en) * 2013-06-19 2014-12-25 Gwangju Institute Of Science And Technology Control Method and Device for Position-Based Impedance Controlled Industrial Robot
CN107838920A (en) * 2017-12-20 2018-03-27 芜湖哈特机器人产业技术研究院有限公司 A kind of robot polishing Force control system and method
CN109015634A (en) * 2018-07-24 2018-12-18 西北工业大学 Multi-arm teleoperation robot power/Position Hybrid Control method based on performance function
CN110315527A (en) * 2019-02-26 2019-10-11 浙江树人学院(浙江树人大学) A kind of flexible mechanical arm control method of adaptive Dynamic Programming
CN110488745A (en) * 2019-07-23 2019-11-22 上海交通大学 A kind of human body automatic ultrasonic scanning machine people, controller and control method
CN110948504A (en) * 2020-02-20 2020-04-03 中科新松有限公司 Normal constant force tracking method and device for robot machining operation
CN111319036A (en) * 2018-12-15 2020-06-23 天津大学青岛海洋技术研究院 Self-adaptive algorithm-based mobile mechanical arm position/force active disturbance rejection control method
CN111633668A (en) * 2020-07-27 2020-09-08 山东大学 Motion control method for robot to process three-dimensional free-form surface

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010253668A (en) * 2009-03-31 2010-11-11 Daihen Corp Control device of robot
US20140379131A1 (en) * 2013-06-19 2014-12-25 Gwangju Institute Of Science And Technology Control Method and Device for Position-Based Impedance Controlled Industrial Robot
CN107838920A (en) * 2017-12-20 2018-03-27 芜湖哈特机器人产业技术研究院有限公司 A kind of robot polishing Force control system and method
CN109015634A (en) * 2018-07-24 2018-12-18 西北工业大学 Multi-arm teleoperation robot power/Position Hybrid Control method based on performance function
CN111319036A (en) * 2018-12-15 2020-06-23 天津大学青岛海洋技术研究院 Self-adaptive algorithm-based mobile mechanical arm position/force active disturbance rejection control method
CN110315527A (en) * 2019-02-26 2019-10-11 浙江树人学院(浙江树人大学) A kind of flexible mechanical arm control method of adaptive Dynamic Programming
CN110488745A (en) * 2019-07-23 2019-11-22 上海交通大学 A kind of human body automatic ultrasonic scanning machine people, controller and control method
CN110948504A (en) * 2020-02-20 2020-04-03 中科新松有限公司 Normal constant force tracking method and device for robot machining operation
CN111633668A (en) * 2020-07-27 2020-09-08 山东大学 Motion control method for robot to process three-dimensional free-form surface

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114734436A (en) * 2022-03-24 2022-07-12 苏州艾利特机器人有限公司 Robot encoder calibration method and device and robot
CN114734436B (en) * 2022-03-24 2023-12-22 苏州艾利特机器人有限公司 Robot encoder calibration method and device and robot
CN117921683A (en) * 2024-03-19 2024-04-26 库卡机器人(广东)有限公司 Joint robot, control method and device thereof, and readable storage medium
CN117921683B (en) * 2024-03-19 2024-05-31 库卡机器人(广东)有限公司 Joint robot, control method and device thereof, and readable storage medium

Also Published As

Publication number Publication date
CN113910232B (en) 2022-12-20

Similar Documents

Publication Publication Date Title
CN108161882B (en) Robot teaching reproduction method and device based on augmented reality
CN113910232B (en) Self-adaptive attitude tracking method and device, storage medium and electronic equipment
González et al. Advanced teleoperation and control system for industrial robots based on augmented virtuality and haptic feedback
US9764462B2 (en) Robot apparatus and robot controlling method
CN110076772B (en) Grabbing method and device for mechanical arm
CN108340351B (en) Robot teaching device and method and teaching robot
Neto et al. High‐level robot programming based on CAD: dealing with unpredictable environments
JP2021000678A (en) Control system and control method
US11179793B2 (en) Automated edge welding based on edge recognition using separate positioning and welding robots
CN109968361B (en) Variable impedance teleoperation control device and method based on real-time force feedback
CN111459274B (en) 5G + AR-based remote operation method for unstructured environment
CN108687767B (en) Offline programming device and offline programming method
CN110948504A (en) Normal constant force tracking method and device for robot machining operation
Kamali et al. Real-time motion planning for robotic teleoperation using dynamic-goal deep reinforcement learning
Chang et al. Image feature command generation of contour following tasks for SCARA robots employing Image-Based Visual Servoing—A PH-spline approach
CN113021356B (en) Robot track planning method and system for ingot trimming process
CN114851209B (en) Industrial robot working path planning optimization method and system based on vision
Rea Minango et al. Combining the STEP-NC standard and forward and inverse kinematics methods for generating manufacturing tool paths for serial and hybrid robots
CN113634871A (en) Robot friction stir welding track planning method based on offline programming
CN106003067A (en) Industrial robot teaching method and teaching file manufacturing method and device
Cheng et al. Trajectory planning method with grinding compensation strategy for robotic propeller blade sharpening application
Dagioglou et al. Smoothing of human movements recorded by a single rgb-d camera for robot demonstrations
US6798416B2 (en) Generating animation data using multiple interpolation procedures
CN113954070B (en) Mechanical arm motion control method and device, storage medium and electronic equipment
CN109648563B (en) Method for controlling motion of serial robot and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant