CN111085996B - Control method, device and system of live working robot - Google Patents

Control method, device and system of live working robot Download PDF

Info

Publication number
CN111085996B
CN111085996B CN201911288390.6A CN201911288390A CN111085996B CN 111085996 B CN111085996 B CN 111085996B CN 201911288390 A CN201911288390 A CN 201911288390A CN 111085996 B CN111085996 B CN 111085996B
Authority
CN
China
Prior art keywords
action
arm
working robot
live working
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201911288390.6A
Other languages
Chinese (zh)
Other versions
CN111085996A (en
Inventor
陈贤飞
俞葆青
夏益青
徐善军
杨冰
焦海斌
闫红雨
李威
王朝松
卢路
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Guodian Futong Science and Technology Development Co Ltd
State Grid Electric Power Research Institute
Original Assignee
Beijing Guodian Futong Science and Technology Development Co Ltd
State Grid Electric Power Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Guodian Futong Science and Technology Development Co Ltd, State Grid Electric Power Research Institute filed Critical Beijing Guodian Futong Science and Technology Development Co Ltd
Priority to CN201911288390.6A priority Critical patent/CN111085996B/en
Publication of CN111085996A publication Critical patent/CN111085996A/en
Application granted granted Critical
Publication of CN111085996B publication Critical patent/CN111085996B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/006Controls for manipulators by means of a wireless system for controlling one or several manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a control method, a device and a system of a live working robot; wherein the method is applied to a controller; the controller is respectively in communication connection with the live working robot and the motion acquisition equipment; the action acquisition equipment is worn on an operator; the method comprises the following steps: receiving action data of an operator sent by action acquisition equipment; generating a control signal according to the action data and a pre-established action mapping model; and controlling the action of the live working robot through the control signal so that the live working robot can work according to the action of the arm of the operator. According to the invention, the operation is carried out by controlling the live working robot to follow the arm action of the operator, so that the working efficiency of the live working robot is improved, and the practicability of the live working robot in a severe working environment is improved.

Description

Control method, device and system of live working robot
Technical Field
The invention relates to the technical field, in particular to a control method, a device and a system of a live working robot.
Background
In the related art, in the working process of an electrified operating robot, the electrified operating robot is usually controlled in a teleoperation mode of a master mechanical arm and a slave mechanical arm; in this method, the operator remotely controls the slave robot arm by the main manipulator in the insulating bucket or the control room. Although the method ensures the safety of the operators, the method has higher requirements on the technical proficiency of the operators, and has the disadvantages of high operation difficulty, slow operation and serious influence on the operation efficiency. In addition, some live working robots adopting an autonomous operation mode adopt various sensors to collect and check data and then operate, but under the highly complex environments such as heavy fog and strong light, the existing sensors and cameras cannot effectively complete operation through identification and positioning, and therefore the practicability of the live working robots is greatly reduced.
Disclosure of Invention
In view of the above, the present invention provides a method, an apparatus and a system for controlling a live working robot, so as to improve the working efficiency of the live working robot and the practicability of the live working robot in a severe working environment.
In a first aspect, the embodiment of the invention provides a control method of a live working robot, which is applied to a controller; the controller is respectively in communication connection with the live working robot and the motion acquisition equipment; the action acquisition equipment is worn on an operator; the method comprises the following steps: receiving action data of an operator sent by action acquisition equipment; generating a control signal according to the action data and a pre-established action mapping model; and controlling the action of the live working robot through the control signal so that the live working robot can work according to the action of the arm of the operator.
With reference to the first aspect, embodiments of the present invention provide a first possible implementation manner of the first aspect, where the above-mentioned electric working robot includes a robot arm; the step of generating a control signal according to the action data and a pre-established action mapping model comprises the following steps: based on a preset biological arm motion model, calculating rotation angle information of shoulder joints and elbow joints of the arms of the operator through motion data; inputting the rotation angle information into a pre-established action mapping model to obtain a target degree of freedom parameter of the mechanical arm; and generating a control signal based on the target degree of freedom parameter of the mechanical arm.
With reference to the first possible implementation manner of the first aspect, an embodiment of the present invention provides a second possible implementation manner of the first aspect, where the step of generating a control signal based on a target degree of freedom parameter of a robot arm includes: acquiring a current degree of freedom parameter of the mechanical arm; calculating the difference of the freedom parameters of the target freedom parameter and the current freedom parameter; based on the degree of freedom parameter difference, a control signal is generated.
With reference to the first possible implementation manner of the first aspect, an embodiment of the present invention provides a third possible implementation manner of the first aspect, where the motion mapping model is established based on a neural network; the action mapping model is established in the following way: obtaining a training sample; the training sample comprises rotation angle information of shoulder joints and elbow joints of a plurality of arms and target freedom degree parameters of the mechanical arms corresponding to the rotation angle information; establishing a network structure of a neural network according to a preset model architecture; and inputting the training samples into a network structure for training to obtain an action mapping model.
With reference to the first aspect, an embodiment of the present invention provides a fourth possible implementation manner of the first aspect, where before receiving the action data of the operator sent by the action acquisition device, the method further includes: receiving a working mode selection instruction sent by a user; if the working mode selection instruction indicates that the live working robot adopts a motion capture mode, receiving initialization data sent by motion acquisition equipment; based on a preset biological arm motion model, calculating initial rotation angle information of shoulder joints and elbow joints of the arms of an operator through initialized data; comparing the initial rotation angle information with preset initial action information; and if the initial rotation angle information is not matched with the initial action information, continuing to execute the step of receiving the initialization data until the initial rotation angle information is matched with the initial action information, and determining that the charged working robot is in a motion capture mode.
In a second aspect, the embodiment of the invention further provides a control device of the live working robot, which is arranged on the controller; the controller is respectively in communication connection with the live working robot and the motion acquisition equipment; the action acquisition equipment is worn on an operator; the device includes: the data receiving module is used for receiving the action data of the operator, which is sent by the action acquisition equipment; the control signal generation module is used for generating a control signal according to the action data and a pre-established action mapping model; and the control module is used for controlling the action of the live working robot through the control signal so as to enable the live working robot to operate according to the action of the arm of the operator.
In a third aspect, an embodiment of the present invention further provides an operation system of an electric operating robot, where the operation system of the electric operating robot includes an electric operating robot, a controller, and a motion acquisition device; the controller is respectively in communication connection with the live working robot and the motion acquisition equipment; the device is arranged in the controller.
With reference to the third aspect, an embodiment of the present invention provides a first possible implementation manner of the third aspect, where the system further includes an insulating bucket arm vehicle and a double insulating bucket; the double-arranged insulating bucket is carried on an insulating bucket arm vehicle; the double-arranged insulating bucket comprises a first insulating bucket and a second insulating bucket; the live working robot and the controller are arranged on the first insulating hopper; the action acquisition equipment is arranged on the second insulating hopper.
In a fourth aspect, embodiments of the present invention further provide a controller, including a processor and a memory, where the memory stores machine executable instructions capable of being executed by the processor, and the processor executes the machine executable instructions to implement the above method.
In a fifth aspect, embodiments of the present invention also provide a machine-readable storage medium storing machine-executable instructions, which when called and executed by a processor, cause the processor to implement the above-mentioned method.
The embodiment of the invention has the following beneficial effects:
the embodiment of the invention provides a control method, a control device and a control system of a live working robot, wherein after action data of an operator sent by action acquisition equipment are received, a control signal is generated according to the action data and a pre-established action mapping model; and controlling the action of the live working robot through the control signal so that the live working robot can work according to the action of the arm of the operator. In the mode, the operation is carried out by controlling the live working robot to follow the arm action of the operator, so that the working efficiency of the live working robot is improved, and the practicability of the live working robot under the severe working environment is improved.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the invention as set forth above.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flowchart of a control method of a live working robot according to an embodiment of the present invention;
fig. 2 is a flowchart of another control method for an electric working robot according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a model of motion of a biological arm according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a biological arm motion model establishing coordinate system according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a BP neural network according to an embodiment of the present invention;
fig. 6 is a flowchart of another control method for an electric working robot according to an embodiment of the present invention;
fig. 7 is a layout structure diagram of a platform of an electric working robot according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a control device of an electric working robot according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of an electric working robot operating system according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of a controller according to an embodiment of the present invention.
Icon: 1-a built-in industrial personal computer; 2-special tool placing platform; 3-a robot platform body; 4-a double-mechanical-arm supporting structure; 51-a first robot arm; 52-a second robotic arm; 61-a first camera; 62-a second camera; 71-a first end tool; 72-a second end tool; 8-a depth camera; 9-a handheld controller; 10-portable data processing module; 11-an operator; 12-a control panel; 13-double bucket connecting mechanism; 14-double insulating buckets; 700-a data receiving module; 702-a control signal generation module; 704-a control module; 80-an electric working robot; 81-a controller; 82-motion capture equipment; 1201-a processor; 1202-a memory; 1203-bus; 1204-forwarding chip.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the following embodiments, and it should be understood that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The live working robot is generally used in an industrial scene with higher voltage, such as line inspection of a high-voltage transmission line; the hot-line work robot is adopted to replace manual work, so that high-voltage and strong electric field damage can be effectively avoided, and the work efficiency is greatly improved.
In the related technology, a teleoperation method of a master mechanical arm and a slave mechanical arm is mainly adopted in the operation process of the live working robot of the power distribution live working robot; in the method, an operator remotely controls the slave mechanical arm through a main manipulator in an insulating bucket or a control room. The method ensures the safety of operators, but has higher requirements on the operators, if the operators are required to have higher technical proficiency, and the spirit is highly concentrated in the operation process; the operation is slow, and the next operation is carried out after the safety of the operation environment is ensured; if the operation is improper, the mechanical arm may be damaged, and the operation efficiency is seriously affected.
With the development of science and technology, the working robot has certain autonomous working capacity through modeling research on different working method flows of various working conditions, an operator only needs to simply operate and guide the robot in place by using a control panel, the robot can recognize scenes and then automatically control a mechanical arm to complete corresponding working tasks, the working intensity of the operator is greatly reduced, and the requirement on complex operation technology of the operator is simplified.
For the live working robot with higher degree of autonomy, only various sensors are needed to collect checking data, and then the operation is carried out; however, the method has certain limitation, for example, in the face of the highly complex environment such as heavy fog, strong light and the like, the existing sensor and camera can not effectively complete the operation at all in the identification and positioning process, and the practicability is greatly reduced. This is a great obstacle to the advancement of the degree of automation of the electric working robot.
Based on the above, the embodiment of the invention provides a control method, a control device and a control system for a live working robot, which can be applied to the live working robot in various industrial scenes.
For the convenience of understanding the embodiment, a detailed description will be given to a control method of an electric working robot disclosed in the embodiment of the present invention.
The embodiment of the invention provides a control method of a live working robot, which is applied to a controller; the controller is respectively in communication connection with the live working robot and the motion acquisition equipment; the action acquisition equipment is worn on an operator; as shown in fig. 1, the method comprises the steps of:
and S100, receiving the action data of the operator sent by the action acquisition equipment.
The motion data is collected by motion collection equipment worn on an operator. The motion acquisition equipment can be composed of a sensor, a data transmission module, a power supply and the like; the sensor can be arranged on the arm of an operator, and can obtain sensing data such as the acceleration, the angular velocity or the angular displacement of a shoulder joint or an elbow joint according to the action of the arm; the sensing data can be used as action data and sent to the controller by a data transmission module of the action acquisition equipment.
Step S102, generating a control signal according to the action data and a pre-established action mapping model.
The motion data is generally data measured by a sensor, and the motion data needs to be preprocessed, such as amplification, denoising and the like; then, analyzing the preprocessed action data according to the motion characteristics of the arms to obtain the motion state parameters of the arms of the operators; in the related art, the motion characteristics of the arm are commonly studied, and the arm can be generally considered to be simplified into a rigid body model with 4 degrees of freedom.
The mechanical arm of the electric working robot is manufactured by imitating a biological arm, the freedom degree parameter of the mechanical arm can correspond to the motion state parameter of the arm, and the corresponding process can be realized by the pre-established motion mapping model. The action mapping model can be established based on principles such as neural networks and machine learning and is obtained through training of a large amount of actual data. After the motion state parameters of the arm are input into the action mapping model, the action mapping model can output the corresponding freedom degree parameters of the mechanical arm; the controller can generate a control signal according to the current state of the mechanical arm and the difference between the degree of freedom parameters corresponding to the arm, so that the mechanical arm is adjusted to the motion state of the arm detected by the motion acquisition equipment through the control signal.
And step S104, controlling the action of the live working robot through the control signal so that the live working robot works according to the action of the arm of the operator.
An electric operating robot generally comprises a mechanical arm, a sliding block module, a control cabinet and other equipment; can send control signal to live working robot's switch board for the switch board is controlled the motion of arm through drive mechanism such as slider module, so that live working robot's arm follows the action of operating personnel's arm and carries out the operation.
The embodiment of the invention provides a control method of a live working robot, which comprises the steps of receiving action data of an operator sent by action acquisition equipment, and generating a control signal according to the action data and a pre-established action mapping model; and controlling the action of the live working robot through the control signal so that the live working robot can work according to the action of the arm of the operator. In the mode, the operation is carried out by controlling the live working robot to follow the arm action of the operator, so that the working efficiency of the live working robot is improved, and the practicability of the live working robot under the severe working environment is improved.
The embodiment of the invention also provides another control method of the hot-line work robot, which is realized on the basis of the control method of the hot-line work robot; as shown in fig. 2, the method comprises the steps of:
step S200, receiving a working mode selection instruction sent by a user.
The above-mentioned live working robot may have a plurality of working modes, such as the working mode for performing autonomous operation and the motion capture mode using the method provided by the embodiment of the present invention. The working mode selection instruction can be actively sent by a user; or the sensor of the live working robot can collect the parameters (such as temperature and brightness) of the ambient environment and feed the parameters back to the controller; the controller gives a currently applicable working mode according to the ambient environment condition parameters for the user to select, and after the user makes a selection through a human-computer interaction device (such as a mouse and a keyboard), the controller receives a working mode selection instruction.
Step S202, if the working mode selection instruction indicates that the charged working robot adopts a motion capture mode, executing step S204; if not, the process is ended.
After the user selects the motion capture mode, the working mode selection instruction carries the identifier set by the motion capture mode, whether the identifier exists can be searched in the working mode selection, and if the identifier exists, the working mode selection instruction indicates that the charged working robot adopts the motion capture mode; if the identifier is not found, the working mode selection instruction indicates that the live working robot adopts other working modes, and the other working modes are not in the scope of the method discussion provided by the embodiment of the invention and are directly finished.
And step S204, receiving the initialization data sent by the motion acquisition equipment.
The initialization data may be the operation data, and is referred to as initialization data since the initialization data is applied to initialization of the method. The initialization data may also be sensing data obtained by a sensor in the motion acquisition device according to the motion of the arm.
The motion acquisition equipment (also called as an inertia measurement system) can be composed of a lithium battery, a power supply management module, a nine-axis inertia unit and a Wi-Fi control unit. The nine-axis inertial unit comprises a three-axis accelerometer, a three-axis gyroscope and a three-axis magnetometer and can measure the acceleration, the angular velocity and the angular displacement of the carrier in real time; the lithium battery supplies power to other modules in the system, the power management circuit module provides charging and power-off functions for the whole system, the Wi-Fi control unit and the nine-axis inertia unit are connected in a serial connection communication mode, and signals of the nine-axis inertia unit are wirelessly transmitted to the controller.
And step S206, calculating initial rotation angle information of the shoulder joint and the elbow joint of the arm of the operator through the initialized data based on the preset biological arm motion model.
The initialization data is attitude data obtained by detecting the forearm and forearm of the operator by inertial sensors worn on the wrist and the shoulder. The biological arm motion model takes the biological right arm modeling as an example: the whole arm model is simplified into a rigid body with 4 degrees of freedom in total, wherein the joints of the shoulder joints are spherical hinges and have 3 degrees of freedom (the front and back extension degree of freedom, the left and right rotation degree of freedom and the inner and outer convergence degree of freedom of the large arm), the elbow joints are simplified into a uniaxial hinge model, and the large arm and the small arm are considered to have only 1 degree of freedom (the rotation degree of freedom of the small arm) and only move in the same plane, as shown in a model schematic diagram of fig. 3, wherein Ll is the length of the large arm, and L2 is the length of the small arm.
Firstly, a space absolute coordinate system O (X, Y, Z) and a relative coordinate system O of a person relative to an absolute space are respectively established in an arm simple model1(X1,Y1,Z1) And joint relative coordinate system O3(X3,Y3,Z3) And O describing the coordinates of the two inertial sensors2(X2,Y2,Z2),O4(X4,Y4,Z4) A coordinate system as shown in fig. 4.
Then, analyzing the posture of the arm; the arm motions of two joints of a human body are tracked and captured, and based on the simple arm motion model as shown above, in the process of calculating a coordinate system, the rotation of the coordinate system is explored by a quaternion method: the quaternion of the space absolute coordinate system is assumed to be qOThe quaternion of the joint posture at the center of the big arm is qo1The quaternion of the joint posture at the center of the forearm is qO3The attitude quaternion of the upper sensor is qO2The attitude quaternion of the lower sensor is qO4(ii) a The process explores the linkage of two sensor nodes to realize arm motion capture based on a quaternion method, wherein the position determination of the elbow rotation center is very critical in the motion capture process.
In the coordinate system established above, the motions of all two joints are unified into the relative coordinate system of human, and the coordinates of the three joints corresponding to the initial time of posture calibration (natural arm droop) are respectively O1(0,0,0),O3(0,0,-L1),O4(0,0,-(L1+L2) In which O) is1Are assumed to be fixed, onlyNeed to determine O3The position of the center of rotation of the elbow joint limb segment can be determined as calculated in a single node model using a quaternion method, a four-dimensional vector
Figure BDA0002315837270000101
The expression after the rotation at the time t in the three-dimensional space is as follows:
Figure BDA0002315837270000102
wherein
Figure BDA0002315837270000103
Is the vector after rotation, qO1(t)Representing the posture quaternion of the shoulder joint at time t. Three-dimensional space coordinate of elbow joint rotation center
Figure BDA0002315837270000105
Comprises the following steps:
Figure BDA0002315837270000106
suppose the shoulder joint O is in motion1The position of the space coordinate is fixed and invariable, the joint O1Will affect the joint O3And O4In the spatial position, in this case, by means of the concept of positive kinematics, from the father joint O1Starting from the position, the quaternion of the shoulder joint and the length of the big arm can be calculated to obtain the joint O3Based on the position of the joint O3The position of the lower sensor is determined by the quaternion of the elbow joint posture quaternion relative to the rotation of the shoulder joint4The position of (a).
Shoulder joint O1The attitude coordinate at the moment t is;
Figure BDA0002315837270000104
wherein q isUpper side (t)For attitude coordinate data measured by the upper sensor at time t, qUpper baseFor upper sensors at O1Coordinate values of the coordinate system relative to the absolute coordinate system.
Elbow joint O3The spatial coordinates in the relative coordinate system at time t are:
Figure BDA0002315837270000111
its three-dimensional coordinate is
Figure BDA00023158372700001113
Lower sensor position coordinate O4Position coordinate O relative to elbow joint3The rotational quaternion of (a);
Figure BDA0002315837270000112
further comprising:
Figure BDA0002315837270000113
the joint O where the sensor is located can be inferred4The relative coordinate system spatial position at time t;
Figure BDA0002315837270000114
the three-dimensional coordinates are;
Figure BDA0002315837270000115
when the included angle of the elbow joint is calculated, the included angle can be simplified into a vector according to a model
Figure BDA0002315837270000116
Sum vector
Figure BDA0002315837270000117
Angle between, in the spatial attitude representation of quaternion, vector
Figure BDA0002315837270000118
Can use shoulder joint
Figure BDA0002315837270000119
Is represented by a vector of attitude quaternions
Figure BDA00023158372700001110
Can use the joint
Figure BDA00023158372700001111
The posture quaternion is used for expressing, and a specific expression of the joint angle is calculated by a quaternion method;
Figure BDA00023158372700001112
wherein, by qJoint angle(q0,q1,q2,q3) Can be calculated as O1O2Included angle α on axis:
α=arctan2(2(q0q3+q1q2),1-2((q2)2+(q3)2)。
step S208, comparing the initial rotation angle information with the preset initial action information.
The initial motion information corresponds to the motion of the operator which is preset and represents the start of the motion capture mode; for example, after the live working robot enters a motion capture mode, after an operator wears the inertial sensor, the operator straightens the arm mark forward to start motion capture, and in this state, the rotation angle information measured by the inertial sensor can be used as preset initial motion information.
Step S210, if the initial rotation angle information is matched with the initial action information, executing step S212; if not, go to step S204.
If the initial rotation angle information is the same as the initial motion information or the error between the initial rotation angle information and the initial motion information is within a preset range, if the initial rotation angle information is matched with the initial motion information, the operator is ready to start motion capture; if the error between the initial rotation angle information and the initial action information exceeds the preset range, the operator can be considered not to be ready, and the step of receiving the initialization data is continuously executed until the initial rotation angle information is matched with the initial action information. The mode can prevent operation accidents caused by the fact that the operator is not ready.
And step S212, receiving the action data of the operator sent by the action acquisition equipment.
Step S214, based on a preset biological arm motion model, calculating rotation angle information of a shoulder joint and an elbow joint of an arm of an operator through motion data; the process is similar to the implementation process of step S206, and the same biological arm motion model is used; when the model receives initialization data, the lengths of the upper arm and the lower arm of the biological arm in the model are initialized.
Step S216, inputting the rotation angle information to a pre-established action mapping model to obtain a target degree of freedom parameter of the mechanical arm.
The live working robot comprises a mechanical arm; the action mapping model can be established based on a neural network; the action mapping model is established in the following way:
(1) obtaining a training sample; the training sample comprises rotation angle information of shoulder joints and elbow joints of a plurality of arms and target freedom degree parameters of the mechanical arms corresponding to the rotation angle information.
(2) And establishing a network structure of the neural network according to a preset model architecture.
(3) And inputting the training samples into a network structure for training to obtain an action mapping model.
Specifically, the above-described action mapping model may be established based on a BP (back propagation) neural network. The BP neural network is a multilayer forward network based on back propagation algorithm, and has an input layer and an output layer and a plurality of hidden layers, and the structure of the BP neural network is shown in figure 5, wherein the input layer is formed by an input quantity X1,X2...XnComposition, impliesLayer by parameter W1,W2...WpThe output layer consists of an output quantity Y1,Y2...YmAnd (4) forming. According to the approximation theory, under the condition that the number of the hidden layer neurons is enough, the BP neural network with the nonlinear transfer function can realize the nonlinear mapping with any precision. For the nonlinear process and function only knowing input data and output data, the BP neural network can be trained by utilizing the input and output data with excellent fitting capability of the BP neural network, so that the BP neural network obtains a mapping function which is very close to the nonlinear process.
The control method provided by the embodiment of the invention mainly realizes the control of the large arm and the small arm of the mechanical arm, and does not relate to the control of hand motion. Only 3 variables of the shoulder and 1 variable of the elbow are controlled, namely 4 pieces of information of the motion of the biological arm can be collected according to the model condition and mapped to 4 degrees of freedom controlled by the mechanical arm, so that a three-layer BP neural network with 4 inputs and 4 outputs is established, and the forward calculation process is as follows;
an input of the network; x ═ X1,x2,x3,x4);
Wherein x1、x2、x3、x4Each represents 4 pieces of rotation angle information of the movement of the biological arm which is collected by the sensor and then calculated.
An input layer; a isi=xi,i=1,2,3,4;
An intermediate layer:
Figure BDA0002315837270000131
j=1,2,...p;
bj=F1(sj),j=1,2,...p;
an output layer:
Figure BDA0002315837270000132
t=1,2,3,4;
yt=F2(lt),t=1,2,3,4;
and (3) network output: y ═ Y1,y2,y3,y4)。
Wherein, betaij,θj,vjt,γtThe neural network parameters are the neural network parameters, the values of which are continuously changed in the training process, X is a network input vector, Y is a network output vector, and the rest are intermediate variables of the neural network. F1(. and F)2(. cndot.) is the transfer function of the intermediate and output layers. F1(. DEG) is generally a non-linear function, and the invention employs a sigmoid function (also known as S-shaped growth curve), F2(. cndot.) A linear function is chosen. F1The nonlinearity of (is) is the basis for implementing the nonlinear mapping of the BP neural network. In the case of a defined number of network neurons, different βij,θj,vjt,γtThe values determine the different input-output mappings of the network. BP neural network uses error back propagation to adjust betaij,θj,vjt,γtValue of (b), continuously adjusting beta in the course of network trainingij,θj,vjt,γtThe value of (b) is such that the output of the network is closer to the target output data in the training data.
The process of noise reduction and accurate reproduction of captured motion data is a complex nonlinear process, a large amount of corresponding input and output data is used for training a BP network, the trained network has nonlinear mapping capacity close to the nonlinear process, namely, a network structure model is used for modeling and identifying the complex nonlinear process, wherein beta is betaij,θj,vjt,γtAre parameters of the model. The trained network can be directly applied in the field.
In step S218, a control signal is generated based on the target degree of freedom parameter of the robot arm.
In the process, the current freedom parameter of the mechanical arm needs to be acquired firstly; the mechanical arm is usually provided with a sensor, and the current freedom parameter of the mechanical arm can be obtained by processing the motion data acquired by the sensor. After the current degree of freedom parameter is obtained, the degree of freedom parameter difference between the target degree of freedom parameter and the current degree of freedom parameter can be calculated; then generating a control signal based on the parameter difference of the degree of freedom; the process is a common control signal generation process.
And step S220, controlling the action of the live working robot through the control signal so that the live working robot works according to the action of the arm of the operator.
According to the control method of the live working robot, whether an operator is ready or not is judged through initialization data, and the occurrence of operation accidents is reduced; control signals are generated through the biological arm motion model and the motion mapping model so as to control the mechanical arm to work along with the arm motion of an operator, and the working efficiency of the live working robot is improved.
The embodiment of the invention also provides another control method (also called a motion capture method) of the charged working robot, which is realized on the basis of the control method of the charged working robot; as shown in fig. 6, the method comprises the steps of:
step S500, the wearing position of the sensor is confirmed by the operator, and the double arms and the elbows of the person are modeled.
Step S502, acquiring sensor data obtained by motion capture, establishing a coordinate system, and confirming the state of the biological arm.
And step S504, preprocessing and deviation correction are carried out on the sensor data, and control data corresponding to the mechanical arm are output.
Specifically, a flow of work performed by the control method of the live working robot will be described by taking the live working robot as an example of performing the operation of the lead line.
Fig. 7 is a layout structure view of a platform of the live working robot; wherein, the left side is the live working robot, and the right side is operating personnel and motion capture equipment (equivalent to the motion acquisition equipment). In the robot platform, 1 is a built-in industrial personal computer; 2, a special tool placing platform; 3 is a robot platform body; 4, a double-mechanical-arm supporting structure; a first mechanical arm 51 and a second mechanical arm 52 (the left arm and the right arm are specifically distinguished); 61 is a first camera equipped on the first mechanical arm, 62 is a second camera equipped on the second mechanical arm; 71 is a first end tool disposed at the end of a first robotic arm, 72 is a second end tool disposed at the end of a second robotic arm; and 8, a depth camera. In the right side, 9 is a handheld controller; 10 is a movable data processing module; 11 is an operator; 12 is a control panel (also called as an operation panel); 13 is a double-bucket connecting mechanism; 14 is a double-arranged insulating bucket.
In general, a live working process of a live working robot includes: the insulating bucket arm is turned in place; roughly positioning a system base; adjusting the insulating hopper to a proper position; manually judging whether the operation task can be automatically completed or not, and if so, automatically completing the task; otherwise, starting the motion capture device, enabling the robot arm to move along with the manual arm, sending an instruction to control the motion of the end tool by the manual hand-held controller after the mechanical arm is in place, withdrawing after the operation motion is completed, repeating the motion until the task is completed, completing the task, setting the two arms of the control personnel, and stopping the mechanical arm.
The operation process specifically comprises the following steps;
(1) the arm car is in place and the double-placed bucket 14 is ready. The operator 11 confirms the task, selects and connects the lead wire module on the control panel 12, initializes the first mechanical arm 51 and the second mechanical arm 52, selects the first end tool 71 as a special tool lead clamp, and selects the second end tool 72 as a peeling tool, and prepares for operation.
(2) The depth camera 8 scans the scene and fits the features to a database. The built-in industrial personal computer 1 processes and judges and then gives a selection whether to operate or not, whether to execute or not is confirmed by an operator, and if the operation can be automatically performed, the robot operates automatically. And if the conditions are not allowed and the autonomous operation cannot be carried out, starting the motion capture system.
(3) After the two arms of the operator wear the corresponding sensors, the motion capture system is started. The first robot 51 and the second robot 52 are initially set to wait for the next control command. The mobile data processing module 10 receives the specific action data of the operator to start working, sends a ready command signal to the control panel 12, and the operator 11 receives the ready signal and confirms connection by using the handheld controller 9, so that the operator 11 and the mechanical arm are synchronously completed.
(4) The operator 11 moves the left arm slowly and the first robotic arm 51 follows the control movement until its first end tool 71 (the wire clamp) captures the lower drainage wire. The operator 11 uses the hand-held controller 9 to send an actuation command to the first end tool 71, and the first end tool 71 performs the task of gripping the lower drainage wire. At the same time, the operator 11 controls the second robotic arm 52 in the right hand to introduce the drainage wire processed by the first robotic arm 51 (i.e., the left robotic arm) into the second end tool 72 (i.e., the right end tool), and the right hand-held controller 9 sends instructions to the second end tool 72 for stripping and merging into a merging tool.
(5) After the left mechanical arm finishes the task of wire feeding, an operator 11 uses the handheld controller 9 to send a setting instruction, the data processing module does not send a left mechanical arm command to the industrial personal computer any longer, and the left mechanical arm enters a standby state. The right arm directs the second robotic arm 52 to bring the lower drainage wire initially treated by the second end tool 72 to the location of the drainage line to be connected. The designated action command is sent to control the second end tool 72 to complete the final work action. At this time, the operator 11 may end the synchronized state of the second robot arm 52 and the right arm after sending the last command.
(6) Since the end tool action time may take several minutes, the operator 11 may issue a standby command, which may cause the robot arm to follow the end, and automatically retract after the specified command is completed.
According to the method, the operation scheme of the mechanical arm under different operation conditions is formulated through modeling analysis of various working conditions of distribution network live working, most of possible problems are avoided before operation starts, path planning is optimized, the mechanical arm can automatically work under the allowable working environment, the operation time is greatly reduced, and the working efficiency is improved.
Corresponding to the embodiment of the method, the embodiment of the invention also provides a control device of the live working robot, which is arranged on the controller; the controller is respectively in communication connection with the live working robot and the motion acquisition equipment; the action acquisition equipment is worn on an operator; as shown in fig. 8, the apparatus includes:
the data receiving module 700 is configured to receive action data of an operator, which is sent by the action acquisition device;
a control signal generation module 702, configured to generate a control signal according to the motion data and a pre-established motion mapping model;
and the control module 704 is used for controlling the action of the live working robot through the control signal so that the live working robot performs work according to the action of the arm of the operator.
The control device of the charged working robot receives the action data of the operator sent by the action acquisition equipment, and generates a control signal according to the action data and a pre-established action mapping model; and controlling the action of the live working robot through the control signal so that the live working robot can work according to the action of the arm of the operator. In the mode, the operation is carried out by controlling the live working robot to follow the arm action of the operator, so that the working efficiency of the live working robot is improved, and the practicability of the live working robot under the severe working environment is improved.
Specifically, the above-mentioned live working robot includes a robot arm; the control signal generation module is further configured to: based on a preset biological arm motion model, rotation angle information of shoulder joints and elbow joints of arms of an operator is obtained through motion data calculation; inputting the rotation angle information into an action mapping model to obtain a target freedom degree parameter of the mechanical arm; based on the target degree of freedom parameter of the mechanical arm, a control signal is generated.
Further, the control signal generating module is further configured to: acquiring a current degree of freedom parameter of the mechanical arm; calculating the difference of the freedom parameters of the target freedom parameter and the current freedom parameter; based on the degree of freedom parameter difference, a control signal is generated.
In the actual implementation process, the action mapping model is established based on a neural network; the action mapping model is established in the following way: obtaining a training sample; the training sample comprises rotation angle information of shoulder joints and elbow joints of a plurality of arms and target freedom degree parameters of the mechanical arms corresponding to the rotation angle information; establishing a network structure of a neural network according to a preset model architecture; and inputting the training samples into a network structure for training to obtain an action mapping model.
Further, the above apparatus further comprises: the instruction receiving module is used for receiving a working mode selection instruction sent by a user; the initialization data receiving module is used for receiving initialization data sent by the motion acquisition equipment if the working mode selection instruction indicates that the live working robot adopts a motion capture mode; the initial information calculation module is used for calculating initial rotation angle information of shoulder joints and elbow joints of the arms of the operator through the initialization data based on a preset biological arm motion model; the comparison module is used for comparing the initial rotation angle information with preset initial action information; and if the initial rotation angle information is not matched with the initial motion information, continuing to execute the step of receiving initialization data until the initial rotation angle information is matched with the initial motion information, and determining that the charged operation robot is in a motion capture mode.
The implementation principle and the technical effects of the control device for the electric working robot provided by the embodiment of the invention are the same as those of the control method for the electric working robot, and for the sake of brief description, corresponding contents in the control method for the electric working robot can be referred to where the embodiment of the control device for the electric working robot is not mentioned.
Corresponding to the above embodiment, the embodiment of the present invention further provides an operation system of an electric operating robot, as shown in fig. 9, the operation system of an electric operating robot includes an electric operating robot 80, a controller 81, and a motion acquisition device 82; the controller is respectively in communication connection with the live working robot and the motion acquisition equipment; the device is arranged in the controller.
The operation system of the hot-line work robot is also called as a motion capture control system of the hot-line work robot; in the specific implementation process, the system can also comprise an insulating bucket arm vehicle and a double-arranged insulating bucket; the double-arranged insulating bucket is carried on an insulating bucket arm vehicle; the double-arranged insulating bucket comprises a first insulating bucket and a second insulating bucket; the live working robot and the controller are arranged on the first insulating hopper; the motion collection device is arranged on the second insulating bucket, and can be seen in fig. 3.
The live working robot is composed of equipment such as mechanical arms, multiple sensors, a slider module, a tool quick replacement device and a control cabinet, the sensors meet the requirements of recognition perception for position and angle, the operation range of the double-arm robot is optimized, and the autonomous live working efficiency is improved.
The live working robot operating system based on motion capture control consists of an intelligent information acquisition system (equivalent to the motion acquisition equipment), a data processing mechanism with optimized algorithm and a motion recurrence module for forward analysis of motion decomposition data; the system basically comprises: the robot comprises an information acquisition device (such as a camera arranged on a mechanical arm), a signal transmission processing device (such as the Wi-Fi control unit), a control panel, a feedback and control module, a synchronous mechanical arm, a handheld controller, a sensor worn on an operator and the like. The motion capture reproduces the motion of both arms of a person, and the motion of the end tool is released by the handheld controller to control commands, so that the difficulty and the calculated amount of the motion capture are reduced, and the mechanical arm of the robot can react more quickly. The handheld device is divided into a left-hand device and a right-hand device, and related instructions and necessary main control instructions are preset. After the synchronization is carried out, the control panel activates the synchronous control device to prevent accidental touch. The joints of the mechanical arm controlled by the motion control system are driven by data separately, inverse motion decomposition which is not based on a target can be realized, the flexibility of practical application is increased, and barriers can be avoided flexibly according to working conditions.
Aiming at the analysis of the operation characteristics and the operation environment of the distribution network live working robot, the system can adopt vision and laser radar as a sensor main sensing system, and the accurate identification and positioning of the target position are realized by researching a data fusion model. Further, a data fusion mechanism can be established: the identification positioning information, the operation control information and the detection information are fused with each other, each part is decomposed into subtask modules, such as a positioning information module, an operation control information module, a detection information module and the like, the processing of each information module is decided by a main controller, and finally the robot can efficiently and orderly carry out live-line operation.
The motion capture system is realized by a mathematical biological arm modeling and information acquisition method based on an inertial sensor, and a data preprocessing model based on acquired data and actual required data is adopted to realize a biological arm control flow through motion capture.
In the mode, the biological arm is effectively modeled through the motion capture system, so that the motion of the mechanical arm can be effectively guided, the target is accurately positioned and the obstacle is effectively avoided, the environment adaptive capacity of the robot is improved, and the automation degree of the live working robot is enhanced. Model data is optimized through the BP neural network, and robustness in the working process is enhanced. The trained network has an association function, can be directly applied to field operation, can effectively identify the action intention of an operator, improves the intelligent degree of an action capture system, and reduces the problem of overlong artificial debugging time.
It should be noted that the motion capture system has various operation modes, and an operator can select motion capture devices with different operation principles according to actual application. The data processing module can move freely, the operation position can not only stand in the insulating hopper, but also can be used for remotely controlling the motion of the mechanical arm in a control room of effective information, the field condition can also be judged in advance, effective actions are designed, researched and recorded, and the mechanical arm can complete operation tasks according to the preset actions on the field.
The manual intervention under autonomous operation does not necessarily need to be done by a person standing in the insulating bucket, and if conditions are more favorable, the intervention can be performed by a remote system at other positions.
The autonomous operation of the live working robot is completed based on environment modeling, algorithm optimization and virtual experiments, and is subject to the maturity of the prior art and the lack of adaptability to complex operation environments. Therefore, manual intervention has to be performed in the actual operation, and if the operation condition is mature, the full-automatic operation does not need manual intervention.
An autonomous operation robot controlled by a motion capture system adopts a modularized integration mode, the number of mechanical arms can be selected according to actual conditions, and two arms do not necessarily need to work together. The single arm can complete the operation, and the single arm can be activated to complete the related tasks.
Alternatively, a filtering algorithm can also be used to achieve a certain degree of noise reduction and data fitting, and similarly, other network models or model combinations in the neural network may achieve a better motion recognition effect.
As a scheme, the sensor data processing module in the embodiment of the invention is movable and is not integrated with the operation robot, so that the flexibility of the operation scene of an operator is ensured.
The embodiment also provides a controller. Fig. 10 is a schematic structural diagram of a controller, and as shown in fig. 10, the apparatus includes a processor 1201 and a memory 1202; the memory 1202 is used for storing one or more computer instructions, and the one or more computer instructions are executed by the processor to implement the control method of the electric working robot.
The implementation apparatus shown in fig. 10 further includes a bus 1203 and a forwarding chip 1204, and the processor 1201, the forwarding chip 1204 and the memory 1202 are connected through the bus 1203. The message transmission implementation device may be a network edge device.
The Memory 1202 may include a high-speed Random Access Memory (RAM) and may also include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. Bus 1203 may be an ISA bus, PCI bus, EISA bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 10, but this does not indicate only one bus or one type of bus.
The forwarding chip 1204 is configured to be connected to at least one user terminal and other network units through a network interface, and send the packaged IPv4 message or IPv6 message to the user terminal through the network interface.
The processor 1201 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be implemented by integrated logic circuits of hardware or instructions in the form of software in the processor 1201. The Processor 1201 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the device can also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA), or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 1202, and the processor 1201 reads information in the memory 1202 to complete the steps of the method of the foregoing embodiments in combination with hardware thereof.
The embodiment of the present invention further provides a machine-readable storage medium, where the machine-readable storage medium stores machine-executable instructions, and when the machine-executable instructions are called and executed by a processor, the machine-executable instructions cause the processor to implement the control method for the charged operation robot, and specific implementation may refer to method implementation, and will not be described herein again.
The control device and the implementation device for the live working robot provided by the embodiment of the invention have the same implementation principle and the same technical effects as those of the method embodiment, and for the sake of brief description, the corresponding contents in the method embodiment can be referred to where the device embodiment is not mentioned.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. The apparatus embodiments described above are merely illustrative, and the flowcharts and block diagrams in the figures, for example, illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, each functional module or unit in each embodiment of the present invention may be integrated together to form an independent part, or each module may exist alone, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Finally, it should be noted that: the above-mentioned embodiments are merely specific embodiments of the present disclosure, which are used for illustrating the technical solutions of the present disclosure and not for limiting the same, and the scope of the present disclosure is not limited thereto, and although the present disclosure is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive of the technical solutions described in the foregoing embodiments or equivalent technical features thereof within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present disclosure, and should be construed as being included therein. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (9)

1. A control method of a live working robot, characterized in that the method is applied to a controller; the controller is respectively in communication connection with the live working robot and the motion acquisition equipment; the motion acquisition equipment is worn on an operator; the method comprises the following steps:
receiving the action data of the operator sent by the action acquisition equipment;
generating a control signal according to the action data and a pre-established action mapping model;
controlling the action of the live working robot through the control signal so that the live working robot works according to the action of the arm of the operator;
before receiving the action data of the operator sent by the action acquisition equipment, the method further comprises the following steps:
receiving a working mode selection instruction sent by a user;
if the working mode selection instruction indicates that the live working robot adopts a motion capture mode, receiving initialization data sent by the motion acquisition equipment;
based on a preset biological arm motion model, calculating initial rotation angle information of shoulder joints and elbow joints of the arms of the operator through the initialization data;
comparing the initial rotation angle information with preset initial action information;
and if the initial rotation angle information is not matched with the initial action information, continuing to execute the step of receiving initialization data until the initial rotation angle information is matched with the initial action information.
2. The method according to claim 1, wherein the live working robot comprises a robot arm; generating a control signal according to the action data and a pre-established action mapping model, wherein the step comprises the following steps:
based on a preset biological arm motion model, calculating rotation angle information of shoulder joints and elbow joints of the arms of the operator according to the motion data;
inputting the rotation angle information into a pre-established action mapping model to obtain a target degree of freedom parameter of the mechanical arm;
and generating a control signal based on the target degree of freedom parameter of the mechanical arm.
3. The method of claim 2, wherein the step of generating a control signal based on the target degree of freedom parameter of the robotic arm comprises:
acquiring a current degree of freedom parameter of the mechanical arm;
calculating a degree-of-freedom parameter difference between the target degree-of-freedom parameter and the current degree-of-freedom parameter;
generating a control signal based on the degree of freedom parameter difference.
4. The method of claim 2, wherein the action mapping model is built based on a neural network; the action mapping model is established in the following way:
obtaining a training sample; the training sample comprises rotation angle information of shoulder joints and elbow joints of a plurality of arms and target freedom degree parameters of mechanical arms corresponding to the rotation angle information;
establishing a network structure of a neural network according to a preset model architecture;
and inputting the training sample into the network structure for training to obtain the action mapping model.
5. A control device of a live working robot is characterized in that the device is arranged on a controller; the controller is respectively in communication connection with the live working robot and the motion acquisition equipment; the motion acquisition equipment is worn on an operator; the device comprises:
the data receiving module is used for receiving the action data of the operator, which is sent by the action acquisition equipment;
the control signal generation module generates a control signal according to the action data and a pre-established action mapping model;
the control module controls the action of the live working robot through the control signal so that the live working robot can work according to the action of the arm of the operator;
the device further comprises:
the instruction receiving module is used for receiving a working mode selection instruction sent by a user;
the initialization data receiving module is used for receiving initialization data sent by the motion acquisition equipment if the working mode selection instruction indicates that the live working robot adopts a motion capture mode;
the initial information calculation module is used for calculating initial rotation angle information of shoulder joints and elbow joints of the arms of the operator through the initialization data based on a preset biological arm motion model;
the comparison module is used for comparing the initial rotation angle information with preset initial action information; and if the initial rotation angle information is not matched with the initial action information, continuing to execute the step of receiving initialization data until the initial rotation angle information is matched with the initial action information.
6. The live working robot operating system is characterized by comprising a live working robot, a controller and motion acquisition equipment; the controller is respectively in communication connection with the live working robot and the motion acquisition equipment; the device of claim 5 disposed in the controller.
7. The system of claim 6, further comprising an arm car and a double-placed bucket; the double-arranged insulating bucket is carried on the insulating bucket arm vehicle;
the double-arranged insulating hopper comprises a first insulating hopper and a second insulating hopper; the live working robot and the controller are arranged on the first insulating hopper; the motion acquisition equipment is arranged on the second insulating hopper.
8. A controller comprising a processor and a memory, the memory storing machine executable instructions executable by the processor, the processor executing the machine executable instructions to implement the method of any one of claims 1 to 4.
9. A machine-readable storage medium having stored thereon machine-executable instructions which, when invoked and executed by a processor, cause the processor to implement the method of any of claims 1 to 4.
CN201911288390.6A 2019-12-12 2019-12-12 Control method, device and system of live working robot Expired - Fee Related CN111085996B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911288390.6A CN111085996B (en) 2019-12-12 2019-12-12 Control method, device and system of live working robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911288390.6A CN111085996B (en) 2019-12-12 2019-12-12 Control method, device and system of live working robot

Publications (2)

Publication Number Publication Date
CN111085996A CN111085996A (en) 2020-05-01
CN111085996B true CN111085996B (en) 2021-05-07

Family

ID=70396157

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911288390.6A Expired - Fee Related CN111085996B (en) 2019-12-12 2019-12-12 Control method, device and system of live working robot

Country Status (1)

Country Link
CN (1) CN111085996B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111923009A (en) * 2020-09-14 2020-11-13 天津滨电电力工程有限公司 IMU-based control device and method for live-wire work robot of distribution line
CN112894831B (en) * 2021-04-21 2022-08-23 广东电网有限责任公司电力科学研究院 Double-arm robot insulated wire stripping system and method
CN116587327B (en) * 2023-06-20 2024-06-18 广东电网有限责任公司广州供电局 Motion control system, live working robot detection method and related equipment
CN116619422A (en) * 2023-06-21 2023-08-22 哈尔滨理工大学 Motion capture and wrist joint angle estimation mechanical arm mapping method based on human body characteristic signals

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102229146B (en) * 2011-04-27 2013-05-08 北京工业大学 Remote control humanoid robot system based on exoskeleton human posture information acquisition technology
CN102615637B (en) * 2012-04-01 2014-08-27 山东电力研究院 Master-slave control robot work platform for high-voltage live working
CN202556399U (en) * 2012-04-01 2012-11-28 山东电力研究院 High-voltage live-working robot device
US9586318B2 (en) * 2015-02-27 2017-03-07 Toyota Motor Engineering & Manufacturing North America, Inc. Modular robot with smart device
CN108858121A (en) * 2018-07-27 2018-11-23 国网江苏省电力有限公司徐州供电分公司 Hot line robot and its control method
CN110421559B (en) * 2019-06-21 2021-01-05 国网安徽省电力有限公司淮南供电公司 Teleoperation method and motion track library construction method of distribution network live working robot

Also Published As

Publication number Publication date
CN111085996A (en) 2020-05-01

Similar Documents

Publication Publication Date Title
CN111085996B (en) Control method, device and system of live working robot
US20210205986A1 (en) Teleoperating Of Robots With Tasks By Mapping To Human Operator Pose
US10406686B2 (en) Bare hand robot path teaching
Kohrt et al. An online robot trajectory planning and programming support system for industrial use
CN111152218B (en) Action mapping method and system of heterogeneous humanoid mechanical arm
JP7035309B2 (en) Master-slave system
CN115469576B (en) Teleoperation system based on human-mechanical arm heterogeneous motion space hybrid mapping
Spranger et al. Human-machine interface for remote training of robot tasks
CN113211447A (en) Mechanical arm real-time perception planning method and system based on bidirectional RRT algorithm
Corsini et al. Nonlinear model predictive control for human-robot handover with application to the aerial case
Aleotti et al. Position teaching of a robot arm by demonstration with a wearable input device
CN111152220B (en) Mechanical arm control method based on man-machine fusion
Ulloa et al. Design and mixed-reality teleoperation of a quadruped-manipulator robot for sar tasks
Cserteg et al. Assisted assembly process by gesture controlled robots
Shariatee et al. Safe collaboration of humans and SCARA robots
CN115847428A (en) AR technology-based mechanical assembly auxiliary guide system and method
Zhou et al. A cooperative shared control scheme based on intention recognition for flexible assembly manufacturing
Djelal et al. Target tracking by visual servoing
Winiarski et al. Automated generation of component system for the calibration of the service robot kinematic parameters
Al-Junaid ANN based robotic arm visual servoing nonlinear system
Nguyen et al. Performance evaluation of an inverse kinematic based control system of a humanoid robot arm using MS Kinect
Bai et al. Kinect-based hand tracking for first-person-perspective robotic arm teleoperation
Garg et al. Handaid: A seven dof semi-autonomous robotic manipulator
CN118061176A (en) Multi-mode sharing teleoperation system and method for three-arm space robot
EP4316747A1 (en) Robot remote operation control device, robot remote operation control system, robot remote operation control method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20210507

CF01 Termination of patent right due to non-payment of annual fee