CN101432103B - Apparatus for controlling robot arm - Google Patents

Apparatus for controlling robot arm Download PDF

Info

Publication number
CN101432103B
CN101432103B CN2007800149419A CN200780014941A CN101432103B CN 101432103 B CN101432103 B CN 101432103B CN 2007800149419 A CN2007800149419 A CN 2007800149419A CN 200780014941 A CN200780014941 A CN 200780014941A CN 101432103 B CN101432103 B CN 101432103B
Authority
CN
China
Prior art keywords
collision
robot arm
people
joint portion
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN2007800149419A
Other languages
Chinese (zh)
Other versions
CN101432103A (en
Inventor
冈崎安直
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Holdings Corp
Original Assignee
Matsushita Electric Industrial Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Matsushita Electric Industrial Co Ltd filed Critical Matsushita Electric Industrial Co Ltd
Publication of CN101432103A publication Critical patent/CN101432103A/en
Application granted granted Critical
Publication of CN101432103B publication Critical patent/CN101432103B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1674Programme controls characterised by safety, monitoring, diagnostic
    • B25J9/1676Avoiding collision or forbidden zones
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/081Touching devices, e.g. pressure-sensitive
    • B25J13/084Tactile sensors
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39201Control of joint stiffness
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39347Joint space impedance control
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40203Detect position of operator, create non material barrier to protect operator

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)

Abstract

A control apparatus (1) controls a robot arm (8) by a means (4), which is provided for controlling operation for coping with collision between human and the robot arm (8), by separately setting mechanical impedance for each joint of the robot arm (8) when a human proximity detecting means (3) detects proximity of human based on the human movement detected by a human movement detecting means (2).

Description

The control device of robot arm
Technical field
The present invention relates to control device and control method, robot, and the control program of robot arm of the robot arm that home-use robot etc. might contact with people's physics.
Background technology
In recent years, the exploitation of home-use robot such as pet type robot is very active, and the housework that waits in expectation in the future support robot etc. are the practicability of practical home-use robot more.Because home-use robot need get into family and people's community life, it is indispensable therefore contacting with people's physics, consider from the security aspect, and need be soft.
For problem so, as technology in the past, in patent documentation 1 (TOHKEMY is put down the 10-329071 communique) a kind of control device is disclosed; Its detection imposes on robot arm and the contact force people; When arm is applied bigger power, reduce recuperability, improve security; When arm is applied small power, increase recuperability, guarantee operation precision.
In addition, in patent documentation 2, also disclose a kind of control device, shown in Figure 11 A, it has: sensor mechanism 515, and it is the residing environment of detecting obstacles thing and the contact site and the contact force thereof that contact with robot 513 respectively; Posture measurement mechanism 514, it measures the posture of travel mechanism 511 and manipulator 512 respectively; Computer 517, its contact force from said contact force and posture calculating robot 513 is avoided action; Driver 516, it drives travel mechanism 511 and manipulator 512 according to the result of calculation of this computer 517.When the contacting of residing environment of barrier and robot 513, avoid action through both teamwork of travel mechanism 511 and manipulator 512.
In addition; A kind of impedance Control device is disclosed in patent documentation 3; Shown in figure 12; It detects the power from external environment condition that acts on the robot 401 with hand end operating means 411 for sense sensor 412 firmly, infers the stiffness coefficient of external environment condition in real time, the arm 410 of the control drive controlling robot 401 through motor 413a.
Patent documentation 1:JP spy opens flat 10-329071 communique
Patent documentation 2:JP spy opens communique 2005-No. 59161
Patent documentation 3:JP spy opens communique 2004-No. 223663
But, in above-mentioned control device in the past, reckon without the application on articulated robot arm, when being articulated robot arm, can not become the control device that guarantees security.
In addition, in above-mentioned control device in the past, also reckon without the people's who contacts activity, can not become the best control that conforms to people's activity.
In addition; In patent documentation 2; Shown in Figure 11 B and Figure 11 C, avoid path 526 and move along contacting to the direction of contact force 531 through making in the robot 513 and contact site 532 environment that has barrier, avoid action 527; Reduce contact force 531, finally avoid contact from environment.In addition; Shown in Figure 11 D and Figure 11 E; Contact-making surface through making in the robot 513 and contact site 532 environment that has barrier tilts to the direction of the moment 533 that takes place through contact force 531, and aforementioned contact site is moved with the mode of the circle 526 of describing arbitrary dimension, avoids action 527; Reduce contact force 531, finally avoid contact from environment.But, have no open for each concrete action control of avoiding action 527.
In addition; About patent documentation 3, shown in figure 12, have dependency relation ground with the stiffness coefficient of the external environment condition of calculating; The impedance operator of robot side is changed; Adopt these values to calculate the position of hand end operating means 411, only be conceived to the position of hand end operating means 411, not open fully for the concrete action control of the arm 410 that carries out in order the people not to be applied the contact infringement.
Summary of the invention
The object of the invention just is to solve the problem of above-mentioned control device in the past; A kind of control device and control method, robot, and the control program of robot arm of robot arm are provided; Even its articulated robot arm also can contact with the people safely, and, the contact action of the best that can carry out corresponding people's activity; Can the people not applied contact infringement ground and people's coexistence, can realize the action control of the safety of robot arm.
For achieving the above object the present invention such as following formation.
According to the 1st mode of the present invention, a kind of control device of articulated robot arm is provided, it has with lower unit:
Position of collision is obtained the unit, the position of collision of its acquisitor or moving body and said articulated robot arm;
Collision respective action control module; It obtains the said position of collision of people that the unit is obtained, said or moving body and said articulated robot arm based on said position of collision, and control makes connecting rod than the said articulated robot arm under the said position of collision near the rigidity of the joint portion of the root one side collision respective action lower than the rigidity of other joint portion.
According to the 9th mode of the present invention, a kind of control method of articulated robot arm is provided, wherein,
Detect near the people of said robot arm or the motion of moving body by motion detection unit;
Based on the said people who detects by said motion detection unit or the motion detection result of moving body, by surveying said people or moving body near probe unit near said robot arm;
Through said near probe unit detect said people or moving body near the time; Based on the said people who detects by said motion detection unit or the said motion of moving body; Infer the said position of collision of said people or moving body and said articulated robot arm; Obtain the unit through position of collision and obtain said position of collision; And; Through utilizing collision respective action control module, control makes than obtains through said position of collision the connecting rod of the said articulated robot arm under the said position of collision of obtaining the unit lower than the rigidity of other joint portion near the rigidity of the joint portion of root one side, carries out the collision respective action corresponding to the collision of said people and said robot arm.
According to the 10th mode of the present invention, a kind of robot is provided, it is characterized in that:
Have,
Said articulated robot arm,
Control the control device of any one the described robot arm in the 1st mode~the 8th mode of said robot arm;
Carry out said collision respective action through said collision respective action control module control corresponding to the collision of said people or moving body and said robot arm.
According to the 11st mode of the present invention, a kind of control program of robot arm is provided, be used to make computer to have as function like lower unit:
Position of collision is obtained the unit, the position of collision of its acquisitor or moving body and articulated robot arm;
Collision respective action control module; It obtains the said people that obtains the unit or the said position of collision of moving body and said articulated robot arm based on said position of collision, and control makes connecting rod than the said articulated robot arm under the said position of collision near the rigidity of the joint portion of the root one side collision respective action lower than the rigidity of other joint portion.
According to the control device of robot arm of the present invention and control method, robot, and the control program of robot arm, possess position of collision and obtain the unit with collision respective action control module or possess these functions.Therefore, the people during near robot arm, can be by the suitable gesture actions of corresponding people's motion or by the mechanical impedance action of suitable joint portion.In other words; Above-mentioned position of collision based on above-mentioned people or moving body and above-mentioned articulated robot arm; Be lower than the rigidity of other joint portion through making than the connecting rod of the above-mentioned articulated robot arm under the above-mentioned position of collision near the rigidity of the joint portion of root one side, it is movable to make easily near the joint portion of root one side, the power that applies when subduing collision by this joint portion; Thereby can reduce to impose on people's impulsive force, improve security.Thereby even articulated robot arm also can contact with the people safely, and the contact action of the best that can carry out corresponding people's activity can realize the people is not applied the action control of contact infringement ground and the robot arm of the safety of people's coexistence.
Description of drawings
Above-mentioned purpose of the present invention and characteristic, can from reference to accompanying drawing to illustrating the following narration of preferred embodiment carrying out of the present invention.Wherein:
Fig. 1 is that the formation and the control object of the control device of the robot arm in expression the 1st embodiment of the present invention is the key diagram of the formation of robot arm.
Fig. 2 is the sketch map of the collision warning region of the robot arm in the control device of the above-mentioned robot arm of explanation in above-mentioned the 1st embodiment of the present invention.
Fig. 3 A is the people's motion detection unit of the above-mentioned control device in above-mentioned the 1st embodiment of explanation and the vertical view of the function that position of collision is inferred the unit.
Fig. 3 B is the people's motion detection unit of the above-mentioned control device in above-mentioned the 1st embodiment of explanation and the vertical view of the function that position of collision is inferred the unit.
Fig. 3 C is the people's motion detection unit of the above-mentioned control device in above-mentioned the 1st embodiment of explanation and the vertical view of the function that position of collision is inferred the unit.
Fig. 3 D is the people's motion detection unit of the above-mentioned control device in above-mentioned the 1st embodiment of explanation and the vertical view of the function that position of collision is inferred the unit.
Fig. 4 is the block diagram of detailed formation of the action control unit of the above-mentioned control device of expression in above-mentioned the 1st embodiment.
Fig. 5 is the flow chart of the action step of the control program in the action control unit of the above-mentioned control device of expression in above-mentioned the 1st embodiment.
Fig. 6 is the flow chart of all action step of the above-mentioned control device in above-mentioned the 1st embodiment of expression.
Fig. 7 is the sketch map of target track of the control device of the above-mentioned robot arm of explanation in above-mentioned the 1st embodiment of the present invention.
Fig. 8 A is that the formation and the control object of the control device of the robot arm in expression the 2nd embodiment of the present invention is the sketch map of the formation of robot arm.
Fig. 8 B is the block diagram of detailed formation of the action control unit of the above-mentioned control device of expression in above-mentioned the 2nd embodiment.
Fig. 9 A is the sketch map of action of the control device of the above-mentioned robot arm of explanation in above-mentioned the 2nd embodiment of the present invention.
Fig. 9 B is the sketch map of action of the control device of the above-mentioned robot arm of explanation in above-mentioned the 2nd embodiment of the present invention.
Fig. 9 C is the sketch map of action of the control device of the above-mentioned robot arm of explanation in above-mentioned the 2nd embodiment of the present invention.
Fig. 9 D is the sketch map of action of the control device of the above-mentioned robot arm of explanation in above-mentioned the 2nd embodiment of the present invention.
Figure 10 is that the control object of the control device of the robot arm in expression the 2nd embodiment of the present invention is the sketch map of other formation of robot arm.
Figure 11 A relates to the concise and to the point pie graph of the robot controller of technology in the past of patent documentation 2.
Figure 11 B is the sketch map of the concrete example of the avoidance contact action in the robot controller of presentation graphs 11A.
Figure 11 C is the sketch map of the above-mentioned concrete example of the avoidance contact action in the robot controller of presentation graphs 11A.
Figure 11 D is the sketch map of another concrete example of the avoidance contact action in the robot controller of presentation graphs 11A.
Figure 11 E is the sketch map of the above-mentioned another concrete example of the avoidance contact action in the robot controller of presentation graphs 11A.
Figure 12 relates to the block diagram of the robot controller of technology in the past of patent documentation 3.
The specific embodiment
Before continuation narration of the present invention, need to prove, for parts additional phase identical in accompanying drawing reference marks together.
Below, with reference to accompanying drawing embodiment of the present invention is at length explained.
Below, before the embodiment among the present invention at length being explained, earlier variety of way of the present invention is described with reference to accompanying drawing.
According to the 1st mode of the present invention, a kind of control device of articulated robot arm is provided, it has:
Position of collision is obtained the unit, the position of collision of its acquisitor or moving body and said articulated robot arm;
Collision respective action control module; It obtains the said people that obtains the unit or the said position of collision of moving body and above-mentioned articulated robot arm based on said position of collision, and respective action is collided in control to be made lower than the rigidity of other joint portion near the rigidity of the joint portion of root one side than the connecting rod of the said articulated robot arm under the said position of collision.
According to the 2nd mode of the present invention; A kind of control device of as the described robot arm of the 1st mode is provided; Wherein, Said collision respective action control module, control carry out said collision respective action makes the rigidity near the joint portion of said root one side of connecting rod of the said articulated robot arm under the said position of collision lower than the rigidity of other joint portion.
According to the 3rd mode of the present invention; Control device a kind of as the 1st mode or the described robot arm of the 2nd mode is provided; Wherein, Said collision respective action control module, through keeping than the connecting rod of the said articulated robot arm under said position of collision rigidity or the additional bigger rigidity near the joint portion of a side of wrist portion, said collision respective action is carried out in control.
According to the 4th mode of the present invention; Control device a kind of as the 1st mode or the described robot arm of the 2nd mode is provided; Wherein, Said collision respective action control module is controlled joint angles through the said connecting rod that contrasts the said articulated robot arm under the said position of collision respectively near each joint portion of a side of wrist portion, and said collision respective action is carried out in control.
According to the 5th mode of the present invention, control device a kind of as the 1st mode or the described robot arm of the 2nd mode is provided, wherein,
The motion detection unit that also possesses the motion that detects said people or moving body;
Said position of collision is obtained the unit, based on the testing result of said motion detection unit, infers the said position of collision of said people or moving body and said articulated robot arm, obtains said position of collision;
Said collision respective action control module, based on obtained the said position of collision that infer the unit by said position of collision, said collision respective action is carried out in control.
According to the 6th mode of the present invention, a kind of control device of as the described robot arm of the 5th mode is provided, wherein,
Also possess testing result based on said motion detection unit, survey said people or moving body near said robot arm near probe unit;
Said collision respective action control module, said near probe unit detect said people or moving body near the time, said collision respective action is carried out in control.
According to the 7th mode of the present invention; Control device a kind of as the 5th mode or the described robot arm of the 6th mode is provided; Wherein, Said collision respective action control module, the speed composition of the said people who detects based on said motion detection unit or the motion of moving body, said collision respective action is carried out in control.
According to the 8th mode of the present invention, the control device of any one described robot arm in a kind of as the 5th mode to the 7 modes is provided, wherein, said motion detection unit detects the position and the translational speed of said people or moving body, and detects said people's motion.
According to the 9th mode of the present invention, a kind of control method of articulated robot arm is provided, wherein,
Detect near the people of said robot arm or the motion of moving body by motion detection unit;
Based on the said people who detects by said motion detection unit or the motion detection result of moving body, by surveying said people or moving body near probe unit near said robot arm;
Through said near probe unit detect said people or moving body near the time; Based on the said people who detects by said motion detection unit or the said motion of moving body; Infer the said position of collision of said people or moving body and said articulated robot arm; Obtain the unit through position of collision and obtain said position of collision; And; Through utilizing collision respective action control module, control makes than obtains through said position of collision the connecting rod of the said articulated robot arm under the said position of collision of obtaining the unit lower than the rigidity of other joint portion near the rigidity of the joint portion of root one side, carries out the collision respective action corresponding to the collision of said people and said robot arm.
According to the 10th mode of the present invention, a kind of robot is provided, it is characterized in that:
Have, said articulated robot arm,
Control the control device of any one the described robot arm in the 1st mode~the 8th mode of said robot arm;
Carry out said collision respective action through said collision respective action control module control corresponding to the collision of said people or moving body and said robot arm.
According to the 11st mode of the present invention, a kind of control program of robot arm is provided, be used to make computer to have as function like lower unit:
Position of collision is obtained the unit, the position of collision of its acquisitor or moving body and articulated robot arm;
Collision respective action control module; It obtains the said position of collision of people that the unit is obtained, said or moving body and said articulated robot arm based on said position of collision, and control makes connecting rod than the said articulated robot arm under the said position of collision near the rigidity of the joint portion of the root one side collision respective action lower than the rigidity of other joint portion.
Below, with reference to accompanying drawing embodiment of the present invention is at length explained.
(the 1st embodiment)
Fig. 1 is that formation and the control object thereof of the control device 1 of the articulated robot arm of expression in the 1st embodiment of the present invention is the sketch map of the formation of articulated robot arm 8.The control device 1 of this robot arm is the control device of control method of implementing the robot arm of above-mentioned the 1st embodiment; Can be used for possessing the robot of articulated robot arm 8; As after state, also can be used for the control program of articulated robot arm 8.
In Fig. 1; The 2nd, as motion detection unit one the example people's motion detection unit; According to using first-class image pickup device 36 shot image data of shooting; Carry out image recognition processing by people's motion detection unit 2, detect near the people's 38 of robot arm 8 the position and the information of translational speed and moving direction by people's motion detection unit 2.
The 3rd, as near probe unit one the example the people near probe unit; By the testing result of people's motion detection unit 2 be the people position information and from after the relation of information of posture of the robot arm 8 obtained of the joint angles information that obtains of the action control unit 7 stated by robot arm 8, detect people 38 near robot arm 8.
The 5th, obtain the position of collision of an example of unit as position of collision and infer the unit; The approaching people's 38 who is detected by people's motion detection unit 2 positional information and people 38 translational speed and moving direction are moving velocity vector information; Infer robot arm 8 and people's 38 collision precalculated position (can be used as the position that position of collision is handled), with inferring the collision respective action generation unit of stating after the result exports to 6.
The 4th, collision respective action control module; The control of the action of the robot arm 8 when using robot arm 8 to carry out operation (specifically is; The control of the value of the mechanical impedance of above-mentioned robot arm 8); The people near probe unit 3 detect people 38 to robot arm 8 near the time, based on the testing result of people's motion detection unit 2, the collision respective action the during collision of control people 38 and robot arm 8.
Collision respective action control module 4 by collision respective action generation unit 6, and action control unit 7 constitute.
Collision respective action generation unit 6; The collision precalculated position of inferring unit 5 is inferred in calculating at position of collision; The joint target track of the robot arm 8 when robot arm 8 contacts with people 38 and the mechanical impedance value of joint target are exported to action control unit 7 with result of calculation.
The posture of the finger of action control unit 7 control robot arms 8; Realize target job with the finger of robot arm 8, and, under the approaching situation of people 38; Switching controls makes follows the joint target track that collision respective action generation unit 6 generates; Carry out impedance Control simultaneously, the value of mechanical impedance of each joint portion of control robot arm 8 reaches the mechanical impedance value of the joint target that collision respective action generation unit 6 generates.Here, each joint portion of robot arm 8, as after state, via the motor driver 19 that is connected on D/A converter 21 and the A/D converter 22, under impedance Control, drive motor 34 as an example of the drive unit of each joint portion, carry out flexure operation respectively.
Control device 1; Constitute by common universal microcomputer as hardware; Except people's motion detection unit 2, people near probe unit 3, position of collision infer unit 5, collision respective action generation unit 6, and action control unit 7 in input and output IF (input/output interface) 20 the part utilization can realize by software by the control program of universal microcomputer execution 18.
Input and output IF20 is made up of the D/A converter 21 on the docking station of the pci bus that is connected universal microcomputer etc., A/D converter 22, counting converter 23.
The control program 18 that is used to control the action of robot arm 8 through execution plays a role control device 1; Joint angles information by 35 outputs of the encoder on the motor 34 of each joint portion that is connected robot arm 8; Input to control device 1 through counting converter 23, calculate the control instruction value of the spinning movement of each joint portion through control device 1.Each control instruction value of calculating flows to motor driver 19 through D/A converter 21, according to each control instruction value of seeing off from motor driver 19, the motor 34 of each joint portion of drive machines human arm 8.
Robot arm 8 is many link robot of 4DOF, has: hand 9, have at front end the 2nd connecting rod 11 of wrist portion 10 that hand 9 is installed, rotatably link the 1st connecting rod 12 of the 2nd connecting rod 11, rotatably link the base portion 13 of supporting the 1st connecting rod 12.Wrist portion 10 has 2 rotating shafts (rotating shaft along the vertical direction and along horizontal rotating shaft) of the 3rd joint portion 16 and the 4th joint portion 17, can make hand 9 and the 2nd connecting rod 11 relative posture (towards) variation.The other end of the 2nd connecting rod 11 is with respect to the front end of the 1st connecting rod 12; Can be around the rotating shaft rotation along above-below direction of the 2nd joint portion 15; The other end of the 1st connecting rod 12 is with respect to base portion 13; Can be around the rotating shaft rotation along above-below direction of the 1st joint portion 14, total can so constitute many link robot of above-mentioned 4DOF around 4 axle rotations.
Each joint portion at the rotating part that constitutes each possesses: on the parts in 2 parts that link by each joint portion set and by after motor driver 19 drive controlling stated as the motor 34 (in fact being located at the inside of each joint portion of robot arm 8) of an example of rotating driving device and detect the encoder 35 (in fact being provided in the inside of each joint portion of robot arm 8) at rotatable phase angle (being joint angle) of the rotating shaft of motor 34.The rotating shaft of motor 34 is attached on another parts of 2 parts that linked by each joint portion, through making the positive and negative rotation of above-mentioned rotating shaft, makes another parts center on each rotation with respect to parts.
The 32nd, the absolute coordinate system with respect to the relative position relation of the downside fixed part of base portion 13 is fixed is made as O with its initial point 0The 33rd, concern the finger coordinate system that is fixed with respect to hand 9 relative positions.The posture (φ, θ) that the origin position Oe (x, y) of the finger coordinate system 33 that will see from absolute coordinate system 32 is defined as the finger position of robot arm 8, the finger coordinate system 33 that will see from absolute coordinate system 32 is defined as the finger gesture of robot arm 8, and finger position/posture vector r is defined as r=[x, y, φ, θ] TUnder the situation of the finger position of controlling robot arm 8 and posture, make finger position/posture vector r follow target finger position/posture vector r Wd, so availability control unit 1 is controlled.
Then, detailed formation and the effect to each functional module of constituting control device 1 describes.
By image pickup device 36 shot image data,, detect the people's 38 when the ceiling in the room that disposes robot arm 8 is overlooked ground two-dimensional position X to people's motion detection unit 2 input through the image recognition processing in people's motion detection unit 2 p(x p, y p) and people 38 moving velocity vector V p(v x, v y), with two-dimensional position X p(x p, y p) and moving velocity vector V p(v x, v y) information export to the people by people's motion detection unit 2 and infer unit 5 near probe unit 3 and position of collision.
The people is near probe unit 3 definition collision warning region 100 shown in Figure 2, survey and judge whether people 38 gets into should zone 100, whether people 38 near robot arm 8.
The sector region 100A of the base end side of collision warning region 100 is the length L with the 1st connecting rod 12 1α 1Doubly be α 1L 1Being radius, is round dot with the 1st joint shaft (rotating shaft of the 1st joint portion 14), with the joint angle q of the 1st joint portion 14 1The central shaft of for example the 1st connecting rod 12 be the center ± β 1° the zone of angle.In addition, the sector region 100B of the front of collision warning region 100 is the length L with the 2nd connecting rod 11 2Length L with hand 9 Hα 2Doubly, i.e. α 2(L 2+ L H) be radius, be round dot with the 2nd joint shaft (rotating shaft of the 2nd joint portion 15), with the joint angle q of the 2nd joint portion 15 2The central shaft of for example the 2nd connecting rod 11 be the center ± β 2° the zone of angle.Thus, even make the range of sector region with speed, can prevent really that also the finger of hand 9 from reaching outside the sector region 100B of front with being directly proportional.
The α of definition collision warning region 100 1, α 2, β 1, β 2Value be to be worth arbitrarily, but can be based on the digit speed V that for example makes robot arm 8 action r={ (dx/dt) 2+ (dy/dt) 2} 1/2Confirm these values, if with digit speed V rStrengthen α with being directly proportional 1, α 2, β 1, β 2The mode of value confirm that then under the situation of the quick action of robot arm 8, the area of collision warning region 100 increases, and can improve security.
In the 1st embodiment, for example, with digit speed V rUnit be set at m/s, and with V KBe set at according to V rSize and V rThe value that is directly proportional or for the value of constant is set at α 1=1.2V K, α 2=1.2V K, β 1=24V K, β 2=24V KWherein, at V KDuring for following [formula 1], be by V K=1.9V rThe value (parameter) of definition is at V r<0.5 when [ms], be to be defined as V K=0.95 with the value (parameter) of constant.
[formula 1]
V r≥0.5[m/s]
Thus, be 0.5 [m/s] when above in digit speed, confirm to collide the size of warning region 100 with the size of digit speed with being directly proportional.In addition, when being lower than 0.5 [m/s], become fixed value α respectively 1=1.14, α 2=1.14, β 1=22.8, β 2=22.8, even under the low situation of digit speed, also can guarantee the collision warning region 100 of size to a certain degree.
The people is near probe unit 3; The joint angles information of the information of the people's 38 who detects by people's motion detection unit 2 position and the robot arm 8 that obtains from action control unit 7; Calculate the relativeness of people 38 and robot arm 8; If people 38 gets into collision warning region 100, just the people is invaded this advisory and give collision respective action control module 4.
Position of collision is inferred unit 5, and the information of the people's 38 who is detected by people's motion detection unit 2 position and the information of moving velocity vector are inferred the collision precalculated position [formula 2] with people 38, and output it to collision respective action generation unit 6.
[formula 2]
X p(x i,y i)
For example, shown in Fig. 3 A, imagination and velocity V p(v x, v y) extended line intersect vertically the straight line A of the rotating shaft of the 1st joint portion 14 through robot arm 8.If the angle initialization that the x axle of straight line A and absolute coordinate system 32 is formed is q 1i, with velocity V p(v x, v y) the intersection point of extended line and straight line A be set at collision precalculated position [formula 3], then provide angle q by following [formula 4]~[formula 7] 1i
[formula 3]
X p(x i,y i)
[formula 4]
X p(x i,y i)
[formula 5]
q 1i=π/2+atan2(v x,v y)
[formula 6]
x i = ( &pi; / 2 - q 1 i ) x p + y p tan q 1 i - q 1 i + &pi; / 2 .
[formula 7]
y i=x itanq 1i
Then, with reference to Fig. 3 A~Fig. 3 C, the details that collides respective action generation unit 6 is described.
Collision respective action generation unit 6, the state (current state) during from the common operation of Fig. 3 A generates the joint target track to the prevention collision posture of Fig. 3 B.
The line of the central shaft of the 1st connecting rod 12 of robot arm 8 is confirmed the angle q of the 1st joint portions 14 through colliding respective action generation unit 6 overlappingly with collision precalculated position [formula 8] 1iThat is q, 1i=atan2 (y i, x i).Wherein, atan2 (y i, x i)=arg (y i+ jx i), be that j is that imaginary number, arg are plural drift angle.In addition, the angle q of the 2nd joint portion 15 2i, surround people 38 in order to make robot arm 8, for example be set at 60 ° fixed value.
[formula 8]
X p(x i,y i)
The angle initialization of the 1st joint portion 14 of the robot arm 8 during with the corresponding posture of collision is q 1iIn addition, about the angle q of the 2nd joint portion 15 2i, regulation is according to the initial point O of collision precalculated position [formula 9] and absolute coordinate system 32 0Between distance L Xi, make its variation by these 3 patterns of following contact mode (1)~(3).
[formula 9]
X p(x i,y i)
Contact mode shown in Fig. 3 B (1):
[formula 10]
L xi≤L 1×0.8
In this case, set q 2i=90 °.
Contact mode shown in Fig. 3 C (2):
[formula 11]
L 1×0.8<L xi≤L 1×1.2
In this case, set q 2i={ (L Xi-L 1)/0.4 * L 1} * 90 °.
At the contact mode (3) shown in Fig. 3 D: L 1* 1.2<l XiThe time, be set at q 2i=0 °.
Thus, robot arm 8 can follow following such posture to carry out the impedance Control action.
When above-mentioned contact mode (1), shown in Fig. 3 B,, be the posture of the 2nd connecting rod 11 with respect to 90 ° of the 1st connecting rod 12 bendings owing to occur in the 1st connecting rod 12 with contacting of people 38, therefore become the posture that robot arm 8 surrounds people 38.
When above-mentioned contact mode (2), shown in Fig. 3 C, become with near contacting of people 38 occurs in the 2nd joint portion 15 the opening angle q of the 2nd joint portion 15 2iThe posture that changes according to contact position with people 38.
When above-mentioned contact mode (3), shown in Fig. 3 D, occur in the angle q of the 2nd connecting rod 11, the 2 joint portions 15 with contacting of people 38 2iBe 0 °, robot arm 8 becomes the 1st connecting rod 12 and the 2nd connecting rod 11 posture point-blank.
In the action of impedance Control so, from the currency (q of the joint angles of robot arm 8 1, q 2), 5 order polynomial interpolations through following [formula 12] are by collision respective action generation unit 6, calculate when obtaining contact action to joint angles (q 1i, q 2i) target track be joint angles desired value q Nd(t).Here, t is the time.
[formula 12]
q nd(t)=a 0+a 1t+a 2t 2+a 3t 3+a 4t 4+a 5t 5
Wherein,
[formula 13]
a o=q n
[formula 14]
a 1 = q . n
[formula 15]
a 2 = 1 2 q . . n
[formula 16]
a 3 = 1 2 t f 3 ( 20 q n 1 - 20 q n - 12 q . 0 t f - 3 q . . n t f 2 )
[formula 17]
a 4 = 1 2 t f 4 ( 30 q n + 16 q . n t f + 3 q . . n t f 2 )
[formula 18]
a 5 = 1 2 t f 5 ( - 12 q n - 6 q . n t f - q . . n t f 2 ) .
[formula 19]
n=1,2,
In addition, t fBe the release time of impedance Control action.
Collision respective action generation unit 6, as the joint angles desired value, will be by the joint angles desired value q of formula (12) calculating Nd(t) export to action control unit 7.
In addition, collision respective action generation unit 6, the people's 38 during according to contact mode (1)~(3) and collision speed composition, the mechanical impedance value of each joint portion when confirming the corresponding posture of the collision of Fig. 3 B~Fig. 3 D.As the setup parameter of mechanical impedance value, inertia I, viscosity D, rigidity K are arranged.
When contact mode (1) and (2): the composition v vertical in the people's 38 during according to contact the speed with the 1st and the 2nd connecting rod 12,11 I1, v I2, through I n=K I/ v In, D n=K D/ V In, K n=K K/ V InConfirm the mechanical impedance parameter of each joint portion.Wherein, n=1,2.In addition, K I, K D, K KBe constant gain, obtain concrete numerical value through experiment in order to reach safe action.In addition, confirm the higher limit Imax of each parameter n, Dmax n, Kmax n, when calculated value surpasses these higher limits, be regarded as each parameter and equate with higher limit.
When contact mode (3):, be made as I about the 1st joint portion 14 1=Imax 1, D 1=Dmax 1, K 1=Kmax 1, about the 2nd joint portion 15, according to composition v vertical in the people's 38 in when contact the speed with the 2nd connecting rod 11 I2, through I 2=K I/ v I2, D 2=K D/ V I2, K 2=K K/ V I2Confirm the mechanical impedance parameter of the 2nd joint portion 15.
About the mechanical impedance of wrist portion 10, usually, higher limit is I n=Imax n, D n=Dmax n, K n=Kmax n, wherein, establish n=3,4.
Higher limit is made as enough big value, for example is Imax n=Dmax n=Kmax n=10000, wherein establish n=1,2,3,4.
If confirmed the mechanical impedance of joint portion as stated; Then in contact mode (1) and (2); For example, when people 38 and the collision of the 1st connecting rod 12 approximate vertical, because
Figure G2007800149419D0015141900QIETU
; Therefore at the 1st joint portion 14; Set the mechanical impedance value with comparing before the impedance Control action, it is soft that the 1st joint portion 14 becomes, at the 2nd joint portion the 15, the 3rd joint portion 16 and the 4th joint portion 17 smallerly; The mechanical impedance value is set in higher limit with comparing before the impedance Control action, the 2nd joint portion the 15, the 3rd joint portion 16 and the 4th joint portion 17 become firm.That is to say, can through collision respective action control module 4 make the 1st connecting rod 12 under the position of collision with the 1st connecting rod 12 compare near a side of root (base portion 13) promptly the rigidity of the 1st joint portion 14 be lower than the for example collision respective action of the rigidity of the 2nd joint portion the 15, the 3rd joint portion 16 and the 4th joint portion 17 of another joint portion.Therefore, through the flexibility of utilizing the 1st joint portion 14 bump of collision is relaxed, make the 2nd joint portion 15 firm simultaneously, the 2nd connecting rod 11 can be inadvertently unmovable because of the reaction force of collision, and the 2nd connecting rod 11 can not collide people 38 yet, can guarantee security.In addition; Through constituting the L font by the 1st connecting rod 12 and the 2nd connecting rod 11; Surround people 38, can play effect, can prevent that people 38 from falling down (through when people 38 falls down, catching people 38 by the 1st connecting rod 12 or the 2nd connecting rod 11 as the support of people 38 front side and horizontal side two directions; Can prevent that people 38 from falling down), can improve security more.
In addition, in contact mode (2), even the speed composition v vertical with the 2nd connecting rod 11 arranged I2Situation under because according to speed composition v I2, relatively the rigidity of the 2nd joint portion 15 is set in the lowland, and the 2nd joint portion 15 becomes soft, therefore can guarantee security.
In addition, in contact mode (3) since with impedance Control action before compare the 1st joint portion 14 and become firm, according to composition v vertical in people 38 the translational speed with the 2nd connecting rod 11 I2More mildly set than the collision precalculated position of the 2nd connecting rod 11 under the collision precalculated position near the joint portion of a side of root (base portion 13) i.e. the 2nd joint portion 15 with comparing before the impedance Control action; Thereby the bump of flexibility absorption people 38 collisions the 2nd connecting rod 11 of enough the 2nd joint portions 15 of ability, so can guarantee security.In addition, with above-mentioned opposite, and mildly set the 1st joint portion 14, compare when firmly setting the 2nd joint portion 15, the inertia that applies during collision mainly is the inertia of the 2nd connecting rod 11, owing to do not apply the inertia of the 1st connecting rod 12, the bump in the time of therefore can reducing to collide.
Have again; In contact mode (3); Except mildly set the 1st joint portion 14, when firmly setting the 2nd joint portion 15; Also can consider to make simultaneously the 1st joint portion 14 and the 2nd joint portion 15 softer, that is to say, make than 2nd connecting rod 11 of collision under the precalculated position near all joint portions of a side of root (base portion 13) than its soft method of joint portion in addition than the 3rd other joint portion 16 and the 4th joint portion 17.
In the method, have at joint portion under the situation of movable range, can be safer.When each joint portion has movable range,,,, therefore also just can not bring into play flexibility owing to can not make the joint portion over-activity if exceed movable range because of collision makes joint portion movable.; Make near the joint portion of root (base portion a 13) side when all soft than the joint portion beyond it; Because movable range through addition the 1st joint portion 14 and the 2nd joint portion 15; Can obtain bigger movable range, therefore can enlarge the scope that to bring into play flexibility, thereby can improve security.
Then, with reference to Fig. 4 the detailed formation of action control unit 7 is described.
In Fig. 4, the 8th, the control object is a robot arm shown in Figure 1.From robot arm 8, export currency (joint angles vector) q=[q of the joint angle of encoder 35 measurements of passing through joint shaft separately 1, q 2, q 3, q 4] T, input to control device through counting converter 23.Wherein, q 1, q 2, q 3, q 4It is respectively the joint angles of the 1st joint portion the 14, the 2nd joint portion the 15, the 3rd joint portion the 16, the 4th joint portion 17.
The 24th, operative goals track generation unit, output is as the finger position/posture target vector r of operation that is used to realize robot arm 8 of target WdAs shown in Figure 7, the target action of robot arm 8 according to the purpose operation, provides the position (r of each taught point in advance Wd0, r Wd1, r Wd2), target track generation unit 24 uses polynomial interpolators, repairs the track between the position of each point, generates finger position/posture target vector r WdWherein, above-mentioned position r Wd0Position when being assumed to be time t=0, r Wd1Be assumed to be time t=t 1The time position, r Wd2Be assumed to be time t=t 2The time the position.
Input to the finger position/posture target vector r of inverse kinematics computing unit 25 by target track generation unit 24 Wd, be transformed into joint target vector q through inverse kinematics computing unit 25 Wd
The 26th, the target track switch unit is to target track switch unit 26, by inverse kinematics computing unit 25 input joint target vector q Wd, by collision respective action generation unit 6 input collision respective action joint target vector q Id, collide the respective action switching commands near probe unit 3 inputs by the people.Usually, the finger position/posture target vector r that generates by operative goals track generation unit 24 WdBe transformed to joint target vector q by inverse kinematics computing unit 25 Wd, target track switch unit 26 is selected the joint target vector q of above-mentioned conversion Wd, as joint target vector q dBy 26 outputs of target track switch unit.But, if the people near the time target track switch unit 26 accept collision respective action switching commands from the people near probe unit 3, then target track switch unit 26 is selected the contact action joint target vector q by 6 inputs of collision respective action generation unit Id, as joint target vector q dBy 26 outputs of target track switch unit.
The 27th, moment of torsion is inferred the unit, infers the external force moment of torsion that imposes on each joint portion of robot arm 8 because of people 38 and contacting of robot arm 8.Infer unit 27 to moment of torsion, via in the motor 34 of each joint portion of drive machines human arm 8 the value of current flowing i=[i of A/D converter 22 input by the current sensor measurement of motor driver 19 1, i 2, i 3, i 4] T, in addition, by the currency q of joint error compensation unit 30 input joint angles and after the joint angles error compensation output u that states QeMoment of torsion is inferred unit 27 and is had the function as observer, by currency q, the joint angles error compensation output u of above current value i, joint angle QeCalculate the torque T that occurs in each joint portion because of the external force that imposes on robot arm 8 Ext=[T 1ex, T 2ext, T 3ext, T 4ext] T
Impedance computation unit, joint 28 is to have the part that can realize the function of mechanical impedances at robot arm 8, when the common action that people 38 keeps off is 31 input side output 0 to Position Control by impedance computation unit 28, joint.On the other hand; If people 38 is approaching; Receive the contact action switching command from collision respective action generation unit 6; The external force torque T of inferring unit 27 through following formula (20), is inferred from the currency q and the moment of torsion of the mechanical impedance parameter I set by collision respective action generation unit 6, D, K, joint angle in the impedance computation unit 28 in the joint then Ext, calculate the joint target correction output q that is used for realizing mechanical impedance at each joint portion of robot arm 8 The d Δ, export to Position Control then and be 31 input side.With joint target correction output q The d Δ, be the joint target q of 31 input side and 26 outputs of target track switch unit in Position Control dAddition generates and revises joint target vector q Dm, and to input to Position Control be 31.
[formula 20]
q d&Delta; = ( s 2 I ^ + s D ^ + K ^ ) - 1 &tau; ext
In the formula, s is a Laplace operator, in addition,
[formula 21]
I ^ = I 1 0 0 0 0 I 2 0 0 0 0 I 3 0 0 0 0 I 4
[formula 22]
D ^ = D 1 0 0 0 0 D 2 0 0 0 0 D 3 0 0 0 0 D 4
[formula 23]
K ^ = K 1 0 0 0 0 K 2 0 0 0 0 K 3 0 0 0 0 K 4
The 30th, Position Control is the joint error compensation unit in 31, the currency q and the joint revise goal vector q of the joint angles vector of in robot arm 8, measuring DmError q eBe transfused to joint error compensation unit 30, joint error compensation exported u by joint error compensation unit 30 QeTo robot arm 8 outputs.
Joint error compensation output u Qe, via D/A converter 21, passing to motor driver 19 as voltage instruction value, positive and negative rotation drives each joint shaft, robot arm 8 actions.
Have again, above-mentioned action control unit 7, as stated, inferring unit 27, impedance computation unit 28, joint and Position Control by target track generation unit 24, inverse kinematics computing unit 25, target track switch unit 26, moment of torsion is 31 to constitute.
Action control unit 7 about by above formation describes its work principle.
What work is the FEEDBACK CONTROL (Position Control) of joint vector (currency of the joint angle) q of joint error compensation unit 30 formation basically, and the part that the dotted line of Fig. 4 surrounds is that Position Control is 31.As joint error compensation unit 30, for example, if use the PID compensator, then with joint error q e0 the mode of being punctured into is controlled, and realizes the action as the robot arm 8 of target.
When detecting people 38 to the intrusion of collision warning region 100 near probe unit 3 through the people, be 31 with respect to the Position Control of above-mentioned explanation, through impedance computation unit, joint 28, be 31 input side in Position Control, with joint target correction output q The d ΔJoint target q with 26 outputs of target track switch unit dThe correction of the desired value of joint portion is carried out in addition.Therefore, above-mentioned Position Control be the desired value of 31 joint portion than original value slight misalignment, the result can realize mechanical impedance.Because through type (20) is by impedance computation unit, joint 28 calculating joint target correction output q The d ΔTherefore, can realize the mechanical impedance of inertia I, viscosity D, rigidity K.
To realizing real work step, describe based on the flow chart of Fig. 5 based on the control program of the work of the action control unit 7 of above principle.
In step 1, will input to control device 1 by the joint angles data (joint parameter vector or joint angles vector q) that encoder 35 is measured.
Then; In step 2; Based on the operation control program 18 of the robot arm 8 in the memory that is stored in control device 1 in advance (not shown), through the operative goals track generation unit 24 of action control unit 7, the finger position of computing machine human arm 8/posture target vector r Wd
Then, in step 3, in inverse kinematics computing unit 25 with finger position/posture target vector r WdBe transformed into joint target q Wd
Then, in step 4, carry out the switching of target track through target track switch unit 26., under the situation of the intrusion of collision warning region 100, carry out the operation action, and get into step 5A nobody 38.On the other hand, people 38 being arranged under the situation of the intrusion of collision warning region 100, carry out the collision respective action, and carry out processing (in the processing of target track switch unit 26) to step 5B.
In step 5B, with joint target vector q dContact action joint target vector q as 6 generations of collision respective action generation unit Id(in the processing of target track switch unit 26).Then, get into step 6.
In step 5A, with joint target vector q dAs joint target q Wd(in the processing of target track switch unit 26).Then, get into step 8.
In step 6, from driving current value i, joint angles data (joint angles vector (currency of joint angle) q), the joint angles error compensation output u of separately motor 34 Qe, infer the external force torque T on the joint portion of unit 27 computing machine human arm 8 through moment of torsion Ext(inferring the processing of unit 27 at moment of torsion).
Then, in step 7, from mechanical impedance parameter I, D, K, the joint angles data (joint angles vector q) collision respective action generation unit 6, set, infer the external force torque T that imposes on robot arm 8 that unit 27 calculates by moment of torsion Ext, through impedance computation unit, joint 28 calculating joint target correction output q The d Δ(processing of impedance computation unit 28) in the joint.
Then, in step 8, as joint target vector q dWith joint target correction output q The d ΔWith, calculate to revise joint target vector qdm, and to input to Position Control be 31.
Then, in step 9, will revise joint target vector q DmWith the difference of current joint vector q be the error q of joint portion eInput to joint error compensation unit 30, error compensation unit 30 computes joint angles errors guarantee output u in the joint Qe(processing of error compensation unit 30) in the joint.As the concrete example of joint error compensation unit 30, can consider the PID compensator.The diagonal angle ranks of adjustment constant are ratio, differential, these 3 gains of integration through controlling suitably, and the joint error is punctured into 0.
Then, in step 10, with joint angles error compensation output u QePass to motor driver 19 through D/A converter 21,, the rotatablely moving of each joint shaft of robot arm 8 taken place through making along the current change of motor 34 circulations of separately joint portion.
Computation cycles through as control repeats above step 1~step 10, realizes the control of the action of robot arm 8.
Then, the work all to the control device in the 1st embodiment of the present invention 1 is controlled object with the hand 9 with robot arm 8, and carrying out transport operation is example, describes based on the flow chart of Fig. 6.
In step 21, in people's motion detection unit 2, carry out image recognition processing based on the view data of image pickup device 36, detect information near the people's 38 of robot arm 8 position and translational speed, moving direction.
Then, in step 22, judge near the intrusion that probe unit 3 carries out people 38, be judged as near probe unit 3 under the situation that does not have to invade, get into step 23 the people to collision warning region 100 through the people.
In step 23, through 8 actions of action control unit 7 control robot arms, control object with the hand 9 of robot arm 8, carry out the transport operation action.
After step 23, turn back to step 21, a very important person does not confirm people 38 intrusion near probe unit 3, with regard to repeating step 21 → step 22 → step 23 → step 21 ... Circulation, robot arm 8 is carried out common above-mentioned object and is controlled and transport operation.
In step 22, under the situation of the intrusion of having confirmed people 38, get into step 24, infer unit 5 through position of collision and infer collision precalculated position [formula 24].
[formula 24]
X p(x i,y i)
Then, in step 24.1, calculate collision precalculated position X by collision respective action generation unit 6 pInitial point O with absolute coordinate system 32 0Between distance L Xi, collision respective action generation unit 6 is distance L relatively XiAnd length L 1, judge when the inequality of formula (10) to be contact mode (1) through collision respective action generation unit 6, during inequality be in formula (11) contact mode (2), other the time be contact mode (3).
Then, in step 25, in collision respective action target track generation unit 6, in collision precalculated position [formula 25], through type (12) generates the collision respective action target track q that is used to obtain the corresponding posture of collision d(t).
[formula 25]
X p(x i,y i)
Then; In step 26; For the mechanical impedance value of the joint portion that will set aspect the corresponding posture in collision, according to contact mode (1)~(3), in addition according to approaching people 38 translational speed; By that kind of before having introduced the mechanical impedance of joint portion as the method for confirming, set through collision respective action target track generation unit 6.
Then, in step 27, in action control unit 7, collision respective action target track q is selected in 26 work of target track switch unit d(t), make robot arm 8 actions, become the corresponding posture of collision through Position Control system.In addition, also work in impedance computation unit, joint 28, each joint portion of robot arm 8 is controlled at (independence) the mechanical impedance value of setting respectively.
Then, in step 28, in people's motion detection unit 2, carry out image recognition processing, detect information near the people's 38 of robot arm 8 position and translational speed, moving direction by the view data of image pickup device 36.
Then; In step 29, judge near the intrusion that probe unit 3 carries out people 38 the people to collision warning region 100, be judged as under the situation of intrusion near probe unit 3 the people; Turn back to step 27; Short of affirmation people 38 breaks away from from collision warning region 100, with regard to repeating step 27 → step 28 → step 29 → step 27 ... Circulation, robot arm 8 continues the collision respective action.
On the other hand, in step 29, have no talent 38 the people under the situation that collision warning region 100 is invaded near probe unit 3 judgements, turn back to step 21, robot arm 8 is controlled and transport operation action recovery to common above-mentioned object.
Through above action step 21~step 29, can realize utilizing the above-mentioned object of robot arm 8 to control and transport operation, people 38 near the time, realize switching to the collision respective action.
As previously discussed; Infer unit 5, collision respective action control module 4 through possessing people's motion detection unit 2, people near probe unit 3, position of collision; People 38 during near robot arm 8; With the suitable gesture actions of corresponding people's 38 motion, can be with the mechanical impedance of suitable joint portion, the corresponding collision respective action of collision of control and people 38 and above-mentioned robot arm 8.
Because robot arm 8 is that base portion 13 is fixed on the ground through root; Therefore on the joint portion of root (base portion 13) side of colliding the precalculated position; Rigidity is lower than other joint portion, the joint portion of the state of easy activity if do not have; Then produce resistance, produce impact collision.; According to above-mentioned the 1st embodiment of the present invention; Because when on each each joint portion of robot arm 8, setting mechanical impedance respectively, compare the joint portion of the state that can on the joint portion of root (base portion 13) side, set the low and easy activity of rigidity with other joint portion, therefore can discharge the power that apply when colliding at this joint portion; Impact to the people reduces, and can improve security.
Therefore; Control device 1 according to above-mentioned the 1st embodiment of the present invention; Even articulated robot arm 8 also can contact with the people safely, and; Can carry out the contact action of the best of corresponding people's action, can realize not damaging with the people contiguously with the action control of the safety of the robot arm 8 of people's coexistence.
Have again, in above-mentioned the 1st embodiment, the quantity of the joint portion of robot arm 8 is defined as 3, but also is not limited to this, can both bring into play same effect for 2 robot arms 8 more than the joint portion.
In addition, in above-mentioned the 1st embodiment, for example clear robot arm 8 and people's collision, but also be not limited to this for the mobile robot or the collision of other moving body such as the lorry in moving and robot arm 8, also can be brought into play same effect.
Have again, in above-mentioned the 1st embodiment, get under the situation of collision warning region 100, once collide inferring of precalculated position, but also be not limited to this, also can adopt (continuation) to collide the method for inferring in precalculated position continuously people 38.In this case, because the precision of inferring of collision improves, therefore can improve security more with better posture, contact mode correspondence.On the other hand,, once colliding under the situation of inferring in precalculated position, having and to reduce amount of calculation, can alleviate advantage the burden of CPU like above-mentioned the 1st embodiment.
(the 2nd embodiment)
Fig. 8 A and Fig. 8 B be the control device 1 of the robot arm of expression in the 2nd embodiment of the present invention formation and sketch map that the control object is the formation of robot arm 8, and above-mentioned the 2nd embodiment of expression in the block diagram of detailed formation of action control unit of above-mentioned control device.Because the basic comprising of the control device 1 in the 2nd embodiment of the present invention, same during with the 1st embodiment shown in Figure 1, therefore the explanation of common part is omitted, below only different portions is at length explained.
In the 2nd embodiment of the present invention, not through image recognition processing, but, carry out the approaching detection of people's motion detection and people through the power that detection machine human arm 8 produces when contact with people 38.Thereby, form the structure of inferring unit 27 input external force torque information from the moment of torsion of action control unit 7 as people's motion detection unit 2.Therefore, in the control device 1 in the 2nd embodiment, do not need image pickup device 36.
People's motion detection unit 2 is inferred the external force torque T of inferring unit 27 based on moment of torsion Ext=[T 1ex, T 2ext, T 3ext, T 4ext] T,, infer robot arm 8 and people's 38 relative position relation by following pattern.Wherein, in Fig. 9 A~Fig. 9 D, (overlooking robot arm 8 from upper side) makes joint portion be defined as positive moment of torsion to the moving moment of torsion of anticlockwise.
(1) in the external force torque T of the 1st joint portion 14 1ext>0, the external force moment of torsion of the 2nd joint portion 15
Figure G2007800149419D00241
The time, shown in Fig. 9 A, infer the contact at the 1st connecting rod 12 into people 38 by people's motion detection unit 2.
(2) in the external force torque T of the 1st joint portion 14 1ext<0, the external force moment of torsion of the 2nd joint portion 15 The time, shown in Fig. 9 B, infer the contact at the 1st connecting rod 12 into people 38 by people's motion detection unit 2.
(3) in the external force torque T of the 1st joint portion 14 1ext>0, the external force torque T of the 2nd joint portion 15 2ext>, shown in Fig. 9 C, infer the contact at the 2nd connecting rod 11 at 0 o'clock into people 38 by people's motion detection unit 2.
(4) in the external force torque T of the 1st joint portion 14 1ext<0, the external force torque T of the 2nd joint portion 15 2ext<, shown in Fig. 9 D, infer the contact at the 2nd connecting rod 11 at 0 o'clock into people 38 by people's motion detection unit 2.
As long as carry out classification, just can learn robot arm 8 and people's 38 general position relation according to 4 above patterns.In addition; The people through detecting state the arbitrary patterns of change to the pattern of above-mentioned (1)~(4) of external force moment of torsion from the external force moment of torsion
Figure G2007800149419D00244
of the external force moment of torsion
Figure G2007800149419D00243
of the 1st joint portion 14, the 2nd joint portion 15, just can survey the approaching of people 38 near probe unit 3.
Through the approximate location of inferring people 38 with above method, with posture, the joint moment of torsion of collision respective action control module 4 control correspondence positions relations, can carry out can with the action control of the safety of the robot arm 8 of people's 38 coexistences.
Have again; In above-mentioned the 1st embodiment, people's motion detection unit 2 is defined as and utilizes image recognition processing; But also be not limited to this; As long as with other sensing detection unit, for example laser radar sensor or ultrasonic sensor etc. can detect people 38 position and speed, also can bring into play same effect.
In addition, in above-mentioned the 2nd embodiment, infer the external force moment of torsion of inferring unit 27 based on moment of torsion and carry out people's detection; But also be not limited to this; Shown in figure 10, connecting rod 12,11 of robot arm 8 etc. maybe with position that people 38 contacts on, an example that obtains the unit as position of collision is provided with feeler 39; Contact with feeler 39 through people 38 and to carry out people's detection, also can bring into play same effect.In addition,, then can improve and infer precision, can control robot arm 8 more safely if also survey with the people who utilizes feeler and external force moment of torsion.
Have again, in above-mentioned the 2nd embodiment, the quantity of the joint portion of robot arm 8 is defined as 3, but also is not limited to this, can both bring into play same effect for 2 robot arms 8 more than the joint portion.
In addition, in above-mentioned the 2nd embodiment, illustrational robot arm 8 and people's collision, but also be not limited to this, for the mobile robot or the collision of other moving body such as the lorry in moving and robot arm 8 also can bring into play same effect.
In addition; In the above-mentioned the 1st and the 2nd embodiment exemplified robot arm 8 is illustrated; But also be not limited to arm; Even mobile robot or moving bodys such as 2 leg walking vehicle device people, multiway walking robot through wheel movement through in the contacting of body and people, using the present invention, also can obtain the effect same with the present invention.
Have again,, can bring into play the effect that has separately through embodiment arbitrarily in the above-mentioned numerous embodiments of appropriate combination.
Utilizability on the industry
The control device of robot arm of the present invention and control method, robot, and the control program of robot arm, reach the control program of robot arm at control device and control method, the robot of the robot arm of the control of the action of the robot arm that effectively carries out contact with the people as home-use robot etc.In addition, also be not limited to home-use robot, also can be used as the control device of the movable agency in industrial robot or the production equipment etc.
The present invention has fully narrated preferred embodiment with reference to accompanying drawing, but obviously for the personage who is familiar with this technology, can carry out various deformation or correction.Such distortion or correction only otherwise exceed the scope of the present invention of the scope defined of appended technical scheme, all are interpreted as containing in the present invention.

Claims (2)

1. the control device of a robot arm is the control device with articulated robot arm of at least 2 connecting rods and at least 2 joint portions, wherein, comprising:
Position of collision is obtained the unit, the position of collision of its acquisitor or moving body and said articulated robot arm;
Collision respective action control module; It obtains the said people that obtains the unit or the said position of collision of moving body and said articulated robot arm based on said position of collision; Respective action is collided in control to be made lowlyer than the rigidity of other joint portion near the rigidity of the joint portion of root one side than the connecting rod of the said articulated robot arm under the said position of collision, and makes than the connecting rod of the said articulated robot arm under the said position of collision and keeps or add bigger rigidity near the rigidity of the joint portion of wrist portion one side.
2. the control device of robot arm according to claim 1; Wherein, The control of said collision respective action control module makes in the joint portion of connecting rod near said root one side than the said articulated robot arm under the said position of collision, is lower than than the connecting rod of the said articulated robot arm under this position of collision rigidity near other joint portion of said root one side near the rigidity of the joint portion of the said connecting rod under the said position of collision.
CN2007800149419A 2006-07-04 2007-06-28 Apparatus for controlling robot arm Active CN101432103B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP184172/2006 2006-07-04
JP2006184172 2006-07-04
PCT/JP2007/062984 WO2008004487A1 (en) 2006-07-04 2007-06-28 Apparatus and method for controlling robot arm, robot, and robot arm control program

Publications (2)

Publication Number Publication Date
CN101432103A CN101432103A (en) 2009-05-13
CN101432103B true CN101432103B (en) 2012-05-23

Family

ID=38894459

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2007800149419A Active CN101432103B (en) 2006-07-04 2007-06-28 Apparatus for controlling robot arm

Country Status (4)

Country Link
US (1) US8676379B2 (en)
JP (2) JP4243309B2 (en)
CN (1) CN101432103B (en)
WO (1) WO2008004487A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110154018A (en) * 2018-02-13 2019-08-23 佳能株式会社 The controller and control method of robot

Families Citing this family (82)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101984750B (en) * 2007-10-01 2013-01-09 Abb技术有限公司 A method for controlling a plurality of axes in an industrial robot system and an industrial robot system
JP5086778B2 (en) * 2007-11-26 2012-11-28 トヨタ自動車株式会社 Robot arm
WO2009098855A1 (en) * 2008-02-06 2009-08-13 Panasonic Corporation Robot, robot control apparatus, robot control method and program for controlling robot control apparatus
JP2009297810A (en) * 2008-06-11 2009-12-24 Panasonic Corp Posture control device of manipulator and posture control method thereof
DE102008041602B4 (en) * 2008-08-27 2015-07-30 Deutsches Zentrum für Luft- und Raumfahrt e.V. Robot and method for controlling a robot
JP2010120139A (en) * 2008-11-21 2010-06-03 New Industry Research Organization Safety control device for industrial robot
KR101537039B1 (en) * 2008-11-28 2015-07-16 삼성전자 주식회사 Robot and control method thereof
WO2010063319A1 (en) * 2008-12-03 2010-06-10 Abb Research Ltd. A robot safety system and a method
ES2424244T3 (en) * 2009-04-22 2013-09-30 Kuka Roboter Gmbh Procedure and device to regulate a manipulator
US8682482B2 (en) * 2009-05-22 2014-03-25 Toyota Motor East Japan, Inc. Working support robot system
JP5167548B2 (en) * 2009-09-01 2013-03-21 川田工業株式会社 Suspended collaborative robot
JP4962551B2 (en) * 2009-10-20 2012-06-27 株式会社安川電機 Robot system and control method of robot system
JP5528095B2 (en) * 2009-12-22 2014-06-25 キヤノン株式会社 Robot system, control apparatus and method thereof
JP4896276B2 (en) * 2010-01-04 2012-03-14 パナソニック株式会社 ROBOT, ROBOT CONTROL DEVICE, CONTROL METHOD, AND CONTROL PROGRAM
JP5059978B2 (en) * 2010-01-25 2012-10-31 パナソニック株式会社 Danger presentation device, danger presentation system, danger presentation method and program
IT1399248B1 (en) * 2010-03-11 2013-04-11 Uni Politecnica Delle Marche EQUIPMENT FOR THE MANAGEMENT OF A SYSTEM OF CONTROL OF A MACHINERY EQUIPPED WITH A MOBILE PART PROVIDED WITH AT LEAST ONE PROXIMITY AND CONTACT SENSOR
WO2011161765A1 (en) * 2010-06-22 2011-12-29 株式会社 東芝 Robot control device
CN102917843B (en) * 2010-11-26 2016-08-31 日商乐华股份有限公司 The control device of robot and control method
US8777818B1 (en) * 2010-12-22 2014-07-15 Larry E. Tate, Jr. Training device
WO2012124342A1 (en) * 2011-03-17 2012-09-20 パナソニック株式会社 Robot, robot control apparatus, control method, and control program
JP2012236244A (en) * 2011-05-10 2012-12-06 Sony Corp Robot device, method of controlling the same, and program for controlling the same
EP2708334B1 (en) * 2011-05-12 2020-05-06 IHI Corporation Device and method for controlling prediction of motion
TW201247373A (en) * 2011-05-23 2012-12-01 Hon Hai Prec Ind Co Ltd System and method for adjusting mechanical arm
DE102012108418A1 (en) 2011-09-14 2013-03-14 Robotics Technology Leaders Gmbh Device for enabling secure collaboration between human and selective complaint articulated arm robot, used in e.g. industrial automation field, has robotic controller having control unit connected in series with servo axle amplifiers
CN104169051A (en) 2012-03-19 2014-11-26 株式会社安川电机 Task robot and robot system
WO2013140579A1 (en) 2012-03-22 2013-09-26 株式会社安川電機 Work robot and robot system
DE202012101121U1 (en) * 2012-03-29 2013-07-16 Kuka Systems Gmbh separator
WO2013164470A1 (en) * 2012-05-04 2013-11-07 Leoni Cia Cable Systems Sas Imitation learning method for a multi-axis manipulator
EP2853359B1 (en) 2012-05-21 2022-07-20 Kabushiki Kaisha Yaskawa Denki Robot
JPWO2013175554A1 (en) * 2012-05-21 2016-01-12 株式会社安川電機 Robot and robot system
JP5907859B2 (en) * 2012-12-21 2016-04-26 本田技研工業株式会社 Link mechanism
JP5802191B2 (en) 2012-12-21 2015-10-28 本田技研工業株式会社 Link mechanism controller
CN103902020B (en) * 2012-12-25 2017-04-12 苏茂 Data glove wrist joint detection device
JP5668770B2 (en) * 2013-03-15 2015-02-12 株式会社安川電機 Robot system and control method of robot system
DE102013104265A1 (en) 2013-04-26 2014-10-30 Pilz Gmbh & Co. Kg Device and method for securing an automated machine
US9427871B2 (en) * 2013-05-06 2016-08-30 Abb Technology Ag Human safety provision in mobile automation environments
KR20140147267A (en) * 2013-06-19 2014-12-30 광주과학기술원 Control Method and Device for Position-Based Impedance Controlled Industrial Robot
CN103600354B (en) * 2013-11-08 2016-10-05 北京卫星环境工程研究所 Spacecraft mechanical arm flexible follow-up control gravity compensation
WO2015120864A1 (en) * 2014-02-13 2015-08-20 Abb Technology Ag Robot system and method for controlling the robot system
EP3120979A4 (en) * 2014-03-14 2017-11-08 Sony Corporation Robot arm device, robot arm control method and program
JP5946859B2 (en) * 2014-04-14 2016-07-06 ファナック株式会社 Robot control device and robot system for robots that move according to force
CN104020699A (en) * 2014-05-30 2014-09-03 哈尔滨工程大学 Movable type visual identification material sorting intelligent robot controlling apparatus
DE102014210544B4 (en) * 2014-06-04 2023-04-13 Robert Bosch Gmbh Procedure for controlling a manipulator in a point-to-point movement
CN106660215B (en) * 2014-07-02 2021-09-17 西门子公司 Early warning system and robot system
US10099609B2 (en) * 2014-07-03 2018-10-16 InfoMobility S.r.L. Machine safety dome
JP6140114B2 (en) * 2014-07-31 2017-05-31 ファナック株式会社 Mobile human cooperative robot
US9740193B2 (en) * 2014-08-08 2017-08-22 Roboticvisiontech, Inc. Sensor-based safety features for robotic equipment
JP6682120B2 (en) * 2014-10-23 2020-04-15 立花 純江 Robot teaching device
EP3017920B1 (en) * 2014-11-07 2017-08-23 Comau S.p.A. An industrial robot and a method for controlling an industrial robot
DE102014222809B3 (en) * 2014-11-07 2016-01-14 Kuka Roboter Gmbh Event-based redundancy angle configuration for articulated arm robots
RU2682195C1 (en) * 2015-05-21 2019-03-15 Ниссан Мотор Ко., Лтд. Troubleshooting diagnostics device and the problems diagnosing method
WO2016189740A1 (en) * 2015-05-28 2016-12-01 株式会社安川電機 Robot system, teaching jig and teaching method
US9868213B2 (en) 2015-08-11 2018-01-16 Empire Technology Development Llc Incidental robot-human contact detection
US10215852B1 (en) * 2015-10-05 2019-02-26 Google Llc Robotic radar assistance
JP6850538B2 (en) 2016-02-08 2021-03-31 川崎重工業株式会社 Working robot
JP6481635B2 (en) * 2016-02-15 2019-03-13 オムロン株式会社 Contact determination device, control device, contact determination system, contact determination method, and contact determination program
GB2549072B (en) * 2016-03-24 2020-07-29 Cmr Surgical Ltd Robot control
JP6570742B2 (en) * 2016-05-16 2019-09-04 三菱電機株式会社 Robot motion evaluation apparatus, robot motion evaluation method, and robot system
JP6755724B2 (en) * 2016-06-20 2020-09-16 キヤノン株式会社 Control methods, robot systems, and article manufacturing methods
CN106725861B (en) * 2017-02-15 2019-12-10 山东大学 Method for detecting collision position of end tool of surgical robot
JP6496335B2 (en) * 2017-03-03 2019-04-03 ファナック株式会社 Robot system
TWI774666B (en) * 2017-03-17 2022-08-21 達明機器人股份有限公司 Jam-proof method for a collaborative robot arm
JP7427358B2 (en) * 2017-07-20 2024-02-05 キヤノン株式会社 Robot system, article manufacturing method, control method, control program, and recording medium
KR102370879B1 (en) * 2017-09-12 2022-03-07 주식회사 한화 Method and Apparatus for controlling a collaborativve robot
JP6680752B2 (en) * 2017-11-28 2020-04-15 ファナック株式会社 Control device that limits the speed of the robot
EP3755504A1 (en) * 2018-02-23 2020-12-30 ABB Schweiz AG Robot system and operation method
JP2019150919A (en) * 2018-03-02 2019-09-12 オムロン株式会社 Robot system
JP7127316B2 (en) * 2018-03-22 2022-08-30 カシオ計算機株式会社 Robot, robot control method and program
DE102018109320A1 (en) * 2018-04-19 2019-10-24 Gottfried Wilhelm Leibniz Universität Hannover Method for detecting an intention of a partner in relation to a multi-membered actuated kinematics
IT201800005091A1 (en) 2018-05-04 2019-11-04 "Procedure for monitoring the operating status of a processing station, its monitoring system and IT product"
WO2019213825A1 (en) * 2018-05-07 2019-11-14 深圳蓝胖子机器人有限公司 Robot and mechanical gripper thereof
JP7155660B2 (en) * 2018-06-26 2022-10-19 セイコーエプソン株式会社 Robot controller and robot system
CN109240092B (en) * 2018-11-30 2021-09-10 长春工业大学 Reconfigurable modular flexible mechanical arm trajectory tracking control method based on multiple intelligent agents
JP2020089927A (en) * 2018-12-03 2020-06-11 学校法人立命館 Robot control system
CN109620410B (en) * 2018-12-04 2021-01-26 微创(上海)医疗机器人有限公司 Method and system for preventing collision of mechanical arm and medical robot
DE102018133349A1 (en) * 2018-12-21 2020-06-25 Pilz Gmbh & Co. Kg Method and device for moment estimation
JPWO2020158642A1 (en) * 2019-01-31 2021-12-02 ソニーグループ株式会社 Robot control device, robot control method, and program
EP3922418A4 (en) * 2019-02-08 2022-03-02 NEC Corporation Control device, control method, and recording medium
DE102020104364B3 (en) * 2020-02-19 2021-05-27 Franka Emika Gmbh Control of a robot manipulator when it comes into contact with a person
CN113319844A (en) * 2020-02-28 2021-08-31 东莞市李群自动化技术有限公司 Mechanical arm control method, control equipment and robot
WO2023037437A1 (en) * 2021-09-08 2023-03-16 東京ロボティクス株式会社 Unmanned aerial vehicle, flying body, and flying robot
WO2023042464A1 (en) * 2021-09-15 2023-03-23 ソニーグループ株式会社 Robot device and robot control method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002187090A (en) * 2000-12-21 2002-07-02 Matsushita Electric Works Ltd Manipulator
JP2004130460A (en) * 2002-10-11 2004-04-30 Sony Corp Operation controller of leg type mobile robot and its operation control method

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2196763A (en) 1986-10-29 1988-05-05 Philips Electronic Associated Solid modeling
US5150026A (en) * 1990-11-19 1992-09-22 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Obstacle avoidance for redundant robots using configuration control
US5737500A (en) * 1992-03-11 1998-04-07 California Institute Of Technology Mobile dexterous siren degree of freedom robot arm with real-time control system
JP3865158B2 (en) 1997-05-30 2007-01-10 株式会社安川電機 Impedance control device for robot arm
JP2000162062A (en) 1998-12-01 2000-06-16 Fujitsu Ltd Obstacle sensor apparatus
US7443115B2 (en) * 2002-10-29 2008-10-28 Matsushita Electric Industrial Co., Ltd. Apparatus and method for robot handling control
JP4228871B2 (en) * 2002-10-29 2009-02-25 パナソニック株式会社 Robot grip control device and robot grip control method
JP2004223663A (en) 2003-01-24 2004-08-12 Doshisha Impedance control device and impedance control program
JP3888310B2 (en) 2003-02-06 2007-02-28 トヨタ自動車株式会社 Data creation device for walking robot control and ZMP position calculation method
JP2005059161A (en) 2003-08-18 2005-03-10 Univ Waseda Robot control device
DE102004041821A1 (en) * 2004-08-27 2006-03-16 Abb Research Ltd. Device and method for securing a machine-controlled handling device
DE102004043514A1 (en) * 2004-09-08 2006-03-09 Sick Ag Method and device for controlling a safety-related function of a machine

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002187090A (en) * 2000-12-21 2002-07-02 Matsushita Electric Works Ltd Manipulator
JP2004130460A (en) * 2002-10-11 2004-04-30 Sony Corp Operation controller of leg type mobile robot and its operation control method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110154018A (en) * 2018-02-13 2019-08-23 佳能株式会社 The controller and control method of robot
US11872698B2 (en) 2018-02-13 2024-01-16 Canon Kabushiki Kaisha Controller of robot and control method

Also Published As

Publication number Publication date
CN101432103A (en) 2009-05-13
JP4243309B2 (en) 2009-03-25
JP2008302496A (en) 2008-12-18
US20090171505A1 (en) 2009-07-02
JPWO2008004487A1 (en) 2009-12-03
WO2008004487A1 (en) 2008-01-10
US8676379B2 (en) 2014-03-18

Similar Documents

Publication Publication Date Title
CN101432103B (en) Apparatus for controlling robot arm
US11045945B2 (en) Method for controlling walking of robot and robot
US8160745B2 (en) Robots with occlusion avoidance functionality
US8311731B2 (en) Robots with collision avoidance functionality
US9002519B2 (en) Robot control method, robot control device, and robot control system
US20200376666A1 (en) Robot system and operation method
CN103722565A (en) Self-collision monitoring system for humanoid robot and monitoring method thereof
JP2015157352A (en) Robot, control device and control method of robot, and control program for robot
US20120004775A1 (en) Robot apparatus and control method therefor
US20090295324A1 (en) Device and method for controlling manipulator
KR20190079322A (en) Robot control system
Zanchettin et al. A novel passivity-based control law for safe human-robot coexistence
US8483876B2 (en) Controller of mobile robot
Tsuji et al. Noncontact impedance control for redundant manipulators
JP7160118B2 (en) Control device, control method, program
Suárez et al. Development of a dexterous dual-arm omnidirectional mobile manipulator
CN110116424A (en) Robot
Sarić et al. Robotic surface assembly via contact state transitions
JPH02188809A (en) Controller for avoiding obstacle of traveling object
JP7024215B2 (en) Mobile control device and control system
JP2009012133A (en) Safety apparatus and manipulator equipped with the same
Omrčen et al. Autonomous motion of a mobile manipulator using a combined torque and velocity control
Tsetserukou et al. iSoRA: humanoid robot arm for intelligent haptic interaction with the environment
JP2006116635A5 (en)
Tsetserukou et al. Obstacle avoidance control of humanoid robot arm through tactile interaction

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant