CN109732610A - Man-machine collaboration robot grasping system and its working method - Google Patents

Man-machine collaboration robot grasping system and its working method Download PDF

Info

Publication number
CN109732610A
CN109732610A CN201910154643.4A CN201910154643A CN109732610A CN 109732610 A CN109732610 A CN 109732610A CN 201910154643 A CN201910154643 A CN 201910154643A CN 109732610 A CN109732610 A CN 109732610A
Authority
CN
China
Prior art keywords
robot
variable
joint
module
man
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910154643.4A
Other languages
Chinese (zh)
Inventor
陶永
任帆
房增亮
邹遇
陈超勇
江山
张强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN201910154643.4A priority Critical patent/CN109732610A/en
Publication of CN109732610A publication Critical patent/CN109732610A/en
Pending legal-status Critical Current

Links

Landscapes

  • Manipulator (AREA)

Abstract

The invention discloses a kind of man-machine collaboration robot grasping system and its working method, man-machine collaboration robot grasping system includes Worn type inertial sensor module, camera spacing module, Gauss model EM algorithm training module, Gauss model processing module and adaptability joint of robot coordinate extraction module.Man-machine collaboration robot provided by the invention grasping system inverts without the calibration and kinematics for carrying out robotic vision system in the application scenarios of man-machine collaboration, reduces the professional skill requirement of operator.In addition, mapping can be realized without a large amount of sample in man-machine collaboration robot provided by the invention grasping system, so that robot trajectory is smoothly submissive.

Description

Man-machine collaboration robot grasping system and its working method
Technical field
The present invention relates to robotic technology field more particularly to a kind of man-machine collaboration robot grasping system and its work sides Method.
Background technique
In industrial production and daily life, robot more and more replaces the mankind to execute various operation tasks, such as Welding, cutting, punching press, spray painting, material processing, precise materials processing, robot have significant advantage in above-mentioned task.? When robot grabs operation, the pose of target object changes frequent occurrence, and robot needs to be adjusted according to the posture information of object The adaptability of crawl is realized in the movement of itself.
In the emerging industry of certain robot applications, such as the update of the main products such as mobile phone, plate, wearable device Replacement speed is very fast, short only some months.Traditional robotic scenarios plenty of time resource will be put on production line, This does not meet present production model, and the feature in these emerging industries is that there are many product category, volume is generally little.The mankind are negative The process relatively high to flexibility, tactile, requirement on flexibility is blamed, then fast and accurately feature is responsible for repeatability using it for robot Work.
More researchs at present be all placed on and the combination of vision on grabbed, but magazine observation change in order to obtain The joint coordinates for measuring robot need to demarcate vision system, profession of traditional vision calibration method to operator Property is more demanding, and needs to waste a large amount of time and could complete.
Summary of the invention
A kind of man-machine collaboration robot grasping system is provided to solve limitation and defect, the present invention of the existing technology, Including data acquiring portion, data processing section, industrial robot part, the data acquiring portion includes that Worn type inertia passes Sensor module, camera spacing module, the data processing section include Gauss model EM algorithm training module, Gauss model processing Module, the industrial robot part include adaptability joint of robot coordinate extraction module;
Operator is arranged on hand in the Worn type inertial sensor module, grabs object for extracting the operator Posture variable;
The camera spacing module extract body marginal point coordinate, obtains the average value of coordinate as target object central point Coordinate (x, y);
The Gauss model EM algorithm training module forms object observation using the parameter of training data estimation Gauss model Mapping relations between variable and joint of robot variable;
The Gauss model processing module is associated the joint of robot variable and the object observational variable, leads to The mapping relations that training obtains in advance are crossed, corresponding joint of robot variable is predicted according to new object observational variable;
The adaptability joint of robot coordinate extraction module extracts the robot that the Gauss model processing module obtains Joint variable.
Optionally, the Worn type inertial sensor module includes sensor node and convergence control node, the sensing Device node is arranged at the centre of the palm, and the convergence control node is arranged at the back of the hand;
The sensor node is for acquiring posture information of operator's palm during grabbing object;
The convergence control node is used to collect the data of the sensor node, by wireless transmission method by collection Data are sent to data processing section.
Optionally, the sensor node is three-axis gyroscope, for obtaining three axis angular velocity of rotation (g of palmx, gy, gz);
The three-axis gyroscope is also used to carry out integral operation according to three axis angular velocity of rotations and sampling time, is sensed The posture information of device node is indicated the posture information by way of Eulerian angles.
Optionally, the industrial robot part further includes robot body and bottom controller,
Joint of robot variable is sent to the robot body by the adaptability joint of robot coordinate extraction module And bottom controller, the bottom controller control the grasping movement that the robot body completes object.
The present invention also provides a kind of working method of man-machine collaboration robot grasping system, the man-machine collaboration robot is grabbed Taking system includes data acquiring portion, data processing section, industrial robot part, and the data acquiring portion includes Worn type Inertial sensor module, camera spacing module, the data processing section include Gauss model EM algorithm training module, Gaussian mode Type processing module, the industrial robot part include adaptability joint of robot coordinate extraction module;
The working method of man-machine collaboration robot grasping system includes:
The Worn type inertial sensor module extracts the posture variable that the operator grabs object;
The camera spacing module extract body marginal point coordinate, obtains the average value of coordinate as target object central point Coordinate (x, y);
The Gauss model EM algorithm training module forms object observation using the parameter of training data estimation Gauss model Mapping relations between variable and joint of robot variable;
The Gauss model processing module is associated the joint of robot variable and the object observational variable, leads to The mapping relations that training obtains in advance are crossed, corresponding joint of robot variable is predicted according to new object observational variable;
The adaptability joint of robot coordinate extraction module extracts the robot that the Gauss model processing module obtains Joint variable.
Optionally, the Gauss model processing module carries out the joint of robot variable and the object observational variable Association predicts corresponding joint of robot variable according to new object observational variable by the mapping relations that training obtains in advance The step of include:
Under the probability distribution of gauss hybrid models, x ° of sample of probability of occurrence are as follows:
Wherein, m represents the total number of Gaussian Profile, p (k)-α (k) representative sample xoFrom the general of k-th Gaussian Profile Rate, p (xo| k)=N (xo;μk;∑k) represent k-th of Gaussian Profile generation sample xoProbability, μkAnd ∑kIt respectively represents k-th The mean vector and covariance matrix of Gaussian Profile;
New observational variable onewPosterior probability from k-th of Gaussian Profile are as follows:
p(k|onew)∝p(k)p(onew|k)
k*=argmaxp (k | onew)
By the mean vector μ of k-th of Gaussian ProfilekWith covariance matrix ∑kCarry out piecemeal:
Wherein, μrMean value for the joint variable being made of among training sample set joint of robot coordinate, μoFor training sample The mean value for the observational variable being made of among this collection the posture information of the center point coordinate of object and sensor, KrrFor joint variable Covariance, KooFor the covariance of observational variable, KorFor the covariance of observational variable and joint variable, KroFor joint variable with The covariance of observational variable,
The new observational variable o of robot acquisition objectnewWith the joint angles r of robotnewAdaptable conditional probability Distribution are as follows:
Wherein,For with new observational variable be adapted joint angles distribution mean value,For the association of Gaussian Profile Variance matrix.
Optionally, the Worn type inertial sensor module includes sensor node and convergence control node, the sensing Device node is arranged at the centre of the palm, and the convergence control node is arranged at the back of the hand;
The Worn type inertial sensor module extracts the step of operator grabs the posture variable of object and includes:
Posture information of sensor node acquisition operator's palm during grabbing object;
The convergence control node collects the data of the sensor node, by wireless transmission method by the data of collection It is sent to data processing section.
The present invention have it is following the utility model has the advantages that
Among man-machine collaboration robot provided by the invention grasping system and its working method, the man-machine collaboration robot Grasping system includes Worn type inertial sensor module, camera spacing module, Gauss model EM algorithm training module, Gauss model Processing module, adaptability joint of robot coordinate extraction module, Worn type inertial sensor module extraction operation person grab object Posture variable, camera spacing module extract body marginal point coordinate obtains the average value of coordinate as target object central point Coordinate (x, y), Gauss model EM algorithm training module using training data estimation Gauss model parameter, formed object observation Mapping relations between variable and joint of robot variable, Gauss model processing module observe joint of robot variable and object Variable is associated, and by the mapping relations that training obtains in advance, predicts corresponding robot according to new object observational variable Joint variable, adaptability joint of robot coordinate extraction module extract the joint of robot that Gauss model processing module obtains and become Amount.Man-machine collaboration robot provided by the invention grasping system is in the application scenarios of man-machine collaboration, without progress robot view The calibration of feel system and kinematics are inverted, and the professional skill requirement of operator is reduced.In addition, provided by the invention man-machine Mapping can be realized without a large amount of sample in cooperation robot grasping system, so that robot trajectory is smoothly submissive.
Detailed description of the invention
Fig. 1 is the structural schematic diagram for the man-machine collaboration robot grasping system that the embodiment of the present invention one provides.
Fig. 2 is the flow chart for the man-machine collaboration robot grasping system that the embodiment of the present invention one provides.
Fig. 3 is the pixel seat that the man-machine collaboration robot grasping system that the embodiment of the present invention one provides obtains object central point Mark is intended to.
Fig. 4 is the man-machine collaboration robot grasping system that provides of the embodiment of the present invention one according to the prediction pair of object observational variable The joint of robot variable answered.
Specific embodiment
To make those skilled in the art more fully understand technical solution of the present invention, the present invention is mentioned with reference to the accompanying drawing The man-machine collaboration robot grasping system and its working method of confession are described in detail.
Embodiment one
Fig. 1 is the structural schematic diagram for the man-machine collaboration robot grasping system that the embodiment of the present invention one provides.Such as Fig. 1 institute Show, man-machine collaboration robot provided in this embodiment grasping system include Worn type inertial sensor module, camera spacing module, Gauss model EM algorithm training module, Gauss model processing module, adaptability joint of robot coordinate extraction module, Worn type are used Property sensor module extraction operation person grab object posture variable, camera spacing module extract body marginal point coordinate, obtain Coordinate (x, y) of the average value of coordinate as target object central point, Gauss model EM algorithm training module use training data It estimates the parameter of Gauss model, forms the mapping relations between object observational variable and joint of robot variable, at Gauss model Reason module is associated joint of robot variable and object observational variable, by training obtained mapping relations in advance, according to New object observational variable predicts that corresponding joint of robot variable, adaptability joint of robot coordinate extraction module extract Gauss The joint of robot variable that model processing modules obtain.Man-machine collaboration robot provided in this embodiment grasping system is in man-machine association In the application scenarios of work, inverts without the calibration and kinematics for carrying out robotic vision system, reduce the special of operator Industry skill requirement.It is reflected in addition, man-machine collaboration robot provided in this embodiment grasping system can be realized without a large amount of sample It penetrates, so that robot trajectory is smoothly submissive.
Fig. 2 is the flow chart for the man-machine collaboration robot grasping system that the embodiment of the present invention one provides.As shown in Fig. 2, institute Stating man-machine collaboration robot grasping system includes Worn type inertial sensor module, camera spacing module, Gauss model processing mould Block, adaptability joint of robot coordinate extraction module, are broadly divided into data acquiring portion, data processing section and industrial machine People part.Data acquiring portion includes Worn type inertial sensor module and camera spacing module, and data processing section includes number According to reception, Gaussian process processing, training data sample, industrial robot includes joint of robot extraction module and industrial robot Ontology and its bottom controller.
Worn type inertial sensor module provided in this embodiment includes 1 sensor node and 1 convergence control node. Sensor node is for acquiring posture information of operator's palm during grabbing object block.Sensor node is mainly three axis tops Spiral shell instrument, for acquiring three axis angular velocity of rotation (g of palmx, gy, gz).Convergence control node is used to converge the number of sensor node According to handling above-mentioned data, send it to data processing section by wireless transmission method.
The sensor node is arranged at the palm centre of the palm, and each sensor node is made of 1 three-axis gyroscope.Three axis Gyroscope is used to acquire three axis angular velocity of rotation (g of each jointx, gy, gz).Control node distribution is converged to be arranged in the back of the hand On, for collecting angular velocity data.Gyro sensor measures carrier along the angular speed of X, Y and Z coordinate system direction respectively, then ties The conjunction sampling time is integrated, and the posture information of sensor is calculated, and be indicated using the form of Eulerian angles, finally by data It is packaged by radioing to data processing section.
Fig. 3 is the pixel seat that the man-machine collaboration robot grasping system that the embodiment of the present invention one provides obtains object central point Mark is intended to.As shown in figure 3, working region is the rectangular area of workbench center 400mm × 350mm.The present embodiment uses picture Plain (pixel) indicates that it is about 0.4mm that a pixel, which represents actual range, and wherein x pixel coordinate from 200 to 1200, sit by y pixel Mark is from 200 to 1075.The object block of crawl is the small cubic block of yellow, and the present embodiment is carried out by the coordinate to color lump marginal point Average calculating operation obtains average value, can obtain the pixel coordinate x and y of object central point.
Data processing section provided in this embodiment includes data receiver part, Gauss model EM algorithm training module and height This model processing modules.The data for receiving camera spacing and sensor is sent are responsible in data receiver part, then pass it to height This processing module.Training data is that object block is placed in working region by people, so that it is reached crawl by dragging robot Position records the value in each joint of handgrip pose and robot at this time.
The present embodiment executes the data of 20 acquirements as sample set at random.Using the EM algorithm pair of gauss hybrid models Sample is trained, while sample being sorted out, and every class sample corresponds to different working regions.The sight of the present embodiment building object Survey the mapping between variable and joint of robot variable.After initialization model parameter, pass through changing for E-step and M-step In generation, constantly updates the parameter of model, until convergence.Man-machine collaboration robot provided in this embodiment grasping system is in man-machine collaboration Application scenarios in, without carry out robotic vision system calibration and kinematics invert, reduce the profession of operator Skill requirement.In addition, mapping can be realized without a large amount of sample in man-machine collaboration robot provided in this embodiment grasping system, So that robot trajectory is smoothly submissive.
E-step: i-th of sample (x is calculatedi) the weight r from k-th of Gaussian Profileik
M-step: the parameter of each cluster is updated.
The present embodiment is classified as m class during with EM algorithm training pattern, by training sample, and every one kind sample obeys one A Gaussian Profile, a region of corresponding working space.Assuming that Different categories of samples is obeyed respectively with μ1μ2…μmFor mean value, with ∑12... ∑ m is multiple Gaussian Profiles of covariance matrix:
Wherein, hk(k=1,2 ..., m) represents the training sample for obeying k-th of Gaussian Profile, is the one of training sample set X A subset.
Wherein,Represent i-th of sample for being under the jurisdiction of k-th of Gaussian Profile, nkIt represents The sample size of k-th of Gaussian Profile (corresponding k-th of zonule).
In the present embodiment, robot needs to adjust the pose for oneself deacclimatizing object: f:o → r when executing task
Wherein, o is the observational variable of object, following oiIt is same;R is joint of robot corresponding with observational variable change Amount, following riIt is same;F is the mapping from observational variable to joint variable.
In the present embodiment, it is assumed that X={ x1,x2,…,xnIt is the training sample set that is obtained by teaching, wherein xi=[ri, oi]TFor the individualized training sample (vector) of joint variable and observational variable composition.Machine man-hour, first from training sample set Learn in X, mapping function f is obtained, when obtaining new observational variable onewWhen, corresponding joint is obtained by mapping function f Variable rnew
Fig. 4 is the man-machine collaboration robot grasping system that provides of the embodiment of the present invention one according to the prediction pair of object observational variable The joint of robot variable answered.As shown in figure 4, selecting suitable model to describe mapping function f, to the crawl of robot adaptability Learn extremely important.The present embodiment is modeled using gauss hybrid models, constructs the observational variable and joint of robot of object Mapping between variable.The present embodiment uses EM algorithm training pattern, training sample is divided into several classes, every one kind sample is obeyed One Gaussian Profile, a corresponding region.Gauss hybrid models are the linear superpositions of multiple Gaussian Profiles.The present embodiment is using high The modeling of this mixed model, the probability distribution of training sample is described using multiple Gaussian processes.In the probability point of gauss hybrid models It plants, x ° of sample of probability of occurrence are as follows:
Wherein, m represents the total number of Gaussian Profile, p (k)-α (k) representative sample xoFrom the general of k-th Gaussian Profile Rate, p (xo| k)=N (xo;μk;∑k) represent k-th of Gaussian Profile generation sample xoProbability, μkAnd ∑kIt respectively represents k-th The mean vector and covariance matrix of Gaussian Profile.
The present embodiment carries out model training, the parameter of learning model using EM algorithm.With EM algorithm learning model parameter During, training sample is clustered, sample set is divided into m subset, each sample set obeys Gauss point Cloth corresponds to a region in working space.
Machine man-hour obtains the observational variable o of object by sensor and cameranew, calculate the observational variable and come from Each Gaussian Profile posterior probability p (k | onew), it chooses the corresponding Gaussian process recurrence of maximum a posteriori probability and acquires robot pass Coordinate is saved, driving robot, which realizes, grabs the adaptability of object.
New observational variable onewPosterior probability from k-th of Gaussian Profile are as follows:
Assuming that the corresponding posterior probability of k-th of Gaussian Profile is maximum, then the corresponding Gaussian process of k-th of Gaussian Profile is chosen Regression forecasting robot joint angles.By the mean vector μ of k-th of Gaussian ProfilekWith covariance matrix ∑kCarry out piecemeal:
Wherein, μrMean value for the joint variable being made of among training sample set joint of robot coordinate, μoFor training sample The mean value for the observational variable being made of among this collection the posture information of the center point coordinate of object and sensor, KrrFor joint variable Covariance, KooFor the covariance of observational variable, KorFor the covariance of observational variable and joint variable, KroFor joint variable with The covariance of observational variable, and
The new observational variable o of robot acquisition objectnewWith the joint angles r of robotnewAdaptable conditional probability Distribution are as follows:
Wherein,For with new observational variable be adapted joint angles distribution mean value, corresponding to Gaussian Profile Maximum probability density;For the covariance matrix of Gaussian Profile, the uncertainty of prediction result represent.By driving machine Each joint of people reachesRobot can be made to complete grasping manipulation with a possibility that maximum.
In the case where not needing progress vision system calibration and kinematics is inverted, gauss hybrid models are directly to multiple areas Joint variable in domain is associated with observational variable, allows the robot to be adapted therewith according to new observational variable prediction Joint variable.
Industrial robot part provided in this embodiment includes joint extraction module, robot body and bottom controller, Treated that joint coordinates extract is sent to robot body and controller by Gauss for joint extraction module, and robot can To smoothly complete the crawl of object block.
Man-machine collaboration robot provided in this embodiment grasping system includes Worn type inertial sensor module, camera spacing Module, Gauss model EM algorithm training module, Gauss model processing module, adaptability joint of robot coordinate extraction module are worn The posture variable that formula inertial sensor module extraction operation person grabs object is worn, camera spacing module extract body marginal point is sat Mark, obtains coordinate (x, y) of the average value as target object central point of coordinate, Gauss model EM algorithm training module use is shown The parameter of data estimation Gauss model is taught, the mapping relations between object observational variable and joint of robot variable, Gauss are formed Model processing modules are associated joint of robot variable and object observational variable, are closed by the mapping that training obtains in advance System predicts corresponding joint of robot variable, adaptability joint of robot coordinate extraction module according to new object observational variable Extract the joint of robot variable that Gauss model processing module obtains.Man-machine collaboration robot provided in this embodiment grasping system In the application scenarios of man-machine collaboration, inverts without the calibration and kinematics for carrying out robotic vision system, reduce operation The professional skill requirement of personnel.In addition, man-machine collaboration robot provided in this embodiment grasping system is without a large amount of sample Mapping can be achieved, so that robot trajectory is smoothly submissive.
It is understood that the principle that embodiment of above is intended to be merely illustrative of the present and the exemplary implementation that uses Mode, however the present invention is not limited thereto.For those skilled in the art, essence of the invention is not being departed from In the case where mind and essence, various changes and modifications can be made therein, these variations and modifications are also considered as protection scope of the present invention.

Claims (7)

1. a kind of man-machine collaboration robot grasping system, which is characterized in that including data acquiring portion, data processing section, work Industry robot part, the data acquiring portion include Worn type inertial sensor module, camera spacing module, at the data Reason part includes Gauss model EM algorithm training module, Gauss model processing module, and the industrial robot part includes adapting to Property joint of robot coordinate extraction module;
Operator is arranged on hand in the Worn type inertial sensor module, and the appearance of object is grabbed for extracting the operator State variable;
The camera spacing module extract body marginal point coordinate, obtains seat of the average value of coordinate as target object central point It marks (x, y);
The Gauss model EM algorithm training module forms object observational variable using the parameter of training data estimation Gauss model With the mapping relations between joint of robot variable;
The Gauss model processing module is associated the joint of robot variable and the object observational variable, by pre- The first mapping relations that training obtains predict corresponding joint of robot variable according to new object observational variable;
The adaptability joint of robot coordinate extraction module extracts the joint of robot that the Gauss model processing module obtains Variable.
2. man-machine collaboration robot according to claim 1 grasping system, which is characterized in that the Worn type inertia sensing Device module includes sensor node and convergence control node, and the sensor node is arranged at the centre of the palm, the convergence control section Point is arranged at the back of the hand;
The sensor node is for acquiring posture information of operator's palm during grabbing object;
The convergence control node is used to collect the data of the sensor node, by wireless transmission method by the data of collection It is sent to data processing section.
3. man-machine collaboration robot according to claim 2 grasping system, which is characterized in that the sensor node is three Axis gyroscope, for obtaining three axis angular velocity of rotation (g of palmx, gy, gz);
The three-axis gyroscope is also used to carry out integral operation according to three axis angular velocity of rotations and sampling time, obtains sensor section The posture information of point, is indicated the posture information by way of Eulerian angles.
4. man-machine collaboration robot according to claim 1 grasping system, which is characterized in that the industrial robot part It further include robot body and bottom controller,
Joint of robot variable is sent to the robot body and bottom by the adaptability joint of robot coordinate extraction module Layer controller, the bottom controller control the grasping movement that the robot body completes object.
5. a kind of working method of man-machine collaboration robot grasping system, which is characterized in that the man-machine collaboration robot crawl System includes data acquiring portion, data processing section, industrial robot part, and the data acquiring portion includes that Worn type is used Property sensor module, camera spacing module, the data processing section includes Gauss model EM algorithm training module, Gauss model Processing module, the industrial robot part include adaptability joint of robot coordinate extraction module;
The working method of man-machine collaboration robot grasping system includes:
The Worn type inertial sensor module extracts the posture variable that the operator grabs object;
The camera spacing module extract body marginal point coordinate, obtains seat of the average value of coordinate as target object central point It marks (x, y);
The Gauss model EM algorithm training module forms object observational variable using the parameter of training data estimation Gauss model With the mapping relations between joint of robot variable;
The Gauss model processing module is associated the joint of robot variable and the object observational variable, by pre- The first mapping relations that training obtains predict corresponding joint of robot variable according to new object observational variable;
The adaptability joint of robot coordinate extraction module extracts the joint of robot that the Gauss model processing module obtains Variable.
6. the working method of man-machine collaboration robot according to claim 5 grasping system, which is characterized in that the Gauss Model processing modules are associated the joint of robot variable and the object observational variable, are obtained by training in advance Mapping relations, the step of predicting corresponding joint of robot variable according to new object observational variable include:
Under the probability distribution of gauss hybrid models, x ° of sample of probability of occurrence are as follows:
Wherein, m represents the total number of Gaussian Profile, p (k)-α (k) representative sample xoFrom the probability of k-th of Gaussian Profile, p (xo| k)=N (xo;μk;∑k) represent k-th of Gaussian Profile generation sample xoProbability, μkAnd ∑kRespectively represent k-th of Gauss The mean vector and covariance matrix of distribution;
New observational variable onewPosterior probability from k-th of Gaussian Profile are as follows:
By the mean vector μ of k-th of Gaussian ProfilekWith covariance matrix ∑kCarry out piecemeal:
Wherein, μrMean value for the joint variable being made of among training sample set joint of robot coordinate, μoFor training sample set Among the mean value of observational variable that is made of the posture information of the center point coordinate of object and sensor, KrrFor the association of joint variable Variance, KooFor the covariance of observational variable, KorFor the covariance of observational variable and joint variable, KroFor joint variable and observation The covariance of variable,
The new observational variable o of robot acquisition objectnewWith the joint angles r of robotnewAdaptable conditional probability distribution Are as follows:
Wherein,For with new observational variable be adapted joint angles distribution mean value,For the covariance of Gaussian Profile Matrix.
7. the working method of man-machine collaboration robot according to claim 5 grasping system, which is characterized in that the wearing Formula inertial sensor module includes sensor node and convergence control node, and the sensor node is arranged at the centre of the palm, described Control node is converged to be arranged at the back of the hand;
The Worn type inertial sensor module extracts the step of operator grabs the posture variable of object and includes:
Posture information of sensor node acquisition operator's palm during grabbing object;
The convergence control node collects the data of the sensor node, is sent the data of collection by wireless transmission method To data processing section.
CN201910154643.4A 2019-03-01 2019-03-01 Man-machine collaboration robot grasping system and its working method Pending CN109732610A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910154643.4A CN109732610A (en) 2019-03-01 2019-03-01 Man-machine collaboration robot grasping system and its working method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910154643.4A CN109732610A (en) 2019-03-01 2019-03-01 Man-machine collaboration robot grasping system and its working method

Publications (1)

Publication Number Publication Date
CN109732610A true CN109732610A (en) 2019-05-10

Family

ID=66368946

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910154643.4A Pending CN109732610A (en) 2019-03-01 2019-03-01 Man-machine collaboration robot grasping system and its working method

Country Status (1)

Country Link
CN (1) CN109732610A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111230873A (en) * 2020-01-31 2020-06-05 武汉大学 Teaching learning-based collaborative handling control system and method
CN111251277A (en) * 2020-01-31 2020-06-09 武汉大学 Human-computer collaboration tool submission system and method based on teaching learning
CN111376269A (en) * 2020-03-04 2020-07-07 北京海益同展信息科技有限公司 Object grabbing method and device, storage medium and electronic equipment
CN111424380A (en) * 2020-03-31 2020-07-17 山东大学 Robot sewing system and method based on skill learning and generalization
CN111775153A (en) * 2020-07-17 2020-10-16 中国科学院宁波材料技术与工程研究所 Heavy-load robot calibration method
CN112388628A (en) * 2019-08-13 2021-02-23 罗伯特·博世有限公司 Apparatus and method for training a gaussian process regression model

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103895022A (en) * 2014-03-17 2014-07-02 东南大学 Wearable type somatosensory control mechanical arm
CN107016342A (en) * 2017-03-06 2017-08-04 武汉拓扑图智能科技有限公司 A kind of action identification method and system
CN107363813A (en) * 2017-08-17 2017-11-21 北京航空航天大学 A kind of desktop industrial robot teaching system and method based on wearable device
KR101948558B1 (en) * 2017-09-28 2019-02-18 김종태 Hand-operated programmable modular robot
CN109382828A (en) * 2018-10-30 2019-02-26 武汉大学 A kind of Robot Peg-in-Hole assembly system and method based on learning from instruction

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103895022A (en) * 2014-03-17 2014-07-02 东南大学 Wearable type somatosensory control mechanical arm
CN107016342A (en) * 2017-03-06 2017-08-04 武汉拓扑图智能科技有限公司 A kind of action identification method and system
CN107363813A (en) * 2017-08-17 2017-11-21 北京航空航天大学 A kind of desktop industrial robot teaching system and method based on wearable device
KR101948558B1 (en) * 2017-09-28 2019-02-18 김종태 Hand-operated programmable modular robot
CN109382828A (en) * 2018-10-30 2019-02-26 武汉大学 A kind of Robot Peg-in-Hole assembly system and method based on learning from instruction

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112388628A (en) * 2019-08-13 2021-02-23 罗伯特·博世有限公司 Apparatus and method for training a gaussian process regression model
CN111230873A (en) * 2020-01-31 2020-06-05 武汉大学 Teaching learning-based collaborative handling control system and method
CN111251277A (en) * 2020-01-31 2020-06-09 武汉大学 Human-computer collaboration tool submission system and method based on teaching learning
CN111251277B (en) * 2020-01-31 2021-09-03 武汉大学 Human-computer collaboration tool submission system and method based on teaching learning
CN111230873B (en) * 2020-01-31 2022-02-01 武汉大学 Teaching learning-based collaborative handling control system and method
CN111376269A (en) * 2020-03-04 2020-07-07 北京海益同展信息科技有限公司 Object grabbing method and device, storage medium and electronic equipment
CN111376269B (en) * 2020-03-04 2021-11-09 北京海益同展信息科技有限公司 Object grabbing method and device, storage medium and electronic equipment
CN111424380A (en) * 2020-03-31 2020-07-17 山东大学 Robot sewing system and method based on skill learning and generalization
CN111775153A (en) * 2020-07-17 2020-10-16 中国科学院宁波材料技术与工程研究所 Heavy-load robot calibration method
CN111775153B (en) * 2020-07-17 2022-08-26 中国科学院宁波材料技术与工程研究所 Heavy-load robot calibration method

Similar Documents

Publication Publication Date Title
CN109732610A (en) Man-machine collaboration robot grasping system and its working method
CN108972494B (en) Humanoid manipulator grabbing control system and data processing method thereof
Li et al. Asymmetric bimanual control of dual-arm exoskeletons for human-cooperative manipulations
CN106346485B (en) The Non-contact control method of bionic mechanical hand based on the study of human hand movement posture
Morales et al. Integrated grasp planning and visual object localization for a humanoid robot with five-fingered hands
CN108563995B (en) Human computer cooperation system gesture identification control method based on deep learning
CN106737664A (en) Sort the Delta robot control methods and system of multiclass workpiece
Alonso et al. Current research trends in robot grasping and bin picking
CN107160364A (en) A kind of industrial robot teaching system and method based on machine vision
CN110298886B (en) Dexterous hand grabbing planning method based on four-stage convolutional neural network
Kabir et al. Automated planning for robotic cleaning using multiple setups and oscillatory tool motions
CN104156068B (en) Virtual maintenance interaction operation method based on virtual hand interaction feature layer model
CN103112007A (en) Human-machine interaction method based on mixing sensor
CN108858193A (en) A kind of mechanical arm grasping means and system
Vianello et al. Human posture prediction during physical human-robot interaction
CN109968310A (en) A kind of mechanical arm interaction control method and system
CN106514667A (en) Human-computer cooperation system based on Kinect skeletal tracking and uncalibrated visual servo
CN108628260A (en) Multi items Tool set equipment based on robot and automatic assembling technique
Omrčen et al. Autonomous acquisition of pushing actions to support object grasping with a humanoid robot
Li et al. Neural learning and kalman filtering enhanced teaching by demonstration for a baxter robot
Nadon et al. Automatic selection of grasping points for shape control of non-rigid objects
Abad et al. Fuzzy logic-controlled 6-DOF robotic arm color-based sorter with machine vision feedback
Arsenic Developmental learning on a humanoid robot
CN113681565A (en) Man-machine cooperation method and device for realizing article transfer between robots
CN108051001A (en) A kind of robot movement control method, system and inertia sensing control device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190510