CN106960099A - A kind of manipulator grasp stability recognition methods based on deep learning - Google Patents

A kind of manipulator grasp stability recognition methods based on deep learning Download PDF

Info

Publication number
CN106960099A
CN106960099A CN201710191189.0A CN201710191189A CN106960099A CN 106960099 A CN106960099 A CN 106960099A CN 201710191189 A CN201710191189 A CN 201710191189A CN 106960099 A CN106960099 A CN 106960099A
Authority
CN
China
Prior art keywords
data
haptic data
image data
manipulator
haptic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710191189.0A
Other languages
Chinese (zh)
Other versions
CN106960099B (en
Inventor
刘华平
覃杰
孙富春
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN201710191189.0A priority Critical patent/CN106960099B/en
Publication of CN106960099A publication Critical patent/CN106960099A/en
Application granted granted Critical
Publication of CN106960099B publication Critical patent/CN106960099B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/17Mechanical parametric or variational design
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/163Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1633Programme controls characterised by the control loop compliant, force, torque control, e.g. combined with position control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • General Physics & Mathematics (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computer Hardware Design (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Medical Informatics (AREA)
  • Computational Mathematics (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Manipulator (AREA)

Abstract

The present invention relates to a kind of manipulator grasp stability recognition methods based on deep learning, belong to robot perception technical field.The inventive method gathers the haptic data of different grasp stabilities first, and the image data of time series then is converted into picture, trains deep learning network using tactile image data, finally utilizes the tactile image data of the unknown stability of deep learning Network Recognition.The inventive method, the training of deep learning network is carried out based on the inconsistent initial data of time span, so as to get network possess robustness on data duration.The deep learning network of this method, the renewal to network parameter is the influence very little that network is subject to when the label of only a small number of data has error based on overall input data.So this method has robustness to noise data.

Description

A kind of manipulator grasp stability recognition methods based on deep learning
Technical field
The present invention relates to a kind of manipulator grasp stability recognition methods based on deep learning, belong to robot perception skill Art field.
Background technology
With continuing to develop for technology and deepening constantly for demand, current robot perception technology has become research heat Point.The primary mechanism of perception of robot is vision, but the influence of the factor such as visual information is easily blocked, light intensity. Tactile data will not then be disturbed by these factors, therefore tactile data to visual information is a kind of important under special scenes Supplement.The crawl point frictional force size of tactile data reflection when the present invention captures article using manipulator, the elastic size of article and The contact situation of finger with article etc., tactile data is used to recognize grasp stability.The present invention utilizes grabbing under dynamic environment The process haptic data of taking judges the stability of crawl process, trains obtained network to predict that grasping movement is under dynamic environment No stabilization, with more practicality.
It is typically by the pressure resistance type, condenser type, the biography of piezoelectric type being distributed on a robotic arm that robot, which obtains tactile data, Sensor, is acquired by the way of loop cycle is sampled.Tactile data is typically the data of time series.Traditional tactile letter Recognizer is ceased, the similitude between dynamic time warping algorithm calculating tactile data is normally based on, then using closest The method such as algorithm or algorithm of support vector machine carries out Classification and Identification.But dynamic time warping algorithm is when tactile data continues Between gap is larger, valid data part is not alignd and data in tend not to obtain accurate similarity evaluation when including noise As a result.
The content of the invention
The purpose of the present invention is to propose to a kind of manipulator grasp stability recognition methods based on deep learning, time sequence The haptic data of row is converted into picture, is trained by appropriately designed deep learning network, the network obtained using training Parameter realizes the identification to new data.
Manipulator grasp stability recognition methods proposed by the present invention based on deep learning, comprises the following steps:
(1) haptic data of the collection machinery hand under different grasp stabilities, comprises the following steps:
(1-1) sets manipulator and has a root fingers, is set respectively on a root fingers on b touch sensor, every finger B touch sensor constitute a contact array;
(1-2) control machinery hand is captured with random crawl point and crawl dynamics to article, collection crawl process Haptic data, if the haptic data of collection is T, haptic data T is three-dimensional matrice, and three dimensions are respectively:The finger of manipulator Radical a, finger touch sensor quantity b and times of collection t, if the tactile number of the last time collection in t collection According to for U, U is the matrix that dimension is a × b;
(1-3) rocks manipulator, gathers a haptic data, if it is V to rock the haptic data gathered after manipulator, V is Dimension is a × b matrix, calculates the maximum diff for rocking haptic data variable quantity on single contact before and after manipulator: Diff=max (abs (U-V)), wherein abs () represent to calculate absolute value, max () expression calculating maximums;
(1-4) calculates haptic data U maximum Umax=max (abs (U)), by the maximum diff of variable quantity with touching Feel that data U maximum is compared, if diff >=0.25 × Umax, judge that haptic data U changes, this crawl is not It is stable, label=0 is designated as, if diff 0.25 × Umax of <, judges that haptic data U does not change, this grasping stability, It is designated as label=1;
(1-5) repeats the above steps (1-2)-(1-4) k times, obtains the haptic data { T of k group crawl processesi, by every group The times of collection of haptic data is designated as { ti, the stability of haptic data is designated as { labeli, wherein i=1,2 ..., k;
(2) haptic data { T for obtaining above-mentioned steps (1)iPicture is converted into, comprise the following steps:
(2-1) calculates haptic data { TiMaximum of Tmax=max ({ Ti) and minimum value Tmin=min ({ Ti), wherein Max () represents to calculate maximum, min () expression calculated minimums;
(2-2) is respectively by haptic data { TiIn each element x be substituted for Wherein floor () represents that removing fractional part retains integer part, obtains haptic data { TiAll image data { Ri, {RiIt is that k width are that b, length are ti, the picture that port number is a, wherein i=1,2 ..., k, a represent the finger root of manipulator Number, b represents the touch sensor quantity of a finger, tiFor the times of collection of haptic data;
(2-3) sets haptic data { TiIn times of collection be up to L, wherein L=max ({ ti), in image data { RiIn The end of picture of the length less than L mends 0, until picture length is L, and it is that b, length are the figure that L, port number are a to obtain k width Sheet data { Pi};
(3) according to the haptic data stability { label of above-mentioned steps (1)iAnd the obtained image data of above-mentioned steps (2) {Pi, deep learning network is trained, is comprised the following steps:
(3-1) calculates image data { PiAverage Mean () represents to calculate each position Average, the image data { P that will be obtained respectively in above-mentioned steps (2)iSubtract image data averageIt is 0 to obtain a class mean Image data { Qi, wherein
The size of the convolution kernel of the first layer convolutional layer of (3-2) projected deep learning network be a × b × 5, first layer it is defeated Go out for 64 characteristic patterns, the second layer is active coating, third layer is full linking layer, the quantity of full linking layer output is 2, is corresponded to respectively The stabilization of haptic data and unstable, the i.e. judged result of manipulator grasp stability, the 4th layer is Classification Loss layer, wherein, a Represent that the finger radical of manipulator, b represent the touch sensor quantity of a finger;
(3-3) utilizes the haptic data { label of above-mentioned steps (1)iAnd the obtained image data of above-mentioned steps (3-1) {Qi, the deep learning network of training step (3-2) design obtains grasp stability identification model;
(4) using the deep learning network of above-mentioned steps (3) design, manipulator grasp stability is identified, including Following steps:
(4-1) sets the haptic data of one group of unknown stability as X, and wherein X is three-dimensional matrice, and three dimensions are respectively:Machine The finger radical a of tool hand, finger touch sensor quantity b and times of collection c, enter to haptic data X times of collection c Row judge, if c≤L, according to above-mentioned steps (2), haptic data is converted into image data, Y is designated as, if c > L, from touch The haptic data for the c-L collection that start-up portion is deleted in data X is felt, according still further to above-mentioned steps (2), by remaining haptic data Image data is converted to, Y is designated as, each element y in Y is judged, if y < 0, make y=0, if y > 255, make Y=255;
It is equal that the image data Y that above-mentioned steps (4-1) are obtained is subtracted the image data that above-mentioned steps (3-1) obtain by (4-2) ValueObtain Z,
The image data Z that (4-3) obtains step (4-2) substitutes into the deep learning net that above-mentioned steps (3-3) training is obtained Network, if two output valves of full linking layer are [m, n], wherein m correspondences label=0 unstable seized condition, n correspondences label =1 stable seized condition, two output valves [m, n] is compared, if m >=n, then it is assumed that image data Z grabs to be unstable Take, if m < n, then it is assumed that image data Z captures to be stable, obtains the judged result of manipulator grasp stability.
Manipulator tactile data recognition methods proposed by the present invention based on deep learning, its advantage is:
1st, recognition methods of the present invention, is converted on the basis of tactile image data, base the haptic data of time series In deep learning Network Recognition tactile image data, haptic data duration skimble-scamble problem is solved.Because each time Article position that manipulator is captured to article, direction can not possibly be identical, so the haptic data of crawl process is inevitable The problem of existence time length is inconsistent.This method carries out deep learning network based on the inconsistent initial data of time span Training, so as to get network possess robustness on data duration.
2nd, the inventive method solves the problem of haptic data valid data position is not lined up.Manipulator is first begin to record Haptic data, is then captured to article again so that effective haptic data position of actual contact object has deviation.We Method carries out the training of deep learning network based on the inconsistent initial data in valid data position, so as to get network possess effectively The robustness of Data Position.
3rd, the inventive method solves the problem of haptic data has noise.Because there is the reasons such as shake in hardware device, There is a certain degree of systematic error in haptic data, the label of individual data also likely to be present target situation by mistake.This method is based on Deep learning network, the renewal to network parameter is that, based on overall input data, the label of only a small number of data has error When network by being influenceed very little.So this method has robustness to noise data.
Brief description of the drawings
The deep learning network model that Fig. 1 is related to for the inventive method.
Embodiment
Manipulator grasp stability recognition methods proposed by the present invention based on deep learning, comprises the following steps:
(1) haptic data of the collection machinery hand under different grasp stabilities, comprises the following steps:
(1-1) sets manipulator and had in a root fingers, one embodiment of the present of invention, and a=3 is set respectively on a root fingers Put in b touch sensor, one embodiment of the present of invention, b=24, b touch sensor on every finger constitutes one Contact array;
(1-2) control machinery hand is captured with random crawl point and crawl dynamics to article, collection crawl process Haptic data, if the haptic data of collection is T, haptic data T is three-dimensional matrice, and three dimensions are respectively:The finger of manipulator Radical a, finger touch sensor quantity b and times of collection t, if the tactile number of the last time collection in t collection According to for U, U is the matrix that dimension is a × b;
(1-3) rocks manipulator, gathers a haptic data, if it is V to rock the haptic data gathered after manipulator, V is Dimension is a × b matrix, calculates the maximum diff for rocking haptic data variable quantity on single contact before and after manipulator: Diff=max (abs (U-V)), wherein abs () represent to calculate absolute value, max () expression calculating maximums;
(1-4) calculates haptic data U maximum Umax=max (abs (U)), by the maximum diff of variable quantity with touching Feel that data U maximum is compared, if diff >=0.25 × Umax, judge that haptic data U changes, this crawl is not It is stable, label=0 is designated as, if diff 0.25 × Umax of <, judges that haptic data U does not change, this grasping stability, It is designated as label=1;
(1-5) repeats the above steps (1-2)-(1-4) k times, and in one embodiment of the present of invention, k=2000 obtains k groups Haptic data { the T of crawl processi, the times of collection of every group of haptic data is designated as { ti, the stability of haptic data is designated as {labeli, wherein i=1,2 ..., k;
(2) haptic data { T for obtaining above-mentioned steps (1)iPicture is converted into, comprise the following steps:
(2-1) calculates haptic data { TiMaximum of Tmax=max ({ Ti) and minimum value Tmin=min ({ Ti), wherein Max () represents to calculate maximum, min () expression calculated minimums;
(2-2) is respectively by haptic data { TiIn each element x be substituted for Wherein floor () represents that removing fractional part retains integer part, obtains haptic data { TiAll image data { Ri, {RiIt is that k width are that b, length are ti, the picture that port number is a, wherein i=1,2 ..., k, a represent the finger root of manipulator Number, b represents the touch sensor quantity of a finger, tiFor the times of collection of haptic data;
(2-3) sets haptic data { TiIn times of collection be up to L, wherein L=max ({ ti), in image data { RiIn The end of picture of the length less than L mends 0, until picture length is L, and it is that b, length are the figure that L, port number are a to obtain k width Sheet data { Pi, in one embodiment of the present of invention, L=620;
(3) according to the haptic data stability { label of above-mentioned steps (1)iAnd the obtained image data of above-mentioned steps (2) {Pi, deep learning network is trained, is comprised the following steps:
(3-1) calculates image data { PiAverage Mean () represents to calculate each position Average, the image data { P that will be obtained respectively in above-mentioned steps (2)iSubtract image data averageIt is 0 to obtain a class mean Image data { Qi, wherein
The size of the convolution kernel of the first layer convolutional layer of (3-2) projected deep learning network be a × b × 5, first layer it is defeated Go out for 64 characteristic patterns, the second layer is active coating, and effect is to remove the negative value in characteristic pattern, third layer is full linking layer, full chain The quantity for connecing layer output is 2, the stabilization of haptic data is corresponded to respectively and unstable, i.e. the judgement knot of manipulator grasp stability Really, the 4th layer is Classification Loss layer, and effect is that the parameter for making network each time after iteration continues to optimize raising accuracy rate, wherein, a Represent that the finger radical of manipulator, b represent the touch sensor quantity of a finger, network model is as shown in Figure 1;
(3-3) utilizes the haptic data { label of above-mentioned steps (1)iAnd the obtained image data of above-mentioned steps (3-1) {Qi, the deep learning network of training step (3-2) design obtains grasp stability identification model;
(4) using the deep learning network of above-mentioned steps (3) design, manipulator grasp stability is identified, including Following steps:
(4-1) sets the haptic data of one group of unknown stability as X, and wherein X is three-dimensional matrice, and three dimensions are respectively:Machine The finger radical a of tool hand, finger touch sensor quantity b and times of collection c, enter to haptic data X times of collection c Row judge, if c≤L, according to above-mentioned steps (2), haptic data is converted into image data, Y is designated as, if c > L, from touch The haptic data for the c-L collection that start-up portion is deleted in data X is felt, according still further to above-mentioned steps (2), by remaining haptic data Image data is converted to, Y is designated as, each element y in Y is judged, if y < 0, make y=0, if y > 255, make Y=255;
It is equal that the image data Y that above-mentioned steps (4-1) are obtained is subtracted the image data that above-mentioned steps (3-1) obtain by (4-2) ValueZ is obtained, wherein
The image data Z that (4-3) obtains step (4-2) substitutes into the deep learning net that above-mentioned steps (3-3) training is obtained Network, if two output valves of full linking layer are [m, n], wherein m correspondences label=0 unstable seized condition, n correspondences label =1 stable seized condition, two output valves [m, n] is compared, if m >=n, then it is assumed that image data Z grabs to be unstable Take, if m < n, then it is assumed that image data Z captures to be stable, obtains the judged result of manipulator grasp stability.

Claims (1)

1. a kind of manipulator grasp stability recognition methods based on deep learning, it is characterised in that this method includes following step Suddenly:
(1) haptic data of the collection machinery hand under different grasp stabilities, comprises the following steps:
(1-1) sets manipulator and has a root fingers, sets b on b touch sensor, every finger respectively on a root fingers Touch sensor constitutes a contact array;
(1-2) control machinery hand is captured with random crawl point and crawl dynamics to article, gathers the tactile of crawl process Data, if the haptic data of collection is T, haptic data T is three-dimensional matrice, and three dimensions are respectively:The finger radical of manipulator A, finger touch sensor quantity b and times of collection t, if the haptic data of the last time collection in t collection is U, U are the matrixes that dimension is a × b;
(1-3) rocks manipulator, gathers a haptic data, if it is V to rock the haptic data gathered after manipulator, V is dimension For a × b matrix, the maximum diff for rocking haptic data variable quantity on single contact before and after manipulator is calculated:diff =max (abs (U-V)), wherein abs () represent to calculate absolute value, max () expression calculating maximums;
(1-4) calculates haptic data U maximum Umax=max (abs (U)), by the maximum diff of variable quantity and tactile number It is compared according to U maximum, if diff >=0.25 × Umax, judges that haptic data U changes, this crawl is unstable It is fixed, label=0 is designated as, if diff 0.25 × Umax of <, judges that haptic data U does not change, this grasping stability, note For label=1;
(1-5) repeats the above steps (1-2)-(1-4) k times, obtains the haptic data { T of k group crawl processesi, by every group of tactile number According to times of collection be designated as { ti, the stability of haptic data is designated as { labeli, wherein i=1,2 ..., k;
(2) haptic data { T for obtaining above-mentioned steps (1)iPicture is converted into, comprise the following steps:
(2-1) calculates haptic data { TiMaximum of Tmax=max ({ Ti) and minimum value Tmin=min ({ Ti), wherein max () represents to calculate maximum, and min () represents calculated minimum;
(2-2) is respectively by haptic data { TiIn each element x be substituted for Wherein Floor () represents that removing fractional part retains integer part, obtains haptic data { TiAll image data { Ri, { RiBe K width are that b, length are ti, the picture that port number is a, wherein i=1,2 ..., k, a represent the finger radical of manipulator, b Represent the touch sensor quantity of a finger, tiFor the times of collection of haptic data;
(2-3) sets haptic data { TiIn times of collection be up to L, wherein L=max ({ ti), in image data { RiIn length The end of picture less than L mends 0, until picture length is L, and it is that b, length are the picture number that L, port number are a to obtain k width According to { Pi};
(3) according to the haptic data stability { label of above-mentioned steps (1)iAnd the obtained image data { P of above-mentioned steps (2)i, Deep learning network is trained, is comprised the following steps:
(3-1) calculates image data { PiAverage Mean () represents to calculate the equal of each position Value, the image data { P that will be obtained respectively in above-mentioned steps (2)iSubtract image data averageObtain the figure that a class mean is 0 Sheet data { Qi, wherein
The size of the convolution kernel of the first layer convolutional layer of (3-2) projected deep learning network is a × b × 5, and first layer is output as 64 characteristic patterns, the second layer is active coating, and third layer is full linking layer, and the quantity of full linking layer output is 2, and tactile is corresponded to respectively The stabilization of data and unstable, the i.e. judged result of manipulator grasp stability, the 4th layer is Classification Loss layer, wherein, a is represented The finger radical of manipulator, b represent the touch sensor quantity of a finger;
(3-3) utilizes the haptic data { label of above-mentioned steps (1)iAnd the obtained image data { Q of above-mentioned steps (3-1)i, instruction Practice the deep learning network of step (3-2) design, obtain grasp stability identification model;
(4) using the deep learning network of above-mentioned steps (3) design, manipulator grasp stability is identified, including it is following Step:
(4-1) sets the haptic data of one group of unknown stability as X, and wherein X is three-dimensional matrice, and three dimensions are respectively:Manipulator Finger radical a, the touch sensor quantity b and times of collection c of finger, haptic data X times of collection c is sentenced It is disconnected, if c≤L, according to above-mentioned steps (2), haptic data is converted into image data, Y is designated as, if c > L, from tactile number According to the haptic data for the c-L collection that start-up portion is deleted in X, according still further to above-mentioned steps (2), remaining haptic data is changed For image data, Y is designated as, each element y in Y is judged, if y < 0, make y=0, if y > 255, make y= 255;
The image data Y that above-mentioned steps (4-1) are obtained is subtracted the image data average that above-mentioned steps (3-1) are obtained by (4-2) Z is obtained, wherein
The image data Z that (4-3) obtains step (4-2) substitutes into the deep learning network that above-mentioned steps (3-3) training is obtained, if Two output valves of full linking layer are [m, n], and wherein m corresponds to label=0 unstable seized condition, n correspondences label=1's Stable seized condition, two output valves [m, n] is compared, if m >=n, then it is assumed that image data Z is unstable crawl, if m < n, then it is assumed that image data Z captures to be stable, obtains the judged result of manipulator grasp stability.
CN201710191189.0A 2017-03-28 2017-03-28 A kind of manipulator grasp stability recognition methods based on deep learning Active CN106960099B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710191189.0A CN106960099B (en) 2017-03-28 2017-03-28 A kind of manipulator grasp stability recognition methods based on deep learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710191189.0A CN106960099B (en) 2017-03-28 2017-03-28 A kind of manipulator grasp stability recognition methods based on deep learning

Publications (2)

Publication Number Publication Date
CN106960099A true CN106960099A (en) 2017-07-18
CN106960099B CN106960099B (en) 2019-07-26

Family

ID=59471549

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710191189.0A Active CN106960099B (en) 2017-03-28 2017-03-28 A kind of manipulator grasp stability recognition methods based on deep learning

Country Status (1)

Country Link
CN (1) CN106960099B (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108573059A (en) * 2018-04-26 2018-09-25 哈尔滨工业大学 A kind of time series classification method and device of feature based sampling
CN108673534A (en) * 2018-04-20 2018-10-19 江苏大学 A kind of software manipulator for realizing intelligent sorting using artificial synapse network system
CN109407603A (en) * 2017-08-16 2019-03-01 北京猎户星空科技有限公司 A kind of method and device of control mechanical arm crawl object
CN109591013A (en) * 2018-12-12 2019-04-09 山东大学 A kind of flexible assembly analogue system and its implementation
CN110202583A (en) * 2019-07-09 2019-09-06 华南理工大学 A kind of Apery manipulator control system and its control method based on deep learning
CN110413188A (en) * 2018-04-28 2019-11-05 北京钛方科技有限责任公司 Smart machine control method and device
CN111055279A (en) * 2019-12-17 2020-04-24 清华大学深圳国际研究生院 Multi-mode object grabbing method and system based on combination of touch sense and vision
CN111459278A (en) * 2020-04-01 2020-07-28 中国科学院空天信息创新研究院 Robot grabbing state discrimination method based on touch array
CN113172663A (en) * 2021-03-24 2021-07-27 深圳先进技术研究院 Manipulator grabbing stability identification method and device and electronic equipment
CN113942009A (en) * 2021-09-13 2022-01-18 苏州大学 Robot bionic hand grabbing method and system
CN114423574A (en) * 2019-09-15 2022-04-29 谷歌有限责任公司 Determining a sequence of actions for environmental adjustments of a robot task

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002043930A1 (en) * 2000-11-30 2002-06-06 Kuka Schweissanlagen Gmbh Lightweight grab for manipulators
CN102126221A (en) * 2010-12-23 2011-07-20 中国科学院自动化研究所 Method for grabbing object by mechanical hand based on image information
CN103759716A (en) * 2014-01-14 2014-04-30 清华大学 Dynamic target position and attitude measurement method based on monocular vision at tail end of mechanical arm
CN105082158A (en) * 2015-07-23 2015-11-25 河北省科学院应用数学研究所 System and method for grabbing and placing compressor based on image recognition
CN105956351A (en) * 2016-07-05 2016-09-21 上海航天控制技术研究所 Touch information classified computing and modelling method based on machine learning

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002043930A1 (en) * 2000-11-30 2002-06-06 Kuka Schweissanlagen Gmbh Lightweight grab for manipulators
CN102126221A (en) * 2010-12-23 2011-07-20 中国科学院自动化研究所 Method for grabbing object by mechanical hand based on image information
CN103759716A (en) * 2014-01-14 2014-04-30 清华大学 Dynamic target position and attitude measurement method based on monocular vision at tail end of mechanical arm
CN105082158A (en) * 2015-07-23 2015-11-25 河北省科学院应用数学研究所 System and method for grabbing and placing compressor based on image recognition
CN105956351A (en) * 2016-07-05 2016-09-21 上海航天控制技术研究所 Touch information classified computing and modelling method based on machine learning

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
JIE WEI 等: "Robotic grasping recognition using multi-modal deep extreme learning machine", 《MULTIDIMENSIONAL SYSTEMS AND SIGNAL PROCESSING》 *
刘玉梅 等: "二关节连杆欠驱动机械手抓取稳定性分析", 《机械工程与自动化》 *
韩峥 等: "基于Kinect的机械臂目标抓取", 《智能系统学报》 *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109407603B (en) * 2017-08-16 2020-03-06 北京猎户星空科技有限公司 Method and device for controlling mechanical arm to grab object
CN109407603A (en) * 2017-08-16 2019-03-01 北京猎户星空科技有限公司 A kind of method and device of control mechanical arm crawl object
CN108673534A (en) * 2018-04-20 2018-10-19 江苏大学 A kind of software manipulator for realizing intelligent sorting using artificial synapse network system
CN108573059B (en) * 2018-04-26 2021-02-19 哈尔滨工业大学 Time sequence classification method and device based on feature sampling
CN108573059A (en) * 2018-04-26 2018-09-25 哈尔滨工业大学 A kind of time series classification method and device of feature based sampling
CN110413188A (en) * 2018-04-28 2019-11-05 北京钛方科技有限责任公司 Smart machine control method and device
CN109591013A (en) * 2018-12-12 2019-04-09 山东大学 A kind of flexible assembly analogue system and its implementation
CN110202583A (en) * 2019-07-09 2019-09-06 华南理工大学 A kind of Apery manipulator control system and its control method based on deep learning
CN114423574A (en) * 2019-09-15 2022-04-29 谷歌有限责任公司 Determining a sequence of actions for environmental adjustments of a robot task
CN111055279A (en) * 2019-12-17 2020-04-24 清华大学深圳国际研究生院 Multi-mode object grabbing method and system based on combination of touch sense and vision
CN111055279B (en) * 2019-12-17 2022-02-15 清华大学深圳国际研究生院 Multi-mode object grabbing method and system based on combination of touch sense and vision
CN111459278A (en) * 2020-04-01 2020-07-28 中国科学院空天信息创新研究院 Robot grabbing state discrimination method based on touch array
CN113172663A (en) * 2021-03-24 2021-07-27 深圳先进技术研究院 Manipulator grabbing stability identification method and device and electronic equipment
CN113942009A (en) * 2021-09-13 2022-01-18 苏州大学 Robot bionic hand grabbing method and system

Also Published As

Publication number Publication date
CN106960099B (en) 2019-07-26

Similar Documents

Publication Publication Date Title
CN106960099A (en) A kind of manipulator grasp stability recognition methods based on deep learning
CN111417983B (en) Deformable object tracking based on event camera
Corona et al. Active garment recognition and target grasping point detection using deep learning
CN105082132B (en) Fast machine people's learning by imitation of power moment of torsion task
Cretu et al. Soft object deformation monitoring and learning for model-based robotic hand manipulation
JP4878842B2 (en) Robot drive method
CN110026987A (en) Generation method, device, equipment and the storage medium of a kind of mechanical arm crawl track
CN107316067B (en) A kind of aerial hand-written character recognition method based on inertial sensor
CN110532984A (en) Critical point detection method, gesture identification method, apparatus and system
CN108960192B (en) Action recognition method and neural network generation method and device thereof, and electronic equipment
Kiatos et al. Robust object grasping in clutter via singulation
CN107016342A (en) A kind of action identification method and system
CN107972026A (en) Robot, mechanical arm and its control method and device
Huang et al. Leveraging appearance priors in non-rigid registration, with application to manipulation of deformable objects
JP6587195B2 (en) Tactile information estimation device, tactile information estimation method, program, and non-transitory computer-readable medium
CN109685037A (en) A kind of real-time action recognition methods, device and electronic equipment
CN104573621A (en) Dynamic gesture learning and identifying method based on Chebyshev neural network
CN106671112A (en) Judging method of grabbing stability of mechanical arm based on touch sensation array information
El Zaatari et al. iTP-LfD: Improved task parametrised learning from demonstration for adaptive path generation of cobot
Faris et al. Proprioception and exteroception of a soft robotic finger using neuromorphic vision-based sensing
JPH0620055A (en) Method and device for picture signal processing
Chen et al. Robotic grasp control policy with target pre-detection based on deep Q-learning
Tada et al. Acquisition of multi-modal expression of slip through pick-up experiences
JP7249928B2 (en) Tactile information estimation device, tactile information estimation method and program
CN108829248A (en) A kind of mobile target selecting method and system based on the correction of user's presentation model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant