CN106346485B - The Non-contact control method of bionic mechanical hand based on the study of human hand movement posture - Google Patents
The Non-contact control method of bionic mechanical hand based on the study of human hand movement posture Download PDFInfo
- Publication number
- CN106346485B CN106346485B CN201610840052.9A CN201610840052A CN106346485B CN 106346485 B CN106346485 B CN 106346485B CN 201610840052 A CN201610840052 A CN 201610840052A CN 106346485 B CN106346485 B CN 106346485B
- Authority
- CN
- China
- Prior art keywords
- hand
- model
- dimensional
- finger
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 37
- 239000011664 nicotinic acid Substances 0.000 title claims abstract description 28
- 230000033001 locomotion Effects 0.000 title claims abstract description 27
- 238000013507 mapping Methods 0.000 claims abstract description 7
- 210000003811 finger Anatomy 0.000 claims description 99
- 230000006870 function Effects 0.000 claims description 75
- 210000000236 metacarpal bone Anatomy 0.000 claims description 17
- 210000003813 thumb Anatomy 0.000 claims description 16
- 210000000707 wrist Anatomy 0.000 claims description 15
- 210000000988 bone and bone Anatomy 0.000 claims description 13
- 210000002478 hand joint Anatomy 0.000 claims description 13
- 239000013256 coordination polymer Substances 0.000 claims description 12
- 230000000875 corresponding effect Effects 0.000 claims description 12
- 210000004932 little finger Anatomy 0.000 claims description 12
- 238000000354 decomposition reaction Methods 0.000 claims description 11
- 238000001514 detection method Methods 0.000 claims description 8
- 239000011159 matrix material Substances 0.000 claims description 6
- 230000011218 segmentation Effects 0.000 claims description 6
- 230000003044 adaptive effect Effects 0.000 claims description 5
- 230000003068 static effect Effects 0.000 claims description 4
- 239000000284 extract Substances 0.000 claims description 3
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 claims description 2
- 210000003484 anatomy Anatomy 0.000 claims description 2
- 238000004040 coloring Methods 0.000 claims description 2
- 238000006073 displacement reaction Methods 0.000 claims description 2
- 238000000605 extraction Methods 0.000 claims description 2
- 210000004247 hand Anatomy 0.000 claims description 2
- 210000002411 hand bone Anatomy 0.000 claims description 2
- 238000005259 measurement Methods 0.000 claims description 2
- 239000000203 mixture Substances 0.000 claims description 2
- 238000010606 normalization Methods 0.000 claims description 2
- 238000013519 translation Methods 0.000 claims description 2
- 210000003857 wrist joint Anatomy 0.000 claims description 2
- 238000007689 inspection Methods 0.000 claims 1
- 230000000386 athletic effect Effects 0.000 abstract description 4
- 230000007812 deficiency Effects 0.000 abstract description 2
- 239000002245 particle Substances 0.000 description 7
- 238000005452 bending Methods 0.000 description 6
- 241001236215 Pinus parviflora Species 0.000 description 2
- JEIPFZHSYJVQDO-UHFFFAOYSA-N iron(III) oxide Inorganic materials O=[Fe]O[Fe]=O JEIPFZHSYJVQDO-UHFFFAOYSA-N 0.000 description 2
- 230000006399 behavior Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000000571 coke Substances 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 210000001145 finger joint Anatomy 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000003238 somatosensory effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/163—Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/107—Static hand or arm
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
Abstract
The present invention provides the Non-contact control methods by study human hand movement gesture stability five fingers bionic mechanical hand, belong to field of intelligent control.This method propose the three-dimensional hand modeling methods with adaptivity, and it is tracked according to athletic posture of the three-dimensional hand model to all artis of controllers hand, the corresponding relationship between the athletic posture of manpower and the action command of manipulator is established by mapping algorithm, controls controllers to the five fingers manipulator in a natural manner.The present invention is using RGB-D image, three-dimensional hand model is established to describe the pose parameter in each joint of manpower, and it proposes improved APSO algorithm and pose parameter is solved, effectively raise the convergence rate of higher-dimension parametric solution, avoid the limitation for wearing the wearable devices such as data glove, the pose of hand partial joint can only be obtained by solving existing sensor-based control method, can not be suitable for the deficiency of the bionical Dextrous Hand of high-freedom degree.
Description
Technical field
The invention belongs to field of intelligent control, it is related to one kind using RGB-D image as input signal, by learning human hand movement
The Non-contact control method of gesture stability five fingers bionic mechanical hand.
Background technique
With the continuous expansion of robot application range, robot plays more next in fields such as Industry Control, Virtual assembles
Bigger effect, while the scene of robot manipulating task and task also become increasingly complex.The manipulator of robot is that its completion is various
The main device and tool of task, simple clamping device and two finger manipulators can no longer meet these application demands, mechanical
Hand is gradually to more fingers, multi-joint and multivariant mechanical Dextrous Hand development.Although current the five fingers bionic mechanical Dextrous Hand
It is become closer in shape with manpower, but the flexibility of actual functional capability and operation still differs greatly with manpower.It is controlled
Method passes through computer or other control equipment directly to the five fingers bionic mechanical from operator usually according to specific crawl target
The instruction of hand sending action repeats pre-designed required movement according to fixed routine, and does not have flexible manpower and machine
The man-machine coordination ability of tool hand, the movement also without calligraphy learning mankind actual human hand.How to allow the Dextrous Hand of robot with natural
The various movements that mode learns manpower are an important research problems in current bio-robot.
In order to learn the movement of actual human hand, robot estimates firstly the need of the posture to human hand movement, then will
The movement of manpower maps and is converted into the action command of its own.The motion information of manpower can mainly pass through two ways at present
It obtains, it may be assumed that contact data glove and noncontacting proximity sensor.First way acquires behaviour by way of wearing data glove
Make the gesture information of personnel, operator's wearable device under this mode, deployment cost is high, there have in practical application scene to be larger
Limitation.For example, operator needs whole wearing gloves in the process of grasping, operate very inconvenient.The second way
The motion information of manpower is acquired by somatosensory device and camera shooting unit, then passes through the method for computer vision to each joint of manpower
Parameter solved.Operator carries out non-contacting control, system setting side to robot bionic hand in a natural manner
Just and user experience is more preferable.
Existing Non-contact control method can be divided into based on gesture identification and track two types based on hand joint.
Based on the method for gesture identification by presetting fixed gesture type, by data in the gesture of operator and database into
Row matching, to manipulate the movement that manipulator completes corresponding classification.The movement of its manipulator is still to set according to fixed routine,
There is no the abilities that study is imitated.Method based on hand joint pose tracks the artis of operator, and should
Input parameter of the parameter as control manipulator, to control the hand motion that manipulator imitates and learns the mankind.It is existing non-
Contact control method refers to mainly for simple two or three refer to manipulator, and the manipulator degrees-of-freedom used is lower, therefore needle
The Untouched control of this kind of manipulator is only needed to track the partial joint of hand, so that the portion of manpower can only be learnt
Transfer is made.But these data can not be suitable for the higher five-needle pines blister rust of freedom degree, realize the non-contact of five-needle pines blister rust
Formula control needs that the articulate pose of institute of manpower is estimated and tracked.Due to manpower high-freedom degree (26) and flexibly
Property, the control of the five fingers bionic mechanical hand based on the study of human hand movement posture still has certain difficulty.
Summary of the invention
For the limitation of the non-contact control mode of existing bionic mechanical hand, the purpose of the present invention is to provide one kind five
Refer to the control method of bionic mechanical hand.It is tracked by the athletic posture to all artis of controllers hand, learns people
The athletic posture of hand and the movement posture for being mapped as manipulator, make controllers in a natural manner to the five fingers manipulator into
Row control.The limitation for wearing the wearable devices such as data glove is avoided, solves existing sensor-based control method only
The pose of hand partial joint can be obtained, the deficiency of the bionical Dextrous Hand of high-freedom degree can not be suitable for.
Technical solution of the present invention:
The Non-contact control method of bionic mechanical hand based on the study of human hand movement posture, steps are as follows:
(1) basic conception
RGB-D image: being acquired by RGB-D camera, by RGB image and depth image (Depth Image) two parts group
At.Wherein, each pixel value of RGB image indicates the colouring information of image, and each pixel value of depth image indicates sensing
Actual range of the device apart from object.Usual RGB image and depth image are registrations, it may be assumed that each of RGB image and depth image
There are one-to-one relationships between pixel.
Hand joint model: according in anatomy human hands skeletal structure and kinematical constraint condition can define hand
Portion's joint model, the model is for establishing three-dimensional hand model.
Hand joint model includes 5 metacarpal bones (Metacarpal), 5 proximal phalanx (Proximal phalanx), 4
Middle phalanxes (Middle phalanx) and 5 distal phalanx (Distal phalanx).Wrist joint point is world coordinate system
Origin has 6 freedom degrees, including global rotation (3 freedom degrees) and global translation (3 freedom degrees).Hand model includes
Four kinds of artis, i.e. MCP artis, PIP artis, DIP artis and IP artis.Wherein, the company of metacarpal bone and proximal phalanx
Contact is MCP artis, and the tie point of proximal phalanx and middle phalanxes is PIP artis, the company of middle phalanxes and distal phalanx
Contact is DIP artis.Thumb does not have middle phalanxes, and the tie point between distal phalanx and proximal phalanx is the joint IP
Point.Each MCP artis has left and right stretching, extension (adduction and abduction, aa), swing (flexion and
Extension, fe) 2 freedom degrees.These three artis of PIP, DIP, IP only have 1 freedom degree of swing.
Two-dimentional hand model: collected hand depth image is subjected to quadtree decomposition, according to the similarity of depth value
Hand depth image is divided into multiple images block, each image block is modeled using two-dimentional mixed Gauss model, i.e.,
Two-dimentional hand model can be obtained, be denoted as symbol CI;
Three-dimensional hand model: three-dimensional modeling is carried out to hand using isotropic mixed Gauss model, is denoted as symbol CH;
Projection model: the three-dimensional hand model under current position and posture is projected on image, corresponding throwing can be obtained
Shadow model uses symbol CpIt indicates.
(2) technical principle of the invention
It tracked by all joints to manpower in RGB-D image, obtain its pose parameter, and all parameters are turned
It is changed to the action command of manipulator, to realize the synchronization of the two.Technical solution principle will be described in detail below, it is first
The modeling of self-adaptation three-dimensional hand, three-dimensional hand joint tracking are first introduced, manpower pose is then introduced and mechanical Dextrous Hand acts appearance
Mapping algorithm between state.
1) adaptive hand modeling
The proportionality coefficient of hand bone length includes hand length Lhand, hand width Whand, length between each palm bone
The ratio of ratio and angle, finger length and each bone length of finger;
(a) it hand length and hand width: defines in collected depth image shared by hand length and hand width
Number of pixels m and n further calculate the mean depth d for obtaining handavg;Practical hand length L is obtained according to projection ratiohand
With hand width Whand:
Wherein: f indicates focal length size as unit of pixel, after the length and width for obtaining rough hand, the 0.9 of coarse value
Exact value is found within the scope of~1.1 times.
(b) ratio and angle of each palm bone length: to the bones of hand same names according to from little finger to thumb
Sequence, respectively number be 1~5, i.e. MC1~MC5;Using long finger metacarpals as reference position, other 4 metacarpal bones and long finger metacarpals
Angle is expressed as θ1~θ4;
Long finger metacarpals MC3With the ratio range of the length of other metacarpal bones are as follows:
Angular range between five metacarpal bones are as follows:
(c) defining each finger length respectively according to the sequence from little finger to thumb is L1~L5, and length is respectively
It corresponds to the sum of finger bone length, it may be assumed that
According to above-mentioned definition, the positional relationship in an each joint of hand is described using following three kinds of proportionate relationships:
Middle finger length L3With intermediate metacarpal bone MC3Lenth ratio:
The lenth ratio of middle finger length and other fingers:
The proximal phalanx PP of each finger is as follows with the ratio range of phalanges MP and distal phalanx DP respectively:
2) three-dimensional hand tracking
Firstly, hand Segmentation is come out from the RGB-D image of acquisition, then carry out the detection in the centre of the palm, wrist, finger tip;
Secondly, carrying out two-dimentional modeling to the hand that segmentation obtains, it is divided into two steps: quadtree decomposition and image blend Gauss modeling;
Then, adaptive three-dimensional hand model is established according to RGB-D image;Finally, by by the projection model of three-dimensional hand model
It is matched with the modeling of hand two dimension, computation model similarity mode item Esim, in conjunction with colour of skin penalty term Ecol_simIt is continuous with interframe
Property penalty term Efr_sim, the pose of hand is obtained by solving objective function, final normalization objective function is as follows:
ε (Θ)=Esim-ωfrEfr_sim-ωcolEcol_sim (8)
Wherein, ωfrFor interframe continuity penalty term Efr_simWeight, ωcolFor colour of skin penalty term Ecol_simWeight;
3) mapping algorithm
For the joint MCP, PIP, DIP in hand joint model, respectively to its swing, left and right stretching, extension angle into
Row constraint.Wherein, the tri- kinds of joints MCP, PIP, DIP all have the ability of swing, but the only joint MCP is able to carry out a left side
Right stretching, extension.
(a) remember that the angle parameter of tri- kinds of joint swings of MCP, PIP and DIP is respectively θmcp_fe、θpipAnd θdip, static
It constrains as follows:
(b) the left and right stretching angle for remembering the joint MCP is θmcp_aa, static constraint are as follows:
{ 1,2,3,4,5 } k=from top to bottom in formula (10) respectively corresponds the little finger, third finger, middle finger, index finger, big
Thumb, θmcp_aaWhat is be worth is positive and negative using middle finger as reference.
It is five fingers bionic mechanical hand model used in the present invention, wherein there are two certainly respectively for thumb, index finger and middle finger
By spending, middle finger and little finger of toe respectively have one degree of freedom, all have finger and close up and open;The articulate movement of institute is by 9 parameter controls
System, i.e., thumb just bend (Thumb_Flexion), thumb two sides side-sway (Thumb_Opposition), the nearly joint of index finger and
It is just bending (Index_Finger_Distal), index finger base joint and is just bending (Index_Finger_Proximal), middle finger base in remote joint
It is just bending (Middle_Finger_Proximal), the nearly joint of middle finger and remote joint and is just bending (Middle_Finger_ in joint
Distal), nameless base joint is just bending (Ring_Finger), little finger of toe base joint and is just bending (Pinky), the five fingers opening (Finger_
Spead), the range of these parameters are as follows:
The parameter for remembering manpower movement is Θ={ (θmcp_fe)k,(θmcp_aa)k,(θpip)k,(θdip)kK={ 1,2,3,4,5 },
The control parameter of manipulator be Θ ', then the mapping function being defined as follows:
(3) technical solution, steps are as follows:
S1. acquire RGB-D image, and to depth image carry out pretreatment and feature extraction, including hand Segmentation, finger tip with
Wrist detection and the centre of the palm are extracted, the specific steps are as follows:
S1.1. the depth value of certain point in 16 bit depth images, z are indicated with zminIndicate that pixel value is big in 16 bit depth images
In 0 minimum value, then according to zminThe pixel coordinate at place, the i.e. position of acquisition hand in the picture;Limited depth is in [zmin,
zmin+ 200] image-region in range is hand region;The hand region of binaryzation is obtained by formula (13):
S1.2. hand contours extract is carried out in the hand images of binaryzation, further obtains the centre of the palm, finger tip and wrist
Position;The position in the centre of the palm is the maximum inscribed circle center of circle of hand profile, then using the Graham scanning method in two-dimentional algorithm of convex hull
Finger tip and wrist location are detected, the position for obtaining finger tip and wrist using two-dimentional algorithm of convex hull calculates hand as hand prior information
The each joint length in portion.
S2. two-dimentional hand model is established according to hand depth image
S2.1. based on the quadtree decomposition algorithm of depth similitude: first, it is determined that the hand images length and width of binaryzation whether
For 2 power, interpolation is carried out if condition is not satisfied;Secondly, the hand images of binaryzation are carried out with the decomposition of recursion, i.e.,
The hand images of binaryzation are equally divided into four sub-blocks, judge whether the maximum of each sub-block is small with the difference of minimum-depth respectively
In 12mm, stop decomposing the sub-block if meeting, if being unsatisfactory for continuing for the sub-block to be divided into four sub-blocks, and judge each
Whether sub-block meets threshold condition (14)
dmax-dmin≤12mm (14)
Wherein, dmaxAnd dminRespectively indicate in the hand images for the binaryzation being currently decomposed the maximum value of sub-block depth with
Minimum value;
S2.2. two-dimentional hand model is established
I-th of image subblock that step S2.1 is decomposed constructs two-dimensional Gaussian function Gi, it is fitted, i.e., it is sub
The center of block i corresponds to the central point of two-dimensional Gaussian function, and the standard deviation of two-dimensional Gaussian functionEnable two
All Gaussian functions weight 1 having the same in hand model is tieed up, then indicates two-dimentional hand model with formula (15):
Wherein, CI(p) indicate that two-dimentional hand model, n indicate the number of Gaussian function in iconic model, p is indicated in image
The two-dimensional coordinate position of pixel, GiIndicate i-th of two-dimensional Gaussian function in model, μiIndicate i-th of Gaussian function central point
Position, σiIndicate the standard deviation of Gaussian function, diIndicate that Gaussian function corresponds to the average depth value of depth image block.
S2.3. hand is modeled using three-dimensional hybrid Gauss model, is defined as follows:
Wherein, CHIt (q) is the three-dimensional hybrid Gauss model of hand, m indicates the Gaussian function number in model, and q indicates deep
Spend the three-dimensional coordinate position of pixel in image, GjIndicate j-th of Gaussian function in model, ωjIndicate j-th of Gaussian function
Weight, μHAnd σHRespectively indicate the mean vector and covariance matrix of Gaussian function.
S3. the objective function in solution formula (8), specific steps are as follows:
S3.1. the parameter vector of manpower, including hand length L are initializedhand, hand width Whand, palm bone (MC1~
MC5) length ratio, 4 metacarpal bones and long finger metacarpals MC3Angle (θ1~θ4), middle finger length L3With the ratio of other finger lengths
Value, middle finger length L3With intermediate metacarpal bone MC3The ratio of length, each finger proximal phalanx PP refer to respectively with phalanges MP and distally
Ratio, the hand pose parameter Θ of bone DP;
S3.2. projection model is calculated
Assuming that a three-dimensional Gaussian function in three-dimensional hand model is GH(q;μH,σH), μHAnd σHRespectively its is homogeneous
It is worth vector and standard deviation, and μH=[μx,μy,μz,1];Assuming that the two-dimensional Gaussian function of the rectangular projection of three-dimensional hand model is GP
(q′;μP,σP), μPAnd σPRespectively its homogeneous mean vector and standard deviation;In the Intrinsic Matrix K and coke of known depth camera
In the case where away from f, projection relation between the two are as follows:
Wherein, the unit matrix that I is 3 × 3, O represent 3 × 1 null vector, μzRepresent the depth of Gaussian function center
Value;According to formula (17), Gaussian function all in three-dimensional hand model is projected respectively, its corresponding two dimension is obtained and throws
Shadow model are as follows:
Wherein, CPIndicate that two-dimensional projection's model, m indicate Gaussian function number, GjIndicate j-th of Gauss in projection model
Function, the three-dimensional coordinate q of pixel projects to corresponding two-dimensional coordinate, ω when two dimensional image in q ' expression three-dimensional hand modelj
Indicate the weight of projection Gaussian function, μPAnd σPRespectively indicate the mean vector and standard deviation of projection Gaussian function.djIndicate projection
The average depth value of Gaussian function, the i.e. depth value of three-dimensional Gaussian function front surface, by by the depth of Gaussian function center
Angle value subtracts the acquisition of its radius.
S3.3. similitude between two-dimentional hand model and the projection model of three-dimensional hand model is calculated
Two-dimentional hand model CIWith projection model CPIt is all two-dimentional mixed Gauss model, by all Gausses in the two
Function is matched, and the similarity measurements flow function defined between the two is as follows:
Wherein, CIWith CPTwo-dimentional hand model and projection model are respectively indicated, i and j indicate dimensional Gaussian letter in corresponding model
Several serial numbers, DijIndicate CIWith CPIn two Gaussian functions integral expression:
Wherein, μiAnd σiThe mean value and standard deviation of i-th of two-dimensional Gaussian function, μ in respectively two-dimentional hand modeljAnd σj
The mean value and standard deviation of j-th of Gaussian function respectively in hand projection model.
S3.4. interframe continuity is calculated
Present frame pose is measured using the smoothness of the pose parameter of the hand pose parameter and present frame of front cross frame
The reasonability of parameter, specific formula is as follows:
Wherein, Θ indicates hand pose parameter vector, in sequence includes 33 dimensions of global displacement, global angle dimensions
It spends, 14 dimensions of joint angles, totally 20 dimensions.θjIndicate that the jth item of Θ, t indicate the serial number of present frame.
S3.5. colour of skin similitude is calculated
Complexion model establishes penalty term, algorithm robustness is improved on the basis of not increasing tracking system complexity, specifically
Formula is as follows:
Wherein, CPProjection model is represented, j indicates CPIn Gaussian function index, SjFor colour of skin judgment formula, if the height
The color of this function region is that then its value is 0 to the colour of skin, is otherwise 1.
S3.6. the pose parameter Θ of hand is obtained by solution formula (8).
S4., step 3 is solved to obtained parameter and is sent to Dextrous Hand control system as input, is calculated according to formula (12)
The parameter in each joint of five fingers bionic mechanical hand, and corresponding action command is generated under control system, make the five fingers bionic mechanical
Hand is completed and controllers similarly act.
Preferred embodiment: the acquisition equipment using Kinect as RGB-D image passes the image of acquisition by USB interface
Transport to computer.
Preferred embodiment: using male gram SVH the five fingers bionic hand as five fingers bionic mechanical hand model.
Beneficial effects of the present invention: adaptive three-dimensional hand can be realized using the depth image of single width manpower by the present invention
Modeling, and all artis of manpower in RGB-D image sequence are tracked by three-dimensional hand model, according to the pass of manpower
Mapping relations between node and the artis of five fingers bionic mechanical hand realize the Untouched control of five fingers bionic mechanical hand,
This method breaches the limitation of fixed routine formula control method, and easily controllable personnel carry out five fingers bionic mechanical hand intelligent
Control.
Detailed description of the invention
Fig. 1 is the hand joint model of the method for the present invention.Rectangle in figure is the position of wrist, and black circle is the pass MCP
Node, gray circular are PIP artis, and round white is DIP artis, and triangle is IP artis.Wrist and MCP artis
Between be metacarpal bone, be proximal phalanx between MCP artis and PIP artis, be intermediate between PIP artis and DIP artis
Phalanges is distal phalanx between DIP artis and finger tip.Serial number 1 to 5 corresponds respectively to little finger, the third finger, middle finger, index finger
And thumb.
Fig. 2 is the two-dimentional hand model of the method for the present invention.Fig. 2 (a) is to carry out quadtree decomposition to binaryzation hand images
Result;Fig. 2 (b) carries out the result of two-dimentional hand modeling on the basis of quadtree decomposition.
Fig. 3 is the three-dimensional hand model of the method for the present invention.Fig. 3 is three-dimensional hand mould of the hand under natural open configuration
Type, the corresponding 1 three-dimensional Gaussian function of each finger-joint point in figure, between finger tip and artis, between artis and artis
1 three-dimensional Gaussian function of each correspondence.Thumb uses 3 three-dimensional Gaussian function representations, and palm is equal using 4 three-dimensional Gaussian functions
Even filling.
Fig. 4 is the detection diagram in the hand centre of the palm of the method for the present invention.Circle in figure is the maximum inscribed circle that detection obtains,
Its center of circle is the centre of the palm of hand.
Fig. 5 is that the hand finger tip of the method for the present invention and the detection of wrist illustrate.Circle center point in figure is respectively according to convex
The hand profile convex closure that packet algorithm detects, including finger tip and wrist.
Fig. 6 is the flow chart of the method for the present invention.
Fig. 7 is the three-dimensional hand joint track algorithm flow chart of the method for the present invention.
Specific embodiment
Specific implementation of the invention is described in detail below in conjunction with the flow chart (Fig. 6 and Fig. 7) in technical solution and attached drawing
Mode.
Embodiment:
RGB-D image is obtained as acquisition equipment using Kinect2.0, the image of acquisition is sent to by USB interface
Computer.Used manipulator is male gram SVH five fingers bionic mechanical hand.
Step 1, RGB-D image is obtained, wherein color image is C, depth image D.
Step 2, initiation parameter.Frame number frame=1, hand scale parameter (hand length Lhand, width Whand、θ1~θ4、L1~L5), hand gestures parameter (θmcp_fe, θpip, θMcp, aa) k, wherein k={ 1,2,3,4,5 } is right respectively
Answer little finger to five fingers of thumb.
Step 3,4 are thened follow the steps if frame=1, it is no to then follow the steps 5.
Step 4, it detects hand region I and establishes two-dimentional hand model CIWith threedimensional model CH, it comprises the following steps:
Step 4.1, the hand region image I of binaryzation is obtained according to formula (13);
Step 4.2, hand profile is extracted from image I using Sobel operator, the position for calculating the maximum inscribed circle center of circle is obtained
Centre of the palm O is obtained, and using the position (such as Fig. 4) of Graham scanning method detection finger tip and wrist in algorithm of convex hull;
Step 4.3, if the power that the length and width of I are 2, thens follow the steps 4.4, otherwise carry out interpolation;
Step 4.4, I is divided into four sub-blocks according to the half of length and width, then judge respectively each sub-block maximum with
Whether the difference of minimum-depth is less than 12mm, stops the decomposition to the sub-block if meeting the condition and otherwise continues the sub-block
It is divided into four parts (such as Fig. 2 (a));
Step 4.5, according to each square region in hand region, a Gaussian function (such as Fig. 2 (b)) is constructed.Wherein,
The average depth value of image block corresponds to the average value of Gaussian function.The two dimensional model of entire hand is calculated according to formula (18).
Step 4.6, hand scale parameter and hand gestures parameter are updated using PSO algorithm iteration.
Step 5, interframe track-while-scan range is arranged according to the pose parameter of former frame, and is solved using ASPO algorithm is improved
Parameter, detailed process is as follows:
Step 5.1, number of particles M and the number of iterations N is set, particle in class stochastical sampling method initialization population is used
Speed and location parameter, setting initialization history optimal solution pi, and calculate globally optimal solution gbest;
Step 5.2, judge stage at current particle group and undated parameter;
Step 5.3, the speed of particle and position in population are updated according to parameter;
Step 5.4, whether the speed for judging particle and position are in reasonable range, if in the range, updating grain
Sub- individual history optimal solution and globally optimal solution, more new particle is individual again later for the range of the speed and position of otherwise correcting particle
History optimal solution piWith globally optimal solution gbest, the number of iterations adds 1;
Step 5.5, if the number of iterations is greater than N or fitness is greater than threshold value, return parameters value Θ.It is no to then follow the steps
5.2。
Step 6, the attitude parameter of former frame is updated to the parameter Θ, frame=frame+1 that step 5.5 returns.
Step 7, the pose parameter Θ ' of five fingers bionic mechanical hand arm is calculated, and inputs manipulator control system, makes manipulator
Complete the movement of instruction.
Step 8, above procedure is repeated until termination system.
Claims (3)
1. a kind of Non-contact control method of the bionic mechanical hand based on the study of human hand movement posture, which is characterized in that step
It is as follows:
(1) basic conception
RGB-D image: being acquired by RGB-D camera, is made of RGB image and depth image two parts;Wherein, RGB image
Each pixel value indicates the colouring information of image, each pixel value of depth image indicate sensor distance object it is practical away from
From;RGB image and depth image are registrations, i.e., exist between each pixel of RGB image and depth image one-to-one
Relationship;
Hand joint model: according to the human hands skeletal structure and kinematical constraint conditional definition hand joint mould in anatomy
Type, the model is for establishing three-dimensional hand model;
Hand joint model includes 5 metacarpal bones, 5 proximal phalanx, 4 middle phalanxes and 5 distal phalanx;Wrist joint point is
World coordinate system origin has 6 freedom degrees, including global rotation and global translation, there is 3 freedom degrees respectively;Hand joint
Model includes four kinds of artis, i.e. MCP artis, PIP artis, DIP artis and IP artis, wherein metacarpal bone and proximal end
The tie point of phalanges is MCP artis, and the tie point of proximal phalanx and middle phalanxes is PIP artis, middle phalanxes and distal end
The tie point of phalanges is DIP artis, and the tie point between the distal phalanx and proximal phalanx of thumb is IP artis;Each
MCP artis has 2 left and right stretching, extension, swing freedom degrees;Tri- kinds of artis of PIP, DIP and IP only have swing 1
Freedom degree;
Two-dimentional hand model: collected hand depth image is subjected to quadtree decomposition, according to the similarity of depth value by hand
Portion's depth image is divided into multiple images block, and each image block is modeled using two-dimentional mixed Gauss model to get arriving
Two-dimentional hand model is denoted as symbol CI;
Three-dimensional hand model: three-dimensional modeling is carried out to hand using isotropic mixed Gauss model, is denoted as symbol CH;
Projection model: the three-dimensional hand model under current position and posture is projected on image, corresponding projection model is obtained, makes
With symbol CpIt indicates;
(2) it tracked by all joints to manpower in RGB-D image, obtain its pose parameter, and all parameters are turned
It is changed to the action command of manipulator, realizes the synchronization of the two;
1) adaptive hand modeling
The proportionality coefficient of hand bone length includes hand length Lhand, hand width Whand, between each palm bone length ratio
And the ratio of angle, finger length and each bone length of finger;
(a) hand length and hand width: pixel shared by hand length and hand width in collected depth image is defined
The mean depth d of hand is calculated in number m and navg;Practical hand length L is obtained according to projection ratiohandWith hand width
Whand:
Wherein: f indicates focal length size as unit of pixel, after the length and width for obtaining rough hand, coarse value 0.9~
Exact value is found within the scope of 1.1 times;
(b) ratio and angle of each palm bone length: to the bones of hand same names according to suitable from little finger to thumb
Sequence, number is 1~5 respectively, i.e. MC1~MC5;Using long finger metacarpals as reference position, the angle of other 4 metacarpal bones and long finger metacarpals
It is expressed as θ1~θ4;
Long finger metacarpals MC3With the ratio range of the length of other metacarpal bones are as follows:
Angular range between five metacarpal bones are as follows:
(c) defining each finger length respectively according to the sequence from little finger to thumb is L1~L5, and length is respectively that its is right
Answer the sum of finger bone length, it may be assumed that
According to above-mentioned definition, the positional relationship in an each joint of hand is described using following three kinds of proportionate relationships:
Middle finger length L3With intermediate metacarpal bone MC3Lenth ratio:
The lenth ratio of middle finger length and other fingers:
The proximal phalanx PP of each finger is as follows with the ratio range of phalanges MP and distal phalanx DP respectively:
2) three-dimensional hand tracking
Firstly, hand Segmentation is come out from the RGB-D image of acquisition, then carry out the detection in the centre of the palm, wrist, finger tip;Secondly,
Two-dimentional modeling is carried out to the hand that segmentation obtains, is divided into two steps: quadtree decomposition and image blend Gauss modeling;Then,
Adaptive three-dimensional hand model is established according to RGB-D image;Finally, by by the projection model and hand of three-dimensional hand model
Two dimension modeling is matched, computation model similarity mode item Esim, in conjunction with colour of skin penalty term Ecol_simIt is punished with interframe continuity
Item Efr_sim, the pose of hand is obtained by solving objective function, final normalization objective function is as follows:
ε (Θ)=Esim-ωfrEfr_sim-ωcolEcol_sim (8)
Wherein, ωfrFor interframe continuity penalty term Efr_simWeight, ωcolFor colour of skin penalty term Ecol_simWeight;
3) mapping algorithm
For the joint MCP, PIP and DIP in hand joint model, the angle of its swing, left and right stretching, extension is carried out respectively
Constraint;Wherein, the tri- kinds of joints MCP, PIP and DIP all have the ability of swing, and the only joint MCP is stretched with left and right is carried out
The ability of exhibition;
(a) remember that the angle parameter of tri- kinds of joint swings of MCP, PIP and DIP is respectively θmcp_fe、θpipAnd θdip, static constraint
It is as follows:
(b) the left and right stretching angle for remembering the joint MCP is θmcp_aa, static constraint are as follows:
{ 1,2,3,4,5 } k=from top to bottom in formula (10), respectively corresponds little finger, the third finger, middle finger, index finger and big thumb
Refer to, θmcp_aaWhat is be worth is positive and negative using middle finger as reference;
Five fingers bionic mechanical hand model, wherein there are two freedom degrees respectively for thumb, index finger and middle finger, and middle finger and little finger of toe respectively have
One degree of freedom all has finger and closes up and open;The articulate movement of institute is by 9 state modulators, i.e. thumb thumb just in the wrong, big
Refer to two sides side-sway, the nearly joint of index finger and remote joint are just bent, index finger base joint is just bent, middle finger base joint is just bent, the nearly joint of middle finger and
Remote joint is just bent, nameless base joint is just bent, little finger of toe base joint is just being bent and the five fingers open, the range of parameter are as follows:
The pose parameter for remembering manpower movement is Θ={ (θmcp_fe)k,(θmcp_aa)k,(θpip)k,(θdip)kK={ 1,2,3,4,5 },
The control parameter of manipulator be Θ ', then the mapping function being defined as follows:
(3) Non-contact control method of the bionic mechanical hand based on the study of human hand movement posture, steps are as follows: S1. acquisition
RGB-D image, and pretreatment and feature extraction, including hand Segmentation, finger tip and wrist detection and the palm are carried out to depth image
The heart extracts, the specific steps are as follows:
S1.1. the depth value of certain point in 16 bit depth images, z are indicated with zminIndicate that pixel value is greater than 0 in 16 bit depth images
Minimum value, then according to zminThe pixel coordinate at place, the i.e. position of acquisition hand in the picture;Limited depth is in [zmin, zmin+
200] image-region in range is hand region;The hand region of binaryzation is obtained by formula (13):
S1.2. hand contours extract is carried out in the hand images of binaryzation, further obtains the position of the centre of the palm, finger tip and wrist
It sets;The position in the centre of the palm is the maximum inscribed circle center of circle of hand profile, then using the Graham scanning method inspection in two-dimentional algorithm of convex hull
Finger tip and wrist location are surveyed, the position for obtaining finger tip and wrist using two-dimentional algorithm of convex hull calculates hand as hand prior information
Each joint length;
S2. two-dimentional hand model is established according to hand depth image
S2.1. based on the quadtree decomposition algorithm of depth similitude: first, it is determined that whether the hand images length and width of binaryzation are 2
Power, interpolation is carried out if condition is not satisfied;Secondly, the hand images of binaryzation are carried out with the decomposition of recursion, i.e., will
The hand images of binaryzation are equally divided into four sub-blocks, and whether the difference of the maximum and minimum-depth that judge each sub-block respectively is less than
12mm stops decomposing the sub-block if meeting, if being unsatisfactory for continuing for the sub-block to be divided into four sub-blocks, and judges each son
Whether block meets threshold condition (14)
dmax-dmin≤12mm (14)
Wherein, dmaxAnd dminRespectively indicate the maximum value of sub-block depth and minimum in the hand images for the binaryzation being currently decomposed
Value;
S2.2. two-dimentional hand model is established
I-th of image subblock that step S2.1 is decomposed constructs two-dimensional Gaussian function Gi, it is fitted, i.e. sub-block i's
Center corresponds to the central point of two-dimensional Gaussian function, and the standard deviation of two-dimensional Gaussian functionEnable two-dimentional hand
All Gaussian functions weight 1 having the same in portion's model then indicates two-dimentional hand model with formula (15):
Wherein, CI(p) indicate that two-dimentional hand model, n indicate the number of Gaussian function in iconic model, p indicates pixel in image
Two-dimensional coordinate position, GiIndicate i-th of two-dimensional Gaussian function in model, μiIndicate the position of i-th of Gaussian function central point
It sets, σiIndicate the standard deviation of Gaussian function, diIndicate that Gaussian function corresponds to the average depth value of depth image block;
S2.3. hand is modeled using three-dimensional hybrid Gauss model, is defined as follows:
Wherein, CHIt (q) is the three-dimensional hybrid Gauss model of hand, m indicates the Gaussian function number in model, and q indicates depth image
The three-dimensional coordinate position of middle pixel, GjIndicate j-th of Gaussian function in model, ωjIndicate the weight of j-th of Gaussian function,
μHAnd σHRespectively indicate the mean vector and covariance matrix of Gaussian function;
S3. the objective function in solution formula (8)
S3.1. the parameter vector of manpower, including hand length L are initializedhand, hand width Whand, palm bone MC1~MC5Length
Ratio, 4 metacarpal bones and long finger metacarpals MC3Angle theta1~θ4, middle finger length L3It is long with ratio, the middle finger of other finger lengths
Spend L3With intermediate metacarpal bone MC3The ratio of length, each finger proximal phalanx PP respectively and the ratio of phalanges MP and distal phalanx DP
The pose parameter Θ that value, manpower act;
S3.2. projection model is calculated
Assuming that a three-dimensional Gaussian function in three-dimensional hand model is GH(q;μH,σH), μHAnd σHRespectively its homogeneous mean value to
Amount and standard deviation, and μH=[μx,μy,μz,1];Assuming that the two-dimensional Gaussian function of the rectangular projection of three-dimensional hand model is GP(q′;
μP,σP), μPAnd σPRespectively its homogeneous mean vector and standard deviation;In the Intrinsic Matrix K and focal length f of known depth camera
In the case of, projection relation between the two are as follows:
Wherein, the unit matrix that I is 3 × 3, O represent 3 × 1 null vector, μzRepresent the depth value of Gaussian function center;Root
According to formula (17), Gaussian function all in three-dimensional hand model is projected respectively, obtains its corresponding two-dimensional projection's mould
Type are as follows:
Wherein, CPIndicate that two-dimensional projection's model, m indicate Gaussian function number, GjIndicate j-th of Gaussian function in projection model,
The three-dimensional coordinate q of pixel projects to corresponding two-dimensional coordinate, ω when two dimensional image in q ' expression three-dimensional hand modeljIt indicates to throw
The weight of shadow Gaussian function, μPAnd σPRespectively indicate the mean vector and standard deviation of projection Gaussian function;djIndicate projection Gaussian function
Several average depth values, the i.e. depth value of three-dimensional Gaussian function front surface, by subtracting the depth value of Gaussian function center
Its radius is gone to obtain;
S3.3. similitude between two-dimentional hand model and the projection model of three-dimensional hand model is calculated
Two-dimentional hand model CIWith projection model CPIt is all two-dimentional mixed Gauss model, by all Gaussian functions in the two
It is matched, the similarity measurements flow function defined between the two is as follows:
Wherein, CIWith CPTwo-dimentional hand model and projection model are respectively indicated, i and j indicate two-dimensional Gaussian function in corresponding model
Serial number, DijIndicate CIWith CPIn two Gaussian functions integral expression:
Wherein, μiAnd σiThe mean value and standard deviation of i-th of two-dimensional Gaussian function, μ in respectively two-dimentional hand modeljAnd σjRespectively
The mean value and standard deviation of j-th of Gaussian function in hand projection model;
S3.4. interframe continuity is calculated
Present frame pose parameter is measured using the smoothness of the pose parameter of the hand pose parameter and present frame of front cross frame
Reasonability, specific formula is as follows:
Wherein, Θ indicates the pose parameter of manpower movement, in sequence includes 33 dimensions of global displacement, global angle dimensions
It spends, 14 dimensions of joint angles, totally 20 dimensions;θjIndicate that the jth item of Θ, t indicate the serial number of present frame;
S3.5. colour of skin similitude is calculated
Complexion model establishes penalty term, and algorithm robustness, specific formula are improved on the basis of not increasing tracking system complexity
It is as follows:
Wherein, CPProjection model is represented, j indicates CPIn Gaussian function index, SjFor colour of skin judgment formula, if the Gaussian function
The color of number region is that then its value is 0 to the colour of skin, is otherwise 1;
S3.6. the pose parameter Θ of manpower movement is obtained by solution formula (8)
S4., step 3 is solved to obtained parameter and is sent to Dextrous Hand control system as input, calculates the five fingers according to formula (12)
The parameter in each joint of bionic mechanical hand, and corresponding action command is generated under control system, keep five fingers bionic mechanical hand complete
It is similarly acted at controllers.
2. Non-contact control method according to claim 1, which is characterized in that using Kinect as RGB-D image
Acquisition equipment, the image of acquisition is transmitted to computer by USB interface.
3. Non-contact control method according to claim 1 or 2, which is characterized in that using male gram SVH the five fingers bionic hand
As five fingers bionic mechanical hand model.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610840052.9A CN106346485B (en) | 2016-09-21 | 2016-09-21 | The Non-contact control method of bionic mechanical hand based on the study of human hand movement posture |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610840052.9A CN106346485B (en) | 2016-09-21 | 2016-09-21 | The Non-contact control method of bionic mechanical hand based on the study of human hand movement posture |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106346485A CN106346485A (en) | 2017-01-25 |
CN106346485B true CN106346485B (en) | 2018-12-18 |
Family
ID=57859069
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610840052.9A Active CN106346485B (en) | 2016-09-21 | 2016-09-21 | The Non-contact control method of bionic mechanical hand based on the study of human hand movement posture |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106346485B (en) |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106960036A (en) * | 2017-03-09 | 2017-07-18 | 杭州电子科技大学 | A kind of database building method for gesture identification |
CN107009376A (en) * | 2017-04-26 | 2017-08-04 | 柳州西格尔汽车内饰件有限公司 | The drive mechanism of mechanical finger |
CN107160364B (en) * | 2017-06-07 | 2021-02-19 | 华南理工大学 | Industrial robot teaching system and method based on machine vision |
CN107729632B (en) * | 2017-09-28 | 2021-05-28 | 广州明珞汽车装备有限公司 | Method and system for automatically setting actions and colors of simulation mechanism of tooling equipment |
CN108133202A (en) * | 2018-01-17 | 2018-06-08 | 深圳市唯特视科技有限公司 | It is a kind of that hand gestures method of estimation is blocked based on layering mixture density network certainly |
CN108919943B (en) * | 2018-05-22 | 2021-08-03 | 南京邮电大学 | Real-time hand tracking method based on depth sensor |
KR102324001B1 (en) | 2018-08-20 | 2021-11-09 | 베이징 센스타임 테크놀로지 디벨롭먼트 컴퍼니 리미티드 | Position and posture detection method and device, electronic device and storage medium |
WO2020073245A1 (en) * | 2018-10-10 | 2020-04-16 | 深圳市道通智能航空技术有限公司 | Gesture recognition method, vr angle of view control method and vr system |
CN109961424B (en) * | 2019-02-27 | 2021-04-13 | 北京大学 | Hand X-ray image data generation method |
CN110271020B (en) * | 2019-05-29 | 2021-04-27 | 浙江大学 | Bionic mechanical kinematics optimization method |
WO2021000327A1 (en) * | 2019-07-04 | 2021-01-07 | 深圳市瑞立视多媒体科技有限公司 | Hand model generation method, apparatus, terminal device, and hand motion capture method |
CN110908512A (en) * | 2019-11-14 | 2020-03-24 | 光沦科技(杭州)有限公司 | Man-machine interaction method based on dynamic gesture coordinate mapping |
CN111152218B (en) * | 2019-12-31 | 2021-10-08 | 浙江大学 | Action mapping method and system of heterogeneous humanoid mechanical arm |
CN113496168B (en) * | 2020-04-02 | 2023-07-25 | 百度在线网络技术(北京)有限公司 | Sign language data acquisition method, device and storage medium |
CN112230769B (en) * | 2020-10-10 | 2023-03-31 | 哈尔滨工业大学(威海) | Joint motion angle measuring method of data glove based on flexible capacitive sensor |
CN112927290A (en) * | 2021-02-18 | 2021-06-08 | 青岛小鸟看看科技有限公司 | Bare hand data labeling method and system based on sensor |
CN116627262B (en) * | 2023-07-26 | 2023-10-13 | 河北大学 | VR interactive device control method and system based on data processing |
CN118238152B (en) * | 2024-05-28 | 2024-08-20 | 华东交通大学 | Design method and system of passive underactuated mechanical finger based on deep learning |
CN118305818B (en) * | 2024-06-07 | 2024-08-13 | 烟台大学 | Bionic manipulator control method and system based on double-hand interaction attitude estimation |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102789568A (en) * | 2012-07-13 | 2012-11-21 | 浙江捷尚视觉科技有限公司 | Gesture identification method based on depth information |
CN103152626A (en) * | 2013-03-08 | 2013-06-12 | 苏州百纳思光学科技有限公司 | Far infrared three-dimensional hand signal detecting device of intelligent television set |
CN104589356A (en) * | 2014-11-27 | 2015-05-06 | 北京工业大学 | Dexterous hand teleoperation control method based on Kinect human hand motion capturing |
CN104899600A (en) * | 2015-05-28 | 2015-09-09 | 北京工业大学 | Depth map based hand feature point detection method |
WO2015162158A1 (en) * | 2014-04-22 | 2015-10-29 | Université Libre de Bruxelles | Human motion tracking |
CN105929962A (en) * | 2016-05-06 | 2016-09-07 | 四川大学 | 360-DEG holographic real-time interactive method |
CN106354161A (en) * | 2016-09-26 | 2017-01-25 | 湖南晖龙股份有限公司 | Robot motion path planning method |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150327794A1 (en) * | 2014-05-14 | 2015-11-19 | Umm Al-Qura University | System and method for detecting and visualizing live kinetic and kinematic data for the musculoskeletal system |
-
2016
- 2016-09-21 CN CN201610840052.9A patent/CN106346485B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102789568A (en) * | 2012-07-13 | 2012-11-21 | 浙江捷尚视觉科技有限公司 | Gesture identification method based on depth information |
CN103152626A (en) * | 2013-03-08 | 2013-06-12 | 苏州百纳思光学科技有限公司 | Far infrared three-dimensional hand signal detecting device of intelligent television set |
WO2015162158A1 (en) * | 2014-04-22 | 2015-10-29 | Université Libre de Bruxelles | Human motion tracking |
CN104589356A (en) * | 2014-11-27 | 2015-05-06 | 北京工业大学 | Dexterous hand teleoperation control method based on Kinect human hand motion capturing |
CN104899600A (en) * | 2015-05-28 | 2015-09-09 | 北京工业大学 | Depth map based hand feature point detection method |
CN105929962A (en) * | 2016-05-06 | 2016-09-07 | 四川大学 | 360-DEG holographic real-time interactive method |
CN106354161A (en) * | 2016-09-26 | 2017-01-25 | 湖南晖龙股份有限公司 | Robot motion path planning method |
Also Published As
Publication number | Publication date |
---|---|
CN106346485A (en) | 2017-01-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106346485B (en) | The Non-contact control method of bionic mechanical hand based on the study of human hand movement posture | |
CN108972494B (en) | Humanoid manipulator grabbing control system and data processing method thereof | |
Li et al. | A mobile robot hand-arm teleoperation system by vision and imu | |
Brahmbhatt et al. | Contactgrasp: Functional multi-finger grasp synthesis from contact | |
Lee et al. | Model-based analysis of hand posture | |
Qu et al. | Human-like coordination motion learning for a redundant dual-arm robot | |
Lopes et al. | Visual learning by imitation with motor representations | |
Chua et al. | Model-based 3D hand posture estimation from a single 2D image | |
JP4878842B2 (en) | Robot drive method | |
CN109243575B (en) | Virtual acupuncture method and system based on mobile interaction and augmented reality | |
Koganti et al. | Bayesian nonparametric learning of cloth models for real-time state estimation | |
Schröder et al. | Real-time hand tracking using synergistic inverse kinematics | |
Zhang et al. | Learning grasping points for garment manipulation in robot-assisted dressing | |
Yang et al. | Humanoid motion planning of robotic arm based on human arm action feature and reinforcement learning | |
Wang et al. | Visual haptic reasoning: Estimating contact forces by observing deformable object interactions | |
Koganti et al. | Real-time estimation of human-cloth topological relationship using depth sensor for robotic clothing assistance | |
Rosado et al. | Reproduction of human arm movements using Kinect-based motion capture data | |
CN117901147A (en) | Five-finger manipulator grabbing and operating system based on monorail trace teaching | |
Aleotti et al. | Grasp programming by demonstration in virtual reality with automatic environment reconstruction | |
Hang et al. | DexFuncGrasp: A Robotic Dexterous Functional Grasp Dataset Constructed from a Cost-Effective Real-Simulation Annotation System | |
CN114882113B (en) | Five-finger mechanical dexterous hand grabbing and transferring method based on shape correspondence of similar objects | |
Srinivasa et al. | A bio-inspired kinematic controller for obstacle avoidance during reaching tasks with real robots | |
Ehlers et al. | Self-scaling Kinematic Hand Skeleton for Real-time 3D Hand-finger Pose Estimation. | |
Miller et al. | Unknown object grasping for assistive robotics | |
Schröder et al. | Analysis of hand synergies for inverse kinematics hand tracking |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |