CN108549489A - A kind of gestural control method and system based on hand form, posture, position and motion feature - Google Patents

A kind of gestural control method and system based on hand form, posture, position and motion feature Download PDF

Info

Publication number
CN108549489A
CN108549489A CN201810392063.4A CN201810392063A CN108549489A CN 108549489 A CN108549489 A CN 108549489A CN 201810392063 A CN201810392063 A CN 201810392063A CN 108549489 A CN108549489 A CN 108549489A
Authority
CN
China
Prior art keywords
hand
feature
gesture
characteristic
result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810392063.4A
Other languages
Chinese (zh)
Other versions
CN108549489B (en
Inventor
刘春燕
孙晅
李美娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Top Technology Co Ltd
Original Assignee
Harbin Top Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Top Technology Co Ltd filed Critical Harbin Top Technology Co Ltd
Priority to CN201810392063.4A priority Critical patent/CN108549489B/en
Publication of CN108549489A publication Critical patent/CN108549489A/en
Application granted granted Critical
Publication of CN108549489B publication Critical patent/CN108549489B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention proposes a kind of gestural control method and system based on hand form, posture, position and motion feature, can be used for the selected and control of multiple function modules in multigroup equipment and equipment.This method can capture gesture motion by camera, and extraction form, posture, position and motion feature are on this basis analyzed feature to identify user's gesture, and then realize the selected and control of equipment or function module.

Description

A kind of gestural control method based on hand form, posture, position and motion feature and System
Technical field
The present invention relates to a kind of gestural control method and system based on hand form, posture, position and motion feature belongs to In technical field of hand gesture recognition.
Background technology
Gesture control has the characteristics of non-contact, simple operation, is one of research direction important in man-machine interaction mode. In gestural control method common at present, electromagnetic induction method is difficult to finger fine movement;Control based on object wearing device Method requires user additionally to wear specific device, increases burden for user.And in the gestural control method based on image only Finger number or hand two dimensional surface movement locus are identified, do not excavate image information contained completely(Such as be directed toward, it is three-dimensional It does exercises track etc.);Or it is similar to conventional mouse or the mode of operation of touch screen only with sliding, click etc..Therefore, at present Gestural control method is in interactive mode, or merely with single simple action, it is difficult to which control is comprising multigroup equipment or has more The complication system of grade function;Or conventional mouse or touch screen operation are imitated, do not change man-machine interaction mode inherently.This Outside, Part Methods depend on conventional human's interactive interface, need user's real time inspection to show equipment, cannot achieve blind operation, There is apparent limitation in practical applications.Such as during car steering, check that additional display device may cause to drive Member's dispersion attention, brings security risk.
Invention content
The present invention is not used to multiple functions in national football team equipment and equipment to solve gestural control method in the prior art A kind of the problem of selected and control of module, it is proposed that gesture control side based on hand form, posture, position and motion feature Method, the technical solution taken are as follows:
The gestural control method includes:
Step 1: reading the image data of input picture;
Step 2: the hand-characteristic in one described image data of extraction step, and obtain hand-characteristic result;The hand-characteristic Including hand morphological feature, hand gestures feature, hand position feature and hand motion feature;The hand-characteristic result includes Hand morphological feature result, hand gestures characteristic results, hand position characteristic results and hand motion characteristic results;
Step 3: being merged, being analyzed and gesture identification to the hand-characteristic data described in step 2, gesture identification knot is obtained Fruit;
Step 4: gesture identification result described in storing step three;
Step 5: whether the gesture identification result of judgment step four is complete gesture, it, then will transmission and institute if it is complete gesture State the corresponding gesture command of gesture identification result;If not complete gesture, then iteration execute step 1 to step 4 until Until recognizing complete gesture.
Further, the hand-characteristic is described in world coordinate system and hand coordinate system;The world Absolute coordinate systems of the coordinate system o-xyz as whole system;When the hand coordinate system o '-x ' y ' z ' are stretched flat opening with palm, Straight line where thumb is origin with straight-line intersection where middle finger, and straight line where middle finger is x ' axis, and finger tip direction is forward direction;It hangs down Straight volar direction is y ' axis directions, and when using the right hand as control hand, volar direction is forward direction, uses the back of the hand direction when left hand For forward direction;The plane that z ' axis is formed perpendicular to x ', y ' axis, thumb direction are forward direction.
Further, hand morphological feature described in step 2 is extracted by image fractal method, and described image divides shape The extraction process of method includes:
Step 1, the image data for reading input picture;
Step 2, using the gradation of image or colouring information of step 1 described image data into row threshold division, obtain hand region;
The edge contour of hand region described in step 3, extraction step 2 obtains hand edge profile results;
Step 4 analyzes hand edge profile results described in step 3, bent by the edge contour in edge contour result The protrusion of line judges finger number and direction with recess, obtains the hand morphological feature result;
Hand morphological feature result described in step 5, output step 4.
Further, hand morphological feature described in step 2 is extracted by template matching method, the template matches The extraction process of method includes:
The first step, the image data for reading input picture, and be loaded into and preset template data;
Second step uses correlation filtering method to the default template based on gray scale or color characteristic preset described in the first step in template Carry out the full figure matching of described image;It is special to presetting the textural characteristics based on HOG, Haar or LPB in template described in the first step The full figure that the default template of sign carries out described image using cascade classifier matches, and obtains matching result;
Third step determines hand morphological feature according to matching result described in second step, and exports the hand morphological feature.
Further, the extraction process of hand gestures feature described in step 2 includes:
Step a, the hand morphological feature is read as a result, and the hand morphological feature result is loaded into the default three-dimensional mould of hand In type;
Step b, characteristic point is extracted using hand morphological feature result described in step a;
Step c, it is matched, is preset with the default threedimensional model according to characteristic point position described in step b and pixel value Threedimensional model matching result;
Step d, hand coordinate system is established according to default threedimensional model matching result described in step c;
Step e, according to hand coordinate system described in step d, hand gestures parameter, the hand gestures ginseng are solved using PnP methods Number is the hand gestures feature;
Step d, the characteristic results of hand gestures feature are exported.
Further, hand position feature described in step 2 is extracted using hand gestures characteristic results, the hand The extraction process of portion's position feature includes:
Step I, the hand gestures characteristic results are read
Step II, using hand gestures characteristic results described in step I, hand position is calculated by world coordinate system and hand coordinate system Parameter is set, the hand position parameter is the hand position feature;
Step III, hand position characteristic results described in output step II.
Further, hand position feature described in step 2 is extracted using hand morphological feature result, the hand The extraction process of portion's position feature includes:
Step A, the hand morphological feature result is read;
Step B, using hand morphological feature described in step A as a result, calculating hand position by the hand centre of form and world coordinate system Parameter, the hand position parameter are the hand position feature;
Step C, hand position characteristic results described in output step B.
Further, the extraction process of hand motion feature described in step 2 includes:
1st step, hand morphological feature result, hand gestures characteristic results and the hand position for reading preamble image and present image Characteristic results;
2nd step carries out hand morphological feature result, hand gestures characteristic results described in the 1st step and hand position characteristic results Variance analysis obtains hand motion after analysis;The hand motion is hand motion feature;
3rd step, output hand motion characteristic results;
Hand motion characteristic results described in 4th step, the 3rd step of storage, so as to the subsequent extracted invocation of procedure.
Further, hand-characteristic described in step 2 is extracted using depth image feature extracting method, it is described to carry The process is taken to be:
Step1:The image data of input picture is read, default three-dimensional skeleton model is loaded into;
Step2:Divide the hand region in input described image using depth information;
Step3:Hand-characteristic point is extracted in hand region;
Step4:Model Matching is carried out using hand-characteristic point described in step3, obtains Model Matching result;
Step5:Establish hand coordinate system;
Step6:According to hand coordinate system extraction hand morphological feature, hand described in Model Matching result combination step5 described in step4 Portion's posture feature and hand position feature;
Step7:Hand exercise feature is calculated according to the difference of the front and back frame image of input picture.
A kind of gestural control system for realizing gestural control method described in claim 1, the technical solution taken are as follows:
The control system includes master controller, data acquisition module, data processing module, command output module and operational feedback Module;The data acquisition signal control interaction end and the data of the data acquisition module of the master controller control interaction end phase Even;The data processing signal control interaction end and the data of the data processing module of the master controller control interaction end phase Even;The instruction output signal control interaction end and the data of described instruction output module of the master controller control interaction end phase Even;The operational feedback signal control interaction end and the data of the operational feedback module of the master controller control interaction end phase Even;The data output end of the data acquisition module is connected with the data input pin of the data processing module;At the data The data output end of reason module is connected with the data input pin of described instruction output module.
The present invention proposes a kind of gestural control method based on hand form, posture, position and motion feature, can be used for more The selected and control of multiple function modules in group equipment and equipment.This method can capture gesture motion by camera, extract shape State, posture, position and motion feature on this basis analyze feature to identify user's gesture, and then realize equipment Or the selected and control of function module, and the regulations speed of the movement range control function state of gesture can be passed through.User is only Simple gesture motion need to be made in a small range and control can be completed, and can defined according to actual demand and preset gesture, use It is convenient.
The invention also provides control systems corresponding with the method, can gesture motion be captured and be identified, And then control multiple function modules in multigroup equipment and equipment.Control system is also equipped with feedback function, and user is carrying out gesture Real-time light, sound or vibrational feedback can be obtained when action, operation is succinct, and interactivity is good.
Advantageous effect of the present invention:
1. the present invention can realize the identification of gesture motion based on monocular two dimensional image or depth image, only compared to current mainstream Method based on depth camera, the scope of application are wider.
2. monocular cam can be used as gesture motion acquisition equipment in control system proposed by the invention.
3. the present invention can be realized by single coherent gesture to multiple function modules selected in multigroup equipment and equipment and Control.
4. the present invention can be controlled to adjust when adjusting the functional status with a certain range section by gesture motion amplitude Speed.
5. gestural control method and system proposed by the invention, in control effect, controllable number of devices is more, can The functional hierarchy of control is complicated.
6. gestural control method and system proposed by the invention, in interactive mode, hence it is evident that be different from conventional mouse and The mode of operation of touch screen, rather than it is simulated.
7. gestural control method proposed by the present invention can realize blind operation.
Description of the drawings
Fig. 1 is the overview flow chart of gestural control method of the present invention.
Fig. 2 is world coordinate system of the present invention and hand coordinate system schematic diagram.
Fig. 3 is hand gestures examples of features figure of the present invention.
Fig. 4 is hand position examples of features figure of the present invention.
Fig. 5 is hand motion examples of features figure of the present invention.
Fig. 6 is image fractal method flow chart of the present invention.
Fig. 7 is template matching method flow chart of the present invention.
Fig. 8 is hand gestures feature extraction flow chart of the present invention.
Fig. 9 is hand position feature extraction flow chart of the present invention.
Figure 10 is hand motion feature extraction flow chart of the present invention.
Figure 11 is the feature extraction flow chart of depth image of the present invention.
Figure 12 is gestural control system structure chart of the present invention.
Figure 13 is central vehicle control system hardware installation mode figure.
Figure 14 is to open driver side vehicle window gesture schematic diagram.
Figure 15 is to close driver side vehicle window gesture schematic diagram.
Figure 16 is that skylight opens and closes pattern diagram.
Figure 17 is that skylight gesture schematic diagram is opened in translation.
Figure 18 is that skylight gesture schematic diagram is closed in translation.
Figure 19 is inclination opening skylight gesture schematic diagram.
Figure 20 is to tilt to close skylight gesture schematic diagram.
Figure 21 is smart home central control equipment scheme of installation.
Figure 22 is that floor lamp opens gesture schematic diagram.
Figure 23 is that floor lamp closes gesture schematic diagram.
Figure 24 is that floor lamp brightness reduces gesture schematic diagram.
Figure 25 is that floor lamp brightness increases gesture schematic diagram.
Figure 26 is that multimedia equipment opens gesture schematic diagram.
Figure 27 is that multimedia equipment closes gesture schematic diagram.
Figure 28 is that multimedia equipment plays, suspends schematic diagram.
Figure 29 is that multimedia equipment switches to a file gesture schematic diagram.
Figure 30 is multimedia equipment current file rewind gesture schematic diagram.
Figure 31 is that multimedia equipment switches to next file gesture schematic diagram.
Figure 32 is multimedia equipment current file F.F. gesture schematic diagram.
Figure 33 is multimedia equipment volume down gesture schematic diagram.
Figure 34 is that multimedia equipment volume increases gesture schematic diagram.
Figure 35 is the relationship decision flow chart between curved convex, recess and finger number and direction.
Relationship between Figure 36 is curved convex, recess and finger number and is directed toward judges gesture exemplary plot one.
Relationship between Figure 37 is curved convex, recess and finger number and is directed toward judges gesture exemplary plot two.
Specific implementation mode
With reference to specific embodiment, the present invention will be further described, but the present invention should not be limited by the examples.
Embodiment 1:
The present invention proposes a kind of gestural control method based on hand form, posture, position and motion feature, passes through camera reality When capture and extract hand form, posture, position and motion feature, identify user's gesture command, realize to multigroup equipment and The control of multiple function modules in equipment.The gestural control method as shown in Figure 1, read input picture, based on practical first Using need can be one or more in hand form, posture, position and motion characteristic from being extracted in image successively, to feature into Row fusion and analysis, when detecting complete gesture command, send the order, otherwise store present analysis result and combine follow-up Image carries out processing analysis, specially:
Step 1: reading the image data of input picture;
Step 2: the hand-characteristic in one described image data of extraction step, and obtain hand-characteristic result;
Step 3: being merged, being analyzed and gesture identification to the hand-characteristic data described in step 2, gesture identification knot is obtained Fruit;
Step 4: gesture identification result described in storing step three;
Step 5: whether the gesture identification result of judgment step four is complete gesture, it, then will transmission and institute if it is complete gesture State the corresponding gesture command of gesture identification result;If not complete gesture, then iteration execute step 1 to step 4 until Until recognizing complete gesture.
The hand-characteristic is described in world coordinate system and hand coordinate system;As shown in Fig. 2, the world Absolute coordinate systems of the coordinate system o-xyz as whole system;Its origin position and three axis direction are chosen according to requirement in practical systems. When the hand coordinate system o '-x ' y ' z ' are stretched flat opening with palm, straight line where thumb is origin with straight-line intersection where middle finger, Straight line where middle finger is x ' axis, and finger tip direction is forward direction;Vertical volar direction is y ' axis directions, and control is used as when using the right hand When hand processed, volar direction is forward direction, the use of the back of the hand direction when left hand is forward direction;The plane that z ' axis is formed perpendicular to x ', y ' axis, thumb Finger direction is forward direction.
The hand-characteristic includes hand morphological feature, hand gestures feature, hand position feature and hand motion feature; The hand-characteristic result includes hand morphological feature result, hand gestures characteristic results, hand position characteristic results and hand Motion characteristic result.
Wherein, hand morphological feature refers to the outer contoured shape of hand.It thrusts out one's fingers, clench fist or opens when user makes When the actions such as palm, can quickly it be identified by morphological feature.In the present embodiment, different hand forms can be used for realizing following work( Energy:
(1)A certain equipment is chosen from multigroup equipment, or chooses the different function module of same equipment, and the equipment or module will be rung Answer subsequent gesture control instruction;
(2)To selecting the adjusting of a certain concrete function different conditions under equipment or function module, as stretched out a charge control equipment " unlatching ", then control device " closing " of clenching fist;
(3)The speed of control function Status Change, when such as needing a certain functional status " increase ", when stretching out a finger, state tune Section is slower, is adjusted when stretching out two fingers very fast.
Hand gestures feature includes the bearing sense formed by modes such as hand pitching, swing, rotations, can pass through hand Angle between coordinate system and world coordinate system indicates that the size of angle then indicates posture amplitude, such as the angle of o ' x ' and ox in Fig. 3 Shown in a.
Different hand gestures can be used for realizing following functions in the present embodiment:
(1)A certain equipment is chosen from multigroup equipment, or chooses the different function module of same equipment, and the equipment or module will be rung Answer subsequent gesture control instruction;
(2)Adjusting to a certain concrete function different conditions under equipment or function module such as utilizes hand pitch attitude control up and down Control equipment "on" and "off", a certain functional status of wagging gesture control device " increase " and " reduction ";
(3)A certain functional status " increase " can be such as needed according to the difference of posture amplitude with the speed of control function Status Change When, when posture amplitude is big, status adjustment is very fast, and posture amplitude hour adjusts slower.
Hand position feature refers to the spatial position coordinate where hand, using hand coordinate origin in world coordinate system In coordinate representation, hand coordinate origin at a distance from world coordinate system origin size indicate displacement, such as Fig. 4 middle conductors Shown in l.When accuracy requirement is not high, can directly calculate coordinate of the hand centre of form in world coordinate system and with origin away from From.
Different hand positions can be used for realizing following functions in the present embodiment:
(1)A certain equipment is chosen from multigroup equipment, or chooses the different function module of same equipment, and the equipment or module will be rung Answer subsequent gesture control instruction;
(2)Adjusting to a certain concrete function different conditions under equipment or function module such as utilizes hand pitch attitude control up and down Control equipment "on" and "off", a certain functional status of wagging gesture control device " increase " and " reduction ";
(3)According to the speed of displacement control function Status Change, when such as needing a certain functional status " increase ", when displacement is big When, status adjustment is very fast, and displacement hour adjusts slower.
Hand motion feature referred in a period of time, hand form, posture or position feature change procedure.As shown in figure 5, Which show one group of hand motion example, wherein hand rotates while being translated along the directions ox around oy axis.In addition, hand motion It also may include the variation of form, such as stretch out or withdraw finger.
Different hand motions can be used for realizing following functions in the present embodiment:
(1)A certain equipment is chosen from multigroup equipment, or chooses the different function module of same equipment, and the equipment or module will be rung Answer subsequent gesture control instruction;
(2)Adjusting to a certain concrete function different conditions under equipment or function module;
(3)According to movement speed speed, the speed of amplitude size control function Status Change.
It is extracted, is had based on monocular two dimensional image method according to the difference of input picture, hand-characteristic in the present embodiment The extracting method of body is as follows
The side that hand morphological feature is combined by image fractal method or template matching method or the two described in above-mentioned steps two Method extracts, wherein image fractal method flow utilizes as shown in fig. 6, be primarily based on hand images gray scale or colouring information Threshold segmentation obtains hand pixel, then extracts hand edge information, can determine whether to reach with recess by the protrusion of boundary curve Refer to number and direction.The extraction process of described image fractal method includes:
Step 1, the image data for reading input picture;
Step 2, using the gradation of image or colouring information of step 1 described image data into row threshold division, obtain hand region;
The edge contour of hand region described in step 3, extraction step 2 obtains hand edge profile results;
Step 4 analyzes hand edge profile results described in step 3, bent by the edge contour in edge contour result The protrusion of line judges finger number and direction with recess, obtains the hand morphological feature result;
Hand morphological feature result described in step 5, output step 4.
Wherein, the relationship between the curved convex involved in step 4, recess and finger number and direction judges flow as schemed Shown in 35, specifically:
1, after obtaining hand edge image, all marginal points are traversed in the picture, calculate curvature;
2, the classification that marginal point is judged by curvature value is raised points when curvature value is larger and higher than raised decision threshold, bent Rate value is smaller and is depression points when being less than recess decision threshold, remaining is then common edge point, as shown in Figure 36 to 37(For simplicity For the sake of, the edge in this figure and subsequent figure, which has been omitted from, not to be drawn);Wherein, curvature is more than raised decision threshold at point A, is denoted as protrusion Point;Curvature is less than recess decision threshold at point B and point C, is denoted as depression points;
3, each raised points are traversed, the position relationship of the point and adjacent recessed point is analyzed:
A) point A and point B, point A and point C are adjacent, and without other raised points between point B and point C.
B) by point A respectively with point B, point C lines, the length of line segment AB and AC is satisfied by predetermined threshold value.
C) corner dimension that AB and AC is formed meets predetermined threshold value.
D) it is comprehensive a), b), c) 3 points, it is believed that point A is finger tip, and finger is directed toward available line section AB and AC and describes.
If e) a), b), c) 3 points cannot all meet, then point A be finger tip.
Template matching method flow is known as shown in fig. 7, being primarily based on hand gray scale under different shape, color or texture priori Know training and obtain series of features template, recycles template to carry out full figure matching in the input image, best matching result is Hand form in present image.The extraction process of the module matching process includes:
The first step, the image data for reading input picture, and be loaded into and preset template data;
Second step uses correlation filtering method to the default template based on gray scale or color characteristic preset described in the first step in template Carry out the full figure matching of described image;It is special to presetting the textural characteristics based on HOG, Haar or LPB in template described in the first step The full figure that the default template of sign carries out described image using cascade classifier matches, and obtains matching result;
Third step determines hand morphological feature according to matching result described in second step, and exports the hand morphological feature.
The extraction of hand gestures feature need to carry out on the basis of morphological feature.It is primarily based on different hand forms and establishes one Serial threedimensional model extracts characteristic point according to the hand form that acquires, according to characteristic point position and pixel value by image characteristic point It is matched with preset model, determines that the three-dimensional between characteristic point closes on relationship, to establish the hand coordinate system under current state, Finally utilize PnP methods(Perspective-n-Point, multiple spot pose)Attitude parameter of the hand under world coordinate system is solved, As shown in figure 8, the extraction process of hand gestures feature described in step 2 includes:
Step a, the hand morphological feature is read as a result, and the hand morphological feature result is loaded into the default three-dimensional mould of hand In type;
Step b, characteristic point is extracted using hand morphological feature result described in step a;
Step c, it is matched, is preset with the default threedimensional model according to characteristic point position described in step b and pixel value Threedimensional model matching result;
Step d, hand coordinate system is established according to default threedimensional model matching result described in step c;
Step e, according to hand coordinate system described in step d, hand gestures parameter, the hand gestures ginseng are solved using PnP methods Number is the hand gestures feature;
Step d, the characteristic results of hand gestures feature are exported.
The extraction of hand position feature need to carry out on the basis of posture or morphological feature.It is extracted when based on posture feature When, it can directly utilize the hand coordinate system of above-mentioned acquisition to be calculated with world coordinate system;It, can when being extracted based on morphological feature The centre of form is calculated using the hand edge profile of above-mentioned acquisition, and then obtains coordinate and and origin of the centre of form in world coordinate system Distance, as shown in figure 9, detailed process is as follows:
When being extracted to hand position feature described in step 2 using hand gestures characteristic results, the hand position feature Extraction process include:
Step I, the hand gestures characteristic results are read
Step II, using hand gestures characteristic results described in step I, hand position is calculated by world coordinate system and hand coordinate system Parameter is set, the hand position parameter is the hand position feature(In hand gestures characteristic extraction procedure, it is applied to PnP algorithms, rotational transformation matrix and translation transformation matrix of the exportable hand coordinate system of the algorithm relative to world coordinate system.Its In, rotational transformation matrix describes hand coordinate system relative to world coordinate system along tri- respective angles of reference axis of xyz, this group Angle is hand gestures feature.Translation transformation matrix then describes the origin o ' of hand coordinate system in world coordinate system o-xyz In three-dimensional coordinate, this group of coordinate is hand position feature)
Step III, hand position characteristic results described in output step II.When utilization hand morphological feature result is to step 2 institute When stating hand position feature and extracting, the extraction process of the hand position feature includes:
Step A, the hand morphological feature result is read;
Step B, using hand morphological feature described in step A as a result, calculating hand position by the hand centre of form and world coordinate system Parameter, the hand position parameter are the hand position feature;
Step C, hand position characteristic results described in output step B.
Wherein, the calculating process for calculating hand position feature is combined by world coordinates and the hand centre of form described in step B It is as follows:
1, during establishing world coordinate system, including the image coordinate of camera captured images is closed to the mapping of world coordinates System.
2, after extraction obtains hand edge profile, the average value of the image coordinate of each point on edge contour, as hand are calculated The image coordinate of portion's centre of form.
3, according to the mapping relations of image coordinate and world coordinates, the world coordinates of the hand centre of form is obtained, which is Hand position feature.
To extract motion characteristic, hand-characteristic in preamble image need to be recorded.By comparing present frame and preamble frame hand shape In state, posture or position feature, the difference of one or more features, you can obtain the hand motion feature in this time, most After need storage extract as a result, for subsequent image processing analyze.As shown in Figure 10, the extraction of hand motion feature described in step 2 Journey includes:
1st step, hand morphological feature result, hand gestures characteristic results and the hand position for reading preamble image and present image Characteristic results;
2nd step carries out hand morphological feature result, hand gestures characteristic results described in the 1st step and hand position characteristic results Variance analysis obtains hand motion after analysis;The hand motion is hand motion feature;
3rd step, output hand motion characteristic results;
Hand motion characteristic results described in 4th step, the 3rd step of storage, so as to the subsequent extracted invocation of procedure.
Hand form, posture, position and motion characteristic can individually be used for gesture control, can also be according to actual needs It is combined.
Can be a comprehensive characteristics by multiple Fusion Features, such as can be by following three when being controlled using multiple features A feature is considered as a combination, when detecting that user makes while when meeting the gesture of following three features, directly executing a certain The a certain function of equipment:
(1)Palm opening is stretched flat;
(2)Palm is translated to front;
(3)The centre of the palm is by overturning upward to downward.
In addition, can also correspond to each feature to different control levels, such as detects and executed respectively accordingly when following characteristics Instruction:
(1)Detect that palm opening is stretched flat, i.e., the five fingers all stretch out, then select to number the equipment for 5;
(2)It on the basis of instructing 1, further detects that palm is translated to front, then selects the function 1 in No. 5 equipment;
(3)On the basis of instructing 2, if detecting that the centre of the palm by being overturn upward to downward, executes in No. 5 functions of the equipments 1 Subfunction 1;If detecting that palm swings, the subfunction 2 in No. 5 functions of the equipments 1 is executed.
In this way, user can complete that selection of equipment, function be selected and work(by a coherent gesture motion The execution of energy.
Embodiment 2
The present invention proposes a kind of gestural control method based on hand form, posture, position and motion feature, passes through camera reality When capture and extract hand form, posture, position and motion feature, identify user's gesture command, realize to multigroup equipment and The control of multiple function modules in equipment.The gestural control method as shown in Figure 1, read input picture, based on practical first Using need can be one or more in hand form, posture, position and motion characteristic from being extracted in image successively, to feature into Row fusion and analysis, when detecting complete gesture command, send the order, otherwise store present analysis result and combine follow-up Image carries out processing analysis, specially:
Step 1: reading the image data of input picture;
Step 2: the hand-characteristic in one described image data of extraction step, and obtain hand-characteristic result;
Step 3: being merged, being analyzed and gesture identification to the hand-characteristic data described in step 2, gesture identification knot is obtained Fruit;
Step 4: gesture identification result described in storing step three;
Step 5: whether the gesture identification result of judgment step four is complete gesture, it, then will transmission and institute if it is complete gesture State the corresponding gesture command of gesture identification result;If not complete gesture, then iteration execute step 1 to step 4 until Until recognizing complete gesture.
The hand-characteristic is described in world coordinate system and hand coordinate system;As shown in Fig. 2, the world Absolute coordinate systems of the coordinate system o-xyz as whole system;Its origin position and three axis direction are chosen according to requirement in practical systems. When the hand coordinate system o '-x ' y ' z ' are stretched flat opening with palm, straight line where thumb is origin with straight-line intersection where middle finger, Straight line where middle finger is x ' axis, and finger tip direction is forward direction;Vertical volar direction is y ' axis directions, and control is used as when using the right hand When hand processed, volar direction is forward direction, the use of the back of the hand direction when left hand is forward direction;The plane that z ' axis is formed perpendicular to x ', y ' axis, thumb Finger direction is forward direction.
Hand-characteristic described in above-mentioned steps two is extracted by the feature extracting method of depth image, as shown in figure 11, The hand-characteristic extraction process is as follows:
Step1:The image data of input picture is read, default three-dimensional skeleton model is loaded into;
Step2:Divide the hand region in input described image using depth information;
Step3:Hand-characteristic point is extracted in hand region;
Step4:Model Matching is carried out using hand-characteristic point described in step3, obtains Model Matching result;
Step5:Establish hand coordinate system;
Step6:According to hand coordinate system extraction hand morphological feature, hand described in Model Matching result combination step5 described in step4 Portion's posture feature and hand position feature;
Step7:Hand exercise feature is calculated according to the difference of the front and back frame image of input picture.
Embodiment 3
The present embodiment proposes a kind of gestural control system for realizing gestural control method described in claim 1, as shown in figure 12, The technical solution taken is as follows:
The control system includes master controller, data acquisition module, data processing module, command output module and operational feedback Module;The data acquisition signal control interaction end and the data of the data acquisition module of the master controller control interaction end phase Even;The data processing signal control interaction end and the data of the data processing module of the master controller control interaction end phase Even;The instruction output signal control interaction end and the data of described instruction output module of the master controller control interaction end phase Even;The operational feedback signal control interaction end and the data of the operational feedback module of the master controller control interaction end phase Even;The data output end of the data acquisition module is connected with the data input pin of the data processing module;At the data The data output end of reason module is connected with the data input pin of described instruction output module.
Master controller is used to control the operation of other each modules, and obtains its operating status in real time from each module;Data obtain Modulus block captures gesture motion in real time by camera, and by data incoming data processing module;Data processing module be used for from Extracting data hand-characteristic is captured, and identifies gesture command, and incoming command output module;Command output module receives gesture Order, and instructed to relevant device sending function;Operational feedback module can provide feedback in real time to user, be slapped convenient for user Control current system operating status.
According to practical application request, control system can take following hardware implementation mode:
Master controller, data processing and command output module can be based on embedded platform or other meters with image-capable Calculate platform building.To ensure that real-time, safety and the stability of system operation, the hardware platform of equipped system need to have Multithreading calculates and the ability of parallel computation.Simultaneously as system needs to run without interruption for a long time, it is that hardware is flat to reduce power consumption One key factor of platform.The tall and handsome embedded platform released up to company, such as TK1, TX1 and TX2 of Tegra series can expire The hardware platform requirements of pedal system.
Monocular or depth camera may be selected as Video Image Capturing device in data acquisition module.Wherein, monocular images Head refers to the module by single-sensor and single lens composition, and the function of module is to obtain field in real time according to external drive The coloured image or gray level image of scape.Depth camera includes mainly being based on TOF(Time of flight, flight time), base In structure light or the sensor being imaged based on multi-angle, it is with the main distinction of monocular cam, depth camera can also be real When capturing scenes in object and camera space length.Data acquisition module can pass through USB data line, optical fiber or wireless etc. Data transfer mode is connect with master controller and data processing module, and real-time transmission data.
Command output module analyzes data processing module as the interface between gestural control system and controlled device To control instruction be conveyed to controlled device according to specified communication protocol.The module can pass through USB data line, optical fiber or short distance Wireless transmission protocol sends to controlled device and instructs, and wherein near radio transport protocol includes ZigBee, NB-IOT, LoRA Deng.Agreement and the hardware module of supported protocol are all the component part of command output module.
Operational feedback module can provide feedback by modes such as vibrations, light or sound to user, according to actual demand May include shaking device, LED, loud speaker or other have the hardware device of same function.
Gestural control method proposed by the present invention can realize the knowledge of gesture motion based on monocular two dimensional image or depth image , not applied widely.Monocular cam can be used as gesture motion acquisition equipment, the hardware in control system of the present invention Realization method cost is relatively low, and equipment is simple, is easy to build and safeguard.The present invention can be set by single coherent gesture realization to multigroup The selected and control of multiple function modules, can control more complicated hardware system in standby and equipment.Gesture proposed by the present invention Control method can realize blind operation, avoid the transfer of user's sight, when for application scenarios such as car steerings, compared to biography System method, the possibility for contributing to reduction accident to occur.
Embodiment 4
Embodiment 4 proposes a kind of central vehicle control system based on gestural control method of the present invention.Hardware device is pacified Dress mode is as shown in figure 13.Wherein:
1. gesture operation area is located at console, close to gear shift lever.
2. central control system uses embedding assembly platform, platform intergration to pass through each equipment of interior bus marco in car.
3. using monocular gray scale camera or depth camera as acquisition equipment, camera is placed in steering wheel and middle control Between platform, and include that infrared light compensating lamp is convenient for being imaged under low-illumination scene.
4. vibrational feedback device is positioned on the right side of driver seat back.In addition, system can also pass through sound system in a vehicle Provide a user sound feedback.
Below by taking driver side vehicle window and skylight as an example, illustrate the control method of the present invention.
The control method of vehicle window control in the central vehicle control system is as follows:Stretch out hand thumb and index finger, thumb Finger points up, such as Figure 14(a)Shown, this gesture is the gesture of choosing of driver side vehicle window, keeps hand form constant and revolves The portion of changing hands makes the centre of the palm can control vehicle window to fall downwards, and the centre of the palm controls rise upwards.It is as follows:
Step a:It stretches out hand thumb and index finger, thumb points up, make the gesture of choosing of driver side vehicle window, system is chosen Driver side vehicle window sends vibrations and sound feedback as controlled plant;
Step b:When vehicle window is not in the state fallen completely, hand is rotated it is downward to the centre of the palm, such as Figure 14(c)It is shown, then System sends out vehicle window and declines instruction, and is begun to decline to user's transmission vibrations and sound feedback, driver side vehicle window, at this time:
A) it keeps current hand form and removes operating space, or change hand form, such as hand is restored to relaxation state, be then System will continuously carry out vehicle window and decline instruction until vehicle window is fallen completely;
B) control gesture of another equipment is such as made, then while continuing to execute vehicle window decline instruction, executes another equipment Gesture control order, system send corresponding vibrations and sound feedback;
C) before vehicle window is fallen completely, hand is restored to Figure 14(a)State, then system send out vehicle window still command, and to User sends vibrations and sound feedback, and vehicle window, which is maintained at current location, to be stopped declining;
Step c:When vehicle window is not in the state risen completely, make driver side vehicle window choose gesture and by hand rotate to The centre of the palm is upward, such as Figure 15(c)Shown, then system sends vehicle window climb command, and sends vibrations and sound feedback to user, drives The person of sailing side vehicle window is begun to ramp up, at this time:
A) it keeps current hand form and removes operating space, or change hand form, such as hand is restored to relaxation state, be then System will continuously carry out vehicle window climb command until vehicle window rises completely;
B) control gesture for such as making another equipment executes another equipment then while continuing to execute vehicle window climb command Gesture control order, system send corresponding vibrations and sound feedback;
C) before vehicle window rises completely, hand is restored to Figure 15(a)State, then system send out vehicle window still command, and to User sends vibrations and sound feedback, and vehicle window, which is maintained at current location, to be stopped rising.
In above procedure, step a is can skip, directly executes step b or c, system will directly send out corresponding instruction and feedback.
In vehicle window control module in central vehicle control system, the opening and closing of vehicle dormer window include translation and tilt Both of which, translational mode refer to skylight and move backward opening, move forward and close, such as Figure 16(a)It is shown;Tilt mode refers to skylight It lifts and opens on rear portion, fall closing, such as Figure 16(b)It is shown.
Control mode is as follows:It stretches out hand thumb, index finger and middle finger, thumb to point up, such as Figure 17(a)It is shown, this hand Gesture chooses gesture for skylight.Keep hand form constant, rotation hand makes the centre of the palm that skylight translation can control to open downwards, the centre of the palm Control is closed upwards;Hand is swung to the left and can control skylight inclination opening, and control to the right is closed.It is as follows:
Translational mode:
Step a:It stretches out hand thumb, index finger and middle finger, thumb to point up, makes the gesture of choosing of skylight, system chooses day Window sends vibrations and sound feedback as controlled plant;
Step b:When skylight is in translational mode, and is not in completely open state, hand is rotated it is downward to the centre of the palm, such as Figure 17(c)Shown, then system sends out skylight translation open instructions, and starts to user's transmission vibrations and sound feedback, skylight It translates backward, at this time:
A) it keeps current hand form and removes operating space, or change hand form, such as hand is restored to relaxation state, be then System will continuously carry out skylight translation open instructions until skylight is opened completely;
B) skylight tilt mode gesture is such as made, then skylight is closed in translation first, then executes tilt mode instruction, and system sends phase It should shake and sound feedback;
C) control gesture of another equipment is such as made, then while continuing to execute skylight translation open instructions, executes another sets Standby gesture control order, system send corresponding vibrations and sound feedback;
D) before skylight is opened completely, hand is restored to Figure 17(a)State, then system send out skylight still command, and to User sends vibrations and sound feedback, skylight are maintained at current location, are moved after stopping;
Step c:When skylight is in translational mode, and is not in completely closed state, makes skylight and choose gesture and by hand Rotation is upward to the centre of the palm, such as Figure 18(c)It is shown, then system send skylight translate out code, and to user send vibrations and Sound feedback, skylight start to move forward, at this time:
A) it keeps current hand form and removes operating space, or change hand form, such as hand is restored to relaxation state, be then System will continuously carry out skylight translation out code until skylight completely closes;
B) skylight tilt mode gesture is such as made, then skylight is closed in translation first, then executes tilt mode instruction, and system sends phase It should shake and sound feedback;
C) control gesture of another equipment is such as made, then while continuing to execute skylight translation out code, executes another sets Standby gesture control order, system send corresponding vibrations and sound feedback;
D) before skylight completely closes, hand is restored to Figure 18(a)State, then system send out skylight still command, and to making User sends vibrations and sound feedback, and skylight is maintained at current location, stops Forward.
In above procedure, step a is can skip, directly executes step b or c, system will directly send out corresponding instruction and feedback.
Tilt mode:
Step a:It stretches out hand thumb, index finger and middle finger, thumb to point up, makes the gesture of choosing of skylight, system chooses day Window sends vibrations and sound feedback as controlled plant;
Step b:When skylight is in tilt mode, and is not in full open position, hand is swung to the left, such as Figure 19(c) Shown, then system sends out the instruction of skylight inclination opening, and sends vibrations and sound feedback to user, on skylight rear portion starts Lift, at this time:
A) it keeps current hand form and removes operating space, or change hand form, such as hand is restored to relaxation state, be then System will continuously carry out the instruction of skylight inclination opening until skylight is opened completely;
B) skylight translational mode gesture is such as made, then tilts first and closes skylight, then executes translational mode instruction, system sends phase It should shake and sound feedback;
C) control gesture of another equipment is such as made, then while continuing to execute the instruction of skylight inclination opening, executes another set Standby gesture control order, system send corresponding vibrations and sound feedback;
D) before skylight is opened completely, hand is restored to Figure 19(a)State, then system send out skylight still command, and to User sends vibrations and sound feedback, skylight are maintained at current location, are lifted in stopping;
Step c:When skylight is in tilt mode, and is not in completely closed state, makes skylight and choose gesture and by hand It swings to the right, such as Figure 20(c)Shown, then system sends skylight and tilts out code, and anti-to user's transmission vibrations and sound Feedback, skylight rear portion are begun to decline, at this time:
A) it keeps current hand form and removes operating space, or change hand form, such as hand is restored to relaxation state, be then System will continuously carry out skylight and tilt out code until skylight completely closes;
B) skylight translational mode gesture is such as made, then tilts first and closes skylight, then executes translational mode instruction, system sends phase It should shake and sound feedback;
C) control gesture of another equipment is such as made, then while continuing to execute skylight inclination out code, executes another sets Standby gesture control order, system send corresponding vibrations and sound feedback;
D) before skylight completely closes, hand is restored to Figure 20(a)State, then system send out skylight still command, and to making User sends vibrations and sound feedback, and skylight is maintained at current location, stops declining.
In above procedure, step a is can skip, directly executes step b or c, system will directly send out corresponding instruction and feedback.
Embodiment 5
Embodiment 5 is the smart home central control equipment based on gestural control method of the present invention, wherein hardware device is pacified Dress mode is as shown in figure 21.Wherein:
1. gesture operation area is located at user seat right hand arm rest.
2. central control system uses embedding assembly platform, it is connect with camera and each controlled plant by wireless signal.
3. using monocular gray scale camera or depth camera as acquisition equipment, camera is placed in above TV, and is wrapped It is convenient for being imaged under low-illumination scene containing infrared light compensating lamp.According to practical application scene, camera can also be positioned over other can be clear The position of clear capture user gesture action.
4. system sends sound feedback by controlled speaker to user.
5. in controlled plant, two speakers can support central control system to provide sound feedback, while also be connected with TV, group At multimedia equipment.
Below by taking floor lamp and multimedia equipment as an example, illustrate the control method of the present invention.
Wherein, floor lamp control function includes that open and close and light and shade are adjusted.Control mode is as follows:Stretch out hand thumb and food Refer to, thumb points up, such as Figure 22(a)Shown, this gesture chooses gesture for floor lamp.Keep hand form constant, to the right Floor lamp can be opened by swinging hand, then be closed to the left;Rotation hand makes the centre of the palm can control lights dim downwards, and the centre of the palm then becomes upwards It is bright.
(1)Floor lamp unlatching, closing control
Step a:It stretches out hand thumb and index finger, thumb points up, make floor lamp and choose gesture, system that floor lamp is chosen to make For controlled plant, and send sound feedback;
Step b:When floor lamp is closed, hand is swung to the right, such as Figure 22(c)Shown, then system sends out landing Lamp open command, and sound feedback is sent to user, floor lamp is opened;
Step c:When floor lamp is in open state, hand is swung to the left, such as Figure 23(c)Shown, then system sends out landing Lamp out code, and sound feedback is sent to user, floor lamp is closed.
In above procedure, step a is can skip, directly executes step b or c, system will directly send out corresponding instruction and feedback.
(2)Floor lamp light and shade is adjusted
Step a:It stretches out hand thumb and index finger, thumb points up, make floor lamp and choose gesture, system that floor lamp is chosen to make For controlled plant, and send sound feedback;
Step b:When floor lamp is in open state, hand is in downward direction rotated to the centre of the palm, such as Figure 24(c)It is shown, then be System send out floor lamp brightness reduce instruction, and to user send sound feedback, floor lamp start it is dimmed, at this time:
A) brightness change speed determines that rotation angle is bigger by hand rotation angle, and pace of change is faster, and system will persistently reduce Brightness is until brightness is adjusted to minimum;
B) change hand form, restore hand to Figure 24(a)State or by hand remove operating space when, it is bright that system sends out floor lamp Degree variation halt instruction, and sound feedback is sent to user, floor lamp keeps present intensity constant;
C) control gesture of another equipment is such as made, system sends out floor lamp brightness change halt instruction first, then executes another The gesture control order of equipment, floor lamp keep present intensity constant, and system sends corresponding sound feedback;
Step c:When floor lamp is in open state, hand is rotated to centre of the palm upward direction, such as Figure 25(c)It is shown, then be System sends out floor lamp brightness and increases instruction, and sends sound feedback to user, and floor lamp starts to brighten, at this time:
A) brightness change speed determines that rotation angle is bigger by hand rotation angle, and pace of change is faster, and system will continue to increase Brightness is until brightness is adjusted to maximum;
B) change hand form, restore hand to Figure 25(a)State or by hand remove operating space when, it is bright that system sends out floor lamp Degree variation halt instruction, and sound feedback is sent to user, floor lamp keeps present intensity constant;
C) control gesture of another equipment is such as made, system sends out floor lamp brightness change halt instruction first, then executes another The gesture control order of equipment, floor lamp keep present intensity constant, and system sends corresponding sound feedback.
In above procedure, step a is can skip, directly executes step b or c, system will directly send out corresponding instruction and feedback.
Embodiment 6
The present embodiment proposes a kind of multimedia utility control device based on gestural control method of the present invention, more matchmakers Body functions of the equipments include unlatching, closing, broadcasting, pause, a upper file, current file rewind, next file, current file F.F. And volume adjustment.Multimedia equipment chooses gesture such as Figure 26(a), the extended hand thumb of user, index finger and middle finger, thumb refer to Upward to select multimedia equipment as controlled plant.
(1)Multimedia equipment is opened, is closed
Step a:It stretches out hand thumb, index finger and middle finger, thumb to point up, makes multimedia equipment and choose gesture, system choosing Middle multimedia equipment sends sound feedback as controlled plant;
Step b:When multimedia equipment is closed, first by lift such as Figure 26 on hand and forearm(c)It is shown, then by hand Rotation is upward to the centre of the palm, such as Figure 26(f)Shown, then system sends out multimedia equipment open command, and sends sound to user Feedback, multimedia equipment are opened.
Step c:When multimedia equipment is in open state, first by lift such as Figure 27 on hand and forearm(c)It is shown, then will Hand rotate it is downward to the centre of the palm, such as Figure 27(f)Shown, then system sends out multimedia equipment out code, and is sent to user Sound feedback, multimedia equipment are closed.
In above procedure, step a is can skip, directly executes step b or c, system will directly send out corresponding instruction and feedback.
(2)Multimedia equipment plays, suspends
It plays and shares same gesture with pause, control device switches between two states.It is as follows:
Step a:It stretches out hand thumb, index finger and middle finger, thumb to point up, makes multimedia equipment and choose gesture, such as Figure 28 (a)Shown, system chooses multimedia equipment as controlled plant, and sends sound feedback;
Step b:When equipment placed in a suspend state when, first will on hand and forearm lift such as Figure 28(c)It is shown, then set back such as figure 28(f)Shown, then system sends multimedia equipment play instruction, and sends sound feedback to user, and multimedia equipment starts It plays;
Step c:When equipment is in broadcast state, first by lift such as Figure 28 on hand and forearm(c)It is shown, then set back such as figure 28(f)Shown, then system sends multimedia equipment pause instruction, and sends sound feedback, multimedia equipment pause to user It plays.
In above procedure, step a is can skip, directly executes step b or c, system will directly send out corresponding instruction and feedback.
(3)The switching of multimedia equipment file, playing progress rate are adjusted
Step a:It stretches out hand thumb, index finger and middle finger, thumb to point up, makes multimedia equipment and choose gesture, such as Figure 29 (a)Shown, system chooses multimedia equipment as controlled plant, and sends sound feedback;
Step b:When apparatus for media playing is in open state, hand swings set back again to the left, as shown in figure 29, is then System sends out multimedia equipment and switches to a file instruction, and sends sound feedback to user, and multimedia equipment switches to One file;
Step c:When apparatus for media playing is in open state, hand swings set back again to the right, as shown in figure 31, is then System sends out multimedia equipment and switches to next file instruction, and sends sound feedback to user, and multimedia equipment switches to down One file;
Step d:When apparatus for media playing is in open state, hand swings and remains unchanged to the left, such as Figure 30(c)It is shown, Then system sends out multimedia equipment current file rewind instruction, and sends sound feedback to user, and multimedia equipment is current Reverse play progress under played file, at this time:
A) change hand form, restore hand to Figure 30(a)State or by hand remove operating space when, system sends out multimedia and sets Standby playing progress rate adjusts halt instruction, and sends sound feedback to user, and multimedia equipment keeps currently playing progress;
B) control gesture of another equipment is such as made, system sends out apparatus for media playing playing progress rate and adjusts halt instruction first, The gesture control order of another equipment is executed again, and multimedia equipment keeps currently playing progress, system to send corresponding sound feedback;
Step e:When apparatus for media playing is in open state, hand swings and remains unchanged to the right, such as Figure 32(c)It is shown, Then system sends out multimedia equipment current file F.F. instruction, and sends sound feedback to user, and multimedia equipment is current Move forward playing progress rate under played file, at this time:
A) change hand form, restore hand to Figure 32(a)State or by hand remove operating space when, system sends out media play Device plays progress adjusts halt instruction, and sends sound feedback to user, and multimedia equipment keeps currently playing progress;
B) control gesture of another equipment is such as made, system sends out apparatus for media playing playing progress rate and adjusts halt instruction first, The gesture control order of another equipment is executed again, and multimedia equipment keeps currently playing progress, system to send corresponding sound feedback.
In above procedure, step a is can skip, directly executes step b ~ e, system will directly send out corresponding instruction and feedback.
(4)Multimedia equipment volume adjustment
Step a:It stretches out hand thumb, index finger and middle finger, thumb to point up, makes multimedia equipment and choose gesture, such as Figure 33 (a)Shown, system chooses multimedia equipment as controlled plant, and sends sound feedback;
Step b:When multimedia equipment is in open state, hand is in downward direction rotated to the centre of the palm, such as Figure 33(c)It is shown, Then system sends out the instruction of multimedia equipment volume down, and sends sound feedback to user, and multimedia equipment volume starts to subtract It is small, at this time:
A) volume change speed determines that rotation angle is bigger by hand rotation angle, and pace of change is faster, and system will persistently reduce Volume is until volume is adjusted to minimum;
B) change hand form, restore hand to Figure 33(a)State or by hand remove operating space when, system sends out multimedia and sets Standby volume adjustment halt instruction, and sound feedback is sent to user, multimedia equipment keeps current volume constant;
C) control gesture of another equipment is such as made, system sends out multimedia equipment volume adjustment halt instruction first, then executes The gesture control order of another equipment, multimedia equipment keep current volume constant, and system sends corresponding sound feedback;
Step c:When multimedia equipment is in open state, hand is rotated to centre of the palm upward direction, such as Figure 34(c)It is shown, Then system sends out the increase instruction of multimedia equipment volume, and sends sound feedback to user, and multimedia equipment volume starts to increase Add, at this time:
A) volume change speed determines that rotation angle is bigger by hand rotation angle, and pace of change is faster, and system will persistently increase Volume is until volume is adjusted to maximum;
B) change hand form, restore hand to Figure 34(a)State or by hand remove operating space when, system sends out multimedia and sets Standby volume adjustment halt instruction, and sound feedback is sent to user, multimedia equipment keeps current volume constant;
The control gesture of another equipment is such as made, system sends out multimedia equipment volume adjustment halt instruction first, then executes another The gesture control order of one equipment, multimedia equipment keep current volume constant, and system sends corresponding sound feedback.
In above procedure, step a is can skip, directly executes step b or c, system will directly send out corresponding instruction and feedback.
Although the present invention is disclosed as above with preferred embodiment, it is not limited to the present invention, any to be familiar with this technology People can do various changes and modification, therefore protection scope of the present invention without departing from the spirit and scope of the present invention It should be subject to what claims were defined.

Claims (10)

1. a kind of gestural control method based on hand form, posture, position and motion feature, which is characterized in that the method Including:
Step 1: reading the image data of input picture;
Step 2: the hand-characteristic in one described image data of extraction step, and obtain hand-characteristic result;The hand-characteristic Including hand morphological feature, hand gestures feature, hand position feature and hand motion feature;The hand-characteristic result includes Hand morphological feature result, hand gestures characteristic results, hand position characteristic results and hand motion characteristic results;
Step 3: being merged, being analyzed and gesture identification to the hand-characteristic data described in step 2, gesture identification knot is obtained Fruit;
Step 4: gesture identification result described in storing step three;
Step 5: whether the gesture identification result of judgment step four is complete gesture, it, then will transmission and institute if it is complete gesture State the corresponding gesture command of gesture identification result;If not complete gesture, then iteration execute step 1 to step 4 until Until recognizing complete gesture.
2. gestural control method according to claim 1, which is characterized in that the hand-characteristic is in world coordinate system and hand It is described in portion's coordinate system;Absolute coordinate systems of the world coordinate system o-xyz as whole system;The hand coordinate When being that o '-x ' y ' z ' are stretched flat opening with palm, straight line where thumb is origin, straight line where middle finger with straight-line intersection where middle finger For x ' axis, finger tip direction is forward direction;Vertical volar direction is y ' axis directions, when using the right hand as control hand, palm Direction is forward direction, the use of the back of the hand direction when left hand is forward direction;The plane that z ' axis is formed perpendicular to x ', y ' axis, thumb direction are just To.
3. gestural control method according to claim 1, which is characterized in that hand morphological feature described in step 2 passes through image Fractal method extracts, and the extraction process of described image fractal method includes:
Step 1, the image data for reading input picture;
Step 2, using the gradation of image or colouring information of step 1 described image data into row threshold division, obtain hand region;
The edge contour of hand region described in step 3, extraction step 2 obtains hand edge profile results;
Step 4 analyzes hand edge profile results described in step 3, bent by the edge contour in edge contour result The protrusion of line judges finger number and direction with recess, obtains the hand morphological feature result;
Hand morphological feature result described in step 5, output step 4.
4. gestural control method according to claim 1, which is characterized in that hand morphological feature described in step 2 passes through module Matching process extracts, and the extraction process of the module matching process includes:
The first step, the image data for reading input picture, and be loaded into and preset template data;
Second step uses correlation filtering method to the default template based on gray scale or color characteristic preset described in the first step in template Carry out the full figure matching of described image;It is special to presetting the textural characteristics based on HOG, Haar or LPB in template described in the first step The full figure that the default template of sign carries out described image using cascade classifier matches, and obtains matching result;
Third step determines hand morphological feature according to matching result described in second step, and exports the hand morphological feature.
5. gestural control method according to claim 1, which is characterized in that the extraction of hand gestures feature described in step 2 Journey includes:
Step a, the hand morphological feature is read as a result, and the hand morphological feature result is loaded into the default three-dimensional mould of hand In type;
Step b, characteristic point is extracted using hand morphological feature result described in step a;
Step c, it is matched, is preset with the default threedimensional model according to characteristic point position described in step b and pixel value Threedimensional model matching result;
Step d, hand coordinate system is established according to default threedimensional model matching result described in step c;
Step e, according to hand coordinate system described in step d, hand gestures parameter, the hand gestures ginseng are solved using PnP methods Number is the hand gestures feature;
Step d, the characteristic results of hand gestures feature are exported.
6. gestural control method according to claim 1, which is characterized in that using hand gestures characteristic results to step 2 institute It states hand position feature to extract, the extraction process of the hand position feature includes:
Step I, the hand gestures characteristic results are read
Step II, using hand gestures characteristic results described in step I, hand position is calculated by world coordinate system and hand coordinate system Parameter is set, the hand position parameter is the hand position feature;
Step III, hand position characteristic results described in output step II.
7. gestural control method according to claim 1, which is characterized in that using hand morphological feature result to step 2 institute It states hand position feature to extract, the extraction process of the hand position feature includes:
Step A, the hand morphological feature result is read;
Step B, using hand morphological feature described in step A as a result, calculating hand position by the hand centre of form and world coordinate system Parameter, the hand position parameter are the hand position feature;
Step C, hand position characteristic results described in output step B.
8. gestural control method according to claim 1, which is characterized in that the extraction of hand motion feature described in step 2 Journey includes:
1st step, hand morphological feature result, hand gestures characteristic results and the hand position for reading preamble image and present image Characteristic results;
2nd step carries out hand morphological feature result, hand gestures characteristic results described in the 1st step and hand position characteristic results Variance analysis obtains hand motion after analysis;The hand motion is hand motion feature;
3rd step, output hand motion characteristic results;
Hand motion characteristic results described in 4th step, the 3rd step of storage, so as to the subsequent extracted invocation of procedure.
9. gestural control method according to claim 1, which is characterized in that using depth image feature extracting method to step Two hand-characteristics extract, and the extraction process is:
Step1:The image data of input picture is read, default three-dimensional skeleton model is loaded into;
Step2:Divide the hand region in input described image using depth information;
Step3:Hand-characteristic point is extracted in hand region;
Step4:Model Matching is carried out using hand-characteristic point described in step3, obtains Model Matching result;
Step5:Establish hand coordinate system;
Step6:According to hand coordinate system extraction hand morphological feature, hand described in Model Matching result combination step5 described in step4 Portion's posture feature and hand position feature;
Step7:Hand exercise feature is calculated according to the difference of the front and back frame image of input picture.
10. a kind of gestural control system for realizing gestural control method described in claim 1, which is characterized in that the control system System includes master controller, data acquisition module, data processing module, command output module and operational feedback module;The master control The data acquisition signal control interaction end of device processed is connected with the data of data acquisition module control interaction end;The main control The data processing signal control interaction end of device is connected with the data of data processing module control interaction end;The master controller The data of instruction output signal control interaction end and described instruction output module control interaction end and be connected;The master controller Operational feedback signal controls interaction end and is connected with the data of operational feedback module control interaction end;The data acquisition module Data output end be connected with the data input pin of the data processing module;The data output end of the data processing module with The data input pin of described instruction output module is connected.
CN201810392063.4A 2018-04-27 2018-04-27 gesture control method and system based on hand shape, posture, position and motion characteristics Active CN108549489B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810392063.4A CN108549489B (en) 2018-04-27 2018-04-27 gesture control method and system based on hand shape, posture, position and motion characteristics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810392063.4A CN108549489B (en) 2018-04-27 2018-04-27 gesture control method and system based on hand shape, posture, position and motion characteristics

Publications (2)

Publication Number Publication Date
CN108549489A true CN108549489A (en) 2018-09-18
CN108549489B CN108549489B (en) 2019-12-13

Family

ID=63512754

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810392063.4A Active CN108549489B (en) 2018-04-27 2018-04-27 gesture control method and system based on hand shape, posture, position and motion characteristics

Country Status (1)

Country Link
CN (1) CN108549489B (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109634415A (en) * 2018-12-11 2019-04-16 哈尔滨拓博科技有限公司 It is a kind of for controlling the gesture identification control method of analog quantity
CN109839827A (en) * 2018-12-26 2019-06-04 哈尔滨拓博科技有限公司 A kind of gesture identification intelligent home control system based on total space location information
CN109933194A (en) * 2019-03-05 2019-06-25 郑州万特电气股份有限公司 To the exchange method of virtual target object in a kind of mixed reality environment
CN110347266A (en) * 2019-07-23 2019-10-18 哈尔滨拓博科技有限公司 A kind of space gesture control device based on machine vision
CN110866450A (en) * 2019-10-21 2020-03-06 桂林医学院附属医院 Parkinson disease monitoring method and device and storage medium
CN110929616A (en) * 2019-11-14 2020-03-27 北京达佳互联信息技术有限公司 Human hand recognition method and device, electronic equipment and storage medium
CN111119645A (en) * 2019-12-10 2020-05-08 斑马网络技术有限公司 Vehicle window control method, device, equipment and computer readable storage medium
CN111228792A (en) * 2020-01-14 2020-06-05 深圳十米网络科技有限公司 Motion sensing game action recognition method and device, computer equipment and storage medium
CN111290585A (en) * 2020-03-31 2020-06-16 哈尔滨拓博科技有限公司 Intelligent space gesture control system based on display equipment
CN111381676A (en) * 2020-03-17 2020-07-07 哈尔滨拓博科技有限公司 TOF sensor and monocular camera fusion gesture recognition device and gesture recognition method
CN111752378A (en) * 2019-03-29 2020-10-09 福建天泉教育科技有限公司 Gesture instruction execution method and storage medium
CN112445326A (en) * 2019-09-03 2021-03-05 浙江舜宇智能光学技术有限公司 Projection interaction method based on TOF camera, system thereof and electronic equipment
WO2021098543A1 (en) * 2019-11-20 2021-05-27 Oppo广东移动通信有限公司 Gesture recognition method and apparatus, and storage medium
WO2021130548A1 (en) * 2019-12-23 2021-07-01 Sensetime International Pte. Ltd. Gesture recognition method and apparatus, electronic device, and storage medium
CN114701409A (en) * 2022-04-28 2022-07-05 东风汽车集团股份有限公司 Gesture interactive intelligent seat adjusting method and system
WO2024103994A1 (en) * 2022-11-16 2024-05-23 广州视琨电子科技有限公司 Method and apparatus for posture-based control of display device, device, and circuit board

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003071410A2 (en) * 2002-02-15 2003-08-28 Canesta, Inc. Gesture recognition system using depth perceptive sensors
CN102426480A (en) * 2011-11-03 2012-04-25 康佳集团股份有限公司 Man-machine interactive system and real-time gesture tracking processing method for same
US20120293408A1 (en) * 2004-04-15 2012-11-22 Qualcomm Incorporated Tracking bimanual movements
CN102999152A (en) * 2011-09-09 2013-03-27 康佳集团股份有限公司 Method and system for gesture recognition
CN103440033A (en) * 2013-08-19 2013-12-11 中国科学院深圳先进技术研究院 Method and device for achieving man-machine interaction based on bare hand and monocular camera
CN103530613A (en) * 2013-10-15 2014-01-22 无锡易视腾科技有限公司 Target person hand gesture interaction method based on monocular video sequence
CN106055091A (en) * 2016-05-16 2016-10-26 电子科技大学 Hand posture estimation method based on depth information and calibration method
CN106155299A (en) * 2015-04-23 2016-11-23 青岛海信电器股份有限公司 A kind of method and device that smart machine is carried out gesture control
CN106934334A (en) * 2015-12-31 2017-07-07 芋头科技(杭州)有限公司 A kind of gesture motion recognition methods and smart machine
CN107066935A (en) * 2017-01-25 2017-08-18 网易(杭州)网络有限公司 Hand gestures method of estimation and device based on deep learning
CN107578023A (en) * 2017-09-13 2018-01-12 华中师范大学 Man-machine interaction gesture identification method, apparatus and system
US20180075659A1 (en) * 2016-09-13 2018-03-15 Magic Leap, Inc. Sensory eyewear

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003071410A2 (en) * 2002-02-15 2003-08-28 Canesta, Inc. Gesture recognition system using depth perceptive sensors
US20120293408A1 (en) * 2004-04-15 2012-11-22 Qualcomm Incorporated Tracking bimanual movements
CN102999152A (en) * 2011-09-09 2013-03-27 康佳集团股份有限公司 Method and system for gesture recognition
CN102426480A (en) * 2011-11-03 2012-04-25 康佳集团股份有限公司 Man-machine interactive system and real-time gesture tracking processing method for same
CN103440033A (en) * 2013-08-19 2013-12-11 中国科学院深圳先进技术研究院 Method and device for achieving man-machine interaction based on bare hand and monocular camera
CN103530613A (en) * 2013-10-15 2014-01-22 无锡易视腾科技有限公司 Target person hand gesture interaction method based on monocular video sequence
CN106155299A (en) * 2015-04-23 2016-11-23 青岛海信电器股份有限公司 A kind of method and device that smart machine is carried out gesture control
CN106934334A (en) * 2015-12-31 2017-07-07 芋头科技(杭州)有限公司 A kind of gesture motion recognition methods and smart machine
CN106055091A (en) * 2016-05-16 2016-10-26 电子科技大学 Hand posture estimation method based on depth information and calibration method
US20180075659A1 (en) * 2016-09-13 2018-03-15 Magic Leap, Inc. Sensory eyewear
CN107066935A (en) * 2017-01-25 2017-08-18 网易(杭州)网络有限公司 Hand gestures method of estimation and device based on deep learning
CN107578023A (en) * 2017-09-13 2018-01-12 华中师范大学 Man-machine interaction gesture identification method, apparatus and system

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109634415B (en) * 2018-12-11 2019-10-18 哈尔滨拓博科技有限公司 It is a kind of for controlling the gesture identification control method of analog quantity
CN109634415A (en) * 2018-12-11 2019-04-16 哈尔滨拓博科技有限公司 It is a kind of for controlling the gesture identification control method of analog quantity
CN109839827A (en) * 2018-12-26 2019-06-04 哈尔滨拓博科技有限公司 A kind of gesture identification intelligent home control system based on total space location information
CN109839827B (en) * 2018-12-26 2021-11-30 哈尔滨拓博科技有限公司 Gesture recognition intelligent household control system based on full-space position information
CN109933194A (en) * 2019-03-05 2019-06-25 郑州万特电气股份有限公司 To the exchange method of virtual target object in a kind of mixed reality environment
CN111752378A (en) * 2019-03-29 2020-10-09 福建天泉教育科技有限公司 Gesture instruction execution method and storage medium
CN110347266A (en) * 2019-07-23 2019-10-18 哈尔滨拓博科技有限公司 A kind of space gesture control device based on machine vision
CN112445326B (en) * 2019-09-03 2023-04-07 浙江舜宇智能光学技术有限公司 Projection interaction method based on TOF camera, system thereof and electronic equipment
CN112445326A (en) * 2019-09-03 2021-03-05 浙江舜宇智能光学技术有限公司 Projection interaction method based on TOF camera, system thereof and electronic equipment
CN110866450A (en) * 2019-10-21 2020-03-06 桂林医学院附属医院 Parkinson disease monitoring method and device and storage medium
CN110929616A (en) * 2019-11-14 2020-03-27 北京达佳互联信息技术有限公司 Human hand recognition method and device, electronic equipment and storage medium
CN114556268A (en) * 2019-11-20 2022-05-27 Oppo广东移动通信有限公司 Gesture recognition method and device and storage medium
CN114556268B (en) * 2019-11-20 2023-10-27 Oppo广东移动通信有限公司 Gesture recognition method and device and storage medium
WO2021098543A1 (en) * 2019-11-20 2021-05-27 Oppo广东移动通信有限公司 Gesture recognition method and apparatus, and storage medium
EP4060458A4 (en) * 2019-11-20 2022-12-28 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Gesture recognition method and apparatus, and storage medium
CN111119645A (en) * 2019-12-10 2020-05-08 斑马网络技术有限公司 Vehicle window control method, device, equipment and computer readable storage medium
WO2021130548A1 (en) * 2019-12-23 2021-07-01 Sensetime International Pte. Ltd. Gesture recognition method and apparatus, electronic device, and storage medium
US11307668B2 (en) 2019-12-23 2022-04-19 Sensetime International Pte. Ltd. Gesture recognition method and apparatus, electronic device, and storage medium
JP2022519411A (en) * 2019-12-23 2022-03-24 商▲湯▼国▲際▼私人有限公司 Gesture recognition methods and devices, electronic devices, and recording media
CN111228792A (en) * 2020-01-14 2020-06-05 深圳十米网络科技有限公司 Motion sensing game action recognition method and device, computer equipment and storage medium
CN111381676A (en) * 2020-03-17 2020-07-07 哈尔滨拓博科技有限公司 TOF sensor and monocular camera fusion gesture recognition device and gesture recognition method
CN111290585A (en) * 2020-03-31 2020-06-16 哈尔滨拓博科技有限公司 Intelligent space gesture control system based on display equipment
CN114701409A (en) * 2022-04-28 2022-07-05 东风汽车集团股份有限公司 Gesture interactive intelligent seat adjusting method and system
CN114701409B (en) * 2022-04-28 2023-09-05 东风汽车集团股份有限公司 Gesture interactive intelligent seat adjusting method and system
WO2024103994A1 (en) * 2022-11-16 2024-05-23 广州视琨电子科技有限公司 Method and apparatus for posture-based control of display device, device, and circuit board

Also Published As

Publication number Publication date
CN108549489B (en) 2019-12-13

Similar Documents

Publication Publication Date Title
CN108549489A (en) A kind of gestural control method and system based on hand form, posture, position and motion feature
CN101697199B (en) Detection method of head-face gesture and disabled assisting system using same to manipulate computer
US10831278B2 (en) Display with built in 3D sensing capability and gesture control of tv
US6697072B2 (en) Method and system for controlling an avatar using computer vision
CN100360204C (en) Control system of intelligent perform robot based on multi-processor cooperation
CN100487636C (en) Game control system and method based on stereo vision
EP2956882B1 (en) Managed biometric identity
CN107660039B (en) A kind of lamp control system of identification dynamic gesture
CN100393486C (en) Method and apparatus for quick tracing based on object surface color
CN109542233B (en) Lamp control system based on dynamic gestures and face recognition
CN106502388A (en) A kind of interactive movement technique and head-wearing type intelligent equipment
GB2524473A (en) Controlling a computing-based device using gestures
CN104407694A (en) Man-machine interaction method and device combining human face and gesture control
CN105867630A (en) Robot gesture recognition method and device and robot system
CN109044651A (en) Method for controlling intelligent wheelchair and system based on natural gesture instruction in circumstances not known
CN111694428A (en) Gesture and track remote control robot system based on Kinect
CN111726921B (en) Somatosensory interactive light control system
CN108995590A (en) A kind of people's vehicle interactive approach, system and device
CN103853464A (en) Kinect-based railway hand signal identification method
CN112631173A (en) Brain-controlled unmanned platform cooperative control system
CN106468917A (en) A kind of tangible live real-time video image remotely assume exchange method and system
CN111589098A (en) Follow-up gesture control method and system for doll with stepping crown block
CN105814442A (en) Apparatuses for controlling electrical devices and software programs and methods for making and using same
CN106802717A (en) Space gesture remote control thereof and electronic equipment
KR101525011B1 (en) tangible virtual reality display control device based on NUI, and method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant