CN103376890A - Gesture remote control system based on vision - Google Patents

Gesture remote control system based on vision Download PDF

Info

Publication number
CN103376890A
CN103376890A CN201210121832XA CN201210121832A CN103376890A CN 103376890 A CN103376890 A CN 103376890A CN 201210121832X A CN201210121832X A CN 201210121832XA CN 201210121832 A CN201210121832 A CN 201210121832A CN 103376890 A CN103376890 A CN 103376890A
Authority
CN
China
Prior art keywords
hand
image
scope
gesture
candidate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201210121832XA
Other languages
Chinese (zh)
Other versions
CN103376890B (en
Inventor
王琪
范伟
谭志明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujitsu Ltd
Original Assignee
Fujitsu Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujitsu Ltd filed Critical Fujitsu Ltd
Priority to CN201210121832.XA priority Critical patent/CN103376890B/en
Publication of CN103376890A publication Critical patent/CN103376890A/en
Application granted granted Critical
Publication of CN103376890B publication Critical patent/CN103376890B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a gesture remote control system based on vision. The gesture remote control system comprises an image capture device for capturing a series of images of an object; a gesture recognizing device for recognizing an object gesture from the series of images captured by the image capture device and sending a recognition result to an operation command trigger device; the operation command trigger device for triggering a preset operation command according to the recognition result sent by the gesture recognizing device. The gesture recognizing device comprises a hand detecting component for detecting the object hand from the images captured by the image capture device; a hand tracking component for tracking the object hand in the following images when the hand detecting component detects the object hand in one image; a gesture recognizing component for determining movement of the object hand according to the object hand detected by the hand detecting component and the object hand tracked by the hand tracking component and recognizing the object gesture according to the determined movement of the object hand.

Description

Gesture remote control system based on vision
Technical field
The present invention relates to that image is processed, pattern-recognition and to the image tracing field, and relate more specifically to gesture remote control system based on vision.
Background technology
Along with life Computer and numerous portable intelligent equipment current people become more and more indispensable, people will wish between people and computing machine more natural and more efficient alternately.Yet, traditional peripherals such as mouse/keyboard, telepilot, the even man-machine interaction (HCI) the touch-screen under some specific conditions (for example, in bathroom or kitchen, in driving, etc.) be inconvenient for the user, because local what need is freely to touch HCI at these.Therefore, in recent years, more and more paid close attention to as the gesture remote control system of one of potential solution.
Basically, the gesture remote control system will follow the tracks of hand and analyze the expression of significant hand, if to be identified as be a kind of in the predefined gesture for they, then corresponding operational order will be triggered to carry out predetermined operation.Because gesture identification is very complicated in a lot of situations, so in gesture recognition process, many different instruments are used this problem that solves, such as Hidden Markov Model (HMM) (Hidden Markov Models (HMM)), particle filter, finite state machine (FSM) and neural network.The demanding computation complexity of most of gesture recognition systems; In addition, some of them have some restriction, for example, need the instrument (collecting depth information as needing infrared camera) of extra equipment (wearing gloves such as needs) or precision or can only operation in good illumination environment and simple background environment (as can not distinguish hand and the object with the similar color of the colour of skin, perhaps can only identify static gesture, etc.).
Therefore, need the gesture recognition system that a kind of computation complexity of real time remote control is low and can well move in complex environment.
Summary of the invention
According to an aspect of the present invention, a kind of gesture remote control system based on vision comprises: image capture device, and described image capture device is used for catching a series of images of object; Gesture identification equipment, described gesture identification equipment sends to the operational order trigger equipment for the gesture of a series of images identifying object of catching from described image capture device and with recognition result; And operational order trigger equipment, described operational order trigger equipment is used for triggering the scheduled operation order according to the recognition result that sends from described gesture identification equipment, wherein, described gesture identification equipment comprises: the hand detection part, and described hand detection part is for the hand of the image detection object of catching from described image capture device; Hand tracking unit, described hand tracking unit be used for when described hand detection part when an image detects the hand of object, the hand of tracing object in ensuing image; The hand that gesture identification parts, described gesture identification parts are used for the hand of the object that detects according to described hand detection part and the object that described hand tracking unit traces into determine object hand motion and come the gesture of identifying object according to the motion of the hand of determined object.
In one embodiment, described hand detection part be by becoming gray level image by the image conversion that described image capture device is caught, and utilize based on the cascade classifier of the local binary patterns hand from this gray level image detected object.
In one embodiment, described hand tracking unit comes the hand of tracing object by following processing: the difference image between the broca scale picture that utilizes the scope of the hand that detects or trace into and present image in previous image and the broca scale picture of previous image, come the hunting zone that is used for the tracking hand in the broca scale picture of original definition present image; Carry out template matching method to determine that wherein, described template matching method comprises as the scope of the hand of the present image that traces into:
The a plurality of first candidate's hand scopes of definition in the hunting zone, those the first candidate hand scopes have the big or small identical size with To Template, and, definition second candidate's hand scope in difference image, this candidate's hand scope has the big or small identical size with To Template, wherein, described To Template is the scope of the hand that detects or trace in previous image;
Carry out following steps until this a plurality of first candidate's hand scopes all pass through following matching judgment processes for described a plurality of first candidate's hand scopes circulation, thereby determine the scope of the hand that candidate's hand scope conduct of mating most with To Template traces in the broca scale picture of present image:
Calculate the mean value of absolute difference of each pixel of first candidate's hand scope and To Template as the first error;
If this first error, then represents this candidate's hand scope greater than the first predetermined threshold and does not mate with To Template, thereby is excluded;
If this first error is then calculated the second error less than the first predetermined threshold, the second error is the value that obtains with predetermined adjustment factor on duty that the mean value by the value of each pixel that the first error is deducted second candidate's hand scope obtains;
If this second error is less than the second predetermined threshold, then determine coupling, namely this first candidate hand scope is determined to be in the scope of the hand that traces in the colour of skin scope of present image, and the value of the second error as Second Threshold in order to next first candidate's hand scope is carried out matching judgment.
In one embodiment, after the hunting zone in the broca scale picture of having determined at present image and before carrying out template matching method, revise the hunting zone of original definition with reference to described difference image, and define described a plurality of candidate's hand scopes in the hunting zone after reducing and carry out the scope that described template matching method is determined the hand of present image, wherein, described correction comprises: with each limit of the hunting zone of original definition gradually to inside contracting, and when arbitrary limit ran into pixel value greater than the pixel of predetermined threshold, this limit stopped to inside contracting.
In one embodiment, in described template matching method, when determining with the To Template coupling for each first candidate's hand scope, this coupling is verified, and described checking comprises: calculate the mean value of the current first candidate's hand scope that is identified coupling and the absolute difference of each pixel of the scope of detected hand in described hand checkout equipment as the 3rd error; Whether judge the 3rd error greater than the 3rd threshold value, in fact do not mate if the 3rd error, is then judged the current first candidate's hand scope of mating most that is identified greater than the 3rd predetermined threshold, thereby get rid of this first candidate hand scope.
In one embodiment, employed broca scale picture can obtain as follows among the present invention: the value of the R component in the RGB component of each pixel of the image of catching is deducted the average of G component and B component, to obtain a difference; And with described difference and predetermined threshold relatively, if described difference is less than described predetermined threshold, then the value of the respective pixel of broca scale picture gets 0, and if described difference larger than described predetermined threshold, then the value of the respective pixel of broca scale picture is got described difference.
In one embodiment, described gesture identification parts according to described hand detection part and described hand tracking unit between the position of the hand of the object that every two field picture detected or traced into, adjacent two two field pictures that obtain from the position calculation of the hand of every two field picture hand position orientation and the motion of the sense of displacement of adjacent two two field pictures that obtain from adjacent two the hand position orientation calculation hand of determining object and motion and the predefined gesture-hands movement trajectory map table of the hand of determined object compared the gesture of identifying object.
According to a further aspect in the invention, a kind of gesture remote control method based on vision comprises a series of images of catching object; Gesture from a series of images identifying object of catching; Trigger the scheduled operation order according to recognition result, wherein, the gesture of identifying object comprises: from the hand of the image detection object of catching; When in an image, detecting the hand of object, the hand of tracing object in ensuing image; And determine according to the hand of the hand of the object that detects and the object that traces into object hand motion and come the gesture of identifying object according to the motion of the hand of determined object.
According to another aspect of the invention, a kind of method for the RGB image conversion being become the broca scale picture, the method comprises: the value of the R component of the RGB component of each pixel of RGB image is deducted the average of G component and B component, to obtain a difference; With described difference and predetermined threshold relatively, if described difference is less than described predetermined threshold, then the value of the respective pixel of broca scale picture gets 0, and if described difference larger than described predetermined threshold, then the value of the respective pixel of broca scale picture is got described difference.
According to another aspect of the invention, a kind of for the method in the image sequence tracking target, comprise: the difference image between the broca scale picture that utilizes the scope of the target that detects or trace into and present image in previous image and the broca scale picture of previous image, come the hunting zone for tracking target in the broca scale picture of original definition present image; Carry out template matching method to determine that wherein, described template matching method comprises as the scope of the target of the present image that traces into:
A plurality of the first candidate target scopes of definition in described hunting zone, those the first candidate target scopes have the big or small identical size with To Template, and, definition the second candidate target scope in described difference image, this candidate target scope has the big or small identical size with To Template, wherein, described To Template is the scope of the target that detects or trace in previous image;
Carry out following steps until this a plurality of the first candidate target scopes all pass through following matching judgment processes for described a plurality of the first candidate target scopes circulation, thereby determine the scope of the target that the candidate target scope conduct of mating most with To Template traces in the broca scale picture of present image:
Calculate the mean value of absolute difference of each pixel of first a candidate target scope and To Template as the first error;
If this first error, then represents this candidate target scope greater than the first predetermined threshold and does not mate with To Template, thereby is excluded;
If this first error is then calculated the second error less than the first predetermined threshold, the second error is the value that obtains with predetermined adjustment factor on duty that the mean value by the value of each pixel that the first error is deducted the second candidate target scope obtains;
If this second error is less than the second predetermined threshold, then determine coupling, namely this first candidate target scope is determined to be in the scope of the target that traces in the colour of skin scope of present image, and the value of the second error as Second Threshold in order to next the first candidate target scope is carried out matching judgment.
According to the gesture recognition system based on vision of the present invention, even move at hand and to cross in the situation about having with some object of the similar color of the colour of skin, also can be simply and from the frame image sequence of catching continuously, detect exactly efficiently hand, follow the tracks of hands movement and identify gesture.
Description of drawings
Fig. 1 shows the schematic structure of gesture control system of the present invention;
Fig. 2 shows the process flow diagram of the gesture recognition process of being carried out by gesture identification equipment of the present invention;
Fig. 3 shows the RGB image of catching and the example of the gray level image after the conversion;
Fig. 4 shows when the example that successfully detects in the gray level image after conversion when in one's hands, and wherein the scope of hand is illustrated with rectangle frame;
Fig. 5 shows the exemplary diagram of the broca scale picture that utilizes in gesture recognition process of the present invention;
Fig. 6 shows the difference image between the broca scale picture of continuous two images that move right for hand;
(a) among Fig. 7, (b) and (c) show the exemplary diagram of processing that in gesture recognition process of the present invention definition is used for following the tracks of at the broca scale picture of present image the hunting zone of hand;
The exemplary diagram of result after the hunting zone that Fig. 8 shows original definition is corrected;
Fig. 9 shows in gesture recognition process of the present invention by utilizing improved template matching algorithm to follow the tracks of the exemplary diagram of the processing of crossing over facial hand; And
Figure 10 shows and be used for calculating the exemplary diagram that hand position orientation, the change of hand position orientation and hand position are orientated defined hands movement direction in gesture identification equipment of the present invention.
Embodiment
Below, describe the preferred embodiments of the present invention in detail with reference to accompanying drawing.Notice that in this instructions and accompanying drawing, being denoted by the same reference numerals has the structural detail of substantially the same function and structure, and omit being repeated in this description these structural details.
The invention provides a kind of efficient and healthy and strong gesture remote control system based on vision, this system utilizes common web camera work and show reliability in complex environment, only needs simultaneously less calculated amount.
Fig. 1 shows the diagram of the schematic construction of gesture control system of the present invention.As shown in Figure 1, the gesture remote control system based on vision of the present invention mainly comprises three parts: image capture device 101, for a series of images of catching object; Gesture identification equipment 102 sends to operational order trigger equipment 103 for the gesture of some row image recognition objects of catching from image capture device 101 and with recognition result; And operational order trigger equipment 103, be used for triggering the scheduled operation order according to the recognition result that sends from described gesture identification equipment 102.
In this manual, the web camera is used as image capture device, but the present invention is not limited to this, any type known or following will learn can capture video images acquisition equipment all can be used as this image capture device.In the present invention, gesture identification equipment is the core of gesture control system, it is identified significant gesture and gesture motion and the predefined gesture of identifying is compared, if one of the gesture motion of identifying and predefined gesture match, then recognition result sends to the operational order trigger equipment to trigger predetermined operational order.The below will describe the content of gesture recognition process of the present invention in detail.
Before describing gesture recognition process of the present invention, the predefined gesture that this gesture recognition process of given first is supported, as shown in table 1.As known from Table 1, compare with static gesture, gesture identification method of the present invention is also supported dynamic gesture.For example, when hand " short stay ", hand keeps the static very short time (for example being less than 1 second), and when hand kept " stops ", hand static reaching more than 1 second of maintenance, " 1 second " only was that example and this value can be by user's Set arbitrarilies here.In addition, for example, when hand (to the right, upwards, downwards) motion left, the hands movement track be after the short stay left (to the right, upwards, downwards) mobile, and when hand when brandishing, the hands movement track for left-to the right-left-downwards (that is, →, ←, →, ←) mobile.In addition, when hand turned clockwise, the hands movement track was
Figure BSA00000707196500061
And when hand was rotated counterclockwise, the hands movement track was
Figure BSA00000707196500062
Here, the hands movement track can be complete circle or more than half circle.In addition, if the user does not make any significant gesture, then the hands movement track is recorded arbitrarily.Although provided definition of gesture as above here, the present invention is not limited to this, the user can at random define as required, and therefore, the gesture that can be used for identifying can be more natural.
Figure BSA00000707196500071
Table 1: definition of gesture
Then, will the detailed process of gesture recognition process of the present invention be described.The hand of this gesture recognition process detected object from the image of catching is followed the tracks of the motion of hand, and then according to the movement locus of hand and predefined gesture what kind of significant gesture command that come recognition expression, detailed process as shown in Figure 2.
Fig. 2 shows the process flow diagram of the gesture recognition process of being carried out by gesture identification equipment of the present invention.As shown in Figure 2, consist of the processing stage that gesture recognition process of the present invention being main by three: detect, follow the tracks of and identification.Particularly, for the every two field picture in a series of images of the object of being caught by image capture device, the process below carrying out.At first, at S201, the image of the object of being caught by image capture device is normalized, and for example to be normalized be 160 pixels * 120 pixels, the present invention is not limited to this certainly, and the large I that is normalized is by user's Set arbitrarily.At S202, detect and whether whether the record of hand tracking determined to detect in the previous frame image in one's hands and followed the tracks of in one's hands according to relevant hand, if in previous frame, do not detect in one's hands, then carry out the processing that relevant hand detects, if in previous frame, followed the tracks of in one's handsly, then carry out the processing that relevant hand is followed the tracks of.For example, in one embodiment, whether sign isTrack is set and detects in one's handsly for expression, and when detecting when in one's hands, isTrack is composed 1, and when not detecting when in one's hands, isTrack is composed 0.In addition, in this embodiment, whether sign isDectec is set and follows the tracks of in one's handsly for expression, and when following the tracks of when in one's hands, isDectec is composed 1, and when not following the tracks of when in one's hands, isDectec is composed 0.Therefore, in this embodiment, just can determine in previous frame, whether to detect in one's hands and whether follow the tracks of in one's hands by the value of checkmark isTrack and isDectec.
Not detecting in one's hands in previous frame and follow the tracks of in the situation in one's hands, be "No" in S202 namely, carries out the processing of relevant hand detection.Particularly, as shown in Figure 2, in S203, present image is become gray level image from the RGB image conversion, certainly, if the image that captures is exactly gray level image, then this step will omit.Fig. 3 shows the RGB image of catching and the example of the gray level image after the conversion.In S204, utilize training in advance good based on inspection side hand in the gamut of the gray level image of cascade classifier after conversion of LBP (local binary patterns).Fig. 4 illustrates when the example that successfully detects in the gray level image after conversion when in one's hands, and wherein the scope of hand is illustrated with rectangle frame.As previously described, if detect in one's handsly, isDectec is composed 1 in S206, if do not detect in one's handsly, then isDectec is composed 0 in S207, and these records are saved for using when the next frame image is processed.
When not detecting in the gray level image after conversion in the situation in one's hands, gesture identification method of the present invention turns back to S201 so that the next frame image is processed.On the other hand, when successfully detecting in the gray level image after conversion in the situation in one's hands, in S208 current RGB image conversion is become the broca scale picture, this is convenient to area of skin color and background are separated, thereby reduces when the next frame image being carried out hand tracking processing because the impact that background causes.More specifically, employedly among the present invention become the method for broca scale picture as follows the RGB image conversion: the value of supposing the pixel in the broca scale picture is represented by s, the value of the R component of the RGB component of each pixel in the RGB image, G component, B component is represented by r, g, b respectively, and set an interim intermediate variable Temp, then the value of s is defined by following formula:
Temp=r-((g+b)/2));
Temp=MAX(0,Temp);
s=Temp>140?0:Temp;
Also namely, at first, the value of the R component in the RGB component of each pixel of RGB image is deducted the average value of G component and B component, to obtain a difference; Then, with this difference and predetermined threshold relatively, if this difference is less than predetermined threshold, then the value of the respective pixel of broca scale picture gets 0, and if this difference larger than predetermined threshold, then the value of the respective pixel of broca scale picture is got this difference.For example, Fig. 5 shows the RGB image of catching is carried out the broca scale picture after the conversion with predetermined threshold 140 example, and wherein rectangle frame represents that hand is detected, and as mentioned above, area of skin color separates with background.Although described aforesaidly obtaining the method for broca scale picture from the RGB image conversion here, the user also can adopt other colour of skin segregation methods to obtain the broca scale picture as required.In addition, the broca scale of present image looks like to be saved to calculate for the difference image when next image being carried out hand tracking processing, and represents that the parameter of the scope of detected hand also is saved to be used for template matches and the error evaluation when next image being carried out hand tracking processing.
Having detected in one's hands in a upper image or follow the tracks of in the situation in one's hands, is "Yes" in S202 namely, carries out the processing that relevant hand is followed the tracks of.Hand of the present invention is followed the tracks of to process and is adopted improved template matching method, this template matching method to calculate faster and can tolerate some variation of the hand shape of hand in motion process.Particularly, in S209, present image is become the broca scale picture from the RGB image conversion, transform method is as described above.In S210, the difference image between the broca scale picture of the broca scale picture of calculating present image and the previous image of preserving.In the calculating of difference image, only the difference along the hands movement direction is retained, and is dropped with the value of hands movement opposite direction.Particularly, at first, the difference of the value of the respective pixel of the value of each pixel of the broca scale picture of calculating present image and the broca scale picture of previous image.Then, with this difference and 0 relatively, if this difference is larger than 0, then the value of the respective pixel of difference image is got this difference, if this difference is less than 0, then the value of the respective pixel of difference image gets 0.Fig. 6 shows the difference image between the broca scale picture of continuous two images that move right for hand.
Then, in S210, in the broca scale picture of present image, follow the tracks of hand according to the difference image between the broca scale picture of the broca scale picture of the hand that in previous image, detects or trace into and present image and previous image.Particularly, at first, utilize the scope of the hand that in previous image, detects or trace into to come the hunting zone that is used for following the tracks of hand in the broca scale picture of original definition present image.For example, by in the example shown in the rectangle frame, the hunting zone can define according to following formula in the scope of hand:
Range.x=MAX(0,Target.x-15);
Range.y=MAX(0,Target.y-15);
Range.width=MIN(a-Range.x,Target.width+30);
Range.height=MIN(b-Range.y,Target.height+30),
Wherein, Target.x, Target.y represents in the previous image horizontal stroke on summit in the upper left corner of the rectangle frame of the hand that detects or trace into, ordinate, Target.width, Target.height represents in the previous image width and the height of the rectangle frame of the hand that detects or trace into, and Range.x, Range.y represents the horizontal stroke on summit in the upper left corner of the hunting zone in the broca scale picture of present image, ordinate, Range.width, Range.height represents width and the height of the hunting zone in the broca scale picture of present image, and a and b be level and the vertical pixel count of presentation video respectively, and " 15 ", the numerical value of " 30 " be rule of thumb be worth set and according to circumstances can be so that other values of the hand scope that approach to follow the tracks of as far as possible the hunting zone of initial setting.
To carry out the calculated amount in the template matching method in next step and improve accuracy in order to reduce, in original definition as mentioned above after the hunting zone, utilize difference image between the broca scale picture of the broca scale picture of present image and previous image to revise the hunting zone of original definition, be about in the described difference image area-of-interest as revised hunting zone.As shown in Figure 7, show the scope of the hand that detects in (a) previous image or trace into, (b) difference image between the broca scale picture of the broca scale picture of present image and previous image, and the hunting zone that (c) in the broca scale picture of present image, defines, wherein, the label in the upper left corner frame number of presentation video exemplarily among the figure.And, Fig. 8 shows the diagram that the hunting zone of the example of revising as the hunting zone reduces, and wherein, the four edges of the hunting zone of original definition is by gradually to inside contracting, and when arbitrary limit ran into pixel value greater than the pixel of predetermined threshold (for example 5), then this limit stopped to inside contracting.
After in the broca scale picture of present image, having defined the hunting zone, in S212, carry out template matching method to determine the scope as the hand of the present image that traces into.Here template matching method refer to the hand that will be in a upper image detects or trace into scope as To Template, candidate's hand scope and this template are compared, if error less than certain value, is then determined to mate.In order to reduce the tracking error when hand is crossed over some objects with colour of skin (for example, face, other hand), above-mentioned template matching method has been considered the movable information from difference image, is specially:
The a plurality of first candidate's hand scopes of definition in the hunting zone, those the first candidate hand scopes have the big or small identical size with To Template, and, definition second candidate's hand scope in difference image, this candidate's hand scope has the big or small identical size with To Template;
Carry out following steps until this a plurality of first candidate's hand scopes all pass through following matching judgment processes for described a plurality of first candidate's hand scopes circulation, thereby determine the scope of the hand that candidate's hand scope conduct of mating most with To Template traces in the broca scale picture of present image: calculate the mean value of absolute difference of each pixel of first candidate's hand scope and To Template as the first error; If this first error, then represents this candidate's hand scope greater than the first predetermined threshold and does not mate with To Template, thereby is excluded; If this first error is then calculated the second error less than the first predetermined threshold, the second error be the mean value by the value of each pixel that the first error is deducted second candidate's hand scope obtain on duty with predetermined adjustment factor; If this second error is less than the second predetermined threshold, then determine coupling, namely this first candidate hand scope is determined to be in the scope of the hand that traces in the colour of skin scope of present image, and the value of the second error as Second Threshold in order to next first candidate's hand scope is carried out matching judgment.
As mentioned above, because this template matching method is found out the hand scope of mating most step by step by the previous candidate's hand scope of reference, therefore, this makes it possible to tolerate the variation of the hand shape of hand in continuous motion.Yet in continuous tracking was processed, this had also caused the accumulation of matching error.Therefore, can process to limit historical matching error by extra checking is set, this checking is processed and is comprised: calculates current be identified first candidate's hand scope of mating most and in the hand checkout equipment mean value of the absolute difference of each pixel of the scope of detected hand as the 3rd error; Whether judge the 3rd error greater than the 3rd threshold value, in fact do not mate if the 3rd error, is then judged the current first candidate's hand scope of mating most that is identified greater than the 3rd predetermined threshold, thereby get rid of this first candidate hand scope.Fig. 9 shows when hand and moves tracking results when crossing over face, in order to clearly show that effect, optionally shows the picture frame at interval, and wherein rectangle frame represents the hand that traces into.
Then, in S213, it is in one's hands whether judgement is successfully followed the tracks of in S212.If successfully follow the tracks of in one's hands, then as mentioned above in S214 sign isTrack composed 1, otherwise in S215 sign isTrack composed 0 and method flow forward S203 to present image is carried out the hand Check processing.
Sign isTrack is composed in 1 the situation in S214, namely, in present image, follow the tracks of in the situation in one's hands, at S216, the difference image that the colour of skin of present image is preserved when being used for the next frame image processed calculates, and the parameter that will represent the hand scope that traces in the present image is preserved for the template matches that is used for the next frame image and is processed, and the horizontal stroke with the position of the hand that traces in the present image, ordinate be defined as respectively in the previous image hand that detects or trace into scope horizontal ordinate and width half and and previous image in the hand that detects or trace into scope ordinate and height half with, if with HandPos (x, y) represent the coordinate of the position of the hand that traces in the present image, then it is represented as following formula:
HandPos(x,y)=(Target.x+Target.width/2,Target.y+Target.height/2)
Then, at S217, after all images in a series of images of being caught by image capture device all having been carried out the processing that hand detects or hand is followed the tracks of, a plurality of hand positions that pass through to record can get movement locus in one's hands.By analyzing this movement locus and can identifying the gesture of object with reference to defined gesture-movement locus mapping table.Particularly, at first, calculate in every two adjacent images the hand position orientation Orient between the position of the hand that detects or trace into, that is, calculate straight line that the position by detection in the position of the hand that detects in the present image or trace into and the previous image or the hand that traces into consists of and the angle of horizontal direction
Then, calculate sense of displacement DeltaOrient between the two adjacent images.Particularly, the hand position orientation Orient of present image with respect to previous image deducted a rear image with respect to the hand position orientation LastOrient of present image and obtain difference.If the absolute value of this difference is greater than 180, then described difference greater than 0 situation under, described difference deduct 360 and the value of obtaining as picture displacement direction DeltaOrient; Described difference less than 0 situation under, described difference add 360 and the value of obtaining as picture displacement direction DeltaOrient.If the absolute value of this difference is not more than 180, then described difference is as picture displacement direction DeltaOrient.
Then, (" arbitrarily " is used for record real time kinematics track for 8 kinds of significant gestures, and " short stay " is set up as the preliminary activities that is used for up/down/left/right gesture, stop, up/down/left/right, wave as one group, they are based on the value of Orient; CW/CC organizes as another, they based on the value of DeltaOrient and the accumulation of a period of time and, especially, for the gesture of waving, it comprises left and the alternating movement of motion to the right), gesture identification is as follows:
Stop
In order to identify the gesture of stop, it is static to detect maintenance in one's hands for continuous STAY_NUM frame, namely, for continuous STAY_NUM frame, always have the position of the hand that traces at present image identical with the position of the hand that detects in previous image or trace into, wherein STAY_NUM is predefined frame number.
Up/down/left/right
In order to identify the gesture of up/down/left/right, at first, should detect preliminary activities " short stay ", then at ensuing DIREC_NUM frame, the value of Orient should keep within the specific limits along a direction, and wherein, DIREC_NUM is predefined frame number.As example, suppose that " short stay " action is for keeping static for 3 frames.Figure 10 (b) shows the value scope for the Orient of up/down/left/right direction.For example, if 46≤Orient≤134 think that then upward direction moves.
Wave:
In order to identify the gesture of waving, need to detect 4 continuous motor segments :-left to the right-left-to the right.In order to detect each motor segment, need the Orient value should be maintained at for the N continuous frame that (MIN_SEG_NUM≤N<=MAX_SEG_NUM), wherein MIN_SEG_NUM and MAX_SEG_NUM are predefined frame number threshold values in the corresponding value scope.
Turn clockwise:
In order to identify the gesture that turns clockwise, need to be in some sequential frame images, the value of DeltaOrient just remains, the absolute value of DeltaOrient remain on certain limit (for example, greater than 10 and less than or equal to 50 scope) and DeltaOrient absolute value and reached predetermined threshold CIRCLE_DEGREE.
Be rotated counterclockwise:
In the judgement that is rotated counterclockwise, except the value of DeltaOrient remain negative, other are all identical with the judgement that turns clockwise.
After gesture as mentioned above is identified, all counters that are used for corresponding gesture candidate will be reset.In addition, system will " be freezed " several frames, that is, and and nonrecognition gesture in these several frames.It is wrong identification for fear of be not intended to gesture for some that this mechanism is introduced into.For example, if the user wishes constantly to make gesture to the right, he/her can retract hand, and then make next one gesture to the right to the right naturally finishing one after the gesture.If do not utilize this kind " freezing " mechanism, then this " retracting " will probably be identified as gesture left by mistake.
So, the gesture of object is identified.
Then, in S219, be exported to the operational order trigger equipment as the gesture of recognition result.
Then, in the operational order trigger equipment as the interface between gesture and the corresponding operating order, trigger corresponding operational order according to the gesture identification result who sends from gesture identification equipment by the mapping table of checking predefined gesture and operational order.The mapping table of gesture and operational order is for example shown in the following table 2.This table 2 has provided corresponding with 8 kinds of gestures for the mapping between the operation of windows picture reader software.Although given here be 8 kinds of gestures corresponding for the mapping between the operation of windows picture reader software, but the invention is not restricted to this, more kinds of gesture and the mappings between the corresponding operating can be defined, and operation not only can relate to the operation to various types of softwares, also can relate to the operation to the difference in functionality of various electronic equipments, this is defined as required by the user.For example, the present invention can be applied to gesture-mouse control.
Gesture Operational order
Short stay Carriage return
Left Expression is moved to the left if previous/picture amplifies
To the right Expression moves right if the next one/picture amplifies
Upwards Expression moves up if picture amplifies
Downwards Expression moves down if picture amplifies
Wave Withdraw from
Turn clockwise Amplify
Be rotated counterclockwise Dwindle
Table 2: gesture-operational order mapping table
According to the gesture recognition system based on vision of the present invention, even move at hand and to cross in the situation about having with some object of the similar color of the colour of skin, also can be simply and from the frame image sequence of catching continuously, detect exactly efficiently hand, follow the tracks of hands movement and identify gesture.
The above has described according to each parts of the gesture identification telechirics based on vision of the present invention and the corresponding operating in each parts, yet can make multiple change and modification in the situation that does not break away from purport of the present invention, these changes and modification also drop in the application's the scope.

Claims (10)

1. gesture remote control system based on vision comprises:
Image capture device, described image capture device is used for catching a series of images of object;
Gesture identification equipment, described gesture identification equipment sends to the operational order trigger equipment for the gesture of a series of images identifying object of catching from described image capture device and with recognition result; And
Operational order trigger equipment, described operational order trigger equipment are used for triggering the scheduled operation order according to the recognition result that sends from described gesture identification equipment, and wherein, described gesture identification equipment comprises:
The hand detection part, described hand detection part is for the hand of the image detection object of catching from described image capture device;
Hand tracking unit, described hand tracking unit be used for when described hand detection part when an image detects the hand of object, the hand of tracing object in ensuing image;
The hand that gesture identification parts, described gesture identification parts are used for the hand of the object that detects according to described hand detection part and the object that described hand tracking unit traces into determine object hand motion and come the gesture of identifying object according to the motion of the hand of determined object.
2. the gesture remote control system based on vision according to claim 1, wherein, described hand detection part be by becoming gray level image by the image conversion that described image capture device is caught, and utilize based on the cascade classifier of the local binary patterns hand from this gray level image detected object.
3. the gesture remote control system based on vision according to claim 1, wherein, described hand tracking unit comes the hand of tracing object by following processing:
Difference image between the broca scale picture of the scope of the hand that utilization detects in previous image or traces into and the broca scale picture of present image and previous image comes the hunting zone that is used for following the tracks of hand in the broca scale picture of original definition present image;
Carry out template matching method determining the scope as the hand of the present image that traces into,
Wherein, described template matching method comprises:
The a plurality of first candidate's hand scopes of definition in the hunting zone, those the first candidate hand scopes have the big or small identical size with To Template, and, definition second candidate's hand scope in difference image, this candidate's hand scope has the big or small identical size with To Template, wherein, described To Template is the scope of the hand that detects or trace in previous image;
Carry out following steps until this a plurality of first candidate's hand scopes all pass through following matching judgment processes for described a plurality of first candidate's hand scopes circulation, thereby determine the scope of the hand that candidate's hand scope conduct of mating most with To Template traces in the broca scale picture of present image:
Calculate the mean value of absolute difference of each pixel of first candidate's hand scope and To Template as the first error;
If this first error, then represents this candidate's hand scope greater than the first predetermined threshold and does not mate with To Template, thereby is excluded;
If this first error is then calculated the second error less than the first predetermined threshold, the second error is the value that obtains with predetermined adjustment factor on duty that the mean value by the value of each pixel that the first error is deducted second candidate's hand scope obtains;
If this second error is less than the second predetermined threshold, then determine coupling, namely this first candidate hand scope is determined to be in the scope of the hand that traces in the colour of skin scope of present image, and the value of the second error as Second Threshold in order to next first candidate's hand scope is carried out matching judgment.
4. the gesture remote control system based on vision according to claim 3, wherein, after the hunting zone in the broca scale picture of having determined at present image and before carrying out template matching method, revise the hunting zone of original definition with reference to described difference image, and define described a plurality of candidate's hand scopes in the hunting zone after reducing and carry out the scope that described template matching method is determined the hand of present image, wherein, described correction comprises: with each limit of the hunting zone of original definition gradually to inside contracting, and when arbitrary limit ran into pixel value greater than the pixel of predetermined threshold, this limit stopped to inside contracting.
5. the gesture remote control system based on vision according to claim 3, wherein, in described template matching method, when determining with the To Template coupling for each first candidate's hand scope, this coupling is verified, and described checking comprises:
Calculate the mean value of the current first candidate's hand scope that is identified coupling and the absolute difference of each pixel of the scope of detected hand in described hand checkout equipment as the 3rd error;
Whether judge the 3rd error greater than the 3rd threshold value, in fact do not mate if the 3rd error, is then judged the current first candidate's hand scope of mating most that is identified greater than the 3rd predetermined threshold, thereby get rid of this first candidate hand scope.
6. each described gesture remote control system based on vision in 5 according to claim 3, wherein, broca scale similarly is to obtain as follows:
The value of the R component in the RGB component of each pixel of the image of catching is deducted the average of G component and B component, to obtain a difference; And
With described difference and predetermined threshold relatively, if described difference is less than described predetermined threshold, then the value of the respective pixel of broca scale picture gets 0, and if described difference larger than described predetermined threshold, then the value of the respective pixel of broca scale picture is got described difference.
7. the gesture remote control system based on vision according to claim 1, wherein, described gesture identification parts are according to described hand detection part and the described hand tracking unit position at the hand of the object that every two field picture detected or traced into, the sense of displacement of the hand position orientation between adjacent two two field pictures that obtain from the position calculation of the hand of every two field picture and adjacent two two field pictures that obtain from adjacent two hand position orientation calculation determine object hand motion and motion and the predefined gesture-hands movement trajectory map table of the hand of determined object compared the gesture of identifying object.
8. gesture remote control method based on vision comprises:
Catch a series of images of object;
Gesture from a series of images identifying object of catching;
Trigger the scheduled operation order according to recognition result,
Wherein, the gesture of identifying object comprises:
Hand from the image detection object of catching;
When in an image, detecting the hand of object, the hand of tracing object in ensuing image; And
Determine according to the hand of the hand of the object that detects and the object that traces into object hand motion and come the gesture of identifying object according to the motion of the hand of determined object.
9. method that is used for the RGB image conversion is become the broca scale picture, the method comprises:
The value of the R component of the RGB component of each pixel of RGB image is deducted the average of G component and B component, to obtain a difference;
With described difference and predetermined threshold relatively, if described difference is less than described predetermined threshold, then the value of the respective pixel of broca scale picture gets 0, and if described difference larger than described predetermined threshold, then the value of the respective pixel of broca scale picture is got described difference.
10. method that is used in the image sequence tracking target comprises:
Difference image between the broca scale picture of the scope of the target that utilization detects in previous image or traces into and the broca scale picture of present image and previous image comes the hunting zone that is used for tracking target in the broca scale picture of original definition present image;
Carry out template matching method to determine that wherein, described template matching method comprises as the scope of the target of the present image that traces into:
A plurality of the first candidate target scopes of definition in described hunting zone, those the first candidate target scopes have the big or small identical size with To Template, and, definition the second candidate target scope in described difference image, this candidate target scope has the big or small identical size with To Template, wherein, described To Template is the scope of the target that detects or trace in previous image;
Carry out following steps until this a plurality of the first candidate target scopes all pass through following matching judgment processes for described a plurality of the first candidate target scopes circulation, thereby determine the scope of the target that the candidate target scope conduct of mating most with To Template traces in the broca scale picture of present image:
Calculate the mean value of absolute difference of each pixel of first a candidate target scope and To Template as the first error;
If this first error, then represents this candidate target scope greater than the first predetermined threshold and does not mate with To Template, thereby is excluded;
If this first error is then calculated the second error less than the first predetermined threshold, the second error is the value that obtains with predetermined adjustment factor on duty that the mean value by the value of each pixel that the first error is deducted the second candidate target scope obtains;
If this second error is less than the second predetermined threshold, then determine coupling, namely this first candidate target scope is determined to be in the scope of the target that traces in the colour of skin scope of present image, and the value of the second error as Second Threshold in order to next the first candidate target scope is carried out matching judgment.
CN201210121832.XA 2012-04-16 2012-04-16 The gesture remote control system of view-based access control model Active CN103376890B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210121832.XA CN103376890B (en) 2012-04-16 2012-04-16 The gesture remote control system of view-based access control model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210121832.XA CN103376890B (en) 2012-04-16 2012-04-16 The gesture remote control system of view-based access control model

Publications (2)

Publication Number Publication Date
CN103376890A true CN103376890A (en) 2013-10-30
CN103376890B CN103376890B (en) 2016-08-31

Family

ID=49462114

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210121832.XA Active CN103376890B (en) 2012-04-16 2012-04-16 The gesture remote control system of view-based access control model

Country Status (1)

Country Link
CN (1) CN103376890B (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104050454A (en) * 2014-06-24 2014-09-17 深圳先进技术研究院 Movement gesture track obtaining method and system
CN104318218A (en) * 2014-10-29 2015-01-28 百度在线网络技术(北京)有限公司 Image recognition method and device
CN104699238A (en) * 2013-12-10 2015-06-10 现代自动车株式会社 System and method for gesture recognition of vehicle
CN105223957A (en) * 2015-09-24 2016-01-06 北京零零无限科技有限公司 A kind of method and apparatus of gesture manipulation unmanned plane
CN105242614A (en) * 2015-11-17 2016-01-13 广州新科佳都科技有限公司 Platform screen door safety protection control method and system
CN105657260A (en) * 2015-12-31 2016-06-08 宇龙计算机通信科技(深圳)有限公司 Shooting method and terminal
CN105677039A (en) * 2016-02-16 2016-06-15 北京博研智通科技有限公司 Method, device and wearable device for gesture-based driving status detection
CN105807783A (en) * 2014-12-30 2016-07-27 览意科技(上海)有限公司 Flight camera
CN106022211A (en) * 2016-05-04 2016-10-12 北京航空航天大学 Method using gestures to control multimedia device
CN106041912A (en) * 2016-06-16 2016-10-26 深圳先进技术研究院 Master-slave mode snake-like robot system and position control method thereof
CN106094861A (en) * 2016-06-02 2016-11-09 零度智控(北京)智能科技有限公司 Unmanned plane, unmanned aerial vehicle (UAV) control method and device
CN106295531A (en) * 2016-08-01 2017-01-04 乐视控股(北京)有限公司 A kind of gesture identification method and device and virtual reality terminal
WO2017000764A1 (en) * 2015-06-30 2017-01-05 芋头科技(杭州)有限公司 Gesture detection and recognition method and system
CN106491071A (en) * 2015-09-06 2017-03-15 中兴通讯股份有限公司 A kind of method for giving a test of one's eyesight and terminal
CN106920251A (en) * 2016-06-23 2017-07-04 阿里巴巴集团控股有限公司 Staff detecting and tracking method and device
CN106934333A (en) * 2015-12-31 2017-07-07 芋头科技(杭州)有限公司 A kind of gesture identification method and system
CN106951871A (en) * 2017-03-24 2017-07-14 北京地平线机器人技术研发有限公司 Movement locus recognition methods, device and the electronic equipment of operating body
WO2017190614A1 (en) * 2016-05-06 2017-11-09 深圳市国华识别科技开发有限公司 Intelligent terminal based man-machine interaction method and system
WO2018082331A1 (en) * 2016-11-07 2018-05-11 深圳光启合众科技有限公司 Image processing method and device, and robot
CN109558000A (en) * 2017-09-26 2019-04-02 京东方科技集团股份有限公司 A kind of man-machine interaction method and electronic equipment
CN110276292A (en) * 2019-06-19 2019-09-24 上海商汤智能科技有限公司 Intelligent vehicle motion control method and device, equipment and storage medium
CN110458095A (en) * 2019-08-09 2019-11-15 厦门瑞为信息技术有限公司 A kind of recognition methods, control method, device and the electronic equipment of effective gesture
WO2021077840A1 (en) * 2019-10-22 2021-04-29 上海商汤智能科技有限公司 Gesture control method and apparatus

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101853071A (en) * 2010-05-13 2010-10-06 重庆大学 Gesture identification method and system based on visual sense
CN102200830A (en) * 2010-03-25 2011-09-28 夏普株式会社 Non-contact control system and control method based on static gesture recognition
CN102339125A (en) * 2010-07-23 2012-02-01 夏普株式会社 Information equipment and control method and system thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102200830A (en) * 2010-03-25 2011-09-28 夏普株式会社 Non-contact control system and control method based on static gesture recognition
CN101853071A (en) * 2010-05-13 2010-10-06 重庆大学 Gesture identification method and system based on visual sense
CN102339125A (en) * 2010-07-23 2012-02-01 夏普株式会社 Information equipment and control method and system thereof

Cited By (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104699238A (en) * 2013-12-10 2015-06-10 现代自动车株式会社 System and method for gesture recognition of vehicle
CN104699238B (en) * 2013-12-10 2019-01-22 现代自动车株式会社 System and method of the gesture of user to execute the operation of vehicle for identification
CN104050454B (en) * 2014-06-24 2017-12-19 深圳先进技术研究院 A kind of motion gesture track acquisition methods and system
CN104050454A (en) * 2014-06-24 2014-09-17 深圳先进技术研究院 Movement gesture track obtaining method and system
CN104318218A (en) * 2014-10-29 2015-01-28 百度在线网络技术(北京)有限公司 Image recognition method and device
CN105807783A (en) * 2014-12-30 2016-07-27 览意科技(上海)有限公司 Flight camera
WO2017000764A1 (en) * 2015-06-30 2017-01-05 芋头科技(杭州)有限公司 Gesture detection and recognition method and system
US10318800B2 (en) 2015-06-30 2019-06-11 Yutou Technology (Hangzhou) Co., Ltd. Gesture detection and recognition method and system
CN106325485B (en) * 2015-06-30 2019-09-10 芋头科技(杭州)有限公司 A kind of gestures detection recognition methods and system
CN106325485A (en) * 2015-06-30 2017-01-11 芋头科技(杭州)有限公司 Gesture detection and identification method and system
CN106491071A (en) * 2015-09-06 2017-03-15 中兴通讯股份有限公司 A kind of method for giving a test of one's eyesight and terminal
WO2017049817A1 (en) * 2015-09-24 2017-03-30 北京零零无限科技有限公司 Method and apparatus for operating unmanned aerial vehicle by means of gestures
CN105223957A (en) * 2015-09-24 2016-01-06 北京零零无限科技有限公司 A kind of method and apparatus of gesture manipulation unmanned plane
US10261507B2 (en) 2015-09-24 2019-04-16 Beijing Zero Zero Infinity Technology Co., Ltd Method and device for controlling unmanned aerial vehicle with gesture
CN105242614A (en) * 2015-11-17 2016-01-13 广州新科佳都科技有限公司 Platform screen door safety protection control method and system
CN106934333B (en) * 2015-12-31 2021-07-20 芋头科技(杭州)有限公司 Gesture recognition method and system
CN105657260A (en) * 2015-12-31 2016-06-08 宇龙计算机通信科技(深圳)有限公司 Shooting method and terminal
CN106934333A (en) * 2015-12-31 2017-07-07 芋头科技(杭州)有限公司 A kind of gesture identification method and system
CN105677039B (en) * 2016-02-16 2020-06-09 北京博研智通科技有限公司 Method and device for detecting driving state based on gesture and wearable device
CN105677039A (en) * 2016-02-16 2016-06-15 北京博研智通科技有限公司 Method, device and wearable device for gesture-based driving status detection
CN106022211A (en) * 2016-05-04 2016-10-12 北京航空航天大学 Method using gestures to control multimedia device
CN106022211B (en) * 2016-05-04 2019-06-28 北京航空航天大学 A method of utilizing gesture control multimedia equipment
WO2017190614A1 (en) * 2016-05-06 2017-11-09 深圳市国华识别科技开发有限公司 Intelligent terminal based man-machine interaction method and system
CN106094861A (en) * 2016-06-02 2016-11-09 零度智控(北京)智能科技有限公司 Unmanned plane, unmanned aerial vehicle (UAV) control method and device
CN106094861B (en) * 2016-06-02 2024-01-12 零度智控(北京)智能科技有限公司 Unmanned aerial vehicle, unmanned aerial vehicle control method and unmanned aerial vehicle control device
CN106041912B (en) * 2016-06-16 2018-06-22 深圳先进技术研究院 Master-slave mode snake-shaped robot system and its position control method
CN106041912A (en) * 2016-06-16 2016-10-26 深圳先进技术研究院 Master-slave mode snake-like robot system and position control method thereof
US10885639B2 (en) 2016-06-23 2021-01-05 Advanced New Technologies Co., Ltd. Hand detection and tracking method and device
US10885638B2 (en) 2016-06-23 2021-01-05 Advanced New Technologies Co., Ltd. Hand detection and tracking method and device
EP3477593A4 (en) * 2016-06-23 2019-06-12 Alibaba Group Holding Limited Hand detecting and tracking method and device
JP2019519049A (en) * 2016-06-23 2019-07-04 アリババ グループ ホウルディング リミテッド Hand detection and tracking method and apparatus
CN106920251A (en) * 2016-06-23 2017-07-04 阿里巴巴集团控股有限公司 Staff detecting and tracking method and device
WO2017219875A1 (en) * 2016-06-23 2017-12-28 阿里巴巴集团控股有限公司 Hand detecting and tracking method and device
CN106295531A (en) * 2016-08-01 2017-01-04 乐视控股(北京)有限公司 A kind of gesture identification method and device and virtual reality terminal
WO2018082331A1 (en) * 2016-11-07 2018-05-11 深圳光启合众科技有限公司 Image processing method and device, and robot
CN106951871A (en) * 2017-03-24 2017-07-14 北京地平线机器人技术研发有限公司 Movement locus recognition methods, device and the electronic equipment of operating body
US10866649B2 (en) 2017-09-26 2020-12-15 Boe Technology Group Co., Ltd. Gesture identification method and electronic device
CN109558000A (en) * 2017-09-26 2019-04-02 京东方科技集团股份有限公司 A kind of man-machine interaction method and electronic equipment
CN110276292A (en) * 2019-06-19 2019-09-24 上海商汤智能科技有限公司 Intelligent vehicle motion control method and device, equipment and storage medium
CN110276292B (en) * 2019-06-19 2021-09-10 上海商汤智能科技有限公司 Intelligent vehicle motion control method and device, equipment and storage medium
CN110458095A (en) * 2019-08-09 2019-11-15 厦门瑞为信息技术有限公司 A kind of recognition methods, control method, device and the electronic equipment of effective gesture
CN110458095B (en) * 2019-08-09 2022-11-18 厦门瑞为信息技术有限公司 Effective gesture recognition method, control method and device and electronic equipment
WO2021077840A1 (en) * 2019-10-22 2021-04-29 上海商汤智能科技有限公司 Gesture control method and apparatus

Also Published As

Publication number Publication date
CN103376890B (en) 2016-08-31

Similar Documents

Publication Publication Date Title
CN103376890A (en) Gesture remote control system based on vision
Zhang et al. Ergonomic posture recognition using 3D view-invariant features from single ordinary camera
Raheja et al. Real-time robotic hand control using hand gestures
Jain et al. Real-time upper-body human pose estimation using a depth camera
US9436872B2 (en) System and method for detecting and tracking multiple parts of an object
EP2957206B1 (en) Robot cleaner and method for controlling the same
CN103135753A (en) Gesture input method and system
CN108171133A (en) A kind of dynamic gesture identification method of feature based covariance matrix
TWI571772B (en) Virtual mouse driving apparatus and virtual mouse simulation method
Raheja et al. Hand gesture pointing location detection
Jaemin et al. A robust gesture recognition based on depth data
Oh et al. Using binary decision tree and multiclass SVM for human gesture recognition
Wang et al. A new hand gesture recognition algorithm based on joint color-depth superpixel earth mover's distance
KR20120089948A (en) Real-time gesture recognition using mhi shape information
CN105261038A (en) Bidirectional optical flow and perceptual hash based fingertip tracking method
Chai et al. Robust hand gesture analysis and application in gallery browsing
Ikemura et al. Human detection by Haar-like filtering using depth information
Liao et al. Design of real-time face position tracking and gesture recognition system based on image segmentation algorithm
Hsieh et al. A real time hand gesture recognition system based on DFT and SVM
Czupryna et al. Real-time vision pointer interface
Shaker et al. Real-time finger tracking for interaction
Xu et al. Bare hand gesture recognition with a single color camera
Araki et al. Real-time both hands tracking using camshift with motion mask and probability reduction by motion prediction
Kim et al. Visual multi-touch air interface for barehanded users by skeleton models of hand regions
Jadhav et al. Hand gesture recognition system to control slide show navigation

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant