CN101901339B - Hand movement detecting method - Google Patents

Hand movement detecting method Download PDF

Info

Publication number
CN101901339B
CN101901339B CN2010102428326A CN201010242832A CN101901339B CN 101901339 B CN101901339 B CN 101901339B CN 2010102428326 A CN2010102428326 A CN 2010102428326A CN 201010242832 A CN201010242832 A CN 201010242832A CN 101901339 B CN101901339 B CN 101901339B
Authority
CN
China
Prior art keywords
hand
detected
camera
palm
face
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2010102428326A
Other languages
Chinese (zh)
Other versions
CN101901339A (en
Inventor
徐向民
梁卓锐
苗捷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN2010102428326A priority Critical patent/CN101901339B/en
Publication of CN101901339A publication Critical patent/CN101901339A/en
Application granted granted Critical
Publication of CN101901339B publication Critical patent/CN101901339B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a hand movement detecting method, at least comprising the following steps: A, starting a machine; B, gathering an image sequence through a camera, and searching the user hands in the image sequence gathered by the camera; C, if the first hand is up, and the second hand supports the elbow joint of the first hand, thereby judging that the first hand is detected hand, and the second hand is support hand; extracting the palm position information of the detected hand and the palm position information of the support hand, and processing the extracted position information. Without wearing a mark on the hand nor exposing the forearm of the hand when detection by using the method, the detection is more convenient, and the detection speed is also improved.

Description

Hand movement detecting method
Technical field
The present invention relates to a kind of hand movement detecting method.
Background technology
In the staff forearm method for testing motion that is adopted at present,, mainly adopt following two kinds for obtaining the information of staff accurately: one, on staff, wear mark, two, adopt the Face Detection of arm, but this mode need be exposed arm.For first kind of mode, owing to need wear mark on hand, inconvenient during detection; For the second way, owing to need arm be exposed, also inconvenient when then detecting if wear long sleeves, cause many restrictions, also have influence on the speed of detection.
On the other hand, conventional detection need adopt two cameras, and detects from two different directions (often adopting orthogonal direction), can detect the three-dimensional motion information of staff forearm, has strengthened the cost that detects.
Summary of the invention
The object of the present invention is to provide a kind of hand movement detecting method, when adopting the method for the invention to detect, need on staff, not wear mark, also need the staff forearm not exposed, more convenient during detection, also improved the speed that detects.
Its technical scheme is following:
A kind of hand movement detecting method, this method comprises the steps: A, start at least; B, camera collection image sequence, and in the image sequence of camera collection the search subscriber both hands; Be raised state if C searches first hand, second hand rest lived the elbow joint of first hand, judges that first hand is that detected hand, second hand are for supporting hand; And extract the palm positional information of detected hand and the palm positional information of support hand, and the positional information of being extracted is handled.
The present invention is through Face Detection location staff, is the vectorial terminal point of staff forearm with the palm of detected hand, is the vectorial starting point of staff forearm with the palm position of supporting hand; Make up staff forearm model; And easily staff is detected, avoid in the prior art, on staff, wear mark or arm is exposed and the defective brought; Detect more conveniently, also improved the speed that detects.
The technical scheme of the further refinement of aforementioned techniques scheme can be:
In said C step, keep the palm position of support hand motionless, through a camera both hands are followed the tracks of; When said detected hand vertically lifts, detect the absolute distance L of the palm of said detected hand to the palm of said support hand; When said detected hand tilted to lift, the palm that detects said detected hand was with respect to the projector distance d of the palm that supports hand in level or vertical direction, and detected the angle θ of the arm of said detected hand with respect to level or vertical direction; Confirm the three-dimensional motion state of the palm of detected hand again through L, d and θ.
Said B step comprises the steps: B1, camera seeker face, and judges whether have people's face to exist; B2, if there is people's face to exist, near people's face, set up the sensitizing range, and through camera search subscriber both hands in the sensitizing range.
Said B2 step comprises the steps: B21, camera search for whether palmar aspect is arranged to camera in the sensitizing range, if having, judges that then this hand is a detected hand; B22, camera are searched for area of skin color below detected hand, if there is area of skin color to meet area and length breadth ratio, judge that then this area of skin color is for supporting hand.
In sum, advantage of the present invention is:
When 1, detecting, need not on staff, wear mark or arm is exposed and the defective brought, detect more conveniently, also improve the speed of detection;
2, obtain the value of L, d and θ through a camera, can confirm the three-dimensional motion state of the palm of detected hand, reduced the cost that detects.
Description of drawings
Fig. 1 is when adopting the said detection method of the embodiment of the invention, the image sequence figure that camera is gathered;
Fig. 2 is that detected hand is erect the constitutional diagram when lifting;
Fig. 3 is the detected hand constitutional diagram when lifting of tilting;
Fig. 4 is the tri-vector figure of staff forearm.
Embodiment
Below in conjunction with accompanying drawing embodiments of the invention are elaborated:
As shown in Figure 1, a kind of hand movement detecting method, this method comprises the steps: A, start at least; B, camera collection image sequence, and in the image sequence of camera collection the search subscriber both hands; Be raised state if C searches first hand, second hand rest lived the elbow joint of first hand, judges that first hand is that detected hand, second hand are for supporting hand; And extract the palm positional information of detected hand and the palm positional information of support hand, and the positional information of being extracted is handled.
Said B step comprises the steps: B1, camera seeker face, and judges whether have people's face to exist; B2, if there is people's face to exist; Near people's face, set up the sensitizing range; And through camera the search subscriber both hands are (wherein in the sensitizing range; Said B2 step comprises the steps: B21, camera search for whether palmar aspect is arranged to camera in the sensitizing range, if having, judges that then this hand is a detected hand; B22, camera are searched for area of skin color below detected hand, if there is area of skin color to meet area and length breadth ratio, judge that then this area of skin color is for supporting hand).
In said C step, keep the palm position of support hand motionless, through a camera both hands are followed the tracks of; As shown in Figure 2, when said detected hand vertically lifts, detect the absolute distance L of the palm of said detected hand to the palm of said support hand; As shown in Figure 3, when said detected hand tilts to lift, detect palm in the horizontal direction the projector distance d of the palm of said detected hand, and detect the angle θ of the arm of said detected hand with respect to vertical direction with respect to the support hand; As shown in Figure 4, carry out geometrical calculation through L, d and θ, be mapped to three dimensions, and the three-dimensional motion state of the palm of definite detected hand.
Specify again in the face of the said detection method of present embodiment down:
1, the people gets in the camera scope, and the person is over against camera, and arm motion detects when being not activated; Seeker's face in the image that collects carries out people's face and detects, and adopts the Haar wavelet transformation to carry out feature extraction; Select the face identification of conducting oneself of Adaboost sorter, search people's face after, according to the size of people's face; Confirm the rectangular area of search staff, with further accurately with dwindle the hunting zone that hand detects.
2, detect people's face after, in people's face near zone, search for staff, mainly judge that through Face Detection and form carrying out hand detects, specific algorithm is following:
2.1 the formula below at first adopting arrives the YCbCr space with the RGB color space conversion
Y = 0.257 R + 0.504 G + 0.098 B + 16 Cb = - 0.148 R - 0.219 G + 0.439 B + 128 Cr = 0.439 R - 0.368 G - 0.071 B + 128
Judge 2.2 adopt the Gauss model method to carry out the colour of skin, suppose colour of skin distribution obedience uni-modal gaussian
p ( x ) = 1 2 π e ( x - μ ) 2 2 σ 2
In the YCbCr color space,, obtain the skin color probability calculating parameter through gathering the colour of skin sample training under the different illumination conditions.The skin color probability computing formula is following:
p(Cb,Cr)=exp[-0.5(x-m) TC -1(x-m)]
Wherein:
X=(Cb, Cr) T, be the pixel in the CbCr space
M=E{x} is the average of all pixels in the CbCr space
C=E{ (x-m) (x-m) T, be the variance of all pixels in the CbCr space
Average and variance obtain through the colour of skin sample statistics that collects, and (Cb, value Cr) are got 0.6 and be threshold value, and its probability surpasses 0.6 and promptly thinks the skin pixel point finally to calculate p;
2.3 handle carry out morphology through the bianry image after the colour of skin cluster, mark UNICOM zone is carried out hand and is chosen through area size, the regional length breadth ratio of UNICOM;
3, detect hand after, judge whether it is staff forearm motion detection, basis for estimation is following 2 points:
Support hand rest and the elbow joint of detected hand, promptly certain distance (being generally 1.5 times of people's face height) can search an other colour of skin district below detected hand, and this zone is identified as the support hand;
Arm to be detected forms relative vertical position (it is generally acknowledged with horizontal sextant angle greater than 85 degree), search effective detected hand inclination angle after, just trigger staff forearm motion detection.
4, camera automatic focusing aligning detected hand makes camera output image 70% be detected hand, to dwindle following range and to improve accuracy of detection, through detected detected hand and support hand, calculates the detected hand palm of the hand to the distance reference value L that supports hand.
In the staff forearm testing process, support hand and keep motionless, in present frame, search for staff.Search staff, think that promptly detected hand is effective, find detected hand after, behaviour detected hand the position below search for area of skin color, if there is area of skin color to meet area and length breadth ratio, promptly be judged to be the support hand.
The calculating detected hand arrives the horizontal projection of support hand apart from d, and staff forearm vector off-center angle θ (being the angle between OA and the OD); Analyze L, d and relation are calculated the staff forearm at the three-dimensional space motion state, and computing method are following:
As shown in Figure 4, OA is the staff forearm vector of original state, and length is L, is true origin with O; The OA direction is the Z direction, and is as shown in Figure 4, and the dotted line plane is over against camera, and the plane vertical direction is a directions X; Thereby construct coordinate system, in view of the activity characteristic of arm, forearm can only be in postbrachium the place ahead; OB is a current state staff forearm vector, and BC length is d, and the size of angle DOA is θ.Then the coordinate of B (x, y z) are:
x=d;
y=dtanθ;
z = L 2 - d 2 - y 2
More than be merely specific embodiment of the present invention, do not limit protection scope of the present invention with this; Any replacement and the improvement on the basis of not violating the present invention's design, done all belong to protection scope of the present invention.

Claims (1)

1. a hand movement detecting method is characterized in that, this method comprises the steps: at least
A, start;
B, camera collection image sequence, and in the image sequence of camera collection the search subscriber both hands;
Be raised state if C searches first hand, second hand rest lived the elbow joint of first hand, judges that first hand is that detected hand, second hand are for supporting hand; And extract the palm positional information of detected hand and the palm positional information of support hand, and the positional information of being extracted is handled;
Said B step comprises the steps: B1, camera seeker face, and judges whether have people's face to exist; B2, if there is people's face to exist, near people's face, set up the sensitizing range, and through camera search subscriber both hands in the sensitizing range;
Said B2 step comprises the steps: B21, camera search for whether palmar aspect is arranged to camera in the sensitizing range, if having, judges that then this hand is a detected hand; B22, camera are searched for area of skin color below detected hand, if there is area of skin color to meet area and length breadth ratio, judge that then this area of skin color is for supporting hand;
In said C step, keep the palm position of support hand motionless, through a camera both hands are followed the tracks of; When said detected hand vertically lifts, detect the absolute distance L of the palm of said detected hand to the palm of said support hand; When said detected hand tilted to lift, the palm that detects said detected hand was with respect to the projector distance d of the palm that supports hand in level or vertical direction, and detected the angle θ of the arm of said detected hand with respect to level or vertical direction; Confirm the three-dimensional motion state of the palm of detected hand again through L, d and θ.
CN2010102428326A 2010-07-30 2010-07-30 Hand movement detecting method Expired - Fee Related CN101901339B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2010102428326A CN101901339B (en) 2010-07-30 2010-07-30 Hand movement detecting method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2010102428326A CN101901339B (en) 2010-07-30 2010-07-30 Hand movement detecting method

Publications (2)

Publication Number Publication Date
CN101901339A CN101901339A (en) 2010-12-01
CN101901339B true CN101901339B (en) 2012-11-14

Family

ID=43226863

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2010102428326A Expired - Fee Related CN101901339B (en) 2010-07-30 2010-07-30 Hand movement detecting method

Country Status (1)

Country Link
CN (1) CN101901339B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105719279B (en) * 2016-01-15 2018-07-13 上海交通大学 Based on the modeling of cylindroid trunk and arm regions segmentation and arm framework extraction method
CN106383452B (en) * 2016-11-24 2020-06-19 北京地平线机器人技术研发有限公司 Intelligent control module and kitchen electrical equipment comprising same
CN107300877A (en) * 2017-07-26 2017-10-27 佛山伊贝尔科技有限公司 A kind of hologram three-dimensional projects robot
CN109815828A (en) * 2018-12-28 2019-05-28 公安部第三研究所 Realize the system and method for initiative alarming or help-seeking behavior detection control
CN110633382B (en) * 2019-07-26 2022-04-12 上海工程技术大学 Automatic carpal plane searching method based on three-dimensional human body scanning

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101073089A (en) * 2004-04-15 2007-11-14 格斯图尔泰克股份有限公司 Tracking bimanual movements

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2418974B (en) * 2004-10-07 2009-03-25 Hewlett Packard Development Co Machine-human interface
CN100585329C (en) * 2007-04-10 2010-01-27 南京航空航天大学 Location system of video finger and location method based on finger tip marking
CN101344816B (en) * 2008-08-15 2010-08-11 华南理工大学 Human-machine interaction method and device based on sight tracing and gesture discriminating
JP2010079651A (en) * 2008-09-26 2010-04-08 Toshiba Corp Movement recognition device, method and program

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101073089A (en) * 2004-04-15 2007-11-14 格斯图尔泰克股份有限公司 Tracking bimanual movements

Also Published As

Publication number Publication date
CN101901339A (en) 2010-12-01

Similar Documents

Publication Publication Date Title
CN101901339B (en) Hand movement detecting method
US10600207B2 (en) Posture state estimation apparatus and posture state estimation method
JP5699788B2 (en) Screen area detection method and system
CN107392086B (en) Human body posture assessment device, system and storage device
CN103257713B (en) A kind of gesture control method
CN106250820B (en) A kind of staircase mouth passenger flow congestion detection method based on image procossing
WO2015149712A1 (en) Pointing interaction method, device and system
CN109685827B (en) Target detection and tracking method based on DSP
WO2018076392A1 (en) Pedestrian statistical method and apparatus based on recognition of parietal region of human body
EP3036714B1 (en) Unstructured road boundary detection
CN106682641A (en) Pedestrian identification method based on image with FHOG- LBPH feature
CN103413120A (en) Tracking method based on integral and partial recognition of object
CN105426828A (en) Face detection method, face detection device and face detection system
CN107993224B (en) Object detection and positioning method based on circular marker
CN104700088B (en) A kind of gesture track recognition method under the follow shot based on monocular vision
CN105138990A (en) Single-camera-based gesture convex hull detection and palm positioning method
CN103530892A (en) Kinect sensor based two-hand tracking method and device
CN111079518B (en) Ground-falling abnormal behavior identification method based on law enforcement and case handling area scene
US20180114073A1 (en) Method and device for counting pedestrians based on identification of head top of human body
CN103106409A (en) Composite character extraction method aiming at head shoulder detection
CN111144174A (en) System for identifying falling behavior of old people in video by using neural network and traditional algorithm
CN104346621A (en) Method and device for creating eye template as well as method and device for detecting eye state
JP5534432B2 (en) Information terminal equipment
CN103198491A (en) Indoor visual positioning method
CN114612933B (en) Monocular social distance detection tracking method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20121114

Termination date: 20210730