CN106909872A - Staff outline identification method - Google Patents

Staff outline identification method Download PDF

Info

Publication number
CN106909872A
CN106909872A CN201510971701.4A CN201510971701A CN106909872A CN 106909872 A CN106909872 A CN 106909872A CN 201510971701 A CN201510971701 A CN 201510971701A CN 106909872 A CN106909872 A CN 106909872A
Authority
CN
China
Prior art keywords
staff
depth
image
profile
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510971701.4A
Other languages
Chinese (zh)
Inventor
魏毅
刘周
王凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Dcom Intelligent Technology Co Ltd
Original Assignee
Jiangsu Dcom Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Dcom Intelligent Technology Co Ltd filed Critical Jiangsu Dcom Intelligent Technology Co Ltd
Priority to CN201510971701.4A priority Critical patent/CN106909872A/en
Publication of CN106909872A publication Critical patent/CN106909872A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/11Hand-related biometrics; Hand pose recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/113Recognition of static hand signs

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The present invention provides a kind of staff outline identification method, comprises the following steps:Three-dimensional staff image is obtained using Kinect colour imagery shots, depth of field data is obtained using depth of field camera;Image enhancement processing and contour detecting treatment are carried out to three-dimensional staff image;Calculate people's centre of the palm coordinate;Calculate the depth of field distance of people's centre of the palm;Setting staff profile depth of field distance threshold;Calculate the depth of field distance of each picture point in the staff profile containing background colour;Picture point by depth of field distance beyond the staff profile depth of field distance threshold scope is set to white, removes the background colour of the three-dimensional staff image;Second contour detecting treatment is carried out, staff profile is obtained.The present invention can effectively remove the interference of background colour, accurately recognize palm profile, finger contours and centre of the palm coordinate;With depth of field identification function, the gesture with depth can be recognized, without using skeleton data, without making whole human body appear in camera view.

Description

Staff outline identification method
Technical field
The present invention relates to gesture identification field, more particularly to a kind of staff outline identification method.
Background technology
In computer science, gesture identification is the subject under discussion that human gesture is recognized by mathematical algorithm.Gesture identification can come from the motion of the parts of body of people, but generally refer to the motion of human hands.Gesture, as a kind of nature, intuitively man-machine interaction means, is always a study hotspot of human-computer interaction technology in the ten years of past two.
But the gesture identification of view-based access control model is easy by ambient interferences, such as face, the skin of neck or the color clothing similar to the colour of skin in data acquisition.So, in Gesture Recognition, fundamental problem is exactly the most, the judgement of the shape and position of staff, only can accurately recognize staff profile, could accurately recognize the gesture of people.It is required that the color of background color and hand has preferable discrimination, particularly in the case where background is changeable and complex, this ambient interferences can be more serious.
The Gesture Recognition of prior art mainly has two kinds, and one kind is to recognize gesture using colour imagery shot combination image algorithm, and it is disadvantageous in that, is not depth of field camera due to camera, it is difficult to gesture of the identification with depth(For example before push away, bending etc.), noise abatement is poor, it is difficult to remove ambient interferences, its accuracy of identification is than relatively low;Another kind is recognized using 3D body-sensing video camera Kinect, the advantage of the SDK softwares that Kinect is carried is that effect is outstanding in terms of skeleton identification, in order to ensure recognition effect, all there is in camera overlay area whole human body in needs, but in terms of staff identification, still there are certain ambient interferences, its accuracy of identification is relatively low.
The content of the invention
It is an object of the present invention to provide a kind of staff outline identification method, effectively solves the relatively low technical problem of background color serious interference present in existing Gesture Recognition, hand identification ratio of precision.
To achieve the above object, the present invention provides a kind of staff outline identification method, including:
At least one three-dimensional staff image is obtained using an at least Kinect colour imagery shots, an at least depth of field data is obtained using an at least depth of field camera;
Image enhancement processing is carried out to the three-dimensional staff image;
Contour detecting treatment is carried out to the three-dimensional staff image, at least one staff image containing background colour, the profile coordinate of staff image of the storage containing background colour is obtained;
At least people's centre of the palm coordinate is calculated and stored according to the staff profile coordinate containing background colour;
The depth of field distance of people's centre of the palm is calculated according to people's centre of the palm coordinate and the depth of field data;
A depth of field distance setting at least staff profile depth of field distance threshold according to default human hand shape and people's centre of the palm;
The depth of field distance of each picture point in the staff profile containing background colour is calculated using the staff profile containing background colour and the depth of field data;
The depth of field distance of each picture point in the staff profile containing background colour and the staff profile depth of field distance threshold are contrasted, depth of field distance is set to white beyond the picture point of the staff profile depth of field distance threshold scope, so as to remove the background colour of the three-dimensional staff image;And
Second contour detecting treatment is carried out to the three-dimensional staff image, an at least staff profile is obtained and store.
It is an advantage of the current invention that the interference of background colour can be removed effectively, palm profile, finger contours and centre of the palm coordinate are accurately recognized;With depth of field identification function, the gesture with depth can be recognized(For example before push away, bending etc.), without using skeleton data, without making whole human body appear in camera view.
Brief description of the drawings
Fig. 1 show the flow chart of staff outline identification method in the present invention.
Specific embodiment
The preferred embodiments of the present invention are introduced below with reference to Figure of description, being used to illustrate proves that the present invention can be implemented, these embodiments can be to those of skill in the art's complete description technology contents of the invention so that technology contents of the invention are more clear and readily appreciate.But the present invention can be emerged from by the embodiment of many multi-forms, protection scope of the present invention is not limited only to the embodiment mentioned in text.
As shown in figure 1, the present invention provides a kind of staff outline identification method, including:
Step S1)At least one three-dimensional staff image is obtained using an at least Kinect colour imagery shots, an at least depth of field data is obtained using an at least depth of field camera.Kinect is a kind of 3D body-sensings video camera of Microsoft's design, can be connected with main frame by USB interface, and it has imported the functions such as the seizure of instant dynamic, image identification, microphone input, speech recognition, community interactive.Kinect is more more intelligent than general camera, and Kinect can launch infrared ray, so as to carry out stereoscopic localized to whole room.Kinect colour imagery shots can then recognize the motion of human body by infrared ray, may recognize that complete rgb color, and automatically be User logs in by facial recognition techniques.In addition, coordinate some high end softwares on main frame, just can carry out real-time tracing to the 48 of human body positions.Speed is taken pictures for 30 frames/in the case of the second in Kinect colour imagery shots, it is per second to obtain 30 Zhang San dimension staff image, and corresponding 30 groups of depth of field data.
Step S2)Image enhancement processing is carried out to the three-dimensional staff image, is specifically included:Histogram modification step;Color set-up procedure;Coordinating step, is 512 × 424 by resolution adjustment;Etc..The step is realized by the software in the main frame that is connected with Kinect colour imagery shots.
Step S3)Contour detecting treatment is carried out to the three-dimensional staff image, at least one staff image containing background colour is obtained, the profile coordinate of staff image of the storage containing background colour specifically includes following steps:Step S301)The skin color of definition people distribution on YCbCr chrominance spaces;Step S302)Calculate mapping data of each picture point on the three-dimensional staff image on YCbCr chrominance spaces;Step S303)On YCbCr chrominance spaces, the picture point set of the skin color for meeting people in the mapping data is defined as staff image, the picture point of the skin color for not meeting people in the mapping data is set to white;And step S304)Define the picture point collection at the staff image border and be combined into the corresponding staff profile of the three-dimensional staff image.The step is realized by the software in the main frame that is connected with Kinect colour imagery shots, such as OpenCV softwares.OpenCV is a cross-platform computer vision library based on (increasing income) distribution, be may operate in Linux, Windows and Mac OS operating systems.Its lightweight and efficiently -- it is made up of a series of C functions and a small amount of C++ classes, while there is provided the interface of the language such as Python, Ruby, MATLAB, realizes many general-purpose algorithms in terms of image procossing and computer vision.Above-mentioned steps are substantially that the cvFindCounters methods of computer OpenCV find out the profile of staff image, due to people skin color YCbCr chrominance spaces distribution:100<=Cb<=127, 138<=Cr<=170, pixel within this range may be considered the image of staff part, other pixels could be arranged to white, so can tentatively remove the larger background colour of aberration, obtain a relatively coarse staff image, the staff image includes a part of background colour, and the picture point adjacent with white portion is general staff profile in the staff image.
Step S4)Calculate and store at least people's centre of the palm coordinate according to the staff profile coordinate containing background colour, the step is realized by the software in the main frame that is connected with Kinect colour imagery shots, such as OpenCV softwares.The computer moments methods of OpenCV can calculate the three-dimensional coordinate of position and the centre of the palm point in the centre of the palm.FindContours methods are first passed through, the area of hand is found by the profile parameters for above finding, represented with HandArea, then by moments methods, with HandArea as parameter, draw the position in the centre of the palm.
Step S5)The depth of field distance of people's centre of the palm is calculated according to people's centre of the palm coordinate and the depth of field data, the step is realized by the software in the main frame that is connected with Kinect colour imagery shots, such as Kinect SDK softwares, the depth of field distance of people's centre of the palm point can be calculated using Kinect SDK softwares.
Step S6)A depth of field distance setting at least staff profile depth of field distance threshold according to default human hand shape and people's centre of the palm.The step is realized by the software in the main frame that is connected with Kinect colour imagery shots, the threshold value is exactly the distance apart from camera, sentence is judged by one it is achieved that such as 1m ~ 3m, and this scope oneself can set between 0.5m ~ 5m according to the different demand of different system.Human hand shape, the different gesture of different human hand shape's correspondences are preset in Computer Database in advance.
Step S7)The depth of field distance of each picture point in the staff profile containing background colour is calculated using the staff profile containing background colour and the depth of field data.The step is realized by the software in the main frame that is connected with Kinect colour imagery shots.
Step S8)The depth of field distance of each picture point in the staff profile containing background colour and the staff profile depth of field distance threshold are contrasted, depth of field distance is set to white beyond the picture point of the staff profile depth of field distance threshold scope, so as to further remove the background colour of the three-dimensional staff image.The step is realized by the software in the main frame that is connected with Kinect colour imagery shots.
Step S9)Second contour detecting treatment is carried out to the three-dimensional staff image, an at least staff profile is obtained and store.The step is realized by the software in the main frame that is connected with Kinect colour imagery shots, such as OpenCV softwares.The profile for staff image being found out using the cvFindCounters methods of OpenCV second, detail and step S3)It is identical, background colour can be further removed, a more accurate staff image is obtained, the picture point adjacent with white portion is accurate staff profile in the staff image.
With depth of field camera be used in combination with colour imagery shot by the present invention, and both data are superimposed, optimizes evaluation algorithm, improves the precision of finger judgement, therefore gesture identification is more accurate.The interference of background colour can be effectively removed, palm profile and centre of the palm coordinate is accurately recognized;With depth of field identification function, the gesture with depth can be recognized(For example before push away, bending etc.), without using skeleton data, without making whole human body appear in camera view.
The above is only the preferred embodiment of the present invention; it should be pointed out that for those skilled in the art, under the premise without departing from the principles of the invention; some improvements and modifications can also be made, these improvements and modifications also should be regarded as protection scope of the present invention.

Claims (4)

1. a kind of staff outline identification method, it is characterised in that including:
At least one three-dimensional staff image is obtained using an at least Kinect colour imagery shots, an at least depth of field data is obtained using an at least depth of field camera;
Image enhancement processing is carried out to the three-dimensional staff image;
Contour detecting treatment is carried out to the three-dimensional staff image, at least one staff image containing background colour, the profile coordinate of staff image of the storage containing background colour is obtained;
Profile coordinate according to the staff image containing background colour is calculated and stores at least people's centre of the palm coordinate;
The depth of field distance of people's centre of the palm is calculated according to people's centre of the palm coordinate and the depth of field data;
A depth of field distance setting at least staff profile depth of field distance threshold according to default human hand shape and people's centre of the palm;
The depth of field distance of each picture point in the staff profile containing background colour is calculated using the staff profile containing background colour and the depth of field data;
The depth of field distance of each picture point in the staff profile containing background colour and the staff profile depth of field distance threshold are contrasted, depth of field distance is set to white beyond the picture point of the staff profile depth of field distance threshold scope, so as to remove the background colour of the three-dimensional staff image;And
Second contour detecting treatment is carried out to the three-dimensional staff image, an at least staff profile is obtained and store.
2. staff outline identification method as claimed in claim 1, it is characterised in that image enhancement processing is carried out to the three-dimensional staff image, is comprised the following steps:
Amendment histogram, is used to adjust color of image, makes the gray scale of image average;And
Adjustment resolution ratio, is 512 × 424 by resolution adjustment.
3. staff outline identification method as claimed in claim 1, it is characterised in that contour detecting treatment is carried out to the three-dimensional staff image, is comprised the following steps:
The skin color of definition people distribution on YCbCr chrominance spaces;
Calculate mapping data of each picture point on the three-dimensional staff image on YCbCr chrominance spaces;
On YCbCr chrominance spaces, the picture point set of the skin color for meeting people in the mapping data is defined as staff image;And
Define the picture point collection at the staff image border and be combined into the corresponding staff profile of the three-dimensional staff image.
4. staff outline identification method as claimed in claim 1, it is characterised in that second contour detecting treatment is carried out to the three-dimensional staff image, is comprised the following steps:
The skin color of definition people distribution on YCbCr chrominance spaces;
Calculate mapping data of each picture point on the three-dimensional staff image on YCbCr chrominance spaces;
On YCbCr chrominance spaces, the picture point set of the skin color for meeting people in the mapping data is defined as staff image;And
Define the picture point collection at the staff image border and be combined into the corresponding staff profile of the three-dimensional staff image.
CN201510971701.4A 2015-12-22 2015-12-22 Staff outline identification method Pending CN106909872A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510971701.4A CN106909872A (en) 2015-12-22 2015-12-22 Staff outline identification method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510971701.4A CN106909872A (en) 2015-12-22 2015-12-22 Staff outline identification method

Publications (1)

Publication Number Publication Date
CN106909872A true CN106909872A (en) 2017-06-30

Family

ID=59200716

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510971701.4A Pending CN106909872A (en) 2015-12-22 2015-12-22 Staff outline identification method

Country Status (1)

Country Link
CN (1) CN106909872A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108022227A (en) * 2017-12-29 2018-05-11 努比亚技术有限公司 A kind of black and white background photo acquisition methods, device and computer-readable recording medium
CN108564070A (en) * 2018-05-07 2018-09-21 京东方科技集团股份有限公司 Method for extracting gesture and its device
CN109164924A (en) * 2018-08-29 2019-01-08 陈介水 A kind of character entry method and the system for identifying character entry method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102368290A (en) * 2011-09-02 2012-03-07 华南理工大学 Hand gesture identification method based on finger advanced characteristic
CN102982557A (en) * 2012-11-06 2013-03-20 桂林电子科技大学 Method for processing space hand signal gesture command based on depth camera
WO2014009561A3 (en) * 2012-07-13 2014-05-01 Softkinetic Software Method and system for human-to-computer gesture based simultaneous interactions using singular points of interest on a hand
CN103927016A (en) * 2014-04-24 2014-07-16 西北工业大学 Real-time three-dimensional double-hand gesture recognition method and system based on binocular vision
CN103941866A (en) * 2014-04-08 2014-07-23 河海大学常州校区 Three-dimensional gesture recognizing method based on Kinect depth image
CN104778460A (en) * 2015-04-23 2015-07-15 福州大学 Monocular gesture recognition method under complex background and illumination

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102368290A (en) * 2011-09-02 2012-03-07 华南理工大学 Hand gesture identification method based on finger advanced characteristic
WO2014009561A3 (en) * 2012-07-13 2014-05-01 Softkinetic Software Method and system for human-to-computer gesture based simultaneous interactions using singular points of interest on a hand
CN102982557A (en) * 2012-11-06 2013-03-20 桂林电子科技大学 Method for processing space hand signal gesture command based on depth camera
CN103941866A (en) * 2014-04-08 2014-07-23 河海大学常州校区 Three-dimensional gesture recognizing method based on Kinect depth image
CN103927016A (en) * 2014-04-24 2014-07-16 西北工业大学 Real-time three-dimensional double-hand gesture recognition method and system based on binocular vision
CN104778460A (en) * 2015-04-23 2015-07-15 福州大学 Monocular gesture recognition method under complex background and illumination

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108022227A (en) * 2017-12-29 2018-05-11 努比亚技术有限公司 A kind of black and white background photo acquisition methods, device and computer-readable recording medium
CN108564070A (en) * 2018-05-07 2018-09-21 京东方科技集团股份有限公司 Method for extracting gesture and its device
CN109164924A (en) * 2018-08-29 2019-01-08 陈介水 A kind of character entry method and the system for identifying character entry method
CN109164924B (en) * 2018-08-29 2022-06-24 陈介水 Character input method and system for recognizing character input method

Similar Documents

Publication Publication Date Title
CN103927016B (en) Real-time three-dimensional double-hand gesture recognition method and system based on binocular vision
CN104317391B (en) A kind of three-dimensional palm gesture recognition exchange method and system based on stereoscopic vision
CN110443205B (en) Hand image segmentation method and device
CN106705837B (en) Object measuring method and device based on gestures
CN106909871A (en) Gesture instruction recognition methods
CN114140867A (en) Eye pose recognition using eye features
CN104778460B (en) A kind of monocular gesture identification method under complex background and illumination
CN106504751A (en) Self adaptation lip reading exchange method and interactive device
CN106709931B (en) Method for mapping facial makeup to face and facial makeup mapping device
CN106774856A (en) Exchange method and interactive device based on lip reading
AU2011301774A1 (en) A method for enhancing depth maps
CN110288715B (en) Virtual necklace try-on method and device, electronic equipment and storage medium
WO2018082388A1 (en) Skin color detection method and device, and terminal
CN111047511A (en) Image processing method and electronic equipment
EP3905104B1 (en) Living body detection method and device
Lee et al. Emotional recognition from facial expression analysis using bezier curve fitting
CN106529502A (en) Lip language identification method and apparatus
CN106909872A (en) Staff outline identification method
Gu et al. Hand gesture interface based on improved adaptive hand area detection and contour signature
Geetha et al. Dynamic gesture recognition of Indian sign language considering local motion of hand using spatial location of Key Maximum Curvature Points
Abdallah et al. An overview of gesture recognition
CN110597397A (en) Augmented reality implementation method, mobile terminal and storage medium
CN111831123B (en) Gesture interaction method and system suitable for desktop mixed reality environment
JP5051671B2 (en) Information processing apparatus, information processing method, and program
CN107506731A (en) A kind of face identification system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20170630

WD01 Invention patent application deemed withdrawn after publication