CN105447466B - A kind of identity integrated recognition method based on Kinect sensor - Google Patents

A kind of identity integrated recognition method based on Kinect sensor Download PDF

Info

Publication number
CN105447466B
CN105447466B CN201510862672.8A CN201510862672A CN105447466B CN 105447466 B CN105447466 B CN 105447466B CN 201510862672 A CN201510862672 A CN 201510862672A CN 105447466 B CN105447466 B CN 105447466B
Authority
CN
China
Prior art keywords
accredited personnel
information
personnel
indicate
human
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510862672.8A
Other languages
Chinese (zh)
Other versions
CN105447466A (en
Inventor
夏鹏
张倩
丘宇彬
朱易华
黄佳洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Turing Robot Co Ltd
Original Assignee
Shenzhen Turing Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Turing Robot Co Ltd filed Critical Shenzhen Turing Robot Co Ltd
Priority to CN201510862672.8A priority Critical patent/CN105447466B/en
Publication of CN105447466A publication Critical patent/CN105447466A/en
Application granted granted Critical
Publication of CN105447466B publication Critical patent/CN105447466B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of identity integrated recognition method based on Kinect sensor, the following steps are included: one, using Kinect sensor obtain accredited personnel multiple groups characteristics of human body, every group of characteristics of human body includes human face image information, the colour of skin/color development information and Human Height information;Two, Haar-Like feature is extracted based on human face image information and obtains the recognition of face classifier result of accredited personnel by SVM algorithm;Three, it is based on the colour of skin/color development information, obtains the colour of skin/color development mixed Gauss model of accredited personnel;Four, it is based on Human Height information, obtains the height average and standard deviation of accredited personnel;It five, will be Step 2: the result of step 3 and step 4 be stored in database to complete the information registering of accredited personnel;Six, the characteristics of human body of Kinect sensor capture current persons, the identity for determining current persons that the characteristics of human body of current persons is made comparisons with the registration information of accredited personnel in database are utilized.

Description

A kind of identity integrated recognition method based on Kinect sensor
Technical field
The present invention relates to a kind of personal identification methods, are especially one kind based on Kinect sensor and merge height letter The identity integrated recognition method of breath, the colour of skin/color development information and human face image information.
Background technique
In recent years, with the breakthrough development of the technologies such as computer, internet, artificial intelligence, robot is just gradually entered into The normal sphere of life of the mesh of people, provides various types of services.Service humanoid robot application in, allow machine person to person into The exchange of the various information of row be realize the premise of good service, and allow robot quickly, the identity of correct identification people, as owner, Client, stranger etc. are the basic guarantees that machine person to person is exchanged to realize that differentiation is treated.
In artificial intelligence field, authentication can be realized by multiple technologies, such as input password, brush ID card, finger print identifying, Iris recognition etc..Although these technologies are widely used, and uniqueness, confidentiality are preferable, but the service for serving people For humanoid robot, and it is not suitable for.Reason is very simple, is exactly that people prefer to that nature can be carried out with robot as people Exchange, rather than by by input password, swipe the card etc. it is cumbersome in a manner of obtain manipulation power.It can be seen that above-mentioned identification authentication mode is only The means for obtaining control as robot administrator or maintenance personnel are more appropriate.
It is mainly based upon the recognition of face of image using more authentication means in robot field at present, has Simply, natural, non-contacting advantage.But there is inherent shortcomings for recognition of face, on the one hand, identification process needs identified person Cooperation, to provide front face, and recognition effect is poor in the bad situation of light, and accuracy rate is lower;On the other hand, machine Device people is easy to be cheated by invader using photo.Especially under the application of some household scenes, the service object of robot is main For kinsfolk, kinsfolk only is identified by recognition of face means, it is avoided that not needing owner is frequently being required to provide just Face cooperation, use is very inconvenient, and flexibility is poor, and experience comfort level is lower.
Summary of the invention
The object of the present invention is to provide a kind of identity integrated recognition method based on Kinect sensor has and realizes and hold Easily, the advantage that recognition speed is fast, accuracy rate is high is not needed using service robot of the invention under family's application scenarios Continually owner is required to cooperate, can be achieved with identity detection identification in most cases, practicability is stronger.
Be not suitable for service humanoid robot use to solve authentication mode commonly used in the prior art, and it is single based on figure The face recognition technology of picture needs owner frequently to cooperate, and in the case where light is bad, recognition effect is poor, and is easy to be invaded The technical issues of person is cheated using photo, a kind of identity integrated recognition method based on Kinect sensor provided by the invention, Including registration process and identification process, and specifically includes the following steps:
One, it allows accredited personnel's multi-angle rotation face before Kinect sensor, and does different limbs in different location Movement, to obtain the multiple groups characteristics of human body of the accredited personnel, every group of characteristics of human body includes human face image information, the colour of skin/color development Information and Human Height information;
Two, the multiple groups human face image information based on the accredited personnel extracts Haar-Like feature and passes through SVM algorithm list Solely training face recognition classifier, to obtain the recognition of face classifier result of the accredited personnel;
Three, the multiple groups colour of skin/color development information based on the accredited personnel passes through the accumulative colour of skin/hair for obtaining the accredited personnel Mixture of colours Gauss model;
Four, the multiple groups Human Height information based on the accredited personnel, the height for obtaining the accredited personnel by calculating are average Value and standard deviation;
It five, will be Step 2: the result that step 3 and step 4 obtain be stored in database to complete the information of the accredited personnel It registers, and completes the information registering of all accredited personnel according to the logon mode of the accredited personnel;
Six, after the completion of registering, using the characteristics of human body of Kinect sensor capture current persons, by the human body of current persons Feature is made comparisons with the registration information of accredited personnel in database, and the identity of current persons is determined according to comparison result.
Further, a kind of identity integrated recognition method based on Kinect sensor of the present invention, wherein in step 1 In, the human face image information for obtaining accredited personnel is realized by mode in detail below:
(1) using depth image and color image of the Kinect sensor acquisition comprising accredited personnel, and according to depth map The human skeleton artis information of depth data reduction accredited personnel as in, wherein
Torso portion includes the crown, lower jaw, chest, abdomen, hip, is successively indicated with C1, C2, C3, C4, C5;
Left-hand part includes left hand finger tip, left finesse, left elbow joint, left shoulder joint, is successively indicated with L1, L2, L3, L4;
Right hand portion includes right hand finger tip, right finesse, right elbow joint, right shoulder joint, is successively indicated with R1, R2, R3, R4;
Left leg section includes left foot point, left foot wrist, left knee joint, left hip joint, is successively indicated with E1, E2, E3, E4;
Right leg section includes right crus of diaphragm point, right crus of diaphragm wrist, right knee joint, right hip joint, is successively indicated with F1, F2, F3, F4;
(2) using the line of two artis of C1 in the human skeleton of accredited personnel and C2 as axis, using human body segmentation side Method extracts the human body head region in color image, as human body head image;
(3) judge whether human body head image includes face using face recognition algorithms, grab people if including face Face image, as the human face image information of accredited personnel, otherwise it is assumed that not including face.
Further, a kind of identity integrated recognition method based on Kinect sensor of the present invention, wherein in step 1 In, the colour of skin/color development information for obtaining accredited personnel is realized by mode in detail below:
(1) the human body head image of accredited personnel is converted into YCbCr colour gamut from RGB color domain, and is directed to human body head figure Each pixel as in judges whether its CbCr chrominance component belongs to basic skin distribution U (Cb, Cr), marks if belonging to It is 1,0 is labeled as if being not belonging to;
(2) according to the judgement of step (1) and label as a result, using it is all mark be pixel as one gather, and The mean value and the corresponding covariance matrix of CbCr for calculating CbCr chrominance component, as colour of skin list Gauss model, wherein CbCr is colored Component mean value is usedIt indicates, covariance matrix σ1 2It indicates, colour of skin list Gauss model N11, σ1 2) table Show;
(3) according to the judgement of step (1) and label as a result, using it is all mark be pixel as one gather, and The mean value and the corresponding covariance matrix of CbCr for calculating CbCr chrominance component, as color development list Gauss model, wherein CbCr is colored Component mean value is usedIt indicates, covariance matrix variance σ2 2It indicates, color development list Gauss model N22, σ2 2) indicate.
Further, a kind of identity integrated recognition method based on Kinect sensor of the present invention, wherein in step 3 In, the colour of skin/color development mixed Gauss model is N=(μ1, σ1 2, μ2, σ2 2)。
Further, a kind of identity integrated recognition method based on Kinect sensor of the present invention, wherein in step 1 In, the Human Height information for obtaining accredited personnel is realized by mode in detail below:
(1) artis in human skeleton is divided into five groups, the 1st group is (C1, C2, C3, C4, C5), the 2nd group for (L1, L2, L3, L4), the 3rd group is (R1, R2, R3, R4), and the 4th group is (E1, E2, E3, E4), and the 5th group is (F1, F2, F3, F4);
(2) least square method fitting three-dimensional space straight line is respectively adopted to each group joint point set, and calculated respective straight Line error of fitting is denoted as Δ 1, Δ 2, Δ 3, Δ 4, Δ 5 respectively;
(3) when all error deltas 1, Δ 2, Δ 3, Δ 4, Δ 5 are respectively less than given threshold T, then it is assumed that human body is in each pass The straight configuration of section, and the Human Height indicated with H is calculated as follows;
α=(Δ45)/(Δ12345) (6)
H=α (H1+max(H2, H3))+2(1-α)max(A1+A2) (7)
In above-mentioned formula (1) into (5),Indicate the three-dimensional space distance between two joint point C1 and C2;Table Show the three-dimensional space distance between two joint point C2 and C3;Indicate the three-dimensional space distance between two joint point C3 and C4;Indicate the three-dimensional space distance between two joint point C4 and C5;Indicate the three-dimensional space between two joint point E1 and E2 Between distance;Indicate the three-dimensional space distance between two joint point E2 and E3;It indicates between two joint point E3 and E4 Three-dimensional space distance;Indicate the three-dimensional space distance between two joint point F1 and F2;Indicate two joint point F2 and Three-dimensional space distance between F3;Indicate the three-dimensional space distance between two joint point F3 and F4;Indicate two joint Three-dimensional space distance between point L1 and L2;Indicate the three-dimensional space distance between two joint point L2 and L3;It indicates Three-dimensional space distance between two joint point L3 and L4;Indicate the three-dimensional space distance between two joint point L4 and C3;Indicate the three-dimensional space distance between two joint point R1 and R2;Indicate the three-dimensional space between two joint point R2 and R3 Between distance;Indicate the three-dimensional space distance between two joint point R3 and R4;It indicates between two joint point R4 and C3 Three-dimensional space distance.
Further, a kind of identity integrated recognition method based on Kinect sensor of the present invention, wherein in step 6 In, the characteristics of human body using Kinect sensor capture current persons, by accredited personnel in the people's body characteristics and database Registration information make comparisons to determine the identity of current persons, specifically includes the following steps:
(1) the Human Height information and the colour of skin/color development information of current persons are obtained using Kinect sensor;
(2) pass through the height for calculating and obtaining current persons according to the Human Height information of current persons, inquire database The registration information of middle accredited personnel simultaneously traverses corresponding [h-3 Δ h, the h+3 Δ h] range of each accredited personnel, judges whether to deposit In the accredited personnel to match with current persons' height, if there is and there is uniqueness to be then directly identified as current persons pair The accredited personnel answered terminates to identify and exports result;Following third step is then carried out if there is but without uniqueness;If no There are matched accredited personnel then to carry out following 4th step;Wherein h indicates that the height average value of accredited personnel, Δ h indicate standard Difference;
(3) according to the colour of skin of current persons/color development information, the colour of skin/color development mixed Gauss model of current persons is obtained, According to there is the accredited personnel to match with current persons' height but not unique condition on the basis of step (2), determining and waiting Select accredited personnel's range and judge whether there is with the colour of skin of current persons/hair color model unique match accredited personnel, if Current persons are then identified as the accredited personnel by the matched accredited personnel of existence anduniquess, are terminated to identify and are exported result;If no The matched accredited personnel of existence anduniquess then following 4th step of progress;
(4) phonetic order is issued, it is desirable that front face towards Kinect sensor, and is obtained and work as forefathers by current persons The human face image information of member;
(5) according to the human face image information of current persons, the accredited personnel's information inquired in database simultaneously judges whether to deposit In matched accredited personnel, and if so, current persons are identified as corresponding accredited personnel, terminate to identify and export as a result, Current persons are then identified as strange personnel or current persons are required to re-register by accredited personnel if there is no match;
Wherein, Human Height information, the colour of skin/color development information and the face figure of current persons are obtained using Kinect sensor As the implementation of information is identical as implementation when registration.
A kind of identity integrated recognition method based on Kinect sensor of the present invention compared with prior art, has following excellent Point: identity integrated recognition method provided by the invention is based on Microsoft's Kinect sensor, and Kinect is using actively red Outer line technology carries out depth finding, can effectively avoid the influence of illumination condition and shelter, can obtain in photographed scene in real time Color image and depth image, the color image and depth image that the present invention is shot according to Kinect sensor extract human body Human face image information, the colour of skin/color development information and height information, and in registration and identification process, melt in conjunction with much information breath Close judgement, can effectively promote the accuracy rate of identification, have implement it is easy, unsophisticated, non-contact, without wearing the excellent of foreign object Point, especially under family's application scenarios, using the robot of the method for the present invention without continually owner being required to cooperate, it will be able to More convenient quickly to identify different kinsfolks, practicability is stronger.
Illustrated embodiment is to a kind of identity integrated recognition method based on Kinect sensor of the present invention with reference to the accompanying drawing It is described in further detail:
Detailed description of the invention
Fig. 1 is a kind of human skeleton schematic diagram of the identity integrated recognition method based on Kinect sensor of the present invention;
Fig. 2 is a kind of identification flow chart of the identity integrated recognition method based on Kinect sensor of the present invention.
Specific embodiment
Below to carry the application that the robot of Kinect sensor identifies different home member under home environment, as A kind of specific example mode of the identity integrated recognition method based on Kinect sensor of the present invention is specifically described.It needs first It is noted that the Kinect sensor of Microsoft can sampling depth image and color image in real time, and by Kinect from The SDK of band can effectively obtain its human body target within the vision and identify skeleton.
A kind of identity integrated recognition method based on Kinect sensor of the present invention, including register and identify two processes, And specifically includes the following steps:
One, it allows accredited personnel's multi-angle rotation face before Kinect sensor, and does different limbs in different location Movement, to obtain the multiple groups characteristics of human body of the accredited personnel, every group of characteristics of human body includes human face image information, the colour of skin/color development Information and Human Height information;
Two, the multiple groups human face image information based on the accredited personnel extracts Haar-Like feature and passes through SVM algorithm list Solely training face recognition classifier, to obtain the recognition of face classifier result of the accredited personnel;
Three, the multiple groups colour of skin/color development information based on the accredited personnel passes through the accumulative colour of skin/hair for obtaining the accredited personnel Mixture of colours Gauss model;
Four, the multiple groups Human Height information based on the accredited personnel, the height for obtaining the accredited personnel by calculating are average Value and standard deviation;
It five, will be Step 2: the result that step 3 and step 4 obtain be stored in database to complete the information of the accredited personnel It registers, and completes the information registering of all accredited personnel according to the logon mode of the accredited personnel;
Six, after the completion of registering, using the characteristics of human body of Kinect sensor capture current persons, by the human body of current persons Feature is made comparisons with the registration information of accredited personnel in database, and the identity of current persons is determined according to comparison result.
In the above step 1, the human face image information for obtaining accredited personnel, is realized by mode in detail below:
(1) using depth image and color image of the Kinect sensor acquisition comprising accredited personnel, and according to depth map The human skeleton artis information of depth data reduction accredited personnel as in, as shown in Figure 1, wherein
Torso portion includes the crown, lower jaw, chest, abdomen, hip, is successively indicated with C1, C2, C3, C4, C5;
Left-hand part includes left hand finger tip, left finesse, left elbow joint, left shoulder joint, is successively indicated with L1, L2, L3, L4;
Right hand portion includes right hand finger tip, right finesse, right elbow joint, right shoulder joint, is successively indicated with R1, R2, R3, R4;
Left leg section includes left foot point, left foot wrist, left knee joint, left hip joint, is successively indicated with E1, E2, E3, E4;
Right leg section includes right crus of diaphragm point, right crus of diaphragm wrist, right knee joint, right hip joint, is successively indicated with F1, F2, F3, F4;
(2) using the line of two artis of C1 in the human skeleton of accredited personnel and C2 as axis, using human body segmentation side Method extracts the human body head region in color image, as human body head image;
(3) judge whether human body head image includes face using face recognition algorithms, grab people if including face Face image, as the human face image information of accredited personnel, otherwise it is assumed that not including face.
In the above step 1, the colour of skin/color development information for obtaining accredited personnel, is realized by mode in detail below:
(1) the human body head image of accredited personnel is converted into YCbCr colour gamut from RGB color domain, and is directed to human body head figure Each pixel as in judges whether its CbCr chrominance component belongs to basic skin distribution U (Cb, Cr), marks if belonging to It is 1,0 is labeled as if being not belonging to;
(2) according to the judgement of step (1) and label as a result, using it is all mark be pixel as one gather, and The mean value and the corresponding covariance matrix of CbCr for calculating CbCr chrominance component, as colour of skin list Gauss model, wherein CbCr is colored Component mean value is usedIt indicates, covariance matrix σ1 2It indicates, colour of skin list Gauss model N11, σ1 2) table Show;
(3) according to the judgement of step (1) and label as a result, using it is all mark be pixel as one gather, and The mean value and the corresponding covariance matrix of CbCr for calculating CbCr chrominance component, as color development list Gauss model, wherein CbCr is colored Component mean value is usedIt indicates, covariance matrix σ2 2It indicates, color development list Gauss model N22, σ2 2) table Show.
In above-mentioned steps three, the colour of skin/color development mixed Gauss model is N=(μ1, σ1 2, μ2, σ2 2)。
In the above step 1, the Human Height information for obtaining accredited personnel, is realized by mode in detail below:
(1) artis in human skeleton is divided into five groups, the 1st group is (C1, C2, C3, C4, C5), the 2nd group for (L1, L2, L3, L4), the 3rd group is (R1, R2, R3, R4), and the 4th group is (E1, E2, E3, E4), and the 5th group is (F1, F2, F3, F4);
(2) least square method fitting three-dimensional space straight line is respectively adopted to each group joint point set, and calculated respective straight Line error of fitting is denoted as Δ 1, Δ 2, Δ 3, Δ 4, Δ 5 respectively;
(3) when all error deltas 1, Δ 2, Δ 3, Δ 4, Δ 5 are respectively less than given threshold T, then it is assumed that human body is in each pass The straight configuration of section, and the Human Height indicated with H is calculated as follows;
α=(Δ45)/(Δ12345) (6)
H=α (H1+max(H2, H3))+2(1-α)max(A1+A2) (7)
In formula (1) into (5),Indicate the three-dimensional space distance between two joint point C1 and C2;Indicate two Three-dimensional space distance between artis C2 and C3;Indicate the three-dimensional space distance between two joint point C3 and C4; Indicate the three-dimensional space distance between two joint point C4 and C5;Indicate two joint point E1 and E2 between three-dimensional space away from From;Indicate the three-dimensional space distance between two joint point E2 and E3;Indicate the three-dimensional between two joint point E3 and E4 Space length;Indicate the three-dimensional space distance between two joint point F1 and F2;It indicates between two joint point F2 and F3 Three-dimensional space distance;Indicate the three-dimensional space distance between two joint point F3 and F4;Indicate two joint point L1 and Three-dimensional space distance between L2;Indicate the three-dimensional space distance between two joint point L2 and L3;Indicate two joint Three-dimensional space distance between point L3 and L4;Indicate the three-dimensional space distance between two joint point L4 and C3;Table Show the three-dimensional space distance between two joint point R1 and R2;Indicate the three-dimensional space distance between two joint point R2 and R3;Indicate the three-dimensional space distance between two joint point R3 and R4;Indicate the three-dimensional space between two joint point R4 and C3 Between distance.
In above-mentioned steps six, the characteristics of human body using Kinect sensor capture current persons, by human body spy Sign is made comparisons with the registration information of accredited personnel in database to determine the identity of current persons, flow chart as shown in Figure 2, specifically The following steps are included:
(1) the Human Height information and the colour of skin/color development information of current persons are obtained using Kinect sensor;
(2) pass through the height for calculating and obtaining current persons according to the Human Height information of current persons, inquire database The registration information of middle accredited personnel simultaneously traverses corresponding [h-3 Δ h, the h+3 Δ h] range of each accredited personnel, judges whether to deposit In the accredited personnel to match with current persons' height, if there is and there is uniqueness to be then directly identified as current persons pair The accredited personnel answered terminates to identify and exports result;Following third step is then carried out if there is but without uniqueness;If no There are matched accredited personnel then to carry out following 4th step;Wherein h indicates that the height average value of accredited personnel, Δ h indicate standard Difference;
(3) according to the colour of skin of current persons/color development information, the colour of skin/color development mixed Gauss model of current persons is obtained, According to there is the accredited personnel to match with current persons' height but not unique condition on the basis of step (2), determining and waiting Select registrant's range and judge whether there is with the colour of skin of current persons/hair color model unique match accredited personnel, if deposited Current persons are then identified as the accredited personnel in the accredited personnel of unique match, terminates to identify and exports result;If do not deposited 4th step below the accredited personnel of unique match is then progressive;
(4) phonetic order is issued, it is desirable that front face towards Kinect sensor, and is obtained and work as forefathers by current persons The human face image information of member;
(5) according to the human face image information of current persons, the accredited personnel's information inquired in database simultaneously judges whether to deposit In matched accredited personnel, and if so, current persons are identified as corresponding accredited personnel, terminate to identify and export as a result, Current persons are then identified as strange personnel or current persons are required to re-register by accredited personnel if there is no match;
It should be noted that obtaining Human Height information, the colour of skin/color development information of current persons using Kinect sensor With the implementation of human face image information, the body height information of accredited personnel, skin are obtained using Kinect sensor with when registration Color/color development information is identical with the implementation of human face image information.
Above embodiments are only the descriptions carried out to the preferred embodiment of the present invention, and model not is claimed to the present invention The restriction for enclosing progress, under the premise of not departing from design principle of the present invention and spirit, this field engineers and technicians are according to this hair The various forms of deformations that bright technical solution is made, should all fall into protection scope determined by claims of the present invention.

Claims (3)

1. a kind of identity integrated recognition method based on Kinect sensor, which is characterized in that including registration process and identified Journey, and specifically includes the following steps:
One, it allows accredited personnel's multi-angle rotation face before Kinect sensor, and does different limb actions in different location, To obtain the multiple groups characteristics of human body of the accredited personnel, every group of characteristics of human body include human face image information, the colour of skin/color development information and Human Height information;The colour of skin/color development the information for obtaining accredited personnel is realized by mode in detail below:
(1) the human body head image of accredited personnel is converted into YCbCr colour gamut from RGB color domain, and in human body head image Each pixel judge whether its CbCr chrominance component belongs to basic skin distribution U (Cb, Cr), if belonging to be labeled as 1, 0 is labeled as if being not belonging to;
(2) it according to the judgement of step (1) and label as a result, mark the pixel for being to gather as one for all, and calculates The corresponding covariance matrix of mean value and CbCr of CbCr chrominance component, as colour of skin list Gauss model, wherein CbCr chrominance component Mean value is usedIt indicates, covariance matrix σ1 2It indicates, colour of skin list Gauss model N11, σ1 2) indicate;
(3) it according to the judgement of step (1) and label as a result, mark the pixel for being to gather as one for all, and calculates The corresponding covariance matrix of mean value and CbCr of CbCr chrominance component, as color development list Gauss model, wherein CbCr chrominance component Mean value is usedIt indicates, covariance matrix σ2 2It indicates, color development list Gauss model N22, σ2 2) indicate;
The Human Height information for obtaining accredited personnel is realized by mode in detail below:
(1) artis in human skeleton is divided into five groups, the 1st group is (C1, C2, C3, C4, C5), the 2nd group for (L1, L2, L3, L4), the 3rd group is (R1, R2, R3, R4), and the 4th group is (E1, E2, E3, E4), and the 5th group is (F1, F2, F3, F4);
(2) least square method fitting three-dimensional space straight line is respectively adopted to each group joint point set, and it is quasi- to calculate respective straight line Error is closed, is denoted as Δ 1, Δ 2, Δ 3, Δ 4, Δ 5 respectively;
(3) when all error deltas 1, Δ 2, Δ 3, Δ 4, Δ 5 are respectively less than given threshold T1, then it is assumed that human body is in each joint Straight configuration, and calculate the Human Height indicated with H as follows;
α=(Δ45)/(Δ12345) (6)
H=α (H1+max(H2, H3))+2(1-α)max(A1+A2) (7)
In above-mentioned formula (1) into (5),Indicate the three-dimensional space distance between two joint point C1 and C2;Indicate two Three-dimensional space distance between artis C2 and C3;Indicate the three-dimensional space distance between two joint point C3 and C4; Indicate the three-dimensional space distance between two joint point C4 and C5;Indicate two joint point E1 and E2 between three-dimensional space away from From;Indicate the three-dimensional space distance between two joint point E2 and E3;Indicate the three-dimensional between two joint point E3 and E4 Space length;Indicate the three-dimensional space distance between two joint point F1 and F2;It indicates between two joint point F2 and F3 Three-dimensional space distance;Indicate the three-dimensional space distance between two joint point F3 and F4;Indicate two joint point L1 and Three-dimensional space distance between L2;Indicate the three-dimensional space distance between two joint point L2 and L3;Indicate two joint Three-dimensional space distance between point L3 and L4;Indicate the three-dimensional space distance between two joint point L4 and C3;Table Show the three-dimensional space distance between two joint point R1 and R2;Indicate the three-dimensional space distance between two joint point R2 and R3;Indicate the three-dimensional space distance between two joint point R3 and R4;Indicate the three-dimensional space between two joint point R4 and C3 Between distance;
The human face image information for obtaining accredited personnel is realized by mode in detail below:
(1) using depth image and color image of the Kinect sensor acquisition comprising accredited personnel, and according in depth image Depth data reduction accredited personnel human skeleton artis information, wherein
Torso portion includes the crown, lower jaw, chest, abdomen, hip, is successively indicated with C1, C2, C3, C4, C5;
Left-hand part includes left hand finger tip, left finesse, left elbow joint, left shoulder joint, is successively indicated with L1, L2, L3, L4;
Right hand portion includes right hand finger tip, right finesse, right elbow joint, right shoulder joint, is successively indicated with R1, R2, R3, R4;
Left leg section includes left foot point, left foot wrist, left knee joint, left hip joint, is successively indicated with E1, E2, E3, E4;
Right leg section includes right crus of diaphragm point, right crus of diaphragm wrist, right knee joint, right hip joint, is successively indicated with F1, F2, F3, F4;
(2) it using the line of two artis of C1 in the human skeleton of accredited personnel and C2 as axis, is mentioned using human body segmentation's method The human body head region in color image is taken, as human body head image;
(3) judge whether human body head image includes face using face recognition algorithms, grab face figure if including face Picture, as the human face image information of accredited personnel, otherwise it is assumed that not including face;
Two, the multiple groups human face image information based on the accredited personnel is extracted Haar-Like feature and is individually instructed by SVM algorithm Practice recognition of face classifier, to obtain the recognition of face classifier result of the accredited personnel;
Three, the multiple groups colour of skin/color development information based on the accredited personnel, it is mixed by the accumulative colour of skin/color development for obtaining the accredited personnel Close Gauss model;
Four, the multiple groups Human Height information based on the accredited personnel, by calculate obtain the accredited personnel height average value and Standard deviation;
Five, by Step 2: the obtained result deposit database of step 3 and step 4 to complete the information registering of the accredited personnel, And the information registering of all accredited personnel is completed according to the logon mode of the accredited personnel;
Six, after the completion of registering, using the characteristics of human body of Kinect sensor capture current persons, by the characteristics of human body of current persons It makes comparisons with the registration information of accredited personnel in database, and determines the identity of current persons according to comparison result.
2. a kind of identity integrated recognition method based on Kinect sensor according to claim 1, it is characterised in that: In step 3, the colour of skin/color development mixed Gauss model is N=(μ1, σ1 2, μ2, σ2 2)。
3. a kind of identity integrated recognition method based on Kinect sensor according to claim 1, it is characterised in that: In step 6, the characteristics of human body using Kinect sensor capture current persons will infuse in the people's body characteristics and database The registration information of volume personnel is made comparisons to determine the identity of current persons, specifically includes the following steps:
(1) the Human Height information and the colour of skin/color development information of current persons are obtained using Kinect sensor;
(2) pass through the height for calculating and obtaining current persons according to the Human Height information of current persons, inquire in database and infuse Volume personnel registration information simultaneously traverse corresponding [h-3 Δ h, the h+3 Δ h] range of each accredited personnel, judge whether there is with Current persons are then directly identified as corresponding by the accredited personnel that current persons' height matches if there is and with uniqueness Accredited personnel terminates to identify and exports result;Following third step is then carried out if there is but without uniqueness;If there is no Matched accredited personnel then carries out following 4th step;Wherein h indicates that the height average value of accredited personnel, Δ h indicate standard deviation;
(3) according to the colour of skin of current persons/color development information, the colour of skin/color development mixed Gauss model of current persons is obtained, in step Suddenly according to there is the accredited personnel to match with current persons' height but not unique condition on the basis of (two), candidate note is determined Volume personnel's range and judge whether there is with the colour of skin of current persons/hair color model unique match accredited personnel, if there is Current persons are then identified as the accredited personnel by the accredited personnel of unique match, are terminated to identify and are exported result;If there is no The accredited personnel of unique match then following 4th step of progress;
(4) phonetic order is issued, it is desirable that front face towards Kinect sensor, and is obtained current persons' by current persons Human face image information;
(5) according to the human face image information of current persons, accredited personnel's information for inquiring in database is simultaneously judged whether there is The accredited personnel matched, and if so, current persons are identified as corresponding accredited personnel, terminate to identify and export as a result, if Current persons are then identified as strange personnel there is no matched accredited personnel or current persons is required to re-register;
Wherein, believed using the Human Height information, the colour of skin/color development information and facial image that Kinect sensor obtains current persons The implementation of breath is identical as implementation when registration.
CN201510862672.8A 2015-12-01 2015-12-01 A kind of identity integrated recognition method based on Kinect sensor Active CN105447466B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510862672.8A CN105447466B (en) 2015-12-01 2015-12-01 A kind of identity integrated recognition method based on Kinect sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510862672.8A CN105447466B (en) 2015-12-01 2015-12-01 A kind of identity integrated recognition method based on Kinect sensor

Publications (2)

Publication Number Publication Date
CN105447466A CN105447466A (en) 2016-03-30
CN105447466B true CN105447466B (en) 2019-07-23

Family

ID=55557626

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510862672.8A Active CN105447466B (en) 2015-12-01 2015-12-01 A kind of identity integrated recognition method based on Kinect sensor

Country Status (1)

Country Link
CN (1) CN105447466B (en)

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105773633B (en) * 2016-04-14 2018-04-20 中南大学 Mobile robot man-machine control system based on face location and sensitivity parameter
JP6688990B2 (en) * 2016-04-28 2020-04-28 パナソニックIpマネジメント株式会社 Identification device, identification method, identification program, and recording medium
CN106599785B (en) * 2016-11-14 2020-06-30 深圳奥比中光科技有限公司 Method and equipment for establishing human body 3D characteristic identity information base
CN106652291A (en) * 2016-12-09 2017-05-10 华南理工大学 Indoor simple monitoring and alarming system and method based on Kinect
CN106778615B (en) * 2016-12-16 2019-10-18 中新智擎科技有限公司 A kind of method, apparatus and service for infrastructure robot identifying user identity
CN106934377B (en) * 2017-03-14 2020-03-17 新疆智辰天林信息科技有限公司 Improved human face detection system
TWI604332B (en) * 2017-03-24 2017-11-01 緯創資通股份有限公司 Method, system, and computer-readable recording medium for long-distance person identification
CN107192342A (en) * 2017-05-11 2017-09-22 广州帕克西软件开发有限公司 A kind of measuring method and system of contactless build data
CN107292252B (en) * 2017-06-09 2020-09-15 南京华捷艾米软件科技有限公司 Identity recognition method for autonomous learning
CN109426785B (en) * 2017-08-31 2021-09-10 杭州海康威视数字技术股份有限公司 Human body target identity recognition method and device
CN109426787A (en) * 2017-08-31 2019-03-05 杭州海康威视数字技术股份有限公司 A kind of human body target track determines method and device
CN108451534B (en) * 2018-01-26 2021-08-27 仰人杰 Human body motion detection method based on dielectric elastomer sensor
CN108734083B (en) * 2018-03-21 2023-04-25 北京猎户星空科技有限公司 Control method, device, equipment and storage medium of intelligent equipment
CN108960078A (en) * 2018-06-12 2018-12-07 温州大学 A method of based on monocular vision, from action recognition identity
CN109272347A (en) * 2018-08-16 2019-01-25 苏宁易购集团股份有限公司 A kind of statistical analysis technique and system of shops's volume of the flow of passengers
CN110594856A (en) * 2019-08-12 2019-12-20 青岛经济技术开发区海尔热水器有限公司 Hot water circulation control method and hot water system
CN110503022A (en) * 2019-08-19 2019-11-26 北京积加科技有限公司 A kind of personal identification method, apparatus and system
CN110524559B (en) * 2019-08-30 2022-06-10 成都未至科技有限公司 Intelligent man-machine interaction system and method based on personnel behavior data
CN111064925B (en) * 2019-12-04 2021-05-04 常州工业职业技术学院 Subway passenger ticket evasion behavior detection method and system
CN111079644B (en) * 2019-12-13 2023-06-06 四川新网银行股份有限公司 Method for assisting photographing based on distance and joint point identification external force and storage medium
CN111199198B (en) * 2019-12-27 2023-08-04 深圳市优必选科技股份有限公司 Image target positioning method, image target positioning device and mobile robot
CN111292087A (en) * 2020-01-20 2020-06-16 北京沃东天骏信息技术有限公司 Identity verification method and device, computer readable medium and electronic equipment
CN114582003B (en) * 2022-04-24 2022-07-29 慕思健康睡眠股份有限公司 Sleep health management system based on cloud computing service
CN115637901A (en) * 2022-10-09 2023-01-24 东风汽车集团股份有限公司 Child lock control system, method and equipment based on OMS

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103529944A (en) * 2013-10-17 2014-01-22 合肥金诺数码科技股份有限公司 Human body movement identification method based on Kinect
CN103606093A (en) * 2013-10-28 2014-02-26 燕山大学 Intelligent chain VIP customer service system based on human characteristics
CN104167016A (en) * 2014-06-16 2014-11-26 西安工业大学 Three-dimensional motion reconstruction method based on RGB color and depth image
CN104766230A (en) * 2015-04-21 2015-07-08 东华大学 Advertising effect evaluation method based on human skeletal tracking

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103529944A (en) * 2013-10-17 2014-01-22 合肥金诺数码科技股份有限公司 Human body movement identification method based on Kinect
CN103606093A (en) * 2013-10-28 2014-02-26 燕山大学 Intelligent chain VIP customer service system based on human characteristics
CN104167016A (en) * 2014-06-16 2014-11-26 西安工业大学 Three-dimensional motion reconstruction method based on RGB color and depth image
CN104766230A (en) * 2015-04-21 2015-07-08 东华大学 Advertising effect evaluation method based on human skeletal tracking

Also Published As

Publication number Publication date
CN105447466A (en) 2016-03-30

Similar Documents

Publication Publication Date Title
CN105447466B (en) A kind of identity integrated recognition method based on Kinect sensor
CN106529468B (en) A kind of finger vein identification method and system based on convolutional neural networks
CN105574518B (en) Method and device for detecting living human face
US6920236B2 (en) Dual band biometric identification system
CN103914699B (en) A kind of method of the image enhaucament of the automatic lip gloss based on color space
CN101561710B (en) Man-machine interaction method based on estimation of human face posture
CN106778785B (en) Construct the method for image Feature Selection Model and the method, apparatus of image recognition
Roomi et al. Race classification based on facial features
CN109543640A (en) A kind of biopsy method based on image conversion
Tayal et al. Automatic face detection using color based segmentation
CN109558825A (en) A kind of pupil center's localization method based on digital video image processing
Shangeetha et al. Computer vision based approach for Indian Sign Language character recognition
CN109063671A (en) Method and device for intelligent cosmetic
CN103984922B (en) Face identification method based on sparse representation and shape restriction
CN107392151A (en) Face image various dimensions emotion judgement system and method based on neutral net
CN109002799A (en) Face identification method
CN109766782A (en) Real-time body action identification method based on SVM
CN104898971B (en) A kind of mouse pointer control method and system based on Visual Trace Technology
Seal et al. Minutiae based thermal face recognition using blood perfusion data
Akhloufi et al. Thermal faceprint: A new thermal face signature extraction for infrared face recognition
CN114863499A (en) Finger vein and palm vein identification method based on federal learning
Sokhib et al. A combined method of skin-and depth-based hand gesture recognition.
Akhloufi et al. Infrared face recognition using distance transforms
CN108288040A (en) Multi-parameter face identification system based on face contour
CN109711306A (en) A kind of method and apparatus obtaining facial characteristics based on depth convolutional neural networks

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant