CN105574386A - Terminal mode management method and apparatus - Google Patents
Terminal mode management method and apparatus Download PDFInfo
- Publication number
- CN105574386A CN105574386A CN201510332763.0A CN201510332763A CN105574386A CN 105574386 A CN105574386 A CN 105574386A CN 201510332763 A CN201510332763 A CN 201510332763A CN 105574386 A CN105574386 A CN 105574386A
- Authority
- CN
- China
- Prior art keywords
- described user
- user
- sex
- iris
- age range
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/165—Detection; Localisation; Normalisation using facial parts and geometric relationships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
- G06V40/175—Static expression
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/197—Matching; Classification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/21—Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/2141—Access rights, e.g. capability lists, access control lists, access tables, access matrices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/178—Human faces, e.g. facial parts, sketches or expressions estimating age from face image; using age information for improving recognition
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computer Security & Cryptography (AREA)
- Geometry (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Ophthalmology & Optometry (AREA)
- Collating Specific Patterns (AREA)
Abstract
Embodiments of the invention disclose a terminal mode management method and apparatus. The terminal mode management method comprises the steps of obtaining facial image information of a user; collecting iris feature information of the user from the facial image information; checking whether iris verification information matched with the iris feature information of the user exists or not; of so, obtaining the age section and the gender corresponding to the iris feature information of the user; or otherwise, extracting facial feature data from the facial image information of the user for determining the age section and the gender of the user; and opening a corresponding user mode according to the obtained age section and the gender of the user. The invention also discloses a terminal mode management apparatus. By adoption of the terminal mode management method and apparatus, the age section and the gender of the user can be determined according to the iris feature information of the facial image information or the facial feature data so as to adaptively present the user mode suitable for the user, and to avoid multiple times of mode switching and mode setting by the user.
Description
Technical field
The present invention relates to electronic technology field, particularly relate to a kind of terminal pattern management method and device.
Background technology
Along with the development of electronic technology field, electric terminal is used by people more and more widely, a lot of electric terminal not only as personal electronic equipments but in units of family, community even wider use by people, therefore, for different user groups, electric terminal needs to adopt different user models, with user-friendly.
Present stage, electric terminal has started to adopt different user models, but if when user wants to enter a certain user model, all needs to be introduced into primary user's pattern, then switch to other user models by primary user's pattern.In addition; primary user's pattern often all has the Thoughts on Safe Identity Verification such as fingerprint recognition, cryptoguard and arranges; that is; if other people want to use electric terminal; the highest weight of electric terminal is all needed to limit people after authentication, enter primary user's pattern; switch to other user models again to use to other people, operate very loaded down with trivial details, Consumer's Experience is poor.
Summary of the invention
Embodiment of the present invention technical matters to be solved is, a kind of terminal pattern management method and device are provided, age range and the sex of described user is determined according to the iris feature information in human face image information or face feature data, thus the adaptive user model presenting applicable user, avoid user to carry out repeatedly pattern and switch and pattern setting.
In order to solve the problems of the technologies described above, embodiments provide a kind of terminal pattern management method, described method comprises:
Obtain the human face image information of user;
The iris feature information of described user is gathered from the human face image information of described user;
The iris verification information whether existed with the iris feature information matches of described user is searched in the iris database preset;
If exist, then from the iris database preset, obtain age range corresponding to the iris feature information of described user and sex;
If do not exist, then extract the face feature data in the human face image information of described user, and determine age range and the sex of described user according to the face feature data in the human face image information of described user;
According to age range and the sex of the described user got, open the user model corresponding with sex with the age range of described user.
Correspondingly, the embodiment of the present invention additionally provides a kind of terminal pattern management devices, comprising:
Facial image acquisition module, for obtaining the human face image information of user;
Iris information acquisition module, for gathering the iris feature information of described user in the human face image information from described user;
Iris information matching module, for searching the iris verification information whether existed with the iris feature information matches of described user in the iris database preset;
First identification module, during for there is the iris verification information with the iris feature information matches of described user, obtains age range corresponding to the iris feature information of described user and sex from the iris database preset;
Second identification module, during for there is not the iris feature information with the iris feature information matches of described user, extract the face feature data in the human face image information of described user, and determine age range and the sex of described user according to the face feature data in the human face image information of described user;
User model opening module, for according to the age range of described user got and sex, opens the user model corresponding with sex with the age range of described user.
The embodiment of the present invention is by obtaining the human face image information of user, iris feature information or the face feature data of described user are gathered from the human face image information of described user, determine age range and the sex of described user, according to age range and the sex of the described user got, open the user model corresponding with sex with the age range of described user, achieve the adaptive user model presenting applicable age of user interval and sex, avoid user to carry out repeatedly pattern and switch and pattern setting.
Accompanying drawing explanation
In order to be illustrated more clearly in the embodiment of the present invention or technical scheme of the prior art, be briefly described to the accompanying drawing used required in embodiment or description of the prior art below, apparently, accompanying drawing in the following describes is only some embodiments of the present invention, for those of ordinary skill in the art, under the prerequisite not paying creative work, other accompanying drawing can also be obtained according to these accompanying drawings.
Fig. 1 is the schematic flow sheet of a kind of terminal pattern management method in the embodiment of the present invention;
Fig. 2 is the schematic flow sheet of terminal pattern management method in another embodiment of the present invention;
Fig. 3 is the composition structural representation of a kind of terminal pattern management devices in the embodiment of the present invention;
Fig. 4 is the second identification module composition structural drawing of Fig. 3 in the embodiment of the present invention;
Fig. 5 is the user model opening module composition structural drawing of Fig. 3 in the embodiment of the present invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, be clearly and completely described the technical scheme in the embodiment of the present invention, obviously, described embodiment is only the present invention's part embodiment, instead of whole embodiments.Based on the embodiment in the present invention, those of ordinary skill in the art, not making the every other embodiment obtained under creative work prerequisite, belong to the scope of protection of the invention.
Fig. 1 is the schematic flow sheet of a kind of terminal pattern management method in the embodiment of the present invention, this method flow process can be implemented by terminal pattern management devices, described terminal pattern management devices can be user terminal or the software program operating in user terminal, described user terminal can comprise mobile phone, notebook computer, panel computer, vehicle-mounted computer, POS (PointOfSales, point of sale) machine etc.Described method at least comprises as shown in the figure:
Step S101, obtains the human face image information of user.
Concrete, when user starting up terminal being detected or entering terminal from standby interface, the human face image information of user can be obtained by terminal camera, also can by the acquisition human face image information of other picture pick-up devices.Such as, mobile phone can be taken with front-facing camera, and the computer without camera can obtain described human face image information by external camera device.In concrete enforcement, terminal can point out described user that face alignment camera is carried out the collection of human face image information, also can by automatically detecting the particular location of described user's face, the direction of adjustment camera and angle are to obtain the human face image information of described user.And then, terminal detects to the human face image information got, if the facial image of described human face image information is complete and detailed information is clear, then carry out following step further to gather other information in human face image information, if the facial image of described human face image information is imperfect or detailed information is fuzzy, such as camera only photographed a part of facial image of described user or iris feature information clear not etc., then terminal needs the human face image information again obtaining described user, until the facial image of described human face image information is complete and detailed information is clear.Further terminal can arrange a complete clarity threshold, if exceed described threshold value, represents that described human face image information is complete clear, if lower than described threshold value, starts the human face image information that camera obtains described user again.
Step S102, gathers the iris feature information of described user from the human face image information of described user.
Concrete, terminal can detect the human eye station diagram picture in described human face image information, and the eye image information detected is extracted, afterwards by detecting the border of iris and pupil in eye image information, the border of iris and sclera, iris and the border of upper eyelid and these several, the border characteristic of iris and lower eyelid, determine the position of iris in described eye image information.After the position of accurately locating described iris, terminal is with reference to the position of described iris, extract the iris feature information of described user, and described iris feature information is normalized, the default fixed measure of iris authentication system is adjusted to, to ensure the accurate identification to described iris feature information by described iris feature information.Further, enhancing process can also be carried out for the described iris feature information after normalization, such as, regulate brightness, contrast and smoothness etc. index, to improve the discrimination to described iris feature information.
Step S103, searches the iris verification information whether existed with the iris feature information matches of described user in the iris database preset.
Concrete, here the iris database preset is the database of the large data of the iris verification information of personal information and its correspondence collected by various mode and approach, such as, can collect the iris verification information that extracts in the photo of personal information in I.D. or passport and correspondence stored in iris database.Described default iris database can be collected at oneself, also can be that the rights of using of iris database collected by obtaining other staff obtain, when needing the iris verification information of described iris database to mate with the iris feature information of described user, entering described iris database after being verified by rights of using and mating.In concrete enforcement, iris verification information in the iris feature information of the described user got and described iris database is carried out comparison one by one by terminal, when finding that there is the iris verification information consistent with the iris feature information of described user, represent in described iris database to there is the iris verification information with the iris feature information matches of described user, then perform step S104; Otherwise, find the iris verification information consistent with the iris feature information of described user if do not have after comparison, represent in described iris database the iris verification information do not existed with the iris feature information matches of described user, then perform step S105.
Step S104, if exist, then obtains age range corresponding to the iris feature information of described user and sex from the iris database preset.
Concrete, described iris verification information in described default iris database all has corresponding personal information, described personal information comprises age, sex, address, blood group etc. detailed data, therefore, when finding the iris verification information with the iris feature information matches of described user in the iris database preset, terminal just can according to the iris feature acquisition of information of described user to the age range in the personal information of its correspondence and sex.
Step S105, if do not exist, then extracts the face feature data in the human face image information of described user, and determines age range and the sex of described user according to the face feature data in the human face image information of described user.
Concrete, because image capturing system is subject to extraneous restriction and interference, can affect to image, therefore before the described face feature data of extraction, need to carry out pre-service to the human face image information of described user.First the process such as gray processing, level and smooth, denoising to be carried out on the human face image information of described user to get rid of the impact of illumination on described human face image information; In addition, due to the difference of the described human face image information display that the factors such as shooting distance, face inclination cause, also need to be normalized described human face image information, because the distance of two is that change is minimum relatively in face, therefore the position of eyes and distance can as the foundations of normalized, further can also increase nose center and the face center benchmark as the described human face image information size of adjustment and degree of registration; Finally the more described human face image information after normalized is cut out and image enhaucament.After completing pretreated step, extract the face feature data in described human face image information, then the sex of described user is first determined according to described face feature data, again based on the face feature reference data that the sex of the described user determined is corresponding, the face feature data of described user and described face feature reference data are compared, finally determines the age range of described user.
Step S106, according to age range and the sex of the described user got, opens the user model corresponding with sex with the age range of described user.
Concrete, terminal has preset different user models according to different ages and sex, comprising: middle-aged and old pattern, child mode, Junior Lady pattern, young man's pattern.It should be noted that, user model here not only comprises the above-mentioned four kinds of patterns mentioned, and can also increase middle aged Ms's pattern, middle aged man's pattern etc., do not limit here according to the finer division of age range.Terminal according to the age range of user and sex, can arrange the configurations such as the conventional application program of this colony and icon display mode in the user model of correspondence.Further, user model can also be provided to revise authority to user, be used for adding or delete the configuration of described user model.In concrete enforcement, when the age range of the described user got is consistent with the age range that a certain user model sets, and sex is when also mating, then obtain the pool of applications of the described user model pre-set, and according to the display mode of described user model, show the icon of the application program in described pool of applications.Such as, an age range is set can to each user model, the age range of middle-aged and old pattern be set to 40 years old ~ infinite, then the user of age range more than the 40 years old i.e. direct middle-aged and old pattern of unlatching; The age range of Junior Lady pattern is set to 16 ~ 30 years old, then age range is at 16 ~ 30 years old and sex is woman user just opens Junior Lady pattern.
The embodiment of the present invention is by obtaining the human face image information of user, iris feature information or the face feature data of described user are gathered from the human face image information of described user, determine age range and the sex of described user, according to age range and the sex of the described user got, open the user model corresponding with sex with the age range of described user, achieve the adaptive user model presenting applicable age of user interval and sex, avoid user to carry out repeatedly pattern and switch and pattern setting.
Fig. 2 is the schematic flow sheet of terminal pattern management method in another embodiment of the present invention.Described method at least comprises as shown in the figure:
Step S201, sets age range corresponding to each described user model and sex.
Concrete, when terminal initialization configuration or user model configuration, each user model needs to set corresponding age range and sex, age range described here and sex can be used for comparing with the age range of described user and sex, to judge the user model opened described user.Such as, the age range arranging child mode is 4 ~ 12 years old, sex is man or female, this just represents that described child mode will be 4 ~ 12 years old to age range, sex man or lady's family are opened, if the age range that terminal gets described user is 7 ~ 10 years old, when sex is female, terminal by with the age range and gender comparison of each user model after, can detect that the age range of described user is 7 ~ 10 years old is in the scope that the age range of child mode setting is 4 ~ 12 years old, and described sex female also meets man or woman's requirement of the setting of described child mode sex, then open described child mode.
Step S202, obtains the human face image information of user.
Concrete grammar is with reference to described step S101.
Step S203, gathers the iris feature information of described user from the human face image information of described user.
Concrete grammar is with reference to described step S102.
Step S204, searches the iris verification information whether existed with the iris feature information matches of described user in the iris database preset.
Concrete grammar is with reference to described step S103.
Step S205, if exist, then obtains age range corresponding to the iris feature information of described user and sex from the iris database preset.
Concrete grammar is with reference to described step S104.
Step S206, if do not exist, then extracts the face feature data in the human face image information of described user, determines the sex of described user according to described face feature data.
Concrete, before face feature data in the human face image information extracting described user, first pre-service is carried out to described human face image information, comprises gray processing, level and smooth, denoising, normalization, cuts out, a series of process such as image enhaucament, obtain the human face image information of standard.Afterwards, in pretreated human face image information, described face feature information is being extracted.Specifically, face is made up of local organs such as eyes, nose, faces, although the local organs of different people has different features, the geometry that described local organs is formed also has difference, but also there is much similar rule, just can judge that everyone sex and age is interval according to prime number face feature information according to these rules.In concrete enforcement, described human face image information can be carried out piecemeal, from each sub-block, extract the face feature information comprising proper vector, characteristic curve etc.Terminal is compared according to the face feature reference data collected in the described face feature information extracted and database, when similar face feature information being detected, obtain the sex of its correspondence, determine the sex of described user according to most result of determination, and then determine based on the age range of the sex execution step S207 of described user.
Step S207, the face feature reference data corresponding according to the sex of described user and the face feature data of described user, determine the age range of described user.
Concrete, because the face feature data between men and women there is different characteristics, so when judging age range, the comparison of same face feature data may obtain different age range because of the difference of sex, and likely adopts different face feature data as comparison object.The sex of the described user that therefore can determine according to step S206, compares the face feature data of a described user face feature reference data corresponding with the sex of described user, reduces the workload of half, and can obtain more accurate age range.Such as, the sex of the described user determined as step S206 is man, the face feature reference data of the face feature data of owning user and the male sex is then only needed to compare, when similar face feature information being detected, obtain the age of its correspondence, determine the age range of described user according to the average area of result of determination.
Step S208, obtains the pool of applications of the user model corresponding with sex with the age range of described user.
Concrete, terminal has preset different user models according to different ages and sex, comprising: middle-aged and old pattern, child mode, Junior Lady pattern, young man's pattern.It should be noted that, user model here not only comprises the above-mentioned four kinds of patterns mentioned, and can also increase middle aged Ms's pattern, middle aged man's pattern etc., do not limit here according to the finer division of age range.Terminal according to the age range of user and sex, can arrange the configurations such as the conventional application program of this colony and icon display mode in the user model of correspondence.Further, user model can also be provided to revise authority to user, be used for adding or delete the configuration of described user model.Therefore, after the age range determining described user and sex, just can determine corresponding user model, then obtain the pool of applications in described user model.Described pool of applications is terminal according to the set of the multiple application program of the featured configuration of each age range and sex, such as, except the basic utilities such as contact person, note, phone, Junior Lady pattern can configure the conventional application program of the Junior Ladies such as some are repaiied figure, take pictures, trivial games, and young man's pattern then can configure the application programs such as some big games, sports cast video, financing.Further, user also can according to the hobby additions and deletions of oneself pool of applications.
Step S209, according to the display mode of described user model, shows the icon of the application program in described pool of applications.
Concrete, after getting described pool of applications, according to the display mode of the described user model pre-set, show the application program in described pool of applications.Such as, the display mode of described middle-aged and old pattern can be simple and larger icon, can facilitate the reading of the elderly, click and understanding like this; The display mode of described child mode can be then the icon of cartoon type, to echo the hobby of young crowd.
The embodiment of the present invention is by obtaining the human face image information of user, iris feature information or the face feature data of described user are gathered from the human face image information of described user, determine age range and the sex of described user, according to age range and the sex of the described user got, open the user model corresponding with sex with the age range of described user, achieve the adaptive user model presenting applicable age of user interval and sex, avoid user to carry out repeatedly pattern and switch and pattern setting.
Fig. 3 is the composition structural representation of a kind of terminal pattern management devices in the embodiment of the present invention.This device can be user terminal or the software program operating in user terminal, and described user terminal can comprise mobile phone, notebook computer, panel computer, vehicle-mounted computer, POS (PointOfSales, point of sale) machine etc.Described device at least comprises as shown in the figure:
Facial image acquisition module 310, for obtaining the human face image information of user.
Concrete, when user starting up terminal being detected or entering terminal from standby interface, facial image acquisition module 310 can obtain the human face image information of user by terminal camera, also can by the acquisition human face image information of other picture pick-up devices.Such as, mobile phone can be taken with front-facing camera, and the computer without camera can obtain described human face image information by external camera device.In concrete enforcement, facial image acquisition module 310 can point out described user that face alignment camera is carried out the collection of human face image information, also can by automatically detecting the particular location of described user's face, the direction of adjustment camera and angle are to obtain the human face image information of described user.And then, facial image acquisition module 310 detects to the human face image information got, if the facial image of described human face image information is complete and detailed information is clear, then other information in human face image information are further gathered, if the facial image of described human face image information is imperfect or detailed information is fuzzy, such as camera only photographed a part of facial image of described user or iris feature information clear not etc., then facial image acquisition module 310 needs the human face image information again obtaining described user, until the facial image of described human face image information is complete and detailed information is clear.Further facial image acquisition module 310 can arrange a complete clarity threshold, if exceed described threshold value, represent that described human face image information is complete clear, if lower than described threshold value, start the human face image information that camera obtains described user again.
Iris information acquisition module 320, for gathering the iris feature information of described user in the human face image information from described user.
Concrete, iris information acquisition module 320 can detect the human eye station diagram picture in described human face image information, and the eye image information detected is extracted, afterwards by detecting the border of iris and pupil in eye image information, the border of iris and sclera, iris and the border of upper eyelid and these several, the border characteristic of iris and lower eyelid, determine the position of iris in described eye image information.After the position of accurately locating described iris, iris information acquisition module 320 is with reference to the position of described iris, extract the iris feature information of described user, and described iris feature information is normalized, the default fixed measure of iris authentication system is adjusted to, to ensure the accurate identification to described iris feature information by described iris feature information.Further, enhancing process can also be carried out for the described iris feature information after normalization, such as, regulate brightness, contrast and smoothness etc. index, to improve the discrimination to described iris feature information.
Iris information matching module 330, for searching the iris verification information whether existed with the iris feature information matches of described user in the iris database preset.
Concrete, here the iris database preset is the database of the large data of the iris verification information of personal information and its correspondence collected by various mode and approach, such as, can collect the iris verification information that extracts in the photo of personal information in I.D. or passport and correspondence stored in iris database.Described default iris database can be collected at oneself, also can be the rights of using by obtaining the iris database that other staff have collected, when needing the iris verification information of described iris database to mate with the iris feature information of described user, entering described iris database after being verified by rights of using and mating.In concrete enforcement, iris verification information in the iris feature information of the described user got and described iris database is carried out comparison one by one by iris information matching module 330, when finding that there is the iris verification information consistent with the iris feature information of described user, represent in described iris database to there is the iris verification information with the iris feature information matches of described user, then trigger the first identification module 340; Otherwise, the iris verification information consistent with the iris feature information of described user is found if do not have after comparison, represent in described iris database the iris verification information do not existed with the iris feature information matches of described user, then trigger the second identification module 350.
First identification module 340, during for there is the iris verification information with the iris feature information matches of described user, obtains age range corresponding to the iris feature information of described user and sex from the iris database preset.
Concrete, described iris verification information in described default iris database all has corresponding personal information, described personal information comprises age, sex, address, blood group etc. detailed data, therefore when the first identification module 340 finds the iris verification information with the iris feature information matches of described user in the iris database preset, just can according to the iris feature acquisition of information of described user to the age range in the personal information of its correspondence and sex.
Second identification module 350, during for there is not the iris feature information with the iris feature information matches of described user, extract the face feature data in the human face image information of described user, and determine age range and the sex of described user according to the face feature data in the human face image information of described user.
Concrete, because image capturing system is subject to extraneous restriction and interference, can affect to image, therefore before the described face feature data of extraction, need to carry out pre-service to the human face image information of described user.First second identification module 350 will carry out the process such as gray processing, level and smooth, denoising to get rid of the impact of illumination on described human face image information on the human face image information of described user; In addition, due to the difference of the described human face image information display that the factors such as shooting distance, face inclination cause, also need to be normalized described human face image information, because the distance of two is that change is minimum relatively in face, therefore the position of eyes and distance can as the foundations of normalized, further can also increase nose center and the face center benchmark as the described human face image information size of adjustment and degree of registration; Finally the more described human face image information after normalized is cut out and image enhaucament.
Further, described second identification module 350 also comprises: sex determining unit 351 and age determining unit 352, as shown in Figure 4.
Sex determining unit 351, for determining the sex of described user according to described face feature data.
Concrete, face is made up of local organs such as eyes, nose, faces, although the local organs of different people has different features, the geometry that described local organs is formed also has difference, but also there is much similar rule, just can judge that everyone sex and age is interval according to prime number face feature information according to these rules.In concrete enforcement, described human face image information can be carried out piecemeal, from each sub-block, extract the face feature information comprising proper vector, characteristic curve etc.Sex determining unit 351 is compared according to the face feature reference data collected in the described face feature information extracted and database, when similar face feature information being detected, obtain the sex of its correspondence, determine the sex of described user according to most result of determination, and then determine based on the age range of the sex triggering age determining unit 352 of described user.
Age determining unit 352, for the face feature data according to face feature reference data corresponding to the sex of described user and described user, determines the age range of described user.
Concrete, because the face feature data between men and women there is different characteristics, so when judging age range, the comparison of same face feature data may obtain different age range because of the difference of sex, and likely adopts different face feature data as comparison object.The sex of the described user that therefore can determine according to sex determining unit 351, the face feature data of a described user face feature reference data corresponding with the sex of described user is compared by age determining unit 352, reduce the workload of half, and more accurate age range can be obtained.Such as, the sex of the described user determined when sex determining unit 351 is man, then age determining unit 352 needs the face feature reference data of the face feature data of owning user and the male sex to compare, when similar face feature information being detected, obtain the age of its correspondence, determine the age range of described user according to most result of determination.
User model opening module 360, for according to the age range of described user got and sex, opens the user model corresponding with sex with the age range of described user.
Concrete, terminal has preset different user models according to different ages and sex, comprising: middle-aged and old pattern, child mode, Junior Lady pattern, young man's pattern.It should be noted that, user model here not only comprises the above-mentioned four kinds of patterns mentioned, and can also increase middle aged Ms's pattern, middle aged man's pattern etc., do not limit here according to the finer division of age range.User model setting module 370 according to the age range of user and sex, can arrange the configurations such as the conventional application program of this colony and icon display mode in the user model of correspondence.Further, user model can also be provided to revise authority to user, be used for adding or delete the configuration of described user model.In concrete enforcement, when the age range of the described user got is consistent with the age range that a certain user model sets, and sex is when also mating, then user model opening module 360 obtains the pool of applications of the described user model pre-set, and according to the display mode of described user model, show the icon of the application program in described pool of applications.Such as, an age range is set can to each user model, the age range of middle-aged and old pattern be set to 40 years old ~ infinite, then the user of age range more than the 40 years old i.e. direct middle-aged and old pattern of unlatching; The age range of Junior Lady pattern is set to 16 ~ 30 years old, then age range is at 16 ~ 30 years old and sex is woman user just opens Junior Lady pattern.
Further, described user model opening module 360 also comprises: set of applications acquiring unit 361 and application program display unit 362, as shown in Figure 5.
Set of applications acquiring unit 361, for obtaining the pool of applications of the user model corresponding with sex with the age range of described user.
Concrete, after the age range determining described user and sex, just can determine corresponding user model, then set of applications acquiring unit 361 obtains the pool of applications in described user model.Described pool of applications is terminal according to the set of the multiple application program of the featured configuration of each age range and sex, such as, except the basic utilities such as contact person, note, phone, Junior Lady pattern can configure the conventional application program of the Junior Ladies such as some are repaiied figure, take pictures, trivial games, and young man's pattern then can configure the application programs such as some big games, sports cast video, financing.Further, user also can according to the hobby additions and deletions of oneself pool of applications.
Application program display unit 362, for the display mode according to described user model, shows the icon of the application program in described pool of applications.
Concrete, after getting described pool of applications, application program display unit 362, according to the display mode of the described user model pre-set, shows the application program in described pool of applications.Such as, the display mode of described middle-aged and old pattern can be simple and larger icon, can facilitate the reading of the elderly, click and understanding like this; The display mode of described child mode can be then the icon of cartoon type, to echo the hobby of young crowd.
User model setting module 370, for setting age range corresponding to each described user model and sex.
Concrete, when terminal initialization configuration or user model configuration, the age range that user model setting module 370 needs setting corresponding and sex, age range described here and sex can be used for comparing with the age range of described user and sex, to judge the user model opened described user.Such as, the age range that user model setting module 370 arranges child mode is 4 ~ 12 years old, sex is man or female, this just represents that described child mode will be 4 ~ 12 years old to age range, sex man or lady's family are opened, if the age range getting described user when user model opening module 360 is 7 ~ 10 years old, when sex is female, terminal by with the age range and gender comparison of each user model after, can detect that the age range of described user is 7 ~ 10 years old is in the scope that the age range of child mode setting is 4 ~ 12 years old, and described sex female also meets man or woman's requirement of the setting of described child mode sex, then open described child mode.
The embodiment of the present invention is by obtaining the human face image information of user, iris feature information or the face feature data of described user are gathered from the human face image information of described user, determine age range and the sex of described user, according to age range and the sex of the described user got, open the user model corresponding with sex with the age range of described user, achieve the adaptive user model presenting applicable age of user interval and sex, avoid user to carry out repeatedly pattern and switch and pattern setting.
One of ordinary skill in the art will appreciate that all or part of flow process realized in above-described embodiment method, that the hardware that can carry out instruction relevant by computer program has come, described program can be stored in a computer read/write memory medium, this program, when performing, can comprise the flow process of the embodiment as above-mentioned each side method.Wherein, described storage medium can be magnetic disc, CD, read-only store-memory body (Read-OnlyMemory, ROM) or random store-memory body (RandomAccessMemory, RAM) etc.
Above disclosedly be only present pre-ferred embodiments, certainly can not limit the interest field of the present invention with this, therefore according to the equivalent variations that the claims in the present invention are done, still belong to the scope that the present invention is contained.
Claims (8)
1. a terminal pattern management method, is characterized in that, described method comprises:
Obtain the human face image information of user;
The iris feature information of described user is gathered from the human face image information of described user;
The iris verification information whether existed with the iris feature information matches of described user is searched in the iris database preset;
If exist, then from the iris database preset, obtain age range corresponding to the iris feature information of described user and sex;
If do not exist, then extract the face feature data in the human face image information of described user, and determine age range and the sex of described user according to the face feature data in the human face image information of described user;
According to age range and the sex of the described user got, open the user model corresponding with sex with the age range of described user.
2. terminal pattern management method as claimed in claim 1, is characterized in that, describedly determines that the age range of described user and sex comprise according to the face feature data in the human face image information of described user:
The sex of described user is determined according to described face feature data;
The face feature reference data corresponding according to the sex of described user and the face feature data of described user, determine the age range of described user.
3. terminal pattern management method as claimed in claim 1, is characterized in that, comprise before the human face image information of the described user of described acquisition:
Set age range corresponding to each described user model and sex.
4. terminal pattern management method as claimed in claim 1, is characterized in that, the described unlatching user model corresponding with sex with the age range of described user comprises:
Obtain the pool of applications of the user model corresponding with sex with the age range of described user;
According to the display mode of described user model, show the icon of the application program in described pool of applications.
5. a terminal pattern management devices, is characterized in that, described device comprises:
Facial image acquisition module, for obtaining the human face image information of user;
Iris information acquisition module, for gathering the iris feature information of described user in the human face image information from described user;
Iris information matching module, for searching the iris verification information whether existed with the iris feature information matches of described user in the iris database preset;
First identification module, during for there is the iris verification information with the iris feature information matches of described user, obtains age range corresponding to the iris feature information of described user and sex from the iris database preset;
Second identification module, during for there is not the iris feature information with the iris feature information matches of described user, extract the face feature data in the human face image information of described user, and determine age range and the sex of described user according to the face feature data in the human face image information of described user;
User model opening module, for according to the age range of described user got and sex, opens the user model corresponding with sex with the age range of described user.
6. terminal pattern management devices as claimed in claim 5, it is characterized in that, described second identification module comprises:
Sex determining unit, for determining the sex of described user according to described face feature data;
Age determining unit, for the face feature data according to face feature reference data corresponding to the sex of described user and described user, determines the age range of described user.
7. terminal pattern management devices as claimed in claim 5, it is characterized in that, described device also comprises:
User model setting module, for setting age range corresponding to each described user model and sex.
8. terminal pattern management devices as claimed in claim 5, it is characterized in that, described user model opening module comprises:
Set of applications acquiring unit, for obtaining the pool of applications of the user model corresponding with sex with the age range of described user;
Application program display unit, for the display mode according to described user model, shows the icon of the application program in described pool of applications.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510332763.0A CN105574386A (en) | 2015-06-16 | 2015-06-16 | Terminal mode management method and apparatus |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510332763.0A CN105574386A (en) | 2015-06-16 | 2015-06-16 | Terminal mode management method and apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
CN105574386A true CN105574386A (en) | 2016-05-11 |
Family
ID=55884507
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510332763.0A Pending CN105574386A (en) | 2015-06-16 | 2015-06-16 | Terminal mode management method and apparatus |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105574386A (en) |
Cited By (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106096585A (en) * | 2016-06-29 | 2016-11-09 | 深圳市金立通信设备有限公司 | A kind of auth method and terminal |
CN106096366A (en) * | 2016-06-08 | 2016-11-09 | 北京奇虎科技有限公司 | A kind of information processing method, device and equipment |
CN106599862A (en) * | 2016-12-20 | 2017-04-26 | 北京奇虎科技有限公司 | Method and device for detecting face age of human face and mobile terminal |
CN106650632A (en) * | 2016-11-28 | 2017-05-10 | 深圳超多维科技有限公司 | Identity identification method and device, and electronic equipment |
CN106682634A (en) * | 2016-12-30 | 2017-05-17 | 维沃移动通信有限公司 | Data processing method of virtual reality terminal and virtual reality terminal |
CN106777089A (en) * | 2016-12-13 | 2017-05-31 | 北京奇虎科技有限公司 | Content screen method, device and mobile terminal |
CN107480498A (en) * | 2017-07-31 | 2017-12-15 | 广东欧珀移动通信有限公司 | Unlocking processing method and Related product |
WO2018027408A1 (en) * | 2016-08-06 | 2018-02-15 | 吕秋萍 | Method for pushing information during speed limiting and speed limiting system |
WO2018027407A1 (en) * | 2016-08-06 | 2018-02-15 | 吕秋萍 | Data collection method for iris matched speed limiting technology, and speed limiting system |
WO2018027405A1 (en) * | 2016-08-06 | 2018-02-15 | 吕秋萍 | Method for pushing information during speed limiting and speed limiting system |
CN107748646A (en) * | 2017-10-11 | 2018-03-02 | 上海展扬通信技术有限公司 | A kind of interface control method and interface control system based on intelligent terminal |
CN107995969A (en) * | 2016-11-30 | 2018-05-04 | 深圳市柔宇科技有限公司 | Electronic device and its soft keyboard display method |
CN108153568A (en) * | 2017-12-21 | 2018-06-12 | 联想(北京)有限公司 | A kind of information processing method and electronic equipment |
CN108537026A (en) * | 2018-03-30 | 2018-09-14 | 百度在线网络技术(北京)有限公司 | application control method, device and server |
CN109299595A (en) * | 2018-09-08 | 2019-02-01 | 太若科技(北京)有限公司 | Method, apparatus and AR equipment based on hand skin texture information unlock AR equipment |
CN109670386A (en) * | 2017-10-16 | 2019-04-23 | 深圳泰首智能技术有限公司 | Face identification method and terminal |
CN109727369A (en) * | 2018-12-27 | 2019-05-07 | 施嘉豪 | Cigarette selling system |
CN110211282A (en) * | 2019-05-23 | 2019-09-06 | 深兰科技(上海)有限公司 | A kind of automatic vending method and vending machine |
CN110315973A (en) * | 2018-03-30 | 2019-10-11 | 比亚迪股份有限公司 | The control method of in-vehicle display system, vehicle and in-vehicle display system |
CN110377377A (en) * | 2019-06-06 | 2019-10-25 | 北京梧桐车联科技有限责任公司 | Display methods and device, electronic equipment and storage medium |
CN111125663A (en) * | 2019-11-27 | 2020-05-08 | 宇龙计算机通信科技(深圳)有限公司 | Control method and device for child mode, storage medium and terminal |
CN113033245A (en) * | 2019-12-09 | 2021-06-25 | 宇龙计算机通信科技(深圳)有限公司 | Function adjusting method and device, storage medium and electronic equipment |
CN113126859A (en) * | 2019-12-31 | 2021-07-16 | 宇龙计算机通信科技(深圳)有限公司 | Contextual model control method, contextual model control device, storage medium and terminal |
CN113436353A (en) * | 2021-05-11 | 2021-09-24 | 西安艾润物联网技术服务有限责任公司 | Vehicle charging method and device based on voice interaction |
CN114766027A (en) * | 2019-11-21 | 2022-07-19 | 株式会社燕孵 | Information processing method, information processing apparatus, and control program |
CN116055806A (en) * | 2022-12-30 | 2023-05-02 | 深圳创维-Rgb电子有限公司 | Mode switching processing method and device of intelligent terminal, terminal and storage medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN201213278Y (en) * | 2008-07-02 | 2009-03-25 | 希姆通信息技术(上海)有限公司 | Intelligent human face tracing device taking image by mobile phone |
CN101499018A (en) * | 2008-01-28 | 2009-08-05 | 联想(北京)有限公司 | Data processing unit and method |
CN102760077A (en) * | 2011-04-29 | 2012-10-31 | 广州三星通信技术研究有限公司 | Method and device for self-adaptive application scene mode on basis of human face recognition |
CN104008320A (en) * | 2014-05-19 | 2014-08-27 | 惠州Tcl移动通信有限公司 | Using permission and user mode control method and system based on face recognition |
CN104063150A (en) * | 2014-06-30 | 2014-09-24 | 惠州Tcl移动通信有限公司 | Mobile terminal capable of entering corresponding scene modes by means of face recognition and implementation method thereof |
CN104077517A (en) * | 2014-06-30 | 2014-10-01 | 惠州Tcl移动通信有限公司 | Mobile terminal user mode start method and system based on iris identification |
CN104462917A (en) * | 2014-11-27 | 2015-03-25 | 柳州市网中网络策划中心 | Internet data management system based on iris verification |
-
2015
- 2015-06-16 CN CN201510332763.0A patent/CN105574386A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101499018A (en) * | 2008-01-28 | 2009-08-05 | 联想(北京)有限公司 | Data processing unit and method |
CN201213278Y (en) * | 2008-07-02 | 2009-03-25 | 希姆通信息技术(上海)有限公司 | Intelligent human face tracing device taking image by mobile phone |
CN102760077A (en) * | 2011-04-29 | 2012-10-31 | 广州三星通信技术研究有限公司 | Method and device for self-adaptive application scene mode on basis of human face recognition |
CN104008320A (en) * | 2014-05-19 | 2014-08-27 | 惠州Tcl移动通信有限公司 | Using permission and user mode control method and system based on face recognition |
CN104063150A (en) * | 2014-06-30 | 2014-09-24 | 惠州Tcl移动通信有限公司 | Mobile terminal capable of entering corresponding scene modes by means of face recognition and implementation method thereof |
CN104077517A (en) * | 2014-06-30 | 2014-10-01 | 惠州Tcl移动通信有限公司 | Mobile terminal user mode start method and system based on iris identification |
CN104462917A (en) * | 2014-11-27 | 2015-03-25 | 柳州市网中网络策划中心 | Internet data management system based on iris verification |
Cited By (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106096366A (en) * | 2016-06-08 | 2016-11-09 | 北京奇虎科技有限公司 | A kind of information processing method, device and equipment |
CN106096585A (en) * | 2016-06-29 | 2016-11-09 | 深圳市金立通信设备有限公司 | A kind of auth method and terminal |
WO2018027405A1 (en) * | 2016-08-06 | 2018-02-15 | 吕秋萍 | Method for pushing information during speed limiting and speed limiting system |
WO2018027408A1 (en) * | 2016-08-06 | 2018-02-15 | 吕秋萍 | Method for pushing information during speed limiting and speed limiting system |
WO2018027407A1 (en) * | 2016-08-06 | 2018-02-15 | 吕秋萍 | Data collection method for iris matched speed limiting technology, and speed limiting system |
CN106650632A (en) * | 2016-11-28 | 2017-05-10 | 深圳超多维科技有限公司 | Identity identification method and device, and electronic equipment |
CN107995969A (en) * | 2016-11-30 | 2018-05-04 | 深圳市柔宇科技有限公司 | Electronic device and its soft keyboard display method |
CN106777089A (en) * | 2016-12-13 | 2017-05-31 | 北京奇虎科技有限公司 | Content screen method, device and mobile terminal |
CN106599862A (en) * | 2016-12-20 | 2017-04-26 | 北京奇虎科技有限公司 | Method and device for detecting face age of human face and mobile terminal |
CN106682634A (en) * | 2016-12-30 | 2017-05-17 | 维沃移动通信有限公司 | Data processing method of virtual reality terminal and virtual reality terminal |
CN107480498A (en) * | 2017-07-31 | 2017-12-15 | 广东欧珀移动通信有限公司 | Unlocking processing method and Related product |
CN107480498B (en) * | 2017-07-31 | 2020-02-14 | Oppo广东移动通信有限公司 | Unlocking processing method and related product |
CN107748646A (en) * | 2017-10-11 | 2018-03-02 | 上海展扬通信技术有限公司 | A kind of interface control method and interface control system based on intelligent terminal |
CN109670386A (en) * | 2017-10-16 | 2019-04-23 | 深圳泰首智能技术有限公司 | Face identification method and terminal |
CN108153568A (en) * | 2017-12-21 | 2018-06-12 | 联想(北京)有限公司 | A kind of information processing method and electronic equipment |
CN108153568B (en) * | 2017-12-21 | 2021-04-13 | 联想(北京)有限公司 | Information processing method and electronic equipment |
CN108537026A (en) * | 2018-03-30 | 2018-09-14 | 百度在线网络技术(北京)有限公司 | application control method, device and server |
CN110315973B (en) * | 2018-03-30 | 2022-01-07 | 比亚迪股份有限公司 | Vehicle-mounted display system, vehicle and control method of vehicle-mounted display system |
CN110315973A (en) * | 2018-03-30 | 2019-10-11 | 比亚迪股份有限公司 | The control method of in-vehicle display system, vehicle and in-vehicle display system |
CN109299595A (en) * | 2018-09-08 | 2019-02-01 | 太若科技(北京)有限公司 | Method, apparatus and AR equipment based on hand skin texture information unlock AR equipment |
CN109727369A (en) * | 2018-12-27 | 2019-05-07 | 施嘉豪 | Cigarette selling system |
CN110211282A (en) * | 2019-05-23 | 2019-09-06 | 深兰科技(上海)有限公司 | A kind of automatic vending method and vending machine |
CN110377377A (en) * | 2019-06-06 | 2019-10-25 | 北京梧桐车联科技有限责任公司 | Display methods and device, electronic equipment and storage medium |
CN114766027A (en) * | 2019-11-21 | 2022-07-19 | 株式会社燕孵 | Information processing method, information processing apparatus, and control program |
US20220276705A1 (en) * | 2019-11-21 | 2022-09-01 | Swallow Incubate Co., Ltd. | Information processing method, information processing device, and non-transitory computer readable storage medium |
CN111125663A (en) * | 2019-11-27 | 2020-05-08 | 宇龙计算机通信科技(深圳)有限公司 | Control method and device for child mode, storage medium and terminal |
CN111125663B (en) * | 2019-11-27 | 2022-04-19 | 宇龙计算机通信科技(深圳)有限公司 | Control method and device for child mode, storage medium and terminal |
CN113033245A (en) * | 2019-12-09 | 2021-06-25 | 宇龙计算机通信科技(深圳)有限公司 | Function adjusting method and device, storage medium and electronic equipment |
CN113126859A (en) * | 2019-12-31 | 2021-07-16 | 宇龙计算机通信科技(深圳)有限公司 | Contextual model control method, contextual model control device, storage medium and terminal |
CN113436353A (en) * | 2021-05-11 | 2021-09-24 | 西安艾润物联网技术服务有限责任公司 | Vehicle charging method and device based on voice interaction |
CN116055806A (en) * | 2022-12-30 | 2023-05-02 | 深圳创维-Rgb电子有限公司 | Mode switching processing method and device of intelligent terminal, terminal and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105574386A (en) | Terminal mode management method and apparatus | |
CN107450708B (en) | Unlocking control method and related product | |
CN109117803B (en) | Face image clustering method and device, server and storage medium | |
CN107886032B (en) | Terminal device, smart phone, authentication method and system based on face recognition | |
De Marsico et al. | Firme: Face and iris recognition for mobile engagement | |
RU2670798C9 (en) | Method of iris authentication of user and device therefor | |
Boehnen et al. | A fast multi-modal approach to facial feature detection | |
WO2016010720A1 (en) | Multispectral eye analysis for identity authentication | |
CN106778450A (en) | A kind of face recognition method and device | |
KR20090008256A (en) | Face recognition system | |
CN108629261A (en) | Remote identity recognition method and system and computer readable recording medium | |
CN110287813A (en) | Personal identification method and system | |
Kurkovsky et al. | Experiments with simple iris recognition for mobile phones | |
CN112818909A (en) | Image updating method and device, electronic equipment and computer readable medium | |
CN110705454A (en) | Face recognition method with living body detection function | |
CN106650632A (en) | Identity identification method and device, and electronic equipment | |
CN108009532A (en) | Personal identification method and terminal based on 3D imagings | |
CN104978546A (en) | Reading content processing method, apparatus and terminal | |
Vairavel et al. | Performance analysis on feature extraction using dorsal hand vein image | |
Masaoud et al. | A review paper on ear recognition techniques: models, algorithms and methods | |
Puhan et al. | Robust eyeball segmentation in noisy iris images using fourier spectral density | |
CN105809101A (en) | Eye white texture identifying method and terminal | |
WO2022000334A1 (en) | Biological feature recognition method and apparatus, and device and storage medium | |
Mohammed et al. | Conceptual analysis of Iris Recognition Systems | |
CN113553890A (en) | Multi-modal biological feature fusion method and device, storage medium and equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20160511 |