WO2017020476A1 - 关联用户的确定方法及装置 - Google Patents
关联用户的确定方法及装置 Download PDFInfo
- Publication number
- WO2017020476A1 WO2017020476A1 PCT/CN2015/097611 CN2015097611W WO2017020476A1 WO 2017020476 A1 WO2017020476 A1 WO 2017020476A1 CN 2015097611 W CN2015097611 W CN 2015097611W WO 2017020476 A1 WO2017020476 A1 WO 2017020476A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- face
- associated user
- user
- gender
- age
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 43
- 238000012216 screening Methods 0.000 claims abstract description 18
- 238000012549 training Methods 0.000 claims description 24
- 238000005286 illumination Methods 0.000 claims description 19
- 238000004364 calculation method Methods 0.000 claims description 12
- 238000012217 deletion Methods 0.000 claims description 4
- 230000037430 deletion Effects 0.000 claims description 4
- 238000000605 extraction Methods 0.000 claims description 4
- 238000001914 filtration Methods 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 13
- 238000004891 communication Methods 0.000 description 10
- 238000012545 processing Methods 0.000 description 9
- 238000005516 engineering process Methods 0.000 description 7
- 238000012706 support-vector machine Methods 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 4
- 230000005236 sound signal Effects 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/5838—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/01—Social networking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/35—Clustering; Classification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/51—Indexing; Data structures therefor; Storage structures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/5866—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/30—Scenes; Scene-specific elements in albums, collections or shared content, e.g. social network photos or video
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/178—Human faces, e.g. facial parts, sketches or expressions estimating age from face image; using age information for improving recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/179—Human faces, e.g. facial parts, sketches or expressions metadata assisted face recognition
Definitions
- the present disclosure relates to the field of image technology processing, and in particular, to a method and apparatus for determining an associated user.
- the embodiments of the present disclosure provide a method and an apparatus for determining an associated user, which are used to solve the problem that the operation of determining an associated user is relatively cumbersome.
- a method for determining an associated user including:
- the face album comprising a set of faces of a plurality of users
- the screening of at least one candidate associated user of the target user from the face album includes:
- the number of identical face source photos between the target users and the target number is greater than a preset number of users as the candidate associated users.
- the acquiring the attribute information of the candidate associated user, and determining the associated user of the target user according to the attribute information includes:
- the obtaining gender and age information of the candidate associated user includes:
- the classifier is used to acquire the gender of the candidate associated user and the age group to which it belongs.
- the determining the associated user according to a preset condition includes:
- the acquiring, by the classifier, the gender of the candidate associated user and the age group to which the candidate belongs includes:
- the classifier For each candidate associated user, the classifier is used to obtain the age of each face of the current candidate associated user, obtain a photo shooting time corresponding to each face, and calculate each face corresponding according to the age and the photo shooting time. a birth time, determining an age group to which the current candidate associated user belongs according to the calculated birth time;
- the classifier For each candidate associated user, the classifier is used to obtain the gender corresponding to each face of the current candidate associated user. If the obtained gender is the same, the obtained gender is used as the gender of the current candidate associated user. If the gender is different, the number of current candidate associated user faces belonging to different genders is counted, and the gender corresponding to the larger number of faces is used as the gender of the current candidate associated user.
- the using the classifier to obtain the age and gender of each face of the current candidate associated user including:
- a determining apparatus for an associated user including:
- An obtaining module configured to obtain a face album, the face album comprising a plurality of user's face sets
- a screening module configured to determine in the face album acquired by the obtaining module Target user, screening at least one candidate associated user of the target user from the face album;
- determining a setting module configured to acquire attribute information of the candidate associated user that is filtered by the determining screening module, determine an associated user of the target user according to the attribute information, and set label information for the associated user.
- the determining the screening module comprises:
- Obtaining a comparison sub-module configured to acquire a face source photo of all users in the face album, and compare the obtained face source photo of the user other than the target user with the face source photo of the target user ;
- the determining submodule is configured to use, as the candidate associated user, the same number of face source photos as the target user is greater than a preset number of users.
- the determining the setting module comprises:
- Obtaining a deletion sub-module configured to acquire gender and age information of the candidate associated user, and deleting a candidate associated user that does not meet the age requirement according to the age information;
- the determination determining sub-module is configured to determine, according to the gender of the remaining candidate associated users, whether the remaining candidate associated users exceed the number of associated users, and if not, determine that the remaining candidate associated users are the associated users, if exceeded, Then, the associated user is determined according to a preset condition.
- the obtaining and deleting submodule comprises:
- Collecting and extracting training units configured to collect training samples, extract features of the training samples, and train a classifier according to the features, the features include a gabor feature, and the classifier includes an SVM classifier;
- an obtaining unit configured to acquire, by using the classifier trained by the collection and extraction training unit, the gender of the candidate associated user and an age group to which the candidate belongs.
- the determination determining submodule is configured to:
- the obtaining unit is configured to:
- the classifier For each candidate associated user, the classifier is used to obtain the age of each face of the current candidate associated user, obtain a photo shooting time corresponding to each face, and calculate each face corresponding according to the age and the photo shooting time. a birth time, determining an age group to which the current candidate associated user belongs according to the calculated birth time;
- the classifier For each candidate associated user, the classifier is used to obtain the gender corresponding to each face of the current candidate associated user. If the obtained gender is the same, the obtained gender is used as the gender of the current candidate associated user. If the gender is different, the number of current candidate associated user faces belonging to different genders is counted, and the gender corresponding to the larger number of faces is used as the gender of the current candidate associated user.
- the obtaining unit is configured to:
- a determining apparatus for an associated user including:
- a memory for storing processor executable instructions
- processor is configured to:
- the face album comprising a set of faces of a plurality of users
- Determining a target user in the face album screening the target from the face album At least one candidate associated user of the user;
- the technical solution provided by the embodiment of the present disclosure may include the following beneficial effects: determining a target user in the face album by acquiring a face album, and screening at least one candidate associated user of the target user from the face album, and then acquiring the candidate associated user.
- the attribute information determines the associated user of the target user according to the attribute information, and finally sets the label information for the associated user.
- the implementation process is fast and simple, and the user does not need to perform cumbersome operations, thereby saving a lot of time for the user.
- the way to identify candidate associated users is simple and easy to implement.
- Determining the associated user according to the attribute information of the candidate associated user is simple and easy to implement.
- the method of determining the associated user according to the preset condition is simple and accurate.
- the accuracy is high.
- the obtained age and gender are directly used as the age and gender of the current face.
- the matching face of the current face is obtained from the database and will match.
- the age and gender of the face are the age and gender of the current face, thus ensuring the recognition accuracy of the current face gender and age.
- FIG. 1 is a flowchart of a method for determining an associated user, according to an exemplary embodiment
- FIG. 2a is a schematic diagram of a face album according to an exemplary embodiment
- 2b is a schematic diagram of a set of faces, according to an exemplary embodiment
- FIG. 3 is a scene diagram of a method for determining an associated user, according to an exemplary embodiment
- FIG. 4a is a flowchart of acquiring user attribute information according to an exemplary embodiment
- FIG. 4b is a flowchart illustrating acquiring age information of a current candidate associated user, according to an exemplary embodiment
- FIG. 5 is a flow chart showing an age of a face according to an exemplary embodiment
- FIG. 6 is a block diagram of a determining device for an associated user, according to an exemplary embodiment
- FIG. 7 is a block diagram of another determining device for an associated user, according to an exemplary embodiment.
- FIG. 8 is a block diagram of still another determining device for an associated user according to an exemplary embodiment
- FIG. 9 is a block diagram of still another determining device for an associated user, according to an exemplary embodiment.
- FIG. 10 is a block diagram of a determining apparatus suitable for an associated user, according to an exemplary embodiment.
- FIG. 1 is a flowchart of a method for determining an associated user according to an exemplary embodiment. As shown in FIG. 1 , the method for determining the associated user is applicable to a mobile terminal, including but not limited to a mobile phone. The following steps S101-S103 are included:
- step S101 a face album is acquired, the face album containing a set of faces of a plurality of users.
- the face album may be obtained from the server, and the face album may include a set of faces of the plurality of users.
- Figure 2a shows an example of a face album containing a collection of faces of a plurality of users, a set of faces of a user as shown in Figure 2b.
- step S102 the target user in the face album is determined, and at least one candidate associated user of the target user is filtered out from the face album.
- the target user may be a baby.
- the baby's face set may be identified from the face album, and the target user may be determined according to the number of faces in the baby face set. For example, suppose there is a collection of faces of two babies in the current face album, wherein the number of faces included in the face set of the first baby is 4, and the number of faces included in the face set of the second baby is 50, then it can be determined The second baby is the target user.
- At least one candidate associated user of the target user may be selected from the face album by, but not limited to, obtaining a face source photo of all users in the face album, and acquiring the obtained user other than the target user
- the face source photo is compared with the target user's face source photo, and then the number of identical face source photos between the target user and the target user is greater than the preset number of users as candidate associated users.
- the face source photo refers to the photo of the face, assuming that the photo 1 includes the face 1 and the face 2, the source photo of the face 1 and the face 2 are both the photo 1, the photo 2 includes the face 3, and the source of the face 3
- the photo is photo 2.
- the preset number can be flexibly set as needed, for example, 10 sheets, 15 sheets, or the like.
- the current face album contains a set of faces of five users of user 1-5, wherein user 1 is the target user, obtains the face source photos of the five users, and separates the face source photos of the user 2-5 with the user.
- the face source photos of 1 are compared, assuming that the number of the same face source photos between the user 2 and the user 1 is two, that is, the user 2 has two photos with the user 1; the same face source photo between the user 3 and the user 1
- the number of users is 30, that is, the user 3 has 30 photos with the user 1; the number of the same face source photos between the user 4 and the user 1 is 33, that is, the user 4 has 33 photos with the user 1; the user 5 and the user 1
- the number of photos between the same face source is 20, that is, user 5 has 20 photos with user 1; assuming that the preset number is 10, it can be determined that user 3, user 4, and user 5 are candidate associated users of the target user.
- step S103 the attribute information of the candidate associated user is acquired, the associated user of the target user is determined according to the attribute information, and the tag information is set for the associated user.
- the attribute information of the candidate associated user may be acquired to determine the associated user of the target user according to the attribute information.
- the gender and age information of the candidate associated user may be obtained, and the candidate associated users that do not meet the age requirement are deleted according to the age information, and then the remaining candidate associated users are determined to exceed the number of associated users according to the remaining candidate associated users. If it is exceeded, it is determined that the remaining candidate associated users are associated users, and if they are exceeded, the associated users are determined according to preset conditions, such as the number of faces of the candidate associated users.
- the gender of the user 3 is male, the age group is 10-15, the gender of the user 4 is female, the age group is 25-30, and the gender of the user 5 is male, belonging to The age group is 28-35. Since the age of the user 3 does not meet the age requirement, the user 3 is deleted. Since the user 4 and the user 5 can determine that the user 4 and the user 5 meet the requirements of the number of associated users, therefore, it is determined.
- the user 4 and the user 5 are associated users of the target user, for example, the user 4 is the mother of the target user, and the user 5 is the father of the target user.
- the users 3-5 are all in accordance with the age requirement. Since both the user 3 and the user 5 are male, the number of associated users is exceeded.
- the user 3 and the user 5 are further filtered according to preset conditions, for example, the number of faces of the user 3 and the user 5 can be obtained, since the number of faces of the user 3 (30 sheets) is larger than the number of faces of the user 5 (20 sheets), therefore, determining User 3 is the associated user of the target user.
- the manner of determining the associated user according to the preset condition is simple and accurate.
- the associated information may be set for the associated user so that the operation can be performed subsequently based on the tag information.
- the label information may be “baby's father” or “baby's mother”, or may be a mark indicating “baby's father” or “baby's mother”.
- the mobile terminal can display the set tag information, for example, the tag information can be displayed on the bottom or top of the user's face in the face album, or the tag information can be displayed on the user's face, for example, displaying the tag information in the upper right corner of the user's face. It should be noted that the style of the label information and the position of the setting are not specifically limited herein.
- the mobile terminal can simultaneously extract the face of the target user and the associated user of the target user, instead of manually finding the associated user of the target user. Then, extract the faces of the associated users one by one, which is simple and fast.
- the user can use the mobile phone 31 to take a lot of photos for the baby and himself, and can upload the photos to the server 32, and the current user clicks on the "face album" option.
- the mobile phone 31 can obtain the face album from the server 32.
- the mobile phone 31 can automatically identify the target user, for example, the baby of the current user, and select at least one candidate associated user of the target user from the face album, and then obtain the candidate associated user.
- the associated user of the target user that is, the baby's father and mother
- the tag information is set for the baby's father and mother, thereby facilitating subsequent operation according to the tag information.
- the target user in the face album is determined by acquiring the face album, and at least one candidate associated user of the target user is filtered out from the face album, and then the attribute information of the candidate associated user is obtained, according to the attribute information.
- the associated user of the target user is determined, and finally the label information is set for the associated user.
- FIG. 4a is a flowchart of acquiring user attribute information, by which the gender and age information of a candidate associated user can be obtained, as shown in FIG. 4a, according to an exemplary embodiment.
- the process includes:
- step S401 training samples are collected, features of the training samples are extracted, and the classifier is trained according to the features.
- the gender training sample and the age training sample need to be collected, and the features of the corresponding training samples are extracted, and the features may include but are not limited to the gabor feature, and then the corresponding classifier is trained according to the feature.
- the above classifiers may include, but are not limited to, a Support Vector Machine (SVM) classifier.
- SVM Support Vector Machine
- the gabor feature is a local feature metric method for images, which is mainly used to characterize local texture features.
- step S402 the gender of the candidate associated user and the age group to which it belongs are obtained by using the classifier.
- the embodiment can obtain the gender and age information of all the faces of each candidate associated user through the corresponding classifier, and then perform statistics on the gender and age information of all the faces. And according to the statistical result, the gender of the corresponding candidate user and the age group to which it belongs are obtained.
- the process of obtaining the age information of the current candidate associated user may be as shown in FIG. 4b, and the process includes:
- step S4031 the age of each face of the current candidate associated user is acquired by the classifier, and the photo shooting time corresponding to each face is acquired.
- step S4032 the birth time corresponding to each face is calculated based on the acquired age and photo taking time.
- step S4033 the age segment to which the current candidate associated user belongs is determined according to the calculated birth time.
- the current candidate associated users have 40 faces, of which 10 faces correspond to birth time in 1988, 8 faces correspond to birth time in 1990, 7 faces correspond to birth time in 1989, and 8 faces correspond to Birth time is 1987, 2 faces corresponding to birth time In 1980, the birth time of the two faces was 1981, the birth time of the two faces was 1995, and the birth time of one face was 1996. It was determined that the current age of the candidate related users was 25 -28.
- the age range is determined by the above method, and the accuracy is high.
- the classifier is used to obtain the gender corresponding to each face of the current candidate associated user. If the obtained gender is the same, the acquired gender is used as the gender of the current candidate associated user. If the gender is different, the number of current candidate associated user faces belonging to different genders is counted, and the gender corresponding to the larger number of faces is used as the gender of the current candidate associated user.
- the current candidate associated user has 40 faces, wherein 38 faces correspond to genders and 2 faces correspond to genders, and the gender of the current candidate associated users is determined to be male.
- the gender is determined by the above method, and the accuracy is high.
- the gender and age information corresponding to all the faces of each candidate associated user are obtained, and then the gender and age range of the corresponding candidate related users are determined according to the gender and age information corresponding to all the faces, and the accuracy is high.
- FIG. 5 is a flowchart of obtaining face age and gender according to an exemplary embodiment. As shown in FIG. 5, for each face of the current candidate associated user, obtaining the age and gender of the current face may include:
- step S501 the age and gender of the current face of the current candidate associated user are acquired by the classifier, and the illumination and posture information of the current face are calculated to obtain a calculation result.
- the illumination information can be calculated by using the mean and variance of the pixel gray values.
- step S502 it is determined whether the calculation result meets the illumination and posture requirements, and if yes, then Step S503 is executed, and if not, step S504 is performed.
- the posture of the user can be judged to be a frontal posture in various ways. For example, the positions of several points on the current face, for example, the positions of the left eye and the right eye, can be extracted, and then whether the left eye and the right eye are symmetric, if symmetrical, It indicates a positive attitude.
- step S503 the acquired age and gender are taken as the age and gender of the current face, and the current face and its corresponding age and gender are saved in the database.
- the obtained age and gender can be used as the age and gender of the current face, and the current face and its corresponding age and gender can be saved in the database for subsequent matching.
- step S504 the matching face of the current face is obtained from the database, and the age and gender of the matching face are taken as the age and gender of the current face.
- the current face does not meet the lighting and posture requirements, for example, the current face is the side, and the light is too dark, the acquired age and gender are inaccurate as the age and gender of the current face, and the matching face of the current face needs to be obtained from the database.
- the age and gender of the matching face are taken as the age and gender of the current face to improve the accuracy.
- the obtained age and gender are directly used as the age and gender of the current face, and the matching face of the current face is obtained from the database when the current face does not meet the requirements of illumination and posture. And the age and gender of the matching face are taken as the age and gender of the current face, thereby ensuring the recognition accuracy of the current face gender and age.
- the embodiment of the present disclosure further provides an embodiment of the determining device of the associated user.
- FIG. 6 is a block diagram of a determining device for an associated user, as shown in FIG. 6, the determining device of the associated user includes an obtaining module 61, a determining screening module 62, and a determining setting module 63, according to an exemplary embodiment.
- the obtaining module 61 is configured to acquire a face album, and the face album includes a set of faces of a plurality of users.
- the determining screening module 62 is configured to determine the target user in the face album acquired by the obtaining module 61, and select at least one candidate associated user of the target user from the face album.
- the determining setting module 63 is configured to acquire the attribute information of the candidate associated user selected by the determining screening module 62, determine the associated user of the target user according to the attribute information, and set the tag information for the associated user.
- the acquiring module obtains the face album, determines the target user in the face album by determining the screening module, and selects at least one candidate associated user of the target user from the face album, and then obtains by setting the setting module.
- the attribute information of the candidate associated user determines the associated user of the target user according to the attribute information, and sets the label information for the associated user.
- FIG. 7 is a block diagram of another determining device for associating a user according to an exemplary embodiment. As shown in FIG. 7, on the basis of the foregoing embodiment shown in FIG. 6, determining the screening module 62 may include: obtaining an alignment. Sub-module 621 and determination sub-module 622.
- the obtaining comparison sub-module 621 is configured to obtain a face source photo of all users in the face album, and compare the obtained face source photo of the user other than the target user with the face source photo of the target user.
- Determining sub-module 622 configured to have the same number of face source photos as the target user A user greater than the preset number is used as a candidate associated user.
- the manner of determining the candidate associated users is simple and easy to implement.
- FIG. 8 is a block diagram of another determining device for an associated user according to an exemplary embodiment. As shown in FIG. 8, on the basis of the foregoing embodiment shown in FIG. 6, determining the setting module 63 may include: acquiring a deleter. Module 631 and decision determination sub-module 632.
- the obtaining deletion sub-module 631 is configured to acquire gender and age information of the candidate associated users, and delete the candidate associated users that do not meet the age requirement according to the age information.
- the determination determining sub-module 632 is configured to determine, according to the gender of the remaining candidate associated users, whether the remaining candidate associated users exceed the number of associated users, and if not, determine that the remaining candidate associated users are related users, and if exceeded, according to the pre- Set the condition to determine the associated user.
- the determination determining sub-module 632 can be configured to obtain the number of faces of the remaining candidate associated users, and the candidate associated users with the largest number of faces as the associated users.
- the manner of determining the associated user according to the attribute information of the candidate associated user is simple and easy to implement.
- FIG. 9 is a block diagram of another determining device for associating a user according to an exemplary embodiment.
- the obtaining delete submodule 631 may include: collecting and extracting.
- the collection extraction training unit 6311 is configured to collect training samples, extract features of the training samples, and train the classifier according to the features, the features include a gabor feature, and the classifier includes an SVM classifier.
- the obtaining unit 6312 is configured to acquire the gender of the candidate associated user and the associated age group by using the classifier trained by the collection extraction training unit 6311.
- the obtaining unit 6312 may be configured to: for each candidate associated user, acquire the age of each face of the current candidate associated user by using a classifier, obtain a photo shooting time corresponding to each face, and according to the age and The photo shooting time calculates the birth time corresponding to each face, and determines the age segment to which the current candidate associated user belongs according to the calculated birth time; for each candidate associated user, the classifier is used to obtain the gender corresponding to each face of the current candidate associated user. If the acquired genders are the same, the acquired gender is used as the gender of the current candidate associated user. If the acquired genders are different, the number of current candidate related user faces belonging to different genders is counted, and the number of faces corresponding to the larger number is corresponding. Gender is the gender of the current candidate associated user.
- the obtaining unit 6312 may be configured to: for each face of the current candidate associated user, use a classifier to acquire the age and gender of the current face of the current candidate associated user, and calculate the illumination and posture information of the current face. If the calculation result meets the requirements of illumination and posture, the obtained age and gender are taken as the age and gender of the current face, and the current face and its corresponding age and gender are saved in the database, if the calculation result does not meet the illumination and posture If required, the matching faces of the current face are obtained from the database, and the age and gender of the matching face are taken as the age and gender of the current face.
- the manner of obtaining the associated user attribute information is flexible, diverse, and high in accuracy.
- FIG. 10 is a block diagram of a determining apparatus suitable for an associated user, according to an exemplary embodiment.
- device 1000 can be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a gaming console, a tablet device, a medical device, a fitness device, a personal digital assistant, an aircraft, and the like.
- apparatus 1000 can include one or more of the following components: processing component 1002, memory 1004, power component 1006, multimedia component 1008, audio component 1010, input/output (I/O) interface 1012, sensor component 1014, and Communication component 1016.
- Processing component 1002 typically controls the overall operation of device 1000, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations.
- Processing component 1002 can include one or more processors 1020 to execute instructions to perform all or part of the steps of the above described methods.
- processing component 1002 can include one or more modules to facilitate interaction between component 1002 and other components.
- processing component 1002 can include a multimedia module to facilitate interaction between multimedia component 1008 and processing component 1002.
- the memory 1004 is configured to store various types of data to support operation at the device 1000. Examples of such data include instructions for any application or method operating on device 1000, contact data, phone book data, messages, pictures, videos, and the like.
- the memory 1004 can be implemented by any type of volatile or non-volatile storage device, or a combination thereof, such as static random access memory (SRAM), electrically erasable programmable read only memory (EEPROM), erasable.
- SRAM static random access memory
- EEPROM electrically erasable programmable read only memory
- EPROM Programmable Read Only Memory
- PROM Programmable Read Only Memory
- ROM Read Only Memory
- Magnetic Memory Flash Memory
- Disk Disk
- Optical Disk Optical Disk
- Power component 1006 provides power to various components of device 1000.
- Power component 1006 can include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for device 1000.
- the multimedia component 1008 includes a screen between the device 1000 and a user that provides an output interface.
- the screen can include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes a touch panel, the screen can be implemented as a touch screen to receive input signals from the user.
- the touch panel includes one or more touch sensors to sense touches, slides, and gestures on the touch panel. The touch sensor may sense not only the boundary of the touch or sliding action, but also the duration and pressure associated with the touch or slide operation.
- the multimedia component 1008 includes a front camera and/or a rear camera. When the device 1000 is in an operation mode, such as a shooting mode or a video mode, the front camera and/or the rear camera can receive external multimedia data. Each front and rear camera can be a fixed optical lens system or have focal length and optical zoom capabilities.
- the audio component 1010 is configured to output and/or input an audio signal.
- the audio component 1010 includes a microphone (MIC) that is configured to receive an external audio signal when the device 1000 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode.
- the received audio signal may be further stored in memory 1004 or transmitted via communication component 1016.
- the audio component 1010 also includes a speaker for outputting an audio signal.
- the I/O interface 1012 provides an interface between the processing component 1002 and the peripheral interface module, which may be a keyboard, a click wheel, a button, or the like. These buttons may include, but are not limited to, a home button, a volume button, a start button, and a lock button.
- Sensor assembly 1014 includes one or more sensors for providing device 1000 with various aspects of state assessment.
- sensor assembly 1014 can detect an open/closed state of device 1000, relative positioning of components, such as the display and keypad of device 1000, and sensor component 1014 can also detect changes in position of one component of device 1000 or device 1000. The presence or absence of contact by the user with the device 1000, the orientation of the device 1000 or acceleration/deceleration and temperature changes of the device 1000.
- Sensor assembly 1014 can include a proximity sensor configured to detect the presence of nearby objects without any physical contact.
- Sensor assembly 1014 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications.
- the sensor assembly 1014 can also include an acceleration sensor, a gyro sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
- Communication component 1016 is configured to facilitate wired or wireless communication between device 1000 and other devices.
- the device 1000 can access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof.
- communication component 1016 is via a broadcast channel Receive broadcast signals or broadcast related information from an external broadcast management system.
- the communication component 1016 also includes a near field communication (NFC) module to facilitate short range communication.
- NFC near field communication
- the NFC module can be implemented based on radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra-wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
- RFID radio frequency identification
- IrDA infrared data association
- UWB ultra-wideband
- Bluetooth Bluetooth
- apparatus 1000 may be implemented by one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable A gate array (FPGA), controller, microcontroller, microprocessor, or other electronic component implementation for performing the above methods.
- ASICs application specific integrated circuits
- DSPs digital signal processors
- DSPDs digital signal processing devices
- PLDs programmable logic devices
- FPGA field programmable A gate array
- controller microcontroller, microprocessor, or other electronic component implementation for performing the above methods.
- non-transitory computer readable storage medium comprising instructions, such as a memory 1004 comprising instructions executable by processor 1020 of apparatus 1000 to perform the above method.
- the non-transitory computer readable storage medium may be a ROM, a random access memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, and an optical data storage device.
- An embodiment of the present disclosure determines a target user in a face album by acquiring a face album, and selects at least one candidate associated user of the target user from the face album, and then obtains candidate candidates.
- the attribute information of the associated user determines the associated user of the target user according to the attribute information, and finally sets the label information for the associated user.
- the implementation process is fast and simple, and does not require the user to perform cumbersome operations, thereby saving a lot of time for the user.
- the way to identify candidate associated users is simple and easy to implement.
- Determining the associated user according to the attribute information of the candidate associated user is simple and easy to implement.
- the method of determining the associated user according to the preset condition is simple and accurate.
- the accuracy is high.
- the obtained age and gender are directly used as the age and gender of the current face.
- the matching face of the current face is obtained from the database and will match.
- the age and gender of the face are the age and gender of the current face, thus ensuring the recognition accuracy of the current face gender and age.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Library & Information Science (AREA)
- Databases & Information Systems (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Business, Economics & Management (AREA)
- Computing Systems (AREA)
- Artificial Intelligence (AREA)
- Software Systems (AREA)
- Evolutionary Computation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Economics (AREA)
- Human Resources & Organizations (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- Marketing (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Evolutionary Biology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Medical Informatics (AREA)
- Mathematical Physics (AREA)
- Processing Or Creating Images (AREA)
- Image Analysis (AREA)
- Collating Specific Patterns (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
Description
Claims (15)
- 一种关联用户的确定方法,所述方法包括:获取面孔相册,所述面孔相册包含多个用户的面孔集合;确定所述面孔相册中的目标用户,从所述面孔相册中筛选出所述目标用户的至少一个候选关联用户;获取所述候选关联用户的属性信息,根据所述属性信息确定所述目标用户的关联用户,并为所述关联用户设置标签信息。
- 根据权利要求1所述的关联用户的确定方法,其中,所述从所述面孔相册中筛选出所述目标用户的至少一个候选关联用户,包括:获取所述面孔相册中所有用户的面孔来源照片,并将获取的除所述目标用户以外的用户的面孔来源照片和所述目标用户的面孔来源照片进行比对;将与所述目标用户之间相同面孔来源照片的数量大于预设数量的用户作为所述候选关联用户。
- 根据权利要求1所述的关联用户的确定方法,其中,所述获取所述候选关联用户的属性信息,根据所述属性信息确定所述目标用户的关联用户,包括:获取所述候选关联用户的性别和年龄信息,并根据所述年龄信息删除不符合年龄要求的候选关联用户;根据剩余的候选关联用户的性别判断剩余的候选关联用户是否超出关联用户数量,若未超出,则确定所述剩余的候选关联用户为所述关联用户,若超出,则根据预设条件确定所述关联用户。
- 根据权利要求3所述的关联用户的确定方法,其中,所述获取所述候选关联用户的性别和年龄信息,包括:收集训练样本,提取所述训练样本的特征,根据所述特征训练分类 器,所述特征包括gabor特征,所述分类器包括SVM分类器;利用所述分类器获取所述候选关联用户的性别和所属的年龄段。
- 根据权利要求3所述的关联用户的确定方法,其中,所述根据预设条件确定所述关联用户,包括:获得所述剩余的候选关联用户的面孔数量,将所述面孔数量最大的候选关联用户作为所述关联用户。
- 根据权利要求4所述的关联用户的确定方法,其中,所述利用所述分类器获取所述候选关联用户的性别和所属的年龄段,包括:针对每个候选关联用户,利用所述分类器获取当前候选关联用户的每个面孔的年龄,获取每个面孔对应的照片拍摄时间,并根据所述年龄和所述照片拍摄时间计算每个面孔对应的出生时间,根据计算出的所述出生时间确定所述当前候选关联用户所属的年龄段;针对每个候选关联用户,利用所述分类器获取当前候选关联用户的每个面孔对应的性别,若获取到的性别相同,则获取到的性别作为所述当前候选关联用户的性别,若获取到的性别不同,则统计属于不同性别的当前候选关联用户面孔的数量,将所述数量较大的面孔对应的性别作为所述当前候选关联用户的性别。
- 根据权利要求6所述的关联用户的确定方法,其中,所述利用所述分类器获取当前候选关联用户的每个面孔的年龄和性别,包括:针对当前候选关联用户的每个面孔,利用所述分类器获取当前候选关联用户的当前面孔的年龄和性别,并计算所述当前面孔的光照和姿态信息,若计算结果符合光照和姿态要求,则将获取到的年龄和性别作为所述当前面孔的年龄和性别,并将所述当前面孔及其对应的年龄和性别保存在数据库中,若计算结果不符合光照和姿态要求,则从所述数据库中获得所述当前面孔的匹配面孔,并将所述匹配面孔的年龄和性别作为 所述当前面孔的年龄和性别。
- 一种关联用户的确定装置,所述装置包括:获取模块,被配置为获取面孔相册,所述面孔相册包含多个用户的面孔集合;确定筛选模块,被配置为确定所述获取模块获取的所述面孔相册中的目标用户,从所述面孔相册中筛选出所述目标用户的至少一个候选关联用户;确定设置模块,被配置为获取所述确定筛选模块筛选出的所述候选关联用户的属性信息,根据所述属性信息确定所述目标用户的关联用户,并为所述关联用户设置标签信息。
- 根据权利要求8所述的关联用户的确定装置,其中,所述确定筛选模块包括:获取比对子模块,被配置为获取所述面孔相册中所有用户的面孔来源照片,并将获取的除所述目标用户以外的用户的面孔来源照片和所述目标用户的面孔来源照片进行比对;确定子模块,被配置为将与所述目标用户之间相同面孔来源照片的数量大于预设数量的用户作为所述候选关联用户。
- 根据权利要求8所述的关联用户的确定装置,其中,所述确定设置模块包括:获取删除子模块,被配置为获取所述候选关联用户的性别和年龄信息,并根据所述年龄信息删除不符合年龄要求的候选关联用户;判断确定子模块,被配置为根据剩余的候选关联用户的性别判断剩余的候选关联用户是否超出关联用户数量,若未超出,则确定所述剩余的候选关联用户为所述关联用户,若超出,则根据预设条件确定所述关联用户。
- 根据权利要求10所述的关联用户的确定装置,其中,所述获取删除子模块包括:收集提取训练单元,被配置为收集训练样本,提取所述训练样本的特征,根据所述特征训练分类器,所述特征包括gabor特征,所述分类器包括SVM分类器;获取单元,被配置为利用所述收集提取训练单元训练的所述分类器获取所述候选关联用户的性别和所属的年龄段。
- 根据权利要求10所述的关联用户的确定装置,其中,所述判断确定子模块,被配置为:获得所述剩余的候选关联用户的面孔数量,将所述面孔数量最大的候选关联用户作为所述关联用户。
- 根据权利要求11所述的关联用户的确定装置,其中,所述获取单元,被配置为:针对每个候选关联用户,利用所述分类器获取当前候选关联用户的每个面孔的年龄,获取每个面孔对应的照片拍摄时间,并根据所述年龄和所述照片拍摄时间计算每个面孔对应的出生时间,根据计算出的所述出生时间确定所述当前候选关联用户所属的年龄段;针对每个候选关联用户,利用所述分类器获取当前候选关联用户的每个面孔对应的性别,若获取到的性别相同,则获取到的性别作为所述当前候选关联用户的性别,若获取到的性别不同,则统计属于不同性别的当前候选关联用户面孔的数量,将所述数量较大的面孔对应的性别作为所述当前候选关联用户的性别。
- 根据权利要求13所述的关联用户的确定装置,其中,所述获取单元,被配置为:针对当前候选关联用户的每个面孔,利用所述分类器获取当前候选 关联用户的当前面孔的年龄和性别,并计算所述当前面孔的光照和姿态信息,若计算结果符合光照和姿态要求,则将获取到的年龄和性别作为所述当前面孔的年龄和性别,并将所述当前面孔及其对应的年龄和性别保存在数据库中,若计算结果不符合光照和姿态要求,则从所述数据库中获得所述当前面孔的匹配面孔,并将所述匹配面孔的年龄和性别作为所述当前面孔的年龄和性别。
- 一种关联用户的确定装置,包括:处理器;用于存储处理器可执行指令的存储器;其中,所述处理器被配置为:获取面孔相册,所述面孔相册包含多个用户的面孔集合;确定所述面孔相册中的目标用户,从所述面孔相册中筛选出所述目标用户的至少一个候选关联用户;获取所述候选关联用户的属性信息,根据所述属性信息确定所述目标用户的关联用户,并为所述关联用户设置标签信息。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
MX2016006745A MX361672B (es) | 2015-07-31 | 2015-12-16 | Método y dispositivo para determinar un usuario asociado. |
KR1020167013623A KR101771153B1 (ko) | 2015-07-31 | 2015-12-16 | 연관 사용자의 확정 방법 및 장치 |
JP2016532556A JP6263263B2 (ja) | 2015-07-31 | 2015-12-16 | 関連ユーザー確定方法および装置 |
RU2016119495A RU2664003C2 (ru) | 2015-07-31 | 2015-12-16 | Способ и устройство для определения ассоциированного пользователя |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510463635.X | 2015-07-31 | ||
CN201510463635.XA CN105069083B (zh) | 2015-07-31 | 2015-07-31 | 关联用户的确定方法及装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017020476A1 true WO2017020476A1 (zh) | 2017-02-09 |
Family
ID=54498453
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2015/097611 WO2017020476A1 (zh) | 2015-07-31 | 2015-12-16 | 关联用户的确定方法及装置 |
Country Status (8)
Country | Link |
---|---|
US (1) | US9892314B2 (zh) |
EP (1) | EP3125188A1 (zh) |
JP (1) | JP6263263B2 (zh) |
KR (1) | KR101771153B1 (zh) |
CN (1) | CN105069083B (zh) |
MX (1) | MX361672B (zh) |
RU (1) | RU2664003C2 (zh) |
WO (1) | WO2017020476A1 (zh) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105069083B (zh) * | 2015-07-31 | 2019-03-08 | 小米科技有限责任公司 | 关联用户的确定方法及装置 |
CN105488470A (zh) * | 2015-11-30 | 2016-04-13 | 小米科技有限责任公司 | 确定人物属性信息的方法及装置 |
CN106295499B (zh) * | 2016-07-21 | 2019-10-11 | 北京小米移动软件有限公司 | 年龄估计方法及装置 |
CN110020155A (zh) * | 2017-12-06 | 2019-07-16 | 广东欧珀移动通信有限公司 | 用户性别识别方法及装置 |
CN110162956B (zh) * | 2018-03-12 | 2024-01-19 | 华东师范大学 | 确定关联账户的方法和装置 |
CN108806699B (zh) * | 2018-05-30 | 2021-03-23 | Oppo广东移动通信有限公司 | 语音反馈方法、装置、存储介质及电子设备 |
US11928181B2 (en) * | 2018-12-27 | 2024-03-12 | Nec Corporation | Information processing apparatus, information processing method, and program |
CN109886158B (zh) * | 2019-01-30 | 2023-01-10 | 广州轨道交通建设监理有限公司 | 一种基于施工现场的定位标签佩戴监控方法与装置 |
CN110351389B (zh) * | 2019-08-07 | 2020-12-25 | 北京瑞策科技有限公司 | 用户社区关联数据的上链方法及其装置 |
CN112256982B (zh) * | 2020-09-15 | 2022-08-16 | 中国科学院信息工程研究所 | 基于稀疏采样时空数据的目标同行关系分析方法及电子装置 |
CN112528842A (zh) * | 2020-12-07 | 2021-03-19 | 北京嘀嘀无限科技发展有限公司 | 用于姿态检测的方法、装置、设备和存储介质 |
CN112817920A (zh) * | 2021-03-03 | 2021-05-18 | 深圳市知小兵科技有限公司 | 分布式大数据的清理方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090248703A1 (en) * | 2008-03-26 | 2009-10-01 | Fujifilm Corporation | Saving device for image sharing, image sharing system, and image sharing method |
CN103399896A (zh) * | 2013-07-19 | 2013-11-20 | 广州华多网络科技有限公司 | 识别用户间关联关系的方法及系统 |
CN104021150A (zh) * | 2009-08-07 | 2014-09-03 | 谷歌公司 | 带有社交网络辅助的面部识别 |
CN104299001A (zh) * | 2014-10-11 | 2015-01-21 | 小米科技有限责任公司 | 生成影集的方法及装置 |
CN105069083A (zh) * | 2015-07-31 | 2015-11-18 | 小米科技有限责任公司 | 关联用户的确定方法及装置 |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
SU1755268A1 (ru) * | 1989-01-06 | 1992-08-15 | Рижское Высшее Военно-Политическое Краснознаменное Училище Им.Бирюзова С.С. | Анализатор изображений |
JP2004227158A (ja) * | 2003-01-21 | 2004-08-12 | Omron Corp | 情報提供装置および情報提供方法 |
US8041082B1 (en) | 2007-11-02 | 2011-10-18 | Google Inc. | Inferring the gender of a face in an image |
CN104866553A (zh) | 2007-12-31 | 2015-08-26 | 应用识别公司 | 利用脸部签名来标识和共享数字图像的方法、系统和计算机程序 |
US7953690B2 (en) * | 2008-01-25 | 2011-05-31 | Eastman Kodak Company | Discovering social relationships from personal photo collections |
US9135277B2 (en) | 2009-08-07 | 2015-09-15 | Google Inc. | Architecture for responding to a visual query |
US9087059B2 (en) | 2009-08-07 | 2015-07-21 | Google Inc. | User interface for presenting search results for multiple regions of a visual query |
CN102043820A (zh) * | 2009-10-26 | 2011-05-04 | 鸿富锦精密工业(深圳)有限公司 | 人脉关系分析系统及方法 |
KR101138822B1 (ko) | 2009-11-19 | 2012-05-10 | 한국과학기술원 | 디지털 사진들에 첨부된 인물들의 이름들을 관리하는 방법 및 시스템 |
US8805079B2 (en) | 2009-12-02 | 2014-08-12 | Google Inc. | Identifying matching canonical documents in response to a visual query and in accordance with geographic information |
US8462224B2 (en) * | 2010-06-01 | 2013-06-11 | Hewlett-Packard Development Company, L.P. | Image retrieval |
JP2013069024A (ja) * | 2011-09-21 | 2013-04-18 | Fuji Xerox Co Ltd | 画像検索プログラム及び画像検索装置 |
US8929615B2 (en) * | 2011-11-03 | 2015-01-06 | Facebook, Inc. | Feature-extraction-based image scoring |
JP2012079354A (ja) * | 2012-01-26 | 2012-04-19 | Casio Comput Co Ltd | 画像表示制御装置、画像表示制御方法及びプログラム |
US20150032535A1 (en) * | 2013-07-25 | 2015-01-29 | Yahoo! Inc. | System and method for content based social recommendations and monetization thereof |
KR101479260B1 (ko) * | 2013-09-10 | 2015-01-09 | 부산대학교 산학협력단 | 사진 기반 인물 친밀도 검색 방법 |
US9420442B2 (en) * | 2014-10-06 | 2016-08-16 | Facebook, Inc. | Ping compensation factor for location updates |
CN104408402B (zh) * | 2014-10-29 | 2018-04-24 | 小米科技有限责任公司 | 人脸识别方法及装置 |
CN104715007A (zh) * | 2014-12-26 | 2015-06-17 | 小米科技有限责任公司 | 用户标识方法及装置 |
-
2015
- 2015-07-31 CN CN201510463635.XA patent/CN105069083B/zh active Active
- 2015-12-16 WO PCT/CN2015/097611 patent/WO2017020476A1/zh active Application Filing
- 2015-12-16 MX MX2016006745A patent/MX361672B/es active IP Right Grant
- 2015-12-16 KR KR1020167013623A patent/KR101771153B1/ko active IP Right Grant
- 2015-12-16 RU RU2016119495A patent/RU2664003C2/ru active
- 2015-12-16 JP JP2016532556A patent/JP6263263B2/ja active Active
-
2016
- 2016-07-13 US US15/209,148 patent/US9892314B2/en active Active
- 2016-07-29 EP EP16181897.6A patent/EP3125188A1/en not_active Withdrawn
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090248703A1 (en) * | 2008-03-26 | 2009-10-01 | Fujifilm Corporation | Saving device for image sharing, image sharing system, and image sharing method |
CN104021150A (zh) * | 2009-08-07 | 2014-09-03 | 谷歌公司 | 带有社交网络辅助的面部识别 |
CN103399896A (zh) * | 2013-07-19 | 2013-11-20 | 广州华多网络科技有限公司 | 识别用户间关联关系的方法及系统 |
CN104299001A (zh) * | 2014-10-11 | 2015-01-21 | 小米科技有限责任公司 | 生成影集的方法及装置 |
CN105069083A (zh) * | 2015-07-31 | 2015-11-18 | 小米科技有限责任公司 | 关联用户的确定方法及装置 |
Also Published As
Publication number | Publication date |
---|---|
MX361672B (es) | 2018-12-13 |
KR20170023768A (ko) | 2017-03-06 |
RU2016119495A (ru) | 2017-11-23 |
CN105069083B (zh) | 2019-03-08 |
RU2664003C2 (ru) | 2018-08-14 |
US20170032180A1 (en) | 2017-02-02 |
US9892314B2 (en) | 2018-02-13 |
KR101771153B1 (ko) | 2017-08-24 |
JP6263263B2 (ja) | 2018-01-17 |
CN105069083A (zh) | 2015-11-18 |
MX2016006745A (es) | 2017-06-28 |
EP3125188A1 (en) | 2017-02-01 |
JP2017526989A (ja) | 2017-09-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017020476A1 (zh) | 关联用户的确定方法及装置 | |
US9953212B2 (en) | Method and apparatus for album display, and storage medium | |
WO2017088470A1 (zh) | 图像分类方法及装置 | |
EP3125135B1 (en) | Picture processing method and device | |
WO2017084182A1 (zh) | 图片处理方法及装置 | |
US20170154206A1 (en) | Image processing method and apparatus | |
TWI702544B (zh) | 圖像處理方法、電子設備和電腦可讀儲存介質 | |
WO2021036382A1 (zh) | 图像处理方法及装置、电子设备和存储介质 | |
WO2017214793A1 (zh) | 指纹模板生成方法及装置 | |
RU2643464C2 (ru) | Способ и устройство для классификации изображений | |
EP3173969B1 (en) | Method, apparatus and terminal device for playing music based on a target face photo album | |
JP6305565B2 (ja) | 写真を集合する方法及び装置 | |
WO2017000491A1 (zh) | 获取虹膜图像的方法、装置及红膜识别设备 | |
US9779294B2 (en) | Methods and devices for classifying pictures | |
US11551465B2 (en) | Method and apparatus for detecting finger occlusion image, and storage medium | |
WO2017140109A1 (zh) | 压力检测方法和装置 | |
CN105335714A (zh) | 照片处理方法、装置和设备 | |
CN112069951A (zh) | 视频片段提取方法、视频片段提取装置及存储介质 | |
CN111797746B (zh) | 人脸识别方法、装置及计算机可读存储介质 | |
CN105426904A (zh) | 照片处理方法、装置和设备 | |
CN109145151B (zh) | 一种视频的情感分类获取方法及装置 | |
CN110020117B (zh) | 一种兴趣信息获取方法、装置及电子设备 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2016532556 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2016119495 Country of ref document: RU Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20167013623 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2016/006745 Country of ref document: MX |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15900243 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15900243 Country of ref document: EP Kind code of ref document: A1 |