WO2016200405A1 - Search based on features of a person - Google Patents

Search based on features of a person Download PDF

Info

Publication number
WO2016200405A1
WO2016200405A1 PCT/US2015/035631 US2015035631W WO2016200405A1 WO 2016200405 A1 WO2016200405 A1 WO 2016200405A1 US 2015035631 W US2015035631 W US 2015035631W WO 2016200405 A1 WO2016200405 A1 WO 2016200405A1
Authority
WO
WIPO (PCT)
Prior art keywords
person
data
individual
computing device
user
Prior art date
Application number
PCT/US2015/035631
Other languages
French (fr)
Inventor
Babak Makkinejad
Original Assignee
Hewlett Packard Enterprise Development Lp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett Packard Enterprise Development Lp filed Critical Hewlett Packard Enterprise Development Lp
Priority to PCT/US2015/035631 priority Critical patent/WO2016200405A1/en
Publication of WO2016200405A1 publication Critical patent/WO2016200405A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data
    • G06F16/63Querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • G06F16/532Query formulation, e.g. graphical querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data

Abstract

Example implementations relate to a search based on features of a person. For example, a computing device may include a processor. The processor may receive first data associated with features of a person, where the first data includes a user-generated image or an audio file. The processor may access parameters relating to a search of the person and identify a database record associated with an individual based on a comparison between the first data associated with the features of the person and other feature data associated with records in a database, where the comparison is based on the parameters relating to the search. The processor may provide information relating to the database record associated with the individual

Description

SEARCH BASED ON FEATURES OF A PERSON
BACKGROUND
[0001] Communication using computing devices has become prevalent. For example, people often communicaie through email using a variety of available email client applications. Some email client applications manage contact information for each user.
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] Some examples of the present application are described with respect to the following figures:
[00033 FIG- 1 is a block diagram of an example system for searching records based on features of a person;
[0004] FIG. 2 is a block diagram of an example computing device for searching records based on features of a person; and
[0G0S3 FIG, 3 is a flowchart of an example method for searching records based on features of a person,
DETAILED DESCRIPTION
[0006] As described above, email has become a prevalent way to communicate with people. For example, a user may send an email to another person if that person's contact information is known (e.g., name, email address, etc.). However, some users are more visual and can remember a person's face better than they can remember the person's name. Others may recai! a person's voice more accurately and reliably than the person's face and/or name. As such, a user may have trouble finding contact information for a person to whom the user wishes to send a communication if that person's name and/or contact information is unknown to the user.
[00073 The examples discussed herein relate to searching records based on features of a person, where the records may each be associated with a particular person and contain information associated with that particular person. A user may search for a person's information using features of that person. For example, a user may submit an image of the person, a user- generated image of the person (e.g., a digital sketch of the person), an audio file having audio of the person's voice, and the like. That submitted data may be compared to data associated with records of other people, and results for people and/or information associated with people may be returned based on the comparison.
[0008] Referring now to the figures, FIG. 1 is a block diagram of an example system 100 for searching records based on features of a person. System 100 may include computing device 104, which may be any suitable computing device managing any suitable service associated with users having accounts with the service {e.g., an email service), such as a web-based server, a local area network server, a cloud-based server, and the like. While examples discussed throughout describe an email service, one of ordinary skill in the art will appreciate that the techniques disclosed herein may be used with any suitable service. Computing device 104 may include a processor 106 and a records search engine 108 that may be any suitable engine that may cause the processor 106 to search for records associated with users based on features of a person. In some examples, the records search engine 106 may be a software add-on for an email application managed by computing device 104. For example, the records search engine 108 of computing device 104 may cause processor 108 to receive data associated with features of a person. The data may be received from a user of any of the user systems 102 and may include a user-generated image and/or an audio file associated with the person. For example, the user may wish to obtain information associated with a particular person, and the user may provide data associated with that particular person (e.g., a user-generated image of the person, an audio fiie with audio of the person's voice, etc.) such that additional information about that person may be found based on the provided data.
[0009] in response to receiving the data from a user system 102, the records search engine 108 of the computing device 104 may cause the processor 106 to access parameters relating to a search of the person. The parameters may include any suitable parameters associated with a search for information associated with the person. For example, the parameters may include user-specified parameters, such as an acceptable confidence ievel of a match between the data submitted by the user and data associated with database records that are searched. In some examples, the parameters may include additional data provided by a user (e.g., a location associated with the person being searched, business and/or organizational affiliation, age, etc.).
[0010] The records search engine 108 of the computing device 104 may cause the processor 106 to identify a database record associated with an individual based on a comparison between the data associated with the features of the person and other feature data associated with records in a database 110, where the comparison may be based on the parameters relating to the search. The database 110 may be any suitable database that may be accessed by the computing device 104 and may store information associated with users of the email service provided by the computing device 104, The records search engine 108 of the computing device 104 may cause the processor 108 to provide information relating to the identified database record associated with the individual.
[0011] Computing device 104 and user systems 102 may be in
communication with each other directly or over a network, which may be any suitable network, such as an ad hoc network, an intranet, an extranet, a virtual private network (VPN), a !ocai area network (LAN), a wireless LAN (WLAN), a wide area network (WAN), a wireless WAN (WWAN), a metropolitan area network (MAN), a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a ceilular telephone network, or any other type of network, or a combination of two or more such networks. Each user system 102 may be a computing device used by a particular user of the email service associated with the computing device 104, For example, a user system 102 may be a computing device through which a user may initiate a search for information about a particular person using a user-generated image and/or an audio file.
[0012] FIG, 2 is a block diagram of an example computing device 200 for searching records based on features of a person. Computing device 200 may be any suitable computing device (e.g., computing device 104 of FIG. 1 } that may manage an emaii service and may perform a search for information associated with a person based on data received from a user.
[0013] Computing device 200 may be, for example, a web-based server, a local area network server, a cloud-based server, a notebook computer, a desktop computer, an all-in-one system, a tablet computing device, a mobile phone, an electronic book reader, a printing device, or any other electronic device suitable for searching records based on features of a person.
Computing device 200 may include a processor 202 and a machine-readable storage medium 204. Computing device 200 may receive data associated with features of a person (e.g., an image of the person generated by the user through some other system or tooi, an audio file with audio imprint of the person's voice, etc.) and may return information relating to that person based on the data received.
[0014] Processor 202 is a tangible hardware component that may be a central processing unit (CPU), a semiconductor-based microprocessor, and/or other hardware devices suitable for retrieval and execution of instructions stored in machine-readable storage medium 204. Processor 202 may fetch, decode, and execute instructions 208, 208, 210, and 212 to control a process of searching records based on features of a person. As an alternative or in addition to retrieving and executing instructions, processor 202 may include at least one electronic circuit that includes electronic components for performing the functionality of instructions 206, 208, 210, 212, or a combination thereof.
[00153 Machine-readable storage medium 204 may be any electronic, magnetic, optical, or other physical storage device that contains or stores executable instructions. Thus, machine-readable storage medium 204 may be, for example, Random Access Memory (RAM), an EPROM, an Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage device, an optica] disc, and the like. In some examples, machine-readable storage medium 204 may be a non-transitory storage medium, where the term "non- transitory" does not encompass transitory propagating signals. As described in detail below, machine-readable storage medium 204 may be encoded with a series of processor executable instructions 208, 208, 210, and 212 for receiving data associated with features of a person, where the data includes a user-generated image or an audio file; determining parameters relating to a search of the person; accessing records in a database, each record in the database being associated with a particular individual; identifying a database record associated with an individuai based on a comparison between the data associated with the features of the person and other feature data associated with the records in the database, the comparison being based on the parameters relating to the search; and providing information relating to the database record associated with the individual
[0016] Data receipt instructions 208 may manage and control the receipt of data associated with features of a person. The data may be received from a user device of a user wishing to search for information associated with the person. The data may include any data associated with features of the person being searched, such as an image of the person, a user-generated image of the person, an audio file with audio of the person's voice, and the like. In some examples, the data receipt instructions 208 may allow a user to provide a digital sketch of the person. For example, the user may sketch an image of the person and scan the image such that the data receipt instructions 208 may receive the scanned user-generated image, in some examples, a user may be provided with digital sketching features that may allow a user to select various characteristics of the person. For example, the user may be provided with options about eye shape characteristics, and the user may specify a particular eye shape that the person being searched has. The characteristic selections may be used to create a user-generated image of the person being searched. In some examples, the user may scan an image of the person such that the data receipt instructions 106 may receive the scanned image.
[0017] Search parameter instructions 208 may manage and control the determination of parameters relating to a search of the person. The parameters may be any suitable parameters relating to the search and may be determined in any suitable manner. For example, a user may specify parameters (e.g., preferences settings) that indicate an acceptable confidence level of a match between the data received and feature data of records associated with people using the email service, and those parameters may be accessed and determined. In some examples, a user may provide additional data associated with features of the person {e.g., a iocation associated with the person, an organizational affiliation associated with the person, a time zone associated with the person, a gender of the person, a skin tone of the person, a hair coior of the person, an ethnicity of the person, estimated age of the person, etc.), and the additional data may be used as parameters relating to the search of the person,
[0018] Database search instructions 210 may manage and control the searching of database records based on the data received and the
parameters associated with the search. Database search instructions 210 may access records in a database (e.g., database 110 of FIG. 1), where each record in the database may be associated with a particular individual (e.g., an account of an individual using the email service). Database search instructions 210 may identify a database record associated with an individual based on a comparison between the data associated with features of the person and other feature data associated with the records in the database. The comparison may be based on the determined parameters relating to the search. For example, if the data received includes an image (e.g., a user- generated image), facial and/or feature recognition techniques may be used to compare the image received and facia! and/or feature data in the database to identify records for people whose faces and/or features are similar to the image received. In another example, if the data received includes an audio file, voice recognition techniques may be used to compare the audio file received and voice imprint data in the database to identify records for people whose voices are similar to the voice in the audio file received. The identified records may be further filtered based on the parameters of the search.
[0019] Information retrieval instructions 212 may manage, control, access, and provide information relating to the database record associated with the individual. The information provided may be any suitable information associated with the individual's record. For example, the information provided may include at ieast one email associated with the individual (e.g., an email the individual sent and/or received), contact information associated with the individual, an image associated with the individual, and the like. In some examples the information provided may include voice data associated with the individual, where the voice data may be used to provide audio thai mimics the individual's voice. For example, the voice data may be used to provide an audio reading of an email associated with the individual in a voice similar to the individual, such that it may sound as if the individual is reading the email
[0020] FIG. 3 is a flowchart of an example method 300 for searching records based on features of a person. Method 300 may be implemented using computing device 200 of FIG. 2.
[0021] Method 300 includes, at 302t receiving first data associated with features of a person. The first data may include a user-generated image or an audio file. For example, a user may generate an image of the person (e.g., a digital sketch) or provide an audio file having audio of the person's voice (e.g., a voicemail file of a voicemaii from the person).
[0022] Method 300 also includes, at 304, determining parameters relating to a search of the person. The parameters may include any suitable parameters (e.g., an acceptable confidence level of a match between the received data and data associated with records for individuals associated with the email service, additional data associated with the person being searched, etc.),
[0023] Method 300 aiso includes, at 306, comparing the first data associated with the features of the person and other feature data associated with records in a database. The comparison may be based on the
parameters relating to the search (e.g., based on the acceptable confidence level, the additional data provided by the user, etc.). For example, the first data may be compared with the other feature data associated with records in a database to determine how closely the data matches.,
[00243 Method 300 also includes, at 308, identifying a database record associated with an individua! based on comparing the first data and the other feature data. In some examples, a set of database records may be identified based on the comparison (e.g., the top 5 most closely-matching records),
[0025] Method 300 also includes, at 310, providing information relating to the database record associated with the individual. Any suitable information may be provided, such as contact information associated with the individual. In some examples, information associated with a set of records may be provided such that a user may view, select, and/or use information the user believes corresponds to the person being searched,
[0026] Examples provided herein {e.g., methods} may be implemented in hardware, software, or a combination of both. Example systems may inciude a controller/processor and memory resources for executing instructions stored in a tangible non-transitory medium (e.g., volatile memory, non-volatile memory, and/or machine-readable media). Non-transitory machine-readable media can be tangible and have machine-readable instructions stored thereon that are executable by a processor to implement examples according to the present disclosure.
[0027] An example system can include and/or receive a tangible non- transitory machine-readable medium storing a set of machine-readable instructions (e.g., software). As used herein, the controller/processor can include one or a plurality of processors such as in a parallel processing system. The memory can include memory addressable by the processor for execution of machine-readable instructions. The machine-readable medium can include volatile and/or non-volatile memory such as a random access memory ("RAM"), magnetic memory such as a hard disk, floppy disk, and/or tape memory, a solid state drive ("SSD"), flash memory, phase change memory, and the like.

Claims

Claims What is claimed is:
1. A computing device, comprising;
a processor to:
receive first data associated with features of a person, wherein the first data includes a user-generated image or an audio file;
access parameters relating to a search of the person;
identify a database record associated with an individual based on a comparison between the first data associated with the features of the person and other feature data associated with records in a database, the comparison being based on the parameters relating to the search; and
provide information relating to the database record associated with the individual.
2. The computing device of claim 1 , wherein the parameters specify an acceptable confidence ievei of a match between the first data and the other feature data.
3. The computing device of claim 1 , wherein the parameters specify additional data associated with the features of the person, the additional data including at least one of a location associated with the person, an organizational affiliation associated with the person, a time zone associated with the person, a gender of the person, a skin tone of the person, a hair color of the person, an ethnicity of the person, and estimated age of the person.
4. The computing device of claim 1, wherein the first data includes the user-generated image and wherein the user-generated image is a sketch of the person generated by a user providing the first data.
5. The computing device of claim 1 , wherein the first data includes the audio file and wherein the audio file includes audio of a voice of the person.
6. The computing device of claim 1 , wherein the information relating to the database record associated with the individuai includes at least one email associated with the individual or contact information associated with the individual.
7. The computing device of claim 1 , wherein the information relating to the database record associated with the individual includes an email associated with the individual, and wherein the processor is further to: provide an audio reading of the email in a voice similar to the individual.
8. A method, comprising;
receiving, by a computing device, first data associated with features of a person, wherein the first data includes a user-generated image or an audio fiie;
determining, by the computing device, parameters relating to a search of the person ;
comparing, fay the computing device, the first data associated with the features of the person and other feature data associated with records in a database, the comparison being based on the parameters relating to the search;
identifying, by the computing device, a database record associated with an individual based on comparing the first data and the other feature data; and
providing, by the computing device, information relating to the database record associated with the individual,
9. The method of ciaim 8, wherein the first data includes the user- generated image and wherein the user-generated image is a sketch of the person generated by a user providing the first data.
10. The method of claim 8, wherein the first data includes the audio file and wherein the audio file includes audio of a voice of the person,
1 1. The method of claim 8, wherein the information relating to the database record associated with the individual includes at feast one email associated with the individual or contact information associated with the individual.
12. A non-transitory machine-readable storage medium storing instructions thai, when executed by at feast one processor of a computing device, cause the computing device to;
receive first data associated with features of a person, wherein the first data includes a user-generated image or an audio file;
determine parameters relating to a search of the person;
access records in a database, each record in the database being associated with a particular individual;
identify a database record associated with an individual based on a comparison between the first data associated with the features of the person and other feature data associated with the records in the database, the comparison being based on the parameters relating to the search; and
provide information relating to the database record associated with the individual.
13. The non-transitory machine-readable storage medium of claim 12, wherein the first data inciudes the user-generated image and wherein the user-generated image is a sketch of the person generated by a user providing the first data.
14. The non-transitory machine-readable storage medium of claim 12, wherein the first data includes the audio file and wherein the audio file includes audio of a voice of the person.
15. The non-transitory machine-readable storage medium of claim 12, wherein the information relating to the database record associated with the individual includes at least one email associated with the individual or contact information associated with the individual.
PCT/US2015/035631 2015-06-12 2015-06-12 Search based on features of a person WO2016200405A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2015/035631 WO2016200405A1 (en) 2015-06-12 2015-06-12 Search based on features of a person

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2015/035631 WO2016200405A1 (en) 2015-06-12 2015-06-12 Search based on features of a person

Publications (1)

Publication Number Publication Date
WO2016200405A1 true WO2016200405A1 (en) 2016-12-15

Family

ID=57504589

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/035631 WO2016200405A1 (en) 2015-06-12 2015-06-12 Search based on features of a person

Country Status (1)

Country Link
WO (1) WO2016200405A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040228504A1 (en) * 2003-05-13 2004-11-18 Viswis, Inc. Method and apparatus for processing image
US20050076004A1 (en) * 2003-09-30 2005-04-07 Hiroyuki Yanagisawa Computer, database generating method for electronic picture book service, photographed subject information providing method, recording medium, and computer data signal
US20100150410A1 (en) * 2005-09-28 2010-06-17 Facedouble Incorporated Image Classification And Information Retrieval Over Wireless Digital Networks And The Internet
US20120114197A1 (en) * 2010-11-09 2012-05-10 Microsoft Corporation Building a person profile database
WO2014140834A1 (en) * 2013-03-15 2014-09-18 Orcam Technologies Ltd. Systems and methods for audible facial recognition

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040228504A1 (en) * 2003-05-13 2004-11-18 Viswis, Inc. Method and apparatus for processing image
US20050076004A1 (en) * 2003-09-30 2005-04-07 Hiroyuki Yanagisawa Computer, database generating method for electronic picture book service, photographed subject information providing method, recording medium, and computer data signal
US20100150410A1 (en) * 2005-09-28 2010-06-17 Facedouble Incorporated Image Classification And Information Retrieval Over Wireless Digital Networks And The Internet
US20120114197A1 (en) * 2010-11-09 2012-05-10 Microsoft Corporation Building a person profile database
WO2014140834A1 (en) * 2013-03-15 2014-09-18 Orcam Technologies Ltd. Systems and methods for audible facial recognition

Similar Documents

Publication Publication Date Title
US10565761B2 (en) Augmented reality z-stack prioritization
US8867779B2 (en) Image tagging user interface
US8396246B2 (en) Tagging images with labels
US11748401B2 (en) Generating congruous metadata for multimedia
JP5989001B2 (en) Providing persona-based application experiences
US11055297B2 (en) Scalable dynamic acronym decoder
US20240020270A1 (en) Efficient similarity detection
EP3254183A1 (en) Adapting timeout values for voice-recognition in association with text boxes
US9195896B2 (en) Methods and systems for image recognition
US11721116B2 (en) Managing camera actions
US20200403816A1 (en) Utilizing volume-based speaker attribution to associate meeting attendees with digital meeting content
US10778888B2 (en) Automatic selection of a camera based on facial detection
US10963527B2 (en) Associating user logs using geo-point density
US11120537B2 (en) Cognitive object emotional analysis based on image quality determination
WO2016131273A1 (en) Contact information processing method, device, and mobile terminal
US10984800B2 (en) Personal assistant device responses based on group presence
US20140379741A1 (en) Identifying entities based on free text in member records
US20160284381A1 (en) Systems and Methods for Quick Decision Editing of Media Content
WO2016200405A1 (en) Search based on features of a person
CN108922547B (en) Identity identification method and device and electronic equipment
US10764265B2 (en) Assigning a document to partial membership in communities
US20180210924A1 (en) Providing relevant search results from multiple search domains based on user profile data
US11321409B2 (en) Performing a search based on position information
US11657054B2 (en) Co-applicant candidate identification by way of edge graphs
CA3162787C (en) Performing search based on position information

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15895119

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15895119

Country of ref document: EP

Kind code of ref document: A1