AU2005306571A1 - Computer-based method and system for identifying a potential partner - Google Patents

Computer-based method and system for identifying a potential partner Download PDF

Info

Publication number
AU2005306571A1
AU2005306571A1 AU2005306571A AU2005306571A AU2005306571A1 AU 2005306571 A1 AU2005306571 A1 AU 2005306571A1 AU 2005306571 A AU2005306571 A AU 2005306571A AU 2005306571 A AU2005306571 A AU 2005306571A AU 2005306571 A1 AU2005306571 A1 AU 2005306571A1
Authority
AU
Australia
Prior art keywords
user
database
face
server
biometric data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
AU2005306571A
Inventor
Sinisa Cupac
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2004906566A external-priority patent/AU2004906566A0/en
Application filed by Individual filed Critical Individual
Priority to AU2005306571A priority Critical patent/AU2005306571A1/en
Priority claimed from PCT/AU2005/001733 external-priority patent/WO2006053375A1/en
Publication of AU2005306571A1 publication Critical patent/AU2005306571A1/en
Priority to AU2012201564A priority patent/AU2012201564A1/en
Abandoned legal-status Critical Current

Links

Landscapes

  • Collating Specific Patterns (AREA)

Description

WO 2006/053375 PCT/AU2005/001733 1 COMPUTER-BASED METHOD AND SYSTEM FOR IDENTIFYING A POTENTIAL PARTNER FIELD OF THE INVENTION 5 The present invention relates generally to the field of communications. More particularly the invention comprises a method and system for identifying a potential partner using a computer-based system. BACKGROUND TO THE INVENTION 10 Dating and introduction services have historically relied on approaches whereby a client completes a questionnaire to ascertain and classify their interests, hobbies, likes, and dislikes. The dating service will then cross match the questionnaire responses of the new client with those of existing clients in an effort to identify a number of potentially compatible individuals for the new client. 15 Generally, this approach is convenient only if the database of the dating service is quite small, since many people share interests such as reading, going to the movies etc. However, where the database is large, there will be many potential matches for a new client especially where the new client provides fairly generic responses to the questionnaire (as most are want of doing). Accordingly, a 20 problem is that a large number of potential partners will be identified on the basis that they have provided a similarly generic response With the advent of Internet-based dating services, databases of individuals searching for a suitable partner have become particularly large for example the 25 service known as "America's Online Dating" currently has 3.5 million users. Clearly, a database having this number of individuals will expose a new client to a very large number of potential partners having similar interests. It is tedious for the new client to search through the many potential partners identified by these services to find a potential partner. Of course, the search could be narrowed to 30 exclude individuals outside the geographical area of the new client, but for a large city there still may remain an unmanageably large number of potential clients to screen.
WO 2006/053375 PCT/AU2005/001733 2 Another problem is that many dating services require a new client to fill out lengthy questionnaires. This approach is intended to overcome the problem of a very large number of potential partners being identified for a new client, as is often case where the questionnaire is simplistic and unable to discriminate. 5 While lengthy questionnaires certainly provide a greater discriminatory power, they are tedious to prepare, and many clients find the questions overly personal and invasive. A further problem is that even where the new client is prepared to screen a very 10 large number of potential partners identified on the basis of a questionnaire with a low discriminatory power, or is prepared to complete a more detailed questionnaire to provide a higher level of discrimination, the potential partners identified can often disappoint. 15 It is an aspect of the present invention to at least alleviate a problem of the prior art by providing a system and method for identifying a potential partner over a computer network. BRIEF DESCRIPTION OF THE DRAWINGS 20 FIG 1 shows a schematic diagram of a generic biometrics-based system. FIG 2 shows a schematic diagram of the communication paths between the face recognition software (BiolD), the web server, the database server, the website (front end), and the BiolD server. FIG 3 shows a schematic representation of the system process, including 25 interaction with the BiolD components. FIG 4 is a flow chart representing the system overview. Figs 5 and 6 representing the database structure in the form of an entity relationship. FIG 7 shows a flow chart representation of the user registration process. 30 FIG 8 shows a flow chart representation of the method by which a user creates a user profile. FIG 9 shows an overview of the photo uploading and matching process. FIG 10 shows a flow chart representation of the process by which a user logs into their account and uploads a photo via a website.
WO 2006/053375 PCT/AU2005/001733 3 FIG 11 shows the process by which after upload of the user's photo, the image is passed to BiolD for enrolment. FIG 12 shows a flow chart representation for uploading a user's image by mobile telephony device. 5 FIG 13 shows SMS notification, confirming upload of photograph as shown in FIG 12. FIG 14 shows a flow chart representation of the process for matching a user with another user on the database having similar facial features. FIG 15 shows delivery of matches to the user via web page. 10 FIG 16 shows delivery of matches to the user via mobile telephony device. FIG 17 shows a schematic structure of an intermediate interface for the interaction between BiolD components, the database, and the client browser. 15 SUMMARY OF THE INVENTION In a first aspect, the present invention provides a method for identifying a potential partner for a user, the method including the steps of: providing biometric data characterising a physical feature of the user and/or a parent of the user, providing a database having biometric data characterising a physical feature of a 20 plurality of individuals, comparing the biometric data of the user with at least a proportion of the biometric data on the database, and identifying at least one individual characterised by biometric data that is the same or similar to that of the user. 25 Applicants propose that in computer-based match making services, the use of biometric data for screening a database of potential partners may dramatically cut down on the number of potential partners identified as potentially compatible with a user. Furthermore, the potential partners identified by the methods and systems described herein may be of greater compatibility than those obtainable 30 by methods of the prior art where no biometric comparisons are made between individuals. Without wishing to be limited by theory, the invention is proposed to rely on the subconscious desire for people to form relationships with others having similar WO 2006/053375 PCT/AU2005/001733 4 physical characteristics or a similar level of attractiveness as their own. It is a widespread belief that human partners look alike. Positive assortative mating, mating with partners more similar than expected by chance, may result in more stable partnerships and may have genetic benefits, although costs of inbreeding 5 may limit the amount of self-similarity that should be tolerated. Research has shown positive assortment for many physical features, and partners' faces resemble each other in ways that allow them to be identified as partners at levels above chance. Furthermore, it is proposed that there is a subconscious desire for people to form 10 relationships with others having similar characteristics to their opposite sex parent, whether that parent is the biological or non-biological parent. In a preferred form of the method, the level of similarity is not so great that the user is presented with potential partners who have a similarity so high that the user could view them as a relative. Instinctively, humans (like many other 15 animals) avoid mating with relatives for the simple biological reason of avoiding congenital disorders in their offspring. The avoidance of exact matches is a difference to methods in the prior art using biometric data for validation of an individual's identity. In the prior art methods, it is desirable for face recognition software to find an exact match to the individual under examination such that 20 their identity can be validated. By contrast, the methods of the present invention are not directed to that result, since only similarities are required between the user and potential partner rather than strict congruence. Even in light of this, face recognition software packages of the prior art will be useful in the context of the present invention, as the algorithms used in the software generally have 25 difficulty in identifying exact matches. Accordingly, the present invention also provides the use of face recognition software for identifying a potential partner for a user. In a preferred form of the invention, the biometric data relates to the position or 30 shape of anatomical features of the face and/or head such as the eyes, ears, nose and mouth. In another form of the invention the biometric data relates to the colouring of features such as the skin, eyes and hair.
WO 2006/053375 PCT/AU2005/001733 5 In a further aspect the present invention provides a computer-based system capable of identifying a potential partner for a user. The computer-based system includes software capable of executing a method as described herein. The system may be implemented over the Internet, incorporated into a standard 5 Internet match making website. In one form of the invention the system includes six major components. The first component of an automated biometric identification/verification system is a data acquisition component that acquires the biometric data in digital format by using a sensor. For face images the sensor is typically a digital camera. This component is optional since the user may use 10 their own digital camera. The second and third components of the system are also optional. They are the data compression and decompression mechanisms, which are designed to meet the data transmission and storage requirements of the system. The fourth component is the feature extraction algorithm. The feature extraction algorithm may produce a feature vector, in which the components are 15 numerical characterizations of the underlying biometrics. The feature vectors are designed to characterize the underlying biometrics of the user for comparison with potential partners on the database. The fifth component of the system is the "matcher," which compares feature vectors obtained from the feature extraction algorithm to produce a similarity score. This score indicates the degree of 20 similarity between a pair of biometrics data under consideration. The sixth component of the system is a decision-maker whereby a decision is made as to whether an individual on the database has sufficient biometric similarity to the user to warrant an introduction. 25 DETAILED DESCRIPTION OF THE INVENTION In a first aspect, the present invention provides a method for identifying a potential partner for a user and/or a parent of the user, the method including the steps of: providing biometric data characterising a physical feature of the user, providing a database having biometric data characterising a physical feature of a 30 plurality of individuals, comparing the biometric data of the user with at least a proportion of the biometric data on the database, and identifying at least one individual characterised by biometric data that is similar to that of the user.
WO 2006/053375 PCT/AU2005/001733 6 Applicants propose that the use of biometric data for screening a database of potential partners may dramatically cut down on the number of potential partners identified as potentially compatible. Furthermore, the potential partners identified by the methods and systems described herein may be of greater compatibility 5 than that obtainable by methods of the prior art. Without wishing to be limited by theory, the invention is proposed to rely on the subconscious desire for people to form relationships with others having similar physical characteristics or a similar level of attractiveness as their own. Thus, 10 the level of similarity between the user and a potential partner is not less than a minimum value, such that the user has an attraction or affinity for the potential partner. It is proposed that when the present methods are implemented, the attraction or affinity between the user and potential partner is greater than that where there is no comparison of biometric data. However, it will be appreciated 15 that the level of similarity may not be greater than a maximum value, such that the user has little or no attraction or affinity for the potential partner. The attraction or affinity may be on a physical, emotional or spiritual basis. Alternatively, the affinity or attraction may be only be at the level of friendship. 20 However, it is preferred that the affinity or attraction is predominantly physical in nature at least at first instance. In a preferred form of the invention, the biometric data relates to the face. There are many algorithms known in the art capable of converting a digital image to a 25 set of numerical values for the purposes of characterising a person's facial features. Facial biometric-based systems are often used in a security setting such as airports, or for identity authentication applications. However, they have yet to fulfil their promise given that they rarely achieve even just 90% accuracy. By contrast these facial biometric-based systems are suited to applications with 30 the present invention since it is not necessary (or desired) for the software to find an exact match between two faces. Some of the methods by which facial recognition technology identifies a match between two photographs include consideration of the shape or size of the upper WO 2006/053375 PCT/AU2005/001733 7 outlines of eye sockets, the geometry of the cheekbone area, the shape and size of sides of the mouth, the distance between eyes, and the length or shape of nose. In an alternative form of the invention the similarity between two faces could be ascertained by "eigenface" technology. This methodology uses the 5 whole face by slicing it into hundreds of gray-scale layers, each with distinctive features. The invention relies on a computer-based comparison of two sets of biometrics data and making a decision about whether or not they relate to persons having similar facial features. The computer may perform this function by providing a 10 similarity measurement in the form of a numerical score which informs as to the similarity of the pair of underlying biometrics data. Alternatively, the computer may generate a list of pair-wise biometrics data comparisons that are in an ascending order, commonly known as a candidate list. It is stressed that the present invention is not limited to any particular facial 15 feature mapping function, but can include any known or yet to be created algorithm, suitable for the purposes described herein, for recognizing facial features, whether it be two-dimensional or three-dimensional, that are then also to be used for ranging functions, as well. Further, according to the present invention, ranging algorithms are used in combination with the known face 20 recognition software. The skilled person will understand that for the purposes of the present invention, the particular facial features used by the face comparison algorithm can be optimised by routine experimentation. For example, many facial recognition software packages utilise the position of various anatomical "landmarks" in 25 deciding whether two faces are the same. Software packages often define these landmarks as nodal points. There are about 80 nodal points on a human face. Useful anatomical landmarks may be selected from the group including right eye pupil, left eye pupil, right mouth corner, left mouth corner, outer end of right eye brow, inner, end of right eye brow, inner end of left eye brow outer end of left eye 30 brow, right temple, outer corner of right eye, inner corner of right eye, inner corner of left eye, outer corner of left eye, left temple, tip of nose, right nostril, left WO 2006/053375 PCT/AU2005/001733 8 nostril, centre point on outer edge of upper lip, centre point on outer edge of lower lip, and tip of chin. The various distances between these nodal points may be measured to create a numerical code that represents the face in a database. For some software 5 packages only 14 to 22 nodal points are needed to complete the face matching process. For the purposes of the present invention, it is not necessarily desirable for the algorithm to identify a very similar face (such that the two faces could be considered to be of the same person). All that is required is for the algorithm to identify faces that are similar to that of the user. The skilled person 10 could trial the use of different numbers of nodal points to find an optimal number that gives the greatest attractiveness to the user. Apart from the use of distances between nodal points, consideration could be given more so to the shape of various anatomical structures. For example, the shape of the jaw (square, pointed or rounded), the shape of the eyes (round or 15 almond), the shape of the nose (wide or narrow), the fullness of the lips could be considered. Other anatomical structures including the eye socket, nostril, ear, chin, cheek, forehead, head, teeth, eyebrow and eyelash could be incorporated into the biometric comparison. The present invention is not limited to the use of anatomical landmark, or 20 anatomical structure information, but extends to the colour of the skin, hair or eyes. The importance of these variables has been discovered during studies showing that humans select long-term partners who not only look like themselves, but look like their opposite sex parents. It has been discovered that men are attracted to women who look like their mothers and women prefer men 25 who resemble their fathers. The same research has also shown that humans select partners who remind them of themselves, particularly in relation to traits such as hair and eye colour. Studies examining hair and eye colour have shown evidence of positive assortment which may reflect attraction to self-similar characteristics but is also consistent with attraction to parental traits (Little, A.C., 30 Penton-Voak, I.S., Burt D.M. & Perrett, D.I. (2003) An imprinting-like phenomenon in humans: partners and opposite-sex parents have similar hair and eye colour. Evolution and Human Behavior, 24: 43-51). This paper set out to WO 2006/053375 PCT/AU2005/001733 9 establish whether the colouring of parents influenced choice of partner and found significant correlations between parental characteristics and actual partner characteristics for both men and women, proving that parental colouring has an effect on human partner choice. In particular, it was found that colour traits in 5 opposite-sex parents had more of an effect on partner choice than colour traits in self or the same-sex parent. In other words the subjects were more likely to choose partners who resembled their opposite-sex parent. The group found that the eye colour of opposite-sex parents significantly affected the choice of partner eye colour in both male and females. They also found that 10 males' choice of partner hair colour was significantly positively affected by maternal hair colour. Without wishing to be limited by theory it is proposed that humans (and some animals) are attracted to elements which are familiar or in some way we are 'imprinted' with certain familiar characteristics from birth which we are then comfortable with, or attracted to, in the future. Traits such as hair 15 and eye colour are examples of parental characteristics that offspring may 'learn' or be imprinted with. The theory of 'imprinting' is also thought to be one reason why individuals have different ideas of what is 'attractive'. Despite a high degree of agreement over what is and what is not 'attractive' throughout the World and different cultures, 20 this learning of parental characteristics may explain some individual differences in opinion about which characteristics are attractive in a partner. The level of similarity between the user and the potential partner may be varied. In general, the higher the similarity the better. However at a certain point, further similarity does not improve attractiveness, or may decrease in attractiveness. 25 Without wishing to be limited by theory, it is thought that too similar a match will trigger the user's instinct to avoid mating with family members. This trigger point may differ depending on race, sex, individual preference etc. However the skilled person could ascertain the point at which similarity is negatively correlated to attractiveness by simple trial and error. 30 The user may choose to exclude any of their own features from the similarity analysis. For example, if the user was particularly dissatisfied with the size or WO 2006/053375 PCT/AU2005/001733 10 shape of their nose, this feature could be excluded from the similarity analysis such that potential partners having a similarly large or misshapen nose are excluded. Of course, this may lead to less than optimal matches being generated by the computer but the similarity with other features such as the eyes 5 or mouth may still result in the user finding attractiveness in a less than optimal match. Many algorithms for determining facial similarity are known in the art, and it will be within the capacity of the skilled person to choose or one more algorithms suitable for use with the present invention. Algorithms that may be used in the 10 context of the present invention include those detailed by Wiskott et al, Phantom faces for face analysis, 1997, Institute for neuroinformatic, Germany pp. 308-311; Wiskott et al, Face recognition by elastic bunch graph matching, 1997, IEEE, pp. 775-779; Tomasi, C., et al., "Stereo Without Search", Proceedings of European Conference on Computer Vision, Cambridge, UK, 1996, 14 pp. (7 sheets).; Turk, 15 M., et al, "Eigenfaces for Recognition", Journal of Cognitive Neuroscience, vol. 3, No 1, pp. 71-86, 1991; Wiskott, L., et al, "Face Recognition by Elastic Bunch Graph Matching", Internal Report, IR-INI 96-08, Institut fur Neuroinformatik, Ruhr-Universitat, Bochum, pp. 1-21, Apr. 1996; Wiskott, L., "Labeled Graphs and Dynamic Link Matching for Face Recognition and Scene Analysis", Verlag Harr 20 Deutsch, Thun-Frankfurt am Main. Reihe Physik, Dec. 1995, pp. 1-109; Wiskott, L., "Phanton Faces for Face Analysis". Proceedings of 3rd Joint Symposium on Neural Computation, Pasadena, CA, vol. 6, pp. 46-52, Jun. 1996; Wiskott, L., "Phanton Faces for Face Analysis". Internal Report, IR-INI 96-06, Institut fur Neoroinformatik, Ruhr-Universitat, Bochum, Germany, Apr. 1996, 12 pp; Wiskott, 25 L. "Phantom Faces for Face Analysis", Pattern Recognition, vol. 30, No. 6, pp. 837-846, 1997; Wiskott, L., et al, "Face Recognition by Elastic Bunch Graph Matching", IEEE Transactions on Pattern Analysis and Machine Intelligence, 19(7), pp. 775-779, 1997; Wong, R., et al, "PC-Based Human Face Recognition System", IEEE, pp. 641-644, 1992; Kruger, N., et al, "Object Recognition with a 30 Sparse and Autonomously Learned Representation Based on Banana Wavelets", Internal Report 96-11, Institut fur Neuroinformatik, Dec. 96, pp. 1-24; Kruger, N., et al, "Object Recognition with Banana Wavelets", European Symposium on Artificial Neural Networks (ESANN97), 1997, 6 pp; Lades, M., et WO 2006/053375 PCT/AU2005/001733 11 al, "Distortion Invarient Object Recognition in the Dynamic Link Architecture", IEEE Transactions on Computers, vol. 42, No. 3, 1993, 11 pp.; Manjunath, B. S., et al, "A Feature Based Approach to Face Recognition", In Proceedings IEEE Conference on Computer Vision and Pattern Recognition, pp. 373-378, 3/92; 5 Mauer, T., et al, "Single-View Based Recognition of Faces Rotated in Depth", In Proceedings of the International Workshop on Automatic Face and Gesture Recognition, pp. 248-253, Zurich, CH, Jun. 26, 1995; Mauer, T., et al, "Learning Feature Transformations to Recognize Faces Rotated in Depth", In Proceedings of the International Conference on Artificial Neural Networks, vol. 1, pp. 353-358, 10 Paris, France, Oct. 9-13, 1995; Mauer, T., et al, "Tracking and Learning Graphs and Pose on Image Sequences of Faces", Proceedings of 2nd International Conference on Automatic Face and Gesture Recognition, Oct. 14-16, 1996, pp. 176-181; Peters, G., et al, "Learning Object Representations by Clustering Banana Wavelet Responses", Tech. Report IR-INI 96-09, Institut fur 15 Neuroinformatik, Ruhr Universitat, Bochum, 1996, 6 pp; Phillips, P. J., et al, "The Face Recognition Technology (FERET) Program", Proceedings of Office of National Drug Control Policy, CTAC International Technology Symposium, Aug. 18-22, 1997, 10 pages; Roy, S., et al, "A Maximum Flow Formulation of the N Camera Stereo Correspondence Problem", IEEE, Proceedings of International 20 Conference on Computer Vision, Bombay, India, Jan. 1998, pp. 1-6; Sara, R. et al "3-D Data Acquision and Interpretation for Virtual Reality and Telepresence", Proceedings IEEE Workshop Computer Vision for Virtual Reality Based Human Communication, Bombay, Jan. 1998, 7 pp. 1998; Sara, R., et al, "On Occluding Contour Artifacts in Stereo Vision", IEEE, 25 Proceedings of International Conference Computer Vision and Pattern Recognition, Puerto Rico, 1997, 6 pp; Steffens, J., et al, "PersonSpotter--Fast and Robust System for Human Detection, Tracking, and Recognition", Proceedings of International Conference on Automatic Face and Gesture Recognition, 6 pp., Japan-Apr. 1998; Hall, E.L., "Computer Image Processing 30 And Recognition", Academic Press 1979, 99. 468-484; Hong, H.,et al., "Online Facial Recognition based on Personalized Gallery", Proceedings of Int'l Conference On Automatic Face And Gesture Recognition, pp. 1-6, Japan Apr. 1997; Kolocsai, P., et al, Statistical Analysis of Gabor-Filter Representation, Proceedings of International Conference on Automatic Face and Gesture WO 2006/053375 PCT/AU2005/001733 12 Recognition, 1997, 4 pp; Ayache, N. et al., "Rectification of Images for Binocular and Trinocular Stereovision", Proc. Of 9th Int'l., Conference on Pattern Recognition, 1, pp. 11-16, Italy, 1988; Beymer, D. J., "Face Recognition Under Varying Pose", MIT A.I. Lab, Memo No. 1461,pp. 1-13, 12/93; Beymer, D.J., 5 "Face Recognition Under Varying Pose", MIT A.I. Lab. Research Report, 1994, pp. 756-761; Buhmann, J. et al., "Distortion Invariant Object Recognition By Matching Hierarchically Labeled Graphs", In Proceedings IJCNN Int'l Conf. Of Neural Networks, Washington, D.C. Jun. 1989, pp. 155-159; DeCarlo, D., et al., "The integration of Optical Flow and Deformable Models with Applications to 10 Human Face Shape and Motion Estimation", pp. 1-15, In Proc. CVPR '96, pp. 231-238 (published)[TM Sep. 18, 1996; Dhond, U., "Structure from Stereo: a Review", IEEE Transactions on Systems, Man, and Cybernetics, 19(6), pp. 1489 1510, 1989. Notification of Transmittal of the International Search Report or the Declaration, 15 International Search Report for PCT/US02/23973, mailed Nov. 18, 2002; Yang, Tzong Jer, "Face Analysis and Synthesis", Jun. 1, 1999, Retrieved from Internet, http://www.cmlab.csie.ntu.edu.tw/ on Oct. 25, 2002, 2 pg. In a further aspect the present invention provides a computer-based system adapted to identify a potential partner for a user. In one preferred form of the 20 invention the system includes six major components depicted in FIG.1 The first component of an automated biometric identification/verification system is a data acquisition component that acquires the biometric data in digital format by using a sensor. For face images the sensor is typically a camera. The second and third components of the system are optional. They are the data compression and 25 decompression mechanisms, which are designed to meet the data transmission and storage requirements of the system. The fourth component is the feature extraction algorithm. The feature extraction algorithm may produce a feature vector, in which the components are numerical characterizations of the underlying biometrics. The feature vectors are designed to characterize the 30 underlying biometrics of the user for potential for comparison with potential partners on the database. In general, the larger the size of a feature vector (without much redundancy), the higher its discrimination power. The discrimination power is the difference between a pair of feature vectors WO 2006/053375 PCT/AU2005/001733 13 representing two different individuals. The fifth component of the system is the "matcher," which compares feature vectors obtained from the feature extraction algorithm to produce a similarity score. This score indicates the degree of similarity between a pair of biometrics data under consideration. The sixth 5 component of the system is a decision-maker. One class of biometric data that may find use in the present invention is that of facial asymmetry. A person may be attracted to a person having a similar level of asymmetry in their face. It has been demonstrated that the asymmetry of specific facial regions captures individual differences that are robust to variation 10 in facial expression. It has been further shown that facial asymmetry provides discriminating power orthogonal to conventional face identification methods. The degree of asymmetry can be quantitated by consideration of two dimensional or three dimensional measurements of the face and head. Three dimensional (3D) face recognition has advantages over 2D face since it 15 compares the 3D shape of the face which is invariant in different lighting conditions. As long as the illumination of the face is in a range which allows 3D reconstruction of a sufficiently large portion of the face, a detailed analysis of the face is possible. 20 A number of commercially available software systems will have use in the present invention. Accordingly, the present invention also provides the use of face recognition software for identifying a potential partner for a user. For example, SecurelDent Products by BioDentity Systems Corporation (Ottawa, Canada) encompasses everything from hardware to middleware to specialized 25 application software. The PreProcessor offers comprehensive facial capture and image recognition support regardless of lighting and other environmental factors. The software has the capability of separating the facial image from the background for easy processing of information. 30 The SecurelDent PreProcessor offers onboard processing, and can provide front-end enhancement to any other facial-recognition system or application. Other offerings in the family include the SecurlDent Face Recognition Controller, which is the primary interface between the PreProcessors and the rest of the WO 2006/053375 PCT/AU2005/001733 14 application; the SecurlDent Photo Enrolment System, which automatically optimizes all images to create a high-quality database; and the SecurlDent Search Engines, which compare biographical or face biometric details, or a combination of the two, against very large databases. 5 Single Sign-On by BiolD America Inc (NC) technology offers the ability to analyze face, voice, and lip movement simultaneously, and requires only a standard USB camera and microphone for implementation. The BiolD dataset features 1,521 gray level images showing a frontal view of a face of one out of 23 different test 10 persons. Images are stored in single files using portable graymap (pgm) data format. The FaceVACS-Logon system (Cognitec, OR) offers automatic facial identification. It may be integrated with conventional access control or time and 15 attendance systems, and a combination with card terminals is possible for high security areas. Users faces are captured by video camera, and image processing algorithms extract a feature set from the digitized image, which the software compares to the users reference set, stored on the computer. Features of the system include flexible operating modes, which enable it to be used as a 20 stand-alone facial-recognition solution. The package includes standard Webcam support, and support of Windows 98/2000/NT/XP/Me. The FaceTrac facial-recognition system (Graphco Technologies, PA) is image capture, comparison against images in a database, and matching of images. This 25 open system can incorporate facial-recognition engine components from vendors such as Viisage, Visionics, and AcSys Biometrics. FaceTrac can match the facial geometry of an individual against portraits in a database. Face ID facial-recognition program (ImageWare Systems, CA) uses biometrics in 30 combination with parallel processing to match faces through a mathematical formula that uses the eyes as a reference point. The formula generates a data record representing the face, which is used to compare against a digital database of enrolled images. Images from scanned photographs or video may be queried using the program, and millions of images may be searched to identify matches.
WO 2006/053375 PCT/AU2005/001733 15 Using more than 200 facial descriptors generated from an image analysis algorithm, the ID-2000 software (Imagis Technologies Inc. BC Canada) captures, compares, and quickly and efficiently displays an individuals face against a database. It enables an individual to be matched in seconds using only an image 5 or photograph as the primary search criteria. The IRID face-recognition technology (Infrared Identification Inc. VA) can perform infrared facial recognition as well as continuous condition monitoring of individuals by using passive infrared imaging that is non-contact, non-invasive, and works under any lighting conditions or in total darkness. 10 The Tridentity 3 Dimensional Face system (Recognition Neurodynamics Limited, Cambridge UK) offers a three-dimensional approach to facial recognition, and can analyze subtle features of the face like bone structure, and enables images to be rotated to offer a better view of the subject. The technology uses patterned 15 light to create a three-dimensional image of the face, and once an image is captured, a 3D representation of the subjects face can be built from a single frame of video footage. The solution can operate on single or multiple scans, and each scan can be processed in under one second on a 400 MHz Pentium system. The search database size is limited by disk space and processor speed 20 only, and the system may be expanded to scale up to multiple cameras and workstations. The system is based on an open architecture, uses COTS components, and may be easily integrated as a component of a larger system. The FaceOn technology (Symtron Technology, CA) uses neural network and 25 artificial intelligence techniques to capture faceprints and determine or verify identity. The FaceOn Logon AdminTool enables complete the faceprint enrollment, as well as add and delete faceprints from the database. The enhanced visual 30 Access Log enables administrators to keep tack of all users access settings. The FaceOn Surveillance system may be integrated with various types of CCTV systems, and offers multiple, real-time image enrollment, retrieval, and recognition (using the Invariant Feature Analysis technology).
WO 2006/053375 PCT/AU2005/001733 16 The Viisage face-recognition (FR) technology (Viisage Technology, MA) is based on an algorithm developed at MIT, and enables software to translate the characteristics of a face into a unique set of numbers called an eigenface. This is used by identification and verification systems for facial comparisons made in 5 real time, and may be used with databases containing millions of faces. The technology enables software to instantly calculate an individual's eigenface from live video or a still digital image and then search a database to find similar or matching images. The family of products includes the FaceFINDER, FaceEXPLORER, FacePASS, 10 FacePIN, and FaceTOOLS applications. They offer the ability to search large databases of images, and a software development kit for developing additional applications. The Facelt facial-recognition software engine from Visionics Corporation NJ 15 enables computers to rapidly and accurately detect and recognize faces, for everything from ID solutions to banking and e-commerce applications. The software can detect one or multiple faces, and can also provide one-to-one or one-to-many matching. It also evaluates the quality of the image and prompts for an improved image if needed, and can crop faces from background imagery. 20 Other features include the ability to generate a faceprint, a digital code/template unique to an individual, as well as the ability to track faces over time. Facelt can also compress facial images to 84 bytes for easy storage and transfer. It uses the local feature analysis technique to represent facial images in terms of local, 25 statistically derived building blocks. The software is resistant to changes in skin tone, lighting, facial expression, eyeglasses, and hair, and allows up to 35 degrees of change in pose in all directions. The UnMask system (Visionsphere Technologies Inc. ON Canada) offers face 30 detection and location of key features, extraction of facial descriptors, and comparison of extracted information against a database. It locates the face and the eyes automatically through proprietary search algorithms, and then normalizes and crops the image to offer invariance to variations in head rotation, WO 2006/053375 PCT/AU2005/001733 17 lighting, hairstyle, and facial expression. The system then uses VisionSpheres Holistic Feature Code (HFC) to provide discrimination for comparing faces at high confidence rates and fast processing speeds. Faces are then compared using a proprietary distance function, which stresses significant differences 5 between faces. VisionSphere also provides the UnMask Plus software artificial intelligence (AI) system, which provides identification and removal of duplicate or multiple images from large databases. The software also includes automatic computer logon 10 authentication system offers hardware and software components for verifying the identity of a network or workstation user. The FaceCam biometric user verification terminal offers integration with applications for physical access control, time and attendance, and registration systems. 15 The ZN-Face physical access control system (ZN Security, a division of ZN Vision Technologies Bochum, Germany) enables automation of identity checks for access to secure areas. The system uses a neural face-recognition routine to verify individuals, and also offers a refined optical filter system and a LiveCheck analysis procedure to prevent attempts at spoofing through photos or masks. 20 The system may be administered via Windows NT/2000, and supports the ODBC database interface standard to enable acceptance of the master data from external databases, and also features the ZN-SmartEye technology. This enables evaluation of pictures from a video camera, and reports the similarity of 25 a face compared with others on the database. The system also works with ZN Phantomas, a computerized image database that can automatically compare faces. In another aspect, the present invention provides a method for identifying a 30 potential partner for a user over a computer network, the method including the use of a system described herein. The systems and methods described herein may be implemented over any type of computer network. In a highly preferred form of the invention the systems and process may be implemented over the WO 2006/053375 PCT/AU2005/001733 18 Internet. However, any other network such as WAN or LAN could be utilised. It is contemplated that wired or wireless networking protocols could be used. It is further contemplated that the network could be implemented by a user 5 carrying their own biometric information on a portable data storage device, and connecting the device to a computer holding the database of individuals. Upon connection of the device to the computer, the user's biometric data is compared with the biometric information. The portable data storage device may be a, flash disk, micro-hard drive, compact disc, magnetic medium such as floppy disk, 10 punched card, or EPROM device. It is further contemplated that the user's image may be forwarded to the server for biometric analysis by means of mobile telephone equipment. Many consumer telephones have the ability to take a digital photograph and transmitting the 15 photograph to a computer via a cellular network. It is envisaged that an image of the potential partner identified by the computer could be returned to the user's mobile telephone, along with the potential partner's contact details. The methods and systems may incorporate other known methods useful for 20 identifying a potential partner such as standard questionnaires and zodiac sign compatibility. Also incorporated may be other screening criteria such as hair colour, skin colour, ethnicity, height, weight, and the like. These further criteria could be selected for or against either before or after the computer selects a potential partner for the user. 25 The user and the potential partners may be a different sex or the same sex. The invention may even be useful for identifying a potential animal companion based on similarities between the features of the potential owner and the potential pet. 30 The present invention will now be more fully described by reference to the following example. It is emphasised that this example is not intended to be restrictive on the general disclosure supra.
WO 2006/053375 PCT/AU2005/001733 19 EXAMPLE 1: HARDWARE AND SOFTWARE CONFIGURATION Face analysis software BiolD SDK V3.1 is used to compare the level of similarity between user faces on the match making database. BiolD is available from HumanScan 5 AGGrundstrasse 1, CH-6060 Sarnen, Switzerland Below are the installation procedures for HumanScan BiolD SDK: Assumption: Windows 2000 (or above), SQL Server 2000 (or above) and .NET Environment are already installed and we have Remote Desktop Connection to 10 the server with Administrator privileges. Create Database: Create a database in SQL server that is to be used for this project. Upload Necessary Files: Firstly, all the necessary files related to HumanScan BiolD software need to be uploaded and they are listed as follows: 15 BiolDAdmin.zip, BiolDServer.zip, BiolDClient.zip, BiolDSDK31.zip Install Admin: Unzip BiolDAdmin.zip and double click on setup.exe. Follow the instructions on the screen and complete the installation. Install Server: Unzip BiolDAdmin.zip and double click on setup.exe. Choose SQL Server and update existing database option. Then choose the database 20 created in Step 2. Follow the instructions on screen and complete the installation. Setup Client: After installing server part, go to Start Menu -> Program -> Humanscan->BiolDManagement. Click on "clients" on the left hand side and click on "+" sign to add client on the right hand side. It will show the computer name in the next window. Then put the security setting to 1. Keep on pressing 25 "next" and then "finish". Install Client: Unzip BiolDClient.zip and double click on setup.exe. Follow the instructions on screen and complete the installation. Later on it will ask about the client, then select the client created in step 6 and move forward. Install SDK: Unzip BiolDSDK31.zip and double click on setup.exe. Follow the 30 instructions on screen and complete the installation.
WO 2006/053375 PCT/AU2005/001733 20 Proposed Technology and Hardware Technology The following hardware/software configuration is used. 5 Server Side Scripting language: ASP.NET Database: SQL Server 2000 or above Operating System: Windows 2000 or above Web Server: IIS 5.0 or above Browsers Compatibility: Microsoft Internet Explorer 5.x, Netscape Communicator 10 6.x, Mozilla FireFox 1.0 and above Scripting Language/ User Interface: HTML Client Side Scripting: Java Script or VB Script Hardware (Server Configuration) 15 Dual or Quad Xeon 2.0 GHz 2 to 4 GB DDR RAM SuperMicro S811 i Chassis SuperMicro P4Sci Main Board 4x73GB SCSI 20 RAID 5 500 to 1000 GB Bandwidth/month Dell Hardware Plesk, Ensim or Cpanel - Control Panel At least 1 dedicated IP address 25 Firewall Protection Remote Desktop Connection Remote reboot option WO 2006/053375 PCT/AU2005/001733 21 MailEnable Pro Mail Server Full Backup options Description of Communication Paths 5 The potential partner match application will consist of several components, which will perform different tasks. Referring now to FIG 2 the components and communication paths of special interest are identified and described as follows. (A) Web Server The Web Server (Internet Information Server) will run as a service on Windows 10 NT4, Windows 2000 or Windows XP. Internet Information Services (IIS) is a powerful Web server that provides a highly reliable, manageable, and scalable Web application infrastructure for all versions of Windows Server. IIS assists in increasing Web site and application availability while lowering system administration costs. 15 The Web server service uses an account to logon to the SQL server depending on the SQL server that is run. The database server is on same PC as Web Server Service; therefore local account can be used. (B) BiolD Server Service 20 The BiolD Server Service runs as a service on Windows XP and runs independently with an existing or a new account. The BiolD Server Service can use an existing account to logon to the SQL Server. If new account is required, several factors need to be considered where the account will be created (domain global or locally on the PC) as mentioned 25 below: - BiolD server service will not be installed on a domain controller: o database server is on same PC as BiolD Server Service: local account " database server is on different PC as BiolD Server Service: WO 2006/053375 PCT/AU2005/001733 22 - PC is member of a domain: domain global account - PC is standalone / not member of a domain: local account - BiolD server service will be installed on a domain controller: 5 domain global account To allow the creation of the account, it's necessary to run the BiolD server setup as an administrator. If necessary to create a domain global account in the installation admin access is required to install BiolD server. If using an existing account for the BiolD Server Service, it's important that this 10 account has the right to "log on as a service". Otherwise the BiolD Server Service will not be able to start. To set this right, use the following: - NT4: User Manager/Policies/User Rights - W2K, XP: MMC SnapIn for Security Settings, LocalPolicies/User Rights Assignment/Log on as a service 15 (C) BiolD and Web Server <-> SQL Server / BiolD Database The BiolD Server stores all data about the clients, users, etc. in a SQL database. To connect to the SQL Server the BiolD Server Service requires login account information. Depending on type or version of SQL server used, the required data 20 differs as mentioned: - MS SQL server 7.0 or higher with integrated or mixed security The BiolD Server Service needs the names of the database server and the database on that database server. It will logon to the database with the BiolD Server Service existing account. To work with the database this account needs to 25 have the right to logon to that database and to read, modify, create and delete datasets in any of the BiolD database tables. - MS SQL server 7.0 or higher with SQL or mixed security The BiolD Server Service needs to the names of the database server and database on that database server and a SQL server user account name and 30 password. It will logon to the database with existing account. To work with the WO 2006/053375 PCT/AU2005/001733 23 database this account needs to have the right to logon to that database and to read, modify, create and delete datasets in any of the BiolD database tables. On the BiolD server side the communication runs via OLE DB. 5 (D) BiolD Server Setup <-> SQL Server The BiolD Server Setup can create the basic BiolD database when installing database on a MS SQL server. BiolD Server Setup needs an account, which is member of the SQL servers System Administrators role (sysadmin) to perform all necessary actions: 10 - Create the new database - Create all required tables in the database - Create the new login for the BiolD Server Service - Give all rights for the new BiolD database to this new login It depends on the specific MS SQL server installation which account can be used 15 to perform these actions: - MS SQL server 7.0 or higher with integrated or mixed security The BiolD Server Setup can logon with the account the BiolD Server Setup is currently running as the administrators - account. It is necessary to be member of the sysadmin Server Role on the SQL server. 20 - MS SQL server 7.0 or higher with SQL or mixed security The BiolD Server Setup needs to know a SQL server user account name and password. It will logon to the SQL server with this given account info. This account has to be member of the sysadmin Server Role on the SQL server (like the standard "SA" account). 25 If it is necessary to create the database de novo, then there is no need to allow the BiolD Server Setup the access to the SQL server. The BiolD Server Setup will install a SQL script, which can be used to create all necessary tables in the database.
WO 2006/053375 PCT/AU2005/001733 24 (E) Client Browser <-> BiolD and Web Server The communication between Client Browser and Web Server will be done using HTTP. Then Web Server will in turn interact with BiolD Server. HTTP stands for Hypertext Transfer Protocol. This is the network protocol used 5 to deliver virtually all files and other data (collectively called resources) on the World Wide Web, such as HTML files, image files, query results. Typically, HTTP transfer takes place through TCP/IP sockets. A browser is an HTTP client because it sends requests to an HTTP server (Web server), which then sends responses back to the client. The standard (and 10 default) port for HTTP servers to listen on is port 80. HTTP is used to transmit resources, not just files. A resource is a package of information that can be identified by a URL. The most common type of resource is a file, but a resource may also be a dynamically generated query result, the output of a CGI script, a document that is available in several languages. 15 System Process Figure 3 is a systematic representation of the system process (Interaction with Bio SDK) together with a description to follow: 20 1. On the front end Client Application will run on a Web Browser. It will send a HTTP request to the web server through ASP.NET script. 2. Web Server i.e. Internet Information Server (IIS) will receive the HTTP request and initiate respective process. 25 3. The processes can be: fetching User's profile from database. Database resides on the back-end and will store all the data related to users and their photos. 30 4. There can be the following types of HTTP Request and Responses: WO 2006/053375 PCT/AU2005/001733 25 a. User can send HTTP Request for registration. In this case Web Server will get user's data and pass it on to Database. User's data is saved on database and HTTP Response will be sent to Web Browser. 5 b. User can send HTTP Request for login. Here the Web Server will authenticate user's username and password from Database and send the HTTP Response back to Web Browser. 10 c. User can send HTTP Request to upload photo. The Web Server will get photo data from client browser. Web Server will save the photo on server and update the Database. Web server will also call BiolD server and components to enroll and create template for the photo. Then web server will send the HTTP Response for photo 15 upload confirmation back to Web Browser. User can send HTTP Request for matching the photo. In this case Web Server will fetch all other user's photos from Database. Web server will then call BiolD server and components to match the photo using processes like classification, 20 verify and identify and then send the match % back to web server. Then web server will send the HTTP Response for match % back to Web Browser, only if match % > 60%. FIG 4 is a flow chart representing the system overview: 1. User can register on site by entering their personal details, email and 25 mobile number. The user needs to choose their mobile service provider to allow receipt of mobile messages from the site and send their pictures. In order to use the match making service, a username and password is required. 2. After registration user will receive a confirmation email to activate their 30 account. This email will contain an activation link to activate their account.
WO 2006/053375 PCT/AU2005/001733 26 3. Once the account is activated, user can login to the website by using their username and password. On successful login, user's session will be created to validate user on each requested page. 4. Once account is activated, user is required to create their profile. User can 5 mention various information such as introduction title, description, personal characteristics, smoker/drinker etc, as well as details of the person he/she is interested in. 5. The user can upload his/her picture in order to find a match. Picture can be upload both through the website or send via email using their Internet 10 enabled mobile phone. In order to upload picture via mobile phone, it is required that the picture meets the defined specification and is sent to provided email address from where the photos will be fetched. 6. User's image will then be passed to BiolD SDK and enrolled. After enrollment a template is created in BiolD for User's image. 15 7. User can then search for members and choose members to perform match with or otherwise update their match preference in order for the system to identify a match. 8. Match is performed using BiolD SDK functions and processes that are classification, verify and identify. 20 9. User will receive a result of >60% match either on his web page or mobile device through SMS. Database Entity Relationship FIGS 5 and 6 show two detailed database structures in the form of entity 25 relationships. Registration Process The user registers online in order to use match making services. The flow chart shown at FIG 7 represents this process. 30 WO 2006/053375 PCT/AU2005/001733 27 Create profile process Once a user is registered on the website, the user creates their profile. This profile will be visible to other users for match making purposes. A flow chart of this process is shown at FIG 8. 5 Photo Upload and Matching Process Overview The overview of photo uploading and matching process is shown at FIG 9. User registers on site by entering their personal details, email and mobile telephone number. The user chooses their mobile service provider to allow 10 receipt of mobile messages from the site and also to upload their photograph. In order to use the match making services, a username and password is required. After registration user will receive a confirmation email to activate their account. This email will contain an activation link for that purpose. Once the account is activated, user can login to the website by using their 15 username and password. On successful login, user's session will be created to validate user on each requested page. Once account is activated, user is required to create their profile. User can include various details such as introduction title, description, personal characteristics (e.g. smoker or drinker etc). User may also include preferred 20 personal characteristics of their desired partner. The user uploads his/her digital photograph in order to find a match. The photograph may be uploaded via the website or via email using their Internet enabled mobile phone. In order to upload picture via mobile phone, it is required 25 that the picture meets the defined specification and is sent to provided email address from where the photos will be retrieved. The photograph must be of an acceptable format (jpg, gif) as mentioned on the website and not exceeding maximum allowed size (this is Admin controlled). When user uploads the photograph, it is saved into a separate folder on server. 30 Picture name will be then added to the database. If there are any pictures of WO 2006/053375 PCT/AU2005/001733 28 same user present with that name, 1,2,3... will be added after to the picture name. Enrollment is the process of collecting biometric samples from an applicant and the subsequent preparation, and storage of biometric reference templates 5 representing that person's identity. User's picture is passed to BiolD SDK and is enrolled using following steps: Locating Faces BiolD SDK uses a two-stage model-based algorithm to detect the location of a 10 human face in an arbitrary image: a binary face model is being matched in a binarized version of the current scene. The comparison is performed with the modified Hausdorff distance, which determines the optimal location, scaling and rotation of the model. The estimated face position is refined by matching a more detailed eye region 15 model, again using the Hausdorff distance for comparison. The exact eye positions are determined by checking the output of an artificial neural network (ANN) specialized on detecting eye centers. The eye positions allow for all further processing: using anthropomorphic knowledge, a normalized portion of the face and of the mouth region can be 20 extracted. Face Features The face is transformed to a uniform size. This procedure ensures that the appropriate biometric features of the face are analyzed, and not, for example, the 25 size of the head, hair style, a tie, or piece of jewelry. Some further preprocessing steps reduce the impact of lighting conditions and color variance. Subsequent to the feature extraction, methods are applied to the normalized image, resulting in a face feature vector, which is then used by the classifier. 30 WO 2006/053375 PCT/AU2005/001733 29 Create Template The template is made up of a separate part for each classification trait and can be understood as a compact representation of the collected feature data, where useless or redundant information is discarded. 5 Independent of the number of training sequences, each part consists of a fixed amount of data representing each person's characteristics. User's picture is now compared with other already stored pictures. This is the process of comparing biometric data with a previously stored reference template or templates. 10 Classification Each person enrolled in BiolD is assigned a unique class, and the classifier compares a new recording (i.e. the feature vectors that are extracted out of this recording) with all (formerly trained and stored) prototypes of each class. 15 The prototype with the highest similarity determines the class ("winner takes all" principle). The identify function is used to match the pictures of existing users in order to find a match (where match score is greater then 0.6 or 60%). The output of a classifier therefore is the Class ID (the person with the best matching features) - the similarity value. Similarity values are typically in the 20 range between 0.0 and 1.0, a match with 1.0 would mean a perfect match. Note that values in biometrics are never really 100%. The output of a classifier will almost never be 1.0, and values of 0.8 to 0.9 are typical. BiolD (as almost all classification systems) uses thresholds to qualify a classification. Only if the similarity value exceeds a certain threshold, the user will 25 be recognized. This prevents poor matches from being falsely identified. These score will be converted into percentages and a match greater then 60% for example, which is a score of 0.6, will be considered a good match. Since BiolD uses individual classification mechanisms for each biometric channel, each channel has its own threshold. Thresholds can be altered in the 30 BiolD User Interface within reasonable limits (the system doesn't allow to set very low threshold values such as 0.0 or 1.0).
WO 2006/053375 PCT/AU2005/001733 30 Upload picture through website process User uploads their photograph to find a match by login into their account as shown schematically in FIG 10. The user can upload his/her picture in order to find a match. In order to upload 5 the picture through the website, user must login to their respective account by entering their username and password. User's image will then be passed to BiolD SDK and enrolled. After enrollment a template is created in BiolD for User's image and a confirmation email is sent to user (see FIG 11) 10 Uploading picture through mobile phone process User can upload their photograph to find a match by using their mobile phones as shown schematically in FIG 12. A photograph is uploaded either through the website or sent via email using an 15 Internet enabled mobile phone. In order to upload the photograph via mobile phone, it is required that the picture meets a defined specification, and is sent to the provided email address from where the photos will be retrieved. User picture will be enrolled using BiolD SDK enroll function and a template will be created for matching purposes. 20 A photograph must be of proper format (jpg, gif) as mentioned on the website and not exceeding maximum allowed size (this is Admin controlled). When users upload the photograph, it is saved into a separate folder on server. Photograph name will be then added to the database. If there are any pictures of same user present with that name, 1,2,3... will be added after to the picture name. 25 Once the picture is uploaded user will be sent a confirmation email using SMS (see FIG 13). Matching Process The matching process is shown generally in FIGS 14, 15 and 16.
WO 2006/053375 PCT/AU2005/001733 31 1. User can start matching process through the website after uploading their pictures. Match is performed using BiolD functions and processes that are classification, verify and identify. 2. Each person enrolled in BiolD is assigned a unique class, and the 5 classifier compares a new picture with all (formerly trained and stored) prototypes of each class. 3. The output of a classifier therefore is the Class ID (the person with the best matching features) - the similarity value. Similarity values are typically in the range between 0.0 and 1.0, a match with 1.0 would mean a perfect 10 match. Note that values in biometrics are never really 100%. The score will be converted to percentage and a match of greater than 60% will be considered as a potential match. 4. BiolD components are called using the API, like suppose BiolD component's object is named as BioIDAPI. 15 5. Identification functions that are: BiolDCtrIIdentificationReady and identifyClick are called to identify similarity in photos. 6. The verify functions that are BiolDCtrIVerificationReady and verify_Click are then called to verify the matches. 7. User will receive a result of >60% match either on his web page or mobile 20 device through SMS in descending order of percentage. Intermediate interface to interact with BiolD SDK An intermediate script to transfer data between BiolD SDK and web browser is required. BiolD SDK includes a set of components and database, so a script 25 (ASP or VB script) is required to call BiolD components and then execute that script through browser (see FIG 17) Finally, it is to be understood that various other modifications and/or alterations may be made without departing from the spirit of the present invention as outlined herein. 30

Claims (22)

1. A method for identifying a potential partner for a user, the method including the steps of: 5 providing biometric data characterising a physical feature of the user and/or a parent of the user, providing a database having biometric data characterising a physical feature of a plurality of individuals, comparing the biometric data of the user with at least a proportion of the 10 biometric data on the database, identifying at least one individual characterised by biometric data that is at least similar to that of the user and/or a parent of the user.
2. A method according to claim 1 wherein the level of similarity is not less 15 than a minimum value, such that the user has an attraction or affinity for the potential partner.
3. A method according to claim 2 wherein the attraction or affinity is greater than that where there is no comparison of biometric data. 20
4. A method according to claim 2 or claim 3 wherein the level of similarity is not greater than a maximum value, such that the user has little or no attraction or affinity for the potential partner. 25
5. A method according to any one of claims 2 to 4 wherein the attraction or affinity is physical.
6. A method according to any one of claims 1 to 5 implemented at least in part over a computer network. 30
7. A method according to claim 6 wherein the computer network is the Internet. WO 2006/053375 PCT/AU2005/001733 33
8. A method according to claim 6 or claim 7 wherein the method is at least partially implemented over a mobile telephone network.
9. A method according to any one of claims 6 to 8 wherein the potential partner(s) are returned to the user as a candidate list in descending order of 5 similarity.
10. A method according to any one of claims 1 to 9 wherein the biometric data relates to the head and/or face of the user. 10
11. A method according to claim 10 wherein the biometric data relates to the position of anatomical landmarks on the head and/or face.
12. A method according to claim 11 wherein the anatomical landmark is selected from the group consisting of the right eye pupil, left eye pupil, right mouth corner, left mouth corner, outer end of right eye brow, inner end of right 15 eye brow, inner end of left eye brow, outer end of left eye brow, right temple, outer corner of right eye, inner corner of right eye, inner corner of left eye, outer corner of left eye, left temple, tip of nose, right nostril, left nostril, centre point on outer edge of upper lip, centre point on outer edge of lower lip, and tip of chin.
13. A method according to any one of claims 10 to 12 wherein the biometric 20 data relates to the shape of an anatomical structure of the face and/or head.
14. A method according to claim 13 wherein the anatomical structure is selected from the group consisting of the eye, eye socket, nose, nostril, ear, chin, jaw, cheek, forehead, head, mouth, lip, teeth, eyebrow, and eyelash.
15. A method according to any one of claims 1 to 14 wherein the step of 25 comparing does not include a biometric characteristic of the user that the user wishes to exclude.
16. A method according to any one of claims 1 to 15 wherein the biometric data relates to colouring. WO 2006/053375 PCT/AU2005/001733 34
17. A method according to claim 16 wherein the colouring is of a feature selected form the group consisting of the skin, eyes, and hair.
18. A method according to any one of claims 1 to 17 wherein the biometric 5 data is selected from the group consisting of an anatomical landmark, shape of an anatomical structure, and colouring.
19. Computer executable code capable of implementing a method according to any one of claims 1 to 18. 10
20. A computer system including a computer executable code according to claim 19.
21. A computer system according to claim 20 including a component selected from the group consisting of a data acquisition component, a data compression component, a data decompression component, a feature extraction component, 15 a matcher component, and a decision maker component.
22. Use of face recognition software for identifying a potential partner for a user. 20
AU2005306571A 2004-11-16 2005-11-16 Computer-based method and system for identifying a potential partner Abandoned AU2005306571A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU2005306571A AU2005306571A1 (en) 2004-11-16 2005-11-16 Computer-based method and system for identifying a potential partner
AU2012201564A AU2012201564A1 (en) 2004-11-16 2012-03-16 Computer based method and system for identifying a potential partner

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
AU2004906566A AU2004906566A0 (en) 2004-11-16 Compter-based method and system for identifying a potential partner
AU2004906566 2004-11-16
AU2005306571A AU2005306571A1 (en) 2004-11-16 2005-11-16 Computer-based method and system for identifying a potential partner
PCT/AU2005/001733 WO2006053375A1 (en) 2004-11-16 2005-11-16 Computer-based method and system for identifying a potential partner

Related Child Applications (1)

Application Number Title Priority Date Filing Date
AU2012201564A Division AU2012201564A1 (en) 2004-11-16 2012-03-16 Computer based method and system for identifying a potential partner

Publications (1)

Publication Number Publication Date
AU2005306571A1 true AU2005306571A1 (en) 2006-05-26

Family

ID=38283763

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2005306571A Abandoned AU2005306571A1 (en) 2004-11-16 2005-11-16 Computer-based method and system for identifying a potential partner

Country Status (1)

Country Link
AU (1) AU2005306571A1 (en)

Similar Documents

Publication Publication Date Title
US9798922B2 (en) Image classification and information retrieval over wireless digital networks and the internet
US10223578B2 (en) System and method for utilizing facial recognition technology for identifying an unknown individual from a digital image
US7831069B2 (en) Digital image search system and method
US7450740B2 (en) Image classification and information retrieval over wireless digital networks and the internet
US20090060288A1 (en) Image Classification And Information Retrieval Over Wireless Digital Networks And The Internet
Manna et al. Face recognition from video using deep learning
WO2006053375A1 (en) Computer-based method and system for identifying a potential partner
Hossain et al. Next generation identity verification based on face-gait Biometrics
Galbally et al. Study on face identification technology for its implementation in the Schengen information system
Saraswat et al. Anti-spoofing-enabled contactless attendance monitoring system in the COVID-19 pandemic
Shashikala et al. Attendance monitoring system using face recognition
Park et al. A study on the design and implementation of facial recognition application system
Arbab‐Zavar et al. On forensic use of biometrics
De Marsico et al. Face recognition in adverse conditions: A look at achieved advancements
Liu Class-constrained transfer LDA for cross-view action recognition in internet of things
AU2012201564A1 (en) Computer based method and system for identifying a potential partner
AU2005306571A1 (en) Computer-based method and system for identifying a potential partner
KR20220124446A (en) Method and system for providing animal face test service based on machine learning
Espinosa Duró Face recognition by means of advanced contributions in machine learning
Jan et al. Databases for iris biometric systems: a survey
Vully Facial expression detection using principal component analysis
Galdámez et al. Ear biometrics: a small look at the process of ear recognition
Al-Kawaz Facial identification for digital forensic
Kurra et al. An Efficient Machine Learning Based Attendance Monitoring System Through Face Recognition.
Srivika et al. Biometric Verification using Periocular Features based on Convolutional Neural Network

Legal Events

Date Code Title Description
MK5 Application lapsed section 142(2)(e) - patent request and compl. specification not accepted