WO2022044036A1 - System and method to determine true facial feature measurements of a face in a 2d image - Google Patents

System and method to determine true facial feature measurements of a face in a 2d image Download PDF

Info

Publication number
WO2022044036A1
WO2022044036A1 PCT/IN2021/050809 IN2021050809W WO2022044036A1 WO 2022044036 A1 WO2022044036 A1 WO 2022044036A1 IN 2021050809 W IN2021050809 W IN 2021050809W WO 2022044036 A1 WO2022044036 A1 WO 2022044036A1
Authority
WO
WIPO (PCT)
Prior art keywords
face
center
user
main server
image
Prior art date
Application number
PCT/IN2021/050809
Other languages
French (fr)
Inventor
Peyush Bansal
Original Assignee
Lenskart Solutions Pvt. Ltd.,
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenskart Solutions Pvt. Ltd., filed Critical Lenskart Solutions Pvt. Ltd.,
Publication of WO2022044036A1 publication Critical patent/WO2022044036A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0621Item configuration or customization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • G06Q30/0643Graphical representation of items or shoppers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Definitions

  • the present invention relates to detection of true facial feature measurements. More specifically, the present invention relates to a face detection system and a method associated with the system that determines true facial measurements for using optical products that require the appropriate size. For example, eyeglasses or sunglasses, facial measurement helps in determining the size of the product of interest.
  • Another method is the use of reference objects in the picture.
  • ATM card or credit card ATM card or credit card
  • These cards have the same length and width even for international standards. If a user places the card flat over his/her forehead and take a picture, then using image processing techniques, the card is detected and hence the mm/pixel (since the length or width of the card is fixed and hence known).
  • most of the techniques use width, length, and as well as depth, based on 3D imaging systems. Therefore, there is a need for a method or system that detects correct facial measurements for selecting the right product of interest, for example, spectacles, without using augmented reality and 3D depth analysis.
  • the face detection system disclosed here addresses the above-mentioned need to detect correct facial measurements for selecting the right product of interest, for example, spectacles, without using augmented reality and 3D depth analysis.
  • the face detection system comprises a communication device, a main server, and a third-party server.
  • the communication device is controlled by a first processor and comprises a camera, where the camera is focused on the face of a user to click an image of the face of the user.
  • the main server is controlled by a second processor that is in communication with the communication device via a communication network, where the main server receives the captured image via the communication device.
  • the third-party server is controlled by a third processor and is in communication with the main server, wherein the third-party server receives the captured image and identifies the landmarks on the face of the user as a measurable value. The identified measurable value is transferred back to the main server to determine the diameter of both irises of the user’s eyes and the distance between center of left iris and center of right iris.
  • the diameter of both the irises of the user’s eyes and the distance between the center of the left iris and the center of the right iris are measured in pixels.
  • the main server in communication with the second processor determines a ratio of an average of the diameter of both the irises in pixels by the distance between the center of the left iris and the center of the right iris in pixels.
  • the communication device receives processed data that comprises the ratio the from the main server, where the user is enabled to choose a product of interest by accessing a set tables that is derived based on the above determined ratio.
  • the face detection system further comprises an image acquisition module of the communication device that is controlled by the first processor, is in communication with the camera to capture and receive the image using the camera.
  • the face detection system further comprises a face detection module of the third-party sever that detects the landmarks of the image of the face in pixels using image processing techniques.
  • the face detection system further comprises a facial features quantifying module of the main server that detects the diameter of both the irises in pixels and the distance between the center of the left iris and the center of the right iris in pixels.
  • the face detection system further comprises a unit determination module of the main server that receives data that includes the diameter and the distance that is retrieved from the facial features quantifying module.
  • the unit determination module determines the ratio of the average of the diameter of both the irises in pixels by the distance between the center of the left iris and the center of the right iris in pixels. In an embodiment, the ratio is within a range between 0.11 to 0.31.
  • Figure 1 is a schematic diagram showing the face detection system that determines true facial feature measurements in a 2D image.
  • Figure 2 is a front view of a reference human face, showing the different dimensions that are retrieved from the face of the user to determine the facial feature measurements.
  • Figure 3 is a block diagram showing the different modules of the face detection system that determine facial feature measurements from a 2D image of the face of a user.
  • Figure 4 is a method flow diagram showing the process involved in the face detection system that determines facial feature measurements from a 2D image of the face of a user.
  • the present invention relates to a face detection system that determines true facial measurements for using optical products that require the appropriate size. For example, eyeglasses or sunglasses, facial measurement helps in determining the size of the product of interest.
  • figure 1 is a schematic diagram showing the face detection system 100 that determines true facial feature measurements of a face in a 2D image.
  • the face detection system 100 includes a communication device 102, a communication network 104, a main server 106, and a third-party server 108.
  • the communication device 102 is, for example, a mobile device, a desktop, a laptop, or any device that includes a software application and an image capturing feature capable of communicating with a server through a network.
  • the communication device 102 is controlled by a first processor 310 and comprises a camera 102, which is focused 401 on the face of a user to click an image of the face 200 of the user.
  • the main server 106 is controlled by a second processor 312 that is in communication with the communication device 102 via a communication network 104, which receives 402 the captured image via the communication device 102.
  • the third-party server 108 is controlled by a third processor 314 and is in communication with the main server 106, which receives 403 the captured image and identifies the landmarks on the face 200 of the user as a measurable value.
  • the identified measurable value is transferred back to the main server 106 to determine 404 the diameter 202 of both irises of the user’s eyes and the distance 204 between center of left iris and center of right iris, as shown in Figure 2.
  • the user in order to detect the facial measurements, the user has to access the software application installed in the communication device 102 that is controlled by the first processor 310, access the camera 110 included in the communication device 102 through the option available on the software application, position the communication device 102 at a distance from his/her face, and capture (or click) the image of the face 200 using the camera 110.
  • the captured image is transferred to a main server 106 via a communication network 104.
  • the main server 106 transfers the image to the third-party server 108, where the landmarks of the face 200 of the user are identified in measurable values, such as, pixels using image processing techniques.
  • This identified data using the third-party server 108 that includes landmarks of the face 200 is transferred back to the main server 106 for further processing that includes, for example, determining the diameter 202 of both the mses in pixels and the distance 204 between the center of the left iris and the center of the right iris in pixels and determining a ratio of an average of the diameter 202 of both the irises in pixels by the distance 204 between the center of the left iris and the center of the right iris in pixels.
  • the communication device 102 finally receives processed data that comprises the ratio the from the main server 106, so that a user is enabled to choose the product of interest, for example, a pair of spectacles, by accessing a set tables derived from the above determined value, or in other words, the above determined ratio.
  • Figure 2 is a front view of a reference human face 200, showing the different dimensions that are retrieved from the face 200 of the user to determine the facial feature measurements.
  • the reference numeral 202 represents the diameter of both the irises in pixels and the reference numeral 204 represents the distance between the center of the left iris and the center of the right iris in pixels.
  • Figure 3 is a block diagram showing the different modules of the face detection system 100 that determines true facial feature measurements from a 2D image of the face 200 of a user.
  • An image acquisition module 302 of the communication device 102 that is in communication with the camera 110 captures and receives the image of the face 200 of the user from the camera 110.
  • the data of the captured image of the user’s face 200 is transferred to the main server 106 and then to a third-party server 108 via the communication network 104.
  • a face detection module 304 of the third-party sever 106 detects the landmarks of the image of the face 200 in pixels using image processing techniques.
  • the detected data that includes the landmarks of the image of the face 200 is fed back from the third-party sever 108 to the main server 106 via the communication network 104.
  • a facial features quantifying module 306 of the main server 106 determines the diameter 202 of both the irises in pixels and the distance 204 between the center of the left iris and the center of the right iris in pixels.
  • a unit determination module 308 of the mam server 106 receives the data including the diameter 202 and the distance 204 that is retrieved from the facial features quantifying module 306.
  • the unit determination module 308 determines ratio of an average of the diameter 202 of both the irises in pixels by the distance 204 between the center of the left iris and the center of the right iris in pixels. This ratio will range between, for example, 0.11 to 0.31.
  • the unit determination module 308 also derives, based on the ratio, the real measurement of the distance between the center of the left iris and the center of the right iris in millimeter (mm), which is defined as the pupillary distance. In an embodiment, the unit determination module 308 further determines the measurements of the rest of the facial features, like eyes, nose, lips, ears and forehead using the pupillary distance as a reference scale. In order to enable a user to choose the product of interest, for example, a pair of spectacles with a custom frame size, a table is provided for the user’s reference to derive the final facial measurements: For Example, a. Male, 0.15 corresponds to a pupillary distance of 72 b. Male, 0.16 corresponds to a pupillary distance of 70 c.
  • Male, 0.17 corresponds to a pupillary distance of 68 d.
  • Male, 0.18 corresponds to a pupillary distance of 66 e.
  • Male, 0.19 corresponds to a pupillary distance of 64 f.
  • Male, 0.20 corresponds to a pupillary distance of 62
  • the present disclosure may be embodied as a method and system. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, a software embodiment or an embodiment combining software and hardware aspects.
  • Instructions may also be stored in a computer- readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function/act performed by any of the units as described above.
  • Instructions may also be loaded onto a computer or other programmable data processing apparatus like a scanner/check scanner to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts performed by any of the units as described above.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Marketing (AREA)
  • Multimedia (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Development Economics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • General Business, Economics & Management (AREA)
  • Human Computer Interaction (AREA)
  • Ophthalmology & Optometry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

5 A face detection system comprises a communication device, a main server, and a third-party server. The communication device is controlled by a first processor and comprises a camera, which is focused on the face of a user to click an image of the face of the user. The main server is controlled by a second processor that is 10 in communication with the communication device via a communication network, which receives the captured image via the communication device. The third-party server is controlled by a third processor and is in communication with the main server, which receives the captured image and identifies the landmarks on the face of the user as a measurable value. The identified measurable value is 15 transferred back to the main server to determine the diameter of both irises of the user's eyes and the distance between center of left iris and center of right iris.

Description

SYSTEM AND METHOD TO DETERMINE TRUE FACIAL FEATURE MEASUREMENTS OF A FACE IN A 2D IMAGE
FIELD OF THE INVENTION
The present invention relates to detection of true facial feature measurements. More specifically, the present invention relates to a face detection system and a method associated with the system that determines true facial measurements for using optical products that require the appropriate size. For example, eyeglasses or sunglasses, facial measurement helps in determining the size of the product of interest.
BACKGROUND OF THE INVENTION
In the current scenario, online purchase of accessories and clothing has become a matter of great interest, especially considering the fact that most of the web portals offer technologies that help the user to, for example, click/scan/upload a picture of him/her through which the website can identify the suitable measurements for the accessory of his/her choice. However, for example, when it comes to choosing a pair of spectacles, measurements of the face have to be carefully taken to determine the right pair for the user. In an example, Nike Fit uses augmented reality that is available in iPhone X (iPhone X™ by Apple Inc.™) grade phones to determine the size. This is possible because iPhone X has an infrared camera that helps determine the depth of an object in an image. Unfortunately, this technology is limited to only those grades of phones.
Another method is the use of reference objects in the picture. In order to find out the measurement of the eyeglass or sunglass (frame) that would fit correctly on a face, few companies use a magnetic strip card (ATM card or credit card) as a reference object. These cards have the same length and width even for international standards. If a user places the card flat over his/her forehead and take a picture, then using image processing techniques, the card is detected and hence the mm/pixel (since the length or width of the card is fixed and hence known). As mentioned above, most of the techniques use width, length, and as well as depth, based on 3D imaging systems. Therefore, there is a need for a method or system that detects correct facial measurements for selecting the right product of interest, for example, spectacles, without using augmented reality and 3D depth analysis.
These and other advantages of the present invention will become readily apparent from the following detailed description read in conjunction with the accompanying drawings.
SUMMARY OF THE INVENTION
The following presents a simplified summary of the subject matter in order to provide a basic understanding of some of the aspects of subject matter embodiments. This summary is not an extensive overview of the subject matter. It is not intended to identify key/critical elements of the embodiments or to delineate the scope of the subject matter. Its sole purpose to present some concepts of the subject matter in a simplified form as a prelude to the more detailed description that is presented later.
The face detection system disclosed here addresses the above-mentioned need to detect correct facial measurements for selecting the right product of interest, for example, spectacles, without using augmented reality and 3D depth analysis. The face detection system comprises a communication device, a main server, and a third-party server. The communication device is controlled by a first processor and comprises a camera, where the camera is focused on the face of a user to click an image of the face of the user. The main server is controlled by a second processor that is in communication with the communication device via a communication network, where the main server receives the captured image via the communication device. The third-party server is controlled by a third processor and is in communication with the main server, wherein the third-party server receives the captured image and identifies the landmarks on the face of the user as a measurable value. The identified measurable value is transferred back to the main server to determine the diameter of both irises of the user’s eyes and the distance between center of left iris and center of right iris.
In an embodiment, the diameter of both the irises of the user’s eyes and the distance between the center of the left iris and the center of the right iris are measured in pixels. In an embodiment, the main server in communication with the second processor determines a ratio of an average of the diameter of both the irises in pixels by the distance between the center of the left iris and the center of the right iris in pixels. In an embodiment, the communication device receives processed data that comprises the ratio the from the main server, where the user is enabled to choose a product of interest by accessing a set tables that is derived based on the above determined ratio.
In an embodiment, the face detection system further comprises an image acquisition module of the communication device that is controlled by the first processor, is in communication with the camera to capture and receive the image using the camera. In an embodiment, the face detection system further comprises a face detection module of the third-party sever that detects the landmarks of the image of the face in pixels using image processing techniques. In an embodiment, the face detection system further comprises a facial features quantifying module of the main server that detects the diameter of both the irises in pixels and the distance between the center of the left iris and the center of the right iris in pixels.
In an embodiment, the face detection system further comprises a unit determination module of the main server that receives data that includes the diameter and the distance that is retrieved from the facial features quantifying module. The unit determination module determines the ratio of the average of the diameter of both the irises in pixels by the distance between the center of the left iris and the center of the right iris in pixels. In an embodiment, the ratio is within a range between 0.11 to 0.31. Other aspects, advantages, and salient features of the present invention will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses exemplary embodiments of the invention. Reference will now be made to the accompanying diagrams which illustrate, by way of an example, and not by way of limitation, of one possible embodiment of the invention.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
The following drawings are illustrative of particular examples for enabling systems and methods of the present invention, are descriptive of some of the methods and mechanism, and are not intended to limit the scope of the invention. The drawings are not to scale (unless so stated) and are intended for use in conjunction with the explanations in the following detailed description.
Figure 1 is a schematic diagram showing the face detection system that determines true facial feature measurements in a 2D image.
Figure 2 is a front view of a reference human face, showing the different dimensions that are retrieved from the face of the user to determine the facial feature measurements.
Figure 3 is a block diagram showing the different modules of the face detection system that determine facial feature measurements from a 2D image of the face of a user.
Figure 4 is a method flow diagram showing the process involved in the face detection system that determines facial feature measurements from a 2D image of the face of a user.
Persons skilled in the art will appreciate that elements in the figures are illustrated for simplicity and clarity and may represent both hardware components of the face detection system. Further, the dimensions of some of the elements in the figure may be exaggerated relative to other elements to help to improve understanding of various exemplary embodiments of the present disclosure.
Throughout the drawings, it should be noted that like reference numbers are used to depict the same or similar elements, features, and structures.
DETAILED DESCRIPTION OF THE INVENTION
The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of exemplary embodiments of the invention. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, persons skilled in the art will recognize that various changes and modifications to the embodiments described herein can be made without departing from the scope and spirit of the invention. In addition, descriptions of well-known functions and constructions are omitted for clarity and conciseness.
The terms and words used in the following description are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the invention. Accordingly, it should be apparent to the person skilled in the art that the following description of exemplary embodiments of the present invention are provided for illustration purpose only.
It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise.
The present invention relates to a face detection system that determines true facial measurements for using optical products that require the appropriate size. For example, eyeglasses or sunglasses, facial measurement helps in determining the size of the product of interest.
Referring to Figures 1 and 3, figure 1 is a schematic diagram showing the face detection system 100 that determines true facial feature measurements of a face in a 2D image. The face detection system 100 includes a communication device 102, a communication network 104, a main server 106, and a third-party server 108. The communication device 102 is, for example, a mobile device, a desktop, a laptop, or any device that includes a software application and an image capturing feature capable of communicating with a server through a network.
Referring to Figures 1-4, the communication device 102 is controlled by a first processor 310 and comprises a camera 102, which is focused 401 on the face of a user to click an image of the face 200 of the user. The main server 106 is controlled by a second processor 312 that is in communication with the communication device 102 via a communication network 104, which receives 402 the captured image via the communication device 102. The third-party server 108 is controlled by a third processor 314 and is in communication with the main server 106, which receives 403 the captured image and identifies the landmarks on the face 200 of the user as a measurable value. The identified measurable value is transferred back to the main server 106 to determine 404 the diameter 202 of both irises of the user’s eyes and the distance 204 between center of left iris and center of right iris, as shown in Figure 2.
In other words, in order to detect the facial measurements, the user has to access the software application installed in the communication device 102 that is controlled by the first processor 310, access the camera 110 included in the communication device 102 through the option available on the software application, position the communication device 102 at a distance from his/her face, and capture (or click) the image of the face 200 using the camera 110. The captured image is transferred to a main server 106 via a communication network 104. The main server 106 transfers the image to the third-party server 108, where the landmarks of the face 200 of the user are identified in measurable values, such as, pixels using image processing techniques.
This identified data using the third-party server 108 that includes landmarks of the face 200 is transferred back to the main server 106 for further processing that includes, for example, determining the diameter 202 of both the mses in pixels and the distance 204 between the center of the left iris and the center of the right iris in pixels and determining a ratio of an average of the diameter 202 of both the irises in pixels by the distance 204 between the center of the left iris and the center of the right iris in pixels. The communication device 102 finally receives processed data that comprises the ratio the from the main server 106, so that a user is enabled to choose the product of interest, for example, a pair of spectacles, by accessing a set tables derived from the above determined value, or in other words, the above determined ratio.
Figure 2 is a front view of a reference human face 200, showing the different dimensions that are retrieved from the face 200 of the user to determine the facial feature measurements. Here, the reference numeral 202 represents the diameter of both the irises in pixels and the reference numeral 204 represents the distance between the center of the left iris and the center of the right iris in pixels.
Figure 3 is a block diagram showing the different modules of the face detection system 100 that determines true facial feature measurements from a 2D image of the face 200 of a user. An image acquisition module 302 of the communication device 102 that is in communication with the camera 110 captures and receives the image of the face 200 of the user from the camera 110. The data of the captured image of the user’s face 200 is transferred to the main server 106 and then to a third-party server 108 via the communication network 104. A face detection module 304 of the third-party sever 106 detects the landmarks of the image of the face 200 in pixels using image processing techniques. The detected data that includes the landmarks of the image of the face 200 is fed back from the third-party sever 108 to the main server 106 via the communication network 104. A facial features quantifying module 306 of the main server 106 determines the diameter 202 of both the irises in pixels and the distance 204 between the center of the left iris and the center of the right iris in pixels. A unit determination module 308 of the mam server 106 receives the data including the diameter 202 and the distance 204 that is retrieved from the facial features quantifying module 306. The unit determination module 308 determines ratio of an average of the diameter 202 of both the irises in pixels by the distance 204 between the center of the left iris and the center of the right iris in pixels. This ratio will range between, for example, 0.11 to 0.31.
The unit determination module 308 also derives, based on the ratio, the real measurement of the distance between the center of the left iris and the center of the right iris in millimeter (mm), which is defined as the pupillary distance. In an embodiment, the unit determination module 308 further determines the measurements of the rest of the facial features, like eyes, nose, lips, ears and forehead using the pupillary distance as a reference scale. In order to enable a user to choose the product of interest, for example, a pair of spectacles with a custom frame size, a table is provided for the user’s reference to derive the final facial measurements: For Example, a. Male, 0.15 corresponds to a pupillary distance of 72 b. Male, 0.16 corresponds to a pupillary distance of 70 c. Male, 0.17 corresponds to a pupillary distance of 68 d. Male, 0.18 corresponds to a pupillary distance of 66 e. Male, 0.19 corresponds to a pupillary distance of 64 f. Male, 0.20 corresponds to a pupillary distance of 62
As will be appreciated by one of skill in the art, the present disclosure may be embodied as a method and system. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, a software embodiment or an embodiment combining software and hardware aspects.
It will be understood that the functions of any of the units as described above can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general-purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts performed by any of the units as described above.
Instructions may also be stored in a computer- readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function/act performed by any of the units as described above.
Instructions may also be loaded onto a computer or other programmable data processing apparatus like a scanner/check scanner to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts performed by any of the units as described above.
In the specification, there has been disclosed exemplary embodiments of the invention. Although specific terms are employed, they are used in a generic and descriptive sense only and not for purposes of limitation of the scope of the invention.

Claims

We Claim:
1. A face detection system comprising: a communication device controlled by a first processor and comprising a camera, wherein the camera is focused on the face of a user to click an image of the face of the user; a main server controlled by a second processor, in communication with the communication device via a communication network, wherein the main server receives the captured image via the communication device; a third-party server controlled by a third processor, in communication with the main server, wherein the third-party server receives the captured image and identifies the landmarks on the face of the user as a measurable value, wherein the identified measurable value is transferred back to the main server to determine the diameter of both irises of the user’s eyes and the distance between center of left iris and center of right iris.
2. The face detection system as claimed in claim 1, wherein the diameter of both the irises of the user’s eyes and the distance between the center of the left iris and the center of the right iris are measured in pixels.
3. The face detection system as claimed in claim 2, wherein the main server in communication with the second processor determines a ratio of an average of the diameter of both the irises in pixels by the distance between the center of the left iris and the center of the right iris in pixels.
4. The face detection system as claimed in claim 3, wherein the communication device receives processed data that comprises the ratio the from the main server, wherein the user is enabled to choose a product of interest by accessing a set tables that is derived based on the above determined ratio.
5. The face detection system as claimed in claim 1, further comprises an image acquisition module of the communication device, that is controlled by the first processor, is in communication with the camera to capture and receive the image using the camera.
6. The face detection system as claimed in claim 1, further comprises a face detection module of the third-party sever that detects the landmarks of the image of the face in pixels using image processing techniques.
7. The face detection system as claimed in claim 1, further comprises a facial features quantifying module of the main server that detects the diameter of both the irises in pixels and the distance between the center of the left iris and the center of the right iris in pixels.
8. The face detection system as claimed in claim 1, further comprises a unit determination module of the main server that receives data that includes the diameter and the distance that is retrieved from the facial features quantifying module, wherein the unit determination module determines the ratio of the average of the diameter of both the irises in pixels by the distance between the center of the left iris and the center of the right iris in pixels.
9. The face detection system as claimed in claim 8, wherein the ratio is within a range between 0.11 to 0.31.
10. A method of face detection comprising: focusing a camera of a communication device on the face of a user to click an image of the face of the user, wherein the communication device is controlled by a first processor; receiving the captured image in a main server that is controlled by a second processor, wherein the main server is in communication with the communication device via a communication network; receiving the captured image at a third-party server that is controlled by a third processor and identifying the landmarks on the face of the user as a measurable value, wherein the third-party server is in communication with the main server; and determining the diameter of both irises of the user’s eyes and the distance between center of left iris and center of right iris after transferring the identified measurable value to the main server.
PCT/IN2021/050809 2020-08-25 2021-08-24 System and method to determine true facial feature measurements of a face in a 2d image WO2022044036A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN202011036580 2020-08-25
IN202011036580 2020-08-25

Publications (1)

Publication Number Publication Date
WO2022044036A1 true WO2022044036A1 (en) 2022-03-03

Family

ID=80352901

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2021/050809 WO2022044036A1 (en) 2020-08-25 2021-08-24 System and method to determine true facial feature measurements of a face in a 2d image

Country Status (1)

Country Link
WO (1) WO2022044036A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013052132A2 (en) * 2011-10-03 2013-04-11 Qualcomm Incorporated Image-based head position tracking method and system
US10451900B2 (en) * 2013-08-22 2019-10-22 Bespoke, Inc. Method and system to create custom, user-specific eyewear
CN110477851A (en) * 2019-07-15 2019-11-22 广东工业大学 A kind of method of accurate measurement pupil and iris absolute diameter

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013052132A2 (en) * 2011-10-03 2013-04-11 Qualcomm Incorporated Image-based head position tracking method and system
US10451900B2 (en) * 2013-08-22 2019-10-22 Bespoke, Inc. Method and system to create custom, user-specific eyewear
CN110477851A (en) * 2019-07-15 2019-11-22 广东工业大学 A kind of method of accurate measurement pupil and iris absolute diameter

Similar Documents

Publication Publication Date Title
US11036991B2 (en) Information display method, device, and system
CN108764071B (en) Real face detection method and device based on infrared and visible light images
USRE47925E1 (en) Method and multi-camera portable device for producing stereo images
US9236024B2 (en) Systems and methods for obtaining a pupillary distance measurement using a mobile computing device
US20220400246A1 (en) Gaze correction of multi-view images
US20220301217A1 (en) Eye tracking latency enhancements
JP2015527625A (en) physical measurement
US20150049952A1 (en) Systems and methods of measuring facial characteristics
Reddy et al. A robust scheme for iris segmentation in mobile environment
WO2018219290A1 (en) Information terminal
US9924865B2 (en) Apparatus and method for estimating gaze from un-calibrated eye measurement points
KR102444768B1 (en) Method and apparatus for measuring local power and/or power distribution of spectacle lenses
WO2022044036A1 (en) System and method to determine true facial feature measurements of a face in a 2d image
US9779328B2 (en) Range image generation
CN111383256B (en) Image processing method, electronic device, and computer-readable storage medium
EP2866446B1 (en) Method and multi-camera portable device for producing stereo images
KR101777328B1 (en) Method for providing data related to product using smart glass and mobile device
US20180199810A1 (en) Systems and methods for pupillary distance estimation from digital facial images
US11915521B2 (en) Zero delay gaze filter
CN109034112B (en) Hospital's face checkout system
Li et al. openEyes: an open-hardware open-source system for low-cost eye tracking
US20190156146A1 (en) Frame recognition system and method
CN109328459B (en) Intelligent terminal, 3D imaging method thereof and 3D imaging system
KR20170032743A (en) Method for providing coupon using smart glass and mobile device
JP6510451B2 (en) Device, method and program for specifying the pupil area of a person in an image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21860774

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21860774

Country of ref document: EP

Kind code of ref document: A1