US9990537B2 - Facial feature location using symmetry line - Google Patents

Facial feature location using symmetry line Download PDF

Info

Publication number
US9990537B2
US9990537B2 US14/803,142 US201514803142A US9990537B2 US 9990537 B2 US9990537 B2 US 9990537B2 US 201514803142 A US201514803142 A US 201514803142A US 9990537 B2 US9990537 B2 US 9990537B2
Authority
US
United States
Prior art keywords
symmetry line
symmetry
face
facial feature
template
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US14/803,142
Other versions
US20170024607A1 (en
Inventor
Zvi Kons
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
International Business Machines Corp
Original Assignee
International Business Machines Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by International Business Machines Corp filed Critical International Business Machines Corp
Priority to US14/803,142 priority Critical patent/US9990537B2/en
Assigned to INTERNATIONAL BUSINESS MACHINES CORPORATION reassignment INTERNATIONAL BUSINESS MACHINES CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KONS, ZVI
Publication of US20170024607A1 publication Critical patent/US20170024607A1/en
Application granted granted Critical
Publication of US9990537B2 publication Critical patent/US9990537B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • G06K9/00248
    • G06K9/00281
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/68Analysis of geometric attributes of symmetry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/165Detection; Localisation; Normalisation using facial parts and geometric relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Definitions

  • the present disclosure relates to image analysis in general, and to facial features identification, in particular.
  • identifying facial features of the subject may be desired.
  • location of facial features require a search in four dimensions using local templates that match the target features.
  • Such a search tends to be complex and prone to errors because it has to locate both (x, y) coordinates, scale parameter and rotation parameter.
  • templates are made to be rotation invariant. However, such templates are generally less selective and therefore can produce false detections.
  • One exemplary embodiment of the disclosed subject matter is a computer-implemented method comprising: obtaining an image of a face of a subject; automatically detecting a symmetry line of the face in the image, wherein the symmetry line intersects at least a mouth region of the face; and automatically locating a facial feature of the face using the symmetry line.
  • Another exemplary embodiment of the disclosed subject matter is a computerized apparatus having a processor, the processor being adapted to perform the steps of: obtaining an image of a face of a subject; automatically detecting a symmetry line of the face in the image, wherein the symmetry line intersects at least a mouth region of the face; and automatically locating a facial feature of the face using the symmetry line.
  • FIG. 1 shows a flowchart diagram of a method, in accordance with some exemplary embodiments of the disclosed subject matter
  • FIG. 2A shows a flowchart diagram of a method, in accordance with some exemplary embodiments of the disclosed subject matter
  • FIG. 2B shows a flowchart diagram of a method, in accordance with some exemplary embodiments of the disclosed subject matter
  • FIGS. 3A and 3B show illustrations of an image of a face, in accordance with some exemplary embodiments of the disclosed subject matter
  • FIGS. 4A and 4B show an illustration of an intensity curve along the symmetry lines of FIGS. 3A and 3B , in accordance with the disclosed subject matter;
  • FIG. 5 illustrates the geometry of the symmetry operator, in accordance with the disclosed subject matter.
  • FIG. 6 shows a block diagram of an apparatus, in accordance with some exemplary embodiments of the disclosed subject matter.
  • One technical problem dealt with by the disclosed subject matter is to implement a face feature location tool that can accurately and precisely identify a facial feature in an image of a face.
  • One technical solution is to detect a symmetry line of the face appearing in the image and detecting the face feature using the symmetry line.
  • the angle of the symmetry line may be used to select a template to be used in searching for the face features.
  • the template may not be rotation invariant and instead be specific for a given rotation.
  • the rotation angle of the symmetry line may be used to rotate a template or to select a template having a suitable rotation, thereby improving the accuracy of the process.
  • a symmetry cross section which may be defined using values across the symmetry line, may be used to search for facial features that are intersected by the symmetry line, such as the mouth or nose positions.
  • the location may be performed by a one dimension correlation of the symmetry cross section with cross section templates, such as depicting lip and moth openings, the nose, the forehead, the chin, or the like.
  • symmetric facial features such as the eyes, may be located using the symmetry line.
  • detected facial features may be verified for properties relating to the symmetry line, such as an equal distance between each feature and the symmetry line, a line connecting the two features is perpendicular to the symmetry line, or the like.
  • the search for symmetrical features such as the eyes is done in unison at both sides of the symmetry line.
  • Two corresponding templates are matched at opposite points over the symmetry line as if they were one symmetric template.
  • the whole image can be rotated, translated and scaled as so to make the symmetry line vertical, centered and at certain length. Further processing or template matching steps can now assume to know that the image of the face is properly aligned.
  • One technical effect of the disclosed subject matter is in providing a relatively stable and accurate method of identifying facial features.
  • Another technical effect is reducing computational complexity of performing the facial feature detection, such as from a two-dimensional search to a single dimensional search.
  • FIG. 1 showing a flowchart diagram of a method, in accordance with some exemplary embodiments of the subject matter.
  • Step 100 an image of a face of a subject is obtained.
  • the image may be originally in grayscale, or processed into grayscale. Additionally or alternatively, the image may a color image.
  • a symmetry line of the face of the subject is detected.
  • a face detector may detect the face within the image.
  • the symmetry line may be detected within the face.
  • Symmetry Line 310 is tilted based on an angle of Face 300 .
  • Symmetry Line 310 crosses Mouth 320 .
  • FIG. 3B another image of the same subject is illustrated.
  • Symmetry Line 310 ′ crosses Mouth 320 ′, which is open.
  • each image has a different symmetry line, as a tilt angle (also referred to as a rotation angle) may be different in each image.
  • a symmetry operator, S may be defined as follows:
  • FIG. 5 illustrate the geometry of the symmetry operator.
  • Symmetry Line 500 is tilted and angled at an angle of ⁇ .
  • Symmetry Line 500 passes through Center 520 (x, y).
  • a half circle having a Center 520 (x, y) and a radius in accordance with a size of a face of the subject e.g., about the lateral radius of the face, 80% of the lateral radius of the face, 75% of the vertical radius of the face, or the like
  • A half circle having a Center 520 (x, y) and a radius in accordance with a size of a face of the subject (e.g., about the lateral radius of the face, 80% of the lateral radius of the face, 75% of the vertical radius of the face, or the like) is denoted as ⁇ . Every point in ⁇ is reflected by Symmetry Line 500 and its corresponding point is expected to be substantially similar to the point (e.g., similar intensity values).
  • the value returned is expected to be close to zero as image intensity of each point in ⁇ is expected to be similar to that of the reflected point which is reflected by the symmetry line.
  • two points that are expected to be in opposite sides of the symmetry line may be selected (e.g., Points 512 and 514 ; denoted as (x R , y R ) and (x L , y L )).
  • the location of the two eyes can be used (e.g., 332 and 334 of FIG. 3A ).
  • a search may be performed between those two point and with various angles to find the minimal value of the symmetry operator. The results of such a search may be the parameters of the symmetry line.
  • the search may be formally defined as follows:
  • the symmetry line may be represented by the parametric equation:
  • a template to be used for locating the desired facial feature is selected.
  • the template may be a rotation dependent template, whose rotation parameter is the tilt angle of the symmetry line. Additionally or alternatively, a template may be rotated to the desired tilt angle.
  • the selection is made from a database retaining different templates, each associated with a different angle.
  • there may be a single template that can be rotated to a desired angle and the rotation of the template to the desired angle is considered as a “selection”.
  • Step 130 the selected template is used for locating the desired facial feature.
  • FIG. 2A showing a flowchart diagram of a method, in accordance with some exemplary embodiments of the subject matter.
  • Step 200 an image of a face is obtained.
  • Step 210 a symmetry line of the face is detected.
  • a symmetry cross section is determined based on the symmetry line.
  • the image may be a grayscale image for which the value at m(x, y) is an intensity value and the symmetry cross section is an intensity cross section.
  • colored images may be used with a vector of values in each pixel (e.g., m(x, y) comprising RGB values).
  • the symmetry cross section may be correlated with a cross section template.
  • the cross section template may include sections which are a-priori known to be attributable to specific facial features.
  • the correlation may be performed based on static facial feature which tend to maintain a static size and position, such as a nose, a forehead and not a mouth or a chin.
  • the symmetry cross section may be aligned with the cross section template.
  • the symmetry line around static face landmarks may be relatively constant and similar in different images of the subject. For example, referring to FIGS. 4A and 4B , Intensity Curve 400 and Intensity Curve 450 may depict the intensity value at the Symmetry Lines 310 and 310 ′, respectively.
  • different sections of the curves relate to different areas of the face, such as hair, forehead, nose, mouth and chin
  • some face landmarks remain relatively constant, such as hair, forehead and nose, while others may change depending from one picture to another, such as the mouth and chin
  • it may be expected that the intensity levels from the area of the forehead and the nose may be similar in both images.
  • Such information may be used to align one cross section based on the other. The alignment may compensate for scale changes, translation, or the like.
  • alignment may be performed by solving the following:
  • ⁇ and ⁇ are scale and translation parameters, respectively, and the alignment is performed over r U which includes the expected range of the static landmarks (e.g., face and nose; for example 2 ⁇ 3 of the image intensity cross section).
  • the facial feature may be identified based on the symmetry cross section.
  • Curve 400 is the symmetry cross section and the attribution of different portions thereof to facial features may not be known.
  • Curve 450 may be a cross section template for which at least the desired facial feature is a-priori identified. For example, in Curve 450 “hair”, “forehead”, “nose”, “mouth” and “chin” features are a-priori attributed to separate portions of Curve 450 .
  • sections of Curve 400 may be attributed to the “hair”, “forehead”, “nose”, mouth” and “chin” features.
  • Step 250 after a symmetry line is detected ( 210 ), symmetric facial features are identified.
  • the identification may be performed using the symmetry line as a reference object.
  • a line connecting the two symmetric features is expected to be perpendicular to the symmetry line.
  • the angle of the above-mentioned line is expected to be ⁇ +90°.
  • a distance between each eye and the symmetry line is expected to be equal as the eyes are expected to appear in symmetric locations.
  • FIG. 6 showing a block diagram of an apparatus, in accordance with some exemplary embodiments of the disclosed subject matter.
  • An Apparatus 600 may be configured to provide for biometric verification and, in accordance with the disclosed subject matter.
  • Apparatus 600 may be a mobile computing device, such as a personal computer, a server, a smartphone, a Personal Digital Assistant (PDA), or the like.
  • PDA Personal Digital Assistant
  • Apparatus 600 may comprise one or more processor(s) 602 .
  • Processor 602 may be a Central Processing Unit (CPU), a microprocessor, an electronic circuit, an Integrated Circuit (IC) or the like.
  • Processor 602 may be utilized to perform computations required by Apparatus 600 or any of it subcomponents.
  • Apparatus 600 may comprise an Input/Output (I/O) Module 605 .
  • I/O Module 605 may be utilized to receive images which are captured by a camera (not shown), audio recordings captured by a microphone (not shown), or the like.
  • I/O Module 605 may be used to provide output indicating successful or unsuccessful verification attempt.
  • I/O Module 605 may be configured to alert when a spoof attempt is detected by Liveness Detector 630 .
  • Apparatus 600 may comprise a Memory 607 .
  • Memory 607 may be a hard disk drive, a Flash disk, a Random Access Memory (RAM), a memory chip, or the like.
  • Memory 607 may retain program code operative to cause Processor(s) 602 to perform acts associated with any of the subcomponents of Apparatus 600 .
  • a Face Recognition Unit 610 which is configured to analyze an image and determine facial features in a face of a subject appearing in the image.
  • Face Recognition Unit 610 may analyze images obtained by I/O Module 605 .
  • Face Recognition Unit 610 may utilize a Symmetry Line Detector 320 to detect a symmetry line of a face in an image.
  • the symmetry line may be used by Face Recognition Unit 610 to identify the target facial features.
  • Template Obtainer 630 may obtain a template based on the symmetry line detected by Symmetry Line Detector 320 .
  • the template may be a template to be used in searching for a face feature. Additionally or alternatively, the template may be a cross section template.
  • the template may be obtained from a data storage, such as Memory 607 , local storage, external storage, remote storage, or the like.
  • Cross Section Conelator 640 may be configured to correlate a symmetry cross section with a cross section template. Conelator 640 may be configured to align the symmetry cross section using the cross section template and to correlate a portion of the cross section template which is a-priori attributed to a target facial feature with a corresponding portion in the symmetry cross section. The corresponding portion may be deemed as including the target facial feature.
  • the present invention may be a system, a method, and/or a computer program product.
  • the computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.
  • the computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device.
  • the computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • a non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or Flash memory erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disc read-only memory
  • DVD digital versatile disk
  • memory stick a floppy disk
  • a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon
  • a computer readable storage medium is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
  • Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network.
  • the network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers.
  • a network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
  • Computer readable program instructions for carrying out operations of the present invention may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
  • the computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present invention.
  • These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the block may occur out of the order noted in the figures.
  • two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

A method, system and product for locating facial features using symmetry line of the face. The method comprises: obtaining an image of a face of a subject; automatically detecting a symmetry line of the face, wherein the symmetry line intersects at least a mouth region of the face; and automatically locating a facial feature of the face using the symmetry line. Optionally, a rotation of the symmetry line is used to select a template, to rotate a template or to rotate the image. Optionally, the facial feature is symmetrical and the facial feature is searched for using a symmetrical template. Optionally, the automatic locating comprises performing a one dimension correlation of an intensity cross section defined by the symmetry line with a cross section template. Optionally, the automatic location comprises correlating a curve defined by the symmetry line with a template curve.

Description

TECHNICAL FIELD
The present disclosure relates to image analysis in general, and to facial features identification, in particular.
BACKGROUND
In many image processing applications, identifying facial features of the subject may be desired. Currently, location of facial features require a search in four dimensions using local templates that match the target features. Such a search tends to be complex and prone to errors because it has to locate both (x, y) coordinates, scale parameter and rotation parameter.
Additionally, some templates are made to be rotation invariant. However, such templates are generally less selective and therefore can produce false detections.
BRIEF SUMMARY
One exemplary embodiment of the disclosed subject matter is a computer-implemented method comprising: obtaining an image of a face of a subject; automatically detecting a symmetry line of the face in the image, wherein the symmetry line intersects at least a mouth region of the face; and automatically locating a facial feature of the face using the symmetry line.
Another exemplary embodiment of the disclosed subject matter is a computerized apparatus having a processor, the processor being adapted to perform the steps of: obtaining an image of a face of a subject; automatically detecting a symmetry line of the face in the image, wherein the symmetry line intersects at least a mouth region of the face; and automatically locating a facial feature of the face using the symmetry line.
THE BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
The present disclosed subject matter will be understood and appreciated more fully from the following detailed description taken in conjunction with the drawings in which corresponding or like numerals or characters indicate corresponding or like components. Unless indicated otherwise, the drawings provide exemplary embodiments or aspects of the disclosure and do not limit the scope of the disclosure. In the drawings:
FIG. 1 shows a flowchart diagram of a method, in accordance with some exemplary embodiments of the disclosed subject matter;
FIG. 2A shows a flowchart diagram of a method, in accordance with some exemplary embodiments of the disclosed subject matter;
FIG. 2B shows a flowchart diagram of a method, in accordance with some exemplary embodiments of the disclosed subject matter;
FIGS. 3A and 3B show illustrations of an image of a face, in accordance with some exemplary embodiments of the disclosed subject matter;
FIGS. 4A and 4B show an illustration of an intensity curve along the symmetry lines of FIGS. 3A and 3B, in accordance with the disclosed subject matter;
FIG. 5 illustrates the geometry of the symmetry operator, in accordance with the disclosed subject matter; and
FIG. 6 shows a block diagram of an apparatus, in accordance with some exemplary embodiments of the disclosed subject matter.
DETAILED DESCRIPTION
One technical problem dealt with by the disclosed subject matter is to implement a face feature location tool that can accurately and precisely identify a facial feature in an image of a face.
One technical solution is to detect a symmetry line of the face appearing in the image and detecting the face feature using the symmetry line.
In some exemplary embodiments, the angle of the symmetry line may be used to select a template to be used in searching for the face features. In some exemplary embodiments, the template may not be rotation invariant and instead be specific for a given rotation. The rotation angle of the symmetry line may be used to rotate a template or to select a template having a suitable rotation, thereby improving the accuracy of the process.
In some exemplary embodiments, a symmetry cross section, which may be defined using values across the symmetry line, may be used to search for facial features that are intersected by the symmetry line, such as the mouth or nose positions. The location may be performed by a one dimension correlation of the symmetry cross section with cross section templates, such as depicting lip and moth openings, the nose, the forehead, the chin, or the like.
In some exemplary embodiments, symmetric facial features, such as the eyes, may be located using the symmetry line. In some exemplary embodiments, detected facial features may be verified for properties relating to the symmetry line, such as an equal distance between each feature and the symmetry line, a line connecting the two features is perpendicular to the symmetry line, or the like.
In some other exemplary embodiments, the search for symmetrical features such as the eyes is done in unison at both sides of the symmetry line. Two corresponding templates are matched at opposite points over the symmetry line as if they were one symmetric template.
In some exemplary embodiments, the whole image can be rotated, translated and scaled as so to make the symmetry line vertical, centered and at certain length. Further processing or template matching steps can now assume to know that the image of the face is properly aligned.
One technical effect of the disclosed subject matter is in providing a relatively stable and accurate method of identifying facial features.
Another technical effect is reducing computational complexity of performing the facial feature detection, such as from a two-dimensional search to a single dimensional search.
Referring now to FIG. 1 showing a flowchart diagram of a method, in accordance with some exemplary embodiments of the subject matter.
In Step 100, an image of a face of a subject is obtained. In some exemplary embodiments, the image may be originally in grayscale, or processed into grayscale. Additionally or alternatively, the image may a color image.
In Step 110, a symmetry line of the face of the subject is detected. In some exemplary embodiments, a face detector may detect the face within the image. The symmetry line may be detected within the face.
Referring now to FIG. 3A illustrating Symmetry Line 310 of Face 300. As can be appreciated, Symmetry Line 310 is tilted based on an angle of Face 300. Symmetry Line 310 crosses Mouth 320. In FIG. 3B, another image of the same subject is illustrated. Symmetry Line 310′ crosses Mouth 320′, which is open. As can be appreciated, each image has a different symmetry line, as a tilt angle (also referred to as a rotation angle) may be different in each image.
In some exemplary embodiments, a symmetry operator, S, may be defined as follows:
S ( x , y , θ ) = ( x 1 , y 1 ) Ω m ( x 1 , y 1 ) - m ( x 2 , y 2 ) ,
wherein m is an intensity value of the image in a point, wherein (x2, y2) is a reflection of a point (x1, y1) over a symmetry line, θ is an angle of the symmetry line, and Ω is half a circle with center at (x, y) and a radius in accordance with a size of the face. In some exemplary embodiments, the following define the reflection point: x2=x1+2d cos(θ), y2=y1−2d sin(θ), d=−(x1−x)cos(θ)+(y1−y)sin(θ).
FIG. 5 illustrate the geometry of the symmetry operator. Symmetry Line 500 is tilted and angled at an angle of θ. Symmetry Line 500 passes through Center 520 (x, y). A half circle having a Center 520 (x, y) and a radius in accordance with a size of a face of the subject (e.g., about the lateral radius of the face, 80% of the lateral radius of the face, 75% of the vertical radius of the face, or the like) is denoted as Ω. Every point in Ω is reflected by Symmetry Line 500 and its corresponding point is expected to be substantially similar to the point (e.g., similar intensity values). As an example, Point 512 (x1, y1) is reflected by Symmetry Line 500 to Point 514 (x2, y2), each of which is at a distance of d from Symmetry Line 500.
In some exemplary embodiments, if the symmetry operator is applied with correct parameters (e.g., correct (x, y) and θ) which match the symmetry line of the face in the image, the value returned is expected to be close to zero as image intensity of each point in Ω is expected to be similar to that of the reflected point which is reflected by the symmetry line.
In some exemplary embodiments, two points that are expected to be in opposite sides of the symmetry line may be selected (e.g., Points 512 and 514; denoted as (xR, yR) and (xL, yL)). In some exemplary embodiments, the location of the two eyes can be used (e.g., 332 and 334 of FIG. 3A). A search may be performed between those two point and with various angles to find the minimal value of the symmetry operator. The results of such a search may be the parameters of the symmetry line. In some exemplary embodiments, the search may be formally defined as follows:
( x s , y s , θ s ) = argmin α [ 0 , 1 ] , θ [ - Π / 2 , Π / 2 ] S ( α · x L + ( 1 - α ) · x R , α · y L + ( 1 - α ) · y R , θ ) .
In some exemplary embodiments, the symmetry line may be represented by the parametric equation:
( x y ) = ( x s y s ) + ( sin θ s cos θ s ) t .
In some exemplary embodiments, the symmetry line may be used to extract image intensity cross section along the symmetry line: v(t)=m(xs+t·sin(θs), ys+t·cos(θs)), where t is a value within the upper and lower face boundaries.
Referring back to FIG. 1. In Step 120, a template to be used for locating the desired facial feature is selected. The template may be a rotation dependent template, whose rotation parameter is the tilt angle of the symmetry line. Additionally or alternatively, a template may be rotated to the desired tilt angle. In some exemplary embodiments, the selection is made from a database retaining different templates, each associated with a different angle. In some exemplary embodiments, there may be a single template that can be rotated to a desired angle and the rotation of the template to the desired angle is considered as a “selection”.
In Step 130, the selected template is used for locating the desired facial feature.
Referring now to FIG. 2A showing a flowchart diagram of a method, in accordance with some exemplary embodiments of the subject matter.
In Step 200, an image of a face is obtained.
In Step 210, a symmetry line of the face is detected.
In Step 220, a symmetry cross section is determined based on the symmetry line. In some exemplary embodiments, the symmetry line may be used to extract a symmetry cross section along the symmetry line: v(t)=m(xs+t·sin(θs), ys+t·cos(θs)), where t is a value within the upper and lower face boundaries. In some exemplary embodiments, the image may be a grayscale image for which the value at m(x, y) is an intensity value and the symmetry cross section is an intensity cross section. Additionally or alternatively, colored images may be used with a vector of values in each pixel (e.g., m(x, y) comprising RGB values).
In Step 230, the symmetry cross section may be correlated with a cross section template. The cross section template may include sections which are a-priori known to be attributable to specific facial features. The correlation may be performed based on static facial feature which tend to maintain a static size and position, such as a nose, a forehead and not a mouth or a chin.
In some exemplary embodiments, the symmetry cross section may be aligned with the cross section template. In some exemplary embodiments, the symmetry line around static face landmarks may be relatively constant and similar in different images of the subject. For example, referring to FIGS. 4A and 4B, Intensity Curve 400 and Intensity Curve 450 may depict the intensity value at the Symmetry Lines 310 and 310′, respectively. As can be appreciated from the figures, different sections of the curves relate to different areas of the face, such as hair, forehead, nose, mouth and chin In some exemplary embodiments, some face landmarks remain relatively constant, such as hair, forehead and nose, while others may change depending from one picture to another, such as the mouth and chin In some exemplary embodiments, it may be expected that the intensity levels from the area of the forehead and the nose may be similar in both images. Such information may be used to align one cross section based on the other. The alignment may compensate for scale changes, translation, or the like.
In some exemplary embodiments, alignment may be performed by solving the following:
( α s , β s ) = argmin α , β ( t r U v 1 ( t ) - v 2 ( α · t + β ) dt ) ,
where α and β are scale and translation parameters, respectively, and the alignment is performed over rU which includes the expected range of the static landmarks (e.g., face and nose; for example ⅔ of the image intensity cross section). The aligned symmetry cross section may be represented by a modified function: v′(t)=v(αs·t+βs).
In Step 240, the facial feature may be identified based on the symmetry cross section. Referring again to FIGS. 4A and 4B. In one example Curve 400 is the symmetry cross section and the attribution of different portions thereof to facial features may not be known. Curve 450 may be a cross section template for which at least the desired facial feature is a-priori identified. For example, in Curve 450 “hair”, “forehead”, “nose”, “mouth” and “chin” features are a-priori attributed to separate portions of Curve 450. By correlating Curve 400 with Curve 450, sections of Curve 400 may be attributed to the “hair”, “forehead”, “nose”, mouth” and “chin” features.
Referring now to FIG. 2B showing a flowchart diagram of a method, in accordance with some exemplary embodiments of the subject matter. In Step 250, after a symmetry line is detected (210), symmetric facial features are identified. The identification may be performed using the symmetry line as a reference object.
As an example, consider a situation in which the eyes are to be identified. In traditional algorithms, mistakes can happen when locating the eyes because of various artifacts such as eye glasses and light reflections. By using the properties of the symmetry line, incorrect eye positions may be avoided. In some cases, one or more of the requirements that are related to the symmetry line may be enforced when searching for the eye features. As an example, a line connecting the two symmetric features is expected to be perpendicular to the symmetry line. In some cases, the angle of the above-mentioned line is expected to be θ+90°. As another example, a distance between each eye and the symmetry line is expected to be equal as the eyes are expected to appear in symmetric locations.
Referring now to FIG. 6 showing a block diagram of an apparatus, in accordance with some exemplary embodiments of the disclosed subject matter.
An Apparatus 600 may be configured to provide for biometric verification and, in accordance with the disclosed subject matter. In some exemplary embodiments, Apparatus 600 may be a mobile computing device, such as a personal computer, a server, a smartphone, a Personal Digital Assistant (PDA), or the like.
In some exemplary embodiments, Apparatus 600 may comprise one or more processor(s) 602. Processor 602 may be a Central Processing Unit (CPU), a microprocessor, an electronic circuit, an Integrated Circuit (IC) or the like. Processor 602 may be utilized to perform computations required by Apparatus 600 or any of it subcomponents.
In some exemplary embodiments of the disclosed subject matter, Apparatus 600 may comprise an Input/Output (I/O) Module 605. I/O Module 605 may be utilized to receive images which are captured by a camera (not shown), audio recordings captured by a microphone (not shown), or the like. I/O Module 605 may be used to provide output indicating successful or unsuccessful verification attempt. In some exemplary embodiments, I/O Module 605 may be configured to alert when a spoof attempt is detected by Liveness Detector 630.
In some exemplary embodiments, Apparatus 600 may comprise a Memory 607. Memory 607 may be a hard disk drive, a Flash disk, a Random Access Memory (RAM), a memory chip, or the like. In some exemplary embodiments, Memory 607 may retain program code operative to cause Processor(s) 602 to perform acts associated with any of the subcomponents of Apparatus 600.
In some exemplary embodiments, a Face Recognition Unit 610 which is configured to analyze an image and determine facial features in a face of a subject appearing in the image. In some exemplary embodiments, Face Recognition Unit 610 may analyze images obtained by I/O Module 605.
In some exemplary embodiments, Face Recognition Unit 610 may utilize a Symmetry Line Detector 320 to detect a symmetry line of a face in an image. The symmetry line may be used by Face Recognition Unit 610 to identify the target facial features.
In some exemplary embodiments, Template Obtainer 630 may obtain a template based on the symmetry line detected by Symmetry Line Detector 320. The template may be a template to be used in searching for a face feature. Additionally or alternatively, the template may be a cross section template. The template may be obtained from a data storage, such as Memory 607, local storage, external storage, remote storage, or the like.
Cross Section Conelator 640 may be configured to correlate a symmetry cross section with a cross section template. Conelator 640 may be configured to align the symmetry cross section using the cross section template and to correlate a portion of the cross section template which is a-priori attributed to a target facial feature with a corresponding portion in the symmetry cross section. The corresponding portion may be deemed as including the target facial feature.
The present invention may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.
The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
Computer readable program instructions for carrying out operations of the present invention may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present invention.
Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present invention has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention. The embodiment was chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.

Claims (17)

What is claimed is:
1. A computer-implemented method comprising:
obtaining an image of a face of a subject;
automatically detecting a symmetry line of the face in the image, wherein the symmetry line intersects at least a mouth region of the face, wherein the symmetry line is a non-vertical line that is tilted by an angle, wherein said automatically detecting comprises:
obtaining a pair of points; and
minimizing a value of a symmetry operator to determine parameters of the symmetry line, wherein a value of the symmetry operator depends on the angle and a center point, wherein said minimizing is performed with respect to a variety of potential points as the center point, wherein the variety of potential points are points in between a first point of the pair of points and a second point of the pair of points; and
automatically locating a facial feature of the face using the symmetry line, wherein said automatically detecting the symmetry line comprises: computing a value of the symmetry operator S(x, y, θ)=∫(x 1 ,y 1 )ϵΩ|m(x1, y1)−m(x2, y2)|, wherein m is an intensity value of the image in a point, wherein (x2, y2) is a reflection of a point (x1, y1) over the symmetry line, wherein θ is the angle, wherein (x2, y2) is determined based on (x1, y1) using θ, and wherein Ω is half a circle with center at (x, y) and a radius in accordance with a size of the face.
2. The computer-implemented method of claim 1,
wherein said detecting the symmetry line comprises determining a rotation value of the symmetry line; and
wherein said automatically locating comprises: setting a rotation of a template based on the rotation value.
3. The computer-implemented method of claim 1,
wherein the facial feature comprises two eyes of the subject; and
wherein said automatically locating comprises: searching for the two eyes based on an angle of a line connecting the two eyes being perpendicular to the symmetry line.
4. The computer-implemented method of claim 1,
wherein the facial feature is symmetrical; and
wherein said automatically locating comprises: searching for the facial feature using a symmetrical template that is rotated and centered with accordance to the symmetry line.
5. The computer-implemented method of claim 1, wherein said automatically locating comprises:
defining a curve based on values of the image at the symmetry line;
correlating the curve with a template curve, wherein the template curve comprises a section attributable to the facial feature; and
determining the location of the facial feature of the face as the location in the curve that correlates to the section attributable to the facial feature in the template.
6. The computer-implemented method of claim 5, wherein the facial feature is a feature that is intersected by the symmetry line.
7. The computer-implemented method of claim 5, wherein the facial feature is selected from the group consisting of: a mouth and a lip.
8. The computer-implemented method of claim 5, wherein said correlating comprises performing a linear transformation on the curve, wherein the linear transformation is configured to compensate for translation and scale changes.
9. The computer-implemented method of claim 1, wherein said automatically locating comprises performing a one dimension correlation of an intensity cross section defined by the symmetry line with a cross section template.
10. The computer-implemented method of claim 1, wherein the image is in grayscale, wherein a value of an image at the symmetry line is an intensity value.
11. The computer-implemented method of claim 1, wherein the image is rotated and centered using an angle and a position of the symmetry line prior to said automatically locating.
12. A computerized apparatus having a processor, the processor being adapted to perform steps of:
obtaining an image of a face of a subject;
automatically detecting a symmetry line of the face in the image, wherein the symmetry line intersects at least a mouth region of the face, wherein the symmetry line is a non-vertical line that is tilted by an angle, wherein said automatically detecting comprises:
obtaining a pair of points; and
minimizing a value of a symmetry operator to determine parameters of the symmetry line wherein a value of the symmetry operator depends on the angle and a center point, wherein said minimizing is performed with respect to a variety of potential points as the center point, wherein the variety of potential points are points in between a first point of the pair of points and a second point of the pair of points; and
automatically locating a facial feature of the face using the symmetry line, wherein said automatically detecting the symmetry line comprises: computing a value of the symmetry operator S(x, y, θ)=∫(x 1 ,y 1 )ϵΩ|m(x1, y1)−m(x2, y2)|, wherein m is an intensity value of the image in a point, wherein (x2, y2) is a reflection of a point (x1, y1) over the symmetry line, wherein θ is the angle, wherein (x2, y2) is determined based on (x1, y1) using θ, and wherein Ω is half a circle with center at (x, y) and a radius in accordance with a size of the face.
13. The computerized apparatus of claim 12,
wherein said detecting the symmetry line comprises determining a rotation value of the symmetry line; and
wherein said automatically locating comprises: setting a rotation of a template based on the rotation value.
14. The computerized apparatus of claim 12,
wherein the facial feature comprises two eyes of the subject; and
wherein said automatically locating comprises: searching for the two eyes based on an angle of a line connecting the two eyes being perpendicular to the symmetry line.
15. The computerized apparatus of claim 12,
wherein the facial feature is symmetrical; and
wherein said automatically locating comprises: searching for the facial feature using a symmetrical template that is rotated and centered with accordance to the symmetry line.
16. The computerized apparatus of claim 12, wherein said automatically locating comprises:
defining a curve based on values of the image at the symmetry line;
correlating the curve with a template curve, wherein the template curve comprises a section attributable to the facial feature; and
determining the location of the facial feature of the face as the location in the curve that correlates to the section attributable to the facial feature in the template.
17. The computerized apparatus of claim 12, wherein said automatically locating comprises performing a one dimension correlation of an intensity cross section defined by the symmetry line with a cross section template.
US14/803,142 2015-07-20 2015-07-20 Facial feature location using symmetry line Expired - Fee Related US9990537B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/803,142 US9990537B2 (en) 2015-07-20 2015-07-20 Facial feature location using symmetry line

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US14/803,142 US9990537B2 (en) 2015-07-20 2015-07-20 Facial feature location using symmetry line

Publications (2)

Publication Number Publication Date
US20170024607A1 US20170024607A1 (en) 2017-01-26
US9990537B2 true US9990537B2 (en) 2018-06-05

Family

ID=57837709

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/803,142 Expired - Fee Related US9990537B2 (en) 2015-07-20 2015-07-20 Facial feature location using symmetry line

Country Status (1)

Country Link
US (1) US9990537B2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110032941B (en) * 2019-03-15 2022-06-17 深圳英飞拓科技股份有限公司 Face image detection method, face image detection device and terminal equipment
US20240070891A1 (en) * 2022-08-26 2024-02-29 Adobe Inc. Generating symmetrical repeat edits for images

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0889438A2 (en) * 1997-07-04 1999-01-07 Agfa-Gevaert N.V. Method of determining a symmetry line in a radiation image
US5859921A (en) * 1995-05-10 1999-01-12 Mitsubishi Denki Kabushiki Kaisha Apparatus for processing an image of a face
US20070286490A1 (en) * 2006-06-09 2007-12-13 Samsung Electronics Co., Ltd. Facial feature detection method and device
US20110075933A1 (en) * 2009-07-12 2011-03-31 Samsung Electronics Co., Ltd. Method for determining frontal face pose
US8355530B2 (en) 2007-02-14 2013-01-15 Samsung Electronics Co., Ltd. Liveness detection method and apparatus of video image
US20130188840A1 (en) 2012-01-20 2013-07-25 Cyberlink Corp. Liveness detection system based on face behavior
US8848986B2 (en) 2011-07-11 2014-09-30 Accenture Global Services Limited Liveness detection
US8856541B1 (en) 2013-01-10 2014-10-07 Google Inc. Liveness detection
US20140337948A1 (en) 2013-05-13 2014-11-13 Hoyos Labs Corp. System and method for determining liveness

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5859921A (en) * 1995-05-10 1999-01-12 Mitsubishi Denki Kabushiki Kaisha Apparatus for processing an image of a face
EP0889438A2 (en) * 1997-07-04 1999-01-07 Agfa-Gevaert N.V. Method of determining a symmetry line in a radiation image
US20070286490A1 (en) * 2006-06-09 2007-12-13 Samsung Electronics Co., Ltd. Facial feature detection method and device
US8355530B2 (en) 2007-02-14 2013-01-15 Samsung Electronics Co., Ltd. Liveness detection method and apparatus of video image
US20110075933A1 (en) * 2009-07-12 2011-03-31 Samsung Electronics Co., Ltd. Method for determining frontal face pose
US8848986B2 (en) 2011-07-11 2014-09-30 Accenture Global Services Limited Liveness detection
US20130188840A1 (en) 2012-01-20 2013-07-25 Cyberlink Corp. Liveness detection system based on face behavior
US8856541B1 (en) 2013-01-10 2014-10-07 Google Inc. Liveness detection
US20140337948A1 (en) 2013-05-13 2014-11-13 Hoyos Labs Corp. System and method for determining liveness

Non-Patent Citations (12)

* Cited by examiner, † Cited by third party
Title
Bredin et al., "Detecting Replay Attacks in Audiovisual Identity Verification", 2006 IEEE International Conference on Acoustics Speech and Signal Processing Proceedings, May 2006, pp. I-621-I-624, vol. 5.
Chakraborty et al., "An Overview of Face Liveness Detection", International Journal on Information Theory (IJIT), Apr. 2014, pp. 11-25, vol. 3, No. 2.
Kahm et al., "2D face liveness detection: An overview", 2012 BIOSIG-Proceedings of the International Conference of Biometrics Special Interest Group (BIOSIG), Sep. 2012.
Kahm et al., "2D face liveness detection: An overview", 2012 BIOSIG—Proceedings of the International Conference of Biometrics Special Interest Group (BIOSIG), Sep. 2012.
Kollreider et al., "Real-Time Face Detection and Motion Analysis With Application in "Liveness" Assessment", Transactions on Information Forensics and Security, Sep. 2007, pp. 548-558, vol. 2, No. 3.
Nalinakshi et al., "Liveness Detection Technique for Prevention of Spoof Attack in Face Recognition System", International Journal of Emerging Technology and Advanced Engineering, Dec. 2013, vol. 3, Issue 12.
Ohyama, Wataru, et al. "Automatic detection of facial midline and its contributions to facial feature extraction." ELCVIA: electronic letters on computer vision and image analysis 6.3 (2007): 55-65. *
Saber, Eli, and A. Murat Tekalp. "Frontal-view face detection and facial feature extraction using color, shape and symmetry based cost functions." Pattern Recognition Letters 19.8 (1998): 669-680. *
Singh et al., "Face Recognition Using Facial Symmetry", Proceedings of the Second International Conference on Computational Science, Engineering and Information Technology, Oct. 2012, pp. 550-554.
Singh et al., "Face Recognition with Liveness Detection using Eye and Mouth Movement", 2014 International Conference on Signal Propagation and Computer Technology (ICSPCT 2014), Jul. 2014, pp. 592-597.
Smith, Steven W. "The scientist and engineer's guide to digital signal processing." 2nd ed., 1999, pp. 136-140. *
Szeliski, Richard. "Computer Vision." 2011, pp. 337-338. *

Also Published As

Publication number Publication date
US20170024607A1 (en) 2017-01-26

Similar Documents

Publication Publication Date Title
US9996732B2 (en) Liveness detector for face verification
US10612932B2 (en) Method and system for correcting a pre-generated navigation path for an autonomous vehicle
US10922794B2 (en) Image correction method and device
US10482681B2 (en) Recognition-based object segmentation of a 3-dimensional image
US9349189B2 (en) Occlusion resistant image template matching using distance transform
US8189961B2 (en) Techniques in optical character recognition
US20070071289A1 (en) Feature point detection apparatus and method
US20220215557A1 (en) Edge detection method and device, electronic equipment, and computer-readable storage medium
US9760797B2 (en) Protecting specific information
CN109086734B (en) Method and device for positioning pupil image in human eye image
US11657644B2 (en) Automatic ruler detection
US20200104990A1 (en) Region of interest weighted anomaly detection
US9536298B2 (en) Electronic device and method for detecting surface flaw of object
CN109345460B (en) Method and apparatus for rectifying image
Song et al. Estimation of kinect depth confidence through self-training
US8971592B2 (en) Method for determining eye location on a frontal face digital image to validate the frontal face and determine points of reference
US9990537B2 (en) Facial feature location using symmetry line
KR20200065590A (en) Method and apparatus for detecting lane center point for accurate road map
US11941835B2 (en) Eye information estimate apparatus, eye information estimate method, and program
US9536144B2 (en) Automatic image classification
US20150379502A1 (en) Image processing method of enabling financial transaction and an image processing system thereof
Dayananda Kumar et al. Automated parameter-less optical mark recognition
KR102303851B1 (en) Method for searching location through image analysis and system thereof
US10331928B2 (en) Low-computation barcode detector for egocentric product recognition
US20220284690A1 (en) Reading system, reading method, and storage medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: INTERNATIONAL BUSINESS MACHINES CORPORATION, NEW Y

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KONS, ZVI;REEL/FRAME:036138/0494

Effective date: 20150720

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20220605