WO2007108225A1 - 顔認識システム - Google Patents
顔認識システム Download PDFInfo
- Publication number
- WO2007108225A1 WO2007108225A1 PCT/JP2007/050802 JP2007050802W WO2007108225A1 WO 2007108225 A1 WO2007108225 A1 WO 2007108225A1 JP 2007050802 W JP2007050802 W JP 2007050802W WO 2007108225 A1 WO2007108225 A1 WO 2007108225A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- face
- user
- feature
- recognition system
- face recognition
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/22—Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
- G06V10/235—Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition based on user input or interaction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/30—Individual registration on entry or exit not involving the use of a pass
- G07C9/32—Individual registration on entry or exit not involving the use of a pass in combination with an identity check
- G07C9/37—Individual registration on entry or exit not involving the use of a pass in combination with an identity check using biometric data, e.g. fingerprints, iris scans or voice recognition
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L9/00—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
- H04L9/32—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
Definitions
- the present invention relates to a face recognition system.
- a building is used to prevent an unqualified person from operating a device such as a personal authentication device or an information processing device for preventing an unqualified person from entering a specific place in the building.
- a device such as a personal authentication device or an information processing device for preventing an unqualified person from entering a specific place in the building.
- Personal identification devices and personal identification devices for verifying identity in electronic commerce using the Internet use the biological characteristics of the individual without using a key or PIN.
- a personal authentication device using a biometrics authentication system to be confirmed is provided (for example, see Patent Document 1) o
- biometrics authentication system examples include systems that authenticate using fingerprints, systems that authenticate using voiceprints, systems that authenticate using facial features, and the like.
- a face authentication system that is, a face recognition system
- face recognition when face recognition is performed, facial feature amounts are extracted from face images, and the degree of similarity with a pre-registered feature amount is compared. Compared to be recognized.
- the positions of both eyes and mouth are detected, and thereby face recognition is performed by a template matching process using the corrected template as a face feature amount.
- the face image includes both eyes and mouth. If either eye detection or mouth detection fails, the face does not exist in the image. Judgment is made and a warning is issued!
- Patent Document 1 Japanese Unexamined Patent Publication No. 2000-163600
- the present invention solves the problems of the conventional face recognition system and includes a feature quantity selection unit that selects a face part to be extracted, and performs face recognition.
- the user can select the part of the face from which features are to be extracted, and emphasizes the features of the selected part even when wearing a mask or eye patch or when the hairstyle changes.
- an image input unit that acquires a user's face image, a database that stores feature amounts of facial parts of a registered person, and the user's face image
- a face position detection unit for detecting the position of the face part from the image
- a feature amount extraction unit for extracting a feature quantity of the face part, a feature quantity of the face part extracted by the feature quantity extraction unit, and the database
- a recognition unit for recognizing the user by comparing the feature quantity of the facial part of the registered person stored in the database, wherein the user identifies the facial part from which the feature quantity is extracted. select.
- the face position detecting unit detects only the position of the face part selected by the user.
- the face recognition system further includes a feature amount selection unit that selects a face portion that is operated by the user to extract a feature amount.
- an image input unit that acquires a user's face image, a database that stores a feature amount of a facial part of a registered person, and the user's face image are used.
- a face position detection unit that detects the position of a face part, a feature amount extraction part that extracts a feature quantity of the face part, a feature quantity of the face part extracted by the feature quantity extraction unit, and the database is stored in the database
- the registered user's facial feature amount is compared to recognize the user.
- a facial recognition system having a recognition unit for recognizing a facial part from which a feature amount is extracted automatically according to the state of the user.
- the user state detection further selects a face part from which a feature amount is extracted by matching the user's face image with a registered template. Part.
- the face recognition system includes a feature amount selection unit that selects a face portion from which a feature amount is to be extracted. Therefore, the user who performs face recognition can select the part of the face for which the feature value is to be extracted by himself / herself, even when wearing a mask or eye patch, or when the hairstyle is changed. Facial recognition can be performed appropriately by reducing the feature values of other parts with emphasis on the feature values of the selected parts.
- FIG. 1 is a diagram showing a configuration of a face recognition system according to a first embodiment of the present invention.
- FIG. 2 is a diagram showing an example of a feature amount selection screen in the first embodiment of the present invention.
- FIG. 3 is a flowchart showing the operation of face position detection processing in the first embodiment of the present invention.
- FIG. 4 is a flowchart showing an operation of feature quantity extraction processing in the first exemplary embodiment of the present invention.
- FIG. 5 is a diagram showing a configuration of a face recognition system according to a second embodiment of the present invention.
- FIG. 1 is a diagram showing a configuration of a face recognition system according to the first embodiment of the present invention.
- reference numeral 10 denotes a face recognition system that performs face recognition by acquiring a face image of the user 19 taken by the camera 21, and is a kind of computer system that operates according to a program.
- the face recognition system 10 is a system that is used for personal authentication of the user 19 based on the facial appearance, which is one of the biological characteristics of the user 19, and is used in various applications. Also good. For example, a specific building such as a condominium, a factory, or an office is used to verify the identity of the user when entering a specific place in the building. Ma It can also be used to verify the identity of the person who owns the ticket at check-in at the airport.
- financial institutions such as banks, post offices, credit unions, etc.
- automatic transaction devices such as ATMs (Automatic Teller Machines) to perform financial transactions. It can also be used to verify your identity when you It can also be used to verify identity in electronic commerce using the Internet.
- the user 19 may be a specific building such as a condominium, a factory, an office, a resident, an employee, or the like in a specific location within the building, or may check in at an airfield. It may be a traveler, a customer who requests financial transactions at a sales office of a financial institution, or operates a financial transaction by operating an automatic transaction device, or uses the Internet. A person who conducts electronic commerce, or a person who verifies the identity using the face recognition system 10 may be V.
- the camera 21 includes an imaging element such as a charge coupled device (CCD) and a complementary metal oxide semiconductor (CMOS), an optical lens, a communication interface, and the like.
- CMOS complementary metal oxide semiconductor
- a digital still camera that shoots still images or a digital video camera that shoots moving images can be used as long as it can be connected and can output the captured images in the form of electronic data. It may be in any form.
- the camera 21 may be, for example, a specific building or a monitoring camera disposed at an entrance of a specific place in the building, or may be a built-in camera built in an automatic transaction apparatus.
- It can also be a camera connected to a personal computer, etc., or it can be built into a notebook personal computer, PDA (Personal Digital Assistant), electronic notebook, mobile phone, PHS (Personal Handy-Phone System) phone, etc. It may be a camera.
- PDA Personal Digital Assistant
- electronic notebook mobile phone
- PHS Personal Handy-Phone System
- the face recognition system 10 acquires a face image for each frame from the feature amount selection unit 11 and the camera 21 for selecting a facial part from which a facial feature amount used for face recognition is extracted.
- a feature quantity extraction unit 14 that extracts feature quantities of parts such as eyes and mouth
- a database 15 that stores the facial features of registered persons registered first, the feature quantities of each registered person stored in the database 15, and the feature quantities extracted by the feature quantity extraction unit 14
- a recognition unit 16 that recognizes a person in the image, and a recognition result output unit 17 that outputs a recognition result of the recognition unit 16.
- the parts of the face from which features used for facial recognition are extracted are the eyes, mouth, ears, nose, etc., but those that can be identified using biological features.
- it may be a part of the face.
- it demonstrates as the said part being eyes and a mouth.
- the feature quantity selection unit 11 is a means for the user 19 to select a face part from which a feature quantity is to be extracted according to the situation. For example, the user 19 puts a mask on If it is, the feature amount of the eye is emphasized. If not, that is, if the mask is not put on, the feature amount of the eye and mouth can be selected to be used.
- the image input unit 12 is connected to a camera 21 as a photographing unit for photographing a face image of the user 19 so as to be communicable via a wired or wireless network, and is used from the camera 21.
- Acquire face image of person 19 The face image is preferably digital image data.
- the network may be a communication cable such as a USB (Universal Serial Bus) cable, or a communication line network such as a telephone line network, the Internet, a LAN (Local Area Network), or a WAN (Wide Area Network). May be.
- the recognition result output unit 17 is communicably connected to the speaker 22 and the monitor 23 via a wired or wireless network, and a message for outputting the recognition result to the speaker 22 or the monitor 23.
- the speaker 22 may have any form, and may be disposed at any place.
- a specific building may be a force such as an interphone disposed at an entrance of a specific place in the building, or may be a built-in speaker built in an automatic transaction apparatus. It may be a speaker connected to a personal computer or the like, or a speaker built in a notebook personal computer, PDA, electronic notebook, mobile phone, PHS phone, or the like.
- the monitor 23 includes a CRT, a liquid crystal display, and an LED (Light Emitting Diode). ) It is provided with display means such as a display, and may be in any form, and may be disposed in any place.
- display means such as a display, and may be in any form, and may be disposed in any place.
- a specific building may be a monitor arranged at the entrance of a specific place in the building, or it may be a display means of an automatic transaction apparatus, and is connected to a personal computer or the like. Even a monitor!
- It may be a monitor provided in a notebook personal computer, PDA, electronic notebook, mobile phone, PHS phone, etc.
- the network may be a communication cable such as a USB cable, or may be a communication line network such as a telephone line network, the Internet, a LAN, or a WAN.
- a communication cable such as a USB cable
- a communication line network such as a telephone line network, the Internet, a LAN, or a WAN.
- the face recognition system 10 is incorporated in any kind of computer as long as it has arithmetic means such as a CPU and MPU, storage means such as a magnetic disk and a semiconductor memory, and an input / output interface.
- arithmetic means such as a CPU and MPU
- storage means such as a magnetic disk and a semiconductor memory
- an input / output interface For example, it may be incorporated in a personal computer, may be incorporated in a server, or a computer network in which a plurality of computers are connected by a network. Even if it was built in.
- FIG. 2 is a diagram showing an example of a feature amount selection screen in the first embodiment of the present invention.
- FIG. 3 is a flowchart showing an operation of face position detection processing in the first embodiment of the present invention.
- 4 is a flowchart showing the operation of the feature amount extraction processing in the first embodiment of the present invention.
- the user 19 uses the feature quantity selection unit 11 to select the face part from which the feature quantity is to be extracted, that is, which part of the face is important as the face feature quantity.
- the face recognition system 10 displays a feature amount selection screen as shown in FIG. 2 on the monitor 23, and allows the user 19 to select whether or not the mask is applied. If it is selected that the mask is applied, it is assumed that the use of the eye feature is selected as the face feature. If it is selected that no mask is applied, it is assumed that the use of eye and mouth feature values is selected.
- the user 19 can also be selected in the same manner as to whether or not the eye patch is worn.
- the mouth feature value is used as the face feature value. It is assumed that the use of is selected. In addition, when it is selected that the eye patch is not worn, it is assumed that the use of eye and mouth feature values is selected. Further, similarly, the user 19 can select whether or not the hairstyle has changed greatly.
- the image input unit 12 acquires the face image of the user 19 captured by the camera 21 for each frame.
- the face position detection unit 13 detects the position of each part of the face from the face image acquired by the image input unit 12. In this case, only the position of the part selected by the feature quantity selection unit 11 is detected. For example, when using the feature amount for each of the eyes and the mouth, the respective positions are detected in order. First, it is determined whether or not the use of eye feature is selected, and when it is selected, the position of the eye is detected. If it is not selected, the eye position is not detected. Subsequently, it is determined whether or not the use of the mouth feature is selected, and if it is selected, the position of the mouth is detected. If it is not selected, the mouth position is not detected.
- Patent Document 2 Japanese Unexamined Patent Publication No. 2003-281539
- the feature quantity extraction unit 14 extracts feature quantities used for face recognition.
- the feature quantity is extracted only for the part selected by the feature quantity selection unit 11.
- feature quantities are extracted for selected eyes or mouths.
- a feature quantity used for face recognition for example, a method of filtering the eyes, nose, and mouth using several types of Gabor filters is known (for example, see Non-Patent Document 1). The explanation is omitted.
- Non-Patent Document 1 “Face recognition by Gabor transformation using automatic feature point extraction” Yoshida, Kure, Shioyama, IEICE Technical Report PRMU2001—202
- the database 15 stores facial features of one or more registered persons, and all parts of the face (for example, eyes, mouth, ears, nose, etc.) from which the facial image power of each registered person is also extracted. The feature amount is saved.
- the recognition unit 16 uses the feature amount extracted by the feature amount extraction unit 14 from the face image of the user 19 and the facial feature amount of the registered person stored in the database 15. Compare and perform face recognition. In this case, the recognition unit 16 compares the feature amount extracted from the face image of the user 19 with the feature amount of the registered person with respect to the part (for example, the eyes or the mouth) selected by the feature amount selection unit 11. The recognition score is calculated.
- a non-turn matching method for calculating the recognition score for example, a method using a normal cross-correlation (for example, see Non-Patent Document 2) is known. Description is omitted.
- the recognition score is calculated for the feature amount of each registered person.
- the recognition unit 16 determines that the corresponding registered person is the user 19. That is, the user 19 is the same person as the registered person, and determines that the face recognition is successful. In other cases, the recognizing unit 16 determines that face recognition has failed.
- the recognition result output unit 17 outputs the recognition result of the recognition unit 16, that is, whether the user 19 has succeeded or failed in the face recognition, by voice or image from the speaker 22 and the monitor 23. To do. As a result, the recognition result is transmitted to the user 19.
- Step S1 It is determined whether or not the use of the feature amount of the eye is selected. If the use of the eye feature is selected, the process proceeds to step S2, and if the use of the eye feature is selected, the process proceeds to step S3.
- Step S2 Detect the eye position.
- Step S3 It is determined whether or not the use of the mouth feature is selected. If the use of the mouth feature value is selected, the process proceeds to step S4. If the use of the mouth feature value is selected, the process ends.
- Step S4 The position of the mouth is detected and the process is terminated.
- Step S11 It is determined whether or not the use of the feature amount of the eye is selected. Eye features If the use of the quantity is selected, the process proceeds to step SI2, and if the use of the feature quantity of the eye is selected, the process proceeds to step S13.
- Step S12 Extract feature value of eye.
- Step S13 It is determined whether or not the use of the mouth feature is selected. If the use of the mouth feature is selected, the process proceeds to step S14. If the use of the mouth feature is selected, the process ends.
- Step S14 Extract the mouth feature and finish the process.
- the user 19 intentionally sets the feature amount by selecting which part of the face is important as the facial feature amount according to his / her state. Can be done. Accordingly, the face recognition of the user 19 can be appropriately performed even in a state where a mask is put on, a state where an eyepatch is worn, a state where the hairstyle changes greatly from the time of registration, and the like. Therefore, it is not necessary to remove the mask or eyepatch at the time of authentication, and the burden on the user 19 can be reduced.
- FIG. 5 is a diagram showing a configuration of a face recognition system according to the second embodiment of the present invention.
- the face recognition system 10 includes an image input unit 12, a face position detection unit 13, a feature amount extraction unit 14, a database 15, a recognition unit 16, and a recognition result output unit 17. And a user state detector 18.
- the feature quantity selection unit 11 is omitted.
- the user state detection unit 18 detects the state of the user 19 based on the result detected by the face position detection unit 13, for example, whether or not the user 19 is wearing a mask. Determine whether.
- the configuration of other points is the same as that of the first embodiment, and a description thereof will be omitted.
- the image input unit 12 acquires a face image of the user 19 captured by the camera 21 for each frame. Subsequently, the face position detection unit 13 detects the position of each part of the face from the face image acquired by the image input unit 12.
- the user state detection unit 18 detects the state of the user 19 by using the position information of each part of the face detected by the face position detection unit 13. For example, the position of the mouth is estimated from the position of the eyes, and the vicinity of the estimated position of the mouth is preliminarily matched with the mouth template registered as registration data. When the cross-correlation value between the estimated mouth position and the mouth template is equal to or less than a predetermined threshold, the user state detection unit 18 determines that the mouth is visible due to the influence of the mask, and Select so that the mouth features are not used for face recognition.
- the user state detection unit 18 similarly matches, for example, the estimated vicinity of the eye position with an eye template registered as bullying registration data. If the cross-correlation value between the estimated eye position and the eye template is equal to or smaller than the predetermined threshold, the user state detection unit 18 determines that the eye is not visible due to the effect of the eyepatch or the hairstyle, and the eye The feature amount is selected not to be used for face recognition.
- the feature quantity extraction unit 14 extracts feature quantities used for face recognition. At this time, the feature quantity is extracted only for the part selected by the user state detection unit 18.
- the recognizing unit 16 uses the feature amount extracted from the face image of the user 19 by the feature amount extracting unit 14 and the facial feature amount of the registered person stored in the database 15. Compare and perform face recognition.
- the recognition result output unit 17 outputs the recognition result of the recognition unit 16, that is, whether the user 19 has succeeded or failed in the face recognition, by voice or image from the speaker 22 and the monitor 23. To do. As a result, the recognition result is transmitted to the user 19.
- the face recognition of the user 19 can be appropriately performed even in a state where a mask is put on, a state where an eyepatch is worn, a state where the hairstyle changes greatly from the time of registration, and the like. Therefore, it is not necessary to remove the mask or eye patch at the time of authentication, and the burden on the user 19 can be reduced.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Security & Cryptography (AREA)
- Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Collating Specific Patterns (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP07707090A EP2026233A4 (en) | 2006-03-23 | 2007-01-19 | FACE RECOGNITION SYSTEM |
US12/225,423 US8340366B2 (en) | 2006-03-23 | 2007-01-19 | Face recognition system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006079844A JP4862447B2 (ja) | 2006-03-23 | 2006-03-23 | 顔認識システム |
JP2006-079844 | 2006-06-23 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2007108225A1 true WO2007108225A1 (ja) | 2007-09-27 |
Family
ID=38522261
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2007/050802 WO2007108225A1 (ja) | 2006-03-23 | 2007-01-19 | 顔認識システム |
Country Status (7)
Country | Link |
---|---|
US (1) | US8340366B2 (ja) |
EP (1) | EP2026233A4 (ja) |
JP (1) | JP4862447B2 (ja) |
KR (1) | KR20090008256A (ja) |
CN (1) | CN101405744A (ja) |
TW (1) | TW200741562A (ja) |
WO (1) | WO2007108225A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111882692A (zh) * | 2020-07-21 | 2020-11-03 | 苏州盖雅信息技术有限公司 | 一种智能移动考勤方法及装置 |
WO2022130616A1 (ja) | 2020-12-18 | 2022-06-23 | 富士通株式会社 | 認証方法、情報処理装置、及び認証プログラム |
Families Citing this family (61)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100795160B1 (ko) * | 2007-03-22 | 2008-01-16 | 주식회사 아트닉스 | 얼굴영역검출장치 및 검출방법 |
JP2010080993A (ja) * | 2008-09-23 | 2010-04-08 | Brother Ind Ltd | インターホンシステム |
US8788977B2 (en) | 2008-11-20 | 2014-07-22 | Amazon Technologies, Inc. | Movement recognition as input mechanism |
US8401294B1 (en) * | 2008-12-30 | 2013-03-19 | Lucasfilm Entertainment Company Ltd. | Pattern matching using convolution of mask image and search image |
JP5480532B2 (ja) * | 2009-04-30 | 2014-04-23 | グローリー株式会社 | 画像処理装置、画像処理方法、及び同方法をコンピュータに実行させるプログラム |
US20100278395A1 (en) * | 2009-05-04 | 2010-11-04 | Jonathan Yen | Automatic backlit face detection |
JP4844670B2 (ja) * | 2009-11-13 | 2011-12-28 | 日本ビクター株式会社 | 映像処理装置および映像処理方法 |
US8725579B2 (en) * | 2010-04-30 | 2014-05-13 | Qualcomm Incorporated | Device ID and financial information |
US8522021B2 (en) | 2010-04-30 | 2013-08-27 | Hewlett-Packard Development Company, L.P. | Communication channel of a device |
US8878773B1 (en) * | 2010-05-24 | 2014-11-04 | Amazon Technologies, Inc. | Determining relative motion as input |
US8705813B2 (en) * | 2010-06-21 | 2014-04-22 | Canon Kabushiki Kaisha | Identification device, identification method, and storage medium |
TWI413004B (zh) * | 2010-07-29 | 2013-10-21 | Univ Nat Taiwan Science Tech | 人臉特徵辨識方法及系統 |
CN102004899B (zh) * | 2010-11-03 | 2012-09-26 | 无锡中星微电子有限公司 | 一种人脸认证系统及方法 |
US8577092B2 (en) * | 2010-11-11 | 2013-11-05 | Lg Electronics Inc. | Multimedia device, multiple image sensors having different types and method for controlling the same |
JP2012169777A (ja) * | 2011-02-10 | 2012-09-06 | Sony Corp | 情報処理装置、情報処理方法、およびプログラム |
DE102011015730A1 (de) * | 2011-03-31 | 2012-10-04 | Land Rheinland-Pfalz, vertreten durch das Landeskriminalamt Rheinland-Pfalz | Phantombilddatenbank (3D) |
US9082235B2 (en) | 2011-07-12 | 2015-07-14 | Microsoft Technology Licensing, Llc | Using facial data for device authentication or subject identification |
US11195057B2 (en) | 2014-03-18 | 2021-12-07 | Z Advanced Computing, Inc. | System and method for extremely efficient image and pattern recognition and artificial intelligence platform |
US11074495B2 (en) | 2013-02-28 | 2021-07-27 | Z Advanced Computing, Inc. (Zac) | System and method for extremely efficient image and pattern recognition and artificial intelligence platform |
US8873813B2 (en) | 2012-09-17 | 2014-10-28 | Z Advanced Computing, Inc. | Application of Z-webs and Z-factors to analytics, search engine, learning, recognition, natural language, and other utilities |
US9916538B2 (en) | 2012-09-15 | 2018-03-13 | Z Advanced Computing, Inc. | Method and system for feature detection |
US11914674B2 (en) | 2011-09-24 | 2024-02-27 | Z Advanced Computing, Inc. | System and method for extremely efficient image and pattern recognition and artificial intelligence platform |
RU2543950C2 (ru) * | 2011-12-28 | 2015-03-10 | Кэнон Кабусики Кайся | Устройство формирования изображения и способ управления указанным устройством |
AU2013205535B2 (en) * | 2012-05-02 | 2018-03-15 | Samsung Electronics Co., Ltd. | Apparatus and method of controlling mobile terminal based on analysis of user's face |
JP5964190B2 (ja) * | 2012-09-27 | 2016-08-03 | 京セラ株式会社 | 端末装置 |
TWI582708B (zh) | 2012-11-22 | 2017-05-11 | 緯創資通股份有限公司 | 臉部表情控制系統、表情控制之方法及其電腦系統 |
US9094576B1 (en) | 2013-03-12 | 2015-07-28 | Amazon Technologies, Inc. | Rendered audiovisual communication |
US9262671B2 (en) | 2013-03-15 | 2016-02-16 | Nito Inc. | Systems, methods, and software for detecting an object in an image |
US9552421B2 (en) * | 2013-03-15 | 2017-01-24 | Microsoft Technology Licensing, Llc | Simplified collaborative searching through pattern recognition |
KR102057947B1 (ko) * | 2013-04-01 | 2019-12-20 | 삼성전자주식회사 | 사용자 인증을 수행하는 디스플레이 장치 및 그 사용자 인증 방법 |
US20160070985A1 (en) * | 2013-05-02 | 2016-03-10 | Konica Minolta Inc. | Image processing apparatus, image processing method, and storage medium storing image processing program thereon |
US9405978B2 (en) | 2013-06-10 | 2016-08-02 | Globalfoundries Inc. | Prioritization of facial recognition matches based on likely route |
JP6123893B2 (ja) * | 2013-06-25 | 2017-05-10 | 富士通株式会社 | 情報処理装置、端末装置、情報処理プログラム及び情報処理方法 |
US9892413B2 (en) | 2013-09-05 | 2018-02-13 | International Business Machines Corporation | Multi factor authentication rule-based intelligent bank cards |
US9829480B2 (en) | 2013-09-26 | 2017-11-28 | Alcohol Monitoring Systems, Inc. | Remote breath alcohol monitor |
WO2015164584A1 (en) | 2014-04-23 | 2015-10-29 | Google Inc. | User interface control using gaze tracking |
EP3035238A1 (en) * | 2014-12-19 | 2016-06-22 | Tata Consultancy Services Limited | Video surveillance system and method for fraud detection |
WO2016186649A1 (en) | 2015-05-19 | 2016-11-24 | Hewlett Packard Enterprise Development Lp | Database comparison operation to identify an object |
CN109074484B (zh) | 2016-03-02 | 2022-03-01 | 蒂诺克股份有限公司 | 用于有效率的面部识别的系统和方法 |
CN105631441A (zh) * | 2016-03-03 | 2016-06-01 | 暨南大学 | 一种人脸识别方法 |
US10728694B2 (en) | 2016-03-08 | 2020-07-28 | Tinoq Inc. | Systems and methods for a compound sensor system |
CN109479181B (zh) | 2016-03-30 | 2020-12-01 | 蒂诺克股份有限公司 | 用于用户检测和识别的系统和方法 |
KR101760211B1 (ko) * | 2016-04-04 | 2017-07-21 | 엔에이치엔엔터테인먼트 주식회사 | 안구 인식을 통해 보안이 강화된 인증 방법 및 시스템 |
US10282595B2 (en) | 2016-06-24 | 2019-05-07 | International Business Machines Corporation | Facial recognition encode analysis |
KR101810190B1 (ko) * | 2016-07-14 | 2017-12-18 | 김용상 | 얼굴 인식을 이용한 사용자 인증 방법 및 그 장치 |
US20190031145A1 (en) | 2017-07-28 | 2019-01-31 | Alclear, Llc | Biometric identification system connected vehicle |
CN111373408B (zh) | 2017-11-27 | 2023-05-02 | 三菱电机株式会社 | 表情识别装置 |
JP6760318B2 (ja) * | 2018-03-14 | 2020-09-23 | オムロン株式会社 | 顔画像識別システム、識別器生成装置、識別装置、画像識別システム、及び識別システム |
TWI661398B (zh) * | 2018-03-31 | 2019-06-01 | 華南商業銀行股份有限公司 | 基於臉部辨識進行驗證的交易系統及其方法 |
TWI687872B (zh) * | 2018-03-31 | 2020-03-11 | 華南商業銀行股份有限公司 | 基於臉部辨識進行驗證的交易系統及其方法 |
US11087121B2 (en) | 2018-04-05 | 2021-08-10 | West Virginia University | High accuracy and volume facial recognition on mobile platforms |
CN108650408B (zh) * | 2018-04-13 | 2021-01-08 | 维沃移动通信有限公司 | 一种屏幕解锁方法和移动终端 |
WO2020041352A1 (en) | 2018-08-21 | 2020-02-27 | Tinoq Inc. | Systems and methods for member facial recognition based on context information |
TWI676136B (zh) * | 2018-08-31 | 2019-11-01 | 雲云科技股份有限公司 | 使用雙重分析之影像偵測方法以及影像偵測裝置 |
JP7302329B2 (ja) * | 2019-06-26 | 2023-07-04 | 日本電気株式会社 | 画像処理装置、人物判別方法、およびプログラム |
US10867460B1 (en) | 2019-10-02 | 2020-12-15 | Motorola Solutions, Inc. | System and method to provide public safety access to an enterprise |
KR102299080B1 (ko) | 2019-12-24 | 2021-09-07 | 유한회사 하존솔루션 | 고객 이미지 인식 기반 샵앤샵 통합 운영 시스템 |
CN111462381A (zh) * | 2020-04-01 | 2020-07-28 | 深圳深云智汇科技有限公司 | 基于人脸温度识别的门禁控制方法、电子装置及存储介质 |
US11328532B2 (en) * | 2020-04-20 | 2022-05-10 | Scott C Harris | Mask aware biometric identification system |
CN111986372A (zh) * | 2020-08-21 | 2020-11-24 | 中南信息科技(深圳)有限公司 | 一种高精度人脸门禁识别装置 |
CN115529837A (zh) * | 2021-04-09 | 2022-12-27 | 鸿富锦精密工业(武汉)有限公司 | 戴口罩人脸识别方法、装置、计算机存储介质 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003050783A (ja) * | 2001-05-30 | 2003-02-21 | Fujitsu Ltd | 複合認証システム |
JP2003281539A (ja) | 2002-03-25 | 2003-10-03 | Oki Electric Ind Co Ltd | 顔部品探索装置および顔部品探索方法 |
JP2004118627A (ja) * | 2002-09-27 | 2004-04-15 | Toshiba Corp | 人物認証装置および人物認証方法 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3279913B2 (ja) * | 1996-03-18 | 2002-04-30 | 株式会社東芝 | 人物認証装置、特徴点抽出装置及び特徴点抽出方法 |
GB2343945B (en) | 1998-11-18 | 2001-02-28 | Sintec Company Ltd | Method and apparatus for photographing/recognizing a face |
JP3970573B2 (ja) * | 2001-10-19 | 2007-09-05 | アルパイン株式会社 | 顔画像認識装置および方法 |
KR100456619B1 (ko) * | 2001-12-05 | 2004-11-10 | 한국전자통신연구원 | 에스.브이.엠(svm)을 이용한 얼굴 등록/인증 시스템 및방법 |
KR100438841B1 (ko) * | 2002-04-23 | 2004-07-05 | 삼성전자주식회사 | 이용자 검증 및 데이터 베이스 자동 갱신 방법, 및 이를이용한 얼굴 인식 시스템 |
JP2003331264A (ja) * | 2002-05-13 | 2003-11-21 | Nec Soft Ltd | 顔画像照合システムおよび顔画像照合方法 |
US7440593B1 (en) * | 2003-06-26 | 2008-10-21 | Fotonation Vision Limited | Method of improving orientation and color balance of digital images using face detection information |
US20060146062A1 (en) * | 2004-12-30 | 2006-07-06 | Samsung Electronics Co., Ltd. | Method and apparatus for constructing classifiers based on face texture information and method and apparatus for recognizing face using statistical features of face texture information |
US20070052726A1 (en) * | 2005-09-08 | 2007-03-08 | David Wright | Method and system for likeness reconstruction |
JP4696857B2 (ja) * | 2005-11-02 | 2011-06-08 | オムロン株式会社 | 顔照合装置 |
-
2006
- 2006-03-23 JP JP2006079844A patent/JP4862447B2/ja not_active Expired - Fee Related
-
2007
- 2007-01-17 TW TW096101788A patent/TW200741562A/zh unknown
- 2007-01-19 US US12/225,423 patent/US8340366B2/en active Active
- 2007-01-19 CN CNA2007800098648A patent/CN101405744A/zh active Pending
- 2007-01-19 EP EP07707090A patent/EP2026233A4/en not_active Withdrawn
- 2007-01-19 KR KR1020087025723A patent/KR20090008256A/ko not_active Application Discontinuation
- 2007-01-19 WO PCT/JP2007/050802 patent/WO2007108225A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003050783A (ja) * | 2001-05-30 | 2003-02-21 | Fujitsu Ltd | 複合認証システム |
JP2003281539A (ja) | 2002-03-25 | 2003-10-03 | Oki Electric Ind Co Ltd | 顔部品探索装置および顔部品探索方法 |
JP2004118627A (ja) * | 2002-09-27 | 2004-04-15 | Toshiba Corp | 人物認証装置および人物認証方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2026233A4 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111882692A (zh) * | 2020-07-21 | 2020-11-03 | 苏州盖雅信息技术有限公司 | 一种智能移动考勤方法及装置 |
CN111882692B (zh) * | 2020-07-21 | 2023-04-18 | 苏州盖雅信息技术有限公司 | 一种智能移动考勤方法及装置 |
WO2022130616A1 (ja) | 2020-12-18 | 2022-06-23 | 富士通株式会社 | 認証方法、情報処理装置、及び認証プログラム |
Also Published As
Publication number | Publication date |
---|---|
JP4862447B2 (ja) | 2012-01-25 |
JP2007257221A (ja) | 2007-10-04 |
US20090110248A1 (en) | 2009-04-30 |
US8340366B2 (en) | 2012-12-25 |
CN101405744A (zh) | 2009-04-08 |
KR20090008256A (ko) | 2009-01-21 |
EP2026233A4 (en) | 2009-12-02 |
EP2026233A1 (en) | 2009-02-18 |
TW200741562A (en) | 2007-11-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4862447B2 (ja) | 顔認識システム | |
US8558663B2 (en) | Integration of facial recognition into cross channel authentication | |
JP4899551B2 (ja) | 認証装置、認証方法、認証プログラムおよびコンピュータ読み取り可能な記録媒体 | |
US20140380446A1 (en) | Method and apparatus for protecting browser private information | |
US8422746B2 (en) | Face authentication system and authentication method thereof | |
US20140241593A1 (en) | Authentication apparatus, authentication method, and non-transitory computer readable medium | |
CN104933344A (zh) | 基于多生物特征模态的移动终端用户身份认证装置及方法 | |
CN103310339A (zh) | 身份识别装置和方法以及支付系统和方法 | |
KR20160147515A (ko) | 사용자 인증 방법 및 이를 지원하는 전자장치 | |
EP3739482B1 (en) | Facial recognition device | |
US10885171B2 (en) | Authentication verification using soft biometric traits | |
CN204791017U (zh) | 基于多生物特征模态的移动终端用户身份认证装置 | |
KR20010074059A (ko) | 모바일 단말기용 얼굴 기반 개인 신원 검증 방법 및 장치 | |
JP4899552B2 (ja) | 認証装置、認証方法、認証プログラム、これを記録したコンピュータ読み取り可能な記録媒体 | |
JP5730044B2 (ja) | 顔画像認証装置 | |
JP2001256496A (ja) | 顔画像認識装置及び顔画像認識方法 | |
US20220277311A1 (en) | A transaction processing system and a transaction method based on facial recognition | |
CN110717428A (zh) | 一种融合多个特征的身份识别方法、装置、系统、介质及设备 | |
JP6767685B2 (ja) | 顔認証装置 | |
JP7260145B2 (ja) | 認証装置、認証用端末、認証方法、プログラム及び記録媒体 | |
JPH10137221A (ja) | 個人識別装置 | |
KR101766829B1 (ko) | 현금인출장치 | |
WO2014092665A1 (en) | Integrated user authentication system in self-service machines | |
KR102583982B1 (ko) | 비대면 출입 통제 방법 및 이를 수행하는 출입 통제 시스템 | |
WO2023073838A1 (ja) | 認証装置、認証システム、認証方法、及び非一時的なコンピュータ可読媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 07707090 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 200780009864.8 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12225423 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2007707090 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020087025723 Country of ref document: KR |