WO2005022462A1 - 使用者認証機能付き電子機器 - Google Patents
使用者認証機能付き電子機器 Download PDFInfo
- Publication number
- WO2005022462A1 WO2005022462A1 PCT/JP2004/012884 JP2004012884W WO2005022462A1 WO 2005022462 A1 WO2005022462 A1 WO 2005022462A1 JP 2004012884 W JP2004012884 W JP 2004012884W WO 2005022462 A1 WO2005022462 A1 WO 2005022462A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- data
- electronic device
- image
- authentication
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/66—Substation equipment, e.g. for use by subscribers with means for preventing unauthorised or fraudulent calling
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72427—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/52—Details of telephonic subscriber devices including functional features of a camera
Definitions
- the present invention relates to an electronic device with a user authentication function, which performs an authentication process using a face image of a user.
- the personal authentication method a method using a password and a method using biometric information such as a fingerprint, a voiceprint, and an iris which show the characteristics of the human body are known.
- Patent Document 1 describes an application of a personal authentication technique using a fingerprint to a mobile phone.
- Patent Document 2 discloses a technique using a face image as biometric information for authentication.
- Patent Document 3 discloses a mobile phone in which a telephone function is stopped on condition that a pet breeding game is in a predetermined state. I have. This mobile phone has a pet breeding game It requires a certain amount of time to elapse before reaching a predetermined state, and security during that time is a problem.
- electronic devices for communicating with virtual characters such as pets those described in Patent Documents 4 and 5 in addition to Patent Document 3 have been proposed.
- Patent Document 1 Japanese Patent Application Laid-Open No. 2002-44727
- Patent Document 2 Japanese Patent Application Laid-Open No. 2002-261914
- Patent Document 3 Japanese Patent Application Laid-Open No. 2003-169132
- Patent Document 4 WO 00Z5328 No. 1 pamphlet
- Patent Document 5 Japanese Patent Application Laid-Open No. 2001-51970
- the present invention has been made in view of the above circumstances, and provides an electronic device that authenticates a user using face image information and that can perform registration and authentication of an authentication image while enjoying the operation.
- the purpose is to:
- the electronic device of the present invention is an electronic device with a user authentication function having a function of authenticating a user and determining whether or not the user is an appropriate user, and an imaging unit capable of capturing a face image of the user; Verification image storage means for storing verification data, which is face image data of a proper user, acquisition processing of the verification data, and verification processing of the input image data photographed by the imaging means with the verification data Authentication processing means to perform user authentication processing, including the need to obtain new collation data, determine whether necessity is necessary, and determine whether it is possible to take a face image that is the source of the necessary collation data.
- Acquisition start determining means for performing a possibility determining process character simulating means for simulating a character; display means capable of displaying an image of the character simulated by the character simulating means;
- the authentication processing means comprises: obtaining a new collation data from a result of the necessity determination processing and the possibility determination processing; and a face image serving as a source of the required collation data. If it is possible to take a picture, a character image is displayed on the display means to notify the user whether to start the acquisition processing.
- ADVANTAGE OF THE INVENTION According to this invention, acquisition processing of the collation data of a user is performed, enjoying communication with a character.
- the electronic device of the present invention includes a process in which the authentication processing unit displays the character image on the display unit and responds to the user during the storage process and the collation process. According to the present invention, a user authentication process is performed while enjoying communication with a virtual character. In addition, the process of acquiring the collation data can be performed while enjoying communication with the virtual character. Therefore, the trouble in authenticating the user can be reduced.
- the matching data may include partial area image data for each face part or feature data for each face part, and the matching processing of the authentication processing unit may be extracted from the input image data. Includes those including calculation of the degree of similarity between the image data for each part of the face and the collation data. According to the present invention, since the collation processing is performed only by photographing the face with the photographing means, the collation can be performed without performing a difficult operation for the user.
- the comparison data is created from a plurality of pieces of photographed image data having different photographing dates and times or photographing locations. According to the present invention, since the collation data is systematically stored according to the photographing conditions, it is possible to improve the authentication accuracy of the user by the face image.
- the electronic device of the present invention includes an electronic device in which the acquisition processing of the authentication processing unit includes a photographing support processing of a proper user's face image at a predetermined photographing date or time or a photographing place. According to the present invention, it is possible to efficiently store collation data for each imaging condition.
- the electronic device according to the present invention is characterized in that the necessity determination process or the possibility determination process is a determination using date and time or information of a shooting place. According to the present invention, an appropriate face image of a user can be photographed at a plurality of photographing dates and times and photographing locations, so that the accuracy of user authentication can be improved. According to the present invention, it is possible to provide an electronic device that authenticates a user using face image information and that can perform registration and authentication of an authentication image while enjoying the operation.
- FIG. 1 is a diagram showing a schematic configuration of a camera-equipped mobile phone according to an embodiment of the present invention.
- FIG. 2 is a diagram showing an example of items of collation data of the camera-equipped mobile phone according to the embodiment of the present invention.
- FIG. 3 is a diagram showing an example of the time of shooting of collation data of the camera-equipped mobile phone according to the embodiment of the present invention and an example of a setting section of a shooting location.
- FIG. 4 is a diagram showing a schematic operation flow when the operation unit of the mobile phone with a force lens according to the embodiment of the present invention is operated.
- FIG. 5 is a diagram showing a schematic flow of face image data collation processing of the camera-equipped mobile phone according to the embodiment of the present invention.
- FIG. 6 is a diagram showing a schematic flow of a process of storing collation data of the camera-equipped mobile phone according to the embodiment of the present invention.
- FIG. 7 is a diagram showing a display example of the display unit when a response is made in accordance with an operation of the operation unit in the camera-equipped mobile phone according to the embodiment of the present invention.
- reference numeral 1 denotes a control unit
- 2 denotes a ROM
- 3 denotes a RAM
- 4 denotes a nonvolatile memory
- 5 denotes an image pickup unit
- 6 denotes a display unit
- 7 denotes an operation unit
- 10 denotes an internal bus
- 20 denotes an internal bus.
- Is a communication unit 21 is an antenna
- 30 is an audio processing unit
- 31 is a microphone
- 32 is a speaker
- 100 is a mobile phone with a camera.
- the electronic device to which the present invention is applied is a mobile phone with a camera.
- the target of application is not limited to a mobile phone with a camera, but may be any electronic device having an imaging unit.
- FIG. 1 is a diagram showing a schematic configuration of a camera-equipped mobile phone which is an electronic apparatus according to an embodiment of the present invention.
- the camera-equipped mobile phone 100 in FIG. 1 has a control unit 1, a ROM 2, RAM 3, nonvolatile memory 4, imaging unit 5, display unit 6, operation unit 7, internal bus 10, communication unit 20, antenna 21, audio processing unit 30, microphone 31, speaker 3 2 .
- the control unit 1 controls the entire operation of the mobile phone 100, and mainly includes a processor (not shown) that executes a predetermined program.
- the control unit 1 controls transmission and reception of data and instructions between the elements of the mobile phone 100 via the internal bus 10.
- the control unit 1 has an authentication processing function using the user's face image data captured by the imaging unit 5 and the appropriate user's face image data stored in the nonvolatile memory 4. Furthermore, it has an acquisition start judgment function for judging whether photographing of new face image data of the user is necessary and possible. It also has a function to simulate a virtual character. Since the function of simulating a virtual character can be realized in the same manner as the realization of the training game function, detailed description is omitted.
- the image is output via the display unit 6, and when outputting a sound simulating the behavior of the character, the image is output via an audio processing unit 30 and a speaker 32. Power.
- the ROM 2 stores a program executed by a processor constituting the control unit 1 and various data used by the mobile phone 100.
- the RAM 3 is a memory for temporarily storing data, and is also used as a work memory when the control unit 1 executes various processes.
- the non-volatile memory 4 is composed of, for example, an EEPROM, and stores collation data, which is facial image data of a proper user, which will be described later, and image data for outputting an image simulating the behavior of a character. It is also used for various data files when a user uses the camera-equipped mobile phone 100.
- the imaging unit 5 includes an optical system such as a lens, an imaging device, an image processing unit (none is shown), and the like, and outputs digital image data based on a captured image signal.
- the imaging unit 5 is the same as that provided in a conventional camera-equipped mobile phone, and the operation in the normal imaging mode is also the same. That is, the through image in the shooting mode is displayed on the display unit 6, and when the shutter button of the operation unit 7 is operated, the digital image data based on the shooting image signal at that time is temporarily stored in the RAM 3, and the operation unit is operated. If the storage is supported from 7, it is stored in the nonvolatile memory 4. In addition, like this Since a camera-equipped mobile phone that performs various photographing operations is well known, a detailed description thereof will be omitted.
- the imaging unit 5 is used for acquiring collation data stored in the nonvolatile memory. It is also used for obtaining face image data at the time of authentication of the user of the mobile phone 100. When used for such authentication processing, the user often operates while looking at the display screen of the display unit 6, so that the lens of the imaging unit 5 is directed toward the display surface side of the display unit 6. preferable.
- a plurality of imaging units 5 are provided, one of which is used for photographing on the display surface side of the display unit 6, or the imaging direction of the imaging unit 5 is made variable, and the display surface side of the display unit 6 is used as the imaging direction at the time of authentication processing. It can be realized by
- the display section 6 displays various information of the mobile phone 100, and includes a liquid crystal display panel for displaying and a display control circuit (not shown) for driving the liquid crystal display panel.
- the operation unit 7 is for the user to input commands and data for operating the mobile phone 100, and includes a numeric keypad for inputting a telephone number and various data, various function keys, and the like. These keys have different functions depending on the operation mode, and also function as a shutter button and a zoom button in a normal photographing mode, an image photographing operation in an authentication process described later, and an input key for various data. It is also used for data input for communicating with a virtual character simulated by the control unit 1.
- the communication unit 20 connected to the antenna 21 performs wireless communication with the outside.
- the transmission data is transmitted on the carrier 21 from the antenna 21 and the received data received by the antenna 21 is demodulated. Is what you do. If the demodulated data is audio data, it is sent to the audio processing unit 30; if it is other data, the control unit 1, RAM 3, The data is sent to the nonvolatile memory 4 or the like.
- the transmission data is input directly from the audio processing unit 30 or from another element via the internal path 10.
- the audio processing unit 30 converts an audio signal input from the microphone 31 into digital data, outputs the digital data as transmission data to the communication unit 20, and receives data (audio data) output from the communication unit 20. To an analog audio signal, and the speaker 3 Output to 2.
- digital data based on the audio signal from the microphone 31 is sent to the control unit 1 and the like via the internal bus 10, and the digital data input via the internal path 10 is converted into an audio signal, and the It is also possible to output to 32.
- the mobile phone 100 has a camera function using the imaging unit 5 and a data communication function using the communication unit 20 as well as making a voice call. These functions can be selectively operated by operating predetermined keys of the operation unit 7. Note that the voice communication function, camera function, and data communication function are the same as those of the conventional one, and therefore, description thereof will be omitted.
- the mobile phone 100 has a user authentication function, and it is possible to restrict the use of the mobile phone except for an appropriate user. The use restriction may be applied to all functions of the mobile phone 100 or to only a part of the functions. Whether a user is appropriate is determined by comparing the user's face image and by entering a password.
- the verification data is obtained by a proper user of the mobile phone 100 capturing at least one face image of the user with the imaging unit 5 and storing the image data or the characteristic data in advance.
- the collation data includes, for example, individual face information and average face information.
- the individual face information is data obtained by analyzing data of each captured face image for each item as shown in FIG. For example, for the eyes, nose, mouth, ears, eyebrows, etc., the coordinates of their vertices, for the contour, multiple point coordinates on the contour, and for the hairstyle, data that distinguishes the hair area and other areas using binary values. Can be used as collation data.
- element arrangement data relative position data of representative elements such as eyes, nose, and mouth are stored as collation data.
- a reduced image may be used as the matching data instead of the vertex coordinates.
- an image around the eye may be cut out and reduced so as to include the eye, and the reduced image may be used as the matching data.
- the collation data is individually compared with the face image data captured and acquired at the time of user authentication, and the similarity is calculated.
- Fig. 3 shows an example of the setting of shooting date and time and shooting location. Storing one or more facial image data or feature data for each combination of these categories ensures that the user is the correct user regardless of the shooting conditions, the user's face condition, etc. Discrimination can be expected.
- the authentication based on the face image data is made available after storing a plurality of data for verification of a proper user in a plurality of sections, but may be made available after storing only one data.
- face image authentication is enabled only by performing a predetermined type of face image shooting based on a combination of the shooting date and time and shooting location setting category, the face image shooting itself for authentication will be a game-like experience. You can do it with fun.
- the average face information is obtained by analyzing face image data of a plurality of appropriate users for each item shown in FIG. 2, and averaging the data for each item for the plurality of face image data. Therefore, the characteristics of the average face image of the appropriate user are shown.
- FIG. 4 is a diagram showing a schematic operation flow when the user operates the operation unit 7.
- the operation in FIG. 4 is controlled by the program of the control unit 1.
- the mobile phone 100 has a function of restricting the use of functions other than the call function by non-authorized users, and authentication of an appropriate user is performed by inputting user's face image data or password. The explanation is made assuming that this is performed by inputting the following.
- step S401 it is determined whether or not the operation is an operation related to a call process. That is, it is determined whether the operation is an operation for receiving a call or an operation for transmitting a call. If it is determined that the call is not a call operation, it is determined whether or not the mobile phone 100 is in the function restriction mode, that is, whether or not the mode is set to the mode for restricting use by a user other than an appropriate user (step S400).
- step S403 it is determined whether user authentication using a face image is possible. In this step, for example, it is determined whether or not the combination of the date and time of photographing and the predetermined number of setting sections of the photographing place shown in FIG. Note that such a determination is not necessarily required. If at least one piece of collation data is stored in the nonvolatile memory 4, user authentication may be enabled.
- step S404 a face image of the user is captured, and the captured face image data is input.
- an image simulating the action of the virtual character is displayed on the display unit 6 so as to display information indicating that image capturing for authentication is required, and information indicating a shooting method and the like. Is also good.
- audio information may be output from the speaker 32.
- Figure 7 shows a display example in that case.
- the character simulated by the control unit 1 is a cat, and until the operation of the operation unit 7 is performed, as shown in FIG. 7 (a), an image of sleeping or playing freely. Is displayed. If authentication using a face image is required, ask a question or the like facing the front as shown in Fig. 7 (b).
- the imaging unit 5 shifts to the face image shooting state.
- the composition to be captured may be displayed as shown in FIG. 7 (c).
- step S405 When the user operates the shutter button to take a picture of the user's face and the image data is stored in RAM 3, in step S405, the image data stored in RAM 3 and the nonvolatile data are stored.
- the collation processing is performed with the collation data stored in the memory 4 to determine whether or not the captured and input face image data is appropriate user face image data. The collation processing and the determination as to whether the user is a proper user will be described later. If the input face image data is determined to be a proper face image of the user (step S406), a response such as "Sure, Mr. X. You can use it" is sent,
- the collation data stored in the memory 4 is updated (step S407). This updating process can be skipped, and may be performed only when the face image is at a shooting date and time or at a shooting location that is not stored.
- the character simulated by the control unit 1 may be a simulated real creature such as a cat, a simulated creature, or a non-living creature such as a mouth pot.
- the technique of displaying an image simulating such a character is well-known in various game devices, and a description thereof will be omitted.
- the message from the character may be output not only by characters but also by voice.
- step S408 various processes can be performed according to the operation unit 7, so that the user can perform an arbitrary process. Setting and release of the function restriction mode can also be operated. Further, in this state, the mobile phone 100 performs a process of storing the collation data as necessary (step S409). The storage processing in step S409 will be described later.
- step 410 it is determined whether or not the operation of terminating the mobile phone 100 has been performed. If the operation has been completed, the operation ends, and if not, the process returns to step S408.
- step S406 If it is determined in step S406 that the input face image data is not a proper face image of the user, a display prompting for a password is displayed. For example, an instruction such as "Is X really? Enter your password if X" is displayed.
- step S411 When the password is input (step S411), the password is collated (step S412), and it is determined whether the user is a proper user.
- the password to be verified is set in advance and stored in the nonvolatile memory 4.
- step S406 if it is determined that the input user of the password is the proper user (step S406), a response such as "I really like Mr. X. You can use J.
- step S408 If it is determined that the user is not a proper user, error processing is performed in step S414.In this step, for example, "Since the eyes are different, X No, I can't use it. "
- step S402 If it is determined in step S402 that the mobile phone 100 has not been set to the function restriction mode, the flow shifts to step S408 to wait for operation of the operation unit 7. Also, If it is determined in step S401 that the operation of the operation unit 7 is a call processing operation, the call processing is performed until a call end operation is detected in step S416 (step S4). 1 5).
- step S501 image data for each face part is extracted from the face image data stored in RAM3.
- the items to be extracted are the items shown in Fig. 2.
- step S502 the data is compared with the collation data for each item, and the degree of similarity for each item is obtained (step S502).
- step S503 a matching score indicating the likelihood that the user who has taken the face image is an appropriate user is calculated based on the similarity for each item.
- the matching score is used as the matching score. If the matching score based on the degree of similarity with any individual face information is large, it is determined that the user is an appropriate user.
- the comparison is performed with the average face information which is the average value of the analysis results for each item of the face image data of a plurality of appropriate users. Then, a weighted average of the similarity for each item is set as a matching score. The point that the user is judged to be an appropriate user when the matching score is large is the same as in the case of individual face information, but with the condition that the similarity of all items is equal to or greater than a predetermined value. Is also good.
- the above collation processing standard is an example, and can be changed as appropriate depending on the required accuracy of the personal authentication and the like.
- the comparison with the average face information may be performed first, or both the comparison with the individual face information and the comparison with the average face information may be performed. If the matching score obtained from both of them is equal to or more than a predetermined value, the appropriate It may be determined that the user is the user.
- the comparison may be performed using the statistics of the collation data.
- the matching data is configured to include a reduced image for each face part.
- the n-dimensional vertical vector X can be created by arranging the density values of each pixel of the reduced image of one part of one face. The reduced images of the same part of the P faces (where!
- the vector Xf (where 1 or more) is composed of the density values of each pixel of the reduced images Is a natural number less than or equal to p.
- the eigenvector 1 f of the variance-covariance matrix ⁇ of p vectors (x f —m> is calculated.
- the eigenvector of the matrix ⁇ is ; I, when the unit matrix is expressed as I,
- the eigenvector 1 j and the eigenvalue; lj can be calculated using the Jacobi method or the like.
- the transposed matrix of the matrix A (1 ⁇ 1 2 , ⁇ ⁇ ⁇ , 1 p ) in which the eigenvectors are arranged and the deviation from the average of the density values of each pixel of the reduced image of the matching data N rows! )
- Column of the matrix X (X l -.
- each row of the transformation matrix B means the principal component direction of the column vector ( xf —m) (f is a natural number not less than 1!) Of the matrix X.
- a vector y composed of the density values of each pixel of the reduced image of the face part of the face image data stored in the RAM 3 is created.
- the difference (ym) between the vector y and the average vector m of the vector xf (f is a natural number not less than 1 and not more than p), which is the collation data of the same part, is calculated.
- the difference (y ⁇ m) is multiplied by the transformation matrix B from the left, and the transposed matrix B * of the above transformation matrix is multiplied from the left to obtain a vector B tB (y ⁇ m).
- a vector (y — m) representing one face part of the face image data stored in RAM 3 is converted into a column vector (x f — m) (f is 1 to p) It corresponds to the process of projecting into the space spanned by (natural numbers).
- This norm is equal to the length of the perpendicular at the time of projection, and can be used as the similarity between the face part of the data for verification and the face part of the face image data stored in RAM3.
- step S601 it is determined whether the collation data needs to be stored. If it is determined in step S403 of FIG. 4 that authentication using a face image is not possible, it is determined that the collation data needs to be stored. Even if it is determined that authentication using a face image is possible, if the matching data is not sufficiently stored for the combination of the shooting date and time and the shooting location setting category (see Fig. 3), the matching data will not be stored. It is determined that data storage is necessary. It is also possible to keep a log of the collation result and determine that the collation data needs to be stored when the accuracy rate of authentication is low.
- step S602 it is determined whether or not a necessary face image can be captured. For example, if the data for the daytime on holidays is required, but the clock means (not shown) in the mobile phone 100 indicates the weekday or nighttime, the acquisition of the necessary data for the verification is required. Do not take images because it is impossible. In order to prevent the determination that a face image can be taken from occurring more frequently than necessary, if the user has not asked for the possibility of taking a picture for a certain period of time, It may be forcibly determined that “the required face image cannot be captured”.
- step S603 If the necessary face image can be shot, it is inquired in step S603 whether the user is ready to shoot. This inquiry is made by displaying an image simulating the behavior of the virtual character on the display unit 6. In step S409 of FIG. 4, since processing other than the call processing is being performed, it is easy to make the virtual character respond to the desire to take a face image between user operations. .
- the determination of the shooting date and time and the shooting location shown in FIG. 3 is performed by a clock means (not shown) provided in the mobile phone 100 or the like. Further, the determination as to whether it is outdoors or indoors can be made based on the illuminance detected by the imaging means, the setting specified by the user, and the like. Examples of user-specified settings include white balance (bulb, sunny, cloudy) and flash ONZOFF. If the determination is impossible or inaccurate with the mobile phone 100, an inquiry is made in step S603 to make a determination.
- step S603 the virtual character is, for example, "I want to photograph Mr. X's face now, is it OK?", "If I am outside now, I want to photograph my face. And so on.
- step S604 a photographable operation is detected (a photographable response is made with the enter key of the operation unit 7, etc.).
- step S605 the mode is shifted to a face image photographing state.
- a shooting guide may be presented.
- the presentation of the shooting guide may be, for example, the same display as in step S404 (see FIG. 7 (c)).
- a process of prompting for a password and determining whether or not the user is a proper user may be performed based on the password.
- the RAM 3 stores the face image data, analyzes it for each item as shown in Fig. 2, and stores the analysis result in the nonvolatile memory 4. (Step S606).
- step S607 it is determined whether or not it is necessary to take a plurality of face images. If necessary, the process returns to step S603 and repeats.
- the present invention is applicable to an electronic device or the like that performs user authentication using face image information.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Signal Processing (AREA)
- Collating Specific Patterns (AREA)
- Telephone Function (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP04772833A EP1662438A1 (en) | 2003-09-01 | 2004-08-30 | Electronic device having user authentication function |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003-308616 | 2003-09-01 | ||
JP2003308616 | 2003-09-01 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005022462A1 true WO2005022462A1 (ja) | 2005-03-10 |
Family
ID=34269519
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/012884 WO2005022462A1 (ja) | 2003-09-01 | 2004-08-30 | 使用者認証機能付き電子機器 |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP1662438A1 (ja) |
CN (1) | CN1846228A (ja) |
WO (1) | WO2005022462A1 (ja) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7991388B1 (en) | 2011-05-10 | 2011-08-02 | CommerceTel, Inc. | Geo-bio-metric PIN |
CN105354461A (zh) * | 2014-08-22 | 2016-02-24 | 深圳市中兴微电子技术有限公司 | 一种鉴权方法及终端 |
CN105991813A (zh) * | 2015-01-28 | 2016-10-05 | 中兴通讯股份有限公司 | 一种呼叫处理方法和装置 |
CN105120089B (zh) * | 2015-08-17 | 2020-01-03 | 惠州Tcl移动通信有限公司 | 一种移动终端自动接听电话的方法及移动终端 |
JP6687488B2 (ja) * | 2015-12-24 | 2020-04-22 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 無人飛行体及びその制御方法 |
CN106294726A (zh) * | 2016-08-09 | 2017-01-04 | 北京光年无限科技有限公司 | 基于机器人角色交互的处理方法及装置 |
CN117336102B (zh) * | 2023-11-30 | 2024-03-01 | 北京冠程科技有限公司 | 一种多重校验的身份鉴别系统及其鉴别方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1173092A (ja) * | 1997-08-28 | 1999-03-16 | Omron Corp | 仮想ペット飼育装置、方法及びプログラム記録媒体 |
JP2002261914A (ja) * | 2001-03-05 | 2002-09-13 | Hitachi Ltd | 使用者認証機能付き携帯通信端末 |
JP2003058888A (ja) * | 2001-08-15 | 2003-02-28 | Secom Co Ltd | 個人照合装置 |
JP2003169132A (ja) * | 2001-11-30 | 2003-06-13 | Yamaha Corp | 携帯電話機 |
-
2004
- 2004-08-30 WO PCT/JP2004/012884 patent/WO2005022462A1/ja not_active Application Discontinuation
- 2004-08-30 CN CNA2004800250124A patent/CN1846228A/zh active Pending
- 2004-08-30 EP EP04772833A patent/EP1662438A1/en not_active Withdrawn
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1173092A (ja) * | 1997-08-28 | 1999-03-16 | Omron Corp | 仮想ペット飼育装置、方法及びプログラム記録媒体 |
JP2002261914A (ja) * | 2001-03-05 | 2002-09-13 | Hitachi Ltd | 使用者認証機能付き携帯通信端末 |
JP2003058888A (ja) * | 2001-08-15 | 2003-02-28 | Secom Co Ltd | 個人照合装置 |
JP2003169132A (ja) * | 2001-11-30 | 2003-06-13 | Yamaha Corp | 携帯電話機 |
Also Published As
Publication number | Publication date |
---|---|
EP1662438A1 (en) | 2006-05-31 |
CN1846228A (zh) | 2006-10-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7440013B2 (en) | Image pickup device with facial region detector and method of synthesizing image including facial region | |
US7742625B2 (en) | Autonomous camera having exchangable behaviours | |
JP4999570B2 (ja) | 表情認識装置及び方法、並びに撮像装置 | |
US7995106B2 (en) | Imaging apparatus with human extraction and voice analysis and control method thereof | |
JP5120777B2 (ja) | 電子データ編集装置、電子データ編集方法及びプログラム | |
CN101262561B (zh) | 成像设备及其控制方法 | |
CN106295499B (zh) | 年龄估计方法及装置 | |
WO2021047069A1 (zh) | 人脸识别方法和电子终端设备 | |
CN100440937C (zh) | 认证装置和认证方法 | |
CN112434546A (zh) | 人脸活体检测方法及装置、设备、存储介质 | |
JP2007094535A (ja) | 認証システム及び認証方法 | |
WO2005022462A1 (ja) | 使用者認証機能付き電子機器 | |
CN113132632B (zh) | 一种针对宠物的辅助拍摄方法和装置 | |
JP2020188507A (ja) | 遊戯画像撮影システムおよび画像処理装置 | |
KR101835531B1 (ko) | 얼굴 인식 기반의 증강현실 영상을 제공하는 디스플레이 장치 및 이의 제어 방법 | |
JP2005102190A (ja) | 使用者認証機能付き電子機器 | |
CN109587391B (zh) | 服务器装置、发布系统、发布方法 | |
JP4633527B2 (ja) | 認証装置、認証システム、認証装置の制御方法、プログラム、および、プログラムを記録したコンピュータ読み取り可能な記録媒体 | |
KR20090032209A (ko) | 휴대 단말기의 전화번호부에 이미지를 등록하는 방법 및장치 | |
JP2005078413A (ja) | 電子機器及び電子機器における応答情報出力方法 | |
JP4429873B2 (ja) | 顔画像認証装置及び顔画像認証方法 | |
CN109426758A (zh) | 皮肤特征信息的采集方法及装置、计算机可读存储介质 | |
JP6390247B2 (ja) | 写真撮影装置、肌診断方法、プログラム、及び、肌診断システム | |
JP2019175421A (ja) | マルチアングル顔認証システム及びその学習方法と認証方法 | |
JP5548964B2 (ja) | 電子データ編集装置、電子データ編集方法及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200480025012.4 Country of ref document: CN |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2004772833 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2004772833 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 2004772833 Country of ref document: EP |