WO2018003336A1 - User authentication device, user authentication device control method, control program, and recording medium - Google Patents

User authentication device, user authentication device control method, control program, and recording medium Download PDF

Info

Publication number
WO2018003336A1
WO2018003336A1 PCT/JP2017/018443 JP2017018443W WO2018003336A1 WO 2018003336 A1 WO2018003336 A1 WO 2018003336A1 JP 2017018443 W JP2017018443 W JP 2017018443W WO 2018003336 A1 WO2018003336 A1 WO 2018003336A1
Authority
WO
WIPO (PCT)
Prior art keywords
infrared image
image
user authentication
unit
infrared
Prior art date
Application number
PCT/JP2017/018443
Other languages
French (fr)
Japanese (ja)
Inventor
成文 後田
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Priority to JP2018524952A priority Critical patent/JP6754834B2/en
Publication of WO2018003336A1 publication Critical patent/WO2018003336A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to a user authentication device that performs user authentication using an infrared image obtained by imaging at least a part of a user's face.
  • Patent Document 1 discloses an infrared face authentication apparatus that performs face recognition using an infrared image.
  • the infrared face authentication device displays a captured infrared image on a display unit in order to assist adjustment of the positional relationship between the user's face and the device.
  • JP 2008-181468 A (published on August 7, 2008)
  • the conventional technique has a problem that an inappropriate infrared image is displayed on the display unit.
  • an infrared image or the like through which the subject's clothing is seen is displayed on the display unit.
  • the user of the mobile terminal may maliciously image another person and view an infrared image through which the subject's clothing is transparent. It becomes more prominent.
  • the present invention has been made in view of the above-described problems, and an object of the present invention is to prevent an inappropriate image from being displayed on the display unit when performing user authentication using an infrared image. Is to realize.
  • a user authentication device is a user authentication device that performs user authentication using an infrared image obtained by imaging at least a part of a user's face, and the infrared image
  • the image determination unit for determining whether or not the face part used for the user authentication is included, and when it is determined that the face part is included, the infrared image is displayed on the display unit.
  • a display control unit that does not display the infrared image on the display unit when it is determined that the face part is not included.
  • a control method for a user authentication device is a control of a user authentication device that performs user authentication using an infrared image obtained by capturing at least a part of a user's face.
  • An image determination step for determining whether or not the infrared image includes a face part used for user authentication, and when it is determined that the face part is included, the infrared image A display control step of displaying the image on the display unit and not displaying the infrared image on the display unit when it is determined that the face portion is not included.
  • FIG. 3 is a block diagram illustrating an example of a configuration of a main part of the mobile terminal according to Embodiment 1.
  • FIG. It is the transition diagram of the image imaged using the portable terminal shown in FIG. 1, and the transition diagram of the image displayed on this portable terminal.
  • It is a flowchart which shows an example of the flow of the process which the portable terminal shown in FIG. 1 performs.
  • 5 is a block diagram illustrating an example of a main configuration of a mobile terminal according to Embodiment 2.
  • FIG. FIG. 5 is a transition diagram of an image captured using the mobile terminal shown in FIG. 4 and a transition diagram of an image displayed on the mobile terminal.
  • Embodiment 1 Hereinafter, Embodiment 1 of the present invention will be described in detail with reference to FIGS. 1 to 3.
  • the application example of the present invention is not limited to a mobile terminal, and it is determined whether or not the subject is a pre-registered user using an image obtained by irradiating the subject with infrared light and capturing the subject. It can be applied to a device having a function to
  • FIG. 2 is a transition diagram of an image captured using the mobile terminal 1 and a transition diagram of an image displayed on the mobile terminal 1.
  • the mobile terminal 1 is not limited to a smartphone as illustrated, and may be a feature phone or a tablet terminal.
  • the mobile terminal 1 (user authentication device) according to the present embodiment has a function of irradiating a subject with light in the infrared region and imaging the subject.
  • an image captured by this function is referred to as an infrared image.
  • the portable terminal 1 includes an infrared irradiation unit 11 that irradiates a subject with infrared rays and an infrared imaging unit 12 that captures infrared images (see FIG. 2B and the like).
  • the infrared irradiation part 11 which concerns on this embodiment is a structure which irradiates the light of the near infrared region about 700 nm to 2500 nm as light (infrared rays) of an infrared region.
  • the mobile terminal 1 compares the iris feature point shown in the infrared image with the iris feature point of the user of the mobile terminal 1 registered in advance (by performing iris authentication), and if it matches, The terminal 1 is unlocked.
  • the positional relationship between the user's eyes and the mobile terminal 1, specifically, the distance between the user's eyes and the mobile terminal 1 and the inclination of the mobile terminal 1 are appropriately set.
  • the captured infrared image is displayed on the display unit 13 in order to adjust the object. Therefore, the infrared irradiation part 11 and the infrared imaging part 12 are provided in the same surface as the display part 13 of the portable terminal 1, as shown in (b) of FIG.
  • the infrared irradiation part 11 and the infrared imaging part 12 should just be provided in the same surface as the display part 13, and the position in this surface is not specifically limited.
  • the portable terminal 1 displays the infrared image on the display unit 13 only when both eyes of the person (the face part to be compared, that is, the face part necessary for user authentication) are shown in the captured infrared image. .
  • FIG. 2A, 2C, and 2E respectively show infrared images 21a, 21b, and 21c captured by the mobile terminal 1.
  • FIG. 2B, 2D, 2F, and 2G show screens displayed on the display unit 13 of the mobile terminal 1.
  • infrared image 21 when it is not necessary to distinguish between the infrared images 21a, 21b, and 21c, these are collectively referred to as “infrared image 21”.
  • the person shown in the infrared image 21 is a user of the mobile terminal 1 (in other words, a person who has registered iris feature points in the mobile terminal 1 in advance).
  • the human eyes are not shown in the infrared image 21 a captured by the mobile terminal 1.
  • the portable terminal 1 causes the display unit 13 to display an alternative image 141 that is an image instead of the infrared image 21 as illustrated in FIG.
  • the mobile terminal 1 causes the display unit 13 to display a text 31 that prompts the imaging device (camera) that captures the infrared image 21 to face.
  • the text 31 is text that prompts the user to adjust the distance between the user's eyes and the mobile terminal 1 and the inclination of the mobile terminal 1 so that both eyes can be seen in the infrared image 21.
  • the display unit 13 when the mobile terminal 1 executes an application for performing user authentication, as shown in FIG. 2B, the display unit 13 regardless of whether or not both eyes of the person are reflected in the infrared image 21.
  • An alignment image 142 (positioning image) is displayed. Details of the alignment image 142 will be described later.
  • the mobile terminal 1 displays the infrared image as shown in FIG.
  • An infrared image 22b which is an image obtained by performing image processing on 21b, is displayed on the display unit 13.
  • the mobile terminal 1 starts user authentication.
  • text 32 indicating that user authentication is being performed may be displayed on the display unit 13.
  • the infrared image 22b is displayed at the back of the alignment image 142 as shown in FIG.
  • the infrared images displayed on the display unit 13 may be collectively referred to as “infrared image 22”.
  • the text 32 shown in FIG. 2 is an example, and is not limited to the illustrated example.
  • the alignment image 142 is an image for causing the user to recognize an eye position suitable for performing user authentication by being displayed on the display unit 13 together with the infrared image 22.
  • the alignment image 142 has circular portions 421a and 421b that are displayed so that the user can recognize the infrared image 22.
  • circular portion 421 when it is not necessary to distinguish the circular portions 421a and 421b, these are collectively referred to as “circular portion 421”.
  • the circular portion 421 includes the user's eyes shown in the infrared image 22 in the circular portion 421 one eye at a time. It is arrange
  • the lock may not be released.
  • the user performs the following adjustment to release the lock of the mobile terminal 1. That is, the user adjusts the positional relationship between his / her eyes and the portable terminal 1 so that his / her eyes are positioned in the circular portion 421 one eye at a time while checking the infrared image 22b and the alignment image 142.
  • the infrared image 21c shown in FIG. 2E is picked up as a result of the adjustment, as shown in FIG. 2F, a screen in which the user's eyes are located in the circular portion 421 one by one is displayed. Is done.
  • the mobile terminal 1 can detect the eye position suitable for user authentication (in other words, the user's eyes (iris) and the mobile terminal). 1 can be recognized by the user.
  • the portable terminal 1 If the feature point of the user's iris shown in the infrared image 21 (for example, the infrared image 21c) matches the feature point of the iris registered in advance in the portable terminal 1 by the user authentication, the portable terminal 1 is unlocked. Then, the mobile terminal 1 displays a screen (for example, the menu screen shown in FIG. 2G) indicating that the lock is released on the display unit 13. As shown in FIG. 2D, the mobile terminal 1 is locked even if the user's eyes in the infrared image 22 are not positioned in the circular portion 421 for each eye. Will be released.
  • a screen for example, the menu screen shown in FIG. 2G
  • the alignment image 142 shown in FIG. 2 is an example, and the present invention is not limited to this example. Specifically, the alignment image 142 allows the user's own eye so that the iris feature point of the person shown in the infrared image can be accurately compared with the iris feature point registered in advance. Any image that can adjust the positional relationship between the mobile terminal 1 and the portable terminal 1 may be used.
  • the image may be an image that does not display a region other than the periphery of the eye in the infrared image, or may be an image that displays the face region other than the periphery of the eye in the infrared image with a light transmission.
  • FIG. 1 is a block diagram illustrating an example of a main configuration of the mobile terminal 1.
  • FIG. 1 is a figure which shows only a member with high relevance to this invention among the members with which the portable terminal 1 is provided. Therefore, illustration and detailed description of members that are less relevant to the present invention (for example, members that receive user operations) among the members included in the mobile terminal 1 are omitted.
  • the mobile terminal 1 includes a control unit 10, an infrared irradiation unit 11, an infrared imaging unit 12, a display unit 13, and a storage unit 14.
  • the control unit 10 controls each unit of the mobile terminal 1 in an integrated manner.
  • the infrared irradiation unit 11 irradiates a subject with infrared rays to capture an infrared image.
  • the infrared imaging unit 12 captures an infrared image.
  • the display unit 13 displays a screen corresponding to the result of the process executed by the control unit 10.
  • the storage unit 14 stores various data used in the mobile terminal 1.
  • the infrared imaging part 12 is what is called a camera, the camera (infrared camera) which images only an infrared image may be sufficient.
  • the infrared imaging unit 12 performs a mode for capturing a normal image (an image captured by irradiating a subject with visible light) and a mode for capturing an infrared image in accordance with an instruction from the control unit 10.
  • a switchable camera may be used.
  • the control unit 10 includes an imaging processing unit 101, an image processing unit 102, an eye detection unit 103 (image determination unit), an authentication unit 104, and a display screen generation unit 105 (display control unit). Further, the storage unit 14 stores at least a substitute image 141, an alignment image 142, and authentication data 143. Since the substitute image 141 and the alignment image 142 have already been described, description thereof is omitted here.
  • the imaging processing unit 101 drives and controls the infrared irradiation unit 11 and the infrared imaging unit 12. Specifically, the imaging processing unit 101 drives the infrared irradiation unit 11 (d1 in FIG. 1) when an application for performing user authentication is executed by a user operation on an operation unit (not shown), and infrared rays are emitted. Irradiate. Further, when the application is executed, the imaging processing unit 101 drives the infrared imaging unit 12 (d2 in FIG. 1) to capture an infrared image. The infrared imaging unit 12 is controlled by the imaging processing unit 101 to capture an infrared image every predetermined time (for example, every 33 milliseconds (30 fps)), and outputs the infrared image to the image processing unit 102.
  • the imaging processing unit 101 drives the infrared irradiation unit 11 (d1 in FIG. 1) when an application for performing user authentication is executed by a user operation on an operation unit (not shown), and infrared rays
  • the imaging processing unit 101 acquires a detection result from a sensor (not shown) that detects infrared rays, and determines whether the infrared rays are sufficiently irradiated to the own device from the outside of the own device (for example, the sun). May be. If it is determined that the infrared ray is sufficiently irradiated, the subject may be regarded as being sufficiently irradiated with the infrared ray, and the infrared ray irradiation unit 11 may not irradiate the subject with the infrared ray. .
  • the imaging processing unit 101 is configured to drive the infrared irradiation unit 11 and irradiate the subject with infrared rays only when the amount of infrared irradiation with respect to the own device is less than the predetermined threshold from the outside of the own device. May be.
  • the image processing unit 102 performs image processing for use in iris authentication on the acquired infrared image. Specifically, the image processing unit 102 performs processing such as noise reduction and edge enhancement. Then, the image processing unit 102 outputs the infrared image subjected to the image processing to the eye detection unit 103. Note that the above-described image processing is an example, and the image processing performed by the image processing unit 102 is not limited to this example.
  • the eye detection unit 103 determines whether or not human eyes are included in the infrared image. Specifically, the eye detection unit 103 first detects a face from the acquired infrared image. When the face can be detected, the face area is cut out, and the eye area is detected from the cut out face area. If the eye region can be detected, it is determined that both eyes of the person are included in the infrared image, and the acquired infrared image is output to the authentication unit 104 (d3 in FIG. 1). Further, the acquired infrared image and determination result are output to the display screen generation unit 105 (d4 in FIG. 1).
  • the eye area cannot be detected, it is determined that the human eyes are not included in the infrared image, and only the determination result is output to the display screen generation unit 105.
  • An existing technique can be used for detection of the face area and the eye area.
  • the authentication unit 104 compares (collates) the iris feature points included in the infrared image with the iris feature points registered in advance, and determines whether the two feature points match.
  • An existing technique can be used for iris authentication performed by the authentication unit 104. Specifically, when acquiring the infrared image, the authentication unit 104 reads the authentication data 143 from the storage unit 14.
  • the authentication data 143 is information indicating the iris feature points registered in advance as described above.
  • the information may be information indicating the feature point of the iris, and may be, for example, an infrared image captured in advance or an image of only the eye region cut out from the infrared image. Moreover, the information which digitized the feature point of the iris may be sufficient.
  • the authentication unit 104 compares the feature point of the iris included in the acquired infrared image with the feature point of the iris indicated by the read authentication data 143, and determines whether or not they match. When it is determined that they match, the display screen generation unit 105 is notified of this.
  • the display screen generation unit 105 generates a screen to be displayed on the display unit 13 and causes the display unit 13 to display the screen. Specifically, when an application for user authentication is activated, the display screen generation unit 105 receives the substitute image 141, the alignment image 142, and the text 31 (not shown in FIG. 1) from the storage unit 14. Read and display on the display unit 13.
  • the display screen generation unit 105 acquires the determination result that both eyes of the person are included in the infrared image
  • the infrared image acquired together with the determination result and the text 32 (not shown in FIG. 1) It is displayed on the display unit 13.
  • the screen shown in FIG. 2D or FIG. 2F is displayed on the display unit 13.
  • the display screen generation unit 105 updates the infrared image displayed on the display unit 13 every time the infrared image is acquired from the eye detection unit 103 together with the determination result that the human eyes are included in the infrared image. To do.
  • the display image generation unit 105 converts the infrared image 22 and the text 32 displayed on the display unit 13 into the substitute image 141 and the text. Change to 31. Thereby, the screen shown in FIG. 2B is displayed on the display unit 13.
  • the display screen generation unit 105 may perform image processing on the infrared image when displaying the infrared image on the display unit 13. For example, a configuration in which an area corresponding to the circular portion 421 in the acquired infrared image is specified and only the area is displayed on the display unit 13 may be used.
  • the display screen generation unit 105 causes the display unit 13 to display a menu screen (the screen shown in FIG. 2G). .
  • FIG. 3 is a flowchart illustrating an example of a flow of processing executed by the mobile terminal 1.
  • step S1 When an application for performing user authentication is started by a user operation, the infrared irradiation unit 11 and the infrared imaging unit 12 are driven by the imaging processing unit 101. Then, the display screen generation unit 105 reads the substitute image 141 and the alignment image 142 from the storage unit 14 and displays them on the display unit 13 (step S1, hereinafter “step” is omitted).
  • the image processing unit 102 acquires an infrared image captured by the infrared imaging unit 12 (step S2), performs image processing on the infrared image (S3), and outputs the image to the eye detection unit 103.
  • the eye detecting unit 103 detects both eyes of the acquired infrared image (S4, image determination step). When both eyes cannot be detected (NO in S4), it is determined that the human eyes are not included in the infrared image, and the determination result is output to the display screen generation unit 105.
  • the display screen generation unit 105 When the display screen generation unit 105 obtains the determination result that the human eyes are not included in the infrared image, the display screen generation unit 105 reads the substitute image 141 from the storage unit 14 and displays it on the display unit 13 (S9, display control step). Then, the process shown in FIG. 3 returns to S2. That is, the infrared image is not displayed on the display unit 13 and the substitute image 141 is displayed until both eyes of the person are detected in S4.
  • the eye detection unit 103 detects both eyes (YES in S4), the eye detection unit 103 determines that both human eyes are included in the infrared image, and outputs the infrared image to the authentication unit 104. Further, the determination result and the infrared image are output to the display screen generation unit 105.
  • the display screen generation unit 105 acquires the determination result and the infrared image
  • the display screen generation unit 105 displays the acquired infrared image on the display unit 13 (S5, display control step).
  • the authentication unit 104 starts user authentication (S6). Specifically, the feature point of the iris included in the acquired infrared image is compared with the feature point of the iris stored in the storage unit 14 (the feature point indicated by the authentication data 143). Note that the processing of S5 and S6 may not be performed in the order shown. For example, the process of S6 may be performed before S5, or the processes of S5 and S6 may be performed simultaneously.
  • the authentication unit 104 determines whether or not the two feature points match, that is, whether or not the person shown in the infrared image is a user of the mobile terminal by performing the above comparison (S7). When it is determined that the person shown in the infrared image is the user of the mobile terminal (YES in S7), the authentication unit 104 notifies the display screen generation unit 105 to that effect. Receiving the notification, the display screen generation unit 105 displays the menu screen on the display unit 13 (S8). This is the end of the process shown in FIG.
  • the user authentication is started only when both eyes of the person are detected in the infrared image 21.
  • the conditions for starting user authentication are not limited to this example.
  • the configuration may be such that user authentication is started when only one eye is detected.
  • the eye detection unit 103 may determine that the infrared image includes the eye (left eye).
  • the eye detection unit 103 may determine that an eye (right eye) is included in the infrared image when the eye region is detected in the right half of the face region cut out from the acquired infrared image.
  • the eye detection unit 103 is configured to determine that eyes are included in the infrared image when the eye region is detected from at least one of the left half or the right half of the face region cut out from the acquired infrared image. There may be. In this case, the eye detection unit 103 may output the acquired infrared image to the authentication unit 104 and output the acquired infrared image and the determination result to the display screen generation unit 105.
  • the authentication unit 104 compares the feature point of the iris of the one eye with the feature point of the iris of the right eye or the left eye registered in advance, It is determined whether or not the two feature points match.
  • the iris authentication is performed by detecting at least one of the left and right eye areas, only the feature point of the iris of one eye is acquired and verified, such as when the user is wearing an eye patch on one eye. User authentication can be performed even when possible.
  • user authentication is performed by iris authentication using an infrared image.
  • user authentication using an infrared image is not limited to iris authentication.
  • it is a portable terminal that performs user authentication by face recognition, and only when a face part (for example, eyes, nose, mouth, ears, contour, etc.) used for face recognition is included in the infrared image.
  • the portable terminal 1a that displays an infrared image on the display unit 13 will be described.
  • the face recognition in the present embodiment includes the contours of the person's face, the positions of both eyes, the position of the nostrils, and the position of the mouth, as well as the contours, positions of both eyes, and the nostrils registered in advance.
  • face recognition is not limited to this example.
  • FIG. 4 is a block diagram illustrating an example of a main configuration of the mobile terminal 1a.
  • the mobile terminal 1 a includes a control unit 10 a and a storage unit 14 a instead of the control unit 10 and the storage unit 14. .
  • the control unit 10 a includes a collation target detection unit 106 instead of the eye detection unit 103 and an authentication unit 104 a instead of the authentication unit 104.
  • the storage unit 14 a stores an alignment image 142 a instead of the alignment image 142, and authentication data 143 a instead of the authentication data 143.
  • the collation target detection unit 106 determines whether or not the infrared image includes a part used for face recognition in the human face.
  • the collation target detection unit 106 determines whether or not the infrared image includes a face outline, both eyes, nose, and mouth. Specifically, the collation target detection unit 106 first detects a face from the acquired infrared image. When a face can be detected and when a face can be detected, a face area is cut out, and an eye area, a nose area, and a mouth area are detected from the cut face area.
  • the infrared image includes a portion used for face recognition
  • the acquired infrared image is output to the authentication unit 104a (d3 in FIG. 4). Further, the acquired infrared image and determination result are output to the display screen generation unit 105 (d4 in FIG. 4).
  • the display screen generation unit 105 (d4 in FIG. 4).
  • An existing technique can be used for detecting each region.
  • the authentication unit 104a is configured to include a contour of a person's face, a position of both eyes, a position of a nostril, a position of a mouth, a contour registered in advance, Match the position of both eyes, the position of the nostrils, and the position of the mouth. Then, the authentication unit 104a determines whether or not they all match. Specifically, when acquiring the infrared image, the authentication unit 104 reads the authentication data 143a from the storage unit 14.
  • the authentication data 143a is information indicating the face outline, the positions of both eyes, the position of the nostril, and the position of the mouth described above.
  • the information may be information indicating the contour of the face, the position of both eyes, the position of the nostril, and the position of the mouth.
  • the information may be an infrared image captured in advance, the contour of the face, the position of both eyes
  • the information may be information obtained by digitizing the position of the nostril and the position of the mouth.
  • the face portion used in the face recognition changes, the information included in the authentication data 143a also changes.
  • the authentication data 143a when comparing the positions of both ears instead of the mouth position, the authentication data 143a includes the face outline, the positions of both eyes, the position of the nostrils, and the positions of both ears. It becomes the information which shows.
  • the alignment image 142a is different from the alignment image 142 described in the first embodiment in order to allow the user to recognize an appropriate positional relationship between the user's face and the mobile terminal 1a in order to accurately perform face recognition. It is an image. A specific example of the image will be described later.
  • the collation target detection unit 106 determines whether or not the infrared image includes a face outline, both eyes, nose, and mouth. Then, the display screen generation unit 105 according to the present embodiment causes the display unit 13 to display the substitute image 141 when the collation target detection unit 106 determines that the face portion described above is not included. Specifically, when the collation target detection unit 106 acquires the infrared image 21d as illustrated in FIG. 5A, the display screen generation unit 105 displays the substitute image as illustrated in FIG. 141 and text 31 are displayed on the display unit 13. This is because the infrared image 21d does not include the face portion described above. With this display, the collation target detection unit 106 causes the user to recognize that the face portion used for face recognition is not shown in the infrared image 21.
  • the display screen generation unit 105 As shown in FIG. The text 32 is displayed on the display unit 13.
  • the infrared image 21e includes all the face portions described above.
  • the infrared image 22e in the present embodiment is an image in which at least the outline of the face, both eyes, nose, and mouth are captured, unlike the infrared image 22 described in the first embodiment.
  • the display screen generation unit 105 displays the alignment image 142a on the display unit 13 regardless of whether or not the infrared image 21 includes a face part used for face recognition.
  • the alignment image 142a is a quadrangle formed by a broken line, and the user's face shown in the infrared image 22 is positioned within the quadrangle. It is an image for making a user adjust the positional relationship of an own face and the portable terminal 1a. Note that the alignment image 142a is not limited to the example of FIG. 5 as long as the user can adjust the positional relationship between the user's face and the portable terminal 1a.
  • the alignment image 142 and the alignment image 142a are displayed on the display unit 13 regardless of the determination results of the eye detection unit 103 and the collation target detection unit 106.
  • the condition for displaying the alignment image is not limited to this example.
  • the display screen generation unit 105 displays the alignment image 142 on the display unit 13 only when the eye detection unit 103 determines that both infrared eyes are included in the infrared image. Also good.
  • the authentication unit 104 when the authentication unit 104 acquires an infrared image, the authentication unit 104 drives a timer (not shown) and measures the time after acquiring the infrared image. If it is not determined that the feature point of the iris included in the infrared image acquired within the predetermined time and the feature point of the iris indicated by the read authentication data 143 match, the imaging processing unit 101 and the display screen generation unit 105. In response to the notification, the imaging processing unit 101 stops the infrared irradiation unit 11 and the infrared imaging unit 12.
  • a screen for notifying the user of that fact May be displayed on the display unit 13.
  • the screen may be, for example, a screen including text indicating that user authentication has failed and a UI that allows the user to select whether to perform user authentication again. If the user selects to perform user authentication again by operating the UI, the infrared irradiation unit 11 and the infrared imaging unit 12 are driven again, and an infrared image is acquired.
  • the portable terminal 1 and the portable terminal 1a mentioned above were the structures provided with the infrared irradiation part 11, the infrared irradiation part 11 is not essential in the portable terminal which concerns on this invention.
  • this portable terminal may image an infrared image using the infrared rays irradiated to a to-be-photographed object from the sun, or another light source Infrared images may be taken using infrared rays irradiated to the subject.
  • control blocks of mobile terminal 1 and mobile terminal 1a may be realized by a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like.
  • the control block may be realized by software using a CPU (Central Processing Unit).
  • the portable terminal 1 and the portable terminal 1a include a CPU, a ROM (Read Only Memory) or a storage device (these are referred to as “recording media”), a RAM (Random Access Memory), and the like.
  • the CPU executes instructions of a program that is software for realizing each function.
  • the program and various data are recorded so as to be readable by a computer (or CPU).
  • the RAM expands the program.
  • the objective of this invention is achieved when a computer (or CPU) reads the said program from the said recording medium and runs it.
  • a “non-temporary tangible medium” such as a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used.
  • the program may be supplied to the computer via an arbitrary transmission medium (such as a communication network or a broadcast wave) that can transmit the program.
  • the present invention can also be realized in the form of a data signal embedded in a carrier wave in which the program is embodied by electronic transmission.
  • a user authentication device (portable terminal 1, portable terminal 1a) according to aspect 1 of the present invention is a user authentication device that performs user authentication using an infrared image obtained by imaging at least a part of a user's face, and the infrared image described above.
  • an image determination unit eye detection unit 103, collation target detection unit 106 that determines whether or not the face part used for the user authentication is included.
  • the display control unit display screen generation
  • the infrared image is displayed only when the infrared image includes a facial part used for user authentication, and the infrared image is not displayed when it is not included.
  • an infrared image captured when user authentication is to be performed that is, an infrared image in which a person's face is captured
  • an infrared image captured without intention of user authentication is displayed.
  • the image is not displayed. Therefore, it is possible to prevent an inappropriate infrared image from being displayed on the display unit.
  • An inappropriate infrared image is, for example, an image through which clothes of a subject are seen.
  • the infrared image is not displayed on the display unit when the face portion used for user authentication is not included, the infrared image necessary for performing user authentication for the user (that is, the face portion is captured). It is possible to recognize that an infrared image has not been captured and to assist the user in performing user authentication smoothly.
  • the user authentication device when the display control unit determines that the face portion is not included in the infrared image, the user authentication device has a predetermined different from the infrared image. An image may be displayed on the display unit.
  • the face part to be compared in the user authentication is not included in the infrared image
  • a predetermined image different from the infrared image is displayed on the display unit instead of the infrared image.
  • the user can capture an infrared image necessary for user authentication as compared to a case where nothing is displayed on the display unit when it is determined that the face portion is not included in the infrared image. It can be recognized more clearly, and the user can perform user authentication more smoothly.
  • the display control unit displays a positioning image indicating the position of the face portion suitable for performing user authentication on the display unit. You may let them.
  • the user since the positioning image indicating the position of the face portion suitable for user authentication is displayed, the user performs user authentication by confirming the positioning image and the infrared image. Therefore, the position of the eye suitable for the user can be recognized, and the positional relationship between the face portion and the user authentication device can be adjusted to a position suitable for performing user authentication.
  • the user authentication device is the user authentication apparatus according to any one of the aspects 1 to 3, wherein when the display control unit determines that the face part is included, only the face part is included. You may display on a display part.
  • the face part to be compared in the user authentication when included in the infrared image, only the face part is displayed on the display unit. As a result, even if the human face and the lower part from the neck are shown in the infrared image at the same time, the lower part from the neck is not displayed, so that an image showing the subject's clothes is prevented from being displayed on the display unit. be able to.
  • the user authentication device is the user authentication apparatus according to any one of the aspects 1 to 4, wherein the face portion is the iris of the user, and the image determination unit includes the user's eyes on the infrared image. You may determine whether it is contained.
  • the user determines whether or not the user's eyes are included in the infrared image, and the infrared image is displayed on the display unit only when it is included.
  • the user can perform user authentication by a simple method of capturing an infrared image so that his / her eyes can be seen.
  • user authentication is performed by comparing with respect to the iris in which the same thing does not exist in a biological body, a user authentication apparatus with high security is realizable.
  • a user authentication device is a control method for a user authentication device that performs user authentication using an infrared image obtained by imaging at least a part of a user's face, and includes the above-described infrared image for the user authentication.
  • An image determination step (step S4) for determining whether or not a facial part to be used is included, and if it is determined that the facial part is included, the infrared image is displayed on the display unit, while
  • a display control step steps S5 and S9 that does not display the infrared image on the display unit when it is determined that the face portion is not included.
  • the user authentication device may be realized by a computer.
  • the user authentication device is operated on each computer by causing the computer to operate as each unit (software element) included in the user authentication device.
  • the control program for the user authentication apparatus to be realized in this way and a computer-readable recording medium on which the control program is recorded also fall within the scope of the present invention.

Abstract

The purpose of the present invention is, when carrying out a user authentication using an infrared image, to avoid an inappropriate image being displayed in a display unit. A portable terminal (1) comprises an eye detection unit (103) which assesses whether both of a person's eyes are included in a captured infrared image, and a display screen generating unit (105) which, if it has been assessed that both of the person's eye are included in the infrared image, causes a display unit (13) to display the infrared image, whereas if it has been assessed that both of the person's eyes are not included in the infrared image, causes the display unit (13) to not display the infrared image.

Description

ユーザ認証装置、ユーザ認証装置の制御方法、制御プログラム、及び記録媒体User authentication apparatus, control method for user authentication apparatus, control program, and recording medium
 本発明は、ユーザの顔の少なくとも一部を撮像した赤外線画像を用いてユーザ認証を行うユーザ認証装置などに関する。 The present invention relates to a user authentication device that performs user authentication using an infrared image obtained by imaging at least a part of a user's face.
 被写体に赤外領域の光を照射して被写体を撮像した画像(赤外線画像)を用いたユーザ認証の技術が従来技術として知られている。例えば下記の特許文献1には、赤外線画像を用いた顔認識を行う赤外線顔認証装置が開示されている。該赤外線顔認証装置は、ユーザの顔と装置との位置関係の調整を補助するために、撮像した赤外線画像を表示部に表示させる。 A technique for user authentication using an image (infrared image) obtained by irradiating a subject with light in the infrared region and capturing the subject is known as a conventional technique. For example, Patent Document 1 below discloses an infrared face authentication apparatus that performs face recognition using an infrared image. The infrared face authentication device displays a captured infrared image on a display unit in order to assist adjustment of the positional relationship between the user's face and the device.
日本国公開特許公報「特開2008-181468号公報(2008年8月7日公開)」Japanese Patent Publication “JP 2008-181468 A (published on August 7, 2008)”
 しかしながら、上述のような従来技術では、撮像された赤外線画像がすべて表示部に表示されることとなる。つまり、ユーザがユーザ認証を行うことを意図せず撮像された赤外線画像も表示部に表示される。そのため従来技術には、不適切な赤外線画像が表示部に表示されてしまうという問題があった。例えば従来技術では、被写体の衣服が透けた赤外線画像などが表示部に表示されてしまう。 However, in the conventional technology as described above, all captured infrared images are displayed on the display unit. That is, an infrared image captured without the user's intention to perform user authentication is also displayed on the display unit. Therefore, the conventional technique has a problem that an inappropriate infrared image is displayed on the display unit. For example, in the related art, an infrared image or the like through which the subject's clothing is seen is displayed on the display unit.
 特に、従来技術を携帯端末に適用した場合、該携帯端末のユーザが悪意を持って他人を撮像し、被写体の衣服が透けた赤外線画像を閲覧する可能性も考えられるため、上記の問題点はより顕著となる。 In particular, when the conventional technology is applied to a mobile terminal, the user of the mobile terminal may maliciously image another person and view an infrared image through which the subject's clothing is transparent. It becomes more prominent.
 本発明は、前記の問題点に鑑みてなされたものであり、その目的は、赤外線画像を用いてユーザ認証を行う場合に、不適切な画像が表示部に表示されることを防ぐユーザ認証装置を実現することにある。 The present invention has been made in view of the above-described problems, and an object of the present invention is to prevent an inappropriate image from being displayed on the display unit when performing user authentication using an infrared image. Is to realize.
 上記の課題を解決するために、本発明の一態様に係るユーザ認証装置は、ユーザの顔の少なくとも一部を撮像した赤外線画像を用いてユーザ認証を行うユーザ認証装置であって、上記赤外線画像に、上記ユーザ認証に用いる顔の部分が含まれているか否かを判定する画像判定部と、上記顔の部分が含まれていると判定された場合、上記赤外線画像を表示部に表示させる一方、上記顔の部分が含まれていないと判定された場合、上記赤外線画像を上記表示部に表示させない表示制御部と、を備える。 In order to solve the above problems, a user authentication device according to an aspect of the present invention is a user authentication device that performs user authentication using an infrared image obtained by imaging at least a part of a user's face, and the infrared image The image determination unit for determining whether or not the face part used for the user authentication is included, and when it is determined that the face part is included, the infrared image is displayed on the display unit. A display control unit that does not display the infrared image on the display unit when it is determined that the face part is not included.
 また、上記の課題を解決するために、本発明の一態様に係るユーザ認証装置の制御方法は、ユーザの顔の少なくとも一部を撮像した赤外線画像を用いてユーザ認証を行うユーザ認証装置の制御方法であって、上記赤外線画像に、上記ユーザ認証に用いる顔の部分が含まれているか否かを判定する画像判定ステップと、上記顔の部分が含まれていると判定された場合、上記赤外線画像を表示部に表示させる一方、上記顔の部分が含まれていないと判定された場合、上記赤外線画像を上記表示部に表示させない表示制御ステップと、を含む。 In order to solve the above problems, a control method for a user authentication device according to an aspect of the present invention is a control of a user authentication device that performs user authentication using an infrared image obtained by capturing at least a part of a user's face. An image determination step for determining whether or not the infrared image includes a face part used for user authentication, and when it is determined that the face part is included, the infrared image A display control step of displaying the image on the display unit and not displaying the infrared image on the display unit when it is determined that the face portion is not included.
 本発明の一態様によれば、赤外線画像を用いてユーザ認証を行う場合に、不適切な画像が表示部に表示されることを防ぐことができるという効果を奏する。 According to one aspect of the present invention, when user authentication is performed using an infrared image, it is possible to prevent an inappropriate image from being displayed on the display unit.
実施形態1に係る携帯端末の要部構成の一例を示すブロック図である。3 is a block diagram illustrating an example of a configuration of a main part of the mobile terminal according to Embodiment 1. FIG. 図1に示す携帯端末を用いて撮像した画像の遷移図、および、該携帯端末に表示される画像の遷移図である。It is the transition diagram of the image imaged using the portable terminal shown in FIG. 1, and the transition diagram of the image displayed on this portable terminal. 図1に示す携帯端末が実行する処理の流れの一例を示すフローチャートである。It is a flowchart which shows an example of the flow of the process which the portable terminal shown in FIG. 1 performs. 実施形態2に係る携帯端末の要部構成の一例を示すブロック図である。5 is a block diagram illustrating an example of a main configuration of a mobile terminal according to Embodiment 2. FIG. 図4に示す携帯端末を用いて撮像した画像の遷移図、および、該携帯端末に表示される画像の遷移図である。FIG. 5 is a transition diagram of an image captured using the mobile terminal shown in FIG. 4 and a transition diagram of an image displayed on the mobile terminal.
 〔実施形態1〕
 以下、本発明の実施形態1について、図1から図3に基づいて詳細に説明する。
Embodiment 1
Hereinafter, Embodiment 1 of the present invention will be described in detail with reference to FIGS. 1 to 3.
 (本発明の概要)
 まず、本発明の概要について図2に基づいて説明する。なお、本実施形態では、本発明に係るユーザ認証装置を携帯端末1に適用した例を説明する。ただし、本発明の適用例は携帯端末に限定されず、被写体に赤外領域の光を照射して被写体を撮像した画像を用いて、該被写体が予め登録されたユーザであるか否かを判定する機能を有する装置に適用することができる。
(Outline of the present invention)
First, the outline | summary of this invention is demonstrated based on FIG. In the present embodiment, an example in which the user authentication device according to the present invention is applied to the mobile terminal 1 will be described. However, the application example of the present invention is not limited to a mobile terminal, and it is determined whether or not the subject is a pre-registered user using an image obtained by irradiating the subject with infrared light and capturing the subject. It can be applied to a device having a function to
 図2は、携帯端末1を用いて撮像した画像の遷移図、および、携帯端末1に表示される画像の遷移図である。なお、携帯端末1は図示のようなスマートフォンに限定されず、フィーチャーフォンやタブレット端末などであってもよい。 FIG. 2 is a transition diagram of an image captured using the mobile terminal 1 and a transition diagram of an image displayed on the mobile terminal 1. Note that the mobile terminal 1 is not limited to a smartphone as illustrated, and may be a feature phone or a tablet terminal.
 本実施形態に係る携帯端末1(ユーザ認証装置)は、被写体に赤外領域の光を照射して該被写体を撮像する機能を備えている。なお以降、この機能によって撮像した画像を赤外線画像と称する。携帯端末1は、この機能を実現するために、赤外線を被写体に照射する赤外線照射部11および赤外線画像を撮像する赤外線撮像部12を備えている(図2の(b)など参照)。なお、本実施形態に係る赤外線照射部11は、赤外領域の光(赤外線)として、700nmから2500nm程度の近赤外領域の光を照射する構成である。 The mobile terminal 1 (user authentication device) according to the present embodiment has a function of irradiating a subject with light in the infrared region and imaging the subject. Hereinafter, an image captured by this function is referred to as an infrared image. In order to realize this function, the portable terminal 1 includes an infrared irradiation unit 11 that irradiates a subject with infrared rays and an infrared imaging unit 12 that captures infrared images (see FIG. 2B and the like). In addition, the infrared irradiation part 11 which concerns on this embodiment is a structure which irradiates the light of the near infrared region about 700 nm to 2500 nm as light (infrared rays) of an infrared region.
 そして携帯端末1は、該赤外線画像に写る虹彩の特徴点と、予め登録された、携帯端末1のユーザの虹彩の特徴点とを比較して(虹彩認証を行って)、一致する場合に携帯端末1のロックを解除する。なお、携帯端末1は、虹彩認証を行う際に、ユーザの目と携帯端末1との位置関係、具体的には、ユーザの目と携帯端末1との距離や携帯端末1の傾きを適切なものに調整させるため、撮像した赤外線画像を表示部13に表示させる。そのため、赤外線照射部11および赤外線撮像部12は、図2の(b)に示すように、携帯端末1の表示部13と同じ面に設けられている。なお、赤外線照射部11および赤外線撮像部12は、表示部13と同じ面に設けられていればよく、該面における位置は特に限定されない。 Then, the mobile terminal 1 compares the iris feature point shown in the infrared image with the iris feature point of the user of the mobile terminal 1 registered in advance (by performing iris authentication), and if it matches, The terminal 1 is unlocked. In addition, when the mobile terminal 1 performs iris authentication, the positional relationship between the user's eyes and the mobile terminal 1, specifically, the distance between the user's eyes and the mobile terminal 1 and the inclination of the mobile terminal 1 are appropriately set. The captured infrared image is displayed on the display unit 13 in order to adjust the object. Therefore, the infrared irradiation part 11 and the infrared imaging part 12 are provided in the same surface as the display part 13 of the portable terminal 1, as shown in (b) of FIG. In addition, the infrared irradiation part 11 and the infrared imaging part 12 should just be provided in the same surface as the display part 13, and the position in this surface is not specifically limited.
 ただし携帯端末1は、撮像した赤外線画像に人の両目(比較対象となる顔の部分、すなわち、ユーザ認証に必要な顔の部分)が写っている場合のみ、赤外線画像を表示部13に表示させる。これについて、図2を参照しながらより詳細に説明する。図2の(a)、(c)、(e)はそれぞれ、携帯端末1が撮像した赤外線画像21a、21b、21cを示している。また、図2の(b)、(d)、(f)、(g)は、携帯端末1の表示部13に表示される画面を示している。なお以降、赤外線画像21a、21b、21cを区別する必要が無い場合、これらを総称して「赤外線画像21」と記載する。なお、図2の例において、赤外線画像21に写っている人間は、携帯端末1のユーザ(換言すれば、携帯端末1に虹彩の特徴点を予め登録している人)であるとする。 However, the portable terminal 1 displays the infrared image on the display unit 13 only when both eyes of the person (the face part to be compared, that is, the face part necessary for user authentication) are shown in the captured infrared image. . This will be described in more detail with reference to FIG. 2A, 2C, and 2E respectively show infrared images 21a, 21b, and 21c captured by the mobile terminal 1. FIG. 2B, 2D, 2F, and 2G show screens displayed on the display unit 13 of the mobile terminal 1. FIG. Hereinafter, when it is not necessary to distinguish between the infrared images 21a, 21b, and 21c, these are collectively referred to as “infrared image 21”. In the example of FIG. 2, it is assumed that the person shown in the infrared image 21 is a user of the mobile terminal 1 (in other words, a person who has registered iris feature points in the mobile terminal 1 in advance).
 図2の(a)に示すように、携帯端末1が撮像した赤外線画像21aには人の両目は写っていない。この場合、携帯端末1は図2の(b)に示すように、赤外線画像21の代わりの画像である代替画像141を表示部13に表示させる。さらに携帯端末1は、赤外線画像21を撮像する撮像デバイス(カメラ)に対して顔を向けるように促すテキスト31を表示部13に表示させる。該テキスト31は、換言すれば、赤外線画像21に両目が写るように、ユーザの目と携帯端末1との距離や携帯端末1の傾きを適切なものに調整するように促すテキストである。これにより、ユーザに赤外線画像21に自身の両目(より具体的には、両目の虹彩)が写っていないことを認識させ、ユーザの目と携帯端末1との位置関係を調整させることができる。また、認証以外の不適切な用途での使用(例えば、他人の服が透けた画像を撮像し、表示部13に表示させるなど)を防ぐことができる。なお、図2に示す代替画像141およびテキスト31は一例であり、図示の例に限定されるものではない。また、テキストを表示部13に表示させることに代えて、あるいは、テキストを表示部13に表示させるとともに、ユーザに撮像デバイスに対して顔を向けるよう促す音声ガイダンスを、図示しないスピーカから出力する構成であってもよい。 As shown in (a) of FIG. 2, the human eyes are not shown in the infrared image 21 a captured by the mobile terminal 1. In this case, the portable terminal 1 causes the display unit 13 to display an alternative image 141 that is an image instead of the infrared image 21 as illustrated in FIG. Furthermore, the mobile terminal 1 causes the display unit 13 to display a text 31 that prompts the imaging device (camera) that captures the infrared image 21 to face. In other words, the text 31 is text that prompts the user to adjust the distance between the user's eyes and the mobile terminal 1 and the inclination of the mobile terminal 1 so that both eyes can be seen in the infrared image 21. This allows the user to recognize that his / her eyes (more specifically, the iris of both eyes) are not reflected in the infrared image 21 and adjust the positional relationship between the user's eyes and the mobile terminal 1. In addition, it is possible to prevent use for inappropriate purposes other than authentication (for example, taking an image through which another person's clothes are seen and displaying the image on the display unit 13). Note that the alternative image 141 and the text 31 illustrated in FIG. 2 are examples, and are not limited to the illustrated examples. Also, instead of displaying text on the display unit 13 or displaying text on the display unit 13 and outputting voice guidance for prompting the user to face the imaging device from a speaker (not shown) It may be.
 また、携帯端末1は、ユーザ認証を行うためのアプリケーションを実行した場合、赤外線画像21に人の両目が写っているか否かに関わらず、図2の(b)に示すように、表示部13に位置合わせ用画像142(位置決め画像)を表示させる。なお、位置合わせ用画像142の詳細については後述する。 In addition, when the mobile terminal 1 executes an application for performing user authentication, as shown in FIG. 2B, the display unit 13 regardless of whether or not both eyes of the person are reflected in the infrared image 21. An alignment image 142 (positioning image) is displayed. Details of the alignment image 142 will be described later.
 一方、図2の(c)に示すように、携帯端末1が撮像した赤外線画像21bには人の両目が写っている場合、携帯端末1は図2の(d)に示すように、赤外線画像21bに対して画像処理を行った画像である赤外線画像22bを表示部13に表示させる。そして、携帯端末1はユーザ認証を開始する。このとき、図2の(d)に示すように、ユーザ認証を行っていることを示すテキスト32を表示部13に表示させてもよい。赤外線画像22bは、図2の(d)に示すように、位置合わせ用画像142の奥に表示される。なお以降、表示部13に表示される赤外線画像を総称して、「赤外線画像22」と記載する場合がある。また、図2に示すテキスト32は一例であり、図示の例に限定されるものではない。 On the other hand, as shown in FIG. 2C, when both eyes of a person are reflected in the infrared image 21b captured by the mobile terminal 1, the mobile terminal 1 displays the infrared image as shown in FIG. An infrared image 22b, which is an image obtained by performing image processing on 21b, is displayed on the display unit 13. Then, the mobile terminal 1 starts user authentication. At this time, as shown in FIG. 2D, text 32 indicating that user authentication is being performed may be displayed on the display unit 13. The infrared image 22b is displayed at the back of the alignment image 142 as shown in FIG. Hereinafter, the infrared images displayed on the display unit 13 may be collectively referred to as “infrared image 22”. Moreover, the text 32 shown in FIG. 2 is an example, and is not limited to the illustrated example.
 ここで、位置合わせ用画像142について説明する。位置合わせ用画像142は、赤外線画像22とともに表示部13に表示することで、ユーザ認証を行うために適した目の位置をユーザに認識させるための画像である。 Here, the alignment image 142 will be described. The alignment image 142 is an image for causing the user to recognize an eye position suitable for performing user authentication by being displayed on the display unit 13 together with the infrared image 22.
 位置合わせ用画像142は、ユーザが赤外線画像22を認識できるように表示させる円部分421aおよび421bを有している。なお以降、円部分421aおよび421bを区別する必要が無い場合、これらを総称して「円部分421」と記載する。円部分421は、ユーザの両目(虹彩)と携帯端末1との位置関係が、ユーザ認証を行うために適した位置関係である場合、赤外線画像22に写るユーザの目が片目ずつ円部分421内に位置するように、位置合わせ用画像142に配置されている。ここで、図2の(d)に示すように、赤外線画像22のユーザの目が片目ずつ円部分421内に位置しない場合、赤外線画像22に写る人が携帯端末1のユーザであっても、ロックが解除されない可能性がある。ロックが解除されない場合、ユーザは赤外線画像22が表示部13に表示された後、携帯端末1のロックを解除するために、以下の調整を行う。すなわち、ユーザは、赤外線画像22bと位置合わせ用画像142を確認しながら、自身の目が片目ずつ円部分421内に位置するように、自身の目と携帯端末1との位置関係を調整する。調整の結果、図2の(e)に示す赤外線画像21cが撮像されると、図2の(f)に示すように、ユーザの目が片目ずつ円部分421内に位置している画面が表示される。 The alignment image 142 has circular portions 421a and 421b that are displayed so that the user can recognize the infrared image 22. Hereinafter, when it is not necessary to distinguish the circular portions 421a and 421b, these are collectively referred to as “circular portion 421”. When the positional relationship between the user's eyes (iris) and the mobile terminal 1 is suitable for performing user authentication, the circular portion 421 includes the user's eyes shown in the infrared image 22 in the circular portion 421 one eye at a time. It is arrange | positioned at the image 142 for alignment so that it may be located in FIG. Here, as shown in FIG. 2D, when the eyes of the user of the infrared image 22 are not positioned in the circular portion 421 for each eye, even if the person shown in the infrared image 22 is the user of the mobile terminal 1, The lock may not be released. When the lock is not released, after the infrared image 22 is displayed on the display unit 13, the user performs the following adjustment to release the lock of the mobile terminal 1. That is, the user adjusts the positional relationship between his / her eyes and the portable terminal 1 so that his / her eyes are positioned in the circular portion 421 one eye at a time while checking the infrared image 22b and the alignment image 142. When the infrared image 21c shown in FIG. 2E is picked up as a result of the adjustment, as shown in FIG. 2F, a screen in which the user's eyes are located in the circular portion 421 one by one is displayed. Is done.
 このように、位置合わせ用画像142を赤外線画像22とともに表示部13に表示させることで、携帯端末1は、ユーザ認証に適した目の位置(換言すれば、ユーザの両目(虹彩)と携帯端末1との位置関係)をユーザに認識させることができる。 In this way, by displaying the alignment image 142 together with the infrared image 22 on the display unit 13, the mobile terminal 1 can detect the eye position suitable for user authentication (in other words, the user's eyes (iris) and the mobile terminal). 1 can be recognized by the user.
 ユーザ認証によって、赤外線画像21(例えば赤外線画像21c)に写るユーザの虹彩の特徴点と、携帯端末1に予め登録された虹彩の特徴点が一致すると、携帯端末1のロックが解除される。そして、携帯端末1はロックを解除したことを示す画面(例えば、図2の(g)に示すメニュー画面)を表示部13に表示させる。なお、携帯端末1のロックは、図2の(d)に示すように、赤外線画像22のユーザの目が片目ずつ円部分421内に位置しない場合であっても、虹彩の特徴点が一致すれば解除される。 If the feature point of the user's iris shown in the infrared image 21 (for example, the infrared image 21c) matches the feature point of the iris registered in advance in the portable terminal 1 by the user authentication, the portable terminal 1 is unlocked. Then, the mobile terminal 1 displays a screen (for example, the menu screen shown in FIG. 2G) indicating that the lock is released on the display unit 13. As shown in FIG. 2D, the mobile terminal 1 is locked even if the user's eyes in the infrared image 22 are not positioned in the circular portion 421 for each eye. Will be released.
 また、図2に示す位置合わせ用画像142は一例であり、この例に限定されるものではない。具体的には、位置合わせ用画像142は、赤外線画像に写る人の虹彩の特徴点と、予め登録された虹彩の特徴点との照合を正確に行うことができるように、ユーザに自身の目と携帯端末1との位置関係を調整させることができる画像であればよい。例えば、赤外線画像における目の周辺以外の領域を表示しない画像であってもよいし、赤外線画像における目の周辺以外の顔の領域を薄く透過して表示させる画像であってもよい。 Further, the alignment image 142 shown in FIG. 2 is an example, and the present invention is not limited to this example. Specifically, the alignment image 142 allows the user's own eye so that the iris feature point of the person shown in the infrared image can be accurately compared with the iris feature point registered in advance. Any image that can adjust the positional relationship between the mobile terminal 1 and the portable terminal 1 may be used. For example, the image may be an image that does not display a region other than the periphery of the eye in the infrared image, or may be an image that displays the face region other than the periphery of the eye in the infrared image with a light transmission.
 (携帯端末1の要部構成)
 次に、携帯端末1の要部構成について、図1に基づいて説明する。図1は、携帯端末1の要部構成の一例を示すブロック図である。なお、図1は、携帯端末1が備えている部材のうち、本発明との関連性が高い部材のみを示す図である。そのため、携帯端末1が備えている部材のうち、本発明との関連性が低い部材(例えば、ユーザの操作を受け付ける部材など)については、図示および詳細な説明を省略する。
(Main components of the mobile terminal 1)
Next, the principal part structure of the portable terminal 1 is demonstrated based on FIG. FIG. 1 is a block diagram illustrating an example of a main configuration of the mobile terminal 1. In addition, FIG. 1 is a figure which shows only a member with high relevance to this invention among the members with which the portable terminal 1 is provided. Therefore, illustration and detailed description of members that are less relevant to the present invention (for example, members that receive user operations) among the members included in the mobile terminal 1 are omitted.
 図示のように、携帯端末1は、制御部10、赤外線照射部11、赤外線撮像部12、表示部13、および、記憶部14を備えている。制御部10は、携帯端末1の各部を統括して制御する。赤外線照射部11は、赤外線画像を撮像するために、赤外線を被写体に照射する。赤外線撮像部12は、赤外線画像を撮像する。表示部13は、制御部10が実行した処理の結果に応じた画面を表示する。記憶部14は、携帯端末1にて使用する各種データを記憶する。なお、赤外線撮像部12はいわゆるカメラであるが、赤外線画像のみを撮像するカメラ(赤外線カメラ)であってもよい。また、赤外線撮像部12は、制御部10からの指示に応じて通常の画像(被写体に可視光が照射されることにより撮像された画像)を撮像するモードと、赤外線画像を撮像するモードとを切り替え可能なカメラであってもよい。 As illustrated, the mobile terminal 1 includes a control unit 10, an infrared irradiation unit 11, an infrared imaging unit 12, a display unit 13, and a storage unit 14. The control unit 10 controls each unit of the mobile terminal 1 in an integrated manner. The infrared irradiation unit 11 irradiates a subject with infrared rays to capture an infrared image. The infrared imaging unit 12 captures an infrared image. The display unit 13 displays a screen corresponding to the result of the process executed by the control unit 10. The storage unit 14 stores various data used in the mobile terminal 1. In addition, although the infrared imaging part 12 is what is called a camera, the camera (infrared camera) which images only an infrared image may be sufficient. In addition, the infrared imaging unit 12 performs a mode for capturing a normal image (an image captured by irradiating a subject with visible light) and a mode for capturing an infrared image in accordance with an instruction from the control unit 10. A switchable camera may be used.
 そして、制御部10には、撮像処理部101、画像処理部102、目検出部103(画像判定部)、認証部104、および表示画面生成部105(表示制御部)が含まれている。また、記憶部14には、代替画像141、位置合わせ用画像142、および認証用データ143が少なくとも記憶されている。なお、代替画像141および位置合わせ用画像142については既に説明しているため、ここでの説明を省略する。 The control unit 10 includes an imaging processing unit 101, an image processing unit 102, an eye detection unit 103 (image determination unit), an authentication unit 104, and a display screen generation unit 105 (display control unit). Further, the storage unit 14 stores at least a substitute image 141, an alignment image 142, and authentication data 143. Since the substitute image 141 and the alignment image 142 have already been described, description thereof is omitted here.
 撮像処理部101は、赤外線照射部11および赤外線撮像部12を駆動および制御する。具体的には、撮像処理部101は、図示しない操作部に対するユーザの操作によってユーザ認証を行うためのアプリケーションが実行されたとき、赤外線照射部11を駆動して(図1のd1)、赤外線を照射させる。また、撮像処理部101は、上記アプリケーションが実行されたとき、赤外線撮像部12を駆動して(図1のd2)、赤外線画像を撮像させる。赤外線撮像部12は、撮像処理部101によって制御されて、所定時間ごと(例えば33ミリ秒(30fps)ごと)に赤外線画像を撮像し、画像処理部102に出力する。 The imaging processing unit 101 drives and controls the infrared irradiation unit 11 and the infrared imaging unit 12. Specifically, the imaging processing unit 101 drives the infrared irradiation unit 11 (d1 in FIG. 1) when an application for performing user authentication is executed by a user operation on an operation unit (not shown), and infrared rays are emitted. Irradiate. Further, when the application is executed, the imaging processing unit 101 drives the infrared imaging unit 12 (d2 in FIG. 1) to capture an infrared image. The infrared imaging unit 12 is controlled by the imaging processing unit 101 to capture an infrared image every predetermined time (for example, every 33 milliseconds (30 fps)), and outputs the infrared image to the image processing unit 102.
 また、撮像処理部101は、赤外線を検知するセンサ(不図示)から検知結果を取得し、自装置の外部(例えば太陽)から自装置に対して赤外線が十分に照射されているか否かを判定してもよい。そして、赤外線が十分に照射されていると判定した場合、被写体にも赤外線が十分に照射されているとみなして、赤外線照射部11から被写体への赤外線の照射を行わない構成であってもよい。換言すれば、撮像処理部101は、自装置の外部から自装置に対する赤外線の照射量が所定の閾値未満であった場合のみ、赤外線照射部11を駆動し、被写体に赤外線を照射させる構成であってもよい。 In addition, the imaging processing unit 101 acquires a detection result from a sensor (not shown) that detects infrared rays, and determines whether the infrared rays are sufficiently irradiated to the own device from the outside of the own device (for example, the sun). May be. If it is determined that the infrared ray is sufficiently irradiated, the subject may be regarded as being sufficiently irradiated with the infrared ray, and the infrared ray irradiation unit 11 may not irradiate the subject with the infrared ray. . In other words, the imaging processing unit 101 is configured to drive the infrared irradiation unit 11 and irradiate the subject with infrared rays only when the amount of infrared irradiation with respect to the own device is less than the predetermined threshold from the outside of the own device. May be.
 画像処理部102は、取得した赤外線画像に対して、虹彩認証で使用するための画像処理を行う。具体的には、画像処理部102は、ノイズ低減、エッジ強調などの処理を行う。そして、画像処理部102は、画像処理を行った赤外線画像を目検出部103に出力する。なお、上述した画像処理は一例であり、画像処理部102が行う画像処理はこの例に限定されるものではない。 The image processing unit 102 performs image processing for use in iris authentication on the acquired infrared image. Specifically, the image processing unit 102 performs processing such as noise reduction and edge enhancement. Then, the image processing unit 102 outputs the infrared image subjected to the image processing to the eye detection unit 103. Note that the above-described image processing is an example, and the image processing performed by the image processing unit 102 is not limited to this example.
 目検出部103は、赤外線画像に人の両目が含まれているか否かを判定する。具体的には、目検出部103はまず、取得した赤外線画像から顔を検出する。そして、顔を検出できた場合、顔の領域を切り出し、切り出した顔領域から目の領域を検出する。そして、目の領域が検出できた場合、赤外線画像に人の両目が含まれていると判定し、取得した赤外線画像を認証部104に出力する(図1のd3)。また、取得した赤外線画像と判定結果とを表示画面生成部105に出力する(図1のd4)。一方、目の領域が検出できなかった場合、赤外線画像に人の両目が含まれていないと判定し、判定結果のみを表示画面生成部105に出力する。なお、顔の領域や目の領域の検出には、既存の技術を用いることができる。 The eye detection unit 103 determines whether or not human eyes are included in the infrared image. Specifically, the eye detection unit 103 first detects a face from the acquired infrared image. When the face can be detected, the face area is cut out, and the eye area is detected from the cut out face area. If the eye region can be detected, it is determined that both eyes of the person are included in the infrared image, and the acquired infrared image is output to the authentication unit 104 (d3 in FIG. 1). Further, the acquired infrared image and determination result are output to the display screen generation unit 105 (d4 in FIG. 1). On the other hand, when the eye area cannot be detected, it is determined that the human eyes are not included in the infrared image, and only the determination result is output to the display screen generation unit 105. An existing technique can be used for detection of the face area and the eye area.
 認証部104は、赤外線画像に含まれる虹彩の特徴点と、予め登録された虹彩の特徴点とを比較(照合)し、2つの特徴点が一致しているか否かを判定する。認証部104が実行する虹彩認証には既存の技術を用いることができる。具体的には、認証部104は、赤外線画像を取得すると、記憶部14から認証用データ143を読み出す。本実施形態に係る認証用データ143は、上述した、予め登録された虹彩の特徴点を示す情報である。該情報は虹彩の特徴点を示す情報であればよく、例えば、予め撮像した赤外線画像であってもよいし、該赤外線画像から切り出した目の領域のみの画像であってもよい。また、虹彩の特徴点を数値化した情報であってもよい。認証部104は、取得した赤外線画像に含まれる虹彩の特徴点と、読み出した認証用データ143が示す虹彩の特徴点とを比較し、一致しているか否かを判定する。そして、一致していると判定したとき、その旨を表示画面生成部105に通知する。 The authentication unit 104 compares (collates) the iris feature points included in the infrared image with the iris feature points registered in advance, and determines whether the two feature points match. An existing technique can be used for iris authentication performed by the authentication unit 104. Specifically, when acquiring the infrared image, the authentication unit 104 reads the authentication data 143 from the storage unit 14. The authentication data 143 according to the present embodiment is information indicating the iris feature points registered in advance as described above. The information may be information indicating the feature point of the iris, and may be, for example, an infrared image captured in advance or an image of only the eye region cut out from the infrared image. Moreover, the information which digitized the feature point of the iris may be sufficient. The authentication unit 104 compares the feature point of the iris included in the acquired infrared image with the feature point of the iris indicated by the read authentication data 143, and determines whether or not they match. When it is determined that they match, the display screen generation unit 105 is notified of this.
 表示画面生成部105は、表示部13に表示させる画面を生成し、表示部13に表示させる。具体的には、表示画面生成部105は、ユーザ認証を行うためのアプリケーションが起動されると、記憶部14から代替画像141、位置合わせ用画像142、およびテキスト31(図1では不図示)を読み出し、表示部13に表示させる。 The display screen generation unit 105 generates a screen to be displayed on the display unit 13 and causes the display unit 13 to display the screen. Specifically, when an application for user authentication is activated, the display screen generation unit 105 receives the substitute image 141, the alignment image 142, and the text 31 (not shown in FIG. 1) from the storage unit 14. Read and display on the display unit 13.
 また、表示画面生成部105は、赤外線画像に人の両目が含まれているとの判定結果を取得した場合、該判定結果とともに取得した赤外線画像、およびテキスト32(図1では不図示)を、表示部13に表示させる。これにより、表示部13には図2の(d)や図2の(f)に示す画面が表示される。なお、表示画面生成部105は、赤外線画像に人の両目が含まれているとの判定結果とともに目検出部103から赤外線画像を取得する度に、表示部13に表示されている赤外線画像を更新する。 Further, when the display screen generation unit 105 acquires the determination result that both eyes of the person are included in the infrared image, the infrared image acquired together with the determination result and the text 32 (not shown in FIG. 1) It is displayed on the display unit 13. Thereby, the screen shown in FIG. 2D or FIG. 2F is displayed on the display unit 13. The display screen generation unit 105 updates the infrared image displayed on the display unit 13 every time the infrared image is acquired from the eye detection unit 103 together with the determination result that the human eyes are included in the infrared image. To do.
 一方、表示画面生成部105は、赤外線画像に人の両目が含まれていないとの判定結果を取得した場合、表示部13に表示されている赤外線画像22およびテキスト32を、代替画像141およびテキスト31に変更する。これにより、表示部13には図2の(b)に示す画面が表示される。 On the other hand, when the display screen generation unit 105 acquires the determination result that the human eyes are not included in the infrared image, the display image generation unit 105 converts the infrared image 22 and the text 32 displayed on the display unit 13 into the substitute image 141 and the text. Change to 31. Thereby, the screen shown in FIG. 2B is displayed on the display unit 13.
 なお、表示画面生成部105は、赤外線画像を表示部13に表示させるときに、該赤外線画像に対する画像処理を行ってもよい。例えば、取得した赤外線画像のうち、円部分421に対応する領域を特定し、該領域のみを表示部13に表示させる構成であってもよい。 The display screen generation unit 105 may perform image processing on the infrared image when displaying the infrared image on the display unit 13. For example, a configuration in which an area corresponding to the circular portion 421 in the acquired infrared image is specified and only the area is displayed on the display unit 13 may be used.
 また、表示画面生成部105は、認証部104から、比較した虹彩の特徴点が一致した旨を通知された場合、メニュー画面(図2の(g)に示す画面)を表示部13に表示させる。 In addition, when the authentication screen 104 notifies the display screen generation unit 105 that the feature points of the compared irises match, the display screen generation unit 105 causes the display unit 13 to display a menu screen (the screen shown in FIG. 2G). .
 (携帯端末1が実行する処理の流れ)
 次に、携帯端末1が実行する処理の流れについて、図3に基づいて説明する。図3は、携帯端末1が実行する処理の流れの一例を示すフローチャートである。
(Flow of processing executed by the mobile terminal 1)
Next, the flow of processing executed by the mobile terminal 1 will be described with reference to FIG. FIG. 3 is a flowchart illustrating an example of a flow of processing executed by the mobile terminal 1.
 ユーザ操作によってユーザ認証を行うためのアプリケーションが起動すると、撮像処理部101によって赤外線照射部11および赤外線撮像部12が駆動される。そして、表示画面生成部105は、記憶部14から代替画像141および位置合わせ用画像142を読み出し、表示部13に表示させる(ステップS1、以下、「ステップ」の記載を省略)。 When an application for performing user authentication is started by a user operation, the infrared irradiation unit 11 and the infrared imaging unit 12 are driven by the imaging processing unit 101. Then, the display screen generation unit 105 reads the substitute image 141 and the alignment image 142 from the storage unit 14 and displays them on the display unit 13 (step S1, hereinafter “step” is omitted).
 続いて、画像処理部102は、赤外線撮像部12が撮像した赤外線画像を取得し(ステップS2)、該赤外線画像の画像処理を行い(S3)、目検出部103に出力する。 Subsequently, the image processing unit 102 acquires an infrared image captured by the infrared imaging unit 12 (step S2), performs image processing on the infrared image (S3), and outputs the image to the eye detection unit 103.
 目検出部103は、取得した赤外線画像について、人の両目の検出を行う(S4、画像判定ステップ)。両目が検出できなかった場合(S4でNO)、赤外線画像に人の両目が含まれていないと判定し、判定結果を表示画面生成部105に出力する。 The eye detecting unit 103 detects both eyes of the acquired infrared image (S4, image determination step). When both eyes cannot be detected (NO in S4), it is determined that the human eyes are not included in the infrared image, and the determination result is output to the display screen generation unit 105.
 表示画面生成部105は、赤外線画像に人の両目が含まれていないとの判定結果を取得すると、記憶部14から代替画像141を読み出し、表示部13に表示させる(S9、表示制御ステップ)。そして、図3に示す処理はS2に戻る。つまり、S4において人の両目が検出されるまで、表示部13には赤外線画像は表示されず、代替画像141が表示される。 When the display screen generation unit 105 obtains the determination result that the human eyes are not included in the infrared image, the display screen generation unit 105 reads the substitute image 141 from the storage unit 14 and displays it on the display unit 13 (S9, display control step). Then, the process shown in FIG. 3 returns to S2. That is, the infrared image is not displayed on the display unit 13 and the substitute image 141 is displayed until both eyes of the person are detected in S4.
 一方、目検出部103が両目を検出した場合(S4でYES)、目検出部103は赤外線画像に人の両目が含まれていると判定し、赤外線画像を認証部104に出力する。また、判定結果および赤外線画像を表示画面生成部105に出力する。表示画面生成部105は、該判定結果および赤外線画像を取得すると、取得した赤外線画像を表示部13に表示させる(S5、表示制御ステップ)。また、認証部104はユーザ認証を開始する(S6)。具体的には、取得した赤外線画像に含まれる虹彩の特徴点と、記憶部14に記憶されている虹彩の特徴点(認証用データ143が示す特徴点)とを比較する。なお、S5およびS6の処理は、図示の順番で行われなくてもよい。例えば、S5の前にS6の処理が行われてもよいし、S5およびS6の処理が同時に行われてもよい。 On the other hand, when the eye detection unit 103 detects both eyes (YES in S4), the eye detection unit 103 determines that both human eyes are included in the infrared image, and outputs the infrared image to the authentication unit 104. Further, the determination result and the infrared image are output to the display screen generation unit 105. When the display screen generation unit 105 acquires the determination result and the infrared image, the display screen generation unit 105 displays the acquired infrared image on the display unit 13 (S5, display control step). The authentication unit 104 starts user authentication (S6). Specifically, the feature point of the iris included in the acquired infrared image is compared with the feature point of the iris stored in the storage unit 14 (the feature point indicated by the authentication data 143). Note that the processing of S5 and S6 may not be performed in the order shown. For example, the process of S6 may be performed before S5, or the processes of S5 and S6 may be performed simultaneously.
 認証部104は、上記の比較を行うことにより、2つの特徴点が一致するか否か、すなわち、赤外線画像に写る人が携帯端末のユーザであるか否かを判定する(S7)。赤外線画像に写る人が携帯端末のユーザであると特定した場合(S7でYES)、認証部104はその旨を表示画面生成部105に通知する。該通知を受けた表示画面生成部105は、メニュー画面を表示部13に表示させる(S8)。以上で図3に示す処理は終了する。 The authentication unit 104 determines whether or not the two feature points match, that is, whether or not the person shown in the infrared image is a user of the mobile terminal by performing the above comparison (S7). When it is determined that the person shown in the infrared image is the user of the mobile terminal (YES in S7), the authentication unit 104 notifies the display screen generation unit 105 to that effect. Receiving the notification, the display screen generation unit 105 displays the menu screen on the display unit 13 (S8). This is the end of the process shown in FIG.
 一方、赤外線画像に写る人が携帯端末のユーザであると特定できなかった場合(S7でNO)、図3に示す処理はS2に戻る。 On the other hand, if it cannot be determined that the person shown in the infrared image is a user of the mobile terminal (NO in S7), the process shown in FIG. 3 returns to S2.
 〔実施形態1の変形例〕
 上述した実施形態1では、赤外線画像21に人の両目を検出した場合のみ、ユーザ認証を開始する構成であった。しかしながら、ユーザ認証を開始する条件はこの例に限定されない。例えば、片目のみを検出した場合にユーザ認証を開始する構成であってもよい。例えば、目検出部103は、取得した赤外線画像から切り出した顔領域の左半分に目の領域を検出した場合、赤外線画像に目(左目)が含まれていると判定してもよい。同様に、目検出部103は、取得した赤外線画像から切り出した顔領域の右半分に目の領域を検出した場合、赤外線画像に目(右目)が含まれていると判定してもよい。すなわち、目検出部103は、取得した赤外線画像から切り出した顔領域の左半分、または右半分の少なくとも一方から目の領域を検出した場合、赤外線画像に目が含まれていると判定する構成であってもよい。この場合、目検出部103は、取得した赤外線画像を認証部104に出力するとともに、取得した赤外線画像と判定結果とを表示画面生成部105に出力する構成であってもよい。
[Modification of Embodiment 1]
In the first embodiment described above, the user authentication is started only when both eyes of the person are detected in the infrared image 21. However, the conditions for starting user authentication are not limited to this example. For example, the configuration may be such that user authentication is started when only one eye is detected. For example, when the eye area is detected in the left half of the face area cut out from the acquired infrared image, the eye detection unit 103 may determine that the infrared image includes the eye (left eye). Similarly, the eye detection unit 103 may determine that an eye (right eye) is included in the infrared image when the eye region is detected in the right half of the face region cut out from the acquired infrared image. That is, the eye detection unit 103 is configured to determine that eyes are included in the infrared image when the eye region is detected from at least one of the left half or the right half of the face region cut out from the acquired infrared image. There may be. In this case, the eye detection unit 103 may output the acquired infrared image to the authentication unit 104 and output the acquired infrared image and the determination result to the display screen generation unit 105.
 なお、この変形例に係る認証部104は、赤外線画像に片目のみが含まれている場合、該片目の虹彩の特徴点と、予め登録された右目または左目の虹彩の特徴点とを比較し、2つの特徴点が一致しているか否かを判定する。 In addition, when only one eye is included in the infrared image, the authentication unit 104 according to this modification compares the feature point of the iris of the one eye with the feature point of the iris of the right eye or the left eye registered in advance, It is determined whether or not the two feature points match.
 このように、少なくとも左右一方の目の領域の検出によって、虹彩認証を実行する構成とすれば、ユーザが片方の目に眼帯をしている場合など、片目の虹彩の特徴点のみを取得および照合可能な場合でも、ユーザ認証を実行することができる。 In this way, if the iris authentication is performed by detecting at least one of the left and right eye areas, only the feature point of the iris of one eye is acquired and verified, such as when the user is wearing an eye patch on one eye. User authentication can be performed even when possible.
 〔実施形態2〕
 本発明の他の実施形態について、図4および図5に基づいて説明すれば、以下のとおりである。なお、説明の便宜上、上記実施形態にて説明した部材と同じ機能を有する部材については、同じ符号を付記し、その説明を省略する。
[Embodiment 2]
The following will describe another embodiment of the present invention with reference to FIGS. For convenience of explanation, members having the same functions as those described in the above embodiment are denoted by the same reference numerals and description thereof is omitted.
 上述した実施形態1では、赤外線画像を用いた虹彩認証によってユーザ認証を行っていた。しかしながら、赤外線画像を用いたユーザ認証は虹彩認証に限定されない。本実施形態では、顔認識によってユーザ認証を行う携帯端末であって、顔認識に用いられる顔の部分(例えば、目、鼻、口、耳、輪郭など)が赤外線画像に含まれている場合のみ、表示部13に赤外線画像を表示させる携帯端末1aについて説明する。なお本実施形態における顔認識は、赤外線画像に写る人の顔の輪郭、両目の位置、鼻の穴の位置、および口の位置と、予め登録された、輪郭、両目の位置、鼻の穴の位置、および口の位置とを比較するものとして説明するが、顔認識はこの例に限定されるものではない。 In Embodiment 1 described above, user authentication is performed by iris authentication using an infrared image. However, user authentication using an infrared image is not limited to iris authentication. In the present embodiment, it is a portable terminal that performs user authentication by face recognition, and only when a face part (for example, eyes, nose, mouth, ears, contour, etc.) used for face recognition is included in the infrared image. The portable terminal 1a that displays an infrared image on the display unit 13 will be described. Note that the face recognition in the present embodiment includes the contours of the person's face, the positions of both eyes, the position of the nostrils, and the position of the mouth, as well as the contours, positions of both eyes, and the nostrils registered in advance. Although the description will be made on the assumption that the position and the position of the mouth are compared, face recognition is not limited to this example.
 (携帯端末1aの要部構成)
 まず、携帯端末1aの要部構成について説明する。図4は、携帯端末1aの要部構成の一例を示すブロック図である。
(Main part configuration of portable terminal 1a)
First, the main part structure of the portable terminal 1a will be described. FIG. 4 is a block diagram illustrating an example of a main configuration of the mobile terminal 1a.
 図示のように、本実施形態に係る携帯端末1aは、実施形態1にて説明した携帯端末1と異なり、制御部10および記憶部14に代えて、制御部10aおよび記憶部14aを備えている。そして、制御部10aは、制御部10と異なり、目検出部103に代えて照合対象検出部106、認証部104に代えて認証部104aを含んでいる。また、記憶部14aは、記憶部14と異なり、位置合わせ用画像142に代えて位置合わせ用画像142a、認証用データ143に代えて認証用データ143aを記憶している。 As illustrated, unlike the mobile terminal 1 described in the first embodiment, the mobile terminal 1 a according to the present embodiment includes a control unit 10 a and a storage unit 14 a instead of the control unit 10 and the storage unit 14. . Unlike the control unit 10, the control unit 10 a includes a collation target detection unit 106 instead of the eye detection unit 103 and an authentication unit 104 a instead of the authentication unit 104. Unlike the storage unit 14, the storage unit 14 a stores an alignment image 142 a instead of the alignment image 142, and authentication data 143 a instead of the authentication data 143.
 照合対象検出部106は、赤外線画像に人の顔において顔認識に使われる部分が含まれているか否かを判定する。本実施形態に係る照合対象検出部106は、赤外線画像に顔の輪郭、両目、鼻、口が含まれているか否かを判定する。具体的には、照合対象検出部106はまず、取得した赤外線画像から顔を検出する。そして、顔を検出できた場合、そして、顔を検出できた場合、顔の領域を切り出し、切り出した顔領域から目の領域、鼻の領域、口の領域を検出する。そして、これら4つの領域がすべて検出できた場合、赤外線画像に顔認識に使われる部分が含まれていると判定し、取得した赤外線画像を認証部104aに出力する(図4のd3)。また、取得した赤外線画像と判定結果とを表示画面生成部105に出力する(図4のd4)。一方、上述した4つの領域がすべて検出できなかった場合、赤外線画像に顔認識に使われる部分が含まれていないと判定し、判定結果のみを表示画面生成部105に出力する。なお、各領域の検出には、既存の技術を用いることができる。 The collation target detection unit 106 determines whether or not the infrared image includes a part used for face recognition in the human face. The collation target detection unit 106 according to the present embodiment determines whether or not the infrared image includes a face outline, both eyes, nose, and mouth. Specifically, the collation target detection unit 106 first detects a face from the acquired infrared image. When a face can be detected and when a face can be detected, a face area is cut out, and an eye area, a nose area, and a mouth area are detected from the cut face area. If all of these four areas are detected, it is determined that the infrared image includes a portion used for face recognition, and the acquired infrared image is output to the authentication unit 104a (d3 in FIG. 4). Further, the acquired infrared image and determination result are output to the display screen generation unit 105 (d4 in FIG. 4). On the other hand, if all of the four areas described above cannot be detected, it is determined that the infrared image does not include a portion used for face recognition, and only the determination result is output to the display screen generation unit 105. An existing technique can be used for detecting each region.
 認証部104aは、実施形態1で説明した認証部104と異なり、赤外線画像に写る人の顔の輪郭、両目の位置、鼻の穴の位置、および口の位置と、予め登録された、輪郭、両目の位置、鼻の穴の位置、および口の位置とを照合する。そして、認証部104aは、これらがすべて一致しているか否かを判定する。具体的には、認証部104は、赤外線画像を取得すると、記憶部14から認証用データ143aを読み出す。本実施形態に係る認証用データ143aは、上述した、顔の輪郭、両目の位置、鼻の穴の位置、および口の位置を示す情報である。該情報は顔の輪郭、両目の位置、鼻の穴の位置、および口の位置を示す情報であればよく、例えば、予め撮像した赤外線画像であってもよいし、顔の輪郭、両目の位置、鼻の穴の位置、および口の位置を数値化した情報であってもよい。また、顔認識において使用する顔の部分が変われば、認証用データ143aに含まれる情報も変わることとなる。例えば、本実施形態における顔認識において、口の位置に代えて両耳の位置を比較する場合、認証用データ143aは、顔の輪郭、両目の位置、鼻の穴の位置、および両耳の位置を示す情報となる。 Unlike the authentication unit 104 described in the first embodiment, the authentication unit 104a is configured to include a contour of a person's face, a position of both eyes, a position of a nostril, a position of a mouth, a contour registered in advance, Match the position of both eyes, the position of the nostrils, and the position of the mouth. Then, the authentication unit 104a determines whether or not they all match. Specifically, when acquiring the infrared image, the authentication unit 104 reads the authentication data 143a from the storage unit 14. The authentication data 143a according to the present embodiment is information indicating the face outline, the positions of both eyes, the position of the nostril, and the position of the mouth described above. The information may be information indicating the contour of the face, the position of both eyes, the position of the nostril, and the position of the mouth. For example, the information may be an infrared image captured in advance, the contour of the face, the position of both eyes Alternatively, the information may be information obtained by digitizing the position of the nostril and the position of the mouth. Further, if the face portion used in the face recognition changes, the information included in the authentication data 143a also changes. For example, in the face recognition in the present embodiment, when comparing the positions of both ears instead of the mouth position, the authentication data 143a includes the face outline, the positions of both eyes, the position of the nostrils, and the positions of both ears. It becomes the information which shows.
 位置合わせ用画像142aは、実施形態1にて説明した位置合わせ用画像142と異なり、顔認識を正確に行うために、ユーザの顔と携帯端末1aとの適切な位置関係をユーザに認識させるための画像である。なお、該画像の具体例については後述する。 The alignment image 142a is different from the alignment image 142 described in the first embodiment in order to allow the user to recognize an appropriate positional relationship between the user's face and the mobile terminal 1a in order to accurately perform face recognition. It is an image. A specific example of the image will be described later.
 (本実施形態における画面例)
 次に、本実施形態における、表示部13に表示される画面例について、図5に基づいて説明する。携帯端末1aを用いて撮像した画像の遷移図と、携帯端末1aに表示される画像の遷移図である。
(Screen example in this embodiment)
Next, an example of a screen displayed on the display unit 13 in the present embodiment will be described with reference to FIG. It is the transition diagram of the image imaged using the portable terminal 1a, and the transition diagram of the image displayed on the portable terminal 1a.
 上述したように、本実施形態に係る照合対象検出部106は、赤外線画像に顔の輪郭、両目、鼻、口が含まれているか否かを判定する。そして、本実施形態に係る表示画面生成部105は、照合対象検出部106が上述した顔の部分が含まれていないと判定した場合、代替画像141を表示部13に表示させる。具体的には、照合対象検出部106が、図5の(a)に示すような赤外線画像21dを取得した場合、表示画面生成部105は、図5の(c)に示すように、代替画像141、およびテキスト31を表示部13に表示させる。これは、該赤外線画像21dには上述した顔の部分が含まれていないためである。この表示により、照合対象検出部106は、赤外線画像21に、顔認識に使用される顔の部分が写っていないことをユーザに認識させる。 As described above, the collation target detection unit 106 according to this embodiment determines whether or not the infrared image includes a face outline, both eyes, nose, and mouth. Then, the display screen generation unit 105 according to the present embodiment causes the display unit 13 to display the substitute image 141 when the collation target detection unit 106 determines that the face portion described above is not included. Specifically, when the collation target detection unit 106 acquires the infrared image 21d as illustrated in FIG. 5A, the display screen generation unit 105 displays the substitute image as illustrated in FIG. 141 and text 31 are displayed on the display unit 13. This is because the infrared image 21d does not include the face portion described above. With this display, the collation target detection unit 106 causes the user to recognize that the face portion used for face recognition is not shown in the infrared image 21.
 一方、照合対象検出部106が、図5の(b)に示すような赤外線画像21eを取得した場合、表示画面生成部105は、図5の(d)に示すように、赤外線画像22e、およびテキスト32を表示部13に表示させる。これは、該赤外線画像21eに上述した顔の部分がすべて含まれているためである。なお、本実施形態における赤外線画像22eは、実施形態1で説明した赤外線画像22と異なり、少なくとも顔の輪郭、両目、鼻、口が写る画像である。 On the other hand, when the collation target detection unit 106 acquires the infrared image 21e as shown in FIG. 5B, the display screen generation unit 105, as shown in FIG. The text 32 is displayed on the display unit 13. This is because the infrared image 21e includes all the face portions described above. Note that the infrared image 22e in the present embodiment is an image in which at least the outline of the face, both eyes, nose, and mouth are captured, unlike the infrared image 22 described in the first embodiment.
 なお、実施形態1と同様に、赤外線画像21に顔認識に使用される顔の部分が写っているか否かに関わらず、表示画面生成部105は、表示部13に位置合わせ用画像142aを表示させる。位置合わせ用画像142aは、図5の(c)および(d)に示すように、破線で形成された四角形であり、赤外線画像22に写るユーザの顔が、該四角形内に位置するように、ユーザに自身の顔と携帯端末1aとの位置関係を調整させるための画像である。なお、位置合わせ用画像142aは、ユーザが自身の顔と携帯端末1aとの位置関係を調整することができる画像であればよく、図5の例に限定されない。 As in the first embodiment, the display screen generation unit 105 displays the alignment image 142a on the display unit 13 regardless of whether or not the infrared image 21 includes a face part used for face recognition. Let As shown in FIGS. 5C and 5D, the alignment image 142a is a quadrangle formed by a broken line, and the user's face shown in the infrared image 22 is positioned within the quadrangle. It is an image for making a user adjust the positional relationship of an own face and the portable terminal 1a. Note that the alignment image 142a is not limited to the example of FIG. 5 as long as the user can adjust the positional relationship between the user's face and the portable terminal 1a.
 〔各実施形態に共通の変形例〕
 上述した実施形態1および2では、位置合わせ用画像142および位置合わせ用画像142aを、目検出部103および照合対象検出部106の判定結果に関わらず表示部13に表示させる構成であった。しかしながら、位置合わせ用画像を表示させる条件はこの例に限定されない。例えば実施形態1において、目検出部103が赤外線画像に人の両目が含まれていると判定した場合のみ、表示画面生成部105は位置合わせ用画像142を表示部13に表示させる構成であってもよい。
[Modification common to each embodiment]
In the first and second embodiments described above, the alignment image 142 and the alignment image 142a are displayed on the display unit 13 regardless of the determination results of the eye detection unit 103 and the collation target detection unit 106. However, the condition for displaying the alignment image is not limited to this example. For example, in the first embodiment, the display screen generation unit 105 displays the alignment image 142 on the display unit 13 only when the eye detection unit 103 determines that both infrared eyes are included in the infrared image. Also good.
 また、上述した実施形態1および2では、虹彩認証および顔認識に時間制限はないものとして説明した。しかしながら、虹彩認証および顔認識に時間制限があってもよい。例えば実施形態1において、認証部104は、赤外線画像を取得したとき、図示しないタイマを駆動させて赤外線画像を取得してからの時間を計測する。そして、所定時間以内に取得した赤外線画像に含まれる虹彩の特徴点と、読み出した認証用データ143が示す虹彩の特徴点とが一致していると判定しなかった場合、その旨を撮像処理部101および表示画面生成部105に通知する。撮像処理部101は、該通知を受けて赤外線照射部11および赤外線撮像部12を停止させる。 In the first and second embodiments described above, it has been described that there is no time limit for iris authentication and face recognition. However, there may be a time limit on iris authentication and face recognition. For example, in the first embodiment, when the authentication unit 104 acquires an infrared image, the authentication unit 104 drives a timer (not shown) and measures the time after acquiring the infrared image. If it is not determined that the feature point of the iris included in the infrared image acquired within the predetermined time and the feature point of the iris indicated by the read authentication data 143 match, the imaging processing unit 101 and the display screen generation unit 105. In response to the notification, the imaging processing unit 101 stops the infrared irradiation unit 11 and the infrared imaging unit 12.
 また、表示画面生成部105は、認証部104から、比較した虹彩の特徴点が一致しないまま所定時間が経過した旨を通知された場合、その旨をユーザに報知するための画面(不図示)を表示部13に表示させてもよい。該画面は例えば、ユーザ認証が失敗した旨を示すテキストと、再度ユーザ認証を行うか否かをユーザに選択させるUIとを含む画面であってもよい。ユーザは上記UIに対する操作によって、再度ユーザ認証を行うことを選択すれば、赤外線照射部11および赤外線撮像部12が再度駆動し、赤外線画像が取得される。 In addition, when the display screen generation unit 105 is notified from the authentication unit 104 that a predetermined time has passed without the feature points of the compared irises matching, a screen for notifying the user of that fact (not shown) May be displayed on the display unit 13. The screen may be, for example, a screen including text indicating that user authentication has failed and a UI that allows the user to select whether to perform user authentication again. If the user selects to perform user authentication again by operating the UI, the infrared irradiation unit 11 and the infrared imaging unit 12 are driven again, and an infrared image is acquired.
 また、上述した携帯端末1および携帯端末1aは、いずれも赤外線照射部11を備えている構成であったが、本発明に係る携帯端末において、赤外線照射部11は必須ではない。なお、本発明に係る携帯端末が赤外線照射部11を備えない構成である場合、該携帯端末は、太陽から被写体に照射される赤外線を使用して赤外線画像を撮像してもよいし、別光源から被写体に照射される赤外線を使用して赤外線画像を撮像してもよい。 Moreover, although both the portable terminal 1 and the portable terminal 1a mentioned above were the structures provided with the infrared irradiation part 11, the infrared irradiation part 11 is not essential in the portable terminal which concerns on this invention. In addition, when the portable terminal which concerns on this invention is a structure which is not provided with the infrared irradiation part 11, this portable terminal may image an infrared image using the infrared rays irradiated to a to-be-photographed object from the sun, or another light source Infrared images may be taken using infrared rays irradiated to the subject.
 〔実施形態3〕
 携帯端末1および携帯端末1aの制御ブロック(特に制御部10および制御部10aに含まれる各部材)は、集積回路(ICチップ)等に形成された論理回路(ハードウェア)によって実現してもよい。また、上記制御ブロックは、CPU(Central Processing Unit)を用いてソフトウェアによって実現してもよい。
[Embodiment 3]
The control blocks of mobile terminal 1 and mobile terminal 1a (particularly, each member included in control unit 10 and control unit 10a) may be realized by a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like. . The control block may be realized by software using a CPU (Central Processing Unit).
 後者の場合、携帯端末1および携帯端末1aは、CPU、ROM(Read Only Memory)または記憶装置(これらを「記録媒体」と称する)、RAM(Random Access Memory)などを備えている。CPUは、各機能を実現するソフトウェアであるプログラムの命令を実行する。記録媒体は、上記プログラムおよび各種データがコンピュータ(またはCPU)で読み取り可能に記録されている。RAMは、上記プログラムを展開する。そして、コンピュータ(またはCPU)が上記プログラムを上記記録媒体から読み取って実行することにより、本発明の目的が達成される。上記記録媒体としては、「一時的でない有形の媒体」、例えば、テープ、ディスク、カード、半導体メモリ、プログラマブルな論理回路などを用いることができる。また、上記プログラムは、該プログラムを伝送可能な任意の伝送媒体(通信ネットワークや放送波等)を介して上記コンピュータに供給されてもよい。なお、本発明は、上記プログラムが電子的な伝送によって具現化された、搬送波に埋め込まれたデータ信号の形態でも実現され得る。 In the latter case, the portable terminal 1 and the portable terminal 1a include a CPU, a ROM (Read Only Memory) or a storage device (these are referred to as “recording media”), a RAM (Random Access Memory), and the like. The CPU executes instructions of a program that is software for realizing each function. In the recording medium, the program and various data are recorded so as to be readable by a computer (or CPU). The RAM expands the program. And the objective of this invention is achieved when a computer (or CPU) reads the said program from the said recording medium and runs it. As the recording medium, a “non-temporary tangible medium” such as a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used. The program may be supplied to the computer via an arbitrary transmission medium (such as a communication network or a broadcast wave) that can transmit the program. The present invention can also be realized in the form of a data signal embedded in a carrier wave in which the program is embodied by electronic transmission.
 〔まとめ〕
 本発明の態様1に係るユーザ認証装置(携帯端末1、携帯端末1a)は、ユーザの顔の少なくとも一部を撮像した赤外線画像を用いてユーザ認証を行うユーザ認証装置であって、上記赤外線画像に、上記ユーザ認証に用いる顔の部分が含まれているか否かを判定する画像判定部(目検出部103、照合対象検出部106)と、上記顔の部分が含まれていると判定された場合、上記赤外線画像を表示部(表示部13)に表示させる一方、上記顔の部分が含まれていないと判定された場合、上記赤外線画像を上記表示部に表示させない表示制御部(表示画面生成部105)と、を備える。
[Summary]
A user authentication device (portable terminal 1, portable terminal 1a) according to aspect 1 of the present invention is a user authentication device that performs user authentication using an infrared image obtained by imaging at least a part of a user's face, and the infrared image described above. Are determined to include an image determination unit (eye detection unit 103, collation target detection unit 106) that determines whether or not the face part used for the user authentication is included. In the case where the infrared image is displayed on the display unit (display unit 13), when it is determined that the face portion is not included, the display control unit (display screen generation) that does not display the infrared image on the display unit Part 105).
 上記の構成によれば、赤外線画像にユーザ認証に用いる顔の部分が含まれている場合のみ、該赤外線画像を表示し、含まれていない場合は該赤外線画像を表示しない。これにより、ユーザ認証を行おうとしている場合に撮像された赤外線画像(すなわち、人の顔が写る赤外線画像)のみが表示されることとなり、換言すれば、ユーザ認証を意図せず撮像された赤外線画像は表示されない。よって、不適切な赤外線画像が表示部に表示されることを防ぐことができる。なお、不適切な赤外線画像とは、例えば、被写体の服が透けた画像などである。また、ユーザ認証に用いる顔の部分が含まれていない場合は赤外線画像が表示部に表示されないので、ユーザに対して、ユーザ認証を行うために必要な赤外線画像(すなわち、上記顔の部分が写る赤外線画像)が撮像できていないことを認識させることができ、ユーザがスムーズにユーザ認証を行うことができるよう補助することができる。 According to the above configuration, the infrared image is displayed only when the infrared image includes a facial part used for user authentication, and the infrared image is not displayed when it is not included. As a result, only an infrared image captured when user authentication is to be performed (that is, an infrared image in which a person's face is captured) is displayed. In other words, an infrared image captured without intention of user authentication is displayed. The image is not displayed. Therefore, it is possible to prevent an inappropriate infrared image from being displayed on the display unit. An inappropriate infrared image is, for example, an image through which clothes of a subject are seen. In addition, since the infrared image is not displayed on the display unit when the face portion used for user authentication is not included, the infrared image necessary for performing user authentication for the user (that is, the face portion is captured). It is possible to recognize that an infrared image has not been captured and to assist the user in performing user authentication smoothly.
 本発明の態様2に係るユーザ認証装置は、上記態様1において、上記表示制御部は、上記赤外線画像に上記顔の部分が含まれていないと判定された場合、該赤外線画像とは異なる所定の画像を上記表示部に表示させてもよい。 In the user authentication device according to aspect 2 of the present invention, in the aspect 1, when the display control unit determines that the face portion is not included in the infrared image, the user authentication device has a predetermined different from the infrared image. An image may be displayed on the display unit.
 上記の構成によれば、ユーザ認証において比較対象となる顔の部分が赤外線画像に含まれていない場合、赤外線画像に代えて、該赤外線画像とは異なる所定の画像を表示部に表示させる。これにより、ユーザは、上記顔の部分が赤外線画像に含まれていないと判定されたときに表示部に何も表示しない場合と比べて、ユーザ認証を行うために必要な赤外線画像が撮像できていないことをより明確に認識することができ、ユーザはよりスムーズにユーザ認証を行うことができる。 According to the above configuration, when the face part to be compared in the user authentication is not included in the infrared image, a predetermined image different from the infrared image is displayed on the display unit instead of the infrared image. As a result, the user can capture an infrared image necessary for user authentication as compared to a case where nothing is displayed on the display unit when it is determined that the face portion is not included in the infrared image. It can be recognized more clearly, and the user can perform user authentication more smoothly.
 本発明の態様3に係るユーザ認証装置は、上記態様1または2において、上記表示制御部は、ユーザ認証を行うために適した、上記顔の部分の位置を示す位置決め画像を上記表示部に表示させてもよい。 In the user authentication device according to aspect 3 of the present invention, in the aspect 1 or 2, the display control unit displays a positioning image indicating the position of the face portion suitable for performing user authentication on the display unit. You may let them.
 上記の構成によれば、ユーザ認証を行うために適した上記顔の部分の位置を示す位置決め画像を表示するので、ユーザは、該位置決め画像と赤外線画像とを確認することで、ユーザ認証を行うために適した目の位置を認識することができ、顔の部分とユーザ認証装置との位置関係を、ユーザ認証を行うために適した位置に調整することができる。 According to the above configuration, since the positioning image indicating the position of the face portion suitable for user authentication is displayed, the user performs user authentication by confirming the positioning image and the infrared image. Therefore, the position of the eye suitable for the user can be recognized, and the positional relationship between the face portion and the user authentication device can be adjusted to a position suitable for performing user authentication.
 本発明の態様4に係るユーザ認証装置は、上記態様1から3のいずれかにおいて、上記表示制御部は、上記顔の部分が含まれていると判定された場合、該顔の部分のみを上記表示部に表示させてもよい。 The user authentication device according to aspect 4 of the present invention is the user authentication apparatus according to any one of the aspects 1 to 3, wherein when the display control unit determines that the face part is included, only the face part is included. You may display on a display part.
 上記の構成によれば、ユーザ認証において比較対象となる顔の部分が赤外線画像に含まれている場合、該顔の部分のみを表示部に表示させる。これにより、赤外線画像に人の顔と首から下の部分が同時に写っていたとしても、首から下の部分は表示されないので、被写体の服が透けた画像が表示部に表示されることを防ぐことができる。 According to the above configuration, when the face part to be compared in the user authentication is included in the infrared image, only the face part is displayed on the display unit. As a result, even if the human face and the lower part from the neck are shown in the infrared image at the same time, the lower part from the neck is not displayed, so that an image showing the subject's clothes is prevented from being displayed on the display unit. be able to.
 本発明の態様5に係るユーザ認証装置は、上記態様1から4のいずれかにおいて、上記顔の部分は、上記ユーザの虹彩であり、上記画像判定部は、上記赤外線画像に上記ユーザの目が含まれているか否かを判定してもよい。 The user authentication device according to aspect 5 of the present invention is the user authentication apparatus according to any one of the aspects 1 to 4, wherein the face portion is the iris of the user, and the image determination unit includes the user's eyes on the infrared image. You may determine whether it is contained.
 上記の構成によれば、赤外線画像にユーザの目が含まれているか否かを判定し、含まれている場合のみ該赤外線画像を表示部に表示させる。これにより、ユーザは自身の目が写るように赤外線画像を撮像するという簡易な方法で、ユーザ認証を行うことができる。また、上記の構成によれば、生体において同じものが存在しない虹彩について比較を行うことでユーザ認証を行うので、セキュリティ性の高いユーザ認証装置を実現することができる。 According to the above configuration, it is determined whether or not the user's eyes are included in the infrared image, and the infrared image is displayed on the display unit only when it is included. Thereby, the user can perform user authentication by a simple method of capturing an infrared image so that his / her eyes can be seen. Moreover, according to said structure, since user authentication is performed by comparing with respect to the iris in which the same thing does not exist in a biological body, a user authentication apparatus with high security is realizable.
 本発明の態様6に係るユーザ認証装置は、ユーザの顔の少なくとも一部を撮像した赤外線画像を用いてユーザ認証を行うユーザ認証装置の制御方法であって、上記赤外線画像に、上記ユーザ認証に用いる顔の部分が含まれているか否かを判定する画像判定ステップ(ステップS4)と、上記顔の部分が含まれていると判定された場合、上記赤外線画像を表示部に表示させる一方、上記顔の部分が含まれていないと判定された場合、上記赤外線画像を上記表示部に表示させない表示制御ステップ(ステップS5、S9)と、を含む。上記の構成によれば、上記態様1に係るユーザ認証装置と同様の作用効果を奏する。 A user authentication device according to an aspect 6 of the present invention is a control method for a user authentication device that performs user authentication using an infrared image obtained by imaging at least a part of a user's face, and includes the above-described infrared image for the user authentication. An image determination step (step S4) for determining whether or not a facial part to be used is included, and if it is determined that the facial part is included, the infrared image is displayed on the display unit, while A display control step (steps S5 and S9) that does not display the infrared image on the display unit when it is determined that the face portion is not included. According to said structure, there exists an effect similar to the user authentication apparatus which concerns on the said aspect 1. FIG.
 本発明の各態様に係るユーザ認証装置は、コンピュータによって実現してもよく、この場合には、コンピュータを上記ユーザ認証装置が備える各部(ソフトウェア要素)として動作させることにより上記ユーザ認証装置をコンピュータにて実現させるユーザ認証装置の制御プログラム、およびそれを記録したコンピュータ読み取り可能な記録媒体も、本発明の範疇に入る。 The user authentication device according to each aspect of the present invention may be realized by a computer. In this case, the user authentication device is operated on each computer by causing the computer to operate as each unit (software element) included in the user authentication device. The control program for the user authentication apparatus to be realized in this way and a computer-readable recording medium on which the control program is recorded also fall within the scope of the present invention.
 本発明は上述した各実施形態に限定されるものではなく、請求項に示した範囲で種々の変更が可能であり、異なる実施形態にそれぞれ開示された技術的手段を適宜組み合わせて得られる実施形態についても本発明の技術的範囲に含まれる。さらに、各実施形態にそれぞれ開示された技術的手段を組み合わせることにより、新しい技術的特徴を形成することができる。 The present invention is not limited to the above-described embodiments, and various modifications are possible within the scope shown in the claims, and embodiments obtained by appropriately combining technical means disclosed in different embodiments. Is also included in the technical scope of the present invention. Furthermore, a new technical feature can be formed by combining the technical means disclosed in each embodiment.
   1 携帯端末(ユーザ認証装置)
  1a 携帯端末(ユーザ認証装置)
 103 目検出部(画像判定部)
 105 表示画面生成部(表示制御部)
 106 照合対象検出部(画像判定部)
  S4 画像判定ステップ
  S5 表示制御ステップ
  S9 表示制御ステップ
1 Mobile terminal (user authentication device)
1a Mobile terminal (user authentication device)
103 Eye detection unit (image determination unit)
105 Display screen generator (display controller)
106 Verification target detection unit (image determination unit)
S4 Image determination step S5 Display control step S9 Display control step

Claims (8)

  1.  ユーザの顔の少なくとも一部を撮像した赤外線画像を用いてユーザ認証を行うユーザ認証装置であって、
     上記赤外線画像に、上記ユーザ認証に用いる顔の部分が含まれているか否かを判定する画像判定部と、
     上記顔の部分が含まれていると判定された場合、上記赤外線画像を表示部に表示させる一方、上記顔の部分が含まれていないと判定された場合、上記赤外線画像を上記表示部に表示させない表示制御部と、を備えることを特徴とするユーザ認証装置。
    A user authentication device that performs user authentication using an infrared image obtained by imaging at least a part of a user's face,
    An image determination unit that determines whether or not the infrared image includes a face portion used for the user authentication;
    When it is determined that the face part is included, the infrared image is displayed on the display unit, while when it is determined that the face part is not included, the infrared image is displayed on the display unit. And a display control unit that does not allow the user authentication device.
  2.  上記表示制御部は、上記赤外線画像に上記顔の部分が含まれていないと判定された場合、該赤外線画像とは異なる所定の画像を上記表示部に表示させることを特徴とする請求項1に記載のユーザ認証装置。 2. The display control unit according to claim 1, wherein, when it is determined that the face portion is not included in the infrared image, the display control unit displays a predetermined image different from the infrared image on the display unit. The user authentication device described.
  3.  上記表示制御部は、ユーザ認証を行うために適した、上記顔の部分の位置を示す位置決め画像を上記表示部に表示させることを特徴とする請求項1または2に記載のユーザ認証装置。 3. The user authentication apparatus according to claim 1, wherein the display control unit displays a positioning image indicating a position of the face portion suitable for performing user authentication on the display unit.
  4.  上記表示制御部は、上記顔の部分が含まれていると判定された場合、該顔の部分のみを上記表示部に表示させることを特徴とする請求項1から3のいずれか1項に記載のユーザ認証装置。 The said display control part displays only the said face part on the said display part, when it determines with the said face part being included, The display part of any one of Claim 1 to 3 characterized by the above-mentioned. User authentication device.
  5.  上記顔の部分は、上記ユーザの虹彩であり、
     上記画像判定部は、上記赤外線画像に上記ユーザの目が含まれているか否かを判定することを特徴とする請求項1から4のいずれか1項に記載のユーザ認証装置。
    The face portion is the user's iris,
    5. The user authentication device according to claim 1, wherein the image determination unit determines whether or not the user's eyes are included in the infrared image. 6.
  6.  ユーザの顔の少なくとも一部を撮像した赤外線画像を用いてユーザ認証を行うユーザ認証装置の制御方法であって、
     上記赤外線画像に、上記ユーザ認証に用いる顔の部分が含まれているか否かを判定する画像判定ステップと、
     上記顔の部分が含まれていると判定された場合、上記赤外線画像を表示部に表示させる一方、上記顔の部分が含まれていないと判定された場合、上記赤外線画像を上記表示部に表示させない表示制御ステップと、を含むことを特徴とするユーザ認証装置の制御方法。
    A method for controlling a user authentication apparatus that performs user authentication using an infrared image obtained by imaging at least a part of a user's face,
    An image determination step for determining whether or not the infrared image includes a face portion used for the user authentication;
    When it is determined that the face part is included, the infrared image is displayed on the display unit, while when it is determined that the face part is not included, the infrared image is displayed on the display unit. A display control step for preventing the user authentication from being performed.
  7.  請求項1に記載のユーザ認証装置としてコンピュータを機能させるための制御プログラムであって、上記画像判定部および上記表示制御部としてコンピュータを機能させるための制御プログラム。 A control program for causing a computer to function as the user authentication device according to claim 1, wherein the control program causes the computer to function as the image determination unit and the display control unit.
  8.  請求項7に記載の制御プログラムを記録したコンピュータ読み取り可能な記録媒体。 A computer-readable recording medium on which the control program according to claim 7 is recorded.
PCT/JP2017/018443 2016-06-29 2017-05-17 User authentication device, user authentication device control method, control program, and recording medium WO2018003336A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2018524952A JP6754834B2 (en) 2016-06-29 2017-05-17 User authentication device, control method of user authentication device, control program, and recording medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016129268 2016-06-29
JP2016-129268 2016-06-29

Publications (1)

Publication Number Publication Date
WO2018003336A1 true WO2018003336A1 (en) 2018-01-04

Family

ID=60786863

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/018443 WO2018003336A1 (en) 2016-06-29 2017-05-17 User authentication device, user authentication device control method, control program, and recording medium

Country Status (2)

Country Link
JP (1) JP6754834B2 (en)
WO (1) WO2018003336A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7384772B2 (en) 2020-04-17 2023-11-21 技嘉科技股▲ふん▼有限公司 Face recognition device and face recognition method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005334402A (en) * 2004-05-28 2005-12-08 Sanyo Electric Co Ltd Method and device for authentication
JP2008181468A (en) * 2006-02-13 2008-08-07 Smart Wireless Kk Infrared face authentication apparatus, and portable terminal and security apparatus including the same
JP2008244753A (en) * 2007-03-27 2008-10-09 Oki Electric Ind Co Ltd Peeping prevention method and peeping prevention device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000293663A (en) * 1999-04-01 2000-10-20 Oki Electric Ind Co Ltd Individual identifying device
JP2006181012A (en) * 2004-12-27 2006-07-13 Matsushita Electric Ind Co Ltd Eye image photographing device, authentication device and guiding method
JP2009199392A (en) * 2008-02-22 2009-09-03 Oki Electric Ind Co Ltd Iris authentication method and iris authentication apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005334402A (en) * 2004-05-28 2005-12-08 Sanyo Electric Co Ltd Method and device for authentication
JP2008181468A (en) * 2006-02-13 2008-08-07 Smart Wireless Kk Infrared face authentication apparatus, and portable terminal and security apparatus including the same
JP2008244753A (en) * 2007-03-27 2008-10-09 Oki Electric Ind Co Ltd Peeping prevention method and peeping prevention device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7384772B2 (en) 2020-04-17 2023-11-21 技嘉科技股▲ふん▼有限公司 Face recognition device and face recognition method

Also Published As

Publication number Publication date
JP6754834B2 (en) 2020-09-16
JPWO2018003336A1 (en) 2019-03-07

Similar Documents

Publication Publication Date Title
KR102299847B1 (en) Face verifying method and apparatus
US20170070680A1 (en) Display control method, terminal device, and storage medium
US11693937B2 (en) Automatic retries for facial recognition
KR101393717B1 (en) Facial recognition technology
US8886953B1 (en) Image processing
US20170109513A1 (en) Identification, authentication, and/or guiding of a user using gaze information
JP4609253B2 (en) Impersonation detection device and face authentication device
US20140197922A1 (en) System and method for positive identification on a mobile device
JP2007011667A (en) Iris authentication device and iris authentication method
US20140056491A1 (en) Method and device for authenticating a user
KR20180109109A (en) Method of recognition based on IRIS recognition and Electronic device supporting the same
AU2013201778B1 (en) Facial feature detection
JP2006259931A (en) Face authentication apparatus and its control method, electronic device equipped with face authentication apparatus, face authentication apparatus control program and recording medium recorded with the program
KR20170078729A (en) Systems and methods for spoof detection in iris based biometric systems
JP2016009453A (en) Face authentication device and face authentication method
JP2007135149A (en) Mobile portable terminal
JP2017191374A (en) Organism determination device, terminal apparatus, control method of organism determination device, and control program
WO2020022034A1 (en) Information processing device, information processing method, and information processing program
JP2009187130A (en) Face authentication device
WO2015042783A1 (en) Non-contact palmprint authentication method and device and portable terminal
WO2005020149A1 (en) Image input device and authentication device using the same
WO2020113571A1 (en) Face recognition data processing method and apparatus, mobile device and computer readable storage medium
JP2009015518A (en) Eye image photographing device and authentication device
JP2015176555A (en) Communication terminal and method for authenticating communication terminal
WO2018003336A1 (en) User authentication device, user authentication device control method, control program, and recording medium

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2018524952

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17819703

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17819703

Country of ref document: EP

Kind code of ref document: A1