WO2011078056A1 - Dispositif d'authentification et procédé d'authentification - Google Patents

Dispositif d'authentification et procédé d'authentification Download PDF

Info

Publication number
WO2011078056A1
WO2011078056A1 PCT/JP2010/072683 JP2010072683W WO2011078056A1 WO 2011078056 A1 WO2011078056 A1 WO 2011078056A1 JP 2010072683 W JP2010072683 W JP 2010072683W WO 2011078056 A1 WO2011078056 A1 WO 2011078056A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
authentication
information
display device
input
Prior art date
Application number
PCT/JP2010/072683
Other languages
English (en)
Japanese (ja)
Inventor
正木 賢治
Original Assignee
コニカミノルタホールディングス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタホールディングス株式会社 filed Critical コニカミノルタホールディングス株式会社
Publication of WO2011078056A1 publication Critical patent/WO2011078056A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/36User authentication by graphic or iconic representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present invention relates to an authentication device and an authentication method, and more particularly, to an authentication device and an authentication method for performing authentication based on information input from the outside.
  • Document 1 Japanese Patent Laid-Open No. 2005-072102 discloses a method of displaying a plurality of images on a screen and selecting a specific image from the plurality of images during user authentication.
  • the present invention has been conceived in view of such circumstances, and an object thereof is to achieve both user convenience and security strength in the authentication device.
  • An authentication apparatus includes a storage unit that stores an image for authentication, a display control unit that causes the display device to display an image, and an input unit that inputs information to the image displayed on the display device And a generation unit for generating information for designating a partial region of the image displayed on the display device based on the information input to the input unit, and a display based on the information generated by the generation unit An update unit for updating an image displayed on the apparatus, and a determination unit that determines whether or not the image updated by the update unit matches the authentication image.
  • the input unit inputs information for drawing a trajectory to the display device.
  • the generation unit generates continuous line segment information by interpolating information input to the input unit.
  • the display control unit causes the display device to display an image including a plurality of objects, and the generation unit generates information for specifying the object included in the image displayed on the display device. It is preferable.
  • the generation unit generates information specifying an object that includes all of the specified area as information specifying the object.
  • the update unit updates the image by changing an image of a partial area in the image displayed on the display device.
  • the storage unit further stores a replacement image
  • the update unit replaces a part of the region with the replacement image, thereby displaying the image displayed on the display device. It is preferable to update.
  • the replacement image stored in the storage unit includes a moving image.
  • An authentication method is an authentication method executed in an authentication apparatus that displays an image on a display device, the step of storing an image for authentication, the step of displaying an image on the display device, and the display device Receiving information input for the image displayed on the screen, generating information specifying a partial area of the image displayed on the display device based on the received information, and the generated information And updating the image displayed on the display device and determining whether or not the updated image matches the authentication image.
  • the step of accepting input accepts input of information that describes a trajectory to the display device.
  • the step of generating information generates continuous line segment information by interpolating information that has received an input.
  • the step of displaying an image causes the display device to display an image including a plurality of objects, and the step of generating information specifies an object included in the image displayed on the display device.
  • the information to be generated is generated.
  • the step of generating information generates information specifying an object that includes all of the information in the specified area as information specifying the object.
  • the updating step updates the image by changing an image of a partial area in the image displayed on the display device.
  • the authentication method of the present invention further includes a step of storing a replacement image, and the updating step replaces a part of the region with a replacement image, thereby converting the image displayed on the display device. It is preferable to update.
  • the authentication image includes a moving image.
  • FIG. 4 It is a figure which shows the external appearance of the information terminal which is 1st Embodiment of the authentication apparatus of this invention. It is a figure which shows the hardware constitutions of the information terminal of FIG. It is a control block diagram of the information terminal of FIG. It is a flowchart of the main routine performed in the information terminal of FIG. It is a figure which shows an example of the image for authentication utilized in the user authentication process of the information terminal of FIG. 5 is a flowchart of a user authentication processing subroutine of FIG. 4. It is a figure which shows an example of the image displayed on the display part of FIG. It is a figure which shows an example of the image displayed on the display part of FIG. It is a figure which shows an example of the image displayed on the display part of FIG. It is a figure which shows an example of the image displayed on the display part of FIG.
  • FIG. 1 shows an appearance of an information terminal 100 that is the first embodiment of the authentication apparatus of the present invention.
  • the information terminal 100 has a display unit 110 on its main surface.
  • the display unit 110 is configured by a general display device such as an LCD (liquid crystal display).
  • Each of the operation keys 112 to 115 is provided below the display unit 110 on the main surface of the information terminal 100.
  • the function assigned to each of the operation keys 112 to 115 can be changed according to the type of application being executed on the information terminal 100 and the state in which the application is being executed.
  • a power key 111 for switching ON / OFF of power-on of the information terminal 100 is provided on the right side surface of the information terminal 100.
  • a card slot 120 for inserting a recording medium such as a memory card into the main body of the information terminal 100 is provided on the lower surface of the information terminal 100.
  • FIG. 2 is a control block diagram of the information terminal 100 of FIG.
  • the information terminal 100 stores a CPU (Central Processing Unit) 101 that controls the entire terminal, a RAM (Random Access Memory) 102 that temporarily stores data, and programs and constants.
  • ROM Read Only Memory
  • HDD hard disk drive
  • I / F communication interface
  • the media drive 108 reads and / or writes information with respect to the recording medium 108A inserted into the card slot 120.
  • CD-ROM Compact Disc-Read Only Memory
  • DVD-ROM Digital Versatile Disk-Read Only Memory
  • USB Universal Serial Bus
  • memory card FD (Flexible Disk), hard disk, Magnetic tape, cassette tape, MO (Magnetic Optical Disc), MD (Mini Disc), IC (Integrated Circuit) card (excluding memory card), optical card, mask ROM, EPROM, EEPROM (Electronically Erasable Programmable Read-Only Memory)
  • FIG. 3 is a control block diagram of the information terminal 100.
  • input determination unit 151 detects that the operation has been performed and the content thereof.
  • the information input unit 150 is configured by the touch panel 107, the operation keys 112 to 115, and the input determination unit 151.
  • the display control unit 155 controls the display content of the display unit 110 via the drawing control unit 156, and stores the screen data displayed on the display unit 110 in the second storage unit 153.
  • the authentication information generation unit 152 generates a correct image (hereinafter, referred to as “authentication image” as appropriate) used when authenticating the user, and stores the image in the third storage unit 157.
  • an image different from the authentication image is displayed on the display unit 110, and information for correcting the image is input to the user to generate an authentication image.
  • User authentication is performed.
  • an image displayed on the display unit 110 at the time of user authentication is referred to as a correction image.
  • the authentication information generation unit 152 generates an image to be displayed first as a correction image and stores it in the first storage unit 158.
  • the authentication information generating unit 152 can generate an authentication image and a correction image using parts stored in advance in the first storage unit 158, for example.
  • the authentication determination unit 154 determines whether the image displayed on the display unit 110 (the image stored in the second storage unit 153) matches the image stored in the third storage unit 157.
  • the first storage unit 158, the second storage unit 153, and the third storage unit 157 are configured by the recording medium 108A, the HDD 104, and / or the RAM 102.
  • the image update unit 150B, the input determination unit 151, the authentication determination unit 154, the display control unit 155, the drawing control unit 156, the authentication information generation unit 152, and the input information generation unit 159 are included in the CPU 101 by the HDD 104. This is realized by executing a program stored in the above. Each unit may be realized by a dedicated hardware component such as a dedicated LSI (Large-Scale Integration) having an equivalent function.
  • LSI Large-Scale Integration
  • the contents of the main routine executed in the information terminal 100 will be described with reference to FIG.
  • the main routine is started, for example, by operating the power key 111 of the information terminal 100 that is in a power-off state.
  • step S1 the CPU 101 determines whether another key (any one of the operation keys 112 to 115) has been operated when the power key 111, which is a condition for starting the main routine, is operated. To do. That is, it is determined whether or not the power key 111 has been operated while other keys are being operated. If it is determined that it has been operated, the process proceeds to step S2, and if it is determined that it has not been operated, the process proceeds to step S3.
  • step S2 for example, based on information input to the operation keys 112 to 115 and the touch panel 107, image information is generated by a combination of parts stored in the first storage unit 158, and the generated image information is used as an authentication image. Is stored in the third storage unit 157.
  • step S2 from the plurality of images, 1 Information for selecting the authentication image may be received. In this case, in step S ⁇ b> 2, an authentication image based on the received information is stored in the third storage unit 157.
  • step S3 the CPU 101 executes a user authentication process and advances the process to step S4. Details of the user authentication process will be described later with reference to FIG.
  • step S4 the CPU 101 determines whether or not the power key 111 has been operated. If it is determined that the power key 111 has not been operated, the CPU 101 executes processing corresponding to the operation on the touch panel 107 and the operation keys 112 to 115 in step S5. . On the other hand, if it is determined in step S4 that the power key 111 has been operated, the main routine is terminated.
  • FIG. 5 shows an example of the authentication image.
  • the authentication image 900 includes an image of a house 920 and a crescent moon 910, and two regions of a region 901 and a region 902 as a background.
  • House 920 includes a roof 921, a main body 925, a window 923, a door 924, and a chimney 922.
  • Table 1 shows an example of authentication image information stored in the third storage unit 157.
  • the authentication image information is information for each part (object) constituting the authentication image.
  • Each object is stored in the first storage unit 158.
  • Table 1 includes a serial number (No.) in the authentication image, a serial number (object number) in the first storage unit 158, and a block number indicating a display position on the display unit 110 for each object.
  • the display unit 110 is configured by, for example, a 6-inch (800 ⁇ 600 pixel), 8-gradation display device.
  • a block is defined for each predetermined number of adjacent pixels. That is, all the pixels constituting the display unit 110 belong to any block.
  • Table 1 the position where each object is displayed is shown by listing the blocks of the display unit 110 as the block numbers described above.
  • FIG. 6 is a flowchart of a subroutine of user authentication processing (step S3 in FIG. 4).
  • CPU 101 first generates a correction image in step S300 and advances the process to step S302.
  • the correction image is, for example, an image that is not the same as the image stored in the third storage unit 157, and the type or display position of one or more objects of the authentication image is exchanged, or the object Generated by adding or deleting.
  • Table 2 shows an example of data indicating the correction image generated in step S300.
  • the correction image shown in Table 2 is obtained by deleting the image of the chimney indicated by the object number 2 from the authentication image shown in Table 1, and the image of the crescent moon of the object number 3 is the object number. It is generated by being replaced with 21 sun images.
  • the CPU 101 stores the generated correction image in the first storage unit 158.
  • FIG. 7 shows a correction image 950 corresponding to the data shown in Table 2.
  • the correction image 950 includes the sun 911 and the house 920A.
  • the house 920A does not include the chimney 922 as compared to the house 920 illustrated in FIG. This corresponds to the fact that the object number 2 is not included in Table 2.
  • the sun 911 is displayed instead of the crescent moon 910 as compared with the authentication image 900 in FIG. This corresponds to the fact that in Table 2, object number 21 (sun) is included instead of object number 3 (month (crescent moon)) in Table 1.
  • step S ⁇ b> 302 the CPU 101 displays the correction image generated in step S ⁇ b> 300 on the display unit 110 and advances the process to step S ⁇ b> 304.
  • step S304 the CPU 101 activates the input device (touch panel 107 and operation keys 112 to 115), and advances the process to step S306.
  • step S306 the CPU 101 causes the information terminal 100 to wait for input to the input device.
  • the process proceeds to step S308.
  • step S308 if the CPU 101 determines that the information input to the input device is information that divides the display area of the display unit 110, the CPU 101 proceeds to step S310, and designates addition / deletion / change of the object. If it is determined that there is, the process proceeds to step S314.
  • step S310 on the assumption that information regarding area division has been input, and operates any of the operation keys 112 to 115. If it is determined that there has been, information regarding addition of an object or the like may be input, and the process may proceed to step S314.
  • step S310 the CPU 101 generates a plurality of divided areas by dividing the display area of the display unit 110 based on the input information, and further confirms the cut area by receiving input from the input device. Then, the process proceeds to step S312.
  • the cut-out area refers to an area cut out among a plurality of divided areas generated from the correction image.
  • the correction image is corrected so as to delete the object included in the cutout area, and becomes a new correction image.
  • FIG. 8 shows the correction image 950 shown in FIG. 7 with the line 800 added.
  • a line 800 divides the display area of the display unit 110 input to the touch panel 107.
  • the CPU 101 divides the display area of the display unit 110 into an area 801 and an area 802 by the line 800.
  • the correction image 950 is displayed in the entire display area of the display unit 110.
  • the CPU 101 accepts input of information for selecting one of the area 801 and the area 802 in order to delete the object displayed in the area. By accepting the input of the information, the CPU 101 determines the cut area.
  • the CPU 101 can execute control for allowing the user to clearly recognize the contents of each area, for example, blinking the display of the area 801 and the area 802 alternately.
  • the user inputs information for selecting one of the areas 801 or 802 by touching one of the areas 801 or 802 with the touch pen 190, for example.
  • CPU101 will update the image for correction displayed on the display part 110 so that the object currently displayed on the said cutting area may be deleted, if a cutting area is fixed.
  • the CPU 101 updates an image to be displayed on the display unit 110 as a correction image 951 in FIG. 9.
  • the correction image 951 in FIG. 9 includes the regions 901 and 902 and the house 901 ⁇ / b> A, but does not include the sun 911. This is because all of the sun 911 was displayed in the region 801 selected as described above.
  • step S ⁇ b> 310 after the generation of the divided area, the determination of the cut area, and the cutting of the image, in step S ⁇ b> 312, the CPU 101 displays the image after cutting the image in step S ⁇ b> 310.
  • the content is stored in the second storage unit 153, and the display content of the display unit 110 is updated so that the image is displayed, and the process proceeds to step S318.
  • step S314 the CPU 101 receives input of information specifying specific contents regarding addition / deletion / change of the object to the operation keys 112 to 115 and / or the touch panel 107, and configures an image based on the information.
  • the type and position of the object are changed, and the process proceeds to step S316.
  • step S314 is performed.
  • the updated image an image with the crescent moon added to the upper left of the screen
  • the correction image 952 includes a crescent moon 910 in the upper left area.
  • step S316 the CPU 101 causes the second storage unit 153 to store the image changed based on the input content to the input device in step S314, and displays the image on the display unit 110 so as to display the image.
  • the content is updated, and the process proceeds to step S318.
  • step S318 the CPU 101 compares the image stored in the second storage unit 153 with the image (authentication image) stored in the third storage unit 157, and advances the process to step S320.
  • step S320 it is determined whether the two images match as a result of the comparison in step S318. If it is determined that the images match, the process proceeds to step S322, and if it does not match, the process returns to step S306.
  • step S322 the CPU 101 deactivates the input device and returns the process to FIG.
  • whether or not the two images match is determined by, for example, displaying the same object over the same block number in the third storage unit 157 and the second storage unit 153 as the authentication image and the correction image. It is determined by whether or not it is a thing. As for the block number, as long as it is about several blocks on the left and right, it may be determined that both images match even if the display position is shifted.
  • the correction image is changed based on the input to the input device and compared with the authentication image stored in the third storage unit 157.
  • the user authentication process is terminated, and the processes after step S4 are executed in the main routine (FIG. 4).
  • the user authentication process does not end until it matches the authentication image stored in the third storage unit 157.
  • the display area of the display unit 110 is divided into two with the line 800 as a boundary.
  • two divided regions, region 801 and region 802, are generated.
  • the correction image displayed on the display unit 110 is updated so that the sun 911 displayed entirely in the area 801 is deleted.
  • the updated image is shown as a correction image 951 in FIG.
  • step S320 does not match the correction image 951 in FIG. 11C and the authentication image 900 in FIG. Therefore, in the state where the correction image 951 in FIG. 11C is displayed, the process returns from step S320 to step S306 (see FIG. 6).
  • the correction image 952 in FIG. 11D is displayed on the display unit 110.
  • the correction image 952 is obtained by adding a crescent moon 910 to the correction image 951 in FIG.
  • a correction image 953 in FIG. 11 (E) is displayed on the display unit 110.
  • the correction image 953 is obtained by adding a chimney 922 to the correction image 952 in FIG.
  • step S4 the process after step S4 is performed.
  • the image displayed on the display unit 110 is updated so that the object that includes the entirety of the cut area is deleted.
  • Information indicating that the operation on the touch panel 107 has been performed which is indicated by a line 800 in FIG. 8, corresponds to information for dividing the region input to the input unit.
  • the input information generation unit 159 generates information that the display area of the display unit 110 is divided into two areas according to the line 800.
  • the image update unit 150B divides the correction image 950 into the region 801 and the region 802 along the line 800, and displays it on the display unit 110 so as to delete the object included in the selected region. Update the image to be played.
  • the image updated in this way is stored in the second storage unit 153, and the authentication determination unit 154 includes the authentication image stored in the third storage unit 157 and the image stored in the second storage unit 153 ( It is determined whether or not the correction images match.
  • the input information generation unit 159 displays information for dividing the display area of the display unit 110 based on the information. Generate. If the input trajectory is not enough to divide the display area, that is, depending on the input trajectory, the display unit 110 cannot be divided into a plurality of closed areas. And interpolate to generate multiple closed regions.
  • FIG. 12A there is a case where an input is made on the touch panel 107 with a locus as indicated by a locus 810 while the correction image 950 is displayed.
  • the upper end of the locus 810 is in contact with the upper end of the display unit 110 (the upper end of the correction image 950 in FIG. 12A), but the lower end thereof is not in contact with any end of the display unit 110.
  • a closed region is not formed by the locus 810 itself. Therefore, depending on the locus 810, the display unit 110 cannot be divided into a plurality of closed regions.
  • trajectory 810 when the trajectory 810 is a straight line, it is extended to the end of the display unit 110, and when the trajectory 810 is a curve, by an existing interpolation method such as Lagrange interpolation, It extends so as to reach the end of the display unit 110 (or so that a closed region can be configured only by the trajectory 810).
  • an existing interpolation method such as Lagrange interpolation
  • trajectory 820 in FIG. 12B when the input trajectories are not continuous, portions close to each other can be interpolated by Lagrange interpolation or the like to obtain a continuous trajectory.
  • lines connecting the fragmented trajectories 820 generated by the separation are indicated by broken lines 821 to 824.
  • the input information generation unit 159 when a plurality of points are input as information for dividing an area on the touch panel 107, the input information generation unit 159 generates a line connecting the plurality of points, thereby displaying the display area of the display unit 110. Can be generated.
  • the input information generation unit 159 generates a straight line 830 that is a straight line connecting them and whose both ends are in contact with the end of the display area of the display unit 110, and the display area of the display unit 110 is generated by the straight line 830.
  • a straight line 830 that is a straight line connecting them and whose both ends are in contact with the end of the display area of the display unit 110, and the display area of the display unit 110 is generated by the straight line 830.
  • FIG. 13B there may be a case where three points of a point 841, a point 842, and a point 843 are input to the touch panel 107.
  • a line 840 that passes through these three points and whose both ends are in contact with the end of the display unit 110 is generated by performing interpolation using an existing interpolation method (for example, spline interpolation ).
  • an existing interpolation method for example, spline interpolation
  • the information terminal according to the second embodiment of the authentication apparatus of the present invention can have the same appearance and hardware configuration as the information terminal 100 in the first embodiment.
  • FIG. 14 is a control block diagram of information terminal 100 according to the present embodiment.
  • the information terminal 100 according to the present embodiment has a key input detection unit 154A in addition to the information terminal 100 according to the first embodiment.
  • step S318 and S320 in the first embodiment whether or not the authentication image matches the correction image (authentication determination) is determined using the operation keys 112 to 115. This is performed on condition that any one of the keys has been operated.
  • the key input detection unit 154A detects that the key has been operated, the key input detection unit 154A outputs a command to the authentication determination unit 154 to execute the authentication determination.
  • the authentication determination unit 154 performs the authentication determination on the condition that the instruction is received.
  • FIG. 15 is a flowchart of a user authentication processing subroutine executed in the information terminal 100 according to the present embodiment.
  • step S330 the input device is processed. It is determined whether or not the input made with respect to specifies the coordinates of the display unit 110 or the authentication key has been operated.
  • the authentication key is a key operated to cause the authentication determination unit 154 to execute the authentication determination, and is any one of the operation keys 112 to 115.
  • the CPU 101 proceeds to step S332 when determining that the information is information for designating coordinates, and proceeds to step S340 when determining that the authentication key has been operated.
  • step S332 the CPU 101 determines the coordinates designated by the information input to the input device, and advances the process to step S334.
  • step S334 the CPU 101 identifies an object including the coordinates determined to be specified in step S332 among the objects displayed on the display unit 110, and advances the process to step S336.
  • the CPU 101 identifies an object based on information about an image displayed on the display unit 110 as shown in Table 2. Specifically, the block number including the coordinates acquired in step S332 is acquired, and the object associated with the block number is specified by referring to Table 2.
  • step S336 information for updating the display mode and the like of the object specified in step S334 is acquired, and the process proceeds to step S338.
  • a display mode such as deleting the object or changing the display color is displayed for an object including the specified coordinates. It is assumed that an operation for inputting information for changing is performed.
  • step S336 information input to the input device by the operation is acquired.
  • step S3308 the CPU 101 updates the display mode for the object identified in step S334 based on the information acquired in step S336, and registers information about the updated image in the second storage unit 153. The process returns to step S306.
  • the area 901 is identified as the designated object.
  • the correction image 954 of FIG. 16 updated to change the display color of the area 901 is displayed on the display unit 110. .
  • the roof 921 is specified as an object including the specified coordinates.
  • the image for correction of FIG. 17 updated so that the display color of the roof 921 may be changed on the display part 110 according to this. 955 is displayed.
  • step S ⁇ b> 340 the CPU 101 determines whether the image in the second storage unit 153 and the image in the third storage unit 157 are the same as in step S ⁇ b> 320 (see FIG. 6). If it is determined that they do not match, the process returns to step S306, and if it is determined that they match, the process proceeds to step S342.
  • step S342 as in step S322 (see FIG. 6), the CPU 101 deactivates the input device and returns the process to FIG.
  • the authentication image, the correction image, and the object used for updating the correction image are selected from those stored in the first storage unit 158.
  • the storage location of the object used in the information terminal 100 is not limited to this.
  • the CPU 101 may use an object acquired from another device via the communication I / F 105 as an authentication image.
  • the authentication image is generated in the information terminal 100 based on the information input to the input device.
  • the authentication image used in the authentication device is not limited to this.
  • an image received from another device via the communication I / F 105 may be used as an authentication image.
  • an object used as a component such as an authentication image is not limited to a still image.
  • a moving image such as an animation as a component such as an authentication image, it is possible to provide diversity to an image used for user authentication.
  • the displayed image is updated based on the information, and after the update It is determined whether or not these images match the authentication image. That is, the displayed image is updated to determine whether or not it matches the authentication image.
  • the image is updated based on information input by the user. As a result, if the user inputs information until the displayed image matches the authentication image, the user can determine whether the image matches the authentication image.
  • the displayed image can be updated.
  • 100 information terminal 101 CPU, 102 RAM, 103 ROM, 104 HDD, 105 communication I / F, 106 input unit, 107 touch panel, 108 media drive, 110 display unit, 150 information input unit, 151 input determination unit, 152 for authentication Information generation unit, 153 Second storage unit, 154 Authentication determination unit, 155 Display control unit, 156 Drawing control unit, 157 Third storage unit, 158 First storage unit, 159 Input information generation unit, 900 Authentication image, 950- 955 Image for correction.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Lors d'une entrée d'informations qui partitionnent une région dans un dispositif d'entrée dans un état dans lequel une image d'utilisation de correction (950) est affichée, une ligne (800) qui représente une trace est affichée sur une unité d'affichage et l'image d'utilisation de correction (950) est partitionnée, la ligne (800) servant de frontière. Quand une région (801) est sélectionnée en tant que région de coupures, un soleil (911) qui est affiché entièrement sur une région (801) est effacé de l'image d'utilisation de correction (950). Sur l'unité d'affichage, une image d'utilisation de correction est affichée après une mise à jour (951). Selon d'autres instructions, les images mises à jour (image d'utilisation de correction (952) et image d'utilisation de correction (953)) sont affichées l'une après l'autre. L'image d'utilisation de correction (953) correspond à une image d'utilisation d'authentification et par conséquent, à l'endroit où l'image d'utilisation de correction (953) est affichée, le traitement d'utilisation d'authentification se termine.
PCT/JP2010/072683 2009-12-25 2010-12-16 Dispositif d'authentification et procédé d'authentification WO2011078056A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009294978A JP2013047859A (ja) 2009-12-25 2009-12-25 認証装置および認証方法
JP2009-294978 2009-12-25

Publications (1)

Publication Number Publication Date
WO2011078056A1 true WO2011078056A1 (fr) 2011-06-30

Family

ID=44195585

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/072683 WO2011078056A1 (fr) 2009-12-25 2010-12-16 Dispositif d'authentification et procédé d'authentification

Country Status (2)

Country Link
JP (1) JP2013047859A (fr)
WO (1) WO2011078056A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014010764A (ja) * 2012-07-02 2014-01-20 Sharp Corp 表示装置、削除方法、コンピュータプログラム、及び記録媒体
JP2014052837A (ja) * 2012-09-07 2014-03-20 Felica Networks Inc 情報処理装置および情報処理方法、並びにプログラム
US8938780B2 (en) 2012-03-27 2015-01-20 Telefonaktiebolaget L M Ericsson (Publ) Display authentication
CN110568982A (zh) * 2019-09-12 2019-12-13 北京字节跳动网络技术有限公司 在线演示文稿中的图片裁剪方法、装置、存储介质及设备
CN113947955A (zh) * 2021-10-18 2022-01-18 贵州振华信息技术有限公司 一种具有放大功能的文稿演示系统以及演示方法
WO2022050847A1 (fr) * 2020-09-07 2022-03-10 Protectoria Venture As Procédé de protection de l'interface utilisateur visuelle d'applications mobiles
WO2023176364A1 (fr) * 2022-03-18 2023-09-21 ソニーグループ株式会社 Dispositif de traitement d'informations, programme, et procédé de traitement d'informations

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113168271A (zh) * 2018-12-07 2021-07-23 三菱电机株式会社 输入显示控制装置、输入显示控制方法以及输入显示系统
US20240071045A1 (en) * 2022-08-24 2024-02-29 Capital One Services, Llc Systems and methods for authenticating via photo modification identification

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002297546A (ja) * 2001-03-29 2002-10-11 Just Syst Corp ユーザ認証方法および装置
JP2007264929A (ja) * 2006-03-28 2007-10-11 Pioneer Electronic Corp ユーザ認証システム、ユーザ認証方法、操作端末及びサーバ等
JP2009282634A (ja) * 2008-05-20 2009-12-03 Canon Inc 情報処理装置及びその制御方法、プログラム、記憶媒体

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002297546A (ja) * 2001-03-29 2002-10-11 Just Syst Corp ユーザ認証方法および装置
JP2007264929A (ja) * 2006-03-28 2007-10-11 Pioneer Electronic Corp ユーザ認証システム、ユーザ認証方法、操作端末及びサーバ等
JP2009282634A (ja) * 2008-05-20 2009-12-03 Canon Inc 情報処理装置及びその制御方法、プログラム、記憶媒体

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8938780B2 (en) 2012-03-27 2015-01-20 Telefonaktiebolaget L M Ericsson (Publ) Display authentication
JP2014010764A (ja) * 2012-07-02 2014-01-20 Sharp Corp 表示装置、削除方法、コンピュータプログラム、及び記録媒体
JP2014052837A (ja) * 2012-09-07 2014-03-20 Felica Networks Inc 情報処理装置および情報処理方法、並びにプログラム
CN110568982A (zh) * 2019-09-12 2019-12-13 北京字节跳动网络技术有限公司 在线演示文稿中的图片裁剪方法、装置、存储介质及设备
CN110568982B (zh) * 2019-09-12 2021-04-20 北京字节跳动网络技术有限公司 在线演示文稿中的图片裁剪方法、装置、存储介质及设备
WO2022050847A1 (fr) * 2020-09-07 2022-03-10 Protectoria Venture As Procédé de protection de l'interface utilisateur visuelle d'applications mobiles
CN113947955A (zh) * 2021-10-18 2022-01-18 贵州振华信息技术有限公司 一种具有放大功能的文稿演示系统以及演示方法
WO2023176364A1 (fr) * 2022-03-18 2023-09-21 ソニーグループ株式会社 Dispositif de traitement d'informations, programme, et procédé de traitement d'informations

Also Published As

Publication number Publication date
JP2013047859A (ja) 2013-03-07

Similar Documents

Publication Publication Date Title
WO2011078056A1 (fr) Dispositif d'authentification et procédé d'authentification
JP6034138B2 (ja) 電子機器、筆跡表示方法およびプログラム
JP4742132B2 (ja) 入力装置、画像処理プログラムおよびコンピュータ読み取り可能な記録媒体
US9774778B2 (en) Electronic camera, image display device, and storage medium storing image display program, including filter processing
US10606476B2 (en) Techniques for interacting with handheld devices
CN104793913A (zh) 对象显示系统、对象显示控制程序及对象显示控制方法
WO2012114876A1 (fr) Dispositif électronique, procédé d'affichage de contenu et programme d'affichage de contenu
CN106910232A (zh) 画线擦除方法及装置
CN104765528A (zh) 一种虚拟键盘显示方法及装置
US20140225932A1 (en) Display device, display device control method, and recording medium
CN103164644A (zh) 终端设备及其开机控制方法
CN112783346A (zh) 一种笔迹数据处理方法、系统、计算机设备及存储介质
WO2013136572A1 (fr) Appareil d'affichage, procédé d'affichage, programme de commande et support d'enregistrement
CN113641416A (zh) 指纹识别模式下的护眼模式处理方法、装置、终端及介质
KR101030177B1 (ko) 데이터 입력장치 및 데이터 입력방법
CN108139847B (zh) 显示装置
JP5597441B2 (ja) 手書き文字入力装置
CN102385453B (zh) 信息处理装置及其控制方法
JP2012221358A (ja) 電子機器、手書き入力方法、および手書き入力プログラム
US20110172010A1 (en) Information processing device, information processing device control method, program, and information storage medium
US9047707B2 (en) Graph display device
JP2007264765A (ja) 数式編集装置及び数式編集プログラム
JP4361118B2 (ja) 情報処理装置、情報処理方法、およびプログラム
CN115617225A (zh) 应用界面显示方法、装置、电子设备及存储介质
CN114721565A (zh) 应用程序启动方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10839289

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10839289

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP