KR20150092441A - Method for user Authentication through touch dragging, device therefor - Google Patents
Method for user Authentication through touch dragging, device therefor Download PDFInfo
- Publication number
- KR20150092441A KR20150092441A KR1020140012748A KR20140012748A KR20150092441A KR 20150092441 A KR20150092441 A KR 20150092441A KR 1020140012748 A KR1020140012748 A KR 1020140012748A KR 20140012748 A KR20140012748 A KR 20140012748A KR 20150092441 A KR20150092441 A KR 20150092441A
- Authority
- KR
- South Korea
- Prior art keywords
- touch drag
- touch
- terminal
- user
- attribute information
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Security & Cryptography (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A user authentication method using a touch drag and an apparatus therefor are disclosed. Specifically, a method for authenticating a terminal user using touch drag attribute information of a user according to the present invention includes extracting touch drag attribute information of a terminal user from a touch drag input of a terminal user upon receiving a touch drag input of the terminal user Storing the touch drag attribute information of the extracted terminal user, performing machine learning based on the touch drag attribute information of the stored terminal user and the touch drag attribute information of the previously stored other user, Determining a criterion for distinguishing an input from a touch drag input of another person, and authenticating the terminal user by determining whether the touch drag input input to the terminal is a touch drag input of the terminal user based on the determined criterion .
Description
The present invention relates to a user authentication method, and more particularly, to a method and apparatus for authenticating a user using a touch drag in a terminal equipped with a touch screen.
The mobile communication system has been developed to provide voice service while guaranteeing the activity of the user. However, the mobile communication system is gradually expanding not only the voice but also the data service. However, with the provision of a variety of services, more personal information is stored in the smart phone in addition to personal information such as call history and address book, so that interest in the security system is increasing.
Security systems in existing mobile communication terminals are divided into two types, and there are a password-based security system that uses information specified by a person such as a password and a graphic form, and an attribute-based security system that uses attributes of individuals such as face recognition and fingerprint recognition .
However, existing password-based security systems have a disadvantage in that others can easily recognize passwords designated by the user, and attribute-based security systems such as face recognition and fingerprint recognition are not widely commercialized due to their accuracy and cost.
It is an object of the present invention to provide a method and apparatus for authenticating a legitimate user of a terminal using a touch drag input according to a user's daily operation at a terminal.
It is another object of the present invention to provide a method and apparatus for further enhancing security of a terminal by using the attribute of a user, which can not be easily followed by others, as a means of security even if the user authentication means is exposed.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention, unless further departing from the spirit and scope of the invention as defined by the appended claims. It will be possible.
According to an aspect of the present invention, there is provided a method of authenticating a terminal user using terminal's touch drag attribute information, the method comprising: receiving, from a touch drag input of a terminal user, touch drag attribute information Extracting the touch drag attribute information of the extracted terminal user, performing machine learning on the basis of the touch drag attribute information of the stored terminal user and the touch drag attribute information of the stored other terminal, Determining a criterion for distinguishing the touch drag input of the other user from the touch drag input of the other user, and determining whether the touch drag input inputted to the terminal is the touch drag input of the terminal user based on the determined reference, .
According to another aspect of the present invention, there is provided a terminal for authenticating a terminal user using a touch drag attribute information of a user, the terminal including a touch screen for inputting a touch drag, touch drag attribute information of a terminal user, An attribute information collecting unit for extracting the touch drag attribute information of the terminal user from the touch drag input of the terminal user and storing the extracted information in the storage unit when the touch drag input of the terminal user is received through the touch screen, A machine learning unit for determining a criterion for distinguishing between a touch drag input of a terminal user and a touch drag input of a terminal by performing machine learning based on information of the touch drag attribute information of another person, The touch drag input input to the screen is input to the terminal user It is determined whether the drag input comprises a user authentication unit that authenticates the terminal user.
According to an embodiment of the present invention, a user can authenticate a legitimate user in the terminal by using only the daily operation of the user by extracting and using the unique feature of the user in the touch drag input of the user.
In addition, according to the embodiment of the present invention, even if another user watches the process of releasing the security using the touch dragging by a legitimate user, the probability of replaying the touch drag operation is very low like a legitimate user, Can be improved.
The effects obtained in the present invention are not limited to the effects mentioned above, and other effects not mentioned can be clearly understood by those skilled in the art from the following description .
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of the specification, illustrate embodiments of the invention and, together with the description, serve to explain the technical features of the invention.
1 is a diagram illustrating a configuration of a terminal according to an embodiment of the present invention.
FIG. 2 is a diagram for explaining a user authentication method using a touch drag according to an embodiment of the present invention. Referring to FIG.
FIG. 3 is a diagram illustrating a method for setting criteria for user authentication according to an embodiment of the present invention.
FIGS. 4 to 6 are diagrams illustrating touch drag attribute information according to an embodiment of the present invention.
7 to 10 are diagrams for explaining a method of authenticating a user using a touch drag according to an embodiment of the present invention.
11 is a diagram illustrating experimental data for comparing attribute information on a touch drag input of a terminal user and others according to an embodiment of the present invention.
12 is a diagram illustrating experimental data for comparing attribute information on touch drag input of a terminal user and a non-user according to each touch drag attribute information according to an embodiment of the present invention.
Hereinafter, preferred embodiments according to the present invention will be described in detail with reference to the accompanying drawings. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS The following detailed description, together with the accompanying drawings, is intended to illustrate exemplary embodiments of the invention and is not intended to represent the only embodiments in which the invention may be practiced. The following detailed description includes specific details in order to provide a thorough understanding of the present invention. However, those skilled in the art will appreciate that the present invention may be practiced without these specific details.
In some instances, well-known structures and devices may be omitted or may be shown in block diagram form, centering on the core functionality of each structure and device, to avoid obscuring the concepts of the present invention.
Throughout the specification, when an element is referred to as "comprising" or " including ", it is meant that the element does not exclude other elements, do. Also, the terms " part, "" module," and " module ", etc. in the specification mean a unit for processing at least one function or operation and may be implemented by hardware or software or a combination of hardware and software have. Also, the terms " a or ", "one "," the ", and the like are synonyms in the context of describing the invention (particularly in the context of the following claims) May be used in a sense including both singular and plural, unless the context clearly dictates otherwise.
The specific terminology used in the following description is provided to aid understanding of the present invention, and the use of such specific terminology may be changed into other forms without departing from the technical idea of the present invention.
According to the present invention, the usual operation of the terminal user, that is, the touch drag operation is recognized through machine learning to find a unique attribute of the terminal user from the natural motion, .
In the following description, a legitimate user of the terminal is referred to as a 'terminal user', and any other N users other than the terminal user are referred to as 'others'. In addition, 'touch dragging' means touching a touch screen of a terminal while dragging the touch pen or a user's finger in a certain direction while maintaining contact. An example of such a touch dragging may include a flicking-touch, and a flicking-touch means touching the screen. In addition, 'touch drag property information' refers to information indicating a characteristic of a touch drag input having a low affinity to each user, and 'touch drag attribute information of a terminal user' Attribute information ", and the 'touch drag attribute information of the other person' means attribute information of the touch drag input of the other person. The user 'authentication' means to identify (or distinguish) whether the user is a legitimate user (i.e., a terminal user) or another one. If the number of terminal users is two or more, .
In the present specification, a 'terminal' refers to a user equipment (UE), a mobile station (MS), a mobile subscriber station (MSS), a subscriber station (SS), an advanced mobile station (AMS) Machine-type communication device, an M2M (machine-to-machine) device, a D2D device (device-to-device), and a station (STA). However, the present invention is not limited to this, and a terminal equipped with a touch pad or touch screen capable of touch input from a user may correspond to the terminal according to the present invention.
1 is a diagram illustrating a configuration of a terminal according to an embodiment of the present invention.
Referring to FIG. 1, a
More specifically, the
The
The
In the case where the
The
Also, in the present invention, the
The
The
The geomagnetic sensor is a sensor that can detect the direction of the magnetic field generated by the earth and detect the orientation like a compass. The geomagnetism is a self possessed by the Earth, and it changes periodically or irregularly, not invariably. To determine the Earth's magnetic field at any point, three independent components are required, including horizontal component, azimuth and dip. The X-axis sensor and the Y-axis sensor are disposed on the left side of the terminal 100. The X-, Y-, and Z- , And the Z axis sensor senses the upward and downward directions of the terminal 100, thereby measuring the direction of movement of the terminal 100. [ The acceleration sensor is a sensor that can directly detect a dynamic vibration change (acceleration) of a sensor moving body that specifies a physical quantity called a speed change. Typically, there is a piezoelectric acceleration sensor. The silicon structure of the sensor is changed according to the acceleration change caused by movement of the moving object, and the fluctuation of the silicon structure generates a charge to the silicon. The electric charge generated in the silicon changes the capacitance of the sensor. It acts as a principle to detect the motion of the moving body as a principle of converting the amount of capacitance change into the corresponding voltage value. The gyro sensor is a sensor that detects the vertical force by the same principle as the acceleration sensor when a coriolis force is generated in the vertical direction of the rotation angle by the rotation angle acceleration sensor. In addition, the altimeter is a sensor for measuring an air pressure difference (pressure) which changes according to the altitude.
The
Each function of the
The attribute
The
Since the touch drag attribute information of the terminal user can be accumulated and stored in the
In addition, the
The
As described above, the
Also, the
The terminal lock /
In addition, the terminal lock /
Hereinafter, a user authentication method using a touch drag according to the present invention will be described in detail with reference to the drawings. In particular, it is assumed that the terminal 100 to which the present invention is applicable includes a display unit 130 (touch screen).
FIG. 2 is a diagram for explaining a user authentication method using a touch drag according to an embodiment of the present invention. Referring to FIG.
Referring to FIG. 2, when the terminal receives the touch drag of the terminal user (S201), the terminal extracts the touch drag attribute information of the terminal user from the inputted touch drag input of the terminal user (S203).
Then, the terminal stores touch drag attribute information of the extracted terminal user (S205). The touch drag attribute information of the terminal user can be extracted and accumulated in the terminal every time a touch drag input of the terminal user is made. That is, steps S201 to S205 may be repeatedly performed every time a touch drag input of the terminal user is input to the terminal.
The terminal performs a machine learning based on the touch drag attribute information of the stored terminal user and the touch drag attribute information of a previously stored other person to determine a criterion for distinguishing the touch drag input of the terminal user from the touch drag input of the other user , A criterion for authenticating the terminal user) (S207). Here, the terminal may determine a criterion for authenticating the terminal user each time the touch drag attribute information of the terminal user and / or the touch drag attribute information of the other terminal are updated, and further, The criteria can be newly determined.
In step S209, the terminal determines whether the touch drag input input to the terminal is the touch drag input of the terminal user or the touch drag input of the other user based on the criterion determined in step S207.
A machine learning algorithm is based on a given data and performs learning based on a specific algorithm to construct a discrimination criterion for classifying the data into a specific group (or set, class), so that when the new data is given, This is the process of predicting whether An SVM algorithm can be used as an example of machine learning, and the terminal can determine a hyperplane form of a criterion for authenticating a terminal user by performing machine learning through the SVM algorithm. The SVM algorithm means an algorithm for finding an optimal hyperplane that separates given data into two groups. Hereinafter, the user authentication method using the touch dragging will be described in more detail by exemplifying the SVM algorithm.
FIG. 3 is a diagram illustrating a method for setting criteria for user authentication according to an embodiment of the present invention.
In FIG. 3, it is assumed that a criterion for user authentication is set using attribute information (x 1 , x 2 ) of two types of touch drag.
First, the terminal vectorizes each of the touch drag attribute information of the stored terminal user and the touch drag attribute information of the other to a vector value, and maps the vector values of the respective touch drag attribute information onto the multidimensional plane. At this time, the vector value of the touch drag attribute information and the dimension of the multi-dimensional plane are the same as the number of each touch drag attribute information used for user authentication. 3, the vector value of the touch drag attribute information is mapped to the two-dimensional plane of x 1 and x 2. However, when N pieces of the touch drag attribute information are used, the vector value of the touch drag attribute information is x 1 , x 2 , ..., x N.
Specifically, when the number of touch drag attribute information used for user authentication is N and the touch drag attribute information of the terminal user to be analyzed or the touch drag attribute information of another person is i, the data vector for the touch drag attribute information is X i , X 2 , ..., X i, and the dimension of each data vector becomes N dimension, and X i can be expressed as the following equation (1).
Here, x iN denotes the N-th vector element of the data vector X i .
Then, the touch drag attribute information vector of the terminal user and the touch drag attribute information vectors of the other user are classified so as to belong to any group (or set, class) of all the groups. In FIG. 3, "O" represents the vector of the touch drag attribute information of the terminal user, and "X" represents the vector of the touch drag attribute information of the other user. That is, the terminal can set a set of the touch drag attribute information of the terminal user as the first label and the set of the touch drag attribute information of the other user as the second label. For example, the group of the touch drag attribute information vector of the terminal user is assigned a negative value (for example, '-1'), the group of the touch drag attribute information vector of the other user is a positive value For example, " 1 "
Then, the terminal determines a reference (i.e., a hyperplane) for distinguishing each group from the vector of the touch drag attribute information of the terminal user and the vector distribution of the touch drag attribute information of the other. For example, in the case of FIG. 3, the terminal can determine the reference so that the distance from the straight line for classifying the two groups to the data vector belonging to the group of the touch drag attribute information, that is, the margin, , It can be expressed as shown in Equation (2) below.
In Equation (2), X represents a vector of the touch drag attribute information of the terminal user and the touch drag attribute information of the other user. W denotes a weight vector, b denotes a bias vector, and both values can be determined by machine learning.
In the case of FIG. 3, the straight line located in the middle of the group of the touch drag attribute information of the terminal user and the group of the touch drag attribute information of the other represents the hyperplane for authenticating the terminal user. 3, the hyperplane has a one-dimensional (i.e., (N-1) -dimensional) linear shape. However, since the hyperplane uses N kinds of touch drag attribute information The hyperplane has an N-1 dimensional form.
When the user inputs a touch drag after the hyperplane is determined, the terminal determines the direction of the user's touch according to the direction of the hyperplane on which the vector value of the attribute information of the touch drag input is located based on the determined hyperplane. It is determined whether or not it is a drag input.
This can be expressed by the following equation (3).
In Equation (3), y may have a negative value (for example, '-1') or a positive value (for example, '1'), and the group of the vector is determined according to the value of y. That is, the terminal can determine whether the corresponding touch drag input is a touch drag input of the terminal user or a touch drag input of another user by applying the y-value derived by applying the attribute information of the touch drag input input to the terminal to Equation (3) .
Also, the terminal may determine the suitability of the hyperplane by applying the touch drag attribute information of the terminal user or the touch drag attribute information of the other user to Equation (3) below. For example, if the touch drag attribute information of the terminal user is given a negative value (for example, '-1'), but applying the result to Equation 3 below, if a negative value is obtained, This may be an inaccurate result. In the example of FIG. 3, a vector (i.e., 'x') of the touch drag attribute information of the other is displayed on the hyperplane, and a vector (i.e., 'O') of the touch drag attribute information of the terminal user is displayed below the hyperplane It can be seen that it is displayed. These values are a touch drag input of the terminal user but are determined as a touch drag input of the other person (false negative). On the contrary, it is a touch drag input of the other user, but a false positive is determined as a touch drag input of the terminal user. In this case, it is necessary to readjust the hyperplane, i.e., the criterion for user authentication, in order to increase the accuracy of the judgment of user authentication, and a detailed description thereof will be described later.
For convenience of description, it is assumed that two pieces of touch drag attribute information are used in FIG. 3, but the touch drag attribute information used for user authentication is not limited thereto.
Table 1 illustrates touch drag attribute information according to an embodiment of the present invention.
(estimated)
(at start time)
(at start time)
(at start time)
Referring to Table 1, the touch drag attribute information (i.e., the touch drag attribute information of the terminal user and the touch drag attribute information of the other person) is calculated based on the rotation radius of the estimated touch finger, the x coordinate of the estimated touch finger rotation axis, The y coordinate of the finger rotational axis, the length of the touch drag, the x coordinate of the point at which the touch drag starts, the y coordinate of the point at which the touch drag starts, the x coordinate of the point at which the touch drag ends, The x coordinate of the gravity sensor at the start of the drag, the y coordinate of the gravity sensor at the start of the touch drag, the z coordinate of the gravity sensor at the start of the touch drag, the average of the long radius of the touch ellipse plane, The average of the minor radiuses of the elliptical plane to be touched, the variance of the minor radiuses of the touched elliptical plane to be touched, the elliptical plane The average of the velocity of the touch drag relative to the x coordinate of the touch drag, the average of the velocity of the touch drag relative to the y coordinate of the touch drag, the average of the velocity of the touch drag relative to the x coordinate of the touch drag, An average of the lengths of the water lines from the respective points of the touch drag to the line segments connecting the start point and the end point of the touch drag, the average of the lengths of the touch drag And the variance of the length of the waterline from the point to the line segment connecting the start point and the end point of the touch drag.
Each of the touch drag attribute information illustrated in Table 1 will be described in detail with reference to FIGS. 4 to 6 below.
FIGS. 4 to 6 are diagrams illustrating touch drag attribute information according to an embodiment of the present invention.
Referring to FIG. 4, it is assumed that a touch drag input of a user exists from a
First, the touch rotation radius information according to the length of the user's finger may be used as attribute information of the touch drag. 'radius' means the radius of rotation of the touch drag according to the length of the estimated touch finger, and the terminal displays a touch drag from the touch
'X_center' and 'y_center' mean the x coordinate of the estimated touch finger rotational axis and the y coordinate of the estimated touch finger rotational axis, respectively, and the distance from the touch
Information about the shape (length, start point, end point, etc.) of the touch drag can be used as attribute information of the touch drag. The length 'drag' of the touch drag means the length from the touch
As the attribute information of the touch drag, the sensing value of the sensor at the touch start point or the touch end point can be used. Although only the gravity sensor (i.e., the geomagnetic sensor) is illustrated in Table 1, it is a matter of course that the sensing value of the sensor other than the gravity sensor can be used. The gravity sensor 'gravity_x' of the gravity sensor at the start of the touch drag, the y coordinate 'gravity_y' of the gravity sensor at the start of the touch drag, and the gravity sensor 'z gravity_z' X coordinate, y coordinate and z coordinate of the gravity sensor mounted on the terminal at a time point recognized by the terminal of the terminal 401.
And, as the attribute information of the touch drag, the strength information of the touch input of the user, which is discriminated through the contact interview of the touch point, can be used. 'tmA' and 'tmV' mean the average value of the minor axis of the touching elliptical plane, respectively, and 'tma' and 'tMV' mean the mean value of the major axis of the touched ellipse plane, Means the dispersion value of the minor axis of the elliptical plane. Also, 'wMA' and 'wMV' mean the average value of the major axis of the ellipse plane in which the touch is actually recognized at the terminal, and the variance value of the major axis of the ellipse plane in which the touch is actually recognized at the terminal. This will be described in more detail with reference to FIG.
5 (a) illustrates an ellipse of a touch actually recognized by a terminal, FIG. 5 (b) illustrates an ellipse of a user finger that is touched by a terminal, and FIG. 5 (c) Illustrates the comparison of the sizes of the ellipses that are actually recognized and the sizes of the ellipses that are touched at the terminal according to the touch pressure of the user's finger.
The ellipse of the user finger actually touched by the terminal means a portion of the touch screen substrate of the terminal (e.g., a film, Plastic, glass, or the like) (i.e., an ellipse). (The major axis and the minor axis) of the touch actually recognized in the terminal according to the touch pressure touched on the touch screen substrate of the terminal and the size (the major axis and the minor axis) of the ellipse of the user finger touching the terminal may have different values .
As shown in FIG. 5C, when the touch pressure is 1, the ellipse of the touch actually sensed by the terminal and the ellipse of the user finger touching the terminal have the same size. If the touch pressure is greater than 1, If the touch pressure is less than 1, the ellipse of the touch actually recognized by the terminal may be smaller than the ellipse of the user finger that is touched by the terminal. However, it is assumed that the ellipse of the touch actually recognized in the terminal is fixed for the sake of convenience of explanation, and the present invention is not limited thereto. That is, when the touch pressure is within a predetermined range (for example, 0.5 <touch pressure <1.5), the ellipse of the touch actually sensed by the terminal and the ellipse of the user's finger touching the terminal may have the same size. That is, the size of the ellipse of the touch actually recognized by the terminal may be different in proportion to the ellipse of the user finger touched by the terminal. If the touch pressure is larger than a predetermined range, the ellipse of the user finger touched by the terminal may be larger than the ellipse of the touch actually recognized by the terminal. If the touch pressure is smaller than a predetermined range, The ellipse of the touch may be smaller than the ellipse of the user finger touching the terminal.
As the attribute information of the touch drag, speed information indicating the speed of touch input can be used. 'xvA' and 'xvV' respectively denote the mean value of the velocity of the touch drag relative to the x coordinate and the variance of the velocity of the touch drag relative to the x coordinate, and 'yvA' and 'yvV' 'VA' and 'vV' mean the average value of the touch drag speed and the variance value of the touch drag speed, respectively. Assuming that the touch drag input of the user is from the
As attribute information of the touch drag, the length information of the line from the touch point to the line segment connecting the start point and the end point of the touch drag can be used. 'ppA' and 'ppV' respectively represent the average value of the lengths of the perpendicular lines from the respective points of the touch drag to the line segments connecting the start point and the end point of the touch drag and the average value of the lengths of the line segments connecting the start point and end point of the touch drag, Is the variance of the length of the length.
Referring to FIG. 6, it is assumed that a touch drag input of a user is on the screen of the terminal from the
On the other hand, as described above, a vector (i.e., 'x') of the touch drag attribute information of another person is displayed on the hyperplane as in the example of FIG. 3, A vector (i.e., 'O') may be displayed. That is, it is determined that the touch drag input of the terminal user is a touch drag input of the other person, and the touch drag input of the other user is determined as the touch drag input of the terminal user. In this case, the hyperplane may be readjusted or the number of times of user authentication may be increased in order to improve the accuracy of user authentication, which will be described with reference to FIG. 7 and FIG.
7 is a diagram for explaining a user authentication method using a touch drag according to an embodiment of the present invention.
Referring to FIG. 7, when the terminal receives the touch drag of the terminal user (S701), the terminal extracts the touch drag attribute information of the terminal user from the input touch drag input of the terminal user (S703).
Then, the terminal stores touch drag attribute information of the extracted terminal user (S705). The touch drag attribute information of the terminal user can be extracted and accumulated in the terminal every time a touch drag input of the terminal user is made. That is, steps S701 to S705 may be repeatedly performed every time the touch drag input of the terminal user is input to the terminal.
The terminal performs a machine learning based on the touch drag attribute information of the stored terminal user and the touch drag attribute information of a previously stored other person to determine a criterion for distinguishing the touch drag input of the terminal user from the touch drag input of the other user , A criterion for authenticating the terminal user) (S707). Here, the terminal may determine a criterion for authenticating the terminal user each time the touch drag attribute information of the terminal user and / or the touch drag attribute information of the other terminal are updated, and further, The criteria can be newly determined.
Then, the terminal determines whether an error has occurred in user authentication (S709). That is, a first label is set and a negative value (for example, '-1') is given to the vector of the touch drag attribute information of the terminal user, but as a result of determining the vector value based on the hyperplane, For example, '-1'), or vice versa, it can be determined that an error has occurred in user authentication.
If it is determined in step S709 that an error has occurred in the user authentication, the terminal adjusts a criterion for authentication of the terminal user (i.e., hyperplane) in order to increase the accuracy of the determination of user authentication (S711). That is, the terminal can move in the positive or negative direction by applying a predetermined threshold to the hyperplane. Specifically, when an error occurs in which the touch drag input of the other user is judged as the touch drag input of the terminal user, it is desirable to narrow the region determined to be the touch drag of the terminal user and widen the region determined to be the touch drag of the other user. Lt; / RTI > On the other hand, when an error occurs in which the touch drag input of the terminal user is determined to be a touch drag input of another person, it is preferable to widen the region judged to be the touch drag of the terminal user and narrow the region judged as the touch drag of the other user. A positive threshold can be applied.
In step S713, the terminal determines whether the touch drag input input to the terminal based on the adjusted reference (i.e., hyperplane) in step S711 is a touch drag input of the terminal user or a touch drag input of another user, .
On the other hand, if it is determined in step S709 that no error occurs in the user authentication, the terminal determines whether the touch drag input input to the terminal based on the reference (i.e., hyperplane) determined in step S709 is a touch drag input It is determined whether the input is the touch drag input of another person and the terminal user is authenticated (S713).
FIG. 8 is a diagram illustrating a user authentication method using a touch drag according to an embodiment of the present invention. Referring to FIG.
Referring to FIG. 8, when the terminal receives the touch drag of the terminal user (S801), the terminal extracts the touch drag attribute information of the terminal user from the inputted touch drag input of the terminal user (S803).
Then, the terminal stores touch drag attribute information of the extracted terminal user (S805). The touch drag attribute information of the terminal user can be extracted and accumulated in the terminal every time a touch drag input of the terminal user is made. That is, steps S801 to S805 may be repeatedly performed every time the touch drag input of the terminal user is input to the terminal.
The terminal performs a machine learning based on the touch drag attribute information of the stored terminal user and the touch drag attribute information of a previously stored other person to determine a criterion for distinguishing the touch drag input of the terminal user from the touch drag input of the other user , A criterion for authenticating the terminal user) (S807). Here, the terminal may determine a criterion for authenticating the terminal user each time the touch drag attribute information of the terminal user and / or the touch drag attribute information of the other terminal are updated, and further, The criteria can be newly determined.
Then, the terminal determines whether the accuracy of user authentication exceeds (or is greater than) a preset threshold (S809). As described above, the vector of the touch drag attribute information of the terminal user is given a negative value (for example, '-1') by setting the first label, but the vector value is determined based on the hyperplane, (E.g., '-1'), or vice versa, it can be determined that an error has occurred in user authentication. In this case, the terminal generates an error in the number of touch drag inputs of the terminal user and / or the number of touch drag inputs of the other user, which are used as basic information for determining the criterion for user authentication (i.e., hyperplane) The accuracy of the user authentication can be calculated.
As a result of the determination in step S809, if the accuracy of user authentication exceeds (or exceeds) a preset threshold value, the terminal determines whether the user authentication is successful based on the criterion (i.e., hyperplane) determined in step S807 The terminal user is authenticated by judging whether the touch drag input inputted to the terminal is the touch drag input of the terminal user or the touch drag input of the other terminal (S811).
On the other hand, if it is determined in step S809 that the accuracy of the user authentication is less than (or less than) the preset threshold value, the terminal determines that the touch drag input, which is input to the terminal, (K, e.g., k is an integer equal to or larger than 2) whether the input is the input or the touch drag input of the other person (S813), and authenticates the terminal user through the result of the repeated determination (S815). In step S813, the terminal requests the touch drag of y times (for example, displays a message for requesting the user to input a touch drag on the screen of the terminal) If it is determined that the user's touch drag input is input, the terminal user can be determined. Also, if it is determined in step S813 that the touch drag input of the terminal user is consecutively numbered k times, the terminal can determine the terminal user. Thus, it is possible to improve the accuracy of the user authentication by determining whether the touch drag input is repeatedly input by the terminal user.
Here, the k value may be set to be in a floating state in inverse proportion to the value of the accuracy with respect to the user authentication previously determined in S809. For example, the k value may be set to 3 if the accuracy of user authentication is less than 99%, and the k value may be set to 5 if the accuracy of user authentication is more than 99% and less than 99.5%.
The terminal can perform locking or unlocking of the terminal through the user authentication method using the touch drag input according to the present invention described above. This will be described with reference to Figs. 9 and 10. Fig.
9 is a diagram for explaining a user authentication method using a touch drag according to an embodiment of the present invention.
9, when the terminal receives a touch drag input for releasing the lock screen state of the terminal from the user on the lock screen of the terminal (S901), the terminal extracts the touch drag attribute information from the input touch drag input (S903 ). At this time, the terminal displays the entire area of the touch screen screen or a predetermined area on the touch screen screen, and can induce a touch drag input for unlocking from the user. For example, a touch drag input area may be displayed in a straight line or arc form on the touch screen screen of the terminal so that the user can induce the touch drag input in the corresponding area.
Then, the terminal determines whether the input touch drag input is the touch drag input of the user or the touch drag input of the other user based on the touch drag attribute information extracted in step S903 based on the predetermined user authentication reference (S905 ).
If it is determined in step S905 that the touch drag input is input by the terminal user, the terminal releases the locked state of the terminal (S907). In the case of FIG. 9, a method of releasing the terminal's lock state by touch dragging input is exemplified. However, as described above, in addition to the touch drag input for unlocking the terminal, , A touch pattern method, or the like can be used together.
On the other hand, if it is determined in step S905 that the touch drag input of the non-terminal user is input, the terminal maintains the locked state of the terminal. When the terminal branches back to step S901 and receives a touch drag input from the user again, the terminal may perform steps S903 to S907 again.
FIG. 10 is a diagram for explaining a user authentication method using a touch drag according to an embodiment of the present invention.
Referring to FIG. 10, when the terminal receives a touch drag input from a user on a screen other than the lock screen of the terminal (S1001), the terminal extracts the touch drag attribute information from the input touch drag input (S1003).
Then, the terminal determines whether the input touch drag input is the touch drag input of the user or the touch drag input of another user based on the touch drag attribute information extracted in step S1003 (S1005) ).
If it is determined in step S1005 that the input is a touch drag input of a non-terminal user, the terminal processes the state of the terminal into a locked state (S1007). 8, the terminal repeatedly determines whether the touch drag input of the user is the terminal user touch drag input or the touch drag input of the other person, as shown in the example of FIG. 8, k value). If it is determined that the input is a touch drag input of another person consecutively, or if it is determined that the touch drag input is input to the other person at least k times of y (y > k) It is possible. Thus, it is possible to improve the accuracy of the user authentication by determining whether the touch drag input is repeatedly input by the terminal user.
On the other hand, if it is determined in step S1005 that the touch drag input of the terminal user is input, the terminal maintains a running screen (e.g., a home screen screen, an application execution screen, etc.) of the terminal. Then, if the terminal branches back to step S1001 and receives a touch drag input from the user again, the terminal can perform steps S1003 to S1007 again.
11 is a diagram illustrating experimental data for comparing attribute information on a touch drag input of a terminal user and others according to an embodiment of the present invention.
11, (a) shows the attribute value of the touch drag input in the natural motion of the terminal user, and the touch drag input of the terminal user is about 450 to about 700 on the x- , And has a value in the range of about 500 to about 1100 on the y-axis. 11 (b) shows an attribute value of the touch drag input in the ordinary operation of the other person, and the touch drag input of the other has a value in a range of about 500 to about 650 on the x axis, Lt; RTI ID = 0.0 > 550 < / RTI >
As described above, the touch drag attribute information used for user authentication is not limited to two and can be used for user authentication by using all of the 25 types of touch drag attribute information exemplified in Table 1 above.
12 is a diagram illustrating experimental data for comparing attribute information on touch drag input of a terminal user and a non-user according to each touch drag attribute information according to an embodiment of the present invention.
Referring to FIG. 12, the touch drag attribute values in the natural motion of the terminal user and others are shown for the 25 attributes used for the touch drag learning illustrated in Table 1 above. In FIG. 12, the learning data (a) for the touch drag input of the terminal user is represented by "O", and the learning data (b) for the touch drag input of the other user for user authentication is represented by "×". As described above, the touch drag input has unique attribute values for each of the twenty-five touch drag attribute information exemplified in Table 1 for each user. By using the attribute values unique to each user, the touch drag input Can be determined.
As described above, according to the present invention, an attribute value unique to each user can be extracted from the touch drag input in the daily operation of the user, and the attribute value of the touch drag input, The touch drag input can be discriminated.
Embodiments in accordance with the present invention may be implemented by various means, for example, hardware, firmware, software, or a combination thereof. In the case of hardware implementation, an embodiment of the present invention may include one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs) field programmable gate arrays, processors, controllers, microcontrollers, microprocessors, and the like.
In addition, in the case of an implementation by firmware or software, an embodiment of the present invention may be embodied in the form of a module, a procedure, a function, and the like for performing the functions or operations described above, Lt; / RTI > Here, the recording medium may include program commands, data files, data structures, and the like, alone or in combination. Program instructions to be recorded on a recording medium may be those specially designed and constructed for the present invention or may be available to those skilled in the art of computer software. For example, the recording medium may be an optical recording medium such as a magnetic medium such as a hard disk, a floppy disk and a magnetic tape, a compact disk read only memory (CD-ROM), a digital video disk (DVD) Optical media such as a floppy disk and a hardware device specifically configured to store and execute program instructions such as ROM, RAM, flash memory and the like. Examples of program instructions may include machine language code such as those generated by a compiler, as well as high-level language code that may be executed by a computer using an interpreter or the like. Such hardware devices may be configured to operate as one or more software modules to perform the operations of the present invention, and vice versa.
While the present invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, It will be apparent to those skilled in the art. Furthermore, although specific terms are used in this specification and the drawings, they are used in a generic sense only to facilitate the description of the invention and to facilitate understanding of the invention, and are not intended to limit the scope of the invention. Accordingly, the foregoing detailed description is to be considered in all respects illustrative and not restrictive. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.
In addition, a device or terminal according to the present invention may be driven by instructions that cause one or more processors to perform the functions and processes described above. Such instructions may include, for example, interpreted instructions such as script commands, such as JavaScript or ECMAScript commands, or other instructions stored in executable code or computer readable media. Further, the apparatus according to the present invention may be implemented in a distributed manner across a network, such as a server farm, or may be implemented in a single computer device.
Further, a computer program (also known as a program, software, software application, script or code) that is embedded in the apparatus according to the present invention and which implements the method according to the present invention includes a compiled or interpreted language, a priori or procedural language , And may be deployed in any form including standalone programs or modules, components, subroutines, or other units suitable for use in a computer environment. A computer program does not necessarily correspond to a file in the file system. The program may be stored in a single file provided to the requested program, or in multiple interactive files (e.g., a file storing one or more modules, subprograms, or portions of code) (E.g., one or more scripts stored in a markup language document). A computer program may be deployed to run on multiple computers or on one computer, located on a single site or distributed across multiple sites and interconnected by a communications network.
Moreover, in describing the embodiments according to the present invention, operations are depicted in the drawings in a particular order, but it is to be understood that they should perform such operations in that particular order or sequential order shown in order to obtain the desired result, Should not be understood as being performed. In certain cases, multitasking and parallel processing may be advantageous. Also, the separation of the various system components of the above-described embodiments should not be understood as requiring such separation in all embodiments, and the described program components and systems will generally be integrated together into a single software product or packaged into multiple software products It should be understood.
The terminal user authentication method using the touch drag according to the present invention can be applied to various devices equipped with a touch screen.
100: terminal 110:
120: input unit 130: display unit
140: storage unit 150: sensor unit
160: control unit 161: attribute information collecting unit
163: Machine learning unit 165: User authentication unit
167: terminal lock / unlock processing unit
Claims (9)
Extracting touch drag attribute information of the terminal user from the touch drag input of the terminal user upon receiving the touch drag input of the terminal user;
Storing touch drag attribute information of the extracted terminal user;
A machine learning is performed on the basis of the touch drag attribute information of the stored terminal user and the touch drag attribute information of a previously stored other person to determine a criterion for distinguishing the touch drag input of the terminal user from the touch drag input of the other user Determining; And
Determining whether the touch drag input input to the terminal is a touch drag input of the terminal user based on the determined criteria, and authenticating the terminal user.
Setting a first drag label attribute and a second drag label attribute information of the user to a first label and a second label, respectively; And
And performing the machine learning through a SVM (Support Vector Machine) algorithm or a Bayesian algorithm based on the set label to determine the criterion.
And adjusting the determined criterion in a positive or negative direction by a predetermined value when an error is generated in a determination as to whether or not the terminal user is a touch drag input according to the determined criterion.
When the determination accuracy according to the determined criterion is lower than a preset threshold value, the touch drag input inputted to the terminal is input to the touch drag input (k) of the terminal user by k And authenticating the terminal as the terminal user.
When the touch drag input inputted on the lock screen of the terminal is judged to be the touch drag input of the terminal user, the touch lock input terminal releases the lock state of the terminal and the touch drag input inputted on the screen other than the lock screen of the terminal, Further comprising the step of processing the terminal into a locked state if it is determined that the input is a touch drag operation.
And receiving the touch drag attribute information of the other person from an external device.
And transmitting the touch drag attribute information of the user to an external device so that the user's touch drag attribute information is used as the touch drag attribute information of the other person.
The touch drag attribute information of the user and the touch drag attribute information of the third person include the rotation radius of the estimated touch finger, the x coordinate of the estimated touch finger rotation axis, the y coordinate of the estimated touch finger rotation axis, the length of the touch drag, The x coordinate of the starting point, the y coordinate of the point at which the touch drag starts, the x coordinate of the point at which the touch drag ends, the y coordinate of the point at which the touch drag ends, the x coordinate of the gravity sensor at the start of the touch drag, The y coordinate of the gravity sensor at the start, the z coordinate of the gravity sensor at the start of the touch drag, the average of the major axis of the touch ellipse plane, the dispersion of the major axis of the touch ellipse plane, The dispersion of the elliptical plane of the minor axis, the average of the major axis of the elliptical plane in which the touch is actually recognized at the terminal, The variance of the major axis of the elliptical plane, the average of the velocity with respect to the x coordinate of the touch drag, the variance of the velocity with respect to the x coordinate of the touch drag, the average of the velocity with respect to the y coordinate of the touch drag, The average of the lengths of the lines from the start point of the touch drag to the end point of the touch drag from each point of the touch drag and the start point and end point of the drag from each point of the drag drag And a variance of the length of the perpendicular to the connecting line.
Touch screen for touch drag input;
A storage unit for storing the touch drag attribute information of the terminal user and the touch drag attribute information of the other user;
An attribute information collecting unit for extracting touch drag attribute information of the terminal user from the touch drag input of the terminal user when the touch drag input of the terminal user is received through the touch screen and storing the extracted attribute information in the storage unit;
A machine learning is performed based on the touch drag attribute information of the terminal user and the touch drag attribute information of the other person to determine a criterion for distinguishing the touch drag input of the terminal user from the touch drag input of the other person Machine learning department; And
And a user authentication unit for authenticating the terminal user by determining whether the touch drag input inputted to the touch screen is a touch drag input of the terminal user based on the determined criteria.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020140012748A KR20150092441A (en) | 2014-02-04 | 2014-02-04 | Method for user Authentication through touch dragging, device therefor |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020140012748A KR20150092441A (en) | 2014-02-04 | 2014-02-04 | Method for user Authentication through touch dragging, device therefor |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20150092441A true KR20150092441A (en) | 2015-08-13 |
Family
ID=54056718
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020140012748A KR20150092441A (en) | 2014-02-04 | 2014-02-04 | Method for user Authentication through touch dragging, device therefor |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20150092441A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20190013307A (en) * | 2017-08-01 | 2019-02-11 | 연세대학교 산학협력단 | Method and Apparatus for Inputting Pattern to Prevent Shoulder Surfing |
KR20190026492A (en) * | 2017-09-05 | 2019-03-13 | 세종대학교산학협력단 | Method for certification and apparatus for executing the method |
KR101980483B1 (en) * | 2018-03-05 | 2019-05-20 | 인하대학교 산학협력단 | Pin input method and system based on user behavior recognition using machine learning |
KR20190129672A (en) * | 2018-05-10 | 2019-11-20 | 세종대학교산학협력단 | Neural network based pattern authentication method and device |
WO2023043118A1 (en) * | 2021-09-16 | 2023-03-23 | 삼성전자 주식회사 | Electronic device and touch recognition method of electronic device |
US11899884B2 (en) | 2021-09-16 | 2024-02-13 | Samsung Electronics Co., Ltd. | Electronic device and method of recognizing a force touch, by electronic device |
-
2014
- 2014-02-04 KR KR1020140012748A patent/KR20150092441A/en active Search and Examination
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20190013307A (en) * | 2017-08-01 | 2019-02-11 | 연세대학교 산학협력단 | Method and Apparatus for Inputting Pattern to Prevent Shoulder Surfing |
KR20190026492A (en) * | 2017-09-05 | 2019-03-13 | 세종대학교산학협력단 | Method for certification and apparatus for executing the method |
KR101980483B1 (en) * | 2018-03-05 | 2019-05-20 | 인하대학교 산학협력단 | Pin input method and system based on user behavior recognition using machine learning |
KR20190129672A (en) * | 2018-05-10 | 2019-11-20 | 세종대학교산학협력단 | Neural network based pattern authentication method and device |
WO2023043118A1 (en) * | 2021-09-16 | 2023-03-23 | 삼성전자 주식회사 | Electronic device and touch recognition method of electronic device |
US11899884B2 (en) | 2021-09-16 | 2024-02-13 | Samsung Electronics Co., Ltd. | Electronic device and method of recognizing a force touch, by electronic device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR20150092441A (en) | Method for user Authentication through touch dragging, device therefor | |
US9706406B1 (en) | Security measures for an electronic device | |
KR102133534B1 (en) | Method and Apparatus for User Authentication | |
EP3482331B1 (en) | Obscuring data when gathering behavioral data | |
US10127370B2 (en) | Computing device chording authentication and control | |
KR20140027606A (en) | Comtrol method for terminal using text recognition and terminal thereof | |
CN109240554B (en) | Method and system for detecting the presence of a finger in the vicinity of a touchless screen | |
JP6039822B2 (en) | Electrostatic Touch Authentication Method (Method for Authenticating Capacitive Touch) | |
KR20200009916A (en) | Electronic device and method for controlling the same | |
CN105723374A (en) | Secure remote modification of device credentials using device-generated credentials | |
CN105447350B (en) | A kind of identity identifying method and device | |
KR101958878B1 (en) | Method for security unlocking of terminal and terminal thereof | |
KR101228336B1 (en) | Personalization Service Providing Method by Using Mobile Terminal User's Activity Pattern and Mobile Terminal therefor | |
US9785863B2 (en) | Fingerprint authentication | |
JP2020098638A (en) | Trigger regions | |
WO2012152995A1 (en) | Method and apparatus for navigation-based authentication | |
JP2016081071A (en) | Biometric authentication device, and method and program for biometric authentication | |
KR20210130856A (en) | Electronic device and its control method | |
US20200167553A1 (en) | Method, system and apparatus for gesture recognition | |
KR20130015978A (en) | Apparatus for detecting lane and method thereof | |
Nohara et al. | Personal identification by flick input using self-organizing maps with acceleration sensor and gyroscope | |
US9158380B2 (en) | Identifying a 3-D motion on 2-D planes | |
KR101428909B1 (en) | System and method for interaction between pen-user terminal using magnetic field | |
KR20190130546A (en) | Comtrol method for electronic device using text recognition and electronic device thereof | |
KR20150019125A (en) | Authentication apparatus based onfg motion |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E601 | Decision to refuse application | ||
N231 | Notification of change of applicant | ||
AMND | Amendment |