KR20150092441A - Method for user Authentication through touch dragging, device therefor - Google Patents

Method for user Authentication through touch dragging, device therefor Download PDF

Info

Publication number
KR20150092441A
KR20150092441A KR1020140012748A KR20140012748A KR20150092441A KR 20150092441 A KR20150092441 A KR 20150092441A KR 1020140012748 A KR1020140012748 A KR 1020140012748A KR 20140012748 A KR20140012748 A KR 20140012748A KR 20150092441 A KR20150092441 A KR 20150092441A
Authority
KR
South Korea
Prior art keywords
touch drag
touch
terminal
user
attribute information
Prior art date
Application number
KR1020140012748A
Other languages
Korean (ko)
Inventor
전진수
이경한
차정석
Original Assignee
에스케이텔레콤 주식회사
국립대학법인 울산과학기술대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 에스케이텔레콤 주식회사, 국립대학법인 울산과학기술대학교 산학협력단 filed Critical 에스케이텔레콤 주식회사
Priority to KR1020140012748A priority Critical patent/KR20150092441A/en
Publication of KR20150092441A publication Critical patent/KR20150092441A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A user authentication method using a touch drag and an apparatus therefor are disclosed. Specifically, a method for authenticating a terminal user using touch drag attribute information of a user according to the present invention includes extracting touch drag attribute information of a terminal user from a touch drag input of a terminal user upon receiving a touch drag input of the terminal user Storing the touch drag attribute information of the extracted terminal user, performing machine learning based on the touch drag attribute information of the stored terminal user and the touch drag attribute information of the previously stored other user, Determining a criterion for distinguishing an input from a touch drag input of another person, and authenticating the terminal user by determining whether the touch drag input input to the terminal is a touch drag input of the terminal user based on the determined criterion .

Description

TECHNICAL FIELD The present invention relates to a method for authenticating a user using touch dragging,

The present invention relates to a user authentication method, and more particularly, to a method and apparatus for authenticating a user using a touch drag in a terminal equipped with a touch screen.

The mobile communication system has been developed to provide voice service while guaranteeing the activity of the user. However, the mobile communication system is gradually expanding not only the voice but also the data service. However, with the provision of a variety of services, more personal information is stored in the smart phone in addition to personal information such as call history and address book, so that interest in the security system is increasing.

Security systems in existing mobile communication terminals are divided into two types, and there are a password-based security system that uses information specified by a person such as a password and a graphic form, and an attribute-based security system that uses attributes of individuals such as face recognition and fingerprint recognition .

Korean Registered Patent No. 10-1032351, April 25, 2011 (Name: Pattern recognition and link system of touch input method and method thereof)

However, existing password-based security systems have a disadvantage in that others can easily recognize passwords designated by the user, and attribute-based security systems such as face recognition and fingerprint recognition are not widely commercialized due to their accuracy and cost.

It is an object of the present invention to provide a method and apparatus for authenticating a legitimate user of a terminal using a touch drag input according to a user's daily operation at a terminal.

It is another object of the present invention to provide a method and apparatus for further enhancing security of a terminal by using the attribute of a user, which can not be easily followed by others, as a means of security even if the user authentication means is exposed.

It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention, unless further departing from the spirit and scope of the invention as defined by the appended claims. It will be possible.

According to an aspect of the present invention, there is provided a method of authenticating a terminal user using terminal's touch drag attribute information, the method comprising: receiving, from a touch drag input of a terminal user, touch drag attribute information Extracting the touch drag attribute information of the extracted terminal user, performing machine learning on the basis of the touch drag attribute information of the stored terminal user and the touch drag attribute information of the stored other terminal, Determining a criterion for distinguishing the touch drag input of the other user from the touch drag input of the other user, and determining whether the touch drag input inputted to the terminal is the touch drag input of the terminal user based on the determined reference, .

According to another aspect of the present invention, there is provided a terminal for authenticating a terminal user using a touch drag attribute information of a user, the terminal including a touch screen for inputting a touch drag, touch drag attribute information of a terminal user, An attribute information collecting unit for extracting the touch drag attribute information of the terminal user from the touch drag input of the terminal user and storing the extracted information in the storage unit when the touch drag input of the terminal user is received through the touch screen, A machine learning unit for determining a criterion for distinguishing between a touch drag input of a terminal user and a touch drag input of a terminal by performing machine learning based on information of the touch drag attribute information of another person, The touch drag input input to the screen is input to the terminal user It is determined whether the drag input comprises a user authentication unit that authenticates the terminal user.

According to an embodiment of the present invention, a user can authenticate a legitimate user in the terminal by using only the daily operation of the user by extracting and using the unique feature of the user in the touch drag input of the user.

In addition, according to the embodiment of the present invention, even if another user watches the process of releasing the security using the touch dragging by a legitimate user, the probability of replaying the touch drag operation is very low like a legitimate user, Can be improved.

The effects obtained in the present invention are not limited to the effects mentioned above, and other effects not mentioned can be clearly understood by those skilled in the art from the following description .

The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of the specification, illustrate embodiments of the invention and, together with the description, serve to explain the technical features of the invention.
1 is a diagram illustrating a configuration of a terminal according to an embodiment of the present invention.
FIG. 2 is a diagram for explaining a user authentication method using a touch drag according to an embodiment of the present invention. Referring to FIG.
FIG. 3 is a diagram illustrating a method for setting criteria for user authentication according to an embodiment of the present invention.
FIGS. 4 to 6 are diagrams illustrating touch drag attribute information according to an embodiment of the present invention.
7 to 10 are diagrams for explaining a method of authenticating a user using a touch drag according to an embodiment of the present invention.
11 is a diagram illustrating experimental data for comparing attribute information on a touch drag input of a terminal user and others according to an embodiment of the present invention.
12 is a diagram illustrating experimental data for comparing attribute information on touch drag input of a terminal user and a non-user according to each touch drag attribute information according to an embodiment of the present invention.

Hereinafter, preferred embodiments according to the present invention will be described in detail with reference to the accompanying drawings. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS The following detailed description, together with the accompanying drawings, is intended to illustrate exemplary embodiments of the invention and is not intended to represent the only embodiments in which the invention may be practiced. The following detailed description includes specific details in order to provide a thorough understanding of the present invention. However, those skilled in the art will appreciate that the present invention may be practiced without these specific details.

In some instances, well-known structures and devices may be omitted or may be shown in block diagram form, centering on the core functionality of each structure and device, to avoid obscuring the concepts of the present invention.

Throughout the specification, when an element is referred to as "comprising" or " including ", it is meant that the element does not exclude other elements, do. Also, the terms " part, "" module," and " module ", etc. in the specification mean a unit for processing at least one function or operation and may be implemented by hardware or software or a combination of hardware and software have. Also, the terms " a or ", "one "," the ", and the like are synonyms in the context of describing the invention (particularly in the context of the following claims) May be used in a sense including both singular and plural, unless the context clearly dictates otherwise.

The specific terminology used in the following description is provided to aid understanding of the present invention, and the use of such specific terminology may be changed into other forms without departing from the technical idea of the present invention.

According to the present invention, the usual operation of the terminal user, that is, the touch drag operation is recognized through machine learning to find a unique attribute of the terminal user from the natural motion, .

In the following description, a legitimate user of the terminal is referred to as a 'terminal user', and any other N users other than the terminal user are referred to as 'others'. In addition, 'touch dragging' means touching a touch screen of a terminal while dragging the touch pen or a user's finger in a certain direction while maintaining contact. An example of such a touch dragging may include a flicking-touch, and a flicking-touch means touching the screen. In addition, 'touch drag property information' refers to information indicating a characteristic of a touch drag input having a low affinity to each user, and 'touch drag attribute information of a terminal user' Attribute information ", and the 'touch drag attribute information of the other person' means attribute information of the touch drag input of the other person. The user 'authentication' means to identify (or distinguish) whether the user is a legitimate user (i.e., a terminal user) or another one. If the number of terminal users is two or more, .

In the present specification, a 'terminal' refers to a user equipment (UE), a mobile station (MS), a mobile subscriber station (MSS), a subscriber station (SS), an advanced mobile station (AMS) Machine-type communication device, an M2M (machine-to-machine) device, a D2D device (device-to-device), and a station (STA). However, the present invention is not limited to this, and a terminal equipped with a touch pad or touch screen capable of touch input from a user may correspond to the terminal according to the present invention.

1 is a diagram illustrating a configuration of a terminal according to an embodiment of the present invention.

Referring to FIG. 1, a terminal 100 according to an embodiment of the present invention includes a communication unit 110, an input unit 120, a display unit 120, and a display unit 130. The communication unit 110 performs functions to authenticate a terminal user using a touch drag input of a user. A memory unit 130, a storage unit 140, a sensor unit 150, and a control unit 160. 1 is merely an example, and the terminal 100 according to the present invention includes at least one of the components shown in FIG. 1, but the present invention is not limited thereto. The control unit 160 may include an attribute information collection unit 161, a machine learning unit 163, a user authentication unit 165, and a terminal lock / unlock processing unit 167 for each function. The configuration of the terminal 100 shown in FIG. 1 represents functional elements that are functionally separated. In order to perform the functions according to the present invention, the terminal 100 may be functionally connected and any one or more configurations may be physically integrated with each other .

More specifically, the communication unit 110 is configured to enable communication with a device (hereinafter referred to as a "service device") other than the terminal supporting the user authentication method using the touch drag according to the present invention And may include one or more modules that enable wired / wireless communication. Particularly, in the present invention, the communication unit 110 can receive touch drag attribute information of a person other than the terminal user from the service apparatus. In addition, attribute information on the touch drag input of the terminal user can be transmitted to the service apparatus. Thus, by transmitting the touch drag attribute information of the terminal user to the service apparatus, the terminal can be used as the touch drag attribute information for authenticating a touch drag input of a legitimate user at the terminal of another person.

The input unit 120 generates input data for controlling the operation of the terminal 100 by a user. The input unit 120 may be implemented by a variety of input means. For example, the input unit 120 may include at least one of a key pad, a dome switch, a touch pad (static / static), a jog wheel, a jog switch, and a voice input unit. Particularly, in the present invention, the input unit 120 may include a touch pad (static pressure / static electricity).

The display unit 130 outputs the operation status and operation result of the terminal 100 to the user under the control of the controller 160. [ Particularly, in the present invention, the display unit 130 may display a result of discriminating whether or not the user is a terminal user from the touch drag input of the user so that the user can recognize the result. The display unit 130 may be implemented by various display means. For example, the display unit 130 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT-LCD), a light emitting diode (LED), an organic light emitting diode (OLED) Organic Light Emitting Diodes (OLEDs), flexible displays, and 3D displays.

In the case where the display unit 130 and the sensor for sensing the touch operation have a mutual layer structure (hereinafter, referred to as 'touch screen'), the display unit 130 may be used as an input device in addition to the output device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like. The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display unit 130 or a capacitance generated in a specific portion of the display unit 130 into an electrical input signal. The touch sensor can be configured to detect not only the position and area to be touched but also the pressure at the time of touch. When there is a touch input to the touch sensor, the corresponding signal (s) is transmitted to the control unit 160, and the control unit 160 determines which region of the display unit 130 is touched You will know.

The storage unit 140 is a unit for storing data and / or programs necessary for the operation of the terminal 100. Particularly, in the present invention, the storage unit 140 stores the touch drag attribute information of the terminal user extracted by the controller 160 when the touch drag input of the terminal user is input through the input unit 120 under the control of the controller 140 Can be stored. The touch drag attribute information of the terminal user can be stored in the storage unit 140 accumulatively every time a touch drag input of the terminal user is made, and the touch drag attribute information of the stored terminal user is stored in the controller 160 And can be used as basic information when performing machine learning to set a reference.

Also, in the present invention, the storage unit 140 may store the touch drag attribute information of the other person. The touch drag attribute information of the other person is collected by the service provider and downloaded in advance from the service device through the communication unit 110 under the control of the control unit 160 or stored in the storage unit 140 at the time of manufacturing the terminal or providing the service, (140) < / RTI > The touch drag attribute information of the other person stored in the storage unit 140 may also be used as basic information in performing the machine learning in order to set a criterion for user authentication in the control unit 160. [

The storage unit 140 may be located inside or outside the control unit 160 and may be connected to the control unit 160 by various well-known means.

The sensor unit 150 senses the current state of the terminal 100, such as the position of the terminal 100, the presence or absence of the user, the orientation of the terminal, acceleration / deceleration of the terminal, and generates a sensing signal. In particular, in the present invention, the sensor unit 150 may include an operation sensor for sensing movement, direction, speed, etc. of the terminal 100. Examples of the operation sensor include a geomagnetic sensor, an acceleration sensor, Sensors, altimeters, and the like.

The geomagnetic sensor is a sensor that can detect the direction of the magnetic field generated by the earth and detect the orientation like a compass. The geomagnetism is a self possessed by the Earth, and it changes periodically or irregularly, not invariably. To determine the Earth's magnetic field at any point, three independent components are required, including horizontal component, azimuth and dip. The X-axis sensor and the Y-axis sensor are disposed on the left side of the terminal 100. The X-, Y-, and Z- , And the Z axis sensor senses the upward and downward directions of the terminal 100, thereby measuring the direction of movement of the terminal 100. [ The acceleration sensor is a sensor that can directly detect a dynamic vibration change (acceleration) of a sensor moving body that specifies a physical quantity called a speed change. Typically, there is a piezoelectric acceleration sensor. The silicon structure of the sensor is changed according to the acceleration change caused by movement of the moving object, and the fluctuation of the silicon structure generates a charge to the silicon. The electric charge generated in the silicon changes the capacitance of the sensor. It acts as a principle to detect the motion of the moving body as a principle of converting the amount of capacitance change into the corresponding voltage value. The gyro sensor is a sensor that detects the vertical force by the same principle as the acceleration sensor when a coriolis force is generated in the vertical direction of the rotation angle by the rotation angle acceleration sensor. In addition, the altimeter is a sensor for measuring an air pressure difference (pressure) which changes according to the altitude.

The controller 160 is configured to perform overall control of the terminal 100 and is configured to perform functions of the communication unit 110, the input unit 120, the display unit 130, the storage unit 140, and the sensor unit 150 Thereby controlling the flow of signals. In particular, in the present invention, the controller 160 implements a function, a process and / or a method for authenticating a user using the touch drag input proposed in FIGS. 2 to 10. FIG.

Each function of the controller 160 will be described in detail. First, the attribute information collecting unit 161 extracts the touch drag attribute information of the terminal user from the touch drag input of the terminal user, if there is the touch drag input of the terminal user through the input unit 120, Attribute information may be stored in the storage unit 140. [ In addition, the attribute information collection unit 161 may extract the touch drag attribute information of the terminal user whenever the drag input is received, and accumulate the attribute information in the storage unit 140. [ As described above, since the touch drag attribute information is the information representing the feature of the touch drag, the attribute information collecting unit 161 acquires the attribute information such as a tab, a double tab, a long tab, The touch drag attribute information may not be extracted.

The attribute information collection unit 161 may receive the touch drag attribute information of a person other than the terminal user from the service apparatus through the communication unit 110 and store the attribute information in the storage unit 140. [ The attribute information collecting unit 161 can transmit a message for requesting the touch drag attribute information of another person to the service apparatus through the communication unit 110 and receive the touch drag attribute information in response thereto. At this time, the attribute information collecting unit 161 requests the service device to provide the touch drag property information of the other person to the service device at a predetermined time interval set in advance by the service device, or when an incorrect user authentication result is generated, If it is lower than the set threshold value, the service device can request the touch drag attribute information of the other person. On the other hand, upon receiving a notification message informing that the touch drag attribute information of the other person has been updated, the service device can access the service device and receive the touch drag attribute information.

The machine learning unit 163 of the control unit 160 reads the touch drag attribute information of the terminal user extracted by the attribute information collection unit 161 and stored in the storage unit 140 and the touch drag attribute information of the terminal user stored in the storage unit 140 (I.e., a criterion for authenticating the terminal user) by distinguishing the touch drag input of the terminal user from the touch drag input of the other user by performing machine learning with the drag attribute information as basic information. For example, the machine learning unit 163 may determine a reference, i.e., a hyperplane, for authenticating a terminal user by performing machine learning through a SVM (Support Vector Machine) algorithm. That is, the touch drag attribute information of the terminal user can be classified into the first label, the touch drag attribute information of the other user can be classified into the second label, and the criteria for authenticating the user through the SVM algorithm can be determined . At this time, if N pieces of attribute information for the touch drag input are used for user authentication, the hyperplane can be determined to be in the form of N-1 dimension. For example, when two attribute information is used for the touch drag input, the hyperplane may be determined as a linear one-dimensional shape. In addition, the machine learning unit 163 may perform a machine learning through a Bayesian algorithm to determine a criterion for authenticating a terminal user. A more detailed description of the algorithm of the machine learning performed in the machine learning unit 163 will be described later.

Since the touch drag attribute information of the terminal user can be accumulated and stored in the storage unit 140, the machine learning unit 163 can store the touch drag attribute information of the terminal user and / or the touch drag Each time attribute information is updated, a criterion for authenticating a terminal user can be reassessed, and a criterion for authenticating a terminal user with a predetermined interval can be newly determined.

In addition, the machine learning unit 163 can verify that an error has occurred in the user authentication by verifying the touch drag attribute information of the terminal user and / or the touch drag attribute information of the other based on the predetermined criteria, It is possible to confirm that an error has occurred in the user authentication as a result of the determination of the touch drag input inputted by the user authentication unit 165 based on the criterion determined in step 163 of FIG. If an incorrect result is obtained as a result of judging whether the touch drag input is the touch drag input of the terminal user or the touch drag input of the other based on the criterion determined by the machine learning unit 163, In order to increase the accuracy of the authentication decision, a new standard for user authentication can be reset (or adjusted).

The user authentication unit 165 of the control unit 160 receives the touch drag input through the input unit 120 and authenticates the user for the touch drag input based on the criterion determined by the machine learning unit 163. [ That is, it is determined whether or not the touch drag for the input touch drag input is the touch drag input of the terminal user, and the user is authenticated.

As described above, the machine learning unit 163 can determine the reference, i.e., the hyperplane, for authenticating the terminal user by performing machine learning through the SVM algorithm. The user authentication unit 165 determines which direction of the hyperplane the touch drag attribute information about the touch drag input input through the input unit 120 is located by the machine learning unit 163, It can be determined whether the input is the touch drag input of the terminal user or the touch drag input of another person.

Also, the machine learning unit 163 may verify the touch drag attribute information of the terminal user and / or the touch drag attribute information of the other user based on the predetermined criterion, As a result of the determination in step 165, the accuracy of the user authentication may not reach a preset threshold value. If it is determined that the touch drag input is the touch drag input of the terminal user or the touch drag input of the other user based on the criterion determined by the machine learning unit 163 and the accuracy of the determination does not reach the predetermined threshold value, The control unit 165 can determine that the user is a terminal user only when a plurality of touch drag is input and the touch drag is judged to be a touch drag of the terminal user continuously or more than a predetermined number of times.

The terminal lock / unlock processing unit 167 of the control unit 160 locks or unlocks the terminal 100 based on the result determined by the user authentication unit 165 using the touch drag input, ). The terminal lock / unlock processing unit 167 uses the touch drag attribute information on the touch drag input inputted on the lock screen of the terminal 100 to display the drag drag attribute information of the terminal 100 on the basis of the result determined by the user authentication unit 165 You can unlock it. If the user authentication unit 165 determines that the terminal is not a terminal user using the touch drag attribute information on the touch drag input of the user on the screen other than the lock screen, the terminal lock / unlock processing unit 167 ) Can be locked. Examples of the screen other than the lock screen include an application execution screen of the terminal 100, a home screen screen other than the application execution screen, and the like. At this time, the home screen screen refers to a state in which an application execution screen is not displayed, and generally refers to a screen that is displayed first when the terminal 100 is unlocked. It is also possible to include a state in which the terminal 100 is executing the application but the execution screen is not displayed. Such a home screen screen can freely change the image according to the setting of the user, and it is also possible to freely arrange the icon of the application desired by the user.

In addition, the terminal lock / unlock processing unit 167 may use a touch drag input to lock the terminal 100 using a result determined by the user authentication unit 165 and / ) Or unlocking the device. As another example of the method for user authentication, a password method, a touch pattern method, and the like may be included. Here, the touch pattern method refers to a method of comparing whether a touch pattern input on the lock screen is the same as a touch pattern (for example, 'A' pattern or 'b' pattern) previously designated by the user.

Hereinafter, a user authentication method using a touch drag according to the present invention will be described in detail with reference to the drawings. In particular, it is assumed that the terminal 100 to which the present invention is applicable includes a display unit 130 (touch screen).

FIG. 2 is a diagram for explaining a user authentication method using a touch drag according to an embodiment of the present invention. Referring to FIG.

Referring to FIG. 2, when the terminal receives the touch drag of the terminal user (S201), the terminal extracts the touch drag attribute information of the terminal user from the inputted touch drag input of the terminal user (S203).

Then, the terminal stores touch drag attribute information of the extracted terminal user (S205). The touch drag attribute information of the terminal user can be extracted and accumulated in the terminal every time a touch drag input of the terminal user is made. That is, steps S201 to S205 may be repeatedly performed every time a touch drag input of the terminal user is input to the terminal.

The terminal performs a machine learning based on the touch drag attribute information of the stored terminal user and the touch drag attribute information of a previously stored other person to determine a criterion for distinguishing the touch drag input of the terminal user from the touch drag input of the other user , A criterion for authenticating the terminal user) (S207). Here, the terminal may determine a criterion for authenticating the terminal user each time the touch drag attribute information of the terminal user and / or the touch drag attribute information of the other terminal are updated, and further, The criteria can be newly determined.

In step S209, the terminal determines whether the touch drag input input to the terminal is the touch drag input of the terminal user or the touch drag input of the other user based on the criterion determined in step S207.

A machine learning algorithm is based on a given data and performs learning based on a specific algorithm to construct a discrimination criterion for classifying the data into a specific group (or set, class), so that when the new data is given, This is the process of predicting whether An SVM algorithm can be used as an example of machine learning, and the terminal can determine a hyperplane form of a criterion for authenticating a terminal user by performing machine learning through the SVM algorithm. The SVM algorithm means an algorithm for finding an optimal hyperplane that separates given data into two groups. Hereinafter, the user authentication method using the touch dragging will be described in more detail by exemplifying the SVM algorithm.

FIG. 3 is a diagram illustrating a method for setting criteria for user authentication according to an embodiment of the present invention.

In FIG. 3, it is assumed that a criterion for user authentication is set using attribute information (x 1 , x 2 ) of two types of touch drag.

First, the terminal vectorizes each of the touch drag attribute information of the stored terminal user and the touch drag attribute information of the other to a vector value, and maps the vector values of the respective touch drag attribute information onto the multidimensional plane. At this time, the vector value of the touch drag attribute information and the dimension of the multi-dimensional plane are the same as the number of each touch drag attribute information used for user authentication. 3, the vector value of the touch drag attribute information is mapped to the two-dimensional plane of x 1 and x 2. However, when N pieces of the touch drag attribute information are used, the vector value of the touch drag attribute information is x 1 , x 2 , ..., x N.

Specifically, when the number of touch drag attribute information used for user authentication is N and the touch drag attribute information of the terminal user to be analyzed or the touch drag attribute information of another person is i, the data vector for the touch drag attribute information is X i , X 2 , ..., X i, and the dimension of each data vector becomes N dimension, and X i can be expressed as the following equation (1).

Figure pat00001

Here, x iN denotes the N-th vector element of the data vector X i .

Then, the touch drag attribute information vector of the terminal user and the touch drag attribute information vectors of the other user are classified so as to belong to any group (or set, class) of all the groups. In FIG. 3, "O" represents the vector of the touch drag attribute information of the terminal user, and "X" represents the vector of the touch drag attribute information of the other user. That is, the terminal can set a set of the touch drag attribute information of the terminal user as the first label and the set of the touch drag attribute information of the other user as the second label. For example, the group of the touch drag attribute information vector of the terminal user is assigned a negative value (for example, '-1'), the group of the touch drag attribute information vector of the other user is a positive value For example, " 1 "

Then, the terminal determines a reference (i.e., a hyperplane) for distinguishing each group from the vector of the touch drag attribute information of the terminal user and the vector distribution of the touch drag attribute information of the other. For example, in the case of FIG. 3, the terminal can determine the reference so that the distance from the straight line for classifying the two groups to the data vector belonging to the group of the touch drag attribute information, that is, the margin, , It can be expressed as shown in Equation (2) below.

Figure pat00002

In Equation (2), X represents a vector of the touch drag attribute information of the terminal user and the touch drag attribute information of the other user. W denotes a weight vector, b denotes a bias vector, and both values can be determined by machine learning.

In the case of FIG. 3, the straight line located in the middle of the group of the touch drag attribute information of the terminal user and the group of the touch drag attribute information of the other represents the hyperplane for authenticating the terminal user. 3, the hyperplane has a one-dimensional (i.e., (N-1) -dimensional) linear shape. However, since the hyperplane uses N kinds of touch drag attribute information The hyperplane has an N-1 dimensional form.

When the user inputs a touch drag after the hyperplane is determined, the terminal determines the direction of the user's touch according to the direction of the hyperplane on which the vector value of the attribute information of the touch drag input is located based on the determined hyperplane. It is determined whether or not it is a drag input.

This can be expressed by the following equation (3).

Figure pat00003

In Equation (3), y may have a negative value (for example, '-1') or a positive value (for example, '1'), and the group of the vector is determined according to the value of y. That is, the terminal can determine whether the corresponding touch drag input is a touch drag input of the terminal user or a touch drag input of another user by applying the y-value derived by applying the attribute information of the touch drag input input to the terminal to Equation (3) .

Also, the terminal may determine the suitability of the hyperplane by applying the touch drag attribute information of the terminal user or the touch drag attribute information of the other user to Equation (3) below. For example, if the touch drag attribute information of the terminal user is given a negative value (for example, '-1'), but applying the result to Equation 3 below, if a negative value is obtained, This may be an inaccurate result. In the example of FIG. 3, a vector (i.e., 'x') of the touch drag attribute information of the other is displayed on the hyperplane, and a vector (i.e., 'O') of the touch drag attribute information of the terminal user is displayed below the hyperplane It can be seen that it is displayed. These values are a touch drag input of the terminal user but are determined as a touch drag input of the other person (false negative). On the contrary, it is a touch drag input of the other user, but a false positive is determined as a touch drag input of the terminal user. In this case, it is necessary to readjust the hyperplane, i.e., the criterion for user authentication, in order to increase the accuracy of the judgment of user authentication, and a detailed description thereof will be described later.

For convenience of description, it is assumed that two pieces of touch drag attribute information are used in FIG. 3, but the touch drag attribute information used for user authentication is not limited thereto.

Table 1 illustrates touch drag attribute information according to an embodiment of the present invention.

number designation Property information Remarks Explanation One r radius calculation
(estimated)
Turn radius of the estimated thumb
2 x x_center Estimated (estimated) X coordinate of estimated thumb rotation axis 3 y y_center Estimated (estimated) Y coordinate of the estimated thumb rotation axis 4 l length The length of the poppet 5 xs x_start The x coordinate of the point where the touch started 6 ys y_start The y coordinate of the point where the touch started 7 xe x_end The x coordinate of the touch end point 8 ye y_end Y coordinate of the touch end point 9 gx gravity_x Starting point
(at start time)
X coordinate of the gravity sensor at the start of touch
10 gy gravity_y Starting point
(at start time)
Y coordinate of the gravity sensor at the start of touch
11 gz gravity_z Starting point
(at start time)
Z coordinate of the gravity sensor at the start of touch
12 tma touchMajor Average The average of the major radius of the ellipse plane being touched 13 tMV touchMajor Variance Dispersion of long axis of touched ellipse plane 14 tmA touchMinor Average Average of the minor radius of the ellipse plane being touched 15 tmV touchMinor Variance Dispersion of the minor radius of the ellipse plane to be touched 16 wMA widthMajor Average The mean of the major radius of the elliptical plane actually recognized by the terminal 17 wMV widthMajor Variance Dispersion of long axis of ellipse plane recognized by terminal 18 xvA velocity of x Average Average of touch speed of x coordinate 19 xvV velocity of x Variance Dispersion of touch speed of x coordinate 20 yvA velocity of y Average The average of the touch speed of y coordinates 21 yvV velocity of y Variance Dispersion of the touch velocity of y coordinates 22 vA velocity Average Average value of touch speed 23 vV velocity Variance Dispersion of Touch Speed 24 ppA perpendicular Average The average of the length of the line from the touch point to the line segment connecting the start point and the end point 25 ppV perpendicular Variance Variance of the length of the line from each touch point to the line segment connecting the start point and the end point

Referring to Table 1, the touch drag attribute information (i.e., the touch drag attribute information of the terminal user and the touch drag attribute information of the other person) is calculated based on the rotation radius of the estimated touch finger, the x coordinate of the estimated touch finger rotation axis, The y coordinate of the finger rotational axis, the length of the touch drag, the x coordinate of the point at which the touch drag starts, the y coordinate of the point at which the touch drag starts, the x coordinate of the point at which the touch drag ends, The x coordinate of the gravity sensor at the start of the drag, the y coordinate of the gravity sensor at the start of the touch drag, the z coordinate of the gravity sensor at the start of the touch drag, the average of the long radius of the touch ellipse plane, The average of the minor radiuses of the elliptical plane to be touched, the variance of the minor radiuses of the touched elliptical plane to be touched, the elliptical plane The average of the velocity of the touch drag relative to the x coordinate of the touch drag, the average of the velocity of the touch drag relative to the y coordinate of the touch drag, the average of the velocity of the touch drag relative to the x coordinate of the touch drag, An average of the lengths of the water lines from the respective points of the touch drag to the line segments connecting the start point and the end point of the touch drag, the average of the lengths of the touch drag And the variance of the length of the waterline from the point to the line segment connecting the start point and the end point of the touch drag.

Each of the touch drag attribute information illustrated in Table 1 will be described in detail with reference to FIGS. 4 to 6 below.

FIGS. 4 to 6 are diagrams illustrating touch drag attribute information according to an embodiment of the present invention.

Referring to FIG. 4, it is assumed that a touch drag input of a user exists from a start point 401 to an end point 403 on the screen of the terminal, as shown in FIG.

First, the touch rotation radius information according to the length of the user's finger may be used as attribute information of the touch drag. 'radius' means the radius of rotation of the touch drag according to the length of the estimated touch finger, and the terminal displays a touch drag from the touch drag start point 401 to the end point 403 as shown in FIG. 4B The radius of the touch finger can be estimated using the arc 405. That is, the arc 405 to the touch drag start point 401 and the end point 403 is extended to generate a virtual circle (or an ellipse) 407, and the radius 409 of the virtual circle 407 It can be estimated as the turning radius of the touch finger.

'X_center' and 'y_center' mean the x coordinate of the estimated touch finger rotational axis and the y coordinate of the estimated touch finger rotational axis, respectively, and the distance from the touch drag start point 401 to the end point 403 (Or an ellipse) 407 generated by extending the center point 411 of the virtual circle (or the ellipse)

Information about the shape (length, start point, end point, etc.) of the touch drag can be used as attribute information of the touch drag. The length 'drag' of the touch drag means the length from the touch drag start point 401 to the end point 403. The x coordinate 'x_start' of the point where the touch drag is started and the y coordinate 'y_start' of the point where the touch drag is started are respectively the x coordinate and the y coordinate of the touch drag start point 401, The x coordinate 'x_end' and the y coordinate 'y_end' of the point where the touch drag ends are the x coordinate and the y coordinate of the touch drag end point 403, respectively.

As the attribute information of the touch drag, the sensing value of the sensor at the touch start point or the touch end point can be used. Although only the gravity sensor (i.e., the geomagnetic sensor) is illustrated in Table 1, it is a matter of course that the sensing value of the sensor other than the gravity sensor can be used. The gravity sensor 'gravity_x' of the gravity sensor at the start of the touch drag, the y coordinate 'gravity_y' of the gravity sensor at the start of the touch drag, and the gravity sensor 'z gravity_z' X coordinate, y coordinate and z coordinate of the gravity sensor mounted on the terminal at a time point recognized by the terminal of the terminal 401.

And, as the attribute information of the touch drag, the strength information of the touch input of the user, which is discriminated through the contact interview of the touch point, can be used. 'tmA' and 'tmV' mean the average value of the minor axis of the touching elliptical plane, respectively, and 'tma' and 'tMV' mean the mean value of the major axis of the touched ellipse plane, Means the dispersion value of the minor axis of the elliptical plane. Also, 'wMA' and 'wMV' mean the average value of the major axis of the ellipse plane in which the touch is actually recognized at the terminal, and the variance value of the major axis of the ellipse plane in which the touch is actually recognized at the terminal. This will be described in more detail with reference to FIG.

5 (a) illustrates an ellipse of a touch actually recognized by a terminal, FIG. 5 (b) illustrates an ellipse of a user finger that is touched by a terminal, and FIG. 5 (c) Illustrates the comparison of the sizes of the ellipses that are actually recognized and the sizes of the ellipses that are touched at the terminal according to the touch pressure of the user's finger.

The ellipse of the user finger actually touched by the terminal means a portion of the touch screen substrate of the terminal (e.g., a film, Plastic, glass, or the like) (i.e., an ellipse). (The major axis and the minor axis) of the touch actually recognized in the terminal according to the touch pressure touched on the touch screen substrate of the terminal and the size (the major axis and the minor axis) of the ellipse of the user finger touching the terminal may have different values .

As shown in FIG. 5C, when the touch pressure is 1, the ellipse of the touch actually sensed by the terminal and the ellipse of the user finger touching the terminal have the same size. If the touch pressure is greater than 1, If the touch pressure is less than 1, the ellipse of the touch actually recognized by the terminal may be smaller than the ellipse of the user finger that is touched by the terminal. However, it is assumed that the ellipse of the touch actually recognized in the terminal is fixed for the sake of convenience of explanation, and the present invention is not limited thereto. That is, when the touch pressure is within a predetermined range (for example, 0.5 <touch pressure <1.5), the ellipse of the touch actually sensed by the terminal and the ellipse of the user's finger touching the terminal may have the same size. That is, the size of the ellipse of the touch actually recognized by the terminal may be different in proportion to the ellipse of the user finger touched by the terminal. If the touch pressure is larger than a predetermined range, the ellipse of the user finger touched by the terminal may be larger than the ellipse of the touch actually recognized by the terminal. If the touch pressure is smaller than a predetermined range, The ellipse of the touch may be smaller than the ellipse of the user finger touching the terminal.

As the attribute information of the touch drag, speed information indicating the speed of touch input can be used. 'xvA' and 'xvV' respectively denote the mean value of the velocity of the touch drag relative to the x coordinate and the variance of the velocity of the touch drag relative to the x coordinate, and 'yvA' and 'yvV' 'VA' and 'vV' mean the average value of the touch drag speed and the variance value of the touch drag speed, respectively. Assuming that the touch drag input of the user is from the start point 401 to the end point 403 on the screen of the terminal as in the example of FIG. 4, the coordinate values of the start point 401 and the end point 403, (401) and the end point (403). It is also possible to calculate the speed of the touch input by using the sensor.

As attribute information of the touch drag, the length information of the line from the touch point to the line segment connecting the start point and the end point of the touch drag can be used. 'ppA' and 'ppV' respectively represent the average value of the lengths of the perpendicular lines from the respective points of the touch drag to the line segments connecting the start point and the end point of the touch drag and the average value of the lengths of the line segments connecting the start point and end point of the touch drag, Is the variance of the length of the length.

Referring to FIG. 6, it is assumed that a touch drag input of a user is on the screen of the terminal from the start point 601 to the end point 603, as in FIG. In this case, the values of 'ppA' and 'ppV' as shown in FIG. 6B are obtained by dividing the line segment from the touch start point 601 to the end point 603 from each point included in the touch- Can be calculated through the length of the waterline.

On the other hand, as described above, a vector (i.e., 'x') of the touch drag attribute information of another person is displayed on the hyperplane as in the example of FIG. 3, A vector (i.e., 'O') may be displayed. That is, it is determined that the touch drag input of the terminal user is a touch drag input of the other person, and the touch drag input of the other user is determined as the touch drag input of the terminal user. In this case, the hyperplane may be readjusted or the number of times of user authentication may be increased in order to improve the accuracy of user authentication, which will be described with reference to FIG. 7 and FIG.

7 is a diagram for explaining a user authentication method using a touch drag according to an embodiment of the present invention.

Referring to FIG. 7, when the terminal receives the touch drag of the terminal user (S701), the terminal extracts the touch drag attribute information of the terminal user from the input touch drag input of the terminal user (S703).

Then, the terminal stores touch drag attribute information of the extracted terminal user (S705). The touch drag attribute information of the terminal user can be extracted and accumulated in the terminal every time a touch drag input of the terminal user is made. That is, steps S701 to S705 may be repeatedly performed every time the touch drag input of the terminal user is input to the terminal.

The terminal performs a machine learning based on the touch drag attribute information of the stored terminal user and the touch drag attribute information of a previously stored other person to determine a criterion for distinguishing the touch drag input of the terminal user from the touch drag input of the other user , A criterion for authenticating the terminal user) (S707). Here, the terminal may determine a criterion for authenticating the terminal user each time the touch drag attribute information of the terminal user and / or the touch drag attribute information of the other terminal are updated, and further, The criteria can be newly determined.

Then, the terminal determines whether an error has occurred in user authentication (S709). That is, a first label is set and a negative value (for example, '-1') is given to the vector of the touch drag attribute information of the terminal user, but as a result of determining the vector value based on the hyperplane, For example, '-1'), or vice versa, it can be determined that an error has occurred in user authentication.

If it is determined in step S709 that an error has occurred in the user authentication, the terminal adjusts a criterion for authentication of the terminal user (i.e., hyperplane) in order to increase the accuracy of the determination of user authentication (S711). That is, the terminal can move in the positive or negative direction by applying a predetermined threshold to the hyperplane. Specifically, when an error occurs in which the touch drag input of the other user is judged as the touch drag input of the terminal user, it is desirable to narrow the region determined to be the touch drag of the terminal user and widen the region determined to be the touch drag of the other user. Lt; / RTI &gt; On the other hand, when an error occurs in which the touch drag input of the terminal user is determined to be a touch drag input of another person, it is preferable to widen the region judged to be the touch drag of the terminal user and narrow the region judged as the touch drag of the other user. A positive threshold can be applied.

In step S713, the terminal determines whether the touch drag input input to the terminal based on the adjusted reference (i.e., hyperplane) in step S711 is a touch drag input of the terminal user or a touch drag input of another user, .

On the other hand, if it is determined in step S709 that no error occurs in the user authentication, the terminal determines whether the touch drag input input to the terminal based on the reference (i.e., hyperplane) determined in step S709 is a touch drag input It is determined whether the input is the touch drag input of another person and the terminal user is authenticated (S713).

FIG. 8 is a diagram illustrating a user authentication method using a touch drag according to an embodiment of the present invention. Referring to FIG.

Referring to FIG. 8, when the terminal receives the touch drag of the terminal user (S801), the terminal extracts the touch drag attribute information of the terminal user from the inputted touch drag input of the terminal user (S803).

Then, the terminal stores touch drag attribute information of the extracted terminal user (S805). The touch drag attribute information of the terminal user can be extracted and accumulated in the terminal every time a touch drag input of the terminal user is made. That is, steps S801 to S805 may be repeatedly performed every time the touch drag input of the terminal user is input to the terminal.

The terminal performs a machine learning based on the touch drag attribute information of the stored terminal user and the touch drag attribute information of a previously stored other person to determine a criterion for distinguishing the touch drag input of the terminal user from the touch drag input of the other user , A criterion for authenticating the terminal user) (S807). Here, the terminal may determine a criterion for authenticating the terminal user each time the touch drag attribute information of the terminal user and / or the touch drag attribute information of the other terminal are updated, and further, The criteria can be newly determined.

Then, the terminal determines whether the accuracy of user authentication exceeds (or is greater than) a preset threshold (S809). As described above, the vector of the touch drag attribute information of the terminal user is given a negative value (for example, '-1') by setting the first label, but the vector value is determined based on the hyperplane, (E.g., '-1'), or vice versa, it can be determined that an error has occurred in user authentication. In this case, the terminal generates an error in the number of touch drag inputs of the terminal user and / or the number of touch drag inputs of the other user, which are used as basic information for determining the criterion for user authentication (i.e., hyperplane) The accuracy of the user authentication can be calculated.

As a result of the determination in step S809, if the accuracy of user authentication exceeds (or exceeds) a preset threshold value, the terminal determines whether the user authentication is successful based on the criterion (i.e., hyperplane) determined in step S807 The terminal user is authenticated by judging whether the touch drag input inputted to the terminal is the touch drag input of the terminal user or the touch drag input of the other terminal (S811).

On the other hand, if it is determined in step S809 that the accuracy of the user authentication is less than (or less than) the preset threshold value, the terminal determines that the touch drag input, which is input to the terminal, (K, e.g., k is an integer equal to or larger than 2) whether the input is the input or the touch drag input of the other person (S813), and authenticates the terminal user through the result of the repeated determination (S815). In step S813, the terminal requests the touch drag of y times (for example, displays a message for requesting the user to input a touch drag on the screen of the terminal) If it is determined that the user's touch drag input is input, the terminal user can be determined. Also, if it is determined in step S813 that the touch drag input of the terminal user is consecutively numbered k times, the terminal can determine the terminal user. Thus, it is possible to improve the accuracy of the user authentication by determining whether the touch drag input is repeatedly input by the terminal user.

Here, the k value may be set to be in a floating state in inverse proportion to the value of the accuracy with respect to the user authentication previously determined in S809. For example, the k value may be set to 3 if the accuracy of user authentication is less than 99%, and the k value may be set to 5 if the accuracy of user authentication is more than 99% and less than 99.5%.

The terminal can perform locking or unlocking of the terminal through the user authentication method using the touch drag input according to the present invention described above. This will be described with reference to Figs. 9 and 10. Fig.

9 is a diagram for explaining a user authentication method using a touch drag according to an embodiment of the present invention.

9, when the terminal receives a touch drag input for releasing the lock screen state of the terminal from the user on the lock screen of the terminal (S901), the terminal extracts the touch drag attribute information from the input touch drag input (S903 ). At this time, the terminal displays the entire area of the touch screen screen or a predetermined area on the touch screen screen, and can induce a touch drag input for unlocking from the user. For example, a touch drag input area may be displayed in a straight line or arc form on the touch screen screen of the terminal so that the user can induce the touch drag input in the corresponding area.

Then, the terminal determines whether the input touch drag input is the touch drag input of the user or the touch drag input of the other user based on the touch drag attribute information extracted in step S903 based on the predetermined user authentication reference (S905 ).

If it is determined in step S905 that the touch drag input is input by the terminal user, the terminal releases the locked state of the terminal (S907). In the case of FIG. 9, a method of releasing the terminal's lock state by touch dragging input is exemplified. However, as described above, in addition to the touch drag input for unlocking the terminal, , A touch pattern method, or the like can be used together.

On the other hand, if it is determined in step S905 that the touch drag input of the non-terminal user is input, the terminal maintains the locked state of the terminal. When the terminal branches back to step S901 and receives a touch drag input from the user again, the terminal may perform steps S903 to S907 again.

FIG. 10 is a diagram for explaining a user authentication method using a touch drag according to an embodiment of the present invention.

Referring to FIG. 10, when the terminal receives a touch drag input from a user on a screen other than the lock screen of the terminal (S1001), the terminal extracts the touch drag attribute information from the input touch drag input (S1003).

Then, the terminal determines whether the input touch drag input is the touch drag input of the user or the touch drag input of another user based on the touch drag attribute information extracted in step S1003 (S1005) ).

If it is determined in step S1005 that the input is a touch drag input of a non-terminal user, the terminal processes the state of the terminal into a locked state (S1007). 8, the terminal repeatedly determines whether the touch drag input of the user is the terminal user touch drag input or the touch drag input of the other person, as shown in the example of FIG. 8, k value). If it is determined that the input is a touch drag input of another person consecutively, or if it is determined that the touch drag input is input to the other person at least k times of y (y > k) It is possible. Thus, it is possible to improve the accuracy of the user authentication by determining whether the touch drag input is repeatedly input by the terminal user.

On the other hand, if it is determined in step S1005 that the touch drag input of the terminal user is input, the terminal maintains a running screen (e.g., a home screen screen, an application execution screen, etc.) of the terminal. Then, if the terminal branches back to step S1001 and receives a touch drag input from the user again, the terminal can perform steps S1003 to S1007 again.

11 is a diagram illustrating experimental data for comparing attribute information on a touch drag input of a terminal user and others according to an embodiment of the present invention.

11, (a) shows the attribute value of the touch drag input in the natural motion of the terminal user, and the touch drag input of the terminal user is about 450 to about 700 on the x- , And has a value in the range of about 500 to about 1100 on the y-axis. 11 (b) shows an attribute value of the touch drag input in the ordinary operation of the other person, and the touch drag input of the other has a value in a range of about 500 to about 650 on the x axis, Lt; RTI ID = 0.0 &gt; 550 &lt; / RTI &gt;

As described above, the touch drag attribute information used for user authentication is not limited to two and can be used for user authentication by using all of the 25 types of touch drag attribute information exemplified in Table 1 above.

12 is a diagram illustrating experimental data for comparing attribute information on touch drag input of a terminal user and a non-user according to each touch drag attribute information according to an embodiment of the present invention.

Referring to FIG. 12, the touch drag attribute values in the natural motion of the terminal user and others are shown for the 25 attributes used for the touch drag learning illustrated in Table 1 above. In FIG. 12, the learning data (a) for the touch drag input of the terminal user is represented by "O", and the learning data (b) for the touch drag input of the other user for user authentication is represented by "×". As described above, the touch drag input has unique attribute values for each of the twenty-five touch drag attribute information exemplified in Table 1 for each user. By using the attribute values unique to each user, the touch drag input Can be determined.

As described above, according to the present invention, an attribute value unique to each user can be extracted from the touch drag input in the daily operation of the user, and the attribute value of the touch drag input, The touch drag input can be discriminated.

Embodiments in accordance with the present invention may be implemented by various means, for example, hardware, firmware, software, or a combination thereof. In the case of hardware implementation, an embodiment of the present invention may include one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs) field programmable gate arrays, processors, controllers, microcontrollers, microprocessors, and the like.

In addition, in the case of an implementation by firmware or software, an embodiment of the present invention may be embodied in the form of a module, a procedure, a function, and the like for performing the functions or operations described above, Lt; / RTI &gt; Here, the recording medium may include program commands, data files, data structures, and the like, alone or in combination. Program instructions to be recorded on a recording medium may be those specially designed and constructed for the present invention or may be available to those skilled in the art of computer software. For example, the recording medium may be an optical recording medium such as a magnetic medium such as a hard disk, a floppy disk and a magnetic tape, a compact disk read only memory (CD-ROM), a digital video disk (DVD) Optical media such as a floppy disk and a hardware device specifically configured to store and execute program instructions such as ROM, RAM, flash memory and the like. Examples of program instructions may include machine language code such as those generated by a compiler, as well as high-level language code that may be executed by a computer using an interpreter or the like. Such hardware devices may be configured to operate as one or more software modules to perform the operations of the present invention, and vice versa.

While the present invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, It will be apparent to those skilled in the art. Furthermore, although specific terms are used in this specification and the drawings, they are used in a generic sense only to facilitate the description of the invention and to facilitate understanding of the invention, and are not intended to limit the scope of the invention. Accordingly, the foregoing detailed description is to be considered in all respects illustrative and not restrictive. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

In addition, a device or terminal according to the present invention may be driven by instructions that cause one or more processors to perform the functions and processes described above. Such instructions may include, for example, interpreted instructions such as script commands, such as JavaScript or ECMAScript commands, or other instructions stored in executable code or computer readable media. Further, the apparatus according to the present invention may be implemented in a distributed manner across a network, such as a server farm, or may be implemented in a single computer device.

Further, a computer program (also known as a program, software, software application, script or code) that is embedded in the apparatus according to the present invention and which implements the method according to the present invention includes a compiled or interpreted language, a priori or procedural language , And may be deployed in any form including standalone programs or modules, components, subroutines, or other units suitable for use in a computer environment. A computer program does not necessarily correspond to a file in the file system. The program may be stored in a single file provided to the requested program, or in multiple interactive files (e.g., a file storing one or more modules, subprograms, or portions of code) (E.g., one or more scripts stored in a markup language document). A computer program may be deployed to run on multiple computers or on one computer, located on a single site or distributed across multiple sites and interconnected by a communications network.

Moreover, in describing the embodiments according to the present invention, operations are depicted in the drawings in a particular order, but it is to be understood that they should perform such operations in that particular order or sequential order shown in order to obtain the desired result, Should not be understood as being performed. In certain cases, multitasking and parallel processing may be advantageous. Also, the separation of the various system components of the above-described embodiments should not be understood as requiring such separation in all embodiments, and the described program components and systems will generally be integrated together into a single software product or packaged into multiple software products It should be understood.

The terminal user authentication method using the touch drag according to the present invention can be applied to various devices equipped with a touch screen.

100: terminal 110:
120: input unit 130: display unit
140: storage unit 150: sensor unit
160: control unit 161: attribute information collecting unit
163: Machine learning unit 165: User authentication unit
167: terminal lock / unlock processing unit

Claims (9)

A method for authenticating a terminal user using a touch drag attribute information of a user in a terminal,
Extracting touch drag attribute information of the terminal user from the touch drag input of the terminal user upon receiving the touch drag input of the terminal user;
Storing touch drag attribute information of the extracted terminal user;
A machine learning is performed on the basis of the touch drag attribute information of the stored terminal user and the touch drag attribute information of a previously stored other person to determine a criterion for distinguishing the touch drag input of the terminal user from the touch drag input of the other user Determining; And
Determining whether the touch drag input input to the terminal is a touch drag input of the terminal user based on the determined criteria, and authenticating the terminal user.
2. The method of claim 1, wherein determining the criteria comprises:
Setting a first drag label attribute and a second drag label attribute information of the user to a first label and a second label, respectively; And
And performing the machine learning through a SVM (Support Vector Machine) algorithm or a Bayesian algorithm based on the set label to determine the criterion.
3. The method of claim 2,
And adjusting the determined criterion in a positive or negative direction by a predetermined value when an error is generated in a determination as to whether or not the terminal user is a touch drag input according to the determined criterion.
3. The method of claim 2, wherein authenticating the terminal user comprises:
When the determination accuracy according to the determined criterion is lower than a preset threshold value, the touch drag input inputted to the terminal is input to the touch drag input (k) of the terminal user by k And authenticating the terminal as the terminal user.
3. The method of claim 2,
When the touch drag input inputted on the lock screen of the terminal is judged to be the touch drag input of the terminal user, the touch lock input terminal releases the lock state of the terminal and the touch drag input inputted on the screen other than the lock screen of the terminal, Further comprising the step of processing the terminal into a locked state if it is determined that the input is a touch drag operation.
3. The method of claim 2,
And receiving the touch drag attribute information of the other person from an external device.
3. The method of claim 2,
And transmitting the touch drag attribute information of the user to an external device so that the user's touch drag attribute information is used as the touch drag attribute information of the other person.
3. The method of claim 2,
The touch drag attribute information of the user and the touch drag attribute information of the third person include the rotation radius of the estimated touch finger, the x coordinate of the estimated touch finger rotation axis, the y coordinate of the estimated touch finger rotation axis, the length of the touch drag, The x coordinate of the starting point, the y coordinate of the point at which the touch drag starts, the x coordinate of the point at which the touch drag ends, the y coordinate of the point at which the touch drag ends, the x coordinate of the gravity sensor at the start of the touch drag, The y coordinate of the gravity sensor at the start, the z coordinate of the gravity sensor at the start of the touch drag, the average of the major axis of the touch ellipse plane, the dispersion of the major axis of the touch ellipse plane, The dispersion of the elliptical plane of the minor axis, the average of the major axis of the elliptical plane in which the touch is actually recognized at the terminal, The variance of the major axis of the elliptical plane, the average of the velocity with respect to the x coordinate of the touch drag, the variance of the velocity with respect to the x coordinate of the touch drag, the average of the velocity with respect to the y coordinate of the touch drag, The average of the lengths of the lines from the start point of the touch drag to the end point of the touch drag from each point of the touch drag and the start point and end point of the drag from each point of the drag drag And a variance of the length of the perpendicular to the connecting line.
A terminal for authenticating a terminal user using touch drag attribute information of a user,
Touch screen for touch drag input;
A storage unit for storing the touch drag attribute information of the terminal user and the touch drag attribute information of the other user;
An attribute information collecting unit for extracting touch drag attribute information of the terminal user from the touch drag input of the terminal user when the touch drag input of the terminal user is received through the touch screen and storing the extracted attribute information in the storage unit;
A machine learning is performed based on the touch drag attribute information of the terminal user and the touch drag attribute information of the other person to determine a criterion for distinguishing the touch drag input of the terminal user from the touch drag input of the other person Machine learning department; And
And a user authentication unit for authenticating the terminal user by determining whether the touch drag input inputted to the touch screen is a touch drag input of the terminal user based on the determined criteria.
KR1020140012748A 2014-02-04 2014-02-04 Method for user Authentication through touch dragging, device therefor KR20150092441A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020140012748A KR20150092441A (en) 2014-02-04 2014-02-04 Method for user Authentication through touch dragging, device therefor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020140012748A KR20150092441A (en) 2014-02-04 2014-02-04 Method for user Authentication through touch dragging, device therefor

Publications (1)

Publication Number Publication Date
KR20150092441A true KR20150092441A (en) 2015-08-13

Family

ID=54056718

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020140012748A KR20150092441A (en) 2014-02-04 2014-02-04 Method for user Authentication through touch dragging, device therefor

Country Status (1)

Country Link
KR (1) KR20150092441A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190013307A (en) * 2017-08-01 2019-02-11 연세대학교 산학협력단 Method and Apparatus for Inputting Pattern to Prevent Shoulder Surfing
KR20190026492A (en) * 2017-09-05 2019-03-13 세종대학교산학협력단 Method for certification and apparatus for executing the method
KR101980483B1 (en) * 2018-03-05 2019-05-20 인하대학교 산학협력단 Pin input method and system based on user behavior recognition using machine learning
KR20190129672A (en) * 2018-05-10 2019-11-20 세종대학교산학협력단 Neural network based pattern authentication method and device
WO2023043118A1 (en) * 2021-09-16 2023-03-23 삼성전자 주식회사 Electronic device and touch recognition method of electronic device
US11899884B2 (en) 2021-09-16 2024-02-13 Samsung Electronics Co., Ltd. Electronic device and method of recognizing a force touch, by electronic device

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190013307A (en) * 2017-08-01 2019-02-11 연세대학교 산학협력단 Method and Apparatus for Inputting Pattern to Prevent Shoulder Surfing
KR20190026492A (en) * 2017-09-05 2019-03-13 세종대학교산학협력단 Method for certification and apparatus for executing the method
KR101980483B1 (en) * 2018-03-05 2019-05-20 인하대학교 산학협력단 Pin input method and system based on user behavior recognition using machine learning
KR20190129672A (en) * 2018-05-10 2019-11-20 세종대학교산학협력단 Neural network based pattern authentication method and device
WO2023043118A1 (en) * 2021-09-16 2023-03-23 삼성전자 주식회사 Electronic device and touch recognition method of electronic device
US11899884B2 (en) 2021-09-16 2024-02-13 Samsung Electronics Co., Ltd. Electronic device and method of recognizing a force touch, by electronic device

Similar Documents

Publication Publication Date Title
KR20150092441A (en) Method for user Authentication through touch dragging, device therefor
US9706406B1 (en) Security measures for an electronic device
KR102133534B1 (en) Method and Apparatus for User Authentication
EP3482331B1 (en) Obscuring data when gathering behavioral data
US10127370B2 (en) Computing device chording authentication and control
KR20140027606A (en) Comtrol method for terminal using text recognition and terminal thereof
CN109240554B (en) Method and system for detecting the presence of a finger in the vicinity of a touchless screen
JP6039822B2 (en) Electrostatic Touch Authentication Method (Method for Authenticating Capacitive Touch)
KR20200009916A (en) Electronic device and method for controlling the same
CN105723374A (en) Secure remote modification of device credentials using device-generated credentials
CN105447350B (en) A kind of identity identifying method and device
KR101958878B1 (en) Method for security unlocking of terminal and terminal thereof
KR101228336B1 (en) Personalization Service Providing Method by Using Mobile Terminal User&#39;s Activity Pattern and Mobile Terminal therefor
US9785863B2 (en) Fingerprint authentication
JP2020098638A (en) Trigger regions
WO2012152995A1 (en) Method and apparatus for navigation-based authentication
JP2016081071A (en) Biometric authentication device, and method and program for biometric authentication
KR20210130856A (en) Electronic device and its control method
US20200167553A1 (en) Method, system and apparatus for gesture recognition
KR20130015978A (en) Apparatus for detecting lane and method thereof
Nohara et al. Personal identification by flick input using self-organizing maps with acceleration sensor and gyroscope
US9158380B2 (en) Identifying a 3-D motion on 2-D planes
KR101428909B1 (en) System and method for interaction between pen-user terminal using magnetic field
KR20190130546A (en) Comtrol method for electronic device using text recognition and electronic device thereof
KR20150019125A (en) Authentication apparatus based onfg motion

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application
N231 Notification of change of applicant
AMND Amendment