CN105207979A - User input feature-based user authentication method and device - Google Patents

User input feature-based user authentication method and device Download PDF

Info

Publication number
CN105207979A
CN105207979A CN201410295075.7A CN201410295075A CN105207979A CN 105207979 A CN105207979 A CN 105207979A CN 201410295075 A CN201410295075 A CN 201410295075A CN 105207979 A CN105207979 A CN 105207979A
Authority
CN
China
Prior art keywords
user
input feature
input
feature vector
input position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410295075.7A
Other languages
Chinese (zh)
Other versions
CN105207979B (en
Inventor
胡育辉
张龙攀
金朝林
王宇飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Tencent Cloud Computing Beijing Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201410295075.7A priority Critical patent/CN105207979B/en
Publication of CN105207979A publication Critical patent/CN105207979A/en
Application granted granted Critical
Publication of CN105207979B publication Critical patent/CN105207979B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention provides a user input feature-based user authentication method and device. The method comprises the following steps that: the original input features of a user are collected in advance, and correlations are established between the collected original input features and the input positions of a display interface one by one, and then, the collected original input features are stored, wherein the display interface comprises at least two input positions, and the original input features comprise the touch data of human organs; when user authentication for subsequent operation is carried out, the current input features of the user at the input positions are collected; the current input features of the user at different input positions are compared with the original input features of corresponding pre-stored input positions; and if the above collected current input features are consistent with the pre-stored original input features, the operation of the user is responded. With the user input feature-based user authentication method and device of the invention adopted, a machine can be effectively prevented from executing operation related to accounts, and risks in online transactions can be reduced for the user, and property can be effectively protected, and privacy information can be kept confidential.

Description

A kind of user authentication method based on user's input feature vector and device
Technical field
The invention belongs to internet security technology area, particularly relate to a kind of user authentication method based on user's input feature vector and device.
Background technology
Along with developing rapidly of the Internet, open the Internet becomes inevitable trend, therefore account is interconnected becomes a wherein most important aspect, such as, user can by face book (Facebook), microsoft network service (MSN, MicrosoftServiceNetwork), the account such as QQ logs in third party website, to meet consumers' demand.
Log in third party website process user at present, value and the effect of user account are more and more important, but the risk that the situations such as the leakage of thing followed personal identification number, phishing, social engineering cause network account stolen is also more and more higher.Especially machine can utilize the account of acquisition and password to carry out illegal operation, and described machine is for example fishing website server.That is, when user carries out the network of relation operation relating to user account number on network, this network operation may be the operation behavior of apparatus control, instead of the operation behavior that user itself sends, if the operation behavior of apparatus control, bring risk so can to user's online transaction, cause the loss of property, or the problem that privacy information is disclosed.
Therefore, how diffServ network operation is the operation of manual operation or machine, thus stops the account such as privacy of user, property internal information stolen by others, and strengthening the safeguard protection of user account, is the current safety field technical issues that need to address.
Summary of the invention
The object of the present invention is to provide a kind of user authentication method based on user's input feature vector and device, be intended to solve in prior art when the operation behavior that network operation is machine, instead of during the operation behavior of user, risk is brought to user's online transaction, cause the loss of property, or the technical problem that privacy information is disclosed.
For solving the problems of the technologies described above, the embodiment of the present invention provides following technical scheme:
Based on a user authentication method for user's input feature vector, described method comprises:
Gather the original input feature vector of user in advance, and store after the original input feature vector collected is associated one by one with the input position of display interface, wherein said display interface comprises the input position of at least two, the original input feature vector that each input position is corresponding different, the original input feature vector of described user comprises the touch data of human organ;
When carrying out user rs authentication to carry out subsequent operation, gather the current input feature that user inputs at different input position;
The current input feature that user is inputted at different input position with prestore to should the original input feature vector of input position compare; And
If the current input feature collected above-mentioned is consistent with the original input feature vector prestored, then respond the operation of user.
For solving the problems of the technologies described above, the embodiment of the present invention provides following technical scheme:
Based on a user authentication device for user's input feature vector, described device comprises:
Presetting module, for gathering the original input feature vector of user in advance, and store after the original input feature vector collected is associated one by one with the input position of display interface, wherein said display interface comprises the input position of at least two, the original input feature vector that each input position is corresponding different, the original input feature vector of described user comprises the touch data of human organ;
Acquisition module, for when carrying out user rs authentication to carry out subsequent operation, gathers the current input feature of user in the described input position input of difference;
Comparison module, for current input feature that user is inputted at different input position with prestore to should the original input feature vector of input position compare; And
Respond module, if consistent with the original input feature vector prestored for the above-mentioned current input feature collected, then responds the operation of user.
Relative to prior art, the present embodiment is in advance at the input position of display interface setting at least two, and store after original for the user collected input feature vector is associated one by one with the input position of display interface, wherein said, the original input feature vector of described user comprises the touch data of human organ; When carrying out user rs authentication to carry out subsequent operation, gather the current input feature that user inputs at input position; And current input feature and original input feature vector compare, when the current input feature collected is consistent with the original input feature vector prestored, the operation of response user.The present invention passes through the input feature vector of touch data as user of human organ, and judge that network operation is the operation behavior of user or the operation behavior of machine with this, if this time operation is then authorized in artificial operation, can effectively prevent machine from removing to perform the associative operation relating to account number, the risk of online transaction is decreased to user, effectively can save property from damage, and privacy information can be good at being maintained secrecy.
Accompanying drawing explanation
The operational environment structural representation of the server at the user authentication device place based on user's input feature vector that Fig. 1 provides for the embodiment of the present invention;
The realization flow schematic diagram of the user authentication method based on user's input feature vector that Fig. 2 provides for first embodiment of the invention;
The realization flow schematic diagram of the user authentication method based on user's input feature vector that Fig. 3 provides for second embodiment of the invention;
The realization flow schematic diagram of the user authentication method based on user's input feature vector that Fig. 4 provides for third embodiment of the invention;
The realization flow schematic diagram of the user authentication method based on user's input feature vector that Fig. 5 provides for fourth embodiment of the invention;
The structural representation of the user authentication device based on user's input feature vector that Fig. 6 provides for the embodiment of the present invention;
Fig. 7 is another structural representation of the user authentication device based on user's input feature vector that the embodiment of the present invention provides.
Embodiment
In order to make object of the present invention, technical scheme and beneficial effect clearly understand, below in conjunction with drawings and Examples, the present invention is further elaborated.Should be appreciated that specific embodiment described herein only in order to explain the present invention, be not intended to limit the present invention.
In the following description, specific embodiments of the invention illustrate, unless otherwise stating clearly with reference to the step of the operation performed by or multi-section computer and symbol.Therefore, it can recognize these steps and operation, wherein have and will mention as being performed by computer for several times, include and handled with the computer processing unit of the electronic signal of the data in a structuring pattern by representing.These data of this manipulation transforms or the position maintained in the memory system of this computer, its reconfigurable or other running changing this computer in a manner familiar to those skilled in the art.The data structure that these data maintain is the provider location of this internal memory, and it has the particular characteristics defined by this data format.But the principle of the invention illustrates with above-mentioned word, it is not represented as a kind of restriction, and those skilled in the art can recognize that the plurality of step of the following stated and operation also may be implemented in the middle of hardware.
" assembly ", " module ", " system ", " interface ", " process " etc. are usually intended to refer to computer related entity as used herein the term: the combination of hardware, hardware and software, software or executory software.Such as, assembly can be but be not limited to be run process on a processor, processor, object, can perform application, the thread performed, program and/or computer.By diagram, run application on the controller and this both controller can be assembly.One or more assembly can have in the process and/or thread that are to perform, and assembly and/or can be distributed between two or more computers on a computer.
And claimed theme may be implemented as and uses standard program and/or engineering to produce software, firmware, hardware or its combination in any with the method for the theme disclosed in computer for controlling realization, device or manufacture.Term as used herein " manufacture " is intended to comprise can from the computer program of any computer readable device, carrier or medium access.Certainly, those skilled in the art will recognize that and can carry out many amendments to this configuration, and do not depart from scope or the spirit of claimed theme.
Fig. 1 and discussion subsequently provide brief, the description of summarizing to the operational environment of the server realizing the user authentication device place based on user's input feature vector of the present invention.The operational environment of Fig. 1 is only an example of suitable operational environment and is not intended to any restriction of suggestion about the purposes of operational environment or the scope of function.Instance server 112 includes but not limited to personal computer, server computer, hand-hold type or laptop devices, mobile device (such as mobile phone, personal digital assistant (PDA), media player etc.), multicomputer system, consumption-orientation server, minicom, mainframe computer, comprises the distributed computing environment (DCE) of above-mentioned any system or equipment, etc.
Although do not require, under the common background that " computer-readable instruction " is performed by one or more server, embodiment is described.Computer-readable instruction can distribute via computer-readable medium (hereafter discussing).Computer-readable instruction can be implemented as program module, such as performs particular task or realizes the function, object, API (API), data structure etc. of particular abstract data type.Typically, the function of this computer-readable instruction can arbitrarily combine or distribute in various environment.
Fig. 1 illustrates the example of the server 112 of the one or more embodiments comprising the user authentication device based on user's input feature vector of the present invention.In one configuration, server 112 comprises at least one processing unit 116 and memory 118.According to exact configuration and the type of server, memory 118 can be volatibility (such as RAM), non-volatile (such as ROM, flash memory etc.) or certain combination of the two.This configuration is illustrated by dotted line 114 in FIG.
In other embodiments, server 112 can comprise supplementary features and/or function.Such as, server 112 can also comprise additional storage device (such as removable and/or non-removable), and it includes but not limited to magnetic memory apparatus, light storage device etc.This additional memory devices is illustrated by storage device 120 in FIG.In one embodiment, the computer-readable instruction for realizing one or more embodiment provided in this article can in storage device 120.Storage device 120 can also store other computer-readable instructions for realizing operating system, application program etc.Computer-readable instruction can be loaded in memory 118 and be performed by such as processing unit 116.
Term as used herein " computer-readable medium " comprises computer-readable storage medium.Computer-readable storage medium comprises the volatibility and non-volatile, removable and non-removable medium that realize for any method of the information storing such as computer-readable instruction or other data and so on or technology.Memory 118 and storage device 120 are examples of computer-readable storage medium.Computer-readable storage medium includes but not limited to RAM, ROM, EEPROM, flash memory or other memory technologies, CD-ROM, digital universal disc (DVD) or other light storage devices, cassette tape, tape, disk storage device or other magnetic storage apparatus or may be used for storing expectation information and can serviced device 112 any other medium of accessing.Computer-readable storage medium so arbitrarily can be a part for server 112.
Server 112 can also comprise the communication connection 126 allowing server 112 and other devices communicatings.Communication connection 126 can include but not limited to modulator-demodulator, network interface unit (NIC), integrated network interface, radiofrequency launcher/receiver, infrared port, USB connection or other interfaces for server 112 being connected to other servers.Communication connection 126 can comprise wired connection or wireless connections.Communication connection 126 can be launched and/or received communication media.
Term " computer-readable medium " can comprise communication media.Communication media typically comprises other data in " the own modulated data signal " of computer-readable instruction or such as carrier wave or other transport sectors and so on, and comprises any information delivery media.Term " own modulated data signal " can comprise such signal: one or more according to being arranged to the mode in signal by information coding or changing in this characteristics of signals.
Server 112 can comprise input equipment 124, such as keyboard, mouse, pen, voice-input device, touch input device, infrared camera, video input apparatus and/or any other input equipment.Also output equipment 122 can be comprised, such as one or more display, loud speaker, printer and/or other output equipments arbitrarily in equipment 112.Input equipment 124 and output equipment 122 can be connected to server 112 via wired connection, wireless connections or its combination in any.In one embodiment, input equipment 124 or the output equipment 122 of server 112 can be used as from the input equipment of another server or output equipment.
The assembly of server 112 can be connected by various interconnection (such as bus).Such interconnection can comprise periphery component interconnection (PCI) (such as quick PCI), USB (USB), live wire (IEEE1394), optical bus structure etc.In another embodiment, the assembly of server 112 can pass through network interconnection.Such as, memory 118 can by be arranged in different physical location, formed by multiple physical memory cells arc of network interconnection.
Those skilled in the art will recognize that, can spanning network distribution for the memory device storing computer-readable instruction.Such as, the computing equipment 130 can accessed via network 128 can store the computer-readable instruction for realizing one or more embodiment provided by the present invention.Server 112 can access computation equipment 130 and a part for downloading computer instructions or all for execution.Alternately, server 112 can download many computer-readable instructions on demand, or some instructions can server 112 place perform and some instructions can perform at computing equipment 130 place.
There is provided herein the various operations of embodiment.In one embodiment, described one or more operations can form the computer-readable instruction that one or more computer-readable medium stores, and it will make computing equipment perform described operation when serviced device performs.The order describing some or all of operation should not be interpreted as implying what these operations were necessarily sequentially correlated with.It will be appreciated by those skilled in the art that the alternative sequence of the benefit with this specification.And, should be appreciated that not all operation must exist in each embodiment provided in this article.
And word used herein " preferably " means to be used as example, example or illustration.Any aspect that Feng Wen is described as " preferably " or design need not be interpreted as than other aspects or design more favourable.On the contrary, the use of word " preferably " is intended to propose concept in a concrete fashion.Term "or" as used in this application is intended to the "or" that means to comprise and the "or" of non-excluded.That is, unless otherwise or clear from the context, " X uses A or B " means any one that nature comprises arrangement.That is, if X uses A; X uses B; Or X uses both A and B, then " X uses A or B " is met in aforementioned arbitrary example.
And although illustrate and describe the disclosure relative to one or more implementation, those skilled in the art are based on to the reading of this specification and accompanying drawing with understand and will expect equivalent variations and amendment.The disclosure comprises all such amendments and modification, and is only limited by the scope of claims.Especially about the various functions performed by said modules (such as element, resource etc.), term for describing such assembly is intended to the random component (unless otherwise instructed) corresponding to the appointed function (such as it is functionally of equal value) performing described assembly, even if be not structurally equal to the open structure of the function performed in exemplary implementations of the present disclosure shown in this article.In addition, although special characteristic of the present disclosure relative in some implementations only one be disclosed, this feature can with can be such as expect and other Feature Combinations one or more of other favourable implementations for given or application-specific.And, " comprise " with regard to term, " having ", " containing " or its distortion be used in embodiment or claim with regard to, such term is intended to comprise " to comprise " similar mode to term.
Refer to Fig. 2, the realization flow schematic diagram of the user authentication method based on user's input feature vector that Fig. 2 provides for first embodiment of the invention.
In step s 201, gather the original input feature vector of user in advance, and store after the original input feature vector collected is associated one by one with the input position of display interface.
Wherein, described display interface comprises the input position of at least two, the original input feature vector that each input position is corresponding different, and the original input feature vector of described user comprises the touch data of human organ.
In embodiments of the present invention, the executive agent of the described user authentication method based on user's input feature vector is client, and described client includes but not limited to following equipment: the mobile terminal with touch-screen, the computer with touch-screen or the individual palmtop PC with touch-screen etc.But, be understandable that, as long as the electronic equipment with touch-screen all should be included within protection scope of the present invention.
Further, the corresponding relation of the original input feature vector of user and client display interface input position is set up in advance in described client, described display interface comprises the input position of at least two, each input position is corresponding with each original input feature vector, the original input feature vector of described user comprises the touch data of human organ, and the touch data of wherein said human organ is: user's touch data that client collects when described input position is inputted to display screen by touch manner.
In step S202, when carrying out user rs authentication to carry out subsequent operation, gather the current input feature that user inputs at described input position.
In step S203, the current input feature that user is inputted at different input position with prestore to should the original input feature vector of input position compare.
Described step S202 and described step S203 can be specially:
When client user carries out network operation, first need to carry out user rs authentication, namely distinguishing this network operation is machine operation or manual operation, if user rs authentication is passed through, this network operation of response user, even determining current network operation is manual operation, then respond this network operation, to ensure user account safety.
In the present embodiment, client gathers the current input feature that user inputs at described input position, and the current input feature that user is inputted at different input position with prestore to should the original input feature vector of input position compare, with the current input feature inputted at described input position according to user, and prestore the incidence relation of original input feature vector and input position, determine that this network operation is machine operation or manual operation.
In step S204, the current input feature collected if above-mentioned is consistent with the original input feature vector prestored, then respond the operation of user.
If current input feature is consistent with the original input feature vector prestored, then can think that user rs authentication is passed through, namely judge that network operation is manual operation, then respond the operation of user.
From the above, the present embodiment is in advance at the input position of display interface setting at least two, and store after original for the user collected input feature vector is associated one by one with the input position of display interface, wherein said, the original input feature vector of described user comprises the touch data of human organ; When carrying out user rs authentication to carry out subsequent operation, gather the current input feature that user inputs at input position; And current input feature and original input feature vector compare, when the current input feature collected is consistent with the original input feature vector prestored, the operation of response user.The present invention passes through the input feature vector of touch data as user of human organ, and judge that network operation is the operation behavior of user or the operation behavior of machine with this, if this time operation is then authorized in artificial operation, can effectively prevent machine from removing to perform the associative operation relating to account number, the risk of online transaction is decreased to user, effectively can save property from damage, and privacy information can be good at being maintained secrecy.
Refer to Fig. 3, the realization flow schematic diagram of the user authentication method based on user's input feature vector that Fig. 3 provides for second embodiment of the invention.
In step S301, gather the original input feature vector of user in advance, and store after the original input feature vector collected is associated one by one with the input position of display interface.
Wherein, described display interface comprises the input position of at least two, the original input feature vector that each input position is corresponding different, and the original input feature vector of described user comprises the touch data of human organ.
In embodiments of the present invention, the executive agent of the described user authentication method based on user's input feature vector is client, and described client includes but not limited to following equipment: the mobile terminal with touch-screen, the computer with touch-screen or the individual palmtop PC with touch-screen etc.But, be understandable that, as long as the electronic equipment with touch-screen all should be included within protection scope of the present invention.
Preferably, the touch data of described human organ is: user's touch data that client collects when described input position is inputted to display screen by touch manner.
Further, in the embodiment of the present invention, described input feature vector can comprise the touch data based on finger finger, articulations digitorum manus, nail.That is, the touch data of described human organ is specially the touch data that user points finger, articulations digitorum manus, nail.
Further, the described touch data based on finger finger, articulations digitorum manus, nail can comprise: one or more combination in any of the sound that described input position sends that touches the areal extent of described input position, touches the pressure size of described input position, touches.That is, according to one or more combination in any of described touch data, can determine that the input feature vector of current input is for inputting by pointing finger or articulations digitorum manus or nail, does not do concrete restriction to its realization herein.
In step s 302, when carrying out user rs authentication to carry out subsequent operation, at the input position that each are different, prompting user is inputted by different input modes.
Be understandable that, the corresponding original input feature vector of each input position; Described different input mode is that the original input feature vector corresponding according to each input position inputs at variant input position.
Such as: in the corresponding relation of the input position prestored and original input feature vector, the corresponding touch data based on finger of input position A, then described client is at the input position A place of its touch-screen display interface, and prompting user adopts finger carrying out touch input.
In step S303, gather the current input feature that user inputs at described input position.
In step s 304, current input feature user inputted at different input position with prestore to should the original input feature vector of input position compare.
Described step S303 and described step S304 can be specially:
Preferably, the touch data based on finger finger, articulations digitorum manus, nail that user inputs at each input position is gathered.
After gathering described touch data, the current input feature that user is inputted at different input position with prestore to comparing one by one the original input feature vector of input position, with the current input feature inputted at described input position according to user, and prestore the incidence relation of original input feature vector and input position, determine that this network operation is machine operation or manual operation.
Such as: in the corresponding relation of the input position prestored and original input feature vector, the corresponding touch data based on finger of input position A, then client gathers the current input feature that user inputs at input position A, and this current input feature and the touch data based on finger preset is compared.
In step S305, the current input feature collected if above-mentioned is consistent with the original input feature vector prestored, then respond the operation of user.
In step S306, by comparative result display to described display interface, and user rs authentication is pointed out to pass through.
Described step S305 and described step S306 can be specially:
Be understandable that, in the present embodiment, described input feature vector comprises the touch data of human organ such as finger finger, articulations digitorum manus, nail etc., and the described touch data based on finger finger, articulations digitorum manus, nail can comprise: one or more combination in any of other touch datas such as the sound that described input position sends that touch the areal extent of described input position, touch the pressure size of described input position, touch.
That is, described client can according to one or more combination in any of described touch data, determines that the current input feature taked is for inputting by finger finger or articulations digitorum manus or nail or other touch manners.
Such as: in the corresponding relation of the input position prestored and original input feature vector, the corresponding touch data based on finger of input position A, client gathers the current input feature that user inputs at input position A, and this current input feature and the touch data based on finger preset is compared:
If collect the data of the areal extent touching described input position at input position A, then according to the corresponding relation of the data that collect and the touch areal extent preset and touch manner, can judge that described current input feature by which kind of input mode is inputted; If judge, described current input feature is the touch data based on finger, then can think that the current input feature that input position A collects is consistent with the original input feature vector prestored.
Or, if collect at input position A and touch the areal extent of described input position and the data of pressure size, then according to the combination of the corresponding relation of the corresponding relation of data, default touch areal extent and touch manner collected and default touch pressure size and touch manner, can judge that described current input feature by which kind of input mode is inputted; Same, if judge, described current input feature is the touch data based on finger, then can think that the current input feature that input position A collects is consistent with the original input feature vector prestored.
It is contemplated that, in some more accurate occasions, the areal extent touching described input position, the pressure size touching described input position can be adopted, touch the plural combination of sound that described input position sends and other touch datas, judge that described current input feature by which kind of input mode is inputted, do not do concrete restriction herein.
Wherein, the corresponding relation of described touch areal extent and touch manner, and/or the corresponding relation of touch pressure size and touch manner, and/or touch the corresponding relation of the touch data territory touch manner such as corresponding relation of sound type size and touch manner, can be set in advance in described client, wherein, described touch manner comprise by finger finger, articulations digitorum manus, nail or other human organs input.
Be understandable that, user is at the current input feature of each different input position input, all can refer to the detection mode of the current input feature of above-mentioned input position A, with to should the original input feature vector prestored of input position compare detection one by one, if all current input feature collected are consistent with the original input feature vector prestored, can judge that network operation is manual operation, the operation of response user, simultaneously, by the display of this comparative result to described display interface (i.e. client touch-screen), and be verified to client user's prompting.
From the above, the present embodiment is in advance at the input position of display interface setting at least two, and store after original for the user collected input feature vector is associated one by one with the input position of display interface, wherein said, the original input feature vector of described user comprises the touch data of human organ; When carrying out user rs authentication to carry out subsequent operation, gather the current input feature that user inputs at input position; And current input feature and original input feature vector compare, when the current input feature collected is consistent with the original input feature vector prestored, the operation of response user.The present invention passes through the input feature vector of touch data as user of human organ, and judge that network operation is the operation behavior of user or the operation behavior of machine with this, if this time operation is then authorized in artificial operation, can effectively prevent machine from removing to perform the associative operation relating to account number, the risk of online transaction is decreased to user, effectively can save property from damage, and privacy information can be good at being maintained secrecy.Further, gather the touch data based on finger finger, articulations digitorum manus, nail that user inputs at each input position, judge artificial operation more accurately, improve the reliability of user rs authentication further.
Refer to Fig. 4, the realization flow schematic diagram of the user authentication method based on user's input feature vector that Fig. 4 provides for third embodiment of the invention.Be different from above-mentioned second embodiment, the present embodiment carries out analytic explanation for the situation that the current input feature collected is inconsistent with the original input feature vector prestored.
In step S401, gather the original input feature vector of user in advance, and store after the original input feature vector collected is associated one by one with the input position of display interface.
Wherein, described display interface comprises the input position of at least two, the original input feature vector that each input position is corresponding different, and the original input feature vector of described user comprises the touch data of human organ.
In embodiments of the present invention, the executive agent of the described user authentication method based on user's input feature vector is client, and described client includes but not limited to following equipment: the mobile terminal with touch-screen, the computer with touch-screen or the individual palmtop PC with touch-screen etc.But, be understandable that, as long as the electronic equipment with touch-screen all should be included within protection scope of the present invention.
Preferably, the touch data of described human organ is: user's touch data that client collects when described input position is inputted to display screen by touch manner.
Further, in the embodiment of the present invention, described input feature vector can comprise the touch data based on finger finger, articulations digitorum manus, nail.That is, the touch data of described human organ is specially the touch data that user points finger, articulations digitorum manus, nail.
Further, the described touch data based on finger finger, articulations digitorum manus, nail can comprise: one or more combination in any of the sound that described input position sends that touches the areal extent of described input position, touches the pressure size of described input position, touches.That is, according to one or more combination in any of described touch data, can determine that the input feature vector of current input is for inputting by pointing finger or articulations digitorum manus or nail, does not do concrete restriction to its realization herein.
In step S402, when carrying out user rs authentication to carry out subsequent operation, at the input position that each are different, prompting user is inputted by different input modes.
Be understandable that, the corresponding original input feature vector of each input position; Described different input mode is that the original input feature vector corresponding according to each input position inputs at variant input position.
Such as: in the corresponding relation of the input position prestored and original input feature vector, the corresponding touch data based on finger of input position A, then described client is at the input position A place of its touch-screen display interface, and prompting user adopts finger carrying out touch input.
In step S403, gather the current input feature that user inputs at described input position.
In step s 404, current input feature user inputted at different input position with prestore to should the original input feature vector of input position compare.
Described step S403 and described step S404 can be specially:
Preferably, the touch data based on finger finger, articulations digitorum manus, nail that user inputs at each input position is gathered.
After gathering described touch data, the current input feature that user is inputted at different input position with prestore to comparing one by one the original input feature vector of input position, with the current input feature inputted at described input position according to user, and prestore the incidence relation of original input feature vector and input position, determine that this network operation is machine operation or manual operation.
Such as: in the corresponding relation of the input position prestored and original input feature vector, the corresponding touch data based on finger of input position A, then client gathers the current input feature that user inputs at input position A, and this current input feature and the touch data based on finger preset is compared.
In step S405a, the current input feature collected if above-mentioned is consistent with the original input feature vector prestored, then respond the operation of user.
Be understandable that, in the present embodiment, described input feature vector comprises the touch data of human organ such as finger finger, articulations digitorum manus, nail etc., and the described touch data based on finger finger, articulations digitorum manus, nail can comprise: one or more combination in any of other touch datas such as the sound that described input position sends that touch the areal extent of described input position, touch the pressure size of described input position, touch.
That is, described client can according to one or more combination in any of described touch data, determines that the current input feature taked is for inputting by finger finger or articulations digitorum manus or nail or other touch manners.
Such as: in the corresponding relation of the input position prestored and original input feature vector, the corresponding touch data based on finger of input position A, client gathers the current input feature that user inputs at input position A, and this current input feature and the touch data based on finger preset is compared:
If collect the data of the areal extent touching described input position at input position A, then according to the corresponding relation of the data that collect and the touch areal extent preset and touch manner, can judge that described current input feature by which kind of input mode is inputted; If judge, described current input feature is the touch data based on finger, then can think that the current input feature that input position A collects is consistent with the original input feature vector prestored.
Or, if collect at input position A and touch the areal extent of described input position and the data of pressure size, then according to the combination of the corresponding relation of the corresponding relation of data, default touch areal extent and touch manner collected and default touch pressure size and touch manner, can judge that described current input feature by which kind of input mode is inputted; Same, if judge, described current input feature is the touch data based on finger, then can think that the current input feature that input position A collects is consistent with the original input feature vector prestored.
It is contemplated that, in some more accurate occasions, the areal extent touching described input position, the pressure size touching described input position can be adopted, touch the plural combination of sound that described input position sends and other touch datas, judge that described current input feature by which kind of input mode is inputted, do not do concrete restriction herein.
Wherein, the corresponding relation of described touch areal extent and touch manner, and/or the corresponding relation of touch pressure size and touch manner, and/or touch the corresponding relation of the touch data territory touch manner such as corresponding relation of sound type size and touch manner, can be set in advance in described client, wherein, described touch manner comprise by finger finger, articulations digitorum manus, nail or other human organs input.
Be understandable that, user is at the current input feature of each different input position input, all can refer to the detection mode of the current input feature of above-mentioned input position A, with to should the original input feature vector prestored of input position compare detection one by one, if all current input feature collected are consistent with the original input feature vector prestored, can judge that network operation is manual operation, the operation of response user, simultaneously, by the display of this comparative result to described display interface (i.e. client touch-screen), and be verified to client user's prompting.
In step S405b, if the current input feature collected is inconsistent with the original input feature vector prestored, then user is pointed out to re-start input.
Wherein, after prompting user re-starts input, step S402 can be returned and continue executable operations.
Be understandable that, under the enforceable mode of one, in the process gathering the current input feature that user inputs at different input position, often receive a current input feature, namely this current input feature is compared, once this current input feature is inconsistent with the original input feature vector prestored, then user is pointed out to re-start input.
That is, in the process of current input feature gathering the input of described input position one by one, when occurring that arbitrary current input feature collected and the original input feature vector prestored are inconsistent, just stop the current input feature gathering user's input, simultaneously, prompting user re-starts input, wherein, re-enter and can comprise two kinds of forms: first, prompting is that inconsistent input position place re-enters at input feature vector comparative result, the second, prompting is re-entered at described input position one by one.
Under the enforceable mode of another kind, when user is after the current input feature that all input positions input all has gathered, unification compares one by one to all current input feature, once one of them current input feature is inconsistent with the original input feature vector of corresponding input position, then user is pointed out to re-start input.
That is, after all input positions all complete input, when occurring that arbitrary current input feature collected and the original input feature vector prestored are inconsistent, prompting user re-enters at described input position one by one.
Be understandable that, in some comparatively accurate occasions, if when the current input feature that detection collects is inconsistent with the original input feature vector prestored, directly do not respond user operation, namely this network operation is refused, and exit current display interface, on can carrying out after a preset time period once user rs authentication to carry out subsequent network operations; Or, the number of times allowing user to carry out re-entering is set and is no more than 2 times, if exceed, exit current display interface then equally, on can carrying out after a preset time period, once user rs authentication, to carry out subsequent network operations etc., is only herein and illustrates, do not form limitation of the invention.
From the above, the present embodiment is in advance at the input position of display interface setting at least two, and store after original for the user collected input feature vector is associated one by one with the input position of display interface, wherein said, the original input feature vector of described user comprises the touch data of human organ; When carrying out user rs authentication to carry out subsequent operation, gather the current input feature that user inputs at input position; And current input feature and original input feature vector compare, when the current input feature collected is consistent with the original input feature vector prestored, the operation of response user.The present invention passes through the input feature vector of touch data as user of human organ, and judge that network operation is the operation behavior of user or the operation behavior of machine with this, if this time operation is then authorized in artificial operation, if this time operation is then refused in the operation of machine, can effectively prevent machine from removing to perform the associative operation relating to account number, decrease the risk of online transaction to user, effectively can save property from damage, and privacy information can be good at being maintained secrecy.Further, gather the touch data based on finger finger, articulations digitorum manus, nail that user inputs at each input position, judge artificial operation more accurately, improve the reliability of user rs authentication further.
Refer to Fig. 5, Fig. 5 is the embody rule embodiment of the user authentication method based on user's input feature vector of the embodiment of the present invention.As shown in the figure, this subscriber authentication system comprises user 11, user interface (UI) 12 and verification process module 13, be understandable that, described user proposes to describe described user authentication method better, do not belong to a described subscriber authentication system part, described user interface is the front end display module of client, and described verification process module 13 is the background processing module of described client.
In step S501, verification process module 13 is set up the corresponding relation of the input position of the original input feature vector of user and display interface and is stored;
Wherein, described input feature vector can comprise the touch data based on finger finger, articulations digitorum manus, nail; Then the touch data of described human organ is specially the touch data that user points finger, articulations digitorum manus, nail.
In step S502, user 11 asks user rs authentication to carry out subsequent operation to user interface 12;
And by described user interface 12, described request is sent to described verification process module 13.
In step S503, verification process module 13 is according to the one-to-one relationship of described original input feature vector and described input position, and at the input position that each are different, prompting user carries out inputting and shows in described user interface 12.
In step S504, user 11 inputs at the input position of described user interface 12;
And by described user interface 12, input feature vector is sent to described verification process module 13.
In step S505, verification process module 13 gathers the current input feature that user 11 inputs at described input position;
In step S506, the current input feature that user inputs at different input position by verification process module 13 with prestore to should the original input feature vector of input position compare.
In step s 507, if the current input feature collected is consistent with the original input feature vector prestored, verification process module 13 points out user 11 to be verified by user interface 12.
In step S508, if the current input feature collected consistent with the original input feature vector prestored while, verification process module 13 responds the operation of user 11.
In step S509, if the current input feature collected is inconsistent with the original input feature vector prestored, verification process module 13 points out user to re-start input by user interface 12.
For example, in the corresponding relation of the input position prestored and original input feature vector, the corresponding touch data based on finger of a certain input position, if collect the data of the areal extent touching described input position at this input position, then according to the corresponding relation of the data that collect and the touch areal extent preset and touch manner, can judge that described current input feature by which kind of input mode is inputted; If judge, described current input feature is the touch data based on finger, then can think that the current input feature that this input position collects is consistent with the original input feature vector prestored.
If by the current input feature of each different input position input; with to should after the original input feature vector prestored of input position compares detection one by one; judge that all current input feature collected are consistent with the original input feature vector prestored; can judge that network operation is manual operation; the operation of response user; thus stop the account such as privacy of user, property internal information stolen by others, strengthen the safeguard protection of user account.
Be understandable that, described step S501, to step S509, can carry out specific implementation with reference to the associated description of above-mentioned 3rd embodiment, repeat no more herein.
For ease of better implementing the user authentication method based on user's input feature vector that the embodiment of the present invention provides, the embodiment of the present invention also provides a kind of device of the user rs authentication based on user's input feature vector.Wherein the implication of noun is identical with the above-mentioned user authentication method based on user's input feature vector, and specific implementation details can explanation in reference method embodiment.Refer to Fig. 6, the structural representation of the user authentication device based on user's input feature vector that Fig. 6 provides for the embodiment of the present invention, wherein said device comprises presetting module 61, acquisition module 62, comparison module 63 and respond module 64.
Wherein said presetting module 61, gather the original input feature vector of user in advance, and store after the original input feature vector collected is associated one by one with the input position of display interface, wherein said display interface comprises the input position of at least two, the original input feature vector that each input position is corresponding different, the original input feature vector of described user comprises the touch data of human organ; Described acquisition module 62, when carrying out user rs authentication to carry out subsequent operation, gathers the current input feature that user inputs at described input position.
In embodiments of the present invention, the device of the described user rs authentication based on user's input feature vector can be the unit that the software unit be built in client, hardware cell or soft or hard combine.Described client includes but not limited to following equipment: the mobile terminal with touch-screen, the computer with touch-screen or the individual palmtop PC with touch-screen etc.But, be understandable that, as long as the electronic equipment with touch-screen all should be included within protection scope of the present invention.
Described comparison module 63, the current input feature that user is inputted at different input position with prestore to should the original input feature vector of input position compare; Described respond module 64, the current input feature collected if above-mentioned is consistent with the original input feature vector prestored, then respond the operation of user.
When client user carries out network operation, first need to carry out user rs authentication, namely distinguishing this network operation is machine operation or manual operation, if user rs authentication is passed through, this network operation of response user, even determining current network operation is manual operation, then respond this network operation, to ensure user account safety.
In the present embodiment, client gathers the current input feature that user inputs at described input position, and the current input feature that user is inputted at different input position with prestore to should the original input feature vector of input position compare, with the current input feature inputted at described input position according to user, and prestore the incidence relation of original input feature vector and input position, determine that this network operation is machine operation or manual operation.
If current input feature is consistent with the original input feature vector prestored, then can think that user rs authentication is passed through, namely judge that network operation is manual operation, then respond the operation of user.
From the above, the present embodiment is in advance at the input position of display interface setting at least two, and store after original for the user collected input feature vector is associated one by one with the input position of display interface, wherein said, the original input feature vector of described user comprises the touch data of human organ; When carrying out user rs authentication to carry out subsequent operation, gather the current input feature that user inputs at input position; And current input feature and original input feature vector compare, when the current input feature collected is consistent with the original input feature vector prestored, the operation of response user.The present invention passes through the input feature vector of touch data as user of human organ, and judge that network operation is the operation behavior of user or the operation behavior of machine with this, if this time operation is then authorized in artificial operation, can effectively prevent machine from removing to perform the associative operation relating to account number, the risk of online transaction is decreased to user, effectively can save property from damage, and privacy information can be good at being maintained secrecy.
Refer to Fig. 7, the structural representation of the user authentication device based on user's input feature vector that Fig. 7 provides for the embodiment of the present invention, wherein said device comprises presetting module 71, acquisition module 72, comparison module 73, respond module 74, first reminding module 75, second reminding module 76 and the 3rd reminding module 77.In the present embodiment, preferably, described input feature vector can comprise the touch data based on finger finger, articulations digitorum manus, nail.
Wherein said presetting module 71, gathers the touch data based on finger finger, articulations digitorum manus, nail that user is original in advance, and stores after being associated one by one with the input position of display interface by the original input feature vector collected.Wherein, described display interface comprises the input position of at least two, the original input feature vector that each input position is corresponding different, and the original input feature vector of described user comprises the touch data of human organ.
Namely the touch data of described human organ is: user's touch data that client collects when described input position is inputted to display screen by touch manner.
Further, the described touch data based on finger finger, articulations digitorum manus, nail can comprise: one or more combination in any of the sound that described input position sends that touches the areal extent of described input position, touches the pressure size of described input position, touches.That is, according to one or more combination in any of described touch data, can determine that the input feature vector of current input is for inputting by pointing finger or articulations digitorum manus or nail, does not do concrete restriction to its realization herein.
Described first reminding module 75, at variant input position, prompting user is inputted by different input modes, and wherein said different input mode is that the original input feature vector corresponding according to each input position inputs at variant input position.
Be understandable that, the corresponding original input feature vector of each input position; Such as: in the corresponding relation of the input position prestored and original input feature vector, the corresponding touch data based on finger of input position A, then described client is at the input position A place of its touch-screen display interface, and prompting user adopts finger carrying out touch input.
Described acquisition module 72, when carrying out user rs authentication to carry out subsequent operation, gathers the current touch data based on finger finger, articulations digitorum manus, nail that user inputs at described input position.Described comparison module 73, the current input feature that user is inputted at different input position with prestore to should the original input feature vector of input position compare.
Preferably, the touch data based on finger finger, articulations digitorum manus, nail that user inputs at each input position is gathered.After gathering described touch data, the current input feature that user is inputted at different input position with prestore to comparing one by one the original input feature vector of input position, with the current input feature inputted at described input position according to user, and prestore the incidence relation of original input feature vector and input position, determine that this network operation is machine operation or manual operation.
Described respond module 74, the current input feature collected if above-mentioned is consistent with the original input feature vector prestored, then respond the operation of user.
Such as: in the corresponding relation of the input position prestored and original input feature vector, the corresponding touch data based on finger of input position A, client gathers the current input feature that user inputs at input position A, and this current input feature and the touch data based on finger preset is compared:
If collect the data of the areal extent touching described input position at input position A, then according to the corresponding relation of the data that collect and the touch areal extent preset and touch manner, can judge that described current input feature by which kind of input mode is inputted; If judge, described current input feature is the touch data based on finger, then can think that the current input feature that input position A collects is consistent with the original input feature vector prestored.
Or, if collect at input position A and touch the areal extent of described input position and the data of pressure size, then according to the combination of the corresponding relation of the corresponding relation of data, default touch areal extent and touch manner collected and default touch pressure size and touch manner, can judge that described current input feature by which kind of input mode is inputted; Same, if judge, described current input feature is the touch data based on finger, then can think that the current input feature that input position A collects is consistent with the original input feature vector prestored.
It is contemplated that, in some more accurate occasions, the areal extent touching described input position, the pressure size touching described input position can be adopted, touch the plural combination of sound that described input position sends and other touch datas, judge that described current input feature by which kind of input mode is inputted, do not do concrete restriction herein.
Wherein, the corresponding relation of described touch areal extent and touch manner, and/or the corresponding relation of touch pressure size and touch manner, and/or touch the corresponding relation of the touch data territory touch manner such as corresponding relation of sound type size and touch manner, can be set in advance in described client, wherein, described touch manner comprise by finger finger, articulations digitorum manus, nail or other human organs input.
Be understandable that, user is at the current input feature of each different input position input, all can refer to the detection mode of the current input feature of above-mentioned input position A, with to should the original input feature vector prestored of input position compare detection one by one, if all current input feature collected are consistent with the original input feature vector prestored, can judge that network operation is manual operation, the operation of response user, simultaneously, by the display of this comparative result to described display interface (i.e. client touch-screen), and be verified to client user's prompting.
Preferred further, if the current input feature collected is inconsistent with the original input feature vector prestored, then point out user to re-start input;
Be understandable that, under the enforceable mode of one, described second reminding module 76, in the process gathering the current input feature that user inputs at different input position, often receive a current input feature, namely this current input feature is compared, once this current input feature is inconsistent with the original input feature vector prestored, then point out user to re-start input.
That is, in the process of current input feature gathering the input of described input position one by one, when occurring that arbitrary current input feature collected and the original input feature vector prestored are inconsistent, just stop the current input feature gathering user's input, simultaneously, prompting user re-starts input, wherein, re-enter and can comprise two kinds of forms: first, prompting is that inconsistent input position place re-enters at input feature vector comparative result, the second, prompting is re-entered at described input position one by one.
Under the enforceable mode of another kind, described 3rd reminding module 77, when user is after the current input feature that all input positions input all has gathered, unification compares one by one to all current input feature, once one of them current input feature is inconsistent with the original input feature vector of corresponding input position, then user is pointed out to re-start input.
That is, after all input positions all complete input, when occurring that arbitrary current input feature collected and the original input feature vector prestored are inconsistent, prompting user re-enters at described input position one by one.
Be understandable that, in some comparatively accurate occasions, if when the current input feature that detection collects is inconsistent with the original input feature vector prestored, directly do not respond user operation, namely this network operation is refused, and exit current display interface, on can carrying out after a preset time period once user rs authentication to carry out subsequent network operations; Or, the number of times allowing user to carry out re-entering is set and is no more than 2 times, if exceed, exit current display interface then equally, on can carrying out after a preset time period, once user rs authentication, to carry out subsequent network operations etc., is only herein and illustrates, do not form limitation of the invention.
From the above, the present embodiment is in advance at the input position of display interface setting at least two, and store after original for the user collected input feature vector is associated one by one with the input position of display interface, wherein said, the original input feature vector of described user comprises the touch data of human organ; When carrying out user rs authentication to carry out subsequent operation, gather the current input feature that user inputs at input position; And current input feature and original input feature vector compare, when the current input feature collected is consistent with the original input feature vector prestored, the operation of response user.The present invention passes through the input feature vector of touch data as user of human organ, and judge that network operation is the operation behavior of user or the operation behavior of machine with this, if this time operation is then authorized in artificial operation, if this time operation is then refused in the operation of machine, can effectively prevent machine from removing to perform the associative operation relating to account number, decrease the risk of online transaction to user, effectively can save property from damage, and privacy information can be good at being maintained secrecy.Further, gather the touch data based on finger finger, articulations digitorum manus, nail that user inputs at each input position, judge artificial operation more accurately, improve the reliability of user rs authentication further.
Be understandable that there is no the part described in detail in the embodiment of the present invention, specifically can carry out specific implementation with reference to the associated description of above-mentioned related embodiment, repeat no more herein.
The described user authentication device based on user's input feature vector that the embodiment of the present invention provides, be for example computer, panel computer, mobile phone with touch function etc., the user authentication method based on user's input feature vector in described image processing apparatus and foregoing embodiments belongs to same design, described based on the user authentication device of user's input feature vector can be run described based on the user authentication method embodiment of user's input feature vector in the either method that provides, its specific implementation process refers to the described user authentication method embodiment based on user's input feature vector, repeat no more herein.
It should be noted that, for the user authentication method based on user's input feature vector of the present invention, this area common test personnel are appreciated that all or part of flow process realized based on the user authentication method of user's input feature vector described in the embodiment of the present invention, that the hardware that can control to be correlated with by computer program has come, described computer program can be stored in a computer read/write memory medium, as being stored in the memory of terminal, and performed by least one processor in this terminal, can comprise in the process of implementation as described in based on the flow process of the embodiment of the user authentication method of user's input feature vector.Wherein, described storage medium can be magnetic disc, CD, read-only store-memory body (ROM) or random store-memory body (RAM) etc.
For described in the embodiment of the present invention based on the user authentication device of user's input feature vector, its each functional module can be integrated in a process chip, also can be that the independent physics of modules exists, also can two or more module integrations in a module.Above-mentioned integrated module both can adopt the form of hardware to realize, and the form of software function module also can be adopted to realize.If described integrated module using the form of software function module realize and as independently production marketing or use time, also can be stored in a computer read/write memory medium, described storage medium such as be read-only memory, disk or CD etc.
In sum; although the present invention discloses as above with preferred embodiment; but above preferred embodiment is also not used to limit the present invention; the common test personnel of this area; without departing from the spirit and scope of the present invention; all can do various change and retouching, the scope that therefore protection scope of the present invention defines with claim is as the criterion.

Claims (12)

1. based on a user authentication method for user's input feature vector, it is characterized in that, comprising:
Gather the original input feature vector of user in advance, and store after the original input feature vector collected is associated one by one with the input position of display interface, wherein said display interface comprises the input position of at least two, the original input feature vector that each input position is corresponding different, the original input feature vector of described user comprises the touch data of human organ;
When carrying out user rs authentication to carry out subsequent operation, gather the current input feature that user inputs at different input position;
The current input feature that user is inputted at different input position with prestore to should the original input feature vector of input position compare; And
If the current input feature collected above-mentioned is consistent with the original input feature vector prestored, then respond the operation of user.
2. the user authentication method based on user's input feature vector according to claim 1, is characterized in that, described input feature vector comprises the touch data based on finger finger, articulations digitorum manus, nail.
3. the user authentication method based on user's input feature vector according to claim 2, it is characterized in that, the described touch data based on finger finger, articulations digitorum manus, nail comprises: one or more combination in any of the sound that described input position sends that touches the areal extent of described input position, touches the pressure size of described input position, touches.
4. the user authentication method based on user's input feature vector according to any one of claims 1 to 3, is characterized in that, before the step of the current input feature that described collection user inputs at described input position, described method also comprises:
At variant input position, prompting user is inputted by different input modes, and wherein said different input mode is that the original input feature vector corresponding according to each input position inputs at variant input position.
5. the user authentication method based on user's input feature vector according to claim 1, it is characterized in that, the described current input feature that user is inputted at different input position with prestore to should after the step that compares of the original input feature vector of input position, described method also comprises:
In the process gathering the current input feature that user inputs at different input position, often receive a current input feature, namely this current input feature is compared, once this current input feature is inconsistent with the original input feature vector prestored, then point out user to re-start input.
6. the user authentication method based on user's input feature vector according to claim 1, it is characterized in that, the described current input feature that user is inputted at different input position with prestore to should after the step that compares of the original input feature vector of input position, described method also comprises:
When user is after the current input feature that all input positions input all has gathered, unification compares one by one to all current input feature, once one of them current input feature is inconsistent with the original input feature vector of corresponding input position, then user is pointed out to re-start input.
7. based on a user authentication device for user's input feature vector, it is characterized in that, comprising:
Presetting module, for gathering the original input feature vector of user in advance, and store after the original input feature vector collected is associated one by one with the input position of display interface, wherein said display interface comprises the input position of at least two, the original input feature vector that each input position is corresponding different, the original input feature vector of described user comprises the touch data of human organ;
Acquisition module, for when carrying out user rs authentication to carry out subsequent operation, gathers the current input feature of user in the described input position input of difference;
Comparison module, for current input feature that user is inputted at different input position with prestore to should the original input feature vector of input position compare; And
Respond module, if consistent with the original input feature vector prestored for the above-mentioned current input feature collected, then responds the operation of user.
8. the user authentication device based on user's input feature vector according to claim 7, is characterized in that,
Described presetting module, also for gathering the original touch data based on finger finger, articulations digitorum manus, nail of user in advance;
Described acquisition module, also for when carrying out user rs authentication to carry out subsequent operation, gathers the current touch data based on finger finger, articulations digitorum manus, nail that user inputs at described input position.
9. the user authentication device based on user's input feature vector according to claim 8, is characterized in that,
The described touch data based on finger finger, articulations digitorum manus, nail comprises: one or more combination in any of the sound that described input position sends that touches the areal extent of described input position, touches the pressure size of described input position, touches.
10. the user authentication device based on user's input feature vector according to any one of claim 7 to 9, is characterized in that, described device also comprises:
First reminding module, at variant input position, prompting user is inputted by different input modes, and wherein said different input mode is that the original input feature vector corresponding according to each input position inputs at variant input position.
11. user authentication device based on user's input feature vector according to claim 1, it is characterized in that, described device also comprises:
Second reminding module, for in the process gathering the current input feature that user inputs at different input position, often receive a current input feature, namely this current input feature is compared, once this current input feature is inconsistent with the original input feature vector prestored, then user is pointed out to re-start input.
12. user authentication device based on user's input feature vector according to claim 1, it is characterized in that, described device also comprises:
3rd reminding module, for when user is after the current input feature that all input positions input all has gathered, unification compares one by one to all current input feature, once one of them current input feature is inconsistent with the original input feature vector of corresponding input position, then user is pointed out to re-start input.
CN201410295075.7A 2014-06-25 2014-06-25 A kind of user authentication method and device based on user's input feature vector Active CN105207979B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410295075.7A CN105207979B (en) 2014-06-25 2014-06-25 A kind of user authentication method and device based on user's input feature vector

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410295075.7A CN105207979B (en) 2014-06-25 2014-06-25 A kind of user authentication method and device based on user's input feature vector

Publications (2)

Publication Number Publication Date
CN105207979A true CN105207979A (en) 2015-12-30
CN105207979B CN105207979B (en) 2018-01-26

Family

ID=54955417

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410295075.7A Active CN105207979B (en) 2014-06-25 2014-06-25 A kind of user authentication method and device based on user's input feature vector

Country Status (1)

Country Link
CN (1) CN105207979B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108701310A (en) * 2016-02-03 2018-10-23 万事达卡国际股份有限公司 Biological attribute data based on capture explains that user expresses and is based on this and provides service
CN112346793A (en) * 2020-09-18 2021-02-09 长沙市到家悠享网络科技有限公司 Data processing method and device, electronic equipment and computer readable medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1223416A (en) * 1998-01-14 1999-07-21 日本电气株式会社 Method for registering and contrasting palmprint and the registering/contrasting device thereof
CN102609129A (en) * 2010-12-29 2012-07-25 微软公司 User identification with biokinematic input
CN102812471A (en) * 2010-03-12 2012-12-05 奥斯-纽赫瑞森个人计算机解决方案公司 A secured personal data handling and management system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1223416A (en) * 1998-01-14 1999-07-21 日本电气株式会社 Method for registering and contrasting palmprint and the registering/contrasting device thereof
CN102812471A (en) * 2010-03-12 2012-12-05 奥斯-纽赫瑞森个人计算机解决方案公司 A secured personal data handling and management system
CN102609129A (en) * 2010-12-29 2012-07-25 微软公司 User identification with biokinematic input

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108701310A (en) * 2016-02-03 2018-10-23 万事达卡国际股份有限公司 Biological attribute data based on capture explains that user expresses and is based on this and provides service
CN112346793A (en) * 2020-09-18 2021-02-09 长沙市到家悠享网络科技有限公司 Data processing method and device, electronic equipment and computer readable medium

Also Published As

Publication number Publication date
CN105207979B (en) 2018-01-26

Similar Documents

Publication Publication Date Title
US11757872B2 (en) Contextual and risk-based multi-factor authentication
US10248910B2 (en) Detection mitigation and remediation of cyberattacks employing an advanced cyber-decision platform
US9529990B2 (en) Systems and methods for validating login attempts based on user location
CN104426885B (en) Abnormal account providing method and device
US9009814B1 (en) Systems and methods for generating secure passwords
US11119648B2 (en) Obfuscating mobile device passwords through pattern generation
US8875279B2 (en) Passwords for touch-based platforms using time-based finger taps
EP3029593A1 (en) System and method of limiting the operation of trusted applications in the presence of suspicious programs
KR20190014124A (en) Two factor authentication
US9563763B1 (en) Enhanced captchas
US10200359B1 (en) Systems and methods for creating credential vaults that use multi-factor authentication to automatically authenticate users to online services
CN106096441A (en) Date storage method and data storage device
US11379568B2 (en) Method and system for preventing unauthorized computer processing
CN105207979A (en) User input feature-based user authentication method and device
US11496511B1 (en) Systems and methods for identifying and mitigating phishing attacks
CN104980279A (en) Identity authentication method, and related equipment and system
CN105279164A (en) File processing method and device based on IOS system
US11914710B2 (en) System and method for application tamper discovery
CN110222508A (en) Extort virus defense method, electronic equipment, system and medium
US10193880B1 (en) Systems and methods for registering user accounts with multi-factor authentication schemes used by online services
US9563752B2 (en) License information access based on developer profiles
US20160292685A1 (en) Authenticating a user with a passcode using a passcode entry table
US10044846B2 (en) Method for executing dual operating systems of smart phone
CN115334698B (en) Construction method, device, terminal and medium of target 5G safety network of target range
US20230094066A1 (en) Computer-implemented systems and methods for application identification and authentication

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20190808

Address after: 518057 Nanshan District science and technology zone, Guangdong, Zhejiang Province, science and technology in the Tencent Building on the 1st floor of the 35 layer

Co-patentee after: Tencent cloud computing (Beijing) limited liability company

Patentee after: Tencent Technology (Shenzhen) Co., Ltd.

Address before: Shenzhen Futian District City, Guangdong province 518000 Zhenxing Road, SEG Science Park 2 East Room 403

Patentee before: Tencent Technology (Shenzhen) Co., Ltd.

TR01 Transfer of patent right