Embodiment
In order to make object of the present invention, technical scheme and beneficial effect clearly understand, below in conjunction with drawings and Examples, the present invention is further elaborated.Should be appreciated that specific embodiment described herein only in order to explain the present invention, be not intended to limit the present invention.
In the following description, specific embodiments of the invention illustrate, unless otherwise stating clearly with reference to the step of the operation performed by or multi-section computer and symbol.Therefore, it can recognize these steps and operation, wherein have and will mention as being performed by computer for several times, include and handled with the computer processing unit of the electronic signal of the data in a structuring pattern by representing.These data of this manipulation transforms or the position maintained in the memory system of this computer, its reconfigurable or other running changing this computer in a manner familiar to those skilled in the art.The data structure that these data maintain is the provider location of this internal memory, and it has the particular characteristics defined by this data format.But the principle of the invention illustrates with above-mentioned word, it is not represented as a kind of restriction, and those skilled in the art can recognize that the plurality of step of the following stated and operation also may be implemented in the middle of hardware.
" assembly ", " module ", " system ", " interface ", " process " etc. are usually intended to refer to computer related entity as used herein the term: the combination of hardware, hardware and software, software or executory software.Such as, assembly can be but be not limited to be run process on a processor, processor, object, can perform application, the thread performed, program and/or computer.By diagram, run application on the controller and this both controller can be assembly.One or more assembly can have in the process and/or thread that are to perform, and assembly and/or can be distributed between two or more computers on a computer.
And claimed theme may be implemented as and uses standard program and/or engineering to produce software, firmware, hardware or its combination in any with the method for the theme disclosed in computer for controlling realization, device or manufacture.Term as used herein " manufacture " is intended to comprise can from the computer program of any computer readable device, carrier or medium access.Certainly, those skilled in the art will recognize that and can carry out many amendments to this configuration, and do not depart from scope or the spirit of claimed theme.
Fig. 1 and discussion subsequently provide brief, the description of summarizing to the operational environment of the server realizing the user authentication device place based on user's input feature vector of the present invention.The operational environment of Fig. 1 is only an example of suitable operational environment and is not intended to any restriction of suggestion about the purposes of operational environment or the scope of function.Instance server 112 includes but not limited to personal computer, server computer, hand-hold type or laptop devices, mobile device (such as mobile phone, personal digital assistant (PDA), media player etc.), multicomputer system, consumption-orientation server, minicom, mainframe computer, comprises the distributed computing environment (DCE) of above-mentioned any system or equipment, etc.
Although do not require, under the common background that " computer-readable instruction " is performed by one or more server, embodiment is described.Computer-readable instruction can distribute via computer-readable medium (hereafter discussing).Computer-readable instruction can be implemented as program module, such as performs particular task or realizes the function, object, API (API), data structure etc. of particular abstract data type.Typically, the function of this computer-readable instruction can arbitrarily combine or distribute in various environment.
Fig. 1 illustrates the example of the server 112 of the one or more embodiments comprising the user authentication device based on user's input feature vector of the present invention.In one configuration, server 112 comprises at least one processing unit 116 and memory 118.According to exact configuration and the type of server, memory 118 can be volatibility (such as RAM), non-volatile (such as ROM, flash memory etc.) or certain combination of the two.This configuration is illustrated by dotted line 114 in FIG.
In other embodiments, server 112 can comprise supplementary features and/or function.Such as, server 112 can also comprise additional storage device (such as removable and/or non-removable), and it includes but not limited to magnetic memory apparatus, light storage device etc.This additional memory devices is illustrated by storage device 120 in FIG.In one embodiment, the computer-readable instruction for realizing one or more embodiment provided in this article can in storage device 120.Storage device 120 can also store other computer-readable instructions for realizing operating system, application program etc.Computer-readable instruction can be loaded in memory 118 and be performed by such as processing unit 116.
Term as used herein " computer-readable medium " comprises computer-readable storage medium.Computer-readable storage medium comprises the volatibility and non-volatile, removable and non-removable medium that realize for any method of the information storing such as computer-readable instruction or other data and so on or technology.Memory 118 and storage device 120 are examples of computer-readable storage medium.Computer-readable storage medium includes but not limited to RAM, ROM, EEPROM, flash memory or other memory technologies, CD-ROM, digital universal disc (DVD) or other light storage devices, cassette tape, tape, disk storage device or other magnetic storage apparatus or may be used for storing expectation information and can serviced device 112 any other medium of accessing.Computer-readable storage medium so arbitrarily can be a part for server 112.
Server 112 can also comprise the communication connection 126 allowing server 112 and other devices communicatings.Communication connection 126 can include but not limited to modulator-demodulator, network interface unit (NIC), integrated network interface, radiofrequency launcher/receiver, infrared port, USB connection or other interfaces for server 112 being connected to other servers.Communication connection 126 can comprise wired connection or wireless connections.Communication connection 126 can be launched and/or received communication media.
Term " computer-readable medium " can comprise communication media.Communication media typically comprises other data in " the own modulated data signal " of computer-readable instruction or such as carrier wave or other transport sectors and so on, and comprises any information delivery media.Term " own modulated data signal " can comprise such signal: one or more according to being arranged to the mode in signal by information coding or changing in this characteristics of signals.
Server 112 can comprise input equipment 124, such as keyboard, mouse, pen, voice-input device, touch input device, infrared camera, video input apparatus and/or any other input equipment.Also output equipment 122 can be comprised, such as one or more display, loud speaker, printer and/or other output equipments arbitrarily in equipment 112.Input equipment 124 and output equipment 122 can be connected to server 112 via wired connection, wireless connections or its combination in any.In one embodiment, input equipment 124 or the output equipment 122 of server 112 can be used as from the input equipment of another server or output equipment.
The assembly of server 112 can be connected by various interconnection (such as bus).Such interconnection can comprise periphery component interconnection (PCI) (such as quick PCI), USB (USB), live wire (IEEE1394), optical bus structure etc.In another embodiment, the assembly of server 112 can pass through network interconnection.Such as, memory 118 can by be arranged in different physical location, formed by multiple physical memory cells arc of network interconnection.
Those skilled in the art will recognize that, can spanning network distribution for the memory device storing computer-readable instruction.Such as, the computing equipment 130 can accessed via network 128 can store the computer-readable instruction for realizing one or more embodiment provided by the present invention.Server 112 can access computation equipment 130 and a part for downloading computer instructions or all for execution.Alternately, server 112 can download many computer-readable instructions on demand, or some instructions can server 112 place perform and some instructions can perform at computing equipment 130 place.
There is provided herein the various operations of embodiment.In one embodiment, described one or more operations can form the computer-readable instruction that one or more computer-readable medium stores, and it will make computing equipment perform described operation when serviced device performs.The order describing some or all of operation should not be interpreted as implying what these operations were necessarily sequentially correlated with.It will be appreciated by those skilled in the art that the alternative sequence of the benefit with this specification.And, should be appreciated that not all operation must exist in each embodiment provided in this article.
And word used herein " preferably " means to be used as example, example or illustration.Any aspect that Feng Wen is described as " preferably " or design need not be interpreted as than other aspects or design more favourable.On the contrary, the use of word " preferably " is intended to propose concept in a concrete fashion.Term "or" as used in this application is intended to the "or" that means to comprise and the "or" of non-excluded.That is, unless otherwise or clear from the context, " X uses A or B " means any one that nature comprises arrangement.That is, if X uses A; X uses B; Or X uses both A and B, then " X uses A or B " is met in aforementioned arbitrary example.
And although illustrate and describe the disclosure relative to one or more implementation, those skilled in the art are based on to the reading of this specification and accompanying drawing with understand and will expect equivalent variations and amendment.The disclosure comprises all such amendments and modification, and is only limited by the scope of claims.Especially about the various functions performed by said modules (such as element, resource etc.), term for describing such assembly is intended to the random component (unless otherwise instructed) corresponding to the appointed function (such as it is functionally of equal value) performing described assembly, even if be not structurally equal to the open structure of the function performed in exemplary implementations of the present disclosure shown in this article.In addition, although special characteristic of the present disclosure relative in some implementations only one be disclosed, this feature can with can be such as expect and other Feature Combinations one or more of other favourable implementations for given or application-specific.And, " comprise " with regard to term, " having ", " containing " or its distortion be used in embodiment or claim with regard to, such term is intended to comprise " to comprise " similar mode to term.
Refer to Fig. 2, the realization flow schematic diagram of the user authentication method based on user's input feature vector that Fig. 2 provides for first embodiment of the invention.
In step s 201, gather the original input feature vector of user in advance, and store after the original input feature vector collected is associated one by one with the input position of display interface.
Wherein, described display interface comprises the input position of at least two, the original input feature vector that each input position is corresponding different, and the original input feature vector of described user comprises the touch data of human organ.
In embodiments of the present invention, the executive agent of the described user authentication method based on user's input feature vector is client, and described client includes but not limited to following equipment: the mobile terminal with touch-screen, the computer with touch-screen or the individual palmtop PC with touch-screen etc.But, be understandable that, as long as the electronic equipment with touch-screen all should be included within protection scope of the present invention.
Further, the corresponding relation of the original input feature vector of user and client display interface input position is set up in advance in described client, described display interface comprises the input position of at least two, each input position is corresponding with each original input feature vector, the original input feature vector of described user comprises the touch data of human organ, and the touch data of wherein said human organ is: user's touch data that client collects when described input position is inputted to display screen by touch manner.
In step S202, when carrying out user rs authentication to carry out subsequent operation, gather the current input feature that user inputs at described input position.
In step S203, the current input feature that user is inputted at different input position with prestore to should the original input feature vector of input position compare.
Described step S202 and described step S203 can be specially:
When client user carries out network operation, first need to carry out user rs authentication, namely distinguishing this network operation is machine operation or manual operation, if user rs authentication is passed through, this network operation of response user, even determining current network operation is manual operation, then respond this network operation, to ensure user account safety.
In the present embodiment, client gathers the current input feature that user inputs at described input position, and the current input feature that user is inputted at different input position with prestore to should the original input feature vector of input position compare, with the current input feature inputted at described input position according to user, and prestore the incidence relation of original input feature vector and input position, determine that this network operation is machine operation or manual operation.
In step S204, the current input feature collected if above-mentioned is consistent with the original input feature vector prestored, then respond the operation of user.
If current input feature is consistent with the original input feature vector prestored, then can think that user rs authentication is passed through, namely judge that network operation is manual operation, then respond the operation of user.
From the above, the present embodiment is in advance at the input position of display interface setting at least two, and store after original for the user collected input feature vector is associated one by one with the input position of display interface, wherein said, the original input feature vector of described user comprises the touch data of human organ; When carrying out user rs authentication to carry out subsequent operation, gather the current input feature that user inputs at input position; And current input feature and original input feature vector compare, when the current input feature collected is consistent with the original input feature vector prestored, the operation of response user.The present invention passes through the input feature vector of touch data as user of human organ, and judge that network operation is the operation behavior of user or the operation behavior of machine with this, if this time operation is then authorized in artificial operation, can effectively prevent machine from removing to perform the associative operation relating to account number, the risk of online transaction is decreased to user, effectively can save property from damage, and privacy information can be good at being maintained secrecy.
Refer to Fig. 3, the realization flow schematic diagram of the user authentication method based on user's input feature vector that Fig. 3 provides for second embodiment of the invention.
In step S301, gather the original input feature vector of user in advance, and store after the original input feature vector collected is associated one by one with the input position of display interface.
Wherein, described display interface comprises the input position of at least two, the original input feature vector that each input position is corresponding different, and the original input feature vector of described user comprises the touch data of human organ.
In embodiments of the present invention, the executive agent of the described user authentication method based on user's input feature vector is client, and described client includes but not limited to following equipment: the mobile terminal with touch-screen, the computer with touch-screen or the individual palmtop PC with touch-screen etc.But, be understandable that, as long as the electronic equipment with touch-screen all should be included within protection scope of the present invention.
Preferably, the touch data of described human organ is: user's touch data that client collects when described input position is inputted to display screen by touch manner.
Further, in the embodiment of the present invention, described input feature vector can comprise the touch data based on finger finger, articulations digitorum manus, nail.That is, the touch data of described human organ is specially the touch data that user points finger, articulations digitorum manus, nail.
Further, the described touch data based on finger finger, articulations digitorum manus, nail can comprise: one or more combination in any of the sound that described input position sends that touches the areal extent of described input position, touches the pressure size of described input position, touches.That is, according to one or more combination in any of described touch data, can determine that the input feature vector of current input is for inputting by pointing finger or articulations digitorum manus or nail, does not do concrete restriction to its realization herein.
In step s 302, when carrying out user rs authentication to carry out subsequent operation, at the input position that each are different, prompting user is inputted by different input modes.
Be understandable that, the corresponding original input feature vector of each input position; Described different input mode is that the original input feature vector corresponding according to each input position inputs at variant input position.
Such as: in the corresponding relation of the input position prestored and original input feature vector, the corresponding touch data based on finger of input position A, then described client is at the input position A place of its touch-screen display interface, and prompting user adopts finger carrying out touch input.
In step S303, gather the current input feature that user inputs at described input position.
In step s 304, current input feature user inputted at different input position with prestore to should the original input feature vector of input position compare.
Described step S303 and described step S304 can be specially:
Preferably, the touch data based on finger finger, articulations digitorum manus, nail that user inputs at each input position is gathered.
After gathering described touch data, the current input feature that user is inputted at different input position with prestore to comparing one by one the original input feature vector of input position, with the current input feature inputted at described input position according to user, and prestore the incidence relation of original input feature vector and input position, determine that this network operation is machine operation or manual operation.
Such as: in the corresponding relation of the input position prestored and original input feature vector, the corresponding touch data based on finger of input position A, then client gathers the current input feature that user inputs at input position A, and this current input feature and the touch data based on finger preset is compared.
In step S305, the current input feature collected if above-mentioned is consistent with the original input feature vector prestored, then respond the operation of user.
In step S306, by comparative result display to described display interface, and user rs authentication is pointed out to pass through.
Described step S305 and described step S306 can be specially:
Be understandable that, in the present embodiment, described input feature vector comprises the touch data of human organ such as finger finger, articulations digitorum manus, nail etc., and the described touch data based on finger finger, articulations digitorum manus, nail can comprise: one or more combination in any of other touch datas such as the sound that described input position sends that touch the areal extent of described input position, touch the pressure size of described input position, touch.
That is, described client can according to one or more combination in any of described touch data, determines that the current input feature taked is for inputting by finger finger or articulations digitorum manus or nail or other touch manners.
Such as: in the corresponding relation of the input position prestored and original input feature vector, the corresponding touch data based on finger of input position A, client gathers the current input feature that user inputs at input position A, and this current input feature and the touch data based on finger preset is compared:
If collect the data of the areal extent touching described input position at input position A, then according to the corresponding relation of the data that collect and the touch areal extent preset and touch manner, can judge that described current input feature by which kind of input mode is inputted; If judge, described current input feature is the touch data based on finger, then can think that the current input feature that input position A collects is consistent with the original input feature vector prestored.
Or, if collect at input position A and touch the areal extent of described input position and the data of pressure size, then according to the combination of the corresponding relation of the corresponding relation of data, default touch areal extent and touch manner collected and default touch pressure size and touch manner, can judge that described current input feature by which kind of input mode is inputted; Same, if judge, described current input feature is the touch data based on finger, then can think that the current input feature that input position A collects is consistent with the original input feature vector prestored.
It is contemplated that, in some more accurate occasions, the areal extent touching described input position, the pressure size touching described input position can be adopted, touch the plural combination of sound that described input position sends and other touch datas, judge that described current input feature by which kind of input mode is inputted, do not do concrete restriction herein.
Wherein, the corresponding relation of described touch areal extent and touch manner, and/or the corresponding relation of touch pressure size and touch manner, and/or touch the corresponding relation of the touch data territory touch manner such as corresponding relation of sound type size and touch manner, can be set in advance in described client, wherein, described touch manner comprise by finger finger, articulations digitorum manus, nail or other human organs input.
Be understandable that, user is at the current input feature of each different input position input, all can refer to the detection mode of the current input feature of above-mentioned input position A, with to should the original input feature vector prestored of input position compare detection one by one, if all current input feature collected are consistent with the original input feature vector prestored, can judge that network operation is manual operation, the operation of response user, simultaneously, by the display of this comparative result to described display interface (i.e. client touch-screen), and be verified to client user's prompting.
From the above, the present embodiment is in advance at the input position of display interface setting at least two, and store after original for the user collected input feature vector is associated one by one with the input position of display interface, wherein said, the original input feature vector of described user comprises the touch data of human organ; When carrying out user rs authentication to carry out subsequent operation, gather the current input feature that user inputs at input position; And current input feature and original input feature vector compare, when the current input feature collected is consistent with the original input feature vector prestored, the operation of response user.The present invention passes through the input feature vector of touch data as user of human organ, and judge that network operation is the operation behavior of user or the operation behavior of machine with this, if this time operation is then authorized in artificial operation, can effectively prevent machine from removing to perform the associative operation relating to account number, the risk of online transaction is decreased to user, effectively can save property from damage, and privacy information can be good at being maintained secrecy.Further, gather the touch data based on finger finger, articulations digitorum manus, nail that user inputs at each input position, judge artificial operation more accurately, improve the reliability of user rs authentication further.
Refer to Fig. 4, the realization flow schematic diagram of the user authentication method based on user's input feature vector that Fig. 4 provides for third embodiment of the invention.Be different from above-mentioned second embodiment, the present embodiment carries out analytic explanation for the situation that the current input feature collected is inconsistent with the original input feature vector prestored.
In step S401, gather the original input feature vector of user in advance, and store after the original input feature vector collected is associated one by one with the input position of display interface.
Wherein, described display interface comprises the input position of at least two, the original input feature vector that each input position is corresponding different, and the original input feature vector of described user comprises the touch data of human organ.
In embodiments of the present invention, the executive agent of the described user authentication method based on user's input feature vector is client, and described client includes but not limited to following equipment: the mobile terminal with touch-screen, the computer with touch-screen or the individual palmtop PC with touch-screen etc.But, be understandable that, as long as the electronic equipment with touch-screen all should be included within protection scope of the present invention.
Preferably, the touch data of described human organ is: user's touch data that client collects when described input position is inputted to display screen by touch manner.
Further, in the embodiment of the present invention, described input feature vector can comprise the touch data based on finger finger, articulations digitorum manus, nail.That is, the touch data of described human organ is specially the touch data that user points finger, articulations digitorum manus, nail.
Further, the described touch data based on finger finger, articulations digitorum manus, nail can comprise: one or more combination in any of the sound that described input position sends that touches the areal extent of described input position, touches the pressure size of described input position, touches.That is, according to one or more combination in any of described touch data, can determine that the input feature vector of current input is for inputting by pointing finger or articulations digitorum manus or nail, does not do concrete restriction to its realization herein.
In step S402, when carrying out user rs authentication to carry out subsequent operation, at the input position that each are different, prompting user is inputted by different input modes.
Be understandable that, the corresponding original input feature vector of each input position; Described different input mode is that the original input feature vector corresponding according to each input position inputs at variant input position.
Such as: in the corresponding relation of the input position prestored and original input feature vector, the corresponding touch data based on finger of input position A, then described client is at the input position A place of its touch-screen display interface, and prompting user adopts finger carrying out touch input.
In step S403, gather the current input feature that user inputs at described input position.
In step s 404, current input feature user inputted at different input position with prestore to should the original input feature vector of input position compare.
Described step S403 and described step S404 can be specially:
Preferably, the touch data based on finger finger, articulations digitorum manus, nail that user inputs at each input position is gathered.
After gathering described touch data, the current input feature that user is inputted at different input position with prestore to comparing one by one the original input feature vector of input position, with the current input feature inputted at described input position according to user, and prestore the incidence relation of original input feature vector and input position, determine that this network operation is machine operation or manual operation.
Such as: in the corresponding relation of the input position prestored and original input feature vector, the corresponding touch data based on finger of input position A, then client gathers the current input feature that user inputs at input position A, and this current input feature and the touch data based on finger preset is compared.
In step S405a, the current input feature collected if above-mentioned is consistent with the original input feature vector prestored, then respond the operation of user.
Be understandable that, in the present embodiment, described input feature vector comprises the touch data of human organ such as finger finger, articulations digitorum manus, nail etc., and the described touch data based on finger finger, articulations digitorum manus, nail can comprise: one or more combination in any of other touch datas such as the sound that described input position sends that touch the areal extent of described input position, touch the pressure size of described input position, touch.
That is, described client can according to one or more combination in any of described touch data, determines that the current input feature taked is for inputting by finger finger or articulations digitorum manus or nail or other touch manners.
Such as: in the corresponding relation of the input position prestored and original input feature vector, the corresponding touch data based on finger of input position A, client gathers the current input feature that user inputs at input position A, and this current input feature and the touch data based on finger preset is compared:
If collect the data of the areal extent touching described input position at input position A, then according to the corresponding relation of the data that collect and the touch areal extent preset and touch manner, can judge that described current input feature by which kind of input mode is inputted; If judge, described current input feature is the touch data based on finger, then can think that the current input feature that input position A collects is consistent with the original input feature vector prestored.
Or, if collect at input position A and touch the areal extent of described input position and the data of pressure size, then according to the combination of the corresponding relation of the corresponding relation of data, default touch areal extent and touch manner collected and default touch pressure size and touch manner, can judge that described current input feature by which kind of input mode is inputted; Same, if judge, described current input feature is the touch data based on finger, then can think that the current input feature that input position A collects is consistent with the original input feature vector prestored.
It is contemplated that, in some more accurate occasions, the areal extent touching described input position, the pressure size touching described input position can be adopted, touch the plural combination of sound that described input position sends and other touch datas, judge that described current input feature by which kind of input mode is inputted, do not do concrete restriction herein.
Wherein, the corresponding relation of described touch areal extent and touch manner, and/or the corresponding relation of touch pressure size and touch manner, and/or touch the corresponding relation of the touch data territory touch manner such as corresponding relation of sound type size and touch manner, can be set in advance in described client, wherein, described touch manner comprise by finger finger, articulations digitorum manus, nail or other human organs input.
Be understandable that, user is at the current input feature of each different input position input, all can refer to the detection mode of the current input feature of above-mentioned input position A, with to should the original input feature vector prestored of input position compare detection one by one, if all current input feature collected are consistent with the original input feature vector prestored, can judge that network operation is manual operation, the operation of response user, simultaneously, by the display of this comparative result to described display interface (i.e. client touch-screen), and be verified to client user's prompting.
In step S405b, if the current input feature collected is inconsistent with the original input feature vector prestored, then user is pointed out to re-start input.
Wherein, after prompting user re-starts input, step S402 can be returned and continue executable operations.
Be understandable that, under the enforceable mode of one, in the process gathering the current input feature that user inputs at different input position, often receive a current input feature, namely this current input feature is compared, once this current input feature is inconsistent with the original input feature vector prestored, then user is pointed out to re-start input.
That is, in the process of current input feature gathering the input of described input position one by one, when occurring that arbitrary current input feature collected and the original input feature vector prestored are inconsistent, just stop the current input feature gathering user's input, simultaneously, prompting user re-starts input, wherein, re-enter and can comprise two kinds of forms: first, prompting is that inconsistent input position place re-enters at input feature vector comparative result, the second, prompting is re-entered at described input position one by one.
Under the enforceable mode of another kind, when user is after the current input feature that all input positions input all has gathered, unification compares one by one to all current input feature, once one of them current input feature is inconsistent with the original input feature vector of corresponding input position, then user is pointed out to re-start input.
That is, after all input positions all complete input, when occurring that arbitrary current input feature collected and the original input feature vector prestored are inconsistent, prompting user re-enters at described input position one by one.
Be understandable that, in some comparatively accurate occasions, if when the current input feature that detection collects is inconsistent with the original input feature vector prestored, directly do not respond user operation, namely this network operation is refused, and exit current display interface, on can carrying out after a preset time period once user rs authentication to carry out subsequent network operations; Or, the number of times allowing user to carry out re-entering is set and is no more than 2 times, if exceed, exit current display interface then equally, on can carrying out after a preset time period, once user rs authentication, to carry out subsequent network operations etc., is only herein and illustrates, do not form limitation of the invention.
From the above, the present embodiment is in advance at the input position of display interface setting at least two, and store after original for the user collected input feature vector is associated one by one with the input position of display interface, wherein said, the original input feature vector of described user comprises the touch data of human organ; When carrying out user rs authentication to carry out subsequent operation, gather the current input feature that user inputs at input position; And current input feature and original input feature vector compare, when the current input feature collected is consistent with the original input feature vector prestored, the operation of response user.The present invention passes through the input feature vector of touch data as user of human organ, and judge that network operation is the operation behavior of user or the operation behavior of machine with this, if this time operation is then authorized in artificial operation, if this time operation is then refused in the operation of machine, can effectively prevent machine from removing to perform the associative operation relating to account number, decrease the risk of online transaction to user, effectively can save property from damage, and privacy information can be good at being maintained secrecy.Further, gather the touch data based on finger finger, articulations digitorum manus, nail that user inputs at each input position, judge artificial operation more accurately, improve the reliability of user rs authentication further.
Refer to Fig. 5, Fig. 5 is the embody rule embodiment of the user authentication method based on user's input feature vector of the embodiment of the present invention.As shown in the figure, this subscriber authentication system comprises user 11, user interface (UI) 12 and verification process module 13, be understandable that, described user proposes to describe described user authentication method better, do not belong to a described subscriber authentication system part, described user interface is the front end display module of client, and described verification process module 13 is the background processing module of described client.
In step S501, verification process module 13 is set up the corresponding relation of the input position of the original input feature vector of user and display interface and is stored;
Wherein, described input feature vector can comprise the touch data based on finger finger, articulations digitorum manus, nail; Then the touch data of described human organ is specially the touch data that user points finger, articulations digitorum manus, nail.
In step S502, user 11 asks user rs authentication to carry out subsequent operation to user interface 12;
And by described user interface 12, described request is sent to described verification process module 13.
In step S503, verification process module 13 is according to the one-to-one relationship of described original input feature vector and described input position, and at the input position that each are different, prompting user carries out inputting and shows in described user interface 12.
In step S504, user 11 inputs at the input position of described user interface 12;
And by described user interface 12, input feature vector is sent to described verification process module 13.
In step S505, verification process module 13 gathers the current input feature that user 11 inputs at described input position;
In step S506, the current input feature that user inputs at different input position by verification process module 13 with prestore to should the original input feature vector of input position compare.
In step s 507, if the current input feature collected is consistent with the original input feature vector prestored, verification process module 13 points out user 11 to be verified by user interface 12.
In step S508, if the current input feature collected consistent with the original input feature vector prestored while, verification process module 13 responds the operation of user 11.
In step S509, if the current input feature collected is inconsistent with the original input feature vector prestored, verification process module 13 points out user to re-start input by user interface 12.
For example, in the corresponding relation of the input position prestored and original input feature vector, the corresponding touch data based on finger of a certain input position, if collect the data of the areal extent touching described input position at this input position, then according to the corresponding relation of the data that collect and the touch areal extent preset and touch manner, can judge that described current input feature by which kind of input mode is inputted; If judge, described current input feature is the touch data based on finger, then can think that the current input feature that this input position collects is consistent with the original input feature vector prestored.
If by the current input feature of each different input position input; with to should after the original input feature vector prestored of input position compares detection one by one; judge that all current input feature collected are consistent with the original input feature vector prestored; can judge that network operation is manual operation; the operation of response user; thus stop the account such as privacy of user, property internal information stolen by others, strengthen the safeguard protection of user account.
Be understandable that, described step S501, to step S509, can carry out specific implementation with reference to the associated description of above-mentioned 3rd embodiment, repeat no more herein.
For ease of better implementing the user authentication method based on user's input feature vector that the embodiment of the present invention provides, the embodiment of the present invention also provides a kind of device of the user rs authentication based on user's input feature vector.Wherein the implication of noun is identical with the above-mentioned user authentication method based on user's input feature vector, and specific implementation details can explanation in reference method embodiment.Refer to Fig. 6, the structural representation of the user authentication device based on user's input feature vector that Fig. 6 provides for the embodiment of the present invention, wherein said device comprises presetting module 61, acquisition module 62, comparison module 63 and respond module 64.
Wherein said presetting module 61, gather the original input feature vector of user in advance, and store after the original input feature vector collected is associated one by one with the input position of display interface, wherein said display interface comprises the input position of at least two, the original input feature vector that each input position is corresponding different, the original input feature vector of described user comprises the touch data of human organ; Described acquisition module 62, when carrying out user rs authentication to carry out subsequent operation, gathers the current input feature that user inputs at described input position.
In embodiments of the present invention, the device of the described user rs authentication based on user's input feature vector can be the unit that the software unit be built in client, hardware cell or soft or hard combine.Described client includes but not limited to following equipment: the mobile terminal with touch-screen, the computer with touch-screen or the individual palmtop PC with touch-screen etc.But, be understandable that, as long as the electronic equipment with touch-screen all should be included within protection scope of the present invention.
Described comparison module 63, the current input feature that user is inputted at different input position with prestore to should the original input feature vector of input position compare; Described respond module 64, the current input feature collected if above-mentioned is consistent with the original input feature vector prestored, then respond the operation of user.
When client user carries out network operation, first need to carry out user rs authentication, namely distinguishing this network operation is machine operation or manual operation, if user rs authentication is passed through, this network operation of response user, even determining current network operation is manual operation, then respond this network operation, to ensure user account safety.
In the present embodiment, client gathers the current input feature that user inputs at described input position, and the current input feature that user is inputted at different input position with prestore to should the original input feature vector of input position compare, with the current input feature inputted at described input position according to user, and prestore the incidence relation of original input feature vector and input position, determine that this network operation is machine operation or manual operation.
If current input feature is consistent with the original input feature vector prestored, then can think that user rs authentication is passed through, namely judge that network operation is manual operation, then respond the operation of user.
From the above, the present embodiment is in advance at the input position of display interface setting at least two, and store after original for the user collected input feature vector is associated one by one with the input position of display interface, wherein said, the original input feature vector of described user comprises the touch data of human organ; When carrying out user rs authentication to carry out subsequent operation, gather the current input feature that user inputs at input position; And current input feature and original input feature vector compare, when the current input feature collected is consistent with the original input feature vector prestored, the operation of response user.The present invention passes through the input feature vector of touch data as user of human organ, and judge that network operation is the operation behavior of user or the operation behavior of machine with this, if this time operation is then authorized in artificial operation, can effectively prevent machine from removing to perform the associative operation relating to account number, the risk of online transaction is decreased to user, effectively can save property from damage, and privacy information can be good at being maintained secrecy.
Refer to Fig. 7, the structural representation of the user authentication device based on user's input feature vector that Fig. 7 provides for the embodiment of the present invention, wherein said device comprises presetting module 71, acquisition module 72, comparison module 73, respond module 74, first reminding module 75, second reminding module 76 and the 3rd reminding module 77.In the present embodiment, preferably, described input feature vector can comprise the touch data based on finger finger, articulations digitorum manus, nail.
Wherein said presetting module 71, gathers the touch data based on finger finger, articulations digitorum manus, nail that user is original in advance, and stores after being associated one by one with the input position of display interface by the original input feature vector collected.Wherein, described display interface comprises the input position of at least two, the original input feature vector that each input position is corresponding different, and the original input feature vector of described user comprises the touch data of human organ.
Namely the touch data of described human organ is: user's touch data that client collects when described input position is inputted to display screen by touch manner.
Further, the described touch data based on finger finger, articulations digitorum manus, nail can comprise: one or more combination in any of the sound that described input position sends that touches the areal extent of described input position, touches the pressure size of described input position, touches.That is, according to one or more combination in any of described touch data, can determine that the input feature vector of current input is for inputting by pointing finger or articulations digitorum manus or nail, does not do concrete restriction to its realization herein.
Described first reminding module 75, at variant input position, prompting user is inputted by different input modes, and wherein said different input mode is that the original input feature vector corresponding according to each input position inputs at variant input position.
Be understandable that, the corresponding original input feature vector of each input position; Such as: in the corresponding relation of the input position prestored and original input feature vector, the corresponding touch data based on finger of input position A, then described client is at the input position A place of its touch-screen display interface, and prompting user adopts finger carrying out touch input.
Described acquisition module 72, when carrying out user rs authentication to carry out subsequent operation, gathers the current touch data based on finger finger, articulations digitorum manus, nail that user inputs at described input position.Described comparison module 73, the current input feature that user is inputted at different input position with prestore to should the original input feature vector of input position compare.
Preferably, the touch data based on finger finger, articulations digitorum manus, nail that user inputs at each input position is gathered.After gathering described touch data, the current input feature that user is inputted at different input position with prestore to comparing one by one the original input feature vector of input position, with the current input feature inputted at described input position according to user, and prestore the incidence relation of original input feature vector and input position, determine that this network operation is machine operation or manual operation.
Described respond module 74, the current input feature collected if above-mentioned is consistent with the original input feature vector prestored, then respond the operation of user.
Such as: in the corresponding relation of the input position prestored and original input feature vector, the corresponding touch data based on finger of input position A, client gathers the current input feature that user inputs at input position A, and this current input feature and the touch data based on finger preset is compared:
If collect the data of the areal extent touching described input position at input position A, then according to the corresponding relation of the data that collect and the touch areal extent preset and touch manner, can judge that described current input feature by which kind of input mode is inputted; If judge, described current input feature is the touch data based on finger, then can think that the current input feature that input position A collects is consistent with the original input feature vector prestored.
Or, if collect at input position A and touch the areal extent of described input position and the data of pressure size, then according to the combination of the corresponding relation of the corresponding relation of data, default touch areal extent and touch manner collected and default touch pressure size and touch manner, can judge that described current input feature by which kind of input mode is inputted; Same, if judge, described current input feature is the touch data based on finger, then can think that the current input feature that input position A collects is consistent with the original input feature vector prestored.
It is contemplated that, in some more accurate occasions, the areal extent touching described input position, the pressure size touching described input position can be adopted, touch the plural combination of sound that described input position sends and other touch datas, judge that described current input feature by which kind of input mode is inputted, do not do concrete restriction herein.
Wherein, the corresponding relation of described touch areal extent and touch manner, and/or the corresponding relation of touch pressure size and touch manner, and/or touch the corresponding relation of the touch data territory touch manner such as corresponding relation of sound type size and touch manner, can be set in advance in described client, wherein, described touch manner comprise by finger finger, articulations digitorum manus, nail or other human organs input.
Be understandable that, user is at the current input feature of each different input position input, all can refer to the detection mode of the current input feature of above-mentioned input position A, with to should the original input feature vector prestored of input position compare detection one by one, if all current input feature collected are consistent with the original input feature vector prestored, can judge that network operation is manual operation, the operation of response user, simultaneously, by the display of this comparative result to described display interface (i.e. client touch-screen), and be verified to client user's prompting.
Preferred further, if the current input feature collected is inconsistent with the original input feature vector prestored, then point out user to re-start input;
Be understandable that, under the enforceable mode of one, described second reminding module 76, in the process gathering the current input feature that user inputs at different input position, often receive a current input feature, namely this current input feature is compared, once this current input feature is inconsistent with the original input feature vector prestored, then point out user to re-start input.
That is, in the process of current input feature gathering the input of described input position one by one, when occurring that arbitrary current input feature collected and the original input feature vector prestored are inconsistent, just stop the current input feature gathering user's input, simultaneously, prompting user re-starts input, wherein, re-enter and can comprise two kinds of forms: first, prompting is that inconsistent input position place re-enters at input feature vector comparative result, the second, prompting is re-entered at described input position one by one.
Under the enforceable mode of another kind, described 3rd reminding module 77, when user is after the current input feature that all input positions input all has gathered, unification compares one by one to all current input feature, once one of them current input feature is inconsistent with the original input feature vector of corresponding input position, then user is pointed out to re-start input.
That is, after all input positions all complete input, when occurring that arbitrary current input feature collected and the original input feature vector prestored are inconsistent, prompting user re-enters at described input position one by one.
Be understandable that, in some comparatively accurate occasions, if when the current input feature that detection collects is inconsistent with the original input feature vector prestored, directly do not respond user operation, namely this network operation is refused, and exit current display interface, on can carrying out after a preset time period once user rs authentication to carry out subsequent network operations; Or, the number of times allowing user to carry out re-entering is set and is no more than 2 times, if exceed, exit current display interface then equally, on can carrying out after a preset time period, once user rs authentication, to carry out subsequent network operations etc., is only herein and illustrates, do not form limitation of the invention.
From the above, the present embodiment is in advance at the input position of display interface setting at least two, and store after original for the user collected input feature vector is associated one by one with the input position of display interface, wherein said, the original input feature vector of described user comprises the touch data of human organ; When carrying out user rs authentication to carry out subsequent operation, gather the current input feature that user inputs at input position; And current input feature and original input feature vector compare, when the current input feature collected is consistent with the original input feature vector prestored, the operation of response user.The present invention passes through the input feature vector of touch data as user of human organ, and judge that network operation is the operation behavior of user or the operation behavior of machine with this, if this time operation is then authorized in artificial operation, if this time operation is then refused in the operation of machine, can effectively prevent machine from removing to perform the associative operation relating to account number, decrease the risk of online transaction to user, effectively can save property from damage, and privacy information can be good at being maintained secrecy.Further, gather the touch data based on finger finger, articulations digitorum manus, nail that user inputs at each input position, judge artificial operation more accurately, improve the reliability of user rs authentication further.
Be understandable that there is no the part described in detail in the embodiment of the present invention, specifically can carry out specific implementation with reference to the associated description of above-mentioned related embodiment, repeat no more herein.
The described user authentication device based on user's input feature vector that the embodiment of the present invention provides, be for example computer, panel computer, mobile phone with touch function etc., the user authentication method based on user's input feature vector in described image processing apparatus and foregoing embodiments belongs to same design, described based on the user authentication device of user's input feature vector can be run described based on the user authentication method embodiment of user's input feature vector in the either method that provides, its specific implementation process refers to the described user authentication method embodiment based on user's input feature vector, repeat no more herein.
It should be noted that, for the user authentication method based on user's input feature vector of the present invention, this area common test personnel are appreciated that all or part of flow process realized based on the user authentication method of user's input feature vector described in the embodiment of the present invention, that the hardware that can control to be correlated with by computer program has come, described computer program can be stored in a computer read/write memory medium, as being stored in the memory of terminal, and performed by least one processor in this terminal, can comprise in the process of implementation as described in based on the flow process of the embodiment of the user authentication method of user's input feature vector.Wherein, described storage medium can be magnetic disc, CD, read-only store-memory body (ROM) or random store-memory body (RAM) etc.
For described in the embodiment of the present invention based on the user authentication device of user's input feature vector, its each functional module can be integrated in a process chip, also can be that the independent physics of modules exists, also can two or more module integrations in a module.Above-mentioned integrated module both can adopt the form of hardware to realize, and the form of software function module also can be adopted to realize.If described integrated module using the form of software function module realize and as independently production marketing or use time, also can be stored in a computer read/write memory medium, described storage medium such as be read-only memory, disk or CD etc.
In sum; although the present invention discloses as above with preferred embodiment; but above preferred embodiment is also not used to limit the present invention; the common test personnel of this area; without departing from the spirit and scope of the present invention; all can do various change and retouching, the scope that therefore protection scope of the present invention defines with claim is as the criterion.