WO2019184011A1 - 管理终端设备方法和终端设备 - Google Patents

管理终端设备方法和终端设备 Download PDF

Info

Publication number
WO2019184011A1
WO2019184011A1 PCT/CN2018/083057 CN2018083057W WO2019184011A1 WO 2019184011 A1 WO2019184011 A1 WO 2019184011A1 CN 2018083057 W CN2018083057 W CN 2018083057W WO 2019184011 A1 WO2019184011 A1 WO 2019184011A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
user
terminal device
determination model
touch
Prior art date
Application number
PCT/CN2018/083057
Other languages
English (en)
French (fr)
Inventor
李腾
李向东
胡峥
柏亚欣
于雪松
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to US16/977,041 priority Critical patent/US11468153B2/en
Priority to CN201880088819.4A priority patent/CN111684762B/zh
Publication of WO2019184011A1 publication Critical patent/WO2019184011A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/10Protecting distributed programs or content, e.g. vending or licensing of copyrighted material ; Digital rights management [DRM]
    • G06F21/12Protecting executable software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/70Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
    • G06F21/71Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer to assure secure computing or processing of information
    • G06F21/73Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer to assure secure computing or processing of information by creating or determining hardware identification, e.g. serial numbers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/32Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials

Definitions

  • the present application relates to the field of human-computer interaction, and more particularly, to a method and terminal device for managing a terminal device.
  • a terminal device can be authenticated by using, for example, a password (for example, a digital password or a gesture password) before starting an application or entering an operation interface, and allowing the verification after passing the verification. Launch the application or allow access to the operator interface.
  • a password for example, a digital password or a gesture password
  • the present application provides a method and apparatus for managing a terminal device, which can improve the security of the terminal device.
  • the first aspect provides a method for managing a terminal device, including: acquiring operation information corresponding to a first operation, where the operation information includes touch information and/or posture information of the terminal device; and corresponding to the first operation
  • the terminal device is managed by the degree of matching of the operation information and the first determination model, wherein the first determination model is determined based on operation information of an operation performed by the first user.
  • the same user ie, the first user
  • the same user may generate a large number of similar operations in the process of operating the terminal device, by using the first
  • the plurality of operations of the user are trained to obtain a determination model, and it is possible to determine whether an operation is performed by the first user based on the determination model, thereby improving the use security of the terminal device.
  • the touch information includes at least one of the following: information of the strength of the touch operation, information of the position of the touch operation, information of the contact area of the touch operation, information of the contact time of the touch operation, and the sliding angle of the touch operation. Information, information on the sliding direction of the touch operation, and information on the sliding distance of the touch operation.
  • the first user includes a host of the terminal device.
  • the managing the terminal device according to the matching degree of the operation information corresponding to the first operation and the first determination model comprises: when the operation information corresponding to the first operation matches the first determination model is higher than a preset When the first threshold is reached, the processing corresponding to the first operation is performed.
  • the first operation is an operation on a picture (for example, an operation of deleting a picture)
  • the image can be processed based on the operation (for example, deleting the picture).
  • the managing the terminal device according to the matching degree of the operation information corresponding to the first operation and the first determination model comprises: when the operation information corresponding to the first operation matches the first determination model is higher than a preset The first threshold is unlocked when the first application is unlocked.
  • the first operation is a pattern unlocking operation
  • the pattern can be unlocked correctly. Make sure you can unlock it.
  • the managing the terminal device according to the matching degree of the operation information corresponding to the first operation and the first determination model comprises: when the operation information corresponding to the first operation matches the first determination model is lower than a preset When the first threshold is used, the processing corresponding to the first operation is prohibited.
  • the first operation is an operation on a picture (for example, an operation of deleting a picture)
  • the matching degree of the operation information corresponding to the first operation and the first determination model is lower than a preset first threshold , you can disable the processing of images based on this operation (for example, prohibiting deletion of images).
  • the managing the terminal device according to the matching degree of the operation information corresponding to the first operation and the first determination model comprises: when the operation information corresponding to the first operation matches the first determination model is lower than a preset When the first threshold is reached, the interface currently displayed by the terminal device is switched to the lock screen interface.
  • the managing the terminal device comprises: when the operation information corresponding to the first operation matches the first determination model is lower than a preset When the first threshold is reached, a preset alarm signal is played.
  • the managing the terminal device according to the matching degree of the operation information corresponding to the first operation and the first determination model comprises: when the operation information corresponding to the first operation matches the first determination model is lower than a preset The first threshold is locked when the first application is locked.
  • the first operation may be an operation for the first application, for example, an operation on an interface of the first application.
  • the first operation may be an operation before starting the first application, or the first operation may be an operation of the first application during background operation.
  • the first operation is an operation detected before detecting the second operation for starting the first application
  • the unlocking the first application includes: not displaying when the second operation is detected Unlock the interface and launch the first app.
  • the first operation is an operation detected before detecting the second operation for starting the first application
  • the locking the first application comprises: displaying the unlocking when the second operation is detected interface.
  • the first operation is an operation detected before detecting a second operation for starting the first application, and prohibiting starting the first application.
  • the first operation is an operation for unlocking the first application.
  • the first application includes at least one of the following applications: an application operated by the first operation, an application preset by a host of the terminal device, and an application preset by a manufacturer of the terminal device.
  • the managing the terminal device according to the matching degree of the operation information corresponding to the first operation and the first determination model comprises: when the operation information corresponding to the first operation matches the first determination model is lower than a preset And determining, according to the operation information corresponding to the first operation, the second determination model from the plurality of determination models, wherein the matching degree of the operation information corresponding to the first operation mode is higher than the preset a second threshold, or the second determination model is a determination model in which the degree of matching of the operation information corresponding to the first operation is the largest, wherein the plurality of determination models are in one-to-one correspondence with a plurality of users, each The determination model is determined based on the operation information of the operation performed by the corresponding user; and the terminal device is managed according to the user authority of the user corresponding to the second determination model.
  • the method further includes: determining, according to the user operation detected in the first time period, a plurality of training information, where the user operation includes operations performed by the plurality of users, the training information including touch operation information of the user operation and/or Or posture information of the terminal device under the operation of the user; clustering the plurality of training information to determine at least one training information set; and according to the information of the second user of the plurality of users, from the at least one training In the information set, a first training information set corresponding to the second user is determined; and a determination model for the second user is determined according to the training information in the first training information set.
  • performing clustering processing on the multiple pieces of training information includes: performing clustering processing on the plurality of training information based on a preset third threshold, wherein density of the training information in each training information set And greater than or equal to the third threshold; determining, according to the information of the second user of the plurality of users, the first training information set corresponding to the second user from the plurality of training information sets, including: when When the information of the second user indicates that the second user is the owner of the terminal device, the training information set having the highest density of the training information in the plurality of training information sets is determined as the first training information set.
  • performing clustering processing on the multiple training information includes: identifying a clustering structure OPTICS algorithm according to object sorting, and performing clustering processing on the plurality of training information.
  • the terminal device has at least two operation modes, wherein the terminal device needs to identify whether the user is the owner in the first operation mode, and the terminal device does not need to identify whether the user is the owner in the second operation mode. And before the matching of the operation information corresponding to the first operation and the first determination model, the method further comprises: determining that the current operation mode of the terminal device is the first operation mode.
  • the second aspect provides a method for managing a terminal device, including: displaying a first interface; receiving a first operation of the user, and acquiring operation information corresponding to the first operation, where the operation information includes touch information and/or Or the posture information of the terminal device; in response to the operation information corresponding to the first operation being the first type, displaying the second interface, the second interface being different from the first interface; and responsive to the first
  • the operation information corresponding to the operation is a second type, and the second type is different from the first type, and the third interface is displayed.
  • the third interface includes a user verification interface, and the second interface is an interface after verification.
  • the “operation information corresponding to the first operation is the first type” may be that the matching degree of the operation information corresponding to the first operation and the first determination model is higher than a preset first threshold.
  • the operation information corresponding to the first operation is the second type, which may be that the matching degree of the operation information corresponding to the first operation and the first determination model is lower than a preset first threshold.
  • the first determination model may be determined based on operation information of an operation performed by a host of the terminal device.
  • the touch information includes at least one of the following information: information of the strength of the touch operation, information of the position of the touch operation, information of the contact area of the touch operation, information of the contact time of the touch operation, and the sliding angle of the touch operation. Information, information on the sliding direction of the touch operation, and information on the sliding distance of the touch operation.
  • the second interface is an interface of an application.
  • the application may be an application to which the first interface belongs.
  • the application can be an application operated by the first operation.
  • the method further includes: receiving a second operation of the user before displaying the second interface; and responding to the second operation and the operation information corresponding to the first operation being the first type, the The second interface is displayed; and the operation information corresponding to the second operation and the first operation is the second type, and the third interface is displayed.
  • the operation information corresponding to the first operation and the first operation may be the first type, which may be that the matching of the operation information corresponding to the second operation with the first determination model is higher than the preset first a threshold value, and the degree of matching between the operation information corresponding to the first operation and the first determination model is higher than a preset first threshold.
  • the operation information corresponding to the first operation and the first operation is a second type, which may be that the matching degree of the operation information corresponding to the first operation and the first determination model is lower than a preset first threshold, and / The degree of matching between the operation information corresponding to the first operation and the first determination model is lower than a preset first threshold.
  • the method further includes: when displaying the third interface, receiving a third operation of the user and acquiring operation information corresponding to the third operation; and displaying, in response to the operation information of the third operation, a third type, displaying The second interface.
  • the “the operation information of the third operation is the third type” may be: the degree of matching between the operation information corresponding to the third operation and the first determination model is higher than a preset first threshold, and the The unlocking information corresponding to the three operations satisfies the unlocking condition.
  • a third aspect provides a method for user identification, including: determining a plurality of training information according to a user operation detected in a first time period, the user operation including operations performed by a plurality of users, the training information including the user operation Touch operation information and/or posture information of the terminal device under the operation of the user; clustering the plurality of training information to determine at least one training information set; according to information of the first user of the plurality of users Determining, from the at least one training information set, a first training information set corresponding to the first user; determining, according to the training information in the first training information set, a determination model for the first user; When the first operation is detected in the second period, according to the determination model, it is determined whether the user who performs the first operation is the first user.
  • the same user since the operation of the user is habitual, the same user may generate a large number of similar operations in the process of operating the terminal device, by causing the terminal device to detect according to the first time period.
  • the plurality of training information determined by the user operation is clustered, so that the training information in the same training information set after clustering can be matched to the same user, and further, the determination model generated based on the training information in the training information set can be effectively determined.
  • the biometric identification device is not required to be implemented for user identification, and the cost of the terminal device can be reduced.
  • the user operation for producing the training information does not need to be deliberately performed by the user, or there is no need to additionally increase the user's operation burden for realizing the user identification, the user experience can be improved, and the practicality of the user identification of the present situation can be improved.
  • performing clustering processing on the multiple pieces of training information includes: performing clustering processing on the plurality of training information based on a preset first threshold, wherein density of the training information in each training information set And greater than or equal to the first threshold; determining, according to the information of the first user of the plurality of users, the first training information set corresponding to the first user from the plurality of training information sets, including: when When the first user's information indicates that the first user is the owner of the terminal device, the training information set having the highest density of the training information in the plurality of training information sets is determined as the first training information set.
  • the operation of the owner is large.
  • the training set with the highest density can be determined to be used.
  • the training information set that identifies the determination model of the owner is trained, so that the identification of the owner can be easily realized.
  • the clustering structure OPTICS algorithm is identified according to the object sorting, and the plurality of training information is clustered.
  • performing clustering processing on the multiple pieces of training information to determine a plurality of training information sets includes: performing clustering processing on the plurality of training information to determine a plurality of training information sets and each training information set Corresponding feature information, wherein the plurality of training information sets are in one-to-one correspondence with the plurality of users, and feature information of each training information set; and the plurality of training information sets from the plurality of users Determining, in the training information set, the first training information set corresponding to the first user, including: when the information of the first user is the feature information of the first user, the feature information in the plurality of training information sets The training information set of the first user's feature information satisfies the second preset condition, and is determined as the first training information set.
  • the feature information corresponding to each training information set can be determined by the clustering process, so that the determination model for the plurality of users can be realized in the case where the feature information of the plurality of users can be acquired.
  • the determination and thus, the utility of the user identification method of the present application can be improved.
  • performing clustering processing on the multiple training information includes: performing clustering processing on the multiple training information according to a K-means algorithm.
  • determining the plurality of training information according to the user operation detected in the first time period including: determining the multiple training information according to the user operation detected for the first application in the first time period;
  • determining, according to the determination model, whether the user performing the first operation is the first user includes: when the first operation for the first application is detected in the second time period, According to the determination model, it is determined whether the user who performs the first operation is the first user.
  • the determination model is determined by the training information determined according to the operation for the first application, and the user who determines the operation for the first application using the determination model can increase The accuracy and reliability of the identification of the user identification method of the present application.
  • determining the plurality of training information according to the user operation detected in the first time period including: determining a plurality of training information according to a user operation detected for the first operation interface in the first time period; When the first operation detects the first operation, determining, according to the determination model, whether the user performing the first operation is the first user, including: when the first operation for the first operation interface is detected in the second time period And determining, according to the determination model, whether the user performing the first operation is the first user.
  • the determination model is determined by the training information determined according to the operation for the first operation interface, and the user for determining the operation of the first operation interface is determined using the determination model. The recognition accuracy and reliability of the method of user identification of the present application can be increased.
  • determining, according to the user operation detected in the first time period, the plurality of training information including: determining, according to a user operation of the first operation type detected in the first time period, a plurality of training information;
  • determining, according to the determination model, whether the user performing the first operation is the first user includes: when the first operation of the first operation type is detected in the second time period, according to The determination model determines whether the user performing the first operation is the first user.
  • the first operation type includes a sliding operation type, a click operation type, or a long press operation type.
  • the determination model is determined by the training information determined according to the operation of the first operation type, and the user for the operation of the first operation type is determined using the determination model, The recognition accuracy and reliability of the method of user identification of the present application can be increased.
  • the touch operation information includes at least one of the following: information of the strength of the touch operation, information of the position of the touch operation, information of the contact area of the touch operation, information of the contact time of the touch operation, and the sliding angle of the touch operation. Information, information on the sliding direction of the touch operation, and information on the sliding distance of the touch operation.
  • the method further includes: disabling the The target application specified in the terminal device; or performing lock screen processing; or performing alarm processing.
  • the terminal device has at least two operation modes, wherein the terminal device needs to identify whether the user is the owner in the first operation mode, and the terminal device does not need to identify whether the user is the owner in the second operation mode. And determining, according to the determining model and the information of the first operation, whether the user performing the first operation is the first user, the method further comprising: determining that the current operating mode of the terminal device is the first operating mode .
  • a fourth aspect provides a method for user identification, including: receiving a plurality of training information sent by a terminal device, where the training information is determined based on a user operation detected by the terminal device in a first time period, where the user operates
  • the operation information includes a touch operation information of the user operation and/or posture information of the terminal device under the operation of the user; and clustering the plurality of training information to determine at least one training Determining, according to information of the first user of the plurality of users, a first training information set corresponding to the first user from the at least one training information set; and training according to the first training information set Determining a determination model for the first user; transmitting the determination model to the terminal device, so that when the terminal device detects the first operation in the second time period, determining, according to the determination model, the user performing the first operation Whether it is the first user.
  • the same user since the operation of the user is habitual, the same user may generate a large number of similar operations in the process of operating the terminal device, by causing the terminal device to detect according to the first time period.
  • the plurality of training information determined by the user operation is clustered, so that the training information in the same training information set after clustering can be matched to the same user, and further, the determination model generated based on the training information in the training information set can be effectively determined.
  • the biometric identification device is not required to be implemented for user identification, and the cost of the terminal device can be reduced.
  • the user operation for producing the training information does not need to be deliberately performed by the user, or there is no need to additionally increase the user's operation burden for realizing the user identification, the user experience can be improved, and the practicality of the user identification of the present situation can be improved.
  • the process of determining the determination model is executed by the server, the request for the processing performance of the terminal device can be reduced, and the processing load of the terminal device can be alleviated, whereby the practicality of the method of user identification of the present application can be further improved.
  • performing clustering processing on the multiple pieces of training information includes: performing clustering processing on the plurality of training information based on a preset first threshold, wherein density of the training information in each training information set And greater than or equal to the first threshold; determining, according to the information of the first user of the plurality of users, the first training information set corresponding to the first user from the plurality of training information sets, including: when When the first user's information indicates that the first user is the owner of the terminal device, the training information set having the highest density of the training information in the plurality of training information sets is determined as the first training information set.
  • performing clustering processing on the multiple pieces of training information to determine a plurality of training information sets includes: performing clustering processing on the plurality of training information to determine a plurality of training information sets and each training information set Corresponding feature information, wherein the plurality of training information sets are in one-to-one correspondence with the plurality of users, and feature information of each training information set; and the plurality of training information sets from the plurality of users Determining, in the training information set, the first training information set corresponding to the first user, including: when the information of the first user is the feature information of the first user, the feature information in the plurality of training information sets The training information set of the first user's feature information satisfies the second preset condition, and is determined as the first training information set.
  • the receiving, by the receiving terminal device, the plurality of training information comprising: receiving the plurality of training information and the third indication information sent by the terminal device, where the third indication information is used to indicate that the multiple training information is based on Determining, by the terminal device, the user operation for the first operation interface detected in the first time period; and sending the determination model to the terminal device comprises: transmitting the determination model and the fourth indication information to the terminal device, The fourth indication information is used to indicate that the determination model is specifically used to determine whether a user who performs an operation for the first operation interface is the first user.
  • the receiving, by the receiving terminal device, the multiple training information includes: receiving the plurality of training information and the fifth indication information that are sent by the terminal device, where the fifth indication information is used to indicate that the multiple training information is Determining, by the user equipment of the first operation type detected by the terminal device in the first time period; and transmitting the determination model to the terminal device, comprising: sending the determination model and sixth indication information to the terminal device, where The sixth indication information is used to indicate that the determination model is specifically used to determine whether the user performing the operation for the first operation type is the first user.
  • the first operation type includes a sliding operation type, a click operation type, or a long press operation type.
  • the touch operation information includes at least one of the following: information of the strength of the touch operation, information of the position of the touch operation, information of the contact area of the touch operation, information of the contact time of the touch operation, and the sliding angle of the touch operation. Information, information on the sliding direction of the touch operation, and information on the sliding distance of the touch operation.
  • an apparatus for managing a terminal device comprising means for performing the steps of the method of any of the first to third aspects and the implementations thereof.
  • the device for managing the terminal device may be configured in the terminal device.
  • the detecting unit in the device of the management terminal device can be implemented using a sensor in the terminal device.
  • the processing unit in the device identified by the user may be independent of the processor in the terminal device.
  • the processing unit in the device managing the terminal device can be implemented using a processor in the terminal device.
  • the device managing the terminal device itself may be a terminal device.
  • an apparatus for user identification comprising means for performing the steps of the method of the fourth aspect and its implementations described above
  • the device identified by the user may be configured in a server capable of communicating with the terminal device.
  • the processing unit in the device identified by the user can be independent of the processor in the server.
  • the processing unit in the user-identified device can be implemented using a processor in the server.
  • the device identified by the user may itself be a server.
  • a terminal device comprising: a sensor, a processor, a memory for detecting a user operation and/or a posture of the terminal device, the memory for storing a computer program, the processor is used for the slave memory
  • the computer program is invoked and executed such that the communication device performs the method of any of the first to third aspects and various possible implementations thereof.
  • the processor is one or more, and the memory is one or more.
  • the memory may be integrated with the processor or the memory may be separate from the processor.
  • the senor may be one or more, the one or more sensors may collectively detect the same parameter, or different sensors may be used to detect different parameters.
  • the terminal device may further include a transmitter (transmitter) and a receiver (receiver).
  • a transmitter transmitter
  • a receiver receiver
  • a server comprising: a processor, a memory, a transmitter (transmitter), and a receiver (receiver) for storing a computer program for calling and running the memory from the memory
  • a computer program that causes the communication device to perform a method of user identification in the fourth aspect and its various implementations.
  • the processor is one or more, and the memory is one or more.
  • the memory may be integrated with the processor or the memory may be separate from the processor.
  • a computer program product comprising: a computer program (which may also be referred to as a code, or an instruction) that, when executed, causes the computer to perform the first aspect to the first Any of the four aspects and methods of the various implementations thereof.
  • a computer readable medium storing a computer program (which may also be referred to as a code, or an instruction), when executed on a computer, causes the computer to perform the first aspect to the first Any of the four aspects and methods of the various implementations thereof.
  • a computer program which may also be referred to as a code, or an instruction
  • a chip system comprising a memory and a processor for storing a computer program for calling and running the computer program from the memory, such that the device on which the chip system is mounted performs the above.
  • the chip system can invoke data collected by sensors of the terminal device and generate a decision model based on the relevant steps in the method of any of the possible implementations of the first aspect or the second aspect.
  • the chip system may include an input circuit or interface for transmitting information or data, and an output circuit or interface for receiving information or data.
  • the same user ie, the first user
  • the operation performs training and obtains a determination model, and based on the determination model, it is possible to determine whether an operation is performed by the first user, thereby improving the use security of the terminal device.
  • FIG. 1 is a schematic diagram showing an example of a terminal device to which the method for managing a terminal device of the present application is applied.
  • FIG. 2 is a schematic flowchart of an example of a method of managing a terminal device according to the present application.
  • FIG. 3 is a schematic diagram showing an example of a user operation corresponding to the determination model.
  • FIG. 4 is a schematic diagram showing an example of interface control implemented by a method of managing a terminal device according to the present application.
  • FIG. 5 is a schematic diagram of another example of interface control implemented by the method of managing a terminal device of the present application.
  • Fig. 6 is a schematic flow chart showing an example of a method of determining a determination model of the present application.
  • FIG. 7 is a schematic interaction diagram of an example of a determination method of a determination model of the present application.
  • Fig. 8 is a schematic block diagram showing an example of a device for user identification of the present application.
  • a terminal device may also be called a user equipment (User Equipment, UE), an access terminal, a subscriber unit, a subscriber station, a mobile station, a mobile station, a remote station, a remote terminal, a mobile device, a user terminal, a terminal, a wireless communication device, and a user.
  • Agent or user device can be a station in the WLAN (STAION, ST), which can be a cellular phone, a cordless phone, a Session Initiation Protocol (SIP) phone, a Wireless Local Loop (WLL) station, and a personal digital processing.
  • WLAN STAION, ST
  • SIP Session Initiation Protocol
  • WLL Wireless Local Loop
  • PDA Personal Digital Assistant
  • handheld device with wireless communication capabilities computing device or other processing device connected to a wireless modem
  • in-vehicle device car networking terminal
  • computer laptop
  • handheld communication device handheld Computing devices
  • satellite wireless devices wireless modem cards
  • STBs set top boxes
  • CPE customer premise equipment
  • next generation communication systems For example, a terminal device in a 5G network or a terminal device in a future evolved Public Land Mobile Network (PLMN) network.
  • PLMN Public Land Mobile Network
  • the terminal device may also be a wearable device.
  • a wearable device which can also be called a wearable smart device, is a general term for applying wearable technology to intelligently design and wear wearable devices such as glasses, gloves, watches, clothing, and shoes.
  • a wearable device is a portable device that is worn directly on the body or integrated into the user's clothing or accessories. Wearable devices are more than just a hardware device, but they also implement powerful functions through software support, data interaction, and cloud interaction.
  • Generalized wearable smart devices include full-featured, large-size, non-reliable smartphones for full or partial functions, such as smart watches or smart glasses, and focus on only one type of application, and need to work with other devices such as smartphones. Use, such as various smart bracelets for smart signs monitoring, smart jewelry, etc.
  • the terminal device may also be a terminal device in an Internet of Things (IoT) system, and the IoT is an important component of future information technology development, and its main technical feature is to pass the article through the communication technology. Connected to the network to realize an intelligent network of human-machine interconnection and physical interconnection.
  • IoT Internet of Things
  • FIG. 1 shows a schematic diagram of an example of the terminal device.
  • the terminal device 100 may include the following components.
  • the RF circuit 110 can be used for transmitting and receiving information or during a call, and receiving and transmitting the signal. Specifically, after receiving the downlink information of the base station, the processor 180 processes the data. In addition, the uplink data is designed to be sent to the base station.
  • RF circuits include, but are not limited to, an antenna, at least one amplifier, a transceiver, a coupler, an LNA (Low Noise Amplifier), a duplexer, and the like.
  • RF circuitry 110 can also communicate with the network and other devices via wireless communication.
  • the wireless communication may use any communication standard or protocol, including but not limited to Wireless Local Area Networks (WLAN) Global System of Mobile communication (GSM) system, Code Division Multiple Access (Code Division Multiple Access) , CDMA) system, Wideband Code Division Multiple Access (WCDMA) system, General Packet Radio Service (GPRS), Long Term Evolution (LTE) system, LTE frequency division duplex (Frequency Division Duplex, FDD) system, LTE Time Division Duplex (TDD), Universal Mobile Telecommunication System (UMTS), Worldwide Interoperability for Microwave Access (WiMAX) communication system , the future fifth generation (5th Generation, 5G) system or new wireless (New Radio, NR).
  • WLAN Wireless Local Area Networks
  • GSM Global System of Mobile communication
  • CDMA Code Division Multiple Access
  • WCDMA Wideband Code Division Multiple Access
  • GPRS General Packet Radio Service
  • LTE Long Term Evolution
  • FDD Frequency Division Duplex
  • TDD Time Division Duplex
  • UMTS Universal Mobile Telecommunication System
  • the memory 120 can be used to store software programs and modules, and the processor 180 executes various functional applications and data processing of the terminal device 100 by running software programs and modules stored in the memory 120.
  • the memory 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may be stored. Data (such as audio data, phone book, etc.) created according to the use of the terminal device 100, and the like.
  • memory 120 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
  • Other input devices 130 can be used to receive input digital or character information, as well as to generate key signal inputs related to user settings and function control of terminal device 100.
  • other input devices 130 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackballs, mice, joysticks, and light mice (the light mouse is not sensitive to display visual output).
  • function keys such as volume control buttons, switch buttons, etc.
  • trackballs mice, joysticks, and light mice (the light mouse is not sensitive to display visual output).
  • Other input devices 130 are coupled to other input device controllers 171 of I/O subsystem 170 for signal interaction with processor 180 under the control of other device input controllers 171.
  • the display screen 140 can be used to display information input by the user or information provided to the user as well as various menus of the terminal device 100, and can also accept user input.
  • the specific display screen 140 may include a display panel 141 and a touch panel 142.
  • the display panel 141 can be configured by using a liquid crystal display (LCD), an organic light-emitting diode (OLED), or the like.
  • the touch panel 142 also referred to as a touch screen, a touch sensitive screen, etc., can collect contact or non-contact operations on or near the user (eg, the user uses any suitable object or accessory such as a finger, a stylus, etc. on the touch panel 142.
  • the operation in the vicinity of the touch panel 142 may also include a somatosensory operation; the operation includes a single-point control operation, a multi-point control operation, and the like, and the corresponding connection device is driven according to a preset program.
  • the touch panel 142 may include two parts: a touch detection device and a touch controller. Wherein, the touch detection device detects the touch orientation and posture of the user, and detects a signal brought by the touch operation, and transmits a signal to the touch controller; the touch controller receives the touch information from the touch detection device, and converts the signal into a processor. The processed information is sent to the processor 180 and can receive commands from the processor 180 and execute them.
  • the touch panel 142 can be implemented by using various types such as resistive, capacitive, infrared, and surface acoustic waves, and the touch panel 142 can be implemented by any technology developed in the future.
  • the touch panel 142 can cover the display panel 141, and the user can display the content according to the display panel 141 (the display content includes, but is not limited to, a soft keyboard, a virtual mouse, a virtual button, an icon, etc.) on the display panel 141. Operation is performed on or near the covered touch panel 142.
  • the touch panel 142 After detecting the operation thereon or nearby, the touch panel 142 transmits to the processor 180 through the I/O subsystem 170 to determine user input, and then the processor 180 according to the user The input provides a corresponding visual output on display panel 141 via I/O subsystem 170.
  • the touch panel 142 and the display panel 141 are two independent components to implement the input and input functions of the terminal device 100 in FIG. 4, in some embodiments, the touch panel 142 and the display panel 141 may be The input and output functions of the terminal device 100 are implemented by integration.
  • the sensor 150 can be one or more, for example, it can include a light sensor, a motion sensor, and other sensors.
  • the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 141 according to the brightness of the ambient light, and the proximity sensor may close the display panel 141 when the terminal device 100 moves to the ear. And / or backlight.
  • the acceleration sensor can detect the magnitude of acceleration in each direction (usually three axes). When it is stationary, it can detect the magnitude and direction of gravity. It can be used to identify the attitude of the terminal device (such as horizontal and vertical screen switching, related Game, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tapping).
  • the terminal device 100 may also be configured with a gravity sensor (also referred to as a gravity sensor), a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, and the like, and will not be described herein.
  • a gravity sensor also referred to as a gravity sensor
  • a gyroscope also referred to as a barometer
  • a hygrometer a thermometer
  • an infrared sensor and the like, and will not be described herein.
  • the audio circuit 160 can transmit the converted audio data to the speaker 161 for conversion to the sound signal output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into a signal, which is received by the audio circuit 160.
  • the audio data is converted to audio data, which is then output to the RF circuit 108 for transmission to, for example, another terminal device, or the audio data is output to the memory 120 for further processing.
  • the I/O subsystem 170 is used to control external devices for input and output, and may include other device input controllers 171, sensor controllers 172, and display controllers 173.
  • one or more other input control device controllers 171 receive signals from other input devices 130 and/or send signals to other input devices 130.
  • Other input devices 130 may include physical buttons (press buttons, rocker buttons, etc.) , dial, slide switch, joystick, click wheel, light mouse (light mouse is a touch-sensitive surface that does not display visual output, or an extension of a touch-sensitive surface formed by a touch screen). It is worth noting that other input control device controllers 171 can be connected to any one or more of the above devices.
  • Display controller 173 in I/O subsystem 170 receives signals from display 140 and/or transmits signals to display 140. After the display 140 detects the user input, the display controller 173 converts the detected user input into an interaction with the user interface object displayed on the display screen 140, ie, implements human-computer interaction. Sensor controller 172 can receive signals from one or more sensors 150 and/or send signals to one or more sensors 150.
  • the processor 180 is a control center of the terminal device 100 that connects various portions of the entire terminal device using various interfaces and lines, by running or executing software programs and/or modules stored in the memory 120, and recalling stored in the memory 120.
  • the data performs various functions and processing data of the terminal device 100, thereby performing overall monitoring of the terminal device.
  • the processor 180 may include one or more processing units; preferably, the processor 180 may integrate an application processor and a modem processor, where the application processor mainly processes an operating system, a user interface, an application, and the like.
  • the modem processor primarily handles wireless communications. It can be understood that the above modem processor may not be integrated into the processor 180.
  • the terminal device 100 also includes a power source 190 (e.g., a battery) that supplies power to the various components.
  • a power source 190 e.g., a battery
  • the power source can be logically coupled to the processor 180 through a power management system to manage functions such as charging, discharging, and power consumption through the power management system.
  • the terminal device 100 may further include a camera, a Bluetooth module, and the like, and details are not described herein again.
  • FIG. 2 shows an exemplary illustration of an example of a method 200 of managing a terminal device of the present application, for example, the method 200 can be applied to the terminal device 100 described above.
  • a processor of the terminal device can acquire related data of the user operation #1.
  • a sensor eg, a touch screen or an attitude sensor or the like
  • the data #1 may be transmitted to the processor.
  • execution of the method 200 can be triggered in any of the following manners.
  • the terminal device can detect the identity of the user of the terminal device according to a prescribed detection period, or determine the degree of matching between the user operation detected in the current period and the determination model (described later in detail).
  • user operation #1 may be one or more operations that are pinched detected during the current detection cycle.
  • Mode 2 Trigger based on the duration of operation of the terminal device
  • the terminal device may record the duration in which the terminal device is continuously operated continuously, wherein the duration of the continuous operation may be: the time interval between any two operations in the duration is less than or equal to the preset time. interval.
  • the terminal device can initiate the method 200 when it is determined that the duration of the continuous operation is greater than or equal to the preset duration.
  • the user operation #1 may be one or more operations detected after the terminal device determines that the duration of the continuous operation is greater than or equal to the preset duration.
  • the terminal device can determine the currently operated application, and if the application belongs to a prescribed application (eg, a chat application or a payment application, etc.), the method 200 can be initiated.
  • a prescribed application eg, a chat application or a payment application, etc.
  • the user operation #1 may be a touch operation for a prescribed application.
  • the user operation #1 can be any of the following operations:
  • A. Screen unlocking operation specifically, in order to prevent misoperation and improve the security direction of the terminal device, the user can lock the screen, or the terminal device does not detect the user's operation on the terminal device within a prescribed time, and can lock the screen by itself. Therefore, when the user needs to unlock the screen, a correct unlocking operation is required, for example, various unlocking operations such as sliding unlocking, password unlocking, or graphic unlocking may be cited.
  • an unlocking operation specifically, in order to improve the security of the terminal device, when the user needs to open an application (for example, a chat application or a payment application, etc.), the application device may pop up an unlocking interface, thereby After the user performs the correct unlock operation, the application can be started normally; or, when the user needs to use a certain function of the application (for example, a transfer function or a query function, etc.), the terminal device or the application can also pop up the unlock interface, thereby After the user performs the correct unlock operation, the function can be started normally.
  • an application for example, a chat application or a payment application, etc.
  • the application device may pop up an unlocking interface, thereby After the user performs the correct unlock operation, the application can be started normally; or, when the user needs to use a certain function of the application (for example, a transfer function or a query function, etc.), the terminal device or the application can also pop up the unlock interface, thereby After the user performs the correct unlock operation, the function can be started normally.
  • the specified application may be a user-set application, such as a chat application or a payment application, or the like; or, the specified application
  • the program can be an application set by a manufacturer or an operator.
  • the "touch operation for a prescribed application” may refer to an operation on a prescribed interface of the application, for example, the operation may include an operation on a prescribed interface control (for example, a click operation on a payment button)
  • the operation may include an operation set by the user, for example, a sliding operation performed on the browsing interface to implement page turning, and the like.
  • the "touch operation for a prescribed application” may refer to an operation of management of an application, and may include, for example, operations such as deletion of an application or change of authority.
  • the specific content included in the above-mentioned user operation #1 is merely an exemplary description, and the present application is not limited thereto, and the user can arbitrarily set the specific content of the user operation #1 as needed.
  • the user operation #1 can also be determined in conjunction with the above triggering method.
  • the user operation may include a touch operation.
  • the related data of the touch operation of the touch screen by the user may include: related data of a touch operation of the touch screen by the user; or
  • the related data of the user's touch operation on the touch screen may include related data of the posture of the terminal device when the user operates the terminal device. The following two types of data are respectively described in detail.
  • the user operation #1 may be a touch operation, in which case the related data of the user operation #1 may be data of the touch operation.
  • the above-described touch detection device may detect data of a user's touch operation, by way of example and not limitation, the touch operation may include, but is not limited to, a click (eg, click or double click, etc.) operation or a slide operation (eg, The data of the touch operation may include, but is not limited to, data of at least one of the following parameters:
  • the terminal device can configure the pressure sensor under the touch screen, so that the strength of the touch operation can be detected, and the data of the detected velocity is transmitted to the above processor.
  • the manner of detecting the strength of the touch operation is merely exemplary.
  • the present invention is not limited thereto.
  • the touch screen may also inherit the above pressure sensor.
  • the touch screen may be a pressure screen or pressure sensing. Screen.
  • the location of the touch operation may be a relative position of the touched area in the touch screen in the entire touch screen, wherein when the touch operation is a click operation, the touch operation may be a position where the touch point is in the screen.
  • the touch operation may include a position of the start point of the slide operation in the screen, or may include a position of the end point of the slide operation in the screen, or may also include a slide track The location on the screen.
  • the contact area of the touch operation may be the contact area of the user's finger with the touch screen detected by the touch detection device.
  • the contact time of the touch operation may be the time of one touch operation detected by the touch detection device, wherein the “one touch operation” may refer to the touch screen being touched by the finger until the finger leaves the touch screen), for example, when the touch When the operation is a click operation, the contact time of the touch operation may refer to the time when the touch screen is touched in a single click operation, and for example, when the touch operation is a sliding operation, the contact time of the touch operation may refer to a sliding operation. The time when the touch screen is touched.
  • the contact time of the touch operation may be a time when the interval detected by the touch detection device is less than or equal to a plurality of touch operations of the preset time interval, for example, when the touch operation is a double-click operation or the like, the finger has left the touch screen.
  • the contact time of the touch operation may refer to the total time that the touch screen is touched during one operation.
  • the angle of the sliding operation may be an angle between a line from the start position of the sliding operation to the end position of the sliding operation and a horizontal (or vertical) direction of the screen.
  • the angle of the sliding operation may be the direction from the start position of the sliding operation to the end position of the sliding operation.
  • the sliding distance of the sliding operation may be a straight line length from the start position of the sliding operation to the end position of the sliding operation.
  • the sliding distance of the sliding operation may be the total length of the trajectory of the sliding operation.
  • the relevant data of the attitude of the terminal device may include data of the angle formed by the terminal device with respect to the horizontal or gravity direction.
  • the attitude information may include, but is not limited to, data detected by at least one of the following sensors:
  • the processor of the terminal device may manage the terminal device based on one or more decision models and related data of the user operation #1.
  • different operating habits of different users may result in different sliding directions, angles, or distances of sliding operations of different users.
  • different operating habits of different users may result in different postures (eg, tilt angles, etc.) of the terminal devices of different users when operating the terminal devices.
  • the data detected by the sensor from the same user has similarity due to the user's operating habits and biometrics.
  • the similarity of the touch contact area of the touch operation of the same user is high.
  • the similarity of the sliding distance of the sliding operation of the same user is high.
  • the similarity of the sliding direction, angle, or distance of the sliding operation of the same user is high.
  • the similarity of the touch time or the touch force of the touch operation of the same user is high.
  • the similarity of the posture (for example, the tilt angle, etc.) of the terminal device of the same user when operating the terminal device is high.
  • training can be performed based on data of the same user, thereby determining a determination model for the user.
  • the determination model can be used to determine whether a certain user operation is an operation performed by a user corresponding to the determination model.
  • the related data of the user operation #1 and the one or more determination models may be used by at least one of the following methods.
  • the determination model #1 may be stored in the terminal device.
  • the determination model #1 may be a model for determining whether a user operation is an operation performed by a host of the terminal device. That is, the determination model may be a model generated based on pre-training of operations performed by the owner.
  • the processor may determine whether the user who executed the user operation #1 is the owner based on the data of the user operation #1 and the determination model #1.
  • the processor may It is determined that the user who performs the user operation #1 is the owner.
  • the processor can determine that the user performing the user operation #1 is the owner.
  • Y is less than or equal to X
  • the ratio of Y to X is greater than or equal to a prescribed threshold #d.
  • the processor may The user who determines that the user operates #1 is not the owner.
  • the processor can determine that the user of the user operation #1 is not the owner.
  • Threshold #c the difference between the value of the Y dimensions in the X dimensions of the data #1 and the value of the Y dimensions in the X dimensions of the decision model #1 (for example, the absolute value of the difference) is larger than a prescribed value Threshold #c.
  • the processor can determine that the user of the user operation #1 is not the owner.
  • Y is less than or equal to X
  • the ratio of Y to X is greater than or equal to a prescribed threshold #d.
  • the processor may allow the processing corresponding to the user operation #1 to be performed.
  • the processor can launch the application #1.
  • the processor can switch the current interface to the interface #1.
  • the user operation #1 may be a screen unlocking operation.
  • the processor may control the terminal device to present an interface that needs to be presented after the screen is unlocked.
  • the processor may control the terminal device to execute the function #1.
  • function #1 for example, transfer, payment, making a call, or transmitting or receiving information, etc.
  • the processor may unlock the specified application.
  • the application #X can be directly launched, wherein the application #X can be unlocked (for example, a password (for example, a fingerprint password, An application that is allowed to start only after a numeric or graphical password) and after the password is correct.
  • the unlocking process for the application #X may not be performed. For example, the password input interface may not be popped up, and the interface of the application #X may be directly accessed. .
  • the processor may prohibit execution of the processing corresponding to the user operation #1.
  • the processor may prohibit launching the application #1.
  • the processor can lock the terminal device, that is, the terminal device needs to be unlocked again before the terminal device can continue to be used.
  • the processor may prohibit switching the current interface to the interface #1.
  • the user operation #1 may be a screen unlocking operation.
  • the processor may control the terminal device to maintain the current interface on the lock screen interface.
  • the processor may prohibit the terminal device from executing the function #1.
  • the processor can lock the terminal device, that is, the terminal device needs to be unlocked again before the terminal device can continue to be used.
  • the processor may lock the specified application.
  • the application #X can be directly locked, wherein the application #X can be unlocked (for example, a password (for example, a fingerprint password, A digital password or a graphical password) and the password is correct) to allow the application to start.
  • application #X can also be an application that is allowed to be launched without unlocking.
  • the application #X is immediately locked, that is, the operation for starting the application #X is detected in time, and the application is not started. Program #X.
  • the specified one or more applications may be locked, for example, the application is prohibited from being operated by the user, or the application is prohibited from running.
  • the application may be an application capable of realizing a payment function or a consumption function, thereby being able to avoid property loss of the owner due to operation of the user by the user other than the owner.
  • the application may be an application set by the owner, thereby ensuring that the owner's privacy is not compromised.
  • the anti-theft function can be turned on.
  • At least one of the following actions may be employed:
  • An alarm is issued to the operator or the police, and the current location information of the terminal device can be carried in the alarm.
  • the lock screen function can be disabled.
  • the display interface of the terminal device can be switched to the lock screen interface, and the user needs to be correctly unlocked before the terminal can be used normally. device.
  • the determination model may be determined based on the sliding operation at the time of the period #A picture, and an example of the trajectory 300 of the sliding operation that the machine is accustomed to is shown in FIG.
  • the sliding operation (for example, the sliding direction and the sliding distance) corresponds to the sliding of the determination model due to the operating habit of the owner.
  • the similarity of the operation is high, and therefore, the determination model can determine that the user who performed the sliding operation is the owner, and thus, the terminal device can be allowed to perform processing based on the sliding operation.
  • the determination model can determine that the user who performs the sliding operation is not the owner, and thus the terminal device can be prohibited from performing processing based on the sliding operation, and The interface switches to the lock screen interface.
  • the security of the terminal device can be improved, for example, since the finger size or the touch force of the adult and the child are significantly different, therefore, according to the solution of the present invention, it is possible to effectively distinguish the user operation by the above-described determination model.
  • the adult is still a child, and thus, even if the child knows the password of the parent's terminal device, it is possible to avoid the loss of the property of the parent due to the child's operation, for example, starting a transfer or purchase.
  • At least one determination model #2 may be further stored in the terminal device.
  • the at least one determination model #2 may be in one-to-one correspondence with at least one non-owner user.
  • the decision model #2 can be used to determine whether a certain user operation is an operation performed by the corresponding non-owner user, that is, each decision model can be generated based on the pre-training performed by the corresponding non-owner user. Model.
  • the processor may further determine whether the determination model #2A matching the data #1 exists in the at least one determination model #2. For example, if the degree of difference between the value of each dimension of the data #1 and the value of each dimension of the decision model #2A (eg, the absolute value of the difference) is less than or equal to the prescribed threshold #e, the processor may It is determined that the user who executes the user operation #1 is the non-owner user corresponding to the determination model #2A (indicated as user #2).
  • the processor can determine that the user who performed the user operation #1 is User #2. Where Y is less than or equal to X, and the ratio of Y to X is greater than or equal to a predetermined threshold #f.
  • the processor can determine the operation authority corresponding to the user #2.
  • the operation authority may be used to indicate an application that the user #2 can use; or the operation authority may be used to indicate an application (for example, a payment application or a chat application, etc.) in which the user #2 is prohibited from being used.
  • the operational authority may be used to indicate a function that the user #2 can use; or, the operational authority may be used to indicate a function (eg, making a call, etc.) that the user #2 is prohibited from using.
  • the operational authority may be used to indicate an operational interface accessible to the user #2; or the operational authority may be used to indicate an interface (eg, a photo browsing interface, etc.) that the user #2 is prohibited from accessing.
  • an interface eg, a photo browsing interface, etc.
  • the processor can determine whether the user is allowed to operate the processing corresponding to #1 according to the authority of the user #2.
  • the processor can determine whether the application #1 can be used by the user #2 based on the authority of the user #2. If the permission of the user #2 indicates that the application #1 can be used by the user #2, the processor can start the application #1; if the permission of the user #2 indicates that the application #1 cannot be used by the user #2, then The processor can disable application #1. And, optionally, if the permission of the user #2 indicates that the application #1 cannot be used by the user #2, the processor can lock the terminal device, that is, the terminal device needs to be unlocked again before the terminal device can continue to be used.
  • the processor may determine whether the interface #1 can be accessed by the user #2 according to the authority of the user #2.
  • the user operation #1 may be a screen unlocking operation.
  • the processor may control the terminal device to present an interface that needs to be presented after the screen is unlocked; The user #2's permission indicates that the user #2 is not allowed to unlock the screen, and the processor can control the terminal device to keep the current interface on the lock screen interface.
  • the processor may determine the function #1 according to the authority of the user #2. Can be used by User #2. If the authority indication function #1 of the user #2 can be used by the user #2, the processor can control the terminal device function #1; if the authority indication function #1 of the user #2 cannot be used by the user #2, the processing The device can prohibit the terminal device from executing function #1. And, optionally, if the authority indication function #1 of the user #2 cannot be used by the user #2, the processor can lock the terminal device, that is, the terminal device needs to be unlocked again before the terminal device can continue to be used.
  • function #1 for example, transfer, payment, making a call, or transmitting or receiving information, etc.
  • the flexibility of control can be improved. For example, when the performer non-owner of the user operation is determined based on the above scheme, the application (for example, photographing, etc.) that does not affect the security of the terminal can still be allowed. carried out.
  • a plurality of determination models may be stored in the terminal device, and the plurality of determination models are in one-to-one correspondence with a plurality of users, and each of the determination models may be used to determine whether a certain user operation is corresponding.
  • the operation performed by the user that is, each decision model may be a model generated based on pre-training of the operation performed by the corresponding user.
  • the processor may further determine whether or not the determination model matching the data #1 exists in the plurality of determination models (hereinafter, it is referred to as the determination model #B for ease of understanding and distinction). For example, if the degree of difference between the value of each dimension of the data #1 and the value of each dimension of the decision model #B (eg, the absolute value of the difference) is less than or equal to the prescribed threshold #g, the processor may It is determined that the user who executes the user operation #1 is the non-owner user corresponding to the determination model #B (indicated as user #3).
  • the determination model #B for ease of understanding and distinction. For example, if the degree of difference between the value of each dimension of the data #1 and the value of each dimension of the decision model #B (eg, the absolute value of the difference) is less than or equal to the prescribed threshold #g, the processor may It is determined that the user who executes the user operation #1 is the non-owner user corresponding to the determination model #B (indicated as user #3).
  • the processor can determine that the user who performed the user operation #1 is User #3. Where Y is less than or equal to X, and the ratio of Y to X is greater than or equal to a predetermined threshold #h.
  • the processor can determine the operation authority corresponding to the user #3.
  • the operation authority may be used to indicate an application that the user #3 can use; or the operation authority may be used to indicate an application (for example, a payment application or a chat application, etc.) in which the user #3 is prohibited from being used.
  • the operational authority may be used to indicate a function that the user #3 can use; or, the operational authority may be used to indicate a function (eg, making a call, etc.) that the user #3 is prohibited from using.
  • the operational authority may be used to indicate an operational interface accessible to the user #3; or the operational authority may be used to indicate an interface (eg, a photo browsing interface, etc.) that the user #3 is prohibited from accessing.
  • an interface eg, a photo browsing interface, etc.
  • the processor can determine whether to allow the user to operate the processing corresponding to #1 according to the authority of the user #3.
  • the processor can determine whether the application #a can be used by the user #3 based on the authority of the user #3. If the permission of the user #3 indicates that the application #a can be used by the user #3, the processor can start the application #a; if the permission of the user #3 indicates that the application #a cannot be used by the user #3, then The processor can disable application #a. And, optionally, if the permission of the user #3 indicates that the application #a cannot be used by the user #3, the processor can lock the terminal device, that is, the terminal device needs to be unlocked again before the terminal device can continue to be used.
  • the processor may determine whether the interface #a can be accessed by the user #3 according to the authority of the user #3.
  • the user operation #1 may be a screen unlocking operation.
  • the processor may control the terminal device to present an interface that needs to be presented after the screen is unlocked; The user #3's permission indicates that the user #3 is not allowed to unlock the screen, and the processor can control the terminal device to keep the current interface on the lock screen interface.
  • the processor may determine the function #a according to the authority of the user #3. Can it be used by user #3. If the authority indication function #a of the user #3 can be used by the user #3, the processor can control the terminal device function #a; if the authority indication function #a of the user #3 cannot be used by the user #3, the processing The device can prohibit the terminal device from executing function #a. And, optionally, if the authority indication function #a of the user #3 cannot be used by the user #3, the processor can lock the terminal device, that is, the terminal device needs to be unlocked again before the terminal device can continue to be used.
  • the processor may determine the function #a according to the authority of the user #3. Can it be used by user #3. If the authority indication function #a of the user #3 can be used by the user #3, the processor can control the terminal device function #a; if the authority indication function #a of the user #3 cannot be used by the user #3, the processing The device can prohibit the terminal device from executing function #a. And, optionally, if the authority indication function #a of the
  • the authority of the user may be set by the owner, or may be sent by the manufacturer or the operator to the terminal device, and the application is not particularly limited.
  • the terminal device may provide two modes, in which the terminal device may determine whether the user who performs the operation is a prescribed user (eg, a machine owner) based on the determination model determined as described above; In contrast, in mode 2, the terminal device may prohibit the determination of whether the user performing the operation is a prescribed user (eg, the owner) based on the determination model determined as described above, or, in mode 2, the terminal device may not perform the operation. The identity of the user who operates is confirmed. Therefore, when the owner allows the other person to use the terminal device, it is possible to avoid the situation in which the use of the above method is prohibited, and the utility of the present application can be further improved, and the user-friendly setting can be realized.
  • a prescribed user eg, a machine owner
  • the same user since the user's operation is habitual, the same user may generate a large number of similar operations in the process of operating the terminal device, and the training model is obtained by using multiple operations based on the same user X. It is possible to determine whether or not a certain user operation is the user X based on the determination model, thereby improving the use security of the terminal device.
  • FIG. 6 shows an exemplary illustration of an example of the determination method 600 of the above-described determination model, for example, the method 600 can be applied to the above-described terminal device 100.
  • a processor of the terminal device can acquire data detected by a sensor of the terminal device (that is, an example of training information).
  • the data can include at least one of the following:
  • the above-described touch detection device may detect data of a user's touch operation, by way of example and not limitation, the touch operation may include, but is not limited to, a click (eg, click or double click, etc.) operation or a slide operation (eg, The data of the touch operation may include, but is not limited to, data of at least one of the following parameters:
  • the terminal device can configure the pressure sensor under the touch screen, so that the strength of the touch operation can be detected, and the data of the detected velocity is transmitted to the above processor.
  • the manner of detecting the strength of the touch operation is merely exemplary.
  • the present invention is not limited thereto.
  • the touch screen may also inherit the above pressure sensor.
  • the touch screen may be a pressure screen or pressure sensing. Screen.
  • the location of the touch operation may be a relative position of the touched area in the touch screen in the entire touch screen, wherein when the touch operation is a click operation, the touch operation may be a position where the touch point is in the screen.
  • the touch operation may include a position of the start point of the slide operation in the screen, or may include a position of the end point of the slide operation in the screen, or may also include a slide track The location on the screen.
  • the contact area of the touch operation may be the contact area of the user's finger with the touch screen detected by the touch detection device.
  • the contact time of the touch operation may be the time of one touch operation detected by the touch detection device, wherein the “one touch operation” may refer to the touch screen being touched by the finger until the finger leaves the touch screen), for example, when the touch When the operation is a click operation, the contact time of the touch operation may refer to the time when the touch screen is touched in a single click operation, and for example, when the touch operation is a sliding operation, the contact time of the touch operation may refer to a sliding operation. The time when the touch screen is touched.
  • the contact time of the touch operation may be a time when the interval detected by the touch detection device is less than or equal to a plurality of touch operations of the preset time interval, for example, when the touch operation is a double-click operation or the like, the finger has left the touch screen.
  • the contact time of the touch operation may refer to the total time that the touch screen is touched during one operation.
  • the angle of the sliding operation may be an angle between a line from the start position of the sliding operation to the end position of the sliding operation and the horizontal (or vertical) direction of the screen.
  • the angle of the sliding operation may be the direction from the start position of the sliding operation to the end position of the sliding operation.
  • the sliding distance of the sliding operation may be a straight line length from the start position of the sliding operation to the end position of the sliding operation.
  • the sliding distance of the sliding operation may be the total length of the trajectory of the sliding operation.
  • the relevant data of the attitude of the terminal device may include data of the angle formed by the terminal device with respect to the horizontal or gravity direction.
  • the attitude information may include, but is not limited to, data detected by at least one of the following sensors:
  • different operating habits of different users may result in different sliding directions, angles, or distances of sliding operations of different users.
  • different operating habits of different users may result in different postures (eg, tilt angles, etc.) of the terminal devices of different users when operating the terminal devices.
  • the data detected by the sensor from the same user has similarity due to the user's operating habits and biometrics.
  • the similarity of the touch contact area of the touch operation of the same user is high.
  • the similarity of the sliding distance of the sliding operation of the same user is high.
  • the similarity of the sliding direction, angle, or distance of the sliding operation of the same user is high.
  • the similarity of the touch time or the touch force of the touch operation of the same user is high.
  • the similarity of the posture (for example, the tilt angle, etc.) of the terminal device of the same user when operating the terminal device is high.
  • the processor of the terminal device can perform clustering processing on the data acquired from the above-mentioned sensors, and can divide the operations of the same user into the same class or group.
  • a sliding operation may cause data to be obtained by the touch screen and the pressure sensor, and for a sliding operation, the touch screen may obtain a sliding direction, a sliding distance, and a sliding angle. Parameters such as multiple dimensions.
  • multiple data detected by different sensors can be used as information of different dimensions of the operation, thereby generating data having specific values for each dimension in a multi-dimensional (eg, two-dimensional or three-dimensional) space.
  • Point or say, coordinate point).
  • data of various parameters detected by the same sensor may be used as information of different dimensions of the operation, thereby generating points having specific values for each dimension in a multi-dimensional (eg, two-dimensional or three-dimensional) space.
  • multiple data points can be determined.
  • the plurality of data points can be clustered.
  • clustering may be performed by any of the following methods.
  • the density-based clustering method works well for concentrated areas. In order to find clusters of arbitrary shape, this method regards clusters as dense object regions separated by low-density regions in data space; a high-density connected region A density-based clustering method that divides regions with sufficiently high density into clusters and finds clusters of arbitrary shape in spatial data with noise.
  • ⁇ Neighbor The area within a given object radius is called the neighborhood of the object
  • Core object If the number of sample points in a given object ⁇ field is greater than or equal to MinPts, the object is said to be the core object;
  • Direct density reach For sample set D, if sample point q is in the ⁇ field of p and p is the core object, object q is directly reachable from object p.
  • Density connected There is a point o in the sample set D. If the object o to the object p and the object q are both reachable, then the p and q densities are associated.
  • Density can be a transitive closure with direct density reach, and this relationship is asymmetric. Density connections are symmetric. The purpose of density-based clustering is to find the largest set of density-connected objects.
  • Density-based clustering searches for clusters by examining the r neighborhood of each point in the database. If the r neighborhood of point p contains more points than MinPts, then create a new cluster with p as the core object. Then, density-based clustering iterative aggregation of objects from these core objects directly to density, this process may involve the consolidation of some density-reachable clusters. The process ends when no new points can be added to any cluster.
  • Point E of the point p is a bit ⁇ m,p,p1,p2,o ⁇
  • the point E is a bit ⁇ m,q,p,m1,m2 ⁇
  • Point E's E field is a bit ⁇ q,m ⁇
  • point o's E field is a bit ⁇ o,p,s ⁇
  • point s E field is a bit ⁇ o,s,s1 ⁇ .
  • the core objects are p, m, o, s.
  • Point m is directly reachable from point p because m is in the E domain of p and p is the core object;
  • Point q is reachable from point p, because point q is directly accessible from point m, and point m is directly accessible from point p;
  • the point q is connected to the point s density because the point q is from the point p density and the s is from the point p density.
  • the density-based clustering method needs to select a distance metric.
  • the distance between any two points reflects the density between the points, indicating whether the points and points can be clustered into the same class. . Since density-based clustering methods are difficult to define density for high-dimensional data, for points in two-dimensional space, Euclidean distance can be used for metrics.
  • the density-based clustering method requires the user to input two parameters: one parameter is the radius (Eps), which represents the range of the circular neighborhood centered on the given point P; the other parameter is the least point in the neighborhood centered on the point P.
  • the number (MinPts) If it is satisfied that the number of points in the neighborhood with the radius Pp centered on the point P is not less than MinPts, the point P is called the core point.
  • the k-distance is the distance from the point p(i) to the kth distance from all points (except the p(i) point).
  • clusters can be generated by connecting the core points.
  • the core points can be connected (in some books, called “density reachable”), and they form a circular neighborhood with a radius of Eps that is connected or overlapped. These connected core points and all the neighborhoods in which they are located Points form a cluster.
  • the idea of calculating the connected core points is based on the method of breadth traversal and depth traversal: taking a point p from the core point set S, and calculating whether each point (except p point) in the point p and the S set is connected, may Obtain a set C1 that connects the core points, and then delete the points in the point p and the C1 set from the set S to obtain the core point set S1; then take a point p1 from S1, and calculate each point in the set of p1 and the core point set S1. (Except for p1 point) Whether it is connected, it is possible to obtain a connected core point set C2, and then delete all points in the point p1 and C2 sets from the set S1, obtain the core point set S2, ... finally get p, p1, p2, ... , and C1, C2, ... form the core point of a cluster. Finally, the points in the core point set S are traversed to complete, and all the clusters are obtained.
  • Boundary point This point is not a core point, but its neighborhood contains at least one core point.
  • Noise point not a core point, nor a boundary point.
  • the aggregation can be performed in such a way that each core point is placed in the same cluster as all the core points in its neighborhood, and the boundary point is placed in the same cluster as a core point in its neighborhood.
  • OPTICS Ordering Points To Identify Clustering Structure
  • the Ordering Points To Identify Clustering Structure is one of density-based clustering methods that performs a search for high density and then sets parameters according to high-density characteristics. Improved the effect of density-based clustering.
  • the goal of OPTICS is to cluster the data in space according to the density distribution. That is, after processing by the OPTICS algorithm, it is theoretically possible to obtain clusters of arbitrary density. Because the OPTICS algorithm outputs an ordered queue of samples, clusters of arbitrary density can be obtained from this queue.
  • Parameters Radius, minimum number of points: One is the input parameters, including: radius ⁇ , and the minimum number of points MinPts.
  • the definition of the core distance can be derived, that is, the distance to the core point, the distance from its MinPtsth point to the coreDist(P).
  • the reachable distance, for the core point P, the reachable distance of O to P is defined as the distance from O to P or the core distance of P, ie the formula
  • the direct density of O to P is reachable, that is, P is the core point, and the distance from P to O is less than the radius.
  • the calculation process of the OPTICS algorithm is as follows:
  • Step 1 Input data sample D, initialize the reach distance and core distance of all points to MAX, radius ⁇ , and minimum number of MinPts.
  • Step 2 Create two queues, an ordered queue (the core point and the direct density of the core point can reach the point), and the result queue (store sample output and processing order).
  • Step 3 If all the data in D is processed, the algorithm ends. Otherwise, an unprocessed and uncored object point is selected from D, and the core point is placed in the result queue, and the direct density of the core point is put into the point. Ordered queues, direct density up to points and in ascending order of reachable distance.
  • Step 4 If the ordered sequence is empty, return to step 2, otherwise take the first point from the ordered queue; for example, in the process, first, determine whether the point is a core point, if not, return to the step 3, if yes, the point is stored in the result queue, if the point is not in the result queue; then, if the point is the core point, find all its direct density reachable points, and put these points into the ordered queue, and Reorder the points in the ordered queue according to the reachable distance. If the point is already in the ordered queue and the new reachable distance is small, update the reachable distance of the point; repeat the above process until the ordered queue is air.
  • Step 5 The algorithm ends.
  • a class may be determined according to the ratio of the number of times the user who needs to authenticate to the terminal device operates the number of times the user of the terminal device operates the terminal device.
  • the minimum number of points in the middle for example, if there are 500 points, the minimum number can be set to 300.
  • the user who needs to be authenticated is the owner of the terminal device, only one class can be determined in S210, and data capable of clustering in the class can be marked as 1, and clustering can be performed in the class The data is marked as 0. That is, the data labeled 1 can be considered to belong to the training information set.
  • Partition-based methods The principle of Partition-based methods is to first determine that scatter points need to be clustered into several classes, then pick a few points as the initial center point, and then do the data points according to the predetermined heuristic algorithm (heuristicalgorithms). Iterative reset (iterativerelocation) until the final goal of "the points in the class are close enough, the points between the classes are far enough".
  • the K-means algorithm is an example of a clustering method based on partitioning.
  • the k-means algorithm divides n objects into k clusters with k as a parameter, so that the clusters have higher similarity, and between clusters. The similarity is low.
  • the processing of the k-means algorithm is as follows: First, k objects are randomly selected, each object initially represents the average or center of a cluster, that is, K initial centroids are selected; for each of the remaining objects, according to The distance from the center of each cluster, assign it to the nearest cluster; then recalculate the average of each cluster. This process is repeated until the criterion function converges until the centroid does not change significantly.
  • the squared error criterion is used, and the sum of squared errors (SSE) is used as a global objective function, that is, the sum of the squares of the Euclidean distances from each point to the nearest centroid is minimized.
  • SSE sum of squared errors
  • Step 1 Select K points as the initial centroid
  • Step 2 Assign each point to the nearest centroid to form K clusters
  • Step 3 Recalculate the centroid of each cluster
  • Step 4 Repeat steps 2 and 3 for no change or maximum number of iterations.
  • the principle of Hierarchical Methods is to calculate the distance between samples first. Combine the closest point to the same class each time. Then, calculate the distance between the class and the class, and merge the closest class into a large class. Continue to merge until a class is synthesized.
  • the calculation methods for the distance between the class and the class are: the shortest distance method, the longest distance method, the intermediate distance method, and the class average method.
  • the shortest distance method defines the distance between a class and a class as the shortest distance between the class and the sample.
  • the hierarchical clustering algorithm is divided into the order of hierarchical decomposition: from the bottom to the top and from the top to the bottom, that is, the condensed hierarchical clustering algorithm and the split hierarchical clustering algorithm (agglomerative and divisive), can also be understood as bottom-up Bottom-up and top-down.
  • the bottom-up approach is that each object is a class at first, then looks for the same kind according to the linkage, and finally forms a "class".
  • the top-down approach is the reverse.
  • the strategy of condensed hierarchical clustering is to first treat each object as a cluster, then merge the clusters into larger and larger clusters until all objects are in one cluster, or a certain termination condition is satisfied.
  • Most hierarchical clustering belongs to agglomerative hierarchical clustering, which differs only in the definition of similarity between clusters.
  • Step 1 Treat each object as a class and calculate the minimum distance between the two.
  • Step 2 Combine the two classes with the smallest distance into one new class
  • Step 3 Recalculate the distance between the new class and all classes
  • Step 4 Repeat step 2/3 until all classes are finally merged into one class.
  • any one of the plurality of classes determined according to the clustering algorithm may correspond to one user, that is, data in one class may be considered as an operation performed by the corresponding user.
  • N trainings can theoretically be obtained.
  • a collection of information (or N classes), wherein the above "scale” can be determined according to parameters used by the clustering algorithm, such as density.
  • the M training information sets can theoretically be obtained according to the processing procedure described in the above S620. (or M classes), where N can be an integer greater than or equal to 1, M can be an integer greater than or equal to 1, and M is less than N.
  • the processor may determine the training information set #A corresponding to the user #A from the M training information sets.
  • the following method of determining the training information set #A can be cited.
  • the density-based clustering method when adopted in S620, if user #A is the owner of the terminal device, there are more training information obtained based on the operation of the user #A, in which case the determined The maximum number of points (or, training information) of the plurality of training information sets is determined as the training information set #A.
  • the density-based clustering method when adopted in S620, if user #A is the owner of the terminal device, and other users have less operation on the terminal device, there may be only one training information set after clustering. In this case, the unique training information set can be determined as the training information set #A.
  • the processor may The information on the frequency of use of the terminal device by the user #A is acquired, and based on the information of the frequency of use, the training information set #A is determined. For example, if the information of the frequency of use indicates that the user #A is the user who uses the terminal device most frequently among the plurality of users, the determined number of midpoints (or the training information) of the plurality of training information sets may be determined at the most.
  • the determined plurality of training information sets may be midpoint (or the training information) The minimum number of sets is determined as training information set #A.
  • the information of the frequency of use may be input by the user to the terminal device, or the information of the frequency of use may also be provided by the server or the operator to the terminal device, and the invention is not particularly limited.
  • the processor can acquire the feature information according to the user #A, and determine the training information set whose feature information is similar to the user #A as the training information set #A, that is, the feature of the training information set #A determined as described above.
  • the similarity between the information and the feature information of the user #A is greater than or equal to the specified threshold #a, or between the value of the feature information of the training information set #A determined as described above and the value of the feature information of the user #A.
  • the difference is less than or equal to the specified threshold #b.
  • the information of the frequency of use may be input by the user to the terminal device, or the information of the frequency of use may also be provided by the server or the operator to the terminal device, and the invention is not particularly limited.
  • the above-exemplified methods for determining the training information set #A are merely illustrative, and the present invention is not limited thereto, for example, the data that can be clustered in the class is marked as 1 as described above, and will be able to Clustering In the case where the data of this class is marked as 0, the data marked as 1 may also be determined as the data (or training information) in the training information set #A.
  • the processor may determine the decision model #A for the user #A based on some or all of the training information in the training information set #A.
  • each training information in the training information set #A may have values of a plurality of dimensions, such that the processor may average the values of the same dimension of the plurality of training information to obtain a reference Information (that is, an example of a determination model) in which the value of the i-th dimension of the reference information is an average value of the values of the i-th dimension of the plurality of training information.
  • a reference Information that is, an example of a determination model
  • the processor may determine the value of the most occurrences of the values of the i-th dimension of the plurality of training information as the value of the i-th dimension of the reference information.
  • the decision model may be determined according to an Adaptive Boosting (Adaboost) algorithm.
  • Adaboost is an iterative algorithm whose core idea is to train different classifiers (weak classifiers) for the same training set, and then combine these weak classifiers to form a stronger final classifier (strong Classifier).
  • the algorithm itself is implemented by changing the data distribution. It determines the weight of each sample based on whether the classification of each sample in each training set is correct and the accuracy of the last overall classification. The new data set with the modified weight is sent to the lower classifier for training, and finally the classifier obtained by each training is finally merged as the final decision classifier.
  • Use the adaboost classifier to eliminate unnecessary training data features and place them on key training data.
  • the algorithm is actually a simple weak classification algorithm lifting process, which can improve the classification ability of data through continuous training.
  • the whole process is as follows:
  • the sample of the error is combined with other new data to form a new N training samples, and the second weak classifier is obtained by learning the sample;
  • the same user since the operation of the user is habitual, the same user may generate a large number of similar operations in the process of operating the terminal device by causing the terminal device to operate according to the user detected in the first time period.
  • the determined plurality of training information is clustered, so that the training information in the same training information set after clustering can be matched to the same user, and further, the determination model generated based on the training information in the training information set can effectively determine an operation. Whether the user is the user corresponding to the training information set, and thus, the biometric identification device is not required to be implemented for user identification, and the cost of the terminal device can be reduced.
  • the user operation for producing the training information does not need to be deliberately performed by the user, or there is no need to additionally increase the user's operation burden for realizing the user identification, the user experience can be improved, and the practicality of the user identification of the present situation can be improved.
  • the training information acquired in the above S610 may be based on the specified application (hereinafter, for ease of understanding and distinction, Generated by user action of #A).
  • the processor can record the correspondence relationship between the generated decision model (indicated as the decision model #A1) and the application #A. Therefore, in S650, when the terminal device detects the user operation, the application for which the user operation is determined may be determined. For example, when it is determined that the user operation is an operation for the application #A, the determination may be determined based on the corresponding relationship. Model #A1, and based on the determination model #A1, determines whether the detected user operation is performed by User #A.
  • the distance and direction between the multiple sliding operations performed by the user Since the user may be more habitual about the operation of an application, for example, when the same user is operating the same application (for example, browsing an application such as a news or an e-book), the distance and direction between the multiple sliding operations performed by the user.
  • the similarity of the angle or the like may be greater than the similarity of the distance, direction or angle between the plurality of sliding operations performed by the user when operating different applications (for example, browsing applications such as news and games).
  • the accuracy and reliability of the user identification of the present application can be further improved by the above process.
  • the training information acquired in the above S610 may be based on a specified operation interface (hereinafter, for ease of understanding and differentiation, Generated by the user operation of the operation interface #A).
  • the processor can record the correspondence relationship between the generated decision model (indicated as the decision model #A2) and the operation interface #A. Therefore, in S650, when the terminal device detects the user operation, the operation interface targeted by the user operation may be determined. For example, when it is determined that the user operation is an operation for the operation interface #A, the determination may be determined based on the corresponding relationship.
  • the determination model #A2 determines whether the detected user operation is performed by the user #A based on the determination model #A2.
  • the distance, direction or angle between the plurality of sliding operations performed by the user may be greater than the similarity of the distance, direction or angle between the plurality of sliding operations performed by the user when operating different operating interfaces (eg, a text reading interface and a game interface, etc.).
  • the accuracy and reliability of the user identification of the present application can be further improved by the above process.
  • the training information acquired in the above 610 may be a prescribed type (hereinafter, for the purpose of understanding and distinguishing, the type is # A) User action generated.
  • the processor can record the correspondence between the generated decision model (indicated, decision model #A3) and type #A. Therefore, in S650, when the terminal device detects the user operation, the type of the user operation may be determined. For example, when determining the type of the user operation type #A, the determination model #A3 may be determined based on the correspondence relationship. And based on the determination model #A3, it is determined whether the detected user operation is performed by the user #A.
  • the similarity between parameters of the same type (eg, sliding) of the same user eg, touch force
  • different types eg, sliding and Click on the similarity between the parameters of the operation (for example, touch force).
  • FIG. 7 illustrates an exemplary illustration of an example of a method 700 of user identification of the present application, for example, the method 700 can be performed in conjunction with the terminal device 100 and a server described above.
  • the server may be a computing device or the like, and the server may implement a communication connection with the terminal device via, for example, the Internet.
  • the terminal device may determine the training information based on the user operation, and the process may be similar to the process described in the above S610.
  • the process may be similar to the process described in the above S610.
  • detailed description thereof is omitted.
  • the terminal device may transmit the obtained training information to the server.
  • the server may perform clustering processing on the training information to determine at least one training information set, where the process may be similar to the process described in the above S620.
  • the process may be similar to the process described in the above S620.
  • detailed description thereof is omitted.
  • the server may determine the training information set #A from the determined at least one training information set, wherein the process may be similar to the process described in the above S630, and a detailed description thereof is omitted herein to avoid redundancy.
  • the server may determine the decision model #A based on the training information set #A, wherein the process may be similar to the process described in the above S740, and a detailed description thereof is omitted herein to avoid redundancy.
  • the server may transmit the decision model #A to the terminal device.
  • the determination model generated in the foregoing manner may be used to determine whether the user who operates the mobile phone is the owner, that is, the training information obtained in the foregoing S610 or S710 may include the owner-based operation.
  • the generated training information for example, when the density-based clustering method is adopted, for example, the number of acquisitions may be increased to increase the proportion of training information generated by the owner's operation to ensure that the owner can
  • the training information generated by the operation is grouped into one type, and the training information set corresponding to the operation of the owner may be determined according to the information of the owner in S630 or S730, thereby being able to be based on the clustered training information set in S640 or S740
  • a determination model for determining whether the user of the operation is the owner of the operation is generated, and further, based on the determined determination model, whether or not a certain user operation is performed by the owner can be determined.
  • FIG. 8 is a schematic block diagram of an apparatus 800 for managing a terminal device according to an embodiment of the present application.
  • the apparatus 800 includes:
  • the detecting unit 810 is configured to acquire operation information corresponding to the first operation, where the operation information includes touch information and/or posture information of the terminal device;
  • the processing unit 820 manages the terminal device according to the matching degree of the operation information corresponding to the first operation and the first determination model, wherein the first determination model is determined based on operation information of an operation performed by the first user.
  • the touch information includes at least one of the following: information of the strength of the touch operation, information of the position of the touch operation, information of the contact area of the touch operation, information of the contact time of the touch operation, and the sliding angle of the touch operation. Information, information on the sliding direction of the touch operation, and information on the sliding distance of the touch operation.
  • the first user includes a host of the terminal device.
  • the processing unit 820 is specifically configured to: when the matching degree of the operation information corresponding to the first operation and the first determination model is higher than a preset first threshold, perform processing corresponding to the first operation.
  • the first operation is an operation on a picture (for example, an operation of deleting a picture)
  • the image can be processed based on the operation (for example, deleting the picture).
  • the processing unit 820 is specifically configured to: when the matching degree of the operation information corresponding to the first operation and the first determination model is higher than a preset first threshold, unlocking the first application.
  • the first operation is a pattern unlocking operation
  • the pattern can be unlocked correctly. Make sure you can unlock it.
  • the processing unit 820 is specifically configured to: when the matching degree of the operation information corresponding to the first operation and the first determination model is lower than a preset first threshold, prohibiting the processing corresponding to the first operation.
  • the first operation is an operation on a picture (for example, an operation of deleting a picture)
  • the matching degree of the operation information corresponding to the first operation and the first determination model is lower than a preset first threshold , you can disable the processing of images based on this operation (for example, prohibiting deletion of images).
  • the processing unit 820 is configured to: when the matching degree of the operation information corresponding to the first operation and the first determination model is lower than a preset first threshold, switch the interface currently displayed by the terminal device to the lock Screen interface.
  • the processing unit 820 is configured to: when the matching degree of the operation information corresponding to the first operation and the first determination model is lower than a preset first threshold, play the preset alarm signal.
  • the processing unit 820 is specifically configured to: when the matching degree of the operation information corresponding to the first operation and the first determination model is lower than a preset first threshold, locking the first application.
  • the first operation may be an operation for the first application, for example, an operation on an interface of the first application.
  • the first operation may be an operation before starting the first application, or the first operation may be an operation of the first application during background operation.
  • the first operation is an operation detected before the second operation for starting the first application is detected, and the processing unit 820 is specifically configured to: when the detecting unit 810 detects the second operation , does not display the unlock interface, and launches the first application.
  • the first operation is an operation detected before the second operation for starting the first application is detected
  • the processing unit 820 is specifically configured to: when the detecting unit 810 detects the second operation, The unlock interface is displayed.
  • the first operation is an operation detected before detecting a second operation for starting the first application
  • the processing unit 820 is specifically configured to: prohibit starting the first application.
  • the first operation is an operation for unlocking the first application.
  • the first application includes at least one of the following applications: an application operated by the first operation, an application preset by a host of the terminal device, and an application preset by a manufacturer of the terminal device.
  • the processing unit 820 is specifically configured to: when the matching degree of the operation information corresponding to the first operation and the first determination model is lower than a preset first threshold, according to the operation information corresponding to the first operation, Determining, in the determination model, a second determination model, the degree of matching of the operation information corresponding to the first operation model is higher than a preset second threshold, or the second determination model is in the plurality of determination models a determination model in which the matching degree of the operation information corresponding to the first operation is the largest, wherein the plurality of determination models are in one-to-one correspondence with the plurality of users, and each of the determination models is determined based on the operation information of the operation performed by the corresponding user. And managing the terminal device according to the user authority of the user corresponding to the second determination model.
  • the processing unit 820 is specifically configured to: determine, according to a user operation detected in the first time period of the detecting unit 810, a plurality of training information, where the user operation includes operations performed by multiple users, where the training information includes the user operation Touching operation information and/or posture information of the terminal device under the operation of the user; performing clustering processing on the plurality of training information to determine at least one training information set; according to information of the second user of the plurality of users, Determining, from the at least one training information set, a first training information set corresponding to the second user; determining a determination model for the second user according to the training information in the first training information set.
  • the processing unit 820 is specifically configured to perform clustering processing on the multiple pieces of training information according to a preset third threshold, where a density of the training information in each training information set is greater than or equal to the third threshold. Determining, according to the information of the second user of the plurality of users, the first training information set corresponding to the second user from the plurality of training information sets, including: when the information of the second user indicates the When the second user is the owner of the terminal device, the training information set having the highest density of the training information in the plurality of training information sets is determined as the first training information set.
  • the processing unit 820 is specifically configured to: identify, according to the object sorting, a clustering structure OPTICS algorithm, and perform clustering processing on the plurality of training information.
  • the terminal device has at least two operation modes, wherein the terminal device needs to identify whether the user is the owner in the first operation mode, and the terminal device does not need to identify whether the user is the owner in the second operation mode.
  • the processing unit 820 is configured to: determine, according to the matching degree of the operation information corresponding to the first operation and the first determination model, that the current operation mode of the terminal device is the first operation mode.
  • the device 800 of the management terminal device may correspond to the terminal device described in the foregoing method 200, and each module or unit in the device 800 identified by the user is used to perform each action and process performed by the terminal device in the foregoing method 200, respectively.
  • each module or unit in the device 800 identified by the user is used to perform each action and process performed by the terminal device in the foregoing method 200, respectively.
  • a detailed description thereof will be omitted.
  • the disclosed systems, devices, and methods may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the unit is only a logical function division.
  • there may be another division manner for example, multiple units or components may be combined or Can be integrated into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be in an electrical, mechanical or other form.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the functions may be stored in a computer readable storage medium if implemented in the form of a software functional unit and sold or used as a standalone product.
  • the technical solution of the present application which is essential or contributes to the prior art, or a part of the technical solution, may be embodied in the form of a software product, which is stored in a storage medium, including
  • the instructions are used to cause a computer device (which may be a personal computer, server, or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present application.
  • the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk, and the like, which can store program codes. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Multimedia (AREA)
  • Technology Law (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请提供了一种获取第一操作对应的操作信息,其中,该操作信息包括触摸信息和/或该终端设备的姿态信息;根据该第一操作对应的操作信息和第一判定模型的匹配程度,管理该终端设备,其中,该第一判定模型是基于第一用户所进行的操作的操作信息确定的,根据本申请的管理终端设备的方法,由于用户的操作具有习惯性,因此,同一用户(即,第一用户)在对终端设备操作的过程中可能产生大量的相似的操作,通过使用基于第一用户的多个操作进行训练并获得判定模型,能够基于该判定模型判定某一操作是否为该第一用户进行的,从而,能够提高终端设备的使用安全性。

Description

管理终端设备方法和终端设备
本申请要求于2018年03月28日提交中国专利局、申请号为201810264881.6、申请名称为“管理终端设备方法和装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及人机交互领域,并且更具体地,涉及管理终端设备的方法和终端设备。
背景技术
目前,已知一种技术,在终端设备启动某个应用程序或进入某个操作界面之前,可以利用例如,密码(例如,数字密码或手势密码)输入等方式进行验证,并在验证通过后允许启动应用程序或允许进入操作界面。
但是,上述技术中存在密码泄露的可能,例如,机主的家庭成员(例如,机主的孩子)可能获知机主设置的密码,从而能够对终端设备进行操作,严重降低了终端设备的安全性。
因此,希望提供一种技术,能够提高终端设备的安全性。
发明内容
本申请提供一种管理终端设备的方法和装置,能够提高终端设备的安全性。
第一方面,提供了一种管理终端设备的方法,包括:获取第一操作对应的操作信息,其中,该操作信息包括触摸信息和/或该终端设备的姿态信息;根据该第一操作对应的操作信息和第一判定模型的匹配程度,管理该终端设备,其中,该第一判定模型是基于第一用户所进行的操作的操作信息确定的。
根据本申请的管理终端设备的方法,由于用户的操作具有习惯性,因此,同一用户(即,第一用户)在对终端设备操作的过程中可能产生大量的相似的操作,通过使用基于第一用户的多个操作进行训练并获得判定模型,能够基于该判定模型判定某一操作是否为该第一用户进行的,从而,能够提高终端设备的使用安全性。
可选地,该触摸信息包括以下至少一种信息:触摸操作的力度的信息、触摸操作的位置的信息、触摸操作的接触面积的信息、触摸操作的接触时间的信息、触摸操作的滑动角度的信息、触摸操作的滑动方向的信息、触摸操作的滑动距离的信息。
可选地,该第一用户包括该终端设备的机主。
可选地,根据该第一操作对应的操作信息和第一判定模型的匹配程度,管理该终端设备包括:当根据该第一操作对应的操作信息与第一判定模型的匹配程度高于预设的第一阈值时,执行该第一操作对应的处理。
例如,当该第一操作是对图片的操作(例如,删除图片的操作)时,如果判定为根据该第一操作对应的操作信息与第一判定模型的匹配程度高于预设的第一阈值,则可以基于 该操作对图像进行处理(例如,删除图片)。
可选地,根据该第一操作对应的操作信息和第一判定模型的匹配程度,管理该终端设备包括:当根据该第一操作对应的操作信息与第一判定模型的匹配程度高于预设的第一阈值时,解锁第一应用。
例如,当该第一操作是图案解锁操作时,如果判定为根据该第一操作对应的操作信息与第一判定模型的匹配程度高于预设的第一阈值,则可以在图案解锁正确时,确定各位可以解锁。
可选地,根据该第一操作对应的操作信息和第一判定模型的匹配程度,管理该终端设备包括:当根据该第一操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值时,禁止执行该第一操作对应的处理。
例如,当该第一操作是对图片的操作(例如,删除图片的操作)时,如果判定为根据该第一操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值,则可以禁止基于该操作对图像进行处理(例如,禁止删除图片)。
可选地,根据该第一操作对应的操作信息和第一判定模型的匹配程度,管理该终端设备包括:当根据该第一操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值时,将该终端设备当前显示的界面切换至锁屏界面。
可选地,根据该第一操作对应的操作信息和第一判定模型的匹配程度,管理该终端设备包括:当根据该第一操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值时,播放预设的警报信号。
可选地,根据该第一操作对应的操作信息和第一判定模型的匹配程度,管理该终端设备包括:当根据该第一操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值时,锁定第一应用。
其中,该第一操作可以是针对第一应用的操作,例如,在第一应用的界面上的操作。
或者,该第一操作可以是在启动第一应用之前的操作,或者,该第一操作可以是第一应用在后台运行期间的操作。
可选地,该第一操作是在检测到用于启动该第一应用的第二操作之前所检测到的操作,以及该解锁第一应用,包括:在检测到该第二操作时,不显示解锁界面,并启动该第一应用。
可选地,该第一操作是在检测到用于启动该第一应用的第二操作之前所检测到的操作,以及该锁定第一应用,包括:在检测到该第二操作时,显示解锁界面。
可选地,该第一操作是在检测到用于启动该第一应用的第二操作之前所检测到的操作,以及禁止启动该第一应用。
可选地,该第一操作是用于解锁该第一应用的操作。
可选地,该第一应用包括以下至少一种应用:该第一操作所操作的应用、该终端设备的机主预先设置的应用和该终端设备的制造商预先设置的应用。
可选地,根据该第一操作对应的操作信息和第一判定模型的匹配程度,管理该终端设备包括:当根据该第一操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值时,根据第一操作对应的操作信息,从多个判定模型中确定第二判定模型,该第二判定模型与该第一操作对应的操作信息的匹配程度高于预设的第二阈值,或该第二判定模型是 该多个判定模型中与该第一操作对应的操作信息的匹配程度最大的判定模型,其中,该多个判定模型与多个用户一一对应,每个判定模型是基于所对应的用户所进行的操作的操作信息确定的;根据该第二判定模型对应的用户的用户权限,管理该终端设备。
可选地,该方法还包括:根据第一时段内检测到的用户操作,确定多个训练信息,该用户操作包括多个用户进行的操作,该训练信息包括该用户操作的触摸操作信息和/或该终端设备在该用户操作下的姿态信息;对该多个训练信息进行聚类处理,以确定至少一个训练信息集合;根据该多个用户中的第二用户的信息,从该至少一个训练信息集合中,确定与该第二用户相对应的第一训练信息集合;根据该第一训练信息集合中的训练信息,确定针对该第二用户的判定模型。
可选地,该对该多个训练信息进行聚类处理,包括:基于预设的第三阈值,对该多个训练信息进行聚类处理,其中,每个训练信息集合中的训练信息的密度大于或等于该第三阈值;该根据该多个用户中的第二用户的信息,从该多个训练信息集合中,确定与该第二用户相对应的第一训练信息集合,包括:当该第二用户的信息指示该第二用户为该终端设备的机主时,将该多个训练信息集合中训练信息的密度最大的训练信息集合确定为该第一训练信息集合。
可选地,该对该多个训练信息进行聚类处理,包括:根据对象排序识别聚类结构OPTICS算法,对该多个训练信息进行聚类处理。
可选地,该终端设备具有至少两种操作模式,其中,在第一操作模式下该终端设备需要识别用户是否为机主,在第二操作模式下该终端设备不需要识别用户是否为机主,以及在根据该第一操作对应的操作信息和第一判定模型的匹配程度之前,该方法还包括:确定该终端设备当前的操作模式为该第一操作模式。
第二方面,提供了一种管理终端设备的方法,包括:显示第一界面;接收用户的第一操作并获取所述第一操作对应的操作信息,其中,所述操作信息包括触摸信息和/或所述终端设备的姿态信息;响应于所述第一操作对应的操作信息为第一类型,显示第二界面,所述第二界面不同于所述第一界面;及响应于所述第一操作对应的操作信息为第二类型,所述第二类型不同于所述第一类型,显示第三界面。
其中,所述第三界面包括用户验证界面,所述第二界面是验证通过后的界面。
这里“所述第一操作对应的操作信息为第一类型”可以是指所述第一操作对应的操作信息与第一判定模型的匹配程度高于预设的第一阈值。
“所述第一操作对应的操作信息为第二类型”可以是指所述第一操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值。
其中,该第一判定模型可以是基于所述终端设备的机主所进行的操作的操作信息确定的。
可选地,所述触摸信息包括以下至少一种信息:触摸操作的力度的信息、触摸操作的位置的信息、触摸操作的接触面积的信息、触摸操作的接触时间的信息、触摸操作的滑动角度的信息、触摸操作的滑动方向的信息、触摸操作的滑动距离的信息。
可选地,所述第二界面为应用程序的界面。
其中,该应用程序可以是第一界面所属于的应用程序。
或者,该应用程序可以是第一操作所操作的应用程序。
可选地,该方法进一步包括:在显示第二界面之前,接收用户的第二操作;响应于所述第二操作及所述第一操作对应的操作信息为所述第一类型,所述第二界面被显示;及响应于所述第二操作及所述第一操作对应的操作信息为所述第二类型,所述第三界面被显示。
这里“所述第二操及所述第一操作作对应的操作信息为第一类型”可以是指所述第二操作对应的操作信息与第一判定模型的匹配程度高于预设的第一阈值,且所述第一操作对应的操作信息与第一判定模型的匹配程度高于预设的第一阈值。
“所述第一操作及所述第一操作对应的操作信息为第二类型”可以是指所述第操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值,和/或所述第一操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值。
可选地,该方法进一步包括:当显示第三界面时,接收用户的第三操作并获取所述第三操作对应的操作信息;响应于所述第三操作的操作信息为第三类型,显示所述第二界面。
其中,“所述第三操作的操作信息为第三类型”可以是指:所述第三操作对应的操作信息与第一判定模型的匹配程度高于预设的第一阈值,且所述第三操作对应的解锁信息满足解锁条件。
第三方面,提供了一种用户识别的方法,包括:根据第一时段内检测到的用户操作,确定多个训练信息,该用户操作包括多个用户进行的操作,该训练信息包括该用户操作的触摸操作信息和/或该终端设备在该用户操作下的姿态信息;对该多个训练信息进行聚类处理,以确定至少一个训练信息集合;根据该多个用户中的第一用户的信息,从该至少一个训练信息集合中,确定与该第一用户相对应的第一训练信息集合;根据该第一训练信息集合中的训练信息,确定针对该第一用户的判定模型;当在第二时段检测到第一操作时,根据该判定模型,确定执行该第一操作的用户是否为该第一用户。
根据本申请的用户识别的方法,由于用户的操作具有习惯性,因此,同一用户在对终端设备操作的过程中可能产生大量的相似的操作,通过使终端设备对根据在第一时段内检测到的用户操作确定的多个训练信息进行聚类,能够使聚类后的同一训练信息集合中的训练信息对应同一用户,进而,基于该训练信息集合中的训练信息所生成的判定模型能够有效判定某一操作的用户是否为该训练信息集合所对应的用户,从而,无需为实现用户识别,额外配置生物特征识别设备,能够降低终端设备的成本。并且,由于生产训练信息的用户操作无需用户刻意进行,或者说,无需为了实现用户识别额外增加使用者的操作负担,因此,能够改善用户体验,提高本生情的用户识别的实用性。
可选地,该对该多个训练信息进行聚类处理,包括:基于预设的第一阈值,对该多个训练信息进行聚类处理,其中,每个训练信息集合中的训练信息的密度大于或等于该第一阈值;该根据该多个用户中的第一用户的信息,从该多个训练信息集合中,确定与该第一用户相对应的第一训练信息集合,包括:当该第一用户的信息指示该第一用户为该终端设备的机主时,将该多个训练信息集合中训练信息的密度最大的训练信息集合确定为该第一训练信息集合。
根据本申请的用户识别的方法,由于机主对终端设备操作的频率较大,因此,机主的操作较多,通过采用基于密度的聚类算法,可以将密度最大的训练信息集合确定为用于训练识别机主的判定模型的训练信息集合,从而,能够容易地实现对机主的识别。
可选地,根据对象排序识别聚类结构OPTICS算法,对该多个训练信息进行聚类处理。
可选地,该对该多个训练信息进行聚类处理,以确定多个训练信息集合,包括:对该多个训练信息进行聚类处理,以确定多个训练信息集合以及每个训练信息集合对应的特征信息,其中,该多个训练信息集合与该多个用户一一对应,以及每个训练信息集合的特征信息;该根据该多个用户中的第一用户的信息,从该多个训练信息集合中,确定与该第一用户相对应的第一训练信息集合,包括:当该第一用户的信息为该第一用户的特征信息时,将该多个训练信息集合中特征信息与该第一用户的特征信息的相似度满足第二预设条件的训练信息集合,确定为该第一训练信息集合。
根据本申请的用户识别的方法,通过聚类处理,能够确定每个训练信息集合对应的特征信息,从而,在能够获取多个用户的特征信息的情况下,能够实现针对多个用户的判定模型的确定,从而,能够提高本申请的用户识别方法的实用性。
可选地,该对该多个训练信息进行聚类处理,包括:根据K均值算法,对该多个训练信息进行聚类处理。
可选地,该根据第一时段内检测到的用户操作,确定多个训练信息,包括:根据第一时段内检测到的针对第一应用的用户操作,确定该多个训练信息;以及该当在第二时段检测到第一操作时,根据该判定模型,确定执行该第一操作的用户是否为该第一用户,包括:当在第二时段检测到针对该第一应用的第一操作时,根据该判定模型,确定执行该第一操作的用户是否为该第一用户。
由于同一用户对同一应用的操作的相似度可能较大,因此,通过根据针对第一应用的操作确定的训练信息确定判定模型,并使用该判定模型判定针对第一应用的操作的用户,能够增大本申请的用户识别的方法的识别准确性和可靠性。
可选地,该根据第一时段内检测到的用户操作,确定多个训练信息,包括:根据第一时段内检测到的针对第一操作界面的用户操作,确定多个训练信息;以及该当在第二时段检测到第一操作时,根据该判定模型,确定执行该第一操作的用户是否为该第一用户,包括:当在第二时段检测到针对该第一操作界面的第一操作时,根据该判定模型,确定执行该第一操作的用户是否为该第一用户。
由于同一用户对同一操作界面的操作的相似度可能较大,因此,通过根据针对第一操作界面的操作确定的训练信息确定判定模型,并使用该判定模型判定针对第一操作界面的操作的用户,能够增大本申请的用户识别的方法的识别准确性和可靠性。
可选地,该根据第一时段内检测到的用户操作,确定多个训练信息,包括:根据第一时段内检测到的第一操作类型的用户操作,确定多个训练信息;以及该当在第二时段检测到第一操作时,根据该判定模型,确定执行该第一操作的用户是否为该第一用户,包括:当在第二时段检测到该第一操作类型的第一操作时,根据该判定模型,确定执行该第一操作的用户是否为该第一用户。
可选地,该第一操作类型包括滑动操作类型、点击操作类型或长按操作类型。
由于同一用户的同一操作类型的操作的相似度可能较大,因此,通过根据第一操作类型的操作确定的训练信息确定判定模型,并使用该判定模型判定针对第一操作类型的操作的用户,能够增大本申请的用户识别的方法的识别准确性和可靠性。
可选地,该触摸操作信息包括以下至少一种信息:触摸操作的力度的信息、触摸操作 的位置的信息、触摸操作的接触面积的信息、触摸操作的接触时间的信息、触摸操作的滑动角度的信息、触摸操作的滑动方向的信息、触摸操作的滑动距离的信息。
可选地,当该第一用户为机主时,在根据该判定模型和该第一操作的信息,确定执行该第一操作的用户不为该第一用户之后,该方法还包括:禁用该终端设备中规定的目标应用;或执行锁屏处理;或执行告警处理。
从而,能够提高终端设备的安全性。
可选地,该终端设备具有至少两种操作模式,其中,在第一操作模式下该终端设备需要识别用户是否为机主,在第二操作模式下该终端设备不需要识别用户是否为机主,以及在根据该判定模型和该第一操作的信息,确定执行该第一操作的用户是否为该第一用户之前,该方法还包括:确定该终端设备当前的操作模式为该第一操作模式。
从而,能够提高本申请的用户识别的方法的灵活性和实用性。
第四方面,提供一种用户识别的方法,包括:接收终端设备发送的多个训练信息,该训练信息是基于该终端设备在第一时段内检测到的用户操作确定的,其中,该用户操作包括多个用户进行的操作,该训练信息包括该用户操作的触摸操作信息和/或该终端设备在该用户操作下的姿态信息;对该多个训练信息进行聚类处理,以确定至少一个训练信息集合;根据该多个用户中的第一用户的信息,从该至少一个训练信息集合中,确定与该第一用户相对应的第一训练信息集合;根据该第一训练信息集合中的训练信息,确定针对该第一用户的判定模型;向该终端设备发送该判定模型,以便于该终端设备在第二时段检测到第一操作时,根据该判定模型,确定执行该第一操作的用户是否为该第一用户。
根据本申请的用户识别的方法,由于用户的操作具有习惯性,因此,同一用户在对终端设备操作的过程中可能产生大量的相似的操作,通过使终端设备对根据在第一时段内检测到的用户操作确定的多个训练信息进行聚类,能够使聚类后的同一训练信息集合中的训练信息对应同一用户,进而,基于该训练信息集合中的训练信息所生成的判定模型能够有效判定某一操作的用户是否为该训练信息集合所对应的用户,从而,无需为实现用户识别,额外配置生物特征识别设备,能够降低终端设备的成本。并且,由于生产训练信息的用户操作无需用户刻意进行,或者说,无需为了实现用户识别额外增加使用者的操作负担,因此,能够改善用户体验,提高本生情的用户识别的实用性。并且,由于确定判定模型的过程由服务器执行,能够降低对终端设备的处理性能的要求,并减轻终端设备的处理负担,从而,能够进一步提高本申请的用户识别的方法的实用性。
可选地,该对该多个训练信息进行聚类处理,包括:基于预设的第一阈值,对该多个训练信息进行聚类处理,其中,每个训练信息集合中的训练信息的密度大于或等于该第一阈值;该根据该多个用户中的第一用户的信息,从该多个训练信息集合中,确定与该第一用户相对应的第一训练信息集合,包括:当该第一用户的信息指示该第一用户为该终端设备的机主时,将该多个训练信息集合中训练信息的密度最大的训练信息集合确定为该第一训练信息集合。
可选地,该对该多个训练信息进行聚类处理,以确定多个训练信息集合,包括:对该多个训练信息进行聚类处理,以确定多个训练信息集合以及每个训练信息集合对应的特征信息,其中,该多个训练信息集合与该多个用户一一对应,以及每个训练信息集合的特征信息;该根据该多个用户中的第一用户的信息,从该多个训练信息集合中,确定与该第一 用户相对应的第一训练信息集合,包括:当该第一用户的信息为该第一用户的特征信息时,将该多个训练信息集合中特征信息与该第一用户的特征信息的相似度满足第二预设条件的训练信息集合,确定为该第一训练信息集合。
可选地,该接收终端设备发送的多个训练信息,包括:接收该终端设备发送的多个训练信息和第一指示,其中,该第一指示信息用于指示该多个训练信息是根据该终端设备在该第一时段内检测到的针对该第一应用的用户操作确定的;以及该向该终端设备发送该判定模型包括:向该终端设备发送该判定模型和第二指示信息,该第二指示信息用于指示该判定模型具体用于判定进行针对该第一应用的操作的用户是否为该第一用户。
可选地,该接收终端设备发送的多个训练信息,包括:接收该终端设备发送的多个训练信息和第三指示信息,其中,该第三指示信息用于指示该多个训练信息是根据该终端设备在该第一时段内检测到的针对该第一操作界面的用户操作确定的;以及该向该终端设备发送该判定模型包括:向该终端设备发送该判定模型和第四指示信息,该第四指示信息用于指示该判定模型具体用于判定进行针对该第一操作界面的操作的用户是否为该第一用户。
可选地,该接收终端设备发送的多个训练信息,包括:接收该终端设备发送的多个训练信息和第五指示信息,其中,该第五指示信息用于指示该多个训练信息是根据该终端设备在该第一时段内检测到的该第一操作类型的用户操作确定的;以及该向该终端设备发送该判定模型包括:向该终端设备发送该判定模型和第六指示信息,该第六指示信息用于指示该判定模型具体用于判定进行针对该第一操作类型的操作的用户是否为该第一用户。
可选地,该第一操作类型包括滑动操作类型、点击操作类型或长按操作类型。
可选地,该触摸操作信息包括以下至少一种信息:触摸操作的力度的信息、触摸操作的位置的信息、触摸操作的接触面积的信息、触摸操作的接触时间的信息、触摸操作的滑动角度的信息、触摸操作的滑动方向的信息、触摸操作的滑动距离的信息。
第五方面,提供了一种管理终端设备的装置,包括用于执行上述第一方面至第三方面中的任一方面及其各实现方式中的方法的各步骤的单元。
其中,该管理终端设备的装置可以配置在终端设备中。此情况下,该管理终端设备的装置中的检测单元可以使用终端设备中的传感器来实现。该用户识别的装置中的处理单元可以独立于终端设备中的处理器。或者,该管理终端设备的装置中的处理单元可以使用终端设备中的处理器来实现。
或者,该管理终端设备的装置本身即可以为终端设备。
第六方面,提供了一种用户识别的装置,包括用于执行上述第四方面及其各实现方式中的方法的各步骤的单元
其中,该用户识别的装置可以配置在能够与终端设备通信的服务器中。此情况下,该用户识别的装置中的处理单元可以独立于服务器中的处理器。或者,该用户识别的装置中的处理单元可以使用服务器中的处理器来实现。
或者,该用户识别的装置本身即可以为服务器。
第七方面,提供了一种终端设备,包括,传感器,处理器,存储器,该传感器用于检测用户操作和/或终端设备的姿态,该存储器用于存储计算机程序,该处理器用于从存储器中调用并运行该计算机程序,使得该通信设备执行第一方面至第三方面中的任一方面及 其各种可能实现方式中的方法。
可选地,所述处理器为一个或多个,所述存储器为一个或多个。
可选地,所述存储器可以与所述处理器集成在一起,或者所述存储器与处理器分离设置。
可选地,该传感器可以为一个或多个,该一个或多个传感器可以共同检测同一参量,或者,不同的传感器可以用于检测不同的参量。
可选地,该终端设备还可以包括,发射机(发射器)和接收机(接收器)。
第八方面,提供了一种服务器,包括,处理器,存储器,发射机(发射器)和接收机(接收器),该存储器用于存储计算机程序,该处理器用于从存储器中调用并运行该计算机程序,使得该通信设备执行第四方面及其各种实现方式中的用户识别的方法。
可选地,所述处理器为一个或多个,所述存储器为一个或多个。
可选地,所述存储器可以与所述处理器集成在一起,或者所述存储器与处理器分离设置。
第九方面,提供了一种计算机程序产品,所述计算机程序产品包括:计算机程序(也可以称为代码,或指令),当所述计算机程序被运行时,使得计算机执行上述第一方面至第四方面中的任一方面及其各种实现方式中的方法。
第十方面,提供了一种计算机可读介质,所述计算机可读介质存储有计算机程序(也可以称为代码,或指令)当其在计算机上运行时,使得计算机执行上述第一方面至第四方面中的任一方面及其各种实现方式中的方法。
第十一方面,提供了一种芯片系统,包括存储器和处理器,该存储器用于存储计算机程序,该处理器用于从存储器中调用并运行该计算机程序,使得安装有该芯片系统的设备执行上述第一方面至第四方面中的任一方面及其各种实现方式中的方法。
例如,该芯片系统可以调用终端设备的传感器采集的数据,并基于上述第一方面或第二方面中任一种可能实现方式中的方法中的相关步骤生成判定模型。
其中,该芯片系统可以包括用于发送信息或数据的输入电路或者接口,以及用于接收信息或数据的输出电路或者接口。
由于用户的操作具有习惯性,同一用户(即,第一用户)在对终端设备操作的过程中可能产生大量的相似的操作,因此,根据本申请的技术,通过使用基于第一用户的多个操作进行训练并获得判定模型,能够基于该判定模型判定某一操作是否为该第一用户进行的,从而,能够提高终端设备的使用安全性。
附图说明
图1是本申请的管理终端设备的方法所适用于的终端设备的一例的示意图。
图2是本申请的管理终端设备的方法的一例的示意性流程图。
图3是判定模型对应的用户操作的一例的示意图。
图4是基于本申请的管理终端设备的方法实现的界面控制的一例的示意图。
图5是基于本申请的管理终端设备的方法实现的界面控制的另一例的示意图。
图6是本申请的判定模型的确定方法的一例的示意性流程图。
图7是本申请的判定模型的确定方法的一例的示意性交互图。
图8是本申请的用户识别的装置的一例的示意性框图。
具体实施方式
下面将结合附图,对本申请中的技术方案进行描述。
本申请的用户识别的方法可以应用于针对终端设备的用户的识别。终端设备也可以称为用户设备(User Equipment,UE)、接入终端、用户单元、用户站、移动站、移动台、远方站、远程终端、移动设备、用户终端、终端、无线通信设备、用户代理或用户装置。终端设备可以是WLAN中的站点(STAION,ST),可以是蜂窝电话、无绳电话、会话启动协议(Session Initiation Protocol,SIP)电话、无线本地环路(Wireless Local Loop,WLL)站、个人数字处理(Personal Digital Assistant,PDA)设备、具有无线通信功能的手持设备、计算设备或连接到无线调制解调器的其它处理设备、车载设备、车联网终端、电脑、膝上型计算机、手持式通信设备、手持式计算设备、卫星无线设备、无线调制解调器卡、电视机顶盒(set top box,STB)、用户驻地设备(customer premise equipment,CPE)和/或用于在无线系统上进行通信的其它设备以及下一代通信系统,例如,5G网络中的终端设备或者未来演进的公共陆地移动网络(Public Land Mobile Network,PLMN)网络中的终端设备等。
作为示例而非限定,在本申请实施例中,该终端设备还可以是可穿戴设备。可穿戴设备也可以称为穿戴式智能设备,是应用穿戴式技术对日常穿戴进行智能化设计、开发出可以穿戴的设备的总称,如眼镜、手套、手表、服饰及鞋等。可穿戴设备即直接穿在身上,或是整合到用户的衣服或配件的一种便携式设备。可穿戴设备不仅仅是一种硬件设备,更是通过软件支持以及数据交互、云端交互来实现强大的功能。广义穿戴式智能设备包括功能全、尺寸大、可不依赖智能手机实现完整或者部分的功能,例如:智能手表或智能眼镜等,以及只专注于某一类应用功能,需要和其它设备如智能手机配合使用,如各类进行体征监测的智能手环、智能首饰等。
此外,在本申请实施例中,终端设备还可以是物联网(Internet of Things,IoT)系统中的终端设备,IoT是未来信息技术发展的重要组成部分,其主要技术特点是将物品通过通信技术与网络连接,从而实现人机互连,物物互连的智能化网络。
图1示出了该终端设备的一例的示意图,如图1所示,该终端设备100可以包括以下部件。
A.RF电路110
RF电路110可用于收发信息或通话过程中,信号的接收和发送,特别地,将基站的下行信息接收后,给处理器180处理;另外,将设计上行的数据发送给基站。通常,RF电路包括但不限于天线、至少一个放大器、收发信机、耦合器、LNA(Low Noise Amplifier,低噪声放大器)、双工器等。此外,RF电路110还可以通过无线通信与网络和其他设备通信。所述无线通信可以使用任一通信标准或协议,包括但不限于无线局域网(Wireless Local Area Networks,WLAN)全球移动通讯(Global System of Mobile communication,GSM)系统、码分多址(Code Division Multiple Access,CDMA)系统、宽带码分多址(Wideband Code Division Multiple Access,WCDMA)系统、通用分组无线业务(General Packet Radio Service,GPRS)、长期演进(Long Term Evolution,LTE)系统、LTE频分 双工(Frequency Division Duplex,FDD)系统、LTE时分双工(Time Division Duplex,TDD)、通用移动通信系统(Universal Mobile Telecommunication System,UMTS)、全球互联微波接入(Worldwide Interoperability for Microwave Access,WiMAX)通信系统、未来的第五代(5th Generation,5G)系统或新无线(New Radio,NR)等。
B.存储器120
存储器120可用于存储软件程序以及模块,处理器180通过运行存储在存储器120的软件程序以及模块,从而执行终端设备100的各种功能应用以及数据处理。存储器120可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图象播放功能等)等;存储数据区可存储根据终端设备100的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器120可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
C.其他输入设备130
其他输入设备130可用于接收输入的数字或字符信息,以及产生与终端设备100的用户设置以及功能控制有关的键信号输入。具体地,其他输入设备130可包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆、光鼠(光鼠是不显示可视输出的触摸敏感表面,或者是由触摸屏形成的触摸敏感表面的延伸)等中的一种或多种。其他输入设备130与I/O子系统170的其他输入设备控制器171相连接,在其他设备输入控制器171的控制下与处理器180进行信号交互。
D.显示屏140
显示屏140可用于显示由用户输入的信息或提供给用户的信息以及终端设备100的各种菜单,还可以接受用户输入。具体的显示屏140可包括显示面板141,以及触控面板142。其中显示面板141可以采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板141。触控面板142,也称为触摸屏、触敏屏等,可收集用户在其上或附近的接触或者非接触操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板142上或在触控面板142附近的操作,也可以包括体感操作;该操作包括单点控制操作、多点控制操作等操作类型),并根据预先设定的程式驱动相应的连接装置。可选的,触控面板142可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位、姿势,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成处理器能够处理的信息,再送给处理器180,并能接收处理器180发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板142,也可以采用未来发展的任何技术实现触控面板142。进一步的,触控面板142可覆盖显示面板141,用户可以根据显示面板141显示的内容(该显示内容包括但不限于,软键盘、虚拟鼠标、虚拟按键、图标等等),在显示面板141上覆盖的触控面板142上或者附近进行操作,触控面板142检测到在其上或附近的操作后,通过I/O子系统170传送给处理器180以确定用户输入,随后处理器180根据用户输入通过I/O子系统170在显示面板141上提供相应的视觉输出。虽然在图4中,触控面板142与显示面板141是作为两个独立的部件来实现终端设备100的输入和输入功能,但是在某些实施例中,可以将触控面板142 与显示面板141集成而实现终端设备100的输入和输出功能。
E.传感器150
传感器150可以为一种或多种,例如,该可以包括光传感器、运动传感器以及其他传感器。
具体地,光传感器可包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板141的亮度,接近传感器可在终端设备100移动到耳边时,关闭显示面板141和/或背光。
作为运动传感器的一种,加速度传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别终端设备姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等。
此外,终端设备100还可配置的重力感应器(也可以称为重力传感器)、陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。
F.音频电路160、扬声器161、麦克风162
可提供用户与终端设备100之间的音频接口。音频电路160可将接收到的音频数据转换后的信号,传输到扬声器161,由扬声器161转换为声音信号输出;另一方面,麦克风162将收集的声音信号转换为信号,由音频电路160接收后转换为音频数据,再将音频数据输出至RF电路108以发送给比如另一终端设备,或者将音频数据输出至存储器120以便进一步处理。
G.I/O子系统170
I/O子系统170用来控制输入输出的外部设备,可以包括其他设备输入控制器171、传感器控制器172、显示控制器173。可选的,一个或多个其他输入控制设备控制器171从其他输入设备130接收信号和/或者向其他输入设备130发送信号,其他输入设备130可以包括物理按钮(按压按钮、摇臂按钮等)、拨号盘、滑动开关、操纵杆、点击滚轮、光鼠(光鼠是不显示可视输出的触摸敏感表面,或者是由触摸屏形成的触摸敏感表面的延伸)。值得说明的是,其他输入控制设备控制器171可以与任一个或者多个上述设备连接。所述I/O子系统170中的显示控制器173从显示屏140接收信号和/或者向显示屏140发送信号。显示屏140检测到用户输入后,显示控制器173将检测到的用户输入转换为与显示在显示屏140上的用户界面对象的交互,即实现人机交互。传感器控制器172可以从一个或者多个传感器150接收信号和/或者向一个或者多个传感器150发送信号。
H.处理器180
处理器180是终端设备100的控制中心,利用各种接口和线路连接整个终端设备的各个部分,通过运行或执行存储在存储器120内的软件程序和/或模块,以及调用存储在存储器120内的数据,执行终端设备100的各种功能和处理数据,从而对终端设备进行整体监控。可选的,处理器180可包括一个或多个处理单元;优选的,处理器180可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器180中。
终端设备100还包括给各个部件供电的电源190(比如电池),优选的,电源可以通过电源管理系统与处理器180逻辑相连,从而通过电源管理系统实现管理充电、放电、以 及功耗等功能。
另外,尽管未示出,终端设备100还可以包括摄像头、蓝牙模块等,在此不再赘述。
图2示出了本申请的管理终端设备的方法200的一例的示例性说明,例如,该方法200可以应用于上述终端设备100中。
如图2所示,在S210,终端设备的处理器(例如,上述处理器180),可以获取用户操作#1的相关数据。
例如,当传感器(例如,触摸屏或姿态传感器等)检测到基于某一用户操作#1而生成的数据#1时,可以将该数据#1发送给处理器。
首先,对本申请的方法200的执行时机,或者说,S210的执行时机进行说明。
作为示例而非限定,可以通过以下任意方式触发该方法200的执行。
方式1:周期性触发
具体地说,终端设备可以按照规定的检测周期检测对终端设备的使用者的身份进行检测,或者说,确定当前周期内检测到的用户操作与判定模型(随后进行详细说明)的匹配程度。此情况下,用户操作#1可以是在当前所处的检测周期捏检测到的一个或多个操作。
方式2:基于终端设备被操作的时长触发
具体地说,终端设备可以记录终端设备连续被连续操作的时长,其中,该被连续操作的时长可以是指:该时长范围内的任意两个操作之间的时间间隔小于或等于预设的时间间隔。从而,终端设备可以在确定在该被连续操作的时长大于或等于预设时长时,启动方法200。此情况下,用户操作#1可以是在终端设备在确定该被连续操作的时长大于或等于预设时长之后所检测到的一个或多个操作。
方式3:基于被操作的应用触发
具体地说,终端设备可以确定当前被操作的应用,如果该应用属于规定的应用(例如,聊天类应用或支付类应用等),则可以启动方法200。此情况下,用户操作#1可以是针对规定的应用程序的触摸操作。
下面,对该“用户操作#1”进行说明。
作为示例而非限定,该用户操作#1可以是以下任意一种操作:
A.屏幕解锁操作,具体地说,为了防止误操作以及提高终端设备的安全向,用户可以将屏幕锁定,或者,终端设备在规定时间内未检测到用户对终端设备的操作,可以自行锁定屏幕,从而,当用户需要解锁屏幕时,需要进行正确的解锁操作,例如,可以列举滑动解锁、密码解锁或图形解锁等各种解锁操作。
B.应用解锁操作,具体地说,为了提高终端设备的安全性,当用户需要打开某一应用(例如,聊天类应用或支付类应用等)终端设备或该应用可以弹出解锁界面,从而,在用户进行正确的解锁操作后,可以正常启动该应用;或者,当用户需要使用应用的某一功能(例如,转账功能或查询功能等),终端设备或该应用同样可以弹出解锁界面,从而,在用户进行正确的解锁操作后,可以正常启动该功能。
C.针对规定的应用程序的触摸操作,例如,作为示例而非限定,该规定的应用程序可以是用户设定的应用程序,例如,聊天类应用或支付类应用等;或者,该规定的应用程序可以是制造商或运营商等设定的应用程序。
其中,该“针对规定的应用程序的触摸操”可以是指在该应用程序的规定的界面上的操 作,例如,该操作可以包括对规定的界面控件的操作(例如,对支付按钮的点击操作);再例如,该操作可以包括用户设定的操作,例如,在浏览界面上为了实现翻页而进行的滑动操作等。
或者,该“针对规定的应用程序的触摸操”可以是指对应用程序的管理的操作,例如,可以包括对应用程序的删除或权限变更等操作。
应理解,以上列举的用户操作#1所包括的具体内容仅为示例性说明,本申请并未限定于此,使用者可以根据需要任意设置该用户操作#1的具体内容。例如,也可以结合上述触发方式确定该用户操作#1。
下面,对该“相关数据”进行详细说明。
作为示例而非限定,在本发明实施例中,该用户操作可以包括触摸操作,此情况下,该用户对触摸屏的触摸操作的相关数据据可以包括:用户对触摸屏的触摸操作的相关数据;或者,该用户对触摸屏的触摸操作的相关数据据可以包括用户对终端设备进行操作时,终端设备所处于的姿态的相关数据,下面,分别对上述两种类型的数据进行详细说明。
1.用户对触摸屏的触摸操作的相关数据
即,该用户操作#1可以是触摸操作,此情况下,该用户操作#1的相关数据可以是该触摸操作的数据。
作为示例而非限定,上述触摸检测装置可以检测用户的触摸操作的数据,作为示例而非限定,该触摸操作可以包括但不限于点击(例如,单击或双击等)操作或滑动操作(例如,单指滑动或多指滑动等)等,该触摸操作的数据可以包括但不限于以下至少一种参数的数据:
a.触摸操作的力度
例如,终端设备可以在触摸屏下配置压力传感器,从而,可以检测触摸操作的力度,并将所检测到的力度的数据发送给上述处理器。
应理解,以上列举的检测触摸操作的力度的方式仅为示例性说明,本发明并未限定于此,例如,触摸屏也可以继承上述压力传感器,例如,该触摸屏可以是压力屏,或者说压力感应屏。
b.触摸操作的位置
例如,该触摸操作的位置可以是触摸屏中被触摸的区域在整个触摸屏中的相对位置,其中,当该触摸操作为点击操作时,该触摸操作为位置可以是指触摸点在屏幕中的位置。当该触摸操作为滑动操作时,该触摸操作为位置可以包括滑动操作的起始点在屏幕中的位置,或者,也可以包括滑动操作的结束点在屏幕中的位置,或者,也可以包括滑动轨迹在屏幕中的位置。
c.触摸操作的接触面积
例如,该触摸操作的接触面积可以是触摸检测装置检测到的用户的手指与触摸屏的接触面积。
d.触摸操作的接触时间
例如,该触摸操作的接触时间可以是触摸检测装置检测到的一次触摸操作的时间,其中,该“一次触摸操作”可以是指自触摸屏被手指接触到至手指离开触摸屏),例如,当该触摸操作为点击操作时,该触摸操作的接触时间可以是指一次单击操作中触摸屏被触摸 的时间,再例如,当该触摸操作为滑动操作时,该触摸操作的接触时间可以是指一次滑动操作中触摸屏被触摸的时间。
或者,该触摸操作的接触时间可以是触摸检测装置检测到的间隔小于或等于预设时间间隔的多次触摸操作的时间,例如,当该触摸操作为双击操作等手指有离开触摸屏的操作时,该触摸操作的接触时间可以是指一次操作过程中触摸屏被触摸的总的时间。
e.滑动操作的滑动角度
例如,该滑动操作的角度可以是自滑动操作的起始位置至滑动操作的终止位置之间的连线与屏幕的水平(或竖直)方向之间的角度。
f.滑动操作的滑动方向
例如,该滑动操作的角度可以是自滑动操作的起始位置至滑动操作的终止位置的方向。
g.滑动操作的滑动距离
例如,该滑动操作的滑动距离可以是自滑动操作的起始位置至滑动操作的终止位置的直线长度。或者,该滑动操作的滑动距离可以是自滑动操作的轨迹的总长度。
应理解,上述参数的具体实例及获取方法仅为示例性说明,本申请并未特别限定,可以使用现有技术提供的各种方法获取上述参数的数据或信息。
2.用户对终端设备进行操作(例如,对触摸屏进行触摸操作)时,终端设备所处于的姿态的相关数据(即,姿态信息的一例)。
例如,该终端设备的姿态的相关数据可以包括终端设备与水平或重力方向所成的角度的数据。
作为示例而非限定,该姿态信息可以包括但不限于来自以下至少一种传感器检测到的数据:
重力传感器、加速度传感器、陀螺仪。
在S220,终端设备的处理器(例如,上述处理器180),可以基于一个或多个判定模型以及该用户操作#1的相关数据,对终端设备进行管理。
首先,对该判定模型进行详细说明。
由于不同用户具有不同的操作习惯和生物特征,可能导致不同用户的操作具有不同给的特征。
例如,不同用户的手指的大小不同可能导致不同用户的触摸操作的触摸接触面积不同。
再例如,不同用户的手大小不同可能导致不同用户的滑动操作的滑动距离不同。
再例如,不同用户的操作习惯不同可能导致不同用户的滑动操作的滑动方向、角度或距离不同。
再例如,不同用户的操作习惯不同可能导致不同用户的触摸操作的触摸时间或触摸力度不同。
再例如,不同用户的操作习惯不同可能导致不同用户的在对终端设备进行操作时终端设备的姿态(例如,倾斜角度等)不同。
并且,受用户的操作习惯和生物特征的影响,传感器检测到的来自同一用户的操作的数据具有相似性。
例如,同一用户的触摸操作的触摸接触面积的相似度较高。
再例如,同一用户的滑动操作的滑动距离的相似度较高。
再例如,同一用户的滑动操作的滑动方向、角度或距离的相似度较高。
再例如,同一用户的触摸操作的触摸时间或触摸力度的相似度较高。
再例如,同一用户的在对终端设备进行操作时终端设备的姿态(例如,倾斜角度等)的相似度较高。
因此,可以基于同一用户的数据进行训练,从而确定针对该用户的判定模型。
即,该判定模型可以用于判定某一用户操作是否是该判定模型对应的用户进行的操作。
随后,结合图3对该判定模型的训练过程以及判定模型的使用方法进行详细说明。
作为示例而非限定,在本申请实施例中,可以采用以下至少一种方法使用该用户操作#1的相关数据以及该一个或多个判定模型。
方法1
在本申请实施例中,在终端设备中可以保存有判定模型#1,具体地说,该判定模型#1可以是用于判定某一用户操作是否是该终端设备的机主进行的操作的模型,即,该判定模型可以是基于该机主进行的操作预先训练而生成的模型。
此情况下,处理器可以基于该用户操作#1的数据和该判定模型#1,确定执行该用户操作#1的用户是否为机主。
例如,如果数据#1的每个维度的值与判定模型#1的每个维度的值之间的差异度(例如,差值的绝对值)小于或等于规定的阈值#c,则处理器可以确定执行该用户操作#1的用户为机主。
或者,如果数据#1的X个维度中的Y个维度的值与判定模型#1的X个维度中的Y个维度的值之间的差异度(例如,差值的绝对值)小于或等于规定的阈值#c,则处理器可以确定执行该用户操作#1的用户为机主。其中,Y小于或等于X,Y与X的比值大于或等于规定的阈值#d。
相反,例如,如果数据#1的每个维度的值与判定模型#1的每个维度的值之间的差异度(例如,差值的绝对值)大于规定的阈值#c,则处理器可以确定该用户操作#1的用户不为机主。
或者,如果数据#1的X个维度中的Y个维度的值与判定模型#1的X个维度中的Y个维度的值之间的差异度(例如,差值的绝对值)大于规定的阈值#c,则处理器可以确定该用户操作#1的用户不为机主。其中,Y小于或等于X,Y与X的比值大于或等于规定的阈值#d。
如上所述,可能存在执行该用户操作#1的用户为机主的情况(即,情况1),或者,执行该用户操作#1的用户不为机主的情况(即,情况2),下面,分别对上述两种判定
情况1
此情况下,处理器可以允许执行该用户操作#1对应的处理。
例如,如果该用户操作#1是用于启动应用程序#1的操作,则处理器可以启动该应用程序#1。
再例如,如果该用户操作#1是用于使当前界面切换为界面#1的操作,则处理器可以 将当前界面切换至该界面#1。例如,该用户操作#1可以是屏幕解锁操作,此情况下,如果判定为执行该用户操作#1的用户为机主,则处理器可以控制终端设备呈现屏幕解锁后需要呈现的界面。
再例如,如果该用户操作#1是用于执行功能#1(例如,转账、付款、拨打电话或收发信息等)的操作,则处理器可以控制终端设备执行该功能#1。
或者,此情况下,处理器可以对规定的应用程序进行解锁,例如,在基于上述方式1或方式2启动该方法200时,如果判定为执行该用户操作#1的用户为机主,则可以在检测到启动应用程序#X的操作(非该该用户操作#1)时,可以直接启动应用程序#X,其中,应用程序#X可以是需要解锁(例如,输入密码(例如,指纹密码、数字密码或图形密码)并在密码正确后)才允许启动的应用。根据本申请的方案,在确定执行该用户操作#1的用户为机主,可以不必执行针对该应用程序#X的解锁过程,例如,可以不弹出密码输入界面,直接进入应用程序#X的界面。
情况2
此情况下,处理器可以禁止执行该用户操作#1对应的处理。
例如,如果该用户操作#1是用于启动应用程序#1的操作,则处理器可以禁止启动该应用程序#1。并且,可选地,处理器可以将终端设备锁定,即,需要重新解锁终端设备后才能继续使用终端设备。
再例如,如果该用户操作#1是用于使当前界面切换为界面#1的操作,则处理器可以禁止将当前界面切换至该界面#1。例如,该用户操作#1可以是屏幕解锁操作,此情况下,如果判定为执行该用户操作#1的用户不为机主,则处理器可以控制终端设备使当前界面保持在锁屏界面。
再例如,如果该用户操作#1是用于执行功能#1(例如,转账、付款、拨打电话或收发信息等)的操作,则处理器可以禁止终端设备执行该功能#1。并且,可选地,处理器可以将终端设备锁定,即,需要重新解锁终端设备后才能继续使用终端设备。
或者,此情况下,处理器可以对规定的应用程序进行锁定,例如,在基于上述方式1或方式2启动该方法200时,如果判定为执行该用户操作#1的用户为机主,则可以在检测到启动应用程序#X的操作(非该该用户操作#1)时,可以直接锁定应用程序#X,其中,应用程序#X可以是需要解锁(例如,输入密码(例如,指纹密码、数字密码或图形密码)并在密码正确)才允许启动的应用。或者,应用程序#X也可以是不需要解锁就允许启动的应用。根据本申请的方案,在确定执行该用户操作#1的用户不为机主,立即锁定该应用程序#X,即,及时检测到用于启动应用程序#X的操作,也不会启动该应用程序#X。
应理解,以上列举的处理方式仅为示例性说明,本发明并未限定于此,其他能够确保终端设备的使用安全性以及机主的隐私或财产的安全性的处理方式均落入本发明的保护范围内。
应理解,以上列举的情况2下的处理过程仅为示例性说明,本申请并未限定于此。本申请还可以应用于以下任意场景。
A.应用锁功能
具体地说,当判定结果为所检测到的用户操作不是机主进行的时,可以将规定的一个或多个应用锁定,例如,禁止该应用被用户操作,或者说,禁止该应用运行。
作为示例而限定,该应用可以是能够实现支付功能或消费功能的应用,从而,能够避免因终端被机主以外的用户操作而使机主的财产损失。或者,该应用可以是机主设定的应用,从而,能够确保机主的隐私不会被泄露。
B.防盗功能
具体地说,如果判定结果为所检测到的用户操作不是机主进行的时,可以开启防盗功能。
作为实现防盗功能的方法,可以采用以下至少一种动作:
1.关机。
2.向运营商或警方发出警报,并且,该警报中可以携带有终端设备当前的位置信息。
应理解,以上列举的防盗功能的动作仅为示例性说明,本发明并未限定于此。
C.锁屏功能
具体地说,如图4所示如果的判定结果为所检测到的用户操作是机主进行的时,可以禁止开启锁屏功能。
相反,如图5所示,如果判定结果为所检测到的用户操作不是机主进行的时,可以将终端设备的显示界面切换至锁屏界面,并需要用户正确解锁后,才能正常使用该终端设备。
具体地说,如图3所示,判定模型可以是基于在时段#A图片时的滑动操作确定的,并且,图3中示出了该机主习惯的滑动操作的轨迹300一例。
如图4所示,当机主浏览图片而进行滑动操作(如轨迹400所示)时,由于机主的操作习惯,该滑动操作(例如,滑动方向和滑动距离)与判定模型所对应的滑动操作的相似度较高,因此,判定模型可以判定执行该滑动操作的用户为机主,从而,可以允许终端设备基于该滑动操作执行处理。
与此相对,如图5所示,当其他用户浏览图片而进行滑动操作(如轨迹500所示)时,由于其他用户的操作习惯与机主相异,该滑动操作(例如,滑动方向和滑动距离)与判定模型所对应的滑动操作的相似度较低,因此,判定模型可以判定执行该滑动操作的用户不为机主,从而,可以禁止终端设备基于该滑动操作执行处理,并且,可以将界面切换至锁屏界面。
根据本申请提供的方法,能够提高终端设备的安全性,例如,由于成人和儿童的手指大小或触摸力度明显不同,因此,根据本发明的方案,通过上述判定模型能够有效区分执行用户操作的是成人还是儿童,从而,即使在儿童知道家长的终端设备的密码的情况下,仍然能够避免因儿童的操作,例如,启动转账或购买等功能,而对家长的财产造成损失。
可选地,在本申请实施例中,在终端设备中还可以保存有至少一个判定模型#2,具体地说,该至少一个判定模型#2可以与至少一个非机主用户一一对应,每个该判定模型#2可以用于判定某一用户操作是否是所对应的非机主用户进行的操作,即,每个判定模型可以是基于所对应的非机主用户进行的操作预先训练而生成的模型。
此情况下,处理器还可以进一步判定该至少一个判定模型#2中是否存在与该数据#1匹配的判定模型#2A。例如,如果数据#1的每个维度的值与判定模型#2A的每个维度的值之间的差异度(例如,差值的绝对值)小于或等于规定的阈值#e,则处理器可以确定执行该用户操作#1的用户为判定模型#2A对应的非机主用户(记作,用户#2)。
或者,如果数据#1的X个维度中的Y个维度的值与判定模型#2A的X个维度中的Y 个维度的值之间的差异度(例如,差值的绝对值)小于或等于规定的阈值#e,则处理器可以确定执行该用户操作#1的用户为用户#2。其中,Y小于或等于X,Y与X的比值大于或等于规定的阈值#f。
其后,处理器可以确定该用户#2对应的操作权限。
其中,操作权限可以用于指示该用户#2能够使用的应用程序;或者,操作权限可以用于指示该用户#2被禁止使用的应用程序(例如,支付类应用或聊天类应用等)。
或者,该操作权限可以用于指示该用户#2能够使用的功能;或者,操作权限可以用于指示该用户#2被禁止使用的功能(例如,拨打电话等)。
再或者,该操作权限可以用于指示该用户#2能够访问的操作界面;或者,操作权限可以用于指示该用户#2被禁止访问的界面(例如,照片浏览界面等)。
从而,处理器可以根据用户#2的权限,确定是否允许该用户操作#1对应的处理。
例如,如果该用户操作#1是用于启动应用程序#1的操作,则处理器可以根据该用户#2的权限,判定该应用程序#1是否能够被用户#2使用。如果该用户#2的权限指示应用程序#1能够被用户#2使用,则处理器可以启动应用程序#1;如果该用户#2的权限指示应用程序#1不能够被用户#2使用,则处理器可以禁止应用程序#1。并且,可选地,如果该用户#2的权限指示应用程序#1不能够被用户#2使用,处理器可以将终端设备锁定,即,需要重新解锁终端设备后才能继续使用终端设备。
再例如,如果该用户操作#1是用于使当前界面切换为界面#1的操作,则处理器可以可以根据该用户#2的权限,判定该界面#1是否能够被用户#2访问。例如,该用户操作#1可以是屏幕解锁操作,此情况下,如果该用户#2的权限指示用户#2被允许解锁屏幕,则处理器可以控制终端设备呈现屏幕解锁后需要呈现的界面;如果该用户#2的权限指示用户#2不被允许解锁屏幕,则处理器可以控制终端设备使当前界面保持在锁屏界面。
再例如,如果该用户操作#1是用于执行功能#1(例如,转账、付款、拨打电话或收发信息等)的操作,则处理器可以根据该用户#2的权限,判定该功能#1是否能够被用户#2使用。如果该用户#2的权限指示功能#1能够被用户#2使用,则处理器可以控制终端设备功能#1;如果该用户#2的权限指示功能#1不能够被用户#2使用,则处理器可以禁止终端设备执行功能#1。并且,可选地,如果该用户#2的权限指示功能#1不能够被用户#2使用,处理器可以将终端设备锁定,即,需要重新解锁终端设备后才能继续使用终端设备。
根据本申请提供的方法,能够提高控制的灵活性,例如,当基于上述方案判定用户操作的执行者非机主时,仍然可以允许不影响终端的安全性的应用程序(例如,拍照等)的执行。
方法2
在本申请实施例中,在终端设备中可以保存有多个判定模型,该多个判定模型与多个用户一一对应,每个该判定模型可以用于判定某一用户操作是否是所对应的用户进行的操作,即,每个判定模型可以是基于所对应的用户进行的操作预先训练而生成的模型。
此情况下,处理器还可以判定该多个判定模型中是否存在与该数据#1匹配的判定模型(以下,为了便于理解和区分,记作判定模型#B)。例如,如果数据#1的每个维度的值与判定模型#B的每个维度的值之间的差异度(例如,差值的绝对值)小于或等于规定的阈值#g,则处理器可以确定执行该用户操作#1的用户为判定模型#B对应的非机主用户 (记作,用户#3)。
或者,如果数据#1的X个维度中的Y个维度的值与判定模型#B的X个维度中的Y个维度的值之间的差异度(例如,差值的绝对值)小于或等于规定的阈值#g,则处理器可以确定执行该用户操作#1的用户为用户#3。其中,Y小于或等于X,Y与X的比值大于或等于规定的阈值#h。
其后,处理器可以确定该用户#3对应的操作权限。
其中,操作权限可以用于指示该用户#3能够使用的应用程序;或者,操作权限可以用于指示该用户#3被禁止使用的应用程序(例如,支付类应用或聊天类应用等)。
或者,该操作权限可以用于指示该用户#3能够使用的功能;或者,操作权限可以用于指示该用户#3被禁止使用的功能(例如,拨打电话等)。
再或者,该操作权限可以用于指示该用户#3能够访问的操作界面;或者,操作权限可以用于指示该用户#3被禁止访问的界面(例如,照片浏览界面等)。
从而,处理器可以根据用户#3的权限,确定是否允许该用户操作#1对应的处理。
例如,如果该用户操作#1是用于启动应用程序#a的操作,则处理器可以根据该用户#3的权限,判定该应用程序#a是否能够被用户#3使用。如果该用户#3的权限指示应用程序#a能够被用户#3使用,则处理器可以启动应用程序#a;如果该用户#3的权限指示应用程序#a不能够被用户#3使用,则处理器可以禁止应用程序#a。并且,可选地,如果该用户#3的权限指示应用程序#a不能够被用户#3使用,处理器可以将终端设备锁定,即,需要重新解锁终端设备后才能继续使用终端设备。
再例如,如果该用户操作#1是用于使当前界面切换为界面#a的操作,则处理器可以可以根据该用户#3的权限,判定该界面#a是否能够被用户#3访问。例如,该用户操作#1可以是屏幕解锁操作,此情况下,如果该用户#3的权限指示用户#3被允许解锁屏幕,则处理器可以控制终端设备呈现屏幕解锁后需要呈现的界面;如果该用户#3的权限指示用户#3不被允许解锁屏幕,则处理器可以控制终端设备使当前界面保持在锁屏界面。
再例如,如果该用户操作#1是用于执行功能#a(例如,转账、付款、拨打电话或收发信息等)的操作,则处理器可以根据该用户#3的权限,判定该功能#a是否能够被用户#3使用。如果该用户#3的权限指示功能#a能够被用户#3使用,则处理器可以控制终端设备功能#a;如果该用户#3的权限指示功能#a不能够被用户#3使用,则处理器可以禁止终端设备执行功能#a。并且,可选地,如果该用户#3的权限指示功能#a不能够被用户#3使用,处理器可以将终端设备锁定,即,需要重新解锁终端设备后才能继续使用终端设备。
需要说明的是,上述用户的权限可以是机主设定的,或者,也可以是制造商或运营商下发给终端设备的,本申请并未特别限定。
另外,在本申请中,终端设备可以提供两种模式,在模式1中,终端设备可以基于如上所述确定的判定模型判定执行操作的用户是否为规定的用户(例如,机主);与此相对,在模式2中,终端设备可以禁止基于如上所述确定的判定模型判定执行操作的用户是否为规定的用户(例如,机主,),或者说,在模式2中,终端设备可以不对执行操作的用户的身份进行确认。从而,能够在机主允许他人使用终端设备的情况下,避免因执行上述方法而出现禁止他人使用的情况,从而能够进一步提高本申请的实用性,实现人性化设置。
根据本申请的方案,由于用户的操作具有习惯性,因此,同一用户在对终端设备操作 的过程中可能产生大量的相似的操作,通过使用基于同一用户X的多个操作进行训练并获得判定模型,能够基于该判定模型判定某一用户操作是否为该用户X,从而,能够提高终端设备的使用安全性。
图6示出了上述判定模型的确定方法600一例的示例性说明,例如,该方法600可以应用于上述终端设备100中。
如图6所示,在S610,终端设备的处理器(例如,上述处理器180)可以获取终端设备的传感器检测到的数据(即,训练信息的一例)。
作为示例而非限定,该数据可以包括以下至少一种数据:
1.用户对触摸屏的触摸操作的相关数据(即,触摸操作信息的一例)。
作为示例而非限定,上述触摸检测装置可以检测用户的触摸操作的数据,作为示例而非限定,该触摸操作可以包括但不限于点击(例如,单击或双击等)操作或滑动操作(例如,单指滑动或多指滑动等)等,该触摸操作的数据可以包括但不限于以下至少一种参数的数据:
a.触摸操作的力度
例如,终端设备可以在触摸屏下配置压力传感器,从而,可以检测触摸操作的力度,并将所检测到的力度的数据发送给上述处理器。
应理解,以上列举的检测触摸操作的力度的方式仅为示例性说明,本发明并未限定于此,例如,触摸屏也可以继承上述压力传感器,例如,该触摸屏可以是压力屏,或者说压力感应屏。
b.触摸操作的位置
例如,该触摸操作的位置可以是触摸屏中被触摸的区域在整个触摸屏中的相对位置,其中,当该触摸操作为点击操作时,该触摸操作为位置可以是指触摸点在屏幕中的位置。当该触摸操作为滑动操作时,该触摸操作为位置可以包括滑动操作的起始点在屏幕中的位置,或者,也可以包括滑动操作的结束点在屏幕中的位置,或者,也可以包括滑动轨迹在屏幕中的位置。
c.触摸操作的接触面积
例如,该触摸操作的接触面积可以是触摸检测装置检测到的用户的手指与触摸屏的接触面积。
d.触摸操作的接触时间
例如,该触摸操作的接触时间可以是触摸检测装置检测到的一次触摸操作的时间,其中,该“一次触摸操作”可以是指自触摸屏被手指接触到至手指离开触摸屏),例如,当该触摸操作为点击操作时,该触摸操作的接触时间可以是指一次单击操作中触摸屏被触摸的时间,再例如,当该触摸操作为滑动操作时,该触摸操作的接触时间可以是指一次滑动操作中触摸屏被触摸的时间。
或者,该触摸操作的接触时间可以是触摸检测装置检测到的间隔小于或等于预设时间间隔的多次触摸操作的时间,例如,当该触摸操作为双击操作等手指有离开触摸屏的操作时,该触摸操作的接触时间可以是指一次操作过程中触摸屏被触摸的总的时间。
e.滑动操作的滑动角度
例如,该滑动操作的角度可以是自滑动操作的起始位置至滑动操作的终止位置之间的 连线与屏幕的水平(或竖直)方向之间的角度。
f.滑动操作的滑动方向
例如,该滑动操作的角度可以是自滑动操作的起始位置至滑动操作的终止位置的方向。
g.滑动操作的滑动距离
例如,该滑动操作的滑动距离可以是自滑动操作的起始位置至滑动操作的终止位置的直线长度。或者,该滑动操作的滑动距离可以是自滑动操作的轨迹的总长度。
应理解,上述参数的具体实例及获取方法仅为示例性说明,本申请并未特别限定,可以使用现有技术提供的各种方法获取上述参数的数据或信息。
2.用户对终端设备进行操作(例如,对触摸屏进行触摸操作)时,终端设备所处于的姿态的相关数据(即,姿态信息的一例)。
例如,该终端设备的姿态的相关数据可以包括终端设备与水平或重力方向所成的角度的数据。
作为示例而非限定,该姿态信息可以包括但不限于来自以下至少一种传感器检测到的数据:
重力传感器、加速度传感器、陀螺仪。
由于不同用户具有不同的操作习惯和生物特征,可能导致不同用户的操作具有不同给的特征。
例如,不同用户的手指的大小不同可能导致不同用户的触摸操作的触摸接触面积不同。
再例如,不同用户的手大小不同可能导致不同用户的滑动操作的滑动距离不同。
再例如,不同用户的操作习惯不同可能导致不同用户的滑动操作的滑动方向、角度或距离不同。
再例如,不同用户的操作习惯不同可能导致不同用户的触摸操作的触摸时间或触摸力度不同。
再例如,不同用户的操作习惯不同可能导致不同用户的在对终端设备进行操作时终端设备的姿态(例如,倾斜角度等)不同。
并且,受用户的操作习惯和生物特征的影响,传感器检测到的来自同一用户的操作的数据具有相似性。
例如,同一用户的触摸操作的触摸接触面积的相似度较高。
再例如,同一用户的滑动操作的滑动距离的相似度较高。
再例如,同一用户的滑动操作的滑动方向、角度或距离的相似度较高。
再例如,同一用户的触摸操作的触摸时间或触摸力度的相似度较高。
再例如,同一用户的在对终端设备进行操作时终端设备的姿态(例如,倾斜角度等)的相似度较高。
因此,在S620,终端设备的处理器可以对从上述传感器获取的数据进行聚类处理,能够将同一用户的操作划分至同一类或组中。
例如,作为示例而非限定,同一操作可能会被多种传感器检测到,例如,滑动操作可能会使触摸屏和压力传感器得到数据,并且,对于滑动操作,触摸屏可以得到滑动方向、 滑动距离、滑动角度等多种维度的参数。
此情况下,对于一个操作,可以将不同传感器检测到的多个数据作为该操作的不同维度的信息,从而生成在多维(例如、二维或三维)空间中的每个维度具有具体值的数据点(或者说,坐标点)。
或者,对于一个操作,可以将同一传感器检测到的多种参数的数据作为该操作的不同维度的信息,从而生成在多维(例如、二维或三维)空间中的每个维度具有具体值的点。
由此,对于多个操作,能够确定多个数据点。
从而,可以对该多个数据点进行聚类。
作为示例而非限定,在本申请实施例中,可以采用以下任意一种方法进行聚类。
一、基于密度的聚类方法
基于密度的方法(Density-based methods)的思想是:设定一个距离半径,最少有多少个点,然后把可以到达的点都连起来,判定为同类。其原理简单说画圈儿,其中要定义两个参数,一个是圈儿的最大半径,一个是一个圈儿里最少应容纳几个点。最后在一个圈里点的就是属于同一个类的点。
基于密度的聚类方法,对于集中区域效果较好,为了发现任意形状的簇,这类方法将簇看做是数据空间中被低密度区域分割开的稠密对象区域;一种基于高密度连通区域的基于密度的聚类方法,该算法将具有足够高密度的区域划分为簇,并在具有噪声的空间数据中发现任意形状的簇。
下面,简单介绍以下基于密度的聚类方法中的几个关键指标:
Ε邻域:给定对象半径为Ε内的区域称为该对象的Ε邻域;
核心对象:如果给定对象Ε领域内的样本点数大于等于MinPts,则称该对象为核心对象;
直接密度可达:对于样本集合D,如果样本点q在p的Ε领域内,并且p为核心对象,那么对象q从对象p直接密度可达。
密度可达:对于样本集合D,给定一串样本点p1,p2….pn,p=p1,q=pn,假如对象pi从pi-1直接密度可达,那么对象q从对象p密度可达。注意:密度可达是单向的,密度可达即可容纳同一类。
密度相连:存在样本集合D中的一点o,如果对象o到对象p和对象q都是密度可达的,那么p和q密度相联。
密度可达是直接密度可达的传递闭包,并且这种关系是非对称的。密度相连是对称关系。基于密度的聚类目的是找到密度相连对象的最大集合。
基于密度的聚类通过检查数据库中每点的r邻域来搜索簇。如果点p的r邻域包含的点多于MinPts个,则创建一个以p为核心对象的新簇。然后,基于密度的聚类迭代的聚集从这些核心对象直接密度可达的对象,这个过程可能涉及一些密度可达簇的合并。当没有新的点可以添加到任何簇时,该过程结束。
例如:假设半径Ε=3,MinPts=3,点p的E领域中有点{m,p,p1,p2,o},点m的E领域中有点{m,q,p,m1,m2},点q的E领域中有点{q,m},点o的E领域中有点{o,p,s},点s的E领域中有点{o,s,s1}。
那么核心对象有p,m,o,s。其中,q不是核心对象,因为它对应的E领域中点数 量等于2,小于MinPts=3;
点m从点p直接密度可达,因为m在p的E领域内,并且p为核心对象;
点q从点p密度可达,因为点q从点m直接密度可达,并且点m从点p直接密度可达;
点q到点s密度相连,因为点q从点p密度可达,并且s从点p密度可达。
下面,简单介绍以下簇的生成原理及过程
基于密度的聚类方法原理的基本要点:确定半径eps的值。
基于密度的聚类方法需要选择一种距离度量,对于待聚类的数据集中,任意两个点之间的距离,反映了点之间的密度,说明了点与点是否能够聚到同一类中。由于基于密度的聚类方法对高维数据定义密度很困难,所以对于二维空间中的点,可以使用欧几里德距离来进行度量。
基于密度的聚类方法需要用户输入2个参数:一个参数是半径(Eps),表示以给定点P为中心的圆形邻域的范围;另一个参数是以点P为中心的邻域内最少点的数量(MinPts)。如果满足:以点P为中心、半径为Eps的邻域内的点的个数不少于MinPts,则称点P为核心点。
基于密度的聚类方法使用到一个k-距离的概念,k-距离是指:给定数据集P={p(i);i=0,1,…n},对于任意点P(i),计算点P(i)到集合D的子集S={p(1),p(2),…,p(i-1),p(i+1),…,p(n)}中所有点之间的距离,距离按照从小到大的顺序排序,假设排序后的距离集合为D={d(1),d(2),…,d(k-1),d(k),d(k+1),…,d(n)},则d(k)就被称为k-距离。也就是说,k-距离是点p(i)到所有点(除了p(i)点)之间距离第k近的距离。对待聚类集合中每个点p(i)都计算k-距离,最后得到所有点的k-距离集合E={e(1),e(2),…,e(n)}。
根据经验计算半径Eps:根据得到的所有点的k-距离集合E,对集合E进行升序排序后得到k-距离集合E’,需要拟合一条排序后的E’集合中k-距离的变化曲线图,然后绘出曲线,通过观察,将急剧发生变化的位置所对应的k-距离的值,确定为半径Eps的值。
根据经验计算最少点的数量MinPts:确定MinPts的大小,实际上也是确定k-距离中k的值,基于密度的聚类方法取k=4,则MinPts=4。
另外,如果觉得经验值聚类的结果不满意,可以适当调整Eps和MinPts的值,经过多次迭代计算对比,选择最合适的参数值。可以看出,如果MinPts不变,Eps取得值过大,会导致大多数点都聚到同一个簇中,Eps过小,会导致一个簇的分裂;如果Eps不变,MinPts的值取得过大,会导致同一个簇中点被标记为噪声点,MinPts过小,会导致发现大量的核心点。
需要说明的是,基于密度的聚类方法,需要输入2个参数,这两个参数的计算都来自经验知识。半径Eps的计算依赖于计算k-距离,DBSCAN取k=4,也就是设置MinPts=4,然后需要根据k-距离曲线,根据经验观察找到合适的半径Eps的值。
其后,可以连通核心点生成簇。
核心点能够连通(有些书籍中称为:“密度可达”),它们构成的以Eps长度为半径的圆形邻域相互连接或重叠,这些连通的核心点及其所处的邻域内的全部点构成一个簇。
计算连通的核心点的思路是,基于广度遍历与深度遍历集合的方式:从核心点集合S 中取出一个点p,计算点p与S集合中每个点(除了p点)是否连通,可能会得到一个连通核心点的集合C1,然后从集合S中删除点p和C1集合中的点,得到核心点集合S1;再从S1中取出一个点p1,计算p1与核心点集合S1集中每个点(除了p1点)是否连通,可能得到一个连通核心点集合C2,再从集合S1中删除点p1和C2集合中所有点,得到核心点集合S2,……最后得到p、p1、p2、……,以及C1、C2、……就构成一个簇的核心点。最终将核心点集合S中的点都遍历完成,得到所有的簇。
需要说明的是,如果eps设置过大,则所有的点都会归为一个簇,如果设置过小,那么簇的数目会过多。如果MinPts设置过大的话,很多点将被视为噪声点。
根据数据点的密度可以分为三类点:
(1)核心点:该点在邻域内的密度超过给定的阀值MinPs。
(2)边界点:该点不是核心点,但是其邻域内包含至少一个核心点。
(3)噪音点:不是核心点,也不是边界点。
有了以上对数据点的划分,聚合可以这样进行:各个核心点与其邻域内的所有核心点放在同一个簇中,把边界点跟其邻域内的某个核心点放在同一个簇中。
点排序识别聚类结构(Ordering Points To Identify Clustering Structure,OPTICS)是基于密度的聚类方法中的一种,其通过优先对高密度(high density)进行搜索,然后根据高密度的特点设置参数,提高了基于密度的聚类的效果。OPTICS的目标是将空间中的数据按照密度分布进行聚类,即,经过OPTICS算法的处理,理论上可以获得任意密度的聚类。因为OPTICS算法输出的是样本的一个有序队列,从这个队列里面可以获得任意密度的聚类。
OPTICS算法的基础有两点:
1.参数(半径,最少点数):一个是输入的参数,包括:半径ε,和最少点数MinPts。
2.定义(核心点,核心距离,可达距离,直接密度可达):
OPTICS算法核心点的定义如下:如果一个点的半径内包含点的数量不少于最少点数,则该点为核心点,数学描述即:Nε(P)>=MinPts。
在这个基础上可以引出核心距离的定义,即对于核心点,距离其第MinPtsth近的点与之的距离coreDist(P)。
Figure PCTCN2018083057-appb-000001
可达距离,对于核心点P,O到P的可达距离定义为O到P的距离或者P的核心距离,即公式
Figure PCTCN2018083057-appb-000002
O到P直接密度可达,即P为核心点,且P到O的距离小于半径。
作为示例而非限定,OPTICS算法的计算过程如下:
步骤1、输入数据样本D,初始化所有点的可达距离和核心距离为MAX,半径ε,和最少点数MinPts。
步骤2、建立两个队列,有序队列(核心点及该核心点的直接密度可达点),结果队 列(存储样本输出及处理次序)。
步骤3、如果D中数据全部处理完,则算法结束,否则从D中选择一个未处理且未核心对象的点,将该核心点放入结果队列,该核心点的直接密度可达点放入有序队列,直接密度可达点并按可达距离升序排列。
步骤4、如果有序序列为空,则回到步骤2,否则从有序队列中取出第一个点;例如,在该过程中,首先,判断该点是否为核心点,不是则回到步骤3,是的话则将该点存入结果队列,如果该点不在结果队列;其后,该点是核心点的话,找到其所有直接密度可达点,并将这些点放入有序队列,且将有序队列中的点按照可达距离重新排序,如果该点已经在有序队列中且新的可达距离较小,则更新该点的可达距离;重复上述过程,直至有序队列为空。
步骤5、算法结束。
另外,作为示例而非限定,在本发明实施例中,可以根据需要认证的用户对终端设备的操作的次数在所有使用该终端设备的用户对该终端设备的操作的次数的比例,确定一个类中的点的最低数量,例如,如果有500个点的情况下,可以将该最低数量设为300个。
并且,例如,如果需要认证的用户为终端设备的机主,则可以在S210中仅确定一个类,并且,可以将能够聚类在该类的数据标记为1,并将能够聚类在该类的数据标记为0。即,标记为1的数据可以被认为是属于训练信息集合。
二、基于划分的聚类方法
基于划分的方法(Partition-based methods)的原理是,首先确定散点需要聚成几类,然后挑选几个点作为初始中心点,再然后依据预先定好的启发式算法(heuristicalgorithms)给数据点做迭代重置(iterativerelocation),直到最后到达“类内的点都足够近,类间的点都足够远”的目标效果。
K均值(k-means)算法是基于划分的聚类方法的一例,k-means算法以k为参数,把n个对象分成k个簇,使簇内具有较高的相似度,而簇间的相似度较低。k-means算法的处理过程如下:首先,随机地选择k个对象,每个对象初始地代表了一个簇的平均值或中心,即选择K个初始质心;对剩余的每个对象,根据其与各簇中心的距离,将它赋给最近的簇;然后重新计算每个簇的平均值。这个过程不断重复,直到准则函数收敛,直到质心不发生明显的变化。通常,采用平方误差准则,误差的平方和(SSE)作为全局的目标函数,即最小化每个点到最近质心的欧几里得距离的平方和。此时,簇的质心就是该簇内所有数据点的平均值。具体步骤如下:
步骤1:选择K个点作为初始质心
步骤2:将每个点指派到最近的质心,形成K个簇
步骤3:重新计算每个簇的质心,
步骤4:重复步骤2和3簇不发生变化或达到最大迭代次数。
K-Means算法的详细过程
三、基于层次的聚类方法
基于层次的聚类方法(Hierarchical Methods)的原理是,先计算样本之间的距离。每次将距离最近的点合并到同一个类。然后,再计算类与类之间的距离,将距离最近的类合并为一个大类。不停的合并,直到合成了一个类。其中类与类的距离的计算方法有:最短 距离法,最长距离法,中间距离法,类平均法等。比如最短距离法,将类与类的距离定义为类与类之间样本的最短距离。
层次聚类算法根据层次分解的顺序分为:自下底向上和自上向下,即凝聚的层次聚类算法和分裂的层次聚类算法(agglomerative和divisive),也可以理解为自下而上法(bottom-up)和自上而下法(top-down)。自下而上法就是一开始每个个体(object)都是一个类,然后根据linkage寻找同类,最后形成一个“类”。自上而下法就是反过来,一开始所有个体都属于一个“类”,然后根据linkage排除异己,最后每个个体都成为一个“类”。这两种路方法没有孰优孰劣之分,只是在实际应用的时候要根据数据特点以及你想要的“类”的个数,来考虑是自上而下更快还是自下而上更快。至于根据Linkage判断“类”的方法就是最短距离法、最长距离法、中间距离法、类平均法等等(其中类平均法往往被认为是最常用也最好用的方法,一方面因为其良好的单调性,另一方面因为其空间扩张/浓缩的程度适中)。为弥补分解与合并的不足,层次合并经常要与其它聚类方法相结合,如循环定位。
凝聚型层次聚类的策略是先将每个对象作为一个簇,然后合并这些原子簇为越来越大的簇,直到所有对象都在一个簇中,或者某个终结条件被满足。绝大多数层次聚类属于凝聚型层次聚类,它们只是在簇间相似度的定义上有所不同。这里给出采用最小距离的凝聚层次聚类算法流程:
步骤1:将每个对象看作一类,计算两两之间的最小距离;
步骤2:将距离最小的两个类合并成一个新类;
步骤3:重新计算新类与所有类之间的距离;
步骤4:重复步骤2/3,直到所有类最后合并成一类。
应理解,以上列举的聚类的方法和过程仅为示例性说明,其他能够将多个数据中由同一用户的操作产生的数据聚为一类的方法均落入本发明的保护范围内。
从而,根据上述聚类算法确定的多个类中的任意一个类(即,多个训练信息集合)可以对应一个用户,即,一个类中的数据可以被认为是所对应的用户所进行的操作所产生的数据。
需要说明的是,在S610获取上述训练信息期间,如果终端设备被N个用户使用,且每个用户的操作次数达到一定规模,则根据上述S620中描述的处理过程,理论上能够获得N个训练信息集合(或者说,N个类),其中,上述“规模”可以根据聚类算法使用的参数,例如,密度等确定。相反,在S610获取上述训练信息期间,如果终端设备被N个用户使用,且只有M个用户的操作次数达到一定规模,则根据上述S620中描述的处理过程,理论上能够获得M个训练信息集合(或者说,M个类),其中,N可以为大于或等于1的整数,M可以为大于或等于1的整数,且M小于N。
以下,为了便于理解和说明,以在S620中获得了M个训练信息集合为例,进行说明。
在S630,处理器可以从该M个训练信息集合中确定与用户#A对应的训练信息集合#A。
作为示例而非限定,可以列举以下确定训练信息集合#A的方法。
方法1
或者,当S620中采用基于密度的聚类方法时,如果用户#A为终端设备的机主,则存 在较多的基于该用户#A的操作获得的训练信息,此情况下,可以将所确定的多个训练信息集合中点(或者说,训练信息)的数量最多集合确定为训练信息集合#A。
或者,当S620中采用基于密度的聚类方法时,如果用户#A为终端设备的机主,且其他用户对终端设备的操作较少,则存在可能存在聚类后仅能够获得一个训练信息集合的情况,此情况下,可以将该唯一的训练信息集合确定为训练信息集合#A。
或者,当S620中采用基于密度的聚类方法时,如果多个用户所进行的操作的数量均满足聚类的最低条件,即在S620中可以聚类成多个训练信息集合,则处理器可以获取据用户#A对终端设备的使用频率的信息,并基于该使用频率的信息,确定训练信息集合#A。例如,如果该使用频率的信息指示用户#A是多个用户中使用终端设备最频繁的用户,则可以将所确定的多个训练信息集合中点(或者说,训练信息)的数量最多集合确定为训练信息集合#A;如果该使用频率的信息指示用户#A是多个用户中使用终端设备最不频繁的用户,则可以将所确定的多个训练信息集合中点(或者说,训练信息)的数量最少集合确定为训练信息集合#A。
并且,作为示例而非限定,该使用频率的信息可以是用户输入至终端设备的,或者,使用频率的信息也可以是服务器或运营商提供给终端设备的,本发明并未特别限定。
方法2
例如,当S620中采用K-means聚类方法时,则可以确定出各个类的特征信息,或者说,各个类的特征信息的值。从而,处理器可以获取据用户#A的特征信息,并将特征信息与该用户#A相似的训练信息集合确定为训练信息集合#A,即,如上所述确定的训练信息集合#A的特征信息与用户#A的特征信息的相似度大于或等于规定的阈值#a,或者说,如上所述确定的训练信息集合#A的特征信息的值与用户#A的特征信息的值之间的差异小于或等于规定的阈值#b。
并且,作为示例而非限定,该使用频率的信息可以是用户输入至终端设备的,或者,使用频率的信息也可以是服务器或运营商提供给终端设备的,本发明并未特别限定。
应理解,以上列举的确定训练信息集合#A的方法仅为示例性说明,本发明并未限定于此,例如,在如上所述将能够聚类在该类的数据标记为1,并将能够聚类在该类的数据标记为0的情况下,也可以将被标记为1的数据确定为训练信息集合#A中的数据(或者说,训练信息)。
在S640,处理器可以基于训练信息集合#A中的部分或全部训练信息,确定针对用户#A的判定模型#A。
例如,作为示例而非限定,训练信息集合#A中的每个训练信息可以具有多个维度的值,从而,处理器可以将多个训练信息的同一维度的值进行平均处理,从而获得一个基准信息(即判定模型的一例),其中,基准信息的第i个维度的值为多个训练信息的第i个维度的值的平均值。
再例如,处理器可以将多个训练信息的第i个维度的值中出现次数最多的值确定为基准信息的第i个维度的值。
应理解,以上列举的确定判定模型#A的方法仅为示例性说明,本申请并未限定于此,其他能够基于训练数据训练处判定模型的方法和过程均落入本申请的保护范围内。
例如,在本发明实施例中,可以根据自适应增强(AdaptiveBoosting,Adaboost)算法 确定判定模型。具体地说,Adaboost是一种迭代算法,其核心思想是针对同一个训练集训练不同的分类器(弱分类器),然后把这些弱分类器集合起来,构成一个更强的最终分类器(强分类器)。其算法本身是通过改变数据分布来实现的,它根据每次训练集之中每个样本的分类是否正确,以及上次的总体分类的准确率,来确定每个样本的权值。将修改过权值的新数据集送给下层分类器进行训练,最后将每次训练得到的分类器最后融合起来,作为最后的决策分类器。使用adaboost分类器可以排除一些不必要的训练数据特征,并放在关键的训练数据上面。
该算法其实是一个简单的弱分类算法提升过程,这个过程通过不断的训练,可以提高对数据的分类能力。整个过程如下所示:
1.先通过对N个训练样本(即,一个训练信息集合中的多个训练信息)的学习得到第一个弱分类器;
2.将分错的样本和其他的新数据一起构成一个新的N个的训练样本,通过对这个样本的学习得到第二个弱分类器;
3.将1和2都分错了的样本加上其他的新样本构成另一个新的N个的训练样本,通过对这个样本的学习得到第三个弱分类器;
4.最终经过提升的强分类器。即某个数据被分为哪一类要由各分类器权值决定。
根据本申请的方案,由于用户的操作具有习惯性,因此,同一用户在对终端设备操作的过程中可能产生大量的相似的操作,通过使终端设备对根据在第一时段内检测到的用户操作确定的多个训练信息进行聚类,能够使聚类后的同一训练信息集合中的训练信息对应同一用户,进而,基于该训练信息集合中的训练信息所生成的判定模型能够有效判定某一操作的用户是否为该训练信息集合所对应的用户,从而,无需为实现用户识别,额外配置生物特征识别设备,能够降低终端设备的成本。并且,由于生产训练信息的用户操作无需用户刻意进行,或者说,无需为了实现用户识别额外增加使用者的操作负担,因此,能够改善用户体验,提高本生情的用户识别的实用性。
可选地,在上述S610中获取的训练信息(即,由一个用户操作触发的多个传感器检测到的数据所构成的信息)可以是基于针对规定的应用(以下为了便于理解和区分,记作应用#A)的用户操作生成的。并且,处理器可以记录所生成的判定模型(记作,判定模型#A1)可以与该应用#A的对应关系。从而,在S650中,当终端设备检测到用户操作时,可以确定该用户操作所针对的应用,例如,在确定该用户操作是针对应用#A的操作时,可以基于上述对应关系,确定该判定模型#A1,并基于该判定模型#A1判定所检测到的用户操作是否是用户#A进行的。
由于用户针对某一应用的操作可能更具习惯性,例如,同一用户在操作同一应用(例如,浏览新闻或电子书等应用)时,该用户所进行的多个滑动操作之间的距离、方向或角度等的相似度可能大于该用户操作不同应用(例如,浏览新闻和游戏等应用)时所进行的多个滑动操作之间的距离、方向或角度等的相似度。从而,通过上述过程,能够进一步提高本申请的用户识别的准确性和可靠性。
可选地,在上述S610中获取的训练信息(即,由一个用户操作触发的多个传感器检测到的数据所构成的信息)可以是基于针对规定的操作界面(以下为了便于理解和区分,记作操作界面#A)的用户操作生成的。并且,处理器可以记录所生成的判定模型(记作, 判定模型#A2)可以与该操作界面#A的对应关系。从而,在S650中,当终端设备检测到用户操作时,可以确定该用户操作所针对的操作界面,例如,在确定该用户操作是针对操作界面#A的操作时,可以基于上述对应关系,确定该判定模型#A2,并基于该判定模型#A2判定所检测到的用户操作是否是用户#A进行的。
由于用户针对某一操作界面的操作可能更具习惯性,例如,同一用户在操作同一操作界面(例如,文字阅读界面)时,该用户所进行的多个滑动操作之间的距离、方向或角度等的相似度可能大于该用户操作不同操作界面(例如,文字阅读界面和游戏界面等)时所进行的多个滑动操作之间的距离、方向或角度等的相似度。从而,通过上述过程,能够进一步提高本申请的用户识别的准确性和可靠性。
可选地,在上述610中获取的训练信息(即,由一个用户操作触发的多个传感器检测到的数据所构成的信息)可以是规定的类型(以下为了便于理解和区分,记作类型#A)的用户操作生成的。并且,处理器可以记录所生成的判定模型(记作,判定模型#A3)可以与类型#A的对应关系。从而,在S650中,当终端设备检测到用户操作时,可以确定该用户操作的类型,例如,在确定该用户操作的类型时类型#A时,可以基于上述对应关系,确定该判定模型#A3,并基于该判定模型#A3判定所检测到的用户操作是否是用户#A进行的。
由于用户的同一类型的操作可能更具习惯性,例如,同一用户的同一类型(例如,滑动)的操作的参数(例如,触摸力度)之间的相似度,可能大于不同类型(例如,滑动和点击)的操作的参数(例如,触摸力度)之间的相似度。从而,通过上述过程,能够进一步提高本申请的用户识别的准确性和可靠性。
图7示出了本申请的用户识别的方法700的一例的示例性说明,例如,该方法700可以由上述终端设备100和服务器配合执行。
其中,该服务器可以是计算机等具有计算功能的设备,并且,该服务器可以与终端设备之间可以经过例如,互联网等实现通信连接。
如图7所示,在S710,该终端设备可以基于用户操作确定训练信息,并且,该过程可以与上述S610描述的过程相似,这里,为了避免赘述,省略其详细说明。
在S720,终端设备可以将所获得的训练信息发送给服务器。
在S430,服务器可以对该训练信息进行聚类处理,以确定至少一个训练信息集合,其中,该过程可以与上述S620描述的过程相似,这里,为了避免赘述,省略其详细说明。
在S740,服务器可以从所确定的至少一个训练信息集合中确定训练信息集合#A,其中,该过程可以与上述S630描述的过程相似,这里,为了避免赘述,省略其详细说明。
在S750,服务器可以基于训练信息集合#A,确定判定模型#A,其中,该过程可以与上述S740描述的过程相似,这里,为了避免赘述,省略其详细说明。
在S760,服务器可以向终端设备发送该判定模型#A。
可选地,在本发明实施例中,在如上所述生成的判定模型可以用于判定操作手机的用户是否是机主,即,上述S610或S710中获得的训练信息可以包括基于机主的操作生成的训练信息,例如,在采用基于密度的聚类方法时,例如,可以采用增大采集数量等方式,以增大由机主的操作生成的训练信息的比例,以确保能够将由机主的操作生成的训练信息聚为一类,在S630或S730中可以根据机主的信息确定与机主的操作相对应的训练信息集 合,从而,在S640或S740中能够基于聚类后的训练信息集合生成用于判定操作的用户是否为机主的判定模型,进而,能够基于所确定的判定模型,判定某一用户操作是否为机主执行的。
图8是本申请实施例提供的管理终端设备的装置800的示意性框图。
如图8所示,该装置800包括:
检测单元810,用于获取第一操作对应的操作信息,其中,该操作信息包括触摸信息和/或该终端设备的姿态信息;
处理单元820,根据该第一操作对应的操作信息和第一判定模型的匹配程度,管理该终端设备,其中,该第一判定模型是基于第一用户所进行的操作的操作信息确定的。
可选地,该触摸信息包括以下至少一种信息:触摸操作的力度的信息、触摸操作的位置的信息、触摸操作的接触面积的信息、触摸操作的接触时间的信息、触摸操作的滑动角度的信息、触摸操作的滑动方向的信息、触摸操作的滑动距离的信息。
可选地,该第一用户包括该终端设备的机主。
可选地,处理单元820具体用于:当根据该第一操作对应的操作信息与第一判定模型的匹配程度高于预设的第一阈值时,执行该第一操作对应的处理。
例如,当该第一操作是对图片的操作(例如,删除图片的操作)时,如果判定为根据该第一操作对应的操作信息与第一判定模型的匹配程度高于预设的第一阈值,则可以基于该操作对图像进行处理(例如,删除图片)。
可选地,处理单元820具体用于:当根据该第一操作对应的操作信息与第一判定模型的匹配程度高于预设的第一阈值时,解锁第一应用。
例如,当该第一操作是图案解锁操作时,如果判定为根据该第一操作对应的操作信息与第一判定模型的匹配程度高于预设的第一阈值,则可以在图案解锁正确时,确定各位可以解锁。
可选地,处理单元820具体用于:当根据该第一操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值时,禁止执行该第一操作对应的处理。
例如,当该第一操作是对图片的操作(例如,删除图片的操作)时,如果判定为根据该第一操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值,则可以禁止基于该操作对图像进行处理(例如,禁止删除图片)。
可选地,处理单元820具体用于:当根据该第一操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值时,将该终端设备当前显示的界面切换至锁屏界面。
可选地,处理单元820具体用于:当根据该第一操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值时,播放预设的警报信号。
可选地,处理单元820具体用于:当根据该第一操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值时,锁定第一应用。
其中,该第一操作可以是针对第一应用的操作,例如,在第一应用的界面上的操作。
或者,该第一操作可以是在启动第一应用之前的操作,或者,该第一操作可以是第一应用在后台运行期间的操作。
可选地,该第一操作是在检测到用于启动该第一应用的第二操作之前所检测到的操作,以及处理单元820具体用于:在该检测单元810检测到该第二操作时,不显示解锁界 面,并启动该第一应用。
可选地,该第一操作是在检测到用于启动该第一应用的第二操作之前所检测到的操作,以及处理单元820具体用于:在检测单元810检测到该第二操作时,显示解锁界面。
可选地,该第一操作是在检测到用于启动该第一应用的第二操作之前所检测到的操作,以及处理单元820具体用于:禁止启动该第一应用。
可选地,该第一操作是用于解锁该第一应用的操作。
可选地,该第一应用包括以下至少一种应用:该第一操作所操作的应用、该终端设备的机主预先设置的应用和该终端设备的制造商预先设置的应用。
可选地,处理单元820具体用于:当根据该第一操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值时,根据第一操作对应的操作信息,从多个判定模型中确定第二判定模型,该第二判定模型与该第一操作对应的操作信息的匹配程度高于预设的第二阈值,或该第二判定模型是该多个判定模型中与该第一操作对应的操作信息的匹配程度最大的判定模型,其中,该多个判定模型与多个用户一一对应,每个判定模型是基于所对应的用户所进行的操作的操作信息确定的;根据该第二判定模型对应的用户的用户权限,管理该终端设备。
可选地,处理单元820具体用于:根据检测单元810第一时段内检测到的用户操作,确定多个训练信息,该用户操作包括多个用户进行的操作,该训练信息包括该用户操作的触摸操作信息和/或该终端设备在该用户操作下的姿态信息;对该多个训练信息进行聚类处理,以确定至少一个训练信息集合;根据该多个用户中的第二用户的信息,从该至少一个训练信息集合中,确定与该第二用户相对应的第一训练信息集合;根据该第一训练信息集合中的训练信息,确定针对该第二用户的判定模型。
可选地,处理单元820具体用于:基于预设的第三阈值,对该多个训练信息进行聚类处理,其中,每个训练信息集合中的训练信息的密度大于或等于该第三阈值;该根据该多个用户中的第二用户的信息,从该多个训练信息集合中,确定与该第二用户相对应的第一训练信息集合,包括:当该第二用户的信息指示该第二用户为该终端设备的机主时,将该多个训练信息集合中训练信息的密度最大的训练信息集合确定为该第一训练信息集合。
可选地,处理单元820具体用于:根据对象排序识别聚类结构OPTICS算法,对该多个训练信息进行聚类处理。
可选地,该终端设备具有至少两种操作模式,其中,在第一操作模式下该终端设备需要识别用户是否为机主,在第二操作模式下该终端设备不需要识别用户是否为机主,以及可选地,处理单元820具体用于:在根据该第一操作对应的操作信息和第一判定模型的匹配程度之前,确定该终端设备当前的操作模式为该第一操作模式。
该管理终端设备的装置800可以对应上述方法200中描述的终端设备,并且,该用户识别的装置800中各个模块或单元分别用于执行上述方法200中的终端设备所执行的各动作和处理过程,这里,为了避免赘述,省略其详细说明。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本 申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (32)

  1. 一种管理终端设备的方法,其特征在于,包括:
    获取第一操作对应的操作信息,其中,所述操作信息包括触摸信息和/或所述终端设备的姿态信息;
    根据所述第一操作对应的操作信息和第一判定模型的匹配程度,管理所述终端设备,其中,所述第一判定模型是基于第一用户所进行的操作的操作信息确定的。
  2. 根据权利要求1所述的管理终端设备的方法,其特征在于,所述触摸信息包括以下至少一种信息:
    触摸操作的力度的信息、触摸操作的位置的信息、触摸操作的接触面积的信息、触摸操作的接触时间的信息、触摸操作的滑动角度的信息、触摸操作的滑动方向的信息、触摸操作的滑动距离的信息。
  3. 根据权利要求1或2所述的管理终端设备的方法,其特征在于,所述第一用户包括所述终端设备的机主。
  4. 根据权利要求3所述的管理终端设备的方法,其特征在于,根据所述第一操作对应的操作信息和第一判定模型的匹配程度,管理所述终端设备包括:
    当根据所述第一操作对应的操作信息与第一判定模型的匹配程度高于预设的第一阈值时,执行所述第一操作对应的处理;或者
    当根据所述第一操作对应的操作信息与第一判定模型的匹配程度高于预设的第一阈值时,解锁第一应用;或者
    当根据所述第一操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值时,禁止执行所述第一操作对应的处理;或者
    当根据所述第一操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值时,将所述终端设备当前显示的界面切换至锁屏界面;或者
    当根据所述第一操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值时,播放预设的警报信号;或者
    当根据所述第一操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值时,锁定第一应用。
  5. 根据权利要求4所述的管理终端设备的方法,其特征在于,所述第一操作是在检测到用于启动所述第一应用的第二操作之前所检测到的操作,以及
    所述解锁第一应用,包括:
    在检测到所述第二操作时,不显示解锁界面,并启动所述第一应用;
    所述锁定第一应用,包括:
    在检测到所述第二操作时,显示解锁界面;或
    禁止启动所述第一应用。
  6. 根据权利要求4所述的管理终端设备的方法,其特征在于,所述第一操作是用于解锁所述第一应用的操作。
  7. 根据权利要求4至6中任一项所述的管理终端设备的方法,其特征在于,所述第 一应用包括以下至少一种应用:
    所述第一操作所操作的应用、所述终端设备的机主预先设置的应用和所述终端设备的制造商预先设置的应用。
  8. 根据权利要求3所述的管理终端设备的方法,其特征在于,根据所述第一操作对应的操作信息和第一判定模型的匹配程度,管理所述终端设备包括:
    当根据所述第一操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值时,根据第一操作对应的操作信息,从多个判定模型中确定第二判定模型,所述第二判定模型与所述第一操作对应的操作信息的匹配程度高于预设的第二阈值,或所述第二判定模型是所述多个判定模型中与所述第一操作对应的操作信息的匹配程度最大的判定模型,其中,所述多个判定模型与多个用户一一对应,每个判定模型是基于所对应的用户所进行的操作的操作信息确定的;
    根据所述第二判定模型对应的用户的用户权限,管理所述终端设备。
  9. 根据权利要求1至8中任一项所述的管理终端设备的方法,其特征在于,所述方法还包括:
    根据第一时段内检测到的用户操作,确定多个训练信息,所述用户操作包括多个用户进行的操作,所述训练信息包括所述用户操作的触摸操作信息和/或所述终端设备在所述用户操作下的姿态信息;
    对所述多个训练信息进行聚类处理,以确定至少一个训练信息集合;
    根据所述多个用户中的第二用户的信息,从所述至少一个训练信息集合中,确定与所述第二用户相对应的第一训练信息集合;
    根据所述第一训练信息集合中的训练信息,确定针对所述第二用户的判定模型。
  10. 根据权利要求9所述的管理终端设备的方法,其特征在于,所述对所述多个训练信息进行聚类处理,包括:
    基于预设的第三阈值,对所述多个训练信息进行聚类处理,其中,每个训练信息集合中的训练信息的密度大于或等于所述第三阈值;
    所述根据所述多个用户中的第二用户的信息,从所述多个训练信息集合中,确定与所述第二用户相对应的第一训练信息集合,包括:
    当所述第二用户的信息指示所述第二用户为所述终端设备的机主时,将所述多个训练信息集合中训练信息的密度最大的训练信息集合确定为所述第一训练信息集合。
  11. 根据权利要求10所述的管理终端设备的方法,其特征在于,所述对所述多个训练信息进行聚类处理,包括:
    根据对象排序识别聚类结构OPTICS算法,对所述多个训练信息进行聚类处理。
  12. 根据权利要求1至11中任一项所述的管理终端设备的方法,其特征在于,所述终端设备具有至少两种操作模式,其中,在第一操作模式下所述终端设备需要识别用户是否为机主,在第二操作模式下所述终端设备不需要识别用户是否为机主,以及
    在根据所述第一操作对应的操作信息和第一判定模型的匹配程度之前,所述方法还包括:
    确定所述终端设备当前的操作模式为所述第一操作模式。
  13. 一种终端设备,其特征在于,包括:
    传感器,用于检测第一操作对应的操作信息,其中,所述操作信息包括触摸信息和/或所述终端设备的姿态信息;
    处理器,用于根据所述第一操作对应的操作信息和第一判定模型的匹配程度,管理所述终端设备,其中,所述第一判定模型是基于第一用户所进行的操作的操作信息确定的。
  14. 根据权利要求13所述的终端设备,其特征在于,所述触摸信息包括以下至少一种信息:
    触摸操作的力度的信息、触摸操作的位置的信息、触摸操作的接触面积的信息、触摸操作的接触时间的信息、触摸操作的滑动角度的信息、触摸操作的滑动方向的信息、触摸操作的滑动距离的信息。
  15. 根据权利要求13或14所述的终端设备,其特征在于,所述第一用户包括所述终端设备的机主。
  16. 根据权利要求15所述的终端设备,其特征在于,所述处理器具体用于:
    当根据所述第一操作对应的操作信息与第一判定模型的匹配程度高于预设的第一阈值时,执行所述第一操作对应的处理;或者
    当根据所述第一操作对应的操作信息与第一判定模型的匹配程度高于预设的第一阈值时,解锁第一应用;或者
    当根据所述第一操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值时,禁止执行所述第一操作对应的处理;或者
    当根据所述第一操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值时,将所述终端设备当前显示的界面切换至锁屏界面;或者
    当根据所述第一操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值时,播放预设的警报信号;或者
    当根据所述第一操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值时,锁定第一应用。
  17. 根据权利要求16所述的终端设备,其特征在于,所述第一操作是在检测到用于启动所述第一应用的第二操作之前所检测到的操作,以及
    在解锁第一应用时,所述处理器具体用于在检测到所述第二操作时,不显示解锁界面,并启动所述第一应用;或者
    在锁定第一应用时,所述处理器具体用于在检测到所述第二操作时,在检测到所述第二操作时,显示解锁界面,或禁止启动所述第一应用。
  18. 根据权利要求16所述的终端设备,其特征在于,所述第一操作是用于解锁所述第一应用的操作。
  19. 根据权利要求16至18中任一项所述的终端设备,其特征在于,所述第一应用包括以下至少一种应用:
    所述第一操作所操作的应用、所述终端设备的机主预先设置的应用和所述终端设备的制造商预先设置的应用。
  20. 根据权利要求15所述的终端设备,其特征在于,所述处理器具体用于当根据所述第一操作对应的操作信息与第一判定模型的匹配程度低于预设的第一阈值时,根据第一操作对应的操作信息,从多个判定模型中确定第二判定模型,所述第二判定模型与所述第 一操作对应的操作信息的匹配程度高于预设的第二阈值,或所述第二判定模型是所述多个判定模型中与所述第一操作对应的操作信息的匹配程度最大的判定模型,其中,所述多个判定模型与多个用户一一对应,每个判定模型是基于所对应的用户所进行的操作的操作信息确定的;
    用于根据所述第二判定模型对应的用户的用户权限,管理所述终端设备。
  21. 根据权利要求13至20中任一项所述的终端设备,其特征在于,所述传感器还用于根据第一时段内检测到的用户操作,确定多个训练信息,所述用户操作包括多个用户进行的操作,所述训练信息包括所述用户操作的触摸操作信息和/或所述终端设备在所述用户操作下的姿态信息;
    所述处理器还用于对所述多个训练信息进行聚类处理,以确定至少一个训练信息集合,用于根据所述多个用户中的第二用户的信息,从所述至少一个训练信息集合中,确定与所述第二用户相对应的第一训练信息集合,用于根据所述第一训练信息集合中的训练信息,确定针对所述第二用户的判定模型。
  22. 根据权利要求21所述的终端设备,其特征在于,所述处理器具体用于基于预设的第三阈值,对所述多个训练信息进行聚类处理,其中,每个训练信息集合中的训练信息的密度大于或等于所述第三阈值,并且当所述第二用户的信息指示所述第二用户为所述终端设备的机主时,将所述多个训练信息集合中训练信息的密度最大的训练信息集合确定为所述第一训练信息集合。
  23. 根据权利要求22所述的终端设备,其特征在于,所述处理器具体用于根据对象排序识别聚类结构OPTICS算法,对所述多个训练信息进行聚类处理。
  24. 根据权利要求13至23中任一项所述的终端设备,其特征在于,所述终端设备具有至少两种操作模式,其中,在第一操作模式下所述终端设备需要识别用户是否为机主,在第二操作模式下所述终端设备不需要识别用户是否为机主,以及
    所述处理器还用于在根据所述第一操作对应的操作信息和第一判定模型的匹配程度之前,确定所述终端设备当前的操作模式为所述第一操作模式。
  25. 一种管理终端设备的方法,其特征在于,包括:
    显示第一界面;
    接收用户的第一操作并获取所述第一操作对应的操作信息,其中,所述操作信息包括触摸信息和/或所述终端设备的姿态信息;
    响应于所述第一操作对应的操作信息为第一类型,显示第二界面,所述第二界面不同于所述第一界面;及
    响应于所述第一操作对应的操作信息为第二类型,所述第二类型不同于所述第一类型,显示第三界面,所述第三界面包括用户验证界面,响应于验证通过,显示所述第二界面。
  26. 根据权利要求25所述的方法,其特征在于,所述触摸信息包括以下至少一种信息:
    触摸操作的力度的信息、触摸操作的位置的信息、触摸操作的接触面积的信息、触摸操作的接触时间的信息、触摸操作的滑动角度的信息、触摸操作的滑动方向的信息、触摸操作的滑动距离的信息。
  27. 根据权利要求25或26所述的方法,其特征在于,所述第二界面为应用程序的界面。
  28. 根据权利要求25至27中任一项所述的方法,其特征在于,所述方法还包括:
    在显示第二界面之前,接收用户的第二操作;
    响应于所述第二操作及所述第一操作对应的操作信息为所述第一类型,所述第二界面被显示;及
    响应于所述第二操作及所述第一操作对应的操作信息为所述第二类型,所述第三界面被显示。
  29. 根据权利要求25至28中任一项所述的方法,其特征在于,所述方法还包括:
    当显示第三界面时,接收用户的第三操作并获取所述第三操作对应的操作信息,其中,所述第三操作的操作信息包括触摸信息和/或所述终端设备的姿态信息;及
    响应于所述第三操作的操作信息为第三操作类型,显示所述第二界面。
  30. 一种终端设备,包括处理器和存储器,其特征在于,所述存储器存储有计算机指令,当所述计算机指令在所述终端设备上运行时,使得所述终端设备执行如权利要求1至12,25至29中任一项所述的方法。
  31. 一种计算机可读存储介质,其特征在于,存储有计算机指令,当所述计算机指令在终端设备上运行时,使得所述终端设备执行如权利要求1至12,25至29中任一项所述的方法。
  32. 一种计算机程序产品,其特征在于,当所述计算机程序产品在终端设备上运行时,使得所述终端设备执行如权利要求1至12,25至29中任一项所述的方法。
PCT/CN2018/083057 2018-03-28 2018-04-13 管理终端设备方法和终端设备 WO2019184011A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US16/977,041 US11468153B2 (en) 2018-03-28 2018-04-13 Terminal device management method and terminal device
CN201880088819.4A CN111684762B (zh) 2018-03-28 2018-04-13 管理终端设备方法和终端设备

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810264881.6 2018-03-28
CN201810264881 2018-03-28

Publications (1)

Publication Number Publication Date
WO2019184011A1 true WO2019184011A1 (zh) 2019-10-03

Family

ID=68060834

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/083057 WO2019184011A1 (zh) 2018-03-28 2018-04-13 管理终端设备方法和终端设备

Country Status (3)

Country Link
US (1) US11468153B2 (zh)
CN (1) CN111684762B (zh)
WO (1) WO2019184011A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112929491B (zh) * 2021-02-07 2022-08-26 展讯通信(上海)有限公司 一种应用程序的启动方法及相关装置
CN117015014A (zh) * 2022-04-28 2023-11-07 中兴通讯股份有限公司 数据传输方法、模型训练方法、设备、计算机可读介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120293404A1 (en) * 2011-05-19 2012-11-22 Panasonic Corporation Low Cost Embedded Touchless Gesture Sensor
CN105068743A (zh) * 2015-06-12 2015-11-18 西安交通大学 基于多指触控行为特征的移动终端用户身份认证方法
CN107026731A (zh) * 2016-01-29 2017-08-08 阿里巴巴集团控股有限公司 一种用户身份验证的方法及装置

Family Cites Families (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7697729B2 (en) 2004-01-29 2010-04-13 Authentec, Inc. System for and method of finger initiated actions
US8447077B2 (en) 2006-09-11 2013-05-21 Validity Sensors, Inc. Method and apparatus for fingerprint motion tracking using an in-line array
US8165355B2 (en) 2006-09-11 2012-04-24 Validity Sensors, Inc. Method and apparatus for fingerprint motion tracking using an in-line array for use in navigation applications
US8358815B2 (en) 2004-04-16 2013-01-22 Validity Sensors, Inc. Method and apparatus for two-dimensional finger motion tracking and control
US8229184B2 (en) 2004-04-16 2012-07-24 Validity Sensors, Inc. Method and algorithm for accurate finger motion tracking
US8175345B2 (en) 2004-04-16 2012-05-08 Validity Sensors, Inc. Unitized ergonomic two-dimensional fingerprint motion tracking device and method
KR100856203B1 (ko) 2006-06-27 2008-09-03 삼성전자주식회사 지문 인식 센서를 이용한 사용자 입력 장치 및 방법
JP2008009835A (ja) 2006-06-30 2008-01-17 Kyocera Mita Corp 操作表示装置
US8782775B2 (en) 2007-09-24 2014-07-15 Apple Inc. Embedded authentication systems in an electronic device
US8278946B2 (en) 2009-01-15 2012-10-02 Validity Sensors, Inc. Apparatus and method for detecting finger activity on a fingerprint sensor
KR101549558B1 (ko) 2009-03-18 2015-09-03 엘지전자 주식회사 휴대 단말기 및 그 제어방법
US9201539B2 (en) 2010-12-17 2015-12-01 Microsoft Technology Licensing, Llc Supplementing a touch input mechanism with fingerprint detection
CN102752441A (zh) 2011-04-22 2012-10-24 比亚迪股份有限公司 一种具有触控屏的移动终端及其控制方法
JP5799628B2 (ja) 2011-07-15 2015-10-28 ソニー株式会社 情報処理装置、情報処理方法、及びプログラム
KR101853856B1 (ko) 2011-10-04 2018-05-04 엘지전자 주식회사 이동 단말기 및 이의 제어방법
US9071970B2 (en) * 2011-12-05 2015-06-30 Sony Corporation Terminal device
CN102594980A (zh) 2011-12-19 2012-07-18 广东步步高电子工业有限公司 一种基于指纹识别技术的多级菜单显示方法及系统
CN103176727B (zh) 2011-12-23 2016-01-27 宇龙计算机通信科技(深圳)有限公司 应用程序的启动方法及通信终端
JP2013140440A (ja) 2011-12-28 2013-07-18 Sharp Corp 情報処理装置およびその駆動方法、制御プログラム、可読記憶媒体
CN103530047B (zh) 2012-07-06 2019-12-24 百度在线网络技术(北京)有限公司 一种触摸屏设备事件触发方法及装置
CN103678965B (zh) 2012-09-14 2018-10-16 百度在线网络技术(北京)有限公司 一种保护移动设备安全的方法及装置
CN103870041B (zh) 2012-12-14 2017-09-22 联想(北京)有限公司 终端设备及其用户识别方法
KR20140079960A (ko) 2012-12-20 2014-06-30 크루셜텍 (주) 지문 인식 이용한 애플리케이션을 실행하기 위한 방법, 장치 및 컴퓨터 판독 가능 기록 매체
US9298361B2 (en) 2013-03-15 2016-03-29 Apple Inc. Analyzing applications for different access modes
KR102127381B1 (ko) 2013-03-29 2020-06-26 엘지전자 주식회사 전자 종이 디스플레이 패널을 이용하는 모바일 디바이스 및 제어 방법
KR101419784B1 (ko) 2013-06-19 2014-07-21 크루셜텍 (주) 지문 인식 및 인증을 위한 방법 및 장치
CN104346549A (zh) 2013-08-08 2015-02-11 联想(北京)有限公司 一种信息处理方法以及一种电子设备
CN104346063A (zh) 2013-08-08 2015-02-11 联想(北京)有限公司 一种信息处理的方法及一种电子设备
KR20150018256A (ko) 2013-08-09 2015-02-23 엘지전자 주식회사 모바일 디바이스 및 그 제어 방법
CN103440445A (zh) 2013-08-14 2013-12-11 深圳市亚略特生物识别科技有限公司 电子设备的解锁控制方法及系统
CN103516907A (zh) 2013-09-27 2014-01-15 朱鹏 一种唤醒和熄灭屏幕的方法及移动终端
CN103530543B (zh) 2013-10-30 2017-11-14 无锡赛思汇智科技有限公司 一种基于行为特征的用户识别方法及系统
CN104077518A (zh) 2014-07-03 2014-10-01 南昌欧菲生物识别技术有限公司 解锁并执行应用程序的装置及方法
CN104036177B (zh) 2014-07-03 2017-11-21 南昌欧菲生物识别技术有限公司 智能终端指纹解锁装置及方法
CN104217151B (zh) 2014-09-11 2017-10-27 三星电子(中国)研发中心 智能终端应用程序的加锁方法及智能终端
CN104318138B (zh) 2014-09-30 2018-05-08 杭州同盾科技有限公司 一种验证用户身份的方法和装置
KR101552116B1 (ko) 2014-11-20 2015-09-15 주하영 지문 입력 방향 및 손가락의 종류를 이용한 화면 잠금 기능을 가지는 이동 통신 단말기
CN104572175B (zh) 2014-12-17 2018-01-19 广东欧珀移动通信有限公司 一种快速启动非隐私类型应用的方法及装置
CN105893809A (zh) 2015-01-06 2016-08-24 江南大学 使用svm分类器识别智能终端用户身份的方法
CN104572127B (zh) 2015-01-28 2019-03-01 努比亚技术有限公司 终端界面布局的方法及终端
US11093988B2 (en) 2015-02-03 2021-08-17 Fair Isaac Corporation Biometric measures profiling analytics
CN104598134B (zh) 2015-02-12 2017-07-21 广东欧珀移动通信有限公司 一种移动终端的指纹操作方法及系统
CN104834520A (zh) 2015-04-17 2015-08-12 惠州Tcl移动通信有限公司 智能终端应用启动的方法及智能终端
CN104850433A (zh) 2015-04-30 2015-08-19 广东欧珀移动通信有限公司 一种移动终端应用启动方法及移动终端
CN105141768A (zh) 2015-08-31 2015-12-09 努比亚技术有限公司 多用户识别方法、装置及移动终端
CN106714163B (zh) 2016-12-05 2020-07-14 同济大学 一种基于姿势变化的手势行为认证模式的构建方法及系统
US20190236249A1 (en) * 2018-01-31 2019-08-01 Citrix Systems, Inc. Systems and methods for authenticating device users through behavioral analysis

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120293404A1 (en) * 2011-05-19 2012-11-22 Panasonic Corporation Low Cost Embedded Touchless Gesture Sensor
CN105068743A (zh) * 2015-06-12 2015-11-18 西安交通大学 基于多指触控行为特征的移动终端用户身份认证方法
CN107026731A (zh) * 2016-01-29 2017-08-08 阿里巴巴集团控股有限公司 一种用户身份验证的方法及装置

Also Published As

Publication number Publication date
CN111684762B (zh) 2022-11-18
CN111684762A (zh) 2020-09-18
US20210042402A1 (en) 2021-02-11
US11468153B2 (en) 2022-10-11

Similar Documents

Publication Publication Date Title
US20230325538A1 (en) Method and apparatus for processing biometric information in electronic device
CN110321965B (zh) 物体重识别模型的训练方法、物体重识别的方法及装置
CN109074819B (zh) 基于操作-声音的多模式命令的优选控制方法及应用其的电子设备
KR102206054B1 (ko) 지문 처리 방법 및 그 전자 장치
CN106055962B (zh) 一种解锁控制方法及移动终端
US8756173B2 (en) Machine learning of known or unknown motion states with sensor fusion
US9965608B2 (en) Biometrics-based authentication method and apparatus
US8823645B2 (en) Apparatus for remotely controlling another apparatus and having self-orientating capability
US20200026939A1 (en) Electronic device and method for controlling the same
US20110043475A1 (en) Method and system of identifying a user of a handheld device
WO2018113409A1 (zh) 启动资源加载方法及装置
CN109901698B (zh) 一种智能交互方法、可穿戴设备和终端以及系统
WO2020001385A1 (zh) 电子装置及应用程序控制方法
US9959449B2 (en) Method for controlling unlocking and terminal
WO2019052551A1 (zh) 终端设备的交互方法、存储介质以及终端设备
CN106095224B (zh) 一种启动应用的方法及移动终端
WO2019184011A1 (zh) 管理终端设备方法和终端设备
AU2023282253A1 (en) Authentication window display method and apparatus
CN106055958B (zh) 一种解锁方法及装置
CN107480495A (zh) 移动终端的解锁方法及相关产品
US11360589B2 (en) Electronic device including flexible display and method of operating same
WO2018068484A1 (zh) 三维手势解锁方法、获取手势图像的方法和终端设备
WO2019153362A1 (zh) 一种指纹录入方法及终端
CN104915627B (zh) 一种文字识别方法及装置
CN110472459B (zh) 提取特征点的方法和装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18912454

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18912454

Country of ref document: EP

Kind code of ref document: A1