CN115600171B - Computer user action recognition system based on user habit judgment - Google Patents

Computer user action recognition system based on user habit judgment Download PDF

Info

Publication number
CN115600171B
CN115600171B CN202211303559.2A CN202211303559A CN115600171B CN 115600171 B CN115600171 B CN 115600171B CN 202211303559 A CN202211303559 A CN 202211303559A CN 115600171 B CN115600171 B CN 115600171B
Authority
CN
China
Prior art keywords
action
user
unit
common
actions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211303559.2A
Other languages
Chinese (zh)
Other versions
CN115600171A (en
Inventor
徐智
宋紫林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xingkong Yinghua Wuhan Technology Co ltd
Original Assignee
Xingkong Yinghua Wuhan Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xingkong Yinghua Wuhan Technology Co ltd filed Critical Xingkong Yinghua Wuhan Technology Co ltd
Priority to CN202211303559.2A priority Critical patent/CN115600171B/en
Publication of CN115600171A publication Critical patent/CN115600171A/en
Application granted granted Critical
Publication of CN115600171B publication Critical patent/CN115600171B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The invention relates to the technical field of identity recognition, in particular to a computer user action recognition system based on user habit judgment. The device comprises a user action identification unit, an ending operation information identification unit and an action corresponding operation binding unit. According to the invention, the computer program operation performed after the user action is finished is judged through the finishing operation information identification unit, and the action corresponding operation binding unit obtains the computer program operation frequency performed after different actions of the user are finished according to the computer program operation performed after the user action is finished, so that the user identity can be judged through the user action habit, the computer can be helped to better identify the user, the safety of the computer system is improved, meanwhile, the user action finishing time point can be identified according to the computer program operation commonly used after the user action is finished, and the corresponding computer program operation can be responded automatically after the user action is finished, so that the intellectualization of the computer system is further improved, and the user operation time is reduced.

Description

Computer user action recognition system based on user habit judgment
Technical Field
The invention relates to the technical field of identity recognition, in particular to a computer user action recognition system based on user habit judgment.
Background
The computer is a modern electronic computing machine for high-speed computing, can perform numerical computation and logic computation, has a memory function, and is a modern intelligent electronic device capable of automatically and high-speed processing mass data according to program operation.
With the continuous progress of the times, the popularity of computers is increased, the computer functions are also improved, the experience of users in using the computers is also improved, common actions are often generated in the process of using the computers, for example, eyes are tired after users watch the computers for a long time, the eyes are rubbed by the users, the brightness of a computer screen is reduced after the eyes are rubbed by the users, so that the eye tired feeling is relieved, the whole series of operations need to be regulated by the users themselves, meanwhile, some program software on the computers needs to be opened manually by the users, the complete manual opening efficiency is too low for some computer software opened through complex processes, and the processes need to be sequentially memorized by the users in the process, so that the use difficulty of the types of software is increased.
Disclosure of Invention
The invention aims to provide a computer user action recognition system judged according to user habits so as to solve the problems in the background technology.
In order to achieve the above object, a computer user action recognition system according to user habit determination is provided, which comprises a video monitoring unit, wherein the video monitoring unit is used for monitoring a user using a computer, the output end of the video monitoring unit is connected with a user action recognition unit, the user action recognition unit is used for recognizing the monitored user action, the output end of the user action recognition unit is connected with an ending operation information recognition unit, the ending operation information recognition unit recognizes the ending time point of the user action and determines the computer program operation performed after the user action is ended, the output end of the ending operation information recognition unit is connected with an action corresponding operation binding unit, the action corresponding operation binding unit obtains the computer program operation frequency of the user after different actions are ended according to the computer program operation performed after the user action is ended, the computer program operation with the highest frequency is bound with the corresponding user action, and binding information is generated, and the output end of the action corresponding operation binding unit is connected with a computer program response unit, and the computer program response unit formulates a response program corresponding to the binding information.
As a further improvement of the technical scheme, the output end of the ending operation information identification unit is connected with a data storage unit, the input end of the data storage unit is connected with the output end of the user action identification unit, and the data storage unit is used for storing the user action and each computer program operation after the user action is ended in real time.
As a further improvement of the technical scheme, the user action identification unit comprises a unit time making module, the unit time making module is used for making calculation time, the output end of the unit time making module is connected with a user common action statistics module, and the user common action statistics module is used for counting user common actions according to the made calculation time.
As a further improvement of the technical scheme, the user action identification unit further comprises an action micro-variation integration module, and the output end of the action micro-variation integration module is connected with the input end of the user common action statistics module.
As a further improvement of the technical scheme, the user action identification unit adopts a common action statistical algorithm, and the algorithm formula is as follows:
wherein the method comprises the steps ofFor each user action number set counted in unit time, T is the formulated unit time,/for each user action number set counted in unit time>To the point ofFor each number of user actions counted per unit time, n is the number of all types of user actions counted,/or->Threshold for the number of user actions, +.>Judging function for common actions>For the judged user action, when judged user action +>Less than the threshold of the number of user actions, common action judging function +.>Output is 0, the judged user action is not a usual action on the surface, when the judged user action +.>Not less than user action number threshold, common action judging function +.>The output is 1, indicating that the judged user action is a common action.
As a further improvement of the technical scheme, the data storage unit comprises a common action recording module, the common action recording module is used for recording all counted common actions, the output end of the common action recording module is connected with a common action real-time sequencing module, the common action real-time sequencing module is used for sequencing common actions in real time, the output end of the common action real-time sequencing module is connected with a common action real-time replacement module, and the common action real-time replacement module updates the common actions in real time according to the common action real-time sequencing.
As a further improvement of the technical scheme, the output end of the user action identification unit is connected with an accidental action eliminating unit, and the accidental action eliminating unit eliminates accidental actions.
As a further improvement of the technical scheme, the output end of the accidental action eliminating unit is connected with the input end of the data storage unit.
As a further improvement of the technical scheme, the output end of the accidental action removing unit is connected with an action operation unbinding unit, and the output end of the action operation unbinding unit is connected with the input end of the action corresponding operation binding unit.
Compared with the prior art, the invention has the beneficial effects that:
1. in the computer user action recognition system judged according to the user habit, the end operation information recognition unit judges the computer program operation performed after the user action is ended, the action corresponding operation binding unit obtains the computer program operation frequency performed after different actions of the user are ended according to the computer program operation number performed after the user action is ended, the user identity can be judged through the user action habit, the computer is helped to better recognize the user, the safety of the computer system is improved, meanwhile, the time point of ending the user action can be recognized according to the computer program operation commonly used after the user action is ended, and when the user action is ended, the corresponding computer program operation can be responded automatically, so that the intelligence of the computer system is further improved, and the user operation time is reduced.
2. In the computer user action recognition system judged according to the user habit, the data storage unit is used for storing the user action and each computer program operation after the user action is finished in real time, comparing the occurrence rate of each computer program operation in real time, and converting the corresponding computer program operation in real time according to the occurrence rate distribution of each computer program operation so as to adapt to the user requirements of different time phases.
3. In the computer user action recognition system judged according to the user habit, actions with nuances with common actions are integrated through the action nuances integration module, so that the actions are summarized into common actions, the common action integration rate is improved, and the common action misjudgment rate is reduced.
4. In the computer user action recognition system judged according to the user habit, accidental actions are counted through the accidental action eliminating unit, accidental actions are eliminated in time, and the accidental actions are prevented from affecting the counting of the later common actions.
Drawings
FIG. 1 is an overall flow chart of the present invention;
FIG. 2 is a flowchart of a user action recognition unit according to the present invention;
FIG. 3 is a flow chart of a data storage unit according to the present invention.
The meaning of each reference sign in the figure is:
10. a video monitoring unit;
20. a user action recognition unit; 210. a unit time making module; 220. a user common action statistics module; 230. an action micro-variation integration module;
30. an end operation information identification unit;
40. a data storage unit; 410. a common action recording module; 420. a common action real-time sequencing module; 430. a common action real-time replacement module;
50. an action corresponding operation binding unit;
60. a computer program response unit;
70. an accidental action rejection unit;
80. the action operation unbinding unit.
Description of the embodiments
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1-3, a computer user action recognition system according to user habit determination is provided, which includes a video monitoring unit 10, the video monitoring unit 10 is used for monitoring a user using a computer, an output end of the video monitoring unit 10 is connected with a user action recognition unit 20, the user action recognition unit 20 is used for recognizing a monitored user action, an output end of the user action recognition unit 20 is connected with an ending operation information recognition unit 30, the ending operation information recognition unit 30 recognizes a user action ending time point, and determines a computer program operation performed after the user action is ended, an output end of the ending operation information recognition unit 30 is connected with an action corresponding operation binding unit 50, the action corresponding operation binding unit 50 obtains a computer program operation frequency performed after different actions are ended according to a computer program operand performed after the user action is ended, and binds the computer program operation with the corresponding user action with the highest frequency, so as to generate binding information, an output end of the action corresponding operation binding unit 50 is connected with a computer program response unit 60, and the computer program response unit 60 formulates a response program corresponding to the binding information.
In particular use, the video monitoring unit 10 monitors a user using a computer, generates user monitoring information, and transmits the user monitoring information to the user action recognition unit 20, the user action recognition unit 20 recognizes the monitored user action, determines a common action of the user during the use of the computer, generates user action recognition information, and transmits the user action recognition information to the end operation information recognition unit 30, the end operation information recognition unit 30 recognizes a user action end time point, and determines a computer program operation performed after the end of the user action, for example, the user rubs eyes for a long time, indicating that eye fatigue has occurred after the user uses the computer for a long time, the user action recognition unit 20 determines the action as a common action, and the end operation information recognition unit 30 determines a computer program operation performed after the end of the user action, that is, when the user has eye fatigue, the brightness of the computer is reduced, that is, the computer program operation after the user finishes, the ending operation information identifying unit 30 generates the judgment information, and transmits the judgment information to the action corresponding operation binding unit 50, the action corresponding operation binding unit 50 obtains the computer program operation frequency after the user finishes different actions according to the computer program operation performed after the user finishes the actions, for example, after the user kneads the eyes for a long time, two computer program operations are performed, the first one is to reduce the brightness of the computer, the second one is to open the music software, and the probability of occurrence of the first case is far higher than the second case after the user kneads the eyes for a long time in unit time, at this time, the operation corresponding to the operation is determined by the operation binding unit 50 to be the computer program operation performed after the user kneads eyes for a long time, the determined computer program operation is bound with the corresponding user operation to generate binding information, the binding information is transmitted to the computer program response unit 60, the computer program response unit 60 formulates a corresponding response program according to the binding information, when the same operation occurs at the later stage of the user, the computer program response unit 60 can identify the operation and respond to the operation by calling the corresponding computer program operation, so that the user identity can be determined through the user operation habit, the computer can be better identified to improve the security of the computer system, meanwhile, the computer program operation commonly used after the user operation is finished can be performed according to the computer program operation commonly used after the user operation is finished to identify the user operation finishing time point, the computer program operation corresponding to the user operation can be automatically responded after the user operation is finished, the computer system intelligence is further improved, and the user operation time is reduced.
The output end of the accidental action removing unit 70 is connected with an action operation unbinding unit 80, and the output end of the action operation unbinding unit 80 is connected with the input end of the action corresponding operation binding unit 50. When the method is specifically used, the accidental action removing unit 70 transmits accidental action statistical information to the action operation unbinding unit 80, the action operation unbinding unit 80 judges whether accidental actions exist in common actions bound by the action corresponding operation binding unit 50, after accidental actions occur in the action corresponding operation binding unit 50, the accidental actions are identified through the action operation unbinding unit 80, the accidental actions and bound computer program operations are unbinding, accidental action binding computer program operations are avoided, the using effect of a user is influenced, the user is prevented from being exclusive, in addition, the output end of the end operation information identifying unit 30 is connected with the data storage unit 40, the input end of the data storage unit 40 is connected with the output end of the user action identifying unit 20, the data storage unit 40 is used for storing all computer program operations after the actions of the user and the actions of the user in real time, the output end of the accidental action removing unit 70 is connected with the action operation unbinding unit 80, and the output end of the action operation unbinding unit 80 is connected with the input end of the action corresponding operation binding unit 50. When the computer program operation processing method is specifically used, the accidental action eliminating unit 70 transmits accidental action statistical information to the action operation unbinding unit 80, the action operation unbinding unit 80 judges whether accidental actions exist in common actions, after accidental actions occur to the action corresponding operation binding unit 50, the accidental actions are identified through the action operation unbinding unit 80, unbinding processing is carried out on the accidental actions and the bound computer program operations, the accidental action binding computer program operations are avoided, the use effect of a user is influenced, so that the user generates exclusion mind, the data storage unit 40 stores the user actions and all computer program operations after the user actions are finished in real time, the occurrence rate of all computer program operations is compared in real time, and corresponding computer program operations are converted in real time according to the distribution of the occurrence rate of all computer program operations, so that the user requirements of different time phases are met.
Further, the user action recognition unit 20 includes a unit time making module 210, the unit time making module 210 is used for making a calculation time, the output end of the unit time making module 210 is connected with a user common action statistics module 220, and the user common action statistics module 220 counts the user common actions according to the made calculation time. When the method is specifically used, the unit time making module 210 makes calculation time, generates time making information, and transmits the time making information to the user common action statistics module 220, the user common action statistics module 220 counts user actions according to the made calculation time, sorts the number of user actions, makes a user action number threshold, takes the user actions exceeding the user action number threshold as common actions, and rejects other user actions.
Still further, the user action recognition unit 20 further includes an action micro-transformation integration module 230, and an output end of the action micro-transformation integration module 230 is connected to an input end of the user common action statistics module 220. Since subtle differences often occur each time a user occurs a common action in the process of counting the common actions, actions with subtle differences with the common actions are integrated and processed by the action micro-transformation integration module 230, so that the actions are summarized as common actions, the common action integration rate is improved, and the common action misjudgment rate is reduced.
Specifically, the user action recognition unit 20 adopts a common action statistical algorithm, and the algorithm formula is as follows:
wherein the method comprises the steps ofFor each user action number set counted in unit time, T is the formulated unit time,/for each user action number set counted in unit time>To the point ofFor each number of user actions counted per unit time, n is the number of all types of user actions counted,/or->Threshold for the number of user actions, +.>Judging function for common actions>For the judged user action, when judged user action +>Less than the threshold of the number of user actions, common action judging function +.>Output is 0, the judged user action is not a usual action on the surface, when the judged user action +.>Not less than user action number threshold, common action judging function +.>The output is 1, indicating that the judged user action is a common action.
In addition, the data storage unit 40 includes a common action recording module 410, the common action recording module 410 is configured to record each counted common action, a common action real-time ordering module 420 is connected to an output end of the common action recording module 410, the common action real-time ordering module 420 is configured to order the common actions in real time, a common action real-time replacement module 430 is connected to an output end of the common action real-time ordering module 420, and the common action real-time replacement module 430 performs real-time update on the common actions according to the common action real-time ordering. When the system is specifically used, the common action recording module 410 records each counted common action, generates common action recording information, transmits the common action recording information to the common action real-time sequencing module 420, the common action real-time sequencing module 420 sequences the common actions in real time, generates real-time sequencing information, and transmits the real-time sequencing information to the common action real-time replacement module 430, the common action real-time replacement module 430 performs real-time update on the common actions according to the common action real-time sequencing, timely updates the common actions, replaces the common actions with the actions lower than the threshold value of the user action number, ensures that the common actions bound with the computer program operation in the later stage can adapt to the current user habit, and improves the adaptation effect of the whole system.
Further, the output end of the user action recognition unit 20 is connected with an accidental action eliminating unit 70, and the accidental action eliminating unit 70 eliminates accidental actions. Since the phenomenon that the user does not perform the computer program operation after the user action is finished easily occurs when the user action is counted, the counted user action is accidental action, and the accidental action is counted through the accidental action eliminating unit 70, so that the accidental action is eliminated in time, and the accidental action is prevented from affecting the counting of the later common actions.
Still further, the output of the contingent activity elimination unit 70 is connected to the input of the data storage unit 40. In specific use, the accidental action eliminating unit 70 counts accidental actions, generates accidental action statistics information, transmits the accidental action statistics information to the data storage unit 40, pre-stores the accidental action statistics information through the data storage unit 40, and can directly call the accidental action information from the data storage unit 40 without secondary judgment when judging whether the user action is accidental action or not in the later stage, thereby improving the accidental action judging efficiency.
In addition, an output end of the accidental action removing unit 70 is connected with an action operation unbinding unit 80, and an output end of the action operation unbinding unit 80 is connected with an input end of the action corresponding operation binding unit 50. When the method is specifically used, the accidental action eliminating unit 70 transmits accidental action statistical information to the action operation unbinding unit 80, the action operation unbinding unit 80 judges whether accidental actions exist in common actions bound by the action corresponding operation binding unit 50, and when accidental actions occur in the action corresponding operation binding unit 50, the accidental actions are identified through the action operation unbinding unit 80, and unbinding processing is carried out on the accidental actions and bound computer program operations, so that the accidental actions are prevented from binding the computer program operations, the use effect of a user is influenced, and the user is prevented from being repelled.
The foregoing has shown and described the basic principles, principal features and advantages of the invention. It will be understood by those skilled in the art that the present invention is not limited to the above-described embodiments, and that the above-described embodiments and descriptions are only preferred embodiments of the present invention, and are not intended to limit the invention, and that various changes and modifications may be made therein without departing from the spirit and scope of the invention as claimed. The scope of the invention is defined by the appended claims and equivalents thereof.

Claims (3)

1. The utility model provides a computer user action recognition system according to user's custom judgement, includes video monitoring unit (10), video monitoring unit (10) are used for monitoring the user who uses the computer, video monitoring unit (10) output is connected with user action and discerns unit (20), user action discerns unit (20) are used for discern the action of the user of control, its characterized in that: the method comprises the steps that an end operation information identification unit (30) is connected to the output end of a user action identification unit (20), the end operation information identification unit (30) identifies a user action end time point and judges computer program operation performed after user action is ended, an action corresponding operation binding unit (50) is connected to the output end of the end operation information identification unit (30), the action corresponding operation binding unit (50) obtains computer program operation frequency after user action is ended according to computer program operation performed after user action is ended, binds computer program operation with highest frequency with corresponding user action to generate binding information, a computer program response unit (60) is connected to the output end of the action corresponding operation binding unit (50), and the computer program response unit (60) formulates a corresponding response program according to the binding information;
the output end of the ending operation information identification unit (30) is connected with a data storage unit (40), the input end of the data storage unit (40) is connected with the output end of the user action identification unit (20), and the data storage unit (40) is used for storing user actions and various computer program operations after the user actions are ended in real time;
the user action identification unit (20) comprises a unit time making module (210), wherein the unit time making module (210) is used for making calculation time, the output end of the unit time making module (210) is connected with a user common action statistics module (220), and the user common action statistics module (220) is used for counting common actions of a user according to the made calculation time;
the user action identification unit (20) further comprises an action micro-change integration module (230), wherein the output end of the action micro-change integration module (230) is connected with the input end of the user common action statistics module (220), and actions with nuances with common actions are integrated and processed through the action micro-change integration module (230) to be summarized into common actions;
the user action recognition unit (20) adopts a common action statistical algorithm, and the algorithm formula is as follows:
wherein the method comprises the steps ofFor each user action number set counted in unit time, T is the formulated unit time,/for each user action number set counted in unit time>To->For each number of user actions counted per unit time, n is the number of all types of user actions counted,/or->Threshold for the number of user actions, +.>Judging function for common actions>For the judged user action, when judged user action +>Less than the threshold of the number of user actions, common action judging function +.>Output is 0, the judged user action is not a usual action on the surface, when the judged user action +.>Not less than user action number threshold, common action judging function +.>The output is 1, which indicates that the judged user action is a common action;
the data storage unit (40) comprises a common action recording module (410), the common action recording module (410) is used for recording each counted common action, the output end of the common action recording module (410) is connected with a common action real-time sequencing module (420), the common action real-time sequencing module (420) is used for sequencing the common actions in real time, the output end of the common action real-time sequencing module (420) is connected with a common action real-time replacement module (430), and the common action real-time replacement module (430) performs real-time update on the common actions according to the common action real-time sequencing;
the output end of the user action identification unit (20) is connected with an accidental action eliminating unit (70), and the accidental action eliminating unit (70) eliminates accidental actions.
2. The computer user action recognition system according to claim 1, wherein: the output end of the accidental action eliminating unit (70) is connected with the input end of the data storage unit (40).
3. The computer user action recognition system according to claim 2, wherein: the output end of the accidental action eliminating unit (70) is connected with an action operation unbinding unit (80), the output end of the action operation unbinding unit (80) is connected with the input end of the action corresponding operation binding unit (50), the action operation unbinding unit (80) judges whether accidental actions exist in common actions after the action corresponding operation binding unit (50) is bound, and after the accidental actions occur in the action corresponding operation binding unit (50), the accidental actions are identified through the action operation unbinding unit (80) and unbinding the accidental actions and the bound computer program operations.
CN202211303559.2A 2022-10-24 2022-10-24 Computer user action recognition system based on user habit judgment Active CN115600171B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211303559.2A CN115600171B (en) 2022-10-24 2022-10-24 Computer user action recognition system based on user habit judgment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211303559.2A CN115600171B (en) 2022-10-24 2022-10-24 Computer user action recognition system based on user habit judgment

Publications (2)

Publication Number Publication Date
CN115600171A CN115600171A (en) 2023-01-13
CN115600171B true CN115600171B (en) 2023-08-04

Family

ID=84848445

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211303559.2A Active CN115600171B (en) 2022-10-24 2022-10-24 Computer user action recognition system based on user habit judgment

Country Status (1)

Country Link
CN (1) CN115600171B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106384027A (en) * 2016-09-05 2017-02-08 四川长虹电器股份有限公司 User identity recognition system and recognition method thereof
CN109298782A (en) * 2018-08-31 2019-02-01 阿里巴巴集团控股有限公司 Eye movement exchange method, device and computer readable storage medium
WO2020038108A1 (en) * 2018-08-24 2020-02-27 上海商汤智能科技有限公司 Dynamic motion detection method and dynamic motion control method and device
CN110941197A (en) * 2019-12-03 2020-03-31 刘知硕 Intelligent control system and control method for household electrical appliance
CN111639628A (en) * 2020-06-15 2020-09-08 周玉贵 Eye feature and action recognition method and system
CN114021181A (en) * 2021-10-13 2022-02-08 哈尔滨工业大学 Mobile intelligent terminal privacy continuous protection system and method based on use habits
CN114501075A (en) * 2020-11-11 2022-05-13 深圳Tcl新技术有限公司 Program recommendation method, smart television and computer readable storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106384027A (en) * 2016-09-05 2017-02-08 四川长虹电器股份有限公司 User identity recognition system and recognition method thereof
WO2020038108A1 (en) * 2018-08-24 2020-02-27 上海商汤智能科技有限公司 Dynamic motion detection method and dynamic motion control method and device
CN109298782A (en) * 2018-08-31 2019-02-01 阿里巴巴集团控股有限公司 Eye movement exchange method, device and computer readable storage medium
CN110941197A (en) * 2019-12-03 2020-03-31 刘知硕 Intelligent control system and control method for household electrical appliance
CN111639628A (en) * 2020-06-15 2020-09-08 周玉贵 Eye feature and action recognition method and system
CN114501075A (en) * 2020-11-11 2022-05-13 深圳Tcl新技术有限公司 Program recommendation method, smart television and computer readable storage medium
CN114021181A (en) * 2021-10-13 2022-02-08 哈尔滨工业大学 Mobile intelligent terminal privacy continuous protection system and method based on use habits

Also Published As

Publication number Publication date
CN115600171A (en) 2023-01-13

Similar Documents

Publication Publication Date Title
CN108665120B (en) Method and device for establishing scoring model and evaluating user credit
WO2017143948A1 (en) Method for awakening intelligent robot, and intelligent robot
CN110148405B (en) Voice instruction processing method and device, electronic equipment and storage medium
CN110136714A (en) Natural interaction sound control method and device
CN103164691A (en) System and method for recognition of emotion based on mobile phone user
CN106486127A (en) A kind of method of speech recognition parameter adjust automatically, device and mobile terminal
CN109977906A (en) Gesture identification method and system, computer equipment and storage medium
CN107016046A (en) The intelligent robot dialogue method and system of view-based access control model displaying
CN106940868A (en) In real time with the transaction risk recognition methods being combined offline and device
CN106648760A (en) Terminal and method thereof for cleaning background application programs based on face recognition
CN109118447A (en) A kind of image processing method, picture processing unit and terminal device
CN111737670A (en) Multi-mode data collaborative man-machine interaction method and system and vehicle-mounted multimedia device
CN115600171B (en) Computer user action recognition system based on user habit judgment
CN112116025A (en) User classification model training method and device, electronic equipment and storage medium
CN107037951B (en) Automatic operation mode identification method and terminal
WO2022222045A1 (en) Speech information processing method, and device
CN112784926A (en) Gesture interaction method and system
CN116567127B (en) Smart phone with fault monitoring function
CN110610099A (en) Financial risk intelligent early warning and wind control system based on FPGA hardware acceleration
CN108363915A (en) unlocking method, mobile terminal and computer readable storage medium
CN116839154A (en) Air conditioner control method and device based on intelligent learning and intelligent air conditioner
WO2021081768A1 (en) Interface switching method and apparatus, wearable electronic device and storage medium
CN109129467B (en) Robot interaction method and system based on cognition
WO2020078093A1 (en) Door lock control method and apparatus, and control device
CN107477970B (en) Refrigerator door opening control method and refrigerator adopting same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant