CN112230760B - Analysis system and method based on combination of user operation and biological characteristics - Google Patents

Analysis system and method based on combination of user operation and biological characteristics Download PDF

Info

Publication number
CN112230760B
CN112230760B CN202010981244.8A CN202010981244A CN112230760B CN 112230760 B CN112230760 B CN 112230760B CN 202010981244 A CN202010981244 A CN 202010981244A CN 112230760 B CN112230760 B CN 112230760B
Authority
CN
China
Prior art keywords
user
state
eye movement
module
working
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010981244.8A
Other languages
Chinese (zh)
Other versions
CN112230760A (en
Inventor
刘磊
刘庆俞
程庆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huainan Normal University
CERNET Corp
Original Assignee
Huainan Normal University
CERNET Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huainan Normal University, CERNET Corp filed Critical Huainan Normal University
Priority to CN202010981244.8A priority Critical patent/CN112230760B/en
Publication of CN112230760A publication Critical patent/CN112230760A/en
Application granted granted Critical
Publication of CN112230760B publication Critical patent/CN112230760B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns

Abstract

The invention discloses an analysis system based on combination of user operation and biological characteristics, which is characterized by comprising a main program module, a camera module and a state indicator light module on a computer, wherein the modules are connected through a USB connecting wire. The invention provides a set of comprehensive analysis scheme based on combination of user mouse keyboard operation characteristics and biological characteristics, can realize acquisition and statistics of user input operation state information, human face region information and eye movement state information by means of a system input device monitoring program and a video camera, can scientifically reflect the real-time working state of a user through real-time calculation of an algorithm, can realize intelligent real-time display of the working state of the user under the state that the user does not sense, greatly ensures the concentration and continuity of long-time working of the user, and avoids unnecessary disturbance.

Description

Analysis system and method based on combination of user operation and biological characteristics
Technical Field
The invention relates to a system and a method for data acquisition, in particular to an analysis system and an analysis method based on combination of user operation and biological characteristics.
Background
In the early 19 th century, people study human mental activities by observing human eye movements, and discuss the relationship between eye movements and human mental activities by analyzing recorded eye movement data; more specifically, under the same situation, the tested eye movement information is recorded, and the selective orientation of the tested eye movement information can be detected, so that the motivation and attitude orientation of different individuals under the same situation can be researched. The ergonomics of eye movement is to detect the problems of visual information extraction and visual control in human-machine interaction by using eye movement indexes, so that people can work more easily, effectively, comfortably and safely.
In the work, when other people do not know the actual working state of the worker, the work of the worker is often disturbed, so the invention provides a system, the input operation state information, the face area information and the eye movement state information of a user are collected, and the program is used for controlling the indicator lamp to display the actual working state of the worker, thereby avoiding the unnecessary disturbance of other people and greatly ensuring the concentration degree and the continuity of the long-time work of the user.
Disclosure of Invention
The invention provides an analysis system based on combination of user operation and biological characteristics, which can realize intelligent real-time display of the working state of a user in a state that the user does not sense, greatly ensure the concentration and continuity of long-time work of the user and avoid unnecessary disturbance.
The invention provides an analysis system based on combination of user operation and biological characteristics, which comprises a main program module, a camera module and a state indicator light module on a computer, wherein the modules are connected through a USB connecting line; wherein the content of the first and second substances,
the main program module comprises:
a monitoring module: collecting and judging the working state of the user according to the input operation state information and the biological state information of the user;
the indicator lamp control module: the system is responsible for connection management of the status indicator lamps and sends display control signals to the status indicator lamps according to the working states of users;
the video acquisition module: the camera is responsible for connection management of the camera and real-time processing of the collected video data;
the camera module is responsible for collecting the face and eye movement state information of the user by the system and sending the face and eye movement state information to the main program module;
and the state indicator light module is responsible for controlling the on-off state of the state light according to the display control signal received from the main program module.
Further, the monitoring module includes:
face area monitoring module: a user working area is defined, and the idle working state of the user is judged according to whether the face of the user appears in the working area;
a user operation monitoring module: recording and analyzing the operating frequency of a mouse and a keyboard when a user works in real time, and taking the operating frequency as a judgment basis of the general working state of the user;
eye movement state monitoring module: and recording and analyzing the eye movement state parameters of the user in real time, and taking the parameters as a judgment basis for the busy work state of the user.
Further, the status lights are 3 types of green, yellow and red.
Further, the user eye movement state parameter is a user blinking frequency.
The method based on the system combining the user operation characteristics and the biological characteristics comprises the following steps:
an initial debugging stage: based on the difference of the biological characteristics of the user, the operation state parameters and the eye movement state parameters of the user are subjected to statistical analysis to be used as the working state monitoring and judging basis of the user;
and (3) a formal use stage: the main program module automatically calls the face area monitoring module, the user operation monitoring module and the eye movement state monitoring module to respectively carry out real-time acquisition and analysis on face position information, operation state information and eye movement state information of a user, judges the working state of the user according to the analysis result and sends a display control signal to the state indicator lamp.
Further, the process of performing statistical analysis on the operation state parameters and the eye movement state parameters of the user in the initial debugging stage comprises:
step1, electrifying the camera;
step2, initializing a system main program, performing self-checking, and starting each functional module;
step3, setting a busy working state or an idle working state by a user;
step4, counting the operation frequency of the mouse and the keyboard and the parameters of the eye movement state of the user, and analyzing to obtain a user operation state threshold U and an eye movement state threshold E within 1 working state recording period T.
Further, the specific algorithm flow in the formal use phase is as follows:
step1, electrifying the status indicator lamp and the camera, and connecting to the user office computer through a Usb connecting line;
step2, initializing a system main program and performing self-checking; starting each functional module, configuring video acquisition camera parameters and status indicator lamp connection parameters, setting the working state of a system user to be an unmanned state, and sending a display control signal to the status indicator lamp to be turned off;
step3, circularly acquiring video data of the camera by the system and analyzing the video data in real time; firstly, a face region monitoring module monitors the boundary of a defined working area in real time to judge whether a face appears; if yes, setting the working state of the system user as an idle working state, sending a display control signal green to a state indicator lamp, and executing Step 4; otherwise, Step7 is executed;
step4, recording the real-time operation state information of the user by the system: mouse click frequency r _ mouse and keyboard click frequency r _ keyboard, and calculating user operation state value U _ current; comparing the U _ current with a user operation state threshold value U, if the U _ current is greater than U, setting the system user working state as a 'general working state', sending a display control signal 'yellow' to a state indicator lamp, and executing Step 5; otherwise, Step7 is executed;
step5, recording the real-time eye movement state information of the user by the system: blinking frequency r _ eye, and calculating a user eye movement state value E _ current; comparing the E _ current with a user eye movement state threshold value E, if the E _ current is less than E, setting the working state of a system user to be a busy state, sending a display control signal red to a state indicator lamp, and executing Step 6; otherwise, Step4 is executed;
step6, maintaining the busy work state for 5T times by the system, and calculating the eye movement state value E _ current' of the user; comparing the E _ current 'with a user eye movement state threshold value E', if the E _ current '< E', continuously setting the working state of the system user to be a busy state, sending a display control signal 'red' to a state indicator lamp, and executing Step 6; otherwise, Step5 is executed;
step7, judging whether the system is finished running; if not, executing Step 3; otherwise, quitting all the functional modules, turning off the status indicator light and ending the system operation.
The invention has the advantages that:
according to the system and the method, by means of a monitoring program of system input equipment and a video camera, the collection and statistics of input operation state information, face region information and eye movement state information of a user can be realized, the result capable of scientifically reflecting the real-time working state (idle working state, general working state and busy working state) of the user can be calculated in real time through a reasonably designed and efficient algorithm, and finally, the corresponding display is carried out through 3 colors of the state indicator lamp; the scheme can realize the intelligent real-time display of the working state of the user in the state that the user does not sense, greatly ensure the concentration degree and the continuity of the long-time work of the user, and avoid being unnecessarily disturbed.
Drawings
FIG. 1 shows an overall framework of the system of the present invention;
FIG. 2 is a flow chart of the algorithm during the initial debug phase of the present invention;
FIG. 3 shows a flowchart of the algorithm at the formal use stage of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention is further illustrated by the following examples:
the invention designs an analysis system based on combination of user operation and biological characteristics, as shown in fig. 1, which comprises a main program module, a camera module and a state indicator light module on a computer, wherein the modules are connected through a USB connecting wire; wherein the content of the first and second substances,
the main program module comprises:
the monitoring module specifically refers to: face area monitoring module: a user working area is defined, and the idle working state of the user is judged according to whether the face of the user appears in the working area; a user operation monitoring module: recording and analyzing the operating frequency of a mouse and a keyboard when a user works in real time, and taking the operating frequency as a judgment basis of the general working state of the user; eye movement state monitoring module: and recording and analyzing the eye movement state parameters of the user in real time, and taking the parameters as a judgment basis for the busy work state of the user.
The indicator lamp control module: the display control module is responsible for the connection management of the status indicator lamps and sends display control signals to the status indicator lamps according to the working status of a user;
the video acquisition module: the camera is responsible for connection management of the camera and real-time processing of the collected video data;
the camera module is responsible for collecting the face and eye movement state information of the user by the system and sending the face and eye movement state information to the main program module;
the state indicator light module is responsible for controlling the on-off states of the green, yellow and red 3 state lights according to the display control signals received from the main program module.
Further, the user eye movement state parameter is a user blinking frequency.
The method comprises an initial debugging stage and a formal using stage, specifically a user operation state threshold value and eye movement state threshold value statistical algorithm, and a user working state real-time monitoring algorithm;
in more detail, the main work of the initial debugging stage is to perform statistical analysis on the operation state parameters and the eye movement state parameters of the user based on the difference of the biological characteristics of the user, so as to be used as the basis for monitoring and judging the working state of the user; wherein, a time period for monitoring each parameter of the working state of the user by the main program is defined as T;
at this stage, the user needs to actively participate in the statistical analysis process of the system on two types of parameters, namely, the operation state parameter and the eye movement state parameter of the user, and the specific flow is shown in fig. 2:
step1, electrifying the camera, and connecting to the user office computer through a Usb connecting line;
step2, initializing a system main program, starting each functional module, and configuring parameters of a video acquisition camera;
step3, adjusting the position of the camera by the user to enable the video acquisition range to include the working area of the user;
step4, when the user is in idle work state or busy work state, the user actively starts the user operation state monitoring module and eye movement state monitoring module of the main program of the system by clicking idle test button or busy test button, and when the work state changes, clicks test ending button to end the debugging process; the main program module records and analyzes to obtain the operation state value U of the idle work of the user k1 And the eye movement state value E k1 Or busy operating state value U f1 Eye movement stage value E f1
Step5, repeating the operation of Step3 for multiple times, obtaining the operation state value set { U } of the user when working at idle k1 ,U k2 ,U k3 ……U kn And a set of eye movement state values E k1 ,E k2 ,E k3 ……E kn }, and a set of operation state values for the user when busy { U } f1 ,U f2 ,U f3 ……U fm And a set of eye movement state values E f1 ,E f2 ,E f3 ……E fm },
Step6, calculating the operation state of the user when the user is idle in 1 working state recording period T by adopting arithmetic mean filtering algorithmValue of state U k And the eye movement state value E k Finally, calculating a user operation state threshold value U and an eye movement state threshold value E;
wherein the content of the first and second substances,
Figure BDA0002687585670000071
Figure BDA0002687585670000072
Figure BDA0002687585670000073
Figure BDA0002687585670000074
Figure BDA0002687585670000075
after the initial debugging stage, the user does not need to actively participate in the system operation in the formal use stage. The main program module can automatically call the face area monitoring module, the user operation monitoring module and the eye movement state monitoring module to respectively carry out real-time acquisition and analysis on face position information, operation state information and eye movement state information of the user, judge the working state of the user according to the analysis result and send a display control signal to the state indicator lamp.
As shown in fig. 3, the face area monitoring information, the operation state monitoring information and the eye movement state monitoring information of the user are sequentially used as main judgment bases for the idle working state, the general working state and the busy working state of the user, special conditions such as low user operation frequency and the like in the user concentration state are comprehensively considered, and the actual working state of the user is reflected as scientifically as possible by methods such as prolonging the monitoring period, optimizing the algorithm flow and the like.
The specific algorithm flow is as follows:
step1, electrifying the status indicator lamp and the camera, and connecting to the user office computer through a Usb connecting line;
step2, initializing a system main program and performing self-checking; starting each functional module, configuring video acquisition camera parameters and status indicator lamp connection parameters, setting the working state of a system user to be an unmanned state, and sending a display control signal to the status indicator lamp to be turned off;
step3, circularly acquiring video data of the camera by the system and analyzing the video data in real time; firstly, a face region monitoring module monitors the boundary of a defined working area in real time to judge whether a face appears; if yes, setting the working state of the system user as an idle working state, sending a display control signal green to a state indicator lamp, and executing Step 4; otherwise, Step7 is executed;
step4, recording the real-time operation state information of the user by the system: mouse click frequency r _ mouse and keyboard click frequency r _ keyboard, and calculating user operation state value U _ current; comparing the U _ current with a user operation state threshold value U, if the U _ current is greater than U, setting the system user working state as a 'general working state', sending a display control signal 'yellow' to a state indicator lamp, and executing Step 5; otherwise, Step7 is executed;
step5, recording the real-time eye movement state information of the user by the system: blinking frequency r _ eye, and calculating a user eye movement state value E _ current; comparing the E _ current with a user eye movement state threshold value E, if the E _ current is less than E, setting the working state of a system user to be a busy state, sending a display control signal red to a state indicator lamp, and executing Step 6; otherwise, Step4 is executed;
step6, maintaining the busy work state for 5T times by the system, and calculating the eye movement state value E _ current' of the user; comparing the E _ current 'with a user eye movement state threshold value E', if the E _ current '< E', continuously setting the working state of the system user to be a busy state, sending a display control signal 'red' to a state indicator lamp, and executing Step 6; otherwise, Step5 is executed;
step7, judging whether the system is finished running; if not, executing Step 3; otherwise, quitting all the functional modules, turning off the status indicator light and ending the system operation.
Furthermore, it should be understood that although the present description refers to embodiments, not every embodiment may contain only a single embodiment, and such description is for clarity only, and those skilled in the art should integrate the description, and the embodiments may be combined as appropriate to form other embodiments understood by those skilled in the art.
The present invention is not limited to the above description of the embodiments, and those skilled in the art should, in light of the present disclosure, appreciate that many changes and modifications can be made without departing from the spirit and scope of the invention.

Claims (6)

1. An analysis system based on combination of user operation and biological characteristics is used for judging an idle working state, a general working state and a busy working state of a user and correspondingly lightening indicator lights with different colors;
the computer is characterized by comprising a main program module, a camera module and a status indicator light module on a computer, wherein the modules are connected through a USB connecting line; wherein the content of the first and second substances,
the main program module comprises:
a monitoring module: collecting and judging the working state of the user according to the input operation state information and the biological state information of the user; the monitoring module comprises a user operation monitoring module and is used for recording and analyzing the operation frequency of a mouse and a keyboard when a user works in real time;
the indicator lamp control module: the display control module is responsible for the connection management of the status indicator lamps and sends display control signals to the status indicator lamps according to the working status of a user;
the video acquisition module: the camera is responsible for connection management of the camera and real-time processing of the collected video data;
the camera module is responsible for collecting the face and eye movement state information of the user by the system and sending the face and eye movement state information to the main program module;
the state indicator light module is responsible for controlling the on-off state of the state light according to the display control signal received from the main program module;
the analysis result of the user operation monitoring module is used as a judgment basis for the general working state of the user;
the monitoring module also comprises a human face area monitoring module and an eye movement state monitoring module; face area monitoring module: a user working area is defined, and the idle working state of the user is judged according to whether the face of the user appears in the working area; eye movement state monitoring module: and recording and analyzing the eye movement state parameters of the user in real time, and taking the parameters as a judgment basis for the busy work state of the user.
2. The system of claim 1, wherein the analysis system is based on a combination of user operations and biometrics, and comprises:
the status lights are green, yellow and red 3.
3. The system of claim 1, wherein the analysis system is based on a combination of user operations and biometrics, and comprises:
the user eye movement state parameter is the blink frequency of the user.
4. A method based on the system of claim 1, wherein the method comprises the steps of:
an initial debugging stage: based on the difference of the biological characteristics of the user, the operation state parameters and the eye movement state parameters of the user are subjected to statistical analysis to be used as the working state monitoring and judging basis of the user;
and (3) formal use stage: the main program module automatically calls the face area monitoring module, the user operation monitoring module and the eye movement state monitoring module to respectively carry out real-time acquisition and analysis on face position information, operation state information and eye movement state information of a user, judges the working state of the user according to the analysis result and sends a display control signal to the state indicator lamp.
5. The method of claim 4, wherein: the process of performing statistical analysis on the operation state parameters and the eye movement state parameters of the user in the initial debugging stage comprises the following steps:
step1, electrifying the camera;
step2, initializing a system main program, performing self-checking, and starting each functional module;
step3, setting a busy working state or an idle working state by a user;
step4, counting the operation frequency of the mouse and the keyboard and the parameters of the eye movement state of the user, and analyzing to obtain a user operation state threshold U and an eye movement state threshold E within 1 working state recording period T.
6. The method of claim 4, wherein: the specific algorithm flow in the formal use stage is as follows:
step1, electrifying the status indicator lamp and the camera;
step2, initializing and self-checking a system main program, starting each functional module, configuring video acquisition camera parameters and state indicator lamp connection parameters, setting the working state of a system user to be an unmanned state, and sending a display control signal to a state indicator lamp to be turned off;
step3, circularly acquiring video data of the camera by the system and analyzing the video data in real time; firstly, a face region monitoring module monitors the boundary of a defined working area in real time to judge whether a face appears; if yes, setting the working state of a system user to be an idle working state, sending a display control signal 'green' to a state indicator lamp, and executing Step 4; otherwise, Step7 is executed;
step4, recording the real-time operation state information of the user by the system: mouse click frequency r _ mouse and keyboard click frequency r _ keyboard, and calculating user operation state value U _ current; comparing the U _ current with a user operation state threshold value U, if the U _ current is greater than U, setting the system user working state as a 'general working state', sending a display control signal 'yellow' to a state indicator lamp, and executing Step 5; otherwise, Step7 is executed;
step5, recording the real-time eye movement state information of the user by the system: blinking frequency r _ eye, and calculating a user eye movement state value E _ current; comparing the E _ current with a user eye movement state threshold value E, if the E _ current is less than E, setting the working state of a system user to be a busy state, sending a display control signal red to a state indicator lamp, and executing Step 6; otherwise, Step4 is executed;
step6, maintaining the busy work state for 5T times by the system, and calculating the eye movement state value E _ current' of the user; comparing the E _ current 'with a user eye movement state threshold value E', if the E _ current '< E', continuously setting the working state of the system user to be a busy state, sending a display control signal 'red' to a state indicator lamp, and executing Step 6; otherwise, Step5 is executed;
step7, judging whether the system is finished running; if not, executing Step 3; otherwise, quitting all the functional modules, turning off the status indicator light and ending the system operation.
CN202010981244.8A 2020-09-17 2020-09-17 Analysis system and method based on combination of user operation and biological characteristics Active CN112230760B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010981244.8A CN112230760B (en) 2020-09-17 2020-09-17 Analysis system and method based on combination of user operation and biological characteristics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010981244.8A CN112230760B (en) 2020-09-17 2020-09-17 Analysis system and method based on combination of user operation and biological characteristics

Publications (2)

Publication Number Publication Date
CN112230760A CN112230760A (en) 2021-01-15
CN112230760B true CN112230760B (en) 2022-09-20

Family

ID=74108707

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010981244.8A Active CN112230760B (en) 2020-09-17 2020-09-17 Analysis system and method based on combination of user operation and biological characteristics

Country Status (1)

Country Link
CN (1) CN112230760B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020034902A1 (en) * 2018-08-11 2020-02-20 昆山美卓智能科技有限公司 Smart desk having status monitoring function, monitoring system server, and monitoring method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6907375B2 (en) * 2002-11-06 2005-06-14 Varco I/P, Inc. Method and apparatus for dynamic checking and reporting system health
KR100631781B1 (en) * 2004-08-25 2006-10-11 삼성전자주식회사 Private video recorder providing user interface showing history of storing status of contents and method there of
JP5036177B2 (en) * 2005-12-12 2012-09-26 オリンパス株式会社 Information display device
CN103426275B (en) * 2013-08-01 2016-01-06 步步高教育电子有限公司 The device of detection eye fatigue, desk lamp and method
CN104732251B (en) * 2015-04-23 2017-12-22 郑州畅想高科股份有限公司 A kind of trainman's driving condition detection method based on video
CN108304764B (en) * 2017-04-24 2021-12-24 中国民用航空局民用航空医学中心 Fatigue state detection device and detection method in simulated flight driving process
CN108551699B (en) * 2018-04-20 2019-10-01 哈尔滨理工大学 Eye control intelligent lamp and control method thereof

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020034902A1 (en) * 2018-08-11 2020-02-20 昆山美卓智能科技有限公司 Smart desk having status monitoring function, monitoring system server, and monitoring method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
An Eye Location based Head Posture Recognition Method and Its Application in Mouse Operation;Chen, Zhe;《KSII TRANSACTIONS ON INTERNET AND INFORMATION SYSTEMS》;20150513;第9卷(第3期);全文 *
基于LSTM的眼动行为识别及人机交互应用;黄君浩等;《计算机系统应用》;20200315(第03期);全文 *
基于眼动特征的驾驶员疲劳预警系统设计;陈瑜等;《软件导刊》;20200531(第05期);全文 *
驾驶员疲劳状态检测技术研究与工程实现;李志春;《信息科技辑》;20090915(第1期);全文 *

Also Published As

Publication number Publication date
CN112230760A (en) 2021-01-15

Similar Documents

Publication Publication Date Title
CN109646022B (en) Child attention assessment system and method thereof
JP4856791B2 (en) EEG interface system, EEG interface providing apparatus, EEG interface execution method, and program
CN116304766B (en) Multi-sensor-based quick assessment method for state of switch cabinet
CN103529343B (en) A kind of intelligent diagnosing method of electric equipment and system
CN102999418A (en) Mobile phone monitoring method based on PC (personal computer) side
CN107785066B (en) Method, device and system for modifying heartbeat type
CN111179109A (en) Electricity consumption data processing method for detecting elderly people living alone
CN104490391A (en) Combatant state monitoring system based on electroencephalogram signals
CN112230760B (en) Analysis system and method based on combination of user operation and biological characteristics
CN103845038A (en) Physical sign signal acquiring method and physical sign signal acquiring equipment
CN110555972A (en) Household appliance use frequency optimization method based on mass data acquired by intelligent socket
CN114041793A (en) RSVP target recognition system and method integrating multi-mode fatigue monitoring and regulation
CN110575165A (en) APP used for brain monitoring and intervention in cooperation with EEG equipment
CN107356835A (en) A kind of intelligent electrical energy monitoring analysis system
CN104627385A (en) Process visualization decision making diagnostic system and inference control method of process visualization decision making diagnostic system
CN117093108B (en) Data regulation and control system and method applied to intelligent exhibition hall interaction
CN112231037A (en) Method for designing corresponding icon based on emotion
CN110197719B (en) Guardianship data processing system
CN108880945A (en) A kind of cloud monitoring system and method
CN115101205A (en) Health state monitoring method, system, equipment and medium based on intelligent gateway
CN110135744B (en) Construction worker safety behavior habit evaluation method
CN103258107A (en) Monitoring method and assistant monitoring system
CN113325728A (en) Intelligent home control method, system and control equipment based on electroencephalogram
CN106850783B (en) Across hole CT automation collection and long-distance monitoring method
CN111741084A (en) Crop growth monitoring and diagnosing system based on wireless sensor network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant