CN112230760A - Analysis system and method based on combination of user operation and biological characteristics - Google Patents

Analysis system and method based on combination of user operation and biological characteristics Download PDF

Info

Publication number
CN112230760A
CN112230760A CN202010981244.8A CN202010981244A CN112230760A CN 112230760 A CN112230760 A CN 112230760A CN 202010981244 A CN202010981244 A CN 202010981244A CN 112230760 A CN112230760 A CN 112230760A
Authority
CN
China
Prior art keywords
user
state
eye movement
module
working
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010981244.8A
Other languages
Chinese (zh)
Other versions
CN112230760B (en
Inventor
刘磊
刘庆俞
程庆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huainan Normal University
CERNET Corp
Original Assignee
Huainan Normal University
CERNET Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huainan Normal University, CERNET Corp filed Critical Huainan Normal University
Priority to CN202010981244.8A priority Critical patent/CN112230760B/en
Publication of CN112230760A publication Critical patent/CN112230760A/en
Application granted granted Critical
Publication of CN112230760B publication Critical patent/CN112230760B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses an analysis system based on combination of user operation and biological characteristics, which is characterized by comprising a main program module, a camera module and a state indicator light module on a computer, wherein the modules are connected through a USB connecting wire. The invention provides a set of comprehensive analysis scheme based on combination of user mouse keyboard operation characteristics and biological characteristics, can realize acquisition and statistics of user input operation state information, human face region information and eye movement state information by means of a system input device monitoring program and a video camera, can scientifically reflect the real-time working state of a user through real-time calculation of an algorithm, can realize intelligent real-time display of the working state of the user under the state that the user does not sense, greatly ensures the concentration and continuity of long-time working of the user, and avoids unnecessary disturbance.

Description

Analysis system and method based on combination of user operation and biological characteristics
Technical Field
The invention relates to a system and a method for data acquisition, in particular to an analysis system and an analysis method based on combination of user operation and biological characteristics.
Background
In the early 19 th century, people study human psychological activities by examining human eye movements, and discuss the relationship between eye movements and human psychological activities by analyzing recorded eye movement data; more specifically, under the same situation, the tested eye movement information is recorded, and the selective orientation of the tested eye movement information can be detected, so that the motivation and attitude orientation of different individuals under the same situation can be researched. The ergonomics of eye movement is to detect the problems of visual information extraction and visual control in human-machine interaction by using eye movement indexes, so that people can work more easily, effectively, comfortably and safely.
In the work, when other people do not know the actual working state of the worker, the work of the worker is often disturbed, so the invention provides a system, the input operation state information, the face area information and the eye movement state information of a user are collected, and the program is used for controlling the indicator lamp to display the actual working state of the worker, thereby avoiding the unnecessary disturbance of other people and greatly ensuring the concentration degree and the continuity of the long-time work of the user.
Disclosure of Invention
The invention provides an analysis system based on combination of user operation and biological characteristics, which can realize intelligent real-time display of the working state of a user in a state that the user does not sense, greatly ensure the concentration and continuity of long-time work of the user and avoid unnecessary disturbance.
The invention provides an analysis system based on combination of user operation and biological characteristics, which comprises a main program module, a camera module and a state indicator light module on a computer, wherein the modules are connected through a USB connecting line; wherein the content of the first and second substances,
the main program module comprises:
a monitoring module: collecting and judging the working state of the user according to the input operation state information and the biological state information of the user;
the indicator lamp control module: the display control module is responsible for the connection management of the status indicator lamps and sends display control signals to the status indicator lamps according to the working status of a user;
the video acquisition module: the camera is responsible for connection management of the camera and real-time processing of the collected video data;
the camera module is responsible for collecting the face and eye movement state information of the user by the system and sending the face and eye movement state information to the main program module;
and the state indicator light module is responsible for controlling the on-off state of the state light according to the display control signal received from the main program module.
Further, the monitoring module includes:
face area monitoring module: a user working area is defined, and the idle working state of the user is judged according to whether the face of the user appears in the working area;
a user operation monitoring module: recording and analyzing the operating frequency of a mouse and a keyboard when a user works in real time, and taking the operating frequency as a judgment basis of the general working state of the user;
eye movement state monitoring module: and recording and analyzing the eye movement state parameters of the user in real time, and taking the parameters as a judgment basis for the busy work state of the user.
Further, the status lights are 3 types of green, yellow and red.
Further, the user eye movement state parameter is a user blinking frequency.
The method based on the system combining the user operation characteristics and the biological characteristics comprises the following steps:
an initial debugging stage: based on the difference of the biological characteristics of the user, the operation state parameters and the eye movement state parameters of the user are subjected to statistical analysis to be used as the working state monitoring and judging basis of the user;
and (3) formal use stage: the main program module automatically calls the face area monitoring module, the user operation monitoring module and the eye movement state monitoring module to respectively carry out real-time acquisition and analysis on face position information, operation state information and eye movement state information of a user, judges the working state of the user according to the analysis result and sends a display control signal to the state indicator lamp.
Further, the process of performing statistical analysis on the operation state parameters and the eye movement state parameters of the user in the initial debugging stage comprises:
step1, electrifying the camera;
step2, initializing a system main program, performing self-checking, and starting each functional module;
step3, setting a busy working state or an idle working state by a user;
step4, counting the operation frequency of the mouse and the keyboard and the parameters of the eye movement state of the user, and analyzing to obtain a user operation state threshold U and an eye movement state threshold E within 1 working state recording period T.
Further, the specific algorithm flow in the formal use phase is as follows:
step1, electrifying the status indicator lamp and the camera, and connecting to the user office computer through a Usb connecting line;
step2, initializing a system main program and performing self-checking; starting each functional module, configuring video acquisition camera parameters and status indicator lamp connection parameters, setting the working state of a system user to be an unmanned state, and sending a display control signal to the status indicator lamp to be turned off;
step3, circularly acquiring video data of the camera by the system and analyzing the video data in real time; firstly, a face region monitoring module monitors the boundary of a defined working area in real time to judge whether a face appears; if yes, setting the working state of the system user as an idle working state, sending a display control signal green to a state indicator lamp, and executing Step 4; otherwise, Step7 is executed;
step4, recording the real-time operation state information of the user by the system: mouse click frequency r _ mouse and keyboard click frequency r _ keyboard, and calculating user operation state value U _ current; comparing the U _ current with a user operation state threshold value U, if the U _ current is greater than U, setting the system user working state as a 'general working state', sending a display control signal 'yellow' to a state indicator lamp, and executing Step 5; otherwise, Step7 is executed;
step5, recording the real-time eye movement state information of the user by the system: blinking frequency r _ eye, and calculating a user eye movement state value E _ current; comparing the E _ current with a user eye movement state threshold value E, if the E _ current is less than E, setting the working state of a system user to be a busy state, sending a display control signal red to a state indicator lamp, and executing Step 6; otherwise, Step4 is executed;
step6, maintaining the busy work state for 5T times by the system, and calculating the eye movement state value E _ current' of the user; comparing the E _ current 'with a user eye movement state threshold value E', if the E _ current '< E', continuously setting the working state of the system user to be a busy state, sending a display control signal 'red' to a state indicator lamp, and executing Step 6; otherwise, Step5 is executed;
step7, judging whether the system is finished running; if not, executing Step 3; otherwise, quitting all the functional modules, turning off the status indicator light and ending the system operation.
The invention has the advantages that:
according to the system and the method, by means of a monitoring program of system input equipment and a video camera, the collection and statistics of input operation state information, face region information and eye movement state information of a user can be realized, the result capable of scientifically reflecting the real-time working state (idle working state, general working state and busy working state) of the user can be calculated in real time through a reasonably designed and efficient algorithm, and finally, the corresponding display is carried out through 3 colors of the state indicator lamp; the scheme can realize the intelligent real-time display of the working state of the user in the state that the user does not sense, greatly ensure the concentration degree and the continuity of the long-time work of the user, and avoid being unnecessarily disturbed.
Drawings
FIG. 1 shows an overall framework of the system of the present invention;
FIG. 2 is a flow chart of the algorithm during the initial debug phase of the present invention;
FIG. 3 shows a flowchart of the algorithm at the formal use stage of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention is further illustrated by the following examples:
the invention designs an analysis system based on combination of user operation and biological characteristics, as shown in fig. 1, which comprises a main program module, a camera module and a state indicator light module on a computer, wherein the modules are connected through a USB connecting wire; wherein the content of the first and second substances,
the main program module comprises:
monitoring module specifically indicates: face area monitoring module: a user working area is defined, and the idle working state of the user is judged according to whether the face of the user appears in the working area; a user operation monitoring module: recording and analyzing the operating frequency of a mouse and a keyboard when a user works in real time, and taking the operating frequency as a judgment basis of the general working state of the user; eye movement state monitoring module: and recording and analyzing the eye movement state parameters of the user in real time, and taking the parameters as a judgment basis for the busy work state of the user.
The indicator lamp control module: the display control module is responsible for the connection management of the status indicator lamps and sends display control signals to the status indicator lamps according to the working status of a user;
the video acquisition module: the camera is responsible for connection management of the camera and real-time processing of the collected video data;
the camera module is responsible for collecting the face and eye movement state information of the user by the system and sending the face and eye movement state information to the main program module;
the state indicator light module is responsible for controlling the on-off states of the green, yellow and red 3 state lights according to the display control signals received from the main program module.
Further, the user eye movement state parameter is a user blinking frequency.
The method comprises an initial debugging stage and a formal using stage, specifically a user operation state threshold value and eye movement state threshold value statistical algorithm, and a user working state real-time monitoring algorithm;
in more detail, the main work of the initial debugging stage is to perform statistical analysis on the operation state parameters and the eye movement state parameters of the user based on the difference of the biological characteristics of the user, so as to be used as the basis for monitoring and judging the working state of the user; wherein, a time period for monitoring each parameter of the working state of the user by the main program is defined as T;
at this stage, the user needs to actively participate in the statistical analysis process of the system on two types of parameters, namely, the operation state parameter and the eye movement state parameter of the user, and the specific flow is shown in fig. 2:
step1, electrifying the camera, and connecting to the user office computer through a Usb connecting line;
step2, initializing a system main program, starting each functional module, and configuring parameters of a video acquisition camera;
step3, adjusting the position of the camera by the user to enable the video acquisition range to include the working area of the user;
step4, when the user is in idle work state or busy work state, the user actively starts the user operation state monitoring module and eye movement state monitoring module of the main program of the system by clicking idle test button or busy test button, and when the work state changes, clicks test ending button to end the debugging process; the main program module records and analyzes to obtain the operation state value U of the idle work of the userk1And the eye movement state value Ek1Or busy operating state value Uf1Eye movement stage value Ef1
Step5, repeating the operation of Step3 for multiple times to obtain the operation state value set { U ] of the user during idle workk1,Uk2,Uk3……UknAnd a set of eye movement state values Ek1,Ek2,Ek3……EknAnd a set of operating state values for the user when busy { U }f1,Uf2,Uf3……UfmAnd a set of eye movement state values Ef1,Ef2,Ef3……Efm},
Step6, calculating the operation state value U of the user when the user is idle in 1 work state recording period T by adopting arithmetic mean filtering algorithmkAnd the eye movement state value EkFinally, calculating a user operation state threshold value U and an eye movement state threshold value E;
wherein the content of the first and second substances,
Figure BDA0002687585670000071
Figure BDA0002687585670000072
Figure BDA0002687585670000073
Figure BDA0002687585670000074
Figure BDA0002687585670000075
after the initial debugging stage, the user does not need to actively participate in the system operation in the formal use stage. The main program module can automatically call the face area monitoring module, the user operation monitoring module and the eye movement state monitoring module to respectively carry out real-time acquisition and analysis on face position information, operation state information and eye movement state information of a user, judge the working state of the user according to the analysis result and send a display control signal to the state indicator lamp.
As shown in fig. 3, the face area monitoring information, the operation state monitoring information and the eye movement state monitoring information of the user are sequentially used as main judgment bases for the idle working state, the general working state and the busy working state of the user, special conditions such as low user operation frequency and the like in the user concentration state are comprehensively considered, and the actual working state of the user is reflected as scientifically as possible by methods such as prolonging the monitoring period, optimizing the algorithm flow and the like.
The specific algorithm flow is as follows:
step1, electrifying the status indicator lamp and the camera, and connecting to the user office computer through a Usb connecting line;
step2, initializing a system main program and performing self-checking; starting each functional module, configuring video acquisition camera parameters and status indicator lamp connection parameters, setting the working state of a system user to be an unmanned state, and sending a display control signal to the status indicator lamp to be turned off;
step3, circularly acquiring video data of the camera by the system and analyzing the video data in real time; firstly, a face region monitoring module monitors the boundary of a defined working area in real time to judge whether a face appears; if yes, setting the working state of the system user as an idle working state, sending a display control signal green to a state indicator lamp, and executing Step 4; otherwise, Step7 is executed;
step4, recording the real-time operation state information of the user by the system: mouse click frequency r _ mouse and keyboard click frequency r _ keyboard, and calculating user operation state value U _ current; comparing the U _ current with a user operation state threshold value U, if the U _ current is greater than U, setting the system user working state as a 'general working state', sending a display control signal 'yellow' to a state indicator lamp, and executing Step 5; otherwise, Step7 is executed;
step5, recording the real-time eye movement state information of the user by the system: blinking frequency r _ eye, and calculating a user eye movement state value E _ current; comparing the E _ current with a user eye movement state threshold value E, if the E _ current is less than E, setting the working state of a system user to be a busy state, sending a display control signal red to a state indicator lamp, and executing Step 6; otherwise, Step4 is executed;
step6, maintaining the busy work state for 5T times by the system, and calculating the eye movement state value E _ current' of the user; comparing the E _ current 'with a user eye movement state threshold value E', if the E _ current '< E', continuously setting the working state of the system user to be a busy state, sending a display control signal 'red' to a state indicator lamp, and executing Step 6; otherwise, Step5 is executed;
step7, judging whether the system is finished running; if not, executing Step 3; otherwise, quitting all the functional modules, turning off the status indicator light and ending the system operation.
Furthermore, it should be understood that although the present description refers to embodiments, not every embodiment may contain only a single embodiment, and such description is for clarity only, and those skilled in the art should integrate the description, and the embodiments may be combined as appropriate to form other embodiments understood by those skilled in the art.
The present invention is not limited to the above description of the embodiments, and those skilled in the art should, in light of the present disclosure, appreciate that many changes and modifications can be made without departing from the spirit and scope of the invention.

Claims (7)

1. An analysis system based on combination of user operation and biological characteristics is characterized by comprising a main program module, a camera module and a status indicator light module on a computer, wherein the modules are connected through a USB connecting line; wherein the content of the first and second substances,
the main program module comprises:
a monitoring module: collecting and judging the working state of the user according to the input operation state information and the biological state information of the user;
the indicator lamp control module: the display control module is responsible for the connection management of the status indicator lamps and sends display control signals to the status indicator lamps according to the working status of a user;
the video acquisition module: the camera is responsible for connection management of the camera and real-time processing of the collected video data;
the camera module is responsible for collecting the face and eye movement state information of the user by the system and sending the face and eye movement state information to the main program module;
and the state indicator light module is responsible for controlling the on-off state of the state light according to the display control signal received from the main program module.
2. The system of claim 1, wherein the analysis system is based on a combination of user operations and biometrics, and comprises: the monitoring module includes:
face area monitoring module: a user working area is defined, and the idle working state of the user is judged according to whether the face of the user appears in the working area;
a user operation monitoring module: recording and analyzing the operating frequency of a mouse and a keyboard when a user works in real time, and taking the operating frequency as a judgment basis of the general working state of the user;
eye movement state monitoring module: and recording and analyzing the eye movement state parameters of the user in real time, and taking the parameters as a judgment basis for the busy work state of the user.
3. The system of claim 1, wherein the analysis system is based on a combination of user operations and biometrics, and comprises: the status lights are green, yellow and red 3.
4. The system of claim 2, wherein the analysis system is based on a combination of user operations and biometrics, and comprises: the user eye movement state parameter is the blink frequency of the user.
5. A method based on the system of claim 2, wherein the method comprises the steps of:
an initial debugging stage: based on the difference of the biological characteristics of the user, the operation state parameters and the eye movement state parameters of the user are subjected to statistical analysis to be used as the working state monitoring and judging basis of the user;
and (3) formal use stage: the main program module automatically calls the face area monitoring module, the user operation monitoring module and the eye movement state monitoring module to respectively carry out real-time acquisition and analysis on face position information, operation state information and eye movement state information of a user, judges the working state of the user according to the analysis result and sends a display control signal to the state indicator lamp.
6. The method of claim 5, wherein: the process of performing statistical analysis on the operation state parameters and the eye movement state parameters of the user in the initial debugging stage comprises the following steps:
step1, electrifying the camera;
step2, initializing a system main program, performing self-checking, and starting each functional module;
step3, setting a busy working state or an idle working state by a user;
step4, counting the operation frequency of the mouse and the keyboard and the parameters of the eye movement state of the user, and analyzing to obtain a user operation state threshold U and an eye movement state threshold E within 1 working state recording period T.
7. The method of claim 5, wherein: the specific algorithm flow in the formal use stage is as follows:
step1, electrifying the status indicator lamp and the camera;
step2, initializing and self-checking a system main program, starting each functional module, configuring video acquisition camera parameters and state indicator lamp connection parameters, setting the working state of a system user to be an unmanned state, and sending a display control signal to a state indicator lamp to be turned off;
step3, circularly acquiring video data of the camera by the system and analyzing the video data in real time; firstly, a face region monitoring module monitors the boundary of a defined working area in real time to judge whether a face appears; if yes, setting the working state of the system user as an idle working state, sending a display control signal green to a state indicator lamp, and executing Step 4; otherwise, Step7 is executed;
step4, recording the real-time operation state information of the user by the system: mouse click frequency r _ mouse and keyboard click frequency r _ keyboard, and calculating user operation state value U _ current; comparing the U _ current with a user operation state threshold value U, if the U _ current is greater than U, setting the system user working state as a 'general working state', sending a display control signal 'yellow' to a state indicator lamp, and executing Step 5; otherwise, Step7 is executed;
step5, recording the real-time eye movement state information of the user by the system: blinking frequency r _ eye, and calculating a user eye movement state value E _ current; comparing the E _ current with a user eye movement state threshold value E, if the E _ current is less than E, setting the working state of a system user to be a busy state, sending a display control signal red to a state indicator lamp, and executing Step 6; otherwise, Step4 is executed;
step6, maintaining the busy work state for 5T times by the system, and calculating the eye movement state value E _ current' of the user; comparing the E _ current 'with a user eye movement state threshold value E', if the E _ current '< E', continuously setting the working state of the system user to be a busy state, sending a display control signal 'red' to a state indicator lamp, and executing Step 6; otherwise, Step5 is executed;
step7, judging whether the system is finished running; if not, executing Step 3; otherwise, quitting all the functional modules, turning off the status indicator light and ending the system operation.
CN202010981244.8A 2020-09-17 2020-09-17 Analysis system and method based on combination of user operation and biological characteristics Active CN112230760B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010981244.8A CN112230760B (en) 2020-09-17 2020-09-17 Analysis system and method based on combination of user operation and biological characteristics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010981244.8A CN112230760B (en) 2020-09-17 2020-09-17 Analysis system and method based on combination of user operation and biological characteristics

Publications (2)

Publication Number Publication Date
CN112230760A true CN112230760A (en) 2021-01-15
CN112230760B CN112230760B (en) 2022-09-20

Family

ID=74108707

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010981244.8A Active CN112230760B (en) 2020-09-17 2020-09-17 Analysis system and method based on combination of user operation and biological characteristics

Country Status (1)

Country Link
CN (1) CN112230760B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040088115A1 (en) * 2002-11-06 2004-05-06 Varco International, Inc. Method and apparatus for dynamic checking and reporting system health
US20060047346A1 (en) * 2004-08-25 2006-03-02 Samsung Electronics Co., Ltd. Private video recorder providing user interface showing history of storing status of content and method therefor
US20070132663A1 (en) * 2005-12-12 2007-06-14 Olympus Corporation Information display system
CN103426275A (en) * 2013-08-01 2013-12-04 步步高教育电子有限公司 Device, desk lamp and method for detecting eye fatigue
CN104732251A (en) * 2015-04-23 2015-06-24 郑州畅想高科股份有限公司 Video-based method of detecting driving state of locomotive driver
CN108304764A (en) * 2017-04-24 2018-07-20 中国民用航空局民用航空医学中心 Fatigue state detection device and detection method in simulated flight driving procedure
CN108551699A (en) * 2018-04-20 2018-09-18 哈尔滨理工大学 Eye control intelligent lamp and control method thereof
WO2020034902A1 (en) * 2018-08-11 2020-02-20 昆山美卓智能科技有限公司 Smart desk having status monitoring function, monitoring system server, and monitoring method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040088115A1 (en) * 2002-11-06 2004-05-06 Varco International, Inc. Method and apparatus for dynamic checking and reporting system health
US20060047346A1 (en) * 2004-08-25 2006-03-02 Samsung Electronics Co., Ltd. Private video recorder providing user interface showing history of storing status of content and method therefor
US20070132663A1 (en) * 2005-12-12 2007-06-14 Olympus Corporation Information display system
CN103426275A (en) * 2013-08-01 2013-12-04 步步高教育电子有限公司 Device, desk lamp and method for detecting eye fatigue
CN104732251A (en) * 2015-04-23 2015-06-24 郑州畅想高科股份有限公司 Video-based method of detecting driving state of locomotive driver
CN108304764A (en) * 2017-04-24 2018-07-20 中国民用航空局民用航空医学中心 Fatigue state detection device and detection method in simulated flight driving procedure
CN108551699A (en) * 2018-04-20 2018-09-18 哈尔滨理工大学 Eye control intelligent lamp and control method thereof
WO2020034902A1 (en) * 2018-08-11 2020-02-20 昆山美卓智能科技有限公司 Smart desk having status monitoring function, monitoring system server, and monitoring method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
CHEN, ZHE: "An Eye Location based Head Posture Recognition Method and Its Application in Mouse Operation", 《KSII TRANSACTIONS ON INTERNET AND INFORMATION SYSTEMS》 *
李志春: "驾驶员疲劳状态检测技术研究与工程实现", 《信息科技辑》 *
陈瑜等: "基于眼动特征的驾驶员疲劳预警系统设计", 《软件导刊》 *
黄君浩等: "基于LSTM的眼动行为识别及人机交互应用", 《计算机系统应用》 *

Also Published As

Publication number Publication date
CN112230760B (en) 2022-09-20

Similar Documents

Publication Publication Date Title
CN109646022B (en) Child attention assessment system and method thereof
US10163527B2 (en) User interface displaying a temporal relationship between a health event indicator and monitored health conditions
CN116304766B (en) Multi-sensor-based quick assessment method for state of switch cabinet
CN111179109A (en) Electricity consumption data processing method for detecting elderly people living alone
CN110737201B (en) Monitoring method and device, storage medium and air conditioner
Santhanam et al. An extensible infrastructure for fully automated spike sorting during online experiments
CN103116405A (en) Real-time detection and control device and method for brain and muscle electricity in tooth movement states
CN110555972A (en) Household appliance use frequency optimization method based on mass data acquired by intelligent socket
CN109620158B (en) Sleep monitoring method, intelligent terminal and storage device
CN103845038A (en) Physical sign signal acquiring method and physical sign signal acquiring equipment
CN112230760B (en) Analysis system and method based on combination of user operation and biological characteristics
CN108489872B (en) Online granularity monitoring method and system
CN114041793A (en) RSVP target recognition system and method integrating multi-mode fatigue monitoring and regulation
CN104627385A (en) Process visualization decision making diagnostic system and inference control method of process visualization decision making diagnostic system
CN107356835A (en) A kind of intelligent electrical energy monitoring analysis system
CN114973628A (en) Intelligent monitoring terminal based on artificial intelligence and big data analysis
CN112231037A (en) Method for designing corresponding icon based on emotion
CN117093108A (en) Data regulation and control system and method applied to intelligent exhibition hall interaction
CN110197719B (en) Guardianship data processing system
CN112362102A (en) Monitoring system for immediately monitoring agricultural production environment
CN111741084A (en) Crop growth monitoring and diagnosing system based on wireless sensor network
CN110988584A (en) Intelligent monitoring and automatic control system for electric power system
CN113191191B (en) Community epidemic situation management method and system based on user habit analysis
CN108880945A (en) A kind of cloud monitoring system and method
CN110261528B (en) Oil chromatographic control unit capable of adaptively adjusting working time

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant