CN112631422B - Media interaction system based on human behavior triggering - Google Patents

Media interaction system based on human behavior triggering Download PDF

Info

Publication number
CN112631422B
CN112631422B CN202011438514.7A CN202011438514A CN112631422B CN 112631422 B CN112631422 B CN 112631422B CN 202011438514 A CN202011438514 A CN 202011438514A CN 112631422 B CN112631422 B CN 112631422B
Authority
CN
China
Prior art keywords
module
gesture
data
electrically connected
output end
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011438514.7A
Other languages
Chinese (zh)
Other versions
CN112631422A (en
Inventor
孙浩章
王丰
周眩
孙心玥儿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian University of Technology
Original Assignee
Xian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian University of Technology filed Critical Xian University of Technology
Priority to CN202011438514.7A priority Critical patent/CN112631422B/en
Publication of CN112631422A publication Critical patent/CN112631422A/en
Application granted granted Critical
Publication of CN112631422B publication Critical patent/CN112631422B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a media interaction system based on human behavior triggering, which comprises an infrared sensor and a pressure sensor, wherein the output ends of the two sensors are electrically connected with a gesture sensing module; the gesture sensing module is respectively and electrically connected with the input ends of the gesture recognition module and the face recognition module, the output ends of the gesture recognition module and the face recognition module are respectively and electrically connected with the data storage module and the input end of the database, and the output end of the data storage module is electrically connected with the input end of the database; the output end of the database is electrically connected with the input end of the recognition algorithm module, the recognition algorithm module is electrically connected with the processor, the processor is electrically connected with the mobile client, and the mobile client is electrically connected with the display screen. The invention adopts the infrared sensor to monitor the gesture and the face of a person when the person is not in contact with the screen, and the gesture is sensed by the gesture sensing module, so that the invention has wider application range, poorer replaceability and very strong market prospect.

Description

Media interaction system based on human behavior triggering
Technical Field
The invention belongs to the technical field of human-computer interaction, and relates to a media interaction system based on human behavior triggering.
Background
The man-machine interaction technology is a technology for realizing human-computer conversation in an effective mode through computer input and output equipment, and comprises that a machine provides a large amount of related information, prompt requests and the like for people through output or display equipment, the people input the related information, answer questions, prompt requests and the like for the machine through the input equipment, and in the traditional man-machine interaction system, the people are considered as operators and only operate the machine without real interaction activities.
The existing human-computer interaction technology is mostly operated by touch gestures, if a screen is intercepted, a user clicks the screen by two times or slides down by three fingers, and the screen is unlocked by fingerprints.
Disclosure of Invention
The invention aims to provide a media interaction system based on human behavior triggering, which has the characteristic of realizing the operation of a screen when the screen is not contacted.
The invention adopts the technical scheme that a media interaction system based on human behavior triggering comprises an infrared sensor and a pressure sensor, wherein the infrared sensor and the pressure sensor respectively receive gesture operation signals, and the output ends of the infrared sensor and the pressure sensor are electrically connected with a gesture sensing module; the output end of the gesture sensing module is electrically connected with the input end of the signal transmission module, the output end of the signal transmission module is electrically connected with the input end of the signal receiving module, the output end of the signal receiving module is electrically connected with the input ends of the gesture recognition module and the face recognition module respectively, the output ends of the gesture recognition module and the face recognition module are electrically connected with the input ends of the data storage module and the database respectively, and the output end of the data storage module is electrically connected with the input end of the database; the output end of the database is electrically connected with the input end of the recognition algorithm module, the output end of the recognition algorithm module is electrically connected with the input end of the processor, the output end of the processor is electrically connected with the input end of the mobile client, and the output end of the mobile client is electrically connected with the input end of the display screen.
The data storage module comprises a data classification module, an information input module, an information storage module and a user information storage module;
gesture recognition module and face recognition module all are connected with the input electricity of data classification module, and the output of data classification module is connected with the input electricity of information entry module, the output of information entry module is connected with the input electricity of information storage module, the output of information storage module is connected with the input electricity of user information storage module, the output of user information storage module is connected with the input electricity of database.
The detection distance of the infrared sensor is 10-200cm.
The recognition algorithm module performs gesture recognition based on MATLAB.
The mobile client is a mobile phone, a computer, a tablet, a camera and a television device.
The display screen is a touch display screen, and the infrared sensor, the pressure sensor, the database, the processor and the display screen are all integrated on the mobile terminal.
The gesture operation mode has three types, and the corresponding response scheme is three types.
The beneficial effects of the invention are:
(1) The invention adopts the infrared sensor to monitor the gesture and the human face when the human is not in contact with the screen, the infrared sensor senses the gesture and the human face through the gesture sensing module, the sensed information is transmitted to the signal receiving module through the signal transmission module, the signal receiving module receives the signal, the signal is recognized through the gesture recognition module, the recognized signal is transmitted to the database for comparison, the data is recognized through the recognition algorithm module, the recognized gesture information is transmitted to the processor, the processor controls the mobile terminal, the information is integrated on the display screen, the purpose of effectively operating the mobile device without touching the mobile device is achieved, and a corresponding response scheme is carried out according to different gesture operations.
(2) The invention adopts the infrared sensor and the pressure sensor to realize the operation in a touch and non-touch dual mode, can deal with remote control and touch operation, increases the application range of the operation, and is convenient for face recognition and equipment unlocking of users through the face recognition module.
(3) According to the invention, the infrared sensor and the pressure sensor are adopted to collect gesture and face data, the data storage module is convenient for storing the data into the database, the operation gesture can be customized, so that people can operate the equipment according to a favorite mode, and the face data of a user can be stored.
Drawings
FIG. 1 is a diagram of a behavior interaction triggering system of a media interaction system based on human behavior triggering according to the present invention;
FIG. 2 is a system diagram of the data storage module of FIG. 1;
FIG. 3 is a diagram of a behavior entry storage system of the media interaction system triggered based on human behavior according to the present invention;
fig. 4 is a simplified diagram of the behavioral interaction trigger system of fig. 1.
Detailed Description
The present invention will be described in detail below with reference to the accompanying drawings and specific embodiments.
As shown in fig. 1-3, a media interaction system triggered based on human behavior includes an infrared sensor and a pressure sensor, where the infrared sensor and the pressure sensor respectively receive gesture operation signals, and output ends of the infrared sensor and the pressure sensor are electrically connected to a gesture sensing module; the output end of the gesture sensing module is electrically connected with the input end of the signal transmission module, the output end of the signal transmission module is electrically connected with the input end of the signal receiving module, the output end of the signal receiving module is electrically connected with the input ends of the gesture recognition module and the face recognition module respectively, the output ends of the gesture recognition module and the face recognition module are electrically connected with the input ends of the data storage module and the database respectively, and the output end of the data storage module is electrically connected with the input end of the database; the output end of the database is electrically connected with the input end of the recognition algorithm module, the output end of the recognition algorithm module is electrically connected with the input end of the processor, the output end of the processor is electrically connected with the input end of the mobile client, and the output end of the mobile client is electrically connected with the input end of the display screen. The touch type and non-touch type dual modes are operated by adopting the infrared sensor and the pressure sensor, so that the touch type touch screen can be applied to remote control and touch operation, and the application range of the operation is enlarged.
The data storage module comprises a data classification module, an information input module, an information storage module and a user information storage module; adopt infrared sensor and pressure sensor to gather gesture and people's face data, through inside data storage module with data storage to the database, made things convenient for the gesture to the operation to customize, people can operate equipment according to the mode of liking, can store user's facial data.
Gesture recognition module and face recognition module all are connected with data classification module's input electricity, data classification module's output is connected with the input electricity of information entry module, the output of information entry module is connected with the input electricity of information storage module, the output of information storage module is connected with the input electricity of user information storage module, the output of user information storage module is connected with the input electricity of database.
The detection distance of the infrared sensor is 10-200cm.
The recognition algorithm module performs gesture recognition based on MATLAB.
The mobile client is a mobile phone, a computer, a flat panel, a camera and television equipment.
The display screen is a touch display screen, and the infrared sensor, the pressure sensor, the database, the processor and the display screen are all integrated on the mobile terminal.
The gesture operation mode has three kinds, and the corresponding response scheme is three kinds, and the user can make different operation gestures or operation behaviors according to the preference, and then set the corresponding response operation (for example, when the user uses the liquid crystal television, the user can set the video playing speed of sliding left and right, and the volume of sliding up and down, which is more convenient for the operation using the remote controller).
When the face data storage device is used, when the face data are required to be stored, the infrared sensor scans the face data, the data are guided into the signal receiving module through the signal transmission module, then the data are transmitted to the data storage module to be stored, in addition, when the data are stored, the data classification module classifies the face data, a plurality of face data can be stored, the user information storage module establishes a user information storage base for the plurality of user data, and all the data are transmitted to the data base to be stored.
When gesture data of the non-contact screen need to be stored, the infrared sensor scans the gesture data, the gesture data are guided into the signal receiving module through the signal transmission module, then the data are transmitted to the data storage module to be stored, in addition, when the gesture data are stored, the gesture data are classified through the data classification module, a plurality of gesture data can be stored, the user information storage module establishes a plurality of information repositories for the gesture data, and all the data are transmitted to the database to be stored.
When gesture data of a touch screen needs to be stored, the pressure sensor receives the gesture data, the gesture sensing module senses gesture tracks, the data are guided into the signal receiving module through the signal transmission module, then the data are transmitted to the data storage module to be stored, and when the gesture data are stored, the data classification module classifies the gesture data, a plurality of gesture data can be stored, the user information storage module establishes a plurality of information repositories for the gesture data, and all the data are transmitted to the database to be stored.
When needs facial unblock, infrared sensor scans facial data to through signal transmission module with data introduction signal receiving module, data are transmitted to the facial recognition module and are carried out facial recognition, then carry out analysis calibration to facial data that stores in facial data and the database, like data unanimous open equipment, if the equipment is not always, scan the detection again.
When the remote gesture operation equipment is needed, the infrared sensor scans gesture data, the gesture data are subjected to sensing analysis through the gesture sensing module, the data are led into the signal receiving module through the signal transmission module and then transmitted to the gesture recognition module, the gesture recognition module analyzes the data, the analyzed data and the data in the database are calibrated, if corresponding gesture data are found, a corresponding response scheme is calculated through the recognition algorithm module, the corresponding scheme is processed through the processor, and a processing result is transmitted to the mobile terminal to be operated.
When the contact type gesture operation equipment is needed, the pressure sensor scans gesture data, the gesture sensing module senses and analyzes the gesture data, the signal transmission module guides the data into the signal receiving module, the data are transmitted to the gesture recognition module, the gesture recognition module analyzes the data, the analyzed data and the data in the database are calibrated, if corresponding gesture data are found, a corresponding response scheme is calculated through the recognition algorithm module, the corresponding scheme is processed through the processor, and a processing result is transmitted to the mobile terminal to be operated.
The invention has the advantages that:
(1) The invention adopts the infrared sensor to monitor the gesture and the human face when the human is not in contact with the screen, the infrared sensor senses the gesture and the human face through the gesture sensing module, the sensed information is transmitted to the signal receiving module through the signal transmission module, the signal receiving module receives the signal, the signal is recognized through the gesture recognition module, the recognized signal is transmitted to the database for comparison, the data is recognized through the recognition algorithm module, the recognized gesture information is transmitted to the processor, the processor controls the mobile terminal, the information is integrated on the display screen, the purpose of effectively operating the mobile device without touching the mobile device is achieved, and a corresponding response scheme is carried out according to different gesture operations.
(2) The invention adopts the infrared sensor and the pressure sensor to realize the operation in a touch and non-touch dual mode, can deal with remote control and touch operation, increases the application range of the operation, and is convenient for face recognition and equipment unlocking of users through the face recognition module.
(3) The invention adopts the infrared sensor and the pressure sensor to collect the gesture and the face data, is convenient for storing the data in the database through the data storage module, can customize the operated gesture, enables people to operate the equipment according to the favorite mode, and can store the face data of the user.

Claims (7)

1. A media interaction system based on human behavior triggering is characterized by comprising an infrared sensor and a pressure sensor, wherein the infrared sensor and the pressure sensor respectively receive gesture operation signals, and the output ends of the infrared sensor and the pressure sensor are electrically connected with a gesture sensing module; the output end of the gesture sensing module is electrically connected with the input end of the signal transmission module, the output end of the signal transmission module is electrically connected with the input end of the signal receiving module, the output end of the signal receiving module is electrically connected with the input ends of the gesture recognition module and the face recognition module respectively, the output ends of the gesture recognition module and the face recognition module are electrically connected with the input ends of the data storage module and the database respectively, and the output end of the data storage module is electrically connected with the input end of the database; the output end of the database is electrically connected with the input end of the recognition algorithm module, the output end of the recognition algorithm module is electrically connected with the input end of the processor, the output end of the processor is electrically connected with the input end of the mobile client, and the output end of the mobile client is electrically connected with the input end of the display screen;
when the remote gesture operation equipment is needed, the infrared sensor scans gesture data, the gesture data are subjected to sensing analysis through the gesture sensing module, the data are led into the signal receiving module through the signal transmission module and then transmitted to the gesture recognition module, the gesture recognition module analyzes the data, the analyzed data and the data in the database are calibrated, if corresponding gesture data are found, a corresponding response scheme is calculated through the recognition algorithm module, the corresponding scheme is processed through the processor, and a processing result is transmitted to the mobile terminal for operation;
when the contact type gesture operation equipment is needed, the pressure sensor scans gesture data, the gesture sensing module senses and analyzes the gesture data, the signal transmission module guides the data into the signal receiving module, the data are transmitted to the gesture recognition module, the gesture recognition module analyzes the data, the analyzed data and the data in the database are calibrated, if corresponding gesture data are found, a corresponding response scheme is calculated through the recognition algorithm module, the corresponding scheme is processed through the processor, and a processing result is transmitted to the mobile terminal to be operated.
2. The human behavior trigger-based media interaction system of claim 1, wherein the data storage module comprises a data classification module, an information entry module, an information storage module and a user information storage module;
the gesture recognition module and the face recognition module are electrically connected with the input end of the data classification module, the output end of the data classification module is electrically connected with the input end of the information input module, the output end of the information input module is electrically connected with the input end of the information storage module, the output end of the information storage module is electrically connected with the input end of the user information storage module, and the output end of the user information storage module is electrically connected with the input end of the database.
3. The human behavior trigger-based media interaction system as claimed in claim 1, wherein the detection distance of the infrared sensor is 10-200cm.
4. The human behavior trigger-based media interaction system of claim 1, wherein the recognition algorithm module performs gesture recognition based on MATLAB.
5. The human behavior trigger-based media interaction system of claim 1, wherein: the mobile client is a mobile phone, a computer, a tablet, a camera and television equipment.
6. The human behavior trigger-based media interaction system of claim 1, wherein: the display screen is a touch display screen, and the infrared sensor, the pressure sensor, the database, the processor and the display screen are all integrated on the mobile terminal.
7. The human behavior trigger-based media interaction system of claim 1, wherein: the gesture operation modes are three, and the corresponding response schemes are three.
CN202011438514.7A 2020-12-10 2020-12-10 Media interaction system based on human behavior triggering Active CN112631422B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011438514.7A CN112631422B (en) 2020-12-10 2020-12-10 Media interaction system based on human behavior triggering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011438514.7A CN112631422B (en) 2020-12-10 2020-12-10 Media interaction system based on human behavior triggering

Publications (2)

Publication Number Publication Date
CN112631422A CN112631422A (en) 2021-04-09
CN112631422B true CN112631422B (en) 2023-04-07

Family

ID=75309166

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011438514.7A Active CN112631422B (en) 2020-12-10 2020-12-10 Media interaction system based on human behavior triggering

Country Status (1)

Country Link
CN (1) CN112631422B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113821103B (en) * 2021-08-20 2023-12-05 苏州创捷传媒展览股份有限公司 On-screen object recognition display system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102523395A (en) * 2011-11-15 2012-06-27 中国科学院深圳先进技术研究院 Television system having multi-point touch function, touch positioning identification method and system thereof

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10025431B2 (en) * 2013-11-13 2018-07-17 At&T Intellectual Property I, L.P. Gesture detection
CN103713738B (en) * 2013-12-17 2016-06-29 武汉拓宝科技股份有限公司 A kind of view-based access control model follows the tracks of the man-machine interaction method with gesture identification
US10534436B2 (en) * 2015-01-30 2020-01-14 Sony Depthsensing Solutions Sa/Nv Multi-modal gesture based interactive system and method using one single sensing system
CN106814532A (en) * 2015-11-30 2017-06-09 深圳市三川叶传媒有限公司 A kind of 270 degree of line holographic projections showcases
CN207399418U (en) * 2017-10-24 2018-05-22 北京蓝海华业工程技术有限公司 A kind of TV based on recognition of face
CN108932060A (en) * 2018-09-07 2018-12-04 深圳众赢时代科技有限公司 Gesture three-dimensional interaction shadow casting technique
CN211603864U (en) * 2019-10-12 2020-09-29 巢湖学院 Intelligent home control system based on gesture recognition
CN210222780U (en) * 2019-10-22 2020-03-31 嘉应学院 Bimodal recognition system of people's face and hand type gesture

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102523395A (en) * 2011-11-15 2012-06-27 中国科学院深圳先进技术研究院 Television system having multi-point touch function, touch positioning identification method and system thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Contactless gesture recognition system using proximity sensors;Heng-Tze Cheng .etc;《IEEE》;全文 *

Also Published As

Publication number Publication date
CN112631422A (en) 2021-04-09

Similar Documents

Publication Publication Date Title
WO2017071131A1 (en) Touch control device, and method for performing fingerprint detection on touch control device
US10126826B2 (en) System and method for interaction with digital devices
US10359876B2 (en) Biometric initiated communication
Lian et al. Automatic user state recognition for hand gesture based low-cost television control system
US9557852B2 (en) Method of identifying palm area of a touch panel and a updating method thereof
CN106598335B (en) A kind of touch screen control method, device and mobile terminal of mobile terminal
US20100302137A1 (en) Touch Sensitive Display Apparatus using sensor input
CN105302373A (en) Method and system for achieving operation of mobile terminal according to touch signals and mobile terminal
US8749531B2 (en) Method for receiving input on an electronic device and outputting characters based on sound stroke patterns
US10628565B2 (en) Method and device
US20140055483A1 (en) Computer User Interface System and Methods
CN109917958B (en) Touch panel, touch display device and driving control method of touch panel
CN109558061B (en) Operation control method and terminal
JP2013516703A (en) User interface method and system for providing force sensitive input
US20040239624A1 (en) Freehand symbolic input apparatus and method
Jeong et al. Single-camera dedicated television control system using gesture drawing
CN104049738A (en) Method and apparatus for operating sensors of user device
CN104966061A (en) Palmprint identification method and device and intelligent safety door
CN112631422B (en) Media interaction system based on human behavior triggering
CN103164160A (en) Left hand and right hand interaction device and method
Chang et al. Recognition of grip-patterns by using capacitive touch sensors
US11836297B2 (en) Keyboard with capacitive key position, key movement, or gesture input sensors
CN111367483A (en) Interaction control method and electronic equipment
CN106845188A (en) A kind for the treatment of method and apparatus of the interface icon based on fingerprint recognition
CN105183355B (en) A kind of output method, output device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant