CN113190104A - Method for realizing man-machine interaction by recognizing human actions through visual analysis by intelligent equipment - Google Patents

Method for realizing man-machine interaction by recognizing human actions through visual analysis by intelligent equipment Download PDF

Info

Publication number
CN113190104A
CN113190104A CN202110065155.3A CN202110065155A CN113190104A CN 113190104 A CN113190104 A CN 113190104A CN 202110065155 A CN202110065155 A CN 202110065155A CN 113190104 A CN113190104 A CN 113190104A
Authority
CN
China
Prior art keywords
actions
visual analysis
human
intelligent equipment
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110065155.3A
Other languages
Chinese (zh)
Inventor
郭奕忠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202110065155.3A priority Critical patent/CN113190104A/en
Publication of CN113190104A publication Critical patent/CN113190104A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Abstract

The intelligent device utilizes visual analysis to recognize the human actions to realize human-computer interaction. The intelligent device analyzes the human body action through the camera, takes the judged action type as an instruction and responds. The system can be used for body-building sports, man-machine interaction intelligence-developing games, man-machine instruction interaction in noisy outdoor environment and the like.

Description

Method for realizing man-machine interaction by recognizing human actions through visual analysis by intelligent equipment
Technical Field
The software field.
Background
With the improvement of computing power of intelligent devices, particularly portable intelligent devices (smart phones, tablet computers and the like) which are used daily, people can perform edge computing by using the computing power of the intelligent devices, recognize human actions in real time and accurately analyze the human actions.
Therefore, a new scheme is provided for human-computer interaction, namely, the human-computer interaction is realized more conveniently by recognizing human actions and using the human actions as instructions for human-computer interaction with intelligent equipment.
The scheme is suitable for a plurality of scenes, in particular to scenes such as body-building sports, entertainment games and the like.
Disclosure of Invention
Acquiring and analyzing the actions of the characters in real time by using an intelligent device computing power and video acquisition module, comparing the identified actions with preset actions, and starting a new interactive scene if the identified actions are the same as the preset actions; if not, prompting or informing the adjustment action according to the scene requirement until the adjustment action is the same as the preset action.
The method is simply described as follows:
firstly, according to the interactive scene requirements, various action bone joint data are collected, produced and stored to be used as the standard for judging interactive instructions.
Secondly, the intelligent device collects and analyzes the movement of the figure in real time through the video collection module, and extracts the current figure skeleton joint data.
And finally, comparing the bone joint data analyzed in real time with preset action bone joint data, and combining a logic algorithm and scene requirements, so that the intelligent equipment makes corresponding response and feedback.
Drawings
FIG. 1: collecting certain motion bone joint data demonstration
FIG. 2: collecting data demonstration of certain motion bone joints.

Claims (3)

1. A method for realizing human-computer interaction by recognizing human actions through visual analysis by an intelligent device comprises the following steps: the intelligent equipment collects and analyzes the actions of the figures in real time through the video collection module, compares the identified actions with preset actions, and starts a new interactive scene if the identified actions are the same as the preset actions; if not, prompting or informing the user of the adjusting action according to the scene requirement until the adjusting action is the same as the preset action.
2. Smart devices, including but not limited to: cell-phone, panel equipment, smart TV set.
3. Application scenarios, including but not limited to: sports and body building, and entertainment and game.
CN202110065155.3A 2021-01-18 2021-01-18 Method for realizing man-machine interaction by recognizing human actions through visual analysis by intelligent equipment Pending CN113190104A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110065155.3A CN113190104A (en) 2021-01-18 2021-01-18 Method for realizing man-machine interaction by recognizing human actions through visual analysis by intelligent equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110065155.3A CN113190104A (en) 2021-01-18 2021-01-18 Method for realizing man-machine interaction by recognizing human actions through visual analysis by intelligent equipment

Publications (1)

Publication Number Publication Date
CN113190104A true CN113190104A (en) 2021-07-30

Family

ID=76972614

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110065155.3A Pending CN113190104A (en) 2021-01-18 2021-01-18 Method for realizing man-machine interaction by recognizing human actions through visual analysis by intelligent equipment

Country Status (1)

Country Link
CN (1) CN113190104A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120143358A1 (en) * 2009-10-27 2012-06-07 Harmonix Music Systems, Inc. Movement based recognition and evaluation
CN104898828A (en) * 2015-04-17 2015-09-09 杭州豚鼠科技有限公司 Somatosensory interaction method using somatosensory interaction system
CN110858277A (en) * 2018-08-22 2020-03-03 阿里巴巴集团控股有限公司 Method and device for obtaining attitude classification model
CN111047925A (en) * 2019-12-06 2020-04-21 山东大学 Action learning system and method based on room type interactive projection
CN111617464A (en) * 2020-05-28 2020-09-04 西安工业大学 Treadmill body-building method with action recognition function

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120143358A1 (en) * 2009-10-27 2012-06-07 Harmonix Music Systems, Inc. Movement based recognition and evaluation
CN104898828A (en) * 2015-04-17 2015-09-09 杭州豚鼠科技有限公司 Somatosensory interaction method using somatosensory interaction system
CN110858277A (en) * 2018-08-22 2020-03-03 阿里巴巴集团控股有限公司 Method and device for obtaining attitude classification model
CN111047925A (en) * 2019-12-06 2020-04-21 山东大学 Action learning system and method based on room type interactive projection
CN111617464A (en) * 2020-05-28 2020-09-04 西安工业大学 Treadmill body-building method with action recognition function

Similar Documents

Publication Publication Date Title
Rautaray Real time hand gesture recognition system for dynamic applications
CN103336576B (en) A kind of moving based on eye follows the trail of the method and device carrying out browser operation
CN108983636B (en) Man-machine intelligent symbiotic platform system
CN110428486B (en) Virtual interaction fitness method, electronic equipment and storage medium
CN108460329B (en) Face gesture cooperation verification method based on deep learning detection
CN110688910B (en) Method for realizing wearable human body basic gesture recognition
CN107894836B (en) Human-computer interaction method for processing and displaying remote sensing image based on gesture and voice recognition
WO2018000519A1 (en) Projection-based interaction control method and system for user interaction icon
US9612688B2 (en) Projection method and electronic device
CN110298220B (en) Action video live broadcast method, system, electronic equipment and storage medium
CN102222342A (en) Tracking method of human body motions and identification method thereof
CN103605466A (en) Facial recognition control terminal based method
CN103092332A (en) Digital image interactive method and system of television
CN114779922A (en) Control method for teaching apparatus, control apparatus, teaching system, and storage medium
CN105042789A (en) Control method and system of intelligent air conditioner
CN111103982A (en) Data processing method, device and system based on somatosensory interaction
CN111223549A (en) Mobile end system and method for disease prevention based on posture correction
CN110568931A (en) interaction method, device, system, electronic device and storage medium
CN108646578B (en) Medium-free aerial projection virtual picture and reality interaction method
CN103135746A (en) Non-touch control method and non-touch control system and non-touch control device based on static postures and dynamic postures
Thakoor et al. Attention biased speeded up robust features (ab-surf): A neurally-inspired object recognition algorithm for a wearable aid for the visually-impaired
CN113556599A (en) Video teaching method and device, television and storage medium
CN113190104A (en) Method for realizing man-machine interaction by recognizing human actions through visual analysis by intelligent equipment
Loewenich et al. Hands-free mouse-pointer manipulation using motion-tracking and speech recognition
Zidianakis et al. Building a sensory infrastructure to support interaction and monitoring in ambient intelligence environments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210730