CN106325527A - Human body action identification system - Google Patents

Human body action identification system Download PDF

Info

Publication number
CN106325527A
CN106325527A CN201610908475.XA CN201610908475A CN106325527A CN 106325527 A CN106325527 A CN 106325527A CN 201610908475 A CN201610908475 A CN 201610908475A CN 106325527 A CN106325527 A CN 106325527A
Authority
CN
China
Prior art keywords
module
human body
human action
human
identification system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610908475.XA
Other languages
Chinese (zh)
Inventor
吕学刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
GH DEVELOPMENT HOLDINGS Ltd
Original Assignee
GH DEVELOPMENT HOLDINGS Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by GH DEVELOPMENT HOLDINGS Ltd filed Critical GH DEVELOPMENT HOLDINGS Ltd
Priority to CN201610908475.XA priority Critical patent/CN106325527A/en
Publication of CN106325527A publication Critical patent/CN106325527A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • G06V40/25Recognition of walking or running movements, e.g. gait recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/013Force feedback applied to a game
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/02Preprocessing
    • G06F2218/04Denoising

Abstract

The invention relates to a human body action identification system. The system comprises a human body action identification unit and an intelligent terminal in communication connection with the human body action identification unit, wherein the intelligent terminal comprises a master controller, a display module and a first communication module, the first communication module is connected with the master controller and used for allowing the master controller to be in data transmission with the human body action identification unit; the display module is connected with the master controller and used for displaying display information output by the master controller; the human body action identification unit is used for sensing actions of a human body, outputting corresponding gesture information and uploading the gesture information to the intelligent terminal; the master controller of the intelligent terminal is used for processing the gesture information and outputting a gesture information display instruction to the display module, and the display module is used for displaying a corresponding gesture according to the gesture information display instruction. According to the human body action identification system, limbs can be moved to perform corresponding actions according to game beats and instructions, specific actions of the human limbs can be sensed, games and motion are combined, and the experience of a user is improved.

Description

Human action identification system
Technical field
The present invention relates to electronic technology field, more particularly, it relates to a kind of human action identification system.
Background technology
The smart machines such as current smart mobile phone are more and more universal, and various application are the most, also occur in that various Game and application.But these game and application are difficult to combine with our own activity, it is impossible to reach to play with people is mutual, Interactive difference.
Summary of the invention
The technical problem to be solved in the present invention is, above-mentioned game and application for prior art are difficult to human body self Moving phase combination, it is impossible to reach game and the mutual of people and cause the defect of interactive difference, it is provided that a kind of human action identification system System.
The technical solution adopted for the present invention to solve the technical problems is: a kind of human action identification system of structure, including Human action recognition unit and the intelligent terminal with the communication connection of described human action recognition unit;
Described intelligent terminal includes master controller, display module and first communication module, described first communication module with Described master controller connects, for carrying out data transmission with described human action recognition unit for described master controller;Described aobvious Show that module is connected with described master controller, for showing the display information that described master controller exports;
Described human action recognition unit includes microprocessor and the sensing module being connected with described microprocessor;Described Sensing module is for sensing the action of human body to produce the output of corresponding sensing signal to described microprocessor, described microprocessor Carry out processing based on described sensing signal and export corresponding attitude information and described attitude information is uploaded to described intelligent terminal;
Described attitude information is carried out processing output attitude information idsplay order to institute by the master controller of described intelligent terminal Stating display module, described display module shows corresponding attitude according to described attitude information idsplay order.
In human action identification system of the present invention, the most described human action recognition unit also includes with described The second communication module that microprocessor connects, described second communication module is for carrying out data biography with described first communication module Defeated.
In human action identification system of the present invention, the most described human action recognition unit also includes with described The attitude algorithm module that microprocessor connects, the described sensing signal received is sent to described attitude algorithm by described microprocessor Module, described attitude algorithm module processes according to described sensing signal, calculates the attitude information of human body, and by described people The attitude information of body returns to described microprocessor, and described microprocessor carries out processing output according to the attitude information of described human body Corresponding attitude information.
In human action identification system of the present invention, the most described attitude algorithm module includes Kalman filtering Device, described Kalman filter is filtered for the sensing signal sending described microprocessor, and output card Kalman Filtering is believed Number.
In human action identification system of the present invention, the most described attitude algorithm module also includes and described karr The Quaternion algebra module that graceful wave filter is connected, described Quaternion algebra module receives the karr of described Kalman filter output Graceful filtering signal, based on described Kalman filtering signal by Quaternion algebra to obtain three-dimensional Eulerian angles, calculates The three-dimensional attitude of human body, obtains the attitude information of human body.
In human action identification system of the present invention, the most described sensing module includes that 3-axis acceleration senses Device, three axle geomagnetic sensor and/or three-axis gyroscopes, by described 3-axis acceleration sensor, three axle geomagnetic sensors and/or Three-axis gyroscope senses the action of described human body to produce corresponding sensing signal.
In human action identification system of the present invention, the most described sensing signal includes that the limbs of described human body exist Bearing data, spin data and displacement data in three dimensions.
In human action identification system of the present invention, the most described human action recognition unit also includes with described The power module that microprocessor connects, described power module is for powering to described human action recognition unit.
In human action identification system of the present invention, the most described human bioequivalence unit also includes and described micro-place The LED that reason device connects, described LED is for sending different colors according to the attitude information of described human body.
In human action identification system of the present invention, the most described first communication module communicates with described second mould Block carries out data transmission by the way of bluetooth or Wi-Fi.
Implement the human action identification system of the present invention, have the advantages that the human action identification system of the present invention System includes human action recognition unit and the intelligent terminal with the communication connection of described human action recognition unit;Described intelligence is eventually End includes that master controller, display module and first communication module, described first communication module are connected with described master controller, uses In carrying out data transmission with described human action recognition unit for described master controller;Described display module and described master controller Connect, for showing the display information that described master controller exports;Described human action recognition unit include microprocessor and The sensing module being connected with described microprocessor;Described sensing module senses letter for the action sensing human body accordingly to produce Number output is to described microprocessor, and described microprocessor carries out processing based on described sensing signal and exports corresponding attitude information also Described attitude information is uploaded to described intelligent terminal;Described attitude information is processed by the master controller of described intelligent terminal Output attitude information idsplay order shows phase to described display module, described display module according to described attitude information idsplay order The attitude answered.The human action identification system of the present invention, can basis by the interconnection of human action recognition unit with intelligent terminal The beat of the game shown on intelligent terminal and instruction are moved limbs and are carried out corresponding action, simultaneously can be according to human action The concrete action of recognition unit perception human body limb, reaches game and the purpose combined of moving, game and people is better achieved Mutual, strengthen interactive, promote Consumer's Experience.
Accompanying drawing explanation
Below in conjunction with drawings and Examples, the invention will be further described, in accompanying drawing:
Fig. 1 is the functional block diagram of the present inventor's body action identification system.
Detailed description of the invention
In order to be more clearly understood from the technical characteristic of the present invention, purpose and effect, now comparison accompanying drawing describes in detail The detailed description of the invention of the present invention.
As it is shown in figure 1, in an embodiment of the human action identification system of the present invention, the human action identification of the present invention System includes human action recognition unit 100 and the intelligent terminal 200 with human action recognition unit 100 communication connection;Intelligence Terminal 200 can include master controller 201, display module 203 and first communication module 202, first communication module 202 and master control Device 201 processed connects, for carrying out data transmission with human action recognition unit 100 for master controller 201;Display module 203 with Master controller 201 connects, for showing the display information that master controller 201 exports.It is to be appreciated that in certain embodiments, Intelligent terminal 200 can be smart mobile phone, can be smart mobile phone, panel computer, desktop computer (PC), PS, XBOX ONE, Any one in XBOX 360.
Human action recognition unit 100 includes microprocessor 101 and the sensing module 102 being connected with microprocessor 101; Sensing module 102 exports to microprocessor 101, microprocessor to produce corresponding sensing signal for the action sensing human body 101 carry out processing the corresponding attitude information of output and attitude information being uploaded to intelligent terminal 200 based on sensing signal;Intelligence is eventually The attitude information that intelligent terminal 200 is received by the master controller 201 of end 200 carries out processing output attitude information idsplay order extremely Display module 203, display module 203 shows corresponding attitude according to attitude information idsplay order.It is to be appreciated that institute of the present invention The attitude referred to refers to the concrete action that user is done when game, such as, in dancing and game, when the left hand of user is to upper left Side lifts at 45 degree of angles, and the display module 203 in intelligent terminal 200 demonstrates identical lift action.
In actual application in certain embodiments, human action recognition unit 100 is for wearing in user foot, lower limb Any one identification device in a certain position of portion, hand and health.Specifically, human action recognition unit 100 is permissible Can wear in the recognition unit of a certain position of user health for foot ring or bracelet etc..Concrete application can be answered according to reality With determining selected recognition unit, this is not construed as limiting by the present invention.
Human action recognition unit 100 also includes the second communication module 104 being connected with microprocessor 101, the second communication Module 104 is for carrying out data transmission with first communication module 202.It is to be appreciated that in certain embodiments, the first communication mould Block 202 can carry out data transmission with second communication module 104 by the way of bluetooth or Wi-Fi.
In certain embodiments, human action recognition unit 100 also includes the attitude algorithm mould being connected with microprocessor 101 Block 103, the sensing signal of reception is sent to attitude algorithm module 103 by microprocessor 101, and attitude algorithm module 103 is according to sense Survey signal to process, calculate the attitude information of human body, and the attitude information of human body is returned to microprocessor 101, micro-place Reason device 101 carries out processing the corresponding attitude information of output according to the attitude information of human body.Preferably, attitude algorithm module 103 is wrapped Including Kalman filter, Kalman filter is filtered for the sensing signal sending microprocessor 101, output card Germania Filtering signal.Preferably, Kalman filtering signal refers to the relevant data signals after Kalman filtering.It is to be appreciated that Kalman filter Kalman filtering (Kalman filtering) one utilizes linear system state equation, is inputted by system Output observation data, carry out the algorithm of optimal estimation to system mode.Owing to observation data include the noise in system and do The impact disturbed, so optimal estimation is also considered as filtering.Data filtering is the one removing noise reduction truthful data Data processing technique, Kalman filtering can be from a series of data that there is measurement noise in the case of measuring known to variance In, estimate the state of dynamical system.Due to, it is easy to computer programming and realizes, and the data of collection in worksite can be carried out reality Time renewal and process, therefore, the human bioequivalence unit 100 of the present invention uses Kalman filter to filter sensing signal Ripple, can obtain the attitude data of the action of human body faster, more accurately.
It addition, Kalman filtering does not require that signal and noise are all the assumed condition of stationary process.For each moment System disturbance and observation error (i.e. noise), as long as it is suitable it is assumed that by containing making an uproar that their statistical property is made some The observation signal of sound processes, and just can try to achieve, in the sense that average, the estimated value that error is minimum actual signal.Such as In terms of image procossing, application card Kalman Filtering causes fuzzy image to restore to due to some influence of noise.Right After noise has made the supposition of some statistical property, it is possible to obtain from broad image in the way of recursion with the algorithm of Kalman The true picture that mean square deviation is minimum, makes the image obscured be restored, is the action to human body the most in an embodiment of the present invention Recovery, and shown by display module 203.And employing Kalman filter, can participate in game process at human body, Estimate the action of human motion and reduce.
In certain embodiments, attitude algorithm module 103 also includes the Quaternion algebra mould being connected with kalman filter device Block.It is to be appreciated that the Kalman filtering signal of Quaternion algebra module receiving card Thalmann filter output, filter based on Kalman Ripple signal to obtain three-dimensional Eulerian angles by Quaternion algebra, is calculated the three-dimensional attitude of human body, obtains human body Attitude information.
Further, Eulerian angles are used to determine 3 one-level independence JIAOSHEN amounts of Fixed-point Motion of A object space, can be by nutating Angle, angle of precession and angle of rotation composition.By using Euler kinematical equations to carry out computing nutational angle, angle of precession and angle of rotation, Thus calculate the human body attitude at three dimensions real-time action, and obtain the attitude information of human body.It addition, the attitude of the present invention Resolving after module 103 first carries out Kalman filtering to sensing signal, recycling Quaternion algebra resolves, by four elements this Individual mathematical tool, makes up the deficiency of 3 Eulerian angles of human body angular movement with it.It is to be appreciated that four elements can describe one Individual coordinate system or vector rotating relative to some coordinate system, the scalar component of four elements illustrates the half cosine of corner Value, its vector section then represents the direction cosines value of the direction of instantaneous axis, the instantaneous axis of rotation and reference frame between centers.Cause This, four elements had both illustrated the direction of rotating shaft, illustrated again the size of corner.Whereby, in an embodiment of the present invention, adopt The attitude information that can quickly obtain human body is calculated with four elements.Improve the efficiency of computing.
In certain embodiments, sensing module 102 includes 3-axis acceleration sensor, three axle geomagnetic sensors and/or three Axle gyroscope.It is to be appreciated that sensing module 102 can include 3-axis acceleration sensor, three axle geomagnetic sensors and three axle tops Spiral shell instrument, or 3-axis acceleration sensor, three axle earth magnetism sensing or three-axis gyroscopes.By three axle 3-axis acceleration sensors, The action of three axle earth magnetism sensings and/or three-axis gyroscope sensing human body is to produce corresponding sensing signal.
In certain embodiments, sensing signal includes the limbs of human body bearing data in three dimensions, rotation number According to and displacement data.
Such as, user in carrying out different motor processs, health can because kinestate is different at different angles, Rotate and position carries out different actions, and the human action recognition unit being worn on human body will move according to the health of user The state change made and sense corresponding sensing signal, as user bearing data within the specific limits, company can be sensed The straight-line displacement data of continuous rotationally-varying data and within the specific limits deviation, i.e. user motion in motor process Track data, further according to the motion trace data obtained be calculated azimuth, direction of rotation (as clockwise or counterclockwise), Direction of displacement is identical with azimuthal or contrary, by the process to these data, it is thus achieved that the limbs of user are at three-dimensional space Between attitude and attitude information.
In certain embodiments, human action recognition unit 100 also includes the power module being connected with microprocessor 101 105.It is to be appreciated that power module 105 is for powering to human action recognition unit 100.
In certain embodiments, human action recognition unit 100 also includes the LED 106 being connected with microprocessor 101. It is to be appreciated that LED 106 is for sending different colors according to the attitude information of human body.Specifically, carrying out when user During game, received sensing signal, according to the sensing signal of sensing module 102 transmission, is processed by microprocessor 101, And export corresponding LED display control instruction, control LED 106 and show different color change.In other words, the present invention The LED 106 of indication can be the breath light that this area often refers to, i.e. when user is when carrying out certain action, as rebounded, micro- Processor 101 is according to sensing signal and processes, and to LED 106 output display control instruction, corresponding LED is lighted, aobvious Illustrate and corresponding color of jumping.Preferably, in an embodiment of the present invention, according to the color combination and variation of LED, can show 255 kinds of colors.
It is to be appreciated that in certain embodiments, the human action identification system of the present invention can realize game and motion phase In conjunction with purpose, such as sports play such as dancing, skiing, bicycles, by the human action recognition unit being worn on user 100 attitude informations sensing the action identifying human body the human action that will identify that are uploaded to intelligent terminal 200, and intelligence is eventually Attitude information is shown by end 200, meanwhile, shows corresponding interface, according to intelligent terminal on intelligent terminal 200 On 200, beat and the instruction of the game of display instruct game user to carry out corresponding action further, it is achieved people and the friendship of game Mutually, the experience of user is promoted.
For the operation principle of the human action identification system of the present invention is further illustrated, now carry out as a example by dancing ring Describe in detail.
First, user first wears dancing ring (including foot ring and bracelet), is then turned on dancing ring, opens intelligent terminal 200, connect dancing ring and intelligent terminal 200, open corresponding dancing and game in intelligent terminal 200, enter dancing and game interface, Beat according to the game shown by dancing and game interface and prompting, user starts mobile both feet and both hands, now, bracelet and 3-axis acceleration sensor, three axle geomagnetic sensors and/or three-axis gyroscope in foot ring senses the mobile generation of user Corresponding sensing signal, bearing data, spin data and the displacement data moved in three dimensions such as human body, and will be felt The sensing signal (bearing data, spin data and displacement data) measured is transferred to microprocessor 101, and microprocessor 101 connects Receive this sensing signal (bearing data, spin data and displacement data), these data are processed and is sent to attitude solution Calculating module 103, attitude algorithm module 103 azimuthal data, spin data and displacement data are carried out by Kalman filter Filtering obtains Kalman filtering signal and sends Quaternion algebra module to, is believed Kalman filtering by Quaternion algebra module Number carry out processing and calculate, it is thus achieved that the three-dimensional attitude of user, obtain the attitude information of user.Microprocessor 101 will obtain The attitude information obtained is uploaded to intelligent terminal 200, and attitude information is processed and on display module 203 by intelligent terminal 200 Showing, user adjusts, according to the display of display module 203, the action that health moves, such as direction distance or the anglec of rotation Deng, thus complete corresponding dancing posture.Meanwhile, microprocessor 101 exports corresponding LED idsplay order, control according to attitude information LED processed carries out the display of respective color, strengthens the effect and atmosphere danced.And then complete the dancing that people combines with game Game, makes user be further improved with the interaction of game, improves Consumer's Experience.
The human action recognition unit 100 of human action identification system disclosed in this invention is for sensing the action of human body To identify the current ongoing motion of user, the i.e. limb action of health, such as, wave, bounce, walk, rotation etc., Including dancing, skiing, by bike etc..Can to the rule action that limb action is different by human action recognition unit 100 To identify the ongoing action of user, obtain, by attitude algorithm, the attitude information that user is current, and by intelligence The attitude information that user is current is shown by terminal 200, and attitude information current for user self is showed user Viewing in real time, in order to user can preferably complete motion according to the action shown to adjust self of intelligent terminal 200, Further enhancing the interactive of user and game.
Above example, only for technology design and the feature of the explanation present invention, its object is to allow person skilled in the art Scholar will appreciate that present disclosure and implements accordingly, can not limit the scope of the invention.All with right of the present invention want The equalization asking scope to be done changes and modifies, and all should belong to the covering scope of the claims in the present invention.
It should be appreciated that for those of ordinary skills, can be improved according to the above description or be converted, And all these modifications and variations all should belong to the protection domain of claims of the present invention.

Claims (10)

1. a human action identification system, it is characterised in that include human action recognition unit and with described human action The intelligent terminal of recognition unit communication connection;
Described intelligent terminal includes master controller, display module and first communication module, and described first communication module is with described Master controller connects, for carrying out data transmission with described human action recognition unit for described master controller;Described display mould Block is connected with described master controller, for showing the display information that described master controller exports;
Described human action recognition unit includes microprocessor and the sensing module being connected with described microprocessor;Described sensing Module for sensing the action of human body to produce the output of corresponding sensing signal to described microprocessor, described microprocessor based on Described sensing signal carries out processing the corresponding attitude information of output and described attitude information being uploaded to described intelligent terminal;
Described attitude information is carried out processing output attitude information idsplay order to described aobvious by the master controller of described intelligent terminal Showing module, described display module shows corresponding attitude according to described attitude information idsplay order.
Human action identification system the most according to claim 1, it is characterised in that described human action recognition unit also wraps Including the second communication module being connected with described microprocessor, described second communication module is for carrying out with described first communication module Data are transmitted.
Human action identification system the most according to claim 2, it is characterised in that described human action recognition unit also wraps Including the attitude algorithm module being connected with described microprocessor, the described sensing signal received is sent to described by described microprocessor Attitude algorithm module, described attitude algorithm module processes according to described sensing signal, calculates the attitude information of human body, and The attitude information of described human body returns to described microprocessor, and described microprocessor is carried out according to the attitude information of described human body Process and export corresponding attitude information.
Human action identification system the most according to claim 3, it is characterised in that described attitude algorithm module includes karr Graceful wave filter, described Kalman filter is filtered for the sensing signal sending described microprocessor, output card Germania Filtering signal.
Human action identification system the most according to claim 4, it is characterised in that described attitude algorithm module also include with The Quaternion algebra module that described Kalman filter is connected, it is defeated that described Quaternion algebra module receives described Kalman filter The Kalman filtering signal gone out, based on described Kalman filtering signal by Quaternion algebra to obtain three-dimensional Euler Angle, calculates the three-dimensional attitude of human body, obtains the attitude information of human body.
Human action identification system the most according to claim 1, it is characterised in that described sensing module includes that three axles accelerate Degree sensor, three axle geomagnetic sensor and/or three-axis gyroscopes, sensed by described 3-axis acceleration sensor, three axle earth magnetism Device and/or three-axis gyroscope sense the action of described human body to produce corresponding sensing signal.
Human action identification system the most according to claim 6, it is characterised in that described sensing signal includes described human body Limbs bearing data, spin data and displacement data in three dimensions.
Human action identification system the most according to claim 7, it is characterised in that described human action recognition unit also wraps Including the power module being connected with described microprocessor, described power module is for powering to described human action recognition unit.
Human action identification system the most according to claim 8, it is characterised in that described human bioequivalence unit also include with The LED that described microprocessor connects, described LED is for sending different colors according to the attitude information of described human body.
10. according to the human action identification system described in any one of claim 1 to 9, it is characterised in that described first communication mould Block and described second communication module carry out data transmission by the way of bluetooth or Wi-Fi.
CN201610908475.XA 2016-10-18 2016-10-18 Human body action identification system Pending CN106325527A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610908475.XA CN106325527A (en) 2016-10-18 2016-10-18 Human body action identification system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610908475.XA CN106325527A (en) 2016-10-18 2016-10-18 Human body action identification system

Publications (1)

Publication Number Publication Date
CN106325527A true CN106325527A (en) 2017-01-11

Family

ID=57819290

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610908475.XA Pending CN106325527A (en) 2016-10-18 2016-10-18 Human body action identification system

Country Status (1)

Country Link
CN (1) CN106325527A (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106909218A (en) * 2017-02-08 2017-06-30 惠州Tcl移动通信有限公司 A kind of terminal display control method and system based on user behavior
CN107966146A (en) * 2017-12-26 2018-04-27 北京风语智格物联科技有限公司 A kind of human body is bent over gesture recognition device
CN108703760A (en) * 2018-06-15 2018-10-26 安徽中科智链信息科技有限公司 Human motion gesture recognition system and method based on nine axle sensors
CN108827290A (en) * 2018-06-15 2018-11-16 安徽中科智链信息科技有限公司 A kind of human motion state inverting device and method
CN109171735A (en) * 2018-08-03 2019-01-11 郑州飞铄电子科技有限公司 A kind of angle measurement system for human action attitude algorithm
CN109190762A (en) * 2018-07-26 2019-01-11 北京工业大学 Upper limb gesture recognition algorithms based on genetic algorithm encoding
CN109673076A (en) * 2018-11-29 2019-04-23 重庆唯哲科技有限公司 The synchronous display systems of the light of more equipment movings
CN109688655A (en) * 2018-11-09 2019-04-26 重庆唯哲科技有限公司 The synchronous display systems of equipment moving based on lighting effects
CN109743809A (en) * 2018-12-19 2019-05-10 重庆秉为科技有限公司 A method of LED is controlled according to human body attitude and is lighted
CN110855797A (en) * 2019-11-28 2020-02-28 河北农业大学 Sheep behavior monitoring system and method based on action recognition
CN110879662A (en) * 2019-11-27 2020-03-13 云南电网有限责任公司电力科学研究院 Action recognition device and method based on AHRS algorithm
CN113128280A (en) * 2019-12-31 2021-07-16 苏州昶升明旸文化传播有限公司 Motion recognition system for display
CN114796986A (en) * 2022-06-08 2022-07-29 深圳市汇泰科电子有限公司 Method for identifying kettle-bell movement information

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103558915A (en) * 2013-11-01 2014-02-05 王洪亮 Human body coupling intelligent information input system and method
CN104898828A (en) * 2015-04-17 2015-09-09 杭州豚鼠科技有限公司 Somatosensory interaction method using somatosensory interaction system
CN206757529U (en) * 2016-10-18 2017-12-15 深圳市华海技术有限公司 Human action identifying system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103558915A (en) * 2013-11-01 2014-02-05 王洪亮 Human body coupling intelligent information input system and method
US20160283189A1 (en) * 2013-11-01 2016-09-29 Beijing Xingyun Shikong Technology Co.,Ltd. Human Body Coupled Intelligent Information Input System and Method
CN104898828A (en) * 2015-04-17 2015-09-09 杭州豚鼠科技有限公司 Somatosensory interaction method using somatosensory interaction system
CN206757529U (en) * 2016-10-18 2017-12-15 深圳市华海技术有限公司 Human action identifying system

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106909218B (en) * 2017-02-08 2020-06-09 惠州Tcl移动通信有限公司 Terminal display control method and system based on user behaviors
CN106909218A (en) * 2017-02-08 2017-06-30 惠州Tcl移动通信有限公司 A kind of terminal display control method and system based on user behavior
CN107966146A (en) * 2017-12-26 2018-04-27 北京风语智格物联科技有限公司 A kind of human body is bent over gesture recognition device
CN108703760A (en) * 2018-06-15 2018-10-26 安徽中科智链信息科技有限公司 Human motion gesture recognition system and method based on nine axle sensors
CN108827290A (en) * 2018-06-15 2018-11-16 安徽中科智链信息科技有限公司 A kind of human motion state inverting device and method
CN109190762A (en) * 2018-07-26 2019-01-11 北京工业大学 Upper limb gesture recognition algorithms based on genetic algorithm encoding
CN109190762B (en) * 2018-07-26 2022-06-07 北京工业大学 Mobile terminal information acquisition system
CN109171735A (en) * 2018-08-03 2019-01-11 郑州飞铄电子科技有限公司 A kind of angle measurement system for human action attitude algorithm
CN109688655A (en) * 2018-11-09 2019-04-26 重庆唯哲科技有限公司 The synchronous display systems of equipment moving based on lighting effects
CN109673076A (en) * 2018-11-29 2019-04-23 重庆唯哲科技有限公司 The synchronous display systems of the light of more equipment movings
CN109673076B (en) * 2018-11-29 2021-08-20 重庆唯哲科技有限公司 Lamplight synchronous display system for multi-equipment movement
CN109743809A (en) * 2018-12-19 2019-05-10 重庆秉为科技有限公司 A method of LED is controlled according to human body attitude and is lighted
CN110879662A (en) * 2019-11-27 2020-03-13 云南电网有限责任公司电力科学研究院 Action recognition device and method based on AHRS algorithm
CN110855797A (en) * 2019-11-28 2020-02-28 河北农业大学 Sheep behavior monitoring system and method based on action recognition
CN113128280A (en) * 2019-12-31 2021-07-16 苏州昶升明旸文化传播有限公司 Motion recognition system for display
CN114796986A (en) * 2022-06-08 2022-07-29 深圳市汇泰科电子有限公司 Method for identifying kettle-bell movement information
CN114796986B (en) * 2022-06-08 2024-03-08 深圳市汇泰科电子有限公司 Method for identifying movement information of kettle bell

Similar Documents

Publication Publication Date Title
CN106325527A (en) Human body action identification system
EP3436867B1 (en) Head-mounted display tracking
US11389686B2 (en) Robotically assisted ankle rehabilitation systems, apparatuses, and methods thereof
US11836284B2 (en) Sensor fusion for electromagnetic tracking
US8287377B2 (en) Movement direction calculator and method for calculating movement direction
CN111228752B (en) Method for automatically configuring sensor, electronic device, and recording medium
CN206757529U (en) Human action identifying system
CN104898829A (en) Somatosensory interaction system
US20180224945A1 (en) Updating a Virtual Environment
JP7316282B2 (en) Systems and methods for augmented reality
CN104898827A (en) Somatosensory interaction method applying somatosensory interaction system
WO2020110659A1 (en) Information processing device, information processing method, and program
US11112857B2 (en) Information processing apparatus, information processing method, and program
US20210373652A1 (en) System and method for a virtual reality motion controller
WO2020261595A1 (en) Virtual reality system, program, and computer readable storage medium
KR102182974B1 (en) System for supporting virtual reality indoor bike exercise and method thereof
CN204631773U (en) Body sense interactive system
CN105771240A (en) Simulation interactive game method
KR102053501B1 (en) VR haptic Tracking System and VR haptic tracking method of walking with Roller based Treadmill system
Kawarazaki et al. A supporting system of choral singing for visually impaired persons using depth image sensor
Arsenault et al. Wearable sensor networks for motion capture
Karimi et al. A wearable 3D motion sensing system integrated with a Bluetooth smart phone application: A system level overview
US20200183487A1 (en) Walkable device and method for controlling a walkable device
Kadam et al. Development of Cost Effective Motion Capture System based on Arduino
CN117180744A (en) Game operation method based on bar-shaped somatosensory controller

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination