CN112274872A - Cloud-based intelligent walnut motion mode identification method and system - Google Patents

Cloud-based intelligent walnut motion mode identification method and system Download PDF

Info

Publication number
CN112274872A
CN112274872A CN202011124307.4A CN202011124307A CN112274872A CN 112274872 A CN112274872 A CN 112274872A CN 202011124307 A CN202011124307 A CN 202011124307A CN 112274872 A CN112274872 A CN 112274872A
Authority
CN
China
Prior art keywords
motion
walnut
intelligent
intelligent walnut
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011124307.4A
Other languages
Chinese (zh)
Other versions
CN112274872B (en
Inventor
金书易
丁珺淳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202011124307.4A priority Critical patent/CN112274872B/en
Publication of CN112274872A publication Critical patent/CN112274872A/en
Application granted granted Critical
Publication of CN112274872B publication Critical patent/CN112274872B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B23/00Exercising apparatus specially adapted for particular parts of the body
    • A63B23/035Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously
    • A63B23/12Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for upper limbs or related muscles, e.g. chest, upper back or shoulder muscles
    • A63B23/16Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for upper limbs or related muscles, e.g. chest, upper back or shoulder muscles for hands or fingers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H39/00Devices for locating or stimulating specific reflex points of the body for physical therapy, e.g. acupuncture
    • A61H39/04Devices for pressing such points, e.g. Shiatsu or Acupressure
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B23/00Exercising apparatus specially adapted for particular parts of the body
    • A63B23/035Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously
    • A63B23/12Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for upper limbs or related muscles, e.g. chest, upper back or shoulder muscles
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/24323Tree-organised classifiers
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/12Driving means
    • A61H2201/1253Driving means driven by a human being, e.g. hand driven
    • A61H2201/1261Driving means driven by a human being, e.g. hand driven combined with active exercising of the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/1635Hand or arm, e.g. handle
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5084Acceleration sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2205/00Devices for specific parts of the body
    • A61H2205/06Arms
    • A61H2205/065Hands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B2071/0691Maps, e.g. yardage maps or electronic maps
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/40Acceleration
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/40Acceleration
    • A63B2220/44Angular acceleration
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/803Motion sensors

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Rehabilitation Therapy (AREA)
  • Public Health (AREA)
  • Epidemiology (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Pain & Pain Management (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention relates to a cloud-based intelligent walnut motion pattern recognition method and system, wherein the method comprises the following steps: s601, acquiring user motion data in real time by using an intelligent walnut internal sensor; s602, performing smoothing pretreatment on the constructed time sequence, and extracting the motion trend of the processed time sequence according to a certain interval; s603, respectively calculating acceleration and angular acceleration of the three coordinate axis directions, and carrying out correlation analysis on the motion trend of the intelligent walnut; s604, constructing correlation coefficient characteristics of three coordinate axis direction dimensions according to the correlation result of the intelligent walnut motion trend calculated in the step S603; s605, training a motion recognition model by using the correlation coefficient characteristics of the three coordinate axis direction dimensions constructed in the step S604; and S606, performing intelligent walnut action recognition on newly collected user motion data, and feeding back a recognition result to the user. The intelligent walnut user exercise reminding system has the beneficial effects that the exercise time and the state of the intelligent walnut user are effectively evaluated, and the user is reminded of regular exercise.

Description

Cloud-based intelligent walnut motion mode identification method and system
[ technical field ] A method for producing a semiconductor device
The invention relates to the technical field of motion recognition of intelligent wearable equipment, in particular to a cloud-based intelligent walnut motion mode recognition method and system.
[ background of the invention ]
With the social development, the aging condition of the population in China is getting more and more serious. As people age, their physical quality declines year by year, easily causing various diseases, affecting their daily lives. Among these diseases, senile dementia is a very common disease. And the higher the probability of the onset of this disease with age. Modern medical research shows that exercise can enhance cardiovascular function, strengthen cognitive ability of brain and delay the progress of senile dementia. Considering that the old people are not suitable for strenuous exercise, the light and flexible exercise is more popular with the old people. Researchers put forward that the hand-turning walnuts can effectively stimulate hand acupuncture points, help patients to exercise arm nerves on the hemiplegic side and the coordination of both arms, and improve the recovery effect of the hand function and the upper limb function of the patients. However, the walnut-playing documents on the market have single function and the effective time of the user cannot be calculated quantitatively. And with age, the elderly have memory decline and are unable to exercise regularly.
Each FFT can only transform the time domain data with limited length, therefore, the signal truncation is needed to be carried out on the time domain signal; even for periodic signals, if the length of time of truncation is not an integer multiple of the period. Then there will be leakage in the intercepted signal; to minimize this leakage error requires the use of a weighting function, also called a windowing function, which is primarily windowed in order to make the time domain signal seem to better meet the periodicity requirements of the FFT process, reducing the leakage. LOESS (localization weighted regression) is a nonparametric method for local regression analysis, which mainly divides a sample into a small interval, performs polynomial fitting on the sample in the interval, continuously repeats the process to obtain weighted regression curves in different intervals, and finally connects the centers of the regression curves together to synthesize a complete regression curve. Bluetooth Low Energy (or Bluetooth LE, BLE, old trademark Bluetooth Smart) is also called Bluetooth Low Energy, is a personal area network technology designed and sold by Bluetooth technical alliance, is intended to be used for emerging applications in the fields of medical care, sports fitness, beacons, security, home entertainment and the like, and compared with the classic Bluetooth, Bluetooth Low Energy aims to remarkably reduce power consumption and cost while maintaining the same communication range. MQTT (Message Queuing telemetry transport) is an instant messaging protocol developed by IBM, and is likely to become an important component of the internet of things, and supports all platforms, almost all networked items can be connected to the outside, and is used as a communication protocol for sensors and actuators (such as networking houses through Twitter).
The invention provides an intelligent walnut information processing method and system by utilizing a cloud-based information acquisition and processing technology to overcome the defect that the conventional cultural playing walnuts are used as health-care products.
[ summary of the invention ]
The invention aims to provide a motion pattern recognition method for effectively evaluating motion time and state of an intelligent walnut user and reminding the user of regular motion.
In order to achieve the purpose, the invention adopts the technical scheme that the cloud-based intelligent walnut motion pattern recognition method comprises the following steps:
s601, acquiring user motion data in real time by using an internal sensor of the intelligent walnut, wherein the acquired data comprises horizontal acceleration and angular acceleration of the intelligent walnut in three coordinate axis directions, so that six-dimensional intelligent walnut motion data is formed;
s602, respectively converting sensor data of six dimensions into time sequences according to fixed time frequency, performing smooth preprocessing on the constructed time sequences, and extracting a motion trend of the processed time sequences according to a certain interval;
s603, respectively calculating acceleration and angular acceleration of the three coordinate axis directions, and carrying out correlation analysis on the motion trend of the intelligent walnut;
s604, constructing correlation coefficient characteristics of three coordinate axis direction dimensions according to the correlation result of the intelligent walnut motion trend calculated in the step S603;
s605, training a motion recognition model by using the correlation coefficient characteristics of the three coordinate axis direction dimensions constructed in the step S604;
and S606, carrying out intelligent walnut action recognition on newly acquired user motion data by using the trained action recognition model, and feeding back a recognition result to the user.
Preferably, in the cloud-based intelligent walnut motion mode identification method, a user acquires horizontal acceleration data and angular acceleration data of the intelligent walnut in three coordinate axis directions by using the accelerometer and the gyroscope inside the intelligent walnut; the intelligent walnut movement is divided into a rotation state mode and a walking state mode; the rotating state mode refers to that the user rotates the intelligent walnut in situ and the user rotates the intelligent walnut in a walking state; the walking state mode refers to that the user is in a walking state, but the intelligent walnut does not rotate in the hand.
Preferably, in the cloud-based intelligent walnut motion pattern recognition method, step S606, the intelligent walnut motion recognition specifically includes the following steps:
s801, performing three coordinate axis direction time sequence modeling on acceleration data and angular acceleration data acquired by an accelerometer and a gyroscope;
s802, smoothing the time sequence constructed in the step S801 by using window function mean value smoothing, wherein the width of a window is determined by the motion state identification frequency;
s803, decomposing the smoothed time sequence obtained in the step S802 based on an LOESS algorithm, and decomposing each coordinate axis direction time sequence into a trend component, a periodic component and a remainder;
s804, calculating acceleration data and angular acceleration data in the same coordinate axis direction, and measuring the motion consistency of a certain moment along the coordinate axis direction by adopting a local correlation coefficient;
s805, constructing a motion trend three-dimensional correlation characteristic sequence based on different coordinate axis directions from the correlation coefficient time sequence calculated in the step S804;
and S806, constructing a classifier by using the three-dimensional correlation characteristic sequence in the step S805 as a data set, identifying a motion mode by the classifier according to the correlation of motion trends represented by the three-dimensional correlation characteristics, wherein when the three-dimensional motion trends are strongly correlated, the intelligent walnut is in a rotating state, and when the three-dimensional motion trends are weakly correlated, the intelligent walnut is in a walking state.
Preferably, in the cloud-based intelligent walnut motion pattern recognition method, in step S801, the accelerometer data D represents three-axis acceleration, and then the accelerometer time sequence uses a set { D }m[Tt]) Representing, wherein m belongs to { X, Y, Z }, i belongs to {1, 2,. multidot.N }, X, Y, Z respectively represent three directions along an X axis, a Y axis and a Z axis, and N represents the length of a time sequence; the angular acceleration time series is then represented by the set G, representing the three-axis angular acceleration by the gyroscope data Gm[Ti]Represents, wherein m belongs to { X, Y, Z }, i belongs to {1, 2.., N }, X, Y, Z respectively represent three directions along an X axis, a Y axis and a Z axis, and N represents a time series length; local correlation coefficient H between coordinate directional acceleration and angular acceleration of X, Y, Z in step S804m[Ti]=corr(dm[Ti],gm[Ti]) Where m is a { x, y, z }, i is a {1, 2,. multidot.N }, corr is a Pearson correlation coefficient function, N represents a local time window, and each time point T is a local time pointiCalculating correlation coefficients to obtain a correlation coefficient time sequence in each axial direction; step S805 is to construct a three-dimensional correlation feature sequence expressed as
Figure BDA0002733085450000041
Wherein i belongs to {1, 2,. and L }, and L is the length of the time sequence; step S806 is performed by step S805HmA classifier is constructed for the data set.
Preferably, the step S806 of constructing a classifier to perform intelligent walnut motion pattern recognition includes a classifier model training process, and a real-time recognition process of acceleration data and angular acceleration data acquired by an accelerometer and a gyroscope based on the trained classifier.
Preferably, the cloud-based intelligent walnut motion pattern recognition method adopts a Support Vector Machine (SVM), logistic regression and a decision tree to train a classifier model.
The invention further aims to provide a motion pattern recognition system for effectively evaluating the motion time and state of the intelligent walnut user and reminding the user of regular motion.
In order to achieve the purpose, the technical scheme is that the cloud-based intelligent walnut motion pattern recognition system comprises a plurality of intelligent walnuts, a mobile phone matched with the intelligent walnuts and a cloud platform; the intelligent walnut is communicated with the mobile phone through Bluetooth, and the mobile phone is communicated with the cloud platform through a communication link; the intelligent walnut built-in sensor is used for collecting user motion data and transmitting the user motion data to the mobile phone; the mobile phone is used for receiving the motion data and transmitting the motion data to the cloud platform; the cloud platform is used for receiving the motion data transmitted by the mobile phone, storing, calculating and analyzing the motion data, and feeding back a calculation and analysis result to the mobile phone; the cloud-based intelligent walnut motion pattern recognition system executes the cloud-based intelligent walnut motion pattern recognition method.
Preferably, the intelligent walnuts are communicated with the mobile phone through a low-power-consumption Bluetooth protocol, and the mobile phone is communicated with the cloud platform through a message queue telemetry transmission protocol.
Preferably, the intelligent walnut is internally provided with a sensor unit, a processor unit and a signal indicating unit; the sensor unit collects user motion data, and the processor unit transmits the collected motion data to the mobile phone end.
Preferably, the intelligent walnut comprises an intelligent walnut shell and a sensor control panel; the sensor control board comprises a sensor unit, a signal indicating unit and a processor unit; the intelligent walnut shell is similar to the shell of a common Chinese playing walnut in size and shape, and is provided with striated bulges, so that the holding feeling is the same as that of the common Chinese playing walnut; the intelligent walnut is hollow inside, and the sensor control panel is located between two hemispheres of the intelligent walnut.
Preferably, the mobile phone is used for transmitting the collected motion data to the cloud platform through the information transmission relay, marking time information on the collected motion data, providing an intelligent walnut action recognition service, and displaying the motion state of the user.
The invention has the following beneficial effects: the motion sensor is used for collecting the motion data of the intelligent walnut in real time, and the built-in chip of the intelligent walnut transmits the data to the cloud end; at the cloud, a time sequence of sensor data is established by an algorithm, and the motion trends in different coordinate axis directions are extracted; calculating motion correlation coefficients in the same coordinate axis direction according to the motion trend data, and constructing motion trend correlation sequence characteristics based on different coordinate axis directions; taking the motion trend correlation sequence characteristics as a sample of an algorithm model, carrying out model training, and training a model capable of identifying the motion mode of the intelligent walnut; the models are applied to newly collected motion data, the motion time of the user in different modes is calculated, the time of the user playing the intelligent walnut is subjected to statistical analysis, the analysis result is sent back to the intelligent mobile phone, and the user is reminded of regular motion through the mobile phone.
[ description of the drawings ]
Fig. 1 is a flow chart of a cloud-based intelligent walnut motion pattern recognition method.
Fig. 2 is a schematic diagram of intelligent walnut rotation actions of the cloud-based intelligent walnut motion pattern recognition method.
Fig. 3 is a schematic diagram of a cloud-based intelligent walnut motion pattern recognition method for intelligent walnut walking states.
Fig. 4 is a flow chart of an intelligent walnut recognition algorithm of the cloud-based intelligent walnut motion pattern recognition method.
Fig. 5 is a cloud-based intelligent walnut motion pattern recognition system architecture diagram.
Fig. 6 is a functional module diagram of an intelligent walnut control panel of an intelligent walnut motion pattern recognition system based on a cloud.
Fig. 7 is a perspective view of a cloud-based intelligent walnut of the intelligent walnut motion pattern recognition system.
Fig. 8 is a mobile phone application function module diagram of an intelligent walnut motion pattern recognition system based on a cloud.
FIG. 9 is a flow chart of a cloud-based intelligent walnut motion pattern recognition system classification model and motion pattern recognition by using the model.
[ detailed description ] embodiments
The invention is further described with reference to the following examples and with reference to the accompanying drawings.
In the present invention, a server is a computer or apparatus that provides and manages network resources on a network, and a terminal may refer to various types of devices including, but not limited to, wireless phones, cellular phones, laptop computers, multimedia wireless devices, wireless communication Personal Computer (PC) cards, Personal Digital Assistants (PDAs), external or internal modems, and the like. A client device, i.e., a terminal, can be any data device that communicates with a server over a wireless channel and/or over a wired channel, e.g., fiber optic or coaxial cables. A terminal can have a variety of names such as mobile station, mobile device, mobile unit, mobile phone, remote station, remote terminal, remote unit, user device, user equipment, handheld device, etc. Different terminals may be incorporated into one system. Terminals may be mobile or stationary and may be dispersed throughout a communication network.
Example 1
The embodiment realizes an intelligent walnut motion mode identification method based on the cloud.
Fig. 1 is a flow chart of a cloud-based intelligent walnut motion pattern recognition method, and as shown in fig. 1, the cloud-based intelligent walnut motion pattern recognition method of the embodiment includes the following steps:
s601, acquiring user motion data in real time by using an internal sensor of the intelligent walnut, wherein the acquired data comprises horizontal acceleration and angular acceleration of the intelligent walnut in three coordinate axis directions, so that six-dimensional intelligent walnut motion data is formed;
s602, respectively converting sensor data of six dimensions into time sequences according to fixed time frequency, performing smooth preprocessing on the constructed time sequences, and extracting a motion trend of the processed time sequences according to a certain interval;
s603, respectively calculating acceleration and angular acceleration of the three coordinate axis directions, and carrying out correlation analysis on the motion trend of the intelligent walnut;
s604, constructing correlation coefficient characteristics of three coordinate axis direction dimensions according to the correlation result of the intelligent walnut motion trend calculated in the step S603;
s605, training a motion recognition model by using the correlation coefficient characteristics of the three coordinate axis direction dimensions constructed in the step S604;
and S606, carrying out intelligent walnut action recognition on newly acquired user motion data by using the trained action recognition model, and feeding back a recognition result to the user.
Preferably, in the cloud-based intelligent walnut motion mode identification method, a user acquires horizontal acceleration data and angular acceleration data of the intelligent walnut in three coordinate axis directions by using the accelerometer and the gyroscope inside the intelligent walnut; the intelligent walnut movement is divided into a rotation state mode and a walking state mode.
Fig. 2 is a schematic diagram of a cloud-based intelligent walnut motion pattern recognition method, wherein the intelligent walnut rotation action is shown in fig. 2, and the rotation state pattern refers to that a user rotates an intelligent walnut in situ and that the user rotates the intelligent walnut in a walking state at the same time.
Fig. 3 is a schematic diagram of a cloud-based intelligent walnut motion pattern recognition method for intelligent walnut walking state, and as shown in fig. 3, the walking state pattern refers to that a user is in a walking state, but the intelligent walnut does not rotate in the hand.
Fig. 4 is a flowchart of an intelligent walnut recognition algorithm of the cloud-based intelligent walnut motion pattern recognition method, as shown in fig. 4, preferably, the step S606 of the cloud-based intelligent walnut motion pattern recognition method specifically includes the following steps:
s801, performing three coordinate axis direction time sequence modeling on acceleration data and angular acceleration data acquired by an accelerometer and a gyroscope;
s802, smoothing the time sequence constructed in the step S801 by using window function mean value smoothing, wherein the width of a window is determined by the motion state identification frequency;
s803, decomposing the smoothed time sequence obtained in the step S802 based on an LOESS algorithm, and decomposing each coordinate axis direction time sequence into a trend component, a periodic component and a remainder;
s804, calculating acceleration data and angular acceleration data in the same coordinate axis direction, and measuring the motion consistency of a certain moment along the coordinate axis direction by adopting a local correlation coefficient;
s805, constructing a motion trend three-dimensional correlation characteristic sequence based on different coordinate axis directions from the correlation coefficient time sequence calculated in the step S804;
and S806, constructing a classifier by using the three-dimensional correlation characteristic sequence in the step S805 as a data set, identifying a motion mode by the classifier according to the correlation of motion trends represented by the three-dimensional correlation characteristics, wherein when the three-dimensional motion trends are strongly correlated, the intelligent walnut is in a rotating state, and when the three-dimensional motion trends are weakly correlated, the intelligent walnut is in a walking state.
Preferably, in the cloud-based intelligent walnut motion pattern recognition method, in step S801, the accelerometer data D represents three-axis acceleration, and then the accelerometer time sequence uses a set { D }m[Ti]) Representing, wherein m belongs to { X, Y, Z }, i belongs to {1, 2,. multidot.N }, X, Y, Z respectively represent three directions along an X axis, a Y axis and a Z axis, and N represents the length of a time sequence; the angular acceleration time series is then represented by the set G, representing the three-axis angular acceleration by the gyroscope data Gm[Ti]) Representing, wherein m belongs to { X, Y, Z }, i belongs to {1, 2,. multidot.N }, X, Y, Z respectively represent three directions along an X axis, a Y axis and a Z axis, and N represents the length of a time sequence; local correlation coefficient H between coordinate directional acceleration and angular acceleration of X, Y, Z in step S804m[Ti]=corr(dm[Ti],gm[Ti]) Where m is equal to { x, y, z },i ∈ {1, 2.,. N }, corr is a Pearson correlation coefficient function, N represents a local time window, and each time point T is equal toiCalculating correlation coefficients to obtain a correlation coefficient time sequence in each axial direction; step S805 is to construct a three-dimensional correlation feature sequence expressed as
Figure BDA0002733085450000091
Wherein i belongs to {1, 2,. and L }, and L is the length of the time sequence; step S806 is performed by step S805HmA classifier is constructed for the data set.
Preferably, the step S806 of constructing a classifier to perform intelligent walnut motion pattern recognition includes a classifier model training process, and a real-time recognition process of acceleration data and angular acceleration data acquired by an accelerometer and a gyroscope based on the trained classifier.
Preferably, the cloud-based intelligent walnut motion pattern recognition method adopts a Support Vector Machine (SVM), logistic regression and a decision tree to train a classifier model.
Example 2
The embodiment realizes an intelligent walnut motion pattern recognition system based on the cloud.
Fig. 5 is a configuration diagram of a cloud-based intelligent walnut motion pattern recognition system, and as shown in fig. 5, the cloud-based intelligent walnut motion pattern recognition system of the embodiment includes a plurality of intelligent walnuts, a mobile phone paired with the intelligent walnuts, and a cloud platform; the intelligent walnut is communicated with the mobile phone through Bluetooth, and the mobile phone is communicated with the cloud platform through a communication link; the intelligent walnut built-in sensor is used for collecting user motion data and transmitting the user motion data to the mobile phone; the mobile phone is used for receiving the motion data and transmitting the motion data to the cloud platform; the cloud platform is used for receiving the motion data transmitted by the mobile phone, storing, calculating and analyzing the motion data, and feeding back a calculation and analysis result to the mobile phone; the cloud-based intelligent walnut motion pattern recognition system executes the cloud-based intelligent walnut motion pattern recognition method.
Preferably, the intelligent walnuts are communicated with the mobile phone through a low-power-consumption Bluetooth protocol, and the mobile phone is communicated with the cloud platform through a message queue telemetry transmission protocol.
Fig. 6 is a functional block diagram of an intelligent walnut control panel of the cloud-based intelligent walnut motion pattern recognition system, as shown in fig. 6, preferably, the intelligent walnut is provided with a built-in sensor unit, a processor unit and a signal indication unit; the sensor unit collects user motion data, and the processor unit transmits the collected motion data to the mobile phone end.
Fig. 7 is a perspective view of an intelligent walnut of a cloud-based intelligent walnut motion pattern recognition system, as shown in fig. 7, preferably, the intelligent walnut comprises an intelligent walnut shell and a sensor control panel; the sensor control board comprises a sensor unit, a signal indicating unit and a processor unit; the intelligent walnut shell is similar to the shell of a common Chinese playing walnut in size and shape, and is provided with striated bulges, so that the holding feeling is the same as that of the common Chinese playing walnut; the intelligent walnut is hollow inside, and the sensor control panel is located between two hemispheres of the intelligent walnut.
Fig. 8 is a functional block diagram of a mobile phone application of a cloud-based intelligent walnut motion pattern recognition system, and as shown in fig. 8, preferably, the mobile phone is used for an information transmission relay to transmit collected motion data to a cloud platform, to mark time information for the collected motion data, to provide an intelligent walnut motion recognition service, and to display a user motion state.
Example 3
The embodiment realizes an intelligent walnut motion pattern recognition system based on the cloud. The embodiment is specifically implemented on the basis of the embodiments 1 and 2, and will be described in terms of system design and intelligent recognition algorithm.
Fig. 5 is a cloud-based intelligent walnut motion pattern recognition system architecture diagram, and as shown in fig. 5, the cloud-based intelligent walnut motion pattern recognition system of the embodiment includes an intelligent walnut module, a smartphone data acquisition application module, and a cloud system data storage and calculation module. And a low-power Bluetooth protocol (BLE) is adopted between the intelligent walnut module and the data acquisition application for sensor data transmission. The smart phone and the cloud system transmit data by using a message queue telemetry transmission protocol (MQTT-SN) based on a sensor network. And finally, storing the sensor data in a cloud end for analyzing the motion condition of the user. Meanwhile, the analysis result is fed back to the user through the smart phone application.
Each module of the intelligent walnut system based on cloud motion recognition is explained in detail below.
Fig. 7 is a perspective view of an intelligent walnut motion pattern recognition system based on a cloud, and as shown in fig. 7, the intelligent walnut comprises an intelligent walnut shell and a sensor control panel. Wherein, the sensor control board includes sensor unit, signal indication unit and processor unit. The intelligent walnut shell is similar to the shell of a common Chinese playing walnut in size and shape, and the walnut shell is provided with striated bulges and has no difference from the common walnut in holding feeling. The interior of the walnut is hollowed, and the sensor control panel is positioned between two hemispheres of the walnut. The sensor unit is used for collecting user motion data in real time, the signal indicating unit is used for indicating the working state of the equipment, and the processor unit transmits the collected data to the smart phone.
Fig. 6 is a functional module diagram of an intelligent walnut control panel of an intelligent walnut motion pattern recognition system based on a cloud, and as shown in fig. 6, the control panel comprises a processor, a switch button, a signal indicator lamp, a battery module, an accelerometer, a gyroscope module and a data interface module. The accelerometer and the gyroscope module form a sensor module for data acquisition. The sensor module and the control chip are communicated by an I2C mode. The control panel is also provided with a starting switch and a signal indicator lamp. After the motion sensor collects data, the data are transmitted to the smart phone through the control chip and are transmitted to the cloud end through application.
Fig. 8 is a functional block diagram of a mobile phone application of a cloud-based intelligent walnut motion pattern recognition system, and as shown in fig. 8, the application on the smart phone includes a data receiving/sending module, a data analyzing module, a sending configuration module, a data receiving/sending module to the cloud, and an information display module. The data receiving/sending module is used for receiving byte data sent by the intelligent walnut control panel. The byte data is converted into character string data through the data analysis module, and finally the character string data is sent to the cloud database for storage. The frequency of sending data to the cloud end by the smart phone, the timestamp of the character string, the unique identification code and other information are set through the configuration module. The information display module mainly displays the motion state of the user and the service information sent to the user by the cloud in real time. In this system, the smartphone has three roles: (a) as an information transmission relay, transmitting the collected sensor data to a cloud database; (b) the sensor collects information and lacks time information, and the mobile phone application software marks the time information for the sensor data; (c) and displaying the motion state of the user and providing intelligent identification service.
The cloud system comprises a data receiving module, a data conversion module, a model training and motion recognition module and a Web information display control module. Wherein, the model training and motion recognition module forms a calculation module. The data receiving and converting module is a message middleware, and when the data of the plurality of sensors is transmitted to the cloud, the data is not directly stored in a database of the cloud, but is cached in a message queue in the converting module. And the analysis submodule in the conversion module continuously takes out the data from the message queue, analyzes the data into a corresponding format according to the storage specification of the database and stores the data into the database. The data calculation module mainly provides a model training and action recognition algorithm, recognizes the motion state represented by the acquired data by training the intelligent recognition model, and feeds back the recognition information to the user. Meanwhile, the Web information display module can display the information acquisition state and the user motion state in fact, and the Web information provides privacy guarantee for the user through authority control.
Based on the intelligent walnut system, the embodiment provides a reliable and accurate motion pattern recognition method. Fig. 1 is a flow chart of a cloud-based intelligent walnut motion pattern recognition method, as shown in fig. 1, the method comprises the following steps:
step S601, acquiring user motion data in real time by using an internal sensor of an intelligent walnut, wherein the acquired data comprises horizontal acceleration and angular acceleration of the intelligent walnut in the directions of three free axes, so that intelligent walnut motion data with six dimensions is formed;
step S602, the sensor data of six dimensions are respectively converted into time series according to fixed time frequency. Then, according to a time interval with a set length, carrying out trend decomposition on the time sequence of each dimension, and extracting the motion trend in the time interval;
in step S603, as can be seen from step S601, the acquired data is data of movement in the directions of three free axes. Setting these three axes as X, Y, Z axes, there is horizontal acceleration data and rotational angular acceleration data for each axis direction. When the walnut rotates, the walnut is divided into two processes of rotation and revolution. Rotation, namely the intelligent walnut rotates around X, Y, Z three axes, and revolution rotates around the geometric center of two walnuts. There is a certain periodicity during the rotation. Therefore, with X, Y, Z three axes as directions, acceleration and angular acceleration motion trend correlation are respectively calculated;
step S604, constructing X, Y, Z three-dimensional features according to the correlation result calculated in step S603;
step S605, training an action recognition model by using the characteristics of the step S604;
and step S606, performing action recognition on the new data by using the training model, and feeding back a recognition result to the user.
Therefore, in the embodiment, an intelligent walnut system and a walnut motion identification process are disclosed. In practical application, when a user plays an intelligent walnut, the motion forms of the intelligent walnut are divided into three types. These three are: (a) the person rotates the walnut in situ, the motion state of the walnut is a rotation state, and the motion state of the walnut is called as a state a; (b) the walnut is rotated when a person is in a walking state, and the motion state of the walnut is a walking and rotating state and is a state b; (c) the person is in a walking state, but the walnut does not rotate in the hand, and the state is a state c. The state that the person stops walking and stops rotating the walnuts in the hands is not considered, and in the state, the numerical values of the motion sensors of the intelligent walnuts in all directions are 0, so that the numerical values are not considered. And the intelligent walnut motion state a and the state b have the same rotation state, and the difference is that the acceleration value of the state b in the X, Y, Z three-axis horizontal direction is larger than that of the state a in the same-axis direction. However, the moving trends of the state a and the state b are consistent, and therefore, it can be considered that both are in a rotating state, the acceleration and the angular acceleration are represented as non-0 on the moving data, and the moving trends in the same direction are consistent. While state c only has motion acceleration in the horizontal direction, angular acceleration tends to 0. In practical application, the state a and the state b represent rotation states, massage is carried out on the hand nerves of the user, and both represent reasonable and effective movement time of the user. The state c represents the leisure time, is a walking state and does not play a role in relieving the motor nerves of the hands. Therefore, in the embodiment, the motion recognition mainly solves the problems of recognizing the rotation state and the walking state of the intelligent walnut.
In practical application, when a user walks, the intelligent walnut has two motion states of rotation and walking when the intelligent walnut normally works. Fig. 2 is a schematic diagram of the intelligent walnut rotation action of the cloud-based intelligent walnut motion pattern recognition method, and as shown in fig. 2, the walnuts rotate around the geometric centers of the two walnuts to generate angular acceleration. Meanwhile, when the user walks in a certain direction, the user generates acceleration in the horizontal direction. Both types of data are acquired by the sensor data inside the intelligent walnut. Fig. 3 is a schematic diagram of a cloud-based intelligent walnut motion pattern recognition method for the intelligent walnut walking state, and as shown in fig. 3, when a walnut is still in the hand, the walnut only has acceleration in the horizontal direction. Fig. 4 is a flow chart of an intelligent walnut recognition algorithm of the cloud-based intelligent walnut motion pattern recognition method, and as shown in fig. 4, the recognition algorithm recognizes two motion modes of the intelligent walnut. The method comprises the following steps:
step S801, time series modeling is performed on accelerometer and gyroscope data. If three-axis acceleration is represented by accelerometer data D, the accelerometer time series is represented by the set { Dm[Tt]Denotes, where m is { X, Y, Z }, i is {1, 2N represents a time series length; the angular acceleration time series is then represented by the set G, representing the three-axis angular acceleration by the gyroscope data Gm[Ti]And represents, wherein m belongs to { X, Y, Z }, i belongs to {1, 2.., N }, X, Y, Z respectively represent three directions along an X axis, a Y axis and a Z axis, and N represents the length of the time series.
Step S802, smoothing the time series of step S801. Due to the noise in the data collected by the sensor, the time sequence constructed in step S801 is smoothed by window function mean smoothing, and the width of the window is determined by the motion state identification frequency.
And step S803, decomposing the smoothed time series acquired in the step S802 by using a LOESS-based algorithm, and decomposing each coordinate axis direction time series into a trend component, a periodic component and a remainder. When the intelligent walnut is in a rotating state, the acceleration and the angular acceleration in the same axial direction are consistent in motion trend, and the motion trend after decomposition is adopted as motion characteristic data of the intelligent walnut in the embodiment. The motion direction and the motion trend of the intelligent walnut during rotation are consistent with the motion trend of the acceleration and the angular acceleration along the same coordinate axis direction.
Step S804, calculating the consistency of the sensor data in the direction of the same coordinate axis. In the embodiment, the motion consistency of a certain moment along the coordinate axis direction is measured by adopting the local correlation coefficient. According to the method represented in step S801, the local correlation coefficient between the coordinate direction acceleration and the angular acceleration of X, Y, Z is as follows: hm[Ti]=corr(dm[Ti],gm[Ti]) Where m is a { x, y, z }, i is a {1, 2,. and N }, corr is a Pearson correlation coefficient function, and N represents a local time window.
For each time point TiThe correlation coefficient can be calculated from the above equation. Therefore, the time series of the correlation coefficients of the respective axial directions can be calculated by the step S804.
The intelligent walnuts rotate along the Z-axis direction and walk, and the acceleration and angular acceleration data distribution conditions can show that the acceleration and the angular acceleration show approximately linear correlation changes in the rotating state. While in the walking state, the two distributions do not show obvious correlation. The distribution of the correlation coefficients is different in different motion modes. Under the rotation mode, the correlation coefficient approaches to 1, and the distribution is concentrated; and in the walking mode, the correlation coefficient is less than 0.6, and the distribution is uniform.
Step S805, the correlation coefficient time series calculated in step S804 forms a three-dimensional feature, which is expressed as
Figure BDA0002733085450000161
Where i ∈ {1, 2,. said, L }, L being the time series length.
Step S806, with step S805HmA classifier is constructed for the data set. And the classifier identifies the motion mode according to the relevance of the motion trend represented by the three-dimensional features. When the three-dimensional motion trend is strongly correlated, the intelligent walnut is in a rotating state, otherwise, the intelligent walnut is in a walking state.
The method for constructing the classifier to perform motion pattern recognition comprises two processes, wherein one process is classifier model training, and the other process is real-time recognition of data collected by the sensor based on the classifier. In the embodiment, the classifier can adopt support vector machine SVM, logistic regression and decision tree. The user can select a proper classifier model to train according to the requirement.
Fig. 9 is a flow chart of a cloud-based intelligent walnut motion pattern recognition system classification model and motion pattern recognition by using the model, such as the process of model training and motion pattern recognition shown in fig. 9. The model training comprises three processes of sensor data acquisition, feature construction and model training. Data acquisition is gathered by intelligent walnut, transmits to the high in the clouds through the cell-phone. And constructing the data according to the steps S801 to S805. And training a classification model by using the constructed features, deploying the trained model at the cloud end, and providing a model general interface. The motion pattern recognition process comprises three processes of data acquisition, feature construction and action recognition. After the characteristics are built, a program calls a model interface to identify the real-time collected data, so that the motion state of the intelligent walnut is judged, and the result is stored in a database. The system develops application according to the recognition result in the database and pushes the application to the mobile phone end of the user in real time.
In summary, in this embodiment, the intelligent walnuts are used to collect acceleration and angular acceleration data, and the acceleration and angular acceleration data are transmitted to the cloud end through the smart phone. And constructing a sensor time sequence at the cloud end, and analyzing the motion trends of the intelligent walnut in different coordinate axis directions according to the time sequence. The intelligent walnut motion characteristics are constructed according to the correlation of the motion trends in the same coordinate axis direction, a classification model is constructed through a machine learning algorithm, and the rotation mode and the walking mode of the intelligent walnut are identified. Through the identification of the intelligent nuclear motion mode, the user can obtain the quantitative evaluation result of the motion state, so that a motion plan is reasonably made, and a good health care effect is achieved.
The embodiment provides an electronic device and a cloud mode identification system. Wherein the electronic device includes an accelerometer, a gyroscope, and a processor. The gyroscope and the accelerometer are connected with the processor through an internal bus. The processor transmits the acquired data to the smart phone in a Bluetooth communication mode. The motion pattern system includes an information receiving queue, a database, and an algorithm identification module. The smart phone transmits data collected by the electronic equipment to the cloud information receiving queue, the data analysis module analyzes the data into standard records and stores the standard records in the database, the algorithm recognition module extracts features from the data, and the motion mode of the electronic equipment is recognized through a trained model. The system analyzes the recognition result and feeds back the result to the user.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, which can be stored in a computer readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and additions can be made without departing from the principle of the present invention, and these should also be considered as the protection scope of the present invention.

Claims (11)

1. A cloud-based intelligent walnut motion pattern recognition method is characterized by comprising the following steps:
s601, acquiring user motion data in real time by using an internal sensor of the intelligent walnut, wherein the acquired data comprises horizontal acceleration and angular acceleration of the intelligent walnut in three coordinate axis directions, so that six-dimensional intelligent walnut motion data is formed;
s602, respectively converting sensor data of six dimensions into time sequences according to fixed time frequency, performing smooth preprocessing on the constructed time sequences, and extracting a motion trend of the processed time sequences according to a certain interval;
s603, respectively calculating acceleration and angular acceleration of the three coordinate axis directions, and carrying out correlation analysis on the motion trend of the intelligent walnut;
s604, constructing correlation coefficient characteristics of three coordinate axis direction dimensions according to the correlation result of the intelligent walnut motion trend calculated in the step S603;
s605, training a motion recognition model by using the correlation coefficient characteristics of the three coordinate axis direction dimensions constructed in the step S604;
and S606, carrying out intelligent walnut action recognition on newly acquired user motion data by using the trained action recognition model, and feeding back a recognition result to the user.
2. The cloud-based intelligent walnut motion pattern recognition method of claim 1, wherein: a user acquires horizontal acceleration data and angular acceleration data of the intelligent walnut in three coordinate axis directions by using the accelerometer and the gyroscope inside the intelligent walnut; the intelligent walnut movement is divided into a rotation state mode and a walking state mode; the rotating state mode refers to that the user rotates the intelligent walnut in situ and the user rotates the intelligent walnut in a walking state at the same time; the walking state mode means that the user is in a walking state, but the intelligent walnut does not rotate in the hand.
3. The cloud-based intelligent walnut motion pattern recognition method of claim 2, wherein the step S606 intelligent walnut motion recognition specifically comprises the following steps:
s801, performing three coordinate axis direction time sequence modeling on acceleration data and angular acceleration data acquired by an accelerometer and a gyroscope;
s802, smoothing the time sequence constructed in the step S801 by using window function mean value smoothing, wherein the width of a window is determined by the motion state identification frequency;
s803, decomposing the smoothed time sequence obtained in the step S802 based on an LOESS algorithm, and decomposing each coordinate axis direction time sequence into a trend component, a periodic component and a remainder;
s804, calculating acceleration data and angular acceleration data in the same coordinate axis direction, and measuring the motion consistency of a certain moment along the coordinate axis direction by adopting a local correlation coefficient;
s805, constructing a motion trend three-dimensional correlation characteristic sequence based on different coordinate axis directions from the correlation coefficient time sequence calculated in the step S804;
and S806, constructing a classifier by using the three-dimensional correlation characteristic sequence in the step S805 as a data set, identifying a motion mode by the classifier according to the correlation of motion trends represented by the three-dimensional correlation characteristics, wherein when the three-dimensional motion trends are strongly correlated, the intelligent walnut is in a rotating state, and when the three-dimensional motion trends are weakly correlated, the intelligent walnut is in a walking state.
4. The cloud-based intelligent walnut motion pattern recognition method of claim 3, wherein: step S801 represents the three-axis acceleration with accelerometer data D, and the accelerometer time series is set by set { Dm[Tt]Represents, wherein m belongs to { X, Y, Z }, i belongs to {1, 2.., N }, X, Y, Z respectively represent three directions along an X axis, a Y axis and a Z axis, and N represents a time series length; using gyroscope data G to represent three-axis angular acceleration, the angular acceleration time series is collected{Gm[Tt]Represents, wherein m belongs to { X, Y, Z }, i belongs to {1, 2.., N }, X, Y, Z respectively represent three directions along an X axis, a Y axis and a Z axis, and N represents a time series length; local correlation coefficient H between coordinate directional acceleration and angular acceleration of X, Y, Z in step S804m[Tt]=corr(dm[Tt],gm,[Tt]) Where m is a { x, y, z }, i is a {1, 2,. multidot.N }, corr is a Pearson correlation coefficient function, N represents a local time window, and each time point T is a local time pointiCalculating correlation coefficients to obtain a correlation coefficient time sequence in each axial direction; step S805 is to construct a three-dimensional correlation feature sequence expressed as
Figure FDA0002733085440000031
Wherein i belongs to {1, 2,. and L }, and L is the length of the time sequence; step S806 is performed by step S805HmA classifier is constructed for the data set.
5. The cloud-based intelligent walnut motion pattern recognition method of claim 3, wherein: the step S806 of constructing the classifier for intelligent walnut motion pattern recognition comprises a classifier model training process and a real-time recognition process of acceleration data and angular acceleration data collected by the accelerometer and the gyroscope based on the trained classifier.
6. The cloud-based intelligent walnut motion pattern recognition method of claim 5, wherein: and (3) carrying out classifier model training by adopting a Support Vector Machine (SVM), logistic regression and a decision tree.
7. A cloud-based intelligent walnut motion pattern recognition system comprises a plurality of intelligent walnuts, a mobile phone matched with the intelligent walnuts and a cloud platform; the intelligent walnut is communicated with the mobile phone through Bluetooth, and the mobile phone is communicated with the cloud platform through a communication link; the intelligent walnut built-in sensor is used for collecting user motion data and transmitting the user motion data to the mobile phone; the mobile phone is used for receiving the motion data and transmitting the motion data to the cloud platform; the cloud platform is used for receiving the motion data transmitted by the mobile phone, storing, calculating and analyzing the motion data, and feeding back a calculation and analysis result to the mobile phone; the method is characterized in that: the cloud-based intelligent walnut motion pattern recognition system executes the cloud-based intelligent walnut motion pattern recognition method of any one of claims 1 to 6.
8. The cloud-based intelligent walnut motion pattern recognition system of claim 7, wherein: the intelligent walnut is communicated with the mobile phone through a low-power-consumption Bluetooth protocol, and the mobile phone is communicated with the cloud platform through a message queue telemetry transmission protocol.
9. The cloud-based intelligent walnut motion pattern recognition system of claim 7, wherein: the intelligent walnut is internally provided with a sensor unit, a processor unit and a signal indicating unit; the sensor unit collects user motion data, and the processor unit transmits the collected motion data to the mobile phone end.
10. The cloud-based intelligent walnut motion pattern recognition system of claim 9, wherein: the intelligent walnut comprises an intelligent walnut shell and a sensor control panel; the sensor control board comprises a sensor unit, a signal indicating unit and a processor unit; the intelligent walnut shell is similar to the shell of a common Chinese playing walnut in size and shape, and is provided with striated bulges, so that the holding feeling is the same as that of the common Chinese playing walnut; the intelligent walnut is hollow, and the sensor control panel is located between two hemispheres of the intelligent walnut.
11. The cloud-based intelligent walnut motion pattern recognition system of claim 7, wherein: the mobile phone is used for the information transmission relay to transmit the collected motion data to the cloud platform, is used for marking time information for the collected motion data, is used for providing intelligent walnut action recognition service and is used for displaying the motion state of the user.
CN202011124307.4A 2020-10-20 2020-10-20 Cloud-based intelligent walnut motion mode identification method and system Active CN112274872B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011124307.4A CN112274872B (en) 2020-10-20 2020-10-20 Cloud-based intelligent walnut motion mode identification method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011124307.4A CN112274872B (en) 2020-10-20 2020-10-20 Cloud-based intelligent walnut motion mode identification method and system

Publications (2)

Publication Number Publication Date
CN112274872A true CN112274872A (en) 2021-01-29
CN112274872B CN112274872B (en) 2022-01-28

Family

ID=74423419

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011124307.4A Active CN112274872B (en) 2020-10-20 2020-10-20 Cloud-based intelligent walnut motion mode identification method and system

Country Status (1)

Country Link
CN (1) CN112274872B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102246125A (en) * 2008-10-15 2011-11-16 因文森斯公司 Mobile devices with motion gesture recognition
CN106267774A (en) * 2015-05-25 2017-01-04 腾讯科技(深圳)有限公司 Moving state identification method and apparatus
CN108520248A (en) * 2018-04-17 2018-09-11 成都乐动信息技术有限公司 Recognizing model of movement method and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102246125A (en) * 2008-10-15 2011-11-16 因文森斯公司 Mobile devices with motion gesture recognition
CN106267774A (en) * 2015-05-25 2017-01-04 腾讯科技(深圳)有限公司 Moving state identification method and apparatus
CN108520248A (en) * 2018-04-17 2018-09-11 成都乐动信息技术有限公司 Recognizing model of movement method and device

Also Published As

Publication number Publication date
CN112274872B (en) 2022-01-28

Similar Documents

Publication Publication Date Title
Mukhopadhyay Wearable sensors for human activity monitoring: A review
Mitra et al. KNOWME: a case study in wireless body area sensor network design
US9060714B2 (en) System for detection of body motion
CN109069066A (en) Wearable and connection gait analysis system
JP2018102617A (en) Emotion estimation apparatus, method, and program
Ghasemzadeh et al. Structural action recognition in body sensor networks: Distributed classification based on string matching
JP2008276353A (en) Communication system, management apparatus, terminal, method, program and data structure
CN104181875A (en) Excise and fitness system based on mode of Internet of Things
CN105868519A (en) Human body characteristic data processing method and apparatus
CN110327595A (en) Motion capture identification and assessment device and method based on wearable sensors
CN111178288B (en) Human body posture recognition method and device based on local error layer-by-layer training
CN110659677A (en) Human body falling detection method based on movable sensor combination equipment
CN109846487A (en) Thigh measuring method for athletic posture and device based on MIMU/sEMG fusion
Fan et al. Wearable motion attitude detection and data analysis based on Internet of Things
CN108958482A (en) A kind of similitude action recognition device and method based on convolutional neural networks
Xu et al. Wearable muscle movement information measuring device based on acceleration sensor
CN111603750A (en) Motion capture recognition evaluation system and method based on edge calculation
CN112274872B (en) Cloud-based intelligent walnut motion mode identification method and system
CN109567814B (en) Classification recognition method, computing device, system and storage medium for tooth brushing action
US20150221232A1 (en) System and methods for crediting physical activity performed by a user
Fu Wireless sensor network topology theory for data collection and analysis of sports training human body
Chen et al. Air-CSL: Chinese sign language recognition based on the commercial WiFi devices
Wang et al. Digital and Intelligent Image Processing by Artificial Intelligence and Internet of Things Technology in Sports Fitness Detection
Xu et al. Towards accelerometry based static posture identification
Qin et al. Real-time monitoring system of exercise status based on Internet of health things using safety architecture model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant