WO2018135692A1 - Wearable device for motion recognition and control, and method for motion recognition control using same - Google Patents

Wearable device for motion recognition and control, and method for motion recognition control using same Download PDF

Info

Publication number
WO2018135692A1
WO2018135692A1 PCT/KR2017/001568 KR2017001568W WO2018135692A1 WO 2018135692 A1 WO2018135692 A1 WO 2018135692A1 KR 2017001568 W KR2017001568 W KR 2017001568W WO 2018135692 A1 WO2018135692 A1 WO 2018135692A1
Authority
WO
WIPO (PCT)
Prior art keywords
control
signal
emg
gesture
motion
Prior art date
Application number
PCT/KR2017/001568
Other languages
French (fr)
Korean (ko)
Inventor
김범준
이분진
Original Assignee
계명대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 계명대학교 산학협력단 filed Critical 계명대학교 산학협력단
Publication of WO2018135692A1 publication Critical patent/WO2018135692A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/389Electromyography [EMG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6824Arm or wrist
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • A61B5/7207Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal of noise induced by motion artifacts
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7225Details of analog processing, e.g. isolation amplifier, gain or sensitivity adjustment, filtering, baseline or drift compensation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/70Multimodal biometrics, e.g. combining information from different biometric modalities
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns

Definitions

  • the present invention relates to a wearable device for motion recognition and control and a motion recognition control method using the same, and more specifically, to an EMG-based control interface using a single channel EMG sensor located in an arm band and an IMU sensor located in a forearm.
  • the present invention relates to a wearable device for gesture recognition and control for realizing a wearable and a gesture recognition control method using the same.
  • HCI human computer interaction
  • various researches on interactions that can enhance user's cognitive, emotional, and life experiences are being conducted.
  • HCI is making great efforts to develop a user-friendly interface using voice, vision, gestures, or other innovative input / output channels.
  • One of the most difficult approaches in this area of research is to connect human nerve signals to computers using the electrical properties of the human nervous system.
  • a variety of biomedical signals can be used to connect these nerves and computers, most of which can be obtained from cellular tissues such as specific body tissues, organs or nervous systems.
  • the biosignal refers to an electric or magnetic signal generated in the human body and typically includes signals such as electromyography (EMG), electrocardiogram (ECG), electroencephalogram (EDG), safety (EOG), and skin conductivity (GSR). .
  • EMG electromyography
  • ECG electrocardiogram
  • EDG electroencephalogram
  • EOG safety
  • GSR skin conductivity
  • the bio-signal has been mainly used for the purpose of treatment or rehabilitation in the medical field, but recently, it is used to control the operation of a computer, machine, robot, etc. by inferring the user's intention in human-computer interface (HCI) field.
  • HCI human-computer interface
  • the range of application is widening.
  • a technology of accurately detecting a biosignal and a technology of accurately inferring a user's operation intention from the detected biosignal are very important.
  • the biosignal detection apparatus determines that the user has started the operation when the specific parameter value of the biosignal exceeds the set threshold and performs a control operation corresponding to the operation.
  • the biosignal includes complex noises caused by device-specific noise, electromagnetic radiation, operating noise, and interaction with other tissues.
  • the noise included in the biosignal such as EMG is constantly changing according to the surrounding environment, the user's physical condition, and the contact state of the sensor and the body, so that it is difficult to set an accurate threshold value.
  • a method that can accurately determine whether the operation of the is required. Republic of Korea Patent Application Publication No. 10-2016-0133306 and Republic of Korea Patent Publication No. 10-1000869 is disclosed in the prior art literature.
  • the present invention has been proposed to solve the above problems of the conventionally proposed methods, an EMG sensor module for measuring EMG signals, an IMU sensor module for measuring motion signals, a control module for performing noise reduction preprocessing, And a computer device unit for recognizing and controlling the operation of the gesture based on the EMG signal, thereby removing the noise included in the EMG signal used for the operation recognition and control, thereby enabling the operation recognition and control to be more accurate.
  • An object of the present invention is to provide a wearable device and a motion recognition control method using the same.
  • An electromyogram (EMG) sensor module for measuring an EMG signal according to muscle movement of a user, and converting and outputting the measured EMG signal to a digital;
  • An IMU Inertial Motion Unit
  • a control (MCU) module configured to receive an EMG signal of the EMG sensor module and a motion signal of the IMU sensor module, and perform filtering to remove dynamic noise included in the EMG signal based on the motion signal;
  • Including a computer device section is characterized by its configuration.
  • the EMG sensor module Preferably, the EMG sensor module, the EMG sensor module, and
  • It can be configured to include a single channel EMG sensor located in the arm band (arm band) of the user.
  • the EMG sensor module Preferably, the EMG sensor module, the EMG sensor module, and
  • the EMG sensor module More preferably, the EMG sensor module,
  • a differential amplifier can be used as an amplifier for amplifying the rectified signal.
  • the IMU sensor module Preferably, the IMU sensor module,
  • the sensor may be configured of any one of an accelerometer, a gyroscope, and a magnetometer.
  • the motion signal More preferably, the motion signal
  • the EMG sensor module may be used as a reference noise signal for correcting motion artifacts included in an EMG signal measured by the EMG sensor module.
  • control module Preferably, the control module,
  • the apparatus may further include a Bluetooth module for transmitting the EMG signal and the motion signal to which the preprocessing process of removing unnecessary noise including the dynamic noise included in the EMG signal is transmitted to the computer device.
  • control module More preferably, the control module,
  • It may be implemented as a Lily Pad (Simblee) BLE board integrated with low power Bluetooth 4.0 for full power communication with the computer device.
  • the EMG sensor module, the IMU sensor module, and the control (MCU) module may be configured to be worn (wearable) is accommodated in the cylindrical case is fastened to the forearm of the user.
  • cylindrical case More preferably, the cylindrical case,
  • It can be produced by 3D printer.
  • cylindrical case More preferably, the cylindrical case,
  • It can be configured to further include a fastening member detachable to the forearm of the user.
  • the computer device unit More preferably, the computer device unit,
  • Data selection and mapping can be performed by referring to a preset datasheet.
  • the computer device unit Even more preferably, the computer device unit,
  • the gesture of the EMG signal is determined and recognized on the basis of the data classified by the feature extraction of the gesture and the result obtained through the data selection and mapping processing process referring to a preset data sheet, and corresponds to the recognized gesture. Operation processing can be executed on the control application.
  • the computer device unit Even more preferably, the computer device unit,
  • the EMG signal and the IMG signal are combined to detect isometric muscle activity that does not appear as a real motion, classify subtle gestures without motion, and control the condition without disturbing the surrounding environment. Can be enabled.
  • the computer device unit Even more preferably, the computer device unit,
  • Wavelet transform of time-frequency analysis may be applied as feature extraction of a gesture according to a user's muscle movement using the EMG signal.
  • the gesture classification is
  • hand close gesture hand open gesture
  • hand extention gesture hand extention gesture
  • the computer device unit Even more preferably, the computer device unit,
  • classification processing may be performed using a K-nearest neighbor (KNN) algorithm.
  • KNN K-nearest neighbor
  • a motion recognition control method using a wearable device including an EMG sensor module, an IMU sensor module, a control (MCU) module, and a computer device,
  • control module performs a preprocessing process to remove unnecessary noise including dynamic noise included in the EMG signal based on the motion signal, and the EMG signal and the motion signal in which the preprocessing process is performed Transmitting to the computer device unit;
  • the computer device unit receives the EMG signal and the motion signal from which the dynamic noise has been removed from the control module through the wireless local area communication through the step (3), and performs mapping processing of feature extraction, classification, and data selection. Recognizing the operation of the EMG signal and performing the corresponding control is characterized in that the configuration.
  • step (4) Preferably, in step (4),
  • the computer device extracts and classifies the gesture feature according to the muscle movement of the user by using the EMG signal received from the control module through Bluetooth communication, and uses the movement (IMU) signal.
  • IMU movement
  • data selection and mapping are performed by referring to a preset data sheet, and the data classified through feature extraction of the gesture and the results obtained through processing of data selection and mapping with reference to a preset data sheet are performed.
  • the gesture of the EMG signal may be determined and recognized, and an operation process corresponding to the determined and recognized gesture may be executed on the control application.
  • the computer device unit More preferably, the computer device unit,
  • wavelet transform of time-frequency analysis is applied, and a feature extraction of a gesture according to the user's muscle movement using the EMG signal is applied.
  • the classification of it can be classified using the K-nearest Neighbor (KNN) algorithm.
  • an EMG sensor module for measuring an EMG signal an IMU sensor module for measuring a motion signal, and noise pretreatment may be performed.
  • a control module and a computer device unit for recognizing and controlling the operation of the gesture based on the EMG signal the motion recognition and control can be eliminated to reduce noise included in the used EMG signal to enable more accurate motion recognition.
  • the single-channel EMG sensor is located in the arm band of the user, and configured to place the IMU sensor on the forearm,
  • the EMG-based control interface is wearable, allowing convenient use of the detachable, and the combination of the EMG signal and the IMU signal can be used to accurately recognize subtle gestures without movement.
  • FIG. 1 is a block diagram illustrating a configuration of a wearable device for gesture recognition and control according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a system structure of a wearable device for gesture recognition and control according to an embodiment of the present invention.
  • FIG. 3 is a view illustrating a state of a hand gesture recognized by a wearable device for gesture recognition and control according to an embodiment of the present invention.
  • FIG. 4 is a view showing an operating state of the arm in a wearing state of the wearable device for gesture recognition and control according to an embodiment of the present invention.
  • FIG. 5 is a view showing another operating state of the arm in a wearing state of the wearable device for gesture recognition and control according to an embodiment of the present invention.
  • FIG. 6 is a view showing another operating state of the arm in a wearing state of the wearable device for gesture recognition and control according to an embodiment of the present invention.
  • FIG. 7 illustrates training data of a wearable device for gesture recognition and control according to an embodiment of the present invention.
  • FIG. 8 is a graph illustrating a recognition score for each gesture of a wearable device for gesture recognition and control according to an embodiment of the present invention.
  • FIG. 9 is a flowchart illustrating an operation flow of a method for controlling motion recognition using a wearable device according to an embodiment of the present invention.
  • S140 the computer device unit recognizing the operation of the EMG signal and performing the corresponding control through the feature extraction, classification, and mapping of data selection using the EMG signal and the motion signal from which the dynamic noise is removed
  • FIG. 1 is a block diagram showing the configuration of a wearable device for gesture recognition and control according to an embodiment of the present invention
  • Figure 2 is a wearable device for gesture recognition and control according to an embodiment of the present invention
  • FIG. 3 is a diagram illustrating a system structure
  • FIG. 3 is a view illustrating a hand gesture recognized by a wearable device for gesture recognition and control according to an embodiment of the present invention
  • FIGS. 4 to 6 are views of the present invention.
  • FIG. 1 is a diagram illustrating various operation states of an arm in a wearing state of a wearable device for gesture recognition and control according to an embodiment. 1 and 2, the wearable device 100 for motion recognition and control according to an embodiment of the present invention, the EMG sensor module 110, IMU sensor module 120, control (MCU) Module 130, and the computer device 140 may be configured.
  • MCU control
  • the EMG sensor module 110 measures an EMG signal according to a user's muscle movement, converts the measured EMG signal into a digital signal, and outputs the digital EMG signal to the control module 130.
  • EMG electromyogram
  • Such an electromyogram (EMG) sensor module 110 may be configured to include a single channel EMG sensor located in an arm band of a user, as shown in FIGS. 4 to 6.
  • the EMG sensor module 110 is attached to the user's arm and rectified the raw EMG signal measured according to the muscle movement of the user, and then amplified the rectified signal to amplify the filtered EMG signal.
  • the A / D converter can be used to convert digital signals.
  • the EMG sensor module 110 may use a differential amplifier as an amplifier for amplifying the rectified signal.
  • the IMU sensor module 120 measures a user's motion signal, converts the measured motion signal into a digital signal, and outputs it to the control module 130.
  • the IMU (Inertial Motion Unit) sensor module 120 is an IMU sensor positioned on the user's forearm to measure a user's movement, as shown in FIGS. 4 to 6, and includes an accelerometer and a gyroscope. ), And a magnetometer of the sensor.
  • the motion signal may be used as a reference noise signal for correcting the motion artifacts included in the EMG signal measured by the EMG sensor module 110.
  • the control (MCU) module 130 receives the EMG signal of the EMG sensor module 110 and the motion signal of the IMU sensor module 120, and receives dynamic motion artifacts included in the EMG signal based on the motion signal. It is the configuration of the processing module that performs filtering to remove.
  • the control module 130 is a Bluetooth module for transmitting the EMG signal and the motion signal to which the preprocessing process for removing unnecessary noise including the dynamic noise included in the EMG signal is transmitted to the computer device unit 140. 131 may be further included.
  • the control module 130 is implemented as a Lily Pad Simmblee BLE board integrated with low power Bluetooth 4.0 for full power communication with the computer device 140. Can be.
  • the EMG sensor module 110, the IMU sensor module 120, and the control (MCU) module 130 is accommodated in the cylindrical case 101, as shown in Figures 4 to 6, the user's forearm It can be configured as a wearable (wearable) fastened to.
  • the cylindrical case 101 may be manufactured by a 3D printer, and may further include a fastening member 102 that is detachable to a forearm of a user.
  • the computer device unit 140 receives the EMG signal and the motion signal from which the dynamic noise has been removed from the control module 130 through wireless near field communication, and operates the EMG signal through mapping processing of feature extraction, classification, and data selection. It is a configuration that recognizes and performs the corresponding control.
  • the computer device unit 140 extracts and classifies gesture features according to muscle movements of a user using EMG signals received from the control module 130 through Bluetooth communication, and uses movement (IMU) signals. In order to distinguish gestures taken by the user, data selection and mapping may be performed by referring to a preset data sheet.
  • the computer device 140 determines and recognizes the gesture of the EMG signal based on the data classified by the feature extraction of the gesture and the result obtained through the data selection and mapping processing with reference to the preset data sheet.
  • the operation processing corresponding to the recognized gesture may be executed on the control application.
  • the computer device unit 140 combines EMG and EMG signals to detect isometric muscle activity that does not appear as actual movement, classifies subtle gestures without movement, Control can be made without interruption.
  • the computer device 140 may apply a wavelet transform of time-frequency analysis as feature extraction of a gesture according to a user's muscle movement using an EMG signal.
  • the gesture classification may be made of three classes, a hand close gesture, a hand open gesture, and a hand extention gesture, as shown in FIG. 3.
  • the wavelet transform (WT) is a time-frequency analysis method that is easy to analyze abnormal signals such as EMG signals.
  • the advantage of using wavelet transform is that it is easy to set the dimension of the feature vector, and The result is almost the same as using the dimension feature vector.
  • the wavelet transform is developed from Fourier analysis and applied for a short time interval, and the size of the window can be modified through real-time frequency analysis.
  • an analysis of a signal having a high frequency can produce the same result as an analysis of a low frequency band.
  • distinct features must be taken from each matched pattern in order to distinguish the gestures taken by the user, several features can be extracted. Among them, general statistical features such as maximum, minimum, root mean square, etc. It may include.
  • the computer device 140 may classify the feature extraction of the gesture according to the muscle movement of the user using the EMG signal, and may classify using the K-nearest Neighbor (KNN) algorithm.
  • KNN K-nearest Neighbor
  • the support vector machine (SVM) classification algorithm can be used in a linear or nonlinear manner using Kerenel, but it requires a lot of data learning for accurate results.
  • the results of KNN are compared with the results of SVM using the KNN algorithm, and it can be seen through experiments that KNN yields better results than SVM. The reason is that KNN itself is non-linear, so it can easily detect data that is distributed linearly or non-linearly, and requires fewer data points to achieve good results.
  • FIG. 7 is a diagram illustrating training data of a wearable device for gesture recognition and control according to an embodiment of the present invention
  • FIG. 8 is a gesture for each wearable device for gesture recognition and control according to an embodiment of the present invention. It is a figure which shows the recognition score graphically.
  • FIG. 7 is a table showing each group of gestures corresponding to 0, 1, and 2 of the hand close gesture, the hand open gesture, and the hand extension gesture after the preprocessing and feature extraction. After learning the data (MAX, MIN, MAV, RMS, STD), choose which function will yield the best results. 8 confirms the score of the model to confirm the accuracy of gesture recognition, and shows the variance with the score.
  • the control module 130 receives an EMG signal (S110) and the control module 130. Receiving this movement (IMU) signal (S120), the control control module 130 transmits the EMG signal and the movement signal, the preprocessing process is performed to the computer device unit 140 (S130), and the computer device unit Recognizing the operation of the EMG signal and performing the corresponding control through the process of mapping the feature extraction, classification, and data selection using the EMG signal and the motion signal from which the dynamic noise is removed (S140) Can be implemented.
  • IMU movement
  • the wearable device 100 used in the motion recognition control method of the present invention may include an EMG sensor module 110 and an IMU sensor module 120 for recognizing a gesture of a hand, as shown in FIGS. 1 to 6, respectively.
  • the control (MCU) module 130 and the computer device unit 140 is provided.
  • the motion recognition control method using the wearable device 100 described above will be described in detail.
  • the control module 130 receives an EMG signal measured from the EMG sensor module 110 through the muscle movement of the user's forearm and converted into a digital signal.
  • the electromyogram sensor module 110 may include a single channel EMG sensor located in an arm band of a user, as shown in FIGS. 4 to 6.
  • the EMG sensor module 110 is attached to the user's arm and rectified the raw EMG signal measured according to the muscle movement of the user, and then amplified the rectified signal to amplify the filtered EMG signal.
  • the A / D converter can be used to convert digital signals.
  • the EMG sensor module 110 may use a differential amplifier as an amplifier for amplifying the rectified signal.
  • the control module 130 receives a movement (IMU) signal measured by the user's movement from the IMU sensor module 120 and converted into a digital signal.
  • IMU Inertial Motion Unit
  • the IMU (Inertial Motion Unit) sensor module 120 is an IMU sensor for measuring the degree of movement of the user, which is located on the forearm of the user, as shown in FIGS. 4 to 6, and includes an accelerometer and a gyroscope. Gyroscope, and a magnetometer (Magnetometer) can be composed of any one sensor.
  • control module 130 performs a preprocessing process of removing unnecessary noise including dynamic noise included in the EMG signal based on the motion signal, and the EMG signal and the motion in which the preprocessing process is performed.
  • the signal is transmitted to the computer device unit 140.
  • the control module 130 may further include a Bluetooth module 131 for transmitting the EMG signal and the motion signal to which the preprocessing process has been performed to the computer device unit 140.
  • the control module 130 may be implemented as a lily pad simble BLE board integrated with low power Bluetooth 4.0 for full power communication with the computer device 140.
  • step S140 the computer device unit 140 receives the EMG signal and the motion signal from which the dynamic noise has been removed from the control module 130 through the wireless local area communication, and maps feature extraction, classification, and data selection through step S130. Through processing, the operation of the EMG signal is recognized and control corresponding thereto is performed.
  • step S140 the computer device 140 extracts and classifies gesture features according to muscle movements of the user using EMG signals received from the control module 130 through Bluetooth communication.
  • Data selection and mapping with reference to a preset data sheet to distinguish gestures taken by a user using a signal Data selection and mapping with reference to a preset data sheet and data classified by extracting a feature of a gesture Determining and recognizing the gesture of the EMG signal based on the result obtained through the operation, the operation processing corresponding to the determined gesture is determined and executed on the control application.
  • the computer device 140 extracts a feature of a gesture according to a user's muscle movement using an EMG signal, applies a wavelet transform of time-frequency analysis, and uses the EMG signal.
  • classification may be performed using a K-nearest neighbor (KNN) algorithm.
  • a wearable device for motion recognition and control and a motion recognition control method using the same include an EMG sensor module measuring an EMG signal, an IMU sensor module measuring a motion signal, and noise And a control module for performing preprocessing of the removal, and a computer device unit for recognizing and controlling the operation of the gesture based on the EMG signal, thereby removing the noise contained in the EMG signal used for the motion recognition and control to more accurately recognize the motion.
  • This can be done by combining the EMG sensor module, the IMU sensor module and the control module in one case, and configuring the single channel EMG sensor in the user's arm band and the IMU sensor in the forearm.
  • EMG-based control interface is wearable, allowing easy use of detachable EMG A call and how the IMU signals are combined using a subtle gesture no motion is possible to ensure that they accurately recognized.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Signal Processing (AREA)
  • Physiology (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Computer Hardware Design (AREA)
  • Power Engineering (AREA)
  • Dermatology (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

A wearable device for motion recognition and control and a method for motion recognition control using the device suggested by the present invention involve: an electromyogram (EMG) sensor module for measuring an EMG signal; an IMU sensor module for measuring a motion signal; a control module performing the pre-processing of noise removal; and a computer device unit recognizing and controlling the motion of a gesture based on the EMG signal. Accordingly, noise included in an EMG signal used for motion recognition and control is removed, thereby enabling more accurate motion recognition.

Description

동작 인식 및 제어를 위한 웨어러블 장치 및 이를 이용한 동작 인식 제어 방법Wearable device for motion recognition and control and motion recognition control method using same
본 발명은 동작 인식 및 제어를 위한 웨어러블 장치 및 이를 이용한 동작 인식 제어 방법에 관한 것으로서, 보다 구체적으로는 팔 밴드에 위치하는 단일 채널 EMG 센서와 팔뚝에 위치하는 IMU 센서를 사용하여 EMG 기반의 제어 인터페이스를 착용(wearable)형으로 구현하는 동작 인식 및 제어를 위한 웨어러블 장치 및 이를 이용한 동작 인식 제어 방법에 관한 것이다.The present invention relates to a wearable device for motion recognition and control and a motion recognition control method using the same, and more specifically, to an EMG-based control interface using a single channel EMG sensor located in an arm band and an IMU sensor located in a forearm. The present invention relates to a wearable device for gesture recognition and control for realizing a wearable and a gesture recognition control method using the same.
전자 기술의 발달에 힘입어 기계와 사용자 간의 인터랙션(interaction) 방법은 더욱 다양해지고 있다. 특히, 최근에는 인간-컴퓨터 상호작용(Human Computer Interaction, HCI) 기술에 대한 관심도가 높아짐에 따라 사용자의인지적, 감성적, 그리고 생활 경험을 증진시킬 수 있는 인터랙션에 대한 다양한 연구가 진행되고 있다. 이러한 HCI는 음성, 비전, 제스처, 혹은 그 외에도 다른 혁신적인 입출력 채널을 이용하는 사용자 친화적인 인터페이스 개발을 위해 많은 노력이 이루어지고 있다. 이 연구 분야에 있어서 가장 난이도가 높은 접근 방법 중 하나는 인간의 신경 체계의 전기적인 특성을 이용하여 인간의 신경 신호와 컴퓨터를 연결하는 것이다. 이와 같은 신경과 컴퓨터 간의 연결을 위해 다양한 생체의학적인 신호들이 사용되어 질 수 있는데, 대부분은 특별한 신체조직, 장기 또는 신경 체계와 같은 세포 조직으로부터 얻을 수 있게 된다.With the development of electronic technology, interaction methods between machines and users are becoming more diverse. In particular, as interest in human computer interaction (HCI) technology increases, various researches on interactions that can enhance user's cognitive, emotional, and life experiences are being conducted. HCI is making great efforts to develop a user-friendly interface using voice, vision, gestures, or other innovative input / output channels. One of the most difficult approaches in this area of research is to connect human nerve signals to computers using the electrical properties of the human nervous system. A variety of biomedical signals can be used to connect these nerves and computers, most of which can be obtained from cellular tissues such as specific body tissues, organs or nervous systems.
생체신호는 인체에서 발생되는 전기적 또는 자기적 신호를 의미하며, 대표적으로 근전도(EMG), 심전도(ECG), 뇌전도(EDG), 안전도(EOG), 피부 전도도(GSR) 등의 신호를 포함한다. 이러한 생체신호는 종래 의료분야에서 치료 또는 재활의 목적으로 주로 활용되어 왔으나, 최근에는 인간-컴퓨터 인터페이스(HCI) 분야에서 사용자의 동작 의도를 추론하여 컴퓨터, 기계, 로봇 등의 동작을 제어하는 용도로 그 활용범위가 넓어지고 있다. 이러한 생체신호를 이용하여 컴퓨터, 기계, 로봇 등을 제어하기 위해서는 생체신호를 정확하게 검출하는 기술은 물론이고, 검출된 생체신호로부터 사용자의 동작의도를 정확하게 추론하는 기술이 매우 중요하다.The biosignal refers to an electric or magnetic signal generated in the human body and typically includes signals such as electromyography (EMG), electrocardiogram (ECG), electroencephalogram (EDG), safety (EOG), and skin conductivity (GSR). . The bio-signal has been mainly used for the purpose of treatment or rehabilitation in the medical field, but recently, it is used to control the operation of a computer, machine, robot, etc. by inferring the user's intention in human-computer interface (HCI) field. The range of application is widening. In order to control a computer, a machine, a robot, etc. by using such a biosignal, a technology of accurately detecting a biosignal and a technology of accurately inferring a user's operation intention from the detected biosignal are very important.
사용자의 동작의도를 추론하는 가장 간단한 방법은, 생체신호 검출장치가 생체신호의 특정인자값이 설정된 임계값을 초과하면 사용자가 동작을 개시한 것으로 판단하고, 해당 동작에 대응하는 제어동작을 수행하게 된다. 그러나 생체신호(EMG 신호)는 장치 고유의 잡음, 전자기적인 복사, 동작 잡음, 그리고 다른 조직과의 상호 작용 등에 의해서 유발되는 복잡한 형태의 잡음들을 포함하게 된다. 이와 같이 EMG와 같은 생체신호에 포함된 노이즈는 주변 환경이나 사용자의 신체상태, 센서와 신체의 접촉상태에 따라 지속적으로 변하기 때문에 정확한 임계값을 설정하기가 어려운 문제가 있었으며, 이러한 문제 해결을 위해서는 사용자의 동작여부를 정확하게 판단할 수 있는 방안이 요구된다. 대한민국 공개특허공보 제10-2016-0133306호와, 대한민국 등록특허공보 제10-1000869호가 선행기술 문헌으로 개시되고 있다.In the simplest method of inferring the user's intention, the biosignal detection apparatus determines that the user has started the operation when the specific parameter value of the biosignal exceeds the set threshold and performs a control operation corresponding to the operation. Done. However, the biosignal (EMG signal) includes complex noises caused by device-specific noise, electromagnetic radiation, operating noise, and interaction with other tissues. As such, the noise included in the biosignal such as EMG is constantly changing according to the surrounding environment, the user's physical condition, and the contact state of the sensor and the body, so that it is difficult to set an accurate threshold value. A method that can accurately determine whether the operation of the is required. Republic of Korea Patent Application Publication No. 10-2016-0133306 and Republic of Korea Patent Publication No. 10-1000869 is disclosed in the prior art literature.
본 발명은 기존에 제안된 방법들의 상기와 같은 문제점들을 해결하기 위해 제안된 것으로서, 근전도 신호를 측정하는 EMG 센서 모듈과, 움직임 신호를 측정하는 IMU 센서 모듈, 잡음 제거의 전처리를 수행하는 제어 모듈, 및 EMG 신호에 기초한 제스처의 동작을 인식하고 제어하는 컴퓨터 장치부를 포함하여 구성함으로써, 동작 인식 및 제어를 사용되는 EMG 신호에 포함된 노이즈를 제거하여 보다 정확한 동작 인식이 가능하도록 하는, 동작 인식 및 제어를 위한 웨어러블 장치 및 이를 이용한 동작 인식 제어 방법을 제공하는 것을 그 목적으로 한다.The present invention has been proposed to solve the above problems of the conventionally proposed methods, an EMG sensor module for measuring EMG signals, an IMU sensor module for measuring motion signals, a control module for performing noise reduction preprocessing, And a computer device unit for recognizing and controlling the operation of the gesture based on the EMG signal, thereby removing the noise included in the EMG signal used for the operation recognition and control, thereby enabling the operation recognition and control to be more accurate. An object of the present invention is to provide a wearable device and a motion recognition control method using the same.
또한, 본 발명은, EMG 센서 모듈과 IMU 센서 모듈 및 제어 모듈을 하나의 케이스에 통합하여 구성하고, 사용자의 팔 밴드에 단일 채널 EMG 센서가 위치되고, 팔뚝에 IMU 센서가 위치하도록 구성함으로써, EMG 기반의 제어 인터페이스를 착용(wearable)형으로 탈부착의 편리한 사용이 가능함은 물론, EMG 신호와 IMU 신호가 결합되어 사용되는 방식으로 움직임이 없는 미묘한 제스처를 정확하게 인식할 수 있도록 하는, 동작 인식 및 제어를 위한 웨어러블 장치 및 이를 이용한 동작 인식 제어 방법을 제공하는 것을 또 다른 목적으로 한다.In addition, the present invention is configured by integrating the EMG sensor module, the IMU sensor module and the control module in one case, the single channel EMG sensor is located in the user's arm band, the IMU sensor is located on the forearm, Wearable based control interface enables easy use of detachable, as well as gesture recognition and control, which enables precise recognition of subtle gestures without movement in the way that EMG and IMU signals are used in combination. Another object of the present invention is to provide a wearable device and a motion recognition control method using the same.
상기한 목적을 달성하기 위한 본 발명의 특징에 따른 동작 인식 및 제어를 위한 웨어러블 장치는,Wearable device for operation recognition and control according to a feature of the present invention for achieving the above object,
동작 인식 및 제어를 위한 웨어러블 장치로서,Wearable device for gesture recognition and control,
사용자의 근육 움직임에 따른 근전도(EMG) 신호를 측정하고, 측정된 근전도 신호를 디지털로 변환하여 출력하는 EMG(electromyogram) 센서 모듈;An electromyogram (EMG) sensor module for measuring an EMG signal according to muscle movement of a user, and converting and outputting the measured EMG signal to a digital;
사용자의 움직임 신호를 측정하고, 측정된 움직임 신호를 디지털 신호로 변환하여 출력하는 IMU(Inertial Motion Unit) 센서 모듈;An IMU (Inertial Motion Unit) sensor module for measuring a user's motion signal and converting the measured motion signal into a digital signal and outputting the digital signal;
상기 EMG 센서 모듈의 근전도 신호와 상기 IMU 센서 모듈의 움직임 신호를 입력받고, 상기 움직임 신호에 기초하여 상기 근전도 신호에 포함된 동적 잡음(motion artifact)을 제거하는 필터링을 수행하는 제어(MCU) 모듈; 및A control (MCU) module configured to receive an EMG signal of the EMG sensor module and a motion signal of the IMU sensor module, and perform filtering to remove dynamic noise included in the EMG signal based on the motion signal; And
상기 제어 모듈로부터 동적 잡음이 제거된 근전도 신호와 움직임 신호를 무선근거리 통신을 통해 전송받아 특징 추출, 분류, 및 데이터 선택의 맵핑 처리를 통해 상기 근전도 신호의 동작을 인식하고 그에 해당하는 제어를 수행하는 컴퓨터 장치부를 포함하는 것을 그 구성상의 특징으로 한다.Recognizing the operation of the EMG signal and performing the corresponding control by receiving the EMG signal and the motion signal from which the dynamic noise has been removed from the control module through wireless near-field communication, and performing mapping processing of feature extraction, classification, and data selection. Including a computer device section is characterized by its configuration.
바람직하게는, 상기 EMG 센서 모듈은,Preferably, the EMG sensor module,
사용자의 팔 밴드(arm band)에 위치하는 단일 채널 EMG 센서를 포함하여 구성할 수 있다.It can be configured to include a single channel EMG sensor located in the arm band (arm band) of the user.
바람직하게는, 상기 EMG 센서 모듈은,Preferably, the EMG sensor module,
사용자의 팔에 부착되어 사용자의 근육 움직임에 따라 측정된 근전도 원시 신호(Raw EMG signal)를 정류(Rectified)하고, 이어 정류된 신호를 증폭하여 필터링된 근전도 신호를 A/D변환기를 이용하여 디지털 신호로 변환 처리할 수 있다.It is attached to the user's arm and rectified the raw EMG signal measured according to the user's muscle movement, and then amplified the amplified signal to convert the filtered EMG signal into a digital signal using an A / D converter. Can be converted to.
더욱 바람직하게는, 상기 EMG 센서 모듈은,More preferably, the EMG sensor module,
정류된 신호의 증폭을 위한 증폭기로 차등 증폭기를 사용할 수 있다.A differential amplifier can be used as an amplifier for amplifying the rectified signal.
바람직하게는, 상기 IMU 센서 모듈은,Preferably, the IMU sensor module,
사용자의 팔뚝에 위치되어 사용자 움직임 정도를 측정하기 위한 IMU 센서로서, 가속도계(accelerometer), 자이로스코프(Gyroscope), 및 자력계(Magnetometer) 중 어느 하나의 센서로 구성될 수 있다.As an IMU sensor positioned on the user's forearm to measure the degree of movement of the user, the sensor may be configured of any one of an accelerometer, a gyroscope, and a magnetometer.
더욱 바람직하게는, 상기 움직임 신호는,More preferably, the motion signal,
상기 EMG 센서 모듈에서 측정되는 근전도(EMG) 신호에 포함되는 동적 잡음(motion artifact)을 보정하기 위한 기준 잡음 신호로 사용될 수 있다.The EMG sensor module may be used as a reference noise signal for correcting motion artifacts included in an EMG signal measured by the EMG sensor module.
바람직하게는, 상기 제어 모듈은,Preferably, the control module,
상기 근전도(EMG) 신호에 포함된 동적 잡음을 포함한 불필요한 잡음을 제거하는 전처리(preprocessing) 과정이 수행된 근전도 신호와 움직임 신호를 상기 컴퓨터 장치부로 전송하기 위한 블루투스 모듈을 더 포함할 수 있다.The apparatus may further include a Bluetooth module for transmitting the EMG signal and the motion signal to which the preprocessing process of removing unnecessary noise including the dynamic noise included in the EMG signal is transmitted to the computer device.
더욱 바람직하게는, 상기 제어 모듈은,More preferably, the control module,
상기 컴퓨터 장치부와 전 전력 통신을 위한 저 전력 블루투스 4.0과 통합된 릴리패드(Lily Pad) 심블리(Simblee) BLE 보드로 구현될 수 있다.It may be implemented as a Lily Pad (Simblee) BLE board integrated with low power Bluetooth 4.0 for full power communication with the computer device.
바람직하게는,Preferably,
상기 EMG 센서 모듈과, IMU 센서 모듈, 및 제어(MCU) 모듈은 원통형 케이스에 수용 설치되어 사용자의 팔뚝에 체결되는 착용(wearable)형으로 구성될 수 있다.The EMG sensor module, the IMU sensor module, and the control (MCU) module may be configured to be worn (wearable) is accommodated in the cylindrical case is fastened to the forearm of the user.
더욱 바람직하게는, 상기 원통형 케이스는,More preferably, the cylindrical case,
3D 프린터로 제작될 수 있다.It can be produced by 3D printer.
더욱 바람직하게는, 상기 원통형 케이스는,More preferably, the cylindrical case,
사용자의 팔뚝에 탈부착이 가능한 체결부재를 더 포함하여 구성할 수 있다.It can be configured to further include a fastening member detachable to the forearm of the user.
더욱 바람직하게는, 상기 컴퓨터 장치부는,More preferably, the computer device unit,
상기 제어 모듈로부터 블루투스 통신을 통해 입력받은 근전도(EMG) 신호를 이용하여 사용자의 근육 움직임에 따른 제스처의 특징을 추출하여 분류하고, 상기 움직임(IMU) 신호를 이용하여 사용자가 취한 제스처를 구분하기 위해 미리 설정된 데이터시트를 참조한 데이터 선택 및 맵핑을 수행할 수 있다.To extract and classify gestures according to muscle movements of a user using EMG signals received through the Bluetooth communication from the control module, and to classify gestures taken by a user using the movement (IMU) signals. Data selection and mapping can be performed by referring to a preset datasheet.
더욱 더 바람직하게는, 상기 컴퓨터 장치부는,Even more preferably, the computer device unit,
상기 제스처의 특징 추출을 통해 분류한 데이터와, 미리 설정된 데이터시트를 참조한 데이터 선택 및 맵핑 수행 처리를 통해 회득한 결과를 기초로 근전도 신호의 제스처를 판단하여 인식하고, 상기 판단하여 인식된 제스처에 해당하는 동작 처리를 제어 어플리케이션 상에서 실행할 수 있다.The gesture of the EMG signal is determined and recognized on the basis of the data classified by the feature extraction of the gesture and the result obtained through the data selection and mapping processing process referring to a preset data sheet, and corresponds to the recognized gesture. Operation processing can be executed on the control application.
더더욱 바람직하게는, 상기 컴퓨터 장치부는,Even more preferably, the computer device unit,
상기 근전도(EMG) 신호와 움직임(IMG) 신호를 결합하여 실제 움직임으로 나타나지 않는 등척성(isometric) 근육 활동을 감지하고, 이에 따른 움직임이 없는 미묘한 제스처를 분류하고 주변 환경에 방해가 되지 않는 상태로 제어가 가능하도록 할 수 있다.The EMG signal and the IMG signal are combined to detect isometric muscle activity that does not appear as a real motion, classify subtle gestures without motion, and control the condition without disturbing the surrounding environment. Can be enabled.
더더욱 바람직하게는, 상기 컴퓨터 장치부는,Even more preferably, the computer device unit,
상기 근전도(EMG) 신호를 이용한 사용자의 근육 움직임에 따른 제스처의 특징 추출로서, 시간-주파수 분석의 웨이블릿 변환(Wavelet Transformation)을 적용할 수 있다.Wavelet transform of time-frequency analysis may be applied as feature extraction of a gesture according to a user's muscle movement using the EMG signal.
더더욱 더 바람직하게는, 상기 제스처 분류는,Even more preferably, the gesture classification is
hand close 제스처, hand open 제스처, 및 hand extention 제스처의 3가지 클래스로 이루어질 수 있다.It can consist of three classes: hand close gesture, hand open gesture, and hand extention gesture.
더더욱 더 바람직하게는, 상기 컴퓨터 장치부는,Even more preferably, the computer device unit,
상기 근전도 신호를 이용한 사용자의 근육 움직임에 따른 제스처의 특징 추출의 분류로서, K-nearest Neighbor(KNN) 알고리즘을 이용하여 분류 처리할 수 있다.As the classification of the feature extraction of the gesture according to the muscle movement of the user using the EMG signal, classification processing may be performed using a K-nearest neighbor (KNN) algorithm.
상기한 목적을 달성하기 위한 본 발명의 특징에 따른 웨어러블 장치를 이용한 동작 인식 제어 방법은,Motion recognition control method using a wearable device according to a feature of the present invention for achieving the above object,
EMG 센서 모듈과 IMU 센서 모듈과 제어(MCU) 모듈 및 컴퓨터 장치부를 구비하는 웨어러블 장치를 이용한 동작 인식 제어 방법으로서,A motion recognition control method using a wearable device including an EMG sensor module, an IMU sensor module, a control (MCU) module, and a computer device,
(1) 상기 제어 모듈이 상기 EMG 센서 모듈로부터 사용자의 팔뚝의 근육 움직임을 통하여 측정되어 디지털 신호로 변환된 근전도(EMG) 신호를 입력받는 단계;(1) receiving, by the control module, an EMG signal measured through the muscle movement of the forearm of the user from the EMG sensor module and converted into a digital signal;
(2) 상기 제어 모듈이 상기 IMU 센서 모듈로부터 사용자의 움직임 정도가 측정되어 디지털 신호로 변환된 움직임(IMU) 신호를 입력받는 단계;(2) receiving, by the control module, a motion (IMU) signal measured by a user's motion from the IMU sensor module and converted into a digital signal;
(3) 상기 제어 모듈이 상기 움직임 신호에 기초하여 상기 근전도(EMG) 신호에 포함된 동적 잡음을 포함한 불필요한 잡음을 제거하는 전처리(preprocessing) 과정을 수행하고, 전처리 과정이 수행된 근전도 신호와 움직임 신호를 상기 컴퓨터 장치부로 전송하는 단계; 및(3) the control module performs a preprocessing process to remove unnecessary noise including dynamic noise included in the EMG signal based on the motion signal, and the EMG signal and the motion signal in which the preprocessing process is performed Transmitting to the computer device unit; And
(4) 상기 컴퓨터 장치부가 상기 단계 (3)을 통해 상기 제어 모듈로부터 동적 잡음이 제거된 근전도 신호와 움직임 신호를 무선근거리 통신을 통해 전송받고, 특징 추출, 분류, 및 데이터 선택의 맵핑 처리를 통해 상기 근전도 신호의 동작을 인식하고 그에 해당하는 제어를 수행하는 단계를 포함하는 것을 그 구성상의 특징으로 한다.(4) The computer device unit receives the EMG signal and the motion signal from which the dynamic noise has been removed from the control module through the wireless local area communication through the step (3), and performs mapping processing of feature extraction, classification, and data selection. Recognizing the operation of the EMG signal and performing the corresponding control is characterized in that the configuration.
바람직하게는, 상기 단계 (4)에서는,Preferably, in step (4),
상기 컴퓨터 장치부가, 상기 제어 모듈로부터 블루투스 통신을 통해 입력받은 근전도(EMG) 신호를 이용하여 사용자의 근육 움직임에 따른 제스처의 특징을 추출하여 분류하고, 상기 움직임(IMU) 신호를 이용하여 사용자가 취한 제스처를 구분하기 위해 미리 설정된 데이터시트를 참조한 데이터 선택 및 맵핑을 수행하고, 상기 제스처의 특징 추출을 통해 분류한 데이터와, 미리 설정된 데이터시트를 참조한 데이터 선택 및 맵핑 수행 처리를 통해 회득한 결과를 기초로 근전도 신호의 제스처를 판단하여 인식하며, 상기 판단하여 인식된 제스처에 해당하는 동작 처리를 제어 어플리케이션 상에서 실행할 수 있다.The computer device extracts and classifies the gesture feature according to the muscle movement of the user by using the EMG signal received from the control module through Bluetooth communication, and uses the movement (IMU) signal. In order to distinguish gestures, data selection and mapping are performed by referring to a preset data sheet, and the data classified through feature extraction of the gesture and the results obtained through processing of data selection and mapping with reference to a preset data sheet are performed. The gesture of the EMG signal may be determined and recognized, and an operation process corresponding to the determined and recognized gesture may be executed on the control application.
더욱 바람직하게는, 상기 컴퓨터 장치부는,More preferably, the computer device unit,
상기 근전도(EMG) 신호를 이용한 사용자의 근육 움직임에 따른 제스처의 특징 추출로서, 시간-주파수 분석의 웨이블릿 변환(Wavelet Transformation)을 적용하고, 상기 근전도 신호를 이용한 사용자의 근육 움직임에 따른 제스처의 특징 추출의 분류로서, K-nearest Neighbor(KNN) 알고리즘을 이용하여 분류 처리할 수 있다.As a feature extraction of a gesture according to a user's muscle movement using the EMG signal, wavelet transform of time-frequency analysis is applied, and a feature extraction of a gesture according to the user's muscle movement using the EMG signal is applied. As the classification of, it can be classified using the K-nearest Neighbor (KNN) algorithm.
본 발명에서 제안하고 있는 동작 인식 및 제어를 위한 웨어러블 장치 및 이를 이용한 동작 인식 제어 방법에 따르면, 근전도 신호를 측정하는 EMG 센서 모듈과, 움직임 신호를 측정하는 IMU 센서 모듈, 잡음 제거의 전처리를 수행하는 제어 모듈, 및 EMG 신호에 기초한 제스처의 동작을 인식하고 제어하는 컴퓨터 장치부를 포함하여 구성함으로써, 동작 인식 및 제어를 사용되는 EMG 신호에 포함된 노이즈를 제거하여 보다 정확한 동작 인식이 가능하도록 할 수 있다.According to the wearable device for motion recognition and control proposed by the present invention and a motion recognition control method using the same, an EMG sensor module for measuring an EMG signal, an IMU sensor module for measuring a motion signal, and noise pretreatment may be performed. By including a control module and a computer device unit for recognizing and controlling the operation of the gesture based on the EMG signal, the motion recognition and control can be eliminated to reduce noise included in the used EMG signal to enable more accurate motion recognition. .
또한, 본 발명에 따르면, EMG 센서 모듈과 IMU 센서 모듈 및 제어 모듈을 하나의 케이스에 통합하여 구성하고, 사용자의 팔 밴드에 단일 채널 EMG 센서가 위치되고, 팔뚝에 IMU 센서가 위치하도록 구성함으로써, EMG 기반의 제어 인터페이스를 착용(wearable)형으로 탈부착의 편리한 사용이 가능함은 물론, EMG 신호와 IMU 신호가 결합되어 사용되는 방식으로 움직임이 없는 미묘한 제스처를 정확하게 인식할 수 있도록 할 수 있다.In addition, according to the present invention, by configuring the EMG sensor module, the IMU sensor module and the control module integrated into one case, the single-channel EMG sensor is located in the arm band of the user, and configured to place the IMU sensor on the forearm, The EMG-based control interface is wearable, allowing convenient use of the detachable, and the combination of the EMG signal and the IMU signal can be used to accurately recognize subtle gestures without movement.
도 1은 본 발명의 일실시예에 따른 동작 인식 및 제어를 위한 웨어러블 장치의 구성을 기능블록으로 도시한 도면.1 is a block diagram illustrating a configuration of a wearable device for gesture recognition and control according to an embodiment of the present invention.
도 2는 본 발명의 일실시예에 따른 동작 인식 및 제어를 위한 웨어러블 장치의 시스템 구조도를 도시한 도면.2 is a diagram illustrating a system structure of a wearable device for gesture recognition and control according to an embodiment of the present invention.
도 3은 본 발명의 일실시예에 따른 동작 인식 및 제어를 위한 웨어러블 장치에서 동작 인식되는 핸드 제스처의 모습을 도시한 도면.3 is a view illustrating a state of a hand gesture recognized by a wearable device for gesture recognition and control according to an embodiment of the present invention.
도 4는 본 발명의 일실시예에 따른 동작 인식 및 제어를 위한 웨어러블 장치의 착용 상태에서의 팔의 동작 상태를 도시한 도면.4 is a view showing an operating state of the arm in a wearing state of the wearable device for gesture recognition and control according to an embodiment of the present invention.
도 5는 본 발명의 일실시예에 따른 동작 인식 및 제어를 위한 웨어러블 장치의 착용 상태에서의 팔의 다른 동작 상태를 도시한 도면.5 is a view showing another operating state of the arm in a wearing state of the wearable device for gesture recognition and control according to an embodiment of the present invention.
도 6은 본 발명의 일실시예에 따른 동작 인식 및 제어를 위한 웨어러블 장치의 착용 상태에서의 팔의 또다른 동작 상태를 도시한 도면.6 is a view showing another operating state of the arm in a wearing state of the wearable device for gesture recognition and control according to an embodiment of the present invention.
도 7은 본 발명의 일실시예에 따른 동작 인식 및 제어를 위한 웨어러블 장치의 트레이닝 데이터를 도시한 도면.7 illustrates training data of a wearable device for gesture recognition and control according to an embodiment of the present invention.
도 8은 본 발명의 일실시예에 따른 동작 인식 및 제어를 위한 웨어러블 장치의 제스처별 인식 스코어를 그래프로 도시한 도면.8 is a graph illustrating a recognition score for each gesture of a wearable device for gesture recognition and control according to an embodiment of the present invention.
도 9는 본 발명의 일실시예에 따른 웨어러블 장치를 이용한 동작 인식 제어 방법의 동작 흐름을 도시한 도면.9 is a flowchart illustrating an operation flow of a method for controlling motion recognition using a wearable device according to an embodiment of the present invention.
<부호의 설명><Description of the code>
100: 본 발명의 일실시예에 따른 웨어러블 장치100: wearable device according to an embodiment of the present invention
101: 원통형 케이스101: cylindrical case
102: 체결부재102: fastening member
110: EMG 센서 모듈110: EMG sensor module
120: IMU 센서 모듈120: IMU sensor module
130: 제어(MCU) 모듈130: control (MCU) module
131: 블루투스 모듈131: Bluetooth module
140: 컴퓨터 장치부140: computer unit
S110: 제어 모듈이 근전도(EMG) 신호를 입력받는 단계S110: step in which the control module receives an EMG signal
S120: 제어 모듈이 움직임(IMU) 신호를 입력받는 단계S120: step in which the control module receives a motion (IMU) signal
S130; 전처리 과정이 수행된 근전도 신호와 움직임 신호를 컴퓨터 장치부로 전송하는 단계S130; Transmitting the EMG signal and the motion signal which have been preprocessed to the computer device
S140: 컴퓨터 장치부가 동적 잡음이 제거된 근전도 신호와 움직임 신호를 이용하여 특징 추출, 분류, 및 데이터 선택의 맵핑 처리를 통해 근전도 신호의 동작을 인식하고 그에 해당하는 제어를 수행하는 단계S140: the computer device unit recognizing the operation of the EMG signal and performing the corresponding control through the feature extraction, classification, and mapping of data selection using the EMG signal and the motion signal from which the dynamic noise is removed
이하, 첨부된 도면을 참조하여 본 발명이 속하는 기술분야에서 통상의 지식을 가진 자가 본 발명을 용이하게 실시할 수 있도록 바람직한 실시예를 상세히 설명한다. 다만, 본 발명의 바람직한 실시예를 상세하게 설명함에 있어, 관련된 공지 기능 또는 구성에 대한 구체적인 설명이 본 발명의 요지를 불필요하게 흐릴 수 있다고 판단되는 경우에는 그 상세한 설명을 생략한다. 또한, 유사한 기능 및 작용을 하는 부분에 대해서는 도면 전체에 걸쳐 동일한 부호를 사용한다.Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings so that those skilled in the art may easily implement the present invention. However, in describing the preferred embodiment of the present invention in detail, if it is determined that the detailed description of the related known function or configuration may unnecessarily obscure the subject matter of the present invention, the detailed description thereof will be omitted. In addition, the same reference numerals are used throughout the drawings for parts having similar functions and functions.
덧붙여, 명세서 전체에서, 어떤 부분이 다른 부분과 ‘연결’ 되어 있다고 할 때, 이는 ‘직접적으로 연결’ 되어 있는 경우뿐만 아니라, 그 중간에 다른 소자를 사이에 두고 ‘간접적으로 연결’ 되어 있는 경우도 포함한다. 또한, 어떤 구성요소를 ‘포함’ 한다는 것은, 특별히 반대되는 기재가 없는 한 다른 구성요소를 제외하는 것이 아니라 다른 구성요소를 더 포함할 수 있다는 것을 의미한다.In addition, in the specification, when a part is 'connected' to another part, it is not only 'directly connected' but also 'indirectly connected' with another element in between. Include. In addition, the term "comprising" a certain component means that the component may further include other components, except for the case where there is no contrary description.
도 1은 본 발명의 일실시예에 따른 동작 인식 및 제어를 위한 웨어러블 장치의 구성을 기능블록으로 도시한 도면이고, 도 2는 본 발명의 일실시예에 따른 동작 인식 및 제어를 위한 웨어러블 장치의 시스템 구조도를 도시한 도면이며, 도 3은 본 발명의 일실시예에 따른 동작 인식 및 제어를 위한 웨어러블 장치에서 동작 인식되는 핸드 제스처의 모습을 도시한 도면이고, 도 4 내지 도 6은 본 발명의 일실시예에 따른 동작 인식 및 제어를 위한 웨어러블 장치의 착용 상태에서의 팔의 다양한 동작 상태를 도시한 도면이다. 도 1 및 도 2에 각각 도시된 바와 같이, 본 발명의 일실시예에 따른 동작 인식 및 제어를 위한 웨어러블 장치(100)는, EMG 센서 모듈(110), IMU 센서 모듈(120), 제어(MCU) 모듈(130), 및 컴퓨터 장치부(140)를 포함하여 구성될 수 있다.1 is a block diagram showing the configuration of a wearable device for gesture recognition and control according to an embodiment of the present invention, Figure 2 is a wearable device for gesture recognition and control according to an embodiment of the present invention FIG. 3 is a diagram illustrating a system structure, and FIG. 3 is a view illustrating a hand gesture recognized by a wearable device for gesture recognition and control according to an embodiment of the present invention, and FIGS. 4 to 6 are views of the present invention. FIG. 1 is a diagram illustrating various operation states of an arm in a wearing state of a wearable device for gesture recognition and control according to an embodiment. 1 and 2, the wearable device 100 for motion recognition and control according to an embodiment of the present invention, the EMG sensor module 110, IMU sensor module 120, control (MCU) Module 130, and the computer device 140 may be configured.
EMG 센서 모듈(110)은, 사용자의 근육 움직임에 따른 근전도(EMG) 신호를 측정하고, 측정된 근전도 신호를 디지털로 변환하여 제어 모듈(130)로 출력하는 구성이다. 이러한 EMG(electromyogram) 센서 모듈(110)은 도 4 내지 도 6에 도시된 바와 같이, 사용자의 팔 밴드(arm band)에 위치하는 단일 채널 EMG 센서를 포함하여 구성할 수 있다. 여기서, EMG 센서 모듈(110)은 사용자의 팔에 부착되어 사용자의 근육 움직임에 따라 측정된 근전도 원시 신호(Raw EMG signal)를 정류(Rectified)하고, 이어 정류된 신호를 증폭하여 필터링된 근전도 신호를 A/D변환기를 이용하여 디지털 신호로 변환 처리할 수 있다. 이때, EMG 센서 모듈(110)은 정류된 신호의 증폭을 위한 증폭기로 차등 증폭기를 사용할 수 있다.The EMG sensor module 110 measures an EMG signal according to a user's muscle movement, converts the measured EMG signal into a digital signal, and outputs the digital EMG signal to the control module 130. Such an electromyogram (EMG) sensor module 110 may be configured to include a single channel EMG sensor located in an arm band of a user, as shown in FIGS. 4 to 6. Here, the EMG sensor module 110 is attached to the user's arm and rectified the raw EMG signal measured according to the muscle movement of the user, and then amplified the rectified signal to amplify the filtered EMG signal. The A / D converter can be used to convert digital signals. In this case, the EMG sensor module 110 may use a differential amplifier as an amplifier for amplifying the rectified signal.
IMU 센서 모듈(120)은, 사용자의 움직임 신호를 측정하고, 측정된 움직임 신호를 디지털 신호로 변환하여 제어 모듈(130)로 출력하는 구성이다. 이러한 IMU(Inertial Motion Unit) 센서 모듈(120)은 도 4 내지 도 6에 도시된 바와 같이, 사용자의 팔뚝에 위치되어 사용자 움직임 정도를 측정하기 위한 IMU 센서로서, 가속도계(accelerometer), 자이로스코프(Gyroscope), 및 자력계(Magnetometer) 중 어느 하나의 센서로 구성될 수 있다. 여기서, 움직임 신호는 EMG 센서 모듈(110)에서 측정되는 근전도(EMG) 신호에 포함되는 동적 잡음(motion artifact)을 보정하기 위한 기준 잡음 신호로 사용될 수 있다.The IMU sensor module 120 measures a user's motion signal, converts the measured motion signal into a digital signal, and outputs it to the control module 130. The IMU (Inertial Motion Unit) sensor module 120 is an IMU sensor positioned on the user's forearm to measure a user's movement, as shown in FIGS. 4 to 6, and includes an accelerometer and a gyroscope. ), And a magnetometer of the sensor. Here, the motion signal may be used as a reference noise signal for correcting the motion artifacts included in the EMG signal measured by the EMG sensor module 110.
제어(MCU) 모듈(130)은, EMG 센서 모듈(110)의 근전도 신호와 IMU 센서 모듈(120)의 움직임 신호를 입력받고, 움직임 신호에 기초하여 근전도 신호에 포함된 동적 잡음(motion artifact)을 제거하는 필터링을 수행하는 프로세싱 모듈의 구성이다. 이러한 제어 모듈(130)은 근전도(EMG) 신호에 포함된 동적 잡음을 포함한 불필요한 잡음을 제거하는 전처리(preprocessing) 과정이 수행된 근전도 신호와 움직임 신호를 컴퓨터 장치부(140)로 전송하기 위한 블루투스 모듈(131)을 더 포함하여 구성할 수 있다. 여기서, 제어 모듈(130)은 도 2에 도시된 바와 같이, 컴퓨터 장치부(140)와 전 전력 통신을 위한 저 전력 블루투스 4.0과 통합된 릴리패드(Lily Pad) 심블리(Simblee) BLE 보드로 구현될 수 있다.The control (MCU) module 130 receives the EMG signal of the EMG sensor module 110 and the motion signal of the IMU sensor module 120, and receives dynamic motion artifacts included in the EMG signal based on the motion signal. It is the configuration of the processing module that performs filtering to remove. The control module 130 is a Bluetooth module for transmitting the EMG signal and the motion signal to which the preprocessing process for removing unnecessary noise including the dynamic noise included in the EMG signal is transmitted to the computer device unit 140. 131 may be further included. Here, as shown in FIG. 2, the control module 130 is implemented as a Lily Pad Simmblee BLE board integrated with low power Bluetooth 4.0 for full power communication with the computer device 140. Can be.
또한, EMG 센서 모듈(110)과, IMU 센서 모듈(120), 및 제어(MCU) 모듈(130)은 도 4 내지 도 6에 도시된 바와 같이, 원통형 케이스(101)에 수용 설치되어 사용자의 팔뚝에 체결되는 착용(wearable)형으로 구성될 수 있다. 이러한 원통형 케이스(101)는 3D 프린터로 제작될 수 있으며, 사용자의 팔뚝에 탈부착이 가능한 체결부재(102)를 더 포함하여 구성할 수 있다.In addition, the EMG sensor module 110, the IMU sensor module 120, and the control (MCU) module 130 is accommodated in the cylindrical case 101, as shown in Figures 4 to 6, the user's forearm It can be configured as a wearable (wearable) fastened to. The cylindrical case 101 may be manufactured by a 3D printer, and may further include a fastening member 102 that is detachable to a forearm of a user.
컴퓨터 장치부(140)는, 제어 모듈(130)로부터 동적 잡음이 제거된 근전도 신호와 움직임 신호를 무선근거리 통신을 통해 전송받아 특징 추출, 분류, 및 데이터 선택의 맵핑 처리를 통해 상기 근전도 신호의 동작을 인식하고 그에 해당하는 제어를 수행하는 구성이다. 이러한 컴퓨터 장치부(140)는 제어 모듈(130)로부터 블루투스 통신을 통해 입력받은 근전도(EMG) 신호를 이용하여 사용자의 근육 움직임에 따른 제스처의 특징을 추출하여 분류하고, 움직임(IMU) 신호를 이용하여 사용자가 취한 제스처를 구분하기 위해 미리 설정된 데이터시트를 참조한 데이터 선택 및 맵핑을 수행할 수 있다.The computer device unit 140 receives the EMG signal and the motion signal from which the dynamic noise has been removed from the control module 130 through wireless near field communication, and operates the EMG signal through mapping processing of feature extraction, classification, and data selection. It is a configuration that recognizes and performs the corresponding control. The computer device unit 140 extracts and classifies gesture features according to muscle movements of a user using EMG signals received from the control module 130 through Bluetooth communication, and uses movement (IMU) signals. In order to distinguish gestures taken by the user, data selection and mapping may be performed by referring to a preset data sheet.
또한, 컴퓨터 장치부(140)는 제스처의 특징 추출을 통해 분류한 데이터와, 미리 설정된 데이터시트를 참조한 데이터 선택 및 맵핑 수행 처리를 통해 회득한 결과를 기초로 근전도 신호의 제스처를 판단하여 인식하고, 판단하여 인식된 제스처에 해당하는 동작 처리를 제어 어플리케이션 상에서 실행할 수 있다. 이러한 컴퓨터 장치부(140)는 근전도(EMG) 신호와 움직임(IMG) 신호를 결합하여 실제 움직임으로 나타나지 않는 등척성(isometric) 근육 활동을 감지하고, 이에 따른 움직임이 없는 미묘한 제스처를 분류하고 주변 환경에 방해가 되지 않는 상태로 제어가 가능하도록 하게 된다.In addition, the computer device 140 determines and recognizes the gesture of the EMG signal based on the data classified by the feature extraction of the gesture and the result obtained through the data selection and mapping processing with reference to the preset data sheet. The operation processing corresponding to the recognized gesture may be executed on the control application. The computer device unit 140 combines EMG and EMG signals to detect isometric muscle activity that does not appear as actual movement, classifies subtle gestures without movement, Control can be made without interruption.
또한, 컴퓨터 장치부(140)는 근전도(EMG) 신호를 이용한 사용자의 근육 움직임에 따른 제스처의 특징 추출로서, 시간-주파수 분석의 웨이블릿 변환(Wavelet Transformation)을 적용할 수 있다. 이때, 제스처 분류는 도 3에 도시된 바와 같이, hand close 제스처, hand open 제스처, 및 hand extention 제스처의 3가지 클래스로 이루어질 수 있다. 이러한 웨이블릿 변환(WT)은 EMG 신호와 같은 비 정상적인 신호를 분석하는데 용이한 시간-주파수 분석 방법으로, 웨이블릿 변환을 사용함으로써 얻을 수 있는 장점은 특징 벡터(feature vector)의 차원 설정이 용이하고, 여러 차원의 특징 벡터를 사용한 경우와 거의 동일한 결과를 도출할 수 있게 된다. 즉, 웨이블릿 변환은 푸리에(fourier) 분석으로부터 개발되어 짧은 시간 구간에 대해 적용되며, 윈도우의 크기는 실시간 주파수 분석을 통해서 수정될 수 있다. 결과적으로 높은 주파수를 가지는 신호에 대한 분석도 낮은 주파수 대역에서의 분석과 동일한 결과를 도출할 수 있게 된다. 사용자가 취한 제스처를 구분하기 위해서는 각 매칭된 패턴으로부터 명확한 특징이 취해져야 하므로, 여러 개의 특징들이 추출될 수 있는데, 이들 가운데 일반적인 통계적인 특징들인 최대값, 최소값, 제곱평균(Root Mean Square) 등을 포함할 수 있다.In addition, the computer device 140 may apply a wavelet transform of time-frequency analysis as feature extraction of a gesture according to a user's muscle movement using an EMG signal. At this time, the gesture classification may be made of three classes, a hand close gesture, a hand open gesture, and a hand extention gesture, as shown in FIG. 3. The wavelet transform (WT) is a time-frequency analysis method that is easy to analyze abnormal signals such as EMG signals. The advantage of using wavelet transform is that it is easy to set the dimension of the feature vector, and The result is almost the same as using the dimension feature vector. In other words, the wavelet transform is developed from Fourier analysis and applied for a short time interval, and the size of the window can be modified through real-time frequency analysis. As a result, an analysis of a signal having a high frequency can produce the same result as an analysis of a low frequency band. Since distinct features must be taken from each matched pattern in order to distinguish the gestures taken by the user, several features can be extracted. Among them, general statistical features such as maximum, minimum, root mean square, etc. It may include.
또한, 컴퓨터 장치부(140)는 근전도 신호를 이용한 사용자의 근육 움직임에 따른 제스처의 특징 추출의 분류로서, K-nearest Neighbor(KNN) 알고리즘을 이용하여 분류 처리할 수 있다. 즉, SVM(Support Vector Machine) 분류 알고리즘은 Kerenel을 사용하여 선형 또는 비선형의 방법으로 사용될 수 있지만, 정확한 결과를 위해서는 많은 데이터 학습이 필요한 단점이 있다. 본 발명에서는 KNN 알고리즘을 이용하여 KNN에 의한 결과를 SVM에 의한 결과와 비교한 바, KNN이 SVM보다는 더 좋은 결과가 도출됨을 실험을 통해 알 수 있었다. 그 이유는 KNN은 자체가 비선형이기 때문에 선형적으로 또는 비선형적으로 분포하는 데이터를 쉽게 감지할 수 있고, 좋은 결과를 얻기 위해서 필요로 하는 데이터 포인트도 더 적게 된다.In addition, the computer device 140 may classify the feature extraction of the gesture according to the muscle movement of the user using the EMG signal, and may classify using the K-nearest Neighbor (KNN) algorithm. In other words, the support vector machine (SVM) classification algorithm can be used in a linear or nonlinear manner using Kerenel, but it requires a lot of data learning for accurate results. In the present invention, the results of KNN are compared with the results of SVM using the KNN algorithm, and it can be seen through experiments that KNN yields better results than SVM. The reason is that KNN itself is non-linear, so it can easily detect data that is distributed linearly or non-linearly, and requires fewer data points to achieve good results.
도 7은 본 발명의 일실시예에 따른 동작 인식 및 제어를 위한 웨어러블 장치의 트레이닝 데이터를 도시한 도면이고, 도 8은 본 발명의 일실시예에 따른 동작 인식 및 제어를 위한 웨어러블 장치의 제스처별 인식 스코어를 그래프로 도시한 도면이다. 도 7은 훈련 데이터로서, 전처리 및 특징 추출 이후의 핸드 Close 제스처, 핸드 Open 제스처, 핸드 Extension 제스처의 손이 0, 1, 및 2에 해당하는 각 제스처 그룹을 표로 나타내고 있다. 데이터를 학습한 후(MAX, MIN, MAV, RMS, STD)의 어떤 기능이 가장 좋은 결과를 얻을 수 있는지 선택한다. 도 8은 제스처 인식의 정확성을 확인하기 위해 모델의 점수를 확인하고, 점수가 있는 분산을 나타내고 있다.7 is a diagram illustrating training data of a wearable device for gesture recognition and control according to an embodiment of the present invention, and FIG. 8 is a gesture for each wearable device for gesture recognition and control according to an embodiment of the present invention. It is a figure which shows the recognition score graphically. FIG. 7 is a table showing each group of gestures corresponding to 0, 1, and 2 of the hand close gesture, the hand open gesture, and the hand extension gesture after the preprocessing and feature extraction. After learning the data (MAX, MIN, MAV, RMS, STD), choose which function will yield the best results. 8 confirms the score of the model to confirm the accuracy of gesture recognition, and shows the variance with the score.
도 9는 본 발명의 일실시예에 따른 웨어러블 장치를 이용한 동작 인식 제어 방법의 동작 흐름을 도시한 도면이다. 도 9에 도시된 바와 같이, 본 발명의 일실시예에 따른 웨어러블 장치를 이용한 동작 인식 제어 방법은, 제어 모듈(130)이 근전도(EMG) 신호를 입력받는 단계(S110), 제어 모듈(130)이 움직임(IMU) 신호를 입력받는 단계(S120), 제어 제어 모듈(130)이 전처리 과정이 수행된 근전도 신호와 움직임 신호를 컴퓨터 장치부(140)로 전송하는 단계(S130), 및 컴퓨터 장치부(140)가 동적 잡음이 제거된 근전도 신호와 움직임 신호를 이용하여 특징 추출, 분류, 및 데이터 선택의 맵핑 처리를 통해 근전도 신호의 동작을 인식하고 그에 해당하는 제어를 수행하는 단계(S140)를 포함하여 구현될 수 있다.9 is a flowchart illustrating an operation of a method for controlling motion recognition using a wearable device according to an embodiment of the present invention. As shown in FIG. 9, in the motion recognition control method using the wearable device according to an embodiment of the present invention, the control module 130 receives an EMG signal (S110) and the control module 130. Receiving this movement (IMU) signal (S120), the control control module 130 transmits the EMG signal and the movement signal, the preprocessing process is performed to the computer device unit 140 (S130), and the computer device unit Recognizing the operation of the EMG signal and performing the corresponding control through the process of mapping the feature extraction, classification, and data selection using the EMG signal and the motion signal from which the dynamic noise is removed (S140) Can be implemented.
본 발명의 동작 인식 제어 방법에 사용되는 웨어러블 장치(100)는, 도 1 내지 도 6에 각각 도시된 바와 같이, 손의 제스처를 인식하기 위한 EMG 센서 모듈(110)과 IMU 센서 모듈(120)과 제어(MCU) 모듈(130) 및 컴퓨터 장치부(140)를 구비한다. 이하에서는 앞서 설명한 웨어러블 장치(100)를 이용한 동작 인식 제어 방법에 대해 상세히 설명하기로 한다.The wearable device 100 used in the motion recognition control method of the present invention may include an EMG sensor module 110 and an IMU sensor module 120 for recognizing a gesture of a hand, as shown in FIGS. 1 to 6, respectively. The control (MCU) module 130 and the computer device unit 140 is provided. Hereinafter, the motion recognition control method using the wearable device 100 described above will be described in detail.
단계 S110에서는, 제어 모듈(130)이 EMG 센서 모듈(110)로부터 사용자의 팔뚝의 근육 움직임을 통하여 측정되어 디지털 신호로 변환된 근전도(EMG) 신호를 입력받는다. 여기서, EMG(electromyogram) 센서 모듈(110)은 도 4 내지 도 6에 도시된 바와 같이, 사용자의 팔 밴드(arm band)에 위치하는 단일 채널 EMG 센서를 포함하여 구성할 수 있다. 여기서, EMG 센서 모듈(110)은 사용자의 팔에 부착되어 사용자의 근육 움직임에 따라 측정된 근전도 원시 신호(Raw EMG signal)를 정류(Rectified)하고, 이어 정류된 신호를 증폭하여 필터링된 근전도 신호를 A/D변환기를 이용하여 디지털 신호로 변환 처리할 수 있다. 이때, EMG 센서 모듈(110)은 정류된 신호의 증폭을 위한 증폭기로 차등 증폭기를 사용할 수 있다.In step S110, the control module 130 receives an EMG signal measured from the EMG sensor module 110 through the muscle movement of the user's forearm and converted into a digital signal. Here, the electromyogram sensor module 110 may include a single channel EMG sensor located in an arm band of a user, as shown in FIGS. 4 to 6. Here, the EMG sensor module 110 is attached to the user's arm and rectified the raw EMG signal measured according to the muscle movement of the user, and then amplified the rectified signal to amplify the filtered EMG signal. The A / D converter can be used to convert digital signals. In this case, the EMG sensor module 110 may use a differential amplifier as an amplifier for amplifying the rectified signal.
단계 S120에서는, 제어 모듈(130)이 IMU 센서 모듈(120)로부터 사용자의 움직임 정도가 측정되어 디지털 신호로 변환된 움직임(IMU) 신호를 입력받는다. 여기서, IMU(Inertial Motion Unit) 센서 모듈(120)은 도 4 내지 도 6에 도시된 바와 같이, 사용자의 팔뚝에 위치되어 사용자 움직임 정도를 측정하기 위한 IMU 센서로서, 가속도계(accelerometer), 자이로스코프(Gyroscope), 및 자력계(Magnetometer) 중 어느 하나의 센서로 구성될 수 있다.In step S120, the control module 130 receives a movement (IMU) signal measured by the user's movement from the IMU sensor module 120 and converted into a digital signal. Here, the IMU (Inertial Motion Unit) sensor module 120 is an IMU sensor for measuring the degree of movement of the user, which is located on the forearm of the user, as shown in FIGS. 4 to 6, and includes an accelerometer and a gyroscope. Gyroscope, and a magnetometer (Magnetometer) can be composed of any one sensor.
단계 S130에서는, 제어 모듈(130)이 움직임 신호에 기초하여 근전도(EMG) 신호에 포함된 동적 잡음을 포함한 불필요한 잡음을 제거하는 전처리(preprocessing) 과정을 수행하고, 전처리 과정이 수행된 근전도 신호와 움직임 신호를 컴퓨터 장치부(140)로 전송한다. 여기서, 제어 모듈(130)은 전처리(preprocessing) 과정이 수행된 근전도 신호와 움직임 신호를 컴퓨터 장치부(140)로 전송하기 위한 블루투스 모듈(131)을 더 포함하여 구성할 수 있다. 이때, 제어 모듈(130)은 컴퓨터 장치부(140)와 전 전력 통신을 위한 저 전력 블루투스 4.0과 통합된 릴리패드(Lily Pad) 심블리(Simblee) BLE 보드로 구현될 수 있다.In operation S130, the control module 130 performs a preprocessing process of removing unnecessary noise including dynamic noise included in the EMG signal based on the motion signal, and the EMG signal and the motion in which the preprocessing process is performed. The signal is transmitted to the computer device unit 140. In this case, the control module 130 may further include a Bluetooth module 131 for transmitting the EMG signal and the motion signal to which the preprocessing process has been performed to the computer device unit 140. In this case, the control module 130 may be implemented as a lily pad simble BLE board integrated with low power Bluetooth 4.0 for full power communication with the computer device 140.
단계 S140에서는, 컴퓨터 장치부(140)가 단계 S130을 통해 제어 모듈(130)로부터 동적 잡음이 제거된 근전도 신호와 움직임 신호를 무선근거리 통신을 통해 전송받고, 특징 추출, 분류, 및 데이터 선택의 맵핑 처리를 통해 상기 근전도 신호의 동작을 인식하고 그에 해당하는 제어를 수행한다. 이러한 단계 S140에서는 컴퓨터 장치부(140)가, 제어 모듈(130)로부터 블루투스 통신을 통해 입력받은 근전도(EMG) 신호를 이용하여 사용자의 근육 움직임에 따른 제스처의 특징을 추출하여 분류하고, 움직임(IMU) 신호를 이용하여 사용자가 취한 제스처를 구분하기 위해 미리 설정된 데이터시트를 참조한 데이터 선택 및 맵핑을 수행하고, 제스처의 특징 추출을 통해 분류한 데이터와, 미리 설정된 데이터시트를 참조한 데이터 선택 및 맵핑 수행 처리를 통해 회득한 결과를 기초로 근전도 신호의 제스처를 판단하여 인식하며, 판단하여 인식된 제스처에 해당하는 동작 처리를 제어 어플리케이션 상에서 실행하게 된다.In step S140, the computer device unit 140 receives the EMG signal and the motion signal from which the dynamic noise has been removed from the control module 130 through the wireless local area communication, and maps feature extraction, classification, and data selection through step S130. Through processing, the operation of the EMG signal is recognized and control corresponding thereto is performed. In step S140, the computer device 140 extracts and classifies gesture features according to muscle movements of the user using EMG signals received from the control module 130 through Bluetooth communication. Data selection and mapping with reference to a preset data sheet to distinguish gestures taken by a user using a signal, data selection and mapping with reference to a preset data sheet and data classified by extracting a feature of a gesture Determining and recognizing the gesture of the EMG signal based on the result obtained through the operation, the operation processing corresponding to the determined gesture is determined and executed on the control application.
또한, 컴퓨터 장치부(140)는 근전도(EMG) 신호를 이용한 사용자의 근육 움직임에 따른 제스처의 특징 추출로서, 시간-주파수 분석의 웨이블릿 변환(Wavelet Transformation)을 적용하고, 근전도 신호를 이용한 사용자의 근육 움직임에 따른 제스처의 특징 추출의 분류로서, K-nearest Neighbor(KNN) 알고리즘을 이용하여 분류 처리할 수 있다.In addition, the computer device 140 extracts a feature of a gesture according to a user's muscle movement using an EMG signal, applies a wavelet transform of time-frequency analysis, and uses the EMG signal. As a classification of feature extraction of gestures according to movement, classification may be performed using a K-nearest neighbor (KNN) algorithm.
상술한 바와 같이, 본 발명의 일실시예에 따른 동작 인식 및 제어를 위한 웨어러블 장치 및 이를 이용한 동작 인식 제어 방법은, 근전도 신호를 측정하는 EMG 센서 모듈과, 움직임 신호를 측정하는 IMU 센서 모듈, 잡음 제거의 전처리를 수행하는 제어 모듈, 및 EMG 신호에 기초한 제스처의 동작을 인식하고 제어하는 컴퓨터 장치부를 포함하여 구성함으로써, 동작 인식 및 제어를 사용되는 EMG 신호에 포함된 노이즈를 제거하여 보다 정확한 동작 인식이 가능하도록 할 수 있으며, 또한 EMG 센서 모듈과 IMU 센서 모듈 및 제어 모듈을 하나의 케이스에 통합하여 구성하고, 사용자의 팔 밴드에 단일 채널 EMG 센서가 위치되고, 팔뚝에 IMU 센서가 위치하도록 구성함으로써, EMG 기반의 제어 인터페이스를 착용(wearable)형으로 탈부착의 편리한 사용이 가능함은 물론, EMG 신호와 IMU 신호가 결합되어 사용되는 방식으로 움직임이 없는 미묘한 제스처를 정확하게 인식할 수 있도록 할 수 있게 된다.As described above, a wearable device for motion recognition and control and a motion recognition control method using the same according to an embodiment of the present invention include an EMG sensor module measuring an EMG signal, an IMU sensor module measuring a motion signal, and noise And a control module for performing preprocessing of the removal, and a computer device unit for recognizing and controlling the operation of the gesture based on the EMG signal, thereby removing the noise contained in the EMG signal used for the motion recognition and control to more accurately recognize the motion. This can be done by combining the EMG sensor module, the IMU sensor module and the control module in one case, and configuring the single channel EMG sensor in the user's arm band and the IMU sensor in the forearm. EMG-based control interface is wearable, allowing easy use of detachable EMG A call and how the IMU signals are combined using a subtle gesture no motion is possible to ensure that they accurately recognized.
이상 설명한 본 발명은 본 발명이 속한 기술분야에서 통상의 지식을 가진 자에 의하여 다양한 변형이나 응용이 가능하며, 본 발명에 따른 기술적 사상의 범위는 아래의 특허청구범위에 의하여 정해져야 할 것이다.The present invention described above may be variously modified or applied by those skilled in the art, and the scope of the technical idea according to the present invention should be defined by the following claims.

Claims (20)

  1. 동작 인식 및 제어를 위한 웨어러블 장치(100)로서,As a wearable device 100 for gesture recognition and control,
    사용자의 근육 움직임에 따른 근전도(EMG) 신호를 측정하고, 측정된 근전도 신호를 디지털로 변환하여 출력하는 EMG(electromyogram) 센서 모듈(110);An electromyogram (EMG) sensor module 110 for measuring an EMG signal according to a muscle movement of a user, and converting and outputting the measured EMG signal digitally;
    사용자의 움직임 신호를 측정하고, 측정된 움직임 신호를 디지털 신호로 변환하여 출력하는 IMU(Inertial Motion Unit) 센서 모듈(120);An IMU (Inertial Motion Unit) sensor module 120 for measuring a user's motion signal and converting the measured motion signal into a digital signal and outputting the digital signal;
    상기 EMG 센서 모듈(110)의 근전도 신호와 상기 IMU 센서 모듈(120)의 움직임 신호를 입력받고, 상기 움직임 신호에 기초하여 상기 근전도 신호에 포함된 동적 잡음(motion artifact)을 제거하는 필터링을 수행하는 제어(MCU) 모듈(130); 및The EMG sensor module 110 receives the EMG signal and the motion signal of the IMU sensor module 120, and performs filtering to remove dynamic artifacts included in the EMG signal based on the motion signal. Control (MCU) module 130; And
    상기 제어 모듈(130)로부터 동적 잡음이 제거된 근전도 신호와 움직임 신호를 무선근거리 통신을 통해 전송받아 특징 추출, 분류, 및 데이터 선택의 맵핑 처리를 통해 상기 근전도 신호의 동작을 인식하고 그에 해당하는 제어를 수행하는 컴퓨터 장치부(140)를 포함하는 것을 특징으로 하는, 동작 인식 및 제어를 위한 웨어러블 장치.Recognize the operation of the EMG signal by receiving the EMG signal and the motion signal from which the dynamic noise has been removed from the control module 130 through wireless near field communication, and the mapping process of feature extraction, classification, and data selection. Wearable device for the operation recognition and control, characterized in that it comprises a computer device unit (140) for performing.
  2. 제1항에 있어서, 상기 EMG 센서 모듈(110)은,The method of claim 1, wherein the EMG sensor module 110,
    사용자의 팔 밴드(arm band)에 위치하는 단일 채널 EMG 센서를 포함하여 구성하는 것을 특징으로 하는, 동작 인식 및 제어를 위한 웨어러블 장치.A wearable device for gesture recognition and control, comprising a single channel EMG sensor located in an arm band of a user.
  3. 제1항에 있어서, 상기 EMG 센서 모듈(110)은,The method of claim 1, wherein the EMG sensor module 110,
    사용자의 팔에 부착되어 사용자의 근육 움직임에 따라 측정된 근전도 원시 신호(Raw EMG signal)를 정류(Rectified)하고, 이어 정류된 신호를 증폭하여 필터링된 근전도 신호를 A/D변환기를 이용하여 디지털 신호로 변환 처리하는 것을 특징으로 하는, 동작 인식 및 제어를 위한 웨어러블 장치.It is attached to the user's arm and rectified the raw EMG signal measured according to the user's muscle movement, and then amplified the amplified signal to convert the filtered EMG signal using an A / D converter. Wearable device for gesture recognition and control.
  4. 제3항에 있어서, 상기 EMG 센서 모듈(110)은,The method of claim 3, wherein the EMG sensor module 110,
    정류된 신호의 증폭을 위한 증폭기로 차등 증폭기를 사용하는 것을 특징으로 하는, 동작 인식 및 제어를 위한 웨어러블 장치.Wearable device for operation recognition and control, characterized in that using a differential amplifier as an amplifier for amplifying the rectified signal.
  5. 제1항에 있어서, 상기 IMU 센서 모듈(120)은,The method of claim 1, wherein the IMU sensor module 120,
    사용자의 팔뚝에 위치되어 사용자 움직임 정도를 측정하기 위한 IMU 센서로서, 가속도계(accelerometer), 자이로스코프(Gyroscope), 및 자력계(Magnetometer) 중 어느 하나의 센서로 구성되는 것을 특징으로 하는, 동작 인식 및 제어를 위한 웨어러블 장치.An IMU sensor positioned on the user's forearm to measure the degree of user movement, wherein the sensor is configured of any one of an accelerometer, a gyroscope, and a magnetometer. Wearable device for.
  6. 제5항에 있어서, 상기 움직임 신호는,The method of claim 5, wherein the motion signal,
    상기 EMG 센서 모듈(110)에서 측정되는 근전도(EMG) 신호에 포함되는 동적 잡음(motion artifact)을 보정하기 위한 기준 잡음 신호로 사용되는 것을 특징으로 하는, 동작 인식 및 제어를 위한 웨어러블 장치.Wearable device for motion recognition and control, characterized in that used as a reference noise signal for correcting the dynamic noise (motion artifact) included in the EMG signal measured by the EMG sensor module (110).
  7. 제1항에 있어서, 상기 제어 모듈(130)은,The method of claim 1, wherein the control module 130,
    상기 근전도(EMG) 신호에 포함된 동적 잡음을 포함한 불필요한 잡음을 제거하는 전처리(preprocessing) 과정이 수행된 근전도 신호와 움직임 신호를 상기 컴퓨터 장치부(140)로 전송하기 위한 블루투스 모듈(131)을 더 포함하는 것을 특징으로 하는, 동작 인식 및 제어를 위한 웨어러블 장치.The Bluetooth module 131 for transmitting the EMG signal and the motion signal to which the preprocessing process of removing unnecessary noise including the dynamic noise included in the EMG signal is performed to the computer device unit 140 is further performed. Wearable device for gesture recognition and control, characterized in that it comprises.
  8. 제7항에 있어서, 상기 제어 모듈(130)은,The method of claim 7, wherein the control module 130,
    상기 컴퓨터 장치부(140)와 전 전력 통신을 위한 저 전력 블루투스 4.0과 통합된 릴리패드(Lily Pad) 심블리(Simblee) BLE 보드로 구현되는 것을 특징으로 하는, 동작 인식 및 제어를 위한 웨어러블 장치.Wearable device for the motion recognition and control, characterized in that implemented in the lily pad (Silylee) BLE board integrated with low power Bluetooth 4.0 for full power communication with the computer device unit (140).
  9. 제1항 내지 제8항 중 어느 한 항에 있어서,The method according to any one of claims 1 to 8,
    상기 EMG 센서 모듈(110)과, IMU 센서 모듈(120), 및 제어(MCU) 모듈(130)은 원통형 케이스(101)에 수용 설치되어 사용자의 팔뚝에 체결되는 착용(wearable)형으로 구성되는 것을 특징으로 하는, 동작 인식 및 제어를 위한 웨어러블 장치.The EMG sensor module 110, the IMU sensor module 120, and the control (MCU) module 130 is installed in the cylindrical case 101 is configured to be worn (wearable) fastened to the forearm of the user. Characterized in that the wearable device for motion recognition and control.
  10. 제9항에 있어서, 상기 원통형 케이스(101)는,The method of claim 9, wherein the cylindrical case 101,
    3D 프린터로 제작되는 것을 특징으로 하는, 동작 인식 및 제어를 위한 웨어러블 장치.Wearable device for motion recognition and control, characterized in that produced by the 3D printer.
  11. 제9항에 있어서, 상기 원통형 케이스(101)는,The method of claim 9, wherein the cylindrical case 101,
    사용자의 팔뚝에 탈부착이 가능한 체결부재(102)를 더 포함하여 구성하는 것을 특징으로 하는, 동작 인식 및 제어를 위한 웨어러블 장치.The wearable device for gesture recognition and control, characterized in that it further comprises a fastening member 102 detachable to the forearm of the user.
  12. 제9항에 있어서, 상기 컴퓨터 장치부(140)는,The method of claim 9, wherein the computer device unit 140,
    상기 제어 모듈(130)로부터 블루투스 통신을 통해 입력받은 근전도(EMG) 신호를 이용하여 사용자의 근육 움직임에 따른 제스처의 특징을 추출하여 분류하고, 상기 움직임(IMU) 신호를 이용하여 사용자가 취한 제스처를 구분하기 위해 미리 설정된 데이터시트를 참조한 데이터 선택 및 맵핑을 수행하는 것을 특징으로 하는, 동작 인식 및 제어를 위한 웨어러블 장치.By using the EMG signal received through the Bluetooth communication from the control module 130 to extract and classify the characteristics of the gesture according to the user's muscle movement, the gesture taken by the user using the movement (IMU) signal Wearable device for gesture recognition and control, characterized in that for performing the data selection and mapping with reference to the preset data sheet to distinguish.
  13. 제12항에 있어서, 상기 컴퓨터 장치부(140)는,The method of claim 12, wherein the computer device unit 140,
    상기 제스처의 특징 추출을 통해 분류한 데이터와, 미리 설정된 데이터시트를 참조한 데이터 선택 및 맵핑 수행 처리를 통해 회득한 결과를 기초로 근전도 신호의 제스처를 판단하여 인식하고, 상기 판단하여 인식된 제스처에 해당하는 동작 처리를 제어 어플리케이션 상에서 실행하는 것을 특징으로 하는, 동작 인식 및 제어를 위한 웨어러블 장치.The gesture of the EMG signal is determined and recognized on the basis of the data classified by the feature extraction of the gesture and the result obtained through the data selection and mapping processing process referring to a preset data sheet, and corresponds to the recognized gesture. The wearable device for gesture recognition and control, characterized by executing an operation process to be performed on the control application.
  14. 제13항에 있어서, 상기 컴퓨터 장치부(140)는,The method of claim 13, wherein the computer device unit 140,
    상기 근전도(EMG) 신호와 움직임(IMG) 신호를 결합하여 실제 움직임으로 나타나지 않는 등척성(isometric) 근육 활동을 감지하고, 이에 따른 움직임이 없는 미묘한 제스처를 분류하고 주변 환경에 방해가 되지 않는 상태로 제어가 가능하도록 하는 것을 특징으로 하는, 동작 인식 및 제어를 위한 웨어러블 장치.The EMG signal and the IMG signal are combined to detect isometric muscle activity that does not appear as a real motion, classify subtle gestures without motion, and control the condition without disturbing the surrounding environment. Wearable device for gesture recognition and control, characterized in that to enable.
  15. 제13항에 있어서, 상기 컴퓨터 장치부(140)는,The method of claim 13, wherein the computer device unit 140,
    상기 근전도(EMG) 신호를 이용한 사용자의 근육 움직임에 따른 제스처의 특징 추출로서, 시간-주파수 분석의 웨이블릿 변환(Wavelet Transformation)을 적용하는 것을 특징으로 하는, 동작 인식 및 제어를 위한 웨어러블 장치.Wearing device for motion recognition and control, characterized in that for applying the wavelet transform of the time-frequency analysis as the feature extraction of the gesture according to the user's muscle movement using the EMG signal.
  16. 제15항에 있어서, 상기 제스처 분류는,The method of claim 15, wherein the gesture classification,
    hand close 제스처, hand open 제스처, 및 hand extention 제스처의 3가지 클래스로 이루어지는 것을 특징으로 하는, 동작 인식 및 제어를 위한 웨어러블 장치.A wearable device for gesture recognition and control, comprising three classes: hand close gesture, hand open gesture, and hand extention gesture.
  17. 제15항에 있어서, 상기 컴퓨터 장치부(140)는,The computer apparatus 140 of claim 15,
    상기 근전도 신호를 이용한 사용자의 근육 움직임에 따른 제스처의 특징 추출의 분류로서, K-nearest Neighbor(KNN) 알고리즘을 이용하여 분류 처리하는 것을 특징으로 하는, 동작 인식 및 제어를 위한 웨어러블 장치.The classification of the feature extraction of the gesture according to the muscle movement of the user using the EMG signal, characterized in that the classification processing using the K-nearest Neighbor (KNN) algorithm, wearable device for motion recognition and control.
  18. EMG 센서 모듈(110)과 IMU 센서 모듈(120)과 제어(MCU) 모듈(130) 및 컴퓨터 장치부(140)를 구비하는 웨어러블 장치(100)를 이용한 동작 인식 제어 방법으로서,A motion recognition control method using the wearable device 100 including the EMG sensor module 110, the IMU sensor module 120, the control (MCU) module 130, and the computer device unit 140,
    (1) 상기 제어 모듈(130)이 상기 EMG 센서 모듈(110)로부터 사용자의 팔뚝의 근육 움직임을 통하여 측정되어 디지털 신호로 변환된 근전도(EMG) 신호를 입력받는 단계;(1) receiving, by the control module 130, an EMG signal measured from the EMG sensor module 110 through the muscle movement of the forearm of the user and converted into a digital signal;
    (2) 상기 제어 모듈(130)이 상기 IMU 센서 모듈(120)로부터 사용자의 움직임 정도가 측정되어 디지털 신호로 변환된 움직임(IMU) 신호를 입력받는 단계;(2) the control module 130 receiving a movement (IMU) signal of which a user's movement is measured from the IMU sensor module 120 and converted into a digital signal;
    (3) 상기 제어 모듈(130)이 상기 움직임 신호에 기초하여 상기 근전도(EMG) 신호에 포함된 동적 잡음을 포함한 불필요한 잡음을 제거하는 전처리(preprocessing) 과정을 수행하고, 전처리 과정이 수행된 근전도 신호와 움직임 신호를 상기 컴퓨터 장치부(140)로 전송하는 단계; 및(3) The control module 130 performs a preprocessing process to remove unnecessary noise including dynamic noise included in the EMG signal based on the motion signal, and the EMG signal on which the preprocessing process is performed. Transmitting a motion signal to the computer device unit (140); And
    (4) 상기 컴퓨터 장치부(140)가 상기 단계 (3)을 통해 상기 제어 모듈(130)로부터 동적 잡음이 제거된 근전도 신호와 움직임 신호를 무선근거리 통신을 통해 전송받고, 특징 추출, 분류, 및 데이터 선택의 맵핑 처리를 통해 상기 근전도 신호의 동작을 인식하고 그에 해당하는 제어를 수행하는 단계를 포함하는 것을 특징으로 하는, 웨어러블 장치를 이용한 동작 인식 제어 방법.(4) the computer device unit 140 receives the EMG signal and the motion signal from which the dynamic noise is removed from the control module 130 through the wireless local area communication, extracting, classifying, and And recognizing an operation of the EMG signal and performing a corresponding control through a mapping process of data selection.
  19. 제18항에 있어서, 상기 단계 (4)에서는,19. The method according to claim 18, wherein in step (4),
    상기 컴퓨터 장치부(140)가, 상기 제어 모듈(130)로부터 블루투스 통신을 통해 입력받은 근전도(EMG) 신호를 이용하여 사용자의 근육 움직임에 따른 제스처의 특징을 추출하여 분류하고, 상기 움직임(IMU) 신호를 이용하여 사용자가 취한 제스처를 구분하기 위해 미리 설정된 데이터시트를 참조한 데이터 선택 및 맵핑을 수행하고, 상기 제스처의 특징 추출을 통해 분류한 데이터와, 미리 설정된 데이터시트를 참조한 데이터 선택 및 맵핑 수행 처리를 통해 회득한 결과를 기초로 근전도 신호의 제스처를 판단하여 인식하며, 상기 판단하여 인식된 제스처에 해당하는 동작 처리를 제어 어플리케이션 상에서 실행하는 것을 특징으로 하는, 웨어러블 장치를 이용한 동작 인식 제어 방법.The computer device unit 140 extracts and classifies gesture features according to muscle movements of a user using EMG signals received from the control module 130 through Bluetooth communication. Data selection and mapping with reference to a preset data sheet is performed to distinguish gestures made by a user using signals, and data classified and extracted by feature extraction of the gesture and data selection and mapping with reference to a preset data sheet are processed. And determining and recognizing a gesture of an EMG signal based on a result acquired through the control, and executing a motion process corresponding to the determined and recognized gesture on a control application.
  20. 제19항에 있어서, 상기 컴퓨터 장치부(140)는,20. The method of claim 19, wherein the computer unit 140,
    상기 근전도(EMG) 신호를 이용한 사용자의 근육 움직임에 따른 제스처의 특징 추출로서, 시간-주파수 분석의 웨이블릿 변환(Wavelet Transformation)을 적용하고, 상기 근전도 신호를 이용한 사용자의 근육 움직임에 따른 제스처의 특징 추출의 분류로서, K-nearest Neighbor(KNN) 알고리즘을 이용하여 분류 처리하는 것을 특징으로 하는, 웨어러블 장치를 이용한 동작 인식 제어 방법.As a feature extraction of a gesture according to a user's muscle movement using the EMG signal, wavelet transform of time-frequency analysis is applied, and a feature extraction of a gesture according to the user's muscle movement using the EMG signal is applied. A classification method of the motion recognition control method using a wearable device, characterized in that the classification process using a K-nearest Neighbor (KNN) algorithm.
PCT/KR2017/001568 2017-01-22 2017-02-13 Wearable device for motion recognition and control, and method for motion recognition control using same WO2018135692A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2017-0010103 2017-01-22
KR1020170010103A KR101963694B1 (en) 2017-01-22 2017-01-22 Wearable device for gesture recognition and control and gesture recognition control method using the same

Publications (1)

Publication Number Publication Date
WO2018135692A1 true WO2018135692A1 (en) 2018-07-26

Family

ID=62908520

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/001568 WO2018135692A1 (en) 2017-01-22 2017-02-13 Wearable device for motion recognition and control, and method for motion recognition control using same

Country Status (2)

Country Link
KR (1) KR101963694B1 (en)
WO (1) WO2018135692A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111046731A (en) * 2019-11-11 2020-04-21 中国科学院计算技术研究所 Transfer learning method and recognition method for gesture recognition based on surface electromyogram signals
CN111700718A (en) * 2020-07-13 2020-09-25 北京海益同展信息科技有限公司 Holding posture identifying method, holding posture identifying device, artificial limb and readable storage medium
WO2020206179A1 (en) * 2019-04-05 2020-10-08 Baylor College Of Medicine Method and system for detection and analysis of thoracic outlet syndrome (tos)
CN113970968A (en) * 2021-12-22 2022-01-25 深圳市心流科技有限公司 Intelligent bionic hand action pre-judging method
CN114265498A (en) * 2021-12-16 2022-04-01 中国电子科技集团公司第二十八研究所 Method for combining multi-modal gesture recognition and visual feedback mechanism

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102258168B1 (en) * 2019-11-14 2021-05-28 주식회사 마이뉴런 Paralysis patient motion analysis monitoring system and method
KR102327168B1 (en) 2020-03-02 2021-11-16 국방과학연구소 System and Method for modeling human body which can estimate occupant motion
WO2022114294A1 (en) * 2020-11-27 2022-06-02 (주) 로임시스템 Biosignal dual processing apparatus
CN113534960B (en) * 2021-07-29 2024-05-28 中国科学技术大学 Upper arm artificial limb control method and system based on IMU and surface electromyographic signals
KR20230087297A (en) * 2021-12-09 2023-06-16 삼성전자주식회사 Method for gesture recogmition using wearable device and the device
EP4369153A1 (en) 2021-12-09 2024-05-15 Samsung Electronics Co., Ltd. Gesture recognition method using wearable device, and device therefor
CN114683292B (en) * 2022-06-01 2022-08-30 深圳市心流科技有限公司 Sampling frequency control method of electromyographic equipment, intelligent bionic hand and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090326406A1 (en) * 2008-06-26 2009-12-31 Microsoft Corporation Wearable electromyography-based controllers for human-computer interface
US20140368424A1 (en) * 2013-06-17 2014-12-18 Samsung Electronics Co., Ltd. Presentation device and method for operating the device
US20150057770A1 (en) * 2013-08-23 2015-02-26 Thaimic Labs Inc. Systems, articles, and methods for human-electronics interfaces
KR20150112741A (en) * 2014-03-27 2015-10-07 전자부품연구원 Wearable device and information input method using the same
KR20150123254A (en) * 2013-02-22 2015-11-03 탈믹 랩스 인크 Methods and devices that combine muscle activity sensor signals and inertial sensor signals for gesture-based control

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090326406A1 (en) * 2008-06-26 2009-12-31 Microsoft Corporation Wearable electromyography-based controllers for human-computer interface
KR20150123254A (en) * 2013-02-22 2015-11-03 탈믹 랩스 인크 Methods and devices that combine muscle activity sensor signals and inertial sensor signals for gesture-based control
US20140368424A1 (en) * 2013-06-17 2014-12-18 Samsung Electronics Co., Ltd. Presentation device and method for operating the device
US20150057770A1 (en) * 2013-08-23 2015-02-26 Thaimic Labs Inc. Systems, articles, and methods for human-electronics interfaces
KR20150112741A (en) * 2014-03-27 2015-10-07 전자부품연구원 Wearable device and information input method using the same

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020206179A1 (en) * 2019-04-05 2020-10-08 Baylor College Of Medicine Method and system for detection and analysis of thoracic outlet syndrome (tos)
CN111046731A (en) * 2019-11-11 2020-04-21 中国科学院计算技术研究所 Transfer learning method and recognition method for gesture recognition based on surface electromyogram signals
CN111700718A (en) * 2020-07-13 2020-09-25 北京海益同展信息科技有限公司 Holding posture identifying method, holding posture identifying device, artificial limb and readable storage medium
CN114265498A (en) * 2021-12-16 2022-04-01 中国电子科技集团公司第二十八研究所 Method for combining multi-modal gesture recognition and visual feedback mechanism
CN114265498B (en) * 2021-12-16 2023-10-27 中国电子科技集团公司第二十八研究所 Method for combining multi-mode gesture recognition and visual feedback mechanism
CN113970968A (en) * 2021-12-22 2022-01-25 深圳市心流科技有限公司 Intelligent bionic hand action pre-judging method
CN113970968B (en) * 2021-12-22 2022-05-17 深圳市心流科技有限公司 Intelligent bionic hand action pre-judging method

Also Published As

Publication number Publication date
KR20180086547A (en) 2018-08-01
KR101963694B1 (en) 2019-03-29

Similar Documents

Publication Publication Date Title
WO2018135692A1 (en) Wearable device for motion recognition and control, and method for motion recognition control using same
WO2018205424A1 (en) Biometric identification method and terminal based on myoelectricity, and computer-readable storage medium
Usakli et al. Design of a novel efficient human–computer interface: An electrooculagram based virtual keyboard
EP1374765B1 (en) A mobile terminal capable of measuring a biological signal
WO2012153965A2 (en) Brain-computer interface device and classification method therefor
US7963931B2 (en) Methods and devices of multi-functional operating system for care-taking machine
KR101218200B1 (en) Wearable sensor-set and operating method of the smae
WO2017192010A1 (en) Apparatus and method for extracting cardiovascular characteristic
EP1304072A3 (en) Method and apparatus for the serial comparison of electrocardiograms
WO2015053418A1 (en) Multifunctional sport headset
KR101238192B1 (en) Ear attachable sensor-set and operating method of the same
EP3288447A1 (en) Apparatus and method for extracting cardiovascular characteristic
KR101218203B1 (en) Wearable sensor-set and operating method of the smae
WO2011093557A1 (en) Device and method for feature extraction of biometric signals
WO2005002420A3 (en) Method and apparatus for an automated procedure to detect and monitor early-stage glaucoma
Yu et al. An inflatable and wearable wireless system for making 32-channel electroencephalogram measurements
WO2017099340A1 (en) Electronic device, signal processing method thereof, biological signal measurement system, and non-transitory computer readable recording medium
Lovelace et al. Modular, bluetooth enabled, wireless electroencephalograph (EEG) platform
WO2015178549A1 (en) Method and apparatus for providing security service by using stimulation threshold or less
Brahmaiah et al. Data Acquisition System of Electrooculogram
CN110051351B (en) Tooth biting signal acquisition method and control method and device of electronic equipment
WO2019156289A1 (en) Electronic device and control method therefor
CN110033772B (en) Non-acoustic voice information detection device based on PPG signal
WO2022260228A1 (en) Method and device for automatically detecting noise signal section
CN113951897A (en) Multi-mode resting electroencephalogram data interference elimination and marking method and device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17892127

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17892127

Country of ref document: EP

Kind code of ref document: A1