CN111694435B - Wearable touch detection method based on inertial sensing unit - Google Patents

Wearable touch detection method based on inertial sensing unit Download PDF

Info

Publication number
CN111694435B
CN111694435B CN202010550315.9A CN202010550315A CN111694435B CN 111694435 B CN111694435 B CN 111694435B CN 202010550315 A CN202010550315 A CN 202010550315A CN 111694435 B CN111694435 B CN 111694435B
Authority
CN
China
Prior art keywords
data
touch
neural network
training
sensing unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010550315.9A
Other languages
Chinese (zh)
Other versions
CN111694435A (en
Inventor
石亦磊
张海沫
苏兰加·纳纳亚卡拉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202010550315.9A priority Critical patent/CN111694435B/en
Publication of CN111694435A publication Critical patent/CN111694435A/en
Application granted granted Critical
Publication of CN111694435B publication Critical patent/CN111694435B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a wearable touch detection method based on an inertial sensing unit, which relates to the technical field of touch detection and comprises the following steps of: building and training a neural network model, and building a motion detection model; assembling an inertial sensor, and collecting the motion information of fingers; and transmitting the motion information characteristic signals to the motion detection model for inspection, and outputting the result. By implementing touch recognition on any physical surface, no touch sensor is required to be mounted on the surface, the necessary hardware facilities are very few, at least one inertial sensor is required to be mounted on a finger or a nail, the device is convenient to carry, various wearable devices can be designed, the cost is low, the data amount required by calculation is smaller, the calculation method is simpler, the calculation speed is also fast, and in addition, the reliability of touch detection is high.

Description

Wearable touch detection method based on inertial sensing unit
Technical Field
The invention relates to the technical field of touch detection, in particular to a wearable touch detection method based on an inertial sensing unit.
Background
Touch detection technology has been in progress for years. In general, a touch detection device is a product that includes a touch position determination device based on resistance, capacitance, infrared rays, surface acoustic waves, force, bending waves, and the like. These devices that utilize sound waves to determine touch location are based on measuring travel time, or measuring phase differences and/or characterizing the screen.
The current touch detection technology is of several common types:
1. touch technologies like touch pads, such as common computer touch pads, touch screens, and the like. Such touch detection techniques have a relatively wide range of applications and sophisticated technologies. However, it is also obvious that the interface for touch interaction must be a pre-designed electronic board, such as a touch screen, and such techniques cannot implement touch interaction on a common physical surface.
2. Camera type touch detection technology. Such techniques can detect touches by a finger on any object surface with less requirements on the interactive interface. The touch interaction technology based on the camera has the following defects: firstly, the touch equipment based on the camera is greatly influenced by light rays and obstacles, and if the camera is not directly aligned with a finger, the touch action cannot be identified; secondly, the camera is not easy to carry and install; third, some cameras, such as depth cameras, may have relatively high costs.
For the problems in the related art, no effective solution has been proposed at present.
Disclosure of Invention
Aiming at the problems in the related art, the invention provides a wearable touch detection method based on an inertial sensing unit, so as to overcome the technical problems existing in the prior related art.
The technical scheme of the invention is realized as follows:
a wearable touch detection method based on an inertial sensing unit comprises the following steps:
building and training a neural network model, and building a motion detection model;
training a neural network model, comprising the steps of:
selecting a plurality of normal persons, wearing inertial sensing units at three positions of fingers, and then touching daily objects to record data;
suspending hands in the air, respectively performing postures of walking, sitting down, standing up and the like, and recording data, wherein the data are non-touch data;
after marking the digital-analog data and the non-touch data, sending the marked digital-analog data and the non-touch data into a neural network for training until the upper and lower intervals of the model accuracy obtained by continuous training for one hundred times are less than 0.1%, and stopping training;
assembling an inertial sensor, and collecting the motion information of fingers;
and transmitting the motion information characteristic signals to the motion detection model for inspection, and outputting the result.
Further, the neural network model includes a data structure, a convolution layer 1, an excitation function, pooling, a convolution layer 2, a convolution layer 3, a full connection 1, and a full connection 2.
Further, assembling the inertial sensor includes assembling the inertial sensor with a finger nail or in the form of a ring on a finger joint.
The invention has the beneficial effects that:
according to the invention, touch recognition is realized on any physical surface, no touch sensor is required to be arranged on the surface, the necessary hardware facilities are very few, at least one inertial sensor is required to be arranged on a finger or a nail, the device is convenient to carry, various wearable devices can be designed, the cost is low, the data amount required by calculation is smaller, the calculation method is simpler, the calculation speed is also fast, and in addition, the reliability of touch detection is high.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow diagram of a wearable touch detection method based on an inertial sensing unit according to an embodiment of the invention;
FIG. 2 is a schematic diagram of a wearable touch detection method based on an inertial sensing unit according to an embodiment of the invention;
fig. 3 is a data schematic diagram of a wearable touch detection method based on an inertial sensing unit according to an embodiment of the invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which are derived by a person skilled in the art based on the embodiments of the invention, fall within the scope of protection of the invention.
According to an embodiment of the invention, a wearable touch detection method based on an inertial sensing unit is provided.
As shown in fig. 1 to 3, the wearable touch detection method based on the inertial sensing unit according to the embodiment of the invention comprises the following steps:
step S1, building and training a neural network model, and building a motion detection model;
s2, assembling an inertial sensor and collecting the motion information of fingers;
and step S3, transmitting the motion information characteristic signals to the motion detection model for inspection, and outputting the result.
By means of the technical scheme, touch recognition is achieved on any physical surface, no touch sensor is required to be mounted on the surface, necessary hardware facilities are few, at least one inertial sensor is required to be mounted on a finger or a nail, the device is convenient to carry, various wearable devices can be designed, cost is low, the data amount required by calculation is smaller, the calculation method is simpler, the calculation speed is high, and in addition, the reliability of touch detection is high.
The neural network model comprises a data structure, a convolution layer 1, an excitation function, pooling, a convolution layer 2, a convolution layer 3, a full connection 1 and a full connection 2.
Specifically, as shown in fig. 2, the neural network model includes the following structures:
I. a data structure;
convolution layer 1: input 1 channel, output 16 channels, core 3x3, step length 1;
excitation function + pooling: excitation function is a relu (Rectified Linear Unit linear rectification function), and the pooling core is 3x3;
convolution layer 2: the input 16 channels and the output 32 channels are 5x5 in core and 1 in step length;
excitation function + pooling: the excitation function is relu, and the pooling core is 3x3;
convolution layer 3: input 32 channels, output 120 channels, core of 5x5, step length of 1;
excitation function + pooling: the excitation function is relu;
full ligation 1: the input length is 600, and the output length is 512;
IX. the excitation function is relu;
x. full connection 2: the input length is 512, and the output length is 128;
XI, excitation function is relu;
XII. full ligation 2: an input length of 128 and an output length of 2 (indicating whether or not to touch);
XIII the excitation function is by softmax (softmax logistic regression);
for the above data structure, as shown in fig. 3, the data sample is time series data, and the sensor reads a set of data at a time, that is, a column of the above graph. Depending on the sampling frequency, a single data may represent different lengths. After reading data with a certain length, the data are sent into a neural network together for judgment. After one prediction is performed, the earliest 1 group of data is discarded, and then one group of data is continuously read, connected to the tail of the data group to form new data, and the new data is sent into the neural network to reciprocate.
Wherein, training the neural network model comprises the following steps:
selecting a plurality of normal persons, wearing inertial sensing units at three positions of fingers, and then touching daily objects to record data;
suspending hands in the air, respectively performing postures of walking, sitting down, standing up and the like, and recording data, wherein the data are non-touch data;
after marking the digital-analog data and the non-touch data, the data are sent to a neural network for training until the upper and lower intervals of the model accuracy obtained by continuous training for one hundred times are less than 0.1%, and the training is stopped.
Wherein assembling the inertial sensor includes assembling the inertial sensor with a finger nail or in the form of a finger ring on a finger joint.
In addition, the neural network is built with a PyTorch, which is an open source Python machine learning library, and is used for applications such as natural language processing based on Torch.
PyTorrch was developed by Facebook artificial intelligence institute (FAIR) based on Torr. It is a Python-based continuable computation package that provides two advanced functions: 1. tensor computation (e.g., numPy) with powerful GPU acceleration. 2. A deep neural network comprising an automated deriving system.
The precursor of PyTorch is Torch, which is the same as the Torch framework in its bottom layer, but uses Python to rewrite much content, not only is more flexible, supports dynamic graphs, but also provides a Python interface. The method is developed by Torch7 team, is a deep learning framework with Python priority, can realize powerful GPU acceleration, and supports dynamic neural networks, which is not supported by many mainstream deep learning frameworks such as Tensorflow. PyTorch can be seen as a numpy with GPU support added, and can also be seen as a powerful deep neural network with automatic derivation function.
Specifically, the accelerometer data of the inertial sensing unit is subjected to discrete derivative (adjacent data is subjected to difference) for a certain time (for example, 50 milliseconds, and the time is preferably between 50 and 150 milliseconds from the experimental result), and then the obtained time series data is sent to a neural network for model training or touch detection. Meanwhile, through analysis of the rotation angle of the inertial sensing unit, the specific motion track or position of the finger can be known. The method comprises the steps of obtaining absolute rotation angles of an inertial sensing unit by data of an accelerometer, a gyroscope and a geomagnetic instrument through a Madgwick AHRS algorithm, and then enabling rotation information to correspond to two-dimensional position information. By processing both information (whether or not to touch and the two-dimensional position information of the finger) simultaneously, single-finger touch interaction (including touch positioning) can be realized on any object surface.
In summary, by means of the above technical solution of the present invention, the following effects can be achieved:
1. touch recognition can be achieved on any physical surface without the need to mount any touch sensors on these surfaces.
2. The device has very few necessary hardware facilities, at least only needs one inertial sensor to be installed on fingers or nails, is convenient to carry, and can be designed into various wearable devices.
3. Low cost, the cost of hardware required to implement this technique is low due to the low cost of the inertial sensing unit.
4. Compared with a camera (computer vision) method, the method has the advantages that the data volume required by calculation is smaller, the calculation method is simpler, and the calculation speed is high.
5. The touch detection reliability is high.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, alternatives, and improvements that fall within the spirit and scope of the invention.

Claims (3)

1. The wearable touch detection method based on the inertial sensing unit is characterized by comprising the following steps of:
building and training a neural network model;
training a neural network model, comprising the steps of:
selecting a plurality of normal persons, wearing inertial sensing units at three positions of fingers, and then touching daily objects to record data;
suspending hands in the air, respectively performing walking, sitting and standing postures, and recording data, wherein the data are non-touch data;
after the touch data and the non-touch data are marked, the touch data and the non-touch data are sent to a neural network for training until the upper and lower intervals of the model accuracy obtained by continuous one hundred times of training are less than 0.1%, training is stopped, and after the touch data and the non-touch data are marked, the specific modes of the touch data and the non-touch data are sent to the neural network for training are as follows:
calculating a discrete derivative of data of the accelerometer of the inertial sensing unit within a certain time, namely, making a difference between adjacent data, then sending the obtained time series data into a neural network for model training, wherein the trained neural network model is used for detecting whether touch exists or not, and the certain time is selected to be 50-150 milliseconds;
the inertial sensor is assembled to collect the motion information of fingers, and the specific mode is as follows:
the data of the accelerometer, the gyroscope and the geomagnetic instrument are used for obtaining the absolute rotation angle of the inertial sensing unit by using a Madgwick AHRS algorithm, then the rotation information is corresponding to the two-dimensional position information, and whether the two-dimensional position information of the finger and the finger are touched are processed simultaneously, so that single-finger touch interaction can be realized on the surface of any object.
2. The inertial sensing unit-based wearable touch detection method of claim 1, wherein the neural network model comprises a data structure, a convolution layer 1, an excitation function, pooling, a convolution layer 2, a convolution layer 3, a full connection 1, and a full connection 2.
3. The inertial sensing unit based wearable touch detection method of claim 1, wherein assembling the inertial sensor includes assembling the inertial sensor with a finger nail or in the form of a finger ring on a finger joint.
CN202010550315.9A 2020-06-16 2020-06-16 Wearable touch detection method based on inertial sensing unit Active CN111694435B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010550315.9A CN111694435B (en) 2020-06-16 2020-06-16 Wearable touch detection method based on inertial sensing unit

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010550315.9A CN111694435B (en) 2020-06-16 2020-06-16 Wearable touch detection method based on inertial sensing unit

Publications (2)

Publication Number Publication Date
CN111694435A CN111694435A (en) 2020-09-22
CN111694435B true CN111694435B (en) 2024-02-02

Family

ID=72481530

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010550315.9A Active CN111694435B (en) 2020-06-16 2020-06-16 Wearable touch detection method based on inertial sensing unit

Country Status (1)

Country Link
CN (1) CN111694435B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103105945A (en) * 2012-12-17 2013-05-15 中国科学院计算技术研究所 Man-machine interaction ring supporting multi-point touch gestures
CN106922185A (en) * 2014-09-30 2017-07-04 微软技术许可有限责任公司 Via the wearable and mobile device control based on proper motion
CN109919034A (en) * 2019-01-31 2019-06-21 厦门大学 A kind of identification of limb action with correct auxiliary training system and method
KR20190102924A (en) * 2018-02-27 2019-09-04 세종대학교산학협력단 Techniques of performing convolutional neural network-based gesture recognition using inertial measurement unit
CN110210547A (en) * 2019-05-27 2019-09-06 南京航空航天大学 Piano playing gesture identification method based on inertia gloves

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130135223A1 (en) * 2009-12-13 2013-05-30 Ringbow Ltd. Finger-worn input devices and methods of use
US10416802B2 (en) * 2015-09-14 2019-09-17 Stmicroelectronics Asia Pacific Pte Ltd Mutual hover protection for touchscreens
US10261685B2 (en) * 2016-12-29 2019-04-16 Google Llc Multi-task machine learning for predicted touch interpretations
WO2018151449A1 (en) * 2017-02-17 2018-08-23 Samsung Electronics Co., Ltd. Electronic device and methods for determining orientation of the device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103105945A (en) * 2012-12-17 2013-05-15 中国科学院计算技术研究所 Man-machine interaction ring supporting multi-point touch gestures
CN106922185A (en) * 2014-09-30 2017-07-04 微软技术许可有限责任公司 Via the wearable and mobile device control based on proper motion
KR20190102924A (en) * 2018-02-27 2019-09-04 세종대학교산학협력단 Techniques of performing convolutional neural network-based gesture recognition using inertial measurement unit
CN109919034A (en) * 2019-01-31 2019-06-21 厦门大学 A kind of identification of limb action with correct auxiliary training system and method
CN110210547A (en) * 2019-05-27 2019-09-06 南京航空航天大学 Piano playing gesture identification method based on inertia gloves

Also Published As

Publication number Publication date
CN111694435A (en) 2020-09-22

Similar Documents

Publication Publication Date Title
US9613262B2 (en) Object detection and tracking for providing a virtual device experience
US20140192024A1 (en) Object detection and tracking with audio and optical signals
JP3630712B2 (en) Gesture input method and apparatus
US20130010071A1 (en) Methods and systems for mapping pointing device on depth map
US10108270B2 (en) Real-time 3D gesture recognition and tracking system for mobile devices
CN106104434A (en) Touch panel device is used to determine user's handedness and orientation
CN109344789A (en) Face tracking method and device
CN104516499B (en) Apparatus and method for event using user interface
CN110832433A (en) Sensor-based component activation
CN106681575A (en) Slider and gesture recognition using capacitive sensing
CN110083418A (en) The processing method, equipment and computer readable storage medium of picture in information flow
Ryumin et al. Automatic detection and recognition of 3D manual gestures for human-machine interaction
Fei et al. Flow-pose Net: An effective two-stream network for fall detection
CN111694435B (en) Wearable touch detection method based on inertial sensing unit
Liu et al. Ultrasonic positioning and IMU data fusion for pen-based 3D hand gesture recognition
CN103854026B (en) A kind of recognition methods and electronic equipment
US8884904B2 (en) Touch panel apparatus, system and operation method thereof
CN110199207A (en) Vibration analysis system and its method
WO2015064991A2 (en) Smart device enabling non-contact operation control and non-contact operation control method using same
Kefer et al. Comparing the placement of two arm-worn devices for recognizing dynamic hand gestures
KR20140046197A (en) An apparatus and method for providing gesture recognition and computer-readable medium having thereon program
Chen et al. An integrated sensor network method for safety management of construction workers
Zhou et al. Acoustic Sensing-based Hand Gesture Detection for Wearable Device Interaction
van Wyk et al. A multimodal gesture-based virtual interactive piano system using computer vision and a motion controller
Mali et al. Hand gestures recognition using inertial sensors through deep learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant