CN110377175A - The recognition methods of percussion event and system and terminal touch-control product on touch panel - Google Patents

The recognition methods of percussion event and system and terminal touch-control product on touch panel Download PDF

Info

Publication number
CN110377175A
CN110377175A CN201810330628.6A CN201810330628A CN110377175A CN 110377175 A CN110377175 A CN 110377175A CN 201810330628 A CN201810330628 A CN 201810330628A CN 110377175 A CN110377175 A CN 110377175A
Authority
CN
China
Prior art keywords
percussion
touch
network
sub
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810330628.6A
Other languages
Chinese (zh)
Other versions
CN110377175B (en
Inventor
蔡宗华
叶映志
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Silicon Integrated Systems Corp
Original Assignee
Silicon Integrated Systems Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Silicon Integrated Systems Corp filed Critical Silicon Integrated Systems Corp
Priority to CN201810330628.6A priority Critical patent/CN110377175B/en
Publication of CN110377175A publication Critical patent/CN110377175A/en
Application granted granted Critical
Publication of CN110377175B publication Critical patent/CN110377175B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)
  • Electrophonic Musical Instruments (AREA)

Abstract

A kind of recognition methods of percussion event on touch panel, includes: measuring vibration signal for the percussion event carried out on touch panel, to collect percussion event and record the type of the percussion event, as sample;Generating includes this collection as several samples;A deep neural network is trained using this sample set, determines the weight parameter group of optimization;Classifier is tapped using the deep neural network and the weight parameter group of the optimization as one, cloth is built in a terminal touch-control product;And image is formed by according to the vibration signal and touch sensing value that execute a tapping operation to the terminal touch-control product and sense, obtain the percussion type of prediction.This announcement simultaneously provides the system for corresponding to the recognition methods and a kind of terminal touch-control product.

Description

The recognition methods of percussion event and system and terminal touch-control product on touch panel
Technical field
This announcement is related to a kind of detection technology, the recognition methods of the percussion event on a kind of particularly relevant touch panel and is System and terminal touch-control product.
Background technique
Existing large touch display device is collocated with mark mapping software, is available for users to and carries out on display picture Mark, to facilitate the content explained on picture.The mark mapping software would generally show main menu at picture edge, and user can Change brush color or adjustment paintbrush thickness by the main menu.However, since screen size is very big, the main menu potential range User is far, and it is quite inconvenient to choose in point, and the operation for adjusting paintbrush attribute is fairly cumbersome for user.
In view of this, it is necessary to a kind of new scheme is proposed, to solve the above problems.
Summary of the invention
The purpose of the present invention is to provide a kind of recognition methods of the percussion event on touch panel and systems and terminal Touch-control product, to promote the accuracy for the prediction for tapping type.
To reach above-mentioned purpose, one aspect of the present invention provides a kind of recognition methods of the percussion event on touch panel, packet Contain: the various percussion events that carry out on touch panel of sensing and several vibration signals for measuring record these and tap thing The type of part is as classification marker;Using the vibration signal of corresponding percussion event and classification marker as a sample, generate Include this collection as several samples;Using the sample in the sample set as input, the weight parameter group conduct freely chosen Adjusting parameter is input in a deep neural network and is trained, and using the algorithm of back-propagation, adjusts the weight parameter group; The sample of the sample set is read in batches, the training deep neural network is finely adjusted the weight parameter group, excellent to determine The weight parameter group of change;Using the deep neural network and the weight parameter group of the optimization as a model, cloth is built to terminal touching It controls in product;Vibration signal caused by the tapping operation executed to the terminal touch-control product is sensed, and generates corresponding this and strikes Several touch sensing values for hitting operation are formed by an image;The vibration signal of the corresponding tapping operation is input to the model In with obtain one first as a result, to these touch sensing values be formed by image analyzed with obtain one second as a result, according to First result and second result obtain the percussion type of a prediction.
Another aspect of the present invention provides a kind of recognition methods of the percussion event on touch panel, includes: sensing is in touch-control The various percussion events carried out on panel and several vibration signals measured, and the touch panel is sensed into each percussion thing Several touch sensing values that part is obtained are converted into an image, and record the type of these percussion events as classification marker; Using vibration signal, image and the classification marker of corresponding percussion event as a sample, generate comprising several samples One sample set;Using the sample in the sample set as input, the weight parameter group freely chosen is input to one as adjusting parameter It is trained in deep neural network, using the algorithm of back-propagation, adjusts the weight parameter group;And by the sample of the sample set This reads in batches, and the training deep neural network is finely adjusted the weight parameter group, to determine the weight parameter of optimization Group.
Further aspect of the present invention provides a kind of identifying system of the percussion event on touch panel, includes: a touch panel; One Vibration Sensor is set together with the touch panel, and wherein the Vibration Sensor carries out on touch panel to sense Various percussion events and measure several vibration signals, if what the touch panel was obtained to sense each percussion event Dry touch sensing value, these touch sensing values are converted into an image;One processor, with the Vibration Sensor and the touch surface Plate coupling, the processor receive several shadows that these vibration signals that the Vibration Sensor transmits and the touch panel transmit Picture;And a memory, it is connect with the processor, which includes several program instructions that can be executed by the processor, at this Reason device executes these program instructions to execute a method, and the method includes: these are tapped the type of event as contingency table Note, is recorded in the memory;It is raw using vibration signal, image and the classification marker of corresponding percussion event as a sample At including this collection as several samples;Using the sample in the sample set as input, the weight parameter group freely chosen is made It for adjusting parameter, is input in a deep neural network and is trained, using the algorithm of back-propagation, adjust the weight parameter Group;And read the sample of the sample set in batches, the training deep neural network is finely adjusted the weight parameter group, with Determine the weight parameter group of optimization.
Another aspect of the invention provides a kind of terminal touch-control product, includes: a touch-control interface;One Vibration Sensor, with this Touch-control interface is set together, and the Vibration Sensor is to sense the vibration generated to the tapping operation that the touch-control interface executes Dynamic signal, the touch-control interface obtain several touch sensing values to sense the tapping operation, these touch sensing values are turned Change an image into;And a controller, coupled with the Vibration Sensor and the touch-control interface, in the controller cloth have with it is above-mentioned The corresponding deep neural network of deep neural network in method, the controller is to by the corresponding deep neural network and root The weight parameter group of the optimization obtained according to the above method is as a model, and the vibration will correspond to the tapping operation is believed Number and the image from the touch-control interface input in the model, to obtain the percussion type of a prediction.
This announcement various is struck by the way of deep learning with what deep neural network study carried out on touch panel The classification for hitting event obtains a prediction model.This prediction model cloth is built on terminal touch-control product, therefore terminal touch-control product The hammer action that user makes can be predicted, type is tapped to these on software view and makees different applications, so that Applicability greatly improves.Also, this announcement is formed by image using the vibration signal and touch sensing value for tapping generation Carry out tapping the prediction of type together, so that the prediction accuracy for tapping type is effectively promoted.
For the above content of this announcement can be clearer and more comprehensible, preferred embodiment is cited below particularly, and cooperate institute's accompanying drawings, makees Detailed description are as follows.
Detailed description of the invention
Fig. 1 shows the schematic diagram of the identifying system of the percussion event on a kind of touch panel realized according to this announcement.
Fig. 2 shows the recognition methods of the percussion event on a kind of touch panel realized according to this announcement first embodiment Flow chart.
Fig. 3 shows the vibration signal in this announcement with the schematic diagram of Annual distribution.
Fig. 4 show in this announcement in frequency space vibration signal schematic diagram.
Fig. 5 shows the schematic diagram of the deep neural network in this announcement.
Fig. 6 shows the recognition methods of the percussion event on a kind of touch panel realized according to this announcement second embodiment Flow chart.
Fig. 7 shows the configuration diagram of the deep neural network according to this announcement.
Fig. 8 shows a kind of schematic diagram for the terminal touch-control product realized according to this announcement.
Fig. 9 shows the recognition methods of the percussion event on a kind of touch panel realized according to this announcement second embodiment Flow chart.
Specific embodiment
It is right as follows in conjunction with drawings and embodiments to keep the purpose, technical solution and effect of this announcement clearer, clear This announcement is further described.It should be appreciated that specific embodiment described herein is only to explain that this announcement, this announcement are said The word " embodiment " used in bright book means serving as example, example or illustration, is not used to limit this announcement.In addition, originally taking off Show that the article " one " used in specification and appended book can generally be interpreted to mean " one or more ", It unless specified otherwise or can understand from context and determine singular.
This announcement is by the way of deep learning (deep learning), to the percussion event made on touch panel It carries out classification learning and obtains a disaggregated model.Using this disaggregated model, can be struck what user made on touch-control product It hits movement to be classified, obtains the type (such as multiple combinations of the finger number of the number and percussion of percussion) of percussion, thus The predetermined operation of corresponding the type can be executed.
Particularly, the touch sensing that the vibration signal and touch panel that this announcement is measured using Vibration Sensor measure The image of value carries out tapping together the prediction of type, can effectively improve the prediction accuracy for tapping type.
The type of percussion event may be, for example, finger different parts progress percussion (such as finger pulp tap, articulations digitorum manus tap, Nail tap), tap number (such as a beat, it is secondary percussion and three times tap etc.), simultaneously percussion finger quantity (such as a finger taps, two fingers tap and three refer to percussion etc.) and the angle (such as 90 degree of percussions and 45 degree of percussions etc.) tapped, Or above-mentioned a plurality of types of any combination.
The predetermined operation can have different configurations using situation according to different.Such as answering in large touch panel With under situation, for example, one refers to that percussion can be associated with the operation of unlatching/closing main menu, and two refer to that percussion can be with change paintbrush The operation of color is associated with, and three refer to that percussion can be associated with the operation of change paintbrush thickness.As described below, those skilled in the art It is appreciated that other application can also be used in the concept of the invention of this announcement.Certainly, the relationship of the operation of type and execution is tapped It can also be by user's self-defining.
Fig. 1 shows the schematic diagram of the identifying system of the percussion event on a kind of touch panel realized according to this announcement.It should System includes a touch device 10 and the computer installation 40 with the coupling of touch device 10.Touch device 10 includes a touch panel 20 and at least one Vibration Sensor 30.Touch device 10 can be the display device with touch function, can pass through a display surface Plate display image (not shown), while can receive the touch control operation of user.Computer installation 40 can be for certain operation Computer of ability, such as personal computer, notebook computer etc..In this announcement, in order to classify to percussion event, need Percussion event is first collected, here, artificially tapping touch device 10, the corresponding signal of event is will hit against and is transmitted to computer dress 40 are set, computer installation 40 is learnt using deep neural network (deep neural network).
Such as accelerometer of Vibration Sensor 30.Vibration Sensor 30 is configurable on any position in touch device 10, Preferably, the configuration of Vibration Sensor 30 is in the lower surface of touch panel 20.Vibration Sensor 30 is to sense in touch device 10 The hammer action of upper progress, generates corresponding vibration signal.When Vibration Sensor 30 configuration at 20 lower surface of touch panel, can To generate preferable signal for the percussion on touch panel 20.
Touch panel 20 includes that signal transmission (Tx) layer 21 and a signal receive (Rx) layer 22.The sensing of touch panel 20 makes The touch operation of user generates several touch sensing values (such as capacitance), from these touch sensing values, can determine touching hair Raw position.These touch sensing values combine its coordinate (such as X, Building Y are marked) to can be considered an image, and touch sensing value corresponds to the shadow Pixel value as in, the distribution of touch sensing value is corresponding with the form of percussion in this image.
In an embodiment, computer installation 40 passes through the vibration signal that interface Vibration Sensor 30 generates, by it Feed-in deep neural network carries out classification learning.It is artificial generate percussion event after, can also using the type of each percussion event as Classification marker (classification labels) is input in computer installation 40, and the formula that exercises supervision learns (supervised Learning).
In another embodiment, computer installation 40 by interface Vibration Sensor 30 generate vibration signal and Touch sensing value from touch panel 20 can be converted into shadow by the touch sensing value from touch panel 20, computer installation 40 Vibration signal and the image feed-in deep neural network are then carried out classification learning by the data mode of picture.
That is, can will hit against the vibration signal of event generation before the prediction for tap type and correspond to and be somebody's turn to do The classification marker of the type of percussion event, to train deep neural network, can also will hit against the vibration letter of event generation as input Number, the classification marker of the type of video signal (come from touch panel 20) and the corresponding percussion event as input to train depth Neural network.
As shown in Figure 1, computer installation 40 includes a processor 41 and a memory 42, processor 41 and Vibration Sensor 30 Coupling, processor 41 receive the vibration signal that Vibration Sensor 30 transmits, and memory 42 is connect with processor 41, and memory 42 includes can Several program instructions executed by processor 41, processor 41 execute these program instructions to execute the deep neural network Related operation.GPU can also be used to execute the related operation of the deep neural network, to promote operation speed in computer installation 40 Degree.
Fig. 2 shows the recognition methods of the percussion event on a kind of touch panel realized according to this announcement first embodiment Flow chart.It please cooperate Fig. 1, referring to Fig.2, described method includes following steps.
Step S21: the various percussion events carried out on touch panel 20 are sensed using Vibration Sensor 30 and are measured Several vibration signals, record these tap events type as classification marker.In this step, on touch panel 20 Various percussion events are artificially generated, the Vibration Sensor 30 being arranged on 20 lower surface of touch panel senses percussion event and gives birth to At vibration signal.Also, the classification marker of the type of corresponding each percussion event is recorded, and stores into memory 42.
In this announcement, the quantity of Vibration Sensor 30 is not limited to one, is also possible to several, Vibration Sensor 30 Any position in touch device 10 can be set, Vibration Sensor 30 can also sense any on 10 surface of touch device The hammer action that position is made.
The acceleration that Vibration Sensor 30 detects is the function of time, and there are three durection component, Fig. 3 show a certain percussion The distribution of the corresponding acceleration magnitude of event in time.In an embodiment, using Fourier transform respectively by three sides It is transformed into frequency space to component, as shown in Figure 4.Specifically, the method can further include each vibration signal The step of being transformed into frequency space from Annual distribution.
After being transformed into frequency space, the flip-flop (DC component) that can further filter low frequency is miscellaneous with high frequency News, are influenced to avoid classification results by acceleration of gravity and noise.Specifically, the method can further include pair The step of each vibration signal is filtered, and filters out the part of high and low frequency.
Step S22: using the vibration signal of corresponding percussion event and classification marker as a sample, include if generating Dry this collection as sample.In this step, Vibration Sensor 30 measures vibration signal and the corresponding percussion event it The classification marker of type constitutes a sample set as a data, i.e. a sample, several samples.Specifically, a sample The classification marker of the type of this characteristic value (feature values) comprising a vibration signal and the corresponding percussion event.
The sample set can be divided into training sample set and test sample collection, which can be used to train depth nerve net Network, the test sample collection are used to test the classification accuracy for the deep neural network model that training obtains.
Step S23: using the sample in the sample set as input, the weight parameter group (weighting freely chosen Parameters it) is used as adjusting parameter, is input in a deep neural network and is trained, using back-propagation (backpropagation) algorithm adjusts the weight parameter group.
The characteristic value in obtain a sample in step S22 is inputted from input layer, defeated by the deep neural network The classification marker predicted out.Fig. 5 shows the example of a deep neural network, and deep neural network can be generally divided into input Layer, output layer and the learning layer between input layer and output interlayer, each sample of sample set are inputted from input layer, point of prediction Class label is exported from output layer.In general, deep neural network includes many learning layers, the number of plies it is quite a lot of (such as 50~ 100 layers), therefore can realize deep learning.The deep neural network that Fig. 5 is shown is only to illustrate, and the deep neural network of this announcement is simultaneously It is not limited.
It may include multiple convolutional layers (convolutional layer), batch normalizing layer (batch in deep neural network Normalization layer), pond layer (pooling layer), full articulamentum (fully connected layer), Line rectification unit (rectified linear unit, ReLu) and a Softmax output layer etc..This announcement can To be learnt using an appropriate number of number of plies, to obtain balance in prediction accuracy and operation efficiency, but it is noted that The number of plies excessively may also lead to accuracy decline.Deep neural network may include multiple cascade sub-networks, each sub-network with The each sub-network of position behind is connected, such as DenseNet(Dense Convolutional Network), to promote prediction Accuracy.Deep neural network also may include residual network (Residual Network), for solving to degrade (degradation) problem.
The target of the deep neural network is so that error in classification (loss) is minimum, and the method for optimization uses back-propagation Algorithm, that is to say, that the prediction result that output layer obtains is compared with true value, obtains an error amount, then this Error amount is successively toward passback, to correct each layer of parameter.
Step S24: the sample of the sample set is read in batches (mini-batch), training deep neural network, to this Weight parameter group is finely adjusted, to determine the weight parameter group of optimization.It, will when every a collection of subsample collection of use is trained Weight parameter group is once finely tuned, is so made iteratively, until error in classification is intended to restrain.Finally, selecting pair There is model parameter group of the parameter group of highest prediction accuracy as optimization in test set.
Step S25: using the deep neural network and the weight parameter group of the optimization as a model, cloth is built to terminal touching It controls in product.In this step, the terminal touch-control product have a prediction model, it comprises in above-mentioned steps S21~S24 The weight parameter for the optimization that the deep neural network of use is identical or corresponding deep neural network and above-mentioned steps S24 are obtained Group.
Step S26: vibration signal caused by the tapping operation that sensing executes the terminal touch-control product, and generation pair Should several touch sensing values of tapping operation be formed by an image.The terminal touch-control product include a touch-control interface and An at least Vibration Sensor.In this step, vibration when user taps the terminal touch-control product, in the terminal touch-control product Sensor measures vibration signal, which senses the touch operation of user, to generate several touch sensing values (such as capacitance), these touch sensing values are converted into an image, the distribution of touch sensing value and the form tapped in this image It is corresponding.
Step S27: the vibration signal of the corresponding tapping operation is input in the model to obtain one first as a result, to this A little touch sensing values are formed by image and are analyzed to obtain one second as a result, being obtained according to first result and second result The percussion type of a prediction out.That is, the prediction for tapping type is produced according to the Vibration Sensor of the terminal touch-control product The touch sensing value that the touch-control interface of raw vibration signal and the terminal touch-control product generates is formed by image and obtains. Wherein, input of the vibration signal as the trained deep neural network from the terminal touch-control product, is struck with obtaining Hit first prediction result of type.The image formed because touching operation from the terminal touch-control product obtains after analysis Second prediction result of type is tapped out.First prediction result and second prediction result all take part in final strike Hit the prediction of type.
In an embodiment, if one of prediction result of first prediction result and second prediction result When not know or can not determine, the prediction of final percussion type can be determined by another prediction result.In another embodiment In, the percussion type finally predicted can be the one of them of first prediction result and second prediction result, such as should When first prediction result and second prediction result are inconsistent, selected one of them can be preset as final prediction knot Fruit, or determined according to its respective score.In another embodiment, the percussion type finally predicted can be tied by this first prediction The combination for the percussion type that fruit and second prediction result distinctly predict.
For example, which can obtain the number of percussion, such as secondary percussion from vibration signal, can from the image Be several fingers while tapping to analyze, such as two refer to and tap, so as to predict tap type be carried out it is secondary Two refer to percussion.For example, which can obtain the number of percussion, such as secondary percussion from vibration signal, from the image Can analyze out which position of finger is tapped, for example, articulations digitorum manus tap, so as to predict tap type be into Secondary articulations digitorum manus of having gone taps.
Step S28: according to the percussion type of the prediction, a predetermined operation of the percussion type of the corresponding prediction is executed.? In this step to, the percussion type that prediction obtains can be sent to a software in execution, which just executes corresponding prediction result Operation.
Fig. 6 shows the recognition methods of the percussion event on a kind of touch panel realized according to this announcement second embodiment Flow chart.It please cooperate Fig. 1, refering to Fig. 6, described method includes following steps:
Step S61: if sense the various percussion events carried out on touch panel 20 using Vibration Sensor 30 and measure Dry vibration signal, and touch panel 20 is sensed into several touch sensing values that each percussion event is obtained and is converted into a shadow Picture, and the type of these percussion events is recorded as classification marker.In this step, it is artificially generated on touch panel 20 each Kind percussion event, the Vibration Sensor 30 being arranged on 20 lower surface of touch panel sense percussion event and generate vibration signal. Touch panel 20 generates several touch sensing values for each percussion event, these touch sensing values are converted into an image Data mode, the corresponding image of percussion event.Also, the classification marker of the type of corresponding each percussion event is remembered Record is got off, and is stored into memory 42.
In this announcement, the image of touch sensing value is input to computer installation 40 after being generated by touch panel 20 again, The touch sensing value from touch panel 20 can also be converted into image by the processor 41 of computer installation 40.
In general, the event of percussion only occurs in the region of a part on touch panel 20, therefore it can only capture this The image of the touch sensing value in region, that is, it is possible to use only the image of this regional area carries out deep neural network Training.Specifically, can further include will be in a regional area of the generation position comprising the percussion event for the method Touch sensing value is converted into the step of image.
Step S62: it using vibration signal, image and the classification marker of corresponding percussion event as a sample, generates Include this collection as several samples.In this step, Vibration Sensor 30 measure vibration signal, come from touch panel 20 Touch sensing value image and the corresponding percussion event type classification marker as a data, i.e. a sample, Several samples constitute a sample set.Specifically, a sample includes characteristic value, the feature of the image of a vibration signal The classification marker of the type of value and the corresponding percussion event.
The sample set can be divided into training sample set and test sample collection, which can be used to train depth nerve net Network, the test sample collection are used to test the classification accuracy for the deep neural network model that training obtains.
Step S63: using the sample in the sample set as input, the weight parameter group freely chosen as adjusting parameter, It is input in a deep neural network and is trained, using the algorithm of back-propagation, adjust the weight parameter group.
Fig. 7 shows the configuration diagram of the deep neural network according to this announcement.It in this step, will be in the sample set Each sample is input to deep neural network shown in Fig. 7 and is trained.
As shown in fig. 7, the deep neural network includes one first sub-network 201, one second sub-network 202, a flatness layer (flatten layer) 210 and multilayer perceptron (Multi-Layer Perceptron) 220.From Vibration Sensor 30 Vibration signal input the first sub-network 201, the image of the touch sensing value from touch panel 20 inputs the second sub-network 202.First sub-network 201 and the second sub-network 202 are parallel framework in the deep neural network.Vibration signal is one-dimensional Data, one-dimensional framework, such as one-dimensional convolutional neural networks (convolutional can be used in the first sub-network 201 Neural network, CNN).The image of touch sensing value is two-dimensional data, and two-dimensional frame can be used in the second sub-network 202 Structure, such as two-dimensional convolutional neural networks.The output of first sub-network 201 and the output of the second sub-network 202, are input to flat Layer 210 is planarized.Since the first sub-network 201 is one-dimensional framework, the second sub-network 202 is two-dimentional framework, needs to utilize The output of first sub-network 201 and the second sub-network 202 is converted into one-dimensional data by flatness layer 210.The output of flatness layer 210 is again It is input to the classification prediction that multilayer perceptron 220 carries out percussion event, to obtain the classification marker of a prediction.Multilayer perceptron 220 can be considered another sub-network of the deep neural network, are made of multiple node layers, and one group of input vector is mapped To one group of output vector, in addition to input node, each node is the neuron for having nonlinear activation function.Multilayer sense Know that device 220 may include convolutional layer, ReLU layers, pond layer and full articulamentum.First sub-network 201, the second sub-network 202 and multilayer Each one weighted value of correspondence of the neuron of perceptron 220, all weighted values of the deep neural network constitute the weight parameter group.
For example, multilayer perceptron 220 is made of an input layer, one or more hidden layers and an output layer, from Each input value that flatness layer 210 is input to multilayer perceptron 220 is inputted from input layer, and each input value is multiplied by a weight It sums (adding a deviation (bias) after can also summing again) after value, as the value of a neuron in first hidden layer, this A value can carry out again non-linear conversion and obtain final value, as the input of next hidden layer, successively carry out and in output layer Export prediction result, that is, the classification marker predicted.
Specifically, step S63 can further include following steps:
The feature of the vibration signal of corresponding percussion event is input to the first one-dimensional sub-network 201;
The feature of the image of the touch sensing value of a corresponding percussion event is input to two-dimensional second sub-network 202;
By the output of the output of first sub-network and second sub-network, it is input to flatness layer 210 and is planarized, it should The output of first sub-network 201 and second sub-network 202 is converted into one-dimensional data;And
By the output of flatness layer 210, it is input to multilayer perceptron 220, the classification prediction of percussion event is carried out, to obtain prediction Classification marker.
The target of the deep neural network is so that error in classification (loss) is minimum, and the method for optimization uses back-propagation Algorithm, that is to say, that the prediction result that output layer obtains is compared with true value, obtains an error amount, then this Error amount is successively toward passback, to correct each layer of parameter.Specifically, the method can further include according to the prediction Classification marker and the sample in the error of classification marker the weight parameter group is adjusted using the algorithm of the back-propagation Step.The algorithm of the back-propagation not only adjusts the weight of each neuron in multilayer perceptron 220, also can adjust the first sub-network 201 and second each neuron in sub-network 202 weight.
Step S64: the sample of the sample set is read in batches, training the deep neural network, to the weight parameter group into Row fine tuning, to determine the weight parameter group of optimization.It, will be to weight parameter group when every a collection of subsample collection of use is trained It is once finely tuned, is so made iteratively, until error in classification is intended to restrain.Finally, selecting has most test set Model parameter group of the parameter group of high prediction accuracy as optimization.
Fig. 8 shows a kind of schematic diagram for the terminal touch-control product realized according to this announcement.As shown in figure 8, the terminal touch-control Product includes a touch-control interface 20 ', one or more Vibration Sensors 30 ' and a controller 60.Vibration Sensor 30 ' can be set In the lower surface of touch-control interface 20 ', or it may also be arranged on any position in the terminal touch-control product.Vibration Sensor 30 ' Vibration signal is generated to sense to the tapping operation that the touch-control interface 20 ' executes.Touch-control interface 20 ' strikes to sense this Operation is hit to obtain several touch sensing values, these touch sensing values are converted into an image.Controller 60 and touch-control interface 20 ' couple with Vibration Sensor 30 ', receive image and vibration-sensing that the touch sensing value that touch-control interface 20 ' generates is formed The vibration signal that device 30 ' generates.Touch sensing value from touch-control interface 20 ' can be also converted into an image by controller 60.
Controller 60 carries out classification prediction to the percussion event carried out on touch-control interface 20 ' to user, to obtain The percussion type of one prediction.For example, in controller 60 cloth have with used in above-mentioned steps S61~S64 depth nerve Network is identical or corresponding deep neural network, and is stored with the weight parameter group for the optimization that above-mentioned steps S64 is obtained.This is corresponding Deep neural network and the optimization weight parameter group constitute a model.The vibration of controller self-excited oscillation in 60 future sensor 30 ' Dynamic signal and the touch sensing value from touch-control interface 20 ' are formed by image and input in the model, and prediction that you can get it is struck Hit type.In this way, the terminal touch-control product realizes the classification prediction of percussion event.
In an embodiment, controller 60 can be any controller in the terminal touch-control product.In another embodiment, Controller 60 is incorporated into a touch chip, that is to say, that not only there is the touch chip of the terminal touch-control product sensing to use The function of the touch operation of person, while also having the function of predicting the percussion type of user.Specifically, the depth nerve net In the corresponding procedure code of network and the firmware of the writable touch chip of weight parameter group of the optimization, in the rank for executing driver Section, the type of the predictable percussion event out of touch chip.
Fig. 6 please be cooperate refering to Fig. 8, the method also includes following steps.
Step S65: using the deep neural network and the weight parameter group of the optimization as a model, cloth is built to terminal touching It controls in product.In this step, the terminal touch-control product have a prediction model, it comprises in above-mentioned steps S61~S64 The weight parameter for the optimization that the deep neural network of use is identical or corresponding deep neural network and above-mentioned steps S64 are obtained Group.
Step S66: it receives vibration signal caused by the tapping operation executed to the terminal touch-control product and comes from and be somebody's turn to do The image of the touch sensing value of terminal touch-control product, and by the vibration signal of the corresponding tapping operation and come from the terminal touch-control The image of product inputs in the model, to obtain the percussion type of a prediction.In this step, user taps terminal touching When controlling product, the Vibration Sensor 30 ' in the terminal touch-control product measures vibration signal, and touch-control interface 20 ' generates touch sensing Value, these touch sensing values are converted into an image, the vibration signal and the image are inputted in the model, i.e., predictable to obtain The type of percussion.
Step S67: according to the percussion type of the prediction, a predetermined operation of the percussion type of the corresponding prediction is executed.? In this step, controller 60 can send the percussion type that prediction obtains to the software operated in operating system, the software Just the operation of corresponding prediction result is executed.
In the application situation that one illustrates, large touch, which is shown in product, is equipped with a marking software.For example, When user carries out one referring to that the marker software accordingly opens when tapping to this product surface/or closing main menu;Two refer to percussion When, which changes the color of paintbrush;When three fingers tap, change pen tip thickness.In the application situation that another is illustrated, When user carries out 90 degree of percussions, main menu can be opened/be closed, can be highlighted menu item when 45 degree of percussions, supplies User chooses multiple projects, or carries out text selection.In another example, when playing film or music, user It taps once to make to play from Trackpad side and suspend, percussion is secondary, can continue to play.
This announcement various is struck by the way of deep learning with what deep neural network study carried out on touch panel The classification for hitting event obtains a prediction model.This prediction model cloth is built on terminal touch-control product, therefore terminal touch-control product The hammer action that user makes can be predicted, type is tapped to these on software view and makees different applications, so that Applicability greatly improves.Also, this announcement is formed by image using the vibration signal and touch sensing value for tapping generation Carry out tapping the prediction of type together, so that the prediction accuracy for tapping type is effectively promoted.
This announcement has used preferred embodiment disclosed above, and so it is not limited to this announcement, the neck of technology belonging to this announcement Have usually intellectual in domain, in the spirit and scope for not departing from this announcement, may make various changes and modifications, therefore this Subject to the protection scope of announcement those as defined in claim attached after view.

Claims (20)

1. a kind of recognition methods of the percussion event on touch panel, which is characterized in that the method includes:
Several vibration signals for sensing the various percussion events that carry out on touch panel and measuring record these and tap thing The type of part is as classification marker;
Using the vibration signal of corresponding percussion event and classification marker as a sample, generating includes the one of several samples Sample set;
Using the sample in the sample set as input, the weight parameter group freely chosen is input to a depth as adjusting parameter It is trained in neural network, using the algorithm of back-propagation, adjusts the weight parameter group;
The sample of the sample set is read in batches, the training deep neural network is finely adjusted the weight parameter group, to determine The weight parameter group optimized out;
Using the deep neural network and the weight parameter group of the optimization as a model, cloth is built in a terminal touch-control product;
Vibration signal caused by the tapping operation executed to the terminal touch-control product is sensed, and generates the corresponding tapping operation Several touch sensing values be formed by an image;And
The vibration signal of the corresponding tapping operation is input in the model to obtain one first as a result, to these touch sensing values Image is formed by be analyzed to obtain one second as a result, obtaining striking for a prediction according to first result and second result Hit type.
2. the method according to claim 1, wherein after the percussion type for obtaining the prediction the step of, institute The method of stating further includes:
According to the percussion type of the prediction, a predetermined operation of the percussion type of the corresponding prediction is executed.
3. according to the method described in claim 2, it is characterized by: the predetermined operation is to select free switch menu, change paintbrush Group composed by color and change paintbrush thickness.
4. according to the method described in claim 1, it is characterized by: the type tapped is the number according to percussion while tapping At least one of the angle of fingers number, the position of finger tapping and percussion divides.
5. according to the method described in claim 1, it is characterized by: the deep neural network includes a convolutional neural networks.
6. a kind of recognition methods of the percussion event on touch panel, which is characterized in that the method includes:
Several vibration signals for sensing the various percussion events that carry out on touch panel and measuring, and by the touch panel It senses several touch sensing values that each percussion event is obtained and is converted into an image, and record these types for tapping event As classification marker;
Using vibration signal, image and the classification marker of corresponding percussion event as a sample, generating includes several samples This collection as this;
Using the sample in the sample set as input, the weight parameter group freely chosen is input to a depth as adjusting parameter It is trained in neural network, using the algorithm of back-propagation, adjusts the weight parameter group;And
The sample of the sample set is read in batches, the training deep neural network is finely adjusted the weight parameter group, to determine The weight parameter group optimized out.
7. according to the method described in claim 6, it is characterized in that, the step of training the deep neural network includes:
The feature of the vibration signal of corresponding percussion event is input to one first sub-network, which is one one-dimensional Sub-network;
The feature of the image of the touch sensing value of a corresponding percussion event is input to one second sub-network, second sub-network For a two-dimentional sub-network;
By the output of the output of first sub-network and second sub-network, it is input to a flatness layer and is planarized, it should The output of first sub-network and second sub-network is converted into one-dimensional data;And
By the output of the flatness layer, it is input to a multilayer perceptron, carries out the classification prediction of percussion event.
8. according to the method described in claim 7, it is characterized by: first sub-network and second sub-network include a convolution Neural network.
9. according to the method described in claim 6, it is characterized in that, the step of these touch sensing values are converted into image packet Contain:
Touch sensing value in one regional area of the generation position comprising the percussion event is converted into the image.
10. according to the method described in claim 6, it is characterized in that, the method further includes:
Using the deep neural network and the weight parameter group of the optimization as a model, cloth is built in a terminal touch-control product;With And
Receive vibration signal caused by the tapping operation executed to the terminal touch-control product and from the terminal touch-control product Touch sensing value image, and by the vibration signal of the corresponding tapping operation and the image from the terminal touch-control product It inputs in the model, to obtain the percussion type of a prediction.
11. according to the method described in claim 10, it is characterized in that, after the percussion type for obtaining the prediction the step of, The method further includes:
According to the percussion type of the prediction, a predetermined operation of the percussion type of the corresponding prediction is executed.
12. according to the method for claim 11, it is characterised in that: the predetermined operation is to select free switch menu, change picture Group composed by color and change paintbrush thickness.
13. according to the method described in claim 6, it is characterized by: the type tapped is the number according to percussion while tapping Fingers number, finger tapping position and or the angle that taps divide.
14. a kind of identifying system of the percussion event on touch panel, which is characterized in that the system includes:
One touch panel;
One Vibration Sensor is set together with the touch panel, and wherein the Vibration Sensor is to sense on touch panel Carry out various percussion events and measure several vibration signals, the touch panel is to sense each percussion event to obtain Several touch sensing values, these touch sensing values are converted into an image;
One processor is coupled with the Vibration Sensor and the touch panel, which receives this that the Vibration Sensor transmits Several images that a little vibration signals and the touch panel transmit;And
One memory is connect with the processor, which includes several program instructions that can be executed by the processor, the processor These program instructions are executed to execute a method, the method includes:
The type that these are tapped event is recorded in the memory as classification marker;
Using vibration signal, image and the classification marker of corresponding percussion event as a sample, generating includes several samples This collection as this;
Using the sample in the sample set as input, the weight parameter group freely chosen is input to a depth as adjusting parameter It is trained in neural network, using the algorithm of back-propagation, adjusts the weight parameter group;And
The sample of the sample set is read in batches, the training deep neural network is finely adjusted the weight parameter group, to determine The weight parameter group optimized out.
15. system according to claim 14, which is characterized in that the deep neural network includes:
One first sub-network, receives the feature of the vibration signal of corresponding percussion event, which is an one-dimensional son Network;
One second sub-network, receives the feature of the image of the touch sensing value of corresponding percussion event, which is One two-dimentional sub-network;
One flatness layer, the output to output and second sub-network to first sub-network planarize, by this The output of one sub-network and second sub-network is converted into one-dimensional data;And
One multilayer perceptron, the output for receiving the flatness layer carries out the classification prediction of percussion event, to obtain the classification of a prediction Label.
16. system according to claim 15, it is characterised in that: first sub-network and second sub-network include a roll Product neural network.
17. system according to claim 14, which is characterized in that the method further includes:
Touch sensing value in one regional area of the generation position comprising the percussion event is converted into the image.
18. system according to claim 14, it is characterised in that: the type of percussion is the number according to percussion while tapping Fingers number, finger tapping position and or the angle that taps divide.
19. a kind of terminal touch-control product, which is characterized in that the product includes:
One touch-control interface;
One Vibration Sensor is set together with the touch-control interface, which executes the touch-control interface to sense A tapping operation and the vibration signal that generates, the touch-control interface is to sense the tapping operation to obtain several touch sensings Value, these touch sensing values are converted into an image;And
One controller is coupled with the Vibration Sensor and the touch-control interface, and cloth has and the depth in claim 6 in the controller The corresponding deep neural network of neural network is spent, the controller is to by the corresponding deep neural network and according to claim The weight parameter group of 6 optimizations obtained is as a model, and will correspond to the vibration signal of the tapping operation and come from The image of the touch-control interface inputs in the model, to obtain the percussion type of a prediction.
20. terminal touch-control product according to claim 19, it is characterised in that: the controller and to according to the prediction Type is tapped, a predetermined operation of the percussion type of the corresponding prediction is executed.
CN201810330628.6A 2018-04-13 2018-04-13 Method and system for identifying knocking event on touch panel and terminal touch product Active CN110377175B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810330628.6A CN110377175B (en) 2018-04-13 2018-04-13 Method and system for identifying knocking event on touch panel and terminal touch product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810330628.6A CN110377175B (en) 2018-04-13 2018-04-13 Method and system for identifying knocking event on touch panel and terminal touch product

Publications (2)

Publication Number Publication Date
CN110377175A true CN110377175A (en) 2019-10-25
CN110377175B CN110377175B (en) 2023-02-03

Family

ID=68243332

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810330628.6A Active CN110377175B (en) 2018-04-13 2018-04-13 Method and system for identifying knocking event on touch panel and terminal touch product

Country Status (1)

Country Link
CN (1) CN110377175B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210117025A1 (en) * 2019-10-18 2021-04-22 Acer Incorporated Electronic apparatus and object information recognition method by using touch data thereof
CN116027953A (en) * 2022-08-15 2023-04-28 荣耀终端有限公司 Finger joint touch operation identification method, electronic equipment and readable storage medium
CN116450026A (en) * 2023-06-16 2023-07-18 荣耀终端有限公司 Method and system for identifying touch operation

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2804014A1 (en) * 2010-06-28 2012-01-12 Cleankeys Inc. Method for detecting and locating keypress-events on touch-and vibration-sensitive flat surfaces
CN105431799A (en) * 2013-08-02 2016-03-23 齐科斯欧公司 Capture of vibro-acoustic data used to determine touch types
CN106779064A (en) * 2016-11-25 2017-05-31 电子科技大学 Deep neural network self-training method based on data characteristics
US20170330054A1 (en) * 2016-05-10 2017-11-16 Baidu Online Network Technology (Beijing) Co., Ltd. Method And Apparatus Of Establishing Image Search Relevance Prediction Model, And Image Search Method And Apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2804014A1 (en) * 2010-06-28 2012-01-12 Cleankeys Inc. Method for detecting and locating keypress-events on touch-and vibration-sensitive flat surfaces
CN105431799A (en) * 2013-08-02 2016-03-23 齐科斯欧公司 Capture of vibro-acoustic data used to determine touch types
US20170330054A1 (en) * 2016-05-10 2017-11-16 Baidu Online Network Technology (Beijing) Co., Ltd. Method And Apparatus Of Establishing Image Search Relevance Prediction Model, And Image Search Method And Apparatus
CN106779064A (en) * 2016-11-25 2017-05-31 电子科技大学 Deep neural network self-training method based on data characteristics

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
肖圣龙等: "面向社会安全事件的分布式神经网络攻击行为分类方法", 《计算机应用》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210117025A1 (en) * 2019-10-18 2021-04-22 Acer Incorporated Electronic apparatus and object information recognition method by using touch data thereof
US11494045B2 (en) * 2019-10-18 2022-11-08 Acer Incorporated Electronic apparatus and object information recognition method by using touch data thereof
CN116027953A (en) * 2022-08-15 2023-04-28 荣耀终端有限公司 Finger joint touch operation identification method, electronic equipment and readable storage medium
CN116450026A (en) * 2023-06-16 2023-07-18 荣耀终端有限公司 Method and system for identifying touch operation
CN116450026B (en) * 2023-06-16 2023-10-20 荣耀终端有限公司 Method and system for identifying touch operation

Also Published As

Publication number Publication date
CN110377175B (en) 2023-02-03

Similar Documents

Publication Publication Date Title
TWI654541B (en) Method and system for identifying tapping events on a touch panel, and terminal touch products
US8527908B2 (en) Computer user interface system and methods
Zhang et al. Data augmentation and dense-LSTM for human activity recognition using WiFi signal
Ling et al. Ultragesture: Fine-grained gesture sensing and recognition
Qifan et al. Dolphin: Ultrasonic-based gesture recognition on smartphone platform
US9304593B2 (en) Behavior recognition system
CN110377175A (en) The recognition methods of percussion event and system and terminal touch-control product on touch panel
CN110163082A (en) A kind of image recognition network model training method, image-recognizing method and device
CN102687161B (en) Selective motor control classification
KR20190085890A (en) Method and apparatus for gesture recognition
US20110043475A1 (en) Method and system of identifying a user of a handheld device
CN108629170A (en) Personal identification method and corresponding device, mobile terminal
CN110414306B (en) Baby abnormal behavior detection method based on meanshift algorithm and SVM
CN109993229A (en) A kind of serious unbalanced data classification method
TW201918866A (en) Method and system for classifying tap events on touch panel, and touch panel product
CN108960430A (en) The method and apparatus for generating personalized classifier for human body motor activity
CN115376518B (en) Voiceprint recognition method, system, equipment and medium for real-time noise big data
Lin et al. WiWrite: An accurate device-free handwriting recognition system with COTS WiFi
CN107609501A (en) The close action identification method of human body and device, storage medium, electronic equipment
Cai et al. Mobile authentication through touch-behavior features
KR102231511B1 (en) Method and apparatus for controlling virtual keyboard
CN109753172A (en) The classification method and system and touch panel product of touch panel percussion event
Zhou et al. Enabling non-intrusive occupant activity modeling using WiFi signals and a generative adversarial network
CN114091611A (en) Equipment load weight obtaining method and device, storage medium and electronic equipment
CN109308133A (en) Intelligent interaction projects interaction technique

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant