WO2024001501A1 - 指关节操作的识别方法及电子设备 - Google Patents

指关节操作的识别方法及电子设备 Download PDF

Info

Publication number
WO2024001501A1
WO2024001501A1 PCT/CN2023/091715 CN2023091715W WO2024001501A1 WO 2024001501 A1 WO2024001501 A1 WO 2024001501A1 CN 2023091715 W CN2023091715 W CN 2023091715W WO 2024001501 A1 WO2024001501 A1 WO 2024001501A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
feature
signal
features
knuckle
Prior art date
Application number
PCT/CN2023/091715
Other languages
English (en)
French (fr)
Inventor
王小晨
张�成
万努梁
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Publication of WO2024001501A1 publication Critical patent/WO2024001501A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/2433Single-class perspective, e.g. one-against-all classification; Novelty detection; Outlier detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • This application relates to the technical field of artificial intelligence (AI), and in particular to a method for identifying finger joint operations and an electronic device.
  • AI artificial intelligence
  • touch panels As a human-computer interaction device for electronic equipment, touch panels (TP) are increasingly used.
  • knuckles are a common way for users to interact with touch screens.
  • the electronic device When the user touches the touch screen with his knuckles, the electronic device will detect a change in the signal of the touch point. Then, the electronic device can identify the knuckle interaction method based on the changing signals, and then perform screenshots, screen recordings, and window switching based on the touch time of the knuckles and the touch screen, as well as the sliding distance of the knuckles on the touch screen, etc. and other shortcut functions.
  • the traditional finger joint touch detection algorithm has insufficient generalization performance, resulting in a low recognition rate of finger joint interaction methods.
  • the traditional knuckle touch detection algorithm has insufficient anti-counterfeiting capabilities, resulting in the failure to effectively filter non-knuckle interaction methods, resulting in a high rate of false touches. Therefore, how to optimize the touch detection algorithm of knuckles has become an urgent technical problem to be solved.
  • This application provides a method and electronic device for identifying finger joint operations, which solves the problems of low recognition rate and high false touch rate of traditional finger joint touch detection algorithms, and improves the recognition effect of finger joints.
  • embodiments of the present application provide a method for identifying finger joint operations.
  • the method includes:
  • the acceleration (ACC) signal, the capacitor (CAP) signal and the touch signal are obtained.
  • the ACC signal is the original signal collected by the ACC sensor
  • the CAP signal is the original signal collected by the CAP sensor
  • the touch signal is The bump signal is the signal obtained after processing the CAP signal;
  • the ACC feature is the feature associated with the finger joint operation in the ACC signal.
  • the score feature is the feature associated with the finger joint operation in the CAP signal.
  • the touch feature is the feature associated with the finger joint operation in the touch signal. Characteristics associated with finger joint operations;
  • the fused features are input into the finger joint classification model to obtain a target classification result.
  • the target classification result indicates that the touch operation is a finger joint operation or a non-finger joint operation.
  • the embodiment of the present application proposes to extract touch features from the touch signal to determine the contact area and contact position, and the ACC features extracted from the ACC signal to determine the intensity of the touch screen.
  • the score feature used to characterize the correlation with the knuckles is extracted from the CAP signal. Then, the ACC features, score features and touch features are fused and input into the finger joint classification model, so as to obtain better classification results, improve the recognition rate of finger joint operations, and reduce the false touch rate of non-finger joint operations.
  • the confidence feature is a feature extracted from the touch signal when the acceleration feature indicates that the touch operation is a finger joint operation. It should be understood that when the acceleration feature indicates that the touch operation is a non-knuckle operation, the electronic device does not need to extract the confidence feature from the touch signal, nor does it need to perform subsequent feature fusion and other operations.
  • ACC features, score features and touch features are extracted, including:
  • the CAP signal is input into the CAP two-classification model to obtain the score feature and extract the touch feature from the touch signal.
  • ACC features, score features and touch features are extracted, including:
  • the CAP signal is input into the CAP two-classification model to obtain the score feature.
  • the machine screening method is used instead of the manual screening method, thereby improving the efficiency of ACC feature screening and improving the accuracy of the knuckle signal. filter effect.
  • the touch characteristics include at least one of contact position characteristics and contact area characteristics.
  • the contact position feature is used to represent the interactive position of the body part on the touch screen
  • the contact area feature is used to represent the area of contact between the body part and the touch screen.
  • the contact position feature is used to represent the coordinate grid number of the grid where the touch point is located.
  • the grid where the touch point is located is at least one grid in the grid list obtained by dividing the touch screen according to the resolution of the touch screen.
  • the method of using coordinates to represent the touch location has problems such as a large amount of calculation and easy leakage of user privacy.
  • By setting grid features it is possible to determine where the touch point is on the touch screen while reducing the amount of calculation and protecting privacy. the approximate position, and then determine whether it is a knuckle operation.
  • the grid list includes p-row and q-column grids.
  • the length of each grid in the grid list is equal to the number of pixels on the vertical axis of the touch screen divided by p.
  • the width of each grid in the grid list is equal to the number of pixels on the horizontal axis of the touch screen divided by q.
  • p and q are positive integers.
  • extracting the contact location features includes:
  • the touch signal determine the X-axis coordinate and Y-axis coordinate of the touch point
  • the X-axis is the horizontal direction of the plane where the touch screen is located
  • the Y-axis is the vertical direction of the plane where the touch screen is located.
  • the ACC feature includes at least one of the following: maximum first-order difference feature, signal amplitude feature, front-end zero-crossing number feature, maximum high-pass value feature, mean domain sum of absolute values feature, front-end normalization Value variance features, front-end normalized value amplitude features, fast Fourier transform mean features, and some fast Fourier transform mean features.
  • the score feature is score.
  • the score score is used to indicate the degree of correlation between the CAP signal and the knuckle operation.
  • the CAP binary classification model is a convolutional neural network model.
  • the ACC two-classification model is a fully connected neural network model.
  • the finger joint classification model is a fully connected neural network model.
  • the method also includes:
  • the target classification result indicates that the touch operation is a knuckle operation
  • determine the knuckle gesture to which the touch operation belongs and execute a response function corresponding to the knuckle gesture.
  • different knuckle gestures correspond to different response functions.
  • the knuckle gesture includes at least one of the following: a knuckle double-click gesture, a knuckle tapping and drawing a circle gesture, a knuckle tapping and drawing the letter S gesture, and three knuckles along the screen by Swipe up and down gestures, double-tap gestures with two knuckles, tap gestures with knuckles and draw a straight line in the middle of the screen.
  • the response function corresponding to the knuckle double-click gesture is a full-screen capture function.
  • the response function corresponding to the gesture of tapping the knuckles and drawing the letter S is a scrolling screenshot function.
  • the response function corresponding to the gesture of sliding three knuckles from up to down along the screen is a sliding screenshot function.
  • the response function corresponding to the double-click gesture with two knuckles is a start/stop screen recording function.
  • the response function corresponding to the gesture of tapping with knuckles and drawing a straight line in the middle of the screen is a split-screen function.
  • the present application provides an identification device, which includes a unit/module for executing the method of the first aspect.
  • the device may correspond to performing the method described in the first aspect.
  • the units/modules in the device please refer to the description in the first aspect. For the sake of brevity, they will not be described again here.
  • a third aspect provides an electronic device, including a processor coupled to a memory, and the processor is used to execute computer programs or instructions stored in the memory, so that the electronic device implements any one of the first aspects. Methods for identifying knuckle operations.
  • a fourth aspect provides a chip, which is coupled to a memory, and is used to read and execute a computer program stored in the memory to implement the method for identifying finger joint operations as in any one of the first aspect.
  • a computer-readable storage medium stores a computer program.
  • the electronic device causes the electronic device to execute the finger joint method of any one of the first aspects. How to identify operations.
  • a computer program product is provided.
  • the computer program product When the computer program product is run on a computer, it causes the computer to perform the method for identifying finger joint operations as in any one of the first aspect.
  • Figure 1 is a schematic diagram of a touch screen provided with an accelerometer according to an embodiment of the present application
  • Figure 2 is a schematic diagram of the IC electrode arrangement of a mutual capacitive touch screen provided by an embodiment of the present application
  • Figure 3 is a schematic diagram of the working principle of a mutual capacitive touch screen provided by an embodiment of the present application.
  • Figure 4 is a schematic diagram of a node matrix corresponding to a CAP signal provided by an embodiment of the present application
  • Figure 5 is a schematic flowchart of a finger joint touch detection algorithm provided by an embodiment of the present application.
  • Figure 6 is a schematic flowchart of another finger joint touch detection algorithm provided by an embodiment of the present application.
  • Figure 7 is a schematic diagram of manual screening of ACC features using threshold 1 provided by the embodiment of the present application.
  • Figure 8 is a schematic diagram of manual screening of ACC features using threshold 2 provided by the embodiment of the present application.
  • Figure 9 is a schematic diagram of manual screening of ACC features using threshold 3 provided by the embodiment of the present application.
  • Figure 10 is a schematic diagram of manual screening of ACC features using threshold 4 provided by the embodiment of the present application.
  • Figure 11 is a front view of the knuckle of the index finger tapping the upper left part of the screen according to an embodiment of the present application
  • Figure 12 is a perspective view of the knuckle of the index finger tapping the upper left part of the screen according to an embodiment of the present application
  • Figure 13 is a front view of the knuckle of the index finger tapping the upper right part of the screen according to an embodiment of the present application;
  • Figure 14 is a perspective view of the knuckle of the index finger tapping the upper right part of the screen according to an embodiment of the present application;
  • Figure 15 is a front view of the knuckle of the index finger tapping the lower left part of the screen according to an embodiment of the present application
  • Figure 16 is a perspective view of the knuckle of the index finger tapping the lower left part of the screen according to an embodiment of the present application
  • Figure 17 is a front view of the knuckle of the index finger tapping the lower right part of the screen according to an embodiment of the present application
  • Figure 18 is a perspective view of the knuckle of the index finger tapping the lower right part of the screen according to an embodiment of the present application
  • Figure 19 is a schematic diagram of the changing trend of an ACC signal provided by an embodiment of the present application.
  • Figure 20 is a schematic diagram of another change trend of ACC signals provided by an embodiment of the present application.
  • Figure 21 is a schematic flowchart of an improved method for identifying finger joint operations provided by an embodiment of the present application.
  • Figure 22 is a schematic diagram of a set of 7*7 node matrices corresponding to finger joints, fingertips and finger pulps provided by an embodiment of the present application;
  • Figure 24 is a schematic diagram of feature fusion provided by the embodiment of the present application.
  • Figure 25 is a schematic diagram of the CAP two-classification model provided by the embodiment of the present application.
  • Figure 26 is a schematic diagram of the ACC two-classification model provided by the embodiment of the present application.
  • Figure 27 is a schematic diagram of a finger joint classification model provided by an embodiment of the present application.
  • Figure 28 is a schematic diagram of the software structure of the electronic device provided by the embodiment of the present application.
  • Figure 29 is a schematic diagram of the identification device provided by the embodiment of the present application.
  • Figure 30 is a schematic diagram of the hardware structure of an electronic device provided by an embodiment of the present application.
  • first and second in the description and claims of this application are used to distinguish different objects, or to distinguish different processes on the same object, rather than to describe a specific sequence of objects. .
  • a first threshold, a second threshold, etc. are used to distinguish different thresholds, rather than to describe a specific order of thresholds.
  • “multiple” refers to two or more than two.
  • a touch screen also known as a touch screen or touch panel, is an inductive liquid crystal display device that can receive input signals from a stylus and fingers.
  • a touch screen consists of a touch detection component and a touch screen controller.
  • the touch detection component is installed on the upper layer of the liquid crystal display device and is used to detect touch information such as the user's touch position, and transmit the detected touch information to the touch screen controller.
  • the touch screen controller is used to process the touch information and send corresponding signals to the processor. Then, after processing the signal, the processor performs corresponding response actions, such as switching on and off, opening applications, taking images, taking screenshots, or switching windows, etc.
  • the first type of original signal is the motion signal collected by the ACC sensor, impact sensor, vibration sensor, etc. when the surface of the touch screen generates mechanical vibration under the action of mechanical force exerted by the user.
  • This motion signal can be used to measure the amount of mechanical force exerted by the user on the touch screen.
  • the electronic device may switch from a static state to a moving state, or from a moving state to a static state. Therefore, the vibration signal can also be used to characterize the movement of the electronic device under the action of mechanical force/ Stationary state.
  • the touch screen of the electronic device is provided with an accelerometer, such as a linear variable differential transformer (LVDT).
  • the accelerometer can be It is composed of ACC sensor, support, potentiometer, spring and shell.
  • the ACC signal collected by the accelerometer can carry acceleration data and/or angular velocity data, etc.
  • the acceleration data can be used to characterize the magnitude of linear acceleration
  • the angular velocity data can be used to characterize the magnitude of angular acceleration.
  • the acceleration data can be the magnitude of the acceleration of the electronic device on the X-axis, Y-axis and Z-axis respectively.
  • the X-axis is the horizontal direction of the plane where the touch screen is located
  • the Y-axis is the vertical direction of the plane where the touch screen is located
  • the Z-axis is the vertical direction running through the touch screen.
  • the change in acceleration is not zero, or does not approach zero, such as the X-axis, Y-axis, and Z-axis. If the change in acceleration is greater than 0.1g, it can be determined that the electronic device is in motion.
  • Angular velocity data is the magnitude of the angular velocity of the electronic device around the X, Y, and Z axes. If the change in angular velocity in each direction is zero, or approaches zero, for example, the change in angular velocity around the X-axis, Y-axis, and Z-axis is less than 3 rad/s, the electronic device is determined to be in a stationary state. If the change in angular velocity in any direction is not zero, or does not approach zero, such as the change in angular velocity of the X-axis, Y-axis, and Z-axis is greater than 3 rad/s, it can be determined that the electronic device is in motion.
  • the ACC signal can be used to measure the magnitude of the mechanical force.
  • the second type of original signal is the TP signal collected by CAP sensors, piezoelectric sensors, and piezoresistive sensors.
  • the TP signal is associated with the touch characteristics of the user's body part that contacts the touch screen, and is used to determine the touch event type.
  • the CAP sensor can be an integrated circuit (IC) chip that collects CAP signals.
  • IC chips may be composed of IC electrodes.
  • FIG. 2 shows a schematic diagram of the IC electrode arrangement of a mutual capacitive touch screen.
  • the mutual capacitive touch screen etches different ITO conductive circuit modules on two layers of indium-tin-oxide (ITO) conductive glass.
  • ITO indium-tin-oxide
  • Two layers of conductive circuit modules are perpendicular to each other on the display of the touch screen, forming horizontal electrodes and vertical electrodes.
  • the horizontal electrodes and vertical electrodes can be regarded as sliders that continuously change in the X-axis and Y-axis directions. Since the transverse electrodes and the longitudinal electrodes are located on different surfaces, a capacitive node will be formed where the two sets of electrodes intersect.
  • One slider can be considered a drive line and the other slider can be considered a detection line.
  • FIG. 3 shows a schematic diagram of the working principle of a mutual capacitive touch screen.
  • the controller of the mutual capacitive touch screen periodically sends excitation signals, also called drive signals, to the lateral electrodes in sequence.
  • the longitudinal electrodes obtain response signals, so that the capacitance values at the intersection points of all horizontal electrodes and vertical electrodes can be obtained, that is, the capacitance size of the two-dimensional plane of the entire mutual capacitive touch screen can be obtained.
  • a finger touches a mutual capacitive touch screen, it causes coupling between the two electrodes near the touch point, which is equivalent to introducing a new capacitance to the two electrodes, thereby changing the charge size measured by the longitudinal electrode, causing the gap between the two electrodes to The capacitance value changes.
  • the CAP signal is the response signal, that is, the capacitance values of all capacitance nodes can be determined based on the CAP signal.
  • the IC chip can extract the CAP signal from the response signal based on the touch point.
  • CAP The signal is used to indicate the capacitance value of each capacitance node in a node matrix.
  • the node matrix may be composed of m rows and n columns of capacitor nodes.
  • the capacitance value of the center node of the node matrix is the largest among the capacitance values of all capacitance nodes indicated by the response signal. That is, the capacitance node corresponding to the maximum capacitance value is the center node of the node matrix.
  • the node matrix is equivalent to drawing the center node with the center node as the center.
  • m and n can be set according to the identification method, the number of capacitive nodes of the touch screen, etc., and are not limited in the embodiments of this application. m and n are positive integers.
  • FIG. 4 shows a schematic diagram of a node matrix corresponding to a CAP signal.
  • the IC chip collects the CAP signal.
  • the CAP signal contains a set of CAP data.
  • CAP data contains the capacitance value of each capacitance node in the 7*7 node matrix.
  • the capacitance value of the center node of the 7*7 node matrix is 1771.
  • the capacitance value of the center node 1771 is the largest among all capacitance nodes on the touch screen.
  • the CAP signal is associated with the touch characteristics of the user's body part that contacts the touch screen.
  • the collected CAP signal is associated with the touch characteristics of the user's body part that touches the touch screen, because the CAP signal is an original signal that only carries the capacitance information of the capacitance node, the CAP signal It cannot intuitively reflect the touch characteristics of the user's body part when it comes into contact with the touch screen. Therefore, the electronic device cannot directly determine whether the user's body part (such as a knuckle) touches the touch screen based on the CAP signal. In this case, the electronic device needs to process the CAP signal to obtain the touch signal.
  • the touch signal can intuitively represent the touch characteristics when the user's body part comes into contact with the touch screen.
  • the touch signal can carry touch data, such as: orientation data, used to describe the arc direction of the touch area and the tool area in a vertical direction and clockwise direction; pressure data, used to describe the pressure exerted by a finger or other tool on the device ; size data, used to describe the size of the pointer touch area related to the set maximum detectable size; toolMajor data, used to describe the size of the major axis of the ellipse close to the tool size; toolMinor data, used to describe the short axis of the ellipse close to the tool size The size of the axis; touchMajor data, used to describe the size of the major axis of the ellipse in the touch area of the contact point; touchMinor data, used to describe the size of the minor axis of the ellipse in the touch area of the contact point; x data, used to describe the X-axis coordinate of the pointer movement ;y data, used to describe the Y-axis coordinate of pointer movement.
  • the embodiments of this application have introduced three signals: ACC signal, CAP signal and touch signal.
  • the ACC signal is the original signal collected by the ACC sensor, which is used to represent the movement/stationary state of the electronic device under the action of mechanical force.
  • the CAP signal is the original signal collected by the CAP sensor, and the original signal carries the capacitance information of the capacitance node.
  • the touch signal is a signal obtained after processing the CAP signal, and can be used to characterize the touch characteristics when the user's body part comes into contact with the touch screen.
  • the above three signals are all associated with body parts.
  • the signal characteristics of these three signals are different.
  • Electronic devices can use the signal characteristics of these three signals to identify different body parts.
  • the embodiment of this application aims to use these three signals to control the finger switch section to identify.
  • Figure 5 shows a schematic flow chart of a finger joint touch detection algorithm.
  • the algorithm consists of steps:
  • Step 51 In response to the user's touch operation, obtain the ACC signal, CAP signal and touch signal.
  • Step 52 Perform feature extraction on the ACC signal to obtain ACC features.
  • the ACC feature is a feature in the ACC signal associated with the touch operation of the finger joint, which can be used to determine the strength of the body part acting on the touch screen. It should be understood that the intensity corresponding to different gestures may be different, and the ACC features extracted from the ACC signal may also be different.
  • Step 53 Use preset thresholds to initially screen ACC features and touch signals.
  • the threshold corresponding to the touch signal may be a preset area threshold, and the threshold corresponding to the ACC feature may be a preset feature threshold. It should be noted that the number of ACC features extracted from ACC features is one or more. For multiple ACC features, different ACC features may correspond to different preset thresholds.
  • the touch operation received by the electronic device may be a finger joint operation or a non-finger joint operation. Since the ACC features corresponding to finger joint operations and the ACC features corresponding to non-finger joint operations may show different distribution patterns, a preset feature threshold can be used to initially screen some non-finger joint signals. When the ACC feature does not meet the preset feature threshold, it can be directly determined that the touch operation is a non-knuckle operation, thereby eliminating the need to perform subsequent steps 54 to 56 . When the ACC feature meets the preset feature threshold, since it is only a rough screening of the signal, there is a possibility of misjudgment of non-finger joint operations as finger joint operations. Therefore, steps 54 to 56 need to be performed to further accurately detect whether it is a finger joint operation. Knuckle manipulation.
  • the touch signals corresponding to finger joint operations may be different from the touch signals corresponding to non-finger joint operations.
  • the contact area between the fingertips and the touch screen is larger, while the contact area between the finger joints and the touch screen is smaller.
  • the preset area threshold can be used to initially screen some non-finger joint signals.
  • the contact area of the touch signal is greater than the preset area threshold, it can be directly determined that the touch operation is a non-knuckle operation, thereby eliminating the need to perform subsequent steps 54 to 56 .
  • the set thresholds and filtering methods may be different. For example, for some ACC features, set a first threshold to filter out features that are smaller than the first threshold; for other ACC features, set a second threshold to screen out features that are larger than the second threshold; for some ACC features, set The third threshold and the fourth threshold are used to filter out features that are smaller than the third threshold and larger than the fourth threshold, and the third threshold is smaller than the fourth threshold.
  • Step 54 Input the ACC signal into the ACC two-classification model.
  • the ACC two-classification model can be used to classify the ACC signal for accurate detection. This results in any of the following classification results:
  • the touch operation on the touch screen is a touch operation on the knuckles
  • Another classification result is: the touch operation performed on the touch screen is a non-knuckle touch operation.
  • the touch operation on the touch screen is a touch operation on the knuckles. Then perform the following step 55.
  • the electronic device does not perform any processing, or continues to determine the specific non-knuckle touch operation type, such as Fingertips, fingertips, nails, and side nails are operated to perform response functions corresponding to non-finger joint touch operations.
  • Step 55 Input the CAP signal into the CAP five-classification model to obtain the classification result of the CAP five-classification model.
  • the CAP five-classification model performs five classifications based on the original CAP signal and obtains any of the following classification results:
  • the second classification result is: the touch operation on the touch screen is a touch operation on the fingertips;
  • the third classification result is: the touch operation on the touch screen is a fingertip touch operation;
  • the fifth classification result is: the touch operation on the touch screen is the touch operation on the side armor.
  • the classification result of the CAP five-classification model is: the touch operation on the touch screen is a touch operation on the knuckles, then the following step 56 is performed.
  • the classification result of the CAP five-category model is: the touch operation on the touch screen is a touch operation on the fingertips, then the response function corresponding to the touch operation on the fingertips is executed.
  • the classification result of the CAP five-category model is: the touch operation on the touch screen is a fingertip touch operation, then the response function corresponding to the fingertip touch operation is executed.
  • the classification result of the CAP five-classification model is: the touch operation on the touch screen is a fingernail touch operation, then the response function corresponding to the nail touch operation is executed.
  • the classification result of the CAP five-category model is: the touch operation on the touch screen is a touch operation on the side armor, then the response function corresponding to the touch operation on the side armor is executed.
  • Step 56 Input the CAP signal into the CAP two-classification model to obtain the classification result of the CAP two-classification model.
  • step 55 there may be a possibility of gesture misjudgment in step 55 above, so the electronic device can input the CAP signal into the CAP two-classification model again to obtain the classification result of the CAP two-classification model, thereby improving the recognition accuracy of the knuckles and reducing the The false touch rate of user operations.
  • the CAP two-classification model performs two classifications based on the original CAP signal and obtains any of the following classification results:
  • the electronic device can perform a response function corresponding to the touch operation of the knuckles.
  • the output result of the CAP two-classification model is: the touch operation on the touch screen is a non-knuckle touch operation, no processing is performed, or a response function corresponding to the non-knuckle touch operation is performed.
  • Figure 6 shows a schematic flow chart of another finger joint touch detection algorithm.
  • the algorithm consists of steps:
  • Step 61 In response to the user's touch operation, obtain the ACC signal, CAP signal and touch signal.
  • Step 62 Perform feature extraction on the ACC signal to obtain ACC features.
  • the ACC feature is a feature in the ACC signal associated with the touch operation of the finger joint.
  • Step 63 Use preset thresholds to initially screen ACC features and touch signals.
  • the threshold corresponding to the touch signal may be a preset area threshold, and the threshold corresponding to the ACC feature may be a preset feature threshold.
  • step 63 For the specific implementation of step 63, reference may be made to the description of step 53 above, which will not be described again here.
  • Step 64 When the ACC feature meets the preset feature threshold and the touch signal meets the contact area threshold, the original The original CAP signal is input into the CAP five-classification model to obtain the classification result of the CAP five-classification model.
  • the CAP five-class model classifies signals into five categories and obtains any of the following classification results:
  • the first classification result is: the touch operation on the touch screen is a touch operation on the knuckles;
  • the second classification result is: the touch operation on the touch screen is a touch operation on the fingertips;
  • the third classification result is: the touch operation on the touch screen is a fingertip touch operation;
  • the fourth classification result is: the touch operation on the touch screen is a fingernail touch operation;
  • the fifth classification result is: the touch operation on the touch screen is the touch operation on the side armor.
  • the classification result of the CAP five-classification model is: the touch operation on the touch screen is a touch operation on the knuckles, then the response function corresponding to the touch operation on the knuckles is executed.
  • the classification result of the CAP five-category model is: the touch operation on the touch screen is a touch operation on the fingertips, then the response function corresponding to the touch operation on the fingertips is executed.
  • the classification result of the CAP five-category model is: the touch operation on the touch screen is a fingertip touch operation, then the response function corresponding to the fingertip touch operation is executed.
  • the classification result of the CAP five-classification model is: the touch operation on the touch screen is a fingernail touch operation, then the response function corresponding to the nail touch operation is executed.
  • the classification result of the CAP five-category model is: the touch operation on the touch screen is a touch operation on the side armor, then the response function corresponding to the touch operation on the side armor is executed.
  • the ACC two-classification model, CAP two-classification model, and CAP five-classification model in Figures 5 and 6 above are traditional machine learning models or neural network models.
  • the algorithm provided in Figure 6 omits the ACC two-class model and the CAP two-class model. Since the algorithm provided in Figure 6 is simpler, it is equivalent to relaxing the detection conditions of finger joints, thus improving the recognition rate of finger joints, but it will also lead to an increase in the false touch rate of non-finger joints.
  • the preset thresholds in the above two algorithms are the thresholds finally obtained after multiple adjustments by workers during the development stage of electronic equipment by manually setting the thresholds.
  • this manual screening method relies too much on subjective debugging.
  • the screening effect of manual screening is not ideal.
  • Figures 7 and 8 show schematic diagrams of manual screening of an ACC feature.
  • the dotted line box a surrounds multiple finger joint feature points (ie, ACC feature points), and the dotted line box b surrounds multiple non-knuckle feature points. . Among them, the value distributions of finger joint feature points and non-finger joint feature points are relatively close.
  • the electronic device will filter out a small number of non-knuckle feature points smaller than threshold 1.
  • the filtering rule is to filter out feature points smaller than threshold 1.
  • a large number of non-finger joint feature points will remain, causing a large number of invalid non-finger joint feature points to be input into the classification model, resulting in inaccurate classification results.
  • threshold 2 is greater than threshold 1
  • the filtering rule is to filter out feature points smaller than threshold 2
  • the electronic device will filter out a large number of non-knuckle features smaller than threshold 2. points, but at the same time, the vast majority of finger joint feature points smaller than the threshold value 2 will be filtered out, so that most of the effective finger joint features are filtered out, resulting in the classification model being unable to accurately classify based on enough effective data.
  • Figures 9 and 10 show schematic diagrams of manual screening of another ACC feature.
  • the horizontal axis is used to represent the number of the data feature
  • the vertical axis is used to represent the value of the data feature
  • the dotted box c contains multiple finger joint feature points (i.e., ACC feature points)
  • the dotted line box d surrounds multiple non-knuckle feature points. Among them, the value distributions of finger joint feature points and non-finger joint feature points partially overlap.
  • the electronic device will filter out a small number of non-knuckle feature points smaller than threshold 3.
  • the filtering rule is to filter out feature points smaller than threshold 3.
  • a large number of non-finger joint feature points will remain, causing a large number of invalid non-finger joint feature points to be input into the classification model, resulting in inaccurate classification results.
  • threshold 4 is greater than threshold 3
  • the filtering rule is to filter out feature points smaller than threshold 4
  • the electronic device will filter out a large number of non-knuckle features smaller than threshold 4 points, but at the same time, a part of the knuckle feature points smaller than the threshold 4 will be filtered out, so that a part of the effective knuckle feature points will be filtered out, resulting in the classification model being unable to accurately classify based on enough effective data.
  • the manual screening method relies too much on subjective debugging and requires a lot of time to perform threshold testing, which increases the complexity of the work.
  • the value distributions of finger joint feature points and non-finger joint feature points are relatively close or partially overlap, even if the threshold is adjusted, it is difficult to filter out most of the non-finger joint feature points, leaving most of the finger joint feature points. , making the final classification result inaccurate.
  • embodiments of the present application improve the two algorithms and provide a new method for identifying finger joint operations.
  • This method cancels the manually set preset thresholds in the above two algorithms and uses the ACC two-classification model to classify ACC features, that is, machine screening is used instead of manual screening.
  • this method adds a CAP two-class model to extract score features from the CAP signal.
  • this method also performs feature fusion on score features, ACC features and touch features to predict the final classification result.
  • finger joint operation identification method provided by the embodiment of the present application can be applied to various electronic devices.
  • electronic devices can be mobile phones, tablets, wearable devices, vehicle-mounted devices, augmented reality (AR), virtual reality (VR) devices, laptops, ultra-mobile personal computers , UMPC), netbook, personal digital assistant (personal digital assistant, PDA) or smart screen, etc., or other devices or devices equipped with a touch screen.
  • AR augmented reality
  • VR virtual reality
  • PDA personal digital assistant
  • smart screen etc., or other devices or devices equipped with a touch screen.
  • PDA personal digital assistant
  • the first part describes the principles of the new finger joint operation identification method provided by the embodiments of the present application.
  • the accelerometer is set in the upper left corner of the touch screen.
  • the accelerometer is stressed in different ways, and the accelerometer will collect different ACC data.
  • Figures 11 to 18 show four scenarios in which finger joints act on different touch areas.
  • Figure 11 shows a front view of the knuckle of the index finger tapping the upper left portion of the screen.
  • FIG. 12 shows a perspective view of the knuckle of the index finger tapping the upper left part of the screen.
  • the X-axis is the horizontal direction of the plane where the touch screen is located
  • the Y-axis is the vertical direction of the plane where the touch screen is located
  • the Z-axis is the vertical direction running through the touch screen.
  • the accelerometer located in the upper left part of the touch screen will obtain a Z-axis Mechanical force F1 in the negative direction.
  • FIG. 13 shows a front view of the knuckle of the index finger tapping the upper right part of the screen.
  • FIG. 14 shows a perspective view of the knuckle of the index finger tapping the upper right part of the screen.
  • the X-axis is the horizontal direction of the plane where the touch screen is located
  • the Y-axis is the vertical direction of the plane where the touch screen is located
  • the Z-axis is the vertical direction running through the touch screen.
  • the accelerometer located at the upper left part of the touch screen will obtain a mechanical force F2 in the negative direction of the Z-axis.
  • FIG. 15 shows a front view of the knuckle of the index finger tapping the lower left part of the screen.
  • FIG. 16 shows a perspective view of the knuckle of the index finger tapping the lower left part of the screen.
  • the X-axis is the horizontal direction of the plane where the touch screen is located
  • the Y-axis is the vertical direction of the plane where the touch screen is located
  • the Z-axis is the vertical direction running through the touch screen.
  • the accelerometer located at the upper left part of the touch screen will obtain a mechanical force F3 in the negative direction of the Z-axis.
  • FIG. 17 shows a front view of the knuckle of the index finger tapping the lower right part of the screen.
  • FIG. 18 shows a perspective view of the knuckle of the index finger tapping the lower right part of the screen.
  • the X-axis is the horizontal direction of the plane where the touch screen is located
  • the Y-axis is the vertical direction of the plane where the touch screen is located
  • the Z-axis is the vertical direction running through the touch screen.
  • the accelerometer located at the upper left part of the touch screen will receive a mechanical force F4 in the positive direction of the Z-axis.
  • the accelerometer is mainly affected by mechanical force in the Z-axis direction.
  • Figures 19 and 20 show schematic diagrams of the change trends of the two ACC signals. Among them, the horizontal axis is used to represent time, and the vertical axis is used to represent the value of ACC data.
  • the value of the ACC data is basically maintained at 2500.
  • the knuckle acts on the upper area or left area of the touch screen, such as the upper left area, the upper right area or the lower left area, from the time corresponding to the 108th frame data to the time corresponding to the 128th frame data, ACC
  • the value of the data will undergo a significant decline process.
  • the value of the ACC data is basically maintained at 3000. After the knuckles act on the lower right area of the touch screen, from the moment corresponding to the 114th frame of data to the moment corresponding to the 128th frame of data, the value of the ACC data will rise significantly.
  • the embodiment of the present application proposes to extract touch features from the touch signal to determine the contact area and contact position, and extract ACC features from the ACC signal to determine the intensity of the touch screen.
  • a CAP binary classification model is pre-installed to extract the score feature used to characterize the correlation with the knuckles from the CAP signal.
  • ACC features, score features and touch features are fused to obtain better classification results.
  • Figure 21 is a schematic flowchart of an improved method for identifying finger joint operations provided by an embodiment of the present application. This method can be applied to the scene of identifying the interaction mode of finger joints.
  • the execution subject of this method may be an electronic device or a functional module in the electronic device.
  • the screen of the electronic device is a touch screen, and an ACC sensor and a CAP sensor are provided on the touch screen.
  • the method may include S1 to S9 described below.
  • the sensor of the electronic device will detect the touch operation. Since the touch operation may be a finger joint operation or a non-finger joint operation, the electronic device can use the following S2 to S9 to determine whether it is a finger joint operation to perform a corresponding response operation.
  • Sensors in electronic devices can collect data periodically.
  • the electronic device can respond to the touch operation, acquire the ACC signal collected through the ACC sensor, collect the CAP signal through the CAP sensor, and process the CAP signal to obtain touch signal.
  • the ACC signal is the original signal collected by the ACC sensor, which can be used to characterize the movement/stationary state of the electronic device under the action of mechanical force.
  • the CAP signal is the original signal collected by the CAP sensor, and the original signal carries the capacitance information of the capacitance node.
  • the touch signal is a signal obtained after processing the CAP signal, and can be used to characterize the touch characteristics when the user's body part comes into contact with the touch screen.
  • ACC signal, CAP signal and touch signal can be time domain signals or frequency domain signals.
  • the ACC feature is a feature in the ACC signal associated with the touch operation of the finger joint, which can be used to determine the strength of the body part acting on the touch screen.
  • the electronic device can use a preset feature extraction algorithm to extract ACC features from the ACC signal.
  • the feature extraction algorithm can be a method based on mutual information, a method based on maximum correlation-minimum redundancy, and a feature selection method based on wrapper method (Wrapper), etc.
  • the ACC feature may include at least one of the following: a maximum first-order difference (maxgradient) feature, a signal amplitude (amplitude) feature, a previous zero-crossing number (zerocrosscnt) feature, a maximum high-pass value (maxhighpass) feature, and an absolute value in the mean domain Sum (meanaddmax) feature, front-end normalized value variance (accnormsquare) feature, front-end normalized value amplitude (accnormsquare) feature, fast fourier transformfft mean (fftmean) feature, partial fast Fourier transform mean (part fftmean, partfftmean) features. It should be understood that the ACC feature may also include other features, which are not limited by the embodiments of this application.
  • the ACC two-classification model performs two classifications based on the extracted ACC features and obtains two classification results:
  • the touch operation on the touch screen is a touch operation on the knuckles.
  • Another classification result is: the touch operation performed on the touch screen is a non-knuckle touch operation.
  • the touch operation applied to the touch screen is a touch operation of the knuckles, then the following S5 is executed.
  • the output result of the ACC two-category model is: the touch operation on the touch screen is a non-knuckle touch operation, no processing will be performed, or a response function corresponding to the non-knuckle touch operation will be executed.
  • the score feature can be used to represent the degree of correlation between the CAP signal and the finger joint operation. That is, the score feature is the feature in the CAP signal that is associated with the finger joint operation.
  • the CAP signal carries the capacitance information of the capacitance node. Based on the description of the above embodiments, when a body part comes into contact with the touch screen, the capacitance value of the capacitance node will change. It should be understood that when the user uses different body parts to contact the touch screen, different gestures will contact different capacitance nodes, and the capacitance value changes caused will also be different.
  • FIG. 22 shows a schematic diagram of a set of 7*7 node matrices corresponding to finger joints, finger tips and finger pads respectively.
  • the capacitance values of the maximum capacitance nodes located in each 7*7 node matrix are different, and the capacitance values of the capacitance nodes distributed around each maximum capacitance node are also different.
  • the CAP two-classification model After inputting the CAP signal into the CAP two-classification model, the CAP two-classification model obtains the score score from the output, that is, the score feature, based on the capacitance information of the capacitance node carried by the CAP signal. It should be understood that if it is a 7*7 node matrix of the knuckle, the score will be higher; if it is a 7*7 node matrix of the fingertip or fingertip, the score will be lower.
  • Table 1 provides a correspondence table between interaction methods and scores.
  • the output result of the CAP two-classification model is: the touch operation on the touch screen is a knuckle touch operation with a score of 0.92, and the touch operation on the touch screen is 0.92.
  • the touch operation of the control screen is non-knuckle touch operation, and the score is 0.08. It should be understood that the higher the score of the touch operation on the touch screen being a touch operation of the knuckles, the more likely it is that the final classification result is a touch operation of the knuckles after performing feature fusion based on the score and so on. high.
  • touch features can be used to characterize the contact area and contact position when the body part contacts the touch screen. It should be understood that the touch characteristics corresponding to different gestures will be different.
  • the electronic device can use a preset feature extraction algorithm to extract touch features from the touch signal.
  • the touch feature can be used to represent the degree of correlation between the touch signal and the finger joint operation, that is, the touch feature is the feature in the touch signal that is associated with the finger joint operation.
  • the touch characteristics may include contact location (location) characteristics and/or contact area (pressure) characteristics.
  • the contact position feature can be used to represent the interactive position of the body part on the touch screen
  • the contact area feature can be used to represent the area of contact between the body part and the touch screen.
  • this application proposes the concept of extracting touch features from touch signals in the improved recognition method of finger joint operations.
  • the above position feature may be a grid feature, which is used to represent the coordinate grid number of the touch point.
  • the electronic device can divide the touch screen into a grid list with p rows and q columns according to the resolution of the touch screen, and each grid in the grid list is represented by a coordinate grid number.
  • the length of each grid is equal to the number of pixels on the vertical axis of the touch screen divided by p
  • the width of each grid is equal to the number of pixels on the horizontal axis of the touch screen divided by q.
  • the screen resolution of a mobile phone is 1600 ⁇ 1200 pixels
  • the screen is divided into a 4 ⁇ 3 grid list.
  • the screen resolution of the mobile phone is 1920 ⁇ 1080 pixels
  • the screen is divided into a 6 ⁇ 4 grid list.
  • p and q are positive integers.
  • the grid indicated by each coordinate grid number will cover several electrode intersection points.
  • the capacitance value of the capacitance node changes, so that the X-axis coordinate and Y-axis coordinate can be obtained from the touch signal, and then the coordinate network of the touch point can be determined.
  • the grid number determines the approximate location of the touch point on the touch screen.
  • FIG. 23 is a schematic diagram of a grid list provided by the embodiment of the present application.
  • the touch screen is divided into a grid list with 7 rows and 4 columns, and each grid is represented by a coordinate grid number.
  • the coordinate grid numbers of the first row of grids from left to right are 00, 01, 02, and 03
  • the coordinate grid numbers of the second row of grids from left to right are 10.
  • 11, 12, 13 the coordinate grid numbers of the third row of grids from left to right are 20, 21, 22, 23...
  • the coordinate grid numbers of the seventh row of grids from left to right are 60, 61, 62, 63.
  • the electronic device When the knuckles tap the area corresponding to the coordinate grid number 21, the electronic device first obtains the CAP signal.
  • the CAP signal contains the capacitance value of each capacitance node in the 7*7 node matrix as shown in the partial enlargement; then the CAP signal is processed Process to obtain the touch signal; then, according to the X-axis coordinate and Y-axis coordinate of the touch signal, the grid feature is extracted from the touch signal.
  • This grid feature is used to represent the coordinate grid number 21 of the touch point, that is, It is determined that the approximate position of the touch point on the touch screen is the area indicated by coordinate grid number 21.
  • FIG. 23 is an example in which the approximate position of the touch point on the touch screen is a grid, which does not limit the embodiments of the present application.
  • the touch point may span multiple grids on the touch screen.
  • the grid feature can be used to represent the coordinate grid number of multiple grids, or the coordinate grid number of a grid in multiple grids that contains a greater number of touch points than the other grids. The number of touch points the grid contains.
  • the embodiment of the present application does not limit the execution order of S3, S4, S5 and S6.
  • the electronic device can extract ACC features from the ACC signal and touch features from the touch signal, and then determine whether to input the CAP signal into the CAP two-classification model based on the output results of the ACC two-classification model. , to perform feature fusion on the extracted multiple features, that is, execute S3 and S6 first, and then execute S4 and S5.
  • the electronic device can first extract ACC features from the ACC signal, and then if the output result of the ACC two-classification model indicates that the touch operation on the touch screen is a knuckle touch operation, then the CAP The signal is input to the CAP binary classification model, and touch features are extracted from the touch signal to perform analysis on the extracted features.
  • Feature fusion that is, execute S3 and S4 first, and then execute S5 and S6.
  • the electronic device can use a preset feature fusion algorithm to splice these features.
  • the preset feature fusion algorithm can be an early fusion algorithm (early fusion) and a late fusion (late fusion) algorithm.
  • the electronic device may include a feature fusion module. Assume that the electronic device extracts the contact position feature and contact area feature from the touch signal, and extracts the maximum first-order difference (maxgradient) feature, signal amplitude (amplitude) feature, front-end zero-crossing number (zerocrosscnt) feature, and maximum first-order difference (maxgradient) feature from the ACC signal.
  • High pass value (maxhighpass) feature mean sum of absolute values in the mean domain (meanaddmax) feature, front normalized value variance (accnormsquare) feature, front normalized value amplitude (accnormsquare) feature, fast fourier transformfft mean, fftmean) features and part fast Fourier transform mean (part fftmean, partfftmean) features
  • the score feature is extracted from the CAP signal.
  • the feature fusion module can perform feature fusion on these 12 features to obtain the fused features.
  • the finger joint classification model predicts whether it is a finger joint interaction mode based on the fused features.
  • the improved identification method of finger joint operations involves three two-classification models: ACC two-classification model, CAP two-classification model and finger joint classification model.
  • the two categories of these three models can be traditional machine learning models or neural network models.
  • the CAP binary classification model can be a convolutional neural networks (CNN) model.
  • CNN is an artificial neural network.
  • the CAP two-classification model can be shown in Figure 25.
  • the structure of CNN is divided into three layers: the convolutional layer (convolutional layer), whose main function is to extract CAP features; the pooling layer (max pooling layer), whose main function is Down sampling does not damage the recognition results; the fully connected layer mainly functions to classify to determine whether the touch operation is a knuckle operation or a non-knuckle operation.
  • the ACC two-classification model and the knuckle classification model can use the fully connected neural network (DNN) model.
  • the DNN network model is a multi-layer perceptron. The principle of the perceptron is to find the most reasonable and robust hyperplane between categories. The most representative perceptron is the support vector machine (SVM) algorithm.
  • the ACC two-classification model can be shown in Figure 26, and the finger joint classification model can be shown in Figure 27.
  • Both DNN models include input layer, hidden layer and output layer. Among them, the number of hidden layers can be multiple. It should be understood that increasing the number of hidden layers can better separate the characteristics of the data, but too many hidden layers will also increase the training time and cause overfitting.
  • 9 ACC features are input into the input layer of the ACC two-classification model, and 9 ACC features and 1 score feature are input into the input layer of the knuckle classification model.
  • the training process of the ACC two-classification model, the CAP two-classification model and the finger joint classification model can refer to the existing technology, and will not be described again here.
  • the knuckle classification model performs two classifications based on the fused features to obtain two classification results:
  • the touch operation on the touch screen is a touch operation on the knuckles, that is, the gesture interaction mode recognized by the knuckle classification model is specifically the knuckle interaction mode.
  • Another classification result is that the touch operation on the touch screen is a non-knuckle touch operation, that is, the gesture interaction mode recognized by the knuckle classification model is specifically a non-knuckle interaction mode.
  • the electronic device can perform a response function corresponding to the touch operation of finger joints.
  • the electronic device may not perform any processing, or perform a response function corresponding to the non-finger knuckle touch operation.
  • the electronic device can be configured with multiple types of knuckle gestures. After identifying the interaction mode of the knuckles, the electronic device can determine the knuckles based on parameters such as the touch position of the knuckles and the touch screen, the touch time of the knuckles and the touch screen, and the sliding distance of the knuckles on the touch screen.
  • the joint interaction method specifically determines which type of knuckle gesture it belongs to, and performs the response function corresponding to the knuckle gesture.
  • the knuckle gesture includes at least one of the following: a double-click gesture with the knuckles, a tap and circle gesture with the knuckles, a tap and draw the letter S gesture with the knuckles, and a three-knuckle sliding gesture from top to bottom along the screen. , double-click gesture with two knuckles, tap with knuckles and draw a straight line gesture in the middle of the screen.
  • the double-click gesture with the knuckles corresponds to the full-screen screenshot function
  • the gesture of tapping and drawing the circle with the knuckles corresponds to the partial screenshot function
  • the gesture of tapping and drawing the letter S with the knuckles corresponds to the scrolling screenshot function
  • the three knuckles move along the screen from top to bottom.
  • the sliding gesture corresponds to the sliding screenshot function
  • the double-click gesture with two knuckles corresponds to the start/stop screen recording function
  • the gesture of tapping with knuckles and drawing a straight line in the middle of the screen corresponds to the split-screen function.
  • the knuckle operation identification method provided by the embodiment of the present application cancels the manually set threshold and uses the ACC two-classification model to classify ACC features, that is, a machine screening method is used instead of a manual screening method, thereby improving the efficiency of ACC feature screening. And improve the filtering effect.
  • the score feature can be extracted from the CAP signal, and then the score feature, ACC feature and touch feature are fused. Finally, the fused features are used for classification, which can achieve better classification results. .
  • the original recognition method and the improved recognition method were used to identify finger joint operations and non-finger joint operations in different experimental scenarios.
  • a large number of experimental results show that the improved recognition method has a higher recognition rate for finger joint operations than the original recognition method for finger joint operations, and the improved recognition method has a lower false-touch rate for non-finger joint operations than the original recognition method. False touch rate for non-finger knuckles. That is, the improved recognition method improves the recognition rate of finger joint operations and reduces the false touch rate of non-finger joint operations, achieving better classification results.
  • Figure 28 is a schematic diagram of the software structure of the electronic device according to the embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has clear roles and division of labor.
  • the layers communicate through software interfaces.
  • the software layers of the Android system are divided from top to bottom into: application layer (application), framework layer (framework, FWK), input layer (Input), hardware abstraction layer (HAL), driver layer, etc. .
  • Figure 28 also shows the hardware layer connected to the software layer.
  • the hardware layer may include a microcontroller unit (MCU), ACC sensor, CAP sensor, etc.
  • MCU microcontroller unit
  • ACC sensor ACC sensor
  • CAP sensor CAP sensor
  • the application layer can include a series of application packages, such as operating system (OS) applications.
  • the OS application can trigger related functions corresponding to the knuckle touch operation by calling the system application programming interface (API) interface.
  • API system application programming interface
  • the functions corresponding to the knuckle touch operation can be customized through the OS application.
  • the OS application can provide a user interface to the user so that the user can define functions corresponding to the knuckle touch operation on the user interface.
  • the framework layer provides APIs and programming frameworks for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the framework layer can include an input management service, which is used to receive and distribute input events, map input events, judge and process the collected input events, and distribute input events to the upper layer.
  • the input management service can manage gestures such as finger pulps, fingertips, nails, side nails, and knuckles to perform corresponding shortcut processing actions.
  • the input layer is used to determine the type of input event.
  • the gesture processing module of the input layer can call the gesture recognition module of the hardware abstraction layer to determine the touch type of the input event.
  • the hardware abstraction layer is the interface layer between the operating system kernel and the hardware circuit. Its purpose is to abstract the hardware. It hides the hardware interface details of a specific platform and provides a virtual hardware platform for the operating system, making it hardware-independent and portable on a variety of platforms. From the perspective of software and hardware testing, both software and hardware testing work can be completed based on the hardware abstraction layer, making it possible to conduct software and hardware testing work in parallel.
  • the ACC sensor collects the ACC signal and sends the ACC signal to the MCU.
  • the CAP sensor collects the CAP signal and sends the CAP signal to the MCU.
  • the MCU processes the CAP signal to obtain the touch signal, and then the MCU sends the ACC signal, CAP signal and touch signal to the gesture recognition module of the hardware abstraction layer.
  • the gesture recognition module processes these signals according to the methods in the above-mentioned embodiments S1 to S9 to obtain the recognition results. If the gesture recognition result is a knuckle interaction mode, the gesture recognition module reports the knuckle touch event to the OS application through the gesture processing module and the input management service. Afterwards, the OS application can trigger related functions corresponding to the knuckle touch operation by calling the API interface of the OS.
  • the above touch signal can also be obtained by processing the CAP signal by the functional module of the software layer, which is not limited in the embodiment of the present application.
  • the electronic device includes hardware structures or software modules corresponding to each function, or a combination of both.
  • the present application can be implemented in the form of hardware or a combination of hardware and computer software. Whether a function is performed by hardware or computer software driving the hardware depends on the specific application and design constraints of the technical solution. Skilled artisans may implement the described functionality using different methods for each specific application, but such implementations should not be considered beyond the scope of this application.
  • Embodiments of the present application can divide the electronic device into functional modules according to the above method examples.
  • each functional module can be divided corresponding to each function, or two or more functions can be integrated into one processing module.
  • the above integrated modules can be implemented in the form of hardware or software function modules. It should be noted that the division of modules in the embodiment of the present application is schematic and is only a logical function division. In actual implementation, there may be other division methods. The following is divided into categories corresponding to each function: Each functional module is explained as an example.
  • Figure 29 is a schematic structural diagram of an identification device provided by an embodiment of the present application.
  • the recognition device 90 may include an acquisition module 91 , a feature extraction module 92 , a feature fusion module 93 and a classification module 94 .
  • the acquisition module 91 is used to acquire the ACC signal, the CAP signal and the touch signal in response to the touch operation on the touch screen.
  • the ACC signal is the original signal collected by the ACC sensor
  • the CAP signal is the original signal collected by the CAP sensor.
  • the touch signal is a signal obtained after processing the CAP signal.
  • the feature extraction module 92 is used to extract ACC features, score features and touch features.
  • the ACC feature is the feature associated with the finger joint operation in the ACC signal
  • the score feature is the feature associated with the finger joint operation in the CAP signal
  • the touch feature is touch.
  • the feature fusion module 93 is used to perform feature fusion on ACC features, score features and touch features.
  • the classification module 94 is used to input the fused features into the finger joint classification model to obtain the finger joint classification results.
  • the finger joint classification result indicates that the touch operation is a finger joint operation or a non-finger joint operation.
  • the feature extraction module 92 is specifically configured to: extract ACC features from the ACC signal.
  • the classification module 94 is also used to: input the ACC feature into the ACC two-classification model to obtain the ACC classification result; and when the ACC classification result indicates that the touch operation is a finger joint operation, input the CAP signal into the CAP two-classification model to obtain the score feature .
  • the feature extraction module 92 is specifically configured to: extract touch features from the touch signal when the ACC classification result indicates that the touch operation is a finger joint operation.
  • the feature extraction module 92 is specifically configured to: extract ACC features from the ACC signal, and extract touch features from the touch signal.
  • the classification module 94 is also used to input ACC features into the ACC two-classification model to obtain ACC classification results.
  • the feature extraction module 92 is specifically used to input the CAP signal into the CAP two-classification model to obtain the score feature when the ACC classification result indicates that the touch operation is a finger joint operation.
  • Figure 30 shows a schematic diagram of the hardware structure of an electronic device provided by an embodiment of the present application.
  • the electronic device may include: a processor 110, an external memory interface 120, an internal memory 121, a USB interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, and a mobile communication module.
  • Wireless communication module 160 audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and user identification module ( subscriber identification module, SIM card interface 195, etc.
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and an environment.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include a central processing unit (CPU), an image signal processor (ISP), a digital signal processor (digital signal processor) , DSP), video codec, neural-network processing unit (NPU), graphics processing unit (GPU), application processor (application processor, AP), and/or modem Processor etc.
  • CPU central processing unit
  • ISP image signal processor
  • DSP digital signal processor
  • video codec video codec
  • NPU neural-network processing unit
  • GPU graphics processing unit
  • application processor application processor, AP
  • modem Processor etc.
  • different processing units can be independent devices or integrated into one or more processors.
  • the CPU is the final execution unit for information processing and program running. Its main tasks include processing instructions, executing operations, controlling time, and processing data.
  • the CPU can include a controller, arithmetic unit, and cache memory, and the bus used to connect these components.
  • a controller can be the nerve center and command center of an electronic device. The controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have been recently used or recycled by processor 110 . If the processor 110 needs to use the instructions or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • NPU is a neural network (NN) computing processor.
  • NN neural network
  • Intelligent cognitive applications of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, etc.
  • the NPU can be used to train the ACC two-classification model, the CAP two-classification model and the knuckle classification model based on the ACC signal and the CAP signal.
  • the acceleration sensor 180E is used to detect the acceleration of the electronic device 100 in various directions. When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. The acceleration sensor 180E can also be used to identify the posture of electronic devices and be used in horizontal and vertical screen switching, pedometer and other applications.
  • the acceleration sensor 180E may be disposed in the upper left corner of the touch screen. When the knuckles tap the touch screen, the acceleration sensor 180E can collect the ACC signal.
  • the touch sensor 180K is provided on the display screen 194 and is used to detect touch operations on or near the display screen 194 .
  • the touch sensor can pass the detected touch operation to the application processor to determine the touch event type.
  • Visual output related to the touch operation may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a location different from that of the display screen 194 .
  • the touch sensor 180K may be a CAP sensor.
  • the CAP sensor can be a mutual capacitive touch sensor or a self-capacitive touch sensor. Utilizing the characteristic that the sensing capacitance of the touch sensor 180K will change when it is touched, the change of the sensing capacitance of each point is detected through the original data. When the change of the sensing capacitance of a certain point or multiple points exceeds a certain threshold, it is determined that the point has been touched, thus Detect the location of the touched point.
  • An embodiment of the present application also provides an electronic device, including a processor.
  • the processor is coupled to a memory.
  • the processor is configured to execute computer programs or instructions stored in the memory, so that the electronic device implements the methods in the above embodiments.
  • Embodiments of the present application also provide a computer-readable storage medium, which stores computer instructions; when the computer-readable storage medium is run on an electronic device, it causes the electronic device to execute the steps shown above. method.
  • the computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from a website, computer, server or data center via a wired (e.g. coaxial cable, fiber optic, digital subscriber Transmit to another website, computer, server or data center via digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.).
  • a wired e.g. coaxial cable, fiber optic, digital subscriber Transmit to another website, computer, server or data center via digital subscriber line (DSL)
  • wireless such as infrared, wireless, microwave, etc.
  • the computer-readable storage medium can be any available medium that can be accessed by a computer or include one or more data storage devices such as servers and data centers that can be integrated with the medium.
  • Available media may be magnetic media (eg, floppy disk, hard disk, or tape), optical media, or semiconductor media (eg, solid state disk (SSD)), etc.
  • Embodiments of the present application also provide a computer program product.
  • the computer program product includes computer program code.
  • the computer program code When the computer program code is run on a computer, it causes the computer to execute the methods in the above embodiments.
  • Embodiments of the present application also provide a chip, which is coupled to a memory, and is used to read and execute computer programs or instructions stored in the memory to execute the methods in the above embodiments.
  • the chip can be a general-purpose processor or a special-purpose processor.
  • the chip can be implemented using the following circuits or devices: one or more field programmable gate arrays (field programmable gate array, FPGA), programmable logic device (programmable logic device, PLD), controller, State machines, gate logic, discrete hardware components, any other suitable circuit, or any combination of circuits capable of performing the various functions described throughout this application.
  • field programmable gate array field programmable gate array, FPGA
  • programmable logic device programmable logic device
  • controller State machines
  • gate logic discrete hardware components
  • any other suitable circuit any combination of circuits capable of performing the various functions described throughout this application.
  • the electronic equipment, identification devices, computer-readable storage media, computer program products, and chips provided by the above embodiments of the present application are all used to execute the methods provided above. Therefore, the beneficial effects they can achieve can be referred to the methods provided above. The beneficial effects corresponding to the method will not be repeated here.
  • preset and predefined can be realized by pre-saving corresponding codes, tables or other methods that can be used to indicate relevant information in the device (for example, including electronic devices). , this application does not limit its specific implementation.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

本申请实施例提供一种指关节操作的识别方法及电子设备,涉及人工智能技术领域。在本申请方案中,当指关节作用于触控屏的不同触控区域时,ACC信号可能会呈现不同的变化趋势,因此提出从触碰信号中提取触摸特征用来判断接触面积和接触位置,从ACC信号中提取ACC特征用来判断触摸屏幕的力度。另外,前置一个CAP二分类模型,从CAP信号中提取用于表征与指关节之间的相关性的score特征。然后将ACC特征、score特征和触摸特征进行特征融合,输入指关节分类模型,从而可以获取更好的分类效果。

Description

指关节操作的识别方法及电子设备
本申请要求于2022年07月01日提交国家知识产权局、申请号为202210768211.4、申请名称为“指关节操作的识别方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及人工智能(artificial intelligence,AI)技术领域,尤其涉及一种指关节操作的识别方法及电子设备。
背景技术
触控屏(touch panel,TP)作为电子设备的人机交互装置,得到越来越广泛的应用。
目前,指关节是一种常用的用户与触控屏的交互方式。当用户将指关节在触控屏上进行触摸时,电子设备将检测到触摸点的信号发生变化。然后,电子设备可以根据变化的信号,识别出指关节交互方式,之后根据指关节与触控屏的触碰时间,以及指关节在触控屏的滑动距离等,执行截屏、录屏和切换窗口等快捷功能。
然而,传统指关节的触摸检测算法存在多种问题。比如,传统指关节的触摸检测算法泛化性能不足,导致对指关节交互方式的识别率低。再比如,传统指关节的触摸检测算法防伪能力不足,导致未能有效过滤非指关节交互方式,使得误触率高。因此,如何优化指关节的触摸检测算法成为亟待解决的技术问题。
发明内容
本申请提供一种指关节操作的识别方法及电子设备,解决了传统指关节的触摸检测算法存在的识别率低和误触率高等问题,提升了指关节的识别效果。
为达到上述目的,本申请采用如下技术方案:
第一方面,本申请实施例提供一种指关节操作的识别方法。该方法包括:
接收作用于触控屏的触控操作;
响应于触控操作,获取加速度(acceleration,ACC)信号、电容(capacitor,CAP)信号和触碰信号,ACC信号为通过ACC传感器采集的原始信号,CAP信号为通过CAP传感器采集的原始信号,触碰信号为对CAP信号进行处理后得到的信号;
提取ACC特征、置信度(score)特征和触摸特征,ACC特征为ACC信号中与指关节操作关联的特征,score特征为CAP信号中与指关节操作关联的特征,触摸特征为触碰信号中与指关节操作关联的特征;
对ACC特征、score特征和触摸特征进行特征融合;
将融合后的特征输入指关节分类模型,得到目标分类结果,目标分类结果指示触控操作为指关节操作或非指关节操作。
应理解,当指关节作用于触控屏的不同触控区域时,ACC信号可能会呈现不同的 变化趋势。因此,本申请实施例提出了从触碰信号提取触摸特征,用来判断接触面积和接触位置,ACC信号中提取的ACC特征,用来判断触摸屏幕的力度。另外,从CAP信号中提取用于表征与指关节之间的相关性的score特征。然后,将ACC特征、score特征和触摸特征进行特征融合输入指关节分类模型,从而可以获取更好的分类效果,提高指关节操作的识别率,并降低非指关节操作的误触率。
在一种可能的实现方式中,置信度特征为在加速度特征指示触控操作为指关节操作的情况下,从触碰信号中提取的特征。应理解,在加速度特征指示触控操作为非指关节操作的情况下,电子设备无需从触碰信号中提取的置信度特征,也无需进行后续的特征融合等操作。
在一种可能的实现方式中,提取ACC特征、score特征和触摸特征,包括:
从ACC信号中提取ACC特征;
将ACC特征输入ACC二分类模型,得到初步分类结果;
在初步分类结果指示触控操作为指关节操作的情况下,将CAP信号输入CAP二分类模型,得到score特征,并从触碰信号中提取触摸特征。
在一种可能的实现方式中,提取ACC特征、score特征和触摸特征,包括:
从ACC信号中提取ACC特征,并从触碰信号中提取触摸特征;
将ACC特征输入ACC二分类模型,得到初步分类结果;
在初步分类结果指示触控操作为指关节操作的情况下,将CAP信号输入CAP二分类模型,得到score特征。
应理解,通过取消手动设置的预设阈值,采用ACC二分类模型对ACC特征进行分类,实现了采用机器筛选方式代替人工筛选方式,从而提高了ACC特征筛选效率,并且,提升了指关节信号的筛选效果。
在一种可能的实现方式中,触摸特征包括接触位置特征和接触面积特征中的至少一项。其中,接触位置特征用于表示身体部位在触控屏上的交互位置,接触面积特征用于表示身体部位与触控屏接触的面积。
在一种可能的实现方式中,接触位置特征用于表示触碰点所在网格的坐标网格编号。其中,触碰点所在网格为根据触控屏的分辨率,对触控屏划分得到的网格列表中的至少一个网格。
应理解,采用坐标表示触碰位置的方式存在计算量大和易泄漏用户隐私的问题,通过设置网格特征,可以实现在降低计算量和保护隐私的情况下,确定触碰点在触控屏中的大致位置,进而确定是否为指关节操作。
在一种可能的实现方式中,网格列表包括p行和q列网格。网格列表中的每个网格的长度等于触控屏的纵轴的像素点数除以p。网格列表中的每个网格的宽度等于触控屏的横轴的像素点数除以q。其中,p和q为正整数。
在一种可能的实现方式中,当触摸特征包括接触位置特征时,提取接触位置特征,包括:
根据触碰信号,确定触碰点的X轴坐标和Y轴坐标;
根据X轴坐标和Y轴坐标,确定表示触碰点所在网格的坐标网格编号的接触位置特征。
其中,X轴为触控屏所在平面的水平方向,Y轴为触控屏所在平面的竖直方向。
在一种可能的实现方式中,ACC特征包括以下至少一项:最大一阶差分特征,信号振幅特征,前段过零数特征,最大高通值特征,均值域绝对值之和特征,前段归一化值方差特征,前段归一化值振幅特征,快速傅立叶变换均值特征,以及部分快速傅立叶变换均值特征。
在一种可能的实现方式中,score特征为score得分。score得分用于表示CAP信号与指关节操作的关联程度。
在一种可能的实现方式中,CAP二分类模型为卷积神经网络模型。
在一种可能的实现方式中,ACC二分类模型为全连接神经网络模型。
在一种可能的实现方式中,指关节分类模型为全连接神经网络模型。
在一种可能的实现方式中,该方法还包括:
在目标分类结果指示触控操作为指关节操作的情况下,确定触控操作属于的指关节手势,并执行与指关节手势对应的响应功能。其中,不同的指关节手势对应不同的响应功能。
在一种可能的实现方式中,指关节手势包括以下至少一项:指关节双击手势,指关节敲击并画圈手势,指关节敲击并画字母S手势,三个指关节沿着屏幕由上向下滑动手势,双指关节双击手势,指关节敲击并在屏幕中间画直线手势。
在一种可能的实现方式中,与指关节双击手势对应的响应功能为截取全屏功能。
在一种可能的实现方式中,与指关节敲击并画圈手势对应的响应功能为局部截屏功能。
在一种可能的实现方式中,与指关节敲击并画字母S手势对应的响应功能为滚动截屏功能。
在一种可能的实现方式中,与三个指关节沿着屏幕由上向下滑动手势对应的响应功能为滑动截屏功能。
在一种可能的实现方式中,与双指关节双击手势对应的响应功能为启动/停止录屏功能。
在一种可能的实现方式中,与指关节敲击并在屏幕中间画直线手势对应的响应功能为分屏功能。
第二方面,本申请提供一种识别装置,该装置包括用于执行上述第一方面的方法的单元/模块。该装置可对应于执行上述第一方面描述的方法,该装置中的单元/模块的相关描述请参照上述第一方面的描述,为了简洁,在此不再赘述。
第三方面,提供一种电子设备,包括处理器,该处理器与存储器耦合,该处理器用于执行该存储器中存储的计算机程序或指令,以使得电子设备实现如第一方面中任一项的指关节操作的识别方法。
第四方面,提供一种芯片,该芯片与存储器耦合,该芯片用于读取并执行该存储器中存储的计算机程序,以实现如第一方面中任一项的指关节操作的识别方法。
第五方面,提供一种计算机可读存储介质,该计算机可读存储介质存储有计算机程序,当该计算机程序在电子设备上运行时,使得电子设备执行如第一方面中任一项的指关节操作的识别方法。
第六方面,提供一种计算机程序产品,当计算机程序产品在计算机上运行时,使得计算机执行如第一方面中任一项的指关节操作的识别方法。
可以理解的是,上述第二方面至第六方面的有益效果可以参见上述第一方面中的相关描述,在此不再赘述。
附图说明
图1为本申请实施例提供的一种设置有加速度计的触控屏的示意图;
图2为本申请实施例提供的一种互电容式触控屏的IC电极排布示意图;
图3为本申请实施例提供的一种互电容式触控屏的工作原理的示意图;
图4为本申请实施例提供的一种CAP信号对应的节点矩阵的示意图;
图5为本申请实施例提供的一种指关节的触摸检测算法的流程示意图;
图6为本申请实施例提供的另一种指关节的触摸检测算法的流程示意图;
图7为本申请实施例提供的采用阈值1对ACC特征进行手动筛选的示意图;
图8为本申请实施例提供的采用阈值2对ACC特征进行手动筛选的示意图;
图9为本申请实施例提供的采用阈值3对ACC特征进行手动筛选的示意图;
图10为本申请实施例提供的采用阈值4对ACC特征进行手动筛选的示意图;
图11为本申请实施例提供的食指的指关节敲击屏幕的左上部的正视图;
图12为本申请实施例提供的食指的指关节敲击屏幕的左上部的斜视图;
图13为本申请实施例提供的食指的指关节敲击屏幕的右上部的正视图;
图14为本申请实施例提供的食指的指关节敲击屏幕的右上部的斜视图;
图15为本申请实施例提供的食指的指关节敲击屏幕的左下部的正视图;
图16为本申请实施例提供的食指的指关节敲击屏幕的左下部的斜视图;
图17为本申请实施例提供的食指的指关节敲击屏幕的右下部的正视图;
图18为本申请实施例提供的食指的指关节敲击屏幕的右下部的斜视图;
图19为本申请实施例提供的一种ACC信号的变化趋势的示意图;
图20为本申请实施例提供的另一种ACC信号的变化趋势的示意图;
图21为本申请实施例提供的改进后的指关节操作的识别方法的流程示意图;
图22为本申请实施例提供的一组指关节、指尖和指腹分别对应的7*7节点矩阵的示意图;
图23为本申请实施例提供的一种网格列表的示意图;
图24为本申请实施例提供的特征融合的示意图;
图25为本申请实施例提供的CAP二分类模型的示意图;
图26为本申请实施例提供的ACC二分类模型的示意图;
图27为本申请实施例提供的指关节分类模型的示意图;
图28为本申请实施例提供的电子设备的软件结构示意图;
图29为本申请实施例提供的识别装置的示意图;
图30为本申请实施例提供的电子设备的硬件结构示意图。
具体实施方式
为了使本申请实施例的目的、技术方案和优点更加清楚,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的 实施例是本申请一部分实施例,而不是全部的实施例。
在本申请的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B。在本申请的描述中,“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。
在本申请的说明书和权利要求书中的术语“第一”和“第二”等是用于区别不同的对象,或者用于区别对同一对象的不同处理,而不是用于描述对象的特定顺序。例如,第一阈值和第二阈值等是用于区别不同的阈值,而不是用于描述阈值的特定顺序。在本申请实施例中,“多个”是指两个或两个以上。
在本申请说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。
首先对本申请中涉及的一些名词或者术语进行解释说明。
触控屏,又称为触摸屏、触控面板,是一种可接收触控笔和手指等输入的信号的感应式液晶显示装置。通常,触控屏由触摸检测部件和触控屏控制器组成。触摸检测部件安装在液晶显示装置上层,用于检测用户的触摸位置等触摸信息,并将检测到的触摸信息传送至触控屏控制器。触控屏控制器用于对触摸信息进行处理后,将相应信号发送到处理器。然后,处理器对信号进行处理后,执行对应的响应动作,例如开关机、打开应用程序、拍摄图像、截屏或切换窗口等。
在一些实施例中,触摸检测部件由多种类型的传感器组成,例如,ACC传感器、CAP传感器、压电弯曲元件、压电薄膜、电位计、可变磁阻传感器、压电式传感器、压阻式传感器、伺服传感器、位移传感器、速度传感器、振动传感器、微机电系统(micro-electro-mechanical system,MEMS)传感器、陀螺仪、接近传感器、电传声器、水听器、电容传声器、驻极体电容传声器、动态传声器、带式传声器、碳粒传声器、压电传声器、光纤传声器、激光传声器及液体传声器等。
当用户的手掌、指腹、指尖、指甲、侧甲和指关节等身体部位与触控屏接触时,发生触摸事件。在触摸事件的影响下,传感器将采集到两类原始信号:
第1类原始信号为,当触控屏的表面在用户施加的机械力的作用下产生机械振动时,由ACC传感器、冲击传感器和振动传感器等采集到的运动信号。该运动信号可以用于衡量用户施加在触控屏上的机械力的大小。在机械力的作用下,电子设备可能从静止状态切换为运动状态,也可能从运动状态切换为静止状态,因此,该振动信号也可以用于表征在机械力的作用下,电子设备的运动/静止状态。
以通过ACC传感器采集的ACC信号为例进行说明。
示例性地,如图1所示,电子设备的触控屏设置有加速度计,比如,线性可变差动变压器(linear variable differential transformer,LVDT)。其中,加速度计可以由 ACC传感器、支承、电位器、弹簧和外壳等部件组成。加速度计采集的ACC信号可以携带加速度数据和/或角速度数据等。其中,加速度数据可以用于表征线加速度的大小,角速度数据可以用于表征角加速度的大小。
加速度数据可以是电子设备分别在X轴,Y轴和Z轴的加速度的大小。其中,X轴为触控屏所在平面的水平方向,Y轴为触控屏所在平面的竖直方向,Z轴为贯穿触控屏的垂直方向。当电子设备被放置在桌面上时,电子设备处于静止状态,此时加速度的变化量为零,或趋近于零,比如X轴,Y轴和Z轴的加速度的变化量均小于或等于0.1g,则可以确定电子设备处于静止状态。当用户手指在触控屏的某个敲击点进行敲击时,在机械力的作用下,加速度的变化量不为零,或不趋近于零,比如X轴,Y轴和Z轴的加速度的变化量大于0.1g,则可以确定电子设备处于运动状态。
角速度数据是电子设备围绕X轴,Y轴和Z轴的角速度的大小。如果各个方向上的角速度的变化量为零,或趋近于零,比如围绕X轴,Y轴和Z轴的角速度的变化量均小于3弧度/秒,则确定电子设备处于静止状态。如果任一方向上的角速度的变化量不为零,或不趋近于零,比如X轴,Y轴和Z轴的角速度的变化量大于3弧度/秒,则可以确定电子设备处于运动状态。
应理解,作用于触控屏的机械力越大,加速度的变化量越大,因此ACC信号可以用于衡量机械力的大小。
第2类原始信号为,由CAP传感器、压电式传感器和压阻式传感器等采集到的TP信号。该TP信号与接触触控屏的用户身体部位的触摸特征关联,用于确定触摸事件类型。
以通过CAP传感器采集的CAP信号为例进行说明。CAP传感器可以为一款采集CAP信号的集成电路(integrated circuit,IC)芯片。IC芯片可以由IC电极组成。
示例性地,图2示出了一种互电容式触控屏的IC电极排布示意图。互电容式触控屏在两层氧化铟锡(indium-tin-oxide,ITO)导电玻璃图层上,蚀刻出不同的ITO导电线路模块。两层导电线路模块相互垂直于触控屏的显示器上,形成横向电极和纵向电极。横向电极和纵向电极可视为在X轴和Y轴方向连续变化的滑条。由于横向电极和纵向电极位于不同表面,两组电极交叉的地方将会形成一个电容节点。一个滑条可视为驱动线,另一个滑条可视为侦测线。当电流经过驱动线中的一条导线时,如果外界有电容变化,那么将会引起另一层导线上电容节点的变化。
示例性地,图3示出了一种互电容式触控屏的工作原理的示意图。互电容式触控屏的控制器周期性地在横向电极依次发出激励信号,也称为驱动信号。然后,纵向电极获取响应信号,从而可以得到所有横向电极和纵向电极交汇点的电容值,即,得到整个互电容式触控屏的二维平面的电容大小。当手指触摸到互电容式触控屏时,引起触摸点附近两个电极之间的耦合,相当于为两个电极引入新电容,从而改变了纵向电极测量的电荷大小,使得两个电极之间的电容值发生变化。
在一种可选的实现方式中,CAP信号即为响应信号,也就是说,根据CAP信号可以确定所有电容节点的电容值。在另一种可选的实现方式中,由于互电容式触控屏通常包括大量的电容节点,当手指触摸到互电容式触控屏时,仅会改变触摸点附近的电容节点的电容值,因此IC芯片可以根据触摸点,从响应信号中提取CAP信号。CAP 信号用于指示一个节点矩阵中各个电容节点的电容值。其中,该节点矩阵可以是由m行、n列电容节点组成。例如,节点矩阵的中心节点的电容值在响应信号指示的所有电容节点的电容值中最大,即,最大电容值对应的电容节点是节点矩阵的中心节点,节点矩阵相当于以中心节点为中心划定的一个包含多个电容节点的触控区域。应理解,由于响应信号和CAP信号都可以用于指示电容节点的电容值,因此两者均可以称为原始信号。
需要说明的是,m和n的具体取值可以根据识别方式、触控屏的电容节点数量等进行设置,本申请实施例不作限定。m和n为正整数。
在一些实施例中,m和n的取值可以相等,比如,m=n=7。
在另一些实施例中,m和n的取值也可以不相等,比如m=7,n=9。
示例性地,图4示出了一种CAP信号对应的节点矩阵的示意图。当用户使用食指的指关节触碰触控屏时,IC芯片采集到CAP信号。CAP信号包含一组CAP数据。CAP数据包含了7*7节点矩阵中各个电容节点的电容值。7*7节点矩阵的中心节点的电容值为1771。中心节点的电容值1771在触摸屏所有电容节点的电容值最大。应理解,由于7*7节点矩阵中的各个电容节点的电容值变化是由用户身体部位与触控屏接触引起的,因此,CAP信号与接触触控屏的用户身体部位的触摸特征关联。
当用户身体部位与触控屏接触时,尽管采集到的CAP信号与接触触控屏的用户身体部位的触摸特征关联,但是由于CAP信号是仅携带了电容节点的电容信息的原始信号,CAP信号不能直观地反映出用户身体部位与触控屏接触时的触摸特征,因此电子设备无法直接根据CAP信号判断出是否为用户的身体部位(比如指关节)触碰触摸屏。在这种情况下,电子设备需要对CAP信号进行处理,得到触碰信号。该触碰信号可以直观地表现出用户身体部位与触控屏接触时的触摸特征。
示例性地,触碰信号可以携带触摸数据,例如:orientation数据,用于描述触摸区域和工具区域以垂直方向顺时针方向的弧度方向;pressure数据,用于描述手指或其他工具施加到设备的压力;size数据,用于描述与设置的最大可检测尺寸相关的指针触摸区域的大小;toolMajor数据,用于描述接近工具大小的椭圆长轴的尺寸;toolMinor数据,用于描述接近工具大小的椭圆短轴的尺寸;touchMajor数据,用于描述接触点触摸区域的椭圆长轴的尺寸;touchMinor数据,用于描述接触点触摸区域的椭圆短轴的尺寸;x数据,用于描述指针移动的X轴坐标;y数据,用于描述指针移动的Y轴坐标。这些触摸数据可以用于表示触摸特征,比如x数据和y数据用于表示触摸点在触控屏中的触摸位置。
至此,本申请实施例已经介绍了三种信号:ACC信号、CAP信号和触碰信号。其中,ACC信号为通过ACC传感器采集的原始信号,用于表征在机械力的作用下,电子设备的运动/静止状态。CAP信号为通过CAP传感器采集的原始信号,该原始信号携带了电容节点的电容信息。触碰信号为对CAP信号进行处理后得到的信号,可以用于表征用户身体部位与触控屏接触时的触摸特征。
结合上述实施例的分析,上述三种信号均与身体部位关联。当用户采用不同的身体部位触碰触控屏时,这三种信号的信号特征有所不同,电子设备利用这三种信号的信号特征,可以对不同身体部位进行识别。本申请实施例旨在采用这三种信号对指关 节进行识别。
下面将介绍两种采用这三种信号进行指关节识别的算法。
图5示出了一种指关节的触摸检测算法的流程示意图。该算法包括步骤:
步骤51.响应于用户的触控操作,获取ACC信号、CAP信号和触碰信号。
对于这三种信号,可以参照上述实施例的具体描述,此处不再赘述。
步骤52.对ACC信号进行特征提取,以得到ACC特征。
其中,ACC特征为ACC信号中与指关节的触控操作关联的特征,可以用来判断身体部位作用于触控屏的力度。应理解,不同手势对应的力度可能有所不同,从ACC信号中提取的ACC特征可能也有所差异。
步骤53.采用预设阈值对ACC特征和触碰信号进行初步筛选。
其中,与触碰信号对应的阈值可以为预设面积阈值,与ACC特征对应的阈值可以为预设特征阈值。需要说明的是,从ACC特征提取的ACC特征的数量为一个或多个。对于多个ACC特征,不同ACC特征可能对应不同的预设阈值。
电子设备接收到的触控操作可能是指关节操作,也可能是非指关节操作。由于指关节操作对应的ACC特征,与非指关节操作对应的ACC特征可能会呈现出不同的分布规律,因此可以采用预设特征阈值对某些非指关节信号进行初步筛选。当ACC特征不符合预设特征阈值时,可以直接确定触控操作是非指关节操作,从而无需进行后续的步骤54-步骤56。当ACC特征符合预设特征阈值时,由于仅是对信号的粗略筛选,存在将非指关节操作误判为指关节操作的可能性,因此需要执行步骤54-步骤56,以进一步精确检测是否为指关节操作。
此外,指关节操作对应的触碰信号,与非指关节操作对应的触碰信号可能也有所不同,比如指腹与触控屏的接触面积较大,指关节与触控屏的接触面积较小,因此可以采用预设面积阈值对某些非指关节信号进行初步筛选。当触碰信号的接触面积大于预设面积阈值时,可以直接确定触控操作是非指关节操作,从而无需进行后续的步骤54-步骤56。当触碰信号的接触面积小于预设面积阈值时,由于仅是对信号的粗略筛选,存在将非指关节操作误判为指关节操作的可能性,因此需要执行步骤54-步骤56,以进一步精确检测是否为指关节操作。
对于不同的ACC特征,设置的阈值和筛选方式可能也有所不同。例如,对于一些ACC特征,设置第一阈值,以筛选掉小于第一阈值的特征;对于另一些ACC特征,设置第二阈值,以筛选掉大于第二阈值的特征;对于再一些ACC特征,设置第三阈值和第四阈值,以筛选掉小于第三阈值的特征,且大于第四阈值的特征,第三阈值小于第四阈值。
步骤54.将ACC信号输入ACC二分类模型。
在ACC特征符合预设特征阈值,触碰信号符合接触面积阈值时,触碰操作为指关节操作的可能性较高,因此可以采用ACC二分类模型对ACC信号进行二分类,以进行精确检测,从而得到下述任意一种分类结果:
一种分类结果是:作用于触控屏的触控操作为指关节的触控操作;
另一种分类结果是:作用于触控屏的触控操作为非指关节的触控操作。
若ACC二分类模型的输出结果是:作用于触控屏的触控操作为指关节的触控操作, 则执行下述步骤55。
若ACC二分类模型的输出结果是:作用于触控屏的触控操作为非指关节的触控操作,则电子设备不作任何处理,或者继续判断具体的非指关节的触控操作类型,比如指腹、指尖、指甲、侧甲操作,以执行与非指关节的触控操作对应的响应功能。
步骤55.将CAP信号输入CAP五分类模型,得到CAP五分类模型的分类结果。
CAP五分类模型根据原始的CAP信号进行五分类,得到下述任意一种分类结果:
第一种分类结果是:作用于触控屏的触控操作为指关节的触控操作;
第二种分类结果是:作用于触控屏的触控操作为指腹的触控操作;
第三种分类结果是:作用于触控屏的触控操作为指尖的触控操作;
第四种分类结果是:作用于触控屏的触控操作为指甲的触控操作;
第五种分类结果是:作用于触控屏的触控操作为侧甲的触控操作。
若CAP五分类模型的分类结果是:作用于触控屏的触控操作为指关节的触控操作,则执行下述步骤56。
若CAP五分类模型的分类结果是:作用于触控屏的触控操作为指腹的触控操作,则执行与指腹的触控操作对应的响应功能。
若CAP五分类模型的分类结果是:作用于触控屏的触控操作为指尖的触控操作,则执行与指尖的触控操作对应的响应功能。
若CAP五分类模型的分类结果是:作用于触控屏的触控操作为指甲的触控操作,则执行与指甲的触控操作对应的响应功能。
若CAP五分类模型的分类结果是:作用于触控屏的触控操作为侧甲的触控操作,则执行与侧甲的触控操作对应的响应功能。
步骤56.将CAP信号输入CAP二分类模型,得到CAP二分类模型的分类结果。
需要说明的是,上述步骤55可能存在手势误判的可能,因此电子设备可以再次将CAP信号输入CAP二分类模型,得到CAP二分类模型的分类结果,从而提高了指关节的识别精度,降低了用户操作的误触率。
CAP二分类模型根据原始的CAP信号进行二分类,得到下述任意一种分类结果:
若CAP二分类模型的输出结果是:作用于触控屏的触控操作为指关节的触控操作,则电子设备可以执行与指关节的触控操作对应的响应功能。
若CAP二分类模型的输出结果是:作用于触控屏的触控操作为非指关节的触控操作,则不作任何处理,或执行与非指关节的触控操作对应的响应功能。
图6示出了另一种指关节的触摸检测算法的流程示意图。该算法包括步骤:
步骤61.响应于用户的触控操作,获取ACC信号、CAP信号和触碰信号。
对于这三种信号,可以参照上述实施例的具体描述,此处不再赘述。
步骤62.对ACC信号进行特征提取,以得到ACC特征。
其中,ACC特征为ACC信号中与指关节的触控操作关联的特征。
步骤63.采用预设阈值对ACC特征和触碰信号进行初步筛选。其中,与触碰信号对应的阈值可以为预设面积阈值,与ACC特征对应的阈值可以为预设特征阈值。
对于步骤63的具体实现方式,可以参照上述步骤53的描述,此处不再赘述。
步骤64.在ACC特征符合预设特征阈值,触碰信号符合接触面积阈值时,将原 始的CAP信号输入CAP五分类模型,得到CAP五分类模型的分类结果。
CAP五分类模型对信号进行五分类,得到下述任意一种分类结果:
第一种分类结果是:作用于触控屏的触控操作为指关节的触控操作;
第二种分类结果是:作用于触控屏的触控操作为指腹的触控操作;
第三种分类结果是:作用于触控屏的触控操作为指尖的触控操作;
第四种分类结果是:作用于触控屏的触控操作为指甲的触控操作;
第五种分类结果是:作用于触控屏的触控操作为侧甲的触控操作。
若CAP五分类模型的分类结果是:作用于触控屏的触控操作为指关节的触控操作,则执行与指关节的触控操作对应的响应功能。
若CAP五分类模型的分类结果是:作用于触控屏的触控操作为指腹的触控操作,则执行与指腹的触控操作对应的响应功能。
若CAP五分类模型的分类结果是:作用于触控屏的触控操作为指尖的触控操作,则执行与指尖的触控操作对应的响应功能。
若CAP五分类模型的分类结果是:作用于触控屏的触控操作为指甲的触控操作,则执行与指甲的触控操作对应的响应功能。
若CAP五分类模型的分类结果是:作用于触控屏的触控操作为侧甲的触控操作,则执行与侧甲的触控操作对应的响应功能。
上述图5和图6中的ACC二分类模型、CAP二分类模型、CAP五分类模型为传统机器学习模型或神经网络模型。
应理解,与图5提供的算法相比,图6提供的算法省略了ACC二分类模型以及CAP二分类模型。由于图6提供的算法更为简单,相当于放宽了指关节的检测条件,因此提高了指关节的识别率,但也会导致非指关节的误触率升高。
需要说明的是,在上述两种算法中的预设阈值为在电子设备的研发阶段,由工作人员采用手动设置阈值的方式,通过多次调整后最终得到的阈值。但是,这种人工筛选方式过于依赖主观调试。另外,人工筛选方式的筛选效果也不理想。
下面结合图7至图10,对人工筛选方式的筛选效果进行示例说明。
示例性地,图7和图8示出了对一种ACC特征进行手动筛选的示意图。
假设横轴用于表示数据特征的编号,纵轴用于表示数据特征的取值,虚线框a包围多个指关节特征点(即ACC特征点),虚线框b包围多个非指关节特征点。其中,指关节特征点和非指关节特征点的取值分布较为接近。
如图7所示,如果将预设阈值设置为阈值1,且筛选规则为筛选掉小于阈值1的特征点,那么电子设备将筛选掉小于阈值1的少量非指关节特征点,但是,此时会剩余大量的非指关节特征点,使得大量无效的非指关节特征点被输入分类模型,导致最终获得的分类结果不准确。
如图8所示,如果将预设阈值设置为阈值2,阈值2大于阈值1,且筛选规则为筛选掉小于阈值2的特征点,那么电子设备将筛选掉小于阈值2的大量非指关节特征点,但是同时会筛选掉小于阈值2的绝大多数指关节特征点,使得大部分有效的指关节特征被滤除,导致分类模型无法根据足够的有效数据进行准确分类。
示例性地,图9和图10示出了对另一种ACC特征进行手动筛选的示意图。
假设横轴用于表示数据特征的编号,纵轴用于表示数据特征的取值,虚线框c多个指关节特征点(即ACC特征点),虚线框d包围多个非指关节特征点。其中,指关节特征点和非指关节特征点的取值分布存在部分重叠。
如图9所示,如果将预设阈值设置为阈值3,且筛选规则为筛选掉小于阈值3的特征点,那么电子设备将筛选掉小于阈值3的少量非指关节特征点,但是,此时会剩余大量的非指关节特征点,使得大量无效的非指关节特征点被输入分类模型,导致最终获得的分类结果不准确。
如图10所示,如果将预设阈值设置为阈值4,阈值4大于阈值3,且筛选规则为筛选掉小于阈值4的特征点,那么电子设备将筛选掉小于阈值4的大量非指关节特征点,但是同时会筛选掉小于阈值4的一部分指关节特征点,使得一部分有效的指关节特征点被滤除,导致分类模型无法根据足够的有效数据进行准确分类。
根据上述实施例的描述可以看出,一方面,人工筛选方式过于依赖主观调试,需要花费大量时间进行阈值的测试,增加了工作复杂度。另一方面,当指关节特征点和非指关节特征点的取值分布较为接近或存在部分重叠时,即便调整阈值,也难以滤除大部分非指关节特征点,剩余大部分指关节特征点,使得最终获得的分类结果不准确。
为了解决上述两种算法存在的问题,本申请实施例对这两种算法进行了改进,提供了一种新的指关节操作的识别方法。该方法取消了上述两种算法中手动设置的预设阈值,采用ACC二分类模型对ACC特征进行分类,即,采用机器筛选方式代替人工筛选方式。另外,该方法增加了一个CAP二分类模型,用于从CAP信号中提取score特征。此外,该方法还对score特征、ACC特征和触摸特征进行了特征融合,用于预测最终的分类结果。
需要说明的是,本申请实施例提供的指关节操作的识别方法可以适用于各种电子设备。
其中,电子设备可以为手机、平板电脑、可穿戴设备、车载设备、增强现实(augmented reality,AR)、虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)或智慧屏等,或者可以为其他设置有触控屏的设备或装置。对于电子设备的具体类型,本申请实施例不作任何限制。
为了更清楚地示意改进后的指关节操作的识别方法,下面将从两个部分展开说明。
第一部分,对本申请实施例提供的新的指关节操作的识别方法的原理进行说明。
通常,根据大多数用户的操作习惯,会将加速度计设置在触控屏的左上角。当用户指关节触碰或敲击触控屏的不同触控区域时,加速度计的受力方式不同,加速度计将采集到不同的ACC数据。示例性地,图11至图18示出了指关节作用于不同的触控区域的四种场景。
场景1:
图11示出了食指的指关节敲击屏幕的左上部的正视图。图12示出了食指的指关节敲击屏幕的左上部的斜视图。假设X轴为触控屏所在平面的水平方向,Y轴为触控屏所在平面的竖直方向,Z轴为贯穿触控屏的垂直方向。如图11和图12所示,当食指的指关节敲击屏幕的左上部时,位于触控屏左上部的加速度计将获得一个向Z轴的 负方向的机械力F1。
场景2:
图13示出了食指的指关节敲击屏幕的右上部的正视图。图14示出了食指的指关节敲击屏幕的右上部的斜视图。假设X轴为触控屏所在平面的水平方向,Y轴为触控屏所在平面的竖直方向,Z轴为贯穿触控屏的垂直方向。如图13和图14所示,当食指的指关节敲击屏幕的右上部时,位于触控屏左上部的加速度计将获得一个向Z轴的负方向的机械力F2。
场景3:
图15示出了食指的指关节敲击屏幕的左下部的正视图。图16示出了食指的指关节敲击屏幕的左下部的斜视图。假设X轴为触控屏所在平面的水平方向,Y轴为触控屏所在平面的竖直方向,Z轴为贯穿触控屏的垂直方向。如图15和图16所示,当食指的指关节敲击屏幕的左下部时,位于触控屏左上部的加速度计将获得一个向Z轴的负方向的机械力F3。
场景4:
图17示出了食指的指关节敲击屏幕的右下部的正视图。图18示出了食指的指关节敲击屏幕的右下部的斜视图。假设X轴为触控屏所在平面的水平方向,Y轴为触控屏所在平面的竖直方向,Z轴为贯穿触控屏的垂直方向。如图17和图18所示,当食指的指关节敲击屏幕的右下部时,位于触控屏左上部的加速度计将获得一个向Z轴的正方向的机械力F4。
根据上述四种场景的描述可以看出,无论指关节作用于触控屏的哪个触控区域,加速度计主要在Z轴方向受到机械力的影响。
以ACC数据用于表征ACC信号的在Z轴方向的加速度为例,图19和图20示出了两种ACC信号的变化趋势的示意图。其中,横轴用于表示时间,纵轴用于表示ACC数据的取值。
如图19所示,在指关节作用于触控屏之前,从与第0帧数据对应的时刻到与第108帧数据对应的时刻之间,ACC数据的取值基本保持在2500。在指关节作用于触控屏的上部区域或左部区域之后,如左上部区域,右上部区域或左下部区域,从与第108帧数据对应的时刻到与第128帧数据对应的时刻,ACC数据的取值将出现一次明显下降的过程。
如图20所示,在指关节作用于触控屏之前,从与第0帧数据对应的时刻到与第114帧数据对应的时刻之间,ACC数据的取值基本保持在3000。在指关节作用于触控屏的右下部区域之后,从与第114帧数据对应的时刻到与第128帧数据对应的时刻,ACC数据的取值将出现一次明显上升的过程。
结合上述实施例的描述,当指关节作用于触控屏的不同触控区域时,ACC数据可能会呈现出不同的变化趋势。因此,本申请实施例提出了从触碰信号中提取触摸特征用来判断接触面积和接触位置,从ACC信号中提取ACC特征用来判断触摸屏幕的力度。另外,前置一个CAP二分类模型,从CAP信号中提取用于表征与指关节之间的相关性的score特征。最终将ACC特征、score特征和触摸特征进行特征融合,以获取更好的分类效果。
第二部分,对本申请实施例提供的改进后的指关节操作的识别方法的流程进行详细说明。
图21为本申请实施例提供的一种改进后的指关节操作的识别方法的流程示意图。该方法可以应用于识别指关节交互方式的场景中。该方法的执行主体可以为电子设备,或电子设备中的功能模块。该电子设备的屏幕为触控屏,触控屏上设置有ACC传感器和CAP传感器。
如图21所示,该方法可以包括下述的S1至S9。
S1.接收作用于触控屏的触控操作。
当用户的手掌、指腹、指尖、指甲、侧甲和指关节等身体部位作用于触控屏时,电子设备的传感器将检测到触控操作。由于该触控操作可能为指关节操作,也可能为非指关节操作,因此电子设备可以采用下述S2至S9判断是否为指关节操作,以执行对应的响应操作。
S2.响应于触控操作,获取ACC信号、CAP信号和触碰信号。
电子设备的传感器可以周期性采集数据。当电子设备接收到作用于触控屏的触控操作时,电子设备可以响应于触控操作,获取到通过ACC传感器采集的ACC信号,通过CAP传感器采集CAP信号,以及对CAP信号进行处理后得到的触碰信号。
其中,ACC信号为通过ACC传感器采集的原始信号,可以用于表征在机械力的作用下,电子设备的运动/静止状态。CAP信号为通过CAP传感器采集的原始信号,该原始信号携带了电容节点的电容信息。触碰信号为对CAP信号进行处理后得到的信号,可以用于表征用户身体部位与触控屏接触时的触摸特征。
对于这三种信号,可以参照上述实施例的具体描述,此处不再赘述。
需要说明的是,ACC信号、CAP信号和触碰信号可以为时域信号,也可以为频域信号。
S3.从ACC信号中提取ACC特征。
其中,ACC特征为ACC信号中与指关节的触控操作关联的特征,可以用来判断身体部位作用于触控屏的力度。
在本申请实施例中,电子设备可以采用预设的特征提取算法,从ACC信号中提取ACC特征。比如,特征提取算法可以为基于互信息的方法、基于最大相关-最小冗余的方法以及基于包装法(Wrapper)的特征选择方法等。
示例性地,ACC特征可以包括以下至少一项:最大一阶差分(maxgradient)特征,信号振幅(amplitude)特征,前段过零数(zerocrosscnt)特征,最大高通值(maxhighpass)特征,均值域绝对值之和(meanaddmax)特征,前段归一化值方差(accnormsquare)特征,前段归一化值振幅(accnormsquare)特征,快速傅立叶变换均值(fast fourier transformfft mean,fftmean)特征,部分快速傅立叶变换均值(part fftmean,partfftmean)特征。应理解,ACC特征还可能包括其他特征,本申请实施例不作限定。
S4.将ACC特征输入ACC二分类模型。
ACC二分类模型根据提取的ACC特征进行二分类,获取两种分类结果:
一种分类结果是:作用于触控屏的触控操作为指关节的触控操作。
另一种分类结果是:作用于触控屏的触控操作为非指关节的触控操作。
若ACC二分类模型的输出结果是:作用于触控屏的触控操作为指关节的触控操作,则执行下述S5。
若ACC二分类模型的输出结果是:作用于触控屏的触控操作为非指关节的触控操作,则不作任何处理,或执行与非指关节的触控操作对应的响应功能。
S5.将CAP信号输入CAP二分类模型,得到score特征。
其中,score特征可以用于表示CAP信号与指关节操作的关联程度,即,score特征为CAP信号中与指关节操作关联的特征。
作为通过CAP传感器采集的原始信号,CAP信号携带了电容节点的电容信息。结合上述实施例的描述,当身体部位与触控屏接触时,将引起电容节点的电容值变化。应理解,当用户采用不同的身体部位与触控屏接触时,不同手势将接触到不同的电容节点,引起的电容值变化也将有所差异。
示例性地,图22示出了一组指关节、指尖和指腹分别对应的7*7节点矩阵的示意图。如图22所示,位于每个7*7节点矩阵的最大电容节点的电容值均不相同,在每个最大电容节点周围分布的电容节点的电容值也有所差异。在将CAP信号输入CAP二分类模型后,CAP二分类模型根据CAP信号携带的电容节点的电容信息,将输出获取score得分,即score特征。应理解,如果是指关节的7*7节点矩阵,那么score得分将较高;如果是指尖或指腹的7*7节点矩阵,那么score得分将较低。
表1提供了一种交互方式与score得分的对应关系表。如表1所示,在将CAP信号输入CAP二分类模型后,CAP二分类模型输出结果为:作用于触控屏的触控操作为指关节的触控操作的score得分是0.92,作用于触控屏的触控操作为非指关节的触控操作的score得分是0.08。应理解,作用于触控屏的触控操作为指关节的触控操作的score得分越高,根据score得分等进行特征融合后,最终得到的分类结果为指关节的触控操作的可能性越高。相反,作用于触控屏的触控操作为指关节的触控操作的score得分越低,根据score得分等进行特征融合后,最终得到的分类结果为指关节的触控操作的可能性越低。
表1
S6.从触碰信号中提取触摸特征。其中,触摸特征可以用来表征身体部位接触触控屏时的接触面积和接触位置。应理解,不同手势对应的触摸特征将有所差异。
电子设备可以采用预设的特征提取算法,从触碰信号中提取触摸特征。触摸特征可以用于表示触碰信号与指关节操作的关联程度,即,触摸特征为触碰信号中与指关节操作关联的特征。比如,触摸特征可以包括接触位置(location)特征和/或接触面积(pressure)特征。其中,接触位置特征可以用于表示身体部位在触控屏上的交互位置,接触面积特征可以用于表示身体部位与触控屏接触的面积。
需要说明的是,与前两种算法直接对触碰信号进行阈值过滤有所不同,本申请在改进后的指关节操作的识别方法中提出了从触碰信号中提取触控特征的概念。
在传统方式中,电子设备通常会采用X轴坐标和Y轴坐标表示触碰点在触控屏中 的详细位置,但是,这种采用坐标表示触碰位置的方式存在计算量大和易泄漏用户隐私的问题。为了解决此类问题,本申请实施例新提出了一个新的概念-坐标网格编号,用于表示触碰点在触控屏中的大致位置。
相应地,上述位置特征可以为网格特征,该网格特征用于表示触碰点的坐标网格编号。
具体地,电子设备可以根据触控屏的分辨率,将触控屏划分为一个p行、q列的网格列表,网格列表中的每个网格采用一个坐标网格编号表示。每个网格的长度等于触控屏的纵轴的像素点数除以p,每个网格的宽度等于触控屏的横轴的像素点数除以q。比如,若手机的屏幕分辨率为1600×1200像素,则屏幕被划分为一个4×3的网格列表。再比如,若手机的屏幕分辨率为1920×1080像素,则屏幕被划分为一个6×4的网格列表。p和q为正整数。
结合上述实施例对图2的描述,由于横向电极和纵向电极相互垂直于触控屏的显示器上,因此,每个坐标网格编号指示的网格将覆盖若干电极交汇点。当指关节引起某些触摸点附近两个电极之间的耦合时,电容节点的电容值发生变化,从而从触碰信号中可以获取X轴坐标和Y轴坐标,进而确定触碰点的坐标网格编号,即确定了触碰点在触控屏中的大致位置。
示例性地,图23为本申请实施例提供的一种网格列表的示意图。如图23所示,触控屏被划分为一个7行、4列的网格列表,且每个网格采用一个坐标网格编号表示。比如,从触控屏顶部开始,第一行网格从左到右依次的坐标网格编号为00、01、02、03,第二行网格从左到右依次的坐标网格编号为10、11、12、13,第三行网格从左到右依次的坐标网格编号为20、21、22、23……第七行网格从左到右依次的坐标网格编号为60、61、62、63。当指关节敲击坐标网格编号21对应的区域时,电子设备先获取CAP信号,CAP信号包含如局部放大图所示的7*7节点矩阵中各个电容节点的电容值;然后对CAP信号进行处理,得到触碰信号;之后根据触碰信号的X轴坐标和Y轴坐标,从触碰信号中提取网格特征,该网格特征用于表示触碰点的坐标网格编号21,即,确定了触碰点在触控屏中的大致位置是坐标网格编号21所指示的区域。
需要说明的是,上述图23是以触碰点在触控屏中的大致位置是一个网格为例进行说明的,其并不对本申请实施例形成限定。在实际实现时,触碰点在触控屏中可能会跨越多个网格。此时,网格特征可以用于表示多个网格的坐标网格编号,或多个网格中的一个网格的坐标网格编号,该一个网格包含的触碰点的数量多于其他网格包含的触碰点的数量。
另外,本申请实施例对S3、S4、S5和S6的执行顺序不作限定。
在第一种实现方式中,电子设备可以一边从ACC信号中提取ACC特征,一边从触碰信号中提取触摸特征,之后根据ACC二分类模型的输出结果,确定是否将CAP信号输入CAP二分类模型,以对提取的多种特征进行特征融合,即先执行S3和S6,再执行S4和S5。
在第二种实现方式中,电子设备可以先从ACC信号中提取ACC特征,之后若ACC二分类模型的输出结果指示作用于触控屏的触控操作为指关节的触控操作,则将CAP信号输入CAP二分类模型,并从触碰信号中提取触摸特征,以对提取的多种特征进行 特征融合,即先执行S3和S4,再执行S5和S6。
应理解,与第一种实现方式相比,在第二种实现方式中,在ACC二分类模型的输出结果指示作用于触控屏的触控操作为非指关节的触控操作的情况下,无需从触碰信号中提取触摸特征,针对这种情形,可以在一定程度上降低电子设备的计算量。
S7.对提取到的ACC特征、score特征和触摸特征进行特征融合。
在提取到ACC特征、score特征和触摸特征后,电子设备可以采用预设特征融合算法,对这些特征进行拼接。比如,预设特征融合算法可以是早融合算法(early fusion)和晚融合(late fusion)算法等。
示例性地,如图24所示,电子设备可以包含一个特征融合模块。假设电子设备从触碰信号中提取了接触位置特征和接触面积特征,从ACC信号中提取了最大一阶差分(maxgradient)特征,信号振幅(amplitude)特征,前段过零数(zerocrosscnt)特征,最大高通值(maxhighpass)特征,均值域绝对值之和(meanaddmax)特征,前段归一化值方差(accnormsquare)特征,前段归一化值振幅(accnormsquare)特征,快速傅立叶变换均值(fast fourier transformfft mean,fftmean)特征和部分快速傅立叶变换均值(part fftmean,partfftmean)特征,从CAP信号中提取了score特征。特征融合模块可以对这12个特征进行特征融合,得到融合后的特征。
S8.将融合后的特征输入指关节分类模型。
指关节分类模型根据融合后的特征,预测是否为指关节交互方式。
本申请实施例提供的改进后的指关节操作的识别方法涉及到了三种二分类模型:ACC二分类模型、CAP二分类模型和指关节分类模型。这三种模型二分类可以为传统机器学习模型或神经网络模型。
在一些实施例中,由于从CAP信号中提取score特征类似单通道图像,因此CAP二分类模型可以为卷积神经网络(convolutional neural networks,CNN)模型。CNN是一种人工神经网络。
示例性地,CAP二分类模型可以如图25所示,CNN的结构分为3层:卷积层(convolutional layer),主要作用是提取CAP特征;池化层(max pooling layer),主要作用是下采样(down sampling),却不会损坏识别结果;全连接层(fully connected layer),主要作用是分类,以确定触控操作为指关节操作还是非指关节操作。
在一些实施例中,以从ACC信号中提取了9个ACC特征,从触碰信号中提取了2个触摸特征为例,由于ACC二分类模型仅需要对9个特征进行处理,指关节分类模型仅需要对12个特征进行处理,计算量较低,因此,ACC二分类模型和指关节分类模型可以采用全连接神经网络(fully connected neural network,DNN)模型。DNN网络模型是一种多层感知机。感知机的原理是寻找类别间最合理、最具有鲁棒性的超平面,最具代表性的感知机是支持向量机(support vector machine,SVM)算法。
示例性地,ACC二分类模型可以如图26所示,指关节分类模型可以如图27所示。这两个DNN模型均包括输入层、隐藏层和输出层。其中,隐藏层的数量可以为多个,应理解,增加隐藏层的数量可以更好地分离数据的特征,但是过多的隐藏层也会增加训练时间以及产生过拟合。如图26和图27所示,在ACC二分类模型的输入层中输入了9个ACC特征,在指关节分类模型的输入层中输入了9个ACC特征、1个score特 征、2个触摸特征(网格特征和接触面积特征)。
需要说明的是,对于ACC二分类模型、CAP二分类模型和指关节分类模型的训练过程可以参照现有技术,此处不予赘述。
S9.获取指关节分类模型的分类结果。
指关节分类模型根据融合后的特征进行二分类,获取两种分类结果:
一种分类结果是作用于触控屏的触控操作为指关节的触控操作,即,指关节分类模型识别到的手势交互方式具体为指关节交互方式。
另一种分类结果是作用于触控屏的触控操作为非指关节的触控操作,即,指关节分类模型识别到的手势交互方式具体为非指关节交互方式。
若指关节分类模型的输出结果指示作用于触控屏的触控操作为指关节的触控操作,则电子设备可以执行与指关节的触控操作对应的响应功能。
若指关节分类模型的输出结果指示作用于触控屏的触控操作为非指关节的触控操作,则电子设备可以不作任何处理,或执行与非指关节的触控操作对应的响应功能。
在一些实施例中,电子设备可以设置多种类型的指关节手势。在识别出指关节交互方式之后,电子设备可以根据指关节与触控屏的触碰位置,指关节与触控屏的触碰时间,以及指关节在触控屏的滑动距离等参数,确定指关节交互方式具体属于哪种类型的指关节手势,并执行与指关节手势对应的响应功能。
示例性地,指关节手势包括以下至少一种:指关节双击手势,指关节敲击并画圈手势,指关节敲击并画字母S手势,三个指关节沿着屏幕由上向下滑动手势,双指关节双击手势,指关节敲击并在屏幕中间画直线手势。
相应地,指关节双击手势对应截取全屏功能,指关节敲击并画圈手势对应局部截屏功能,指关节敲击并画字母S手势对应滚动截屏功能,三个指关节沿着屏幕由上向下滑动手势对应滑动截屏功能,双指关节双击手势对应启动/停止录屏功能,指关节敲击并在屏幕中间画直线手势对应分屏功能。
本申请实施例提供的指关节操作的识别方法,取消了手动设置的阈值,采用ACC二分类模型对ACC特征进行分类,即,采用机器筛选方式代替人工筛选方式,从而提高了ACC特征筛选效率,并提升了筛选效果。另外,通过增加一个CAP二分类模型,可以从CAP信号中提取score特征,然后对score特征、ACC特征和触摸特征进行了特征融合,最终利用融合后的特征进行分类,可以获得更好的分类效果。
需要说明的是,本申请实施例在不同的实验场景下,分别采用原识别方法与改进后的识别方法对指关节操作和非指关节操作进行了识别。大量实验结果表明,改进后的识别方法对指关节操作的识别率高于原识别方法对指关节操作的识别率,改进后的识别方法对非指关节操作的误触率低于原识别方法对非指指关节的误触率。即,改进后的识别方法提高了指关节操作的识别率,并且降低了非指关节操作的误触率,获得了更好的分类效果。
图28是本申请实施例的电子设备的软件结构示意图。分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统的软件层从上至下依次划分为:应用层(application)、框架层(framework,FWK)、输入层(Input)、硬件抽象层(HAL)、驱动层等。
另外,图28还示出了与软件层连接的硬件层,硬件层可以包括微控制单元(microcontroller unit,MCU)、ACC传感器和CAP传感器等。
应用程序层可以包括一系列应用程序包,比如包括操作系统(operating system,OS)应用。OS应用可以通过调用系统应用程序接口(application programming interface,API)接口来触发指关节触摸操作对应相关的功能,指关节触摸操作对应的功能可通过OS应用进行自定义。另外,OS应用可以提供用户界面给用户,以便于用户在用户界面上自行定义指关节触摸操作对应的功能。
框架层为应用程序层的应用程序提供API和编程框架。应用程序框架层包括一些预先定义的函数。比如,框架层可以包括输入管理服务,用于接收和分发输入事件,对输入事件进行映射,对收集到的输入事件进行判断处理,将输入事件分发到上层。在本申请实施例中,输入管理服务可以对指腹、指尖、指甲、侧甲和指关节等手势进行管理,以执行对应的快捷处理动作。
输入层用于判断输入事件的类型,比如,输入层的手势处理模块可以调用硬件抽象层的手势识别模块,以判断输入事件的触控类型。
硬件抽象层是位于操作系统内核与硬件电路之间的接口层,其目的在于将硬件抽象化。它隐藏了特定平台的硬件接口细节,为操作系统提供虚拟硬件平台,使其具有硬件无关性,可在多种平台上进行移植。从软硬件测试的角度来看,软硬件的测试工作都可分别基于硬件抽象层来完成,使得软硬件测试工作的并行进行成为可能。
示例性地,ACC传感器采集ACC信号,并将ACC信号发送至MCU。CAP传感器采集CAP信号,并将CAP信号发送至MCU。MCU对CAP信号进行处理得到触碰信号,然后MCU将ACC信号、CAP信号和触碰信号,发送至硬件抽象层的手势识别模块。手势识别模块按照上述实施例S1至S9中的方法,对这些信号进行处理,以得到识别结果。如果手势识别结果为指关节交互方式,则手势识别模块依次通过手势处理模块、输入管理服务向OS应用上报指关节触控事件。之后,OS应用可以通过调用OS的API接口,触发指关节触摸操作对应相关的功能。
需要说明的是,上述触碰信号也可以是由软件层的功能模块,对CAP信号进行处理得到的,本申请实施例不作限定。
上述主要从电子设备的角度对本申请实施例提供的方案进行了介绍。可以理解的是,电子设备为了实现上述功能,其包含了执行每一个功能相应的硬件结构或软件模块,或两者结合。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,本申请能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
本申请实施例可以根据上述方法示例对电子设备进行功能模块的划分,例如,可以对应每一个功能划分每一个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。下面以采用对应每一个功能划分 每一个功能模块为例进行说明。
图29为本申请实施例提供的一种识别装置的结构示意图。如图29所示,该识别装置90可以包括获取模块91、特征提取模块92、特征融合模块93和分类模块94。
获取模块91,用于响应作用于触控屏的触控操作,获取ACC信号、CAP信号和触碰信号,ACC信号为通过ACC传感器采集的原始信号,CAP信号为通过CAP传感器采集的原始信号,触碰信号为对CAP信号进行处理后得到的信号。
特征提取模块92,用于提取ACC特征、score特征和触摸特征,ACC特征为ACC信号中与指关节操作关联的特征,score特征为CAP信号中与指关节操作关联的特征,触摸特征为触碰信号中与指关节操作关联的特征。
特征融合模块93,用于对ACC特征、score特征和触摸特征进行特征融合。
分类模块94,用于将融合后的特征输入指关节分类模型,得到指关节分类结果。指关节分类结果指示触控操作为指关节操作或非指关节操作。
在一些实施例中,特征提取模块92具体用于:从ACC信号中提取ACC特征。分类模块94还用于:将ACC特征输入ACC二分类模型,得到ACC分类结果;并在ACC分类结果指示触控操作为指关节操作的情况下,将CAP信号输入CAP二分类模型,得到score特征。特征提取模块92具体用于:在ACC分类结果指示触控操作为指关节操作的情况下,从触碰信号中提取触摸特征。
在一种可能的实现方式中,特征提取模块92具体用于:从ACC信号中提取ACC特征,并从触碰信号中提取触摸特征。分类模块94还用于:将ACC特征输入ACC二分类模型,得到ACC分类结果。特征提取模块92具体用于:在ACC分类结果指示触控操作为指关节操作的情况下,将CAP信号输入CAP二分类模型,得到score特征。
图30示出了本申请实施例提供的电子设备的硬件结构示意图。
如图30所示,电子设备可以包括:处理器110,外部存储器接口120,内部存储器121,USB接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中,传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,以及骨传导传感器180M等。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括中央处理器(central processing unit,CPU),图像信号处理器(image signal processor,ISP),数字信号处理器(digital signal processor,DSP),视频编解码器,神经网络处理器(neural-network processing unit,NPU),图形处理器(graphics processing unit,GPU),应用处理器(application processor,AP),和/或调制解调处理器等。在一些实施例中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,CPU是信息处理、程序运行的最终执行单元,其主要工作包括处理指令、执行操作、控制时间和处理数据等。CPU可以包括控制器、运算器、高速缓冲存储器, 以及用于连接这些部件的总线。控制器可以是电子设备的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。在本申请实施例中,NPU可以用于根据ACC信号和CAP信号,训练ACC二分类模型、CAP二分类模型和指关节分类模型。
加速度传感器180E用于检测电子设备100在各个方向上加速度的大小。当电子设备100静止时可检测出重力的大小及方向。加速度传感器180E还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。
在本申请实施例中,加速度传感器180E可以设置在触控屏的左上角。在指关节敲击触控屏时,加速度传感器180E可以采集ACC信号。
触摸传感器180K,设置于显示屏194,用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同。
在本申请实施例中,触摸传感器180K可以为CAP传感器。其中,CAP传感器可以互电容式触摸传感器或自电容式触摸传感器。利用触摸传感器180K被触摸时感应电容会发生变化的特性,通过原始数据检测各点的感应电容的变化,当某一点或多点的感应电容的变化量超过一定阈值时确定该点被触摸,从而实现对被触摸点的位置的检测。
需要说明的是,电子设备与导电性能好的物体或人体接触时,例如被用户手持时,电子设备对地电容较小;电子设备与导电性能不好的物体接触时,例如被放置在桌面等绝缘体上时,或者,电子设备通过可折叠保护皮套放置在桌面上时,电子设备对地电容较大。也就是说,当电子设备被用户手持时,电子设备处于接地场景;当电子设备被放置在桌面时,电子设备处于浮地状态。
本申请实施例还提供了一种电子设备,包括处理器,处理器与存储器耦合,处理器用于执行存储器中存储的计算机程序或指令,以使得电子设备实现上述各实施例中的方法。
本申请实施例还提供了一种计算机可读存储介质,该计算机可读存储介质中存储有计算机指令;当该计算机可读存储介质在电子设备上运行时,使得该电子设备执行如上所示的方法。该计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,该计算机指令可以从一个网站站点、计算机、服务器或者数据中心通过有线(例如同轴电缆、光纤、数字用户 线(digital subscriber line,DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。该计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可以用介质集成的服务器、数据中心等数据存储设备。可用介质可以是磁性介质(例如,软盘、硬盘或磁带),光介质或者半导体介质(例如固态硬盘(solid state disk,SSD))等。
本申请实施例还提供了一种计算机程序产品,该计算机程序产品包括计算机程序代码,当计算机程序代码在计算机上运行时,使得计算机执行上述各实施例中的方法。
本申请实施例还提供了一种芯片,该芯片与存储器耦合,该芯片用于读取并执行存储器中存储的计算机程序或指令,以执行上述各实施例中的方法。该芯片可以为通用处理器,也可以为专用处理器。
需要说明的是,该芯片可以使用下述电路或者器件来实现:一个或多个现场可编程门阵列(field programmable gate array,FPGA)、可编程逻辑器件(programmable logic device,PLD)、控制器、状态机、门逻辑、分立硬件部件、任何其他适合的电路、或者能够执行本申请通篇所描述的各种功能的电路的任意组合。
上述本申请实施例提供的电子设备、识别装置、计算机可读存储介质、计算机程序产品以及芯片均用于执行上文所提供的方法,因此,其所能达到的有益效果可参考上文所提供的方法对应的有益效果,在此不再赘述。
应理解,上述只是为了帮助本领域技术人员更好地理解本申请实施例,而非要限制本申请实施例的范围。本领域技术人员根据所给出的上述示例,显然可以进行各种等价的修改或变化,例如,上述检测方法的各个实施例中某些步骤可以是不必须的,或者可以新加入某些步骤等。或者上述任意两种或者任意多种实施例的组合。这样的修改、变化或者组合后的方案也落入本申请实施例的范围内。
还应理解,上文对本申请实施例的描述着重于强调各个实施例之间的不同之处,未提到的相同或相似之处可以互相参考,为了简洁,这里不再赘述。
还应理解,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。
还应理解,本申请实施例中,“预先设定”、“预先定义”可以通过在设备(例如,包括电子设备)中预先保存相应的代码、表格或其他可用于指示相关信息的方式来实现,本申请对于其具体的实现方式不做限定。
还应理解,本申请实施例中的方式、情况、类别以及实施例的划分仅是为了描述的方便,不应构成特别的限定,各种方式、类别、情况以及实施例中的特征在不矛盾的情况下可以相结合。
还应理解,在本申请的各个实施例中,如果没有特殊说明以及逻辑冲突,不同的实施例之间的术语和/或描述具有一致性、且可以相互引用,不同的实施例中的技术特征根据其内在的逻辑关系可以组合形成新的实施例。
最后应说明的是:以上描述内容,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。

Claims (17)

  1. 一种指关节操作的识别方法,其特征在于,所述方法包括:
    响应作用于触控屏的触控操作,获取加速度信号、电容信号和触碰信号,所述加速度信号为通过加速度传感器采集的原始信号,所述电容信号为通过电容传感器采集的原始信号,所述触碰信号为对所述电容信号进行处理后得到的信号;
    提取加速度特征、置信度特征和触摸特征,所述加速度特征为所述加速度信号中与指关节操作关联的特征,所述置信度特征为所述电容信号中与指关节操作关联的特征,所述触摸特征为所述触碰信号中与指关节操作关联的特征;
    对所述加速度特征、所述置信度特征和所述触摸特征进行特征融合;
    将融合后的特征输入指关节分类模型,得到目标分类结果,所述目标分类结果指示所述触控操作为指关节操作或非指关节操作。
  2. 根据权利要求1所述的方法,其特征在于,所述置信度特征为在所述加速度特征指示所述触控操作为指关节操作的情况下,从所述触碰信号中提取的特征。
  3. 根据权利要求2所述的方法,其特征在于,所述提取加速度特征、置信度特征和触摸特征,包括:
    从所述加速度信号中提取所述加速度特征;
    将所述加速度特征输入加速度二分类模型,得到初步分类结果;
    在所述初步分类结果指示所述触控操作为指关节操作的情况下,将所述电容信号输入电容二分类模型,得到所述置信度特征,并从所述触碰信号中提取所述触摸特征。
  4. 根据权利要求2所述的方法,其特征在于,所述提取加速度特征、置信度特征和触摸特征,包括:
    从所述加速度信号中提取所述加速度特征,并从所述触碰信号中提取所述触摸特征;
    将所述加速度特征输入加速度二分类模型,得到初步分类结果;
    在所述初步分类结果指示所述触控操作为指关节操作的情况下,将所述电容信号输入电容二分类模型,得到所述置信度特征。
  5. 根据权利要求1至4中任一项所述的方法,其特征在于,所述触摸特征包括接触位置特征和接触面积特征中的至少一项;
    其中,所述接触位置特征用于表示身体部位在所述触控屏上的交互位置,所述接触面积特征用于表示身体部位与所述触控屏接触的面积。
  6. 根据权利要求5所述的方法,其特征在于,所述接触位置特征用于表示触碰点所在网格的坐标网格编号;
    其中,所述触碰点所在网格为根据所述触控屏的分辨率,对所述触控屏划分得到的网格列表中的至少一个网格。
  7. 根据权利要求6所述的方法,其特征在于,所述网格列表包括p行和q列网格,所述网格列表中的每个网格的长度等于所述触控屏的纵轴的像素点数除以p,所述网格列表中的每个网格的宽度等于所述触控屏的横轴的像素点数除以q,p和q为正整数。
  8. 根据权利要求6或7所述的方法,其特征在于,所述触摸特征包括所述接触位置特征;
    提取所述接触位置特征,包括:
    根据所述触碰信号,确定所述触碰点的X轴坐标和Y轴坐标;
    根据所述X轴坐标和Y轴坐标,确定表示所述触碰点所在网格的坐标网格编号的所述接触位置特征;
    其中,所述X轴为所述触控屏所在平面的水平方向,所述Y轴为所述触控屏所在平面的竖直方向。
  9. 根据权利要求1至4中任一项所述的方法,其特征在于,所述加速度特征包括以下至少一项:最大一阶差分特征,信号振幅特征,前段过零数特征,最大高通值特征,均值域绝对值之和特征,前段归一化值方差特征,前段归一化值振幅特征,快速傅立叶变换均值特征,以及部分快速傅立叶变换均值特征。
  10. 根据权利要求1至4中任一项所述的方法,其特征在于,所述置信度特征为置信度得分,所述置信度得分用于表示所述电容信号与指关节操作的关联程度。
  11. 根据权利要求3或4所述的方法,其特征在于,所述电容二分类模型为卷积神经网络模型,所述加速度二分类模型为全连接神经网络模型。
  12. 根据权利要求1至4中任一项所述的方法,其特征在于,所述指关节分类模型为全连接神经网络模型。
  13. 根据权利要求1至12中任一项所述的方法,其特征在于,所述方法还包括:
    在所述目标分类结果指示所述触控操作为指关节操作的情况下,确定所述触控操作属于的指关节手势,并执行与所述指关节手势对应的响应功能;
    其中,不同的指关节手势对应不同的响应功能。
  14. 根据权利要求12所述的方法,其特征在于,所述指关节手势包括以下至少一项:指关节双击手势,指关节敲击并画圈手势,指关节敲击并画字母S手势,三个指关节沿着屏幕由上向下滑动手势,双指关节双击手势,指关节敲击并在屏幕中间画直线手势;其中,
    与所述指关节双击手势对应的响应功能为截取全屏功能;
    与所述指关节敲击并画圈手势对应的响应功能为局部截屏功能;
    与所述指关节敲击并画字母S手势对应的响应功能为滚动截屏功能;
    与所述三个指关节沿着屏幕由上向下滑动手势对应的响应功能为滑动截屏功能;
    与所述双指关节双击手势对应的响应功能为启动/停止录屏功能;
    与所述指关节敲击并在屏幕中间画直线手势对应的响应功能为分屏功能。
  15. 一种电子设备,其特征在于,包括处理器,所述处理器与存储器耦合,所述处理器用于执行所述存储器中存储的计算机程序或指令,以使得所述电子设备实现如权利要求1至14中任一项所述的指关节操作的识别方法。
  16. 一种芯片,其特征在于,所述芯片与存储器耦合,所述芯片用于读取并执行所述存储器中存储的计算机程序,以实现如权利要求1至14中任一项所述的指关节操作的识别方法。
  17. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序,当所述计算机程序在电子设备上运行时,使得所述电子设备执行如权利要求1至14中任一项所述的指关节操作的识别方法。
PCT/CN2023/091715 2022-07-01 2023-04-28 指关节操作的识别方法及电子设备 WO2024001501A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210768211.4 2022-07-01
CN202210768211.4A CN117389454A (zh) 2022-07-01 2022-07-01 指关节操作的识别方法及电子设备

Publications (1)

Publication Number Publication Date
WO2024001501A1 true WO2024001501A1 (zh) 2024-01-04

Family

ID=89383195

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/091715 WO2024001501A1 (zh) 2022-07-01 2023-04-28 指关节操作的识别方法及电子设备

Country Status (2)

Country Link
CN (1) CN117389454A (zh)
WO (1) WO2024001501A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106445120A (zh) * 2016-09-05 2017-02-22 华为技术有限公司 触控操作的识别方法及装置
CN107077242A (zh) * 2014-09-24 2017-08-18 齐科斯欧公司 通过使用时空触摸图案来提高触摸屏事件分析的准确性的方法
US20190317633A1 (en) * 2018-04-13 2019-10-17 Silicon Integrated Systems Corp Method and system for identifying tap events on touch panel, and touch-controlled end project
CN112445410A (zh) * 2020-12-07 2021-03-05 北京小米移动软件有限公司 触控事件识别方法、装置及计算机可读存储介质
CN113449725A (zh) * 2021-06-30 2021-09-28 平安科技(深圳)有限公司 对象分类方法、装置、设备及存储介质
CN113919390A (zh) * 2021-09-29 2022-01-11 华为技术有限公司 一种识别触摸操作的方法及电子设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107077242A (zh) * 2014-09-24 2017-08-18 齐科斯欧公司 通过使用时空触摸图案来提高触摸屏事件分析的准确性的方法
CN106445120A (zh) * 2016-09-05 2017-02-22 华为技术有限公司 触控操作的识别方法及装置
US20190317633A1 (en) * 2018-04-13 2019-10-17 Silicon Integrated Systems Corp Method and system for identifying tap events on touch panel, and touch-controlled end project
CN112445410A (zh) * 2020-12-07 2021-03-05 北京小米移动软件有限公司 触控事件识别方法、装置及计算机可读存储介质
CN113449725A (zh) * 2021-06-30 2021-09-28 平安科技(深圳)有限公司 对象分类方法、装置、设备及存储介质
CN113919390A (zh) * 2021-09-29 2022-01-11 华为技术有限公司 一种识别触摸操作的方法及电子设备

Also Published As

Publication number Publication date
CN117389454A (zh) 2024-01-12

Similar Documents

Publication Publication Date Title
US11009989B2 (en) Recognizing and rejecting unintentional touch events associated with a touch sensitive device
Sharma et al. Human computer interaction using hand gesture
Gu et al. Accurate and low-latency sensing of touch contact on any surface with finger-worn IMU sensor
US8270670B2 (en) Method for recognizing and tracing gesture
Lee et al. Finger identification and hand gesture recognition techniques for natural user interface
US9778789B2 (en) Touch rejection
US20130155026A1 (en) New kind of multi-touch input device
WO2012144970A1 (en) Obstructing user content based on location
CN104516499B (zh) 利用用户接口的事件的设备和方法
CN104885051A (zh) 锚拖动触摸符号辨识
Chua et al. Hand gesture control for human–computer interaction with Deep Learning
Zahra et al. Camera-based interactive wall display using hand gesture recognition
CN116198435B (zh) 车辆的控制方法、装置、车辆以及存储介质
WO2024001501A1 (zh) 指关节操作的识别方法及电子设备
Liu et al. Ultrasonic positioning and IMU data fusion for pen-based 3D hand gesture recognition
Dhamanskar et al. Human computer interaction using hand gestures and voice
Lakshmi et al. A novel air writing recognition system using Raspberry Pi for the control and interaction of digital systems
US20220050528A1 (en) Electronic device for simulating a mouse
TWI478017B (zh) 觸控裝置及其觸控方法
CN109542229B (zh) 手势识别方法、用户设备、存储介质及装置
Raees et al. THE-3DI: Tracing head and eyes for 3D interactions: An interaction technique for virtual environments
Feng et al. FM: Flexible mapping from one gesture to multiple semantics
Babu et al. Controlling Computer Features Through Hand Gesture
Agarwal et al. Evaluation of microgesture recognition using a smartwatch
NL2031789B1 (en) Aggregated likelihood of unintentional touch input

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23829687

Country of ref document: EP

Kind code of ref document: A1