CN106155642B - Information processing method and electronic equipment - Google Patents

Information processing method and electronic equipment Download PDF

Info

Publication number
CN106155642B
CN106155642B CN201510134587.XA CN201510134587A CN106155642B CN 106155642 B CN106155642 B CN 106155642B CN 201510134587 A CN201510134587 A CN 201510134587A CN 106155642 B CN106155642 B CN 106155642B
Authority
CN
China
Prior art keywords
area
sensing
touch
unit
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510134587.XA
Other languages
Chinese (zh)
Other versions
CN106155642A (en
Inventor
卢睿
谢晓辉
马骞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201510134587.XA priority Critical patent/CN106155642B/en
Publication of CN106155642A publication Critical patent/CN106155642A/en
Application granted granted Critical
Publication of CN106155642B publication Critical patent/CN106155642B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses an information processing method, which comprises the following steps: acquiring original data of a touch event, wherein the touch event is a touch operation of an operating body on a touch detection unit; processing the original data of the touch event to obtain a sensing area; the sensing area is used for describing an area range of the touch detection unit sensing the operation body; calculating characteristic parameters of the sensing area; determining the effectiveness of the sensing region according to the characteristic parameters of the sensing region. The invention also discloses an electronic device.

Description

Information processing method and electronic equipment
Technical Field
The present invention relates to electronic technologies, and in particular, to an information processing method and an electronic device.
Background
Display devices of electronic devices are implemented using touch screens, and have become increasingly widespread. In the process of using such electronic devices, as shown in fig. 1-1, when a user needs to input text, a virtual keyboard is usually called up on a touch screen 11 of the electronic device, and the user opens a browser, and when the user wants to input a keyword to be searched in an input box 13 of a search engine (hundredths) (the position of an input cursor 12), the electronic device calls up a virtual touch screen keyboard 14. In fig. 1-1, when a user inputs through fingers on a virtual touch screen keyboard, the user desires to hold the hand on a support like a physical keyboard, for example, when the user uses the physical keyboard, if the physical keyboard is placed on a desktop, the palm and the wrist of the user will be held on the desktop; when a virtual touch screen keyboard is used, a user naturally holds a hand on the touch screen to perform a tap input, and thus, the contact of the palm and the wrist on the touch screen generates an input response, thereby causing a malfunction. In other scenarios, when a user uses a stylus pen to input on a virtual touch screen keyboard, the same user's palm and wrist are supported on the touch screen, and thus, the palm and wrist may generate a false touch.
Disclosure of Invention
In view of this, embodiments of the present invention provide an information processing method and an electronic device to solve at least one problem in the prior art, which can avoid misoperation caused by a palm and a wrist, thereby improving user experience.
The technical scheme of the embodiment of the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an information processing method, where the method includes:
acquiring original data of a touch event, wherein the touch event is a touch operation of an operating body on a touch detection unit;
processing the original data of the touch event to obtain a sensing area; the sensing area is used for describing an area range of the touch detection unit sensing the operation body;
calculating characteristic parameters of the sensing area;
determining the effectiveness of the sensing region according to the characteristic parameters of the sensing region.
In a second aspect, an embodiment of the present invention provides an electronic device, including a first obtaining unit, a processing unit, a calculating unit, and a first determining unit, where:
the first acquisition unit is used for acquiring original data of a touch event, wherein the touch event is the touch operation of an operation body on the touch detection unit;
the processing unit is used for processing the original data of the touch event to obtain a sensing area; the sensing area is used for describing an area range of the touch detection unit sensing the operation body;
the calculation unit is used for calculating characteristic parameters of the sensing area;
the first determining unit is used for determining the effectiveness of the sensing area according to the characteristic parameters of the sensing area.
According to the information processing method and the electronic device provided by the embodiment of the invention, the original data of the touch event is acquired, wherein the touch event is the touch operation of an operating body on a touch detection unit; processing the original data of the touch event to obtain a sensing area; the sensing area is used for describing an area range of the touch detection unit sensing the operation body; calculating characteristic parameters of the sensing area; determining the effectiveness of the sensing region according to the characteristic parameters of the sensing region; therefore, misoperation caused by the palm and the wrist can be avoided, and user experience is improved.
Drawings
FIG. 1-1 is a schematic diagram of a virtual keyboard in the related art;
fig. 1-2 are schematic diagrams illustrating a flow chart of an information processing method according to an embodiment of the present invention;
FIGS. 1-3 are schematic diagrams of a touch unit according to an embodiment of the invention;
FIGS. 1-4 are schematic views of a sensing region according to an embodiment of the invention;
FIGS. 1-5 are schematic diagrams illustrating the results of determining a sensing region according to one embodiment of the present invention;
FIG. 2-1 is a schematic flow chart of an implementation of a second information processing method according to an embodiment of the present invention;
FIG. 2-2 is a diagram illustrating the value types of the sensing nodes according to a second embodiment of the present invention;
FIG. 3 is a schematic diagram of an implementation flow of a third information processing method according to an embodiment of the present invention;
FIG. 4-1 is a schematic flow chart of an implementation of a fourth information processing method according to an embodiment of the present invention;
FIG. 4-2 is a diagram of a four-touch model according to an embodiment of the invention;
FIG. 5-1 is a schematic flow chart of an implementation of a fifth information processing method according to an embodiment of the present invention;
FIG. 5-2 is a schematic diagram of a fifth embodiment of the present invention;
FIG. 6 is a schematic diagram of an implementation flow of a sixth information processing method according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a seventh electronic device according to an embodiment of the invention;
fig. 8 is a schematic structural diagram of an eighth electronic device according to an embodiment of the invention;
fig. 9 is a schematic structural diagram of a nine-electronic device according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of a tenth electronic device according to an embodiment of the invention;
fig. 11 is a schematic structural diagram of an eleventh electronic device according to an embodiment of the invention.
Detailed Description
The technical solution of the present invention is further elaborated below with reference to the drawings and the specific embodiments.
Example one
The embodiment of the invention provides an information processing method which is applied to electronic equipment. In particular, the electronic device in this embodiment includes a touch unit, wherein the touch unit includes a touch display unit and a touch detection unit, wherein, for example, the touch display unit is typically represented by a touch screen, and the touch detection unit is typically represented by a touch pad, and the touch display unit is actually a device that combines the touch pad and the display unit, such as a display screen. Specifically, the electronic device in this embodiment may be a smart phone including a touch screen, a personal computer, a notebook computer, a netbook computer (netbook), a tablet computer, a desktop computer, a smart television, and the like.
The functions implemented by the information processing method provided in this embodiment may be implemented by a processor in the electronic device calling a program code, and certainly, the program code may be stored in a computer storage medium.
Fig. 1-2 are schematic diagrams illustrating a flow of implementing an information processing method according to an embodiment of the present invention, as shown in fig. 1-2, the information processing method includes:
step 101, acquiring original data of a touch event, wherein the touch event is a touch operation of an operating body on a touch detection unit;
here, since the touch unit has various forms, the operation body corresponding to the touch unit is also different, but it should be noted that although there is a difference in the principle that the touch unit senses the operation body, the implementation of the present embodiment is not affected. Among the various detecting units, a capacitive touch screen (capacitive touch screen) is the most widely used touch display unit due to its advantages of good definition, sensitivity, and support for multi-touch technology, and in this embodiment and the following embodiments, the capacitive touch screen will be taken as an example to describe the embodiments of the present invention.
Here, the operation body refers to a human body or an object that can be sensed by the touch unit, for example, in the case of a capacitive touch screen, the operation body may be a human body having a biological characteristic such as a palm, a wrist, a finger, a skin, and the like, and may also be a stylus that can be sensed by the capacitive touch screen.
Here, the touch operation includes a contact touch operation and a proximity (spaced) touch operation, where, for a general touch unit (first type of touch unit), an operation body needs to be in contact with the touch unit to recognize the operation of the operation body; however, for some touch units, the operation body is not required to directly contact with the touch unit (second type of touch unit), as shown in fig. 1-3, the diagram b in fig. 1-3 is the left side view of the diagram a in fig. 1-3, when a user uses such a touch screen 32, as long as the distance 33 between the finger 31 of the user and the touch screen 32 is within the range that can be sensed by the touch screen 32, the touch screen 32 can position the finger 31, thereby obtaining the touch operation of the user. Because the second type of touch unit can sense the finger without direct contact, cross infection can be well avoided, and the touch screen is clean and sanitary.
Here, for an electronic device having a touch unit, the electronic device generally further includes a touch chip (IC), and the touch chip includes a Micro Controller Unit (MCU) for receiving a Digital signal after being processed by an Analog/Digital converter (a/D) and processing the Digital signal into a touch signal according to a predetermined program; and then the microprocessor in the touch chip sends the touch signal to a main processor of the electronic equipment. It can be seen that the function of the microprocessor on the touch chip is to process raw data into a touch signal, generally speaking, the function of the microprocessor is performed by a hardware manufacturer of the touch screen, and the hardware manufacturer of each touch screen has different processing methods for raw data. Taking a capacitive touch screen as an example to illustrate a processing method, a microprocessor on a touch chip obtains the capacitance of a changed continuous area, if the area of the continuous area is smaller, the central position of the area is directly taken as the position of a touch point to be reported, and a touch signal with position information is obtained; if the continuous area is large, the continuous area is resolved into one touch point or a plurality of touch points (signals), wherein the one touch point carries one corresponding position information, and the plurality of touch points carry a plurality of corresponding position information.
In this embodiment, the original data refers to data that is not processed by the touch chip, and may be data output by an analog-to-digital converter, for example.
102, processing the original data of the touch event to obtain a sensing area, wherein the sensing area is used for describing an area range of the touch detection unit sensing the operation body;
here, for the first type of touch unit described above, the sensing regions are as regions 41 to 47 (white) in fig. 1 to 4. FIGS. 1-4 are an example of a scenario where a user, using a virtual keyboard on a touch screen, rests their palm on the screen, where the fingers, palm portion connecting the thumb and little finger, of the larger area, part of the wrist, that may be used for normal tapping, are in contact with the touch screen; in other words, the regions 41 to 47 may be generated by the contact of the fingers, the palm portion of the hand connecting the thumb and the little finger, and a portion of the wrist with the touch screen, which are used in normal tapping. As can be seen from fig. 1-4, when a hand is applied to the touch screen, the capacitance values of the touch screen contact areas 41-47 change; what is different is that the capacitance value change on the areas 41 to 47 where the hand is in contact with the touch screen is of one type (i.e. sensing area), and the capacitance value change on the areas where the hand is not in contact with the touch screen is of another type, corresponding to the sensing area, the areas where the hand is not in contact can be called non-sensing areas, and the change value of the capacitance (which can be understood as raw data in the present embodiment) on the whole screen is output to the processor after analog-to-digital conversion, and in the present embodiment, the processor processes the change value of the capacitance into the sensing areas as shown in fig. 1 to 4. The prior art comprises the following treatment processes: the change value of the capacitance is output to a microprocessor on the touch chip after analog-to-digital conversion, and is processed into a touch signal by the microprocessor.
Here, for the second type of touch unit, the sensing area is not different from that shown in fig. 1 to 4, because, as long as the distance between the hand and the touch unit is within the range that the second type of touch unit can sense, the capacitance value changes in one type in the area (sensing area) on the second type of touch unit within the sensing range, and the capacitance value changes in another type in the area (non-sensing area) on the second type of touch unit outside the sensing range, so that the processing procedure of step 102 is not different from that of the first type of touch unit only needing to approach for sensing.
Step 103, calculating characteristic parameters of the sensing area;
here, the characteristic parameters may include parameters such as an area, a positional relationship, a circumscribed rectangle, and a main axis direction.
And 104, determining the effectiveness of the sensing area according to the characteristic parameters of the sensing area.
Here, step 104, determining the validity of the sensing region according to the characteristic parameter of the sensing region, includes: judging whether the sensing area is effective or not according to the characteristic parameters of the sensing area, and if so, recording an input object corresponding to the effective sensing area; and if not, ignoring the input object corresponding to the invalid sensing area.
Here, the input object may include characters on a virtual keyboard, and may also include other objects, such as navigation on a web portal, news headlines; each video window on the video website; navigation and news titles on the portal website are all linked, the video window is also linked, and when a user clicks the news title, the electronic equipment displays news corresponding to the news title. This click is also considered as an input object by the user.
The technical solution provided by the embodiment of the present invention can be used in the following scenarios, as described in the foregoing background art, when the virtual keyboard is called out, only the sensing area corresponding to the finger can be regarded as the effective area, and the sensing area corresponding to the palm and the wrist is invalid. Referring to fig. 1-4 and 1-5, by calculating the characteristic parameters of the sensing regions, the sensing regions 42 to 46 are judged to be valid and the sensing regions 41 and 47 are judged to be invalid according to the characteristic parameters, and as can be seen from fig. 1-4, the sensing regions 42 to 46 correspond to finger regions and the sensing regions 41 and 47 correspond to palms.
In the embodiment of the invention, the original data of a touch event is obtained, wherein the touch event is the touch operation of an operating body on a touch detection unit; processing the original data of the touch event to obtain a sensing area; the sensing area is used for describing an area range of the touch detection unit sensing the operation body; calculating characteristic parameters of the sensing area; determining the effectiveness of the sensing region according to the characteristic parameters of the sensing region; therefore, the technical scheme provided by the embodiment of the invention is a false touch judgment method based on the original data of the touch screen; the scheme firstly takes raw data (raw data) of the touch screen instead of a single touch point signal; and identifying an effective sensing area and an ineffective sensing area according to the individual geometric characteristics (characteristic parameters) of the original data so as to distinguish a false touch signal from a normal finger signal and further reject the false touch caused by the fact that an abnormal finger touches a screen during user operation.
In the related technology, in order to distinguish a false touch signal from a normal finger signal, two schemes are provided, wherein the first technical scheme is that in order to distinguish the false touch signal from the normal finger signal, a touch screen signal can be received only in a keyboard key area, and a signal is not received in an area supported by a hand; the first technical solution in the related art has disadvantages in that: the flexibility is poor, and the problem is not solved fundamentally; for example, a false touch in the keyboard area is inevitable when the user's hand inadvertently makes a touch. The technical solution provided by the present embodiment and the first technical solution have the following advantages: the scheme provided by the embodiment has no fixed area limitation, and the error touch at any position on the whole touch screen can be rejected.
The second technical solution in the related art distinguishes the false touch signal from the normal finger signal according to the difference between the stylus signal and the signal generated by touching the touch screen with the hand. The second technical solution in the related art has the following disadvantages: a special stylus is required and will fail if the stylus produces the same signal as if the hand were touching the touch screen or if the user is inputting directly with a finger instead of the stylus. The technical solutions provided by the present embodiment and the second technical solution have the following advantages: the technical scheme of the embodiment does not depend on the type of the touch type, a user does not need to use a special touch pen, the user can directly operate on the touch screen by using a finger, and the mistaken touch can be rejected at the moment.
Example two
Based on the first embodiment, an embodiment of the present invention provides an information processing method, which is applied to an electronic device. In particular, the electronic device in the present embodiment includes one touch unit. Specifically, the electronic device in this embodiment may be a smart phone, a personal computer, a notebook computer, a netbook computer (netbook), a tablet computer, a desktop computer, a smart television, and the like, which include the touch unit.
The functions implemented by the information processing method provided in this embodiment may be implemented by a processor in the electronic device calling a program code, and certainly, the program code may be stored in a computer storage medium.
Fig. 2-1 is a schematic flow chart of an implementation of a second information processing method according to an embodiment of the present invention, as shown in fig. 2-1, the information processing method includes:
step 201, obtaining original data of a touch event, wherein the touch event is a touch operation performed by an operator on a touch detection unit, and the original data at least includes a value and a position of each sensing node on the touch detection unit;
here, when the sensing node is a capacitive touch screen, the value derived by the sensing node is the capacitance value and the coordinate of the capacitance.
Step 202, performing connected domain scanning on the original data of the touch event according to the type of the value of the sensing node to obtain the sensing area;
here, in the implementation process of step 202, it is determined whether the values of two adjacent sensing nodes are of the same type, if the values of the two sensing nodes are of the same type, the two sensing nodes are connected to form a region, and if the values of the two sensing nodes are different, the two sensing nodes are not connected together. After step 202, an area is formed on the whole touch unit, that is, adjacent areas with the same numerical type are formed into independent areas, and when the numerical type includes two types, the area type formed on the touch unit includes two types; when the numerical type includes N types, where N is an integer of 2 or more, the type of the area formed on the touch unit also includes N types. In the following, a capacitive touch screen is taken as an example, when an operating body touches the capacitive touch screen, capacitance values of sensing nodes of the touch screen change, capacitance values of some sensing nodes do not change (assumed to be 0), capacitance values of some sensing nodes become negative numbers, capacitance values of some sensing nodes become positive numbers, wherein the sensing node whose capacitance value changes to negative or positive is called the node whose capacitance value changes, FIG. 2-2 is a schematic diagram of the original data in the second embodiment of the present invention, the graph b in fig. 2-2 is an enlarged schematic diagram of the area 50 in the graph a in fig. 2-2, and it can be seen from the graph b in fig. 2-2 that the capacitance value of the capacitive sensing node in the area 52 is unchanged, the capacitance value of the capacitive sensing node in the area 53 is a positive number, and the capacitance values of the capacitive sensing nodes in the areas 51 and 54 are negative numbers. After processing fig. 2-2 through step 202, the resulting sensing regions can be referred to as regions 41 through 47 shown in fig. 1-4.
Step 203, calculating characteristic parameters of the sensing area;
step 204, determining the effectiveness of the sensing region according to the characteristic parameters of the sensing region.
Here, the steps 203 to 204 correspond to the steps 103 to 104 in the first embodiment, respectively, and therefore, those skilled in the art can understand the steps 203 to 204 with reference to the first embodiment, and details are not repeated herein for brevity.
EXAMPLE III
Based on the foregoing embodiments, an embodiment of the present invention provides an information processing method, which is applied to an electronic device. In particular, the electronic device in the present embodiment includes one touch unit. Specifically, the electronic device in this embodiment may be a smart phone, a personal computer, a notebook computer, a netbook computer (netbook), a tablet computer, a desktop computer, a smart television, and the like, which include the touch unit.
The functions implemented by the information processing method provided in this embodiment may be implemented by a processor in the electronic device calling a program code, and certainly, the program code may be stored in a computer storage medium.
Fig. 3 is a schematic flow chart of an implementation of a third information processing method according to an embodiment of the present invention, and as shown in fig. 3, the information processing method includes:
step 301, obtaining original data of a touch event, wherein the touch event is a touch operation of an operating body on a touch detection unit;
step 302, processing the raw data of the touch event to obtain a sensing area; the sensing area is used for describing an area range of the touch detection unit sensing the operation body;
here, the steps 301 to 302 correspond to the steps 101 to 102 in the first embodiment, respectively, and therefore, a person skilled in the art can understand the steps 101 to 102 with reference to the first embodiment, and details are not described herein for brevity.
Step 303, calculating the area of the sensing region;
here, the area of the sensing region may be represented by the number of occupied sensing nodes, for example, in the b diagram of fig. 2-2, the sensing region 53 occupies 20 sensing nodes, the capacitance value of each sensing node is an integer and is expressed in a 16-ary manner, and the capacitance value of a certain node is, for example, a numerical value such as 3b, a2, and the like. By analogy, the areas of sensing regions 41 to 47 in fig. 1-4 can be calculated, for example, the area of sensing region 41 is 29 sensing nodes, the area of sensing region 42 is 4 sensing nodes, the area of sensing region 43 is 4 sensing nodes, the area of sensing region 44 is 4 sensing nodes, the area of sensing region 45 is 4 sensing nodes, the area of sensing region 46 is 9 sensing nodes, and the area of sensing region 47 is 17 sensing nodes.
Step 304, judging whether the area of the sensing region meets a preset first condition or not to obtain a first judgment result;
here, the first condition may generally refer to any condition related to the area of the sensing region, for example, a threshold value related to the area may be set, or a range related to the area may be set.
Step 305, determining the validity of the sensing region according to the first judgment result.
Here, continuing with the example in step 303, as can be seen from step 303, the area (29 or 17) of the sensing region corresponding to the palm is much larger than the area (4 to 9) of the sensing region corresponding to the finger, if the first condition is an area threshold, for example, the first condition may set the area threshold to be 10 to 16, taking the area threshold to be 10 as an example, and if the area of the sensing region is smaller than 10, the sensing region may be determined to be valid; the sensing region is determined to be invalid if the area of the sensing region is greater than 10.
If the first condition is set as the area range, specifically, the first condition may set the area range to 3 to 10, and if the area of the sensing region is within the area range of 3 to 10, the sensing region is determined to be valid, for example, the sensing regions 42 to 46 may be determined to be valid, whereas if the area of the sensing region is no longer within the area range, the sensing region is determined to be invalid. Of course, the first condition may also set the area to be in the range of 15 to 35, and if the area of the sensing region is within the area range of 15 to 35, it is determined that the sensing region is invalid, for example, the sensing regions 41 and 47 may be determined to be invalid, whereas if the area of the sensing region is not within the area range of 15 to 35, it is determined that the sensing region is valid, for example, the sensing regions 42 to 46 are valid.
In the embodiment of the present invention, when the characteristic parameter is an area, the above-mentioned steps 304 to 305 actually provide a method for implementing the step 104 "determining the validity of the sensing region according to the characteristic parameter of the sensing region".
Example four
Based on the foregoing embodiments, an embodiment of the present invention provides an information processing method, which is applied to an electronic device. In particular, the electronic device in the present embodiment includes one touch unit. Specifically, the electronic device in this embodiment may be a smart phone, a personal computer, a notebook computer, a netbook computer (netbook), a tablet computer, a desktop computer, a smart television, and the like, which include the touch unit.
The functions implemented by the information processing method provided in this embodiment may be implemented by a processor in the electronic device calling a program code, and certainly, the program code may be stored in a computer storage medium.
Fig. 4-1 is a schematic flow chart of an implementation of a four-information processing method according to an embodiment of the present invention, as shown in fig. 4-1, the information processing method includes:
step 401, acquiring original data of a touch event, wherein the touch event is a touch operation performed by an operating body on a touch detection unit;
here, the operation body refers to a human body or an object that can be sensed by the touch unit, for example, in the case of a capacitive touch screen, the operation body may be a human body having a biological characteristic such as a palm, a wrist, a finger, a skin, and the like, and may also be a stylus that can be sensed by the capacitive touch screen.
Here, the raw data refers to data that is not processed by the touch chip, and may be data output through an analog-to-digital converter, for example.
Step 402, processing the raw data of the touch event to obtain a sensing area; the sensing area is used for describing an area range of the touch detection unit sensing the operation body;
step 403, calculating the area and position of the sensing region;
here, the steps 401 to 403 correspond to the steps 101 to 103 in the first embodiment, respectively, and therefore, a person skilled in the art can understand the steps 401 to 403 with reference to the first embodiment, and details are not described herein for brevity.
Step 404, establishing a first touch model according to the area of the sensing region and the position of the sensing region;
here, the touch model is an integral structure formed by the sensing units when the operating body is in contact with the touch units, such as a hand shape (touch model) formed by the sensing units 41 to 47 shown in fig. 1 to 4, and in the process of specific application, the hand shape shown in fig. 1 to 4 is actually an ideal contact model, and actually, the contact model may have other types due to the problem of the operation habit of people, such as the c diagram of fig. 4 to 2; the contact model may also have only the parts of fig. 1-4, such as the a-and b-diagrams of fig. 4-2, where the dotted line part represents the non-contact, in other words, the dotted line part represents the non-sensing region.
Step 405, determining the validity of the sensing region according to the first touch model.
Here, step 405, determining the validity of the sensing region from the first touch model, comprises: determining sensing regions in the first contact model that represent fingers to be valid, and determining sensing regions in the first touch model other than the sensing regions that represent fingers to be invalid.
In the embodiment, raw data (raw data) of the touch screen is taken first, instead of a single touch point signal; according to the individual geometric features of the original data and the relative position relationship (characteristic parameters) of the space, a certain user contact model is analyzed, extracted and formed, so that the difference between the false touch signal and the normal finger signal is identified and distinguished, and the false touch brought by the fact that the user touches the screen by an abnormal finger point during operation is rejected. It follows that the validity or invalidity of the sensing region is determined according to the overall structure (first touch model) formed by the sensing units, so that in the determination process, the sensing region representing the finger in the first touch model can be determined; the remaining sensing regions may be determined to be invalid. As can be seen from the above description, the technical solution provided by this embodiment is actually a determination method based on such integrity, and therefore, has higher accuracy.
In this embodiment of the present invention, step 404, establishing a first touch model according to the area of the sensing region and the position of the sensing region includes:
step 4041, acquiring an area and a position of a first sensing region, where the first sensing region is a sensing region with a largest area;
here, there are many methods for calculating the area of the sensing region, and it is preferable that the number of sensing nodes represents the area of the sensing region, and the sensing nodes can be understood as one pixel point and thus can be used to represent the area of the sensing region. When a plurality of sensing regions having equal areas exist, one sensing region may be acquired as a first sensing region according to a preset rule; a person skilled in the art may set a preset rule according to an actual situation, for example, obtain the positions of a plurality of sensing regions with equal areas, and use the sensing region located at the lower left or lower right as a first sensing region; the reason why the lower left or lower right sensing region is used as the first sensing region is that the lower left or lower right sensing region is generally a palm region connected to the thumb. Note that in step 4041, a sensing area is actually determined, which corresponds to the palm area connected to the thumb; only in step 4041 is the palm region connected to the thumb the sensing region with the largest area.
Here, the reason why the sensing region having the largest area is used as the palm region connected to the thumb is that the palm portion area is generally larger than the area of the finger region, and after the palm is determined, the hand shape is substantially determined. Therefore, the determination may be started by picking the largest sensing region (hereinafter referred to as a master block) from among sensing regions having an area larger than a certain threshold T1.
Step 4042, determining the main axis direction of the first sensing region according to the area and position of the first sensing region;
here, step 4042 determines the primary axis direction of the first sensing region from the area and position of the first sensing region, including: and calculating a circumscribed rectangle of the first sensing region according to the area and the position of the first sensing region, and then determining the main axis direction of the first sensing region according to the circumscribed rectangle of the first sensing region.
In step 4042, the aspect ratio of the main block may also be considered; as shown in fig. 4-2, if the length and the width are close, it is considered that the model may be a palm front contact model, and a palm front contact hand type (abbreviated as type1) is formed by flatly placing the plane where the palm is located on the touch screen; if the aspect ratio is very different, the area of the main block and the area of the circumscribed rectangle are further considered. If the area of the block is small and the ratio of the area of the circumscribed rectangle to the actual area of the block is greater than the threshold T2, it can be considered that the palm-side portion of the palm contacts the screen (type 2), i.e. the plane where the palm is located is perpendicular to the touch screen; if the tile area is large and the ratio of the circumscribed rectangle area to the actual tile area is smaller than the threshold T3, it can be considered as a palm front touch screen and a lower half palm touch screen (type 3).
Step 4043, determining a palm region of the first touch model according to a main axis direction of the first sensing region;
step 4044, acquiring a second sensing region excluding the sensing regions constituting the palm region;
step 4045, determining a distance between a center of the second sensing region and a center of the palm region;
step 4046, determining a sensing area farthest from the palm area as a finger area;
step 4047, a first touch model is established according to the palm area and the finger area.
In the above-described steps 4042 to 4047, the stub 4042 is divided into three types of type1, type2, and type3 to describe how the first touch model is established.
1) For type1
After the main shaft direction of the main block is determined, searching whether other sensing areas exist in the range of two sides with the included angle of the main shaft direction being a threshold value T4, such as the sensing areas marked by the dotted lines in the above figures 1-4, so as to form a palm model together with the main block; in the extending direction of the main shaft direction of the main block, whether an independent small block (a contact area formed by fingers) exists or not is searched, and the independent small block and the main block form a complete first contact module (a hand model).
2) For type2
In the extension direction of the main shaft direction of the main block, searching the contact surface of a little finger (or other fingers in some cases) to jointly form a palm side model; it should be noted that, due to the physiological characteristics of human hands, the angle formed by the major axis direction of the finger point block and the major axis direction of the palm main block is limited. Finger points outside the constraint threshold are determined not to be part of the palm-formed model, and as shown in type2, if there is a contact point on its right side, i.e., on the back of the hand, it is not part of the palm-formed model.
3) For type3
the main block included in type3 is more specific, and the area of the main block itself is greatly different from the area of the rectangle surrounding the main block, and if it is further determined whether the main block is a block of the type3, it is also possible to consider calculating the concave-convex feature of the main block. After the type3 is determined, contact surfaces formed by finger points are searched on a vertical extension line of a main shaft of the hand model (namely the direction opposite to the concave surface), and the contact surfaces jointly form the hand model.
After the model is built, which contact points are generated by normal operation and which contact points are mistakenly touched can be judged according to the model. For example, after the palm model is created, the palm contact area and the fingertip contact points are covered by the palm, and even if the palm is not completely contacted with the touch screen, the contact surface generated between the fingertip contact points in the palm contact area is marked as the mistaken touch because the contact surface is generated by the mistaken touch rather than the normal operation contact point. And small-area contact points outside the palm contact area and the fingertip contact area can be regarded as normal contact points.
In the embodiment of the invention, when the palm area connected with the thumb is determined, the sensing areas of the fingers can be matched, so that the sensing area corresponding to the palm area is determined; for example, when the difference between the areas of the 4 sensing regions is within a predetermined range and the distance between two adjacent sensing regions in the 4 sensing regions is within a predetermined threshold range, in other words, the areas of the 4 sensing regions are almost equal or the distances between the adjacent sensing regions are almost equal, as shown in fig. 1-4, the areas of the sensing regions 42 to 45 are all 4, that is, the areas of the sensing regions 42 to 45 are almost equal and the distances between the 4 sensing regions are almost equal, it can be determined that the sensing regions 42 to 45 are 4 fingers, then it can be determined that the sensing region 46 corresponds to a thumb, and it can be determined that the sensing region 41 and the sensing region 47 correspond to a palm, so as to establish the first touch model.
In the embodiment of the present invention, the raw data at least includes a value and a position of each sensing node on the touch detection unit; correspondingly, in step 402, the processing raw data of the touch event to obtain a sensing area includes: step 4020, performing connected domain scanning on the original data of the touch event according to the type of the value of the sensing node to obtain the sensing area.
Here, the above-mentioned step 4020 corresponds to the step 202 in the second embodiment, and therefore, those skilled in the art can understand the above-mentioned step 4020 by referring to the second embodiment, and for brevity, will not be described again here.
In this embodiment of the present invention, step 405, the determining the validity of the sensing area according to the first touch model includes:
step 4051, determining whether the area of the sensing region in the first touch model meets a preset condition, and obtaining a first determination result;
step 4052, determining the validity of the sensing region according to the first determination result.
Here, the steps 4051 to 4052 correspond to the steps 304 to 305 in the third embodiment, respectively, so that those skilled in the art can refer to the third embodiment to understand the steps 4051 to 4052, and the description is omitted here for brevity.
EXAMPLE five
Based on the foregoing embodiments, an embodiment of the present invention provides an information processing method, which is applied to an electronic device. In particular, the electronic device in the present embodiment includes one touch unit. Specifically, the electronic device in this embodiment may be a smart phone, a personal computer, a notebook computer, a netbook computer (netbook), a tablet computer, a desktop computer, a smart television, and the like, which include the touch unit.
The functions implemented by the information processing method provided in this embodiment may be implemented by a processor in the electronic device calling a program code, and certainly, the program code may be stored in a computer storage medium.
Fig. 5-1 is a schematic flow chart of an implementation of a five-information processing method according to an embodiment of the present invention, and as shown in fig. 5-1, the information processing method includes:
step 501, acquiring original data of a touch event, wherein the touch event is a touch operation of an operating body on a touch detection unit;
step 502, processing the raw data of the touch event to obtain a sensing area; the sensing area is used for describing an area range of the touch detection unit sensing the operation body;
step 503, calculating characteristic parameters of the sensing area;
step 504, determining the effectiveness of the sensing area according to the characteristic parameters of the sensing area;
here, the steps 501 to 504 correspond to the steps 101 to 104 in the first embodiment, respectively, and therefore, a person skilled in the art can understand the steps 501 to 504 with reference to the first embodiment, and details are not repeated herein for brevity.
Step 505, acquiring a first time interval, where the first time interval is a time interval of two consecutive and adjacent touch events on an effective sensing area;
step 506, judging whether the first time interval meets a preset second condition to obtain a second judgment result;
here, the second condition refers to the condition related to the first time interval, and the second condition may be a time threshold or a time range.
Step 507, determining an input object based on the second judgment result.
Here, step 507, based on the second determination result, determines an input object, including: when the second judgment result shows that the first time interval meets a preset second condition, inputting an input object corresponding to a next touch event as an effective input object, and judging a previous touch event as invalid; and when the second judgment result shows that the first time interval does not meet a preset second condition, judging that the two adjacent touch events are invalid.
The technical scheme provided by the embodiment of the invention can be used in the following scenes, when a user uses the virtual keyboard, the user wants to obtain the same effect as the mechanical keyboard, and actually the user taps the virtual keyboard according to the habit of using the mechanical keyboard, specifically, the user generally puts hands on the keyboard between the taps of the keyboard, as shown in fig. 5-2, at this time, the user does not really want to tap a character, but habitually puts the hands on the keyboard, and prepares for the next tapping; for example, a user intended to tap a string of characters "dasr", the input process for the user would typically be such that:
when the user starts to tap, the user puts the little finger, the ring finger, the middle finger and the index finger of the left hand on the characters a, s, d and f respectively, then the user lifts the left hand and falls the middle finger to input the character d, then lifts the left hand and falls the little finger to input the character a, then the user lifts the left hand and falls the ring finger to input the character s, then the user lifts the left hand and falls the index finger to input the character r, and therefore the input of the string of characters "dasr" is completed.
In the process of inputting, when the user starts to tap, the user puts the little finger, the ring finger, the middle finger and the index finger of the left hand on the characters 'a, s, d and f' respectively, and the user does not really want to tap the characters 'a, s, d and f' but is a habit based on a mechanical keyboard; then in the related art, such a habit based on a mechanical keyboard (placing the little finger, ring finger, middle finger and index finger of the left hand on the characters "a, s, d and f", respectively) causes the electronic device to recognize the characters "a, s, d and f", thereby inputting the characters "a, s, d and f", in other words, the user does not want to input the characters "a, s, d and f", i.e., the input of the characters "a, s, d and f" is a wrong operation. By adopting the technical scheme provided by the embodiment of the invention, the situation can be well avoided, because when a user puts the little finger, the ring finger, the middle finger and the index finger of the left hand on the characters a, s, d and f respectively, the characters are not considered as a touch event, when the user lifts the hand to input the character d, the electronic equipment obtains a time interval (a first time interval), then judges whether the first time interval meets a preset second condition or not, and when the second condition is met, the touch event (when the user lifts the hand to input the character d) is taken as an effective touch event, so that the character d is input.
EXAMPLE six
Based on the foregoing embodiments, an embodiment of the present invention provides an information processing method, which is applied to an electronic device. In particular, the electronic device in the present embodiment includes one touch unit. Specifically, the electronic device in this embodiment may be a smart phone, a personal computer, a notebook computer, a netbook computer (netbook), a tablet computer, a desktop computer, a smart television, and the like, which include the touch unit.
The functions implemented by the information processing method provided in this embodiment may be implemented by a processor in the electronic device calling a program code, and certainly, the program code may be stored in a computer storage medium.
Fig. 6 is a schematic flow chart of an implementation of a sixth information processing method according to an embodiment of the present invention, and as shown in fig. 6, the information processing method includes:
601, acquiring original data of a touch event, wherein the touch event is a touch operation of an operating body on a touch detection unit;
step 602, processing the raw data of the touch event to obtain a sensing area; the sensing area is used for describing an area range of the touch detection unit sensing the operation body;
step 603, calculating characteristic parameters of the sensing area;
step 604, determining the validity of the sensing region according to the characteristic parameters of the sensing region;
here, the steps 601 to 604 correspond to the steps 101 to 104 in the first embodiment, respectively, and therefore, a person skilled in the art can understand the steps 601 to 604 with reference to the first embodiment, and details are not described herein for brevity.
Step 605, acquiring the duration of the touch event on the effective sensing area;
step 606, judging whether the duration time meets a preset third condition to obtain a third judgment result;
here, the third condition refers to a condition regarding duration, and the third condition may be a time threshold or a time range.
Step 607, based on the third determination result, determining the input object.
Here, step 607, based on the third determination result, of determining an input object, includes: when the third judgment result shows that the duration time meets a preset third condition, inputting an input object corresponding to the touch event as an effective input object; and when the third judgment result shows that the duration time does not meet the preset third condition, inputting the input object corresponding to the touch event as an invalid input object.
The technical scheme provided by the embodiment of the invention can be used in the following scenes, when a user uses the virtual keyboard, the user wants to obtain the same effect as the mechanical keyboard, and actually the user taps the virtual keyboard according to the habit of using the mechanical keyboard, specifically, the user generally puts hands on the keyboard between the taps of the keyboard, as shown in fig. 5-2, at this time, the user does not really want to tap a character, but habitually puts the hands on the keyboard, and prepares for the next tapping; for example, a user intended to tap a string of characters "dasr", the input process for the user would typically be such that:
when the user starts to tap, the user puts the little finger, the ring finger, the middle finger and the index finger of the left hand on the characters a, s, d and f respectively, then the user lifts the left hand and falls the middle finger to input the character d, then lifts the left hand and falls the little finger to input the character a, then the user lifts the left hand and falls the ring finger to input the character s, then the user lifts the left hand and falls the index finger to input the character r, and therefore the input of the string of characters "dasr" is completed.
In the process of inputting, when the user starts to tap, the user puts the little finger, the ring finger, the middle finger and the index finger of the left hand on the characters 'a, s, d and f' respectively, and the user does not really want to tap the characters 'a, s, d and f' but is a habit based on a mechanical keyboard; then in the related art, such a habit based on a mechanical keyboard (placing the little finger, ring finger, middle finger and index finger of the left hand on the characters "a, s, d and f", respectively) causes the electronic device to recognize the characters "a, s, d and f", thereby inputting the characters "a, s, d and f", in other words, the user does not want to input the characters "a, s, d and f", i.e., the input of the characters "a, s, d and f" is a wrong operation. By adopting the technical scheme provided by the embodiment of the invention, the situation can be well avoided because when the user places the little finger, the ring finger, the middle finger and the index finger of the left hand on the characters a, s, d and f respectively, the user does not consider the touch events to be an effective touch event, the electronic equipment needs to judge the duration of the touch event, because the user places the fingers on the characters, the duration of the placement is far longer than the duration of the touch events corresponding to the user when the user normally taps the characters (d, a, s and r), therefore, as long as the third condition is set reasonably, the user places the little finger, the ring finger, the middle finger and the index finger of the left hand on the characters a, s, d and f respectively to judge the touch events to be ineffective, and the following operations, such as the user lifts the left hand and drops the lower middle finger to input the character d, a valid touch event is determined. Therefore, by adopting the technical scheme provided by the embodiment of the invention, the misoperation of the user can be well avoided.
EXAMPLE seven
Based on the foregoing information processing method, an embodiment of the present invention provides an electronic device, where a first obtaining unit, a processing unit, a calculating unit, and a first determining unit in the electronic device may all be implemented by a processor in the electronic device; of course, the implementation can also be realized through a specific logic circuit; in the course of a particular embodiment, the processor may be a Central Processing Unit (CPU), a Microprocessor (MPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), or the like.
Fig. 7 is a schematic structural diagram of a seventh electronic device according to an embodiment of the present invention, and as shown in fig. 7, the electronic device 700 includes a first obtaining unit 701, a processing unit 702, a calculating unit 703 and a first determining unit 704, where:
the first obtaining unit 701 is configured to obtain original data of a touch event, where the touch event is a touch operation performed by an operator on a touch detection unit;
the processing unit 702 is configured to process the raw data of the touch event to obtain a sensing area; the sensing area is used for describing an area range of the touch detection unit sensing the operation body;
the calculating unit 703 is configured to calculate a characteristic parameter of the sensing region;
the first determining unit 704 is configured to determine the validity of the sensing region according to the characteristic parameter of the sensing region.
In the embodiment of the present invention, the raw data at least includes a value and a position of each sensing node on the touch detection unit;
and the processing unit is used for scanning the connected domain of the original data of the touch event according to the type of the numerical value of the sensing node to obtain the sensing region.
In the embodiment of the present invention, the first obtaining unit 701 obtains original data of a touch event, where the touch event is a touch operation performed by an operating body on a touch detecting unit; the processing unit 702 processes the raw data of the touch event to obtain a sensing area; the sensing area is used for describing an area range of the touch detection unit sensing the operation body; the calculation unit 703 calculates a characteristic parameter of the sensing region; the first determination unit 704 determines the validity of the sensing region according to the characteristic parameter of the sensing region; therefore, misoperation caused by the palm and the wrist can be avoided, and user experience is improved.
Example eight
Based on the foregoing information processing method, an embodiment of the present invention provides an electronic device, where a first obtaining unit, a processing unit, a calculating unit, a first determining unit, modules included in the first determining unit, and sub-modules included in the modules in the electronic device may all be implemented by a processor in the electronic device; of course, the implementation can also be realized through a specific logic circuit; in the course of a particular embodiment, the processor may be a central processing unit, a microprocessor, a digital signal processor, a field programmable gate array, or the like.
Fig. 8 is a schematic structural diagram of an eighth electronic device according to an embodiment of the present invention, and as shown in fig. 8, the electronic device 800 includes a first obtaining unit 801, a processing unit 802, a calculating unit 803, and a first determining unit 804, where the first determining unit 804 includes an establishing module 8041 and a second determining module 8042, where:
the first obtaining unit 801 is configured to obtain original data of a touch event, where the touch event is a touch operation performed by an operator on a touch detection unit;
the processing unit 802 is configured to process the raw data of the touch event to obtain a sensing area; the sensing area is used for describing an area range of the touch detection unit sensing the operation body;
the calculating unit 803 is configured to calculate an area of the sensing region and a position of the sensing region;
the establishing module 8041 is configured to establish a first touch model according to the area of the sensing region and the position of the sensing region;
the second determining module 8042 is configured to determine validity of the sensing area according to the first touch model.
In this embodiment of the present invention, the second determining module includes a judging sub-module and a first determining sub-module, where:
the judging submodule is used for judging whether the area of the sensing area in the first touch model meets a preset condition or not to obtain a first judging result;
the first determining submodule is used for determining the effectiveness of the sensing area according to the first judgment result.
In the embodiment of the present invention, the establishing module includes a first obtaining sub-module, a second determining sub-module, a third determining sub-module, a second obtaining sub-module, a fourth determining sub-module, a fifth determining sub-module, and an establishing sub-module, wherein:
the first acquisition submodule is used for acquiring the area and the position of a first sensing area, and the first sensing area is the sensing area with the largest area;
the second determining submodule is used for determining the main shaft direction of the first sensing area according to the area and the position of the first sensing area;
the third determining submodule is used for determining a palm area of the first touch model according to the main axis direction of the first sensing area;
the second acquisition submodule is used for acquiring a second sensing area except for the sensing area forming the palm area;
the fourth determination submodule is used for determining the distance between the center of the second sensing area and the center of the palm area;
the fifth determination submodule is configured to determine a sensing region farthest from the palm region as a finger region;
and the establishing submodule is used for establishing a first touch model according to the palm area and the finger area.
Example nine
Based on the foregoing information processing method, an embodiment of the present invention provides an electronic device, where a first obtaining unit, a processing unit, a calculating unit, and a first determining unit in the electronic device, and a judging module and a first determining module included in the first determining unit may be implemented by a processor in the electronic device; of course, the implementation can also be realized through a specific logic circuit; in the course of a particular embodiment, the processor may be a central processing unit, a microprocessor, a digital signal processor, a field programmable gate array, or the like.
Fig. 9 is a schematic diagram of a composition structure of a nine-electronic device according to an embodiment of the present invention, as shown in fig. 9, the electronic device 900 includes a first obtaining unit 901, a processing unit 902, a calculating unit 903, and a first determining unit 904, where the first determining unit 904 includes a determining module 9041 and a first determining module 9042, where:
the first obtaining unit 901 is configured to obtain original data of a touch event, where the touch event is a touch operation performed by an operator on a touch detection unit;
the processing unit 902 is configured to process the raw data of the touch event to obtain a sensing area; the sensing area is used for describing an area range of the touch detection unit sensing the operation body;
the calculating unit 903 is configured to calculate a characteristic parameter of the sensing region;
the judging module 9041 is configured to judge whether the area of the sensing region meets a preset first condition, so as to obtain a first judgment result;
the first determining module 9042 is configured to determine validity of the sensing region according to the first determination result.
Example ten
Based on the foregoing information processing method, an embodiment of the present invention provides an electronic device, where a first obtaining unit, a processing unit, a calculating unit, a first determining unit, a second obtaining unit, a first judging unit, and a second determining unit in the electronic device may all be implemented by a processor in the electronic device; of course, the implementation can also be realized through a specific logic circuit; in the course of a particular embodiment, the processor may be a central processing unit, a microprocessor, a digital signal processor, a field programmable gate array, or the like.
Fig. 10 is a schematic diagram of a composition structure of a tenth electronic device according to an embodiment of the present invention, and as shown in fig. 10, the electronic device 1000 includes a first obtaining unit 1001, a processing unit 1002, a calculating unit 1003, a first determining unit 1004, a second obtaining unit 1005, a first determining unit 1006, and a second determining unit 1007, where:
the first obtaining unit 1001 is configured to obtain original data of a touch event, where the touch event is a touch operation performed by an operating body on a touch detection unit;
the processing unit 1002 is configured to process raw data of the touch event to obtain a sensing area; the sensing area is used for describing an area range of the touch detection unit sensing the operation body;
the calculating unit 1003 is configured to calculate a characteristic parameter of the sensing region;
the first determining unit 1004 is configured to determine validity of the sensing region according to the characteristic parameter of the sensing region;
the second obtaining unit 1005 is configured to obtain a first time interval, where the first time interval is a time interval between two consecutive touch events that occur in an effective sensing area;
the first determining unit 1006 is configured to determine whether the first time interval meets a preset second condition, so as to obtain a second determination result;
the second determination unit 1007 is configured to determine an input object based on the second determination result.
EXAMPLE eleven
Based on the foregoing information processing method, an embodiment of the present invention provides an electronic device, where a first obtaining unit, a processing unit, a calculating unit, a first determining unit, a third obtaining unit, a second determining unit, and a third determining unit in the electronic device may all be implemented by a processor in the electronic device; of course, the implementation can also be realized through a specific logic circuit; in the course of a particular embodiment, the processor may be a central processing unit, a microprocessor, a digital signal processor, a field programmable gate array, or the like.
Fig. 11 is a schematic diagram of a composition structure of an eleventh electronic device according to an embodiment of the present invention, and as shown in fig. 11, the electronic device 1100 includes a first obtaining unit 1101, a processing unit 1102, a calculating unit 1103, a first determining unit 1104, a third obtaining unit 1105, a second judging unit 1106, and a third determining unit 1107, where:
the first obtaining unit 1101 is configured to obtain original data of a touch event, where the touch event is a touch operation performed by an operator on a touch detection unit;
the processing unit 1102 is configured to process the raw data of the touch event to obtain a sensing area; the sensing area is used for describing an area range of the touch detection unit sensing the operation body;
the calculating unit 1103 is configured to calculate a characteristic parameter of the sensing region;
the first determining unit 1104 is configured to determine validity of the sensing region according to the characteristic parameter of the sensing region;
the third obtaining unit 1105, configured to obtain a duration of a touch event on a valid sensing area;
the second determining unit 1106 is configured to determine whether the duration meets a preset third condition, so as to obtain a third determination result;
the third determining unit 1107 is configured to determine an input object based on the third determination result.
Here, it should be noted that: the description of the embodiment of the electronic device is similar to the description of the method, and has the same beneficial effects as the embodiment of the method, and therefore, the description is omitted. For technical details that are not disclosed in the embodiment of the electronic device of the present invention, those skilled in the art should refer to the description of the embodiment of the method of the present invention to understand that, for the sake of brevity, detailed description is not repeated here.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. It should be understood that, in various embodiments of the present invention, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation on the implementation process of the embodiments of the present invention.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units; can be located in one place or distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all the functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: various media that can store program codes, such as a removable Memory device, a Read Only Memory (ROM), a magnetic disk, or an optical disk.
Alternatively, the integrated unit of the present invention may be stored in a computer-readable storage medium if it is implemented in the form of a software functional module and sold or used as a separate product. Based on such understanding, the technical solutions of the embodiments of the present invention may be essentially implemented or a part contributing to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: a removable storage device, a ROM, a magnetic or optical disk, or other various media that can store program code.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (12)

1. An information processing method, characterized in that the method comprises:
acquiring original data of a touch event, wherein the touch event is a touch operation of an operating body on a touch detection unit, and the original data at least comprises the numerical value and the position of each sensing node on the touch detection unit;
processing the original data of the touch event to obtain a sensing area, wherein the sensing area is used for describing an area range of the touch detection unit sensing the operation body;
calculating characteristic parameters of the sensing region, wherein the characteristic parameters comprise area, position and main axis direction;
establishing a first touch model according to the area of the sensing area, the position of the sensing area and the main shaft direction;
determining the validity of the sensing region from the first touch model, comprising: determining a sensing region representative of a finger from the first touch model, the sensing region representative of a finger being determined to be valid.
2. The method of claim 1, wherein the raw data includes at least a value and a location of each sensing node on the touch detection unit;
the processing the raw data of the touch event to obtain a sensing area includes:
and performing connected domain scanning on the original data of the touch event according to the type of the value of the sensing node to obtain the sensing area.
3. The method of claim 1, wherein determining the validity of the sensing region from the first touch model comprises:
judging whether the area of a sensing area in the first touch model meets a preset condition or not to obtain a first judgment result;
and determining the effectiveness of the sensing area according to the first judgment result.
4. The method of claim 1, wherein the building a first touch model based on the area of the sensing region, the position of the sensing region, and the principal axis direction comprises:
acquiring the area and the position of a first sensing area, wherein the first sensing area is the sensing area with the largest area;
determining the main shaft direction of the first sensing area according to the area and the position of the first sensing area;
determining a palm area of the first touch model according to the main axis direction of the first sensing area;
acquiring a second sensing region except for a sensing region constituting the palm region;
determining a distance between a center of the second sensing region and a center of the palm region;
determining a sensing region farthest from the palm region as a finger region;
and establishing a first touch model according to the palm area and the finger area.
5. The method according to any one of claims 1 to 4, further comprising:
acquiring a first time interval, wherein the first time interval is the time interval of two touch events which occur adjacently in sequence and are on an effective sensing area;
judging whether the first time interval meets a preset second condition or not to obtain a second judgment result;
determining an input object based on the second determination result.
6. The method according to any one of claims 1 to 4, further comprising:
acquiring a duration of a touch event on the active sensing area;
judging whether the duration time meets a preset third condition or not to obtain a third judgment result;
determining an input object based on the third determination result.
7. An electronic device, comprising a first acquisition unit, a processing unit, a calculation unit, and a first determination unit, wherein:
the first acquisition unit is used for acquiring original data of a touch event, wherein the touch event is a touch operation of an operating body on a touch detection unit, and the original data at least comprises a numerical value and a position of each sensing node on the touch detection unit;
the processing unit is used for processing the original data of the touch event to obtain a sensing area; the sensing area is used for describing an area range of the touch detection unit sensing the operation body;
the calculation unit is used for calculating characteristic parameters of the sensing region, wherein the characteristic parameters comprise an area, a position and a main axis direction;
the first determination unit is used for determining the effectiveness of the sensing area according to the characteristic parameters of the sensing area, and comprises a building module and a second determination module,
the establishing module is used for establishing a first touch model according to the area of the sensing area, the position of the sensing area and the main shaft direction;
the first determination module, configured to determine validity of the sensing region according to the first touch model, includes: determining a sensing region representative of a finger from the first touch model, the sensing region representative of a finger being determined to be valid.
8. The electronic device of claim 7, wherein the raw data includes at least a value and a location of each sensor node on the touch detection unit;
and the processing unit is used for scanning the connected domain of the original data of the touch event according to the type of the numerical value of the sensing node to obtain the sensing region.
9. The electronic device of claim 7, wherein the second determination module comprises a determination sub-module and a first determination sub-module, wherein:
the judging submodule is used for judging whether the area of the sensing area in the first touch model meets a preset condition or not to obtain a first judging result;
the first determining submodule is used for determining the effectiveness of the sensing area according to the first judgment result.
10. The electronic device of claim 7, wherein the setup module includes a first acquisition sub-module, a second determination sub-module, a third determination sub-module, a second acquisition sub-module, a fourth determination sub-module, a fifth determination sub-module, and a setup sub-module, wherein:
the first acquisition submodule is used for acquiring the area and the position of a first sensing area, and the first sensing area is the sensing area with the largest area;
the second determining submodule is used for determining the main shaft direction of the first sensing area according to the area and the position of the first sensing area;
the third determining submodule is used for determining a palm area of the first touch model according to the main axis direction of the first sensing area;
the second acquisition submodule is used for acquiring a second sensing area except for the sensing area forming the palm area;
the fourth determination submodule is used for determining the distance between the center of the second sensing area and the center of the palm area;
the fifth determination submodule is configured to determine a sensing region farthest from the palm region as a finger region;
and the establishing submodule is used for establishing a first touch model according to the palm area and the finger area.
11. The electronic device according to any one of claims 7 to 10, further comprising a second acquisition unit, a first determination unit, and a second determination unit, wherein:
the second acquisition unit is used for acquiring a first time interval, wherein the first time interval is a time interval of two touch events which occur adjacently in sequence and are on the effective sensing area;
the first judging unit is configured to judge whether the first time interval meets a preset second condition, so as to obtain a second judgment result;
the second determination unit is configured to determine an input object based on the second determination result.
12. The electronic device according to any one of claims 7 to 10, further comprising a third acquisition unit, a second determination unit, and a third determination unit, wherein:
the third acquisition unit is used for acquiring the duration of the touch event on the effective sensing area;
the second judging unit is configured to judge whether the duration time meets a preset third condition, so as to obtain a third judgment result;
the third determination unit is configured to determine an input object based on the third determination result.
CN201510134587.XA 2015-03-25 2015-03-25 Information processing method and electronic equipment Active CN106155642B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510134587.XA CN106155642B (en) 2015-03-25 2015-03-25 Information processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510134587.XA CN106155642B (en) 2015-03-25 2015-03-25 Information processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN106155642A CN106155642A (en) 2016-11-23
CN106155642B true CN106155642B (en) 2020-07-24

Family

ID=57340465

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510134587.XA Active CN106155642B (en) 2015-03-25 2015-03-25 Information processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN106155642B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108984045A (en) * 2017-06-01 2018-12-11 深圳市鸿合创新信息技术有限责任公司 Invalid touch-control determination method, device, electronic equipment and storage medium
CN107422912B (en) * 2017-07-24 2021-04-30 神思电子技术股份有限公司 Method for preventing electromagnetic handwriting screen from being touched by mistake
CN107450840B (en) * 2017-08-04 2020-12-01 歌尔科技有限公司 Method and device for determining finger touch connected domain and electronic equipment
CN107562363B (en) * 2017-09-08 2020-07-24 Oppo广东移动通信有限公司 Touch operation method and device and terminal
TWI662460B (en) * 2018-07-18 2019-06-11 義隆電子股份有限公司 Method of changing identified type of touch object
WO2020172879A1 (en) * 2019-02-28 2020-09-03 深圳市汇顶科技股份有限公司 Method and apparatus for identifying false touch of palm, chip, device, and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101882042A (en) * 2010-06-08 2010-11-10 苏州瀚瑞微电子有限公司 Palm judgment method of capacitive touch screen

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102385469B (en) * 2010-08-30 2015-12-02 联想(北京)有限公司 Terminal and control method thereof
CN102346596B (en) * 2011-11-14 2013-11-13 宇龙计算机通信科技(深圳)有限公司 Touch operation processing method and terminal
KR101907463B1 (en) * 2012-02-24 2018-10-12 삼성전자주식회사 Composite touch screen and operating method thereof
CN104020878A (en) * 2014-05-22 2014-09-03 小米科技有限责任公司 Touch input control method and device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101882042A (en) * 2010-06-08 2010-11-10 苏州瀚瑞微电子有限公司 Palm judgment method of capacitive touch screen

Also Published As

Publication number Publication date
CN106155642A (en) 2016-11-23

Similar Documents

Publication Publication Date Title
CN106155642B (en) Information processing method and electronic equipment
US10969903B2 (en) Method, device and mobile terminal for preventing false-touch on touch screen
RU2662408C2 (en) Method, apparatus and data processing device
US9274652B2 (en) Apparatus, method, and medium for sensing movement of fingers using multi-touch sensor array
CN106598335B (en) A kind of touch screen control method, device and mobile terminal of mobile terminal
US9245166B2 (en) Operating method based on fingerprint and gesture recognition and electronic device
US9778742B2 (en) Glove touch detection for touch devices
CN104679362B (en) Touch device and control method thereof
TWI478041B (en) Method of identifying palm area of a touch panel and a updating method thereof
US9317130B2 (en) Visual feedback by identifying anatomical features of a hand
CN106681554B (en) A kind of control method of mobile terminal touch screen, device and mobile terminal
CN106855782A (en) A kind of method for preventing false touch, device and terminal
US20090066659A1 (en) Computer system with touch screen and separate display screen
CN106855783A (en) A kind of method of false-touch prevention, device and mobile terminal
KR20090060888A (en) Apparatus and method for providing an adaptive on-screen keyboard
JP2011197782A (en) Candidate display device and candidate display method
CN103164067A (en) Method for judging touch input and electronic device
CN109101127B (en) Palm touch detection in a touch screen device with a floating ground or thin touch panel
US20160342275A1 (en) Method and device for processing touch signal
US10564844B2 (en) Touch-control devices and methods for determining keys of a virtual keyboard
US20150091836A1 (en) Touch control input method and system, computer storage medium
CN108268163B (en) Determining occurrence of elongated contact of a single finger with slot analysis in a touch screen device
CN115033170A (en) Input control system and method based on virtual keyboard and related device
JP2015169948A (en) Information processing device, information processing method, and information processing program
JP2012238128A (en) Information device having back-face input function, back-face input method, and program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant