WO2015176484A1 - 触摸输入控制方法及装置 - Google Patents

触摸输入控制方法及装置 Download PDF

Info

Publication number
WO2015176484A1
WO2015176484A1 PCT/CN2014/089246 CN2014089246W WO2015176484A1 WO 2015176484 A1 WO2015176484 A1 WO 2015176484A1 CN 2014089246 W CN2014089246 W CN 2014089246W WO 2015176484 A1 WO2015176484 A1 WO 2015176484A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
edge
palm
touch information
sub
Prior art date
Application number
PCT/CN2014/089246
Other languages
English (en)
French (fr)
Inventor
杨坤
张博
朱凌
Original Assignee
小米科技有限责任公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 小米科技有限责任公司 filed Critical 小米科技有限责任公司
Priority to JP2016520274A priority Critical patent/JP6033502B2/ja
Priority to RU2014153895A priority patent/RU2618921C2/ru
Priority to KR1020147035869A priority patent/KR101714857B1/ko
Priority to BR112015000003A priority patent/BR112015000003A2/pt
Priority to MX2014015064A priority patent/MX349777B/es
Priority to US14/578,715 priority patent/US9671911B2/en
Publication of WO2015176484A1 publication Critical patent/WO2015176484A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • G06F3/04186Touch location disambiguation

Definitions

  • the present disclosure relates to the field of smart terminal technologies, and in particular, to a touch input control method and apparatus.
  • the terminal usually uses TP (Touch Panel) to provide touch input function to the user, and the capacitive touch sensor is integrated in the TP to enable the terminal to realize multi-touch human-computer interaction.
  • TP is also commonly referred to as a touch screen.
  • the terminal touch screen frame is gradually tapered to provide a wider touch range to the user.
  • the non-finger part for example, the palm portion of the thumb connecting the palm
  • the integrated circuit chip obtains and reports the touch information generated by the erroneous operation.
  • an area on the touch screen that is easily mishandled by a user can be set, and when the user touches these areas, the touch IC shields touch information obtained from these areas.
  • the anti-misoperation area is set in advance, the touch information generated by the erroneous operation can be avoided, but the touch information generated by the normal operation of the user's finger on these areas is shielded correspondingly, resulting in low touch input accuracy.
  • the present disclosure provides a touch input control method and apparatus to solve the problem of easily blocking normal operation touch information when the touch information generated by the erroneous operation is avoided in the related art, resulting in low touch input accuracy.
  • a touch input control method including:
  • Edge touch information includes edge palm touch information generated by a palm touch operation
  • edge touch information includes the edge palm touch information
  • the edge palm touch information is masked.
  • the edge touch information of the edge area of the touch screen of the terminal is:
  • the touch operation is located in the edge area of the touch screen, it is determined that the touch information is the edge touch information.
  • the edge palm touch information generated by the palm touch operation includes:
  • the edge touch information includes the edge palm touch information.
  • the shielding the edge palm touch information comprises:
  • the touch input function of the corresponding sub-area is turned off.
  • the determining the corresponding sub-region of the edge palm touch information on the edge region includes:
  • the first sub-region corresponding to the coordinate range to which the position coordinate belongs is determined as the corresponding sub-region.
  • the determining the corresponding sub-region of the edge palm touch information on the edge region further includes:
  • the second sub-area associated with the first sub-area is determined as the corresponding sub-area, and the second sub-area is a sub-area in which the finger touch operation corresponding to the palm touch operation is performed when the terminal is held.
  • the determining the corresponding sub-region of the edge palm touch information on the edge region includes:
  • it also includes:
  • it also includes:
  • a virtual touch control interface is output in the finger touch operation area.
  • a touch input control apparatus including:
  • An identification unit configured to identify edge touch information of an edge area of the touch screen of the terminal
  • a determining unit configured to determine, according to the touch shape corresponding to the edge touch information, whether the edge touch information includes palm touch information generated by a palm touch operation;
  • a shielding unit configured to block the edge palm touch information when the edge touch information includes the edge palm touch information.
  • the identifying unit includes:
  • An information obtaining subunit configured to obtain touch information generated by the touch operation when a user performs a touch operation on the touch screen of the terminal;
  • a position determining subunit configured to determine, according to location coordinates corresponding to the touch information, whether the touch operation is located in an edge region of the touch screen
  • An information determining subunit configured to determine the touch letter when the touch operation is located in an edge region of the touch screen
  • the information is the edge touch information.
  • the determining unit includes:
  • a shape obtaining subunit configured to obtain a touch shape corresponding to the edge touch information by a contact distribution of a touch operation on the edge region
  • a shape matching subunit configured to determine whether a touch shape corresponding to the edge touch information matches a preset palm touch shape
  • a matching determining subunit configured to determine, when the touch shape corresponding to the edge touch information matches the palm touch shape, the edge touch information includes the edge palm touch information.
  • the shielding unit includes:
  • Determining a subunit configured to determine a corresponding sub-region of the edge palm touch information on the edge region
  • the subunit is turned off to turn off the touch input function of the corresponding sub-area.
  • the determining subunit includes:
  • a position comparison subunit configured to compare position coordinates in the edge palm touch information with a coordinate range of each subregion of the edge region
  • the first sub-area determining sub-unit is configured to determine the first sub-area corresponding to the coordinate range to which the position coordinate belongs as the corresponding sub-area.
  • the determining subunit further includes:
  • a second sub-region determining sub-unit configured to determine a second sub-region associated with the first sub-region as the corresponding sub-region, where the second sub-region corresponds to the palm touch operation when the terminal is held The sub-area where the finger touch operation is located.
  • the determining subunit includes:
  • a time judging subunit configured to determine whether a duration of the palm touch operation on the touch screen exceeds a preset time
  • the sub-area determining sub-unit is configured to determine a corresponding sub-area of the edge palm touch information on the edge area when the duration exceeds a preset time.
  • it also includes:
  • a clearing unit configured to clear other palm touch information generated by the palm touch operation during the palm touch operation.
  • it also includes:
  • a detecting unit configured to detect a finger touch operation area on the touch screen when holding the terminal
  • an output unit configured to output a virtual touch control interface in the finger touch operation area.
  • a touch input control apparatus including:
  • a memory for storing processor executable instructions
  • processor is configured to:
  • Edge touch information includes edge palm touch information generated by a palm touch operation
  • edge touch information includes the edge palm touch information
  • the edge palm touch information is masked.
  • the edge palm touch information included in the edge touch information may be determined according to the touch shape corresponding to the edge touch information, so that only the erroneous input touch information generated by the palm touch operation is blocked.
  • the correct touch information generated by the touch operation of the finger in the edge area is retained, so the accuracy of the touch input can be improved.
  • the present disclosure can obtain the corresponding sub-region of the edge palm information on the edge region after the edge palm touch information is shielded, thereby specifically closing the touch input function of the corresponding sub-region, thereby further improving the accuracy of the touch input;
  • the area may include a first sub-area corresponding to the coordinate range of the position coordinate of the edge palm touch information, and may also include a second sub-area corresponding to the finger touch operation corresponding to the palm touch operation associated with the first sub-area, thereby correspondingly
  • the sub-area can be flexibly set according to the user's habit of holding the terminal, and the user experience can be enhanced while improving the accuracy of the touch input.
  • the disclosure may determine that the edge palm touch information is the touch information generated by the user holding the terminal when the duration of the edge palm touch information on the touch screen exceeds the preset time, and then determine that the corresponding sub-region of the touch input function needs to be turned off, thereby The corresponding sub-area can be identified only when the terminal is held, thereby improving the accuracy of turning off the touch input function of the sub-area.
  • FIG. 1 is a flowchart of a touch input control method according to an exemplary embodiment.
  • FIG. 2A is a flowchart of another touch input control method according to an exemplary embodiment.
  • FIG. 2B is a schematic diagram of a terminal touch screen area division according to an exemplary embodiment.
  • 2C is a schematic diagram of a user holding a terminal, according to an exemplary embodiment.
  • FIG. 2D is a schematic diagram of an area where touch information is located, according to an exemplary embodiment.
  • FIG. 2E is a schematic diagram of another area where touch information is located according to an exemplary embodiment.
  • FIG. 2E is a schematic diagram of another area where touch information is located according to an exemplary embodiment.
  • FIG. 2G is a schematic diagram of a virtual touch control interface on a touch screen according to an exemplary embodiment.
  • FIG. 3 is a block diagram of a touch input control apparatus according to an exemplary embodiment.
  • FIG. 4 is a block diagram of another touch input control apparatus according to an exemplary embodiment.
  • FIG. 5 is a block diagram of another touch input control apparatus according to an exemplary embodiment.
  • FIG. 6 is a block diagram of another touch input control apparatus according to an exemplary embodiment.
  • FIG. 7 is a block diagram of another touch input control apparatus according to an exemplary embodiment.
  • FIG. 8 is a block diagram of another touch input control apparatus according to an exemplary embodiment.
  • FIG. 9 is a block diagram of another touch input control apparatus according to an exemplary embodiment.
  • FIG. 10 is a block diagram of another touch input control apparatus according to an exemplary embodiment.
  • FIG. 11 is a schematic structural diagram of a touch input control device according to an exemplary embodiment.
  • first, second, third, etc. may be used in the present disclosure to describe various information, such information should not be limited to these terms. These terms are only used to distinguish the same type of information from each other.
  • first information may also be referred to as second information without departing from the scope of the present disclosure.
  • second information may also be referred to as first information.
  • word "if” as used herein may be interpreted as "when” or “when” or “in response to a determination.”
  • FIG. 1 is a flowchart of a touch input control method for a terminal, which may include the following steps, according to an exemplary embodiment.
  • step 101 edge touch information of the edge area of the touch screen of the terminal is identified.
  • the terminal in the present disclosure may refer to various intelligent terminals that support touch input.
  • the terminal may support multi-touch input by setting a capacitive touch screen (also referred to as TP), which is a transparent special layer on the surface of the glass panel.
  • TP capacitive touch screen
  • Metal conductive material when the user performs a touch operation on the metal layer, the capacitance of the contact portion of the operating portion changes on the touch screen, so that the frequency of the oscillator connected to the capacitor changes, and the Touch IC chip in the terminal can pass the measurement frequency. The change determines the touch location to obtain touch information.
  • the user will hold the terminal through one hand and touch input through the other hand.
  • the palm portion for example, the thumb connects the palm of the palm
  • the root portion is easy to touch the edge region of the touch screen.
  • the edge region may be divided in advance on the touch screen, and the edge region generally refers to a narrow region on the four sides of the touch screen, except that the portion of the edge region may become the central region.
  • the touch operation when the user performs a touch operation on the touch screen of the terminal, the touch operation may be obtained.
  • Touch information the touch information may generally include position coordinates (including abscissa and ordinate) of the touch input, touch area size, touch state (touch, lift, slide), etc., and the terminal determines the location according to the position coordinates corresponding to the touch information. Whether the touch operation is located within a range of the touch screen edge region, and if the touch operation is located at the touch screen edge region, determining that the touch information is edge touch information.
  • step 102 according to the touch shape corresponding to the edge touch information, it is determined whether the edge touch information includes edge palm touch information generated by a palm touch operation.
  • the touch operation in the edge region of the touch screen forms a corresponding contact on the touch screen
  • the touch shape corresponding to the edge touch information can be obtained through the contact distribution, and then the touch shape corresponding to the edge touch information is determined. Whether it matches the preset palm touch shape, if the touch shape corresponding to the edge touch information matches the palm touch shape, it may be determined that the edge touch information includes the edge palm touch information.
  • step 103 if the edge touch information includes the edge palm touch information, the edge palm touch information is masked.
  • the corresponding sub-region of the edge palm touch information on the edge region may be determined, and then the touch input function of the corresponding sub-region is turned off.
  • the above embodiment can determine the edge palm touch information included in the edge touch information according to the touch shape corresponding to the edge touch information, thereby shielding only the erroneous input touch information generated by the palm touch operation, and retaining the correct touch generated by the finger touch operation in the edge region. Information, therefore, can improve the accuracy of touch input.
  • FIG. 2A is a flowchart of another touch input control method, which may be used in a terminal, including the following steps, according to an exemplary embodiment.
  • step 201 edge touch information of the edge area of the touch screen of the terminal is identified.
  • the user will hold the terminal through one hand and touch input through the other hand.
  • the palm portion for example, the thumb connects the palm of the palm
  • the root portion which is hereinafter referred to as the palm root portion, is easy to touch the edge region of the touch screen. Therefore, in the present disclosure, the edge region can be divided in advance on the touch screen, and the edge region generally refers to a narrow region on the four sides of the touch screen, except for the edge region. Called the central area.
  • FIG. 2B is a schematic diagram of a touch screen area division of a terminal according to an exemplary embodiment of the present disclosure.
  • the terminal touch screen may be divided into an edge area and a center area, wherein the possibility of contacting the edge area according to the palm holding the terminal Position, the edge region is further divided into six sub-regions, which are sub-region A, sub-region B, sub-region C, sub-region D, sub-region E, and sub-region F.
  • the touch information generated by the touch operation may be obtained, and the touch information may generally include location coordinates of the touch input (including the abscissa and the ordinate), the touch area size, a touch state (touch, lift, slide) or the like, the terminal determining, according to the position coordinates corresponding to the touch information, whether the touch operation is located in the edge area of the touch screen, and if the touch operation is located in the edge area of the touch screen, The touch information may be determined to be edge touch information. 2B, if the terminal determines that the touch operation is located in any of the sub-areas A to the sub-areas F according to the position coordinates corresponding to the touch information, the touch may be determined.
  • the information is edge touch information.
  • step 202 determining, according to the touch shape corresponding to the edge touch information, whether the edge touch information includes edge palm touch information generated by a palm touch operation, and if the edge touch information includes edge palm touch information, performing steps 203. If the edge touch information does not include edge palm touch information, return to step 201.
  • the touch operation in the edge region of the touch screen forms a corresponding contact on the touch screen
  • the touch shape corresponding to the edge touch information can be obtained through the contact distribution, and then the touch shape corresponding to the edge touch information is determined. Whether it matches the preset palm touch shape, if the touch shape corresponding to the edge touch information matches the palm touch shape, it may be determined that the edge touch information includes the edge palm touch information.
  • FIG. 2C is a schematic diagram of a user holding a terminal according to an exemplary embodiment of the present disclosure.
  • the user holds the terminal by the left hand, the palm root portion contacts the edge region of the touch screen, and contacts are formed on the edge region.
  • the outline of the point distribution is an ellipse shown by a broken line on the palm root portion as shown in Fig. 2C, and the fingertip portion of the thumb forms a contact on the central area, and the outline of the contact distribution is a dotted line on the fingertip portion as shown in Fig. 2C.
  • the circular shape is shown, whereby it can be seen that at least the palm touch operation and the finger touch operation can be distinguished by the shape corresponding to the touch information.
  • FIG. 2D is a schematic diagram of an area where touch information is located according to an exemplary embodiment: according to the holding manner of the palm holding terminal in FIG. 2C, corresponding to FIG. 2D, the terminal obtains the palm root of the touch screen edge area.
  • the terminal may select the touch shape and the preset touch.
  • the shape is compared to determine the type of the edge touch information.
  • the obtained edge touch information may be determined as Touch the information on the edge of the palm.
  • the fingertip portion also touches the edge region, as shown in FIG. 2E, according to Another exemplary embodiment shows another schematic diagram of the area in which the edge touch information is located, which is different from that of FIG. 2D in that the corresponding circular touch shape in the thumb portion of FIG. 2E is located in the edge region.
  • step 203 the edge palm touch information is masked.
  • the edge touch information includes the edge palm touch information
  • the edge palm touch information is directly shielded, that is, the edge palm touch information is not reported.
  • the terminal may shield the ellipse on the edge region of the touch screen in the image.
  • the edge of the palm touch information the difference is that in FIG. 2D, the touch information of the thumb finger operation is located in the central area, and in FIG. 2E, the touch information of the thumb finger operation is located in the edge area, but for that In the case of holding the terminal, the embodiment of the present disclosure can be applied to shield only the edge palm touch information without shielding the normal touch input information located in any area on the touch screen.
  • the palm root portion may directly hold the edge area of the terminal, so the recognized edge palm touch information is not reported, thereby shielding the edge palm touch information; After the root portion slides on the touch screen for a certain distance and then holds the edge of the terminal, the palm root portion may generate palm touch information in the central area of the touch screen during the sliding process. These palm touch information are useless touch information for the touch operation. Therefore, by updating the state information of the palm touch information, the other palm touch information generated by the palm touch operation during the palm touch operation may be cleared, wherein the state information may include the position coordinate in the palm touch information. Information from the touch screen, or location coordinates.
  • step 204 it is determined whether the duration of the palm touch operation on the touch screen exceeds a preset time. If the duration exceeds the preset time, step 205 is performed; if the duration does not exceed the preset time Then, return to step 201.
  • the palm when the user holds the terminal, the palm may touch the edge area for a long time, but when the user performs other touch operations on the terminal, the palm may occasionally touch the edge area, so this step may further determine that the palm touch information is Whether the duration on the touch screen exceeds a preset time, which defines the minimum time possible for the palm to hold the terminal, thereby determining whether the palm touch information generated by the grip terminal is used to subsequently determine whether to perform the closing edge
  • the touch function of the corresponding sub-area on the area is considered that when the user holds the terminal, the palm may touch the edge area for a long time, but when the user performs other touch operations on the terminal, the palm may occasionally touch the edge area, so this step may further determine that the palm touch information is Whether the duration on the touch screen exceeds a preset time, which defines the minimum time possible for the palm to hold the terminal, thereby determining whether the palm touch information generated by the grip terminal is used to subsequently determine whether to perform the closing edge
  • step 205 a corresponding sub-region of the edge palm touch information on the edge region is determined.
  • the location coordinates of the edge palm touch information and the coordinate range of each sub-region of the edge region may be compared, and the first sub-region corresponding to the coordinate range to which the location coordinate belongs may be determined as the corresponding sub-region.
  • the second sub-area associated with the first sub-area is determined as the corresponding sub-area, and the second sub-area is a sub-area in which the finger touch operation corresponding to the palm touch operation is performed when the terminal is held.
  • the edge palm is compared by the position coordinates.
  • the corresponding sub-area of the touch information is the sub-area C.
  • FIG. 2F is a schematic diagram of another area of touch information according to an exemplary embodiment.
  • the left hand since the left hand holds the terminal, except for the palm portion, the left edge region of the terminal is held.
  • the other four fingers of the left hand may also hold the right edge region of the terminal, as shown in FIG. 2C. Therefore, after identifying the sub-region C corresponding to the edge palm touch information, the sub-region E opposite to the sub-region C may be directly obtained.
  • the sub-area E is a sub-area in which the finger touch operation corresponding to the palm touch operation is performed.
  • step 206 the touch input function of the corresponding sub-area is turned off, and the current flow is ended.
  • the edge palm touch information included in the edge touch information can be determined according to the touch shape corresponding to the edge touch information, thereby shielding only the error caused by the palm touch operation.
  • Input touch information retaining the correct touch information generated by the finger touch operation in the edge area, thus improving the accuracy of the touch input; and, after touching the edge of the palm touch information, the edge palm can be obtained
  • the corresponding sub-area of the information on the edge area thereby specifically closing the touch input function of the corresponding sub-area, further improving the accuracy of the touch input, and can be flexibly set according to the user's holding terminal habit, and improving the touch input accuracy.
  • the user experience can be enhanced; and, also, the corresponding sub-area can be identified only when the terminal is held, thereby improving the accuracy of turning off the touch input function of the sub-area.
  • FIG. 2G is a schematic diagram of a virtual touch control interface on a touch screen according to an exemplary embodiment: in FIG. 2G, a virtual touch control icon is displayed in a touch operation area near a thumb fingertip, and the virtual touch control icon is slid
  • the control button icon is an example.
  • FIG. 2G Only an example of outputting a virtual touch control interface in the thumb touch operation area is shown in FIG. 2G.
  • information on the touch screen may be detected according to the touch area, the touch shape, and the like.
  • the finger touches the operation area and outputs the touch control interface in the above-mentioned detected area, which is not limited in this embodiment of the present disclosure.
  • the present disclosure also provides an embodiment of a touch input control device and a terminal.
  • FIG. 3 is a block diagram of a touch input control device according to an exemplary embodiment of the present disclosure.
  • the device includes an identification unit 31, a determination unit 32, and a shielding unit 33.
  • the identification unit 31 is configured to identify edge touch information of an edge region of the touch screen of the terminal;
  • the determining unit 32 is configured to determine, according to the touch shape corresponding to the edge touch information, whether the edge touch information includes palm touch information generated by a palm touch operation;
  • the shielding unit 33 is configured to block the edge palm touch information when the edge touch information includes the edge palm touch information.
  • the edge palm touch information included in the edge touch information may be determined according to the touch shape corresponding to the edge touch information, so that only the error caused by the palm touch operation may be shielded. Inputting touch information preserves the correct touch information generated by the finger touch operation in the edge area, thereby improving the accuracy of the touch input.
  • FIG. 4 is a block diagram of another touch input control device according to an exemplary embodiment of the present disclosure.
  • the identification unit 31 may include : an information acquisition sub-unit 311, a position determination sub-unit 312, and an information determination sub-unit 313.
  • the information obtaining sub-unit 311 is configured to obtain touch information generated by the touch operation when the user performs a touch operation on the touch screen of the terminal;
  • the location determining sub-unit 312 is configured to determine, according to location coordinates corresponding to the touch information, whether the touch operation is located in an edge region of the touch screen;
  • the information determining sub-unit 313 is configured to determine that the touch information is the edge touch information when the touch operation is located in the touch screen edge region.
  • FIG. 5 is a block diagram of another touch input control device according to an exemplary embodiment of the present disclosure.
  • the determining unit 32 may include The shape obtaining sub-unit 321, the shape matching sub-unit 322, and the matching determining sub-unit 323.
  • the shape obtaining sub-unit 321 is configured to obtain a touch shape corresponding to the edge touch information by a contact distribution of a touch operation on the edge region;
  • the shape matching sub-unit 322 is configured to determine whether a touch shape corresponding to the edge touch information matches a preset palm touch shape
  • the matching determination sub-unit 323 is configured to determine that the edge touch information includes the edge palm touch information when the touch shape corresponding to the edge touch information matches the palm touch shape.
  • the configuration of the determining unit 32 in the device embodiment shown in FIG. 5 may also be included in the foregoing device embodiment shown in FIG. 4, and the disclosure is not limited thereto.
  • the contact shape may be used to determine the touch shape corresponding to the edge touch information, thereby determining the edge palm touch information included in the edge touch information, so as to shield only the erroneous input touch information generated by the palm touch operation, and retain the finger.
  • the correct touch information generated by the touch operation in the edge area can improve the accuracy of the touch input.
  • FIG. 6 is a block diagram of another touch input control device according to an exemplary embodiment of the present disclosure.
  • the shielding unit 33 may include : Determining subunit 331 and closing subunit 332.
  • the determining sub-unit 331 is configured to determine a corresponding sub-area of the edge palm touch information on the edge area;
  • the closing sub-unit 332 is configured to close a touch input function of the corresponding sub-area.
  • determining subunit 331 and the closing subunit 332 in the device embodiment shown in FIG. 6 may also be included in the foregoing device embodiment shown in FIG. 4 or FIG. 5, and the disclosure is not limited thereto. .
  • the corresponding sub-region of the edge palm information on the edge region can be obtained, thereby specifically closing the touch input function of the corresponding sub-region, thereby further improving the accuracy of the touch input.
  • FIG. 7 is a block diagram of another touch input control device according to an exemplary embodiment of the present disclosure.
  • the determining subunit 331 may be The method includes a position comparison sub-unit 3311, a first sub-area determination sub-unit 3312, and a second sub-area determination sub-unit 3313.
  • the position comparison sub-unit 3311 is configured to compare the position coordinates in the edge palm touch information with the coordinate range of each sub-area of the edge region;
  • the first sub-area determining sub-unit 3312 is configured to determine the first sub-area corresponding to the coordinate range to which the position coordinate belongs as the corresponding sub-area.
  • the second sub-area determining sub-unit 3313 is configured to determine a second sub-area associated with the first sub-area as the corresponding sub-area, where the second sub-area is a holding terminal Palm touch operation corresponding finger touch The subregion in which the operation is located.
  • the determining subunit 331 in the embodiment shown in FIG. 7 includes the first sub-area determining sub-unit 3312 and the second sub-area determining sub-unit 3313.
  • the embodiment does not limit this.
  • the corresponding sub-region may include the first sub-region corresponding to the coordinate range of the position coordinate of the edge palm touch information, and may also include the finger touch operation corresponding to the palm touch operation associated with the first sub-region.
  • the two sub-areas so that the corresponding sub-areas can be flexibly set according to the user's holding terminal habits, and the user experience can be enhanced while improving the touch input accuracy.
  • FIG. 8 is a block diagram of another touch input control device according to an exemplary embodiment of the present disclosure.
  • the embodiment is based on the foregoing embodiment shown in FIG. 6 or FIG.
  • the unit 331 may include a time judging subunit 3314 and a sub-area determining sub-unit 3315.
  • the time judging subunit 3314 is configured to determine whether the duration of the palm touch operation on the touch screen exceeds a preset time
  • the sub-area determining sub-unit 3315 is configured to determine a corresponding sub-area of the edge palm touch information on the edge area when the duration exceeds a preset time.
  • the edge palm touch information on the touch screen exceeds the preset time
  • the edge palm touch information is determined as the touch information generated by the user holding the terminal, and then the corresponding sub-function of the touch input function needs to be closed.
  • the area so that the corresponding sub-area can be identified only when the terminal is held, thereby improving the accuracy of turning off the touch input function of the sub-area.
  • FIG. 9 is a block diagram of another touch input control device according to an exemplary embodiment of the present disclosure.
  • the embodiment may further include:
  • the clearing unit 34 is configured to clear other palm touch information generated by the palm touch operation during the palm touch operation.
  • the clearing unit 34 in the device embodiment shown in FIG. 9 may also be included in any of the foregoing device embodiments shown in FIG. 4 to FIG. 8 , and the disclosure is not limited thereto.
  • FIG. 10 is a block diagram of another touch input control device according to an exemplary embodiment of the present disclosure.
  • the embodiment may further include: Detection unit 35 and output unit 36.
  • the detecting unit 35 is configured to detect a finger touch operation area on the touch screen when the terminal is held;
  • the output unit 36 is configured to output a virtual touch control interface in the finger touch operation area.
  • the detecting unit 35 and the output unit 36 in the device embodiment shown in FIG. 10 may also be included in any of the foregoing device embodiments shown in FIG. 4 to FIG. 9 , and the disclosure is not limited thereto. .
  • the device embodiment since it basically corresponds to the method embodiment, refer to the method embodiment for the relevant points. Part of the description can be.
  • the device embodiments described above are merely illustrative, wherein the units described as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, ie may be located A place, or it can be distributed to multiple network units. Some or all of the modules may be selected according to actual needs to achieve the objectives of the present disclosure. Those of ordinary skill in the art can understand and implement without any creative effort.
  • the present disclosure also provides a touch input control device, the touch input control device comprising: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to: identify a touch screen edge of the terminal An edge touch information of the area; determining, according to the touch shape corresponding to the edge touch information, whether the edge touch information includes edge palm touch information generated by a palm touch operation; and if the edge touch information includes the edge palm touch information, Then the edge palm touch information is shielded.
  • the present disclosure also provides a terminal, the terminal including a memory, and one or more programs, wherein one or more programs are stored in the memory and configured to be executed by one or more processors
  • the one or more programs include instructions for: identifying edge touch information of a touch screen edge region of the terminal; determining, according to the touch shape corresponding to the edge touch information, whether the edge touch information includes a palm touch operation The edge palm touches the information; if the edge touch information includes the edge palm touch information, the edge palm touch information is masked.
  • FIG. 11 is another schematic structural diagram of a touch input control device 1100 according to an exemplary embodiment of the present disclosure.
  • device 1100 can be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a gaming console, a tablet device, a medical device, a fitness device, a personal digital assistant, and the like.
  • apparatus 1100 can include one or more of the following components: processing component 1102, memory 1104, power component 1106, multimedia component 1108, audio component 1110, input/output (I/O) interface 1112, sensor component 1114, and Communication component 1116.
  • Processing component 1102 typically controls the overall operation of device 1100, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations.
  • Processing component 1102 can include one or more processors 1120 to execute instructions to perform all or part of the steps of the above described methods.
  • processing component 1102 can include one or more modules to facilitate interaction between component 1102 and other components.
  • the processing component 1102 can include a multimedia module to facilitate interaction between the multimedia component 1108 and the processing component 1102.
  • Memory 1104 is configured to store various types of data to support operation at device 1100. Examples of such data include instructions for any application or method operating on device 1100, contact data, phone book data, messages, pictures, videos, and the like.
  • the memory 1104 can be implemented by any type of volatile or non-volatile storage device, or a combination thereof, such as static random access memory (SRAM), electrically erasable programmable read only memory (EEPROM), erasable.
  • SRAM static random access memory
  • EEPROM electrically erasable programmable read only memory
  • EPROM Programmable Read Only Memory
  • PROM Programmable Read Only Memory
  • ROM Read Only Memory
  • Magnetic Memory Flash Memory
  • Disk Disk or Optical Disk.
  • Power component 1106 provides power to various components of device 1100.
  • Power component 1106 can include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for device 1100.
  • the multimedia component 1108 includes a screen between the device 1100 and a user that provides an output interface.
  • the screen can include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes a touch panel, the screen can be implemented as a touch screen to receive input signals from the user.
  • the touch panel includes one or more touch sensors to sense touches, slides, and gestures on the touch panel. The touch sensor may sense not only the boundary of the touch or sliding action, but also the duration and pressure associated with the touch or slide operation.
  • the multimedia component 1108 includes a front camera and/or a rear camera. When the device 1100 is in an operation mode, such as a shooting mode or a video mode, the front camera and/or the rear camera can receive external multimedia data. Each front and rear camera can be a fixed optical lens system or have focal length and optical zoom capabilities.
  • the audio component 1110 is configured to output and/or input an audio signal.
  • the audio component 1110 includes a microphone (MIC) that is configured to receive an external audio signal when the device 1100 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode.
  • the received audio signal may be further stored in memory 1104 or transmitted via communication component 1116.
  • the audio component 1110 also includes a speaker for outputting an audio signal.
  • the input/output interface 1112 provides an interface between the processing component 1102 and the peripheral interface module, and the peripheral interface module may be a keyboard, a click wheel, a button, or the like. These buttons may include, but are not limited to, a home button, a volume button, a start button, and a lock button.
  • Sensor assembly 1114 includes one or more sensors for providing a status assessment of various aspects to device 1100.
  • sensor assembly 1114 can detect an open/closed state of device 1100, relative positioning of components, such as the display and keypad of device 1100, and sensor component 1114 can also detect a change in position of one component of device 1100 or device 1100. The presence or absence of user contact with device 1100, device 1100 orientation or acceleration/deceleration and temperature change of device 1100.
  • Sensor assembly 1114 can include a proximity sensor configured to detect the presence of nearby objects without any physical contact.
  • Sensor assembly 1114 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications.
  • the sensor assembly 1114 can also include an acceleration sensor, a gyro sensor, a magnetic sensor, a pressure sensor, a microwave sensor, or a temperature sensor.
  • Communication component 1116 is configured to facilitate wired or wireless communication between device 1100 and other devices.
  • the device 1100 can access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof.
  • communication component 1116 receives broadcast signals or broadcast associated information from an external broadcast management system via a broadcast channel.
  • the communication component 1116 also includes a near field communication (NFC) module to facilitate short range communication.
  • NFC near field communication
  • the NFC module can be implemented based on radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra-wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
  • RFID radio frequency identification
  • IrDA infrared data association
  • UWB ultra-wideband
  • Bluetooth Bluetooth
  • apparatus 1100 may be implemented by one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable A gate array (FPGA), controller, microcontroller, microprocessor, or other electronic component implementation for performing the above methods.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGA field programmable A gate array
  • controller microcontroller, microprocessor, or other electronic component implementation for performing the above methods.
  • non-transitory computer readable storage medium comprising instructions, such as a memory 1104 comprising instructions executable by processor 1120 of apparatus 1100 to perform the above method.
  • the non-transitory computer readable storage medium may be a ROM, a random access memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, and an optical data storage device.
  • a non-transitory computer readable storage medium when the instructions in the storage medium are executed by a processor of a terminal, enabling the terminal to perform a touch input control method, the method comprising: identifying an edge of an edge region of the touch screen of the terminal Touching information; determining, according to the touch shape corresponding to the edge touch information, whether the edge touch information includes edge palm touch information generated by a palm touch operation; and if the edge touch information includes the edge palm touch information, shielding the device The edge of the palm touches the information.

Abstract

本公开是关于触摸输入控制方法及装置,所述方法包括:识别终端触摸屏边缘区域的边缘触摸信息;根据所述边缘触摸信息对应的触摸形状,判断所述边缘触摸信息是否包含由手掌触摸操作产生的边缘手掌触摸信息,如果所述边缘触摸信息包含所述边缘手掌触摸信息,则屏蔽所述边缘手掌触摸信息。本公开可以根据边缘触摸信息对应的触摸形状判断出边缘触摸信息中包含的边缘手掌触摸信息,从而仅屏蔽由于手掌触摸操作产生的误输入触摸信息,保留手指在边缘区域触摸操作产生的正确触摸信息,因此能够提高触摸输入的准确性。

Description

触摸输入控制方法及装置
本申请基于申请号为201410219188.9、申请日为2014年5月22日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。
技术领域
本公开涉及智能终端技术领域,尤其涉及触摸输入控制方法及装置。
背景技术
随着智能终端的发展和普及,终端通常采用TP(Touch Panel,触控屏幕总成)向用户提供触摸输入功能,TP内集成电容触控传感器,以使终端能够实现多点触摸的人机交互,TP也即通常所称的触摸屏。为了增强用户的触摸体验,终端触摸屏边框逐步变细,以便向用户提供更宽广的触摸范围。但是,随着终端触摸屏边框变细,当用户持握终端时,非手指部位(例如,大拇指连接手掌的掌根部分)容易触碰到触摸屏,使得终端通过Touch IC(Touch Integrated Circuit,触控集成电路)芯片获得并上报误操作产生的触摸信息。
相关技术中,可以将触摸屏上易被用户误操作的区域进行设置,当用户触摸到这些区域时,Touch IC屏蔽从这些区域获得的触摸信息。然而,预先设置防误操作区域虽然能够避免误操作产生的触摸信息,但是也会相应屏蔽用户手指在这些区域上正常操作产生的触摸信息,导致触摸输入准确性不高。
发明内容
本公开提供触摸输入控制方法及装置,以解决相关技术中在避免误操作产生的触摸信息时,容易屏蔽正常操作的触摸信息,导致触摸输入准确性不高的问题。
根据本公开实施例的第一方面,提供一种触摸输入控制方法,包括:
识别终端触摸屏边缘区域的边缘触摸信息;
根据所述边缘触摸信息对应的触摸形状,判断所述边缘触摸信息是否包含由手掌触摸操作产生的边缘手掌触摸信息;
如果所述边缘触摸信息包含所述边缘手掌触摸信息,则屏蔽所述边缘手掌触摸信息。
可选的,所述识别终端触摸屏边缘区域的边缘触摸信息包括:
当用户在所述终端触摸屏上执行触摸操作时,获得所述触摸操作产生的触摸信息;
根据所述触摸信息对应的位置坐标判断所述触摸操作是否位于所述触摸屏边缘区域;
如果所述触摸操作位于所述触摸屏边缘区域,则确定所述触摸信息为所述边缘触摸信息。
可选的,所述根据所述边缘触摸信息对应的触摸形状,判断所述边缘触摸信息是否包 含由手掌触摸操作产生的边缘手掌触摸信息包括:
通过所述边缘区域上触摸操作的触点分布获得所述边缘触摸信息对应的触摸形状;
判断所述边缘触摸信息对应的触摸形状是否与预先设置的手掌触摸形状匹配;
如果所述边缘触摸信息对应的触摸形状与所述手掌触摸形状匹配,则确定所述边缘触摸信息包含所述边缘手掌触摸信息。
可选的,所述屏蔽所述边缘手掌触摸信息包括:
确定所述边缘手掌触摸信息在所述边缘区域上的对应子区域;
关闭所述对应子区域的触摸输入功能。
可选的,所述确定所述边缘手掌触摸信息在所述边缘区域上的对应子区域包括:
比较所述边缘手掌触摸信息中的位置坐标与所述边缘区域的每个子区域的坐标范围;
将所述位置坐标所属坐标范围对应的第一子区域确定为所述对应子区域。
可选的,所述确定所述边缘手掌触摸信息在所述边缘区域上的对应子区域还包括:
将与所述第一子区域关联的第二子区域确定为所述对应子区域,所述第二子区域为持握终端时与所述手掌触摸操作对应的手指触摸操作所在的子区域。
可选的,所述确定所述边缘手掌触摸信息在所述边缘区域上的对应子区域包括:
判断所述手掌触摸操作在所述触摸屏上的持续时间是否超过预设时间;
如果所述持续时间超过预设时间,确定所述边缘手掌触摸信息在所述边缘区域上的对应子区域。
可选的,还包括:
清除所述手掌触摸操作过程中产生的除所述边缘手掌触摸信息的其他手掌触摸信息。
可选的,还包括:
检测持握终端时所述触摸屏上的手指触摸操作区域;
在所述手指触摸操作区域输出虚拟触摸控制界面。
根据本公开实施例的第二方面,提供一种触摸输入控制装置,包括:
识别单元,用于识别终端触摸屏边缘区域的边缘触摸信息;
判断单元,用于根据所述边缘触摸信息对应的触摸形状,判断所述边缘触摸信息是否包含由手掌触摸操作产生的手掌触摸信息;
屏蔽单元,用于在所述边缘触摸信息包含所述边缘手掌触摸信息时,屏蔽所述边缘手掌触摸信息。
可选的,所述识别单元包括:
信息获得子单元,用于当用户在所述终端触摸屏上执行触摸操作时,获得所述触摸操作产生的触摸信息;
位置判断子单元,用于根据所述触摸信息对应的位置坐标判断所述触摸操作是否位于所述触摸屏边缘区域;
信息确定子单元,用于在所述触摸操作位于所述触摸屏边缘区域时,确定所述触摸信 息为所述边缘触摸信息。
可选的,所述判断单元包括:
形状获得子单元,用于通过所述边缘区域上触摸操作的触点分布获得所述边缘触摸信息对应的触摸形状;
形状匹配子单元,用于判断所述边缘触摸信息对应的触摸形状是否与预先设置的手掌触摸形状匹配;
匹配确定子单元,用于在所述边缘触摸信息对应的触摸形状与所述手掌触摸形状匹配时,确定所述边缘触摸信息包含所述边缘手掌触摸信息。
可选的,所述屏蔽单元包括:
确定子单元,用于确定所述边缘手掌触摸信息在所述边缘区域上的对应子区域;
关闭子单元,用于关闭所述对应子区域的触摸输入功能。
可选的,所述确定子单元包括:
位置比较子单元,用于比较所述边缘手掌触摸信息中的位置坐标与所述边缘区域的每个子区域的坐标范围;
第一子区域确定子单元,用于将所述位置坐标所属坐标范围对应的第一子区域确定为所述对应子区域。
可选的,所述确定子单元还包括:
第二子区域确定子单元,用于将与所述第一子区域关联的第二子区域确定为所述对应子区域,所述第二子区域为持握终端时与所述手掌触摸操作对应的手指触摸操作所在的子区域。
可选的,所述确定子单元包括:
时间判断子单元,用于判断所述手掌触摸操作在所述触摸屏上的持续时间是否超过预设时间;
子区域确定子单元,用于在所述持续时间超过预设时间时,确定所述边缘手掌触摸信息在所述边缘区域上的对应子区域。
可选的,还包括:
清除单元,用于清除所述手掌触摸操作过程中产生的除所述边缘手掌触摸信息的其他手掌触摸信息。
可选的,还包括:
检测单元,用于检测持握终端时所述触摸屏上的手指触摸操作区域;
输出单元,用于在所述手指触摸操作区域输出虚拟触摸控制界面。
根据本公开实施例的第三方面,提供一种触摸输入控制装置,包括:
处理器;
用于存储处理器可执行指令的存储器;
其中,所述处理器被配置为:
识别终端触摸屏边缘区域的边缘触摸信息;
根据所述边缘触摸信息对应的触摸形状,判断所述边缘触摸信息是否包含由手掌触摸操作产生的边缘手掌触摸信息;
如果所述边缘触摸信息包含所述边缘手掌触摸信息,则屏蔽所述边缘手掌触摸信息。
本公开的实施例提供的技术方案可以包括以下有益效果:
本公开在识别出触摸屏边缘区域的边缘触摸信息后,可以根据边缘触摸信息对应的触摸形状判断出边缘触摸信息中包含的边缘手掌触摸信息,从而仅屏蔽由于手掌触摸操作产生的误输入触摸信息,保留手指在边缘区域触摸操作产生的正确触摸信息,因此能够提高触摸输入的准确性。
本公开在屏蔽边缘手掌触摸信息后,可以获得边缘手掌信息在边缘区域上的对应子区域,从而有针对性地关闭对应子区域的触摸输入功能,进一步提高触摸输入的准确性;其中,对应子区域除了可以包含边缘手掌触摸信息的位置坐标所属坐标范围对应的第一子区域,也可以包含第一子区域所关联的与手掌触摸操作对应的手指触摸操作所在的第二子区域,从而使得对应子区域可以根据用户持握终端习惯进行灵活设置,在提高触摸输入准确性的同时,可以增强用户体验。
本公开可以在边缘手掌触摸信息在触摸屏上的持续时间超过预设时间时,确定边缘手掌触摸信息为用户持握终端产生的触摸信息,此时再确定需要关闭触摸输入功能的对应子区域,从而可以仅在持握终端时识别对应子区域,由此提高对子区域触摸输入功能关闭的准确性。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本公开。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本公开的实施例,并与说明书一起用于解释本公开的原理。
图1是根据一示例性实施例示出的一种触摸输入控制方法的流程图。
图2A是根据一示例性实施例示出的另一种触摸输入控制方法的流程图。
图2B是根据一示例性实施例示出的终端触摸屏区域划分示意图。
图2C是根据一示例性实施例示出的用户持握终端的示意图。
图2D是根据一示例性实施例示出的一种触摸信息所在区域示意图。
图2E是根据一示例性实施例示出的另一种触摸信息所在区域示意图。
图2E是根据一示例性实施例示出的另一种触摸信息所在区域示意图。
图2G是根据一示例性实施例示出的一种触摸屏上虚拟触摸控制界面示意图。
图3是根据一示例性实施例示出的一种触摸输入控制装置的框图。
图4是根据一示例性实施例示出的另一种触摸输入控制装置的框图。
图5是根据一示例性实施例示出的另一种触摸输入控制装置的框图。
图6是根据一示例性实施例示出的另一种触摸输入控制装置的框图。
图7是根据一示例性实施例示出的另一种触摸输入控制装置的框图。
图8是根据一示例性实施例示出的另一种触摸输入控制装置的框图。
图9是根据一示例性实施例示出的另一种触摸输入控制装置的框图。
图10是根据一示例性实施例示出的另一种触摸输入控制装置的框图。
图11是根据一示例性实施例示出的一种用于触摸输入控制装置的一结构示意图。
具体实施方式
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本公开相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本公开的一些方面相一致的装置和方法的例子。
在本公开使用的术语是仅仅出于描述特定实施例的目的,而非旨在限制本公开。在本公开和所附权利要求书中所使用的单数形式的“一种”、“所述”和“该”也旨在包括多数形式,除非上下文清楚地表示其他含义。还应当理解,本文中使用的术语“和/或”是指并包含一个或多个相关联的列出项目的任何或所有可能组合。
应当理解,尽管在本公开可能采用术语第一、第二、第三等来描述各种信息,但这些信息不应限于这些术语。这些术语仅用来将同一类型的信息彼此区分开。例如,在不脱离本公开范围的情况下,第一信息也可以被称为第二信息,类似地,第二信息也可以被称为第一信息。取决于语境,如在此所使用的词语“如果”可以被解释成为“在……时”或“当……时”或“响应于确定”。
如图1所示,图1是根据一示例性实施例示出的一种触摸输入控制方法的流程图,该触摸输入控制方法用于终端中,可以包括以下步骤。
在步骤101中,识别终端触摸屏边缘区域的边缘触摸信息。
本公开中的终端可以指支持触摸输入的各种智能终端,终端可以通过设置电容式触摸屏(也可称为TP)支持多点触摸输入,电容式触摸屏是在玻璃面板表面粘贴一层透明的特殊金属导电物质,当用户在金属层上进行触摸操作时,操作部位的触点在触摸屏上的电容发生变化,使得与该电容相连的振荡器频率发生变化,终端内的Touch IC芯片可以通过测量频率变化确定触摸位置,从而获得触摸信息。
通常在使用终端时,用户会通过一只手持握终端,另一只手进行触摸输入,随着终端触摸屏边框逐步变细,当用户持握终端时,手掌部分(例如,大拇指连接手掌的掌根部分)容易触碰到触摸屏的边缘区域,本公开中可以预先在触摸屏上划分出边缘区域,边缘区域通常指触摸屏四边上较窄的区域,除了边缘区域的部分可以成为中心区域。
本实施例中,当用户在终端触摸屏上执行触摸操作时,可以获得所述触摸操作产生的 触摸信息,触摸信息通常可以包括触摸输入的位置坐标(包括横坐标和纵坐标)、触摸面积大小、触摸状态(触摸、抬起、滑动)等,终端根据所述触摸信息对应的位置坐标判断所述触摸操作是否位于所述触摸屏边缘区域的范围内,如果所述触摸操作位于所述触摸屏边缘区域,则确定所述触摸信息为边缘触摸信息。
在步骤102中,根据所述边缘触摸信息对应的触摸形状,判断所述边缘触摸信息是否包含由手掌触摸操作产生的边缘手掌触摸信息。
本实施例中,在触摸屏边缘区域的触摸操作会在触摸屏上形成相应的触点,通过这些触点分布可以获得所述边缘触摸信息对应的触摸形状,然后判断所述边缘触摸信息对应的触摸形状是否与预先设置的手掌触摸形状匹配,如果所述边缘触摸信息对应的触摸形状与所述手掌触摸形状匹配,则可以确定所述边缘触摸信息包含所述边缘手掌触摸信息。
在步骤103中,如果所述边缘触摸信息包含所述边缘手掌触摸信息,则屏蔽所述边缘手掌触摸信息。
本实施例中,在屏蔽边缘手掌触摸信息时,可以确定边缘手掌触摸信息在所述边缘区域上的对应子区域,然后关闭所述对应子区域的触摸输入功能。
上述实施例可以根据边缘触摸信息对应的触摸形状判断出边缘触摸信息中包含的边缘手掌触摸信息,从而仅屏蔽由于手掌触摸操作产生的误输入触摸信息,保留手指在边缘区域触摸操作产生的正确触摸信息,因此能够提高触摸输入的准确性。
如图2A所示,图2A是根据一示例性实施例示出的另一种触摸输入控制方法的流程图,该触摸输入控制方法可以用于终端中,包括以下步骤。
在步骤201中,识别终端触摸屏边缘区域的边缘触摸信息。
通常在使用终端时,用户会通过一只手持握终端,另一只手进行触摸输入,随着终端触摸屏边框逐步变细,当用户持握终端时,手掌部分(例如,大拇指连接手掌的掌根部分,后续简称掌根部分)容易触碰到触摸屏的边缘区域,因此本公开中可以预先在触摸屏上划分出边缘区域,边缘区域通常指触摸屏四边上较窄的区域,除了边缘区域的部分可以称为中心区域。
参见图2B,是本公开根据一示例性实施例示出的终端触摸屏区域划分示意图:图2B中,可以将终端触摸屏划分为边缘区域和中心区域,其中,根据手掌持握终端时接触边缘区域的可能位置,将边缘区域进一步划分为六个子区域,分别是子区域A、子区域B、子区域C、子区域D、子区域E和子区域F。
本实施例中,当用户在终端触摸屏上执行触摸操作时,可以获得所述触摸操作产生的触摸信息,触摸信息通常可以包括触摸输入的位置坐标(包括横坐标和纵坐标)、触摸面积大小、触摸状态(触摸、抬起、滑动)等,终端根据所述触摸信息对应的位置坐标判断所述触摸操作是否位于所述触摸屏边缘区域内,如果所述触摸操作位于所述触摸屏边缘区域内,则可以确定所述触摸信息为边缘触摸信息。结合图2B,如果终端根据触摸信息对应的位置坐标判断触摸操作位于子区域A至子区域F中的任意子区域时,可以确定触摸 信息为边缘触摸信息。
在步骤202中,根据所述边缘触摸信息对应的触摸形状,判断所述边缘触摸信息是否包含由手掌触摸操作产生的边缘手掌触摸信息,如果所述边缘触摸信息包含边缘手掌触摸信息,则执行步骤203;如果所述边缘触摸信息不包含边缘手掌触摸信息,则返回步骤201。
本实施例中,在触摸屏边缘区域的触摸操作会在触摸屏上形成相应的触点,通过这些触点分布可以获得所述边缘触摸信息对应的触摸形状,然后判断所述边缘触摸信息对应的触摸形状是否与预先设置的手掌触摸形状匹配,如果所述边缘触摸信息对应的触摸形状与所述手掌触摸形状匹配,则可以确定所述边缘触摸信息包含所述边缘手掌触摸信息。
参见图2C,是本公开根据一示例性实施例示出的用户持握终端的示意图:图2C中用户通过左手持握终端,掌根部分接触到触摸屏边缘区域,在边缘区域上形成触点,触点分布的轮廓为如图2C中在掌根部分上通过虚线示出的椭圆形,大拇指指尖部分在中心区域上形成触点,触点分布的轮廓为如图2C中指尖部分上通过虚线示出的圆形,由此可知通过触摸信息对应的形状至少可以区分手掌触摸操作和手指触摸操作。
参见图2D,是根据一示例性实施例示出的一种触摸信息所在区域的示意图:根据图2C中手掌持握终端的持握方式,对应到图2D中,终端获得触摸屏边缘区域上由掌根部分触摸时对应的椭圆形触摸形状,以及触摸屏中心区域上由大拇指指尖部分触摸时对应的圆形触摸形状。因此,本公开中可以根据手的不同部分在触摸屏上操作时会对应的不同的触摸形状,预先设置手掌触摸形状为如图2D中示出的椭圆形,以及手指触摸形状为如图2D中示出的圆形。
由此可知,当用户采用图2C中示出的方式持握终端时,结合图2C和图2D的示例,终端在获得边缘触摸信息对应的触摸形状后,可以将该触摸形状与预先设置的触摸形状进行比较,从而确定该边缘触摸信息的类型,结合图2D,如果终端获得的边缘触摸信息对应的触摸形状与预先设置的椭圆形在大小形状上均匹配,则可以确定获得的边缘触摸信息为边缘手掌触摸信息。
与图2D不同,在另一个可能的应用场景中,根据用户持握终端习惯的不同,除了掌根部分触摸边缘区域外,大拇指指尖部分也触摸边缘区域,如图2E所示,是根据一示例性实施例示出的另一种边缘触摸信息所在区域示意图,与图2D的不同在于,图2E中大拇指指尖部分触摸时对应的圆形触摸形状位于边缘区域内。
在步骤203中,屏蔽所述边缘手掌触摸信息。
当步骤203的判断结果为边缘触摸信息包含边缘手掌触摸信息时,直接屏蔽边缘手掌触摸信息,即不上报该边缘手掌触摸信息,结合图2D和图2E,终端可以屏蔽图中触摸屏边缘区域上椭圆形表示的边缘手掌触摸信息;不同在于,图2D中,大拇指指尖操作的触摸信息位于中心区域内,而图2E中,大拇指指尖操作的触摸信息位于边缘区域内,但是无论对于那种持握终端的情况,应用本公开实施例可以仅屏蔽边缘手掌触摸信息,而不会屏蔽位于触摸屏上任何区域内的正常触摸输入信息。
需要说明的是,用户在持握终端的过程中,掌根部分可能直接持握到终端的边缘区域,因此不上报识别出的边缘手掌触摸信息,从而屏蔽该边缘手掌触摸信息;但是,当掌根部分在触摸屏上滑动一段距离后才持握到终端边缘,则在滑动过程中掌根部分可能在触摸屏的中心区域产生手掌触摸信息,这些手掌触摸信息对触摸操作来说属于无用的触摸信息,因此可以通过上报这些手掌触摸信息的状态信息,从而清除所述手掌触摸操作过程中产生的除所述边缘手掌触摸信息的其他手掌触摸信息,其中,状态信息可以包括手掌触摸信息中的位置坐标抬起信息,或者位置坐标划出触摸屏的信息。
在步骤204中,判断所述手掌触摸操作在所述触摸屏上的持续时间是否超过预设时间,如果所述持续时间超过预设时间,则执行步骤205;如果所述持续时间未超过预设时间,则返回步骤201。
本实施例中考虑到用户在持握终端时,手掌可能长时间触摸边缘区域,但用户在对终端进行其他触摸操作时,手掌也可能偶尔触摸边缘区域,因此本步骤可以进一步判断手掌触摸信息在触摸屏上的持续时间是否超过预设时间,该预设时间定义了手掌持握终端时可能的最少时间,从而确定是否是由持握终端产生的所述手掌触摸信息,以便后续确定是否执行关闭边缘区域上对应子区域的触摸功能。
在步骤205中,确定所述边缘手掌触摸信息在所述边缘区域上的对应子区域。
本公开中,可以比较所述边缘手掌触摸信息的位置坐标与所述边缘区域的每个子区域的坐标范围,将所述位置坐标所属坐标范围对应的第一子区域确定为所述对应子区域,将与所述第一子区域关联的第二子区域确定为所述对应子区域,所述第二子区域为持握终端时与所述手掌触摸操作对应的手指触摸操作所在的子区域。需要说明的是,除了上述示出的确定对应子区域的方式,在实际应用中,也可以根据需要采取其他方式确定所述对应子区域,对此本公开不进行限制。
结合图2B至图2E,如果椭圆形的边缘手掌触摸信息在边缘区域上的持续时间超过了预设时间,则按照图2B中对边缘区域所划分的子区域可知,通过位置坐标比较该边缘手掌触摸信息的对应子区域为子区域C。
另一个可选示例参见图2F,是根据一示例性实施例示出的另一种触摸信息所在区域的示意图:结合图2C,由于左手持握终端时,除了手掌部分握住终端左侧边缘区域外,左手其他四根手指也可能握住终端右侧边缘区域,如图2C中所示,因此在识别出边缘手掌触摸信息对应的子区域C后,可以直接获得与子区域C相对的子区域E作为关联子区域,子区域E即为与手掌触摸操作对应的手指触摸操作所在的子区域。
在步骤206中,关闭所述对应子区域的触摸输入功能,结束当前流程。
由上述实施例可见,在识别出触摸屏边缘区域的边缘触摸信息后,可以根据边缘触摸信息对应的触摸形状判断出边缘触摸信息中包含的边缘手掌触摸信息,从而仅屏蔽由于手掌触摸操作产生的误输入触摸信息,保留手指在边缘区域触摸操作产生的正确触摸信息,因此能够提高触摸输入的准确性;并且,在屏蔽边缘手掌触摸信息后,可以获得边缘手掌 信息在边缘区域上的对应子区域,从而有针对性地关闭对应子区域的触摸输入功能,进一步提高触摸输入的准确性,可以根据用户持握终端习惯进行灵活设置,在提高触摸输入准确性的同时,可以增强用户体验;以及,还可以仅在持握终端时识别对应子区域,由此提高对子区域触摸输入功能关闭的准确性。
在前述图1或图2示出的实施例基础上,在另一个可选实施例中,结合图2D至图2F,在检测出拇指触摸操作在触摸屏上的触摸操作区域后,如图2D中的圆形区域,则可以在该区域附近显示虚拟触摸控制界面,以便用户通过拇指进行各种输入操作,实现对整个触摸屏的触摸输入控制。参见图2G,是根据一示例性实施例示出的一种触摸屏上虚拟触摸控制界面示意图:图2G中,在拇指指尖附近的触摸操作区域内显示虚拟触摸控制图标,该虚拟触摸控制图标以滑动控制按钮图标为例。需要说明的是,图2G中仅示出了在拇指触摸操作区域输出虚拟触摸控制界面的示例,在实际应用中,也可以根据触摸面积、触摸形状等信息检测用户持握终端时触摸屏上的任意手指触摸操作区域,并在上述检测到的区域输出触摸控制界面,对此本公开实施例不进行限制。
与前述触摸输入控制方法实施例相对应,本公开还提供了触摸输入控制装置及终端的实施例。
如图3所示,图3是本公开根据一示例性实施例示出的一种触摸输入控制装置框图,所述装置包括:识别单元31、判断单元32和屏蔽单元33。
其中,所述识别单元31,被配置为识别终端触摸屏边缘区域的边缘触摸信息;
所述判断单元32,被配置为根据所述边缘触摸信息对应的触摸形状,判断所述边缘触摸信息是否包含由手掌触摸操作产生的手掌触摸信息;
所述屏蔽单元33,被配置为在所述边缘触摸信息包含所述边缘手掌触摸信息时,屏蔽所述边缘手掌触摸信息。
上述实施例中,在识别出触摸屏边缘区域的边缘触摸信息后,可以根据边缘触摸信息对应的触摸形状判断出边缘触摸信息中包含的边缘手掌触摸信息,从而可以仅屏蔽由于手掌触摸操作产生的误输入触摸信息,保留手指在边缘区域触摸操作产生的正确触摸信息,因此能够提高触摸输入的准确性。
如图4所示,图4是本公开根据一示例性实施例示出的另一种触摸输入控制装置框图,该实施例在前述图3所示实施例的基础上,所述识别单元31可以包括:信息获取子单元311、位置判断子单元312和信息确定子单元313。
其中,所述信息获得子单元311,被配置为当用户在所述终端触摸屏上执行触摸操作时,获得所述触摸操作产生的触摸信息;
所述位置判断子单元312,被配置为根据所述触摸信息对应的位置坐标判断所述触摸操作是否位于所述触摸屏边缘区域;
所述信息确定子单元313,被配置为在所述触摸操作位于所述触摸屏边缘区域时,确定所述触摸信息为所述边缘触摸信息。
如图5所示,图5是本公开根据一示例性实施例示出的另一种触摸输入控制装置框图,该实施例在前述图3所示实施例的基础上,所述判断单元32可以包括:形状获得子单元321、形状匹配子单元322和匹配确定子单元323。
其中,所述形状获得子单元321,被配置为通过所述边缘区域上触摸操作的触点分布获得所述边缘触摸信息对应的触摸形状;
所述形状匹配子单元322,被配置为判断所述边缘触摸信息对应的触摸形状是否与预先设置的手掌触摸形状匹配;
所述匹配确定子单元323,被配置为在所述边缘触摸信息对应的触摸形状与所述手掌触摸形状匹配时,确定所述边缘触摸信息包含所述边缘手掌触摸信息。
需要说明的是,上述图5所示的装置实施例中的判断单元32的结构也可以包含在前述图4所示的装置实施例中,对此本公开不进行限制。
上述实施例中,可以采用触点分布确定边缘触摸信息对应的触摸形状,从而判断出边缘触摸信息中包含的边缘手掌触摸信息,以便仅屏蔽由于手掌触摸操作产生的误输入触摸信息,并且保留手指在边缘区域触摸操作产生的正确触摸信息,因此能够提高触摸输入的准确性。
如图6所示,图6是本公开根据一示例性实施例示出的另一种触摸输入控制装置框图,该实施例在前述图3所示实施例的基础上,所述屏蔽单元33可以包括:确定子单元331和关闭子单元332。
其中,所述确定子单元331,被配置为确定所述边缘手掌触摸信息在所述边缘区域上的对应子区域;
所述关闭子单元332,被配置为关闭所述对应子区域的触摸输入功能。
需要说明的是,上述图6所示的装置实施例中的确定子单元331和关闭子单元332也可以包含在前述图4或图5所示的装置实施例中,对此本公开不进行限制。
上述实施例中,在屏蔽边缘手掌触摸信息时,可以获得边缘手掌信息在边缘区域上的对应子区域,从而有针对性地关闭对应子区域的触摸输入功能,进一步提高触摸输入的准确性。
如图7所示,图7是本公开根据一示例性实施例示出的另一种触摸输入控制装置框图,该实施例在前述图6所示实施例的基础上,所述确定子单元331可以包括:位置比较子单元3311、第一子区域确定子单元3312和第二子区域确定子单元3313。
其中,所述位置比较子单元3311,被配置为比较所述边缘手掌触摸信息中的位置坐标与所述边缘区域的每个子区域的坐标范围;
所述第一子区域确定子单元3312,被配置为将所述位置坐标所属坐标范围对应的第一子区域确定为所述对应子区域。
所述第二子区域确定子单元3313,被配置为将与所述第一子区域关联的第二子区域确定为所述对应子区域,所述第二子区域为持握终端时与所述手掌触摸操作对应的手指触摸 操作所在的子区域。
需要说明的是,为了示例方便,上述图7示出的实施例中确定子单元331同时包括了第一子区域确定子单元3312和第二子区域确定子单元3313,在实际应用中,可以根据需要仅包含上述任一子单元,对此本实施例不进行限制。
上述实施例中,对应子区域除了可以包含边缘手掌触摸信息的位置坐标所属坐标范围对应的第一子区域,也可以包含第一子区域所关联的与手掌触摸操作对应的手指触摸操作所在的第二子区域,从而使得对应子区域可以根据用户持握终端习惯进行灵活设置,在提高触摸输入准确性的同时,可以增强用户体验。
如图8所示,图8是本公开根据一示例性实施例示出的另一种触摸输入控制装置框图,该实施例在前述图6或图7所示实施例的基础上,所述确定子单元331可以包括:时间判断子单元3314和子区域确定子单元3315。
其中,所述时间判断子单元3314,被配置为判断所述手掌触摸操作在所述触摸屏上的持续时间是否超过预设时间;
所述子区域确定子单元3315,被配置为在所述持续时间超过预设时间时,确定所述边缘手掌触摸信息在所述边缘区域上的对应子区域。
上述实施例中,可以在边缘手掌触摸信息在触摸屏上的持续时间超过预设时间时,确定边缘手掌触摸信息为用户持握终端产生的触摸信息,此时再确定需要关闭触摸输入功能的对应子区域,从而可以仅在持握终端时识别对应子区域,由此提高对子区域触摸输入功能关闭的准确性。
如图9所示,图9是本公开根据一示例性实施例示出的另一种触摸输入控制装置框图,该实施例在前述图3所示实施例的基础上,所述装置还可以包括:
清除单元34,被配置为清除所述手掌触摸操作过程中产生的除所述边缘手掌触摸信息的其他手掌触摸信息。
需要说明的是,上述图9所示的装置实施例中的清除单元34也可以包含在前述图4至图8所示的任一装置实施例中,对此本公开不进行限制。
如图10所示,图10是本公开根据一示例性实施例示出的另一种触摸输入控制装置框图,该实施例在前述图3所示实施例的基础上,所述装置还可以包括:检测单元35和输出单元36。
其中,所述检测单元35,被配置为检测持握终端时所述触摸屏上的手指触摸操作区域;
所述输出单元36,被配置为在所述手指触摸操作区域输出虚拟触摸控制界面。
需要说明的是,上述图10所示的装置实施例中的检测单元35和输出单元36也可以包含在前述图4至图9所示的任一装置实施例中,对此本公开不进行限制。
上述装置中各个单元的功能和作用的实现过程具体详见上述方法中对应步骤的实现过程,在此不再赘述。
对于装置实施例而言,由于其基本对应于方法实施例,所以相关之处参见方法实施例 的部分说明即可。以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本公开方案的目的。本领域普通技术人员在不付出创造性劳动的情况下,即可以理解并实施。
相应的,本公开还提供一种触摸输入控制装置,所述触摸输入控制装置包括:处理器;用于存储处理器可执行指令的存储器;其中,所述处理器被配置为:识别终端触摸屏边缘区域的边缘触摸信息;根据所述边缘触摸信息对应的触摸形状,判断所述边缘触摸信息是否包含由手掌触摸操作产生的边缘手掌触摸信息;如果所述边缘触摸信息包含所述边缘手掌触摸信息,则屏蔽所述边缘手掌触摸信息。
相应的,本公开还提供一种终端,所述终端包括有存储器,以及一个或者一个以上的程序,其中一个或者一个以上程序存储于存储器中,且经配置以由一个或者一个以上处理器执行所述一个或者一个以上程序包含用于进行以下操作的指令:识别终端触摸屏边缘区域的边缘触摸信息;根据所述边缘触摸信息对应的触摸形状,判断所述边缘触摸信息是否包含由手掌触摸操作产生的边缘手掌触摸信息;如果所述边缘触摸信息包含所述边缘手掌触摸信息,则屏蔽所述边缘手掌触摸信息。
如图11所示,图11是本公开根据一示例性实施例示出的一种用于触摸输入控制装置1100的另一结构示意图。例如,装置1100可以是移动电话,计算机,数字广播终端,消息收发设备,游戏控制台,平板设备,医疗设备,健身设备,个人数字助理等。
参照图11,装置1100可以包括以下一个或多个组件:处理组件1102,存储器1104,电源组件1106,多媒体组件1108,音频组件1110,输入/输出(I/O)接口1112,传感器组件1114,以及通信组件1116。
处理组件1102通常控制装置1100的整体操作,诸如与显示,电话呼叫,数据通信,相机操作和记录操作相关联的操作。处理组件1102可以包括一个或多个处理器1120来执行指令,以完成上述的方法的全部或部分步骤。此外,处理组件1102可以包括一个或多个模块,便于处理组件1102和其他组件之间的交互。例如,处理组件1102可以包括多媒体模块,以方便多媒体组件1108和处理组件1102之间的交互。
存储器1104被配置为存储各种类型的数据以支持在装置1100的操作。这些数据的示例包括用于在装置1100上操作的任何应用程序或方法的指令,联系人数据,电话簿数据,消息,图片,视频等。存储器1104可以由任何类型的易失性或非易失性存储设备或者它们的组合实现,如静态随机存取存储器(SRAM),电可擦除可编程只读存储器(EEPROM),可擦除可编程只读存储器(EPROM),可编程只读存储器(PROM),只读存储器(ROM),磁存储器,快闪存储器,磁盘或光盘。
电源组件1106为装置1100的各种组件提供电力。电源组件1106可以包括电源管理系统,一个或多个电源,及其他与为装置1100生成、管理和分配电力相关联的组件。
多媒体组件1108包括在所述装置1100和用户之间的提供一个输出接口的屏幕。在一些实施例中,屏幕可以包括液晶显示器(LCD)和触摸面板(TP)。如果屏幕包括触摸面板,屏幕可以被实现为触摸屏,以接收来自用户的输入信号。触摸面板包括一个或多个触摸传感器以感测触摸、滑动和触摸面板上的手势。所述触摸传感器可以不仅感测触摸或滑动动作的边界,而且还检测与所述触摸或滑动操作相关的持续时间和压力。在一些实施例中,多媒体组件1108包括一个前置摄像头和/或后置摄像头。当装置1100处于操作模式,如拍摄模式或视频模式时,前置摄像头和/或后置摄像头可以接收外部的多媒体数据。每个前置摄像头和后置摄像头可以是一个固定的光学透镜系统或具有焦距和光学变焦能力。
音频组件1110被配置为输出和/或输入音频信号。例如,音频组件1110包括一个麦克风(MIC),当装置1100处于操作模式,如呼叫模式、记录模式和语音识别模式时,麦克风被配置为接收外部音频信号。所接收的音频信号可以被进一步存储在存储器1104或经由通信组件1116发送。在一些实施例中,音频组件1110还包括一个扬声器,用于输出音频信号。
输入输出接口1112为处理组件1102和外围接口模块之间提供接口,上述外围接口模块可以是键盘,点击轮,按钮等。这些按钮可包括但不限于:主页按钮、音量按钮、启动按钮和锁定按钮。
传感器组件1114包括一个或多个传感器,用于为装置1100提供各个方面的状态评估。例如,传感器组件1114可以检测到装置1100的打开/关闭状态,组件的相对定位,例如所述组件为装置1100的显示器和小键盘,传感器组件1114还可以检测装置1100或装置1100一个组件的位置改变,用户与装置1100接触的存在或不存在,装置1100方位或加速/减速和装置1100的温度变化。传感器组件1114可以包括接近传感器,被配置用来在没有任何的物理接触时检测附近物体的存在。传感器组件1114还可以包括光传感器,如CMOS或CCD图像传感器,用于在成像应用中使用。在一些实施例中,该传感器组件1114还可以包括加速度传感器,陀螺仪传感器,磁传感器,压力传感器,微波传感器或温度传感器。
通信组件1116被配置为便于装置1100和其他设备之间有线或无线方式的通信。装置1100可以接入基于通信标准的无线网络,如WiFi,2G或3G,或它们的组合。在一个示例性实施例中,通信组件1116经由广播信道接收来自外部广播管理系统的广播信号或广播相关信息。在一个示例性实施例中,所述通信组件1116还包括近场通信(NFC)模块,以促进短程通信。例如,在NFC模块可基于射频识别(RFID)技术,红外数据协会(IrDA)技术,超宽带(UWB)技术,蓝牙(BT)技术和其他技术来实现。
在示例性实施例中,装置1100可以被一个或多个应用专用集成电路(ASIC)、数字信号处理器(DSP)、数字信号处理设备(DSPD)、可编程逻辑器件(PLD)、现场可编程门阵列(FPGA)、控制器、微控制器、微处理器或其他电子元件实现,用于执行上述方法。
在示例性实施例中,还提供了一种包括指令的非临时性计算机可读存储介质,例如包括指令的存储器1104,上述指令可由装置1100的处理器1120执行以完成上述方法。例如,所述非临时性计算机可读存储介质可以是ROM、随机存取存储器(RAM)、CD-ROM、磁带、软盘和光数据存储设备等。
一种非临时性计算机可读存储介质,当所述存储介质中的指令由终端的处理器执行时,使得终端能够执行一种触摸输入控制方法,所述方法包括:识别终端触摸屏边缘区域的边缘触摸信息;根据所述边缘触摸信息对应的触摸形状,判断所述边缘触摸信息是否包含由手掌触摸操作产生的边缘手掌触摸信息;如果所述边缘触摸信息包含所述边缘手掌触摸信息,则屏蔽所述边缘手掌触摸信息。
本领域技术人员在考虑说明书及实践这里公开的发明后,将容易想到本公开的其它实施方案。本申请旨在涵盖本公开的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本公开的一般性原理并包括本公开未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本公开的真正范围和精神由下面的权利要求指出。
应当理解的是,本公开并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本公开的范围仅由所附的权利要求来限制。

Claims (19)

  1. 一种触摸输入控制方法,其特征在于,包括:
    识别终端触摸屏边缘区域的边缘触摸信息;
    根据所述边缘触摸信息对应的触摸形状,判断所述边缘触摸信息是否包含由手掌触摸操作产生的边缘手掌触摸信息;
    如果所述边缘触摸信息包含所述边缘手掌触摸信息,则屏蔽所述边缘手掌触摸信息。
  2. 根据权利要求1所述的方法,其特征在于,所述识别终端触摸屏边缘区域的边缘触摸信息包括:
    当用户在所述终端触摸屏上执行触摸操作时,获得所述触摸操作产生的触摸信息;
    根据所述触摸信息对应的位置坐标判断所述触摸操作是否位于所述触摸屏边缘区域;
    如果所述触摸操作位于所述触摸屏边缘区域,则确定所述触摸信息为所述边缘触摸信息。
  3. 根据权利要求1所述的方法,其特征在于,所述根据所述边缘触摸信息对应的触摸形状,判断所述边缘触摸信息是否包含由手掌触摸操作产生的边缘手掌触摸信息包括:
    通过所述边缘区域上触摸操作的触点分布获得所述边缘触摸信息对应的触摸形状;
    判断所述边缘触摸信息对应的触摸形状是否与预先设置的手掌触摸形状匹配;
    如果所述边缘触摸信息对应的触摸形状与所述手掌触摸形状匹配,则确定所述边缘触摸信息包含所述边缘手掌触摸信息。
  4. 根据权利要求1所述的方法,其特征在于,所述屏蔽所述边缘手掌触摸信息包括:
    确定所述边缘手掌触摸信息在所述边缘区域上的对应子区域;
    关闭所述对应子区域的触摸输入功能。
  5. 根据权利要求4所述的方法,其特征在于,所述确定所述边缘手掌触摸信息在所述边缘区域上的对应子区域包括:
    比较所述边缘手掌触摸信息中的位置坐标与所述边缘区域的每个子区域的坐标范围;
    将所述位置坐标所属坐标范围对应的第一子区域确定为所述对应子区域。
  6. 根据权利要求5所述的方法,其特征在于,所述确定所述边缘手掌触摸信息在所述边缘区域上的对应子区域还包括:
    将与所述第一子区域关联的第二子区域确定为所述对应子区域,所述第二子区域为持握终端时与所述手掌触摸操作对应的手指触摸操作所在的子区域。
  7. 根据权利要求4所述的方法,其特征在于,所述确定所述边缘手掌触摸信息在所述边缘区域上的对应子区域包括:
    判断所述手掌触摸操作在所述触摸屏上的持续时间是否超过预设时间;
    如果所述持续时间超过预设时间,确定所述边缘手掌触摸信息在所述边缘区域上的对应子区域。
  8. 根据权利要求1所述的方法,其特征在于,还包括:
    清除所述手掌触摸操作过程中产生的除所述边缘手掌触摸信息的其他手掌触摸信息。
  9. 根据权利要求1至8任意一项所述的方法,其特征在于,还包括:
    检测持握终端时所述触摸屏上的手指触摸操作区域;
    在所述手指触摸操作区域输出虚拟触摸控制界面。
  10. 一种触摸输入控制装置,其特征在于,包括:
    识别单元,用于识别终端触摸屏边缘区域的边缘触摸信息;
    判断单元,用于根据所述边缘触摸信息对应的触摸形状,判断所述边缘触摸信息是否包含由手掌触摸操作产生的手掌触摸信息;
    屏蔽单元,用于在所述边缘触摸信息包含所述边缘手掌触摸信息时,屏蔽所述边缘手掌触摸信息。
  11. 根据权利要求10所述的装置,其特征在于,所述识别单元包括:
    信息获得子单元,用于当用户在所述终端触摸屏上执行触摸操作时,获得所述触摸操作产生的触摸信息;
    位置判断子单元,用于根据所述触摸信息对应的位置坐标判断所述触摸操作是否位于所述触摸屏边缘区域;
    信息确定子单元,用于在所述触摸操作位于所述触摸屏边缘区域时,确定所述触摸信息为所述边缘触摸信息。
  12. 根据权利要求10所述的装置,其特征在于,所述判断单元包括:
    形状获得子单元,用于通过所述边缘区域上触摸操作的触点分布获得所述边缘触摸信息对应的触摸形状;
    形状匹配子单元,用于判断所述边缘触摸信息对应的触摸形状是否与预先设置的手掌触摸形状匹配;
    匹配确定子单元,用于在所述边缘触摸信息对应的触摸形状与所述手掌触摸形状匹配时,确定所述边缘触摸信息包含所述边缘手掌触摸信息。
  13. 根据权利要求10所述的装置,其特征在于,所述屏蔽单元包括:
    确定子单元,用于确定所述边缘手掌触摸信息在所述边缘区域上的对应子区域;
    关闭子单元,用于关闭所述对应子区域的触摸输入功能。
  14. 根据权利要求13所述的装置,其特征在于,所述确定子单元包括:
    位置比较子单元,用于比较所述边缘手掌触摸信息中的位置坐标与所述边缘区域的每个子区域的坐标范围;
    第一子区域确定子单元,用于将所述位置坐标所属坐标范围对应的第一子区域确定为所述对应子区域。
  15. 根据权利要求14所述的装置,其特征在于,所述确定子单元还包括:
    第二子区域确定子单元,用于将与所述第一子区域关联的第二子区域确定为所述对应子区域,所述第二子区域为持握终端时与所述手掌触摸操作对应的手指触摸操作所在的子 区域。
  16. 根据权利要求13所述的装置,其特征在于,所述确定子单元包括:
    时间判断子单元,用于判断所述手掌触摸操作在所述触摸屏上的持续时间是否超过预设时间;
    子区域确定子单元,用于在所述持续时间超过预设时间时,确定所述边缘手掌触摸信息在所述边缘区域上的对应子区域。
  17. 根据权利要求10所述的装置,其特征在于,还包括:
    清除单元,用于清除所述手掌触摸操作过程中产生的除所述边缘手掌触摸信息的其他手掌触摸信息。
  18. 根据权利要求10至17任意一项所述的装置,其特征在于,还包括:
    检测单元,用于检测持握终端时所述触摸屏上的手指触摸操作区域;
    输出单元,用于在所述手指触摸操作区域输出虚拟触摸控制界面。
  19. 一种触摸输入控制装置,其特征在于,包括:
    处理器;
    用于存储处理器可执行指令的存储器;
    其中,所述处理器被配置为:
    识别终端触摸屏边缘区域的边缘触摸信息;
    根据所述边缘触摸信息对应的触摸形状,判断所述边缘触摸信息是否包含由手掌触摸操作产生的边缘手掌触摸信息;
    如果所述边缘触摸信息包含所述边缘手掌触摸信息,则屏蔽所述边缘手掌触摸信息。
PCT/CN2014/089246 2014-05-22 2014-10-23 触摸输入控制方法及装置 WO2015176484A1 (zh)

Priority Applications (6)

Application Number Priority Date Filing Date Title
JP2016520274A JP6033502B2 (ja) 2014-05-22 2014-10-23 タッチ入力制御方法、タッチ入力制御装置、プログラム及び記録媒体
RU2014153895A RU2618921C2 (ru) 2014-05-22 2014-10-23 Способ и устройство управления сенсорным вводом
KR1020147035869A KR101714857B1 (ko) 2014-05-22 2014-10-23 터치 입력 제어방법, 장치, 프로그램 및 기록매체
BR112015000003A BR112015000003A2 (pt) 2014-05-22 2014-10-23 método e dispositivo de controle de entrada de toque
MX2014015064A MX349777B (es) 2014-05-22 2014-10-23 Metodo y dispositivo de control de entrada tactil.
US14/578,715 US9671911B2 (en) 2014-05-22 2014-12-22 Touch input control method and device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201410219188.9A CN104020878A (zh) 2014-05-22 2014-05-22 触摸输入控制方法及装置
CN201410219188.9 2014-05-22

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/578,715 Continuation US9671911B2 (en) 2014-05-22 2014-12-22 Touch input control method and device

Publications (1)

Publication Number Publication Date
WO2015176484A1 true WO2015176484A1 (zh) 2015-11-26

Family

ID=51437668

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/089246 WO2015176484A1 (zh) 2014-05-22 2014-10-23 触摸输入控制方法及装置

Country Status (9)

Country Link
US (1) US9671911B2 (zh)
EP (1) EP2947553A1 (zh)
JP (1) JP6033502B2 (zh)
KR (1) KR101714857B1 (zh)
CN (1) CN104020878A (zh)
BR (1) BR112015000003A2 (zh)
MX (1) MX349777B (zh)
RU (1) RU2618921C2 (zh)
WO (1) WO2015176484A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017143823A1 (zh) * 2016-02-25 2017-08-31 上海斐讯数据通信技术有限公司 一种电容屏抗电磁干扰的方法和装置

Families Citing this family (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108762577A (zh) 2011-10-18 2018-11-06 卡内基梅隆大学 用于分类触敏表面上的触摸事件的方法和设备
KR20140114766A (ko) 2013-03-19 2014-09-29 퀵소 코 터치 입력을 감지하기 위한 방법 및 장치
US9612689B2 (en) 2015-02-02 2017-04-04 Qeexo, Co. Method and apparatus for classifying a touch event on a touchscreen as related to one of multiple function generating interaction layers and activating a function in the selected interaction layer
US9013452B2 (en) 2013-03-25 2015-04-21 Qeexo, Co. Method and system for activating different interactive functions using different types of finger contacts
CN104020878A (zh) * 2014-05-22 2014-09-03 小米科技有限责任公司 触摸输入控制方法及装置
US9329715B2 (en) 2014-09-11 2016-05-03 Qeexo, Co. Method and apparatus for differentiating touch screen users based on touch event analysis
US11619983B2 (en) 2014-09-15 2023-04-04 Qeexo, Co. Method and apparatus for resolving touch screen ambiguities
US9864453B2 (en) * 2014-09-22 2018-01-09 Qeexo, Co. Method and apparatus for improving accuracy of touch screen event analysis by use of edge classification
US10606417B2 (en) 2014-09-24 2020-03-31 Qeexo, Co. Method for improving accuracy of touch screen event analysis by use of spatiotemporal touch patterns
CN104407793B (zh) * 2014-11-26 2018-03-13 深圳市华星光电技术有限公司 触摸信号处理方法及设备
CN105718069B (zh) * 2014-12-02 2020-01-31 联想(北京)有限公司 信息处理方法及电子设备
CN104461366A (zh) * 2014-12-16 2015-03-25 小米科技有限责任公司 激活移动终端的操作状态的方法及装置
CN104571693B (zh) * 2014-12-22 2018-08-10 联想(北京)有限公司 信息处理方法及电子设备
US9489097B2 (en) * 2015-01-23 2016-11-08 Sony Corporation Dynamic touch sensor scanning for false border touch input detection
CN104714692B (zh) 2015-01-30 2016-08-24 努比亚技术有限公司 移动终端防误触控方法及装置
KR102332015B1 (ko) * 2015-02-26 2021-11-29 삼성전자주식회사 터치 처리 방법 및 이를 지원하는 전자 장치
CN104635990B (zh) * 2015-02-27 2017-11-14 上海卓易科技股份有限公司 一种识别用户触屏的方法及装置
CN106155642B (zh) * 2015-03-25 2020-07-24 联想(北京)有限公司 一种信息处理方法及电子设备
CN104750418B (zh) * 2015-03-30 2017-03-15 努比亚技术有限公司 触摸操作区域长按操作的识别方法及装置
CN104932620A (zh) * 2015-06-23 2015-09-23 上海华豚科技有限公司 一种具有收窄框屏的电子设备的控制方法及其电子设备
US10642404B2 (en) 2015-08-24 2020-05-05 Qeexo, Co. Touch sensitive device with multi-sensor stream synchronized data
CN105159496A (zh) * 2015-08-31 2015-12-16 小米科技有限责任公司 触控事件响应方法及移动终端
CN105245650B (zh) * 2015-10-19 2018-09-18 昆山龙腾光电有限公司 触摸屏边缘防误触的便携式电子装置及方法
CN106648190B (zh) * 2015-10-30 2019-07-02 深圳市汇顶科技股份有限公司 防止触摸屏边缘误操作的装置和方法
CN106095317B (zh) * 2016-06-15 2020-07-10 海信视像科技股份有限公司 一种触摸屏的响应方法及终端
TWI606376B (zh) * 2016-08-08 2017-11-21 意象無限股份有限公司 觸控感測裝置及濾除誤觸的觸控方法
CN108021259B (zh) 2016-11-03 2021-03-30 华为技术有限公司 一种防误触方法及电子设备
CN106775086A (zh) * 2016-12-16 2017-05-31 广东欧珀移动通信有限公司 一种移动终端的触摸屏控制方法、装置及移动终端
CN106775084B (zh) 2016-12-16 2019-04-16 Oppo广东移动通信有限公司 一种触摸屏的防误触方法、装置及移动终端
CN106681636B (zh) * 2016-12-16 2020-01-14 Oppo广东移动通信有限公司 一种防误触的方法、装置及移动终端
CN106855783A (zh) * 2016-12-16 2017-06-16 广东欧珀移动通信有限公司 一种防误触的方法、装置及移动终端
CN106775405A (zh) * 2016-12-16 2017-05-31 广东欧珀移动通信有限公司 一种移动终端的触摸屏防误触方法、装置及移动终端
CN106775406A (zh) * 2016-12-16 2017-05-31 广东欧珀移动通信有限公司 一种移动终端触摸屏的防误触控制方法、装置及移动终端
CN107562346A (zh) * 2017-09-06 2018-01-09 广东欧珀移动通信有限公司 终端控制方法、装置、终端及计算机可读存储介质
CN107577372A (zh) * 2017-09-06 2018-01-12 广东欧珀移动通信有限公司 边缘触控方法、装置及移动终端
US10585536B2 (en) * 2017-09-29 2020-03-10 Apple Inc. Method for transitioning power states of an electronic device
CN109782937B (zh) * 2017-11-13 2021-03-05 京东方科技集团股份有限公司 触摸驱动方法、装置和显示终端
CN109062443A (zh) 2018-08-17 2018-12-21 武汉华星光电技术有限公司 触控感应方法及其设备
US11009989B2 (en) 2018-08-21 2021-05-18 Qeexo, Co. Recognizing and rejecting unintentional touch events associated with a touch sensitive device
KR102645332B1 (ko) 2018-12-19 2024-03-11 삼성전자주식회사 디스플레이의 복수의 영역들 간 인터랙팅을 위한 방법 및 전자 장치
CN111610874B (zh) * 2019-02-22 2022-05-10 华为技术有限公司 一种触摸屏的响应方法及电子设备
CN111752465A (zh) * 2019-03-29 2020-10-09 北京小米移动软件有限公司 防止边缘误触控的方法、装置及存储介质
US10942603B2 (en) 2019-05-06 2021-03-09 Qeexo, Co. Managing activity states of an application processor in relation to touch or hover interactions with a touch sensitive device
US11231815B2 (en) 2019-06-28 2022-01-25 Qeexo, Co. Detecting object proximity using touch sensitive surface sensing and ultrasonic sensing
CN113031802A (zh) * 2019-12-09 2021-06-25 华为终端有限公司 一种触控区域调整方法及装置
US11907526B2 (en) 2019-12-09 2024-02-20 Huawei Technologies Co., Ltd. Touch region adjustment method and apparatus for determining a grasping gesture of a user on an electronic device
CN113093930A (zh) * 2020-01-08 2021-07-09 北京小米移动软件有限公司 一种触摸信号处理方法、装置及介质
US11592423B2 (en) 2020-01-29 2023-02-28 Qeexo, Co. Adaptive ultrasonic sensing techniques and systems to mitigate interference
CN113282220A (zh) * 2020-02-19 2021-08-20 北京小米移动软件有限公司 防误触方法、确定防误触区域的方法及移动终端
CN115808991B (zh) * 2023-01-31 2023-07-07 荣耀终端有限公司 显示屏的触控操作方法及电子设备

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103176653A (zh) * 2013-03-13 2013-06-26 向运明 手持式装置触控显示屏防误触方法
CN103235695A (zh) * 2013-04-11 2013-08-07 广东欧珀移动通信有限公司 触控设备中防止误操作的方法及其装置
CN104020878A (zh) * 2014-05-22 2014-09-03 小米科技有限责任公司 触摸输入控制方法及装置

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6459424B1 (en) * 1999-08-10 2002-10-01 Hewlett-Packard Company Touch-sensitive input screen having regional sensitivity and resolution properties
US7847789B2 (en) * 2004-11-23 2010-12-07 Microsoft Corporation Reducing accidental touch-sensitive device activation
US20090174679A1 (en) * 2008-01-04 2009-07-09 Wayne Carl Westerman Selective Rejection of Touch Contacts in an Edge Region of a Touch Surface
JP4672756B2 (ja) * 2008-06-30 2011-04-20 株式会社東芝 電子機器
JP5611763B2 (ja) * 2010-10-27 2014-10-22 京セラ株式会社 携帯端末装置及び処理方法
US9244545B2 (en) 2010-12-17 2016-01-26 Microsoft Technology Licensing, Llc Touch and stylus discrimination and rejection for contact sensitive computing devices
RU2455676C2 (ru) * 2011-07-04 2012-07-10 Общество с ограниченной ответственностью "ТРИДИВИ" Способ управления устройством с помощью жестов и 3d-сенсор для его осуществления
JP5799628B2 (ja) * 2011-07-15 2015-10-28 ソニー株式会社 情報処理装置、情報処理方法、及びプログラム
CN103064548A (zh) * 2011-10-24 2013-04-24 联咏科技股份有限公司 可滤除误触面板的手势判断方法
JP5189197B1 (ja) * 2011-10-27 2013-04-24 シャープ株式会社 携帯情報端末
CN102768595B (zh) 2011-11-23 2015-08-26 联想(北京)有限公司 一种识别触摸屏上触控操作指令的方法及装置
US20130207913A1 (en) * 2012-02-09 2013-08-15 Sony Mobile Communications Inc. Touch panel device, portable terminal, position detecting method, and recording medium
JP6292673B2 (ja) * 2012-03-02 2018-03-14 日本電気株式会社 携帯端末装置、誤操作防止方法、及びプログラム
JP5922480B2 (ja) * 2012-04-25 2016-05-24 京セラ株式会社 表示機能を備える携帯機器、プログラムおよび表示機能を備える携帯機器の制御方法
SE537730C2 (sv) * 2012-05-14 2015-10-06 Scania Cv Ab Projicerat virtuellt inmatningssystem för fordon
JP2014006654A (ja) * 2012-06-22 2014-01-16 Fujifilm Corp 情報表示装置、ユーザーインターフェースの提供方法並びにプログラム
JP2014052950A (ja) * 2012-09-10 2014-03-20 Sharp Corp 情報端末
CN103513865A (zh) 2013-04-27 2014-01-15 展讯通信(上海)有限公司 一种触控设备及控制其配置操作模式的方法、装置
CN104423656B (zh) * 2013-08-20 2018-08-17 南京中兴新软件有限责任公司 误触摸识别方法和装置
US9851853B2 (en) * 2014-05-30 2017-12-26 Apple Inc. Low power scan for device wake up and unlock

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103176653A (zh) * 2013-03-13 2013-06-26 向运明 手持式装置触控显示屏防误触方法
CN103235695A (zh) * 2013-04-11 2013-08-07 广东欧珀移动通信有限公司 触控设备中防止误操作的方法及其装置
CN104020878A (zh) * 2014-05-22 2014-09-03 小米科技有限责任公司 触摸输入控制方法及装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017143823A1 (zh) * 2016-02-25 2017-08-31 上海斐讯数据通信技术有限公司 一种电容屏抗电磁干扰的方法和装置

Also Published As

Publication number Publication date
RU2618921C2 (ru) 2017-05-12
RU2014153895A (ru) 2016-07-20
BR112015000003A2 (pt) 2017-06-27
US20150338954A1 (en) 2015-11-26
US9671911B2 (en) 2017-06-06
CN104020878A (zh) 2014-09-03
KR101714857B1 (ko) 2017-03-09
KR20160000396A (ko) 2016-01-04
MX349777B (es) 2017-08-10
JP6033502B2 (ja) 2016-11-30
EP2947553A1 (en) 2015-11-25
MX2014015064A (es) 2016-04-26
JP2016524764A (ja) 2016-08-18

Similar Documents

Publication Publication Date Title
WO2015176484A1 (zh) 触摸输入控制方法及装置
US11635810B2 (en) Managing and mapping multi-sided touch
WO2016112697A1 (zh) 解锁方法、装置及终端
WO2018098865A1 (zh) 消息阅读方法及装置
WO2017036019A1 (zh) 移动终端控制的方法及移动终端
WO2017071050A1 (zh) 具有触摸屏的终端的防误触方法及装置
US11080503B2 (en) Method and apparatus for identifying fingerprint, electronic device, and computer readable storage medium
KR101843447B1 (ko) 가상 버튼을 프로세싱 하기 위한 방법, 및 모바일 단말
WO2018133387A1 (zh) 指纹识别方法及装置
WO2015123971A1 (zh) 输入方法和装置
US10061497B2 (en) Method, device and storage medium for interchanging icon positions
WO2016206295A1 (zh) 字符确定方法及装置
WO2017092500A1 (zh) 功能键的触控方法及装置
RU2628484C2 (ru) Способ и устройство для активации рабочего состояния мобильного терминала
US20160195992A1 (en) Mobile terminal and method for processing signals generated from touching virtual keys
CN106843691B (zh) 移动终端的操作控制方法及装置
CN106814903B (zh) 功能键的触控方法和装置
CN112987958A (zh) 触控信号处理方法和装置

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: MX/A/2014/015064

Country of ref document: MX

ENP Entry into the national phase

Ref document number: 2016520274

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20147035869

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2014153895

Country of ref document: RU

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14892782

Country of ref document: EP

Kind code of ref document: A1

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112015000003

Country of ref document: BR

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14892782

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 112015000003

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20150102