US9007314B2 - Method for touch processing and mobile terminal - Google Patents

Method for touch processing and mobile terminal Download PDF

Info

Publication number
US9007314B2
US9007314B2 US13/520,221 US201013520221A US9007314B2 US 9007314 B2 US9007314 B2 US 9007314B2 US 201013520221 A US201013520221 A US 201013520221A US 9007314 B2 US9007314 B2 US 9007314B2
Authority
US
United States
Prior art keywords
touching point
area
processor
touch
touching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US13/520,221
Other languages
English (en)
Other versions
US20130201118A1 (en
Inventor
Xiangtao Liu
Dayong Gan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Beijing Lenovo Software Ltd
Original Assignee
Lenovo Beijing Ltd
Beijing Lenovo Software Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd, Beijing Lenovo Software Ltd filed Critical Lenovo Beijing Ltd
Assigned to BEIJING LENOVO SOFTWARE LTD., LENOVO (BEIJING) CO., LTD. reassignment BEIJING LENOVO SOFTWARE LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GAN, DAYONG, Liu, Xiangtao
Publication of US20130201118A1 publication Critical patent/US20130201118A1/en
Application granted granted Critical
Publication of US9007314B2 publication Critical patent/US9007314B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • G06F3/04186Touch location disambiguation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers

Definitions

  • the present application relates to the field of communication technology, and in particular, to a touch processing method and a mobile terminal.
  • An existing terminal with a touch screen includes a touch-type mobile phone and PDA (Personal Digital Assistant) and so on, and does not include any physical keys, such as a keyboard, etc., so that all of operations are implemented by finger or fingers touching the touch screen.
  • PDA Personal Digital Assistant
  • inventors of the present application find that a misoperation may occur easily when a user hold the mobile phone closely to perform the touch operation.
  • the user when the user holds the mobile phone by one hand, his/her thumb may move on the touch screen to perform the touch operation, but a part connecting the thumb with a palm of the hand may touch the touch screen at the same time, such that the touch screen may sense a plurality of touch operations and in turn generate a plurality of touch commands, thus a misoperation may occur.
  • the user may be careful to only hold the edges of the mobile phone while holding the mobile phone, but this causes the operation to be inconvenient and accordingly the touch operation experience of the user is decreased.
  • An object of the embodiments of the present application is to provide a touch processing method and a mobile terminal, in order to settle problems in the prior art that a misoperation occurs easily when the mobile terminal implements a touch input and the touch operation experience of the user is decreased.
  • a touch processing method for a mobile terminal with a touch sensing device the touch sensing device has a first area and a second area, a touching point in the first area have a first response level and a touching point in the second area has a second response level, the first response level is higher than the second response level, wherein the method comprises: acquiring a first touching point and a second touching point; and responding to the first touching point, when the first touching point is in the first area while the second touching point is in the second area.
  • the touching point in the first area is responded in a first response mode and the touching point in the second area is responded in a second response mode, and the first response mode is different from the second response mode.
  • the mobile terminal further includes a first processor and a second processor, the first touching point is responded in the first response mode when the second processor determines the first touching point is in the first area, and the first response mode is to transmit the first touching point to a first processor; and the second touching point is responded in the second response mode when the second processor determines the second touching point is in the second area, and the second response mode is to obtain a touch command corresponding to the second touching point and transmit the touch command to the first processor.
  • Acquiring the first touching point and the second touching point comprises acquiring the second touching point in the second area during a process of acquiring the first touching point in the first area.
  • the method further comprises: not responding to the second touching point.
  • Acquiring the first touching point and the second touching point comprises acquiring the first touching point in the first area during a process of acquiring the second touching point in the second area.
  • the method further comprises: terminating the acquiring of the second touching point.
  • Acquiring the first touching point and the second touching point comprises acquiring the first touching point in the first area and the second touching point in the second area at the same time.
  • the method further comprises: not responding to the second touching point.
  • Acquiring the first touching point and the second touching point comprises acquiring the first touching point and the second touching point by the second processor.
  • the method further comprises: the second processor determines a touch command corresponding to the second touching point according to the second touching point acquired from the second area.
  • the first touching point in the first area is acquired during a process of acquiring the second touching point in the second area.
  • the method further comprises: the second processor transmitting the first touching point to the first processor; the second processor not performing the step of determining a touch command corresponding to the second touching point according to the second touching point acquired from the second area.
  • Responding to the first touching point comprises the first processor responding to the first touching point after receiving the first touching point.
  • Acquiring the first touching point in the first area during the process of determining the touch command corresponding to the second touching point according to the second touching point acquired from the second area further comprises: the second processor transmitting the first touching point to the first processor; the second processor terminating the step of determining the touch command corresponding to the second touching point according to the second touching point acquired from the second area.
  • Responding to the first touching point comprises the first processor responding to the first touching point after receiving the first touching point.
  • a mobile terminal with a touch sensing device the touch sensing device has a first area and a second area, a touching point in the first area has a first response level and a touching point in the second area has a second response level, the first response level is higher than the second response level, further comprising a first processor and a second processor, wherein the second processor comprises an acquiring unit for acquiring a first touching point and a second touching point; the first processor comprises a response unit for responding to the first touching point when the first touching point is in the first area while the second touching point is in the second area.
  • the touching point in the first area is responded in a first response mode and the touching point in the second area is responded in a second response mode, and the first response mode is different from the second response mode.
  • the first touching point is responded in the first response mode when the second processor determines the first touching point is in the first area, and the first response mode is to transmit the first touching point to a first processor; and the second touching point is responded in the second response mode when the second processor determines the second touching point is in the second area, and the second response mode is to obtain a touch command corresponding to the second touching point and transmit the touch command to the first processor.
  • the second processor further comprises: a determination unit for determining a touch command corresponding to the second touching point according to the second touching point acquired from the second area.
  • the acquiring unit acquires the first touching point in the first area during a process of acquiring the second touching point in the second area.
  • the second processor further comprises: a transmitting unit for transmitting the first touching point to the first processor; and a control unit for controlling the determination unit not to perform the process of determining the touch command corresponding to the second touching point according to the second touching point acquired from the second area.
  • the response unit responds to the first touching point after receiving the first touching point.
  • the acquiring unit acquires the first touching point in the first area during the process of determining the touch command corresponding to the second touching point according to the second touching point acquired from the second area.
  • the second processor further comprises: a transmitting unit for transmitting the first touching point to the first processor; and a control unit for controlling the determination unit to terminate the process of determining the touch command corresponding to the second touching point according to the second touching point acquired from the second area.
  • the response unit responds to the first touching point after receiving the first touching point.
  • the touch-type mobile terminal in the embodiments of the present application is equipped with a touch sensing device, the touch sensing device has a first area and a second area, and a first touching point will be responded, when the first touching point and a second touching point are acquired and the first touching point is in the first area while the second touching point is in the second area.
  • the embodiments of the present application to perform a touch input, no misoperation would occur because one touch command corresponding to one touching point would be responded when touch operations are input in different areas.
  • a user may not have to only hold edges of the mobile terminal for a purpose of avoiding a misoperation when he/she holds the mobile terminal to operate. Instead, the mobile terminal may respond to useful touch commands automatically and shield the touch commands generated by a misoperation, thus the user experience is enhanced.
  • FIG. 1 is a flowchart illustrating a first embodiment of the touch processing method of the present application
  • FIG. 2 is a flowchart illustrating a second embodiment of the touch processing method of the present application
  • FIG. 3 is a flowchart illustrating a third embodiment of the touch processing method of the present application.
  • FIG. 4 is a schematic diagram illustrating a structure of a mobile terminal to which the embodiments of the touch processing method of the present application are applied;
  • FIG. 5 is a block diagram illustrating a first embodiment of a mobile terminal of the present application.
  • FIG. 6 is a block diagram illustrating a second embodiment of a mobile terminal of the present application.
  • FIG. 7 is a block diagram illustrating a third embodiment of a mobile terminal of the present application.
  • the mobile terminal in the embodiments of the present application has a structure with a touch sensing device, the touch sensing device has a first area and a second area, a touching point in the first area has a first response level and a touching point in the second area has a second response level, the first response level is higher than the second response level. Further, the mobile terminal in the embodiments of the present application further comprises a first processor for responding to touching points and a second processor for acquiring the touching points.
  • FIG. 1 is a flowchart illustrating a first embodiment of the touch processing method of the present application.
  • Step 101 acquiring a first touching point and a second touching point.
  • the touching point in the second area may be acquired during a process of acquiring the first touching point in the first area; the touching point in the first area may be acquired during a process of acquiring the touching point in the second area; the first touching point in the first area and the second touching point in the second area also may be acquired at the same time.
  • Step 102 judging that the first touching point is in the first area and the second touching point is in the second area.
  • Step 103 responding to the first touching point and terminating the current flow.
  • the first area refers to a touch input area corresponding to a display screen
  • the second area may be a special touch input area other than the touch input area corresponding to the display screen, wherein the special touch input area is a touch input area for touch gestures (that is, one touch gesture is determined by collecting a plurality of second touching points).
  • the first area also may be a predetermined area for touch input operations on the touch display screen, and the second area may be edges scope of the touch display screen.
  • the first processor may be a CPU (Central Processing Unit) of the mobile terminal
  • the second processor may be a MCU (Microcontroller) connecting with the touch sensing device.
  • the MCU may be connected with the CPU, the MCU transmits the touching point in the first area to the CPU, and the MCU transmits the touch command corresponding to the touching points in the second area acquired by the MCU to the CPU.
  • the CPU is used to respond to the received touching point and touch command.
  • the CPU responds to the received touching point, it determines as an object on the mobile terminal according to a coordination of the touching point; or it determines as a gesture input, for example, a sliding, according to a plurality of the touching points.
  • the CPU When the CPU responds to the received touch command, the CPU executes the touch command directly, for example, an instruction for returning to a home page or an instruction for returning to the previous menu.
  • the functions of the MCU also can be integrated into the CPU and implemented, that is to say, the functions of the first processor and the second processor may be integrated into one processor and implemented, and the embodiments of the present application are not limited hereto.
  • FIG. 2 is a flowchart illustrating a second embodiment of the touch processing method of the present application.
  • Step 201 the second processor acquires a first touching point from a first area and acquires a second touching point from a second area, respectively.
  • the second processor is connected to the touch sensing device in the mobile terminal, the touch sensing device is generally a touch screen, the first area generally refers to a touch input area corresponding to the display screen and the second area generally refers to a touch input area only for receiving touch gestures.
  • the first area also may be predetermined as a central scope involved in the touch input operations on the touch screen, and the second area is an edge scope on the touch screen, that is, an area which is easily be touched by hands of the user holding the mobile terminal but not used to generate the touch command.
  • the first area and the second area described above may be defined flexibly depending on actual applications as long as they could respond the useful touch inputs when a misoperation occurs.
  • Step 202 the second processor determines the touch command corresponding to the second touching point according to the second touching point acquired from the second area.
  • Step 203 the second processor transmits the first touching point and the touch command corresponding to the second touching point to the first processor.
  • Step 204 the first processor responds to a touch command corresponding to the first touching point according to the first touching point, and responds to the touch command corresponding to the second touching point.
  • Step 205 it is judged whether the second processor acquires the first touching point in the first area simultaneously during the process of acquiring the second touching point in the second area, if yes, the process would proceeds to Step 206 ; otherwise, the process returns to the Step 201 .
  • Step 206 the second processor transmits the first touching point to the first processor and does not perform the step of determining the touch command corresponding to the second touching point according to the second touching point acquired from the second area.
  • a misoperation scene corresponding thereto may be a situation as follows: the palm of the user touches the second area continuously and never leaves the second area when the user holds the mobile terminal by a hand, then the second processor may acquire the second touching point continuously; at this time, the second processor acquires the first touching point if the user performs the touch operation in the first area.
  • the second processor does not perform the process for determining the touch command corresponding to the second touching point according to the second touching point acquired from the second area but only transmits the first touching point to the first processor, although the second processor acquires the first touching point and the second touching point at the same time.
  • Step 207 the first processor responds to the first touching point after receiving the first touching point, and terminates the current flow.
  • the mobile terminal may only respond to the first touching point and shield the second touching point when the touch inputs exists both in the first area and in the second area and the touch input in the second area is a misoperation, because the first processor only receives the first touching point and does not receive the touch command corresponding to the second touching point transmitted by the second processor at all.
  • FIG. 3 is a flowchart illustrating a third embodiment of the touch processing method of the present application.
  • Step 301 the second processor acquires a first touching point from a first area and acquires a second touching point from a second area, respectively.
  • the second processor is connected to the touch sensing device in the mobile terminal, the touch sensing device is generally a touch screen, the first area generally refers to a touch input area corresponding to the display screen and the second area generally refers to a touch input area only for receiving touch gestures.
  • the first area also may be predetermined as a central scope involved in the touch input operations on the touch screen, and the second area is an edge scope on the touch screen, that is, an area which is easily be touched by hands of the user while holding the mobile terminal but not used to generate the touch command.
  • the first area and the second area described above may be defined flexibly depending on actual applications as long as they could respond the useful touch inputs when a misoperation occurs.
  • Step 302 the second processor determines the touch command corresponding to the second touching point according to the second touching point acquired from the second area.
  • Step 303 the second processor transmits the first touching point and the touch command corresponding to the second touching point to the first processor.
  • Step 304 the first processor responds to a touch command corresponding to the first touching point according to the first touching point, and responds to the touch command corresponding to the second touching point.
  • Step 305 it is judged whether the second processor acquires the first touching point in the first area during the process of determining the corresponding command according to the second touching point, if yes, the process would proceeds to Step 306 ; otherwise, the process returns to the Step 301 .
  • Step 306 the second processor transmits the first touching point to the first processor.
  • Step 307 the second processor terminates the process of determining the touch command corresponding to the second touching point according to the second touching point acquired from the second area.
  • a misoperation scene corresponding thereto may be a situation as follows: the palm of the user touches the second area transiently when the user holds the mobile terminal by a hand, then the second processor may acquire the second touching point and determine the touch command corresponding to the second touching point; at this time, the second processor also may acquires the first touching point if the user performs the touch operation in the first area.
  • the mobile terminal may judge the second touching point as a misoperation because the first touching point has been input to the first area although the second processor has started to determine the touch command according to the second touching point, thus the operation for determining the touch command corresponding to the second touching point according to the second touching point acquired from the second area is terminated and only the first touching point is transmitted to the first processor.
  • Step 308 the first processor responds to the first touching point after receiving the first touching point, and terminates the current flow.
  • the mobile terminal may only respond to the first touching point and shield the second touching point when the touch inputs exists both in the first area and in the second area and the touch input in the second area is a misoperation, because the first processor only receives the first touching point and does not receive the touch command corresponding to the second touching point transmitted by the second processor at all.
  • FIG. 4 is a schematic diagram illustrating a structure of a mobile terminal to which the embodiments of the touch processing method of the present application are applied.
  • the mobile terminal comprises: a display screen 410 and a touch screen 420 .
  • the touch screen 420 comprises a first touch area 421 corresponding to the display screen 410 and a second touch area 422 only for receiving touch gestures.
  • the first touch area 421 and the second touch area 422 are separated from each other by a dotted line schematically.
  • the touch screen 420 is connected with a MCU directly and the MCU is connected with a CPU (neither the MCU nor the CPU is shown in FIG. 4 ).
  • the MCU transmits the first touching point to the CPU, the MCU does not perform the step of determining a touch command corresponding to the second touching point according to the second touching point acquired from the second touch area 422 , and meanwhile the CPU responds to the first touching point after receiving the first touching point;
  • the MCU transmits the first touching point to the CPU, the MCU terminates the step of determining the touch command corresponding to the second touching point according to the second touching point acquired from the second touch area 422 , and meanwhile the CPU responds to the first touching point after receiving the first touching point.
  • the present application also provides embodiments of a mobile terminal.
  • the mobile terminal comprises: a touch sensing device 510 , a first processor 520 and a second processor 530 .
  • the touch sensing device 510 has a first area and a second area, a touching point in the first area has a first response level and a touching point in the second area has a second response level, the first response level is higher than the second response level;
  • the second processor 530 comprises:
  • the first processor 520 comprises:
  • the mobile terminal comprises: a touch sensing device 610 , a first processor 620 and a second processor 630 .
  • the touch sensing device 610 has a first area and a second area, a touching point in the first area has a first response level and a touching point in the second area has a second response level, the first response level is higher than the second response level;
  • the second processor 630 comprises:
  • a first acquiring unit 631 for acquiring the first touching point in the first area during a process of acquiring the second touching point in the second area
  • a first determination unit 632 for determining a touch command corresponding to the second touching point according to the second touching point acquired from the second area
  • a first transmitting unit 633 for transmitting the first touching point to the first processor 620 ;
  • a first control unit 634 for controlling the first determination unit 632 not to perform the process of determining the touch command corresponding to the second touching point according to the second touching point acquired from the second area;
  • the first processor 620 comprises:
  • the mobile terminal comprises: a touch sensing device 710 , a first processor 720 and a second processor 730 .
  • the touch sensing device 710 has a first area and a second area, a touching point in the first area has a first response level and a touching point in the second area has a second response level, the first response level is higher than the second response level;
  • the second processor 730 comprises:
  • a second acquiring unit 731 for acquiring the first touching point in the first area during a process of determining a touch command corresponding to the second touching point according to the second touching point acquired from the second area;
  • a second determination unit 732 for determining a touch command corresponding to the second touching point according to the second touching point acquired from the second area
  • a second transmitting unit 733 for transmitting the first touching point to the first processor 720 ;
  • a second control unit 734 for controlling the second determination unit 732 to terminate the process of determining the touch command corresponding to the second touching point according to the second touching point acquired from the second area;
  • the first processor 720 comprises:
  • the touch-type mobile terminal in the embodiments of the present application is equipped with the touch sensing device, the touch sensing device has a first area and a second area, and a first touching point will be responded when the first touching point and a second touching point are acquired and the first touching point is in the first area while the second touching point is in the second area.
  • the embodiments of the present application to perform a touch input, no misoperation would occur because one touch command corresponding to one touching point would be responded when touch operations are input in different areas.
  • a user may not have to only hold edges of the mobile terminal for a purpose of avoiding a misoperation when he/she holds the mobile terminal to operate. Instead, the mobile terminal may respond to useful touch commands automatically and shield the touch commands generated by a misoperation, thus the user experience is enhanced.
  • the computer software product may be stored in a storage medium, for example, a ROM/RAM, a magnetic disk, an optical disk, and the like, and comprise several instructions for causing a computer device (may be a personal computer, a server, or a network device, etc.) to perform the method in respective embodiments or some parts of the embodiment of the present application.
  • a computer device may be a personal computer, a server, or a network device, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)
US13/520,221 2009-12-30 2010-12-24 Method for touch processing and mobile terminal Active 2031-05-31 US9007314B2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN2009102445615A CN102117140A (zh) 2009-12-30 2009-12-30 一种触摸处理方法及移动终端
CN200910244561.5 2009-12-30
CN200910244561 2009-12-30
PCT/CN2010/080229 WO2011079749A1 (zh) 2009-12-30 2010-12-24 一种触摸处理方法及移动终端

Publications (2)

Publication Number Publication Date
US20130201118A1 US20130201118A1 (en) 2013-08-08
US9007314B2 true US9007314B2 (en) 2015-04-14

Family

ID=44215936

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/520,221 Active 2031-05-31 US9007314B2 (en) 2009-12-30 2010-12-24 Method for touch processing and mobile terminal

Country Status (3)

Country Link
US (1) US9007314B2 (zh)
CN (1) CN102117140A (zh)
WO (1) WO2011079749A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130285960A1 (en) * 2012-04-27 2013-10-31 Samsung Electronics Co. Ltd. Method for improving touch response and an electronic device thereof
US20140139463A1 (en) * 2012-11-21 2014-05-22 Bokil SEO Multimedia device for having touch sensor and method for controlling the same
US20160077618A1 (en) * 2014-09-16 2016-03-17 Samsung Display Co., Ltd. Touch display device including visual accelerator

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9110580B2 (en) * 2011-08-05 2015-08-18 Nokia Technologies Oy Apparatus comprising a display and a method and computer program
CN103294232A (zh) 2012-02-22 2013-09-11 华为终端有限公司 一种触摸操作的处理方法及终端
TWI476651B (zh) * 2012-05-10 2015-03-11 Innocom Tech Shenzhen Co Ltd 觸控面板、觸控裝置及觸碰點偵測方法
TWI464647B (zh) * 2012-09-10 2014-12-11 Elan Microelectronics Corp 觸控裝置及其手勢判斷方法
CN104238793B (zh) * 2013-06-21 2019-01-22 中兴通讯股份有限公司 一种防止触摸屏移动设备误操作的方法及装置
CN103616970B (zh) * 2013-11-07 2017-01-04 华为终端有限公司 触控响应方法及装置
CN105022518B (zh) * 2014-04-29 2018-04-24 华邦电子股份有限公司 便携式电子装置及其触摸检测方法
CN104007932B (zh) * 2014-06-17 2017-12-29 华为技术有限公司 一种触摸点识别方法及装置
CN104822091B (zh) * 2015-04-29 2016-08-24 努比亚技术有限公司 视频播放进度控制方法、装置及移动终端
CN106325621B (zh) * 2015-06-24 2019-05-14 小米科技有限责任公司 移动终端及触摸响应方法
WO2017161497A1 (zh) * 2016-03-22 2017-09-28 深圳市柔宇科技有限公司 头戴式显示设备及其操控方法
WO2017166209A1 (zh) * 2016-03-31 2017-10-05 华为技术有限公司 设置勿触区域的方法、装置、电子设备、显示界面以及存储介质
CN107850974A (zh) * 2016-05-20 2018-03-27 华为技术有限公司 识别误触摸操作的方法和电子设备
CN106095322B (zh) * 2016-06-27 2023-03-24 联想(北京)有限公司 一种控制方法及电子设备
CN106534556A (zh) * 2016-11-22 2017-03-22 努比亚技术有限公司 移动终端及其数据处理方法
EP3640782B1 (en) * 2017-08-03 2024-07-31 Huawei Technologies Co., Ltd. Anti-misoperation method and terminal
CN108897247A (zh) * 2018-05-02 2018-11-27 佛山市顺德区美的洗涤电器制造有限公司 控制方法和洗涤电器
KR102700035B1 (ko) * 2019-02-19 2024-08-29 삼성전자주식회사 터치 센서를 터치하는 외부 객체의 좌표를 식별하기 위한 전자 장치
CN112162689B (zh) * 2020-09-18 2022-03-04 维沃移动通信有限公司 输入方法、装置及电子设备
CN113157197B (zh) * 2021-04-28 2024-05-10 北京字节跳动网络技术有限公司 一种触控板控制方法及装置

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060044259A1 (en) * 2004-08-25 2006-03-02 Hotelling Steven P Wide touchpad on a portable computer
US20070152976A1 (en) * 2005-12-30 2007-07-05 Microsoft Corporation Unintentional touch rejection
US20080012835A1 (en) * 2006-07-12 2008-01-17 N-Trig Ltd. Hover and touch detection for digitizer
US7561146B1 (en) * 2004-08-25 2009-07-14 Apple Inc. Method and apparatus to reject accidental contact on a touchpad
US20090179865A1 (en) * 2008-01-15 2009-07-16 Avi Kumar Interface system and method for mobile devices
US20100085323A1 (en) * 2009-12-04 2010-04-08 Adam Bogue Segmenting a Multi-Touch Input Region by User
US20100134437A1 (en) * 2008-11-28 2010-06-03 Htc Corporation Portable electronic device and method for waking up the same from sleep mode through touch screen
US20100139990A1 (en) * 2008-12-08 2010-06-10 Wayne Carl Westerman Selective Input Signal Rejection and Modification
US20100156795A1 (en) * 2008-12-23 2010-06-24 Samsung Electronics Co., Ltd. Large size capacitive touch screen panel
US20100182247A1 (en) * 2009-01-21 2010-07-22 Microsoft Corporation Bi-modal multiscreen interactivity
US20100194703A1 (en) * 2007-09-19 2010-08-05 Adam Fedor Multimedia, multiuser system and associated methods
US20100235793A1 (en) * 2009-03-16 2010-09-16 Bas Ording Methods and Graphical User Interfaces for Editing on a Multifunction Device with a Touch Screen Display
US20110069015A1 (en) * 2009-09-23 2011-03-24 Nokia Corporation Touch detection
US20110095988A1 (en) * 2009-10-24 2011-04-28 Tara Chand Singhal Integrated control mechanism for handheld electronic devices
US20120154294A1 (en) * 2010-12-17 2012-06-21 Microsoft Corporation Using movement of a computing device to enhance interpretation of input events produced when interacting with the computing device
US20120158629A1 (en) * 2010-12-17 2012-06-21 Microsoft Corporation Detecting and responding to unintentional contact with a computing device
US8239784B2 (en) * 2004-07-30 2012-08-07 Apple Inc. Mode-based graphical user interfaces for touch sensitive input devices
US20120262407A1 (en) * 2010-12-17 2012-10-18 Microsoft Corporation Touch and stylus discrimination and rejection for contact sensitive computing devices
US20130154983A1 (en) * 2010-07-01 2013-06-20 Flatfrog Laboratories Ab Data processing in relation to a multi-touch sensing apparatus

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8253695B2 (en) * 2006-09-06 2012-08-28 Apple Inc. Email client for a portable multifunction device
US20090174679A1 (en) * 2008-01-04 2009-07-09 Wayne Carl Westerman Selective Rejection of Touch Contacts in an Edge Region of a Touch Surface
CN101539818A (zh) * 2008-03-21 2009-09-23 阿尔派株式会社 视频装置及具有该视频装置的音响装置和导航装置

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8239784B2 (en) * 2004-07-30 2012-08-07 Apple Inc. Mode-based graphical user interfaces for touch sensitive input devices
US7561146B1 (en) * 2004-08-25 2009-07-14 Apple Inc. Method and apparatus to reject accidental contact on a touchpad
US20060044259A1 (en) * 2004-08-25 2006-03-02 Hotelling Steven P Wide touchpad on a portable computer
US20070152976A1 (en) * 2005-12-30 2007-07-05 Microsoft Corporation Unintentional touch rejection
US20080012835A1 (en) * 2006-07-12 2008-01-17 N-Trig Ltd. Hover and touch detection for digitizer
US20100194703A1 (en) * 2007-09-19 2010-08-05 Adam Fedor Multimedia, multiuser system and associated methods
US20090179865A1 (en) * 2008-01-15 2009-07-16 Avi Kumar Interface system and method for mobile devices
US20100134437A1 (en) * 2008-11-28 2010-06-03 Htc Corporation Portable electronic device and method for waking up the same from sleep mode through touch screen
US20100139990A1 (en) * 2008-12-08 2010-06-10 Wayne Carl Westerman Selective Input Signal Rejection and Modification
US20100156795A1 (en) * 2008-12-23 2010-06-24 Samsung Electronics Co., Ltd. Large size capacitive touch screen panel
US20100182247A1 (en) * 2009-01-21 2010-07-22 Microsoft Corporation Bi-modal multiscreen interactivity
US20100235793A1 (en) * 2009-03-16 2010-09-16 Bas Ording Methods and Graphical User Interfaces for Editing on a Multifunction Device with a Touch Screen Display
US20110069015A1 (en) * 2009-09-23 2011-03-24 Nokia Corporation Touch detection
US20110095988A1 (en) * 2009-10-24 2011-04-28 Tara Chand Singhal Integrated control mechanism for handheld electronic devices
US20100085323A1 (en) * 2009-12-04 2010-04-08 Adam Bogue Segmenting a Multi-Touch Input Region by User
US20130154983A1 (en) * 2010-07-01 2013-06-20 Flatfrog Laboratories Ab Data processing in relation to a multi-touch sensing apparatus
US20120154294A1 (en) * 2010-12-17 2012-06-21 Microsoft Corporation Using movement of a computing device to enhance interpretation of input events produced when interacting with the computing device
US20120158629A1 (en) * 2010-12-17 2012-06-21 Microsoft Corporation Detecting and responding to unintentional contact with a computing device
US20120262407A1 (en) * 2010-12-17 2012-10-18 Microsoft Corporation Touch and stylus discrimination and rejection for contact sensitive computing devices

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PCT/CN2010/080229 International Preliminary Report on Patentability dated Jul. 4, 2012 (5 pages).

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130285960A1 (en) * 2012-04-27 2013-10-31 Samsung Electronics Co. Ltd. Method for improving touch response and an electronic device thereof
US9612676B2 (en) * 2012-04-27 2017-04-04 Samsung Electronics Co., Ltd. Method for improving touch response and an electronic device thereof
US20140139463A1 (en) * 2012-11-21 2014-05-22 Bokil SEO Multimedia device for having touch sensor and method for controlling the same
US9703412B2 (en) * 2012-11-21 2017-07-11 Lg Electronics Inc. Multimedia device for having touch sensor and method for controlling the same
US20160077618A1 (en) * 2014-09-16 2016-03-17 Samsung Display Co., Ltd. Touch display device including visual accelerator
US9720589B2 (en) * 2014-09-16 2017-08-01 Samsung Display Co., Ltd. Touch display device including visual accelerator

Also Published As

Publication number Publication date
CN102117140A (zh) 2011-07-06
WO2011079749A1 (zh) 2011-07-07
US20130201118A1 (en) 2013-08-08

Similar Documents

Publication Publication Date Title
US9007314B2 (en) Method for touch processing and mobile terminal
AU2020201096B2 (en) Quick screen splitting method, apparatus, and electronic device, display UI, and storage medium
EP1980937B1 (en) Object search method and terminal having object search function
US8395584B2 (en) Mobile terminals including multiple user interfaces on different faces thereof configured to be used in tandem and related methods of operation
KR101478595B1 (ko) 정보를 송신하기 위한 터치 기반 방법 및 장치
US20170123590A1 (en) Touch Point Recognition Method and Apparatus
US9891816B2 (en) Method and mobile terminal for processing touch input in two different states
WO2019014859A1 (zh) 一种多任务操作方法及电子设备
US10282073B2 (en) Method and wireless handheld device for automatically switching handheld mode
TWI626591B (zh) 應用程式切換系統及方法
US8456433B2 (en) Signal processing apparatus, signal processing method and selection method of user interface icon for multi-touch panel
KR20140071282A (ko) 표시된 대상의 줌을 제어하는 전자 기기 및 방법
US10705649B2 (en) Pressure touch control method and electronic device
US20130332888A1 (en) Method and device for operating list in handheld device
CN109074124B (zh) 数据处理的方法及移动设备
CN107728898B (zh) 一种信息处理方法及移动终端
US20170371479A1 (en) Method for Processing Operation and Terminal
CN103150083B (zh) 一种自定义桌面图标的显示方法及装置
CN104750401A (zh) 一种触控方法、相关装置以及终端设备
WO2022089480A1 (zh) 信息处理方法、装置及电子设备
US20150234546A1 (en) Method for Quickly Displaying a Skype Contacts List and Computer Program Thereof and Portable Electronic Device for Using the Same
CN112162689A (zh) 输入方法、装置及电子设备
CN105159572A (zh) 界面切换方法及移动终端
CN107390918A (zh) 移动设备的操作方法、移动设备及具有存储功能的装置
TW201520946A (zh) 快速顯示Skype聯絡人名單之方法及其電腦可程式產品及可攜式電子裝置

Legal Events

Date Code Title Description
AS Assignment

Owner name: LENOVO (BEIJING) CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LIU, XIANGTAO;GAN, DAYONG;REEL/FRAME:028476/0469

Effective date: 20120613

Owner name: BEIJING LENOVO SOFTWARE LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LIU, XIANGTAO;GAN, DAYONG;REEL/FRAME:028476/0469

Effective date: 20120613

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8