CN103677591A - Terminal self-defined gesture method and terminal thereof - Google Patents

Terminal self-defined gesture method and terminal thereof Download PDF

Info

Publication number
CN103677591A
CN103677591A CN201210315343.8A CN201210315343A CN103677591A CN 103677591 A CN103677591 A CN 103677591A CN 201210315343 A CN201210315343 A CN 201210315343A CN 103677591 A CN103677591 A CN 103677591A
Authority
CN
China
Prior art keywords
self
display screen
action
touch display
defined gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201210315343.8A
Other languages
Chinese (zh)
Inventor
王西圣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ZTE Corp
Original Assignee
ZTE Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ZTE Corp filed Critical ZTE Corp
Priority to CN201210315343.8A priority Critical patent/CN103677591A/en
Priority to PCT/CN2013/081025 priority patent/WO2014032504A1/en
Publication of CN103677591A publication Critical patent/CN103677591A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0238Programmable keyboards
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04897Special input arrangements or commands for improving display capability

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

The invention discloses a terminal self-defined gesture method and a terminal of the terminal self-defined gesture method. The terminal self-defined gesture method comprises the steps that multiple continuous motions executed by a touch display screen and various sensors within the preset time are detected, and the detected multiple continuous motions serve as an operation path and are stored according to a preset data format; a self-defined gesture is requested for being input, and the detected contact with the touch display screen serves as a self-defined gesture, corresponds to the operation path and is stored in the preset data format. The terminal at least comprises the touch display screen, a CPU and a storage device. The terminal self-defined gesture method and the terminal of the terminal self-defined gesture method can simplify some complex operations of a user and enable operations to be more convenient in special occasions or for special crowds.

Description

The method of the self-defined gesture of terminal and terminal thereof
Technical field
The present invention relates to communication technical field, relate in particular to method and the terminal thereof of the self-defined gesture of a kind of terminal.
Background technology
Smart mobile phone has been obtained the development of advancing by leaps and bounds in recent years, has had the trend that replaces gradually traditional non intelligent mobile phone, and on smart mobile phone, touch-screen is widely used, and becomes the important component of smart mobile phone.
Development along with smart mobile phone, display screen and touch-screen have increasing trend, cause one-handed performance mobile phone more and more difficult, a large amount of uses of sensor simultaneously, the very big lifting of CPU speed, make large software on mobile phone, will obtain applying more widely, the complicacy of software also improves accordingly, the complicacy of user's operation also can improve thereupon, in some special occasions, in the time of cannot carrying out bimanualness, the greatly use of limited subscriber, for the dumb or handicapped people of some hands, also can there is same problem, at this moment just need a kind of mode of operation of more simplifying.
Summary of the invention
In view of above-mentioned analysis, the present invention aims to provide method and the terminal thereof of the self-defined gesture of a kind of terminal, in order to solve the problem of terminal operation complexity in prior art.
Object of the present invention is mainly achieved through the following technical solutions:
The method that the invention provides the self-defined gesture of a kind of terminal, comprising:
Detecting the performed a plurality of continuous actions in the given time of touch display screen and various kinds of sensors, is courses of action by a plurality of continuous actions that detect, and with tentation data form, stores;
The self-defined gesture of request input, and using that detect corresponding as a self-defined gesture and described courses of action with contact described touch display screen, with tentation data form, store.
Further, described method also comprises:
When needs trigger a plurality of continuous action by self-defined gesture, by current detection to contact contrast with the self-defined gesture of storage before, and when more successful, trigger the corresponding a plurality of continuous actions in corresponding operating path.
Further, the step that detects and store a plurality of continuous actions specifically comprises:
Whether have action, if had, a node using the action detecting as scheduled operation path, stores with tentation data form if detecting successively touch display screen and various kinds of sensors; Otherwise, return and again detect touch display screen and whether various kinds of sensors has action, until when detecting touch display screen and sensor and all not moving, trigger the flow process of self-defined gesture.
Further, when each action is stored with predetermined format, also the time point of described action executing is together stored.
The present invention also provides a kind of terminal, at least comprises: touch display screen, CPU and storer, wherein,
Described CPU, for detection of touch display screen and the performed a plurality of continuous actions of various kinds of sensors as courses of action; And, after the self-defined gesture of request input, using that detect corresponding as a self-defined gesture and described courses of action with contact described touch display screen;
Described storer is that courses of action are stored with tentation data form for a plurality of continuous actions that CPU is detected; And what CPU was detected stores with tentation data form with contacting as a self-defined gesture of described touch display screen, described courses of action are corresponding with described self-defined gesture.
Further, described CPU also for, when needs trigger a plurality of continuous action by self-defined gesture, by current detection to contact contrast with the self-defined gesture of storage before, and when more successful, trigger the corresponding a plurality of continuous actions in corresponding operating path.
Further, described CPU specifically for, whether detect successively touch display screen and various kinds of sensors has action, if had, a node using the action detecting as scheduled operation path, stores with tentation data form; Otherwise, return and again detect touch display screen and whether various kinds of sensors has action, until when detecting touch display screen and sensor and all not moving; And after the self-defined gesture of request input, detection contacts with described touch display screen.
Further, described various kinds of sensors at least comprises following a kind of:
Gravity sensor, acceleration transducer, optical sensor, range sensor, gyroscope.
Further, also comprise:
Gesture trigger, described gesture trigger is the described touch display screen that terminal carries, or is external touch display screen, or for thering is the equipment of Bluetooth function and touch display screen.
Beneficial effect of the present invention is as follows:
The present invention can simplify some complex operations of user, at special occasions or for special population, the invention enables more convenient to operate.
Other features and advantages of the present invention will be set forth in the following description, and, the becoming apparent from instructions of part, or understand by implementing the present invention.Object of the present invention and other advantages can be realized and be obtained by specifically noted structure in the instructions write, claims and accompanying drawing.
Accompanying drawing explanation
Fig. 1 is the schematic flow sheet of the present invention's the first embodiment of the method;
In Fig. 2 the present invention the second embodiment of the method, the schematic flow sheet of self-defined gesture;
Fig. 3 is in the present invention's the second embodiment of the method, triggers the schematic flow sheet of gesture;
Fig. 4 is the structural representation of terminal embodiment of the present invention.
Embodiment
Below in conjunction with accompanying drawing, specifically describe the preferred embodiments of the present invention, wherein, accompanying drawing forms the application's part, and together with embodiments of the present invention for explaining principle of the present invention.
First, by reference to the accompanying drawings 1 to 3 couple of the method for the invention embodiment is elaborated.
The first embodiment of the method:
As described in Figure 1, Fig. 1 is the schematic flow sheet of the present invention's the first embodiment of the method, specifically can comprise:
Step 101: detecting the performed a plurality of continuous actions in the given time of touch-screen and each sensor, is courses of action by a plurality of continuous actions that detect, and stores with tentation data form;
Step 102: the self-defined gesture of request input, and using that detect corresponding as a self-defined gesture and described courses of action with contact described touch-screen, with tentation data form, store.
More than be described as the storing step of a plurality of continuous actions and self-defined gesture, as the preferred embodiments of the present invention, can also comprise:
Step 103: when needs trigger a plurality of continuous action by self-defined gesture, by current detection to contact contrast with the self-defined gesture of storage before, and when more successful, trigger the corresponding a plurality of continuous actions in corresponding operating path.
The second embodiment of the method:
As described in Figure 2, Fig. 2 is in the present invention's the second embodiment of the method, and the schematic flow sheet of self-defined gesture, specifically can comprise:
Step 200: user starts after definition gesture function, start-of-record time point;
Whether step 201:CPU detects touch-screen action, if so, records the time point of this action executing, and this action and time point are as a node of courses of action, with predetermined format, store, and whether then continue to detect touch-screen has action; Otherwise execution step 202;
Step 202: whether have action, if so, record the time point of this action executing if detecting various kinds of sensors, and this action and time point are as the next node of these courses of action, whether with predetermined format, store, then forward step 201 to, continuing to detect touch-screen has action; Otherwise, forward step 203 to;
Step 203: judge whether everything completes, if, record deadline point, request user inputs self-defined gesture, and using that detect corresponding as a self-defined gesture and this courses of action with contact touch-screen, with tentation data form, store, self-defined gesture process completes; Otherwise, forward step 201 to.
It is more than the execution flow process of self-defined gesture, for example, certain user need to often repeat a certain group of operation, and content of operation is as follows: " return the slips-specified point of main interface-left click-slide to the right-slip-specified point clicks-wait for that the amplification of mobile phone (acceleration transducer)-multiple spot is clicked-rocked to 10s-mobile phones transverse screen (gravity sensing)-down sliding-specified point and main interface is clicked-returned to dwindle-specified point to the right ".At this moment can start definition gesture function, carry out after above operation, input self-defining gesture and preserve as " Z " is rear.
As shown in Figure 3, Fig. 3 is in the present invention's the second embodiment of the method, triggers the schematic flow sheet of gesture, specifically can comprise:
Step 301: start self-defined gesture Trigger Function, wait for user's input;
Step 302: user inputs self-defined gesture by touch-screen;
Step 303: search in the self-defined gesture of storage before, whether inquiry belongs to User Defined gesture, if find this self-defined gesture, calls a plurality of continuous actions corresponding to this self-defined gesture;
Step 304: carry out a plurality of continuous actions corresponding to this self-defined gesture, wherein, if carried out according to time point during a plurality of each continuous action, operation while re-defining gesture completely, comprise the interval time between each operation, if carried out not according to time point, only need order executable operations, can greatly save time.
For example, giving an example of definition gesture above-mentioned, when user inputs after " Z " gesture on gesture trigger, CPU can confirm that user had defined this gesture and found corresponding a plurality of continuous actions of this gesture, then repeated user-defined content of operation: " return the slips-specified point of main interface-left click-slide to the right-slip-specified point clicks-wait for that the amplification of mobile phone (acceleration transducer)-multiple spot is clicked-rocked to 10s-mobile phones transverse screen (gravity sensing)-down sliding-specified point and main interface is clicked-returned to dwindle-specified point to the right ".
It should be noted that, in the time of by self-defined gesture and subsequent triggers, the touch display screen that can carry by terminal is as gesture trigger, can also connect a utility appliance with touch display screen as gesture trigger, also can adopt the autonomous device with Bluetooth function and touch display screen as gesture trigger.In addition, various sensors can include but not limited to following one or more sensors:
Gravity sensor, acceleration transducer, optical sensor, range sensor, gyroscope.
Next, terminal is elaborated described in 4 pairs of embodiment of the present invention by reference to the accompanying drawings.
As shown in Figure 4, the structural representation that Fig. 4 is terminal embodiment of the present invention, at least should comprise: touch display screen, CPU and storer, wherein,
CPU, is responsible for the performed a plurality of continuous actions of detection touch display screen and various kinds of sensors as courses of action; And, after the self-defined gesture of request input, using that detect corresponding as a self-defined gesture and described courses of action with contact described touch display screen;
Wherein, the process of a plurality of continuous actions that CPU detection touch display screen and various kinds of sensors are performed specifically comprises: whether detect successively touch display screen and various kinds of sensors has action, if had, a node using the action detecting as scheduled operation path, stores with tentation data form; Otherwise, return and again detect touch display screen and whether various kinds of sensors has action, until when detecting touch display screen and sensor and all not moving;
And, when needs trigger a plurality of continuous action by self-defined gesture, CPU by current detection to contact contrast with the self-defined gesture of storage before, and when more successful, trigger the corresponding a plurality of continuous actions in corresponding operating path.
Storer, a plurality of continuous actions of being responsible for CPU to detect are that courses of action are stored with tentation data form; And what CPU was detected stores with tentation data form with contacting as a self-defined gesture of described touch display screen, described courses of action are corresponding with described self-defined gesture.
Gesture trigger, when needs trigger a plurality of continuous action by self-defined gesture, user is by this gesture trigger input gesture, the touch display screen that this gesture trigger can carry for terminal (touch display screen using while inputting self-defined gesture), also can be external touch display screen, can also be for thering is the equipment of Bluetooth function and touch display screen.
Wherein, various kinds of sensors at least comprises following a kind of:
Gravity sensor, acceleration transducer, optical sensor, range sensor, gyroscope.
In sum, the embodiment of the present invention provides method and the terminal thereof of the self-defined gesture of a kind of terminal, by collecting and store touch-screen, the various kinds of sensors courses of action within a period of time, be stored in storer, user can set oneself a gesture and define this path, while using this self-defining gesture, these group data in storer are called later, repeat to realize these group courses of action of definition.
The beneficial effect of the embodiment of the present invention is: for the longer operation in certain operations path, or need to touch and when various sensor married operation, user only needs a gesture to complete; Use some softwares and when game, one group of fixing gesture and sensor operation can be reduced to an own gesture defining; In the situation that some are not easy to operation, or for the dumb or handicapped people of those hands, adopt self-defined gesture can make operation more free and relaxed.For certain specific needs, can record the operating process in certain a period of time of storage, also can be so that user operates more personalized.
The above; be only the present invention's embodiment preferably, but protection scope of the present invention is not limited to this, is anyly familiar with in technical scope that those skilled in the art disclose in the present invention; the variation that can expect easily or replacement, within all should being encompassed in protection scope of the present invention.Therefore, protection scope of the present invention should be as the criterion with the protection domain of claims.

Claims (9)

1. a method for the self-defined gesture of terminal, is characterized in that, comprising:
Detecting the performed a plurality of continuous actions in the given time of touch display screen and various kinds of sensors, is courses of action by a plurality of continuous actions that detect, and with tentation data form, stores;
The self-defined gesture of request input, and using that detect corresponding as a self-defined gesture and described courses of action with contact described touch display screen, with tentation data form, store.
2. method according to claim 1, is characterized in that, also comprises:
When needs trigger a plurality of continuous action by self-defined gesture, by current detection to contact contrast with the self-defined gesture of storage before, and when more successful, trigger the corresponding a plurality of continuous actions in corresponding operating path.
3. method according to claim 1 and 2, is characterized in that, the step that detects and store a plurality of continuous actions specifically comprises:
Whether have action, if had, a node using the action detecting as scheduled operation path, stores with tentation data form if detecting successively touch display screen and various kinds of sensors; Otherwise, return and again detect touch display screen and whether various kinds of sensors has action, until when detecting touch display screen and sensor and all not moving, trigger the flow process of self-defined gesture.
4. method according to claim 3, is characterized in that, when each action is stored with predetermined format, also the time point of described action executing is together stored.
5. a terminal, is characterized in that, at least comprises: touch display screen, CPU and storer, wherein,
Described CPU, for detection of touch display screen and the performed a plurality of continuous actions of various kinds of sensors as courses of action; And, after the self-defined gesture of request input, using that detect corresponding as a self-defined gesture and described courses of action with contact described touch display screen;
Described storer is that courses of action are stored with tentation data form for a plurality of continuous actions that CPU is detected; And what CPU was detected stores with tentation data form with contacting as a self-defined gesture of described touch display screen, described courses of action are corresponding with described self-defined gesture.
6. terminal according to claim 5, it is characterized in that, described CPU also for, when needs trigger a plurality of continuous action by self-defined gesture, by current detection to contact contrast with the self-defined gesture of storage before, and when more successful, trigger the corresponding a plurality of continuous actions in corresponding operating path.
7. according to the terminal described in claim 5 or 6, it is characterized in that, described CPU specifically for, detect successively touch display screen and various kinds of sensors and whether have action, if had, a node using the action detecting as scheduled operation path, stores with tentation data form; Otherwise, return and again detect touch display screen and whether various kinds of sensors has action, until when detecting touch display screen and sensor and all not moving; And after the self-defined gesture of request input, detection contacts with described touch display screen.
8. terminal according to claim 5, is characterized in that, described various kinds of sensors at least comprises following a kind of:
Gravity sensor, acceleration transducer, optical sensor, range sensor, gyroscope.
9. according to the terminal described in claim 5 or 6, it is characterized in that, also comprise:
Gesture trigger, described gesture trigger is the described touch display screen that terminal carries, or is external touch display screen, or for thering is the equipment of Bluetooth function and touch display screen.
CN201210315343.8A 2012-08-30 2012-08-30 Terminal self-defined gesture method and terminal thereof Pending CN103677591A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201210315343.8A CN103677591A (en) 2012-08-30 2012-08-30 Terminal self-defined gesture method and terminal thereof
PCT/CN2013/081025 WO2014032504A1 (en) 2012-08-30 2013-08-07 Method for terminal to customize hand gesture and terminal thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210315343.8A CN103677591A (en) 2012-08-30 2012-08-30 Terminal self-defined gesture method and terminal thereof

Publications (1)

Publication Number Publication Date
CN103677591A true CN103677591A (en) 2014-03-26

Family

ID=50182468

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210315343.8A Pending CN103677591A (en) 2012-08-30 2012-08-30 Terminal self-defined gesture method and terminal thereof

Country Status (2)

Country Link
CN (1) CN103677591A (en)
WO (1) WO2014032504A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106255977A (en) * 2014-05-06 2016-12-21 讯宝科技有限责任公司 For performing the apparatus and method of variable data acquisition procedure
US10198083B2 (en) 2014-02-25 2019-02-05 Xi'an Zhongxing New Software Co. Ltd. Hand gesture recognition method, device, system, and computer storage medium
CN110287734A (en) * 2019-07-01 2019-09-27 Oppo广东移动通信有限公司 Setting method, device, terminal and storage medium of secure communication node
US10591999B2 (en) 2014-02-25 2020-03-17 Zte Corporation Hand gesture recognition method, device, system, and computer storage medium
CN115604384A (en) * 2022-11-14 2023-01-13 广东石油化工学院(Cn) An electronic device for assisting the use of smartphones

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101546233A (en) * 2009-05-05 2009-09-30 上海华勤通讯技术有限公司 Identification and operation method of touch screen interface gestures
US20100031203A1 (en) * 2008-08-04 2010-02-04 Microsoft Corporation User-defined gesture set for surface computing
CN102622225A (en) * 2012-02-24 2012-08-01 合肥工业大学 Multipoint touch application program development method supporting user defined gestures

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070177804A1 (en) * 2006-01-30 2007-08-02 Apple Computer, Inc. Multi-touch gesture dictionary

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100031203A1 (en) * 2008-08-04 2010-02-04 Microsoft Corporation User-defined gesture set for surface computing
CN101546233A (en) * 2009-05-05 2009-09-30 上海华勤通讯技术有限公司 Identification and operation method of touch screen interface gestures
CN102622225A (en) * 2012-02-24 2012-08-01 合肥工业大学 Multipoint touch application program development method supporting user defined gestures

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10198083B2 (en) 2014-02-25 2019-02-05 Xi'an Zhongxing New Software Co. Ltd. Hand gesture recognition method, device, system, and computer storage medium
US10591999B2 (en) 2014-02-25 2020-03-17 Zte Corporation Hand gesture recognition method, device, system, and computer storage medium
CN106255977A (en) * 2014-05-06 2016-12-21 讯宝科技有限责任公司 For performing the apparatus and method of variable data acquisition procedure
CN106255977B (en) * 2014-05-06 2019-07-09 讯宝科技有限责任公司 Device and method for executing variable data acquisition procedure
US10365721B2 (en) 2014-05-06 2019-07-30 Symbol Technologies, Llc Apparatus and method for performing a variable data capture process
CN110287734A (en) * 2019-07-01 2019-09-27 Oppo广东移动通信有限公司 Setting method, device, terminal and storage medium of secure communication node
CN115604384A (en) * 2022-11-14 2023-01-13 广东石油化工学院(Cn) An electronic device for assisting the use of smartphones

Also Published As

Publication number Publication date
WO2014032504A1 (en) 2014-03-06

Similar Documents

Publication Publication Date Title
CN102647504B (en) Method for controlling applications in mobile phone
US10739967B2 (en) Window moving method of mobile device and apparatus thereof
CN102929706B (en) A kind of method of merged file folder
JP6100287B2 (en) Terminal multiple selection operation method and terminal
CN102741799B (en) Touch screen operation method and terminal
CN104965641B (en) information display method and device
CN102929514B (en) A kind of application icon method for sorting of mobile terminal
WO2012068932A1 (en) Method and device for controlling application icons on touch screen
CN103513912B (en) A kind of interface switching method and device
CN103677591A (en) Terminal self-defined gesture method and terminal thereof
KR20150047451A (en) Method, apparatus and terminal device for displaying messages
US20150212693A1 (en) Interaction method and apparatus for listing data on mobile terminal
CN105635828B (en) Control method for playing back, device, electronic equipment and storage medium
US20130332888A1 (en) Method and device for operating list in handheld device
CN103543988A (en) Method for processing array information, method and device of controlling information to enter arrays
CN105760094A (en) Frame rate controlling method and device and terminal
WO2015131590A1 (en) Method for controlling blank screen gesture processing and terminal
CN106055237B (en) One kind slides booster response method and equipment
CN111459303B (en) Method, device, mouse and storage medium for controlling terminal screen with mouse
KR101242481B1 (en) Multimodal Interface Support User Device Using User Touch and Breath, and Control Method Thereof
CN103473014A (en) Multitask switching method and terminal
CN109978482A (en) Workflow processing method, device, equipment and storage medium
CN103034317A (en) Method and device for handling standby
US10908868B2 (en) Data processing method and mobile device
CN103677417B (en) A kind of detect the method for gesture, device and terminal device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20140326

RJ01 Rejection of invention patent application after publication