CN110874142A - User-defined method of full-screen mobile phone operation gesture, computing terminal and storage medium - Google Patents

User-defined method of full-screen mobile phone operation gesture, computing terminal and storage medium Download PDF

Info

Publication number
CN110874142A
CN110874142A CN201911092325.6A CN201911092325A CN110874142A CN 110874142 A CN110874142 A CN 110874142A CN 201911092325 A CN201911092325 A CN 201911092325A CN 110874142 A CN110874142 A CN 110874142A
Authority
CN
China
Prior art keywords
user
visual control
visual
operation instruction
touch screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201911092325.6A
Other languages
Chinese (zh)
Inventor
诸葛嘉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201911092325.6A priority Critical patent/CN110874142A/en
Publication of CN110874142A publication Critical patent/CN110874142A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)

Abstract

The invention discloses a self-defining method of a full-screen mobile phone operation gesture, which comprises the following steps: outputting a number of definable visual controls to a touch screen user interface; according to a first operation instruction of a user, moving a corresponding visual control to a first designated position of the user; judging whether the first designated position is located at the edge position of the touch screen user interface, if so, entering the next step, and if not, enabling the visual control to move to the edge of the touch screen user interface according to the nearest straight line path; locking the position of the visual control according to a second operation instruction of the user; recording an effective triggerable area of the visual control according to a third operation instruction of a user; and recording the implementation gesture and the implementation function of the visual control according to a fourth operation instruction of the user. The invention can realize the complete independent definition of the operation gesture of the full-screen mobile phone by the user, has large self-adjusting space of the user, can randomly define the operation gesture and has better use experience.

Description

User-defined method of full-screen mobile phone operation gesture, computing terminal and storage medium
Technical Field
The invention relates to the technical field of mobile phones, in particular to a user-defined method, a computing terminal and a storage medium for full-screen mobile phone operation gestures.
Background
The comprehensive screen of the current mobile phone is trendy, the comprehensive screen mobile phone is deeply loved by users by the ratio of the screen with the extremely high front, the comprehensive screen mobile phone not only has stronger visual impact force, but also can maximize the size of the front screen under the condition that the size of the machine body is not changed, and the dual requirements of the users on the large screen and the small machine body are met. For a full-screen mobile phone, gesture operation is an important factor reflecting the experience friendliness of the full-screen mobile phone, currently, mobile phone manufacturers have operation logics of all the families of the full-screen mobile phone, so that a user needs to have a large learning cost to adapt to full-screen operation gestures of a new mobile phone when changing the mobile phones of different manufacturers, the full-screen mobile phone is not friendly to the user, and in terms of experience, the new mobile phone can experience the possibility of backing up compared with the old mobile phone due to different design ideas of the manufacturers on the interaction logics. In addition, the operation gesture set by each mobile phone manufacturer is single, and the user is difficult to customize the operation gesture of the mobile phone according to the use habit of the user or the operation gesture considered to be good by the user.
Disclosure of Invention
The purpose of the invention is as follows: in order to overcome the defects in the prior art, the invention provides a method for customizing an operation gesture of a full-screen mobile phone, a computing terminal and a storage medium, aiming at enabling a user to customize the operation gesture on the mobile phone and solving the problems that the operation gesture of the existing full-screen mobile phone is single and the user cannot customize the operation gesture.
The technical scheme is as follows: in order to achieve the purpose, the invention discloses a self-defining method of a full-screen mobile phone operation gesture, which comprises the following steps:
outputting a number of definable visual controls to a touch screen user interface;
according to a first operation instruction of a user, moving a corresponding visual control to a first designated position of the user;
judging whether the first designated position is located at the edge position of the touch screen user interface, if so, entering the next step, and if not, enabling the visual control to move to the edge of the touch screen user interface according to the nearest straight line path;
locking the position of the visual control according to a second operation instruction of the user;
recording an effective triggerable area of the visual control according to a third operation instruction of a user;
and recording the implementation gesture and the implementation function of the visual control according to a fourth operation instruction of the user.
Further, the recording the valid triggerable region of the visual control according to the third operation instruction of the user includes:
causing a number of movable controls to be displayed on a periphery of the visual controls on the touch screen user interface;
and moving the corresponding movable control according to a third operation instruction of the user, so that the extension length of the visual control at the edge position of the touch screen user interface is changed.
Further, the recording of the implementation gesture and the implementation function of the visual control according to the fourth operation instruction of the user includes:
detecting a click operation of a user corresponding to the visual control;
when the click operation of the user corresponding to the visual control is detected, displaying an angle control around a touch point of the click operation;
determining the angle range selected by the user aiming at the angle control as a touch angle according to a fifth operation instruction of the user, and popping up a plurality of functional item labels for the user to select;
taking the function corresponding to the function item label selected by the user as an implementation function of an implementation gesture corresponding to the visual control, wherein the implementation gesture is as follows: sliding inward within the touch angle by an edge of the touch screen user interface.
Further, the visual control is long, and moving the corresponding visual control to the first designated position of the user according to the first operation instruction of the user includes:
enabling the visual control to move along with the touch position of the user according to the sliding operation of the user, and continuously calculating the distance between the center of the visual control and the nearest transverse edge and the nearest vertical edge of the touch screen user interface;
calculating whether the edge closest to the center of the visual control is a transverse edge or a vertical edge according to the distance;
and adjusting the visual control to be parallel to the nearest edge of the visual control according to the calculation result.
Further, after the recording of the implementation gesture and the implementation function of the visual control according to the fourth operation instruction of the user, the method further includes:
hiding the visual control.
Further, the outputting a number of definable visual controls to the touch screen user interface further comprises, before or after:
outputting a plurality of recommended positions for fixing the visual control to the touch screen user interface, wherein the recommended positions are highlighted, and a quick setting control is arranged near the recommended positions;
the moving the corresponding visual control to the first designated position of the user according to the first operation instruction of the user includes:
when the fact that the user clicks the quick setting control is detected, moving a visual control to a recommended position corresponding to the quick setting control if the user clicks the quick setting control;
and when the fact that the user performs click-press sliding operation on the visual control is detected, enabling the visual control to move along with the touch position of the user.
In order to achieve the object of the invention, the invention also provides a computing terminal, which comprises a processor and a memory;
the memory is used for storing an executable program;
the processor is used for executing the executable program to realize the self-defining method of the full-screen mobile phone operation gesture.
In order to achieve the object of the invention, the invention further provides a storage medium, wherein the storage medium stores an executable program, and the executable program is executed to achieve the self-defining method of the full-screen mobile phone operation gesture.
Has the advantages that: according to the self-defining method of the operation gesture of the full-screen mobile phone, the computing terminal and the storage medium, the visual control is arranged on the touch screen user interface, and the visual control can determine the position of the visual control, the triggerable area, the realization gesture and the realization function according to the operation gesture of the user, so that the user can completely and independently define the operation gesture of the full-screen mobile phone, the self-adjusting space of the user is large, the operation gesture can be freely defined, and the use experience is better.
Drawings
FIG. 1 is a flow diagram of a user-defined method for full-screen mobile phone operation gestures;
FIG. 2 is a first display state diagram of a touch screen user interface;
FIG. 3 is a second display state diagram of the touch screen user interface;
FIG. 4 is a third display state diagram of a touch screen user interface;
FIG. 5 is a fourth display state diagram of a touch screen user interface;
FIG. 6 is a fifth display state diagram of a touch screen user interface.
Detailed Description
The present invention will be further described with reference to the accompanying drawings.
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the description relating to "first", "second", etc. in the present invention is for descriptive purposes only and is not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In addition, technical solutions between various embodiments may be combined with each other, but must be realized by a person skilled in the art, and when the technical solutions are contradictory or cannot be realized, such a combination should not be considered to exist, and is not within the protection scope of the present invention.
In addition, in the following description, suffixes such as "module", "part", or "unit" used to denote elements are used only for facilitating the description of the present invention, and have no specific meaning in themselves. Thus, "module", "component" or "unit" may be used mixedly.
Example one
The self-defining method of the full-screen mobile phone operation gesture shown in fig. 1 is applied to a controller of a full-screen mobile phone, the full-screen mobile phone at least comprises a touch screen and the controller, and the method comprises the following steps S101 to S106:
step S101, outputting a plurality of definable visual controls to a touch screen user interface;
in this step, the visual controls are identification images of a plurality of specific shapes displayed on the touch screen, so that the user can conveniently define the visual controls, and the controller defines the attributes of the visual controls by detecting the operation instructions of the user for the visual controls (i.e., the operation instructions generated by touch, sliding and the like on the touch screen). As shown in fig. 2, three visual controls a1, a2, a3 are displayed on the touch screen user interface.
Step S102, according to a first operation instruction of a user, enabling a corresponding visual control to move to a first appointed position of the user;
in this step, the first operation instruction of the user generally refers to clicking and pressing the position of the visual control to be defined first and then keeping the clicking and pressing state to drag the visual control to slide along with the finger;
step S103, judging whether the first designated position is located at the edge position of the touch screen user interface, if so, entering the next step, and if not, enabling the visual control to move to the edge of the touch screen user interface according to the nearest linear path;
in this step, the positions of the visual controls a1, a2, a3 after moving are shown in fig. 3;
step S104, locking the position of the visual control according to a second operation instruction of the user;
in this step, the controller may display a function button on the touch screen, for example, a button written with "position confirmation" (as shown in fig. 3) may be displayed, and when the user clicks the button, the user indicates that the user has finished customizing the positions of all the visual controls on the interface, so that the positions of the visual controls are fixed.
Step S105, recording an effective triggerable area of the visual control according to a third operation instruction of a user;
and step S106, recording the implementation gesture and the implementation function of the visual control according to a fourth operation instruction of the user.
In the above steps S105 to S106, after the effective triggerable region, the implementation gesture, and the implementation function are defined, in the subsequent specific use, when the user needs to trigger the implementation function corresponding to a certain visual control, the implementation gesture needs to be executed in the effective trigger region of the visual control to trigger the implementation function. The implementation gesture can be an operation gesture such as sliding and double-clicking, and the implementation function can be a function which can be implemented on a full-screen mobile phone such as returning, returning to a desktop and a pull-down list.
Preferably, the step S105 of recording the valid triggerable region of the visual control according to the third operation instruction of the user includes the following steps S201 to S202:
step S201, displaying a plurality of movable controls on the periphery of the visual control on the touch screen user interface;
in this step, the movable control c is displayed at the edge position of the visual control, and is generally displayed at both ends of the visual control in the length direction, as shown in fig. 4, because the full-screen gesture is generally operated at the edge of the screen, that is, the effective triggerable region is a line, the range of the effective triggerable region can be determined only by defining the length of the visual control.
And S202, moving a corresponding movable control according to a third operation instruction of the user, so that the extension length of the visual control at the edge position of the touch screen user interface is changed.
In this step, the active triggerable region of the single visual control need not extend along only one straight edge of the touch screen, but may extend between two opposite vertical edges, e.g., the visual control may turn from a bottom edge along a corner of the touch screen and extend to side edges of the touch screen, as shown in visual control a3 in fig. 4.
When the user defines the effective triggerable area of each visual control, the controller always displays a function button for ending the current definition mode on the touch screen, for example, a button named as "area confirmation" can be displayed, as shown in fig. 4, when the user clicks the button, the definition of the effective triggerable area of all the visual controls is finished, and the next definition mode can be entered.
Preferably, the step S106 of recording the implementation gesture and the implementation function of the visual control according to the fourth operation instruction of the user includes the following steps S301 to S304:
step S301, detecting click operation of a user corresponding to the visual control;
step S302, when a click operation of a user corresponding to the visual control is detected, displaying an angle control around a touch point of the click operation;
in this step, when it is detected that the user performs the click operation on a certain visual control, it indicates that the user needs to specifically define the click visual control, and therefore an angle control d is displayed around the user, and the effect of the angle control d is shown in fig. 5;
step S303, determining an angle range selected by the user aiming at the angle control according to a fifth operation instruction of the user, taking the angle range as a touch angle, and popping up a plurality of functional item labels e for the user to select;
in this step, the display effect is as shown in fig. 6;
step S304, taking the function corresponding to the function item label e selected by the user as an implementation function of an implementation gesture corresponding to the visual control, wherein the implementation gesture is as follows: sliding inward within the touch angle by an edge of the touch screen user interface.
In the above steps S303 to S304, the touch angle represents an angle range in which the user performs a sliding gesture from the effective triggerable region of the visual control, if the implementation function corresponding to the touch angle of a certain visual control in the angle range of 0 to 90 ° is set to return, in specific use, the user needs to slide from a certain point of the effective triggerable region of the visual control to the inside of the screen, and the sliding angle is between 0 to 90 ° with respect to the edge of the screen, if the sliding angle is 45 °, the return function can be triggered, but the function cannot be triggered if the sliding angle is 100 °, by the above setting, multiple implementation functions can be set for the same visual control, for example, it can be defined that the implementation function corresponding to a certain visual control in the angle range of 0 to 90 ° is return, and the implementation function corresponding to the desktop in the angle range of 90 to 180 ° is return, therefore, a plurality of realization functions can be conveniently defined at the same position by a user so as to facilitate single-hand operation, if the user can respectively set two visual controls with one realization function and the same realization gesture at the left edge and the right edge of the touch screen, and each visual control can realize a plurality of realization functions, so that the user can conveniently carry out left-hand and right-hand single-hand operation.
In the above processes S301 to S304, the controller always displays a function button for ending the current definition mode on the touch screen, for example, a button named "gesture confirmation" may be displayed, as shown in fig. 6, when the user clicks the button, the control gesture and the function definition of all the visual controls are completed.
Preferably, the visual control is long, and the step S102 of moving the corresponding visual control to the first designated position of the user according to the first operation instruction of the user includes the following steps S401 to S403:
step S401, enabling the visual control to move along with the touch position of a user according to the sliding operation of the user, and continuously calculating the distance between the center of the visual control and the nearest transverse edge and the nearest vertical edge of the touch screen user interface;
step S402, calculating whether the edge closest to the center of the visual control is a transverse edge or a vertical edge according to the distance;
and S403, adjusting the visual control to be parallel to the nearest edge according to the calculation result.
Through the steps S401-S403, the state of the visual control can be conveniently fed back to the user in real time, when the user releases the visual control, in the step S103, the controller can adsorb the visual control to the edge position of the touch screen in the shortest straight line, therefore, when the user adjusts the position of the visual control, the expected adsorption position of the visual control can be judged in advance and the visual control can be released, and the control mode conforms to human intuition and is good in experience.
Preferably, after the recording the valid triggerable region of the visual control according to the third operation instruction of the user, the following step S501 is further included:
step S501, establishing a connection between two or more visual controls according to a sixth operation instruction of the user and defining an operation instruction related to the connection.
In this step, when the subsequent user actually uses the mobile phone, the subsequent user can slide from a certain point of the effective triggerable area of a certain visual control to the inside of the screen and reach a certain point of the effective triggerable area of another visual control to trigger some special operation instructions, such as locking the screen, opening the flashlight, switching the horizontal screen, and the like.
Specifically, the establishing of the association between the two or more visual controls according to the sixth operation instruction of the user and the defining of the operation instruction associated therewith includes the following steps S601-S603:
step S601, establishing the relation between two or more visual controls according to the sliding connection instruction of the user;
step S602, outputting a plurality of function item labels to the touch screen user interface for selection by a user;
step S603, regarding the function corresponding to the function item label selected by the user as an implementation function corresponding to a specific gesture operation performed between two or more corresponding visual controls.
Preferably, after the recording of the implementation gesture and the implementation function of the visual control according to the fourth operation instruction of the user, the following step S701 is further included:
step S701, hiding the visual control.
Through the steps, when the mobile phone is actually used, the visual control is in a hidden state, so that the content of the touch screen cannot be shielded, and a user can operate according to the approximate position memorized in the visual control to trigger the required realization function.
Further, before or after outputting the definable visual controls to the touch screen user interface in step S101, the following step S801 is further included:
step S801, outputting a plurality of recommended positions b for fixing the visual control to the touch screen user interface, wherein the recommended positions b are highlighted, and a quick setting control f is arranged near the recommended positions;
in this step, the effect of setting the recommended position b is shown in fig. 2.
Based on this, the step S101 of moving the corresponding visual control to the first designated position of the user according to the first operation instruction of the user includes the following steps S802 to S803:
step S802, when detecting that the user performs click operation on the quick setting control f, if so, moving a visual control to a recommended position corresponding to the quick setting control f;
and step S803, when the point-press sliding operation of the user on the visual control is detected, the visual control moves along with the touch position of the user.
Through the steps, the user can selectively and quickly define the positions of the visual controls.
In addition, the user can increase or decrease the number of visual controls as required, and the specific method is as follows: the controller displays two function buttons on the touch screen, wherein the name of one function button is 'add', the name of one function button is 'delete', when a user clicks the 'add' button, a visual control is added on the basis of the current display content, and when the user clicks a certain visual control and then clicks the 'delete' button, the visual control clicked by the user is deleted.
Example two
The embodiment relates to a computing terminal, which comprises a processor and a memory; the memory is used for storing an executable program; the processor is used for executing the executable program to realize the self-defining method of the full-screen mobile phone operation gesture in the first embodiment.
EXAMPLE III
The present embodiment relates to a storage medium, such as a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic memory, a magnetic disk, an optical disk, a server, an App application mall, etc., on which an executable program is stored, which is executed to implement the customization method of the full-screen mobile phone operation gesture in the first embodiment.
According to the self-defining method of the operation gesture of the full-screen mobile phone, the computing terminal and the storage medium, the visual control is arranged on the touch screen user interface, and the visual control can determine the position of the visual control, the triggerable area, the realization gesture and the realization function according to the operation gesture of the user, so that the user can completely and independently define the operation gesture of the full-screen mobile phone, the self-adjusting space of the user is large, the operation gesture can be freely defined, and the use experience is better.
The above description is only of the preferred embodiments of the present invention, and it should be noted that: it will be apparent to those skilled in the art that various modifications and adaptations can be made without departing from the principles of the invention and these are intended to be within the scope of the invention.

Claims (8)

1. The self-defining method of the operation gesture of the full-screen mobile phone is characterized by comprising the following steps:
outputting a number of definable visual controls to a touch screen user interface;
according to a first operation instruction of a user, moving a corresponding visual control to a first designated position of the user;
judging whether the first designated position is located at the edge position of the touch screen user interface, if so, entering the next step, and if not, enabling the visual control to move to the edge of the touch screen user interface according to the nearest straight line path;
locking the position of the visual control according to a second operation instruction of the user;
recording an effective triggerable area of the visual control according to a third operation instruction of a user;
and recording the implementation gesture and the implementation function of the visual control according to a fourth operation instruction of the user.
2. The method for customizing the full-screen mobile phone operation gesture according to claim 1, wherein the recording of the effective triggerable region of the visual control according to the third operation instruction of the user comprises:
causing a number of movable controls to be displayed on a periphery of the visual controls on the touch screen user interface;
and moving the corresponding movable control according to a third operation instruction of the user, so that the extension length of the visual control at the edge position of the touch screen user interface is changed.
3. The method for customizing full-screen mobile phone operation gestures according to claim 1, wherein the recording of implementation gestures and implementation functions of the visual control according to a fourth operation instruction of a user comprises:
detecting a click operation of a user corresponding to the visual control;
when the click operation of the user corresponding to the visual control is detected, displaying an angle control around a touch point of the click operation;
determining the angle range selected by the user aiming at the angle control as a touch angle according to a fifth operation instruction of the user, and popping up a plurality of functional item labels for the user to select;
taking the function corresponding to the function item label selected by the user as an implementation function of an implementation gesture corresponding to the visual control, wherein the implementation gesture is as follows: sliding inward within the touch angle by an edge of the touch screen user interface.
4. The method of claim 1, wherein the visual controls are elongated, and moving the corresponding visual control to a first designated location of the user according to the first operation instruction of the user comprises:
enabling the visual control to move along with the touch position of the user according to the sliding operation of the user, and continuously calculating the distance between the center of the visual control and the nearest transverse edge and the nearest vertical edge of the touch screen user interface;
calculating whether the edge closest to the center of the visual control is a transverse edge or a vertical edge according to the distance;
and adjusting the visual control to be parallel to the nearest edge of the visual control according to the calculation result.
5. The method for customizing full-screen mobile phone operation gestures according to claim 1, wherein the recording of the implementation gestures and implementation functions of the visual control according to the fourth operation instruction of the user further comprises:
hiding the visual control.
6. The method of customizing a full-screen cell phone operational gesture of claim 1, wherein outputting a number of definable visual controls to the touch screen user interface further comprises, before or after:
outputting a plurality of recommended positions for fixing the visual control to the touch screen user interface, wherein the recommended positions are highlighted, and a quick setting control is arranged near the recommended positions;
the moving the corresponding visual control to the first designated position of the user according to the first operation instruction of the user includes:
when the fact that the user clicks the quick setting control is detected, moving a visual control to a recommended position corresponding to the quick setting control if the user clicks the quick setting control;
and when the fact that the user performs click-press sliding operation on the visual control is detected, enabling the visual control to move along with the touch position of the user.
7. A computing terminal, characterized by: the computing terminal comprises a processor and a memory;
the memory is used for storing an executable program;
the processor is configured to execute the executable program to implement the customized method of full-screen handset operational gestures as claimed in any of claims 1-6.
8. A storage medium, characterized by: the storage medium has stored thereon an executable program that when executed performs a customized method of full-screen handset operational gestures as claimed in any one of claims 1 to 6.
CN201911092325.6A 2019-11-11 2019-11-11 User-defined method of full-screen mobile phone operation gesture, computing terminal and storage medium Withdrawn CN110874142A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911092325.6A CN110874142A (en) 2019-11-11 2019-11-11 User-defined method of full-screen mobile phone operation gesture, computing terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911092325.6A CN110874142A (en) 2019-11-11 2019-11-11 User-defined method of full-screen mobile phone operation gesture, computing terminal and storage medium

Publications (1)

Publication Number Publication Date
CN110874142A true CN110874142A (en) 2020-03-10

Family

ID=69717950

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911092325.6A Withdrawn CN110874142A (en) 2019-11-11 2019-11-11 User-defined method of full-screen mobile phone operation gesture, computing terminal and storage medium

Country Status (1)

Country Link
CN (1) CN110874142A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112148193A (en) * 2020-09-30 2020-12-29 维沃移动通信有限公司 Navigation gesture setting method and device and electronic equipment
WO2022041606A1 (en) * 2020-08-31 2022-03-03 珠海格力电器股份有限公司 Method and apparatus for adjusting display position of control

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022041606A1 (en) * 2020-08-31 2022-03-03 珠海格力电器股份有限公司 Method and apparatus for adjusting display position of control
CN112148193A (en) * 2020-09-30 2020-12-29 维沃移动通信有限公司 Navigation gesture setting method and device and electronic equipment
WO2022068725A1 (en) * 2020-09-30 2022-04-07 维沃移动通信有限公司 Navigation gesture setting method and apparatus, and electronic device

Similar Documents

Publication Publication Date Title
US8966387B2 (en) Method and apparatus for managing icon in portable terminal
KR101315452B1 (en) Method for providing shopping information using a mobile terminal and user interface for providing shopping information usint the mobile terminal
US7607105B2 (en) System and method for navigating in a display window
KR101229699B1 (en) Method of moving content between applications and apparatus for the same
EP2698708A1 (en) Method for providing user interface having multi-tasking function, mobile communication device, and computer readable recording medium for providing the same
RU2483344C2 (en) Wireless communication device and split touch sensitive user input surface
US20150143285A1 (en) Method for Controlling Position of Floating Window and Terminal
KR102056189B1 (en) Mobile terminal and method for controlling thereof
US10082947B2 (en) Information processing device, information processing method, and information processing program
KR20110113844A (en) Mobile terminal and method for controlling thereof
EP2889740A1 (en) Method, apparatus and computer program product for zooming and operating screen frame
CN106325753B (en) A kind of display methods and mobile terminal of payment interface
KR20130029243A (en) Mobile terminal and method for transmitting information using the same
US9146634B2 (en) Handheld device and homescreen management method thereof
CN106020698A (en) Mobile terminal and realization method of single-hand mode
US20140195935A1 (en) Information processing device, information processing method, and information processing program
CN110874142A (en) User-defined method of full-screen mobile phone operation gesture, computing terminal and storage medium
KR20150056346A (en) Terminal for screen capture processing for multi windows
KR20140094407A (en) Method for providing shopping information using a mobile terminal and user interface for providing shopping information usint the mobile terminal
KR20130116976A (en) Mobile terminal and method for controlling thereof
WO2019085810A1 (en) Processing method, device, apparatus, and machine-readable medium
KR20120003537A (en) Mobile terminal and control method thereof
CN105868034A (en) Man-machine interaction method, man-machine interaction device and mobile equipment
KR102138500B1 (en) Terminal and method for controlling the same
CN114115639A (en) Interface control method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20200310

WW01 Invention patent application withdrawn after publication