CN102520860B - A kind of method and mobile terminal for carrying out desktop display control - Google Patents
A kind of method and mobile terminal for carrying out desktop display control Download PDFInfo
- Publication number
- CN102520860B CN102520860B CN201110409344.4A CN201110409344A CN102520860B CN 102520860 B CN102520860 B CN 102520860B CN 201110409344 A CN201110409344 A CN 201110409344A CN 102520860 B CN102520860 B CN 102520860B
- Authority
- CN
- China
- Prior art keywords
- display
- target operation
- operation area
- control
- area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 30
- 230000003321 amplification Effects 0.000 claims description 15
- 238000003199 nucleic acid amplification method Methods 0.000 claims description 15
- 238000010586 diagram Methods 0.000 description 8
- 230000000007 visual effect Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 2
- 239000000470 constituent Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a kind of method and mobile terminal for carrying out desktop display control, the method includes:After mobile terminal detects user to the touch-screen event of touch display screen, object run region is determined according to the touch-screen event and is amplified the object run region display or reduces to show or translate display, the object run region is whole desktops or local desktop.By the solution of the present invention, whole desktop or local desktop can be dragged or zoomed in or out, the region that user needs point tactile is placed into the convenient position clicked on of user, maloperation is avoided, improves Consumer's Experience.
Description
Technical Field
The invention relates to the field of mobile terminals, in particular to a method for controlling desktop display and a mobile terminal.
Background
With the progress of technology, touch screen mobile terminals are widely used by replacing conventional key operations with touch screen operations. The existing touch screen mobile phone supports operations such as clicking and sliding. When the screen of the touch screen terminal is large (for example, 4.3 inches), the user can operate some icons or virtual keys with one hand, and the user is inconvenient to click, so that misoperation can be caused.
Disclosure of Invention
The technical problem to be solved by the invention is to provide a method for controlling desktop display and a mobile terminal, which are convenient for a user to accurately touch a touch screen.
In order to solve the technical problem, the invention provides a method for controlling desktop display, wherein after a mobile terminal detects a touch screen event of a user on a touch display screen, a target operation area is determined according to the touch screen event and is subjected to amplification display, reduction display or translation display, and the target operation area is a whole desktop or a local desktop.
Further, the method can also have the following characteristics:
after the mobile terminal detects a long-time pressing operation in a touch screen event, when the starting point of the long-time pressing operation is judged to correspond to a non-control position on a display screen, the whole desktop is used as a target operation area selected by a user;
after the mobile terminal detects a long press operation in a touch screen event, when the starting point of the long press operation is judged to correspond to the control position on the display screen, the control area corresponding to the control position is used as a target operation area selected by a user;
after the mobile terminal detects that history points in a touch screen event form a closed area, taking the corresponding closed area on the desktop as a target operation area selected by a user.
Further, the method can also have the following characteristics:
and after the mobile terminal determines a target operation area, displaying the target operation area as a selected state.
Further, the method can also have the following characteristics:
after the mobile terminal determines a target operation area, displaying a reduction control and an amplification control on a touch display screen, carrying out reduction display on the target operation area after detecting a short-press operation at a position corresponding to the reduction control, and carrying out amplification display on the target operation area after detecting the short-press operation at a position corresponding to the amplification control;
or,
after the mobile terminal determines a target operation area and detects multi-point opposite movement operation, the target operation area is displayed in a reduced mode corresponding to the stroke length, and after the multi-point opposite movement operation is detected, the target operation area is displayed in an enlarged mode corresponding to the stroke length.
Further, the method can also have the following characteristics:
after the mobile terminal determines a target operation area and detects single-point movement operation or multi-point equidirectional movement operation, the mobile terminal performs translation display on the target operation area, wherein the translation display is in the same direction as the movement operation direction and corresponds to the stroke length.
In order to solve the above technical problems, the present invention provides a mobile terminal for performing desktop display control, comprising a central processing module, a user interface management module, and a human-machine interface module for detecting a user's operation on a touch display screen and displaying a desktop on the touch display screen, wherein,
the central processing module is used for acquiring a touch screen event of a user on a touch display screen through the human-computer interface module, determining a target operation area according to the touch screen event and controlling the user interface management module to perform amplification display, reduction display or translation display on the target operation area, wherein the target operation area is a whole desktop or a local desktop;
the user interface management module is used for controlling the human-computer interface module to display the target operation area according to the instruction of the central processing module, and is also used for supporting the amplification display or the reduction display or the translation display of all the desktops and the amplification display or the reduction display or the translation display of partial desktops.
Further, the mobile terminal may further have the following characteristics:
the central processing module is further used for taking the whole desktop as a target operation area selected by a user when judging that the starting point of the long press operation corresponds to the position of the non-control on the display screen after the long press operation in the touch screen event is detected by the man-machine interface module; or, the control device is further configured to, after detecting a long press operation in a touch screen event through the human-computer interface module, when judging that a starting point of the long press operation corresponds to a control position on the display screen, take a control area corresponding to the control position as a target operation area selected by a user; and the method is also used for taking the corresponding closed area on the desktop as a target operation area selected by the user after detecting that the historical points in the touch screen event form the closed area through the human-computer interface module.
Further, the mobile terminal may further have the following characteristics:
the central processing module is further used for displaying the target operation area as a selected state after the target operation area is determined.
Further, the mobile terminal may further have the following characteristics:
the central processing module is further configured to display a zoom-out control and a zoom-in control on the touch display screen through the user interface management module after a target operation area is determined, perform zoom-out display on the target operation area through the user interface management module after a short-press operation is detected at a position corresponding to the zoom-out control through the human-computer interface module, and perform zoom-in display on the target operation area through the user interface management module after a short-press operation is detected at a position corresponding to the zoom-in control through the human-computer interface module; or, after the target operation area is determined, after the human-computer interface module detects the multi-point opposite movement operation, the user interface management module performs reduced display corresponding to the stroke length on the target operation area, and after the human-computer interface module detects the multi-point reverse movement operation, the user interface management module performs enlarged display corresponding to the stroke length on the target operation area.
Further, the mobile terminal may further have the following characteristics:
the central processing module is further configured to, after determining a target operation area and detecting a single-point movement operation or a multi-point equidirectional movement operation through the human-computer interface module, perform, through the user interface management module, translation display on the target operation area in the same direction as a movement operation direction and corresponding to a stroke length.
Through the scheme of the invention, the whole desktop or the local desktop can be dragged, amplified or reduced, the area which needs to be touched by the user is placed at the position which is convenient for the user to click, misoperation is avoided, and the user experience is improved.
Drawings
FIG. 1 is a block diagram showing the constituent modules of a mobile terminal according to an embodiment;
FIG. 2 is a diagram illustrating an embodiment of a mobile terminal with additional function options;
FIG. 3 is a diagram illustrating a display of an original screen of a mobile terminal according to an exemplary embodiment;
FIG. 4 is a diagram illustrating an example of a desktop after zooming;
FIG. 5 is a diagram illustrating an example of a desktop after translating;
FIG. 6 is a diagram illustrating a position of an original virtual keyboard of a mobile terminal on a desktop in example two;
FIG. 7 is a schematic diagram illustrating a position of a virtual keyboard of a mobile terminal in an enlarged scale on a desktop in example two;
fig. 8 is a schematic diagram illustrating a position of a virtual keyboard of a mobile terminal after the virtual keyboard is translated on a desktop in example two.
Detailed Description
The mobile terminal comprises a man-machine interface module 101, a user interface management module 102, a central processing unit 103 and a program storage module 104.
And the human-computer interface module 101 is used for detecting the operation of the user on the touch display screen and displaying the desktop on the touch display screen, and is also used for calling the picture and the interface of the program storage module, displaying the corresponding interface on the screen and waiting for the user operation. The function of the module is the same as that of the man-machine interface module in the prior art.
And the user interface management module 102 is configured to control the human-machine interface module to display the target operation area according to an instruction of the central processing module, and is further configured to support an enlargement display, a reduction display, or a translation display of all desktops, and an enlargement display, a reduction display, or a translation display of a local desktop.
The central processing module 103 is configured to acquire a touch screen event of a user on the touch display screen through the human-computer interface module, determine a target operation area according to the touch screen event, and control the user interface management module to perform enlargement display, reduction display, or translation display on the target operation area, where the target operation area is a whole desktop or a local desktop.
The program storage module 104 is used for storing pictures, data, menus and display interfaces required by the mobile phone; in addition, an operating system, application functions, data files and the like of the mobile phone are also stored. The function of this module is the same as that of the program storage module of the prior art.
The user enables the mobile terminal to know the target operation area of the user by operating the touch screen.
When the user presses the position of the non-control on the display screen for a long time, the target operation area representing the user is the whole desktop. After detecting a long-press operation in a touch event through the human-computer interface module 101, the central processing module 103 uses the entire desktop as a target operation area selected by a user when judging that a starting point of the long-press operation corresponds to a non-control position on the display screen.
When the user presses the control position on the display screen for a long time, the target operation area representing the user is the control. After detecting a long-press operation in a touch screen event through the human-computer interface module 101, the central processing module 103 takes a control area corresponding to a control position as a target operation area selected by a user when judging that an initial point of the long-press operation corresponds to the control position on the display screen.
When the user moves on the display screen to indicate a closed area, for example, an area enclosing a plurality of controls, the target operation area of the user is the closed area. The central processing module 103 is further configured to, after detecting that a closed region is formed by history points in a touch event through the human-computer interface module, take the corresponding closed region on the desktop as a target operation region selected by a user.
After determining the target operation area, the central processing module 103 displays the target operation area in a selected state. The display of the selected state may be performed in various ways, such as displaying a border at the edge of the target operation area, displaying the target operation area in a certain color or transparent effect, and indicating other ways of the target operation area.
The modes of the user for zooming the target operation area by operating the touch screen include the following two modes.
In a first mode, after determining a target operation area, the central processing module 103 displays a zoom-out control and a zoom-in control on the touch display screen through the user interface management module, and after detecting a short-press operation at a position corresponding to the zoom-out control through the human-computer interface module, performs zoom-out display on the target operation area through the user interface management module, and after detecting a short-press operation at a position corresponding to the zoom-in control through the human-computer interface module, performs zoom-in display on the target operation area through the user interface management module.
In the second mode, after the central processing module 103 detects the multi-point opposite movement operation through the human-computer interface module, the target operation area is displayed in a reduced manner corresponding to the stroke length through the user interface management module, and after the multi-point opposite movement operation is detected through the human-computer interface module, the target operation area is displayed in an enlarged manner corresponding to the stroke length through the user interface management module.
The method for zooming the target operation area by translating and zooming the touch screen by the user comprises the following steps: after the central processing module 103 determines a target operation area, and detects a single-point movement operation or a multi-point equidirectional movement operation through the human-computer interface module, the user interface management module performs translation display on the target operation area in the same direction as the movement operation direction and corresponding to the stroke length.
The method for controlling desktop display comprises the following steps: after detecting a touch screen event of a user on a touch display screen, the mobile terminal determines a target operation area according to the touch screen event and performs amplification display, reduction display or translation display on the target operation area, wherein the target operation area is a whole desktop or a local desktop.
When a user presses a non-control position on the display screen for a long time, the whole desktop is taken as a target operation area selected by the user. After detecting a long-time pressing operation in a touch screen event, the mobile terminal takes the whole desktop as a target operation area selected by a user when judging that the starting point of the long-time pressing operation corresponds to a non-control position on a display screen;
when the user presses the position of the control on the display screen for a long time, the control is taken as the target operation area selected by the user. After detecting a long-time pressing operation in a touch screen event, the mobile terminal judges that the starting point of the long-time pressing operation corresponds to a control position on a display screen, and takes a control area corresponding to the control position as a target operation area selected by a user;
the user uses a finger to stroke the closed area on the display screen, and the closed area is used as a target operation area selected by the user. After the mobile terminal detects that history points in a touch screen event form a closed area, taking the corresponding closed area on the desktop as a target operation area selected by a user.
And after the mobile terminal determines a target operation area, displaying the target operation area as a selected state. The display of the selected state may be performed in various ways, such as displaying a border at the edge of the target operation area, displaying the target operation area in a certain color or transparent effect, and indicating other ways of the target operation area.
The modes of the user for zooming the target operation area by operating the touch screen include the following two modes.
In the first mode, after the mobile terminal determines a target operation area, a zoom-out control and a zoom-in control are displayed on a touch display screen, the target operation area is displayed in a zoom-out mode after a short-press operation is detected at a position corresponding to the zoom-out control, and the target operation area is displayed in a zoom-in mode after the short-press operation is detected at a position corresponding to the zoom-in control;
in the second mode, the user can use two fingers to move in opposite directions on the display screen to indicate that the target operation area is desired to be reduced, or use two fingers to move in opposite directions on the display screen to indicate that the target operation area is desired to be enlarged. And after detecting the multi-point reverse movement operation, the mobile terminal performs zooming-out display corresponding to the stroke length on the target operation area, and performs zooming-in display corresponding to the stroke length on the target operation area.
The user may indicate the direction in which the user wishes to pan by swiping in the target direction with one finger, or by swiping in the same direction with multiple fingers. And after detecting the single-point movement operation or the multi-point equidirectional movement operation, the mobile terminal performs translation display on the target operation area in the same direction as the movement operation direction and corresponding to the stroke length.
According to the scheme, the whole or part of the desktop is dragged and dropped or zoomed through an operation mode of the touch screen, so that a user can use the touch type terminal more conveniently and quickly, and a new experience degree of the user is provided.
As shown in fig. 2, the central processing unit converts the operation of the user on the interface into the operation of the application corresponding to the graphic management system, calls the corresponding interface management module, and refreshes the display result to the screen buffer and displays the display result on the display screen. The newly added function options in the invention are translation of the whole desktop, translation of the local desktop, zooming of the whole desktop and zooming of the local desktop.
The following is a detailed description of a specific flow.
In the following examples, the terminal detects a touch screen event in real time in the process.
Example 1, the process of the user translating the whole desktop includes:
step 1, a user presses a non-control position of a desktop for a long time;
step 2, detecting a long press operation in a touch screen event, wherein the position of the long press operation is a non-control position, judging whether to allow the operation of the whole desktop (namely judging whether to include options of translating the whole desktop or zooming the whole desktop), if so, executing the next step, otherwise, processing according to a normal flow;
step 3, displaying the whole desktop in a selected state;
step 4, after detecting the translation operation of one contact or the translation operations of a plurality of contacts along the same direction, judging whether the translation operation of the desktop is allowed (namely judging whether the translation of the whole desktop is included), if so, executing the next step, otherwise, processing according to the normal flow;
and 5, recording a translation starting point and a translation current end point, and drawing the visual effect of desktop translation.
Example 2, the process of the user scaling the whole desktop includes:
step 1, a user presses a non-control position of a desktop for a long time;
step 2, detecting a long press operation in a touch screen event, wherein the position of the long press operation is a non-control position, judging whether to allow the operation of the whole desktop (namely judging whether to include options of translating the whole desktop or zooming the whole desktop), if so, executing the next step, otherwise, processing according to a normal flow;
step 3, displaying the whole desktop in a selected state;
step 4, detecting the translation operation of the two contacts along the opposite direction, judging whether the desktop is allowed to be zoomed (namely judging whether the whole desktop is zoomed), if so, executing the next step, otherwise, processing according to a normal flow;
and 5, recording the translation starting point and the translation current end point, and drawing the desktop amplified visual effect.
Example 3, the process of the user translating the local desktop includes:
step 1, a user defines a closed area or presses a control for a long time;
step 2, detecting a moving operation in a touch event and forming a closed area by history points, or detecting a long-time pressing operation in the touch event and the long-time pressing position is a control position, judging whether to allow the operation on the local desktop (namely judging whether to include an option of translating the local desktop or zooming the local desktop), if so, executing the next step, otherwise, processing according to a normal flow;
step 3, displaying the control or the closed area as a selected state;
step 4, detecting the translation operation of one contact or the translation operations of a plurality of contacts along the same direction, judging whether the translation operation of the local desktop is allowed (namely judging whether the translation operation of the local desktop is included), if so, executing the next step, otherwise, processing according to a normal flow;
and 5, recording a translation starting point and a translation current end point, and drawing the translation visual effect of the local desktop (namely the control or the closed area).
Example 4, the process of the user enlarging the partial desktop includes:
step 1, a user defines a closed area or presses a control for a long time;
step 2, detecting a moving operation in a touch event and forming a closed area by history points, or detecting a long-time pressing operation in the touch event and the long-time pressing position is a control position, judging whether to allow the operation on the local desktop (namely judging whether to include an option of translating the local desktop or zooming the local desktop), if so, executing the next step, otherwise, processing according to a normal flow;
step 3, displaying the control or the closed area as a selected state;
step 4, detecting the translation operation of the two contacts along the opposite direction, judging whether the local desktop is allowed to be zoomed (namely judging whether the local desktop is zoomed or not), if so, executing the next step, otherwise, processing according to a normal flow;
and 5, recording a translation starting point and a translation current end point, and drawing the amplified visual effect of the local desktop (namely the control or the closed area).
Example 4, the process of the user enlarging or reducing the local desktop includes:
step 1, a user defines a closed area or presses a control for a long time;
step 2, detecting a moving operation in a touch event and forming a closed area by history points, or detecting a long-time pressing operation in the touch event and the long-time pressing position is a control position, judging whether to allow the operation on the local desktop (namely judging whether to include an option of translating the local desktop or zooming the local desktop), if so, executing the next step, otherwise, processing according to a normal flow;
step 3, displaying the control or the closed area as a selected state;
step 4, detecting the translation operation of the two contacts along the opposite direction, judging whether the local desktop is allowed to be zoomed (namely judging whether the local desktop is zoomed or not), if so, executing the next step, otherwise, processing according to a normal flow;
and 5, recording a translation starting point and a translation current end point, and drawing the amplified visual effect of the local desktop (namely the control or the closed area).
Step 6, detecting the translation operation of the two contacts along the same direction, judging whether the local desktop is allowed to be zoomed (namely judging whether the local desktop is zoomed or not), if so, executing the next step, otherwise, processing according to a normal flow;
and 7, drawing the visual effect of zooming out the local desktop (namely the control or the closed area) on the basis of zooming in by taking the end point of the translation operation in the zooming-in process as a starting point and the current point in the zooming-out process.
In the above example, after the terminal detects the touch release event, the terminal ends the drawing of the translation or zoom, keeps the last drawing of the desktop formed by the translation or zoom operation, and waits for the user operation.
The invention is explained in detail below with reference to the drawings.
Example one, operate on a desktop example.
As shown in fig. 3, the situation that the original display screen displays the desktop is shown, the thick line represents the edge of the screen, and the shaded part represents the desktop. As shown in fig. 4, the user slides in opposite directions on the screen using two fingers, and the terminal displays the desktop in an enlarged manner. As shown in fig. 5, the user presses the non-control area at the upper left corner of the screen to move the display screen to the lower right to indicate that the original desktop is desired to be translated, and the double-line arrow indicates the direction and distance of the movement of the user, so that the user who originally is located at the upper left corner of the screen needs to click the target to move to the middle position of the screen, and the user can conveniently click. The blank area generated after dragging the desktop can be displayed in a pure color or displayed in a picture or animation mode preset by the terminal. The user can move the whole screen to all directions according to the use requirement of the user so as to meet the use requirement of the user. And in the process of moving the desktop, all icons and controls on the desktop synchronously move along with the movement of the interface.
Example two, operate example on control.
When only one control needs to be selected, the user can directly press the control, and when a plurality of controls need to be selected, the user can touch the screen circle to define the area to be selected.
As shown in fig. 6, the desktop is the input interface and the virtual keyboard is used as a control. As shown in fig. 7, after the user presses the control area for a long time, the user moves in opposite directions with two fingers, and the mobile terminal displays the virtual keyboard in an enlarged manner. As shown in fig. 8, after the control area is pressed for a long time, the screen may be swiped, the virtual keyboard may be translated to a partial position on the display screen, and the blank area flowing out after dragging may be displayed as an extension of the adjacent area (e.g., an input interface), or may be replaced with a special background or animation.
It should be noted that the embodiments and features of the embodiments in the present application may be arbitrarily combined with each other without conflict.
The present invention is capable of other embodiments, and various changes and modifications may be made by one skilled in the art without departing from the spirit and scope of the invention.
It will be understood by those skilled in the art that all or part of the steps of the above methods may be implemented by instructing the relevant hardware through a program, and the program may be stored in a computer readable storage medium, such as a read-only memory, a magnetic or optical disk, and the like. Alternatively, all or part of the steps of the above embodiments may be implemented using one or more integrated circuits. Accordingly, each module/unit in the above embodiments may be implemented in the form of hardware, and may also be implemented in the form of a software functional module. The present invention is not limited to any specific form of combination of hardware and software.
Claims (8)
1. A method for performing desktop display control, wherein,
after detecting a touch screen event of a user on a touch display screen, a mobile terminal determines a target operation area according to the touch screen event and performs amplification display, reduction display or translation display on the target operation area, wherein the target operation area is a whole desktop or a local desktop;
the local desktop is a control area corresponding to a control position when the starting point of the long press operation is judged to correspond to the control position on the display screen after the mobile terminal detects the long press operation in the touch screen event;
the method comprises the steps that all desktops are the whole desktops when a starting point of a long press operation is judged to correspond to a non-control position on a display screen after the mobile terminal detects the long press operation in a touch screen event;
specifically, after the mobile terminal determines a target operation area and detects a single-point movement operation or a multi-point equidirectional movement operation, the mobile terminal performs translation display on the target operation area in the same direction as the movement operation direction and corresponding to the stroke length.
2. The method of claim 1,
after the mobile terminal detects a long-time pressing operation in a touch screen event, when the starting point of the long-time pressing operation is judged to correspond to a non-control position on a display screen, the whole desktop is used as a target operation area selected by a user;
after the mobile terminal detects a long press operation in a touch screen event, when the starting point of the long press operation is judged to correspond to the control position on the display screen, the control area corresponding to the control position is used as a target operation area selected by a user;
after the mobile terminal detects that history points in a touch screen event form a closed area, taking the corresponding closed area on the desktop as a target operation area selected by a user.
3. The method of claim 2,
and after the mobile terminal determines a target operation area, displaying the target operation area as a selected state.
4. The method of claim 1, 2 or 3,
after the mobile terminal determines a target operation area, displaying a reduction control and an amplification control on a touch display screen, carrying out reduction display on the target operation area after detecting a short-press operation at a position corresponding to the reduction control, and carrying out amplification display on the target operation area after detecting the short-press operation at a position corresponding to the amplification control;
or,
after the mobile terminal determines a target operation area and detects multi-point opposite movement operation, the target operation area is displayed in a reduced mode corresponding to the stroke length, and after the multi-point opposite movement operation is detected, the target operation area is displayed in an enlarged mode corresponding to the stroke length.
5. A mobile terminal for controlling desktop display comprises a central processing module, a user interface management module and a man-machine interface module for detecting user operation on a touch display screen and displaying desktop on the touch display screen,
the central processing module is used for acquiring a touch screen event of a user on a touch display screen through the human-computer interface module, determining a target operation area according to the touch screen event and controlling the user interface management module to perform amplification display, reduction display or translation display on the target operation area, wherein the target operation area is a whole desktop or a local desktop; the system is also used for carrying out translation display on the target operation area in the same direction as the moving operation direction and corresponding to the stroke length through the user interface management module after a target operation area is determined and single-point moving operation or multi-point equidirectional moving operation is detected through the human-computer interface module;
the user interface management module is used for controlling the human-computer interface module to display the target operation area according to the instruction of the central processing module, and is also used for supporting the amplification display, the reduction display or the translation display of all desktops and the amplification display, the reduction display or the translation display of partial desktops;
the local desktop is a control area corresponding to a control position when the central processing module detects a long press operation in a touch screen event through the human-computer interface module and judges that the starting point of the long press operation corresponds to the control position on the display screen;
and all the desktops are the whole desktops when the central processing module judges that the starting point of the long-time pressing operation corresponds to the position of the non-control on the display screen after detecting the long-time pressing operation in the touch screen event through the human-computer interface module.
6. The mobile terminal of claim 5,
the central processing module is further used for taking the whole desktop as a target operation area selected by a user when judging that the starting point of the long press operation corresponds to the position of the non-control on the display screen after the long press operation in the touch screen event is detected by the man-machine interface module; or, the control device is further configured to, after detecting a long press operation in a touch screen event through the human-computer interface module, when judging that a starting point of the long press operation corresponds to a control position on the display screen, take a control area corresponding to the control position as a target operation area selected by a user; and the method is also used for taking the corresponding closed area on the desktop as a target operation area selected by the user after detecting that the historical points in the touch screen event form the closed area through the human-computer interface module.
7. The mobile terminal of claim 6,
the central processing module is further used for displaying the target operation area as a selected state after the target operation area is determined.
8. The mobile terminal of claim 5, 6 or 7,
the central processing module is further configured to display a zoom-out control and a zoom-in control on the touch display screen through the user interface management module after a target operation area is determined, perform zoom-out display on the target operation area through the user interface management module after a short-press operation is detected at a position corresponding to the zoom-out control through the human-computer interface module, and perform zoom-in display on the target operation area through the user interface management module after a short-press operation is detected at a position corresponding to the zoom-in control through the human-computer interface module; or, after the target operation area is determined, after the human-computer interface module detects the multi-point opposite movement operation, the user interface management module performs reduced display corresponding to the stroke length on the target operation area, and after the human-computer interface module detects the multi-point reverse movement operation, the user interface management module performs enlarged display corresponding to the stroke length on the target operation area.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201110409344.4A CN102520860B (en) | 2011-12-09 | 2011-12-09 | A kind of method and mobile terminal for carrying out desktop display control |
PCT/CN2012/070927 WO2013082881A1 (en) | 2011-12-09 | 2012-02-07 | Desktop display control method and mobile terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201110409344.4A CN102520860B (en) | 2011-12-09 | 2011-12-09 | A kind of method and mobile terminal for carrying out desktop display control |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102520860A CN102520860A (en) | 2012-06-27 |
CN102520860B true CN102520860B (en) | 2018-01-19 |
Family
ID=46291807
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201110409344.4A Active CN102520860B (en) | 2011-12-09 | 2011-12-09 | A kind of method and mobile terminal for carrying out desktop display control |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN102520860B (en) |
WO (1) | WO2013082881A1 (en) |
Families Citing this family (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8866770B2 (en) * | 2012-03-19 | 2014-10-21 | Mediatek Inc. | Method, device, and computer-readable medium for changing size of touch permissible region of touch screen |
CN109101165A (en) * | 2012-06-28 | 2018-12-28 | 汉阳大学校产学协力团 | User interface adjusting method |
CN102830914B (en) * | 2012-07-31 | 2018-06-05 | 北京三星通信技术研究有限公司 | The method and its equipment of operating terminal equipment |
CN103593132A (en) * | 2012-08-16 | 2014-02-19 | 腾讯科技(深圳)有限公司 | Touch device and gesture recognition method |
CN102880411B (en) * | 2012-08-20 | 2016-09-21 | 东莞宇龙通信科技有限公司 | Mobile terminal and touch operation method thereof |
CN103677543A (en) * | 2012-09-03 | 2014-03-26 | 中兴通讯股份有限公司 | Method for adjusting screen display area of mobile terminal and mobile terminal |
CN107247538B (en) * | 2012-09-17 | 2020-03-20 | 华为终端有限公司 | Touch operation processing method and terminal device |
CN102902481B (en) * | 2012-09-24 | 2016-12-21 | 东莞宇龙通信科技有限公司 | Terminal and terminal operation method |
CN102855066B (en) * | 2012-09-26 | 2017-05-17 | 东莞宇龙通信科技有限公司 | Terminal and terminal control method |
CN103309604A (en) * | 2012-11-16 | 2013-09-18 | 中兴通讯股份有限公司 | Terminal and method for controlling information display on terminal screen |
CN103902206B (en) * | 2012-12-25 | 2017-11-28 | 广州三星通信技术研究有限公司 | The method and apparatus and mobile terminal of mobile terminal of the operation with touch-screen |
CN103294346B (en) * | 2013-06-20 | 2018-03-06 | 锤子科技(北京)有限公司 | The window moving method and its device of a kind of mobile device |
CN103324347B (en) * | 2013-06-27 | 2017-09-22 | 广东欧珀移动通信有限公司 | A kind of operating method and system of the mobile terminal based on many contact panels |
CN103414829A (en) * | 2013-08-27 | 2013-11-27 | 深圳市金立通信设备有限公司 | Method, device and terminal device for controlling screen contents |
CN103472996A (en) * | 2013-09-17 | 2013-12-25 | 深圳市佳创软件有限公司 | Method and device for receiving touch in mobile device |
US9733806B2 (en) | 2013-10-09 | 2017-08-15 | Htc Corporation | Electronic device and user interface operating method thereof |
CN103530035A (en) * | 2013-10-09 | 2014-01-22 | 深圳市中兴移动通信有限公司 | Touch control terminal and area operating method of touch control terminal |
CN104571777A (en) * | 2013-10-09 | 2015-04-29 | 宏达国际电子股份有限公司 | Electronic device and user interface operation method thereof |
CN104571799B (en) * | 2013-10-28 | 2019-02-05 | 联想(北京)有限公司 | Information processing method and electronic equipment |
CN103902218A (en) * | 2013-12-27 | 2014-07-02 | 深圳市同洲电子股份有限公司 | Mobile terminal screen displaying method and mobile terminal |
CN103888840B (en) * | 2014-03-27 | 2017-03-29 | 电子科技大学 | A kind of video mobile terminal Real Time Dragging and the method and device for scaling |
CN104049843B (en) * | 2014-06-03 | 2018-01-23 | 联想(北京)有限公司 | A kind of information processing method and electronic equipment |
CN105700763A (en) * | 2014-11-25 | 2016-06-22 | 中兴通讯股份有限公司 | Terminal interface window moving method and terminal interface window moving device |
CN104915111B (en) * | 2015-05-28 | 2018-08-14 | 努比亚技术有限公司 | terminal operation control method and device |
CN104932776A (en) * | 2015-06-29 | 2015-09-23 | 联想(北京)有限公司 | Information processing method and electronic equipment |
CN105117100A (en) * | 2015-08-19 | 2015-12-02 | 小米科技有限责任公司 | Target object display method and apparatus |
CN105224169B (en) * | 2015-09-09 | 2019-02-05 | 魅族科技(中国)有限公司 | A kind of Interface Moving method and terminal |
CN105404456B (en) * | 2015-12-22 | 2019-01-22 | 厦门美图移动科技有限公司 | A kind of mobile terminal dialing keyboard management method and device |
CN107015749A (en) * | 2016-01-28 | 2017-08-04 | 中兴通讯股份有限公司 | A kind of method for showing interface and mobile terminal for mobile terminal |
CN105930252A (en) * | 2016-04-29 | 2016-09-07 | 杨夫春 | Mobile terminal file memory display method |
CN106354396A (en) * | 2016-08-26 | 2017-01-25 | 乐视控股(北京)有限公司 | Interface adjustment method and device |
CN106686232B (en) * | 2016-12-27 | 2020-03-31 | 努比亚技术有限公司 | Control interface optimization method and mobile terminal |
CN108279840A (en) * | 2017-12-22 | 2018-07-13 | 石化盈科信息技术有限责任公司 | A kind of the one-handed performance method and single-hand operation device of touch screen |
CN112256169B (en) * | 2020-10-14 | 2021-08-10 | 北京达佳互联信息技术有限公司 | Content display method and device, electronic equipment and storage medium |
CN113434079A (en) * | 2021-05-28 | 2021-09-24 | 北京信和时代科技有限公司 | Control method, device and equipment for event recording application and computer storage medium |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101650633A (en) * | 2009-07-03 | 2010-02-17 | 苏州佳世达电通有限公司 | Manipulating method of electronic device |
CN102163126A (en) * | 2010-02-24 | 2011-08-24 | 宏达国际电子股份有限公司 | Display method and electronic device for using the same |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5045559B2 (en) * | 2008-06-02 | 2012-10-10 | 富士通モバイルコミュニケーションズ株式会社 | Mobile device |
US8963849B2 (en) * | 2008-12-04 | 2015-02-24 | Mitsubishi Electric Corporation | Display input device |
US9182854B2 (en) * | 2009-07-08 | 2015-11-10 | Microsoft Technology Licensing, Llc | System and method for multi-touch interactions with a touch sensitive screen |
CN102023788A (en) * | 2009-09-15 | 2011-04-20 | 宏碁股份有限公司 | Control method for touch screen display frames |
-
2011
- 2011-12-09 CN CN201110409344.4A patent/CN102520860B/en active Active
-
2012
- 2012-02-07 WO PCT/CN2012/070927 patent/WO2013082881A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101650633A (en) * | 2009-07-03 | 2010-02-17 | 苏州佳世达电通有限公司 | Manipulating method of electronic device |
CN102163126A (en) * | 2010-02-24 | 2011-08-24 | 宏达国际电子股份有限公司 | Display method and electronic device for using the same |
Also Published As
Publication number | Publication date |
---|---|
CN102520860A (en) | 2012-06-27 |
WO2013082881A1 (en) | 2013-06-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102520860B (en) | A kind of method and mobile terminal for carrying out desktop display control | |
EP2372516B1 (en) | Methods, systems and computer program products for arranging a plurality of icons on a touch sensitive display | |
AU2008100003A4 (en) | Method, system and graphical user interface for viewing multiple application windows | |
EP2778878B1 (en) | Automatically expanding panes | |
EP2657811B1 (en) | Touch input processing device, information processing device, and touch input control method | |
US10684751B2 (en) | Display apparatus, display method, and program | |
EP3002664B1 (en) | Text processing method and touchscreen device | |
EP2613247B1 (en) | Method and apparatus for displaying a keypad on a terminal having a touch screen | |
EP2474896A2 (en) | Information processing apparatus, information processing method, and computer program | |
EP2738654B1 (en) | Touch screen operation method and terminal | |
CN106104450B (en) | Method for selecting a part of a graphical user interface | |
CN104238927B (en) | The control method and device of intelligent terminal application program | |
TWI490771B (en) | Programmable display unit and screen operating and processing program thereof | |
CN105426080A (en) | Image switching method and terminal | |
CN106415471A (en) | Processing method for user interface of terminal, user interface and terminal | |
JP2012141869A (en) | Information processing apparatus, information processing method, and computer program | |
US9069391B2 (en) | Method and medium for inputting Korean characters using a touch screen | |
EP3278203B1 (en) | Enhancement to text selection controls | |
US10540086B2 (en) | Apparatus, method and computer program product for information processing and input determination | |
CN109804342B (en) | Method for adjusting display and operation of graphical user interface | |
CN112558844A (en) | Tablet computer-based medical image reading method and system | |
CN102789358A (en) | Image output and display method, device and display equipment | |
CN117289849A (en) | Gesture auxiliary writing method and device | |
EP2804079A1 (en) | User equipment and method | |
KR20160027063A (en) | Method of selection of a portion of a graphical user interface |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right | ||
TR01 | Transfer of patent right |
Effective date of registration: 20231107 Address after: Unit 11, 46th Floor, Building 16, Yard 1, Jianguomenwai Street, Chaoyang District, Beijing, 100004 Patentee after: Beijing Wutao Technology Co.,Ltd. Address before: 518057 Ministry of justice, Zhongxing building, South Science and technology road, Nanshan District hi tech Industrial Park, Shenzhen, Guangdong Patentee before: ZTE Corp. |