CN102520860B - A kind of method and mobile terminal for carrying out desktop display control - Google Patents

A kind of method and mobile terminal for carrying out desktop display control Download PDF

Info

Publication number
CN102520860B
CN102520860B CN201110409344.4A CN201110409344A CN102520860B CN 102520860 B CN102520860 B CN 102520860B CN 201110409344 A CN201110409344 A CN 201110409344A CN 102520860 B CN102520860 B CN 102520860B
Authority
CN
China
Prior art keywords
object run
display
touch
run region
desktop
Prior art date
Application number
CN201110409344.4A
Other languages
Chinese (zh)
Other versions
CN102520860A (en
Inventor
柳鲲鹏
黄连芳
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Priority to CN201110409344.4A priority Critical patent/CN102520860B/en
Publication of CN102520860A publication Critical patent/CN102520860A/en
Application granted granted Critical
Publication of CN102520860B publication Critical patent/CN102520860B/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for entering handwritten data, e.g. gestures, text
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the screen or tablet into independently controllable areas, e.g. virtual keyboards, menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Abstract

The invention discloses a kind of method and mobile terminal for carrying out desktop display control, the method includes:After mobile terminal detects user to the touch-screen event of touch display screen, object run region is determined according to the touch-screen event and is amplified the object run region display or reduces to show or translate display, the object run region is whole desktops or local desktop.By the solution of the present invention, whole desktop or local desktop can be dragged or zoomed in or out, the region that user needs point tactile is placed into the convenient position clicked on of user, maloperation is avoided, improves Consumer's Experience.

Description

A kind of method and mobile terminal for carrying out desktop display control

Technical field

The present invention relates to field of mobile terminals, more particularly to a kind of method and mobile terminal for carrying out desktop display control.

Background technology

With advances in technology, traditional button operation is substituted by using touch screen operation, mobile terminal of touch screen obtains To widely using.Existing touch-screen mobile phone supports the operations such as click, slip.Larger (such as 4.3 English of touch screen terminal screen It is very little) when, it is not easy to click on when one-handed performance part icon or virtual key, may result in maloperation.

The content of the invention

It is convenient the technical problem to be solved in the present invention is to provide a kind of method and mobile terminal for carrying out desktop display control The accurate touch-sensitive screen contact of user.

In order to solve the above-mentioned technical problem, the invention provides a kind of method for carrying out desktop display control, wherein, it is mobile After terminal detects user to the touch-screen event of touch display screen, object run region is determined and to institute according to the touch-screen event State object run region to be amplified display or reduce display or translation display, the object run region is whole desktops or office Portion's desktop.

Further, the above method can also have the characteristics that:

After the mobile terminal detects the long-press operation in touch-screen event, judge that the starting point of the long-press operation is corresponding When non-control location on display screen, object run region that whole desktop is chosen as user;

After the mobile terminal detects the long-press operation in touch-screen event, judge that the starting point of the long-press operation is corresponding When control location on display screen, object run region that control regions corresponding to this control location are chosen as user;

After the mobile terminal detects the history point composition enclosed region in touch-screen event, it will be closed accordingly on this desktop Close the object run region that region is chosen as user.

Further, the above method can also have the characteristics that:

After the mobile terminal determines object run region, the object run region is shown as selected state.

Further, the above method can also have the characteristics that:

After the mobile terminal determines object run region, shown in touch display screen and reduce control and zoom in control, And it is described diminution control corresponding to position detect short-press operation after, diminution is carried out to the object run region and shown, and After short-press operation is detected in position corresponding to the zoom in control, display is amplified to the object run region;

Or

After the mobile terminal determines object run region, after detecting that multiple spot moves towards operation, the target is grasped Make region carry out it is corresponding with stroke length reduce display, detect multiple spot move backward operate after, to the object run area Domain carries out stroke length and amplifies display accordingly.

Further, the above method can also have the characteristics that:

After the mobile terminal determines object run region, single point movement operation or multiple spot moving operation in the same direction are detected Afterwards, to the object run region carry out with it is moving operation direction equidirectional and it is corresponding with stroke length translate it is aobvious Show.

In order to solve the above-mentioned technical problem, the invention provides a kind of mobile terminal for carrying out desktop display control, including Central processing module, user interface management module and for detect user to touch display screen operate and be responsible for touch show The human-machine interface module of screen display desktop, wherein,

The central processing module, for knowing touch screen thing of the user to touch display screen by the human-machine interface module Part, object run region is determined according to the touch-screen event and controls the user interface management module to the object run area Domain is amplified display or reduces display or translation display, and the object run region is whole desktops or local desktop;

The user interface management module, for controlling the human-machine interface module pair according to the instruction of central processing module The object run region is shown, is additionally operable to support the amplification to whole desktops to show or reduce display or translation display, And the amplification to local desktop shows or reduced display or translation display.

Further, above-mentioned mobile terminal can also have the characteristics that:

The central processing module, it is additionally operable to detect that the long-press in touch-screen event operates by the human-machine interface module Afterwards, when judging that the starting point of the long-press operation corresponds to the non-control location on display screen, selected whole desktop as user In object run region;Or be additionally operable to after the long-press operation in touch-screen event is detected by the human-machine interface module, When judging that the starting point of the long-press operation corresponds to the control location on display screen, by control regions corresponding to this control location The object run region chosen as user;It is additionally operable to detect the history point in touch-screen event by the human-machine interface module After forming enclosed region, using the object run region that corresponding enclosed region is chosen as user on this desktop.

Further, above-mentioned mobile terminal can also have the characteristics that:

The central processing module, it is additionally operable to, it is determined that behind object run region, the object run region is shown as Selected state.

Further, above-mentioned mobile terminal can also have the characteristics that:

The central processing module, it is additionally operable to, it is determined that behind object run region, pass through the user interface management module Shown in touch display screen and reduce control and zoom in control, and it is corresponding in the diminution control by the human-machine interface module Position detect short-press operation after, the object run region reduce by the user interface management module aobvious Show, and after detecting in position corresponding to the zoom in control short-press operation by the human-machine interface module, pass through the use Family interface manager module is amplified display to the object run region;Or be additionally operable to it is determined that behind object run region, After detecting that multiple spot moves towards operation by the human-machine interface module, by the user interface management module to the mesh Mark operating area progress is corresponding with stroke length to reduce display, detects that multiple spot moves backward by the human-machine interface module After operation, stroke length is carried out to the object run region by the user interface management module and amplifies display accordingly.

Further, above-mentioned mobile terminal can also have the characteristics that:

The central processing module, it is additionally operable to after determining object run region, is detected by the human-machine interface module After single point movement operation or multiple spot moving operation in the same direction, by the user interface management module to the object run region Carry out with it is moving operation direction equidirectional and corresponding with stroke length translate display.

By the solution of the present invention, whole desktop or local desktop can be dragged or zoomed in or out, by user Need a little tactile region to be placed into the convenient position clicked on of user, avoid maloperation, improve Consumer's Experience.

Brief description of the drawings

Fig. 1 is the comprising modules structure chart of mobile terminal in embodiment;

Fig. 2 is the schematic diagram of mobile terminal what's new option in embodiment;

Fig. 3 is mobile terminal original screen display schematic diagram in example one;

Fig. 4 is the schematic diagram after being zoomed in and out in example one to desktop;

Fig. 5 is the schematic diagram after being translated in example one to desktop;

Fig. 6 is the position view of mobile terminal original dummy keyboard on the table in example two;

Fig. 7 is the position view after mobile terminal dummy keyboard amplifies on the table in example two;

Fig. 8 is the position view after mobile terminal dummy keyboard translates on the table in example two.

Embodiment

Mobile terminal in the present invention, including human-machine interface module 101, user interface management module 102, central processing unit 103, program storage block 104.

Human-machine interface module 101, for detecting user to touch display screen operation and being responsible for showing in touch display screen Show desktop, be additionally operable to picture and the interface of caller memory module, and by corresponding interface display on screen, wait user Operation.The function of this module and human-machine interface module function phase in the prior art are same.

User interface management module 102, for controlling the human-machine interface module pair according to the instruction of central processing module The object run region is shown, is additionally operable to support the amplification to whole desktops to show or reduce display or translation display, And the amplification to local desktop shows or reduced display or translation display.

Central processing module 103, for knowing touch screen thing of the user to touch display screen by the human-machine interface module Part, object run region is determined according to the touch-screen event and controls the user interface management module to the object run area Domain is amplified display or reduces display or translation display, and the object run region is whole desktops or local desktop.

Program storage block 104, picture, data, menu and the display interface needed for memory mobile phone;In addition, also deposit Store up mobile phone operating system, application function and data file etc..The work(of the function of this module and program storage block in the prior art Can be identical.

The object run region that user makes mobile terminal know user by carrying out operation to touch-screen.

During non-control location on user's long-press display screen, the object run region for representing user is whole desktop.Center After processing module 103 detects the long-press operation in touch-screen event by the human-machine interface module 101, the long-press behaviour is judged When the starting point of work corresponds to the non-control location on display screen, object run region that whole desktop is chosen as user.

During control location on user's long-press display screen, the object run region for representing user is this control.Central processing After module 103 detects the long-press operation in touch-screen event by the human-machine interface module 101, the long-press operation is judged When starting point corresponds to the control location on display screen, target that control regions corresponding to this control location are chosen as user Operating area.

User is when display screen movement indicates an enclosed region, such as encloses the region of multiple controls, represents user's Object run region is this enclosed region.Central processing module 103 is additionally operable to detect touch screen by the human-machine interface module After history point composition enclosed region in event, using the object run area that corresponding enclosed region is chosen as user on this desktop Domain.

The object run region is shown as selected state by central processing module 103 it is determined that behind object run region. A various ways can be had by being shown as selected state, such as in object run edges of regions display bezel, object run region is shown Certain color or transparent effect are shown as, and indicates the other manner in object run region.

User is included following two by way of operation scaling object run region is carried out to touch-screen.

Mode one, central processing module 103 are existed it is determined that behind object run region by the user interface management module Shown in touch display screen and reduce control and zoom in control, and by the human-machine interface module corresponding to the diminution control After position detects short-press operation, diminution is carried out to the object run region by the user interface management module and shown, And after short-press operation is detected in position corresponding to the zoom in control by the human-machine interface module, pass through user circle Face management module is amplified display to the object run region.

Mode two, after central processing module 103 detects that multiple spot moves towards operation by the human-machine interface module, lead to Cross the user interface management module to the object run region carry out it is corresponding with stroke length reduce display, by described After human-machine interface module detects multiple spot reverse movement operation, by the user interface management module to the object run area Domain carries out stroke length and amplifies display accordingly.

User is included by way of translation scaling object run region is carried out to touch-screen:Central processing module 103 is true Set the goal after operating area, single point movement operation or multiple spot moving operation in the same direction are detected by the human-machine interface module Afterwards, by the user interface management module to the object run region carry out with moving operation direction equidirectional and It is corresponding with stroke length to translate display.

Carrying out the method for desktop display control includes:Mobile terminal detects touch-screen event of the user to touch display screen Afterwards, determine object run region according to the touch-screen event and be amplified the object run region display or reduce to show Or translation display, the object run region is whole desktops or local desktop.

During non-control location on user's long-press display screen, represent to want the object run for choosing whole desktop as user Region.After mobile terminal detects the long-press operation in touch-screen event, judge that the starting point of the long-press operation corresponds to display During non-control location on screen, object run region that whole desktop is chosen as user;

During control location on user's long-press display screen, represent to want the object run area for choosing this control as user Domain.After mobile terminal detects the long-press operation in touch-screen event, judge that the starting point of the long-press operation corresponds to display screen On control location when, object run region that control regions corresponding to this control location are chosen as user;

With finger, stroke goes out enclosed region to user on a display screen, represents to want the mesh for choosing this enclosed region as user Mark operating area.After mobile terminal detects the history point composition enclosed region in touch-screen event, it will be closed accordingly on this desktop Close the object run region that region is chosen as user.

After the mobile terminal determines object run region, the object run region is shown as selected state.Display There can be a various ways for selected state, such as in object run edges of regions display bezel, object run region is shown as Certain color or transparent effect, and indicate the other manner in object run region.

User is included following two by way of operation scaling object run region is carried out to touch-screen.

Mode one, after the mobile terminal determines object run region, shown in touch display screen and reduce control and put Big control, and it is described diminution control corresponding to position detect short-press operation after, the object run region is reduced It has been shown that, and after detecting in position corresponding to the zoom in control short-press operation, the object run region is amplified aobvious Show;

Mode two, user can be moved along opposite direction on a display screen using two fingers, and object run is wished in expression Area reduction, or moved in opposite direction on a display screen using two fingers, expression wishes that object run region is amplified.Institute State after mobile terminal detects that multiple spot moves towards operation, reduce corresponding with stroke length is carried out to the object run region It has been shown that, after detecting multiple spot reverse movement operation, stroke length is carried out to the object run region and amplifies display accordingly.

User, along target direction stroke, or uses the stroke in same direction of multiple fingers using finger, can be with Represent that user wishes the direction of translation.After mobile terminal detects single point movement operation or multiple spot moving operation in the same direction, to institute State object run region carry out with it is moving operation direction equidirectional and corresponding with stroke length translate display.

This programme is used by a kind of mode of operation to touch-screen, whole to desktop or local progress drag and drop or scaling Family use touch type terminal when it is more convenient, there is provided the new Experience Degree of user.

As shown in Fig. 2 central processing unit by operation of the user to interface, converts the behaviour to the corresponding application of pattern management system Make, and interface manager module corresponding to calling, and display result is refreshed to screen buffer and shown on a display screen.This hair The function choosing-item increased newly in bright is the overall desktop of translation, translation part desktop, the overall desktop of scaling, scaling part desktop.

Described in detail below by idiographic flow.

In each example below, terminal detects touch-screen event in real time on stream.

Example 1, the flow that user translates whole desktop include:

Step 1, the non-control location of user's long-press desktop;

Step 2, the long-press operation in touch-screen event is detected, and the position of this long-press operation is non-control location, is judged Whether allow to operate whole desktop and (judge whether to include the option for translating overall desktop or scaling entirety desktop), such as Fruit is to perform next step, otherwise, is handled by normal flow;

Step 3, it is selected state to show whole desktop;

Step 4, after detecting the translation of the translation or multiple contacts of a contact in the same direction, judge Whether allow to carry out desktop translation (judging whether to include the overall desktop of translation), if it is, next step is performed, it is no Then, handled by normal flow;

Step 5, the starting point and current terminus of translation are recorded, draws the visual effect of desktop translation.

Example 2, the flow that user scales whole desktop include:

Step 1, the non-control location of user's long-press desktop;

Step 2, the long-press operation in touch-screen event is detected, and the position of this long-press operation is non-control location, is judged Whether allow to operate whole desktop and (judge whether to include the option for translating overall desktop or scaling entirety desktop), such as Fruit is to perform next step, otherwise, is handled by normal flow;

Step 3, it is selected state to show whole desktop;

Step 4, the translation in opposite direction of two contacts is detected, determines whether to zoom in and out desktop Operation (judges whether to include the overall desktop of scaling), if it is, performing next step, otherwise, handled by normal flow;

Step 5, the starting point and current terminus of translation are recorded, draws the visual effect of desktop amplification.

Example 3, the flow that user translates local desktop include:

Step 1, user draws a circle to approve an enclosed region or the control of long-press one;

Step 2, detect the moving operation in touch-screen event and history point forms an enclosed region, or detect tactile Long-press operation and long opsition dependent in screen event are control locations, determine whether to operate local desktop and (sentence The disconnected option for whether including translating local desktop or scaling part desktop), if it is, next step is performed, otherwise, by normal flow Processing;

Step 3, show that this control or this enclosed region are selected state;

Step 4, the translation of the translation or multiple contacts of a contact in the same direction is detected, judgement is It is no to allow to carry out local desktop translation (judging whether to include the local desktop of translation), if it is, next step is performed, it is no Then, handled by normal flow;

Step 5, the starting point and current terminus of translation are recorded, it is flat to draw local desktop (i.e. this control or this enclosed region) The visual effect of shifting.

Example 4, the flow that user amplifies local desktop include:

Step 1, user draws a circle to approve an enclosed region or the control of long-press one;

Step 2, detect the moving operation in touch-screen event and history point forms an enclosed region, or detect tactile Long-press operation and long opsition dependent in screen event are control locations, determine whether to operate local desktop and (sentence The disconnected option for whether including translating local desktop or scaling part desktop), if it is, next step is performed, otherwise, by normal flow Processing;

Step 3, show that this control or this enclosed region are selected state;

Step 4, the translation in opposite direction of two contacts is detected, determines whether to carry out local desktop Zoom operations (judge whether to include the local desktop of scaling), if it is, performing next step, otherwise, handled by normal flow;

Step 5, the starting point and current terminus of translation are recorded, local desktop (i.e. this control or this enclosed region) is drawn and puts Big visual effect.

Example 4, the flow that user zooms in or out local desktop include:

Step 1, user draws a circle to approve an enclosed region or the control of long-press one;

Step 2, detect the moving operation in touch-screen event and history point forms an enclosed region, or detect tactile Long-press operation and long opsition dependent in screen event are control locations, determine whether to operate local desktop and (sentence The disconnected option for whether including translating local desktop or scaling part desktop), if it is, next step is performed, otherwise, by normal flow Processing;

Step 3, show that this control or this enclosed region are selected state;

Step 4, the translation in opposite direction of two contacts is detected, determines whether to carry out local desktop Zoom operations (judge whether to include the local desktop of scaling), if it is, performing next step, otherwise, handled by normal flow;

Step 5, the starting point and current terminus of translation are recorded, local desktop (i.e. this control or this enclosed region) is drawn and puts Big visual effect.

Step 6, the translation in same direction of two contacts is detected, determines whether to carry out local desktop Zoom operations (judge whether to include the local desktop of scaling), if it is, performing next step, otherwise, handled by normal flow;

Step 7, the end point of translation as the current point during starting point and diminution, is drawn using in amplification process Local desktop (i.e. this control or this enclosed region) after amplification on the basis of the visual effect that reduces.

In above-mentioned example, after terminal detects touch release event, terminate translation or the drafting of scaling, holding translates or contracting Put operation last time and draw the desktop formed, wait user's operation.

The present invention is elaborated below by accompanying drawing.

Example one, to desktop operation example.

As shown in figure 3, representing that former display screen shows the situation of desktop, thick line represents the edge of screen, and dash area represents Desktop.As shown in figure 4, user is slided in opposite direction using two fingers on screen, desktop is amplified display by terminal. As shown in figure 5, user pins the non-control regions in the screen upper left corner, the instruction of stroke display screen is wished to translate former desktop to the right, Double-lined arrow represents direction and the distance of user's stroke, the user originally positioned at the screen upper left corner is put striking mark and is moved to screen Curtain centre position, facilitate the point selection operation of user.Caused white space can be that pure color is shown after dragging desktop, or according to The picture or animation form of terminal preset are shown.User can the use demand of oneself, to whole screen to all directions move It is dynamic, to meet the use demand of oneself.In desktop moving process, all icons and control also follow the shifting at interface on desktop Dynamic synchronizing moving.

Example two, to control operation example.

When needing the control of selection there was only one, user can direct this control of long-press, when needing the control of selection to have multiple, use Family can draw a circle to approve region to be chosen with touch-screen.

As shown in fig. 6, desktop is inputting interface, dummy keyboard is as a control.As shown in fig. 7, this control of user's long-press Behind part region, moved round about with two fingers, mobile terminal amplification shows dummy keyboard.As shown in figure 8, it can also grow After this control regions, stroke screen, dummy keyboard is moved into display screen upper part position, the white space flowed out after dragging The expansion (such as inputting interface) of adjacent area can be shown as, or is replaced with special background or animation.

It should be noted that in the case where not conflicting, the feature in embodiment and embodiment in the application can phase Mutually any combination.

Certainly, the present invention can also have other various embodiments, ripe in the case of without departing substantially from spirit of the invention and its essence Various corresponding changes and deformation, but these corresponding changes and deformation can be made according to the present invention by knowing those skilled in the art The protection domain of appended claims of the invention should all be belonged to.

One of ordinary skill in the art will appreciate that all or part of step in the above method can be instructed by program Related hardware is completed, and described program can be stored in computer-readable recording medium, such as read-only storage, disk or CD Deng.Alternatively, all or part of step of above-described embodiment can also be realized using one or more integrated circuits.Accordingly Ground, each module/unit in above-described embodiment can be realized in the form of hardware, can also use the shape of software function module Formula is realized.The present invention is not restricted to the combination of the hardware and software of any particular form.

Claims (8)

1. a kind of method for carrying out desktop display control, wherein,
After mobile terminal detects user to the touch-screen event of touch display screen, object run area is determined according to the touch-screen event Domain is simultaneously amplified display to the object run region or reduces display or translate display, and the object run region is all Desktop or local desktop;
Wherein, after the local desktop is the long-press operation that mobile terminal is detected in touch-screen event, the long-press operation is judged Starting point correspond to display screen on control location when control location corresponding to control regions;
After whole desktops detect the long-press operation in touch-screen event for mobile terminal, the starting of the long-press operation is judged Point corresponds to whole desktop during the non-control location on display screen;
Specifically, after the mobile terminal determines object run region, detect that single point movement operation or multiple spot move in the same direction After operation, the object run region is carried out with it is moving operation direction equidirectional and corresponding with stroke length translate Display.
2. the method as described in claim 1, it is characterised in that
After the mobile terminal detects the long-press operation in touch-screen event, judge that the starting point of the long-press operation corresponds to and show During non-control location in display screen, object run region that whole desktop is chosen as user;
After the mobile terminal detects the long-press operation in touch-screen event, judge that the starting point of the long-press operation corresponds to and show During control location in display screen, object run region that control regions corresponding to this control location are chosen as user;
After the mobile terminal detects the history point composition enclosed region in touch-screen event, by corresponding closed area on this desktop The object run region that domain is chosen as user.
3. method as claimed in claim 2, it is characterised in that
After the mobile terminal determines object run region, the object run region is shown as selected state.
4. the method as described in claim 1,2 or 3, it is characterised in that
After the mobile terminal determines object run region, shown in touch display screen and reduce control and zoom in control, and After position corresponding to the diminution control detects short-press operation, diminution is carried out to the object run region and shown, and in institute State position corresponding to zoom in control detect short-press operation after, display is amplified to the object run region;
Or
After the mobile terminal determines object run region, after detecting that multiple spot moves towards operation, to the object run area Domain progress is corresponding with stroke length to reduce display, and after detecting multiple spot reverse movement operation, the object run region is entered Row stroke length amplifies display accordingly.
5. the mobile terminal of desktop display control is carried out a kind of, including central processing module, user interface management module and is used for Detection user operates to touch display screen and is being responsible for showing the human-machine interface module of desktop in touch display screen, wherein,
The central processing module, for knowing touch-screen event of the user to touch display screen by the human-machine interface module, Object run region is determined according to the touch-screen event and controls the user interface management module to the object run region It is amplified display or reduces display or translation display, the object run region is whole desktops or local desktop;It is additionally operable to After determining object run region, single point movement operation or multiple spot moving operation in the same direction are detected by the human-machine interface module Afterwards, by the user interface management module to the object run region carry out with moving operation direction equidirectional and It is corresponding with stroke length to translate display;
The user interface management module, for controlling the human-machine interface module to described according to the instruction of central processing module Object run region is shown, is additionally operable to support the amplification to whole desktops to show or reduce display or translation display, and Amplification to local desktop shows or reduced display or translation display;
Wherein, local desktop is that central processing module detects that the long-press in touch-screen event operates by the human-machine interface module Afterwards, controls region corresponding to the control location when starting point of long-press operation corresponds to the control location on display screen is judged Domain;
After whole desktops detect the long-press operation in touch-screen event for central processing module by the human-machine interface module, sentence The starting point of the disconnected long-press operation corresponds to whole desktop during the non-control location on display screen.
6. mobile terminal as claimed in claim 5, it is characterised in that
The central processing module, it is additionally operable to after the long-press operation in touch-screen event is detected by the human-machine interface module, When judging that the starting point of the long-press operation corresponds to the non-control location on display screen, whole desktop is chosen as user Object run region;Or be additionally operable to after the long-press operation in touch-screen event is detected by the human-machine interface module, judge When the starting point of long-press operation corresponds to the control location on display screen, using control regions corresponding to this control location as The object run region that user chooses;It is additionally operable to detect that the history point in touch-screen event forms by the human-machine interface module After enclosed region, using the object run region that corresponding enclosed region is chosen as user on this desktop.
7. mobile terminal as claimed in claim 6, it is characterised in that
The central processing module, it is additionally operable to, it is determined that behind object run region, the object run region is shown as choosing State.
8. the mobile terminal as described in claim 5,6 or 7, it is characterised in that
The central processing module, it is additionally operable to, it is determined that behind object run region, touching by the user interface management module Touch display screen display and reduce control and zoom in control, and by the human-machine interface module in position corresponding to the diminution control Put after detecting short-press operation, diminution is carried out to the object run region by the user interface management module and shown, and After detecting short-press operation in position corresponding to the zoom in control by the human-machine interface module, pass through the user interface Management module is amplified display to the object run region;Or it is additionally operable to, it is determined that behind object run region, pass through institute State after human-machine interface module detects that multiple spot moves towards operation, by the user interface management module to the object run Region progress is corresponding with stroke length to reduce display, and multiple spot reverse movement operation is detected by the human-machine interface module Afterwards, stroke length is carried out to the object run region by the user interface management module and amplifies display accordingly.
CN201110409344.4A 2011-12-09 2011-12-09 A kind of method and mobile terminal for carrying out desktop display control CN102520860B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201110409344.4A CN102520860B (en) 2011-12-09 2011-12-09 A kind of method and mobile terminal for carrying out desktop display control

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201110409344.4A CN102520860B (en) 2011-12-09 2011-12-09 A kind of method and mobile terminal for carrying out desktop display control
PCT/CN2012/070927 WO2013082881A1 (en) 2011-12-09 2012-02-07 Desktop display control method and mobile terminal

Publications (2)

Publication Number Publication Date
CN102520860A CN102520860A (en) 2012-06-27
CN102520860B true CN102520860B (en) 2018-01-19

Family

ID=46291807

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110409344.4A CN102520860B (en) 2011-12-09 2011-12-09 A kind of method and mobile terminal for carrying out desktop display control

Country Status (2)

Country Link
CN (1) CN102520860B (en)
WO (1) WO2013082881A1 (en)

Families Citing this family (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8866770B2 (en) * 2012-03-19 2014-10-21 Mediatek Inc. Method, device, and computer-readable medium for changing size of touch permissible region of touch screen
WO2014003337A1 (en) * 2012-06-28 2014-01-03 한양대학교 산학협력단 Method for adjusting ui and user terminal using same
CN102830914B (en) * 2012-07-31 2018-06-05 北京三星通信技术研究有限公司 The method and its equipment of operating terminal equipment
CN103593132A (en) * 2012-08-16 2014-02-19 腾讯科技(深圳)有限公司 Touch device and gesture recognition method
CN102880411B (en) * 2012-08-20 2016-09-21 东莞宇龙通信科技有限公司 Mobile terminal and touch operation method thereof
CN103677543A (en) * 2012-09-03 2014-03-26 中兴通讯股份有限公司 Method for adjusting screen display area of mobile terminal and mobile terminal
EP2730999A4 (en) * 2012-09-17 2014-07-23 Huawei Device Co Ltd Touch operation processing method and terminal device
CN102902481B (en) * 2012-09-24 2016-12-21 东莞宇龙通信科技有限公司 Terminal and terminal operation method
CN102855066B (en) * 2012-09-26 2017-05-17 东莞宇龙通信科技有限公司 Terminal and terminal control method
CN103309604A (en) * 2012-11-16 2013-09-18 中兴通讯股份有限公司 Terminal and method for controlling information display on terminal screen
CN103902206B (en) * 2012-12-25 2017-11-28 广州三星通信技术研究有限公司 The method and apparatus and mobile terminal of mobile terminal of the operation with touch-screen
CN103294346B (en) * 2013-06-20 2018-03-06 锤子科技(北京)有限公司 The window moving method and its device of a kind of mobile device
CN103324347B (en) * 2013-06-27 2017-09-22 广东欧珀移动通信有限公司 A kind of operating method and system of the mobile terminal based on many contact panels
CN103414829A (en) * 2013-08-27 2013-11-27 深圳市金立通信设备有限公司 Method, device and terminal device for controlling screen contents
CN103472996A (en) * 2013-09-17 2013-12-25 深圳市佳创软件有限公司 Method and device for receiving touch in mobile device
US9733806B2 (en) 2013-10-09 2017-08-15 Htc Corporation Electronic device and user interface operating method thereof
CN103530035A (en) * 2013-10-09 2014-01-22 深圳市中兴移动通信有限公司 Touch control terminal and area operating method of touch control terminal
CN104571777A (en) * 2013-10-09 2015-04-29 宏达国际电子股份有限公司 Electronic device and user interface operation method thereof
CN104571799B (en) * 2013-10-28 2019-02-05 联想(北京)有限公司 Information processing method and electronic equipment
CN103902218A (en) * 2013-12-27 2014-07-02 深圳市同洲电子股份有限公司 Mobile terminal screen displaying method and mobile terminal
CN103888840B (en) * 2014-03-27 2017-03-29 电子科技大学 A kind of video mobile terminal Real Time Dragging and the method and device for scaling
CN108132741A (en) * 2014-06-03 2018-06-08 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN105700763A (en) * 2014-11-25 2016-06-22 中兴通讯股份有限公司 Terminal interface window moving method and terminal interface window moving device
CN104915111B (en) * 2015-05-28 2018-08-14 努比亚技术有限公司 terminal operation control method and device
CN104932776A (en) * 2015-06-29 2015-09-23 联想(北京)有限公司 Information processing method and electronic equipment
CN105117100A (en) * 2015-08-19 2015-12-02 小米科技有限责任公司 Target object display method and apparatus
CN105224169B (en) * 2015-09-09 2019-02-05 魅族科技(中国)有限公司 A kind of Interface Moving method and terminal
CN105404456B (en) * 2015-12-22 2019-01-22 厦门美图移动科技有限公司 A kind of mobile terminal dialing keyboard management method and device
CN107015749A (en) * 2016-01-28 2017-08-04 中兴通讯股份有限公司 A kind of method for showing interface and mobile terminal for mobile terminal
CN105930252A (en) * 2016-04-29 2016-09-07 杨夫春 Mobile terminal file memory display method
CN106354396A (en) * 2016-08-26 2017-01-25 乐视控股(北京)有限公司 Interface adjustment method and device
CN106686232A (en) * 2016-12-27 2017-05-17 努比亚技术有限公司 Method for optimizing control interfaces and mobile terminal

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101650633A (en) * 2009-07-03 2010-02-17 苏州佳世达电通有限公司;佳世达科技股份有限公司 Manipulating method of electronic device
CN102163126A (en) * 2010-02-24 2011-08-24 宏达国际电子股份有限公司 Display method and electronic device for using the same

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5045559B2 (en) * 2008-06-02 2012-10-10 富士通モバイルコミュニケーションズ株式会社 Mobile device
CN104298398A (en) * 2008-12-04 2015-01-21 三菱电机株式会社 Display input device
US9182854B2 (en) * 2009-07-08 2015-11-10 Microsoft Technology Licensing, Llc System and method for multi-touch interactions with a touch sensitive screen
CN102023788A (en) * 2009-09-15 2011-04-20 宏碁股份有限公司 Control method for touch screen display frames

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101650633A (en) * 2009-07-03 2010-02-17 苏州佳世达电通有限公司;佳世达科技股份有限公司 Manipulating method of electronic device
CN102163126A (en) * 2010-02-24 2011-08-24 宏达国际电子股份有限公司 Display method and electronic device for using the same

Also Published As

Publication number Publication date
CN102520860A (en) 2012-06-27
WO2013082881A1 (en) 2013-06-13

Similar Documents

Publication Publication Date Title
TWI511027B (en) Method, device and apparatus of splitting screen
EP2354929B1 (en) Automatic keyboard layout determination
US8421762B2 (en) Device, method, and graphical user interface for manipulation of user interface objects with activation regions
US9477404B2 (en) Device, method, and graphical user interface for managing concurrently open software applications
EP2106652B1 (en) Portable electronic device, method, and graphical user interface for displaying electronic lists and documents
US9436381B2 (en) Device, method, and graphical user interface for navigating and annotating an electronic document
US9535600B2 (en) Touch-sensitive device and touch-based folder control method thereof
JP6040269B2 (en) Method and graphical user interface for editing on a multifunction device having a touch screen display
US8438500B2 (en) Device, method, and graphical user interface for manipulation of user interface objects with activation regions
US10394441B2 (en) Device, method, and graphical user interface for controlling display of application windows
KR101911088B1 (en) Haptic feedback assisted text manipulation
EP2975512B1 (en) Device and method for displaying a virtual loupe in response to a user contact
US10175879B2 (en) Device, method, and graphical user interface for zooming a user interface while performing a drag operation
JP6138866B2 (en) Device, method and graphical user interface for document manipulation
US8766928B2 (en) Device, method, and graphical user interface for manipulating user interface objects
US10168868B2 (en) Method and apparatus for multitasking
EP2941687B1 (en) User interface for a computing device
AU2008100003A4 (en) Method, system and graphical user interface for viewing multiple application windows
US10102010B2 (en) Layer-based user interface
EP2469398B1 (en) Selecting of text using gestures
US8347232B1 (en) Interactive user interface
US20190012353A1 (en) Multifunction device with integrated search and application selection
JP2013529339A (en) Portable electronic device and method for controlling the same
US20140165006A1 (en) Device, Method, and Graphical User Interface for Managing Folders with Multiple Pages
US20110310026A1 (en) Easy word selection and selection ahead of finger

Legal Events

Date Code Title Description
PB01 Publication
C06 Publication
SE01 Entry into force of request for substantive examination
C10 Entry into substantive examination
GR01 Patent grant
GR01 Patent grant