WO2018040502A1 - Procédé et appareil de controle d'ecran, terminal et support de stockage informatique - Google Patents

Procédé et appareil de controle d'ecran, terminal et support de stockage informatique Download PDF

Info

Publication number
WO2018040502A1
WO2018040502A1 PCT/CN2017/072294 CN2017072294W WO2018040502A1 WO 2018040502 A1 WO2018040502 A1 WO 2018040502A1 CN 2017072294 W CN2017072294 W CN 2017072294W WO 2018040502 A1 WO2018040502 A1 WO 2018040502A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch screen
area
hand
terminal
inoperable
Prior art date
Application number
PCT/CN2017/072294
Other languages
English (en)
Chinese (zh)
Inventor
石磊
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Publication of WO2018040502A1 publication Critical patent/WO2018040502A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects

Definitions

  • the present invention relates to the field of communications technologies, and in particular, to a screen control method, apparatus, terminal, and computer storage medium.
  • an embodiment of the present invention provides a screen control method, device, terminal, and computer storage medium, which at least solves the problem that the single-handed operation cannot be operated due to a single effective operation mode of the terminal screen in the prior art.
  • an embodiment of the present invention provides a screen manipulation method, where the method includes:
  • the first touch screen and the second touch screen are respectively divided into a one-hand operable area and a single in the active area on the first touch screen and the second touch screen of the terminal.
  • Hand inoperable area When the user operates the terminal with one hand, the first touch screen and the second touch screen are respectively divided into a one-hand operable area and a single in the active area on the first touch screen and the second touch screen of the terminal. Hand inoperable area;
  • At least the interactive interface of the one-hand inoperable area on the first touch screen is mapped to the one-hand operable area of the second touch screen.
  • the first touch screen and the second touch screen are respectively divided into one hand in the active area on the first touch screen and the second touch screen of the terminal.
  • the operable area and the one-hand inoperable area include:
  • the touchable area in the lower part of the first touch screen is divided into a one-hand operable area, and the other areas on the first touch screen are divided into one-handed and inoperable.
  • the touchable area on the upper portion of the second touch screen is divided into a one-hand operable area, and the other areas of the second touch screen are divided into one-hand inoperable areas.
  • the method further includes:
  • the original one-hand operable area and the one-hand inoperable area on the first touch screen are respectively reset to a one-hand inoperable area and a one-hand operable area;
  • the first touch screen and the second touch screen are respectively divided into one-handed ones in the active areas on the first touch screen and the second touch screen of the terminal.
  • the operating area and the one-hand inoperable area include:
  • the touchable area in the middle of the first touch screen is divided into a one-hand operable area, and the first touch screen is operated on the one-hand operable area.
  • the next two areas are divided into one-hand inoperable areas;
  • the touchable area on the upper part of the second touch screen is divided into one-hand operable areas, and the one-hand operable area is divided into Two one-handed operable sub-areas, dividing other areas of the second touch screen into one-hand inoperable areas;
  • mapping the interaction interface of the one-hand inoperable area on the first touch screen to the one-hand operable area of the second touch screen includes:
  • mapping the interaction interface of the one-hand inoperable area on the first touch screen to the one-hand operable area of the second touch screen includes:
  • Setting an appropriate ratio according to an area of the one-hand inoperable area on the first touch screen and an area of the one-hand operable area on the second touch screen, and the interaction interface of the one-hand inoperable area on the first touch screen is configured according to The set ratio is scaled down and mapped to the one-hand operable area of the second touch screen.
  • the method further includes:
  • the operation of the user is received on the second touch screen, determining whether the operation is a preset effective operation, and if so, mapping the operation to a corresponding position of the one-hand inoperable area on the first touch screen Otherwise, the operation is ignored.
  • the interaction interface of the one-hand inoperable area on the first touch screen is mapped to the one-hand operable area of the second touch screen
  • the second touch screen is received on the second touch screen
  • the user's operation displays an operation trajectory of the operation on a corresponding position of the one-hand inoperable area on the first touch screen.
  • the first touch screen and the second touch screen are in different faces.
  • the terminal further includes other touch screens than the first touch screen and the second touch screen.
  • an embodiment of the present invention further provides a screen manipulation device, including:
  • a dividing module configured to operate the terminal in one hand according to the first touch screen of the terminal and An active area on the second touch screen, the first touch screen and the second touch screen are respectively divided into a one-hand operable area and a one-hand inoperable area;
  • the mapping module is configured to map at least the interaction interface of the one-hand inoperable area on the first touch screen to the one-hand operable area of the second touch screen.
  • the dividing module is configured to divide the touchable area in the lower part of the first touch screen into a one-hand operable area when the terminal is in a vertically held state and the user operates the terminal with one hand.
  • the other area on the first touch screen is divided into a one-hand inoperable area;
  • the touchable area on the upper part of the second touch screen is divided into a one-hand operable area, and the other areas of the second touch screen are divided into one-hand inoperable areas. ;
  • the mapping module is configured to map an interaction interface of the one-hand inoperable area on the first touch screen to a one-hand operable area of the second touch screen;
  • the dividing module is configured to: when the terminal is in the state of being held vertically, when the user operates the terminal with one hand, the touchable area in the middle of the first touch screen is divided into a one-hand operable area, and the first touch screen is The upper and lower two areas of the one-hand operable area are divided into one-hand inoperable areas; the touchable area on the upper part of the second touch screen is divided into one-hand operable areas, and the one-hand operable area is divided into two a single-handed sub-region, dividing the other areas of the second touch screen into one-hand inoperable areas;
  • the mapping module is configured to: after the dividing module divides the two one-hand operable sub-regions on the second touch screen, interact with two one-hand inoperable areas on the first touch screen The interfaces are respectively mapped to two one-hand operable sub-areas of the second touch screen.
  • the first touch screen and the second touch screen are in different faces.
  • an embodiment of the present invention further provides a terminal, including at least two touch screens and a controller; the at least two touch screens include a first touch screen and a second touch screen;
  • the controller is configured to divide the first touch screen and the second touch screen into one-hand operable area and one-handedly, respectively, when the terminal is operated by the user with one hand, in the active area on the first touch screen and the second touch screen of the terminal Operating area; at least the one-hand inoperable area on the first touch screen
  • the interactive interface is mapped to the one-hand operable area of the second touch screen.
  • the first touch screen and the second touch screen are in different faces.
  • the embodiment of the present invention further provides a computer storage medium, wherein the computer storage medium stores computer executable instructions, and the computer executable instructions are used to execute the screen manipulation method according to the embodiment of the present invention.
  • a screen manipulation method, device, terminal and computer storage medium disclosed in the embodiments of the present invention can map a partial area of one touch screen to another touch screen.
  • all the interactive elements on the first touch screen are operated by one hand by mapping the one-hand inoperable area on the first touch screen to the one-hand operable area on the second touch screen.
  • the purpose is not only to improve the practicability of the one-hand operation of the terminal, especially the large-screen terminal, but also to improve the user experience of the user operating the large-screen terminal with one hand, so that the user can only use the single-purpose defect in the special environment or the user When you operate it, you can experience a better experience.
  • the touch screen of the existing terminal with two or more screens can be fully utilized to improve the usage rate of the touch screen.
  • FIG. 1 is a flowchart of a screen manipulation method based on two or more touch screens according to Embodiment 1 of the present invention
  • FIG. 2 is a schematic diagram of a front touch screen according to a first embodiment of the present invention; wherein the front touch screen is a one-hand operable area A2 and a one-hand inoperable area A1 divided when the terminal is vertically held;
  • FIG. 3 is a schematic diagram of a back touch screen according to Embodiment 1 of the present invention; wherein the back touch screen is a one-hand operable area B1 and a one-hand inoperable area B2 divided when the terminal is vertically held;
  • FIG. 4 is a schematic diagram of another front touch screen according to the first embodiment of the present invention; wherein the front touch screen is a one-hand operable area A4 and a single hand that cannot be operated when the terminal is held upside down. Area A3;
  • FIG. 5 is a schematic diagram of another back touch screen according to Embodiment 1 of the present invention. wherein the back touch screen is a one-hand operable area B3 and a one-hand inoperable area B4 divided when the terminal is held upside down;
  • FIG. 6 is a schematic diagram of another front touch screen according to the first embodiment of the present invention. wherein the front touch screen is a one-hand operable area A6 and a one-hand inoperable area A5, A7 divided when the terminal is vertically held;
  • FIG. 7 is a schematic diagram of another back touch screen according to Embodiment 1 of the present invention. wherein the back touch screen is a one-hand operable area B5 and a one-hand inoperable area B6 divided when the terminal is vertically held;
  • FIG. 8 is a schematic diagram of another front touch screen according to Embodiment 1 of the present invention. wherein the front touch screen is a one-hand operable area A9 and a one-hand inoperable area A8 divided when the terminal is horizontally held;
  • FIG. 9 is a schematic diagram of another back touch screen according to Embodiment 1 of the present invention. wherein the back touch screen is a one-hand operable area B9 and a one-hand inoperable area B8 divided when the terminal is horizontally held;
  • FIG. 10 is a flowchart of another method for controlling a screen based on two or more touch screens according to Embodiment 2 of the present invention.
  • FIG. 11 is a schematic block diagram of a screen manipulation device based on two or more touch screens according to Embodiment 3 of the present invention.
  • Embodiment 1 is a diagrammatic representation of Embodiment 1:
  • the embodiment provides a screen manipulation method based on two or more touch screens.
  • the method of this embodiment is particularly suitable for one-hand operation of a large-screen terminal, and can provide a better one-hand operation experience for the user.
  • the screen manipulation method based on two or more touch screens of this embodiment includes:
  • the plurality of touch screens are respectively divided into a one-hand operable area and a one-hand inoperable area in the active area of the plurality of touch screens on the terminal;
  • the plurality of touch screens include at least the first touch screen and the first A second touch screen of different faces of the touch screen; wherein the active area can be understood as a touch area of the finger on the plurality of screens of the terminal when the user operates with one hand.
  • S102 includes two situations, one situation is: mapping an interactive interface of the one-hand inoperable area on the first touch screen to the one-hand operable area on the second touch screen; and the other situation is: placing the first touch screen
  • the interactive interface of the one-hand inoperable area is mapped to the one-hand operable area on the second touch screen, and the interactive interface of the one-hand inoperable area on the second touch screen is mapped to the one-hand operable area on the first touch screen.
  • the terminal includes, but is not limited to, a mobile phone, a tablet computer, and the like that can be operated by hand.
  • the interface of the terminal operated by a general user is a touch interface located on the front side of the terminal.
  • the first touch screen of this embodiment may be a front touch interface, or the second touch screen is a touch interface of the front side of the terminal.
  • the front touch screen is regarded as the first touch screen, and the other touch screen is regarded as the second touch screen.
  • the front touch screen is generally formed by superimposing the display module and the touch panel in layers, so that the display screen area of the existing terminal can be regarded as the area where the first touch screen is located.
  • the first touch screen may have only one piece, or may be a composite touch screen composed of multiple touch screens, for example, a folded multi-screen mobile phone that has been developed, and the like.
  • Such a multi-screen mobile phone is also within the scope of application of the method of the present implementation.
  • the division of the one-hand operable area and the one-hand inoperable area on the first touch screen is related to the holding state of the terminal when the user uses the terminal, for example, if the terminal is held in the hand by the user, at this time, the one-hand of the terminal
  • the operable area and the one-hand inoperable area are generally divided by a horizontal dividing line. If the terminal is held horizontally by the user, the one-hand operable area and the one-hand inoperable area of the terminal are generally in a vertical direction. Divided by dividing lines.
  • the second touch screen can be disposed on the back side of the terminal, the side of the terminal, and the like. This embodiment does not limit this.
  • the second touch screen is disposed on the right side of the terminal.
  • the touch screen on the right side is generally strip-shaped.
  • the area touched by the user's finger on the right side touch screen may be divided into a one-hand operable area, and the other areas are divided into one-hand inoperable areas.
  • the thickness of the terminal is limited, in order to implement the first touch screen of the control terminal on the right side touch screen of the limited width, the interactive interface of the one-hand inoperable area on the first touch screen is mapped to the right side touch screen.
  • the interactive interface of the application can be mapped to the strip side touch screen in order from top to bottom in a certain order.
  • the general user prefers to use the right-hand one-hand operation terminal, and the second touch screen is disposed on the right side of the terminal, which is suitable for the habit of most users holding the terminal in front.
  • the user can The second touch screen is disposed on the left side of the terminal, and the design can refer to the design of the touch screen on the right side.
  • the touch screens on the left and right sides can be set on the terminal at the same time.
  • the upper side or the lower side of the terminal provided by the second touch screen can also be used, so that when the user grips the terminal horizontally, the area on the terminal that cannot be operated by one hand can be controlled, and the setting manner can refer to the above right side. How to set the touch screen.
  • the plurality of touch screens may further include a third touch screen and a fourth touch.
  • Screen, fifth touch screen, etc., each touch screen is located on a different side. Extending the setting manner of the second touch screen can obtain the setting manner of the third touch screen, the fourth touch screen, and the like.
  • the touch screen when the touch screen is set on the side, due to the limitation of the area of the side frame, the shape of the touch screen is very limited, and the operation of the interactive interface of the one-hand inoperable area on the first touch screen is limited. .
  • the touch screen located on the back side of the terminal As the second touch screen.
  • An existing terminal device already has a front and back dual-screen terminal, such as a YOTA dual-screen smart phone. If a dual-screen terminal of this type is used, in this embodiment, one of the touch screens is set as the first touch screen, and the other is set.
  • the second touch screen may be a screen having only a touch function.
  • the division of the one-hand operable area and the one-hand inoperable area of the first touch screen may also change, and the second touch screen may be one-handed.
  • the change of the division of the operation area and the one-handed inoperable area is divided into the following two types: the terminal is in the vertically held state and the horizontally held state.
  • the terminal When the terminal is in the state of being held vertically, the user holds the terminal in one hand and is generally held in the lower part of the terminal. At this time, the operation of the user on the first touch screen of the terminal is generally performed by the thumb of the user, and the other fingers are generally The first touch screen cannot be operated, and the area of the first touch screen touched by the user through the thumb is generally at the lower part of the terminal.
  • the division of the first touch screen includes: when the user operates the terminal with one hand, the finger is in the lower part of the first touch screen.
  • the touchable area is divided into a one-hand operable area, and other areas on the first touch screen are divided into one-hand inoperable areas.
  • the touchable area of the user's finger generally a thumb
  • the upper part of the terminal that the user's finger cannot touch is divided into one hand and is inoperable. region.
  • the user can use other fingers, such as the index finger or the middle finger
  • the touch screen is operated by two touch screens.
  • the active range of the index finger and the middle finger is in the upper half of the second touch screen
  • the dividing of the second touch screen includes: dividing the finger touchable area of the user on the upper part of the second touch screen.
  • the one-hand operable area divides the other areas of the second touch screen into one-hand inoperable areas.
  • the upper part of the second touch screen is divided into a one-hand operable area by the touchable area such as the index finger or the middle finger, and the non-touchable area such as the index finger or the middle finger is divided into the one-hand inoperable area.
  • the specific mapping process of S102 includes: mapping an interaction interface of the one-hand inoperable area on the first touch screen to a one-hand operable area on the second touch screen.
  • the S102 may further include mapping the interactive interface of the one-hand inoperable area on the second touch screen to the one-hand operable area on the first touch screen, but considering that the user holds the terminal with one hand, the position of the palm and the like is generally positive. For the second touch screen, it is inevitable that a touch will occur.
  • the specific mapping process in S102 includes: mapping the interface of the one-hand inoperable area on the second touch screen to the one-hand operable interface of the first touch screen.
  • S102 is specifically: according to the terminal.
  • the area of the one-hand inoperable area on the first touch screen and the area of the second touch screen are set to an appropriate ratio, and the interactive interface of the one-hand inoperable area on the first touch screen is mapped to the second according to the set ratio.
  • One-hand operable area of the touch screen is specifically: according to the terminal.
  • the manner of non-equalizing the interactive interface of the one-hand inoperable area on the first touch screen may also be adopted, as long as the UI on the first touch screen is fully mapped on the second touch screen (User Interface , user interface) elements can be.
  • UI elements include, but are not limited to, menu items, application icons of the UI interface, and the like.
  • the operation interface of the one-hand inoperable area on the second touch screen is also mapped to the one-hand operable area on the first touch screen
  • the appropriate ratio may be set according to the area of the one-hand inoperable area on the second touch screen of the terminal and the area of the first touch screen, and the interactive interface of the one-hand inoperable area on the second touch screen is scaled according to the set ratio. To the one-hand operable area of the first touch screen.
  • the terminal when the user holds the terminal vertically, it can be subdivided into two states, one is a positive grip and the other is a reverse grip. It can be understood that the upper and lower regions of the first touch screen and the second touch screen are in an opposite state.
  • the proximity area to the front camera of the terminal when the user is holding the terminal, for the first touch screen and the second touch screen, the proximity area to the front camera of the terminal is the upper area, and the area closer to the touch key of the terminal The lower area; when the user is holding the terminal, for the first touch screen and the second touch screen, the proximity area to the front camera of the terminal is the lower area, and the upper area of the area closer to the touch key of the terminal.
  • the terminal in this embodiment may set a mapping touch function key (virtual or physical), and the step of S101 is performed after the mapping touch function key is turned on by the user.
  • the mapping setup is completed, that is, after S102, the settings of the first touch screen and the second touch screen may be stored.
  • the division of the first touch screen and the second touch screen may be implemented by the terminal separately, or may be implemented by the terminal according to the real-time setting of the user.
  • the terminal may pre-store the division rules of the first touch screen and the second touch screen in the vertical and horizontal grip states of the terminal, and take the second touch screen as the back touch screen as an example, and the terminal may acquire the user at the terminal.
  • the terminal can obtain the effective area area on the network, and the effective area can be based on the terminal.
  • the setting of the one-hand inoperable area and the one-hand operable area on the second touch screen is similar to the above method.
  • the user can set the size of the touch screen according to the actual size of one's own hand (left or right hand) and the actual size of the terminal.
  • the terminal when the terminal is held vertically, the user can use a finger (generally a thumb) on the first touch screen from the left side to the right side of the screen or from the right side to the left side (when the terminal screen is large, it can be the slave screen)
  • the right side of the screen to the bottom of the screen is drawn with an arc
  • the terminal sets the upper half of the arc as a one-hand inoperable area, and sets the lower half of the arc as a one-hand operable area; on the second touch screen,
  • the user uses the index finger to draw an arc from the left side of the screen to the right or from the right side to the left side.
  • the terminal has an area above the arc as a one-hand operable area, and an area below the arc is a one-hand inoperable area.
  • FIG. 2 and FIG. 3 respectively, a schematic diagram of the first touch screen dividing the one-hand operable area and the one-hand inoperable area and the second touch screen dividing the one-hand operable area and the one-hand inoperable area when the terminal is vertically held In Figures 2 and 3, the terminal is in a positive grip state.
  • mapping touch function When the user holds the terminal vertically, after the mapping touch function is turned on, the user may be prompted by an animation to draw a thumb on the first touch screen shown in FIG. 2 to cover the area that can be covered.
  • an arc is drawn from the right side of the screen to the bottom end of the screen, as shown by the arc in FIG. 2, the area above the arc is A1, which is a user's one-hand inoperable area, and the area below the arc is A2, which is a user.
  • A1 is a user's one-hand inoperable area
  • A2 One-hand operable area.
  • the user may be prompted to draw an area that can be covered by the index finger on the second touch screen shown in FIG. 3, typically, an arc is drawn from the right side of the screen to the left side of the screen.
  • an arc is drawn from the right side of the screen to the left side of the screen.
  • B1 which is a one-hand operable area of the user
  • B2 which is a one-hand inoperable area of the user.
  • A1 is Regional interactive interface (app icon or menu item) A certain proportion is mapped to the B1 area, and when a click operation is triggered in the B1 area, the corresponding map is mapped to the A1 area.
  • the smart phone When the user holds the mobile phone upside down, the smart phone has an automatic rotation function, and the UI interface is automatically adjusted as the user's grip direction changes.
  • the first touch screen is divided into one-hand operable regions.
  • mapping touch function When the user holds the terminal vertically, after the mapping touch function is turned on, the user may be prompted by an animation to draw a region on the first touch screen shown in FIG.
  • an arc is drawn from the right side of the screen to the bottom end of the screen, as shown in the arc in FIG. 4, the area above the arc is A3, which is a user's one-hand inoperable area, and the area below the arc is A4, which is a user.
  • One-hand operable area One-hand operable area.
  • the user may be prompted to draw an area that can be covered with the index finger on the second touch screen shown in FIG. 5, typically, an arc is drawn from the right side of the screen to the left side of the screen.
  • an arc is drawn from the right side of the screen to the left side of the screen.
  • the area above the arc is B3, which is a one-hand operable area of the user; the area below the arc is B4, which is a one-hand inoperable area of the user.
  • the interactive interface (application icon or menu item) of the A3 area is proportionally mapped to the B3 area, and when the click operation is triggered in the B3 area, the corresponding map is mapped to the A3 area.
  • the original one-hand operable area on the first touch screen (the A2 area in FIG. 2) is set as a one-hand inoperable area, and the original one-handed hand is used.
  • the inoperable area (A1 area in FIG. 2) is set as a one-hand operable area
  • the original one-hand operable area (B1 area in FIG. 3) on the second touch screen is set as a one-hand inoperable area.
  • the user's usage habit may be to hold the middle portion of the terminal with one hand. At this time, if the terminal screen is large, the user's thumb may not cover the upper part screen and the lower part screen of the terminal. Can only cover a section of the screen in the middle.
  • the division of the first touch screen includes: when the user operates the terminal with one hand, the touchable area of the finger in the middle of the first touch screen is divided into a one-hand operable area, and the one-hand operable area on the first touch screen is The second area is divided into a one-hand inoperable area; the division of the second touch screen includes: dividing the finger touchable area of the user on the upper part of the second touch screen into a one-hand operable area, and dividing the other areas of the second touch screen into The one-hand inoperable area divides the one-hand operable area of the second touch screen into two single-hand operable sub-areas.
  • step S102 includes: mapping the interaction interfaces of the two one-hand inoperable areas on the first touch screen to the two one-hand operable sub-areas of the second touch screen. It can be understood that the two one-hand inoperable areas on the first touch screen and the two one-hand operable sub-areas of the second touch screen have a one-to-one correspondence.
  • the above division may be implemented by the terminal alone or by the terminal according to the real-time setting of the user.
  • the first touch screen divides the one-hand operable area and the one-hand inoperable area
  • the second touch screen divides the one-hand operable area and Schematic diagram of a one-hand inoperable area.
  • the arc is a curve in which the thumb slides in the middle of the first touch screen, the A5 and A7 areas are areas that the user's thumb cannot cover, and the A6 area is the middle part of the screen that the user can only cover, wherein, Simplify the area division, and use the rectangle containing the thumb sliding arc as the A6 area.
  • the first touch screen area is divided into three areas: A5, A6, and A7.
  • the one-hand operable area B5 in FIG. 7 is divided into two one-hand operable sub-areas B51 and B52, which correspond to A5 and A7, respectively.
  • the division of the one-hand operable sub-regions B51 and B52 can be referred to the division manner in FIG. 7, or can be horizontally divided by the one-hand operable region B5.
  • the interaction interface between the A5 and A7 areas is mapped to the B51 and B52 areas, and the touch signals in the B51 and B52 areas can correspond to the A5 and A7 areas, thereby solving the interaction elements of the A5 and A7 areas difficult in one-handed situations.
  • B51 and A7 may be corresponding in this embodiment, and B52 corresponds to A5.
  • the first touch screen is divided into: when the terminal is in the horizontally held state, when the user operates the terminal with one hand, the touchable area of the finger (generally a thumb) on the first touch screen on the one-hand operation side is divided into The one-hand operable area divides other areas on the first touch screen into one-hand inoperable areas; the second touch screen is divided into: the finger-touchable area on the second touch screen on the one-hand operation side is divided into one-handed operation The area divides other areas of the second touch screen into one-hand inoperable areas.
  • the above setting is a problem in consideration of the user's usage habits.
  • the right area of the first touch screen and the second touch screen may be set as the one-hand operation area of the user, and the left area is one-handed.
  • the interaction interface of the one-hand inoperable area of the first touch screen may be mapped to the one-hand operable area of the second touch screen.
  • FIG. 8 is around FIG. 9 and the user operates the terminal with one hand with one hand.
  • the user may be prompted by an animation to draw a thumb on the first touch screen shown in FIG. 8 to cover the area that can be covered.
  • a thumb is drawn from the upper side of the screen to the lower side of the screen (or from the lower side of the screen to the upper side of the screen), as shown by the arc in Fig. 8, the area to the left of the arc is A8,
  • the user has one-handed inoperable area, and the area below the arc is A9, which is the one-hand operable area of the user.
  • the user may be prompted to draw an area that can be covered with the index finger on the second touch screen shown in FIG. 9, typically using a finger that can move on the second touch screen, such as the index or middle finger.
  • An arc is drawn from the upper side of the second touch screen to the lower side of the screen, as shown in FIG. 9 , correspondingly, the left area of the arc is B8, which is a non-operable area of the user with one hand; the right area of the arc is B9, It is a user's one-handed operable area.
  • the index finger or the middle finger of the user can move on the back of the mobile phone, and the area with the highest coverage of the index finger is the B9 area. Since the area of the A9 area and the B9 area are not necessarily the same, Therefore, the interactive interface (application icon or menu item) of the A9 area is proportionally mapped to the B9 area, and when the click operation is triggered in the B9 area, the corresponding map is mapped to the A9 area.
  • the foregoing dividing process may also be implemented independently by the terminal, and may be divided according to a preset dividing rule.
  • the specific content and acquisition method of the preset division rule can refer to the previous related description.
  • the above describes the division of the one-hand operable area and the one-hand inoperable area on the terminal when the user vertically holds and traverses the terminal. After the above division is completed, the user can use the terminal on the first touch screen and the second touch screen.
  • the touch operation implements control of all interactive elements on the first touch screen.
  • the second touch screen will scan whether it occurs on the screen.
  • the operation is a preset effective operation, and if so, the operation is mapped to the first touch screen. The corresponding position of the operation area, otherwise, the operation is ignored.
  • the preset effective operations include, but are not limited to, a double-click operation or a tap operation based on the strength of the click calculated by the pressure sensor that is greater than a certain threshold, and the operation is different from the click operation.
  • the operation trajectory of the operation is displayed on the corresponding position of the one-hand inoperable area on the first touch screen. It is convenient for the user to move the finger moving on the second touch screen up and down and left and right according to the position of the interactive element such as the application icon that he wants to open.
  • the active area on each touch screen can be operated according to the one-hand operation of the user.
  • the first touch screen and the second touch screen are respectively divided into a one-hand operable area and a one-hand inoperable area, and the interactive interface of the one-hand inoperable area on the first touch screen is mapped to the one-hand operable area on the second touch screen.
  • the purpose of realizing all the interactive elements on the front touch screen is one-handed, which not only improves the feasibility and practicability of the one-hand operation of the terminal, especially the large-screen terminal, but also improves the user experience, so that the user is in a special environment or the user. Get a better experience when you can only operate one-handedly because of your own defects.
  • the operation trace of the user on the second touch screen may be correspondingly displayed on the first touch screen, so that the user can move the finger according to the UI element that needs to be operated, thereby simplifying the difficulty of one-hand operation and improving the user's The accuracy of the operation on the second touch screen.
  • the user's erroneous operation can be avoided, and the user experience is improved.
  • Embodiment 2 is a diagrammatic representation of Embodiment 1:
  • the embodiment provides a screen manipulation method based on two or more touch screens, including:
  • the terminal prompts the user to slide on the first touch screen and the second touch screen in an animated manner, and respectively determines a finger reachable area;
  • the first touch screen of the terminal generally overlaps with the display screen.
  • the manner in which the user holds the terminal is a vertical grip.
  • the user can draw a curve from the left side of the screen to the right side with a thumb on the display screen, and the finger on the lower side of the curve can be covered.
  • Area; draw a curve with the index finger on the second touch screen, and the area on the upper side of the curve is the area covered by the finger.
  • the first touch screen and the second touch screen may be a front touch screen and a back touch screen of the terminal, respectively.
  • the user can also traverse the terminal, and the division of the corresponding finger reachable area can be referred to the related description of Embodiment 1.
  • S1003 The terminal maps an area that the thumb cannot cover on the first touch screen to a finger reachable area on the second touch screen according to an appropriate ratio
  • the second touch screen receives the user's touch operation, and the terminal determines whether the touch operation is a valid operation, and if it is a valid operation, proceeds to S1005; otherwise, the operation is ignored;
  • S1005 Map the touch operation on the second touch screen to the first touch screen of the terminal, and trigger the corresponding UI element.
  • the UI elements here include, but are not limited to, menu items, application icons of the UI interface, and the like.
  • the areas between different touch screens can be mapped to each other, and when two or more touch screens are provided on the terminal, the first touch screen is One-handed inoperable area mapped to the second touch screen
  • the hand-operable area realizes the one-handed operation of all the interactive elements on the first touch screen, which not only improves the feasibility and practicality of the one-hand operation of the terminal, especially the large-screen terminal, but also improves the user experience, so that the user In a special environment or when users can only use one-handed operation due to their own defects, they can get a better experience.
  • the touch screen of the existing terminal with two or more touch screens can be fully utilized to improve the usage rate of the touch screen.
  • the method of this embodiment is particularly applicable to a terminal having two or more touch screens of a large screen.
  • Embodiment 3 is a diagrammatic representation of Embodiment 3
  • the embodiment provides a screen manipulation device, and the screen manipulation device includes:
  • the dividing module 111 is configured to divide the first touch screen and the second touch screen on the terminal into a one-hand operable area and a one-hand inoperable area, respectively;
  • the mapping module 112 is configured to map at least the interaction interface of the one-hand inoperable area on the first touch screen to the one-hand operable area of the second touch screen.
  • the division module 111 divides the one-hand operable area and the one-hand inoperable area of the plurality of touch screens of the terminal differently according to the state of the user's handheld terminal.
  • the dividing module 111 may be configured to divide the touchable area of the finger in the lower part of the first touch screen into a one-hand operable area when the terminal is in the state of being held vertically, and the first touch screen is The other areas are divided into one-hand inoperable areas; the finger touchable area of the user in the upper part of the second touch screen is divided into one-hand operable areas, and the other areas of the second touch screen are divided into one-hand inoperable areas, correspondingly, mapping
  • mapping The module 112 is configured to map an interactive interface of the one-hand inoperable area on the first touch screen to a one-hand operable area of the second touch screen.
  • the dividing module 111 can also be configured to: when the terminal is in the state of being held vertically, when the user operates the terminal with one hand, the finger (usually the thumb) is The touchable area of the middle of the first touch screen is divided into a one-hand operable area, and the upper and lower two areas of the one-hand operable area on the first touch screen are divided into one-hand inoperable areas; and the user is in the upper part of the second touch screen.
  • the finger touchable area is divided into a one-hand operable area, and the other area of the second touch screen is divided into a one-hand inoperable area, and the one-hand operable area is divided into two one-hand operable sub-areas.
  • the mapping module 112 is configured to: after the dividing module 111 divides the two single-hand operable sub-regions on the second touch screen, respectively map the interaction interfaces of the two single-hand inoperable regions on the first touch screen. Two two-hand operable sub-areas to the second touch screen.
  • the dividing module 111 can also be configured to: when the terminal is in the horizontally held state, when the user operates the terminal with one hand, on the first touch screen on the one-hand operation side
  • the finger touchable area is divided into a one-hand operable area, and the other area on the first touch screen is divided into a one-hand inoperable area
  • the finger touchable area on the second touch screen on the one-hand operation side is divided into a one-hand operable area
  • the other areas of the second touch screen are divided into one-hand inoperable areas
  • the mapping module 112 is configured to map the interactive interface of the one-hand inoperable area on the first touch screen to the one-hand operable area of the second touch screen.
  • the apparatus of the embodiment further includes an operation control module 113 configured to: at least the first of the terminals in the mapping module 112 After the interactive interface of the one-hand inoperable area on the touch screen is mapped to the one-hand operable area of the second touch screen, if the user's operation is received on the second touch screen, it is determined whether the operation is a preset effective operation, and if so, The operation is mapped to the corresponding position of the one-hand inoperable area on the first touch screen, otherwise the operation is ignored.
  • the preset effective operation includes: a double-click operation or a tap operation based on the strength of the click calculated by the pressure sensor that is greater than a certain threshold, and the operation is different from the click operation.
  • the mapping module 112 may set an appropriate ratio according to the area of the one-hand inoperable area on the first touch screen and the area of the one-hand operable area on the second touch screen, and the first touch screen
  • the interactive interface of the one-hand inoperable area is set according to the The scale is scaled down and mapped to the one-hand operable area of the at least one second touch screen.
  • the front touch module of the embodiment has a display function.
  • the implementation is performed.
  • the example further includes a display control module 114 configured to: after the mapping module 112 maps the interactive interface of the one-hand inoperable area on the first touch screen to the one-hand operable area of the second touch screen, if the user is received on the second touch screen The operation controls the operation trajectory of the operation on the corresponding position of the one-hand inoperable area on the first touch screen.
  • the screen manipulation device is applied to a terminal having at least two touch screens of a first touch screen and a second touch screen.
  • the dividing module 111, the mapping module 112, the operation control module 113, and the display control module 114 in the screen manipulation device may be used by the screen manipulation device or the terminal to which the screen manipulation device belongs in an actual application.
  • the central processing unit CPU, Central Processing Unit
  • DSP Digital Signal Processor
  • MCU Microcontroller Unit
  • FPGA Field-Programmable Gate Array
  • a terminal is further provided, the terminal can be operated by one hand, including at least two touch screens and a controller 10; the at least two touch screens include a first touch screen and a second touch screen; ,
  • the controller 10 is configured to: when the user operates the terminal with one hand, divide the plurality of touch screens into a one-hand operable area and a one-hand inoperable area in the active area of the plurality of touch screens on the terminal; at least the first touch screen The interactive interface of the one-handed inoperable area is mapped to the one-hand operable area of the second touch screen.
  • the first touch screen and the second touch screen are different from each other.
  • the dividing module 111 can divide the first touch screen and the second touch screen respectively on the first touch screen and the active area on the second touch screen when the user operates the terminal with one hand.
  • One-hand operable area and one-hand inoperable area The mapping module completes mapping of the one-hand inoperable area on the first touch screen and the one-hand inoperable area on the second touch screen, so that when the user operates the terminal with one hand, the operation of the one-hand operable area on the second touch screen can be performed.
  • the control of the one-hand inoperable area on the first touch screen is completed, and the complete control of the interactive elements on the interactive interface is realized when the one-hand operation is realized, the action of the one-hand operation terminal of the user is simplified, and the one-hand operation terminal of the user is improved.
  • the possibility improves the user's experience of one-handed operation of the terminal, and also improves the utilization of the touch screen of the terminal having two or more screens.
  • modules or steps of the present invention can be implemented by a general-purpose computing device, which can be concentrated on a single computing device or distributed over a network composed of multiple computing devices.
  • they may be implemented by program code executable by the computing device such that they may be stored in a storage medium (ROM/RAM, diskette, optical disk) by a computing device, and in some cases
  • the steps shown or described may be performed in an order different than that herein, or they may be separately fabricated into individual integrated circuit modules, or a plurality of the modules or steps may be implemented as a single integrated circuit module. Therefore, the invention is not limited to any particular combination of hardware and software.
  • the first touch screen and the second touch screen are respectively divided into a one-hand operable area and a one-hand inoperable area, and the technical solution of the embodiment of the present invention is
  • the interaction interface of the one-hand inoperable area on the first touch screen is mapped to the one-hand operable area on the second touch screen different from the first touch screen, so that all the interactive elements on the first touch screen are operated by one hand.
  • the feasibility and practicality of the hand operation also enhances the user experience of the user operating the large screen terminal with one hand, so that the user can get a better one-hand operation experience in a special environment or when the user can only operate with one hand due to his own defects.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

La présente invention concerne un appareil et un procédé de commande d'écran, et un support de stockage informatique. Le procédé de controle d'écran comprend les étapes suivantes : selon des zones de mouvement sur un premier écran tactile et un second écran tactile d'un terminal lorsqu'un utilisateur actionne le terminal avec une seule main, diviser respectivement le premier écran tactile et le second écran tactile en une zone activable par une seule main et une zone non activable par une seule main ; et mapper au moins une interface interactive de la zone non activable par une seule main sur le premier écran tactile sur la zone activable par une seule main du second écran tactile.
PCT/CN2017/072294 2016-08-31 2017-01-23 Procédé et appareil de controle d'ecran, terminal et support de stockage informatique WO2018040502A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610797696.4A CN107797747A (zh) 2016-08-31 2016-08-31 一种基于多屏的屏幕操控方法、装置和终端
CN201610797696.4 2016-08-31

Publications (1)

Publication Number Publication Date
WO2018040502A1 true WO2018040502A1 (fr) 2018-03-08

Family

ID=61299943

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/072294 WO2018040502A1 (fr) 2016-08-31 2017-01-23 Procédé et appareil de controle d'ecran, terminal et support de stockage informatique

Country Status (2)

Country Link
CN (1) CN107797747A (fr)
WO (1) WO2018040502A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112328163A (zh) * 2020-11-10 2021-02-05 四川长虹电器股份有限公司 一种双屏显示设备的触摸控制方法
CN113138663A (zh) * 2021-03-29 2021-07-20 北京小米移动软件有限公司 设备调节方法、设备调节装置、电子设备及存储介质

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108920075A (zh) * 2018-06-26 2018-11-30 努比亚技术有限公司 双屏移动终端控制方法、移动终端及计算机可读存储介质
CN109309752B (zh) * 2018-07-19 2021-10-22 奇酷互联网络科技(深圳)有限公司 移动终端和单手操作屏幕的方法及装置
CN109144386B (zh) * 2018-08-01 2021-03-05 Oppo(重庆)智能科技有限公司 触摸屏控制方法、装置、存储介质、移动终端及终端配件
CN109343788B (zh) * 2018-09-30 2021-08-17 维沃移动通信有限公司 一种移动终端的操作控制方法及移动终端
CN109639890A (zh) * 2018-11-30 2019-04-16 努比亚技术有限公司 双面屏操作方法、移动终端及计算机可读存储介质
CN109683785B (zh) * 2018-12-24 2021-03-12 维沃移动通信有限公司 一种信息处理方法及移动终端
CN109743446B (zh) * 2018-12-25 2020-12-18 南京车链科技有限公司 双屏来电显示方法、终端及计算机可读存储介质
CN109885242B (zh) * 2019-01-18 2021-07-27 维沃移动通信有限公司 一种执行操作的方法和电子设备
CN109901760B (zh) * 2019-01-21 2020-07-28 维沃移动通信有限公司 一种对象控制方法及终端设备
CN118368357A (zh) * 2023-01-17 2024-07-19 腾讯科技(深圳)有限公司 界面控制方法、装置、终端及存储介质

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103370924A (zh) * 2010-12-10 2013-10-23 尤塔设备Ipr有限公司 具有用户界面的移动装置
CN105867746A (zh) * 2016-06-24 2016-08-17 努比亚技术有限公司 一种触控操作方法及移动终端

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101676843A (zh) * 2008-09-18 2010-03-24 联想(北京)有限公司 触摸输入方法及触摸输入装置
CN102681779A (zh) * 2012-04-25 2012-09-19 中兴通讯股份有限公司南京分公司 触摸屏操作方法及装置

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103370924A (zh) * 2010-12-10 2013-10-23 尤塔设备Ipr有限公司 具有用户界面的移动装置
CN105867746A (zh) * 2016-06-24 2016-08-17 努比亚技术有限公司 一种触控操作方法及移动终端

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112328163A (zh) * 2020-11-10 2021-02-05 四川长虹电器股份有限公司 一种双屏显示设备的触摸控制方法
CN113138663A (zh) * 2021-03-29 2021-07-20 北京小米移动软件有限公司 设备调节方法、设备调节装置、电子设备及存储介质

Also Published As

Publication number Publication date
CN107797747A (zh) 2018-03-13

Similar Documents

Publication Publication Date Title
WO2018040502A1 (fr) Procédé et appareil de controle d'ecran, terminal et support de stockage informatique
US8259083B2 (en) Mobile device having backpanel touchpad
CN103064629B (zh) 能动态调整图形控件的便携电子设备及方法
US10108331B2 (en) Method, apparatus and computer readable medium for window management on extending screens
TWI469038B (zh) 具有觸摸屏的電子設備及其螢幕解鎖方法
US20150143285A1 (en) Method for Controlling Position of Floating Window and Terminal
US20160320891A1 (en) Electronic Display with a Virtual Bezel
US20150185953A1 (en) Optimization operation method and apparatus for terminal interface
TWI616803B (zh) 螢幕畫面的縮放及操作方法、裝置與電腦程式產品
WO2014032431A1 (fr) Dispositif terminal et procédé de lancement rapide d'un programme
WO2013181881A1 (fr) Procédé et dispositif de commande d'un écran tactile
WO2016138661A1 (fr) Procédé de traitement pour une interface utilisateur d'un terminal, interface utilisateur et terminal
WO2018001261A1 (fr) Procédé de configuration de fonctions de boutons et terminal mobile
WO2015117341A1 (fr) Terminal mobile et son procédé d'utilisation à une main, et support de stockage informatique
WO2022007934A1 (fr) Procédé et appareil de commande d'icône d'application, et dispositif électronique
WO2017185459A1 (fr) Procédé et appareil pour déplacer des icônes
US20170045977A1 (en) Mobile Terminal
WO2014079289A1 (fr) Procédé, dispositif et terminal de positionnement tactile
JP2014179877A (ja) 携帯端末装置の表示制御方法
WO2013178192A2 (fr) Procédé de commande d'écran tactile en un seul point, dispositif et terminal mobile
WO2015024375A1 (fr) Procédé et dispositif d'ajustement de zone de composant d'interface graphique
CN104360813A (zh) 一种显示设备及其信息处理方法
WO2016183912A1 (fr) Procédé et appareil d'agencement de disposition de menus
CN104035716A (zh) 触摸屏操作方法和装置以及终端
KR20160019762A (ko) 터치 스크린 한손 제어 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17844813

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17844813

Country of ref document: EP

Kind code of ref document: A1