CN111316200A - Full-screen single-hand operation method, terminal equipment and computer readable medium - Google Patents

Full-screen single-hand operation method, terminal equipment and computer readable medium Download PDF

Info

Publication number
CN111316200A
CN111316200A CN201780095019.0A CN201780095019A CN111316200A CN 111316200 A CN111316200 A CN 111316200A CN 201780095019 A CN201780095019 A CN 201780095019A CN 111316200 A CN111316200 A CN 111316200A
Authority
CN
China
Prior art keywords
interface
terminal device
main display
virtual
display interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201780095019.0A
Other languages
Chinese (zh)
Inventor
徐叶辉
黄成钟
郑雪瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Transsion Communication Co Ltd
Original Assignee
Shenzhen Transsion Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Transsion Communication Co Ltd filed Critical Shenzhen Transsion Communication Co Ltd
Publication of CN111316200A publication Critical patent/CN111316200A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A full-screen one-hand operation method, a terminal device and a computer readable medium are provided, wherein the method comprises the following steps: after the terminal equipment enters a one-hand operation mode, the terminal equipment displays a virtual operation interface on a main display interface of the terminal equipment, wherein the area of the virtual operation interface is smaller than that of the main display interface (101); when touch operation aiming at a target area in the virtual operation interface is detected, the terminal equipment executes the touch operation in an area corresponding to the target area in the main display interface (102). The method can realize the operation control of the whole main display interface by performing the operation in a small range in the virtual operation window of the terminal equipment, and can improve the screen display effect when the terminal equipment is operated by one hand.

Description

Full-screen single-hand operation method, terminal equipment and computer readable medium Technical Field
The invention relates to the technical field of terminals, in particular to a full-screen one-hand operation method, terminal equipment and a computer readable medium.
Background
With the continuous development and popularization of terminal equipment technology, in order to realize more functions and better effects, the appearance of the terminal equipment is gradually changed, for example, the screen of the terminal equipment becomes larger and larger, and meanwhile, some troubles are brought to people. Taking a mobile phone as an example, as the screen becomes larger, it becomes more and more difficult to operate the mobile phone with one hand, and especially when the space is narrow or the operation with two hands is inconvenient (such as standing in a bus), the operation is very inconvenient.
In the existing one-hand operation method of the terminal device, the content displayed on the full screen of the terminal device is reduced and displayed in a predetermined fixed area at one corner of the screen, and the predetermined fixed area is smaller than the screen area of the terminal device, so that a user can perform one-hand operation in the predetermined fixed area. However, since the predetermined fixed area is small, the display content after the display is reduced in the predetermined fixed area is not clear enough (e.g., the font is blurred), and the display effect is poor.
Disclosure of Invention
The embodiment of the invention provides a full-screen one-hand operation method, which can realize the operation applied to the whole main display interface by performing small-range operation in a virtual operation window of terminal equipment, and can improve the screen display effect when the terminal equipment is operated by one hand.
In a first aspect, an embodiment of the present invention provides a full-screen one-handed operation method, where the method includes:
after a terminal device enters a one-hand operation mode, displaying a virtual operation interface on a main display interface of the terminal device, wherein the area of the virtual operation interface is smaller than that of the main display interface;
when touch operation aiming at a target area in the virtual operation interface is detected, the touch operation is executed in an area corresponding to the target area in the main display interface.
In a second aspect, an embodiment of the present invention provides a terminal device, where the terminal device includes a unit configured to execute the method of the first aspect.
In a third aspect, an embodiment of the present invention provides another terminal device, which includes a processor, an input device, an output device, and a memory, where the processor, the input device, the output device, and the memory are connected to each other, where the memory is used to store a computer program that supports the terminal device to execute the foregoing method, and the computer program includes program instructions, and the processor is configured to call the program instructions to execute the foregoing method according to the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, in which a computer program is stored, the computer program comprising program instructions, which, when executed by a processor, cause the processor to perform the method of the first aspect.
According to the embodiment of the invention, after the terminal equipment enters the one-hand operation mode, the virtual operation interface is displayed on the main display interface of the terminal equipment, the operation applied to the whole main display interface can be realized by performing the operation in a small range in the virtual operation window, and the screen display effect can be improved when the terminal equipment is operated by one hand.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic flow chart of a full-screen one-handed operation method according to an embodiment of the present invention;
FIG. 2a is a schematic diagram of a display position of a one-handed operation interface according to an embodiment of the present invention;
FIG. 2b is a schematic diagram of a display position of another one-handed operation interface according to an embodiment of the present invention;
fig. 2c is a schematic diagram illustrating a touch operation performed in a full-screen one-handed operation method according to an embodiment of the present invention;
FIG. 2d is a schematic diagram of an interface for performing touch operation in another full-screen one-handed operation method according to an embodiment of the present invention;
fig. 2e is a schematic diagram of an interface including a virtual trigger button according to an embodiment of the present invention;
FIG. 2f is a schematic diagram of an interface including a direct manipulation area according to an embodiment of the present invention;
FIG. 2g is a schematic diagram of a method for entering a one-handed operation mode according to an embodiment of the present invention;
fig. 2h is a schematic diagram of a method for dragging a virtual operation interface according to an embodiment of the present invention;
FIG. 2i is a schematic diagram illustrating a method for enlarging or reducing a virtual operation interface according to an embodiment of the present invention;
FIG. 2j is a schematic diagram illustrating a method for exiting a single-handed operation mode according to an embodiment of the present invention;
FIG. 3 is a schematic flow chart of a full-screen one-handed operation method according to another embodiment of the present invention;
fig. 4 is a schematic block diagram of a terminal device according to an embodiment of the present invention;
fig. 5 is a schematic block diagram of a terminal device according to another embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the invention. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," and the like in the description and claims of the present invention and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the specification of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In particular implementations, the terminal devices described in embodiments of the invention include, but are not limited to, other portable devices such as mobile phones, laptop computers, or tablet computers having touch sensitive surfaces (e.g., touch screen displays and/or touch pads). It should also be understood that in some embodiments, the device is not a portable communication device, but is a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or touchpad).
In the discussion that follows, a terminal device that includes a display and a touch-sensitive surface is described. However, it should be understood that the terminal device may include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
The terminal device supports various applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disc burning application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an email application, an instant messaging application, an exercise support application, a photo management application, a digital camera application, a web browsing application, a digital music player application, and/or a digital video player application.
Various applications that may be executed on the terminal device may use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the terminal device may be adjusted and/or changed between applications and/or within respective applications. In this way, a common physical architecture (e.g., touch-sensitive surface) of the terminal device may support various applications with user interfaces that are intuitive and transparent to the user.
Referring to fig. 1, which is a schematic flowchart of a full-screen one-handed operation method according to an embodiment of the present invention, as shown in fig. 1, the method may include:
101. after the terminal equipment enters the one-hand operation mode, the terminal equipment displays a virtual operation interface on a main display interface of the terminal equipment, and the area of the virtual operation interface is smaller than that of the main display interface.
The single-hand operation mode provided by the embodiment of the invention is mainly aimed at large-screen terminal equipment, and due to the fact that the screen is large, when a user operates with only one hand, some keys are difficult to touch and accidents are easy to happen, so that the single-hand operation mode is that the display area of the screen is reduced, the user can smoothly operate with only one hand, and the terminal equipment is convenient to use and is prevented from falling.
The main display interface of the terminal device includes content currently displayed in the whole screen of the terminal device, and the main display interface may include a desktop display interface of the terminal device, and may also include an interface in an Application (APP) installed in the terminal device.
Specifically, after the terminal device is connected to the one-handed operation mode, a virtual operation interface may be displayed on a main display interface of the terminal device, where the virtual operation interface may include display content, and the display content is obtained by reducing content displayed in the main display interface according to a preset ratio.
Therefore, as shown in fig. 2a, the virtual operation interface may be displayed at a position of a frame region of the main display interface (e.g., a lower right region of the main display interface), or as shown in fig. 2b, an edge of the virtual operation interface overlaps an edge of the main display interface (e.g., a lower right corner of the virtual operation interface overlaps a lower right corner of the main display interface).
Optionally, the virtual operation interface may be a region with adjustable transparency. The virtual operation interface can be displayed according to the preset transparency, the transparency of the virtual operation interface can be set according to needs, the display content of the main display interface in the area can be seen in the area of the virtual operation interface, the shielding of the display content in the area by the virtual operation interface is avoided, and the screen display effect is improved.
102. When the touch operation aiming at the target area in the virtual operation interface is detected, the terminal equipment executes the touch operation in the area corresponding to the target area in the main display interface.
The target area mentioned in the embodiment of the present invention is any area in the virtual operation interface, and the target area and the area corresponding to the target area can be represented as a set of a plurality of coordinates. Specifically, when the touch operation in the target area is detected, the touch operation does not directly trigger a trigger event of a main display interface, but the touch operation is performed in an area corresponding to the target area in the main display interface.
Specifically, the terminal device may determine, according to a correspondence between coordinates in the virtual operation interface and coordinates in the main display interface, second target coordinates of the main display interface corresponding to the first target coordinates of the virtual operation interface, where the first target coordinates and the second target coordinates may be a set of one coordinate and multiple coordinates. In the terminal device, a first coordinate system of a virtual operation interface and a second coordinate system of a main display interface may be pre-established, where the first coordinate system and the second coordinate system have a one-to-one correspondence, that is, when a first target coordinate in the first coordinate system is determined, a second target coordinate corresponding to the first target coordinate may be uniquely determined in the second coordinate system according to the correspondence, and further, the touch operation may be performed at the second target coordinate of the main display interface. Optionally, the correspondence between the coordinates in the virtual operation interface and the coordinates in the main display interface is stored in the terminal device in a data form. In this method, step 102 may be repeated.
As shown in fig. 2c, if the terminal device detects that a first sliding operation occurs on the virtual operation interface by a finger of a user, the touch operation is the first sliding operation, the first sliding operation is represented as a first sliding track on the virtual operation interface, and the sliding track may be regarded as a set of multiple coordinates, that is, the first target coordinate; according to the corresponding relation, a second target coordinate corresponding to the first target coordinate can be determined, namely, a second sliding operation can be triggered at the position of the second target coordinate in the main display interface, the track formed by the second sliding operation is the same as the track of the first sliding operation in shape, the track of the second sliding operation is obtained by amplifying the track of the first sliding operation in proportion, and the proportion is determined by the corresponding relation.
If the terminal device detects that a first pressing operation occurs on the virtual operation interface by a finger of a user, the touch operation is the pressing operation, and the terminal device detects that the pressing operation occurs on a third target coordinate position of the virtual operation interface; according to the corresponding relationship, a fourth target coordinate corresponding to the third target coordinate may be determined, that is, a second pressing operation may be triggered at a fourth target coordinate position in the main display interface, and the second pressing operation may trigger a trigger event corresponding to the fourth target coordinate position in the main display interface, for example, as shown in fig. 2d, if a user performs a click operation in an area encircled by the virtual operation interface, an event triggered by clicking a corresponding position of the main display screen is simulated.
By the method, the touch operation can be executed on the main display interface only by performing the touch operation on the virtual operation interface in a small range, and the screen display effect can be improved when the terminal equipment is operated by one hand.
In an optional embodiment, the virtual operation interface may include a virtual trigger button, where the virtual trigger button is used to trigger the main display interface to execute a trigger event corresponding to the virtual trigger button. Optionally, after determining the second target coordinate of the main display interface corresponding to the first target coordinate of the virtual operation interface, a first indication element may be displayed at the second target coordinate, where the first indication element is used to indicate a position of the touch operation, and the first indication element may be displayed as a circular cursor or an arrow on the main display interface. The first indication element can facilitate a user to determine a touch position on the main display interface last time, so that when the user uses the virtual trigger key, the corresponding trigger operation can be accurately performed on the position indicated by the first indication element. Optionally, a second indication element may be further displayed at the first target coordinate, and the second indication element may facilitate the user to determine the touch position on the virtual operation interface last time. When the virtual touch interface is not touched for a predetermined time (e.g., 5 seconds, 10 seconds, 30 seconds, etc.), the first indication element may be hidden, so as to prevent the first indication element from blocking the display content of the main display interface, and improve the display effect of the main display interface.
As shown in fig. 2e, the virtual trigger button may be located in a side area of the main display interface, when the terminal device detects that a user clicks the virtual trigger button, a trigger event corresponding to the virtual trigger button is triggered at a coordinate position indicated by a first indication element of the main display interface, for example, the main display interface is a desktop display interface of the terminal device, where displayed content of the interface includes icons of an application a, an application B, and an application C, the virtual trigger button in the virtual operation interface includes a determination button, the determination button is used to trigger the main display interface to execute the corresponding trigger event, if the first indication element of the main display interface is an arrow, and when the arrow indication position is an icon area of the application B, the user clicks the determination button of the virtual operation interface, the displayed content of the main display page is display page content entering into the application B, the effect of directly performing touch operation in a non-one-hand operation mode is achieved. Because the display content of the virtual operation interface is obtained by scaling down the content displayed on the main display interface, the touch operation is simulated through the virtual trigger key under the condition that the virtual operation interface is small, and the accuracy is higher than that of the touch operation directly performed on the virtual operation interface.
In an optional embodiment, a direct operation area may be located in a non-overlapping area of the main display interface and the virtual operation interface of the terminal device, where the direct operation area is an area that can directly perform touch operation on the main display interface at the periphery of the virtual operation interface. The direct operation area may be a lower half area of the main display interface, or an arc area on one side of the lower left or lower right in the main display interface, and optionally, the area may be determined according to the position of the virtual operation area, as shown in fig. 2f, the direct operation area is an arc area on one side of the lower right in the main display interface, and the reachable range of the user in one-handed operation is in the area, that is, when the direct operation area needs to be touch-controlled, the terminal device does not need to pass through the virtual operation interface, and can also respond to the touch-control operation on the main display interface in the direct operation area, so that the user can more conveniently touch-control the area.
Referring to fig. 3, fig. 3 is a schematic flow chart of a full-screen one-handed operation method according to a second embodiment of the present invention, and fig. 3 is further optimized based on fig. 1. As shown in fig. 3, the full-screen one-handed operation method may include the steps of:
301. the terminal device detects whether the terminal device meets the activation condition, and if the terminal device meets the activation condition, the terminal device enters a one-hand operation mode.
The activation condition refers to an activation condition of a one-handed operation mode of the terminal device, and the activation condition includes: the screen of the terminal device is touched according to a first preset operation, or a first preset key of the terminal device is touched. And if the terminal equipment meets the activation condition, the terminal equipment enters a one-hand operation mode.
Specifically, when the screen of the terminal device is touched according to a first preset operation, the terminal device meets an activation condition, and the terminal device enters a one-handed operation mode. Alternatively, the first predetermined operation may be a sliding operation of a lower frame area of the screen of the terminal device, for example, as shown in fig. 2g, when the terminal device detects that an upward sliding operation occurs in a preset frame area a of the screen, and a sliding track formed by the sliding operation in the preset frame area a is greater than a first preset track length (for example, 3 centimeters), the terminal device meets the activation condition, enters the single-handed operation mode, and if the sliding track formed by the sliding operation is less than or equal to the first preset track length, the process is ended. Optionally, after the terminal device detects that the screen preset frame region has an upward sliding operation, the terminal device may detect whether a sliding track formed by the sliding operation is in a preset side region of the screen, if so, the terminal device meets the activation condition, and enters the one-handed operation mode, and if not, the process is ended.
Optionally, when a first predetermined key of the terminal device is touched, the terminal device meets the activation condition, and the terminal device enters a one-handed operation mode. For example, when the terminal device detects that a preset combination key preset on a screen is touched, the terminal device meets the activation condition and enters the one-hand operation mode. Specifically, the preset combination key of the terminal device may be a volume down key + Home key, and the volume down key and the Home key of the terminal device are pressed at the same time, so that the terminal device meets the activation condition, and enters the one-handed operation mode. The Home key is a key with a function of returning to a main screen in the terminal equipment of a Windows operating system, an iOS operating system and an android operating system.
302. After the terminal equipment enters the one-hand operation mode, the terminal equipment displays a virtual operation interface on a main display interface of the terminal equipment, and the area of the virtual operation interface is smaller than that of the main display interface.
In an optional embodiment, after the virtual operation interface is displayed on the main display interface of the terminal device, the method further includes:
when a first dragging operation aiming at a preset frame area of the virtual operation interface is detected, moving the virtual operation interface on the main display interface; and when a second dragging operation aiming at the preset corner area of the virtual operation interface is detected, reducing or enlarging the virtual operation interface.
A specific implementation manner may be that, when the terminal device detects a pressing operation in a preset border area or a preset corner area, a border of the virtual operation area is selected (for example, displayed in red), which indicates that a dragging operation may be performed.
Specifically, the virtual operation interface has a preset frame area, and as shown in fig. 2h, when a first drag operation is performed on the preset frame area, the virtual operation interface may move on the main display interface according to a moving direction of the first drag operation. By the method, the virtual operation interface can be dragged to move on the main display interface, and the appropriate display position of the virtual operation interface is selected, so that a user can conveniently realize one-hand operation (for example, the virtual display interface is dragged to the left side of the screen for display, so that the operation of the left-handed user is facilitated), and the blocking of the page content needing to be displayed on the main display interface can also be avoided.
Specifically, the virtual operation interface may have a preset corner region, where the preset corner region refers to a region extending outward to a fixed range with any corner of the virtual operation interface as a center, as shown in fig. 2i, when a second dragging operation is performed on the preset corner region of the virtual operation interface, the virtual operation interface may zoom in or out on the main display interface according to the second dragging operation. The method can drag and adjust the display size of the virtual operation interface on the main display interface so that a user can adjust the display effect of the virtual operation interface, adjust the virtual display interface to a size convenient for the user to operate personally, and avoid the situation that the operation is influenced by the unclear content of the small virtual operation interface.
303. When the touch operation aiming at the target area in the virtual operation interface is detected, the terminal equipment executes the touch operation in the area corresponding to the target area in the main display interface.
304. The terminal device detects whether the terminal device meets the exit condition for exiting the single-hand operation mode, and if the terminal device meets the exit condition, the terminal device hides the virtual operation interface and exits the single-hand operation mode.
The above-mentioned exit condition includes: and the screen of the terminal equipment is touched according to a second preset operation, or a second preset key of the terminal equipment is touched.
Specifically, when the screen of the terminal device is touched according to a second predetermined operation, the terminal device meets the exit condition, and the terminal device hides the virtual operation interface and exits the one-handed operation mode. Alternatively, the second predetermined operation may be a sliding operation of the screen side area of the terminal device, for example, as shown in fig. 2j, when the terminal device detects that a sliding operation toward the center of the screen occurs in the screen preset side area b, and a sliding track formed by the sliding operation is greater than a second preset track length (for example, 2 cm), the terminal device meets the exit condition, exits the one-handed operation mode, and if the sliding track formed by the sliding operation is less than or equal to the second preset track length, the process is ended.
Optionally, when a second predetermined key of the terminal device is touched, the terminal device meets the exit condition, and the terminal device exits the one-handed operation mode. For example, the second predetermined key of the terminal device may be a set of combination keys: and the terminal equipment meets the exit condition and exits the single-hand operation mode under the condition that the volume down key and the Home key of the terminal equipment are pressed simultaneously.
Step 302 and step 303 may refer to the above detailed description of step 101 and step 102 shown in fig. 1, and are not described here again.
In this embodiment, on the basis of the method of the embodiment shown in fig. i, implementation manners of entering and exiting the one-handed operation mode by the terminal device are added, and particularly before the terminal device enters the one-handed operation mode, whether the touch position of the user is close to a corner of the screen can be detected, so that whether the terminal device is operated by one hand can be more accurately determined, and the mobile terminal can conveniently enter the one-handed operation mode quickly and is convenient for the user to use.
The embodiment of the present invention further provides a terminal device, which includes a unit configured to execute any one of the methods described above. Specifically, referring to fig. 4, a schematic block diagram of a terminal device according to an embodiment of the present invention is shown. The terminal device of the embodiment includes: the device comprises a display unit, a detection unit and an execution unit.
The display unit 410 is configured to display a virtual operation interface on a main display interface of the terminal device after the terminal device enters the one-handed operation mode.
Specifically, the area of the virtual operation interface displayed by the display unit 410 is smaller than the area of the main display interface, and the virtual operation interface may include display content, where the display content is obtained by reducing the content displayed in the main display interface according to a preset ratio.
Alternatively, the virtual operation interface displayed by the display unit 410 may be a region with adjustable transparency. The virtual operation interface can be displayed according to the preset transparency, the transparency of the virtual operation interface can be set according to needs, the display content of the main display interface in the area can be seen in the area of the virtual operation interface, the shielding of the display content in the area by the virtual operation interface is avoided, and the screen display effect is improved.
Optionally, the display unit 410 may display a virtual display interface including a virtual trigger key, where the virtual trigger key is used to trigger the main display interface to execute a trigger event corresponding to the virtual trigger key.
Optionally, the display unit 410 is further configured to display an indication element within the main display page, where the indication element is used to indicate a position of the touch operation, and the indication element may be displayed as a circular cursor or an arrow on the main display interface.
The first detecting unit 420 is configured to detect a touch operation on a target area within the virtual operation interface.
Specifically, when the first detection unit 420 detects a touch operation in the target area, the touch operation does not directly trigger a trigger event of the main display interface, but triggers the execution unit 430 to execute the touch operation in an area corresponding to the target area in the main display interface.
The executing unit 430 is configured to execute the touch operation in an area corresponding to the target area in the main display interface.
In an optional embodiment, the terminal device may further include a second detection unit 440, wherein:
the second detecting unit 440 is configured to detect whether the terminal device meets an activation condition, and if the terminal device meets the activation condition, the terminal device enters a one-handed operation mode.
Specifically, the activation condition refers to an activation condition of a one-handed operation mode of the terminal device, and the activation condition includes: the screen of the terminal device is touched according to a first preset operation, or a first preset key of the terminal device is touched. If the second detecting unit 440 detects that the terminal device meets the activation condition, the terminal device enters a one-handed operation mode.
The second detecting unit 440 may be further configured to detect whether the terminal device meets an exit condition for exiting the single-handed operation mode, and if the terminal device meets the exit condition, the terminal device hides the virtual operation interface and exits the single-handed operation mode.
Specifically, the exit condition includes: and the screen of the terminal equipment is touched according to a second preset operation, or a second preset key of the terminal equipment is touched. When the second detecting unit 440 detects that the terminal device meets the quit condition, the terminal device hides the virtual operation interface and quits the single-handed operation mode.
In an optional embodiment, the second detecting unit 440 is further configured to detect a first dragging operation for a preset border region of the virtual operation interface and a second dragging operation for a preset corner region of the virtual operation interface.
Specifically, when the second detection unit 440 detects a first drag operation on a preset border region of the virtual operation interface, the display unit 410 displays the movement of the virtual operation interface on the main display interface; when the second detection unit 440 detects a second drag operation for a preset corner region of the virtual operation interface, the display unit 410 displays a zoom-in or a zoom-out of the virtual operation interface on the main display interface.
Referring to fig. 5, a schematic block diagram of a terminal device according to another embodiment of the present invention is shown. As shown, the terminal device in this embodiment may include: one or more processors 501; one or more input devices 502, one or more output devices 503, and memory 504. The processor 501, the input device 502, the output device 503, and the memory 504 are connected by a bus 505. The memory 504 is used to store a computer program comprising program instructions and the processor 501 is used to execute the program instructions stored by the memory 504.
The output device 503 is configured to display a virtual operation interface on the main display interface of the terminal device after the terminal device enters the one-handed operation mode.
The output device 503 is further configured to display the indication element in the main display interface.
The processor 501 is configured to execute the touch operation in an area corresponding to the target area in the main display interface.
Specifically, the processor 501 is configured to determine a second target coordinate corresponding to the first target coordinate according to a corresponding relationship between a coordinate in the virtual operation interface and a coordinate in the main display interface; the processor 501 is further configured to perform the touch operation at the second target coordinate of the main display interface.
In an optional embodiment, the processor 501 is further configured to detect whether the terminal device meets an activation condition, and if the activation condition is met, the terminal device enters the above-mentioned one-handed operation mode. The activation conditions include: the screen of the terminal device is touched according to a first preset operation, or a first preset key of the terminal device is touched.
Optionally, the processor 501 is configured to detect a first dragging operation for a preset frame region of the virtual operation interface, and when the first dragging operation for the preset frame region of the virtual operation interface is detected, move the virtual operation interface on the main display interface through the output device 503; the processor 501 is further configured to detect a second dragging operation for a preset corner region of the virtual operation interface, and when the second dragging operation for the preset corner region of the virtual operation interface is detected, zoom the virtual operation interface down or up through the output device 503.
Optionally, the memory 504 is configured to store coordinates and corresponding relationships thereof in the virtual operating interface and the main display interface.
It should be understood that, in the embodiment of the present invention, the Processor 501 may be a Central Processing Unit (CPU), and the Processor may also be other general-purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The input device 502 may include a touch pad, a fingerprint sensor (for collecting fingerprint information of a user and direction information of the fingerprint), a microphone, etc., and the output device 503 may include a display (LCD, etc.), a speaker, etc.
The memory 504 may include a read-only memory and a random access memory, and provides instructions and data to the processor 501. A portion of the memory 504 may also include non-volatile random access memory. For example, the memory 504 may also store device type information.
In a specific implementation, the processor 501, the input device 502, and the output device 503 described in this embodiment of the present invention may execute the implementation manners described in the first embodiment and the second embodiment of the full-screen one-handed operation method provided in this embodiment of the present invention, and may also execute the implementation manner of the terminal device described in this embodiment of the present invention, which is not described herein again.
In another embodiment of the present invention, a computer-readable storage medium is provided, which stores a computer program comprising program instructions, which when executed by a processor, implement the method embodiments illustrated in fig. 1 and 3 described above.
The computer-readable storage medium may be an internal storage unit of the terminal device according to any of the foregoing embodiments, for example, a hard disk or a memory of the terminal device. The computer readable storage medium may also be an external storage device of the terminal device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the terminal device. Further, the computer-readable storage medium may also include both an internal storage unit and an external storage device of the terminal device. The computer-readable storage medium is used for storing the computer program and other programs and data required by the terminal device. The computer readable storage medium may also be used to temporarily store data that has been output or is to be output.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the terminal device and the unit described above may refer to corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed terminal device and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may also be an electric, mechanical or other form of connection.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment of the present invention.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention essentially or partially contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
While the invention has been described with reference to specific embodiments, the invention is not limited thereto, and various equivalent modifications and substitutions can be easily made by those skilled in the art within the technical scope of the invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

  1. A full-screen one-handed operation method is characterized by comprising the following steps:
    after a terminal device enters a one-hand operation mode, displaying a virtual operation interface on a main display interface of the terminal device, wherein the area of the virtual operation interface is smaller than that of the main display interface;
    when touch operation aiming at a target area in the virtual operation interface is detected, the touch operation is executed in an area corresponding to the target area in the main display interface.
  2. The method of claim 1, wherein the target area comprises a first target coordinate, and wherein performing the touch operation in an area corresponding to the target area within the primary display interface comprises:
    determining a second target coordinate corresponding to the first target coordinate according to the corresponding relation between the coordinate in the virtual operation interface and the coordinate in the main display interface;
    executing the touch operation at the second target coordinate of the main display interface.
  3. The method according to claim 2, wherein before displaying the virtual operation interface on the main display interface of the terminal device, the method further comprises:
    detecting whether the terminal equipment meets an activation condition, if so, entering the single-hand operation mode, wherein the activation condition comprises: the screen of the terminal device is touched according to a first preset operation, or a first preset key of the terminal device is touched.
  4. The method according to claim 3, wherein after the virtual operation interface is displayed on the main display interface of the terminal device, the method further comprises:
    when a first dragging operation aiming at a preset frame area of the virtual operation interface is detected, moving the virtual operation interface on the main display interface;
    and when a second dragging operation aiming at a preset corner area of the virtual operation interface is detected, reducing or amplifying the virtual operation interface.
  5. The method according to claim 4, wherein the virtual operation interface includes a virtual trigger button, and the virtual trigger button is used for triggering the main display interface to execute a trigger event corresponding to the virtual trigger button.
  6. The method according to claim 5, wherein the virtual operation interface includes display content, and the display content is obtained by zooming out the content displayed in the main display interface according to a preset scale.
  7. The method according to claim 6, wherein when the touch operation for the target area in the virtual operation interface is detected, after the touch operation is performed on the area corresponding to the target area in the main display interface, the method further comprises:
    detecting whether the terminal equipment meets an exit condition for exiting the single-hand operation mode, if so, hiding the virtual operation interface and exiting the single-hand operation mode, wherein the exit condition comprises: and the screen of the terminal equipment is touched according to a second preset operation, or a second preset key of the terminal equipment is touched.
  8. A terminal device, characterized in that it comprises means for performing the method of any of claims 1-7.
  9. A terminal device comprising a processor, an input device, an output device and a memory, the processor, the input device, the output device and the memory being interconnected, wherein the memory is configured to store a computer program comprising program instructions, the processor being configured to invoke the program instructions to perform the method according to any one of claims 1 to 7.
  10. A computer-readable storage medium, characterized in that the computer storage medium stores a computer program comprising program instructions that, when executed by a processor, cause the processor to perform the method according to any of claims 1-7.
CN201780095019.0A 2017-09-19 2017-09-19 Full-screen single-hand operation method, terminal equipment and computer readable medium Pending CN111316200A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/102230 WO2019056167A1 (en) 2017-09-19 2017-09-19 Full-screen one-hand operation method, terminal device, and computer-readable medium

Publications (1)

Publication Number Publication Date
CN111316200A true CN111316200A (en) 2020-06-19

Family

ID=65809486

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780095019.0A Pending CN111316200A (en) 2017-09-19 2017-09-19 Full-screen single-hand operation method, terminal equipment and computer readable medium

Country Status (2)

Country Link
CN (1) CN111316200A (en)
WO (1) WO2019056167A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114625300A (en) * 2022-01-26 2022-06-14 北京讯通安添通讯科技有限公司 Intelligent terminal operation method and device, terminal and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103914258A (en) * 2014-03-26 2014-07-09 深圳市中兴移动通信有限公司 Mobile terminal and method for operating same
CN103955339A (en) * 2014-04-25 2014-07-30 华为技术有限公司 Terminal operation method and terminal equipment
CN104238745A (en) * 2014-07-31 2014-12-24 天津三星通信技术研究有限公司 Method for operating mobile terminal by one hand and mobile terminal
CN106371688A (en) * 2015-07-22 2017-02-01 小米科技有限责任公司 Full-screen single-hand operation method and apparatus

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102937873B (en) * 2012-10-12 2015-09-16 天津三星通信技术研究有限公司 The method and apparatus of input through keyboard is carried out in portable terminal
CN104007930B (en) * 2014-06-09 2015-11-25 努比亚技术有限公司 A kind of mobile terminal and realize the method and apparatus of one-handed performance

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103914258A (en) * 2014-03-26 2014-07-09 深圳市中兴移动通信有限公司 Mobile terminal and method for operating same
CN103955339A (en) * 2014-04-25 2014-07-30 华为技术有限公司 Terminal operation method and terminal equipment
CN104238745A (en) * 2014-07-31 2014-12-24 天津三星通信技术研究有限公司 Method for operating mobile terminal by one hand and mobile terminal
CN106371688A (en) * 2015-07-22 2017-02-01 小米科技有限责任公司 Full-screen single-hand operation method and apparatus

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114625300A (en) * 2022-01-26 2022-06-14 北京讯通安添通讯科技有限公司 Intelligent terminal operation method and device, terminal and storage medium

Also Published As

Publication number Publication date
WO2019056167A1 (en) 2019-03-28

Similar Documents

Publication Publication Date Title
US8654076B2 (en) Touch screen hover input handling
US10048859B2 (en) Display and management of application icons
EP3557395B1 (en) Information processing apparatus, information processing method, and computer program
US9645699B2 (en) Device, method, and graphical user interface for adjusting partially off-screen windows
US8976140B2 (en) Touch input processor, information processor, and touch input control method
US20140285456A1 (en) Screen control method and the apparatus
US20120030624A1 (en) Device, Method, and Graphical User Interface for Displaying Menus
TWI471779B (en) Electronic device and method of controlling same
US20120256857A1 (en) Electronic device and method of controlling same
WO2018068328A1 (en) Interface display method and terminal
TW201602893A (en) Method for providing auxiliary information and touch control display apparatus using the same
WO2015085919A1 (en) Clicked object magnifying method and apparatus based on floating touch
CN106250190A (en) A kind of application startup method and terminal
KR20140078629A (en) User interface for editing a value in place
WO2019119799A1 (en) Method for displaying application icon, and terminal device
US9304650B2 (en) Automatic cursor rotation
CN108491152B (en) Touch screen terminal control method, terminal and medium based on virtual cursor
KR101231513B1 (en) Contents control method and device using touch, recording medium for the same and user terminal having it
WO2014075540A1 (en) Touchscreen scroll control system and method
WO2023093661A1 (en) Interface control method and apparatus, and electronic device and storage medium
CN111316200A (en) Full-screen single-hand operation method, terminal equipment and computer readable medium
US20150309693A1 (en) Cursor assistant window
CN111226190A (en) List switching method and terminal
CN109032710B (en) Interface adjusting method, device, equipment and storage medium
CN112667931B (en) Webpage collecting method, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination