WO2019056167A1 - 一种全屏幕单手操作方法、终端设备及计算机可读介质 - Google Patents

一种全屏幕单手操作方法、终端设备及计算机可读介质 Download PDF

Info

Publication number
WO2019056167A1
WO2019056167A1 PCT/CN2017/102230 CN2017102230W WO2019056167A1 WO 2019056167 A1 WO2019056167 A1 WO 2019056167A1 CN 2017102230 W CN2017102230 W CN 2017102230W WO 2019056167 A1 WO2019056167 A1 WO 2019056167A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal device
interface
main display
virtual
area
Prior art date
Application number
PCT/CN2017/102230
Other languages
English (en)
French (fr)
Inventor
徐叶辉
黄成钟
郑雪瑞
Original Assignee
深圳传音通讯有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳传音通讯有限公司 filed Critical 深圳传音通讯有限公司
Priority to CN201780095019.0A priority Critical patent/CN111316200A/zh
Priority to PCT/CN2017/102230 priority patent/WO2019056167A1/zh
Publication of WO2019056167A1 publication Critical patent/WO2019056167A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present invention relates to the field of terminal technologies, and in particular, to a full-screen one-hand operation method, a terminal device, and a computer readable medium.
  • terminal equipment With the continuous development and popularization of terminal equipment technology, the appearance of terminal equipment is gradually changing in order to achieve more functions and better effects. For example, the screen of the terminal equipment becomes larger and larger, and also brings people Some troubles have come. Taking mobile phones as an example, as the screen becomes larger, it becomes more and more difficult to operate the mobile phone with one hand, especially when some spaces are narrow or inconvenient to operate by hands (such as standing by bus), which is very inconvenient.
  • the one-hand operation method of the existing terminal device is to reduce the content of the full screen display of the terminal device in a predetermined fixed area at a corner of the screen, the predetermined fixed area is smaller than the screen area of the terminal device, and the user can be in the predetermined fixed area. Perform one-handed operation.
  • the predetermined fixed area is small, display of the reduced display content in the predetermined fixed area is not clear enough (for example, font blur), and the display effect is poor.
  • the embodiment of the invention provides a full-screen one-hand operation method, which can be applied to the entire main display interface by performing a small-scale operation in the virtual operation window of the terminal device, and can be improved when the terminal device operates with one hand.
  • the screen displays the effect.
  • an embodiment of the present invention provides a full screen one-hand operation method, where the method includes:
  • the virtual operation interface is displayed on the main display interface of the terminal device, and the area of the virtual operation interface is smaller than the area of the main display interface;
  • the touch operation is performed in an area corresponding to the target area in the main display interface.
  • an embodiment of the present invention provides a terminal device, where the terminal device includes The unit of the method of the above first aspect.
  • an embodiment of the present invention provides another terminal device, including a processor, an input device, an output device, and a memory, where the processor, the input device, the output device, and the memory are connected to each other, wherein the memory is used for A computer program supporting a terminal device for performing the above method, the computer program comprising program instructions, the processor being configured to invoke the program instructions to perform the method of the first aspect above.
  • an embodiment of the present invention provides a computer readable storage medium, where the computer storage medium stores a computer program, where the computer program includes program instructions, and the program instructions, when executed by a processor, cause the processing The method of the first aspect described above is performed.
  • the virtual operation interface is displayed on the main display interface of the terminal device, and the small-range operation in the virtual operation window can be applied to the entire main display interface. Operation can improve the screen display effect when one-hand operation is performed on the terminal device.
  • FIG. 1 is a schematic flow chart of a full screen one-hand operation method according to an embodiment of the present invention
  • FIG. 2a is a schematic diagram of a display position of a one-hand operation interface according to an embodiment of the present invention
  • FIG. 2b is a schematic diagram of another single-hand operation interface display position according to an embodiment of the present invention.
  • 2c is a schematic diagram of performing a touch operation in a full-screen one-hand operation method according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram of an interface for performing a touch operation in another full-screen one-hand operation method according to an embodiment of the present invention
  • 2 e is a schematic diagram of an interface including a virtual trigger button according to an embodiment of the present invention.
  • 2f is a schematic diagram of an interface including a direct operation area according to an embodiment of the present invention.
  • 2g is a schematic diagram of a method for entering a one-hand operation mode according to an embodiment of the present invention
  • 2h is a schematic diagram of a method for dragging a virtual operation interface according to an embodiment of the present invention.
  • FIG. 2 is a schematic diagram of a method for enlarging or reducing a virtual operation interface according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram of a method for exiting a one-hand operation mode according to an embodiment of the present invention
  • FIG. 3 is a schematic flowchart of a full screen one-hand operation method according to another embodiment of the present invention.
  • FIG. 4 is a schematic block diagram of a terminal device according to an embodiment of the present invention.
  • FIG. 5 is a schematic block diagram of a terminal device according to another embodiment of the present invention.
  • the term “if” can be interpreted as “when” or “on” or “in response to determining” or “in response to detecting” depending on the context. .
  • the phrase “if determined” or “if detected [condition or event described]” may be interpreted in context to mean “once determined” or “in response to determining” or “once detected [condition or event described] ] or “in response to detecting [conditions or events described]”.
  • the terminal device described in the embodiments of the present invention includes, but is not limited to, other portable devices such as a mobile phone, a laptop computer or a tablet computer with a touch sensitive surface (for example, a touch screen display and/or a touch pad).
  • the device is not a portable communication device, but a desktop computer having a touch sensitive surface (eg, a touch screen display and/or a touch pad).
  • terminal device including a display and a touch sensitive surface is described.
  • the terminal device may include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
  • the terminal device supports various applications, such as one or more of the following: drawing application, presentation application, word processing application, website creation application, disk burning application, spreadsheet application, game application, phone Applications, video conferencing applications, email applications, instant messaging applications, workout support applications, photo management applications, digital camera applications, digital camera applications, web browsing applications, digital music player applications And/or digital video player app.
  • applications such as one or more of the following: drawing application, presentation application, word processing application, website creation application, disk burning application, spreadsheet application, game application, phone Applications, video conferencing applications, email applications, instant messaging applications, workout support applications, photo management applications, digital camera applications, digital camera applications, web browsing applications, digital music player applications And/or digital video player app.
  • Various applications that can be executed on the terminal device can use at least one common physical user interface device such as a touch sensitive surface.
  • One or more functions of the touch sensitive surface and corresponding information displayed on the terminal device can be adjusted and/or changed within the application and/or within the respective application.
  • the common physical architecture of the terminal device eg, a touch-sensitive surface
  • FIG. 1 is a schematic flowchart of a full-screen one-hand operation method according to an embodiment of the present invention. As shown in FIG. 1, the method may include:
  • the terminal device After the terminal device enters the one-hand operation mode, the terminal device displays a virtual operation interface on the main display interface of the terminal device, where the area of the virtual operation interface is smaller than the surface of the main display interface. product.
  • the one-hand operation mode mentioned in the embodiment of the present invention is mainly for a large-screen terminal device. Since the screen is large, when the user operates with only one hand, some buttons are difficult to touch. Moreover, it is also prone to accidents. Therefore, the one-hand operation mode is to reduce the screen display area, and the user can operate smoothly with only one hand, which is convenient for use and avoids accidents in which the terminal device falls.
  • the main display interface of the terminal device includes the content that the terminal device currently displays on the entire screen.
  • the main display interface may include a desktop display interface of the terminal device, and may also include an application (Application, APP) installed by the terminal device. Inside the interface.
  • a virtual operation interface may be displayed on the main display interface of the terminal device, where the virtual operation interface may include display content, where the display content is displayed in the main display interface.
  • the content is reduced according to the preset ratio.
  • the area of the virtual operation interface is smaller than the area of the main display interface. Therefore, as shown in FIG. 2a, the virtual operation interface may be displayed at a position of the main display interface frame area (such as a lower right area of the main display interface), or As shown in FIG. 2b, the edge of the virtual operation interface overlaps with the edge of the main display interface (eg, the lower right corner of the virtual operation interface overlaps with the lower right corner of the main display interface).
  • the virtual operation interface may be an area with adjustable transparency.
  • the virtual operation interface can be displayed according to the preset transparency.
  • the transparency of the virtual operation interface can be set according to requirements.
  • the display content of the main display interface of the area can be seen in the area of the virtual operation interface, thereby avoiding the virtual operation interface.
  • the occlusion of the content displayed in the area improves the screen display.
  • the terminal device When detecting a touch operation for the target area in the virtual operation interface, the terminal device performs the touch operation in an area corresponding to the target area in the main display interface.
  • the target area mentioned in the embodiment of the present invention is an arbitrary area within the virtual operation interface, and the target area and the area corresponding to the target area may be represented as a set of multiple coordinates. Specifically, when the touch operation in the target area is detected, the touch operation does not directly trigger the trigger event of the main display interface, but performs the touch on the area corresponding to the target area in the main display interface. operating.
  • the terminal device may determine, according to the correspondence between the coordinates in the virtual operation interface and the coordinates in the main display interface, the second target coordinate of the main display interface corresponding to the first target coordinate of the virtual operation interface, where the first target is
  • the coordinates and the second target coordinates may be a set of coordinates and a plurality of coordinates.
  • the first coordinate system and the main display interface of the virtual operation interface may be established in advance a second coordinate system, wherein the first coordinate system and the second coordinate system have a plurality of coordinates one-to-one correspondence, that is, when determining the first target coordinate in the first coordinate system, the second coordinate system may be The correspondence relationship uniquely determines the second target coordinate corresponding to the first target coordinate, and further performs the touch operation at the second target coordinate of the main display interface.
  • the correspondence between the coordinates in the virtual operation interface and the coordinates in the main display interface is stored in the terminal device in the form of data. In this method, step 102 can be repeated.
  • the terminal device detects that the user's finger has a first sliding operation on the virtual operation interface, the touch operation is the first sliding operation, and the first sliding operation is represented as the first virtual operation interface.
  • a sliding trajectory the sliding trajectory can be regarded as a set of a plurality of coordinates, that is, the first target coordinate; according to the corresponding relationship, the second target coordinate corresponding to the first target coordinate can be determined, that is, the main display interface can be
  • the second target coordinate position triggers a second sliding operation, the trajectory formed by the second sliding operation is the same as the trajectory shape of the first sliding operation, and the trajectory of the second sliding operation is proportionally enlarged by the trajectory of the first sliding operation.
  • the ratio is determined by the above correspondence.
  • the terminal device detects that the user's finger has a first pressing operation on the virtual operation interface, the touch operation is the pressing operation, and the terminal device detects that the pressing operation occurs at the third target coordinate position of the virtual operation interface;
  • the fourth target coordinate corresponding to the third target coordinate may be determined, that is, the second target pressing position may be triggered in the main display interface, and the second pressing operation may trigger the main display interface.
  • the trigger event corresponding to the four target coordinate positions for example, as shown in FIG. 2d, if the user performs a click operation on the area circled by the virtual operation interface, the event triggered by clicking the corresponding position of the main display screen is simulated.
  • the touch operation can be performed on the main display interface only by performing a small touch operation on the virtual operation interface, and the screen display effect can be improved when the one-hand operation is performed on the terminal device.
  • the virtual operation interface may include a virtual trigger button, and the virtual trigger button is used to trigger the main display interface to execute a trigger event corresponding to the virtual trigger button.
  • the first indication element may be displayed at the second target coordinate, where the first indication element is used to indicate The position of the control operation, the first indication element may be displayed as a circular cursor or an arrow on the main display interface.
  • the first indicator element can facilitate the user to determine the last touch position on the main display interface, from Moreover, it is convenient to perform a corresponding triggering operation on the position indicated by the first indication element when the user uses the virtual trigger button.
  • the second indication element may also be displayed at the first target coordinate, and the second indication element may facilitate the user to determine the last touch position on the virtual operation interface. If the touch operation does not occur in the preset duration (for example, 5 seconds, 10 seconds, or 30 seconds, etc.) on the virtual touch interface, the first indicator element may be hidden to prevent the first indicator element from blocking the main display.
  • the display content of the interface enhances the display effect of the main display interface.
  • the virtual trigger button may be located in a side area of the main display interface, and the terminal device detects that the virtual trigger is triggered by the first indication element indicated by the first display element of the main display interface when the user clicks the virtual trigger button.
  • the trigger event corresponding to the button for example, the main display interface is a desktop display interface of the terminal device, and the content displayed by the interface includes icons of the application A, the application B, and the application C, and the virtual trigger button in the virtual operation interface includes a determination button, and the The determining button is used to trigger the main display interface to execute a corresponding trigger event.
  • the content displayed on the main display page is the content of the display page in the application B, that is, the effect of directly performing the touch operation in the non-one-hand operation mode is achieved.
  • the display content of the virtual operation interface is obtained by scaling down the content displayed on the main display interface, when the virtual operation interface is small, the touch operation is simulated by the virtual trigger button, and the touch operation is directly performed on the virtual operation interface. The accuracy is higher.
  • the main operation interface of the terminal device and the virtual operation interface may have a direct operation area in the non-coincidence area, and the direct operation area is an area that can directly touch the main display interface on the periphery of the virtual operation interface.
  • the direct operation area may be the lower half area of the main display interface, or the left bottom or the lower right side of the main display interface is an arc-shaped area.
  • the area may be according to the position of the virtual operation area. It is determined that, as shown in FIG. 2f, the direct operation area is an area in which one side of the lower right side of the main display interface is curved, and the reachable range of the one-hand operation of the user is in the area, that is, the direct operation area needs to be performed.
  • the terminal device can also respond to the touch operation on the main display interface in the direct operation area, so that the user can more conveniently perform the touch operation in the area.
  • FIG. 3 is a schematic flowchart of a full-screen one-hand operation method according to a second embodiment of the present invention, and FIG. 3 is further optimized based on FIG. As shown in Figure 3, the full screen
  • the one-handed operation method can include the following steps:
  • the terminal device detects whether the terminal device meets an activation condition, and if the activation condition is met, the terminal device enters a one-hand operation mode.
  • the activation condition is an activation condition of the one-hand operation mode of the terminal device, and the activation condition includes: the screen of the terminal device is touched according to the first predetermined operation, or the first predetermined button of the terminal device is touched. If the terminal device meets the above activation condition, the terminal device enters the one-hand operation mode.
  • the terminal device meets the activation condition, and the terminal device enters the one-hand operation mode.
  • the first predetermined operation may be a sliding operation of the bottom border area of the terminal device.
  • the terminal device detects that the preset frame area a of the screen has an upward sliding operation, When the sliding track formed by the sliding operation in the preset frame area a is larger than the first preset track length (for example, 3 cm), the terminal device meets the above activation condition and enters the one-hand operation mode, and if the sliding operation forms a sliding When the track is less than or equal to the first preset track length, the process ends.
  • the terminal device may detect whether the sliding track formed by the sliding operation is within a preset side area of the screen, and if so, the terminal device In accordance with the above activation conditions, enter the above one-hand operation mode, and if not, end this process.
  • the terminal device when the first predetermined button of the terminal device is touched, and the terminal device meets the activation condition, the terminal device enters a one-hand operation mode.
  • the terminal device detects that the preset preset combination button of the screen is touched, the terminal device meets the activation condition and enters the one-hand operation mode.
  • the preset combination button of the terminal device may be the volume down button + the Home button, and simultaneously press the volume down button and the Home button of the terminal device, and the terminal device meets the above activation condition to enter the one-hand operation mode.
  • the Home button is a key that has a function of returning to the home screen in the terminal device of the Windows operating system, the iOS operating system, and the Android operating system.
  • the terminal device After the terminal device enters the one-hand operation mode, the terminal device displays a virtual operation interface on the main display interface of the terminal device, where the area of the virtual operation interface is smaller than the area of the main display interface.
  • the method further includes:
  • the virtual operation interface When the first drag operation for the preset border area of the virtual operation interface is detected, the virtual operation interface is moved on the main display interface; when the second drag of the preset corner area for the virtual operation interface is detected When the operation is performed, the virtual operation interface is reduced or enlarged.
  • the specific implementation manner may be that when the terminal device detects the pressing operation in the preset border area or in the preset corner area, the border of the virtual operation area is selected (for example, displayed in red), indicating that the drag operation can be performed.
  • the virtual operation interface has a preset frame area. As shown in FIG. 2h, when the first drag operation is performed on the preset frame area, the virtual operation interface may follow the moving direction of the first drag operation. Move on the main display interface.
  • the virtual operation interface can be dragged and moved on the main display interface, and the appropriate display position of the virtual operation interface is selected, so that the user can realize one-hand operation, for example, the virtual display interface is dragged to the left side of the screen for convenient use.
  • the left-hand user can also avoid obscuring the page content that the main display interface needs to display.
  • the virtual operation interface may have a preset corner area, where the preset corner area refers to an area extending outward from a fixed angle of any corner of the virtual operation interface, as shown in FIG. 2i, when the virtual operation interface is When the preset corner area performs the second drag operation, the virtual operation interface may be reduced or enlarged in the main display interface according to the second drag operation.
  • the method can be used to adjust the display size of the virtual operation interface on the main display interface, so that the user can adjust the display effect of the virtual operation interface, and adjust the virtual display interface to a size that is convenient for the user to operate.
  • the operation interface is too small and the content is not clear, which affects the operation.
  • the terminal device When detecting a touch operation for the target area in the virtual operation interface, the terminal device performs the touch operation in an area corresponding to the target area in the main display interface.
  • the terminal device detects whether the terminal device meets the exit condition for exiting the one-hand operation mode. If the exit condition is met, the terminal device hides the virtual operation interface and exits the one-hand operation mode.
  • the exit condition includes: the screen of the terminal device is touched according to a second predetermined operation, or the second predetermined button of the terminal device is touched.
  • the terminal device when the screen of the terminal device is touched according to the second predetermined operation, the terminal device meets the exit condition, and the terminal device hides the virtual operation interface and exits the one-hand operation mode.
  • the second predetermined operation may be a sliding operation of a screen side area of the terminal device, for example, As shown in FIG. 2j, when the terminal device detects that the screen preset side area b has a sliding operation to the center of the screen, and the sliding operation forms a sliding track larger than the second preset track length (for example, 2 cm), The terminal device meets the above exit condition and exits the one-hand operation mode. If the sliding track formed by the sliding operation is less than or equal to the second preset track length, the process ends.
  • the terminal device meets the exit condition, and the terminal device exits the one-hand operation mode.
  • the second predetermined button of the terminal device may be a group of combined buttons: a volume down button + a home button, and when the volume down button and the home button of the terminal device are pressed, the terminal device meets the above exit condition, and exits the above.
  • One-hand operation mode when the second predetermined button of the terminal device is touched, the terminal device meets the exit condition, and exits the above.
  • steps 302 and 303 refer to the detailed description of step 101 and step 102 shown in FIG. 1 above, and details are not described herein again.
  • an implementation manner in which the terminal device enters the one-hand operation mode and exits the one-hand operation mode is added, especially before the terminal device enters the one-hand operation mode. It can detect whether the user's touch position is close to the corner of the screen, and more accurately determine whether it is a one-hand operation, so that the mobile terminal can quickly enter the one-hand operation mode, which is convenient for the user to use.
  • the embodiment of the invention further provides a terminal device, which comprises a unit for performing the method according to any of the preceding items.
  • a terminal device which comprises a unit for performing the method according to any of the preceding items.
  • FIG. 4 it is a schematic block diagram of a terminal device according to an embodiment of the present invention.
  • the terminal device of this embodiment includes: a display unit, a detecting unit, and an executing unit.
  • the display unit 410 is configured to display a virtual operation interface on the main display interface of the terminal device after the terminal device enters the one-hand operation mode.
  • the area of the virtual operation interface displayed by the display unit 410 is smaller than the area of the main display interface, and the virtual operation interface may include display content, where the display content is that the content displayed in the main display interface is reduced according to a preset ratio. After getting it.
  • the virtual operation interface displayed by the display unit 410 may be an area with adjustable transparency.
  • the virtual operation interface can be displayed according to the preset transparency.
  • the transparency of the virtual operation interface can be set according to requirements.
  • the display content of the main display interface of the area can be seen in the area of the virtual operation interface, thereby avoiding the virtual operation.
  • the interface occludes the display content of the area, which improves the screen display effect.
  • the display unit 410 may display a virtual display interface including a virtual trigger button, where the virtual trigger button is used to trigger the main display interface to perform a trigger event corresponding to the virtual trigger button. Pieces.
  • the display unit 410 is further configured to display an indication element in the main display page, where the indication element is used to indicate a location of the touch operation, and the indication element may be displayed as a circular cursor or an arrow on the main display interface.
  • the first detecting unit 420 is configured to detect a touch operation for the target area in the virtual operation interface.
  • the touch operation when the first detecting unit 420 detects the touch operation in the target area, the touch operation does not directly trigger the trigger event of the main display interface, but the trigger execution unit 430 is in the main display interface.
  • the touch area is performed on the area corresponding to the target area.
  • the executing unit 430 is configured to perform the touch operation in the area corresponding to the target area in the main display interface.
  • the terminal device may further include a second detecting unit 440, where:
  • the second detecting unit 440 is configured to detect whether the terminal device meets the activation condition, and if the activation condition is met, the terminal device enters the one-hand operation mode.
  • the activation condition refers to an activation condition of the one-hand operation mode of the terminal device, where the activation condition includes: the screen of the terminal device is touched according to a first predetermined operation, or the first predetermined button of the terminal device is Touch. If the second detecting unit 440 detects that the terminal device meets the above activation condition, the terminal device enters the one-hand operation mode.
  • the second detecting unit 440 is further configured to detect whether the terminal device meets the exit condition for exiting the one-hand operation mode, and if the exit condition is met, the terminal device hides the virtual operation interface and exits the one-hand operation mode.
  • the exit condition includes: the screen of the terminal device is touched according to a second predetermined operation, or the second predetermined button of the terminal device is touched.
  • the second detecting unit 440 detects that the terminal device meets the foregoing exit condition, the terminal device hides the virtual operation interface and exits the one-hand operation mode.
  • the second detecting unit 440 is further configured to detect a first drag operation for the preset border area of the virtual operation interface, and a second drag for the preset corner area of the virtual operation interface. operating.
  • the display unit 410 displays the movement of the virtual operation interface on the main display interface; when the second detecting unit 440 detects the second drag operation for the preset corner region of the virtual operation interface The display unit 410 displays the reduction or enlargement of the virtual operation interface on the main display interface.
  • FIG. 5 is a schematic block diagram of a terminal device according to another embodiment of the present invention.
  • the terminal device in this embodiment as shown may include one or more processors 501; one or more input devices 502, one or more output devices 503, and a memory 504.
  • the above processor 501, input device 502, output device 503, and memory 504 are connected by a bus 505.
  • the memory 504 is used to store a computer program, the computer program including program instructions, and the processor 501 is configured to execute program instructions stored in the memory 504.
  • the output device 503 is configured to display a virtual operation interface on the main display interface of the terminal device after the terminal device enters the one-hand operation mode.
  • the output device 503 is further configured to display the indication element in the main display interface.
  • the processor 501 is configured to perform the touch operation in an area corresponding to the target area in the main display interface.
  • the processor 501 is configured to determine, according to a correspondence between coordinates in the virtual operation interface and coordinates in the main display interface, a second target coordinate corresponding to the first target coordinate; the processor 501 is further configured to be in the main The touch operation is performed at the second target coordinate of the display interface.
  • the processor 501 is further configured to detect whether the terminal device meets the activation condition, and if the activation condition is met, the terminal device enters the one-hand operation mode.
  • the activation condition includes: the screen of the terminal device is touched according to the first predetermined operation, or the first predetermined button of the terminal device is touched.
  • the processor 501 is configured to detect a first drag operation for the preset border area of the virtual operation interface, and output the first drag operation for the preset border area of the virtual operation interface.
  • the device 503 is configured to move the virtual operation interface on the main display interface.
  • the processor 501 is further configured to detect a second drag operation for the preset corner area of the virtual operation interface, when a pre-detection for the virtual operation interface is detected. When the second drag operation of the corner area is set, the virtual operation interface is reduced or enlarged by the output device 503.
  • the memory 504 is configured to store coordinates in the virtual operation interface and in the main display interface and Correspondence relationship.
  • the processor 501 may be a central processing unit (CPU), and the processor may also be another general-purpose processor, a digital signal processor (DSP). , Application Specific Integrated Circuit (ASIC), Field-Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware component, etc.
  • the general purpose processor may be a microprocessor or the processor or any conventional processor or the like.
  • the input device 502 may include a touchpad, a fingerprint sensor (for collecting fingerprint information of the user and direction information of the fingerprint), a microphone, and the like
  • the output device 503 may include a display (LCD or the like), a speaker, and the like.
  • the memory 504 can include read only memory and random access memory and provides instructions and data to the processor 501.
  • a portion of the memory 504 can also include a non-volatile random access memory.
  • the memory 504 can also store information of the device type.
  • the processor 501, the input device 502, and the output device 503, which are described in the embodiments of the present invention, may be described in the first embodiment and the second embodiment of the full-screen one-hand operation method provided by the embodiment of the present invention.
  • the implementation manner of the terminal device described in the embodiment of the present invention may also be implemented, and details are not described herein again.
  • a computer readable storage medium storing a computer program, the computer program comprising program instructions, when executed by a processor, implements the above-described FIG. 1 and FIG. The illustrated method embodiment.
  • the computer readable storage medium may be an internal storage unit of the terminal device described in any of the foregoing embodiments, such as a hard disk or a memory of the terminal device.
  • the computer readable storage medium may also be an external storage device of the terminal device, such as a plug-in hard disk equipped on the terminal device, a smart memory card (SMC), and a secure digital (Secure Digital, SD) ) cards, flash cards, etc.
  • the computer readable storage medium may also include both an internal storage unit of the terminal device and an external storage device.
  • the computer readable storage medium is for storing the computer program and other programs and data required by the terminal device.
  • the computer readable storage medium can also be used to temporarily store data that has been output or is about to be output.
  • the disclosed terminal device and method may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the unit is only a logical function division.
  • there may be another division manner for example, multiple units or components may be combined or Can be integrated into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, or an electrical, mechanical or other form of connection.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the embodiments of the present invention.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as a standalone product, may be stored in a computer readable storage medium.
  • the medium includes instructions for causing a computer device (which may be a personal computer, server, or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present invention.
  • the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk, and the like. .

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种全屏幕单手操作方法、终端设备及计算机可读介质,其中方法包括:在终端设备进入单手操作模式后,该终端设备在其主显示界面上显示虚拟操作界面,所述虚拟操作界面的面积小于所述主显示界面的面积(101);当检测到针对所述虚拟操作界面内的目标区域的触控操作时,该终端设备在所述主显示界面内与上述目标区域对应的区域执行所述触控操作(102)。上述方法使得在终端设备的虚拟操作窗口中进行小范围的操作就能实现对整个主显示界面的操作控制,并且在对终端设备进行单手操作时能够提高屏幕显示效果。

Description

一种全屏幕单手操作方法、终端设备及计算机可读介质 技术领域
本发明涉及终端技术领域,尤其涉及一种全屏幕单手操作方法、终端设备及计算机可读介质。
背景技术
随着终端设备技术的不断发展和普及,为实现更多的功能和更好的效果,终端设备的外观也在逐渐变化,例如,终端设备的屏幕变得越来越大,同时也给人们带来了一些烦恼。以手机为例,随着屏幕变大,单手操作手机就变得越来越困难,特别在一些空间狭窄或不便于双手操作(如乘公交站立)的时候,十分不便。
现有的终端设备的单手操作方法,是将终端设备的全屏幕显示的内容缩小显示在屏幕一角的预定固定区域内,该预定固定区域小于终端设备的屏幕区域,用户可以在预定固定区域内进行单手操作。然而,由于预定固定区域较小,在预定固定区域内显示缩小后的显示内容不够清晰(例如,字体模糊),显示效果较差。
发明内容
本发明实施例提供一种全屏幕单手操作方法,可以在终端设备的虚拟操作窗口中进行小范围的操作就能实现应用于整个主显示界面的操作,能够在终端设备上单手操作时提高屏幕显示效果。
第一方面,本发明实施例提供了一种全屏幕单手操作方法,该方法包括:
在终端设备进入单手操作模式后,在所述终端设备的主显示界面上显示虚拟操作界面,所述虚拟操作界面的面积小于所述主显示界面的面积;
当检测到针对所述虚拟操作界面内的目标区域的触控操作时,在所述主显示界面内与所述目标区域对应的区域执行所述触控操作。
第二方面,本发明实施例提供了一种终端设备,该终端设备包括用于执行 上述第一方面的方法的单元。
第三方面,本发明实施例提供了另一种终端设备,包括处理器、输入设备、输出设备和存储器,所述处理器、输入设备、输出设备和存储器相互连接,其中,所述存储器用于存储支持终端设备执行上述方法的计算机程序,所述计算机程序包括程序指令,所述处理器被配置用于调用所述程序指令,执行上述第一方面的方法。
第四方面,本发明实施例提供了一种计算机可读存储介质,所述计算机存储介质存储有计算机程序,所述计算机程序包括程序指令,所述程序指令当被处理器执行时使所述处理器执行上述第一方面的方法。
本发明实施例通过在终端设备进入单手操作模式后,在终端设备的主显示界面上显示虚拟操作界面,在所述虚拟操作窗口中进行小范围的操作就能实现应用于整个主显示界面的操作,可以在终端设备上进行单手操作时提高屏幕显示效果。
附图说明
为了更清楚地说明本发明实施例技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本发明实施例提供的一种全屏幕单手操作方法的示意流程图;
图2a是本发明实施例提供的一种单手操作界面显示位置的示意图;
图2b是本发明实施例提供的另一种单手操作界面显示位置的示意图;
图2c是本发明实施例提供的一种全屏幕单手操作方法中进行触控操作的示意图;
图2d是本发明实施例提供的另一种全屏幕单手操作方法中进行触控操作的界面示意图;
图2e是本发明实施例提供的一种包含虚拟触发按键的界面示意图;
图2f是本发明实施例提供的一种包含直接操作区域的界面示意图;
图2g是本发明实施例提供的一种进入单手操作模式的方法示意图;
图2h是本发明实施例提供的一种拖动虚拟操作界面的方法示意图;
图2i是本发明实施例提供的一种放大或缩小虚拟操作界面的方法示意图;
图2j是本发明实施例提供的一种退出单手操作模式的方法示意图;
图3是本发明另一实施例提供的一种全屏幕单手操作方法的示意流程图;
图4是本发明实施例提供的一种终端设备的示意性框图;
图5是本发明另一实施例提供的一种终端设备示意性框图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。在本文中提及“实施例”意味着,结合实施例描述的特定特征、结构或特性可以包含在本发明的至少一个实施例中。在说明书中的各个位置出现该短语并不一定均是指相同的实施例,也不是与其它实施例互斥的独立的或备选的实施例。本领域技术人员显式地和隐式地理解的是,本文所描述的实施例可以与其它实施例相结合。
基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
本发明的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别不同对象,而不是用于描述特定顺序。此外,术语“包括”和“具有”以及它们任何变形,意图在于覆盖不排他的包含。例如包含了一系列步骤或单元的过程、方法、系统、产品或设备没有限定于已列出的步骤或单元,而是可选地还包括没有列出的步骤或单元,或可选地还包括对于这些过程、方法、产品或设备固有的其他步骤或单元。
还应当理解,在此本发明说明书中所使用的术语仅仅是出于描述特定实施例的目的而并不意在限制本发明。如在本发明说明书和所附权利要求书中所使用的那样,除非上下文清楚地指明其它情况,否则单数形式的“一”、“一个”及“该”意在包括复数形式。
还应当进一步理解,在本发明说明书和所附权利要求书中使用的术语“和/或”是指相关联列出的项中的一个或多个的任何组合以及所有可能组合,并且 包括这些组合。
如在本说明书和所附权利要求书中所使用的那样,术语“如果”可以依据上下文被解释为“当...时”或“一旦”或“响应于确定”或“响应于检测到”。类似地,短语“如果确定”或“如果检测到[所描述条件或事件]”可以依据上下文被解释为意指“一旦确定”或“响应于确定”或“一旦检测到[所描述条件或事件]”或“响应于检测到[所描述条件或事件]”。
具体实现中,本发明实施例中描述的终端设备包括但不限于诸如具有触摸敏感表面(例如,触摸屏显示器和/或触摸板)的移动电话、膝上型计算机或平板计算机之类的其它便携式设备。还应当理解的是,在某些实施例中,所述设备并非便携式通信设备,而是具有触摸敏感表面(例如,触摸屏显示器和/或触摸板)的台式计算机。
在接下来的讨论中,描述了包括显示器和触摸敏感表面的终端设备。然而,应当理解的是,终端设备可以包括诸如物理键盘、鼠标和/或控制杆的一个或多个其它物理用户接口设备。
终端设备支持各种应用程序,例如以下中的一个或多个:绘图应用程序、演示应用程序、文字处理应用程序、网站创建应用程序、盘刻录应用程序、电子表格应用程序、游戏应用程序、电话应用程序、视频会议应用程序、电子邮件应用程序、即时消息收发应用程序、锻炼支持应用程序、照片管理应用程序、数码相机应用程序、数字摄影机应用程序、web浏览应用程序、数字音乐播放器应用程序和/或数字视频播放器应用程序。
可以在终端设备上执行的各种应用程序可以使用诸如触摸敏感表面的至少一个公共物理用户接口设备。可以在应用程序之间和/或相应应用程序内调整和/或改变触摸敏感表面的一个或多个功能以及终端设备上显示的相应信息。这样,终端设备的公共物理架构(例如,触摸敏感表面)可以支持具有对用户而言直观且透明的用户界面的各种应用程序。
参见图1,是本发明实施例提供的一种全屏幕单手操作方法的示意流程图,如图1所示该方法可包括:
101、在终端设备进入单手操作模式后,该终端设备在该终端设备的主显示界面上显示虚拟操作界面,该虚拟操作界面的面积小于该主显示界面的面 积。
本发明实施例中提到的单手操作模式,主要是针对大屏幕终端设备而言的,由于屏幕较大,当用户只用一只手操作时,就会有一些按键很难触碰得到,而且也容易发生意外,所以,单手操作模式就是让屏幕显示面积缩小,用户只用一只手就可以流畅地操作,便于使用的同时避免发生终端设备掉落的事故。
上述终端设备的主显示界面,包括该终端设备当前在整个屏幕中显示的内容,上述主显示界面可以包括该终端设备的桌面显示界面,也可以包括该终端设备安装的应用程序(Application,APP)内的界面。
具体地,在终端设备接入单手操作模式后,在该终端设备的主显示界面上可以显示虚拟操作界面,上述虚拟操作界面内可以包含显示内容,上述显示内容是该主显示界面中显示的内容按照预设比例进行缩小后得到的。
上述虚拟操作界面的面积小于该主显示界面的面积,因此,如图2a所示,上述虚拟操作界面可以显示在该主显示界面边框区域的位置(如主显示界面的右下方区域),或者如图2b所示,上述虚拟操作界面的边缘与该主显示界面的边缘重叠(如虚拟操作界面的右下角与主显示界面的右下角重叠)。
可选地,上述虚拟操作界面可以为透明度可调的区域。上述虚拟操作界面可以根据预设透明度进行显示,上述虚拟操作界面的透明度可以根据需要进行设置,此时可以在虚拟操作界面的区域看到该区域主显示界面的显示内容,避免了上述虚拟操作界面对该区域显示内容的遮挡,提高了屏幕显示效果。
102、当检测到针对该虚拟操作界面内的目标区域的触控操作时,该终端设备在该主显示界面内与上述目标区域对应的区域执行该触控操作。
本发明实施例提到的目标区域为虚拟操作界面内的任意区域,该目标区域和该目标区域对应的区域都可以表示为多个坐标的集合。具体地,当检测到在该目标区域的触控操作时,该触控操作不会直接触发主显示界面的触发事件,而是在该主显示界面内与该目标区域对应的区域执行该触控操作。
具体地,终端设备可以根据该虚拟操作界面内的坐标与该主显示界面内的坐标的对应关系,确定虚拟操作界面的第一目标坐标对应的主显示界面的第二目标坐标,上述第一目标坐标和第二目标坐标可以是一个坐标和多个坐标的集合。在该终端设备中,可以预先建立虚拟操作界面的第一坐标系和主显示界面 的第二坐标系,上述第一坐标系和上述第二坐标系中存在多个坐标一一对应,即确定第一坐标系中的第一目标坐标时,就可以在第二坐标系中根据上述对应关系唯一确定该第一目标坐标对应的第二目标坐标,进一步地,在该主显示界面的该第二目标坐标处执行上述触控操作。可选地,上述虚拟操作界面内的坐标与上述主显示界面内的坐标的对应关系以数据形式储存在终端设备中。该方法中,步骤102可以重复执行。
如图2c所示,若终端设备检测到用户的手指在该虚拟操作界面发生第一滑动操作,上述触控操作即为该第一滑动操作,该第一滑动操作在该虚拟操作界面表示为第一滑动轨迹,该滑动轨迹可以看作是多个坐标的集合,即上述第一目标坐标;根据上述对应关系,可以确定该第一目标坐标对应的第二目标坐标,即可以在上述主显示界面中第二目标坐标位置触发第二滑动操作,该第二滑动操作形成的轨迹与上述第一滑动操作的轨迹形状相同,该第二滑动操作的轨迹是上述第一滑动操作的轨迹按比例放大得到的,该比例由上述对应关系决定。
其中,若终端设备检测到用户的手指在该虚拟操作界面发生第一按压操作,上述触控操作即为该按压操作,终端设备检测到该按压操作发生在该虚拟操作界面第三目标坐标位置;根据上述对应关系,可以确定该第三目标坐标对应的第四目标坐标,即可以在上述主显示界面中第四目标坐标位置触发第二按压操作,该第二按压操作可以触发主显示界面该第四目标坐标位置对应的触发事件,例如,如图2d所示,用户若在虚拟操作界面圈出的区域进行点击操作,即模拟了点击主显示屏幕对应位置触发的事件。
通过上述方法,仅在虚拟操作界面小幅度进行触控操作就可以在主显示界面执行该触控操作,能够在终端设备上进行单手操作时提高屏幕显示效果。
在可选实施例中,上述虚拟操作界面中可以包含虚拟触发按键,上述虚拟触发按键用于触发该主显示界面执行与上述虚拟触发按键对应的触发事件。可选的,上述确定该虚拟操作界面的第一目标坐标对应的主显示界面的第二目标坐标之后,可以在该第二目标坐标处显示第一指示元素,该第一指示元素用于指示触控操作的位置,该第一指示元素在主显示界面可以显示为圆形光标或者箭头。该第一指示元素可以便于用户确定上次在主显示界面上的触控位置,从 而便于在用户使用虚拟触发按键时准确地对该第一指示元素指示的位置进行对应的触发操作。可选的,还可以在第一目标坐标处显示第二指示元素,该第二指示元素可以便于用户确定上次在虚拟操作界面上的触控位置。其中,当虚拟触控界面上预设时长(例如,5秒、或10秒、或30秒等)内未发生触控操作,则可以隐藏上述第一指示元素,避免第一指示元素遮挡主显示界面的显示内容,提升主显示界面的显示效果。
如图2e所示,上述虚拟触发按键可以位于该主显示界面的侧边区域,终端设备检测到用户在点击虚拟触发按键时,在主显示界面的第一指示元素指示的坐标位置触发该虚拟触发按键对应的触发事件,例如主显示界面为终端设备的桌面显示界面,该界面显示的内容包含应用A、应用B和应用C的图标,在该虚拟操作界面内的虚拟触发按键包括确定按键,该确定按键用于触发该主显示界面执行对应的触发事件,若在主显示界面的第一指示元素为一个箭头,当该箭头指示位置为应用B的图标区域时,用户点击虚拟操作界面的该确定按键,则主显示页面显示的内容为进入应用B内的显示页面内容,即达到了在非单手操作模式下直接进行触控操作的效果。因为虚拟操作界面的显示内容是主显示界面显示的内容按比例缩小后得到的,在虚拟操作界面较小的情况下,通过虚拟触发按键模拟触控操作,比在虚拟操作界面直接进行触控操作的准确性更高。
在可选实施例中,终端设备的主显示界面与虚拟操作界面未重合区域内可以有直接操作区域,上述直接操作区域是在该虚拟操作界面外围可以直接对主显示界面进行触控操作的区域。该直接操作区域可以为该主显示界面的下半部区域,或者在该主显示界面内左下方或右下方的一边为弧形的区域,可选的,该区域可以根据该虚拟操作区域的位置决定,如图2f所示,直接操作区域为在该主显示界面内右下方的一边为弧形的区域,用户单手操作的可达范围在该区域内,即在需要对该直接操作区域进行触控操作时,不需要再通过上述虚拟操作界面,该终端设备也能响应该直接操作区域内对上述主显示界面的上述触控操作,使用户对该区域内的触控操作更方便。
请参见图3,图3是本发明第二实施例提供的一种全屏幕单手操作方法的示意流程图,图3是在图1的基础上进一步优化得到的。如图3所示,该全屏 幕单手操作方法可包括如下步骤:
301、终端设备检测该终端设备是否符合激活条件,若符合上述激活条件时,该终端设备进入单手操作模式。
上述激活条件指该终端设备的单手操作模式的激活条件,上述激活条件包括:该终端设备的屏幕被按照第一预定操作进行触控,或者,该终端设备的第一预定按键被触控。若该终端设备符合上述激活条件时,该终端设备进入单手操作模式。
具体地,当该终端设备的屏幕被按照第一预定操作进行触控,该终端设备符合激活条件,该终端设备进入单手操作模式。可选地,第一预定操作可以为该终端设备的屏幕下边框区域的滑动操作,例如,如图2g所示,当该终端设备检测到屏幕的预设边框区域a发生向上的滑动操作,在预设边框区域a内该滑动操作形成的滑动轨迹大于第一预设轨迹长度(例如:3厘米)时,该终端设备符合上述激活条件,进入上述单手操作模式,若该滑动操作形成的滑动轨迹小于或等于第一预设轨迹长度时,结束本流程。可选地,当该终端设备检测到屏幕预设边框区域发生向上的滑动操作之后,该终端设备可以检测该滑动操作形成的滑动轨迹是否在屏幕的预设侧边区域内,若是,该终端设备符合上述激活条件,进入上述单手操作模式,若不是,结束本流程。
可选地,当该终端设备的第一预定按键被触控,该终端设备符合上述激活条件,该终端设备进入单手操作模式。例如,当该终端设备检测到屏幕预设的预设组合按键被触控,该终端设备符合上述激活条件,进入上述单手操作模式。具体地,终端设备的预设组合按键可以为音量下键+Home键,同时按下该终端设备的音量下键和Home键,该终端设备符合上述激活条件,进入上述单手操作模式。其中,Home键是Windows操作系统、iOS操作系统、安卓操作系统的终端设备中具有回到主屏幕功能的键。
302、在该终端设备进入单手操作模式后,该终端设备在该终端设备的主显示界面上显示虚拟操作界面,该虚拟操作界面的面积小于该主显示界面的面积。
在可选实施例中,在该终端设备的主显示界面上显示虚拟操作界面之后,该方法还包括:
当检测到针对该虚拟操作界面的预设边框区域的第一拖动操作时,在该主显示界面上移动该虚拟操作界面;当检测到针对该虚拟操作界面的预设角落区域的第二拖动操作时,将该虚拟操作界面进行缩小或放大。
具体的实现方式可以是,终端设备检测到在预设边框区域或在预设角落区域的按压操作时,该虚拟操作区域的边框被选中(例如显示为红色),表示可以进行拖动操作。
具体地,该虚拟操作界面有预设边框区域,向如图2h所示,当针对该预设边框区域进行第一拖动操作时,该虚拟操作界面可以按照该第一拖动操作的移动方向在该主显示界面上移动。通过该方法可以拖动该虚拟操作界面在该主显示界面移动,选择该虚拟操作界面合适的显示位置,便于用户实现单手操作,(例如将该虚拟显示界面拖到屏幕左侧显示,便于惯用左手的用户进行操作),也可以避免遮挡主显示界面需要显示的页面内容。
具体地,该虚拟操作界面可以有预设角落区域,该预设角落区域指以该虚拟操作界面的任一角为中心向外延伸固定范围的区域,如图2i所示,当针对该虚拟操作界面的预设角落区域进行第二拖动操作时,该虚拟操作界面可以按照该第二拖动操作在主显示界面进行缩小或放大。通过该方法可以拖动调节该虚拟操作界面在该主显示界面的显示大小,以便用户调节该虚拟操作界面的显示效果,将该虚拟显示界面调整至便于用户个人操作的大小,可以也避免该虚拟操作界面太小内容不清晰影响操作的情况。
303、当检测到针对该虚拟操作界面内的目标区域的触控操作时,该终端设备在该主显示界面内与上述目标区域对应的区域执行该触控操作。
304、该终端设备检测该终端设备是否符合退出上述单手操作模式的退出条件,若符合上述退出条件时,该终端设备隐藏该虚拟操作界面,退出上述单手操作模式。
上述退出条件包括:该终端设备的屏幕被按照第二预定操作进行触控,或者,该终端设备的第二预定按键被触控。
具体地,当该终端设备的屏幕被按照第二预定操作进行触控,该终端设备符合上述退出条件,该终端设备隐藏该虚拟操作界面,退出上述单手操作模式。可选地,第二预定操作可以为该终端设备的屏幕侧边区域的滑动操作,例如, 如图2j所示,当该终端设备检测到屏幕预设侧边区域b发生向屏幕中央的滑动操作,该滑动操作形成的滑动轨迹大于第二预设轨迹长度(例如:2厘米)时,该终端设备符合上述退出条件,退出上述单手操作模式,若该滑动操作形成的滑动轨迹小于或等于第二预设轨迹长度时,结束本流程。
可选地,当该终端设备的第二预定按键被触控,该终端设备符合上述退出条件,该终端设备退出单手操作模式。例如,终端设备的第二预定按键可以为一组组合按键:音量下键+Home键,同时按下该终端设备的音量下键和Home键的情况下,该终端设备符合上述退出条件,退出上述单手操作模式。
其中,步骤302和步骤303可以参见上述图1所示的步骤101和步骤102的具体描述,此处不再赘述。
本实施例中,在图一所示的实施例的方法的基础上,增加了终端设备进入单手操作模式和退出单手操作模式的实现方式,特别是在终端设备进入单手操作模式前,可以检测用户触控位置是否靠近屏幕角落,更准确地判断是否为单手操作的情况,便于移动终端快速进入单手操作模式,方便用户使用。
本发明实施例还提供一种终端设备,该终端设备包括用于执行前述任一项所述的方法的单元。具体地,参见图4,是本发明实施例提供的一种终端设备的示意框图。本实施例的终端设备包括:显示单元、检测单元以及执行单元。
显示单元410,用于在终端设备进入单手操作模式后,在该终端设备的主显示界面上显示虚拟操作界面。
具体地,显示单元410显示的该虚拟操作界面的面积小于该主显示界面的面积,该虚拟操作界面内可以包含显示内容,上述显示内容是该主显示界面中显示的内容按照预设比例进行缩小后得到的。
可选地,显示单元410显示的上述虚拟操作界面可以为透明度可调的区域。该虚拟操作界面可以根据预设透明度进行显示,上述虚拟操作界面的透明度可以根据需要进行设置,此时可以在该虚拟操作界面的区域看到该区域主显示界面的显示内容,避免了上述虚拟操作界面对该区域显示内容的遮挡,提高了屏幕显示效果。
可选地,显示单元410可以显示包含虚拟触发按键的虚拟显示界面,上述虚拟触发按键用于触发该主显示界面执行与上述虚拟触发按键对应的触发事 件。
可选地,显示单元410还用于在主显示页面内显示指示元素,该指示元素用于指示触控操作的位置,该指示元素在主显示界面可以显示为圆形光标或者箭头。
第一检测单元420,用于检测针对上述虚拟操作界面内的目标区域的触控操作。
具体地,当第一检测单元420检测到在该目标区域的触控操作时,该触控操作不会直接触发主显示界面的触发事件,而是触发执行单元430在该主显示界面内与该目标区域对应的区域执行该触控操作。
执行单元430,用于在上述主显示界面内与该目标区域对应的区域执行上述触控操作。
在可选实施例中,终端设备还可以包括第二检测单元440,其中:
第二检测单元440,用于检测终端设备是否符合激活条件,若符合上述激活条件时,该终端设备进入单手操作模式。
具体地,上述激活条件指该终端设备的单手操作模式的激活条件,上述激活条件包括:该终端设备的屏幕被按照第一预定操作进行触控,或者,该终端设备的第一预定按键被触控。若第二检测单元440检测到该终端设备符合上述激活条件时,该终端设备进入单手操作模式。
第二检测单元440,还可以用于检测该终端设备是否符合退出上述单手操作模式的退出条件,若符合上述退出条件时,该终端设备隐藏该虚拟操作界面,退出上述单手操作模式。
具体地,上述退出条件包括:该终端设备的屏幕被按照第二预定操作进行触控,或者,该终端设备的第二预定按键被触控。当第二检测单元440检测到该终端设备符合上述退出条件,该终端设备隐藏该虚拟操作界面,退出上述单手操作模式。
在可选实施例中,第二检测单元440,还用于检测针对该虚拟操作界面的预设边框区域的第一拖动操作,和针对该虚拟操作界面的预设角落区域的第二拖动操作。
具体地,当第二检测单元440检测到针对该虚拟操作界面的预设边框区域 的第一拖动操作时,显示单元410在该主显示界面上显示该虚拟操作界面的移动;当第二检测单元440检测到针对该虚拟操作界面的预设角落区域的第二拖动操作时,显示单元410在该主显示界面上显示该虚拟操作界面的缩小或放大。
参见图5,是本发明另一实施例提供的一种终端设备示意框图。如图所示的本实施例中的终端设备可以包括:一个或多个处理器501;一个或多个输入设备502,一个或多个输出设备503和存储器504。上述处理器501、输入设备502、输出设备503和存储器504通过总线505连接。存储器504用于存储计算机程序,所述计算机程序包括程序指令,处理器501用于执行存储器504存储的程序指令。
输出设备503,用于在终端设备进入单手操作模式后,在该终端设备的主显示界面上显示虚拟操作界面。
输出设备503,还用于在主显示界面内显示指示元素。
处理器501,用于在上述主显示界面内与该目标区域对应的区域执行上述触控操作。
具体地,处理器501用于根据该虚拟操作界面内的坐标与该主显示界面内的坐标的对应关系,确定该第一目标坐标对应的第二目标坐标;处理器501还用于在该主显示界面的该第二目标坐标处执行上述触控操作。
在可选实施例中,处理器501还用于检测终端设备是否符合激活条件,若符合上述激活条件时,该终端设备进入上述单手操作模式。上述激活条件包括:该终端设备的屏幕被按照第一预定操作进行触控,或者,该终端设备的第一预定按键被触控。
可选的,处理器501用于检测针对该虚拟操作界面的预设边框区域的第一拖动操作,当检测到针对该虚拟操作界面的预设边框区域的第一拖动操作时,通过输出设备503在所述主显示界面上移动所述虚拟操作界面;处理器501还用于检测针对该虚拟操作界面的预设角落区域的第二拖动操作,当检测到针对该虚拟操作界面的预设角落区域的第二拖动操作时,通过输出设备503将所述虚拟操作界面进行缩小或放大。
可选的,存储器504用于存储虚拟操作界面内和主显示界面内的坐标及其 对应关系。
应当理解,在本发明实施例中,所称处理器501可以是中央处理单元(Central Processing Unit,CPU),该处理器还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现成可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
输入设备502可以包括触控板、指纹采传感器(用于采集用户的指纹信息和指纹的方向信息)、麦克风等,输出设备503可以包括显示器(LCD等)、扬声器等。
该存储器504可以包括只读存储器和随机存取存储器,并向处理器501提供指令和数据。存储器504的一部分还可以包括非易失性随机存取存储器。例如,存储器504还可以存储设备类型的信息。
具体实现中,本发明实施例中所描述的处理器501、输入设备502、输出设备503可执行本发明实施例提供的全屏幕单手操作方法的第一实施例和第二实施例中所描述的实现方式,也可执行本发明实施例所描述的终端设备的实现方式,在此不再赘述。
在本发明的另一实施例中提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序包括程序指令,被处理器执行时实现上述图1和图3所示的方法实施例。
所述计算机可读存储介质可以是前述任一实施例所述的终端设备的内部存储单元,例如终端设备的硬盘或内存。所述计算机可读存储介质也可以是所述终端设备的外部存储设备,例如所述终端设备上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)等。进一步地,所述计算机可读存储介质还可以既包括所述终端设备的内部存储单元也包括外部存储设备。所述计算机可读存储介质用于存储所述计算机程序以及所述终端设备所需的其他程序和数据。所述计算机可读存储介质还可以用于暂时地存储已经输出或者将要输出的数据。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、计算机软件或者二者的结合来实现,为了清楚地说明硬件和软件的可互换性,在上述说明中已经按照功能一般性地描述了各示例的组成及步骤。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本发明的范围。
所属领域的技术人员可以清楚地了解到,为了描述的方便和简洁,上述描述的终端设备和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的终端设备和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另外,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口、装置或单元的间接耦合或通信连接,也可以是电的,机械的或其它的形式连接。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本发明实施例方案的目的。
另外,在本发明各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以是两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分,或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储 介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本发明各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本发明的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到各种等效的修改或替换,这些修改或替换都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应以权利要求的保护范围为准。

Claims (10)

  1. 一种全屏幕单手操作方法,其特征在于,包括:
    在终端设备进入单手操作模式后,在所述终端设备的主显示界面上显示虚拟操作界面,所述虚拟操作界面的面积小于所述主显示界面的面积;
    当检测到针对所述虚拟操作界面内的目标区域的触控操作时,在所述主显示界面内与所述目标区域对应的区域执行所述触控操作。
  2. 根据权利要求1所述的方法,其特征在于,所述目标区域包括第一目标坐标,所述在所述主显示界面内与所述目标区域对应的区域执行所述触控操作,包括:
    根据所述虚拟操作界面内的坐标与所述主显示界面内的坐标的对应关系,确定所述第一目标坐标对应的第二目标坐标;
    在所述主显示界面的所述第二目标坐标处执行所述触控操作。
  3. 根据权利要求2所述的方法,其特征在于,所述在所述终端设备的主显示界面上显示虚拟操作界面之前,所述方法还包括:
    检测所述终端设备是否符合激活条件,若符合所述激活条件时,进入所述单手操作模式,所述激活条件包括:所述终端设备的屏幕被按照第一预定操作进行触控,或者,所述终端设备的第一预定按键被触控。
  4. 根据权利要求3所述的方法,其特征在于,所述在所述终端设备的主显示界面上显示虚拟操作界面之后,所述方法还包括:
    当检测到针对所述虚拟操作界面的预设边框区域的第一拖动操作时,在所述主显示界面上移动所述虚拟操作界面;
    当检测到针对所述虚拟操作界面的预设角落区域的第二拖动操作时,将所述虚拟操作界面进行缩小或放大。
  5. 根据权利要求4所述的方法,其特征在于,所述虚拟操作界面中包含虚拟触发按键,所述虚拟触发按键用于触发所述主显示界面执行与所述虚拟触 发按键对应的触发事件。
  6. 根据权利要求5所述的方法,其特征在于,所述虚拟操作界面内包含显示内容,所述显示内容是所述主显示界面中显示的内容按照预设比例进行缩小后得到的。
  7. 根据权利要求6所述的方法,其特征在于,所述当检测到针对所述虚拟操作界面内的目标区域的触控操作时,在所述主显示界面内与所述目标区域对应的区域执行所述触控操作之后,所述方法还包括:
    检测所述终端设备是否符合退出单手操作模式的退出条件,若符合所述退出条件时,隐藏所述虚拟操作界面,退出所述单手操作模式,所述退出条件包括:所述终端设备的屏幕被按照第二预定操作进行触控,或者,所述终端设备的第二预定按键被触控。
  8. 一种终端设备,其特征在于,包括用于执行如权利要求1-7任一权利要求所述的方法的单元。
  9. 一种终端设备,其特征在于,包括处理器、输入设备、输出设备和存储器,所述处理器、输入设备、输出设备和存储器相互连接,其中,所述存储器用于存储计算机程序,所述计算机程序包括程序指令,所述处理器被配置用于调用所述程序指令,执行如权利要求1-7任一项所述的方法。
  10. 一种计算机可读存储介质,其特征在于,所述计算机存储介质存储有计算机程序,所述计算机程序包括程序指令,所述程序指令当被处理器执行时使所述处理器执行如权利要求1-7任一项所述的方法。
PCT/CN2017/102230 2017-09-19 2017-09-19 一种全屏幕单手操作方法、终端设备及计算机可读介质 WO2019056167A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201780095019.0A CN111316200A (zh) 2017-09-19 2017-09-19 一种全屏幕单手操作方法、终端设备及计算机可读介质
PCT/CN2017/102230 WO2019056167A1 (zh) 2017-09-19 2017-09-19 一种全屏幕单手操作方法、终端设备及计算机可读介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/102230 WO2019056167A1 (zh) 2017-09-19 2017-09-19 一种全屏幕单手操作方法、终端设备及计算机可读介质

Publications (1)

Publication Number Publication Date
WO2019056167A1 true WO2019056167A1 (zh) 2019-03-28

Family

ID=65809486

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/102230 WO2019056167A1 (zh) 2017-09-19 2017-09-19 一种全屏幕单手操作方法、终端设备及计算机可读介质

Country Status (2)

Country Link
CN (1) CN111316200A (zh)
WO (1) WO2019056167A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114625300A (zh) * 2022-01-26 2022-06-14 北京讯通安添通讯科技有限公司 一种智能终端的操作方法、装置、终端和存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102937873A (zh) * 2012-10-12 2013-02-20 天津三星通信技术研究有限公司 在便携式终端中进行键盘输入的方法和设备
CN103914258A (zh) * 2014-03-26 2014-07-09 深圳市中兴移动通信有限公司 移动终端及其操作方法
CN104007930A (zh) * 2014-06-09 2014-08-27 深圳市中兴移动通信有限公司 一种移动终端及其实现单手操作的方法和装置
CN104238745A (zh) * 2014-07-31 2014-12-24 天津三星通信技术研究有限公司 一种移动终端单手操作方法及移动终端

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103955339A (zh) * 2014-04-25 2014-07-30 华为技术有限公司 一种终端操作方法及终端设备
CN106371688B (zh) * 2015-07-22 2019-10-01 小米科技有限责任公司 全屏幕单手操作方法及装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102937873A (zh) * 2012-10-12 2013-02-20 天津三星通信技术研究有限公司 在便携式终端中进行键盘输入的方法和设备
CN103914258A (zh) * 2014-03-26 2014-07-09 深圳市中兴移动通信有限公司 移动终端及其操作方法
CN104007930A (zh) * 2014-06-09 2014-08-27 深圳市中兴移动通信有限公司 一种移动终端及其实现单手操作的方法和装置
CN104238745A (zh) * 2014-07-31 2014-12-24 天津三星通信技术研究有限公司 一种移动终端单手操作方法及移动终端

Also Published As

Publication number Publication date
CN111316200A (zh) 2020-06-19

Similar Documents

Publication Publication Date Title
US8654076B2 (en) Touch screen hover input handling
JP5893060B2 (ja) 連続的なズーム機能を提供するユーザーインターフェイスの方法
US8842084B2 (en) Gesture-based object manipulation methods and devices
US10042546B2 (en) Systems and methods to present multiple frames on a touch screen
EP2825944B1 (en) Touch screen hover input handling
US20120030624A1 (en) Device, Method, and Graphical User Interface for Displaying Menus
US20120026100A1 (en) Device, Method, and Graphical User Interface for Aligning and Distributing Objects
WO2018068328A1 (zh) 一种界面显示的方法及终端
WO2018119674A1 (zh) 一种柔性显示屏的控制方法及装置
US20110163967A1 (en) Device, Method, and Graphical User Interface for Changing Pages in an Electronic Document
WO2018233399A1 (zh) 显示方法、移动终端及计算机可读存储介质
WO2019119799A1 (zh) 一种显示应用图标的方法及终端设备
JP2014529138A (ja) タッチ入力を用いたマルチセル選択
WO2021203815A1 (zh) 页面操作方法、装置、终端及存储介质
WO2017059734A1 (zh) 一种图片缩放方法及电子设备
WO2014044133A1 (zh) 应用界面及控制应用界面操作的方法和装置
WO2019062431A1 (zh) 拍摄方法及移动终端
WO2019015581A1 (zh) 文字删除方法及移动终端
WO2016184052A1 (zh) 一种卡片的添加方法、装置、设备及计算机存储介质
US9304650B2 (en) Automatic cursor rotation
TW202145206A (zh) 單手操作模式開啟方法、終端及電腦儲存媒介
CN108491152B (zh) 基于虚拟光标的触屏终端操控方法、终端及介质
WO2014075540A1 (zh) 触摸屏滚屏控制系统及方法
WO2019056167A1 (zh) 一种全屏幕单手操作方法、终端设备及计算机可读介质
WO2023093661A1 (zh) 界面控制方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17926273

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17926273

Country of ref document: EP

Kind code of ref document: A1