CN118113181A - Single-hand operation method and electronic equipment - Google Patents

Single-hand operation method and electronic equipment Download PDF

Info

Publication number
CN118113181A
CN118113181A CN202211521350.3A CN202211521350A CN118113181A CN 118113181 A CN118113181 A CN 118113181A CN 202211521350 A CN202211521350 A CN 202211521350A CN 118113181 A CN118113181 A CN 118113181A
Authority
CN
China
Prior art keywords
area
interface
control
display
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202211521350.3A
Other languages
Chinese (zh)
Inventor
高超
赵增智
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202211521350.3A priority Critical patent/CN118113181A/en
Priority to PCT/CN2023/127972 priority patent/WO2024114234A1/en
Publication of CN118113181A publication Critical patent/CN118113181A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a single-hand operation method and electronic equipment, which relate to the technical field of electronic equipment. In the scheme, when the electronic equipment displays a first interface, if a trigger instruction of a single-hand operation mode is detected, the electronic equipment can determine a first area and a second area of the display screen, and the original display area of the display screen and the first interface displayed in the original display area are reduced and displayed in the second area; the electronic device may then perform a function corresponding to a touch operation on the reduced original display area displayed in the second area and the first interface displayed therein in response to the touch operation by the user on the first area.

Description

Single-hand operation method and electronic equipment
Technical Field
The embodiment of the application relates to the technical field of electronic equipment, in particular to a one-hand operation method and electronic equipment.
Background
With the continuous development of electronic devices, more and more electronic devices with display screens are widely used in daily life and work of people, such as mobile phones, tablet computers, etc. with display screens. And it is not difficult to find that with the development of screen technology, the display screen of the electronic device is becoming larger and larger, so as to provide the user with richer information and bring better use experience to the user.
However, as the display screen of the electronic device becomes larger, the problem of inconvenience of one-handed operation of the user becomes more and more apparent. For example, in the case that a user holds the electronic device with one hand, it is difficult to perform a full-range operation on the display screen with the hand holding the electronic device, and a large portion of the area on the display screen is difficult to operate, and at this time, the user has to use both hands to operate, i.e., hold the electronic device with one hand and operate the display screen of the electronic device with the other hand. The single-handed operation experience of the user is affected.
Disclosure of Invention
The application provides a single-hand operation method and electronic equipment, which can improve single-hand operation experience of a user.
In order to achieve the above purpose, the embodiment of the application adopts the following technical scheme:
In a first aspect, a method of one-handed operation is provided, which may be applied to an electronic device, which may include a display screen. The one-hand operation method comprises the following steps: displaying a first interface; responding to a trigger instruction of a single-hand operation mode, and determining a first area and a second area of the display screen; the first area is an area which can be touched by a finger of a user in a single hand when the user holds the electronic equipment in a single hand, and the second area is a residual area except the first area on the display screen; displaying the reduced first interface in the second area; in response to a touch operation acting on the first area, a function corresponding to the touch operation is executed on the reduced first interface.
In the foregoing aspect of the present invention, when the electronic device displays the first interface through the screen display area of the display screen, if a trigger instruction for entering the one-hand operation mode is detected, the electronic device may use an area (i.e., a portion of the screen display area of the display screen of the electronic device) that is touchable by a user with one hand on the display screen as a touch area of the one-hand operation mode to receive a touch operation by the user with one hand, and use a remaining area on the display screen except for the touch area as a display area of the one-hand operation mode to reduce and display the original screen display area of the display screen and the first interface displayed by the original screen display area in the display area of the one-hand operation mode. The electronic device may then sense a touch operation of a user within a touch area of the one-handed operation mode to correspondingly control a reduced original screen display area displayed in a display area of the one-handed operation mode and a first interface displayed in the original screen display area according to the touch operation sensed within the touch area. Therefore, the user can perform full-range operation on the original interface displayed by the electronic equipment in a shrinking manner by performing touch operation in the touch control area. Under the condition that an application program is not required to be newly developed or adapted, the problem that partial content displayed on a screen can not be touched by fingers when a user operates the electronic equipment with one hand is solved.
In one possible implementation manner, determining the first area of the display screen may include: prompting a user to slide on the display screen according to a specified track by using fingers of one hand when holding the electronic equipment by one hand; according to the sliding of the fingers of the single hand, determining the maximum area which can be touched by the fingers of the single hand on the display screen; and determining a first area of the display screen according to the maximum area. Therefore, the electronic equipment can ensure that the set touch control area accords with the single-hand operation habit of each user, and the user can operate all positions of the touch control area when holding the electronic equipment by one hand.
In one possible implementation manner, the first interface may include a target control, and the performing, in response to a touch operation applied to the first area, a function corresponding to the touch operation on the reduced first interface may include: responding to the touch operation acting on the target position in the first area, and mapping the touch operation acting on the target position in the first area into the touch operation acting on the target control in the reduced first interface; a coordinate mapping relation is established in advance between a target position in the first area and a position of a target control in the first interface; responding to the touch operation of the target control in the reduced first interface, and executing the function corresponding to the target control on the reduced first interface.
In this way, the electronic device can map the touch operation of the user on a certain position in the touch area to the touch operation of the corresponding position in the display area by establishing the coordinate mapping relation between the touch area in the single-hand operation mode and the display area in the single-hand operation mode, so that the electronic device can execute the function required to be executed when the corresponding position in the display area is touched. Therefore, the user does not need to operate in a display area which cannot be touched by one hand, and can directly touch in a touch control area which can be touched by one hand, so that the original interface which is displayed in a reduced mode in the display area is operated in a full range.
Optionally, taking the first interface as a video playing interface, taking the target control as a playing control located at a central position of the video playing interface as an example, the central position of the video playing interface and the central position of the first area may be pre-established with a coordinate mapping relationship, where the responding to the touch operation acting on the first area performs a function corresponding to the touch operation on the reduced first interface, and may include: responding to the touch operation acting on the central position of the first area, and mapping the touch operation acting on the central position of the first area into the touch operation acting on the play control in the reduced video play interface; and responding to the touch operation of the playing control in the reduced video playing interface, and playing the video in the reduced video playing interface.
In this way, the electronic device may establish a mapping relationship according to the layout orientation of the target control in the first interface, for example, the center position, the upper left, the upper right, the lower left, the lower right, etc. of the target control in the display area, and corresponding positions in the touch area in the single-hand operation mode, for example, the center position, the upper left, the upper right, the lower left, the lower right, etc. of the touch area, so as to ensure that the touch position of the user in the touch area corresponds to the position of the target control in the first interface, so that the user can quickly and accurately touch the control to be operated in the touch area. In one possible implementation, the first interface may include a target control, and after displaying the zoomed-out first interface, the one-handed operation method further includes: and displaying the target control in the first area. Therefore, the electronic device can map the control in the first interface to the touch area for display, so that a user can intuitively and accurately control the control to be operated in the touch area.
In this implementation manner, the performing, in response to the touch operation applied to the first area, a function corresponding to the touch operation on the reduced first interface may include: responding to touch operation acted on the target control in the first area, and executing a function corresponding to the target control on the reduced first interface. Because the target control displayed in the touch area is mapped by the target control in the first interface, when the target control displayed in the touch area is triggered, the target control in the first interface can be considered to be triggered, and the electronic device can directly execute the function required to be executed when the target control is triggered on the reduced first interface.
Optionally, taking the first interface as a video playing interface, taking the target control as a playing control as an example, after the reduced first interface is displayed, the single-hand operation method further includes: and displaying the play control in the first area. The performing, in response to the touch operation applied to the first area, a function corresponding to the touch operation on the reduced first interface may include: and responding to the touch operation acted on the playing control in the first area, and playing the video in the reduced video playing interface. Therefore, when the electronic equipment displays the reduced video playing interface in the single-hand operation mode, the electronic equipment can display the playing control in the video playing interface in the touch area, so that a user can intuitively and accurately control the playing control in the touch area by one hand.
In one possible implementation manner, the target control is a first control, and after displaying the reduced first interface, the one-hand operation method may further include: according to the display style of each control in the first interface, acquiring the control fixedly displayed at the first position in the first interface from the first interface as a first control; wherein the first position includes at least one of a top position and a bottom position. It will be appreciated that the controls that are typically located at the top or bottom of the user interface are fixed position operational controls that do not move with the page view, so the electronic device may map the controls located at the top or bottom of the user interface to the touch area for display, so that the user may operate these fixed position controls at any time in the touch area. Alternatively, when the mapping is displayed in the touch area, the mapping may be displayed correspondingly to the top or bottom of the touch area.
Optionally, the first control may include at least one of a title bar, a navigation bar, and a menu bar.
In one possible implementation manner, the target control is a second control, and after the first interface is displayed after the shrinking, the one-hand operation method further includes: according to the display style of each control in the first interface, acquiring the control which is not fixedly displayed at the first position in the first interface from the first interface as a second control; wherein the first position comprises at least one of a top position and a bottom position; displaying a selection cursor on the reduced first interface, wherein the selection cursor comprises a boundary which defines a selection area of the selection cursor in the reduced first interface, and the selection area is used for selecting at least one second control; displaying the target control in the first area comprises: and displaying a second control in the selection area of the selection cursor in the first area.
Because the second control which moves along with the page browsing in the first interface and is in the non-fixed position can be gradually displayed or not gradually displayed along with the page browsing, the electronic equipment can map the second control not all the time, but when the user browses the second control, the second control is mapped and displayed. Optionally, the electronic device may determine the second control currently browsed by the user by displaying the selection cursor, so that the electronic device may map the second control selected by the selection cursor to the touch area for display.
Optionally, the selection cursor may move along with a sliding operation of the user in the touch area, and when the user controls the movement of the selection cursor, the electronic device may follow the movement of the selection cursor, and map the newly selected second control to the touch area for display after the movement of the selection cursor. In this way, the electronic device can map the control selected by the selection cursor to the touch area in real time for display.
Optionally, taking the first interface as a browsing interface, the browsing interface includes a plurality of second controls, where the plurality of second controls includes a first option control as an example, and the one-hand operation method may further include: when the first option control is included in the selection area of the selection cursor, the first option control is displayed in the first area. Therefore, when the electronic equipment displays the reduced browsing interface in the single-hand operation mode, the electronic equipment can display a selection cursor on the browsing interface, the selection cursor can browse any selection control in the interface, and the electronic equipment displays the selection control selected by the selection cursor in the touch control area, so that a user can intuitively and accurately control the selection control in the touch control area by one hand.
In one possible implementation manner, after the control non-fixedly displayed at the first position in the first interface is obtained from the first interface as the second control, the one-hand operation method may further include: combining the plurality of second controls with the matched display patterns to obtain a combined control; and determining the area occupied by the combined control as a selection area of the selection cursor. The displaying the second control in the selection area of the selection cursor in the first area includes: and displaying the combined control in the selection area of the selection cursor in the first area. Thus, for a plurality of second controls moving along with page browsing and having non-fixed positions in the user interface, if the second controls are mapped to the touch area one by one according to single controls, the user needs to frequently execute sliding operation and may not operate the second controls until the user wants to control the second controls, so that the electronic device can recombine the second controls to obtain combined controls which can be mapped to the touch area together for display, and then the user can directly operate the combined controls displayed in the touch area until the user wants to control a specific control.
In one possible implementation, the one-handed operation method may further include: and according to the display style of each second control, taking a plurality of second controls arranged along the same direction as a plurality of second controls with matched display styles. It will be appreciated that because the co-directional controls are typically the same size, the electronic device may recombine the co-directional controls to obtain a combined control of regular shape.
In one possible implementation manner, the first interface is an application interface, and the one-hand operation method may further include: when the first interface is an application interface of the first application program, a plurality of second controls which are arranged along the horizontal direction are used as a plurality of second controls with matched display patterns; when the first interface is an application interface of the second application program, a plurality of second controls which are arranged along the vertical direction are used as a plurality of second controls with matched display patterns; wherein the first application is different from the second application. In this manner, the electronic device can adaptively generate matched combination controls according to different application types.
Optionally, taking the first interface as a browsing interface, the browsing interface includes a plurality of second controls, where the plurality of second controls include a first option control, a second option control, and a third option control, which are arranged along a horizontal direction, for example, and the one-hand operation method may further include: combining the first option control, the second option control and the third option control which are arranged along the horizontal direction to obtain an option combination control; when the selection area of the cursor is selected to comprise the option combination control, displaying the option combination control in the first area; the size of the selection area of the selection cursor is matched with the size of the area occupied by the option combination control. Therefore, when the electronic device displays the reduced browsing interface in the single-hand operation mode, the electronic device can combine a plurality of controls which are arranged in a row in the horizontal direction into one combined control, the size of the combined control can be the size of a selection area of a selection cursor, when the selection cursor moves to the position of the combined control, the electronic device can directly map the combined control to a touch area for displaying, so that a user can intuitively and accurately control each control in the combined control in the touch area by one hand.
In one possible implementation, the one-handed operation method may further include: when the size of the area occupied by the combined control is not matched with the size of the first area, adjusting the display style of each second control in the combined control to obtain an adjusted combined control; the size of the area occupied by the adjusted combined control is matched with the size of the first area; and displaying the adjusted combined control in the first area. Therefore, the electronic device can adaptively adjust the size and the position of each control in the combined control according to the size of the touch area, and when the combined control displayed in the display area is too small, the combined control can be amplified and displayed when being mapped in the touch area, so that the control experience of a user in the touch area is improved.
In one possible implementation manner, the adjusting the display style of each second control in the combined control includes: according to the size of the first area, the display size of each second control in the combined control is adjusted; or adjusting the display interval between two adjacent second controls in the combined control according to the size of the first area; or adjusting the display position of each second control in the combined control according to the size of the first area. In this way, the electronic device can adaptively select and adjust at least one of the size, the position or the spacing of each control in the combined control according to the size of the touch control area.
In one possible implementation manner, taking the first interface as a home page of the application program as an example, the responding to the touch operation applied to the first area, and executing the function corresponding to the touch operation on the reduced first interface includes: and responding to the left sliding operation acted on the first area, exiting the application program, and displaying the zoomed-out desktop interface in the second area. In this way, the user may perform a specific gesture in the touch area to perform a specific operation.
In one possible implementation manner, when the first interface is a non-home page of the application program, the responding to the touch operation applied to the first area performs a function corresponding to the touch operation on the reduced first interface, and includes: and responding to the left sliding operation acted on the first area, displaying a reduced second interface, wherein the second interface is the interface of the upper layer of the first interface. As such, the functions performed by the same gesture may be different when in different interfaces.
In one possible implementation manner, the responding to the touch operation applied to the first area performs a function corresponding to the touch operation on the reduced first interface, including: detecting a preset gesture operation acting on the first area, wherein the preset gesture operation is used for triggering a preset function in the third interface; responding to the preset gesture operation, switching the reduced first interface displayed in the second area to the reduced third interface, and executing the preset function. In this way, the electronic device can bind a specific gesture operation with a certain function in a certain page, so that when the user performs the specific gesture operation in the touch area, the user can open the page by one key to perform the function.
In one possible implementation manner, the second area includes a target display area and a third area, and displaying the first interface after shrinking in the second area includes: displaying the reduced first interface in the target display area; displaying a plurality of icon controls in a third area; the icon control comprises at least one of an icon control of the application program and an icon control of the shortcut function. Therefore, the phenomenon that the display screen has a black area due to the existence of an area which does not display any content is avoided, the full utilization of the whole display area of the display screen of the electronic equipment is realized, the waste of large screen area is avoided, and a user can quickly open other applications while browsing the original user interface.
In one possible implementation manner, the displaying the plurality of icon controls in the third area includes: rearranging icon controls of a plurality of application programs in a desktop interface; displaying the rearranged icon controls of the plurality of application programs in the third area. In this way, when the electronic device enters the single-hand operation mode, besides the first interface displayed in the original screen display area is reduced and displayed in the display area of the single-hand operation mode, the application icons on the desktop can be automatically rearranged and displayed in the remaining third area.
In one possible implementation, the one-handed operation method further includes: displaying a switching control in a first area, wherein an action area of the first area is a target display area; responding to touch operation acted on a switching control in the first area, and determining that an acting area of the first area is switched from a target display area to a third area; in response to a touch operation acting on the first region, a function corresponding to the touch operation of the first region is performed on the plurality of icon controls in the third region. Therefore, the electronic device can provide a switching control in the touch control area so that a user can control whether the current touch control area is used for controlling the first interface displayed in the original screen display area or controlling a plurality of icons additionally displayed in the third area.
In a second aspect, there is provided an electronic device comprising: the device comprises a display unit, a determining unit and an executing unit. The display unit is used for displaying the first interface. The determining unit is used for responding to a trigger instruction of the single-hand operation mode and determining a first area and a second area of the display screen; the first area is an area which can be touched by a finger of a user in one hand on the display screen when the user holds the electronic equipment in one hand, and the second area is a remaining area except the first area on the display screen. And the determining unit is also used for displaying the reduced first interface in the second area. And the execution unit is used for responding to the touch operation acted on the first area and executing the function corresponding to the touch operation on the reduced first interface.
In one possible implementation, the determining unit may be configured to: prompting a user to slide on the display screen according to a specified track by using fingers of one hand when holding the electronic equipment by one hand; according to the sliding of the fingers of the single hand, determining the maximum area which can be touched by the fingers of the single hand on the display screen; and determining a first area of the display screen according to the maximum area.
In one possible implementation manner, the first interface may include a target control, and the executing unit may be configured to: responding to the touch operation acting on the target position in the first area, and mapping the touch operation acting on the target position in the first area into the touch operation acting on the target control in the reduced first interface; a coordinate mapping relation is established in advance between a target position in the first area and a position of a target control in the first interface; responding to the touch operation of the target control in the reduced first interface, and executing the function corresponding to the target control on the reduced first interface.
Optionally, taking the first interface as a video playing interface, taking a target control as an example of a playing control located at a central position of the video playing interface, a coordinate mapping relationship may be pre-established between the central position of the video playing interface and the central position of the first area, and the executing unit may be configured to: responding to the touch operation acting on the central position of the first area, and mapping the touch operation acting on the central position of the first area into the touch operation acting on the play control in the reduced video play interface; and responding to the touch operation of the playing control in the reduced video playing interface, and playing the video in the reduced video playing interface.
In one possible implementation, the first interface may include a target control, and the display unit may be further configured to: and displaying the target control in the first area. The execution unit may be configured to: responding to touch operation acted on the target control in the first area, and executing a function corresponding to the target control on the reduced first interface.
Optionally, taking the first interface as a video playing interface, and taking the target control as a playing control as an example, the display unit may be further configured to: and displaying the play control in the first area. The execution unit may be configured to: and responding to the touch operation acted on the playing control in the first area, and playing the video in the reduced video playing interface.
In one possible implementation manner, the target control is a first control, and the electronic device may further include: the first acquisition unit is used for acquiring the control fixedly displayed at the first position in the first interface from the first interface according to the display style of each control in the first interface, and taking the control as the first control; wherein the first position includes at least one of a top position and a bottom position.
Optionally, the first control may include at least one of a title bar, a navigation bar, and a menu bar.
In one possible implementation manner, the target control is a second control, and the electronic device may further include: the second acquisition unit is used for acquiring the control which is not fixedly displayed at the first position in the first interface from the first interface according to the display style of each control in the first interface, and taking the control as a second control; wherein the first position includes at least one of a top position and a bottom position. The display unit described above may also be used for: displaying a selection cursor on the reduced first interface, wherein the selection cursor comprises a boundary which defines a selection area of the selection cursor in the reduced first interface, and the selection area is used for selecting at least one second control; and displaying a second control in the selection area of the selection cursor in the first area.
Optionally, taking the first interface as a browsing interface, the browsing interface includes a plurality of second controls, where the plurality of second controls includes a first option control, and the display unit may be further configured to: when the first option control is included in the selection area of the selection cursor, the first option control is displayed in the first area.
In one possible implementation, the electronic device may further include: a combining unit and a region generating unit. The combination unit is used for carrying out combination processing on the plurality of second controls with the matched display patterns to obtain a combined control; the region generation unit is used for determining the region occupied by the combination control as a selection region of the selection cursor. The display unit described above may also be used for: and displaying the combined control in the selection area of the selection cursor in the first area.
In one possible implementation, the electronic device may further include: and the matching unit is used for taking a plurality of second controls arranged along the same direction as a plurality of second controls with matched display patterns according to the display patterns of each second control.
In one possible implementation manner, the first interface is an application interface, and the matching unit may be further configured to: when the first interface is an application interface of the first application program, a plurality of second controls which are arranged along the horizontal direction are used as a plurality of second controls with matched display patterns; when the first interface is an application interface of the second application program, a plurality of second controls which are arranged along the vertical direction are used as a plurality of second controls with matched display patterns; wherein the first application is different from the second application.
Optionally, taking the first interface as a browsing interface, where the browsing interface includes a plurality of second controls, and the plurality of second controls includes a first option control, a second option control, and a third option control that are arranged along a horizontal direction, for example, the combination unit may be configured to: and carrying out combination processing on the first option control, the second option control and the third option control which are arranged along the horizontal direction to obtain an option combination control. The display unit described above may also be used for: when the selection area of the cursor is selected to comprise the option combination control, displaying the option combination control in the first area; the size of the selection area of the selection cursor is matched with the size of the area occupied by the option combination control.
In one possible implementation, the electronic device may further include: the adjusting unit is used for adjusting the display style of each second control in the combined control when the size of the area occupied by the combined control is not matched with the size of the first area, so as to obtain an adjusted combined control; the size of the area occupied by the adjusted combined control is matched with the size of the first area. The display unit described above may also be used for: and displaying the adjusted combined control in the first area.
In a possible implementation, the above adjustment unit may be further configured to: according to the size of the first area, the display size of each second control in the combined control is adjusted; or adjusting the display interval between two adjacent second controls in the combined control according to the size of the first area; or adjusting the display position of each second control in the combined control according to the size of the first area.
In one possible implementation manner, taking the first interface as a home page of the application program as an example, the execution unit may be configured to: and responding to the left sliding operation acted on the first area, exiting the application program, and displaying the zoomed-out desktop interface in the second area.
In one possible implementation, when the first interface is a non-home page of the application, the execution unit may be configured to: and responding to the left sliding operation acted on the first area, displaying a reduced second interface, wherein the second interface is the interface of the upper layer of the first interface.
In one possible implementation manner, the execution unit may be configured to: detecting a preset gesture operation acting on the first area, wherein the preset gesture operation is used for triggering a preset function in the third interface; responding to the preset gesture operation, switching the reduced first interface displayed in the second area to the reduced third interface, and executing the preset function.
In a possible implementation manner, the second area includes a target display area and a third area, and the display unit may be configured to: displaying the reduced first interface in the target display area; displaying a plurality of icon controls in a third area; the icon control comprises at least one of an icon control of the application program and an icon control of the shortcut function.
In one possible implementation, the display unit may be configured to: rearranging icon controls of a plurality of application programs in a desktop interface; displaying the rearranged icon controls of the plurality of application programs in the third area.
In one possible implementation, the display unit may be configured to: and displaying the switching control in the first area, wherein the action area of the first area is a target display area. The electronic device may further include: and the switching unit is used for responding to the touch operation acted on the switching control in the first area and determining that the action area of the first area is switched from the target display area to the third area. The execution unit may further be configured to: in response to a touch operation acting on the first region, a function corresponding to the touch operation of the first region is performed on the plurality of icon controls in the third region.
In a third aspect, the present application provides an electronic device comprising a display screen, one or more processors, and one or more memories. The display screen, one or more memories coupled to the one or more processors, the one or more memories for storing computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the one-handed method of operation in any of the possible implementations of the first aspect described above.
In a fourth aspect, the present application provides a one-handed device for inclusion in an electronic apparatus, the device having functionality to implement the behaviour of the electronic apparatus in any of the above-described first aspect and possible implementations of the first aspect. The functions can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules or units corresponding to the functions described above.
In a fifth aspect, the present application provides a chip system, which is applied to an electronic device. The system-on-chip includes one or more interface circuits and one or more processors. The interface circuit and the processor are interconnected by a wire. The interface circuit is for receiving a signal from a memory of the electronic device and transmitting the signal to the processor, the signal including computer instructions stored in the memory. When the processor executes the computer instructions, the electronic device performs the one-handed operation method in any of the possible implementations of the first aspect.
In a sixth aspect, the present application provides a computer storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the one-handed method of any of the possible implementations of the first aspect described above.
In a seventh aspect, the present application provides a computer program product for, when run on an electronic device, causing the electronic device to perform the one-handed operation method in any of the possible implementations of the first aspect.
It will be appreciated that the advantages achieved by the electronic device of the second aspect and any possible implementation thereof, the electronic device of the third aspect, the one-hand operation device of the fourth aspect, the chip system of the fifth aspect, the computer storage medium of the sixth aspect, and the computer program product of the seventh aspect may refer to the advantages in the first aspect and any possible implementation thereof, and are not repeated herein.
Drawings
Fig. 1A is a schematic hardware structure of an electronic device according to an embodiment of the present application;
fig. 1B is a schematic diagram of an example software architecture of an electronic device according to an embodiment of the present application;
Fig. 2 is a schematic diagram of a scenario of a user operating an electronic device during right-handed operation according to an embodiment of the present application;
fig. 3 is a schematic diagram of an interface of an electronic device according to an embodiment of the present application;
fig. 4 is a second schematic interface diagram of the electronic device according to the embodiment of the present application;
fig. 5 is a schematic diagram of an interface of an electronic device according to an embodiment of the present application;
Fig. 6 is a schematic diagram of an interface of an electronic device according to an embodiment of the present application;
Fig. 7 is a schematic diagram of an interface of an electronic device according to an embodiment of the present application;
fig. 8 is a schematic diagram of an interface of an electronic device according to an embodiment of the present application;
Fig. 9 is a schematic diagram seventh of an interface of an electronic device according to an embodiment of the present application;
fig. 10 is an interface schematic diagram eighth of an electronic device according to an embodiment of the present application;
fig. 11 is a schematic diagram of an interface of an electronic device according to an embodiment of the present application;
fig. 12 is an interface schematic diagram of an electronic device according to an embodiment of the present application;
Fig. 13 is an eleventh interface schematic diagram of an electronic device according to an embodiment of the present application;
Fig. 14 is a schematic diagram showing an interface of an electronic device according to an embodiment of the present application;
FIG. 15 is a thirteenth interface schematic diagram of an electronic device according to an embodiment of the present application;
Fig. 16 is a schematic diagram fourteen interface diagrams of an electronic device according to an embodiment of the present application;
fig. 17 is an interface schematic diagram fifteen of an electronic device according to an embodiment of the present application;
Fig. 18 is a sixteen schematic interface diagrams of an electronic device according to an embodiment of the present application;
Fig. 19 is a seventeen schematic diagrams of an interface of an electronic device according to an embodiment of the present application;
fig. 20 is an interface schematic diagram eighteenth of an electronic device according to an embodiment of the present application;
fig. 21 is a nineteenth schematic interface diagram of an electronic device according to an embodiment of the present application;
Fig. 22 is a schematic diagram of an interface of an electronic device according to an embodiment of the present application;
fig. 23 is a twenty-first interface schematic diagram of an electronic device according to an embodiment of the present application;
FIG. 24 is a flow chart of a method for one-handed operation according to an embodiment of the present application;
Fig. 25 is a method flowchart of another one-hand operation method according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings in the embodiments of the present application. The terms "first" and "second" are used below for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present embodiment, unless otherwise specified, the meaning of "plurality" is two or more.
The embodiment of the application provides a one-hand operation method which can be applied to electronic equipment. The electronic device may be a tablet, a mobile phone, a vehicle-mounted device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), or the like, which has a display screen, and the embodiment of the present application does not limit the specific type of the electronic device.
By way of example, fig. 1A shows a schematic diagram of the structure of an electronic device 100. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, a gravity sensor, and the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a memory, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it may be called directly from memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-INTEGRATED CIRCUIT, I2C) interface, an integrated circuit built-in audio (inter-INTEGRATED CIRCUIT SOUND, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), an active-matrix organic LIGHT EMITTING diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-oLed, a quantum dot LIGHT EMITTING diode (QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
In an embodiment of the present application, the display 194 may be used to display touch windows and application windows. The touch window is displayed in an area of the display screen 194 that can be touched by a user with one hand, and the application window is displayed in the remaining area of the display screen 194 except the area where the touch window is located.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, so that the electrical signal is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
In an embodiment of the present application, the camera 193 may be used to collect hand information of the user. For example, when the user first activates the "one-handed" function, the electronic device 100 prompts the user to enter hand information. After the user inputs the instruction of "turn on the camera 193", the processor 110 turns on the camera 193 to take a picture, so as to acquire the hand information of the user. The hand information may include information such as a palm size, a length of each finger, a fingerprint of each finger, and the like. In the embodiment of the present application, the electronic device 100 obtains the hand information of the user, mainly to obtain the thumb length of the user.
In the embodiment of the present application, the length of the thumb is the distance between the farthest touch point and the holding point that can be touched when the thumb of the user touches the display screen 194. The holding point may be a point where the palm of the user contacts the edge of the display screen 194. Illustratively, as shown in fig. 2, when the user takes the gesture of holding the electronic device with one hand as shown in fig. 2, the distance between the farthest touch point 201 of the thumb and the holding point 202 is the length of the thumb.
In the embodiment of the present application, the electronic device 100 obtains the length of the thumb by obtaining the hand information of the user, so as to determine the position and the size of the display of the touch window when the "one-hand operation" function is subsequently turned on, so as to ensure that the user can control the display content in the application window through the touch operation acting in the touch window when holding the mobile phone.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer-executable program code that includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, electronic device 100 performs a reduction in the performance of a processor located in the vicinity of temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the electronic device 100 heats the battery 142 to avoid the low temperature causing the electronic device 100 to be abnormally shut down. In other embodiments, when the temperature is below a further threshold, the electronic device 100 performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation (e.g., long press, up slide, left slide, single click, double click, etc.) acting on or near it. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, i.e.: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The electronic device 100 may be a mounted deviceAndroid/>Microsoft/>Or other operating system, the embodiment of the application does not limit the operating system carried by the electronic device.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. Embodiments of the application are configured in a layered mannerThe system is an example illustrating the software architecture of the electronic device 100.
Fig. 1B is a software block diagram of an electronic device 100 according to an embodiment of the application. The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun rows (Android runtime) and system libraries, and a kernel layer, respectively. The application layer may include a series of application packages.
As shown in fig. 1B, the application package may include Applications (APP) for cameras, gallery, calendar, phone calls, map, navigation, WLAN, bluetooth, music, video, short messages, etc. For convenience of description, an application program will be hereinafter simply referred to as an application. The application on the electronic device may be a native application or a third party application, and the embodiment of the present application is not limited.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for the application of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 1B, the application framework layer may include a window MANAGER SERVICE (VMS), an activity management server (ACTIVITY MANAGER SERVICE, AMS), an input event management server (input MANAGER SERVICE, IMS), a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window management server is used for managing window programs. The window management server can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The Activity management server (ACTIVITY MANAGER SERVICE, AMS) is responsible for managing Activity, for starting, switching, scheduling, and managing and scheduling applications of each component in the system.
The input event management server (input MANAGER SERVICE, IMS) may be configured to translate, package, etc. the original input event, obtain an input event containing more information, and send the input event to the window management server, where a clickable area (such as a control) of each application program, location information of a focus window, etc. are stored in the window management server. Thus, the window management server can properly distribute the input event to the designated control or focus window.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
View systems include visual controls, such as controls that display text, controls that display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android run is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The kernel layer may contain display drivers, input/output device drivers (e.g., keyboard, touch screen, headphones, speakers, microphones, etc.), device nodes, camera drivers, audio drivers, and sensor drivers, among others. The user performs input operation through the input device, and the kernel layer can generate corresponding original input events according to the input operation and store the corresponding original input events in the device node.
Based on the above description of the software and hardware of the electronic device according to the present application, the flow of the single-hand operation method provided by the present application will be described in detail with reference to the accompanying drawings.
Currently, when a user uses an electronic device, if the palm size of the user is not large enough, the user often needs to use both hands to operate the electronic device to obtain a better operation experience. But this mode of operation is relatively single. In practical use, the user sometimes needs to release one of the hands to do other things, such as carrying a bag, grabbing a subway handrail, and the like. At this time, the user needs to operate the electronic device with one hand. However, as the display screen of the electronic device is larger and larger, a part of content exists in the interface displayed by the display screen, which is not operable by one hand of the user, so that it becomes more and more difficult for the user to operate the electronic device by one hand, and many times, the user needs to use two hands to operate simultaneously to complete the corresponding functions.
As a solution, a developer may redesign the layout of the application program so as to set the display position of the function key in the application interface at the lower right corner or the lower left corner of the display screen, so that the user can touch the function key with one hand when using the electronic device. However, this solution requires redesigning all applications, which is labor intensive.
As another solution, the developer may convert the lower-pull window into the upper-pull window, so that when the user uses the electronic device with one hand, the user may conveniently perform the touch operation on the top area of the display screen, which is originally required by the user. However, this solution not only easily causes the user to touch by mistake, but also fails to solve the problem that the user may not operate the part of the content displayed on the screen with one hand.
In order to solve the above problems, the embodiment of the present application provides a one-hand operation method. When the electronic device displays the first interface through the screen display area of the display screen, if an instruction for entering the single-hand operation mode is detected, the electronic device can use an area which can be touched by a user with one hand (namely, a part of the area of the display screen of the electronic device) on the display screen as a touch area of the single-hand operation mode so as to receive the touch operation of the user with one hand, and simultaneously, use the rest area except the touch area on the display screen as a display area of the single-hand operation mode so as to reduce and display the original screen display area of the display screen and the first interface displayed by the original screen display area in the display area of the single-hand operation mode. The electronic device may then sense a touch operation of a user within a touch area of the one-handed operation mode to correspondingly control a reduced original screen display area displayed in a display area of the one-handed operation mode and a first interface displayed in the original screen display area according to the touch operation sensed within the touch area. Therefore, the user can perform full-range operation on the original interface displayed by the electronic equipment in a shrinking manner by performing touch operation in the touch control area. Under the condition that an application program is not required to be newly developed or adapted, the problem that partial content displayed on a screen can not be touched by fingers when a user operates the electronic equipment with one hand is solved.
That is, in the process that the user uses the electronic device with one hand, if there is a part of content that cannot be operated by the user with one hand in the interface displayed by the electronic device, the user can turn on the one-hand operation mode of the electronic device. Once the one-handed operation mode of the electronic device is triggered, the electronic device may divide the display screen into two areas, one being a touch area of the one-handed operation mode and the other being a display area of the one-handed operation mode. The touch area in the single-hand operation mode may be an area that a finger of a user can touch or easily touch in the display screen when the user operates the electronic device with one hand, and the display area in the single-hand operation mode may be a remaining area of the display screen except for the touch area.
And then the electronic equipment can correspondingly reduce the content originally displayed on the display screen by multiple to be intensively displayed in the display area of the single-hand operation mode, and establishes a mapping relation between the operation area of the single-hand operation mode and the display area of the single-hand operation mode, so that after the electronic equipment detects the touch operation in the touch area of the single-hand operation mode, the electronic equipment can execute the function required to be executed when the display area of the single-hand operation mode is touched according to the mapping relation.
It will be appreciated that when the electronic device is in the normal mode, typically the entire screen display area of the display screen is used to display content, and when the electronic device is switched from the normal mode to the one-handed mode, a display area of the one-handed mode appears on the display screen of the electronic device. Since the size of the display area in the one-handed operation mode is smaller than the size of the entire screen display area of the display screen, the entire screen display area of the display screen and the display contents therein need to be reduced originally to ensure that all the displays in the normal mode are included in the display area in the one-handed operation mode, and that the information that should be displayed in the normal mode is not lost.
In some embodiments, after the electronic device enters the single-hand operation mode, the electronic device may display the touch area in the single-hand operation mode, so that the user knows the position of the area that can be touched by a single hand on the display screen, and the user can accurately implement the touch operation on the display area in the single-hand operation mode in the touch area in the single-hand operation mode. The electronic device may also display the display area in the one-handed operation mode, so that the user knows the position of the user interface on the display screen when the user operates in one hand.
In the embodiment of the application, the electronic device can display the operation area in the single-hand operation mode in the form of a window, and the displayed window can be called a touch window. The position and size of the touch window are consistent with those of the operation area of the one-hand operation mode. That is, once the one-hand operation mode of the electronic device is triggered, the electronic device will display the touch window in the operation area of the one-hand operation mode of the display screen, and will display the reduced original display area and the display content therein in the display area of the one-hand operation mode of the display screen.
Optionally, the electronic device may also implement highlighting of the touch area in the single-hand operation mode by highlighting or by displaying with a preset color. Alternatively, the electronic device may also implement highlighting of the display area in the one-handed operation mode by highlighting, or by displaying with a preset color, or the like. The embodiment of the present application is not limited thereto.
The technical scheme provided by the embodiment of the application will be specifically described by taking the electronic equipment as an example of a mobile phone.
In the embodiment of the application, the size and the position of the touch area in the single-hand operation mode can be preset by an operating system of the electronic device, for example, the operating system presets at least one of the coordinates and the size of the touch area, and generates a configuration file of the touch area in response, so that the touch area is displayed on a display screen of the electronic device when the electronic device calls the configuration file. For example, the electronic device may display the touch area in at least one corner of an upper left corner, a lower left corner, an upper right corner, and a lower right corner of the display screen according to the configuration file, and the size is smaller than the display size of the display screen.
In the embodiment of the application, the user can also customize the size and the position of the touch control area in the single-hand operation mode.
It will be appreciated that different handsets are not uniform in size and different users use different palm sizes, resulting in different ranges of thumb operation by each user's single hand. Therefore, the embodiment of the application provides the initialization setting function of the touch window in the single-hand operation mode. The mobile phone is held by a user in a normal single hand, and the mobile phone analyzes the holding gesture of the user in the single hand so as to determine the area of the touch window, which is suitable for the single-hand operation mode of the user, in the display screen according to the position range where the thumb of the user can touch the display screen. Therefore, the mobile phone can ensure that the touch control area set by the mobile phone is more in accordance with the single-hand operation habit of each user, and the user can operate all positions of the touch control area when holding the mobile phone by one hand.
Optionally, when the mobile phone starts the "one-handed operation mode" function for the first time, the mobile phone may perform an initialization setting of the touch area of the one-handed operation mode. At this time, the mobile phone can analyze the holding gesture of the single hand of the user to determine the area which can be touched by the single hand of the user on the display screen of the mobile phone, so that the mobile phone can adaptively adjust the position and the size of the touch area of the single-hand operation mode according to the area of the display screen which can be touched by the single hand of the user. Optionally, when the user needs to readjust the position and the size of the touch area, the user may also operate the mobile phone again to perform the initialization setting of the touch area in the single-hand operation mode.
In some embodiments, the handset may display a reminder animation to remind the user that the handset has currently entered or exited the one-handed mode of operation.
For example, referring to fig. 3, when the user first turns on the "one-handed operation mode" function, the display of the mobile phone may display a prompt for indicating the user to draw an arc on the display. It will be appreciated that fig. 3 shows the arc indication when the right hand is holding the phone, and the display of the phone may also display the arc indication shown in fig. 4 when the user is used to holding the phone with the left hand.
Optionally, when the user first turns on the "one-hand operation" function, the mobile phone may automatically recognize the one-hand holding gesture of the user, so as to display the arc prompt shown in fig. 4 according to the recognized left-hand holding gesture; and displaying the arc prompt shown in fig. 3 according to the recognized right hand holding gesture. The following description describes the technical manner of the embodiment of the present application taking a single-hand holding posture as a right-hand holding posture as an example.
Alternatively, the mobile phone may determine the single hand holding posture of the user according to the manner in which the temperature sensor detects the temperature of the hand, the pressure sensor detects the holding pressure, and the like. The manner in which the mobile phone recognizes the single-hand holding gesture is not limited in the embodiment of the present application.
For example, referring to fig. 5, when the user holds the mobile phone in the right hand holding gesture shown in fig. 2 and uses the thumb to draw an arc on the display screen (as shown by a curve 501 in fig. 5) according to the instruction, the mobile phone may receive the touch position corresponding to the arc operation of the user. And then the mobile phone determines the maximum area 502 which can be touched by the user with one hand according to the touch position.
Alternatively, the mobile phone may determine the position and size of the touch area 503 adapted to the single-handed operation mode of the user according to the maximum area 502 that the user can touch with one hand. The shape of the touch area 503 is not limited in the embodiment of the present application, and may be square, circular, oval, etc. As one way, taking the shape of the touch area as a square as an example, the mobile phone can determine the maximum value of the length and the width of the touch area according to the maximum area 502 that can be touched by a user with one hand, so as to determine the position and the size of the touch area. Alternatively, the mobile phone may display the maximum area 502 that can be touched by a single hand of the user, so that the user can customize the shape and size of the touch area.
In some embodiments, the mobile phone may also obtain hand information of the user, so as to calculate the thumb length of the user according to the obtained hand information. And then the mobile phone can calculate the maximum area which can be touched by the user by one hand according to the length of the thumb of the user and the holding gesture of the user by one hand. Therefore, the mobile phone can determine the position and the size of the touch area according to the maximum area which can be touched by a user with one hand. Optionally, when the user turns on the "one-hand operation mode" function, the mobile phone may remind the user to enter hand information. After the hand information of the user is input into the mobile phone, the mobile phone can calculate the thumb length of the user according to the acquired hand information.
In the embodiment of the application, after the position and the size of the touch control area of the single-hand operation mode are determined, the mobile phone can display the interface content displayed in the whole screen display area of the display screen to the remaining area of the display screen in a reduced manner. The remaining area is other areas except the touch area in the display screen. For example, referring to fig. 6, after determining the position and the size of the touch area 601 in the single-hand operation mode according to the maximum area touchable by the user's single hand, the mobile phone may display the remaining area 602 except for the touch area 601 in the display screen as the display area in the single-hand operation mode, so as to display the interface content originally displayed in the display area of the whole screen. Thus, the user can perform a full-range operation on the interface contents displayed in the remaining area 602 by performing an operation in the touch area 601.
Illustratively, fig. 7 (a) shows an interface diagram when a desktop interface is displayed through a display screen while the mobile phone is in a normal mode. The normal mode generally refers to a normal display mode of the user interface designed according to the size of the display screen of the mobile phone. The user interface is typically displayed on the entire screen when the handset is in the normal mode. As shown in fig. 7 (a), when the mobile phone is in the normal mode, the mobile phone is distributed over the whole screen display area of the display screen through the desktop interface 701 displayed on the display screen.
When the mobile phone normally displays the desktop interface 701, if the mobile phone detects a trigger operation in the one-hand operation mode, as shown in (b) of fig. 7, the mobile phone may display a touch area 702 in the one-hand operation mode on the display screen in response to the trigger operation, and reduce the desktop interface 701 displayed in the whole screen display area of the display screen to a remaining area 703 of the display screen. The remaining area 703 is other area of the display screen than the touch area 702. As shown in fig. 7 (b), an isostatically reduced desktop interface 704 is displayed in the remaining area 703 of the display screen. Therefore, the user can perform the operation in the touch area 702 to perform the full-range operation on the desktop interface 703 with reduced geometric scale displayed in the remaining area 703.
Optionally, the setting page of the mobile phone may be added with a function switch control in a single-hand operation mode, and the triggering operation in the single-hand operation mode may be a click operation of the function switch control in the single-hand operation mode by a user. Optionally, the triggering operation of the single-hand operation mode may be a pressing operation of a specific physical key or a key combination on the mobile phone by the user, or may be a voice indication of the user, a shortcut gesture operation of the user on the screen of the mobile phone (such as a gesture of drawing a circle track or a long-press gesture acting on the lower right corner of the screen), or a space gesture operation of the user. The embodiment of the application does not limit the triggering operation of the single-hand operation mode.
Optionally, when the triggering operation of the single-hand operation mode is detected by the mobile phone for the first time, the mobile phone may enter an initialization setting mode of the touch area of the single-hand operation mode to determine the position and the size of the touch area of the single-hand operation mode. And then the mobile phone records the position and the size of the touch control area in the single-hand operation mode, so that when the subsequent mobile phone detects the triggering operation of the single-hand operation mode again, the touch control area in the single-hand operation mode can be directly displayed on the display screen.
The mobile phone can determine the position and the size of the residual area in the display screen according to the position and the size of the touch area in the display screen in the single-hand operation mode. Alternatively, the entire remaining area may be used as a display area for the one-handed operation mode.
Illustratively, fig. 8 (a) shows an interface diagram when a video application is displayed through a display screen while the mobile phone is in a normal mode. As shown in fig. 8 (a), when the mobile phone is in the normal mode, the mobile phone fills the whole screen display area of the display screen through the application interface 801 of the video application displayed by the display screen.
When the mobile phone normally displays the application interface 801 of the video application, if the mobile phone detects a trigger operation in the one-hand operation mode, as shown in (b) in fig. 8, the mobile phone may display a touch area 802 in the one-hand operation mode on the display screen in response to the trigger operation, and narrow down the application interface 801 of the video application displayed in the whole display area of the display screen to the remaining area 803 of the display screen. The remaining area 803 is other area of the display screen than the touch area 802. As shown in fig. 8 (b), an application interface 804 of the video application with an equal ratio reduced is displayed in the remaining area 803 of the display screen. Therefore, the user can perform full-range operation on the application interface 804 of the video application with reduced scaling displayed in the remaining area 803 by performing operation in the touch area 802.
Optionally, if the user interface displayed in the whole screen display area of the display screen originally includes application windows of a plurality of applications, when the mobile phone enters the single-hand operation mode, the mobile phone may also reduce the user interface including the application windows of the plurality of applications displayed in the whole screen display area of the display screen originally to be displayed in the display area of the single-hand operation mode.
Illustratively, fig. 9 (a) shows an interface schematic for displaying a split window through a display when the mobile phone is in a normal mode. As shown in fig. 9 (a), when the mobile phone is in the normal mode and the user interface 901 displayed by the mobile phone through the display screen includes upper and lower split screen windows of the video playing application and the music playing application, the application windows of the video playing application and the application windows of the music playing application occupy half of the display screen of the mobile phone, and the two application windows are not overlapped.
When the mobile phone normally displays the split-screen windows of the video playing application and the music playing application, if the mobile phone detects a trigger operation in the one-hand operation mode, as shown in (b) in fig. 9, the mobile phone may respond to the trigger operation to display a touch area 902 in the one-hand operation mode on the display screen, and reduce and display a user interface 901 displayed in the display area of the whole screen originally on the display screen to a remaining area 903 of the display screen. The remaining area 903 is other areas of the display screen except for the touch area 902. As shown in fig. 9 (b), an equally scaled down user interface 904 is displayed in the remaining area 903 of the display screen, and the video playing application and the split screen window of the music playing application in the user interface 904 are also equally scaled down. Thus, the user can perform full-range operation on the user interface 904 including the split screen window of the video playing application and the music playing application, which is displayed in the remaining area 903 after the scaling down, by performing operation in the touch area 902.
It will be appreciated that the application window of the user interface including a plurality of applications may not be limited to a split screen scenario, and other scenarios such as a floating screen, a mini floating window, a multi-tasking window, etc. are also applicable, and the embodiment of the present application is not limited thereto.
Optionally, the mobile phone may rearrange and combine the interface contents displayed in the whole original screen display area of the mobile phone display screen according to the size of the remaining area, and display the rearranged and combined interface contents in the remaining area. It will be appreciated that the interface content after rearrangement may be displayed across the entire remaining area.
Taking a desktop interface as an example, as shown in (a) of fig. 10, when the mobile phone is in a normal mode, the desktop interface 1001 displayed by the mobile phone through the display screen is distributed over the entire screen display area of the display screen. Wherein, the desktop interface 1001 includes application icons of a plurality of applications.
When the mobile phone normally displays the desktop interface 1001, if the mobile phone detects a trigger operation in the one-hand operation mode, as shown in (b) in fig. 10, the mobile phone may display a touch area 1002 in the one-hand operation mode on the display screen in response to the trigger operation, and rearrange and combine application icons of a plurality of applications in the desktop interface 1001 that is originally displayed in the whole screen display area of the display screen, so as to display the rearranged and combined application icons of the plurality of applications in a remaining area 1003 of the display screen. The remaining area 1003 is other area of the display screen than the touch area 1002. As shown in fig. 10 (b), a new desktop interface 1004 in which application icons are rearranged and combined is displayed in the remaining area 1003 of the display screen, the content arrangement in the new desktop interface 1004 is different from that of the original desktop interface 1001, and the desktop interface 1004 is distributed over the entire remaining area of the display screen. Similarly, the user can perform a full-range operation on the new desktop interface 1004 displayed in the remaining area 1003 by performing an operation in the touch area 1002.
Alternatively, since the user interface displayed by the display screen is generally attached to the aspect ratio of the screen of the display screen, the mobile phone can determine, according to the position and the size of the remaining area, the maximum area meeting the aspect ratio of the screen of the display screen, as the target display area for displaying the user interface originally displayed in the whole screen display area. Therefore, the mobile phone can reduce the user interface displayed in the whole original screen display area of the mobile phone display screen to the target display area in an equal ratio, and the reduced user interface can be paved in the target display area.
Optionally, after the mobile phone displays the user interface with reduced scaling by using the target display area in the remaining area, if there are other areas in the remaining area, except for the target display area, that are not displaying the content, the mobile phone may rearrange the application icons of the plurality of applications displayed on the desktop interface and display the rearranged application icons in the other areas. In this way, when the user operates the original user interface displayed in the target display area through the touch area, the user can also operate the plurality of application icons displayed in other areas through the touch area. Therefore, the phenomenon that the display screen has a black area due to the existence of an area which does not display any content is avoided, the full utilization of the whole display area of the mobile phone display screen is realized, the waste of large screen area is avoided, and a user can quickly open other applications while browsing the original user interface.
For example, referring to fig. 11, after determining the position and the size of the touch area 1101 in the single-handed operation mode, the mobile phone may use the remaining area 1102 of the display screen except for the touch area 1101 as the display area in the single-handed operation mode. Then, the mobile phone can determine, according to the position and the size of the remaining area 1102, the largest area satisfying the aspect ratio of the screen of the display screen in the remaining area 1102, as the target display area 1103 for displaying the user interface originally displayed in the whole screen display area. At this time, as shown in fig. 11, the remaining area 1102 has other areas 1104 where no content is displayed in addition to the target display area 1103. Therefore, in the embodiment of the present application, the mobile phone may rearrange the application icons of the applications displayed on the desktop interface for displaying in the other area 1104. The full utilization of the whole display area of the mobile phone display screen is realized.
As an example, fig. 12 (a) shows an interface diagram when a video application is displayed through a display screen while a mobile phone is in a one-handed operation mode. As shown in fig. 12 (a), after the mobile phone enters the one-hand operation mode, the mobile phone may display a touch area 1201 of the one-hand operation mode on the display screen, and display an application interface 1202 of the video application with reduced geometric proportion in a remaining area of the display screen, where the remaining area is other areas of the display screen except for the touch area 1201. Since the application interface 1202 of the video application does not cover the entire remaining area of the display screen, the remaining area still has the left area 1203 where no content is displayed, as shown in (a) of fig. 12, after the mobile phone enters the one-hand operation mode, the mobile phone may display a plurality of application icons in the left area 1203 of the display screen at the same time.
Optionally, the mobile phone may display more application icons in a left area of the display screen in a manner of switching a list, and the user may perform up-down sliding control on the switching list through touch control on the touch control area, so that the application icons in hidden display are displayed in the left area of the display screen.
In some embodiments, when the remaining area includes the target display area and the other areas, the mobile phone may determine, according to the area switching instruction, which area of the target display area and the other areas can be currently controlled by the touch area.
Optionally, the area switching instruction may be a preset gesture operation of the user on the touch area, such as a lateral sliding operation at the bottom of the touch area, which is not limited in the embodiment of the present application. As an example, when a user currently uses the touch area to control the user interface displayed in the target display area after the scaling down, if the user performs a lateral sliding gesture from right to left at the bottom of the touch area, the mobile phone may switch the action range of the touch area from the target display area to other areas, and at this time, the mobile phone may execute a function corresponding to the touch operation on the content displayed in the other areas according to the touch operation performed by the user on the touch area. If the user performs a lateral sliding gesture from left to right at the bottom of the touch area, the mobile phone can switch the action range of the touch area from other areas back to the target display area, and at this time, the mobile phone can execute a function corresponding to the touch operation on the user interface displayed in the target display area after the scaling down according to the touch operation of the user acting on the touch area.
Optionally, the mobile phone may display a control by using the touch area, so as to implement operation switching between the target display area and other areas, and at this time, the area switching instruction may also be a click operation of the control by the user. As an example, as shown in (b) of fig. 12, the handset may display a control 1204 in the touch area 1201. When the user clicks the control 1204, the scope of the touch area may be switched from the application interface 1202 of the video application to the plurality of application icons displayed in the left area 1203, and at this time, the mobile phone may execute a function corresponding to the touch operation on the plurality of application icons displayed in the left area 1203 according to the touch operation performed by the user on the touch area 1201. When the user clicks the control 1204 again, the scope of the touch area may switch from the plurality of application icons displayed in the left area 1203 back to the application interface 1202 of the video application, and at this time, the mobile phone may execute a function corresponding to the touch operation on the application interface 1202 of the video application according to the touch operation of the user on the touch area 1201.
Optionally, if there are other areas of the remaining areas that have no content displayed in addition to the target display area, the mobile phone may display application icons of applications frequently used by the user in the other areas, or the user may customize application icons of a plurality of applications displayed in the other areas. The embodiments of the present application are not limited.
In some embodiments, when the touch area of the one-handed operation mode is located relatively upward in the display screen, the remaining area may also include a bottom area of the display screen. Since the bottom area also does not display content, the cell phone may also display application icons of a plurality of applications in the bottom area in order to avoid screen waste. Thus, when the user operates the original user interface displayed in the target display area through the touch area, the user can also operate the plurality of application icons displayed in the bottom area through the touch area. Therefore, the phenomenon that the display screen has a black area due to the existence of an area which does not display any content is avoided, the full utilization of the whole display area of the mobile phone display screen is realized, the waste of large screen area is avoided, and a user can quickly open other applications while browsing the original user interface.
Referring to fig. 13, fig. 13 is a schematic diagram illustrating an interface when a mobile phone is in a one-hand operation mode and a video application is displayed on a display screen. As shown in fig. 13, after the mobile phone enters the single-hand operation mode, the mobile phone may display a touch area 1301 in the single-hand operation mode on the display screen, and display an application interface 1302 of the video application with reduced equal ratio in a remaining area of the display screen, where the remaining area is other areas of the display screen except the touch area 1301. Since the application interface 1302 of the video application does not cover the whole remaining area of the display screen, the remaining area still has the left area 1303 and the bottom area 1304 where no content is displayed, as shown in fig. 13, after the mobile phone enters the one-hand operation mode, the mobile phone may display a plurality of application icons in the left area 1303 and the bottom area 1304 of the display screen at the same time.
Alternatively, since some users can touch the bottom area of the display screen with one hand, the users can directly perform touch operation on the plurality of application icons in the bottom area instead of performing touch operation on the plurality of application icons displayed in the bottom area through the touch area.
Optionally, the mobile phone may display more application icons in a manner of switching a list in a bottom area of the display screen, and the user may perform left-right sliding control on the switching list through touch control on the touch control area, so that the application icons in hidden display are displayed in the bottom area of the display screen.
Alternatively, the bottom area or the left area of the remaining area where the content is not displayed may not display the application icon, but display a plurality of shortcut functions, such as a shortcut function of screenshot, sharing, sweeping, shortcut payment, health code, and the like. The embodiments of the present application are not limited in this regard.
In the embodiment of the application, when the mobile phone detects the touch operation of the user on the touch control area, the mobile phone can respond to the touch operation and execute the function corresponding to the touch operation on the reduced whole screen display area displayed in the residual area and the user interface in the reduced whole screen display area. The touch operation may be a common touch operation such as up-sliding, down-sliding, left-sliding, right-sliding, clicking, double-clicking, long-pressing, or a sliding gesture such as drawing a circle, drawing a hook, drawing a fork x, etc. The embodiment of the present application is not limited thereto.
Illustratively, taking a left-hand slide operation as an example, fig. 14 (a) shows an interface diagram when a desktop interface is displayed through a display screen while the mobile phone is in a one-hand operation mode. As shown in fig. 14 (a), when the mobile phone displays a touch area 1401 in a single-hand operation mode on the display screen and displays a reduced original whole screen display area and a desktop interface 1402 displayed in the original whole screen display area in a remaining area except the touch area 1401 on the display screen, if the mobile phone detects a left-slide operation of a user acting on the touch area, the mobile phone responds to the left-slide operation of the touch area and maps the left-slide operation acting on the touch area to a left-slide operation acting on the reduced original whole screen display area displayed in the remaining area and the desktop interface 1402 displayed in the reduced original whole screen display area, so that the mobile phone can execute a function corresponding to the left-slide operation on the reduced original whole screen display area and the desktop interface 1402 displayed in the reduced original whole screen display area. As shown in fig. 14 (b), the mobile phone can perform a desktop interface switching function corresponding to the left-hand sliding operation on the reduced original entire screen display area and the desktop interface 1402 displayed therein, thereby displaying a new desktop interface 1403 (next page desktop interface) on the reduced original entire screen display area.
Illustratively, taking a left-hand slide operation as an example, fig. 15 (a) shows an interface diagram when a video application is displayed through a display screen while the mobile phone is in a one-hand operation mode. As shown in fig. 15 (a), when the mobile phone displays a touch area 1501 in a single-hand operation mode on the display screen and displays a reduced original whole screen display area and an application interface 1502 of a video application displayed in the original whole screen display area in a remaining area except the touch area 1501 on the display screen, if the mobile phone detects a left-slide operation of a user acting on the touch area, the mobile phone responds to the left-slide operation of the touch area and maps the left-slide operation acting on the touch area to a left-slide operation of the application interface 1502 of the reduced original whole screen display area displayed in the remaining area and the video application displayed in the reduced original whole screen display area, so that the mobile phone can execute a function corresponding to the left-slide operation on the reduced original whole screen display area and the application interface 1502 of the video application displayed in the reduced original whole screen display area. As shown in fig. 15 (b), the mobile phone may execute the video application exit function corresponding to the left-hand sliding operation on the application interface 1502 of the video application displayed in the reduced whole screen display area, so that the reduced whole screen display area displays the desktop main interface 1503 to which the video application exits.
In some embodiments, after displaying the user interface with reduced geometric proportion by using the remaining area, if the mobile phone detects a confirmation instruction for the user interface, the mobile phone may consider that the user needs to further control specific content in the user interface, and at this time, the mobile phone may execute a function corresponding to the touch operation on a certain content or a certain part of content in the user interface according to the touch operation of the user acting on the touch area.
Optionally, the mobile phone may display a switching control by using the touch area, and at this time, the confirmation instruction for the user interface may be a click operation of the switching control by the user. Thus, the user can select the currently displayed user interface and further control the specific content in the user interface by clicking the switching control. Optionally, the user may click the switch control again to implement deselection of the currently displayed user interface to cancel further manipulation of the specific content in the user interface.
Optionally, the confirmation instruction of the user interface may also be a preset gesture operation of the user on the touch area, where the preset gesture operation may be a double-click gesture, a hook "", etc., and the embodiment of the present application is not limited thereto. Taking a double-click gesture as an example, after the mobile phone displays the user interface with reduced geometric proportion by using the residual area, a user can select the currently displayed user interface and further control specific content in the user interface by executing the double-click gesture operation on the touch area. Optionally, the user may perform a double-click gesture operation on the touch area again, so as to implement deselection of the currently displayed user interface, so as to cancel further manipulation of specific content in the user interface. In some embodiments, to ensure that a user can accurately locate and control a certain content displayed in the remaining area through a touch operation of the touch area, the mobile phone may also determine an action position of the touch area in the remaining area.
As one way, the handset may display a cursor in the remaining area, the position of which may be used to alert the user to the location of the content currently located by the handset. The user can control the movement of the cursor in the residual area through the touch operation (such as up-sliding, down-sliding, left-sliding and right-sliding) in the touch area so as to control the cursor in the residual area to move to the position of a certain content which the user wants to locate and control, so that the user can further realize the accurate control of the content through the touch operation in the touch area.
Illustratively, fig. 16 (a) shows an interface diagram when a desktop interface is displayed through a display screen while the mobile phone is in a one-handed operation mode. As shown in fig. 16 (a), when the mobile phone displays a touch area 1601 in a single-handed operation mode on the display screen and displays the reduced whole original screen display area and a desktop interface 1602 displayed in the whole original screen display area on the display screen except for the touch area 1601, if the user performs a double-click gesture operation on the touch area 1601, the mobile phone may consider that the user needs to further control specific content in the desktop interface 1602, and at this time, the mobile phone responds to the double-click gesture operation, the mobile phone may display a cursor 1603 on the desktop interface 1602, where the cursor may be used to indicate a certain content currently selected by the user mobile phone. Optionally, the mobile phone may use the interface content corresponding to the upper left corner of the entire screen display area as the initial stay position of the cursor. As shown in fig. 16 (a), the initial rest position of the cursor 1603 may be the position where the first application icon in the upper left corner of the desktop interface 1602 is located.
When the mobile phone detects the right-slide operation of the user on the touch area, the mobile phone responds to the right-slide operation of the touch area and controls the cursor 1603 to move rightward in the remaining area so as to select the next content. As shown in fig. 16 (b), the cursor 1603 may be moved to the position of the second application icon in the upper left corner of the desktop interface 1602. In this way, the user can control the movement of the cursor 1603 in the remaining area through the touch operations such as up-sliding, down-sliding, left-sliding, right-sliding and the like in the touch area 1601, so as to control the cursor 1603 in the remaining area to move to the position of a certain application icon that the user wants to locate and control, so that the user can execute the clicking operation in the touch area again, and trigger the mobile phone to open the application corresponding to the currently selected application icon of the cursor 1603.
It will be appreciated that when more content is displayed in the remaining area, the user may need to perform a number of touch operations in the touch area to control the cursor to gradually move to a certain content position for the corresponding manipulation in the remaining area. Especially for a large-screen mobile phone, the display screen originally displays more content, and when the display screen is reduced in a single-hand operation mode and then is intensively displayed in a residual area, the size of the display content is correspondingly reduced, the display content is denser, the user is inconvenient to find the content, and the difficulty of accurately positioning a cursor in the residual area to a specific position by the touch operation of the user in a touch area is also increased.
Thus, in some embodiments, the mobile phone may display a content selection frame in the remaining area, where the area range of the content selection frame may be the range of the touch area in the remaining area. The area range defined by the content selection frame can include displaying a plurality of contents in the remaining area, so that a user can control the movement of the content selection frame in the remaining area through the touch operation (such as sliding, sliding down, sliding left and sliding right) in the touch area, so as to control the movement of the content selection frame in the remaining area to the position of an area where a certain content which the user wants to locate and control is located, and further, the user can accurately locate the specific position of the content from the content selection frame and realize accurate control of the content through the touch operation in the touch area.
As shown in fig. 17 (a), the mobile phone may display a touch area 1701 of a single-hand operation mode on the display screen, and when displaying the reduced whole original screen display area and the desktop interface 1702 displayed in the whole original screen display area in the remaining area except for the touch area 1701 on the display screen, the mobile phone may also display a content selection box 1703 on the desktop interface 1702, where the content selection box may be used to instruct to prompt the user that a certain area is currently selected by the mobile phone, and the selected area may include a plurality of interface contents. Alternatively, the mobile phone may take the upper left corner position of the original entire screen display area as the initial stay position of the content selection frame. As shown in fig. 17 (a), the initial rest position of the content selection box 1703 may be a position of the upper left corner of the desktop interface 1602, and the area outlined by the content selection box 1703 includes a plurality of application icons.
When the mobile phone detects a right-slide operation of the user on the touch area, the mobile phone controls the content selection box 1703 to move rightward in the remaining area in response to the right-slide operation of the touch area, so as to select the next area. As shown in fig. 17 (b), the content selection box 1703 may be moved to a position in the upper right corner of the desktop interface 1702. In this way, the user can control the movement of the content selection box 1703 in the remaining area by a touch operation of sliding up, sliding down, sliding left, sliding right, etc. in the touch area 1701, so as to control the movement of the content selection box 1703 in the remaining area to a certain area where the user wants to locate and manipulate. The user can further precisely locate a specific position of a certain content in the content selection box 1703 and realize precise manipulation of the content through a touch operation in the touch area 1701.
Optionally, the area range outlined by the content selection frame may be an action range of the touch area in the remaining area. When the content selection frame in the remaining area stays in a certain area, the mobile phone can establish a mapping relation between the touch area and the selected area of the current content selection frame, so that when the mobile phone detects touch operation (such as single click operation, double click operation, long press operation and the like) of a user acting on a certain position in the touch area, the mobile phone can map the touch operation at the certain position in the touch area to the touch operation at the corresponding position in the content selection frame according to the mapping relation, and the mobile phone can execute a function corresponding to the touch operation at the corresponding position in the content selection frame.
Illustratively, when the user manipulates the content selection box 1703 to move to the position of the upper right corner of the desktop interface 1702 as shown in (b) of fig. 17, the mobile phone may establish a mapping relationship between the touch area 1701 and the area selected by the current content selection box 1703. For example, as shown in (c) of fig. 17, touch operations in four touch sub-areas of upper left, upper right, lower left, and lower right in the touch area may be mapped to touch operations on 4 application icons in the current content selection box 1703 one-to-one. In this way, the user can respectively control the 4 application icons in the current content selection box 1703 by performing touch operations in the four touch sub-areas of the upper left, upper right, lower left and lower right in the touch area.
For example, as shown in (d) of fig. 17, when the mobile phone detects a click operation by the user on the lower left side in the touch area, the mobile phone may map the click operation on the lower left side in the touch area to a click operation on the application icon 1704 of the music application in the content selection frame 1703 according to the mapping relation, so that the mobile phone may open the music application in response to the click operation by the user on the lower left side in the touch area.
Optionally, the mobile phone may establish a mapping relationship between the touch area and the selected area of the current content selection frame in real time according to the area range defined by the content selection frame, or may establish a mapping relationship between the touch area and the selected area of the current content selection frame when detecting the determining operation of the user on the content selection frame. The determining operation of the content selection frame may be a double-click operation of the user acting on the touch area, a specific gesture operation (such as a hook), or the like, which is not limited in the embodiment of the present application.
Illustratively, taking a double-click operation as an example, when the user manipulates the content selection box 1703 to move to a position at the upper right corner of the desktop interface 1702 as shown in (b) of fig. 17, if the user determines that a specific manipulation of a certain content in the current content selection box 1703 is required, the user may perform the double-click operation in the touch area. When the mobile phone detects a double-click operation of the user on the touch area, it can determine that the determination operation of the user on the content selection frame is detected, and at this time, the mobile phone can enter the area range framed by the content selection frame 1703 in response to the double-click operation, and establish a mapping relationship between the touch area and the area range framed by the current content selection.
Optionally, the mobile phone may delete the mapping relationship between the touch area and the selected area of the current content selection frame when detecting the exit operation of the user on the content selection frame. The determining operation of the content selection frame may be a double-click operation of the user acting on the touch area, a specific gesture operation (such as drawing a fork x), or the like, which is not limited in the embodiment of the present application.
For example, taking a double-click operation as an example, when the user performs the double-click operation in the touch area to control the mobile phone to enter the area range framed by the content selection frame and establish the mapping relationship between the touch area and the area range framed by the current content selection frame, the user may also perform the double-click operation in the touch area again to control the mobile phone to exit the area range framed by the content selection frame and delete the mapping relationship between the touch area and the area range framed by the current content selection frame. The mobile phone returns to the original operation logic, namely, the user can continue to control the movement of the content selection frame through the sliding operation in the touch control area.
In some embodiments, the mobile phone may adaptively display content selection boxes of different sizes according to different content displayed in the remaining area. Optionally, since the content displayed by different applications is different and the layout of the application interfaces of different applications is also different, the mobile phone can generate content selection frames with different sizes according to the application interfaces of different applications displayed in the whole screen display area.
For example, as shown in fig. 18, when the mobile phone displays the reduced original whole screen display area and the application interface 1801 of the short video application displayed in the original whole screen display area in the remaining area, the mobile phone may also display a content selection box 1802 shown in fig. 18 on the application interface 1801 of the short video application, where the content selection box may be used to indicate a certain area currently selected by the user mobile phone, and the selected area may include a plurality of controls such as a focus control, a collection control, a comment control, and a sharing control. It can be seen that, because the multiple controls such as the focus control, the collection control, the comment control, the sharing control and the like are located at a vertical position in the application interface 1801 of the short video application, the mobile phone can adaptively adjust the size of the content selection frame so as to match with the interface content distributed vertically.
Optionally, the mobile phone may acquire all the operation control views displayed in the remaining area, then determine whether the views are in a horizontal or vertical position, and calculate the size and position of each view to select a part of the views to be recombined into a combined view. The range size of the combined view is the size of the content selection frame. Therefore, the mobile phone can realize the self-adaption determination of the size of the content selection frame according to the horizontal arrangement or the vertical arrangement of the views displayed in the residual area.
For example, as shown in fig. 19 (a), when the mobile phone displays the reduced whole original screen display area and the application interface 1901 of the search application displayed in the whole original screen display area in the remaining area, the mobile phone may obtain all views displayed in the application interface 1901 of the search application, and then the mobile phone may determine which views are in horizontal positions and which are in vertical positions. And then the mobile phone calculates the size and the position of each view to adaptively select proper views to be recombined into a combined view according to the horizontal arrangement or the vertical arrangement of each view, so as to form the size of the content selection frame. As shown in fig. 19 (a), the mobile phone may recombine a plurality of views horizontally arranged in a row in the application interface 1901 of the search application into a combined view1902, where the range size of the combined view1902 is the size of the content selection frame. As shown in fig. 19 (b), the mobile phone may also recombine a plurality of views vertically arranged in a column in the application interface 1901 of the search application into a combined view1903, where the range size of the combined view1903 is the size of the content selection frame. As shown in fig. 19 (c), the mobile phone may also recombine a plurality of views vertically arranged and a plurality of views horizontally arranged in the application interface of the search application into a combined view1904, where the range size of the combined view1904 is the size of the content selection frame.
Optionally, the mobile phone may highlight the area outlined by the content selection frame to prompt the user for the area currently selected by the content selection frame.
Optionally, the mobile phone may map the combined view in the content selection frame to the touch area, and set the combined view to be visible, so that the mobile phone may display the combined view defined by the content selection frame in the touch area, and thus the user may accurately and correspondingly control the corresponding view in the content selection frame displayed in the remaining area by controlling a certain view displayed in the touch area. As shown in fig. 20, the mobile phone displays, in the touch area 2001, icon controls of 4 application icons, which are combined views defined by the content selection frame 2002.
Optionally, because the view in the reduced user interface is smaller, when the view is mapped to the smaller touch area, the user is sometimes inconvenient to operate, or when the typesetting of the view in the reduced user interface is not matched with the touch area, the mobile phone cannot be mapped to the touch area correspondingly. Therefore, when the combined view in the content selection frame is mapped to the touch area, the size and the position of each view can be readjusted according to the size of the touch area, so that the adjusted combined view is more matched with the touch area, and the user can operate in the touch area by one hand more conveniently.
For example, as shown in fig. 21, when the mobile phone maps the combined views in the content selection frame 2101 displayed in the remaining area to the touch area 2102, the mobile phone can readjust the size and the position of each view in the content selection frame 2101 so that the adjusted combined views are more matched with the touch area 2102, and then the mobile phone can display the adjusted combined views shown in fig. 21 in the touch area 2102, wherein the size of the views in the adjusted combined views is larger, so that the user operation is more convenient.
For example, as shown in fig. 22, when the mobile phone maps the combined view in the content selection frame 2201 displayed in the remaining area to the touch area 2202, because the focus control, the collection control, the comment control and the sharing control in the combined view in the content selection frame 2201 are in a vertical position, the mobile phone can determine that the combined view in the content selection frame 2201 is not matched with the touch area, if the combined view is mapped directly to the touch area, each control is smaller in the touch area, which is inconvenient for the user to operate. Therefore, the mobile phone can readjust the size and position of each control in the combined view according to the size and position of each control in the focus control, the collection control, the comment control and the sharing control, so that the adjusted combined view is more matched with the touch area 2202. The mobile phone may then display the adjusted combined view shown in fig. 22 in the touch area 2202, where the focus control, the collection control, the comment control, and the sharing control are arranged laterally and have a larger display size, so that the user can operate in the touch area with one hand more conveniently.
Alternatively, the mobile phone may also map the fixed-position operation controls such as a title bar, a bottom bar, a top bar, etc. in the user interface displayed in the remaining area, which do not move with the page browsing, directly to the touch area, where the controls do not need to be framed by a content selection frame. So that the user can directly operate these controls within the touch area. Optionally, the mapped positions of the controls within the touch area may correspond to the positions of the controls in the user interface. For example, the top bar is typically displayed on top of the user interface, and the cell phone may also map the top bar to the top of the touch area. The bottom bar is typically displayed at the bottom of the user interface and the phone may also map the bottom bar to the bottom of the touch area.
Illustratively, as shown in (a) of fig. 23, since the application interface of the shopping application displayed in the remaining area includes fixed-position operation controls such as a top bar 2301 and a bottom bar 2302, the mobile phone can map the top bar 2301 and the bottom bar 2302 to the touch area, so that the mobile phone can display the top bar 2303 and the bottom bar 2304 shown in (a) of fig. 23 in the touch area. In this way, the user can directly perform touch operations on the top and bottom columns 2303 and 2304 in the touch area with one hand.
Optionally, after the mobile phone maps the title bar, the bottom bar, the top bar, and other operation controls in the user interface, which are not moved along with the page browsing, in the fixed position directly to the touch area, the mobile phone may also map the operation controls in the user interface, which are displayed in the remaining area, in the non-fixed position along with the page browsing, to the touch window through the content selection frame. That is, the mobile phone may map the combined view framed by the content selection box to the touch area. It will be appreciated that the fixed position of the title bar, bottom bar, top bar, etc. that do not move with the page view do not overlap with the combined view framed by the content selection box when displayed on the touch area.
As shown in fig. 23 (b), the mobile phone can map fixed-position operation controls such as a top bar 2301 and a bottom bar 2302 in an application interface of a shopping application displayed in a remaining area into a touch area, so that a user can directly touch and operate a top bar 2303 and a bottom bar 2304 in the touch area to realize touch operations on the top bar 2301 and the bottom bar 2302 in the remaining area. The handset may also map the combined view within the content selection box 2305 into the touch area so that the user may directly touch the combined view2306 in the touch area to implement a touch operation on the combined view2306 within the content selection box 2305 in the remaining area.
Optionally, when the mobile phone detects a sliding operation of the user on the touch area, the mobile phone controls the content selection frame 2305 to move downward in the remaining area in response to the sliding operation of the touch area, so as to select the next combined view. For example, the content selection frame 2305 may be moved from the position shown in (b) in fig. 23 to the position shown in (c) in fig. 23.
Optionally, according to a touch operation of a user on the designated control in the touch area, when the newly generated user interface is displayed in the remaining area, the mobile phone may highlight the content related to the designated control in the newly generated user interface, or the mobile phone may display the content control related to the designated control in the user interface in the touch area, so that the user may quickly pay attention to or control to the content possibly interested in. Alternatively, the handset may not be highlighted if there is no content associated with the specified control in the newly generated user interface.
As an example, as shown in (c) in fig. 23, when the mobile phone maps the combined view in the content selection box 2305 into the touch area, if the user clicks the cap control 2307 in the combined view in the touch area, the mobile phone may enter the cap-related page corresponding to the cap control 2307 in response to the clicking operation, that is, the mobile phone may switch the page shown in (c) in fig. 23 displayed in the remaining area to the cap-related page shown in (d) in fig. 23. At this time, as shown in (d) of fig. 23, the mobile phone may highlight the controls 2308 and 2309 similar to the cap control 2307 in the related page of the cap, and at the same time, the mobile phone may map the controls 2308 and 2309 similar to the cap control 2307 in the related page of the cap to the touch area, so that the user may quickly control to possibly interested contents directly through the touch area.
Optionally, when the combined views in the touch area correspond to the positions of the combined views in the content selection frame 2305 in the remaining area, the mobile phone may also hide, i.e. set to be invisible, each view in the combined views in the touch area, so as to avoid that the display screen displays excessive repeated content, which affects the look and feel of the user. In this way, the user can directly perform the touch operation on the corresponding touch position in the touch area, so as to implement the touch operation on the combined view in the content selection frame 2305 in the remaining area.
Optionally, when the mobile phone detects a sliding operation of the user on the touch area, the mobile phone may synchronously control page content in the user interface displayed in the remaining area to perform the sliding browsing operation. Therefore, the mobile phone can acquire all views at the non-fixed positions in the user interface displayed in the residual area again, and one or more combined views are generated by recombination. The user can synchronously control the content selection frame in the user interface displayed in the residual area through the sliding operation acted on the touch area, and switch from a plurality of combined views.
Optionally, when the remaining area has a plurality of application icons or shortcut functions displayed in the left area or the bottom area in addition to the target display area of the user interface for displaying the original whole screen display area, when the user needs to manipulate the display content in the left area or the bottom area, the user may also manipulate the content selection frame through the touch area to move to the left area or the bottom area, and when the content selection frame moves to the left area or the bottom area, the mobile phone may adaptively adjust the content selection frame to a suitable size according to the vertical or horizontal arrangement of the plurality of application icons or shortcut functions in the left area or the bottom area. The handset may then map the plurality of application icons or shortcut functions in the content selection box to the touch area.
Optionally, the user may also customize the gesture operation, which may be tied to a certain function of a certain application interface. Therefore, the user can directly execute the self-defined gesture operation on the touch area, and the corresponding function of the application bound by the gesture operation is quickly started. For example, when a user performs a gesture of drawing a circle on the touch area, the mobile phone may be triggered to directly open a call page of the phone application, and perform a call dialing function for a specified contact. For example, when a user performs a gesture of drawing a box on the touch area, the mobile phone can be triggered to directly open a code scanning page of the payment application, and perform a code scanning function.
The following describes a one-hand operation method according to an embodiment of the present application with reference to the accompanying drawings, which may be applied to the above-described scenarios shown in fig. 2 to 23. The one-handed operation mode applies an electronic device, which may be the mobile phone described above. As shown in fig. 24, the method may include S2410-S2480.
S2410, controlling a screen display area of a display screen by the electronic device to display a first interface.
The first interface may be understood as a user interface presented by the handset through the entire screen display area of the display screen. The user interface is a media interface for interaction and exchange of information between an application or operating system and a user, which enables conversion between an internal form of information and a form acceptable to the user. The user interface of the application program is source code written in a specific computer language such as java, extensible markup language (extensible markup language, XML) and the like, and the interface source code is analyzed and rendered on the terminal equipment and finally presented as interface elements in the user interface.
Optionally, the user interface may include interface elements such as icons, windows, controls, and the like. Among them, controls (controls) are also called parts (widgets), and typical controls are a toolbar (toolbar), a menu bar, a text box, a button (button), a scroll bar (scrollbar), a picture, and a text. A commonly used presentation form of a user interface is a graphical user interface (graphic user interface, GUI), which refers to a graphically displayed user interface that is related to computer operations. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as a picture, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
In the embodiment of the application, the first interface may include at least one of a desktop interface and an application interface of an application program. Alternatively, the first interface may be a full screen displayed desktop interface or a full screen displayed application interface. Alternatively, the first interface may be a combined interface composed of a desktop interface displayed in full screen and a floating application interface displayed in floating on the desktop interface. Alternatively, the first interface may also be a combined interface composed of at least one application interface, for example, the first interface may be an application interface of two application programs displayed in a split screen state.
Alternatively, the interface content of the first interface may span the entire screen display area. Alternatively, the interface content of the first interface may not cover the whole screen display area, that is, there is a black area where no content is displayed between the interface content boundary in the first interface and the whole screen display area boundary of the display screen. The first interface is not limited in the embodiment of the present application.
S2420, the electronic device detects a trigger instruction of the one-hand operation mode.
Alternatively, the trigger instruction of the one-handed operation mode may be triggered by the user. Optionally, the trigger instruction of the single-hand operation mode may be a click operation of a user on a function option of the single-hand operation mode, or a press operation of a user on a specific physical key or a key combination on the mobile phone, or a voice instruction of the user, a shortcut gesture operation (such as a gesture of drawing a circle track) of the user on a screen of the mobile phone, or a space gesture operation of the user, which is not limited in the embodiment of the present application.
Alternatively, the triggering instruction of the one-hand operation mode may be triggered by the electronic device. As one embodiment, the electronic device may automatically trigger a one-hand operation mode upon detecting that the user is currently holding the electronic device in a one-hand holding posture. Alternatively, the electronic device may automatically trigger the one-hand operation mode when detecting that the holding temperature of the user is lower than the preset temperature value. The mode of the electronic equipment automatically triggering the one-hand operation mode is not limited by the embodiment of the application.
The single-hand operation mode is an operation mode of the electronic device, and the operation mode divides a display screen of the electronic device into a first area and a second area. When the user operates the electronic device with one hand, the finger of the user can touch or is easy to touch in the display screen, and the second area is the rest area except the first area in the display screen of the electronic device. The operation mode reduces the original display area of the display screen of the electronic device (i.e. before entering the one-hand operation mode), and the display content therein (such as a desktop interface, an application program interface, an application icon, characters, patterns, a display window, a control, etc. in the display area), and places the reduced display area and the reduced display content in a second area of the display screen for display. And the first area and the second area can establish a mapping relation, namely, the touch operation executed by the user in the first area can be mapped to the second area, so that the electronic equipment can execute the function required to be executed when the second area is touched through the touch of the first area. Thus, the user can realize normal operation of the electronic equipment in the first area by one hand.
For example, after the electronic device enters a single-hand operation mode, a reduced original display area and reduced display content are displayed in a second area of the display screen, wherein the display content comprises application icons of the short message application, and when a user performs a single-click operation at a certain position of the first area, the electronic device can respond to the single-click operation to determine that the single-click operation is mapped to the application icons of the short message application displayed in the second area, so that the electronic device can execute functions required to be executed when the application icons of the short message application are touched, namely, open the short message application program corresponding to the short message application icons.
Optionally, after the electronic device enters the one-hand operation mode, the electronic device may implement all functions of the display area before zooming out in the second area, including displaying the zoomed out display content in the second area and performing touch operation in the second area. That is, the user may implement the touch operation on the second area in the first area, or may directly implement the touch operation on the second area in the second area.
Optionally, the method comprises the step of. The electronic device may also display the first area, so that the user knows the position of the area that can be touched by one hand on the display screen. As an implementation manner, the electronic device may add a touch window, and place the touch window in the first area of the display screen for display. That is, once the one-hand operation mode of the electronic device is triggered, the electronic device will display the touch window in the first area of the display screen, and will display the reduced original display area and the reduced display content in the second area of the display screen.
S2430, the electronic equipment responds to a trigger instruction of a one-hand operation mode, and a first area and a second area of the display screen are determined.
The first area is an area which can be touched by a finger of a user with one hand on the display screen when the user holds the electronic equipment with one hand, and the second area is a remaining area except the first area on the display screen. The electronic device can receive touch operation input by a user with one hand through the first area, and can display content to be displayed by the electronic device through the second area.
Optionally, when the electronic device first responds to a trigger instruction of the one-hand operation mode, the electronic device may enter an initialization setting mode of the first area to determine the first area adapted to the current one-hand operation of the user. In one mode, when a user holds the electronic device with a single hand normally, the electronic device can prompt the user to slide on the display screen according to a specified track such as an arc by using the thumb, and then the electronic device can analyze the maximum sliding distance of the thumb of the user on the screen through the sliding track of the thumb of the user so as to determine the maximum area range which can be touched by the thumb on the screen. Therefore, the electronic equipment can determine the area range of the first area according to the maximum area range which can be touched by the thumb on the screen.
Optionally, the electronic device may also display a maximum area range that the thumb can touch on the screen, so that the user can customize the area range of the first area.
After the electronic device determines the first area, the remaining area except the first area in the display screen can be used as the second area. In the embodiment of the present application, the first area may be understood as a touch area in a single-hand operation mode, and the second area may be understood as a display area in a single-hand operation mode.
Optionally, after determining the first area, the electronic device may record the position and the size of the first area, so that when the subsequent electronic device enters the one-hand operation mode again, the first area may be directly determined.
It will be appreciated that the shape of the first region is not limited in this embodiment, and may be square, circular, oval, etc. Taking the shape of the first area as a square as an example, the electronic device can determine the maximum value of the length and the width of the first area according to the maximum area range which can be touched by the thumb on the screen, so that the electronic device can determine the position and the size of the first area.
Optionally, the size of the first area may set an upper limit value to avoid that the first area is too large, resulting in the second area being too small, so that the content displayed in the second area is too small.
Optionally, when determining the area range of the first area, the electronic device may also reserve a bottom area with a preset size at the bottom of the display screen, so that the position of the first area may be above the bottom area. Such that the bottom region may be used as part of the second region for displaying content when the first region is determined. The preset size can be reasonably set according to practical situations, and the embodiment of the application is not limited.
S2440, the electronic device reduces the screen display area of the display screen and the first interface displayed by the screen display area.
It can be understood that when the electronic device is in the normal mode, the whole screen display area of the display screen is generally used for displaying the first interface, and when the electronic device is switched from the normal mode to the one-hand operation mode, the electronic device divides the display screen into two areas, namely, a first area and a second area, wherein the first area is a touch area in the one-hand operation mode, and the second area is a display area in the one-hand operation mode. Since the size of the second area (i.e. the display area in the one-handed operation mode) is smaller than the size of the entire screen display area of the display screen, when the electronic device is switched from the normal mode to the one-handed operation mode, the entire screen display area of the display screen and the display content therein need to be reduced to ensure that all the display in the normal mode is included in the second area, and that the information that should be displayed in the normal mode is not missing. In this way, when the electronic device is switched from the normal mode to the one-handed operation mode, the user can view the display of the original normal mode through the second area of the display screen.
Optionally, the first interface displayed by the display screen is attached to the aspect ratio of the screen of the display screen, so that the electronic device can determine, in the second area, a maximum area meeting the aspect ratio of the screen of the display screen as a target display area for displaying the first interface originally displayed in the whole screen display area according to the position and the size of the second area, and then the electronic device can determine, according to the size of the target display area, the original screen display area of the display screen and the target proportion of the first interface to be reduced, which is displayed in the original screen display area. Then, the mobile phone can reduce the original screen display area of the electronic equipment and the first interface displayed in the original screen display area according to the target proportion in an equal ratio, so that the reduced first interface can be ensured to be capable of being paved in the target display area.
Optionally, after determining the target proportion, the electronic device may record the target proportion, so that when the subsequent electronic device enters the single-hand operation mode again, the first interface displayed in the whole screen display area of the electronic device may be directly reduced according to the recorded target proportion, and the first interface is reduced according to the target proportion in an equal proportion.
S2450, the electronic device controls the second area of the display screen to display the reduced display area and the reduced first interface.
Optionally, the electronic device may determine, from the second area, a target display area for the first interface displayed in the original whole screen display area of the display screen, so as to ensure that the first interface displayed in the original whole screen display area can be scaled down to the target display area for display. Alternatively, the target display area may determine whether the target display area is leaning towards the left or right edge of the display screen in the second area according to whether the single-hand grip gesture of the user is left or right hand grip. For example, when the single-hand holding posture of the user is right-hand holding, the target display area may be directed toward the right-side edge of the display screen in the second area.
Optionally, when a third area in which the content is not displayed in addition to the target display area in the second area, the electronic device may reorder the applications on the desktop, and control the third area of the display screen to display the reordered desktop applications.
Optionally, the electronic device may also control the third area of the display screen to display an application program or a shortcut function commonly used by the user. The present application is not limited to the content displayed in the third region.
S2460, the electronic device establishes a mapping relation between the first area and the second area.
Alternatively, the mapping relationship between the first region and the second region may be a coordinate mapping relationship between the first region and the second region. Therefore, after the electronic device detects that the user performs the touch operation on a certain position in the first area, the touch operation on a certain position in the first area by the user can be mapped to the touch operation on a corresponding position in the second area according to the coordinate mapping relation, so that the electronic device can perform the function required to be performed when the corresponding position in the second area is touched.
In some embodiments, the electronic device may establish a mapping relationship between the first region and the first interface in the second region after the display is zoomed out. Optionally, the electronic device may establish a mapping relationship between the first region and the operation control in the first interface displayed in the second region. The operation controls in the first interface may include fixed position operation controls that do not move with page browsing, such as a title bar, a navigation bar, a bottom bar, and a top bar, and may also include non-fixed position operation controls that move with page browsing, such as pictures, buttons, and text boxes.
Optionally, the electronic device may establish a coordinate mapping relationship between the position of the target control in the first interface displayed in the second area and the target position in the first area, so as to map the position of the target control in the first interface to the first area. The target control may be any operation control in the first interface, and the target position may be any position in the second area. Therefore, after the electronic equipment detects that the user performs the touch operation on the target position in the first area, the touch operation on the target position in the first area by the user can be mapped to the touch operation on the position of the target control in the second area according to the coordinate mapping relation, so that the electronic equipment can perform the function required to be performed when the target control in the second area is touched. For example, the electronic device can display a new user interface in the reduced screen display area within the second area that is generated when the target control is touched.
Optionally, the electronic device may map the target control in the first interface displayed in the second area to the first area for display, that is, the first area may display the target control. And then the electronic equipment can establish a function mapping relation between the function required to be executed by the electronic equipment when the target control displayed in the second area is touched and the target control displayed in the first area. Thus, when the electronic device detects that the user performs the touch operation on the target control displayed in the first area, the electronic device can directly perform the function required to be performed when the target control in the second area is touched.
Alternatively, the electronic device may map a title bar, a bottom bar, a top bar, etc. that does not move with page browsing, fixed position operation controls to the first area, and a picture, a button, a text box, etc. that moves with page browsing, non-fixed position operation controls may be selectively mapped to the first area.
As an embodiment, referring to fig. 25, S2460 may include:
S2461, the electronic device acquires the first control and the second control in the first interface.
The first control may be a fixed-position operation control that does not move with page browsing, such as a title bar, a bottom bar, a top bar, etc., in the first interface, and the second control may be a non-fixed-position operation control that moves with page browsing in the first interface.
The electronic device may obtain a display style of each control in the first interface to determine a display position and a display size of each control in the first interface. The electronic device may then determine whether the control is the first control by determining whether the control is fixedly displayed at a location in the first interface. Alternatively, because title bars, navigation bars, menu bars, etc., are fixed position operational controls that do not move with page browsing, are typically displayed at the top or bottom of the user interface, while non-fixed position operational controls that move with page browsing are typically displayed in a long preview page in the middle of the user interface. Therefore, the electronic equipment can acquire the control fixedly displayed at the first position in the first interface from the first interface as the first control; and acquiring a control which is not fixedly displayed at a first position in the first interface from the first interface, and taking the control as a second control. Wherein the first position includes at least one of a top position and a bottom position.
S2462, the electronic device generates a combined control according to the second control.
Optionally, the electronic device may determine, according to the display style of each second control, a plurality of second controls with matched display styles, and then perform a combination process on the plurality of second controls with matched display styles to obtain a combined control. As one way, the electronic device may use a plurality of second controls arranged along the same direction as a plurality of second controls whose display styles match. The same direction may be horizontal or vertical.
Optionally, the electronic device may determine whether the arrangement manner of the second controls in the first interface is horizontal arrangement or vertical arrangement, so as to select a part of the second controls from the first interface according to the arrangement manner, the position and the size of the second controls in the first interface, and recombine the second controls to obtain combined controls, so that the electronic device can map the combined controls to the first area together. Optionally, the electronic device can recombine all of the second controls in the first interface to obtain a plurality of combined controls.
Optionally, the electronic device may combine the plurality of second controls horizontally arranged in one or more rows to obtain a combined control. Alternatively, the electronic device may also combine multiple second controls vertically arranged in one or more columns to obtain a combined control. Optionally, the electronic device may combine the plurality of second controls arranged horizontally with the plurality of second controls arranged vertically to obtain a combined control.
Alternatively, the electronic device may display a selection cursor in the second area, such as the content selection box 2101 shown in fig. 21, where the selection cursor may prompt the user as to which portion of the content in the second area may be correspondingly controlled currently in the first area.
Optionally, the selection cursor includes a boundary defining a selection area of the selection cursor in the zoomed-out first interface, the selection area being for selecting the at least one second control. Wherein the transparency of the selected area defined by the boundary may be 0 to 100%.
The content of the second area framed by the selection area is the content which can be correspondingly controlled by the user through the first area. The electronic device may move the selection cursor within the second area according to a sliding operation of the user on the first area such that the selection area of the selection cursor frames different contents.
Alternatively, the size of the selection area of the selection cursor may be the size of the area occupied by the combination control. The electronic device can determine the size of the selection area of the selection cursor according to the size of the area occupied by the combination control, so that the selection area of the selection cursor can be ensured to just frame the combination control. In this way, the electronic device can move the selection cursor among the plurality of combination controls according to the sliding operation of the user on the first area. Therefore, the user can determine the currently selected combination control of the electronic device through the selection cursor displayed by the electronic device.
Optionally, for a fixed-position first control that does not move with page browsing in the first interface, the fixed-position first control may be always displayed in the first interface, and the electronic device may map the first control to the first area all the time during display of the first interface, so that a user may operate the first control at any time. For the second control which moves along with the page browsing in the first interface and is in the non-fixed position, the second control can be gradually displayed or not gradually displayed along with the page browsing, so the electronic device can map the second control not all the time, but when the user browses the second control, the second control is mapped.
Optionally, the electronic device can map a combination control framed by a selection region of the selection cursor to the first region. When the user performs the sliding operation in the first area, and controls the selection cursor in the second area to move from the first combination control to the second combination control, the electronic device can map the first combination control to the first area, namely, the first combination control selected by the selection cursor is displayed in the first area, then the movement of the selection cursor is followed, the mapping of the first combination control is canceled, the second combination control is mapped to the first area again, namely, the electronic device cancels the display of the first combination control in the first area, and the second combination control newly selected by the selection cursor is redisplayed. The electronic device can map the combination control framed by the selection area of the selection cursor to the first area in real time.
The electronic device may perform a combination process on the first option control, the second option control, and the third option control arranged along the horizontal direction when the first interface is a browsing interface, and the browsing interface includes a plurality of second controls, where the plurality of second controls includes the first option control, the second option control, and the third option control arranged along the horizontal direction, so as to obtain an option combination control. Thus, when the selection cursor selects the option combination control, the electronic device can display the option combination control in the first area. Wherein the selection area of the current selection cursor may just frame the option combination control.
Optionally, the size of the selection area of the selection cursor may also be the size of the area occupied by each second control. In this way, when the user controls the selection cursor in the second area to move between different second controls by performing a sliding operation in the first area, the electronic device may map the different second controls to the first area respectively following the movement of the selection cursor. Therefore, the electronic equipment can map the second control framed by the selection area of the selection cursor to the first area in real time for display.
The first interface is a browsing interface, and the browsing interface includes a plurality of second controls, and when the plurality of second controls include a first option control, a second option control, and a third option control arranged along a horizontal direction, the selection cursor may follow a sliding operation of the user in the first area, and the first option control, the second option control, and the third option control are selected one by one. Optionally, when the first option control is selected by the selection cursor, the electronic device may map the first option control to the first area for display.
Optionally, when the selection cursor selects a single object, the electronic device may also directly perform a single click or double click operation in the first area without mapping the object to the first area for display, so as to implement touch control on the single object.
Alternatively, the electronic device may adaptively display selection cursors of different sizes according to different applications.
In one manner, when the first interface is an application interface of the first application program, the electronic device may use the plurality of second controls arranged along the horizontal direction as the plurality of second controls with matched display styles to generate the combined control. As shown in fig. 19 (a), when the first interface is the first page of the search application, the focus control, the news control, and the map control are usually located at a horizontal position, and at this time, the electronic device may combine these multiple controls arranged along the horizontal direction to obtain a combined control, and the electronic device may display the selection cursor 1902 according to the area occupied by the combined control.
When the first interface is an application interface of the second application program, the electronic device can use the plurality of second controls arranged along the vertical direction as a plurality of second controls with matched display styles. Wherein the first application is different from the second application. As shown in fig. 18, when the first interface is a video playing interface of a short video application program, the focus control, the collection control, the comment control and the sharing control are usually located at a vertical position, and at this time, the electronic device may combine multiple controls arranged along a vertical direction to obtain a combined control, and the electronic device may display the selection cursor 1802 according to an area occupied by the combined control.
Alternatively, the electronic device may adaptively display selection cursors of different sizes according to different contents in the same application program. The size of the selection cursor is not limited in the embodiment of the application.
S2463, the electronic device judges whether the first control and the combined control are suitable for the first area. If not, the electronic device first executes S2464 and then continues to execute S2463. If yes, the electronic device executes S2465.
S2464, the electronic device adjusts the positions and the sizes of the first control and the combined control to obtain the adjusted first control and the adjusted combined control.
S2465, the electronic device maps the first control and the combined control to the first area.
When fewer second controls (for example, less than 10 second controls) exist in the first interface, the electronic device can directly judge whether the size of the area occupied by the second controls is matched with the size of the first area, and if not, the electronic device can adjust the display style of the second controls to be matched with the first area. The electronic device can then map the adjusted second control to the first region.
When more second controls (such as not less than 10 second controls) exist in the first interface, the electronic device can divide the second controls into a plurality of areas, the second controls of each area are recombined to obtain a plurality of combined controls, and then the electronic device can map the combined controls selected by the selection cursor to the first area according to the movement of the selection cursor.
Optionally, the electronic device may determine whether the size of the area occupied by the combined control matches the size of the first area, and if not, the electronic device may adjust the display style of the combined control to match the first area. The size of the area occupied by the adjusted combined control is matched with the size of the first area.
As a way, the electronic device may adjust the display style of each second control in the combined controls, then recombine each adjusted second control to obtain a new combined control after adjustment, and then the electronic device may map the new combined control after adjustment to the first area.
Optionally, the electronic device may adjust a display size of each second control in the combined control according to the size of the first area, may also adjust a display interval between two adjacent second controls in the combined control, and may also adjust a display position of each second control in the combined control. The embodiment of the application does not limit the adjustment mode of the display style of the combined control.
When a plurality of first controls exist in the first interface, the electronic device can also directly judge whether the sizes of the areas occupied by the first controls are matched with the sizes of the first areas, and if not, the electronic device can adjust the display patterns of the first controls to be matched with the first areas. The electronic device may then map the adjusted first control to the first region. Optionally, the electronic device may also adjust the display style of the first controls, and display a portion of the first controls in the form of a switch list, hide a portion of the first controls, and when the user slides the switch list on the first area, display the hidden first controls again by the electronic device.
When there are a first control and more second controls in the first interface, the electronic device may recombine the more second controls to obtain a plurality of combined controls. And then the electronic equipment can judge whether the size of the area occupied by the first control and the combined control selected by the selection cursor is matched with the size of the first area, and if not, the electronic equipment can adjust the display modes of the first control and the combined control selected by the selection cursor to be matched with the first area.
It can be appreciated that when the area occupied by the first control and the combined control when displayed according to the original typesetting and size is much smaller than the first area, the electronic device can determine that the typesetting and size of the first control and the combined control do not match the first area. Optionally, when the area occupied by the first control and the combined control when displayed according to the original typesetting and size is larger than the first area, the electronic device may determine that the typesetting and size of the first control and the combined control are not matched with the first area. Optionally, when the controls are too small, i.e., the dimensions of the first control and the combined control are smaller than the preset value, the electronic device may determine that the typesetting and the dimensions of the first control and the combined control do not match the first area. The preset value may be set reasonably according to implementation and application, and the embodiment of the application is not limited.
When the area occupied by the first control and the combined control when displayed according to the original typesetting and size is approximately consistent with the first area, the electronic device can determine that the typesetting and the size of the first control and the combined control are matched with the first area, and the electronic device can directly map the first control and the combined control to the first area according to the original typesetting and the size.
When the typesetting and the size of the first control and the combined control are not matched with the first area, the electronic equipment can display and adjust the typesetting and the size of the first control and the combined control so as to ensure that the typesetting and the size of the adjusted first control and the combined control are matched with the first area. The electronic device then maps the adjusted first control and the combined control to the first region. Optionally, the electronic device may not adjust the typesetting and the size of the first control, but display the first control according to the original typesetting and the original size in a form of a switching list.
Taking the first control in the first interface as a bottom side bar as an example, when more sub-controls exist in the bottom side bar, as shown in fig. 23, the electronic device may display a switching list of the sub-controls in the first area, for example, 5 sub-controls in the bottom side bar 2302. The 5 child controls "home, VIP member, message, shopping cart, my" in bottom bar 2302 as shown in fig. 23 may display a switch list for the bottom bar that may display the first 3 child controls "home, VIP member, message" that may switch to display "message, shopping cart, my" when the user slides the switch list, i.e., the 2 child controls remaining hidden, and the 2 child controls previously displayed hidden.
Optionally, the electronic device may also establish a coordinate mapping relationship between the position of the target control in the first interface and the target position in the first area according to the position of the target control in the first interface, so as to map the target control in the first interface in the first area to the second area. The electronic device can respond to the touch operation on the target position in the first area, and map the touch operation on the target position in the first area into the touch operation on the target control in the reduced first interface. And then the electronic equipment can respond to the touch operation acted on the target control in the reduced first interface, and execute the function corresponding to the target control on the reduced first interface. The target control may be the first control, the second control, or the third control.
S2470, the electronic device detects a touch operation performed by the user on the first region.
The touch operation may include operations such as touching the display screen of the electronic device by a user, moving the touch screen after touching, and the like, including touching the display screen by a finger or other parts of the body by a user, and touching the display screen by a touch device such as a touch pen, where the touching may be an operation of directly touching the touch screen, or may be a control of the display screen in a certain small distance range from a vertical distance of the touch screen surface, for example, the finger may directly touch the touch screen, or may implement touch control on the touch screen in a small distance range from the vertical distance of the touch screen surface, that is, without directly touching the touch screen surface.
Optionally, the touch operation may be a common touch operation such as up-sliding, down-sliding, left-sliding, right-sliding, single-clicking, double-clicking, long-pressing, or a specific touch gesture such as circle drawing, hook drawing, fork drawing, and the like. The embodiment of the present application is not limited thereto.
Optionally, the electronic device may monitor, in real time, a touch gesture of a user on the display screen according to a gesture algorithm, and determine whether the touch gesture of the display screen is located in a first area of the display screen. When the electronic device detects that a touch gesture of a user acting on the display screen is located in a first region of the display screen, the electronic device may map the touch gesture to a second region.
S2480, the electronic device executes a function corresponding to the touch operation on the reduced screen display area displayed in the second area and the reduced first interface according to the mapping relationship in response to the touch operation performed by the user on the first area.
Optionally, when the first interface is the home page of a certain application program, the electronic device may execute the exit function of the application program when detecting that the touch operation of the user acting on the first area is a left-sliding operation, and at this time, the electronic device may respond to the left-sliding operation, control the reduced screen display area to display the first interface, and switch and display the desktop interface returned after the application program exits.
Optionally, when the first interface is a non-home page of a certain application program, the electronic device may execute a function of returning to a previous page of the application program when detecting that a touch operation of a user acting on the first area is a left-sliding operation, and at this time, the electronic device may respond to the left-sliding operation, control the reduced screen display area to display the first interface, and switch and display an application interface of a previous level.
Optionally, when the first interface is a desktop interface and the electronic device detects that the touch operation of the user acting on the first area is a left-sliding operation, the electronic device may execute a desktop interface switching function of the application program, and at this time, the electronic device may respond to the left-sliding operation, control the reduced screen display area to display the first interface, and switch to display a new desktop interface.
Optionally, when the first interface is an application interface of a certain application program, when the first area is mapped with a plurality of operation controls in a content selection frame in the application interface, the electronic device detects a click operation of a user on a target operation control in the first area, and the electronic device can execute a function when the target operation control in the application interface is touched.
For example, when the first area maps with the video playing control in the scaled-down video playing application interface displayed in the second area, and when the electronic device detects the click operation of the user on the video playing control in the first area, the electronic device may execute the function of the scaled-down video playing application interface when the video playing control is touched, that is, the video playing function, so as to control the scaled-down screen display area to display the video playing picture.
For another example, when the electronic device pre-establishes a coordinate mapping relationship between the center position of the video playing interface and the center position of the first area when the video playing control in the video playing application interface is located at the center position of the video playing interface, the electronic device may map the click operation acting on the center position of the first area to the click operation acting on the playing control in the reduced video playing interface when the electronic device detects the click operation acting on the center position of the first area by the user; and then the electronic equipment can respond to the clicking operation of the playing control in the reduced video playing interface to play the video in the reduced video playing interface so as to control the reduced screen display area to display the video playing picture.
It should be understood that the above touch operations and corresponding functions are merely examples, and embodiments of the present application are not limited thereto.
According to the single-hand operation method provided by the embodiment of the application, the electronic equipment can adaptively adjust the touch control area of the single-hand operation mode according to the single-hand holding gesture of the user so as to ensure that the touch control area is suitable for the single-hand operation of the user. The remaining area of the display screen except the touch area of the single-hand operation mode can be used as the display area of the single-hand operation mode. The display area may display a user interface displayed with the reduced original display area. In addition, when the display area in the one-hand operation mode is after displaying the user interface displayed in the reduced original display area, if there is a remaining display area, the electronic device may also rearrange the desktop application in the remaining display area and display the rearranged desktop application. Meanwhile, the electronic equipment also establishes a mapping relation between the operation area of the single-hand operation mode and the display area of the single-hand operation mode, so that after the electronic equipment detects the touch operation of the touch area of the single-hand operation mode, the electronic equipment can be mapped to the touch operation of the display area of the single-hand operation mode according to the mapping relation, and the electronic equipment can execute the function required to be executed when the display area of the single-hand operation mode is touched. Therefore, the user can perform full-range operation on the original interface displayed by the electronic equipment in a shrinking manner by performing touch operation in the touch control area. Under the condition that an application program is not required to be newly developed or adapted, the problem that partial content displayed on a screen can not be touched by fingers when a user operates the electronic equipment with one hand is solved.
It will be appreciated that the electronic device, in order to achieve the above-described functions, includes corresponding hardware and/or software modules that perform the respective functions. The present application can be implemented in hardware or a combination of hardware and computer software, in conjunction with the example algorithm steps described in connection with the embodiments disclosed herein. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application in conjunction with the embodiments, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The present embodiment may divide the functional modules of the electronic device according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated modules described above may be implemented in hardware. It should be noted that, in this embodiment, the division of the modules is schematic, only one logic function is divided, and another division manner may be implemented in actual implementation.
Embodiments of the present application also provide a computer storage medium having stored therein computer instructions which, when executed on an electronic device, cause the electronic device to perform the above-described related method steps to implement the interface display method in the above-described embodiments.
Embodiments of the present application also provide a computer program product, which when run on a computer, causes the computer to perform the above-mentioned related steps to implement the interface display method performed by the electronic device in the above-mentioned embodiments.
In addition, embodiments of the present application also provide an apparatus, which may be embodied as a chip, component or module, which may include a processor and a memory coupled to each other; the memory is configured to store computer-executable instructions, and when the apparatus is running, the processor may execute the computer-executable instructions stored in the memory, so that the chip executes the interface display method executed by the electronic device in the above method embodiments.
The electronic device, the computer storage medium, the computer program product, or the chip provided in this embodiment are used to execute the corresponding methods provided above, so that the beneficial effects thereof can be referred to the beneficial effects in the corresponding methods provided above, and will not be described herein.
It will be apparent to those skilled in the art from this description that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of specific embodiments of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (26)

1. A one-handed operation method, applied to an electronic device, the electronic device including a display screen, the method comprising:
Displaying a first interface;
responding to a trigger instruction of a single-hand operation mode, and determining a first area and a second area of the display screen; the first area is an area which can be touched by a finger of a user when the user holds the electronic equipment with one hand, and the second area is a residual area of the display screen except the first area;
Displaying the reduced first interface in the second area;
And responding to the touch operation acted on the first area, and executing a function corresponding to the touch operation on the reduced first interface.
2. The method of claim 1, wherein the determining the first region of the display screen comprises:
Prompting a user to slide on the display screen according to a specified track by using the fingers of the single hand when the user holds the electronic equipment by the single hand;
determining the maximum area which can be touched by the finger of the single hand on the display screen according to the sliding of the finger of the single hand;
and determining a first area of the display screen according to the maximum area.
3. The method of claim 1, wherein the first interface includes a target control, the responding to the touch operation applied to the first area, and performing a function corresponding to the touch operation on the reduced first interface includes:
Responding to the touch operation acting on the target position in the first area, and mapping the touch operation acting on the target position in the first area into the touch operation acting on the target control in the reduced first interface; a coordinate mapping relation is pre-established between a target position in the first area and a position of the target control in the first interface;
Responding to the touch operation of the target control in the reduced first interface, and executing the function corresponding to the target control on the reduced first interface.
4. The method of claim 3, wherein the first interface is a video playing interface, the target control is a playing control located at a central position of the video playing interface, and a coordinate mapping relationship is pre-established between the central position of the video playing interface and the central position of the first area;
The responding to the touch operation acted on the first area, executing the function corresponding to the touch operation on the reduced first interface, including:
Responding to the touch operation acting on the central position of the first area, and mapping the touch operation acting on the central position of the first area into the touch operation acting on the play control in the reduced video play interface;
and responding to the touch operation of the playing control in the reduced video playing interface, and playing the video in the reduced video playing interface.
5. The method of claim 1, wherein the first interface comprises a target control, the method further comprising, after displaying the zoomed-out first interface:
displaying the target control in the first area;
The responding to the touch operation acted on the first area, executing the function corresponding to the touch operation on the reduced first interface, including:
And responding to the touch operation acted on the target control in the first area, and executing the function corresponding to the target control on the reduced first interface.
6. The method of claim 5, wherein the first interface is a video playback interface, the target control is a playback control, and after displaying the zoomed-out first interface, the method further comprises:
displaying the play control in the first area;
The responding to the touch operation acted on the first area, executing the function corresponding to the touch operation on the reduced first interface, including:
And responding to the touch operation acted on the playing control in the first area, and playing the video in the video playing interface after shrinking.
7. The method of claim 5, wherein the target control is a first control, and wherein after displaying the zoomed-out first interface, the method further comprises:
according to the display style of each control in the first interface, acquiring a control fixedly displayed at a first position in the first interface from the first interface as the first control; wherein the first position includes at least one of a top position and a bottom position.
8. The method of claim 7, wherein the first control comprises at least one of a title bar, a navigation bar, a menu bar.
9. The method of claim 5, wherein the target control is a second control, and wherein after displaying the zoomed-out first interface, the method further comprises:
According to the display style of each control in the first interface, acquiring a control which is not fixedly displayed at a first position in the first interface from the first interface as a second control; wherein the first position comprises at least one of a top position and a bottom position;
Displaying a selection cursor on the reduced first interface, wherein the selection cursor comprises a boundary which defines a selection area of the selection cursor in the reduced first interface, and the selection area is used for selecting at least one second control;
the displaying the target control in the first area includes:
And displaying the second control in the selection area of the selection cursor in the first area.
10. The method of claim 9, wherein the first interface is a browsing interface, the browsing interface including a plurality of the second controls including a first option control, the method further comprising:
And when the first option control is included in the selection area of the selection cursor, displaying the first option control in the first area.
11. The method of claim 9, wherein after retrieving, from the first interface, a control that is not fixedly displayed at a first location within the first interface as a second control, the method further comprises:
Combining the plurality of second controls matched with the display style to obtain a combined control;
Determining the area occupied by the combined control as a selection area of the selection cursor;
The displaying the second control in the selection area of the selection cursor in the first area includes:
and displaying the combined control in the selection area of the selection cursor in the first area.
12. The method of claim 11, wherein the method further comprises:
and according to the display style of each second control, using a plurality of second controls arranged along the same direction as a plurality of second controls matched with the display style.
13. The method of claim 12, wherein the first interface is an application interface, the method further comprising:
When the first interface is an application interface of the first application program, a plurality of second controls which are arranged along the horizontal direction are used as a plurality of second controls with the matched display patterns;
When the first interface is an application interface of a second application program, a plurality of second controls which are arranged along the vertical direction are used as a plurality of second controls with the matched display style; wherein the first application is different from the second application.
14. The method of claim 12, wherein the first interface is a browsing interface, the browsing interface including a plurality of the second controls, the plurality of the second controls including a first option control, a second option control, and a third option control arranged in a horizontal direction, the method further comprising:
Combining the first option control, the second option control and the third option control which are arranged along the horizontal direction to obtain an option combination control;
When the selection area of the selection cursor comprises the option combination control, displaying the option combination control in the first area; the size of the selection area of the selection cursor is matched with the size of the area occupied by the option combination control.
15. The method according to any one of claims 11-14, further comprising:
When the size of the area occupied by the combined control is not matched with the size of the first area, adjusting the display style of each second control in the combined control to obtain an adjusted combined control; the size of the area occupied by the adjusted combined control is matched with the size of the first area;
And displaying the adjusted combination control in the first area.
16. The method of claim 15, wherein the adjusting the display style of each of the second controls in the combined control comprises:
according to the size of the first area, adjusting the display size of each second control in the combined control;
Or alternatively
According to the size of the first area, adjusting the display interval between two adjacent second controls in the combined control; or alternatively
And adjusting the display position of each second control in the combined control according to the size of the first area.
17. The method according to any one of claims 1 to 16, wherein the first interface is a home page of an application program, the responding to the touch operation applied to the first area, and executing a function corresponding to the touch operation on the reduced first interface includes:
and responding to the left sliding operation acted on the first area, exiting the application program, and displaying the zoomed-out desktop interface in the second area.
18. The method according to any one of claims 1 to 16, wherein when the first interface is a non-home page of an application, the responding to the touch operation applied to the first area, and executing the function corresponding to the touch operation on the reduced first interface, includes:
And responding to the left sliding operation acted on the first area, displaying a reduced second interface, wherein the second interface is the interface of the upper level of the first interface.
19. The method of any one of claims 1-16, wherein the performing a function corresponding to the touch operation on the reduced first interface in response to the touch operation applied to the first area comprises:
Detecting a preset gesture operation acting on the first area, wherein the preset gesture operation is used for triggering a preset function in a third interface;
And responding to the preset gesture operation, switching the reduced first interface displayed in the second area to the reduced third interface, and executing the preset function.
20. The method of any of claims 1-19, wherein the second region includes a target display region and a third region, the displaying the reduced first interface in the second region comprising:
displaying the reduced first interface in the target display area;
displaying a plurality of icon controls in the third area; the icon control comprises at least one of an icon control of an application program and an icon control of a shortcut function.
21. The method of claim 20, wherein displaying a plurality of icon controls in the third region comprises:
rearranging icon controls of a plurality of application programs in a desktop interface;
Displaying the rearranged icon controls of the plurality of application programs in the third area.
22. The method of claim 20, wherein the method further comprises:
Displaying a switching control in the first area, wherein an action area of the first area is the target display area;
determining that an active area of the first area is switched from the target display area to the third area in response to a touch operation acting on the switching control in the first area;
and responding to the touch operation acted on the first area, and executing functions corresponding to the touch operation of the first area on the plurality of icon controls in the third area.
23. An electronic device comprising a display screen, a memory, and one or more processors; the display screen, the memory and the processor are coupled; the memory is for storing computer program code comprising computer instructions which, when executed by the processor, cause the electronic device to perform the method of any of claims 1-22.
24. A chip system, wherein the chip system is applied to an electronic device; the system-on-chip includes one or more interface circuits and one or more processors; the interface circuit and the processor are interconnected through a circuit; the interface circuit is configured to receive a signal from a memory of the electronic device and to send the signal to the processor, the signal including computer instructions stored in the memory; the electronic device, when executing the computer instructions, performs the method of any of claims 1-22.
25. A computer storage medium comprising computer instructions which, when run on an electronic device, cause the wearable device to perform the method of any of claims 1-22.
26. A computer program product, characterized in that the computer program product, when run on an electronic device, causes the electronic device to perform the method of any of claims 1-22.
CN202211521350.3A 2022-11-30 2022-11-30 Single-hand operation method and electronic equipment Withdrawn CN118113181A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211521350.3A CN118113181A (en) 2022-11-30 2022-11-30 Single-hand operation method and electronic equipment
PCT/CN2023/127972 WO2024114234A1 (en) 2022-11-30 2023-10-30 Single-handed operation method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211521350.3A CN118113181A (en) 2022-11-30 2022-11-30 Single-hand operation method and electronic equipment

Publications (1)

Publication Number Publication Date
CN118113181A true CN118113181A (en) 2024-05-31

Family

ID=91216703

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211521350.3A Withdrawn CN118113181A (en) 2022-11-30 2022-11-30 Single-hand operation method and electronic equipment

Country Status (2)

Country Link
CN (1) CN118113181A (en)
WO (1) WO2024114234A1 (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012077273A1 (en) * 2010-12-07 2012-06-14 パナソニック株式会社 Electronic device
US8769431B1 (en) * 2013-02-28 2014-07-01 Roy Varada Prasad Method of single-handed software operation of large form factor mobile electronic devices
CN104049883A (en) * 2013-03-15 2014-09-17 广州三星通信技术研究有限公司 Method and system for displaying subscreen and operating in subscreen
CN103176744B (en) * 2013-04-12 2018-08-07 努比亚技术有限公司 A kind of display equipment and its information processing method
CN103353826B (en) * 2013-04-16 2017-05-24 努比亚技术有限公司 Display equipment and information processing method thereof
WO2017022031A1 (en) * 2015-07-31 2017-02-09 日立マクセル株式会社 Information terminal device
WO2020037469A1 (en) * 2018-08-20 2020-02-27 华为技术有限公司 Interface display method and electronic device

Also Published As

Publication number Publication date
WO2024114234A1 (en) 2024-06-06

Similar Documents

Publication Publication Date Title
US11797145B2 (en) Split-screen display method, electronic device, and computer-readable storage medium
CN114416227B (en) Window switching method, electronic device and readable storage medium
US20140078091A1 (en) Terminal Device and Method for Quickly Starting Program
CN107562322B (en) Page switching method and device
WO2019233306A1 (en) Icon display method, device and terminal
CN105359121B (en) Use the application remote operation for receiving data
KR102330829B1 (en) Method and apparatus for providing augmented reality function in electornic device
JP2023513063A (en) CARD DISPLAY METHOD, ELECTRONICS AND COMPUTER READABLE STORAGE MEDIUM
KR20160143115A (en) User terminal device and method for controlling the user terminal device thereof
AU2017206466B2 (en) Flexible display of electronic device and method for operating same
WO2022062898A1 (en) Window display method and device
WO2021213449A1 (en) Touch operation method and device
CN114816167B (en) Application icon display method, electronic device and readable storage medium
CN110806831A (en) Touch screen response method and electronic equipment
CN110442277B (en) Method for displaying preview window information and electronic equipment
EP4024839A1 (en) Operation method and electronic device
CN114546545B (en) Image-text display method, device, terminal and storage medium
CN113986072B (en) Keyboard display method, folding screen device and computer readable storage medium
EP4310648A1 (en) Service card processing method, and electronic device
EP4339765A1 (en) Method for displaying multiple interfaces, and electronic device
CN114461312B (en) Display method, electronic device and storage medium
CN118113181A (en) Single-hand operation method and electronic equipment
WO2023160455A1 (en) Object deletion method and electronic device
WO2024012354A1 (en) Display method and electronic device
WO2024109220A1 (en) Widget display method, electronic device, and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20240531