CN113641275A - Interface control method and electronic equipment - Google Patents

Interface control method and electronic equipment Download PDF

Info

Publication number
CN113641275A
CN113641275A CN202110973350.6A CN202110973350A CN113641275A CN 113641275 A CN113641275 A CN 113641275A CN 202110973350 A CN202110973350 A CN 202110973350A CN 113641275 A CN113641275 A CN 113641275A
Authority
CN
China
Prior art keywords
control
input
target
interface
electronic equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110973350.6A
Other languages
Chinese (zh)
Inventor
宣伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202110973350.6A priority Critical patent/CN113641275A/en
Publication of CN113641275A publication Critical patent/CN113641275A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an interface control method, and belongs to the field of electronic equipment. The method comprises the following steps: under the condition that the single-hand mode is detected to be started, displaying a first control in a first area of a target interface; receiving a first input for the first control; generating a second control associated with the first control in response to the first input; the second control is used for executing target operation on the target interface according to second input aiming at the first control.

Description

Interface control method and electronic equipment
Technical Field
The application belongs to the field of electronic equipment, and particularly relates to an interface control method and electronic equipment.
Background
With the progress of hardware technology and the increasing demand of users, the screen of the portable electronic device is also larger, and the large screen brings inconvenience to the user while the user obtains better viewing experience through the large screen.
At present, in the case of a handheld electronic device, it is almost impossible for a user to cover the entire screen area with a single-handed operation area, and the user can only operate the current interface with two hands. Therefore, the convenience of the operation mode of the electronic equipment by the user is low at present.
Disclosure of Invention
The embodiment of the application aims to provide an interface control method and electronic equipment, and the problem that the convenience of an operation mode of the electronic equipment by a user is low at present can be solved.
In a first aspect, an embodiment of the present application provides an interface control method, where the method includes:
under the condition that the single-hand mode is detected to be started, displaying a first control in a first area of a target interface;
receiving a first input for the first control;
generating, in response to the first input, a second control associated with the first input; the second control is used for executing target operation on the target interface according to second input aiming at the first control.
In a second aspect, an embodiment of the present application provides an interface control apparatus, including:
the display module is used for displaying a first control in a first area of a target interface under the condition that the single-hand mode is detected to be started;
a first receiving module for receiving a first input for the first control;
a generating module for generating a second control associated with the first input in response to the first input; the second control is used for executing target operation on the target interface according to second input aiming at the first control.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, the electronic device can display the first control in the first area of the target interface when the electronic device detects that the one-hand mode is started, generate the second control associated with the first control through the first input of the user to the first control, enable the user to perform the second input to the first control through the one hand, and execute the target operation on the target interface through the second control, so that the problem of inconvenience in one-hand operation of the user is solved, and the convenience of the user for the operation mode of the electronic device is improved.
Drawings
FIG. 1 is a flowchart illustrating steps of an interface control method according to an embodiment of the present disclosure;
FIG. 2 is one of the schematic diagrams of a target interface provided by an embodiment of the present application;
FIG. 3 is a second schematic diagram of a target interface provided by an embodiment of the present application;
FIG. 4 is a third schematic diagram of a target interface provided by an embodiment of the present application;
FIG. 5 is a fourth schematic view of a target interface provided by embodiments of the present application;
FIG. 6 is a fifth schematic view of a target interface provided by embodiments of the present application;
FIG. 7 is a sixth schematic view of a target interface provided by embodiments of the present application;
FIG. 8 is a seventh schematic view of a target interface provided by embodiments of the present application;
FIG. 9 is a schematic structural diagram of an interface control apparatus according to an embodiment of the present disclosure;
fig. 10 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 11 is a second schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The interface control method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
Referring to fig. 1, the present application provides an interface control method applied to an electronic device, including:
step 101, displaying a first control in a first area of a target interface under the condition that the single-hand mode is detected to be started.
Step 102, receiving a first input for the first control.
103, responding to the first input, and generating a second control associated with the first control; the second control is used for executing target operation on the target interface according to second input aiming at the first control.
In step 101, the single-handed mode may be triggered by an input from a user or by a change in a parameter of the electronic device. For example, in some embodiments, the user may turn on the above-mentioned one-handed mode by a gesture operation or a touch operation, for example, pressing a corner region of the target interface for a long time. In some embodiments, the user may also turn on the one-handed mode described above through voice input. Of course, in some embodiments, the electronic device may determine that the user is currently in a one-handed operation state through a parameter detected by a sensor of the electronic device, such as a gyroscope, so as to start the above one-handed mode, which is not listed here.
The target interface may be a current operation interface of the user, that is, a current display interface of the electronic device. The first control can be displayed in the first area of the target interface, and the specific position can be determined according to actual needs, and it can be understood that the first area can be an area close to a user operation finger, so that the user can operate the first control in the area. The user can realize further functions in the single-hand mode through touch operation or gesture operation on the first control.
The first control can be displayed in the form of a key or a key combination, and a user can realize the operation of the target interface through one or more keys. Of course, in order to enhance the user's appearance, the first control may also be displayed in the form of a virtual finger, which is not limited herein.
In the step 102 and the step 103, the first input may be a touch operation or a gesture operation performed by the user on the first control, so as to determine that the current user needs to perform a one-handed operation, and meanwhile, avoid a false touch. For example, the first input may be a single-click operation on the first control. In such a case, the electronic device can generate a second control associated with the first control in response to the first input.
In particular, in some embodiments, the user may generate a different second control from a different first input. For example, a user may generate a second control of a different size depending on the time the user presses the first control for a long time. The user may also generate a second control in a different position or a different orientation according to the different position of clicking the first control, which is not listed here.
It should be noted that the first control may be displayed in a first area where the current user can or can perform the touch operation under the condition of single-handed operation, and at least a portion of the second control may be displayed in an area where the current user cannot or cannot perform the touch operation, so that when a target interaction location where the user needs to perform the target operation is located in the area where the current user cannot perform the touch operation, the second control may be controlled to perform the target operation through an input to the first control.
The second control is used for realizing full screen operation of the target interface through second input of the user to the first control. It is understood that the second control can be displayed in a floating manner on the target interface, and parameters of the display position or the display size can be changed, so that a user can visually change the display position or the display size of the second control to realize the object interaction of the second control and the target interaction position.
The target operation can be determined according to the actual application scenario. For example, the second control may interact with the application icon as in FIG. 2 to open or delete the application, or to move the application icon. The second control may also interact with other controls as shown in fig. 5, so as to close or switch the interface, and the like, which is not described herein again. Of course, in some embodiments, the target operation may be set to be a different operation according to a different second input of the user, for example, clicking to open an application, pressing to uninstall an application for a long time, or sliding a switch interface.
Optionally, in order to improve the impression and the interaction interest of the user, referring to fig. 2 to 5, the second control may be displayed on the target interface in a manner of a virtual finger, and the virtual finger may stretch or change an angle according to the second input, so as to implement interaction on objects at different interaction positions of the target interface. Of course, referring to fig. 6, in order to avoid the occlusion, the second control may be displayed in the form of a small circular key or a cursor, which is not illustrated here.
Of course, in some embodiments, the second control may not be displayed, and the user may determine the current interaction object of the second control through the interaction effect between the second control and the application icon. For example, when the interaction position of the second control is located on the application icon, the application icon may generate a concave effect, and when the interaction position of the second control is located on the other control, the function of the other control may be notified in a form of a text box or a voice announcement, so as to notify the user of the current object to be interacted with of the second control.
The second input is similar to the first input, and includes, but is not limited to, a touch or gesture input of the user to the first control. It should be understood that the second input may be an input by the user to change the interaction position of the second control or to change the interaction object of the second control. For example, the user may change the display size or the display position of the second control by dragging the first control in the first region, or the user may perform an interactive operation such as opening an application program or closing an interface through the second control by double-clicking or single-clicking the first control in the first region. In this way, when the user operates with one hand, the user can operate the whole target interface only through the second input of the first control in the first area.
In the embodiment of the application, the electronic device can display the first control in the first area of the target interface when the electronic device detects that the one-hand mode is started, generate the second control associated with the first control by the first input of the user to the first control, perform the second input to the first control by the one hand of the user, and execute the operation corresponding to the second input on the target interface by the second control, so that the problem of inconvenience in operation of the one hand of the user is solved, and the convenience of the operation mode of the user to the electronic device is improved.
Optionally, the second input includes a first sub-input and a second sub-input, and after step 103, the method further includes:
receiving the second input;
adjusting a display parameter of the second control in response to the second input;
and executing the target operation through the adjusted second control.
As can be seen from the above, the user can change the interaction position or the interaction object of the second control through the second input, and can further perform an interaction operation on the interaction object at the interaction position corresponding to the adjusted second control.
The step of executing the target operation through the adjusted second control may be triggered by the electronic device when detecting that the second input is terminated, for example, after the user performs the pressing operation, when the electronic device detects that the finger of the user leaves the target interface, it may be determined that the adjustment of the second control is completed, and at this time, the target operation may be automatically triggered and executed.
Of course, in other optional embodiments, the step of executing the target operation through the adjusted second control may also be triggered by another touch operation of the user, that is, the second input may include a plurality of sub-inputs, and the step of adjusting the display parameter and executing the target operation is completed through different sub-inputs.
Accordingly, the second input may include a first sub-input and a second sub-input. The first sub-input and the second sub-input can be touch operations of a user, and the user can change the display parameters of the second control through the first sub-input.
After the target interaction position is determined, the user can also execute corresponding interaction operation through the second control through the second sub-input. It can be understood that the user can trigger the second control to perform different target operations on the target interface through different touch operations on the first control.
For example, when the target interaction position of the second control is located on the application icon, the user can start the application program by clicking the first control; the application program is dragged or unloaded by long pressing the first control, which may be the same as the touch operation performed by the user when interacting with the application icon normally, and is not described herein again.
2-5, in the embodiment of FIGS. 2-5, the first control and the second control are each displayed in the form of a virtual finger. As shown in fig. 2, the virtual finger displayed outside the first area in fig. 2 may be the second control before adjustment. The user may increase the length of the second control by long-pressing the fingertip portion of the first control, and the virtual finger displayed outside the first area in fig. 3 may be the adjusted second control corresponding to fig. 2. Accordingly, the user may also cause the length of the second control to be shortened by long pressing the root of the finger of the first control. Of course, the user may also control the orientation angle of the second control by dragging the fingertip of the first control, for example, the virtual finger displayed outside the first area in fig. 4 may be the adjusted second control corresponding to fig. 3.
Referring to FIG. 6, in the embodiment of FIG. 6, the first control and the second control may also both be displayed in the form of circular buttons. The user can change the display position of the second control by dragging the first control in the first area, as shown in the middle dashed line box of fig. 6, which is the first control and the second control before adjustment, and as shown in the solid line box, which is the first control and the second control after adjustment. The specific design can be set according to actual needs or customized by users.
It should be appreciated that the display parameters may be used to determine a target interaction location for the second control, and that the user may change the target interaction location by adjusting the display parameters. Taking fig. 2 to fig. 5 as an example, when the second control is displayed as a virtual finger, the target interaction position between the second control and the target interface may be a fingertip position of the virtual finger, and the user may change the fingertip position by changing a length and an orientation angle of the virtual finger, so as to change the target interaction position.
Taking fig. 6 as an example, when the second control is displayed as a circular key, the target interaction position between the second control and the target interface may be the position of the circular key, and the user may change the target interaction position by changing the display position of the circular key, and so on.
Further, after the target interaction position is determined, the user may also perform a corresponding interaction operation through the second control through the second sub-input. It can be understood that the user can trigger the second control to perform different interactive operations on the target interface through different touch operations on the first control.
For example, when the target interaction position of the second control is located on the application icon, the user can start the application program by clicking the first control; the application program is dragged or unloaded by long pressing the first control, which may be the same as the touch operation performed by the user when interacting with the application icon normally, and is not described herein again.
In the embodiment of the application, the electronic equipment can change the display parameters of the second control through the second input, so that the user can change the target interaction position corresponding to the second control through a visual form, the impression and the interaction interest of the user are improved, the control mode of the second control is enriched, and the use experience of the user is improved.
Further, the display parameters include at least one of: a display size; displaying the angle; and displaying the position.
It is understood that the display size includes, but is not limited to, a display length, a display width, and the like of the second control, the display angle may be understood as an angle between the second control and any one boundary of the target interface, and the display position may be determined by a position coordinate of the second control on the target interface.
The display parameters can be determined according to the specific display form of the second control. For example, with respect to fig. 2 to 5, the display parameters may include a display length of the virtual finger and an orientation angle of the virtual finger. With reference to fig. 6, the display parameters may include display coordinates of the circular key. The user can adjust the display parameter according to the first sub-input, so that the target interaction position of the second control is adjusted.
Optionally, step 103 includes:
in response to the first input, determining a target finger associated with the first input;
generating the second control corresponding to the target finger;
after the step of generating a second control associated with the first input, the method further comprises:
and under the condition that the second input is received, executing target operation corresponding to the target finger on the target interface.
In the embodiment of the application, in order to simplify the operation, the user may execute the above-mentioned first input through different fingers to generate a second control corresponding to the currently used target finger, so as to implement the interactive operation corresponding to the target finger through the second control.
The step of determining the target finger associated with the first input may be performed by obtaining biometric information such as a fingerprint when the user performs a touch on the first control, so as to determine the target finger currently used for performing the first input. After that, the electronic device may generate a second control corresponding to the target finger.
It should be understood that the display modality of the second control can be determined according to the target finger. In a specific embodiment, the electronic device may display a virtual finger having the same shape as the target finger to improve the look and feel of the user and prompt the user about the currently used finger; in other embodiments, the electronic device may also display the second control in different forms according to different fingers, which is not described herein again.
Further, the electronic device may store a matching relationship between the finger and the interaction operation of the second control in advance, for example, when the target finger is a thumb, the user may perform the interaction operation of starting the application program through the second input, and when the target finger is a forefinger, the user may perform the interaction operation of deleting the application program through the second input, which is not listed here.
Generally, when a user holds an electronic device with one hand, the user is inconvenient to use fingers other than the thumb for touch operation. In a specific embodiment, the electronic device may match the thumb of the left hand with the interactive operation of the start application and match the thumb of the right hand with the interactive operation of the delete application, so as to implement different interactive operations by the thumbs of the left and right hands, respectively.
In the embodiment of the application, the electronic device can determine the generated second control by identifying the target finger associated with the first input, so that the user can generate the second control for executing different interactive operations through different fingers, the operation process of the user is simplified, and the use experience of the user is improved.
It should be noted that, after the user uses the target finger to generate the corresponding second control, the second control may be triggered to execute different interaction operations on the target interface through different touch operations on the first control, so as to further enrich the control modes of the second control.
Optionally, the step 101 includes:
acquiring position parameters of the electronic equipment, wherein the position parameters comprise an included angle between the electronic equipment and a reference coordinate axis and the orientation of the electronic equipment;
under the condition that the electronic equipment faces a target preset direction and an included angle between the electronic equipment and a reference coordinate axis is larger than or equal to a preset threshold value, displaying a first control in a first area of the target interface corresponding to the target preset direction; wherein the boundary of the first region partially coincides with the boundary of the target interface.
In order to facilitate the user to perform touch operation on the first control when the user performs one-handed operation. In the embodiment of the application, the electronic device may determine, according to the position parameter of the electronic device, an area where a target finger used by a current user is located, and then display the first control in an area close to the target finger of the user.
Specifically, the position parameters may include an angle between reference coordinate axes of the electronic device and an orientation of the electronic device. The electronic device can display the first control in a first area corresponding to the target preset direction under the condition that the electronic device faces the target preset direction and the included angle between the electronic device and the reference coordinate axis is larger than or equal to a preset threshold value.
Referring to fig. 7 to 8, when the electronic device faces to the right side in fig. 7 and the angle α between the electronic device and the y-axis is greater than or equal to the preset threshold, the user usually uses a right hand to perform an operation at this time, so that the first area may be located at a position where the target interface coincides with the right boundary, and further may be located at a lower right corner of the target interface, so that the user can perform a touch operation on the first control by using the right hand. .
When the electronic device faces the left side in fig. 8 and the angle β between the electronic device and the y-axis is greater than or equal to the preset threshold, the user usually uses the left hand to perform the operation at this time, so that the first area may be located at a position where the target interface coincides with the left boundary, and further may be located at the lower left corner of the target interface, so that the user can perform the touch operation on the first control through the left hand.
The preset angle can be set according to the actual application scene and can be set to 20-30 degrees generally
In the embodiment of the application, the electronic device can determine the position of the first region according to the position parameter of the electronic device, so that the first control is displayed in the region close to the target finger, and convenience in operation of a user is further improved.
It should be noted that, in the interface control method provided in the embodiment of the present application, the execution main body may be an interface control device, or a control module in the interface control device, which is used for executing the interface control method. The method for executing interface control by an interface control device is taken as an example in the embodiment of the present application, and the interface control device provided in the embodiment of the present application is described.
Referring to fig. 9, an embodiment of the present application provides an interface control apparatus 900, including:
a display module 901, configured to display a first control in a first area of a target interface when the one-hand mode is detected to be turned on;
a first receiving module 902, configured to receive a first input for the first control;
a generating module 903, configured to generate, in response to the first input, a second control associated with the first control; the second control is used for executing target operation on the target interface according to second input aiming at the first control.
In the embodiment of the application, the electronic device may display the first control in the first area of the target interface through the display module 901 when detecting that the one-handed mode is turned on, receive the first input of the user to the first control through the first receiving module 902, generate the second control associated with the first control through the generating module 903, perform the second input to the first control through the one-handed mode by the user, and execute the operation corresponding to the second input at the target interface through the second control, so that the problem of inconvenience in operation of the one-handed mode of the user is solved, and convenience in operation of the user to the electronic device is improved.
Optionally, the apparatus further comprises:
a second receiving module for receiving the second input;
an adjustment module for adjusting a display parameter of the second control in response to the second input;
and the first execution module is used for executing the target operation through the adjusted second control.
Optionally, the display parameters include at least one of: a display size; displaying the angle; and displaying the position.
Optionally, the generating module 903 includes:
a determination unit configured to determine, in response to the first input, a target finger associated with the first input;
the generating unit is used for generating the second control corresponding to the target finger;
the device further comprises:
and the second execution module is used for executing target operation corresponding to the target finger on the target interface under the condition of receiving the second input.
Optionally, the display module 901 includes:
the device comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for acquiring position parameters of the electronic equipment, and the position parameters comprise an included angle between the electronic equipment and a reference coordinate axis and the orientation of the electronic equipment;
the display unit is used for displaying a first control in the first area of the target interface corresponding to the target preset direction under the condition that the electronic equipment faces the target preset direction and an included angle between the electronic equipment and a reference coordinate axis is larger than or equal to a preset threshold value; wherein the boundary of the first region partially coincides with the boundary of the target interface.
The interface control device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The interface control device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The interface control device provided in the embodiment of the present application can implement each process implemented by the method embodiments of fig. 1 to 8, and is not described here again to avoid repetition.
Optionally, as shown in fig. 10, an electronic device 1000 is further provided in this embodiment of the present application, and includes a processor 1001, a memory 1002, and a program or an instruction stored in the memory 1002 and executable on the processor 1001, where the program or the instruction is executed by the processor 1001 to implement each process of the interface control method embodiment, and can achieve the same technical effect, and no further description is provided here to avoid repetition.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 11 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 1100 includes, but is not limited to: a radio frequency unit 1101, a network module 1102, an audio output unit 1103, an input unit 1104, a sensor 1105, a display unit 1106, a user input unit 1107, an interface unit 1108, a memory 1109, a processor 1110, and the like.
Those skilled in the art will appreciate that the electronic device 1100 may further include a power source (e.g., a battery) for supplying power to the various components, and the power source may be logically connected to the processor 1110 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system. The electronic device structure shown in fig. 11 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
The display unit 1106 is configured to display a first control in a first area of the target interface when the one-hand mode is detected to be turned on;
a user input unit 1107, configured to receive a first input for the first control;
a processor 1110 for generating a second control associated with the first control in response to the first input; the second control is used for executing target operation on the target interface according to second input aiming at the first control.
Optionally, a user input unit 1107, further configured to receive the second input;
processor 1110 is further configured to adjust a display parameter of the second control in response to the second input;
the processor 1110 is further configured to execute the target operation through the adjusted second control.
Optionally, the display parameters include at least one of: a display size; displaying the angle; and displaying the position.
Optionally, processor 1110 is further configured to determine, in response to the first input, a target finger associated with the first input;
a processor 1110 further configured to generate the second control corresponding to the target finger;
optionally, the processor 1110 is further configured to, if the second input is received, perform a target operation corresponding to the target finger on the target interface.
Optionally, the sensor 1105 is configured to obtain location parameters of the electronic device, where the location parameters include an included angle between the electronic device and a reference coordinate axis and an orientation of the electronic device;
the display unit 1106 is further configured to display a first control in the first area of the target interface corresponding to the target preset direction when the electronic device faces the target preset direction and an included angle between the electronic device and a reference coordinate axis is greater than or equal to a preset threshold; wherein the boundary of the first region partially coincides with the boundary of the target interface.
It should be understood that in the embodiment of the present application, the input Unit 1104 may include a Graphics Processing Unit (GPU) 11041 and a microphone 11042, and the Graphics processor 11041 processes image data of still pictures or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 1106 may include a display panel 110611, and the display panel 110611 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1107 includes a touch panel 11071 and other input devices 11072. A touch panel 11071, also called a touch screen. The touch panel 11071 may include two portions of a touch detection device and a touch controller. Other input devices 11072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 1109 may be used for storing software programs and various data including, but not limited to, application programs and an operating system. Processor 1110 may integrate an application processor that handles primarily operating systems, user interfaces, applications, etc. and a modem processor that handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1110.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the interface control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the interface control method embodiment, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (12)

1. An interface control method is applied to electronic equipment, and is characterized in that the method comprises the following steps:
under the condition that the single-hand mode is detected to be started, displaying a first control in a first area of a target interface;
receiving a first input for the first control;
generating a second control associated with the first control in response to the first input;
the second control is used for executing target operation on the target interface according to second input aiming at the first control.
2. The method of claim 1, wherein after the step of displaying a second control associated with the first control, the method further comprises:
receiving the second input;
adjusting a display parameter of the second control in response to the second input;
and executing the target operation through the adjusted second control.
3. The method of claim 2, wherein the display parameters include at least one of: a display size; displaying the angle; and displaying the position.
4. The method of claim 1, wherein generating, in response to the first input, a second control associated with the first control comprises:
in response to the first input, determining a target finger associated with the first input;
generating the second control corresponding to the target finger;
after the step of generating a second control associated with the first control, the method further comprises:
and under the condition that the second input is received, executing target operation corresponding to the target finger on the target interface.
5. The method of any of claims 1-4, wherein displaying a first control in a first region of a target interface comprises:
acquiring position parameters of the electronic equipment, wherein the position parameters comprise an included angle between the electronic equipment and a reference coordinate axis and the orientation of the electronic equipment;
under the condition that the electronic equipment faces a target preset direction and an included angle between the electronic equipment and a reference coordinate axis is larger than or equal to a preset threshold value, displaying a first control in a first area of the target interface corresponding to the target preset direction; wherein the boundary of the first region partially coincides with the boundary of the target interface.
6. An interface control device applied to an electronic device, the method comprising:
the display module is used for displaying a first control in a first area of a target interface under the condition that the single-hand mode is detected to be started;
a first receiving module for receiving a first input for the first control;
a generating module to generate a second control associated with the first control in response to the first input; the second control is used for executing target operation on the target interface according to second input aiming at the first control.
7. The apparatus of claim 6, wherein the second input comprises a first sub-input and a second sub-input, the apparatus further comprising:
a second receiving module, configured to receive the first sub-input;
an adjustment module, configured to adjust a display parameter of the second control in response to the first sub-input;
and the first execution module is used for executing the target operation through the adjusted second control.
8. The method of claim 7, wherein the display parameters include at least one of: a display size; displaying the angle; and displaying the position.
9. The method of claim 7, wherein the generating module comprises:
a determination unit configured to determine, in response to the first input, a target finger associated with the first input;
the generating unit is used for generating the second control corresponding to the target finger;
the device further comprises:
and the second execution module is used for executing target operation corresponding to the target finger on the target interface under the condition of receiving the second input.
10. The method according to any one of claims 6-9, wherein the display module comprises:
the device comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for acquiring position parameters of the electronic equipment, and the position parameters comprise an included angle between the electronic equipment and a reference coordinate axis and the orientation of the electronic equipment;
the display unit is used for displaying a first control in a first area of the target interface corresponding to a target preset direction under the condition that the electronic equipment faces the target preset direction and an included angle between the electronic equipment and a reference coordinate axis is larger than or equal to a preset threshold value; wherein the boundary of the first region partially coincides with the boundary of the target interface.
11. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the interface control method according to any one of claims 1-5.
12. A readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the interface control method according to claims 1-5.
CN202110973350.6A 2021-08-24 2021-08-24 Interface control method and electronic equipment Pending CN113641275A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110973350.6A CN113641275A (en) 2021-08-24 2021-08-24 Interface control method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110973350.6A CN113641275A (en) 2021-08-24 2021-08-24 Interface control method and electronic equipment

Publications (1)

Publication Number Publication Date
CN113641275A true CN113641275A (en) 2021-11-12

Family

ID=78423543

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110973350.6A Pending CN113641275A (en) 2021-08-24 2021-08-24 Interface control method and electronic equipment

Country Status (1)

Country Link
CN (1) CN113641275A (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110069178A (en) * 2019-03-14 2019-07-30 维沃移动通信有限公司 Interface control method and terminal device
CN110221761A (en) * 2019-04-24 2019-09-10 维沃移动通信有限公司 Display methods and terminal device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110069178A (en) * 2019-03-14 2019-07-30 维沃移动通信有限公司 Interface control method and terminal device
CN110221761A (en) * 2019-04-24 2019-09-10 维沃移动通信有限公司 Display methods and terminal device

Similar Documents

Publication Publication Date Title
CN112162665B (en) Operation method and device
WO2023025060A1 (en) Interface display adaptation processing method and apparatus, and electronic device
CN112433693B (en) Split screen display method and device and electronic equipment
CN113209601A (en) Interface display method and device, electronic equipment and storage medium
WO2022111458A1 (en) Image capture method and apparatus, electronic device, and storage medium
CN113703630A (en) Interface display method and device
CN112764561A (en) Electronic equipment control method and device and electronic equipment
CN112269481A (en) Method and device for controlling friction force adjustment and electronic equipment
CN114995713B (en) Display control method, display control device, electronic equipment and readable storage medium
WO2023030304A1 (en) Prompt information display method and apparatus, and electronic device
WO2023030238A1 (en) Secure input method and apparatus
CN115357172A (en) Content display method and device
CN113641275A (en) Interface control method and electronic equipment
CN112162689B (en) Input method and device and electronic equipment
CN114115639A (en) Interface control method and device, electronic equipment and storage medium
CN114356113A (en) Input method and input device
CN113821138A (en) Prompting method and device and electronic equipment
WO2017016333A1 (en) Screen adjustment method and device
CN112698745A (en) Control display method and electronic equipment
CN112162810A (en) Message display method and device and electronic equipment
CN112596645A (en) Application identifier hiding method and device and electronic equipment
CN112732392A (en) Operation control method and device for application program
CN111857496A (en) Operation execution method and device and electronic equipment
CN112214297A (en) Application switching method and electronic equipment
CN113434080B (en) Information input method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination