CN114115639A - Interface control method and device, electronic equipment and storage medium - Google Patents

Interface control method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114115639A
CN114115639A CN202111422090.XA CN202111422090A CN114115639A CN 114115639 A CN114115639 A CN 114115639A CN 202111422090 A CN202111422090 A CN 202111422090A CN 114115639 A CN114115639 A CN 114115639A
Authority
CN
China
Prior art keywords
target
interface
area
edge
function
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111422090.XA
Other languages
Chinese (zh)
Inventor
肖嘉里
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202111422090.XA priority Critical patent/CN114115639A/en
Publication of CN114115639A publication Critical patent/CN114115639A/en
Priority to PCT/CN2022/133138 priority patent/WO2023093661A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Abstract

The application discloses an interface control method, an interface control device, electronic equipment and a storage medium, and belongs to the technical field of communication. The method comprises the following steps: displaying a target interface; under the condition that the target interface comprises a function control, acquiring the position of the function control and determining a target area according to the position of the function control; under the condition that the target interface does not comprise a function control, the target area is a preset area; and executing a function corresponding to the target gesture and the target area under the condition that the target gesture acts on the target area.

Description

Interface control method and device, electronic equipment and storage medium
Technical Field
The application belongs to the technical field of communication, and particularly relates to an interface control method and device, electronic equipment and a storage medium.
Background
With the continuous development of electronic technology, in order to expand the available display area of the electronic device, physical keys or fixed virtual keys are not used in the electronic device, and therefore, a target area is set in the display area of the electronic device for responding to a target operation of a user, so as to implement a function corresponding to the target operation in the target area.
However, as the size of the screen of the electronic device is increased, the area of the target area in the display area of the screen may be small, and the target area is likely not touched during the target operation performed by the user, so that the target operation cannot be responded normally.
Disclosure of Invention
An object of the embodiments of the present application is to provide an interface control method, an interface control apparatus, an electronic device, and a storage medium, which can solve the problem that a target area may have a small area and may easily cause no response to a target operation of the target area.
In a first aspect, an embodiment of the present application provides an interface control method, where the method includes:
displaying a target interface;
under the condition that the target interface comprises a function control, acquiring the position of the function control and determining a target area according to the position of the function control;
under the condition that the target interface does not comprise a function control, the target area is a preset area;
and executing a function corresponding to the target gesture and the target area under the condition that the target gesture acts on the target area.
In a second aspect, an embodiment of the present application provides an interface control apparatus, including:
the first display module is used for displaying a target interface;
the first acquisition module is used for acquiring the position of the function control and determining a target area according to the position of the function control under the condition that the target interface comprises the function control;
the first determining module is used for determining that the target area is a preset area under the condition that the target interface does not comprise a function control;
and executing a function corresponding to the target gesture and the target area under the condition that the target gesture acts on the target area.
In a third aspect, embodiments of the present application provide an electronic device, which includes a processor and a memory, where the memory stores a program or instructions executable on the processor, and the program or instructions, when executed by the processor, implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product, stored on a storage medium, for execution by at least one processor to implement the method according to the first aspect.
In the embodiment of the application, whether the target interface contains the function control or not is analyzed, an area which is as large as possible but does not influence the function of the function control is determined for the target area based on the position of the function control when the target interface contains the function control, and on the other hand, the preset area is used as the target area when the target interface does not contain the function control.
Drawings
Fig. 1 is a schematic flowchart of an interface control method according to an embodiment of the present disclosure;
FIG. 2 is a schematic side view of an embodiment of the present application;
FIG. 3 is a schematic diagram of determining a target area in an embodiment of the present application;
fig. 4 is a schematic structural diagram of an interface control device according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 6 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The interface control method, the interface control apparatus, the electronic device, and the storage medium provided in the embodiments of the present application are described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
Fig. 1 is a schematic flow chart of an interface control method provided in an embodiment of the present application, and as shown in fig. 1, the method includes:
step 110, displaying a target interface;
specifically, the electronic device displays the target interface through the display screen, and the target interface described in this embodiment may specifically be an interface displayed in a screen of the electronic device at the current time, and specifically may be a display interface of a system application program, such as a system setting interface, or a calendar interface, or may also be a use interface of an application program installed in the electronic device, such as a payment interface of a shopping application program.
The electronic device described in the embodiments of the present application may specifically be the other portable communication device including, but not limited to, a mobile phone or a tablet computer having a touch sensitive surface (e.g., a touch screen display and/or a touch pad). It should also be understood that in some embodiments, the terminal may not be a portable communication device, but rather a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or touchpad).
In the embodiment of the present application, the edge of the target interface may specifically be an edge of a screen of the electronic device.
Step 120, under the condition that the target interface comprises a function control, acquiring the position of the function control and determining a target area according to the position of the function control;
specifically, the function control described in the embodiment of the present application may specifically be a function control displayed in the current display interface, and the function control may specifically refer to a control that can interact with a user, that is, the control may respond to an input of the user and present a corresponding function.
The function control described in the embodiment of the present application may specifically refer to an icon or a button in an interface, for example, a shooting button control in a shooting interface of a camera, or a flash button control.
According to the attribute information of each control in the current display interface, the method and the device for distinguishing the control in the current display interface are used for distinguishing whether each control belongs to a functional control or a non-functional control, and can also distinguish the functional control and the non-functional control in the current display interface according to parameters such as the size, the position or the layer sequence of the control in the current display interface.
In the embodiment of the application, if the functional control exists in the target interface based on the attribute information of each control or the parameter of the control in the target interface, the target interface includes the functional control at this time.
Correspondingly, when the functional control cannot be detected in the target interface based on the attribute information of each control or the parameters of the control in the target interface, the functional control is not included in the target interface at this time.
It can be understood that, in the case that the target interface includes a functionality control, at this time, a control that needs to respond to a specific input of the user exists in the target sub-interface, and in this embodiment of the application, the position of the functionality control may be further determined according to attribute information or parameter information of the functionality control.
The position of the function control described in the embodiment of the present application may specifically be a position of the function control relative to the target interface, and specifically may be a position of the function control in a pixel coordinate system corresponding to the target interface, for example, when the function control is an icon or a button, the position of the function control is a position of the whole icon or the whole button in the target interface.
More specifically, the functionality control described in the embodiment of the present application corresponds to the response area of the functionality control, and in the embodiment of the present application, the target area may be further determined according to an edge position of the response area of the functionality control.
In the embodiment of the present application, if it is desired to solve the problem that a user is likely not to touch a target region in a process that the target region is desired to be subjected to a target operation by the user due to a small area of the target region, a manner of directly increasing the area of the target region is considered in the embodiment of the present application, so that the target region can better respond to the target operation of the user.
However, after the area of the target area is increased, the function control may be covered, and the function control cannot normally respond to the input of the user to the function control, that is, the function of the function control cannot be normally realized.
Step 130, under the condition that the target interface does not comprise a function control, the target area is a preset area;
and executing a function corresponding to the target gesture and the target area under the condition that the target gesture acts on the target area.
Specifically, the target area described in this embodiment of the present application may be specifically used for an area responding to a target gesture, that is, when the target area receives the target gesture, the target area may execute a function corresponding to the target gesture acting on the target area, and when the target area receives other gestures, the target area may normally execute the other gestures.
For example, the target gesture corresponds to function a in the other region, but when the target region corresponds to function B, only function B is executed when the target gesture acts on the target region, but when the target gesture acts on the other region, function a is executed.
For example, in the case that the target area is a side area, the side area may specifically refer to one of the upper side, the lower side, the left side and the right side of the display screen of the electronic device, or a multi-sided side area.
Fig. 2 is a schematic side area diagram provided in an embodiment of the present application, and as shown in fig. 2, for example, the side area may specifically include an upper side area 21, a left side area 22, and a lower side area 23 in a current display interface of the electronic device.
The side area in the embodiment of the application is specifically used for responding to a target gesture of a user, for example, the target gesture may be a right stroke operation of the user on the left side of the current display interface, and a starting point of the target operation is located in the left side area of the current display interface, at this time, switching of the current display interface to a previous-level interface may be implemented.
For example, the target gesture may be a user's sliding operation on the lower side of the current display interface, and the starting point of the target operation is located in the lower side area, at which time switching the current display interface to the multitask management interface may be achieved.
For example, the target gesture may be a slide-down operation of the user on the upper side of the current display interface, and the starting point of the target operation is located in the upper side area, at which time an operation of calling up the bottom status bar in the current display interface may be implemented.
The preset region described in the embodiment of the present application may specifically be a preset target region, that is, a default target region.
Under the condition that the target interface does not include the function control, the target area cannot be determined according to the position of the function control, and if the whole target interface is set as the target area, normal response of other inputs in the target interface may be affected.
Therefore, in the embodiment of the present application, the target area is set as the preset area under the condition that the target interface does not include the function control.
In the embodiment of the application, whether the target interface contains the function control or not is analyzed, an area which is as large as possible but does not influence the function of the function control is determined for the target area based on the position of the function control when the target interface contains the function control, and on the other hand, the preset area is used as the target area when the target interface does not contain the function control.
Optionally, the obtaining the position of the function control and determining the target area according to the position of the function control includes:
acquiring the edge position of the function control;
determining a second edge, which is closest to the first edge of the target interface, in the function control according to the edge position of the function control;
determining an interface area formed by the edges where the first edge and the second edge are located as a target area;
wherein the target interface includes the interface region.
Specifically, the function control described in the embodiment of the present application often has an area corresponding to the function control, and therefore the edge position of the function control described in the embodiment of the present application may specifically refer to an edge position of the area corresponding to the function control, for example, when the function control is a virtual key identifier, the edge position of the function control at this time is an identifier edge of the virtual key identifier.
More specifically, there may be multiple edges for each functionality control, and correspondingly, the edge position of the functionality control is an edge position formed by the multiple edges.
The edge position of the functionality control obtained in the embodiment of the application may be specifically determined according to the attribute of the functionality control, and the edge position may specifically be a position of the edge of the functionality control relative to the target interface.
In this embodiment of the application, after determining the position of the functionality control, the target area may be further determined according to the edge position of the functionality control, which may specifically be determining a second edge closest to the first edge in the target interface according to the edge position of the functionality control.
The first edge described in the embodiment of the present application may specifically be any edge in the target interface, and after the first edge is determined, a second edge closest to the first edge may be further determined.
The second edge described in this embodiment of the application may be specifically an edge closest to the first edge of the target interface among edges of the function controls, and accordingly, in the case that there are multiple function controls, the second edge is an edge closest to the first edge of the target interface among the multiple function controls, that is, in the case that there are multiple function controls, only one second edge is determined.
It can be understood that, in the embodiment of the present application, when different edges in the target interface are selected as the first edge, there are corresponding second edges.
In the embodiment of the present application, after the first edge and the second edge are determined, the edges where the first edge and the second edge are located may be used as the edges of the target area, and since the distance between the first edge and the second edge is relatively fixed, after the two edges are determined, the interface area formed by the two edges is further determined as the target area.
Fig. 3 is a schematic diagram of determining a target area in the embodiment of the present application, and as shown in fig. 3, the interface area includes a function control 31, a first edge 32 of a target interface, and a second edge 311 closest to the first edge 32 in the function control 31, and an interface area 33 formed by edges where the first edge 32 and the second edge 311 are located is determined as the target area.
In the embodiment of the application, after the edge position of the function control is obtained, the second edge closest to the first edge area of the target interface in the function control is determined, so that the target area is determined, the area of the target area can be increased as much as possible under the condition that the normal use of the function control is not influenced, and the target gesture can be effectively ensured to be successfully responded.
Optionally, after determining, according to the edge position of the functionality control, a second edge of the functionality control that is closest to the first edge of the target interface, the method further includes:
obtaining a distance between the first edge and the second edge;
and determining an interface area formed by the edges of the first edge and the second edge as a target area under the condition that the distance is greater than or equal to a preset value.
Specifically, the preset value described in the embodiment of the present application may be a preset value.
The distance between the first edge and the second edge described in the embodiment of the present application may specifically be a closest straight-line distance between the first edge and the second edge, which may specifically be expressed in pixel units, and the distance is specifically determined according to pixel coordinates of the first edge in a pixel coordinate system corresponding to the target interface and pixel coordinates of the second edge in a pixel coordinate system corresponding to the target interface.
In this embodiment of the application, when the distance is greater than or equal to the preset value, it indicates that a certain distance still exists between the first edge and the function control, and at this time, an interface area formed by edges where the first edge and the second edge are located may be determined as a target area.
Optionally, in this embodiment of the application, when the distance is smaller than or equal to a preset value, it is determined that the distance between the first edge and the function control is smaller, and at this time, if the target area is determined according to the interface area formed by the edge where the first edge and the second edge are located, the target area may affect the normal function of the function control, so that the target area is directly determined as the preset area.
In the embodiment of the application, the distance between the first edge and the second edge is fully considered, so that the area of the target area is ensured to be increased as much as possible without affecting the function control in the process of determining the target area.
Optionally, the target interface includes a first area, the first area includes the function control, in a case where a first gesture acts on the first area, a first function corresponding to the first gesture and the first area is executed, and after the target area is determined according to the position of the function control, the method further includes:
receiving a first gesture input of a user under the condition that the target area is at least partially overlapped with the first area, wherein the first gesture acts on the overlapped area of the target area and the first area;
in the case that the first gesture does not have a target function corresponding to the target area, giving up responding to the first gesture input and not executing the first function;
and in the case that the first gesture exists in a target function corresponding to the target area, responding to the first gesture input, executing the target function, and not executing the first function.
Specifically, the first region described in this embodiment of the present application may be a specific region in the target interface, where the first region executes the corresponding first function when receiving the corresponding first gesture, and executes the functions of the other gestures normally when receiving the other gestures.
For example, the first gesture corresponds to the function C in the other area, but when the first area corresponds to the function D, the first gesture only executes the function D when the first gesture acts on the first area, but when the first gesture acts on the other area, the function C is executed.
The at least partial overlapping of the target region and the first region described in the embodiment of the present application may specifically be partial overlapping, or the target region includes the first region, or the first region includes the target region.
In the embodiment of the present application, when the target area at least partially overlaps the first area, the first gesture acting on the overlapping area at this time cannot determine whether the user wishes to trigger execution of the first function, and therefore, when the first gesture does not have the target function corresponding to the target area, the response to the first gesture input is abandoned, and the first function is not executed at this time.
In the embodiment of the present application, when the first gesture exists in the target function corresponding to the target area, that is, when the first gesture acts on the target area, the corresponding target function can be triggered, only the other target function is executed at this time, that is, the function that the first gesture acts on the target area is executed at this time, and the function that the first gesture acts on the first area is not responded, that is, the first function is not executed.
In the embodiment of the application, when the condition that the target area is at least partially overlapped with the first area is fully considered, and the first input acting on the overlapped area of the target area and the first area is received, only the target function is executed without executing the first function, so that the instruction conflict can be effectively avoided, and the normal execution of the function is ensured.
According to the interface control method provided by the embodiment of the application, the execution main body can be an interface control device. The method for executing interface control by an interface control device is taken as an example in the embodiment of the present application, and the interface control device provided in the embodiment of the present application is described.
Fig. 4 is a schematic structural diagram of an interface control device according to an embodiment of the present application, as shown in fig. 4, including: the first display module 410 is configured to display a target interface; the first obtaining module 420 is configured to, when the target interface includes a function control, obtain a position of the function control and determine a target area according to the position of the function control; the first determining module 430 is configured to, when the target interface does not include a function control, determine that the target area is a preset area; and executing a function corresponding to the target gesture and the target area under the condition that the target gesture acts on the target area.
Optionally, the first obtaining module is specifically configured to:
acquiring the edge position of the function control;
determining a second edge, which is closest to the first edge of the target interface, in the function control according to the edge position of the function control;
determining an interface area formed by the edges where the first edge and the second edge are located as a target area;
wherein the target interface includes the interface region.
Optionally, the apparatus further comprises:
a second obtaining module, configured to obtain a distance between the first edge and the second edge;
and the second determining module is used for determining an interface area formed by the edges where the first edge and the second edge are located as a target area under the condition that the distance is greater than or equal to a preset value.
Optionally, the target interface includes a first area, the first area includes the function control, and in a case where a first gesture acts on the first area, the apparatus further includes:
a first receiving module, configured to receive a first gesture input of a user if the target area at least partially overlaps the first area, where the first gesture acts on an overlapping area of the target area and the first area;
a first executing module, configured to, in a case where there is no target function corresponding to the target area in the first gesture, forgo responding to the first gesture input and not execute the first function;
and in the case that the first gesture exists in a target function corresponding to the target area, responding to the first gesture input, executing the target function, and not executing the first function.
In the embodiment of the application, whether the target interface contains the function control or not is analyzed, an area which is as large as possible but does not influence the function of the function control is determined for the target area based on the position of the function control when the target interface contains the function control, and on the other hand, the preset area is used as the target area when the target interface does not contain the function control.
The interface control device in the embodiment of the present application may be an electronic device, or may be a component in an electronic device, such as an integrated circuit or a chip. The electronic device may be a terminal, or may be a device other than a terminal. The electronic Device may be, for example, a Mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic Device, a Mobile Internet Device (MID), an Augmented Reality (AR)/Virtual Reality (VR) Device, a robot, a wearable Device, an ultra-Mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and may also be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine, a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The interface control device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android operating system (Android), an iOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
The interface control device provided in the embodiment of the present application can implement each process implemented by the method embodiments of fig. 1 to fig. 3, and is not described here again to avoid repetition.
Optionally, fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application, and as shown in fig. 5, an electronic device 500 according to an embodiment of the present application is further provided and includes a processor 501 and a memory 502, where the memory 502 stores a program or an instruction that can be executed on the processor 501, and when the program or the instruction is executed by the processor 501, the steps of the interface control method embodiment are implemented, and the same technical effects can be achieved, and are not repeated here to avoid repetition.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 6 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 600 includes, but is not limited to: a radio frequency unit 601, a network module 602, an audio output unit 603, an input unit 604, a sensor 605, a display unit 606, a user input unit 607, an interface unit 608, a memory 609, a processor 610, and the like.
Those skilled in the art will appreciate that the electronic device 600 may further comprise a power source (e.g., a battery) for supplying power to the various components, and the power source may be logically connected to the processor 610 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 6 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The display unit 606 is used for displaying a target interface;
the processor 610 is configured to, when the target interface includes a function control, obtain a position of the function control, and determine a target area according to the position of the function control;
under the condition that the target interface does not comprise a function control, the target area is a preset area;
and executing a function corresponding to the target gesture and the target area under the condition that the target gesture acts on the target area.
Optionally, the processor 610 is configured to obtain an edge position of the functionality control;
the processor 610 is configured to determine, according to the edge position of the functionality control, a second edge of the functionality control, which is closest to the first edge of the target interface;
the processor 610 is configured to determine an interface region formed by edges where the first edge and the second edge are located as a target region;
wherein the target interface includes the interface region.
Optionally, a sensor 605 is used to obtain the distance between the first edge and the second edge;
the processor 610 is configured to determine an interface area formed by edges where the first edge and the second edge are located as a target area when the distance is greater than or equal to a preset value.
Optionally, in a case that the target area at least partially overlaps the first area, the user input unit 607 is configured to receive a first gesture input of the user, where the first gesture acts on an overlapping area of the target area and the first area;
in the absence of a target function corresponding to the target area for the first gesture, processor 610 is configured to forgo responding to the first gesture input without performing the first function;
in the case that the first gesture exists at a target function corresponding to the target area, the processor 610 is configured to execute the target function without executing the first function in response to the first gesture input.
In the embodiment of the application, whether the target interface contains the function control or not is analyzed, an area which is as large as possible but does not influence the function of the function control is determined for the target area based on the position of the function control when the target interface contains the function control, and on the other hand, the preset area is used as the target area when the target interface does not contain the function control.
It is to be understood that, in the embodiment of the present application, the input Unit 604 may include a Graphics Processing Unit (GPU) 6041 and a microphone 6042, and the Graphics Processing Unit 6041 processes image data of a still picture or a video obtained by an image capturing apparatus (such as a camera) in a video capturing mode or an image capturing mode. The display unit 606 may include a display panel 6061, and the display panel 6061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 607 includes at least one of a touch panel 6071 and other input devices 6072. A touch panel 6071, also referred to as a touch screen. The touch panel 6071 may include two parts of a touch detection device and a touch controller. Other input devices 6072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
The memory 609 may be used to store software programs as well as various data. The memory 609 may mainly include a first storage area storing a program or an instruction and a second storage area storing data, wherein the first storage area may store an operating system, an application program or an instruction (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 609 may include volatile memory or nonvolatile memory, or the memory x09 may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. The volatile Memory may be a Random Access Memory (RAM), a Static Random Access Memory (Static RAM, SRAM), a Dynamic Random Access Memory (Dynamic RAM, DRAM), a Synchronous Dynamic Random Access Memory (Synchronous DRAM, SDRAM), a Double Data Rate Synchronous Dynamic Random Access Memory (Double Data Rate SDRAM, ddr SDRAM), an Enhanced Synchronous SDRAM (ESDRAM), a Synchronous Link DRAM (SLDRAM), and a Direct Memory bus RAM (DRRAM). The memory 609 in the embodiments of the subject application include, but are not limited to, these and any other suitable types of memory.
Processor 610 may include one or more processing units; optionally, the processor 610 integrates an application processor, which mainly handles operations related to the operating system, user interface, application programs, etc., and a modem processor, which mainly handles wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 610.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the interface control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a computer read only memory ROM, a random access memory RAM, a magnetic or optical disk, and the like.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the interface control method embodiment, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
Embodiments of the present application provide a computer program product, where the program product is stored in a storage medium, and the program product is executed by at least one processor to implement the processes of the foregoing interface control method embodiments, and can achieve the same technical effects, and in order to avoid repetition, details are not described here again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. An interface control method, comprising:
displaying a target interface;
under the condition that the target interface comprises a function control, acquiring the position of the function control and determining a target area according to the position of the function control;
under the condition that the target interface does not comprise a function control, the target area is a preset area;
and executing a function corresponding to the target gesture and the target area under the condition that the target gesture acts on the target area.
2. The interface control method according to claim 1, wherein the obtaining the position of the functionality control and determining the target area according to the position of the functionality control comprises:
acquiring the edge position of the function control;
determining a second edge, which is closest to the first edge of the target interface, in the function control according to the edge position of the function control;
determining an interface area formed by the edges where the first edge and the second edge are located as a target area;
wherein the target interface includes the interface region.
3. The interface control method according to claim 2, wherein after determining a second edge of the functionality control closest to the first edge of the target interface according to the edge position of the functionality control, the method further comprises:
obtaining a distance between the first edge and the second edge;
and determining an interface area formed by the edges of the first edge and the second edge as a target area under the condition that the distance is greater than or equal to a preset value.
4. The interface control method according to claim 1, wherein the target interface includes a first region including the function control, and when a first gesture acts on the first region, a first function corresponding to the first gesture and the first region is executed, and after the target region is determined according to the position of the function control, the method further includes:
receiving a first gesture input of a user under the condition that the target area is at least partially overlapped with the first area, wherein the first gesture acts on the overlapped area of the target area and the first area;
in the case that the first gesture does not have a target function corresponding to the target area, giving up responding to the first gesture input and not executing the first function;
and in the case that the first gesture exists in a target function corresponding to the target area, responding to the first gesture input, executing the target function, and not executing the first function.
5. An interface control apparatus, comprising:
the first display module is used for displaying a target interface;
the first acquisition module is used for acquiring the position of the function control and determining a target area according to the position of the function control under the condition that the target interface comprises the function control;
the first determining module is used for determining that the target area is a preset area under the condition that the target interface does not comprise a function control;
and executing a function corresponding to the target gesture and the target area under the condition that the target gesture acts on the target area.
6. The interface control device according to claim 5, wherein the first obtaining module is specifically configured to:
acquiring the edge position of the function control;
determining a second edge, which is closest to the first edge of the target interface, in the function control according to the edge position of the function control;
determining an interface area formed by the edges where the first edge and the second edge are located as a target area;
wherein the target interface includes the interface region.
7. The interface control device of claim 6, further comprising:
a second obtaining module, configured to obtain a distance between the first edge and the second edge;
and the second determining module is used for determining an interface area formed by the edges where the first edge and the second edge are located as a target area under the condition that the distance is greater than or equal to a preset value.
8. The interface control apparatus of claim 5, wherein the target interface includes a first region, the first region including the function control, the apparatus further comprising, in the event that a first gesture acts on the first region:
a first receiving module, configured to receive a first gesture input of a user if the target area at least partially overlaps the first area, where the first gesture acts on an overlapping area of the target area and the first area;
a first executing module, configured to, in a case where there is no target function corresponding to the target area in the first gesture, forgo responding to the first gesture input and not execute the first function;
and in the case that the first gesture exists in a target function corresponding to the target area, responding to the first gesture input, executing the target function, and not executing the first function.
9. An electronic device comprising a processor and a memory, the memory storing a program or instructions executable on the processor, the program or instructions when executed by the processor implementing the steps of the interface control method according to any one of claims 1 to 4.
10. A readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the interface control method according to any one of claims 1 to 4.
CN202111422090.XA 2021-11-26 2021-11-26 Interface control method and device, electronic equipment and storage medium Pending CN114115639A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111422090.XA CN114115639A (en) 2021-11-26 2021-11-26 Interface control method and device, electronic equipment and storage medium
PCT/CN2022/133138 WO2023093661A1 (en) 2021-11-26 2022-11-21 Interface control method and apparatus, and electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111422090.XA CN114115639A (en) 2021-11-26 2021-11-26 Interface control method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114115639A true CN114115639A (en) 2022-03-01

Family

ID=80370115

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111422090.XA Pending CN114115639A (en) 2021-11-26 2021-11-26 Interface control method and device, electronic equipment and storage medium

Country Status (2)

Country Link
CN (1) CN114115639A (en)
WO (1) WO2023093661A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023093661A1 (en) * 2021-11-26 2023-06-01 维沃移动通信有限公司 Interface control method and apparatus, and electronic device and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012091704A1 (en) * 2010-12-29 2012-07-05 Empire Technology Development Llc Environment-dependent dynamic range control for gesture recognition
KR20140138424A (en) * 2013-05-23 2014-12-04 삼성전자주식회사 Method and appratus for user interface based on gesture
EP3680765A4 (en) * 2017-09-08 2020-09-23 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Navigation bar control method and device
CN110639203B (en) * 2019-09-29 2023-07-25 网易(杭州)网络有限公司 Control response method and device in game
CN114115639A (en) * 2021-11-26 2022-03-01 维沃移动通信有限公司 Interface control method and device, electronic equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023093661A1 (en) * 2021-11-26 2023-06-01 维沃移动通信有限公司 Interface control method and apparatus, and electronic device and storage medium

Also Published As

Publication number Publication date
WO2023093661A1 (en) 2023-06-01

Similar Documents

Publication Publication Date Title
CN107678644B (en) Image processing method and mobile terminal
CN111857509A (en) Split screen display method and device and electronic equipment
CN112162665A (en) Operation method and device
CN112433693B (en) Split screen display method and device and electronic equipment
CN112099684A (en) Search display method and device and electronic equipment
CN113655929A (en) Interface display adaptation processing method and device and electronic equipment
CN112148167A (en) Control setting method and device and electronic equipment
CN114518822A (en) Application icon management method and device and electronic equipment
CN114415886A (en) Application icon management method and electronic equipment
WO2023093661A1 (en) Interface control method and apparatus, and electronic device and storage medium
CN113282213A (en) Interface display method and device
CN113268182A (en) Application icon management method and electronic equipment
WO2023241563A1 (en) Data processing method and electronic device
CN112817555A (en) Volume control method and volume control device
CN112399010A (en) Page display method and device and electronic equipment
CN111638828A (en) Interface display method and device
CN114779977A (en) Interface display method and device, electronic equipment and storage medium
CN114416269A (en) Interface display method and display device
CN114995713A (en) Display control method and device, electronic equipment and readable storage medium
CN114327726A (en) Display control method, display control device, electronic equipment and storage medium
CN114879872A (en) Display method, display device, electronic equipment and storage medium
CN113885981A (en) Desktop editing method and device and electronic equipment
CN114416264A (en) Message display method and device
CN112148406A (en) Page switching method and device, electronic equipment and readable storage medium
CN115756681A (en) Interface display method and device, electronic equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination