WO2023093661A1 - 界面控制方法、装置、电子设备及存储介质 - Google Patents

界面控制方法、装置、电子设备及存储介质 Download PDF

Info

Publication number
WO2023093661A1
WO2023093661A1 PCT/CN2022/133138 CN2022133138W WO2023093661A1 WO 2023093661 A1 WO2023093661 A1 WO 2023093661A1 CN 2022133138 W CN2022133138 W CN 2022133138W WO 2023093661 A1 WO2023093661 A1 WO 2023093661A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
area
interface
edge
gesture
Prior art date
Application number
PCT/CN2022/133138
Other languages
English (en)
French (fr)
Inventor
肖嘉里
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2023093661A1 publication Critical patent/WO2023093661A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present application belongs to the technical field of communication, and in particular relates to an interface control method, device, electronic equipment and storage medium.
  • the area of the target area in the display area of the screen may be small, and the user may not touch the target area during the target operation, resulting in that the target operation cannot be obtained. Normal response.
  • the purpose of the embodiments of the present application is to provide an interface control method, device, electronic device, and storage medium, which can solve the problem that the target area may be small and may not respond to target operations in the target area.
  • the embodiment of the present application provides an interface control method, the method includes:
  • the target interface includes a functional control, acquiring the position of the functional control and determining a target area according to the position of the functional control;
  • the target area is a preset area
  • the function corresponding to the target gesture and the target area is executed.
  • an interface control device including:
  • a first display module configured to display a target interface
  • a first acquiring module configured to acquire the position of the functional control and determine the target area according to the position of the functional control when the target interface includes a functional control;
  • a first determining module configured to determine that the target area is a preset area when the target interface does not include functional controls
  • the function corresponding to the target gesture and the target area is executed.
  • the embodiment of the present application provides an electronic device, the electronic device includes a processor and a memory, the memory stores programs or instructions that can run on the processor, and the programs or instructions are processed by the The steps of the method described in the first aspect are realized when the controller is executed.
  • an embodiment of the present application provides a readable storage medium, on which a program or an instruction is stored, and when the program or instruction is executed by a processor, the steps of the method described in the first aspect are implemented .
  • the embodiment of the present application provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is used to run programs or instructions, so as to implement the first aspect the method described.
  • an embodiment of the present application provides a computer program product, the program product is stored in a storage medium, and the program product is executed by at least one processor to implement the method described in the first aspect.
  • the target interface contains functional controls. If the target interface contains functional controls, based on the position of the functional controls, determine a target area as large as possible without affecting the function of the functional controls. On the other hand, if the target interface does not contain functional controls, the preset area is used as the target area. In the embodiment of the present application, the scope of the target area is expanded as much as possible while reducing the functions of the functional controls. , thus effectively alleviating the problem that the target operation acting on the target area may not get a normal response due to the small size of the target area.
  • FIG. 1 is a schematic flow chart of an interface control method provided in an embodiment of the present application
  • Figure 2 is a schematic diagram of the side area provided by the embodiment of the present application.
  • FIG. 3 is a schematic diagram of determining a target area in an embodiment of the present application.
  • FIG. 4 is a schematic structural diagram of an interface control device provided in an embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
  • FIG. 6 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
  • Figure 1 is a schematic flow chart of the interface control method provided by the embodiment of the present application, as shown in Figure 1, including:
  • Step 110 displaying the target interface
  • the electronic device displays the target interface through the display screen.
  • the target interface described in the embodiment of the present application may specifically be the interface being displayed on the screen of the electronic device at the current moment, which may specifically be the display interface of the system application program.
  • a system setting interface, or a calendar interface, etc. may also be a user interface of an application program installed in the electronic device, such as a payment interface of a shopping application program.
  • the electronic devices described in the embodiments of the present application may specifically include but not limited to other portable communication devices such as mobile phones or tablet computers with touch-sensitive surfaces (eg, touch screen displays and/or touch pads). It should also be appreciated that in some embodiments, the terminal may not be a portable communication device, but a desktop computer with a touch-sensitive surface (eg, a touchscreen display and/or a touchpad).
  • the edge of the target interface in the embodiment of the present application may specifically be the edge of the screen of the electronic device.
  • Step 120 if the target interface includes a functional control, acquire the position of the functional control and determine the target area according to the position of the functional control;
  • the functional control described in the embodiments of the present application may specifically be a functional control displayed on the current display interface, and the functional control may specifically refer to a control that can interact with the user, that is, the control can respond to user input, And present the corresponding function.
  • the functional controls described in the embodiments of the present application may specifically refer to icons or buttons in the interface, such as a shooting button control in a shooting interface of a camera, or a flash button control.
  • each control in the current display interface it is used to distinguish whether each control belongs to a functional control or a non-functional control, or it can be based on parameters such as the size, position or layer order of the control in the current display interface To distinguish functional controls and non-functional controls in the current display interface.
  • the target interface includes the functional control at this time.
  • the target interface does not include the functional control at this time.
  • the target interface includes functional controls
  • the position of the functional control described in the embodiment of the present application may specifically be the position of the functional control relative to the target interface, specifically the position of the functional control in the pixel coordinate system corresponding to the target interface, for example, when the functional control is an icon or a button , the position of the functional control is the overall position of the icon or button in the target interface.
  • the functional control described in the embodiment of the present application corresponds to the response area of the functional control, and in the embodiment of the present application, the target area can be further determined according to the edge position of the response area of the functional control.
  • the functional control may be covered. At this time, the functional control cannot normally respond to the user's input to the functional control, that is, the function of the functional control cannot be realized normally. Determining the target area based on the position of the functional control can avoid affecting the normal realization of the function of the functional control and maximize the size of the target area.
  • Step 130 if the target interface does not include functional controls, the target area is a preset area
  • the target area described in the embodiments of the present application can be specifically used as an area that responds to target gestures, that is, when the target area receives the target gesture, it will execute the function corresponding to when the target gesture acts on the target area, When other gestures are received in the target area, other gestures will be executed normally.
  • the side area may specifically refer to one or multiple sides of the display screen of the electronic device.
  • Figure 2 is a schematic diagram of the side area provided by the embodiment of the present application.
  • the side area may specifically include the upper side area 21, the left side area 22 and the lower side area in the current display interface of the electronic device Area 23.
  • the side area in the embodiment of the present application is specifically used to respond to the user's target gesture.
  • the target gesture can be the user's right swipe operation on the left side of the current display interface, and the starting point of the target operation is located on the left side of the current display interface.
  • the current display interface can be switched to the upper-level interface at this time.
  • the target gesture may be a user's sliding-up operation on the lower side of the current display interface, and the starting point of the target operation is located in the lower side area.
  • the current display interface can be switched to a multi-task management interface.
  • the target gesture may be a user's swipe operation on the upper side of the current display interface, and the starting point of the target operation is located in the upper side area. At this time, the operation of calling the bottom status bar in the current display interface can be realized.
  • the preset area described in the embodiment of the present application may specifically be a preset target area, that is, a default target area.
  • the target area cannot be determined according to the position of the functional controls at this time. If the entire target interface is set as the target area, other inputs in the target interface may be affected at this time. normal response.
  • the target area will be set as the preset area.
  • the target interface contains functional controls. If the target interface contains functional controls, based on the position of the functional controls, determine a target area as large as possible without affecting the function of the functional controls. On the other hand, if the target interface does not contain functional controls, the preset area is used as the target area. In the embodiment of the present application, the scope of the target area is expanded as much as possible while reducing the functions of the functional controls. , thus effectively alleviating the problem that the target operation acting on the target area may not get a normal response due to the small size of the target area.
  • the acquiring the position of the functional control and determining the target area according to the position of the functional control includes:
  • the target interface includes the interface area.
  • the functional control described in the embodiment of the present application often corresponds to an area, so the edge position of the functional control described in the embodiment of the present application may specifically refer to the edge position of the corresponding area of the functional control, for example, In the case that the functional control is a virtual key mark, the edge position of the functional control at this time is the mark edge of the virtual key mark.
  • each functional control may have multiple edges, and correspondingly, the edge position of the functional control is an edge position formed by multiple edges.
  • obtaining the edge position of the functional control may specifically be determined according to the attribute of the functional control, and the edge position may specifically be the position of the edge of the functional control relative to the target interface.
  • the target area can be further determined according to the edge position of the functional control, specifically, it can be determined according to the edge position of the functional control that it is the closest to the first edge in the target interface the second edge of .
  • the first edge described in the embodiment of the present application may specifically be any edge in the target interface. After the first edge is determined, a second edge closest to the first edge may be further determined.
  • the second edge described in the embodiment of the present application may specifically be the edge closest to the first edge of the target interface among the edges of the functional control.
  • the second edge is Among the multiple functional controls, the edge closest to the first edge of the target interface, that is, in the case of multiple functional controls, only one second edge will be determined.
  • the side where the first edge and the second edge are located can be used as the side of the target area, and since the distance between the first edge and the second edge is also relatively fixed Therefore, after determining the two edges, further determine the interface area formed by these two edges as the target area.
  • FIG. 3 is a schematic diagram of determining the target area in the embodiment of the present application. As shown in FIG. 3 , it includes a functional control 31, a first edge 32 of the target interface, and a second edge 311 closest to the first edge 32 in the functional control 31, The interface area 33 formed by the side where the first edge 32 and the second edge 311 are located is determined to be the target area.
  • the method further includes:
  • the interface area formed by the first edge and the side where the second edge is located is determined as the target area.
  • the preset value described in the embodiment of the present application may specifically be a preset value.
  • the distance between the first edge and the second edge described in the embodiment of the present application may specifically be the shortest straight-line distance between the first edge and the second edge, which may specifically be expressed in pixel units, and the distance may specifically be It is determined according to the pixel coordinates of the first edge in the pixel coordinate system corresponding to the target interface and the pixel coordinates of the second edge in the pixel coordinate system corresponding to the target interface.
  • the first edge and the second edge can be The interface area formed by the edges is determined as the target area.
  • the distance when the distance is less than or a preset value, it means that the distance between the first edge and the functional control is relatively small.
  • the distance between the first edge and the second The interface area formed by the edges where the edges are located determines the target area, and the target area may affect the normal functions of the functional controls at this time, so the target area is directly determined as the preset area at this time.
  • the distance between the first edge and the second edge is fully considered to ensure that the area of the target area is increased as much as possible without affecting the functional controls in the process of determining the target area.
  • the target interface includes a first area
  • the first area includes the function control
  • the first gesture acts on the first area
  • the first gesture and the first gesture are executed.
  • the first function corresponding to an area, after the target area is determined according to the position of the function control, the method further includes:
  • the first gesture acts on the overlapping area of the target area and the first area;
  • the target function is executed, and the first function is not executed.
  • the first area described in the embodiments of the present application may specifically be a specific area in the target interface.
  • the first area When the first area receives its corresponding first gesture, it will execute the corresponding first function.
  • the first area When receiving other gestures, the first area will normally perform the functions of other gestures.
  • the target area described in the embodiments of the present application may at least partially overlap with the first area, specifically may be partially overlapped, or the target area includes the first area, or the first area includes the target area.
  • the first gesture acting on the overlapping area at this time cannot determine whether the user wishes to trigger the execution of the first function. Therefore, in the If the first gesture does not have a target function corresponding to the target area, give up responding to the first gesture input, and do not execute the first function at this time.
  • the first gesture described in the embodiment of this application exists in the target function corresponding to the target area, that is, when the first gesture acts on the target area and can trigger the corresponding target function, only other target functions are executed at this time , that is, at this time, the function of the first gesture acting on the target area is executed, and the function of acting on the first area in response to the first gesture is not executed, that is, the first function is not executed.
  • the target function instead of executing the first function, can effectively avoid instruction conflicts and ensure the normal execution of the function.
  • the interface control method provided in the embodiment of the present application may be executed by an interface control device.
  • the method for performing interface control by the interface control device is taken as an example to illustrate the device for interface control provided in the embodiment of the present application.
  • Fig. 4 is a schematic structural diagram of the interface control device provided by the embodiment of the present application, as shown in Fig. 4, including: wherein, the first display module 410 is used to display the target interface; If the interface includes functional controls, acquire the position of the functional controls and determine the target area according to the position of the functional controls; wherein, the first determining module 430 is used to determine the target area if the target interface does not include functional controls.
  • the target area is a preset area; wherein, when the target gesture acts on the target area, the function corresponding to the target gesture and the target area is executed.
  • the first acquisition module is specifically used for:
  • the target interface includes the interface area.
  • the device also includes:
  • a second acquiring module configured to acquire the distance between the first edge and the second edge
  • the second determination module is configured to determine the interface area formed by the first edge and the side where the second edge is located as the target area when the distance is greater than or equal to a preset value.
  • the target interface includes a first area
  • the first area includes the function control
  • the device when the first gesture acts on the first area, the device further includes:
  • a first receiving module configured to receive a user's first gesture input when the target area at least partially overlaps the first area, and the first gesture acts on the target area and the first area the overlapping area;
  • a first execution module configured to give up responding to the first gesture input and not execute the first function if the first gesture does not have a target function corresponding to the target area;
  • the target function In a case where the first gesture exists in a target function corresponding to the target area, in response to the first gesture input, the target function is executed, and the first function is not executed.
  • the target interface contains functional controls. If the target interface contains functional controls, based on the position of the functional controls, determine a target area as large as possible without affecting the function of the functional controls. On the other hand, if the target interface does not contain functional controls, the preset area is used as the target area. In the embodiment of the present application, the scope of the target area is expanded as much as possible while reducing the functions of the functional controls. , thus effectively alleviating the problem that the target operation acting on the target area may not get a normal response due to the small size of the target area.
  • the interface control device in the embodiment of the present application may be an electronic device, or may be a component in the electronic device, such as an integrated circuit or a chip.
  • the electronic device may be a terminal, or other devices other than the terminal.
  • the electronic device can be a mobile phone, a tablet computer, a notebook computer, a handheld computer, a vehicle electronic device, a mobile Internet device (Mobile Internet Device, MID), an augmented reality (augmented reality, AR)/virtual reality (virtual reality, VR) ) equipment, robots, wearable devices, ultra-mobile personal computer (ultra-mobile personal computer, UMPC), netbook or personal digital assistant (personal digital assistant, PDA), etc.
  • the interface control device in the embodiment of the present application may be a device with an operating system.
  • the operating system may be an Android operating system, an iOS operating system, or other possible operating systems, which are not specifically limited in this embodiment of the present application.
  • the interface control device provided in the embodiment of the present application can realize various processes realized by the method embodiments in FIG. 1 to FIG. 3 , and details are not repeated here to avoid repetition.
  • FIG. 5 is a schematic structural diagram of an electronic device provided by the embodiment of the present application.
  • the program or instruction running on the processor 501 when the program or instruction is executed by the processor 501, implements the steps of the above-mentioned interface control method embodiment, and can achieve the same technical effect. To avoid repetition, it will not be repeated here. .
  • the electronic devices in the embodiments of the present application include the above-mentioned mobile electronic devices and non-mobile electronic devices.
  • FIG. 6 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
  • the electronic device 600 includes but is not limited to: a radio frequency unit 601, a network module 602, an audio output unit 603, an input unit 604, a sensor 605, a display unit 606, a user input unit 607, an interface unit 608, a memory 609, and a processor 610, etc. part.
  • the electronic device 600 can also include a power supply (such as a battery) for supplying power to various components, and the power supply can be logically connected to the processor 610 through the power management system, so that the management of charging, discharging, and function can be realized through the power management system. Consumption management and other functions.
  • a power supply such as a battery
  • the structure of the electronic device shown in FIG. 6 does not constitute a limitation to the electronic device, and the electronic device may include more or fewer components than shown in the figure, or combine some components, or arrange different components, which will not be repeated here. .
  • the display unit 606 is used to display the target interface
  • the processor 610 is configured to acquire a position of the functional control when the target interface includes a functional control, and determine a target area according to the position of the functional control;
  • the target area is a preset area
  • the function corresponding to the target gesture and the target area is executed.
  • the processor 610 is configured to acquire an edge position of the functional control
  • the processor 610 is configured to determine a second edge of the functional control that is closest to the first edge of the target interface according to the edge position of the functional control;
  • the processor 610 is configured to determine an interface area formed by the first edge and the side where the second edge is located as a target area;
  • the target interface includes the interface area.
  • the sensor 605 is used to obtain the distance between the first edge and the second edge;
  • the processor 610 is configured to determine the interface area formed by the first edge and the side where the second edge is located as the target area when the distance is greater than or equal to a preset value.
  • the user input unit 607 is configured to receive a user's first gesture input, and the first gesture acts on the target area and the first area. the overlapping area of the first area;
  • the processor 610 is configured to give up responding to the first gesture input and not execute the first function
  • the processor 610 is configured to execute the target function and not execute the first function in response to the first gesture input.
  • the target interface contains functional controls. If the target interface contains functional controls, based on the position of the functional controls, determine a target area as large as possible without affecting the function of the functional controls. On the other hand, if the target interface does not contain functional controls, the preset area is used as the target area. In the embodiment of the present application, the scope of the target area is expanded as much as possible while reducing the functions of the functional controls. , thus effectively alleviating the problem that the target operation acting on the target area may not get a normal response due to the small size of the target area.
  • the input unit 604 may include a graphics processor (Graphics Processing Unit, GPU) 6041 and a microphone 6042, and the graphics processor 6041 is used for the image capture device (such as the image data of the still picture or video obtained by the camera) for processing.
  • the display unit 606 may include a display panel 6061, and the display panel 6061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like.
  • the user input unit 607 includes at least one of a touch panel 6071 and other input devices 6072 .
  • the touch panel 6071 is also called a touch screen.
  • the touch panel 6071 may include two parts, a touch detection device and a touch controller.
  • Other input devices 6072 may include, but are not limited to, physical keyboards, function keys (such as volume control buttons, switch buttons, etc.), trackballs, mice, and joysticks, which will not be repeated here.
  • the memory 609 can be used to store software programs as well as various data.
  • the memory 609 may mainly include a first storage area for storing programs or instructions and a second storage area for storing data, wherein the first storage area may store an operating system, an application program or instructions required by at least one function (such as a sound playing function, image playback function, etc.), etc.
  • memory 609 may include volatile memory or nonvolatile memory, or memory x09 may include both volatile and nonvolatile memory.
  • the non-volatile memory can be read-only memory (Read-Only Memory, ROM), programmable read-only memory (Programmable ROM, PROM), erasable programmable read-only memory (Erasable PROM, EPROM), electronically programmable Erase Programmable Read-Only Memory (Electrically EPROM, EEPROM) or Flash.
  • ROM Read-Only Memory
  • PROM programmable read-only memory
  • Erasable PROM Erasable PROM
  • EPROM erasable programmable read-only memory
  • Electrical EPROM Electrical EPROM
  • EEPROM electronically programmable Erase Programmable Read-Only Memory
  • Volatile memory can be random access memory (Random Access Memory, RAM), static random access memory (Static RAM, SRAM), dynamic random access memory (Dynamic RAM, DRAM), synchronous dynamic random access memory (Synchronous DRAM, SDRAM), double data rate synchronous dynamic random access memory (Double Data Rate SDRAM, DDRSDRAM), enhanced synchronous dynamic random access memory (Enhanced SDRAM, ESDRAM), synchronous connection dynamic random access memory (Synch link DRAM , SLDRAM) and Direct Memory Bus Random Access Memory (Direct Rambus RAM, DRRAM).
  • RAM Random Access Memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • DRAM synchronous dynamic random access memory
  • SDRAM double data rate synchronous dynamic random access memory
  • Double Data Rate SDRAM Double Data Rate SDRAM
  • DDRSDRAM double data rate synchronous dynamic random access memory
  • Enhanced SDRAM, ESDRAM enhanced synchronous dynamic random access memory
  • Synch link DRAM , SLDRAM
  • Direct Memory Bus Random Access Memory Direct Rambus
  • the processor 610 may include one or more processing units; optionally, the processor 610 integrates an application processor and a modem processor, wherein the application processor mainly processes operations related to the operating system, user interface, and application programs, etc., Modem processors mainly process wireless communication signals, such as baseband processors. It can be understood that the foregoing modem processor may not be integrated into the processor 610 .
  • the embodiment of the present application also provides a readable storage medium, on which a program or instruction is stored, and when the program or instruction is executed by a processor, each process of the above-mentioned interface control method embodiment is realized, and the same To avoid repetition, the technical effects will not be repeated here.
  • the processor is the processor in the electronic device described in the above embodiments.
  • the readable storage medium includes a computer readable storage medium, such as a computer read only memory ROM, a random access memory RAM, a magnetic disk or an optical disk, and the like.
  • the embodiment of the present application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is used to run programs or instructions to implement the above interface control method embodiment
  • the chip includes a processor and a communication interface
  • the communication interface is coupled to the processor
  • the processor is used to run programs or instructions to implement the above interface control method embodiment
  • chips mentioned in the embodiments of the present application may also be called system-on-chip, system-on-chip, system-on-a-chip, or system-on-a-chip.
  • the embodiment of the present application provides a computer program product, the program product is stored in a storage medium, and the program product is executed by at least one processor to implement the various processes in the above interface control method embodiment, and can achieve the same technical effect , to avoid repetition, it will not be repeated here.
  • the term “comprising”, “comprising” or any other variation thereof is intended to cover a non-exclusive inclusion such that a process, method, article or apparatus comprising a set of elements includes not only those elements, It also includes other elements not expressly listed, or elements inherent in the process, method, article, or device. Without further limitations, an element defined by the phrase “comprising a " does not preclude the presence of additional identical elements in the process, method, article, or apparatus comprising that element.
  • the scope of the methods and devices in the embodiments of the present application is not limited to performing functions in the order shown or discussed, and may also include performing functions in a substantially simultaneous manner or in reverse order according to the functions involved. Functions are performed, for example, the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.

Abstract

一种界面控制方法、装置、电子设备及存储介质,属于通信技术领域。包括:显示目标界面(110);在所述目标界面包括功能控件的情况下,获取所述功能控件的位置并根据所述功能控件的位置确定目标区域(120);在所述目标界面不包括功能控件的情况下,所述目标区域为预设区域(130);其中,在目标手势作用于所述目标区域的情况下,执行与所述目标手势及目标区域对应的功能。

Description

界面控制方法、装置、电子设备及存储介质
相关申请的交叉引用
本申请要求于2021年11月26日提交的申请号为202111422090.X,发明名称为“界面控制方法、装置、电子设备及存储介质”的中国专利申请的优先权,其通过引用方式全部并入本申请。
技术领域
本申请属于通信技术领域,具体涉及一种界面控制方法、装置、电子设备及存储介质。
背景技术
随着电子技术的不断发展,为了扩展电子设备的可用显示区域,在电子设备中往往不再使用实体按键或者固定的虚拟按键,因此会在电子设备的显示区域中设定目标区域,用于响应用户的目标操作,从而实现在该目标区域的目标操作对应的功能。
但是,随着电子设备屏幕的尺寸不断增大,屏幕的显示区域中的目标区域面积可能较小,在用户进行目标操作的过程中,很可能没有接触到该目标区域,从而导致目标操作无法得到正常响应。
发明内容
本申请实施例的目的是提供一种界面控制方法、装置、电子设备及存储介质,能够解决目标区域可能面积较小,容易导致对于目标区域的目标操作无响应的问题。
第一方面,本申请实施例提供了一种界面控制方法,该方法包括:
显示目标界面;
在所述目标界面包括功能控件的情况下,获取所述功能控件的位置并根据所述功能控件的位置确定目标区域;
在所述目标界面不包括功能控件的情况下,所述目标区域为预设区域;
其中,在目标手势作用于所述目标区域的情况下,执行与所述目标手势及目标区域对应的功能。
第二方面,本申请实施例提供了一种界面控制装置,包括:
第一显示模块,用于显示目标界面;
第一获取模块,用于在所述目标界面包括功能控件的情况下,获取所述功能控件的位置并根据所述功能控件的位置确定目标区域;
第一确定模块,用于在所述目标界面不包括功能控件的情况下,所述目标区域为预设区域;
其中,在目标手势作用于所述目标区域的情况下,执行与所述目标手势及目标区域对应的功能。
第三方面,本申请实施例提供了一种电子设备,该电子设备包括处理器和存储器,所述存储器存储可在所述处理器上运行的程序或指令,所述程序或指令被所述处理器执行时实现如第一方面所述的方法的步骤。
第四方面,本申请实施例提供了一种可读存储介质,所述可读存储介质上存储程序或指令,所述程序或指令被处理器执行时实现如第一方面所述的方法的步骤。
第五方面,本申请实施例提供了一种芯片,所述芯片包括处理器和通信接口,所述通信接口和所述处理器耦合,所述处理器用于运行程序或指令,实现如第一方面所述的方法。
第六方面,本申请实施例提供一种计算机程序产品,该程序产品被存储在存储介质中,该程序产品被至少一个处理器执行以实现如第一方面所述的方法。
在本申请实施例中,对目标界面中是否包含功能控件进行分析,在目标界面中包含功能控件的情况下,基于功能控件的位置,为目标区域确定 一个尽可能大但是又不影响功能控件功能的区域,另一方面,在目标界面中不含包含功能控件的情况下,将预设区域作为目标区域,本申请实施例在减少对于功能控件功能的情况下,尽可能的扩大目标区域的范围,从而有效缓解了目标区域面积可能较小,而导致的作用于目标区域的目标操作无法得到正常响应的问题。
附图说明
图1为本申请实施例提供的界面控制方法流程示意图;
图2为本申请实施例提供的侧边区域示意图;
图3为本申请实施例中确定目标区域的示意图;
图4为本申请实施例提供的界面控制装置结构示意图;
图5为本申请实施例提供的电子设备结构示意图;
图6为实现本申请实施例的一种电子设备的硬件结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员获得的所有其他实施例,都属于本申请保护的范围。
本申请的说明书和权利要求书中的术语“第一”、“第二”等是用于区别类似的对象,而不用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便本申请的实施例能够以除了在这里图示或描述的那些以外的顺序实施,且“第一”、“第二”等所区分的对象通常为一类,并不限定对象的个数,例如第一对象可以是一个,也可以是多个。此外,说明书以及权利要求中“和/或”表示所连接对象的至少其中之一,字符“/”,一般表示前后关联对象是一种“或”的关系。
下面结合附图,通过具体的实施例及其应用场景对本申请实施例提供 的界面控制方法、装置、电子设备及存储介质进行详细地说明。
图1为本申请实施例提供的界面控制方法流程示意图,如图1所示,包括:
步骤110,显示目标界面;
具体地,电子设备通过显示屏显示目标界面,本申请实施例中所描述的目标界面,具体可以是当前时刻的电子设备的屏幕中正在显示的界面,其具体可以是系统应用程序的显示界面,例如系统设置界面,或者日历界面等,也可以是安装在电子设备中的应用程序的使用界面,例如购物应用程序的付款界面等。
本申请实施例中所描述的电子设备具体可以是该包括但不限于具有触摸敏感表面(例如,触摸屏显示器和/或触摸板)的移动电话或平板电脑等其它便携式通信设备。还应当理解的是,在某些实施例中,该终端可以不是便携式通信设备,而是具有触摸敏感表面(例如,触摸屏显示器和/或触摸板)的台式计算机。
本申请实施例中目标界面的边缘,具体可以是电子设备的屏幕的边缘。
步骤120,在所述目标界面包括功能控件的情况下,获取所述功能控件的位置并根据所述功能控件的位置确定目标区域;
具体地,本申请实施例中所描述的功能控件具体可以是显示在当前显示界面中的功能控件,该功能控件具体可以是指可以与用户进行交互的控件,即该控件可以响应用户的输入,并呈现相应的功能。
本申请实施例中所描述的功能控件具体可以是指界面中的图标或者按钮等,例如相机的拍摄界面中的拍摄按钮控件,或者闪光灯按钮控件等。
本申请实施例中根据当前显示界面中各个控件的属性信息,用于区别各个控件是属于功能控件还是非功能控件,也可以是根据控件在当前显示界面中的大小、位置或者图层顺序等参数来区分当前显示界面中的功能控件和非功能控件。
在本申请实施例中,若基于目标界面中各个控件的属性信息或者控件 的参数,检测到了目标界面中存在功能控件时,则此时目标界面中包括功能控件。
而相应地,基于目标界面中各个控件的属性信息或者控件的参数,不能检测到了目标界面中存在功能控件时,则此时目标界面中不包括功能控件。
可以理解的是,在目标界面中包括功能控件的情况下,此时目标子界面中存在需要响应用户特定输入的控件,本申请实施例中可以进一步根据功能控件的属性信息或者参数信息来确定其位置。
本申请实施例中所描述的功能控件的位置,具体可以是功能控件相对于目标界面的位置,具体可以是功能控件在目标界面对应的像素坐标系的位置,例如在功能控件为图标或者按钮时,功能控件的位置就是该图标或者按钮整体在目标界面中的位置。
更具体地,本申请实施例中所描述的功能控件对应有该功能控件的响应区域,而本申请实施例中可以根据该功能控件的响应区域的边缘位置,进一步确定目标区域。
在本申请实施例中,若希望解决目标区域面积可能较小,容易导致用户希望对目标区域进行目标操作的过程中,很可能没有触摸到目标区域的问题,本申请实施例中,考虑通过直接增加目标区域的面积的方式,从而使得目标区域能够更好的响应用户的目标操作。
然而,当目标区域的面积增大后,可能会覆盖功能控件,此时功能控件无法正常响应用户对该功能控件的输入,即功能控件的功能得不到正常实现,而本申请实施例中可以基于功能控件的位置来确定目标区域,可以避免影响功能控件的功能得到正常实现的情况下,来尽量扩大目标区域的大小。
步骤130,在所述目标界面不包括功能控件的情况下,所述目标区域为预设区域;
其中,在目标手势作用于所述目标区域的情况下,执行与所述目标手 势及目标区域对应的功能。
具体地,本申请实施例中所描述的目标区域具体可以专门用于响应目标手势的区域,即该目标区域在接收到目标手势时,会执行与该目标手势作用在目标区域时对应的功能,而在目标区域在接收到其它手势时,会正常执行其它手势。
例如,目标手势在其它区域对应功能A,但是在目标区域对应功能B时,在目标手势作用于目标区域时,仅仅会执行功能B,但是目标手势作用于其它区域时,则会执行功能A。
例如,在目标区域为侧边区域的情况下,该侧边区域具体可以是指电子设备显示屏上边、下边、左边、右边中一边或者多边的侧边区域。
图2为本申请实施例提供的侧边区域示意图,如图2所示,例如侧边区域具体可以是包括电子设备当前显示界面中的上侧边区域21,左侧边区域22和下侧边区域23。
本申请实施例中的侧边区域具体用于响应用户的目标手势,例如,该目标手势可以是用户在当前显示界面左侧的右划操作,该目标操作的起点位于当前显示显示界面的左侧边区域中,此时可以实现将当前显示界面切换为上一级界面。
例如,该目标手势可以是用户在当前显示界面下侧的上滑操作,该目标操作的起点位于下侧边区域中,此时可以实现将当前显示界面切换为多任务管理界面。
例如,该目标手势可以是用户在当前显示界面上侧的下滑操作,该目标操作的起点位于上侧边区域中,此时可以实现在当前显示界面中调出底部状态栏的操作。
本申请实施例中所描述的预设区域,具体可以是预先设定的目标区域,即默认的目标区域。
在目标界面中不包括功能控件的情况下,则此时无法根据功能控件的位置来确定目标区域,若将整个目标界面都设定为目标区域的时候,此时 可能会影响目标界面中其它输入的正常响应。
因此,在目标界面中不包括功能控件的情况下,本申请实施例中,会设定目标区域为预设区域。
在本申请实施例中,对目标界面中是否包含功能控件进行分析,在目标界面中包含功能控件的情况下,基于功能控件的位置,为目标区域确定一个尽可能大但是又不影响功能控件功能的区域,另一方面,在目标界面中不含包含功能控件的情况下,将预设区域作为目标区域,本申请实施例在减少对于功能控件功能的情况下,尽可能的扩大目标区域的范围,从而有效缓解了目标区域面积可能较小,而导致的作用于目标区域的目标操作无法得到正常响应的问题。
可选地,所述获取所述功能控件的位置并根据所述功能控件的位置确定目标区域,包括:
获取所述功能控件的边缘位置;
根据所述功能控件的边缘位置确定所述功能控件中距离所述目标界面的第一边缘距离最近的第二边缘;
将所述第一边缘与所述第二边缘所在边构成的界面区域确定为目标区域;
其中,所述目标界面包括所述界面区域。
具体地,本申请实施例中所描述的功能控件往往对应存在一个区域,因此本申请实施例中所描述的功能控件的边缘位置,具体可以是指该功能控件的对应区域的边缘位置,例如,在功能控件为一个虚拟按键标识的情况下,此时功能控件的边缘位置就是虚拟按键标识的标识边缘。
更具体地,每个功能控件可能存在多条边缘,对应地,该功能控件的边缘位置是由多条边缘构成的边缘位置。
本申请实施例中获取功能控件的边缘位置具体可以是根据功能控件的属性确定的,该边缘位置具体可以功能控件的边缘相对于目标界面的位置。
在本申请实施例中,在确定功能控件的位置之后,可以进一步根据功能控件的边缘位置来确定目标区域,其具体可以是根据功能控件的边缘位置确定其与目标界面中第一边缘距离最接近的第二边缘。
本申请实施例中所描述的第一边缘具体可以是目标界面中的任意一条边缘,在确定第一边缘后,可以进一步确定与第一边缘距离最近的第二边缘。
本申请实施例中所描述的第二边缘具体可以是功能控件的各条边缘中距离目标界面的第一边缘距离最近的边缘,相应地,在存在多个功能控件的情况下,第二边缘是多个功能控件中,距离目标界面的第一边缘最近的边缘,也就是说在存在多个功能控件的情况下,也仅仅会确定一条第二边缘。
可以理解的是,本申请实施例中选取目标界面中不同的边缘作为第一边缘的情况下,其均存在对应的第二边缘。
本申请实施例中,在确定第一边缘和第二边缘后,可以将第一边缘和第二边缘所在边作为目标区域的边,而由于第一边缘和第二边缘之间的距离也是相对固定的,因此,在确定两条边后,进一步将这两个所在边构成的界面区域确定为目标区域。
图3为本申请实施例中确定目标区域的示意图,如图3所示,包括功能控件31、目标界面的第一边缘32和功能控件31中距离第一边缘32最接近的第二边缘311,将第一边缘32和第二边缘311所在边构成的界面区域33确为目标区域。
在本申请实施例中,在获取功能控件的边缘位置后,确定功能控件中距离目标界面的第一边缘区域最近的第二边缘,从而确定目标区域,能够在避免影响功能控件正常使用的情况下,尽可能增大目标区域的面积,有效保证目标手势能够得到成功响应。
可选地,在根据所述功能控件的边缘位置确定所述功能控件中距离所述目标界面的第一边缘距离最近的第二边缘之后,所述方法还包括:
获取所述第一边缘与所述第二边缘之间的距离;
在所述距离大于等于预设值的情况下,将所述第一边缘与所述第二边缘所在边构成的界面区域确定为目标区域。
具体地,本申请实施例中所描述的预设值,具体可以是预先设定的数值。
本申请实施例中所描述的第一边缘与第二边缘之间的距离,具体可以是第一边缘和第二边缘之间的最近直线距离,其具体可以用像素单位来表示,该距离具体是根据第一边缘在目标界面对应的像素坐标系中的像素坐标,和第二边缘在目标界面对应的像素坐标系中的像素坐标确定的。
在本申请实施例中,当该距离大于等于预设值的情况下,则说明第一边缘和功能控件之间还存在一定的距离,此时可以将所述第一边缘与所述第二边缘所在边构成的界面区域确定为目标区域。
可选地,在本申请实施例中,当该距离小于或预设值的情况下,则说明该第一边缘和功能控件之间距离较小,此时若根据第一边缘与所述第二边缘所在边构成的界面区域确定目标区域,则此时目标区域可能会影响功能控件的正常功能,因此,此时直接将目标区域确定为预设区域。
在本申请实施例中,通过充分考虑第一边缘和第二边缘之间的距离,从而保证在确定目标区域的过程中,不影响到功能控件的情况下,尽量增大目标区域的面积。
可选地,所述目标界面包括第一区域,所述第一区域包括所述功能控件,在第一手势作用于所述第一区域的情况下,执行与所述第一手势及所述第一区域对应的第一功能,在所述根据所述功能控件的位置确定目标区域之后,所述方法还包括:
在所述目标区域与所述第一区域至少部分重叠的情况下,接收用户的第一手势输入,所述第一手势作用于所述目标区域与所述第一区域的重叠区域;
在所述第一手势不存在与所述目标区域对应的目标功能的情况下,放 弃响应所述第一手势输入,不执行所述第一功能;
在所述第一手势存在于所述目标区域对应的目标功能的情况下,响应于所述第一手势输入,执行所述目标功能,不执行所述第一功能。
具体地,本申请实施例中所描述的第一区域具体可以是目标界面中的特定区域,该第一区域在接收其对应的第一手势时,会执行对应的第一功能,在对应的该第一区域在接收其它手势时,会正常执行其它手势的功能。
例如,例如该第一手势在其它区域对应功能C,但是在第一区域对应功能D时,在第一手势作用于第一区域时,仅仅会执行功能D,但是第一手势作用于其它区域时,则会执行功能C。
本申请实施例中所描述的目标区域与所述第一区域至少部分重叠具体可以是部分重叠,或者目标区域中包含第一区域,或者第一区域中包含目标区域。
而在本申请实施例中,在目标区域与所述第一区域至少部分重叠的情况下,此时作用于重叠区域的第一手势,不能确定用户是否希望触发执行第一功能,因此,在所述第一手势不存在与所述目标区域对应的目标功能的情况下,放弃响应所述第一手势输入,此时不执行第一功能。
本申请实施例中所描述的第一手势存在于所述目标区域对应的目标功能,即说明第一手势作用于目标区域的情况下,能够触发对应的目标功能时,此时仅仅执行其它目标功能,即此时执行的是第一手势作用于目标区域的功能,而不响应第一手势作用于第一区域的功能,即不执行第一功能。
在本申请实施例中,充分考虑了目标区域与所述第一区域至少部分重叠的情况下,接收到作用于所述目标区域与所述第一区域的重叠区域的第一输入时,仅仅执行目标功能,而不执行第一功能,能够有效避免指令冲突,保证功能正常执行。
本申请实施例提供的界面控制方法,执行主体可以为界面控制装置。本申请实施例中以界面控制装置执行界面控制的方法为例,说明本申请实 施例提供的界面控制的装置。
图4为本申请实施例提供的界面控制装置结构示意图,如图4所示,包括:其中,第一显示模块410用于显示目标界面;其中,第一获取模块420,用于在所述目标界面包括功能控件的情况下,获取所述功能控件的位置并根据所述功能控件的位置确定目标区域;其中,第一确定模块430用于在所述目标界面不包括功能控件的情况下,所述目标区域为预设区域;其中,在目标手势作用于所述目标区域的情况下,执行与所述目标手势及目标区域对应的功能。
可选地,所述第一获取模块,具体用于:
获取所述功能控件的边缘位置;
根据所述功能控件的边缘位置确定所述功能控件中距离所述目标界面的第一边缘距离最近的第二边缘;
将所述第一边缘与所述第二边缘所在边构成的界面区域确定为目标区域;
其中,所述目标界面包括所述界面区域。
可选地,所述装置还包括:
第二获取模块,用于获取所述第一边缘与所述第二边缘之间的距离;
第二确定模块,用于在所述距离大于等于预设值的情况下,将所述第一边缘与所述第二边缘所在边构成的界面区域确定为目标区域。
可选地,所述目标界面包括第一区域,所述第一区域包括所述功能控件,在第一手势作用于所述第一区域的情况下,所述装置还包括:
第一接收模块,用于在所述目标区域与所述第一区域至少部分重叠的情况下,接收用户的第一手势输入,所述第一手势作用于所述目标区域与所述第一区域的重叠区域;
第一执行模块,用于在所述第一手势不存在与所述目标区域对应的目标功能的情况下,放弃响应所述第一手势输入,不执行所述第一功能;
在所述第一手势存在于所述目标区域对应的目标功能的情况下,响应 于所述第一手势输入,执行所述目标功能,不执行所述第一功能。
在本申请实施例中,对目标界面中是否包含功能控件进行分析,在目标界面中包含功能控件的情况下,基于功能控件的位置,为目标区域确定一个尽可能大但是又不影响功能控件功能的区域,另一方面,在目标界面中不含包含功能控件的情况下,将预设区域作为目标区域,本申请实施例在减少对于功能控件功能的情况下,尽可能的扩大目标区域的范围,从而有效缓解了目标区域面积可能较小,而导致的作用于目标区域的目标操作无法得到正常响应的问题。
本申请实施例中的界面控制装置可以是电子设备,也可以是电子设备中的部件,例如集成电路或芯片。该电子设备可以是终端,也可以为除终端之外的其他设备。示例性的,电子设备可以为手机、平板电脑、笔记本电脑、掌上电脑、车载电子设备、移动上网装置(Mobile Internet Device,MID)、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、机器人、可穿戴设备、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本或者个人数字助理(personal digital assistant,PDA)等,还可以为服务器、网络附属存储器(Network Attached Storage,NAS)、个人计算机(personal computer,PC)、电视机(television,TV)、柜员机或者自助机等,本申请实施例不作具体限定。
本申请实施例中的界面控制装置可以为具有操作系统的装置。该操作系统可以为安卓(Android)操作系统,可以为iOS操作系统,还可以为其他可能的操作系统,本申请实施例不作具体限定。
本申请实施例提供的界面控制装置能够实现图1至图3的方法实施例实现的各个过程,为避免重复,这里不再赘述。
可选地,图5为本申请实施例提供的电子设备结构示意图,如图5所示,本申请实施例还提供一种电子设备500,包括处理器501和存储器502,存储器502上存储有可在所述处理器501上运行的程序或指令,该程序或指令被处理器501执行时实现上述界面控制方法实施例的各个步骤,且能 达到相同的技术效果,为避免重复,这里不再赘述。
需要说明的是,本申请实施例中的电子设备包括上述所述的移动电子设备和非移动电子设备。
图6为实现本申请实施例的一种电子设备的硬件结构示意图。
该电子设备600包括但不限于:射频单元601、网络模块602、音频输出单元603、输入单元604、传感器605、显示单元606、用户输入单元607、接口单元608、存储器609、以及处理器610等部件。
本领域技术人员可以理解,电子设备600还可以包括给各个部件供电的电源(比如电池),电源可以通过电源管理系统与处理器610逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。图6中示出的电子设备结构并不构成对电子设备的限定,电子设备可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置,在此不再赘述。
其中,显示单元606用于显示目标界面;
处理器610用于在所述目标界面包括功能控件的情况下,获取所述功能控件的位置,并根据所述功能控件的位置确定目标区域;
在所述目标界面不包括功能控件的情况下,所述目标区域为预设区域;
其中,在目标手势作用于所述目标区域的情况下,执行与所述目标手势及目标区域对应的功能。
可选地,处理器610用于获取所述功能控件的边缘位置;
处理器610用于根据所述功能控件的边缘位置确定所述功能控件中距离所述目标界面的第一边缘距离最近的第二边缘;
处理器610用于将所述第一边缘与所述第二边缘所在边构成的界面区域确定为目标区域;
其中,所述目标界面包括所述界面区域。
可选地,传感器605用于获取所述第一边缘与所述第二边缘之间的距离;
处理器610用于在所述距离大于等于预设值的情况下,将所述第一边缘与所述第二边缘所在边构成的界面区域确定为目标区域。
可选地,在所述目标区域与所述第一区域至少部分重叠的情况下,用户输入单元607用于接收用户的第一手势输入,所述第一手势作用于所述目标区域与所述第一区域的重叠区域;
在所述第一手势不存在与所述目标区域对应的目标功能的情况下,处理器610用于放弃响应所述第一手势输入,不执行所述第一功能;
在所述第一手势存在于所述目标区域对应的目标功能的情况下,处理器610用于响应于所述第一手势输入,执行所述目标功能,不执行所述第一功能。
在本申请实施例中,对目标界面中是否包含功能控件进行分析,在目标界面中包含功能控件的情况下,基于功能控件的位置,为目标区域确定一个尽可能大但是又不影响功能控件功能的区域,另一方面,在目标界面中不含包含功能控件的情况下,将预设区域作为目标区域,本申请实施例在减少对于功能控件功能的情况下,尽可能的扩大目标区域的范围,从而有效缓解了目标区域面积可能较小,而导致的作用于目标区域的目标操作无法得到正常响应的问题。
应理解的是,本申请实施例中,输入单元604可以包括图形处理器(Graphics Processing Unit,GPU)6041和麦克风6042,图形处理器6041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。显示单元606可包括显示面板6061,可以采用液晶显示器、有机发光二极管等形式来配置显示面板6061。用户输入单元607包括触控面板6071以及其他输入设备6072中的至少一种。触控面板6071,也称为触摸屏。触控面板6071可包括触摸检测装置和触摸控制器两个部分。其他输入设备6072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。
存储器609可用于存储软件程序以及各种数据。存储器609可主要包括存储程序或指令的第一存储区和存储数据的第二存储区,其中,第一存储区可存储操作系统、至少一个功能所需的应用程序或指令(比如声音播放功能、图像播放功能等)等。此外,存储器609可以包括易失性存储器或非易失性存储器,或者,存储器x09可以包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(Read-Only Memory,ROM)、可编程只读存储器(Programmable ROM,PROM)、可擦除可编程只读存储器(Erasable PROM,EPROM)、电可擦除可编程只读存储器(Electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(Random Access Memory,RAM),静态随机存取存储器(Static RAM,SRAM)、动态随机存取存储器(Dynamic RAM,DRAM)、同步动态随机存取存储器(Synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(Double Data Rate SDRAM,DDRSDRAM)、增强型同步动态随机存取存储器(Enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(Synch link DRAM,SLDRAM)和直接内存总线随机存取存储器(Direct Rambus RAM,DRRAM)。本申请实施例中的存储器609包括但不限于这些和任意其它适合类型的存储器。
处理器610可包括一个或多个处理单元;可选的,处理器610集成应用处理器和调制解调处理器,其中,应用处理器主要处理涉及操作系统、用户界面和应用程序等的操作,调制解调处理器主要处理无线通信信号,如基带处理器。可以理解的是,上述调制解调处理器也可以不集成到处理器610中。
本申请实施例还提供一种可读存储介质,所述可读存储介质上存储有程序或指令,该程序或指令被处理器执行时实现上述界面控制方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
其中,所述处理器为上述实施例中所述的电子设备中的处理器。所述可读存储介质,包括计算机可读存储介质,如计算机只读存储器ROM、 随机存取存储器RAM、磁碟或者光盘等。
本申请实施例另提供了一种芯片,所述芯片包括处理器和通信接口,所述通信接口和所述处理器耦合,所述处理器用于运行程序或指令,实现上述界面控制方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
应理解,本申请实施例提到的芯片还可以称为系统级芯片、系统芯片、芯片系统或片上系统芯片等。
本申请实施例提供一种计算机程序产品,该程序产品被存储在存储介质中,该程序产品被至少一个处理器执行以实现如上述界面控制方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。此外,需要指出的是,本申请实施方式中的方法和装置的范围不限按示出或讨论的顺序来执行功能,还可包括根据所涉及的功能按基本同时的方式或按相反的顺序来执行功能,例如,可以按不同于所描述的次序来执行所描述的方法,并且还可以添加、省去、或组合各种步骤。另外,参照某些示例所描述的特征可在其他示例中被组合。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以计算机软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端(可以是 手机,计算机,服务器,或者网络设备等)执行本申请各个实施例所述的方法。
上面结合附图对本申请的实施例进行了描述,但是本申请并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本申请的启示下,在不脱离本申请宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本申请的保护之内。

Claims (13)

  1. 一种界面控制方法,包括:
    显示目标界面;
    在所述目标界面包括功能控件的情况下,获取所述功能控件的位置并根据所述功能控件的位置确定目标区域;
    在所述目标界面不包括功能控件的情况下,所述目标区域为预设区域;
    其中,在目标手势作用于所述目标区域的情况下,执行与所述目标手势及目标区域对应的功能。
  2. 根据权利要求1所述的界面控制方法,其中,所述获取所述功能控件的位置并根据所述功能控件的位置确定目标区域,包括:
    获取所述功能控件的边缘位置;
    根据所述功能控件的边缘位置确定所述功能控件中距离所述目标界面的第一边缘距离最近的第二边缘;
    将所述第一边缘与所述第二边缘所在边构成的界面区域确定为目标区域;
    其中,所述目标界面包括所述界面区域。
  3. 根据权利要求2所述的界面控制方法,其中,在根据所述功能控件的边缘位置确定所述功能控件中距离所述目标界面的第一边缘距离最近的第二边缘之后,所述方法还包括:
    获取所述第一边缘与所述第二边缘之间的距离;
    在所述距离大于等于预设值的情况下,将所述第一边缘与所述第二边缘所在边构成的界面区域确定为目标区域。
  4. 根据权利要求1所述的界面控制方法,其中,所述目标界面包括第一区域,所述第一区域包括所述功能控件,在第一手势作用于所述第一区域的情况下,执行与所述第一手势及所述第一区域对应的第一功能,在所述根据所述功能控件的位置确定目标区域之后,所述方法还包括:
    在所述目标区域与所述第一区域至少部分重叠的情况下,接收用户的第一手势输入,所述第一手势作用于所述目标区域与所述第一区域的重叠区域;
    在所述第一手势不存在与所述目标区域对应的目标功能的情况下,放弃响应所述第一手势输入,不执行所述第一功能;
    在所述第一手势存在于所述目标区域对应的目标功能的情况下,响应于所述第一手势输入,执行所述目标功能,不执行所述第一功能。
  5. 一种界面控制装置,包括:
    第一显示模块,用于显示目标界面;
    第一获取模块,用于在所述目标界面包括功能控件的情况下,获取所述功能控件的位置并根据所述功能控件的位置确定目标区域;
    第一确定模块,用于在所述目标界面不包括功能控件的情况下,所述目标区域为预设区域;
    其中,在目标手势作用于所述目标区域的情况下,执行与所述目标手势及目标区域对应的功能。
  6. 根据权利要求5所述的界面控制装置,其中,所述第一获取模块具体用于:
    获取所述功能控件的边缘位置;
    根据所述功能控件的边缘位置确定所述功能控件中距离所述目标界面的第一边缘距离最近的第二边缘;
    将所述第一边缘与所述第二边缘所在边构成的界面区域确定为目标区域;
    其中,所述目标界面包括所述界面区域。
  7. 根据权利要求6所述的界面控制装置,其中,所述装置还包括:
    第二获取模块,用于获取所述第一边缘与所述第二边缘之间的距离;
    第二确定模块,用于在所述距离大于等于预设值的情况下,将所述第一边缘与所述第二边缘所在边构成的界面区域确定为目标区域。
  8. 根据权利要求5所述的界面控制装置,其中,所述目标界面包括第一区域,所述第一区域包括所述功能控件,在第一手势作用于所述第一区域的情况下,所述装置还包括:
    第一接收模块,用于在所述目标区域与所述第一区域至少部分重叠的情况下,接收用户的第一手势输入,所述第一手势作用于所述目标区域与所述第一区域的重叠区域;
    第一执行模块,用于在所述第一手势不存在与所述目标区域对应的目标功能的情况下,放弃响应所述第一手势输入,不执行所述第一功能;
    在所述第一手势存在于所述目标区域对应的目标功能的情况下,响应于所述第一手势输入,执行所述目标功能,不执行所述第一功能。
  9. 一种电子设备,包括处理器和存储器,所述存储器存储可在所述处理器上运行的程序或指令,所述程序或指令被所述处理器执行时实现如权利要求1-4任一项所述的界面控制方法的步骤。
  10. 一种可读存储介质,所述可读存储介质上存储程序或指令,所述程序或指令被处理器执行时实现如权利要求1-4任一项所述的界面控制方法的步骤。
  11. 一种芯片,所述芯片包括处理器和通信接口,所述通信接口和所述处理器耦合,所述处理器用于运行程序或指令,实现如权利要求1-4任一项所述的界面控制方法的步骤。
  12. 一种计算机程序产品,该程序产品被存储在存储介质中,该程序产品被至少一个处理器执行以实现如权利要求1-4任一项所述的界面控制方法的步骤。
  13. 一种电子设备,所述电子设备用于执行如权利要求1-4任一项所述的界面控制方法。
PCT/CN2022/133138 2021-11-26 2022-11-21 界面控制方法、装置、电子设备及存储介质 WO2023093661A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111422090.XA CN114115639A (zh) 2021-11-26 2021-11-26 界面控制方法、装置、电子设备及存储介质
CN202111422090.X 2021-11-26

Publications (1)

Publication Number Publication Date
WO2023093661A1 true WO2023093661A1 (zh) 2023-06-01

Family

ID=80370115

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/133138 WO2023093661A1 (zh) 2021-11-26 2022-11-21 界面控制方法、装置、电子设备及存储介质

Country Status (2)

Country Link
CN (1) CN114115639A (zh)
WO (1) WO2023093661A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114115639A (zh) * 2021-11-26 2022-03-01 维沃移动通信有限公司 界面控制方法、装置、电子设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012091704A1 (en) * 2010-12-29 2012-07-05 Empire Technology Development Llc Environment-dependent dynamic range control for gesture recognition
WO2019047187A1 (zh) * 2017-09-08 2019-03-14 广东欧珀移动通信有限公司 导航栏控制方法及装置
CN110209337A (zh) * 2013-05-23 2019-09-06 三星电子株式会社 用于基于手势的用户界面的方法和设备
CN110639203A (zh) * 2019-09-29 2020-01-03 网易(杭州)网络有限公司 一种游戏中的控件响应方法及装置
CN114115639A (zh) * 2021-11-26 2022-03-01 维沃移动通信有限公司 界面控制方法、装置、电子设备及存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012091704A1 (en) * 2010-12-29 2012-07-05 Empire Technology Development Llc Environment-dependent dynamic range control for gesture recognition
CN110209337A (zh) * 2013-05-23 2019-09-06 三星电子株式会社 用于基于手势的用户界面的方法和设备
WO2019047187A1 (zh) * 2017-09-08 2019-03-14 广东欧珀移动通信有限公司 导航栏控制方法及装置
CN110639203A (zh) * 2019-09-29 2020-01-03 网易(杭州)网络有限公司 一种游戏中的控件响应方法及装置
CN114115639A (zh) * 2021-11-26 2022-03-01 维沃移动通信有限公司 界面控制方法、装置、电子设备及存储介质

Also Published As

Publication number Publication date
CN114115639A (zh) 2022-03-01

Similar Documents

Publication Publication Date Title
KR102519800B1 (ko) 전자 장치
US9459704B2 (en) Method and apparatus for providing one-handed user interface in mobile device having touch screen
US20130159925A1 (en) Device and Method for Resizing User Interface Content
US20120032891A1 (en) Device, Method, and Graphical User Interface with Enhanced Touch Targeting
WO2011084857A1 (en) Apparatus and method having multiple application display modes including mode with display resolution of another apparatus
US11567725B2 (en) Data processing method and mobile device
WO2023061280A1 (zh) 应用程序显示方法、装置及电子设备
WO2023125425A1 (zh) 一种显示方法、装置和电子设备
WO2023155877A1 (zh) 应用图标管理方法、装置和电子设备
WO2023016463A1 (zh) 显示控制方法、装置、电子设备和介质
WO2022242515A1 (zh) 界面显示方法及装置
CN112433693A (zh) 分屏显示方法、装置及电子设备
WO2023093661A1 (zh) 界面控制方法、装置、电子设备及存储介质
WO2024046203A1 (zh) 内容显示方法及装置
WO2024037419A1 (zh) 显示控制方法、装置、电子设备及可读存储介质
WO2023241563A1 (zh) 数据处理方法和电子设备
WO2024012416A1 (zh) 显示方法和装置
CN112148167A (zh) 控件设置方法、装置和电子设备
WO2023155874A1 (zh) 应用图标管理方法、装置和电子设备
WO2023155858A1 (zh) 文档编辑方法及其装置
WO2023030307A1 (zh) 截图方法、装置及电子设备
CN111638828A (zh) 界面显示方法及装置
CN114779977A (zh) 界面显示方法、装置、电子设备及存储介质
CN114416269A (zh) 界面显示方法和显示设备
CN114327726A (zh) 显示控制方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22897744

Country of ref document: EP

Kind code of ref document: A1