WO2022057881A1 - 操作方法及装置 - Google Patents

操作方法及装置 Download PDF

Info

Publication number
WO2022057881A1
WO2022057881A1 PCT/CN2021/118982 CN2021118982W WO2022057881A1 WO 2022057881 A1 WO2022057881 A1 WO 2022057881A1 CN 2021118982 W CN2021118982 W CN 2021118982W WO 2022057881 A1 WO2022057881 A1 WO 2022057881A1
Authority
WO
WIPO (PCT)
Prior art keywords
interface
control
input
target
user
Prior art date
Application number
PCT/CN2021/118982
Other languages
English (en)
French (fr)
Inventor
李少赓
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Priority to EP21868703.6A priority Critical patent/EP4216045A4/en
Priority to KR1020237012327A priority patent/KR20230065337A/ko
Priority to JP2023517295A priority patent/JP2023542666A/ja
Publication of WO2022057881A1 publication Critical patent/WO2022057881A1/zh
Priority to US18/187,057 priority patent/US20230236852A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44521Dynamic linking or loading; Link editing at or after load time, e.g. Java class loading
    • G06F9/44526Plug-ins; Add-ons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04804Transparency, e.g. transparent or translucent windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the embodiments of the present application relate to the field of communication technologies, and in particular, to an operation method and apparatus.
  • electronic devices can display different contents of different applications on one screen.
  • the electronic device can play a video in the form of a floating window when displaying a chat interface of a social application, so that the user can chat with others while playing the video.
  • the electronic device can only display video content, and the user cannot directly operate the video function controls, such as pausing the video playback or switching the playing music.
  • the purpose of the embodiments of the present application is to provide an operation method, apparatus and electronic device, which can solve the problem that the user cannot operate the application when the electronic device displays the interface display content of the application in the form of a floating window.
  • an embodiment of the present application provides an operation method, the method includes: receiving a first input when a first interface is displayed; the first interface includes N first controls, where N is a positive integer; responding to The first input displays a target window and at least one second control; the interface display content of the first interface is displayed in the target window; at least one second control has a mapping relationship with N first controls; each second control is used to trigger a corresponding A functional operation that can be triggered by the first control; receiving a second input from the user to a target control in at least one second control; and executing a target operation corresponding to the target control in response to the second input.
  • an embodiment of the present application further provides an operating device, the device includes: a receiving module, a display module, and an executing module; the receiving module is configured to receive a first input when a first interface is displayed; a first The interface includes N first controls, where N is a positive integer; a display module is used to display a target window and at least one second control in response to the first input received by the receiving module; the interface display content of the first interface is displayed in the target window There is a mapping relationship between at least one second control and N first controls; each second control is used to trigger the functional operation that can be triggered by the corresponding first control; the receiving module is also used for receiving at least the user's display module displayed at least A first input of a target control in a second control; an execution module, configured to execute a target operation corresponding to the target control in response to the first input received by the receiving module.
  • an embodiment of the present application provides an electronic device, including a processor, a memory, and a program or instruction stored on the memory and executable on the processor, and the program or instruction is implemented when executed by the processor The steps of the operation method as described in the first aspect.
  • an embodiment of the present application provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or instruction is executed by a processor, the steps of the method according to the first aspect are implemented .
  • an embodiment of the present application provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction, and implement the first aspect the method described.
  • a mapping relationship with the N first controls in the first interface is also displayed. at least one second control. And, after receiving the second input from the user to the target control in the at least one second control, in response to the second input, execute the target operation corresponding to the target control, so that the electronic device displays a certain target control in the form of a floating window.
  • the interface of the application displays content, the user can control the application to perform certain functional operations through the function controls attached to the floating window.
  • FIG. 1 is a schematic diagram of a display interface of a floating window provided by an embodiment of the present application.
  • FIG. 2 is a schematic flowchart of an operation method provided by an embodiment of the present application.
  • FIG. 3 is one of the schematic diagrams of the interface to which an operation method provided by an embodiment of the present application is applied;
  • FIG. 4 is a second schematic diagram of an interface to which an operation method provided by an embodiment of the present application is applied;
  • FIG. 5 is a schematic structural diagram of an operating device provided by an embodiment of the present application.
  • FIG. 6 is one of the schematic structural diagrams of an electronic device provided by an embodiment of the present application.
  • FIG. 7 is a second schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • first, second and the like in the description and claims of the present application are used to distinguish similar objects, and are not used to describe a specific order or sequence. It is to be understood that the data so used are interchangeable under appropriate circumstances so that the embodiments of the present application can be practiced in sequences other than those illustrated or described herein, and distinguish between “first”, “second”, etc.
  • the objects are usually of one type, and the number of objects is not limited.
  • the first object may be one or more than one.
  • “and/or” in the description and claims indicates at least one of the connected objects, and the character “/" generally indicates that the associated objects are in an "or” relationship.
  • the operation method provided by the embodiment of the present application can be applied to a scenario in which the electronic device displays the interface display content of the application in the form of displaying a floating window.
  • an electronic device displays content displayed on the interface of an application in the form of a floating window
  • the continuous enrichment of the functions of the electronic device in the related art, in order to facilitate the user to use the electronic device and make full use of the electronic device's Function, for example, as shown in Figure 1, the user chats while watching the video of the game on the electronic device, and the electronic device displays the currently playing video of the video application in the form of a floating window or a small window (the floating window 10 in Figure 1). content.
  • the user cannot operate the application through the floating window or small window, for example, to pause the playing video content. In this way, when the user wants to pause the video being played in the floating window, he needs to click the floating window and switch to the original interface of the video application to pause the video being played, which is a cumbersome operation.
  • the electronic device in the case where the electronic device displays the first interface, by setting the function, the function button that the user wants to achieve a quick operation or the combined function of multiple function buttons can be set.
  • the electronic device displays the mapping buttons while displaying the floating window, so that when the user clicks these mapping buttons, the electronic device can be triggered to perform the corresponding function, and the user does not need to click the floating window to display the original interface of the application to trigger these functions, simplifying operation.
  • an operation method provided by an embodiment of the present application may include the following steps 201 to 204:
  • Step 201 when the first interface is displayed, the operating device receives a first input.
  • the above-mentioned first interface includes N first controls, and N is a positive integer.
  • the above-mentioned first interface may be an interface of a target application installed in the electronic device, and the electronic device triggers the target application to perform corresponding operations after receiving the user's input of the N first controls on the first interface.
  • the above-mentioned first input may be: the user's input on the screen of the electronic device, or the voice command input by the user, or the specific gesture input by the user, which can be specifically determined according to actual use requirements. Not limited.
  • the above-mentioned first input may be an input that the user triggers the electronic device to display the above-mentioned first interface in the form of a floating window.
  • Step 202 The operating device displays a target window and at least one second control in response to the first input.
  • the interface display content of the first interface is displayed in the target window
  • the at least one second control is in a mapping relationship with N first controls, and each second control is used to trigger a function operation that can be triggered by the corresponding first control .
  • the electronic device only displays the target window (that is, the above-mentioned floating window or small window).
  • the electronic device not only displays the target window, but also displays at least A second control.
  • the above-mentioned preset position may be below the target window.
  • the above-mentioned target window displays the interface display content of the first interface
  • the user can view the interface display content of the first interface in real time through the target window, but cannot control the target application through the target window.
  • the above target application is a video application
  • the user can view the video content currently being played by the video application through the target window, but cannot control the video application to pause the video being played.
  • the at least one second control is a control that has a mapping relationship with N first controls in the first interface, wherein one second control has a mapping relationship with at least one first control.
  • the above-mentioned mapping relationship is a key mapping, that is, when the user clicks on the second control, the electronic device can be triggered to execute at least one functional operation that can be triggered and executed by the first control.
  • the second control is a mapping control of at least one first control.
  • the electronic device responds to the first input, in addition to displaying the floating window 20 (ie, the above-mentioned target window), in the area In 21, three mapping controls that have a mapping relationship with the controls in the first interface are displayed.
  • the floating window 20 ie, the above-mentioned target window
  • Step 203 The operating device receives a second input from the user to the target control in the at least one second control.
  • the above-mentioned second input may be a user's input of clicking, dragging, long pressing, or short pressing on the target control.
  • the corresponding input action is performed on it.
  • the target control is a button control
  • the second input is the user's click input on the target control
  • the target control is a progress bar control
  • the second input is the user's dragging input on the target control.
  • Step 204 In response to the second input, the operating device executes the target operation corresponding to the above-mentioned target control.
  • the electronic device after the electronic device receives the user's second input on the target control, the electronic device performs the target operation in response to the second input.
  • the target operation is an operation that can be triggered by a first control that has a mapping relationship with the target control in the above-mentioned first interface.
  • the electronic device After the electronic device receives the pause control in the three mapping controls in the area 21 clicked by the user, the electronic device sends a control instruction to the target application to control the target application to pause playing the video content being played.
  • the above target application may be a video application, or may be a social application, a game application, or other applications that can be operated through touch controls.
  • the electronic device can not only perform key mapping on the controls in the first interface of the target application, but also perform function mapping on the functions of the target application.
  • the above-mentioned first interface includes N functions (for example, functions realized through various gesture operations, these functions may exist in the first interface, or may not be included).
  • N functions for example, functions realized through various gesture operations, these functions may exist in the first interface, or may not be included.
  • the above-mentioned at least one second control has a mapping relationship with N functions, each second control is used to trigger a corresponding function in the N functions, and the above-mentioned target operation is: triggering the above-mentioned first interface, and the target The target function corresponding to the control.
  • the controls in the first interface may be equivalently replaced with functions.
  • At least one second control that has a mapping relationship with the N first controls in the first interface is also displayed . And, after receiving the second input from the user to the target control in the at least one second control, in response to the second input, execute the target operation corresponding to the target control, so that the electronic device displays a certain target control in the form of a floating window.
  • the interface of the application displays content, the user can control the application to perform certain functional operations through the function controls attached to the floating window.
  • the electronic device in order to enable the electronic device to display the mapping button (that is, the above-mentioned second control) when the floating window (that is, the above-mentioned target window) is displayed, the electronic device needs to perform the control on the first interface. Keymap.
  • the first interface includes M controls, and the M controls include the N first controls, where M ⁇ N, and M is a positive integer. That is, the above-mentioned N first controls are controls that establish a mapping relationship with the second controls among the M controls in the first interface.
  • the above-mentioned first interface contains M controls, wherein the electronic device only performs key mapping on N controls (ie, the above-mentioned N first controls).
  • the operation method provided by this embodiment of the present application may further include the following steps 202a1 and 202a2:
  • Step 202a1 In the case where the first interface is displayed, the operating device receives a third input from the user to at least one of the M controls.
  • Step 202a2 In response to the third input, the operating device determines that the at least one control is the first control.
  • the operations that can be triggered by the at least one control above include: functions that can be used when the first interface is in a floating window display state.
  • the user may enter the setting interface by clicking on the function entry provided by the electronic device.
  • the user can click one or more controls to be key mapped to perform key mapping.
  • the user can trigger the electronic device to display the target window through a specific gesture operation on the first interface.
  • the electronic device when a plurality of controls in the above-mentioned M controls are key-mapped with a second control, after the user clicks on the second control, the electronic device performs operations corresponding to the above-mentioned plurality of controls. For example, the user may map the pause and close controls of the video application to a mapping control, and when the user clicks the mapping control, the electronic device may pause the video content being played by the video application and close the video application.
  • FIG. 4 it is a video playback interface of a video application displayed by an electronic device (ie, the above-mentioned first interface), and the first interface includes three controls (rewind, pause, and fast-forward) , when the electronic device receives a specific input from the user and displays a key mapping setting interface (as shown in (B) in FIG. 4 ), the user can select at least one of the above three controls and perform key mapping on it.
  • the user wants to uncheck a control he can uncheck the control by double-clicking the control.
  • the user can click the "Finish" control, and the electronic device establishes the button mapping relationship.
  • the electronic device displays a prompt message: "There are no operable items in the interface".
  • the user can map the function keys in the first interface of the application in the form of a single function or a combined function according to his own habits and preferences, so that the user can display the first interface in the form of a floating window on the electronic device,
  • the application can be controlled by clicking the mapped button.
  • the first interface in order to prevent the user from changing the content of the interface during the key mapping setting process, the first interface may be locked.
  • the operation method provided by this embodiment of the present application may further include the following steps 202b1 and 202b2:
  • Step 202b1 when the first interface is displayed, the operating device receives a fourth input.
  • Step 202b2 In response to the fourth input, the operating device displays the second interface according to the preset transparency.
  • the above-mentioned second interface includes an interface image of the first interface;
  • the above-mentioned third input is: the user's input to the target area in the second interface.
  • the above-mentioned target area is: an area corresponding to at least one control in the interface image of the first interface.
  • the electronic device displays the second interface with a preset transparency, and the second interface displays a static image of the first interface, so as to prevent the user from setting the button mapping due to changes in the interface content of the first interface. Failed question.
  • the electronic device can lock the first interface when the user performs the key mapping setting, so as to prevent the user from misoperation caused by the change of the interface content.
  • the electronic device may cancel the locking of the first interface.
  • the operation method provided by this embodiment of the present application may further include the following step 202b3:
  • Step 202b3 the operating device cancels the display of the above-mentioned second interface.
  • the display of the second interface is canceled, and the electronic device updates the interface display content of the first interface.
  • the electronic device can cancel the locking of the first interface after the user completes the key mapping setting, so that the first interface returns to normal.
  • the user can change the display position of the target window by dragging.
  • the user drags the target window to move, it moves with the movement of the target window.
  • the operation method provided by this embodiment of the present application may further include the following steps 202c1 and 202c2:
  • Step 202c1 The operating device receives a fifth input from the user for dragging the target window.
  • Step 202c2 In response to the fifth input, the operating device updates the display position of the display target window and the at least one second control according to the movement track corresponding to the fifth input.
  • the display position of the above at least one second control moves with the movement of the target window.
  • the relative positional relationship between the target window and the at least one second control remains unchanged. That is, the at least one second control is always located at the preset position of the target window.
  • the above-mentioned second control can move with the movement of the target window, so as to ensure the unobtrusive cleanliness and integrity of the interface.
  • the electronic device after receiving the third input from the user to at least one of the M controls in the first interface, the electronic device can perform key mapping on the at least one control, and after receiving the user-triggered display After the first input of the target window, in response to the first input, in addition to displaying the target window, at least one second control that has a mapping relationship with the N first controls (ie at least one control) in the first interface is also displayed. And, after receiving the second input from the user to the target control in the at least one second control, in response to the second input, execute the target operation corresponding to the target control, so that the electronic device displays a certain target control in the form of a floating window.
  • the interface of the application displays content, the user can control the application to perform certain operations through the function controls attached to the floating window.
  • the execution body may be an operation device, or a control module in the operation device for executing the operation method.
  • an operation method performed by an operating device is used as an example to describe the operating device provided by the embodiment of the present application.
  • FIG. 5 is a schematic diagram of a possible structure for implementing an operating device provided by an embodiment of the present application.
  • the operating device 600 includes: a receiving module 601, a display module 602, and an executing module 603; In the case of displaying the first interface, the first input is received; the first interface includes N first controls, and N is a positive integer; the display module 602 is used for displaying the target window and the first input in response to the first input received by the receiving module 601. At least one second control; the interface display content of the first interface is displayed in the target window; at least one second control has a mapping relationship with N first controls; each second control is used to trigger the corresponding first control.
  • the receiving module 601 is further configured to receive the user's first input to the target control in the at least one second control displayed by the display module 602;
  • the executing module 603 is configured to respond to the first input received by the receiving module 601, execute The target action corresponding to the target control.
  • the electronic device 600 further includes: a determination module 604; the first interface includes M controls, and the M controls include N first controls, where M ⁇ N, and M is a positive integer
  • the receiving module 601 is also used for receiving the third input of the user to at least one control in the M controls when the first interface is displayed; the determining module 604 is used for responding to the third input received by the receiving module 601, determining
  • the at least one control is a first control; wherein, operations that can be triggered by the at least one control include: functions that can be used when the first interface is in a floating window display state.
  • the receiving module 601 is further configured to receive a fourth input when the first interface is displayed; the display module 602 is further configured to display the second input according to preset transparency in response to the fourth input received by the receiving module 601 interface; the second interface includes an interface image of the first interface; wherein, the third input is: a user's input to a target area in the second interface; the target area is: an area corresponding to at least one control in the interface image of the first interface .
  • the display module 602 is further configured to cancel the display of the second interface.
  • the receiving module 601 is further configured to receive a fifth input from the user for dragging the target window; the display module 602, in response to the fifth input received by the receiving module 601, updates the display according to the movement track corresponding to the fifth input The display position of the target window and the at least one second control; wherein, the display position of the at least one second control moves with the movement of the target window.
  • the modules that must be included in the operating device 600 are indicated by solid-line boxes, such as the receiving module 601 , the display module 602 and the execution module 603 ; the modules that may be included in the operating device 600 are indicated by dashed-line boxes , such as determining block 604 .
  • the operating device in this embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal.
  • the apparatus may be a mobile electronic device or a non-mobile electronic device.
  • the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palmtop computer, an in-vehicle electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook, or a personal digital assistant (personal digital assistant).
  • UMPC ultra-mobile personal computer
  • PDA personal digital assistant
  • non-mobile electronic devices can be servers, network attached storage (Network Attached Storage, NAS), personal computer (personal computer, PC), television (television, TV), teller machine or self-service machine, etc., this application Examples are not specifically limited.
  • the operating device in this embodiment of the present application may be a device having an operating system.
  • the operating system may be an Android (Android) operating system, an iOS operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present application.
  • the operating device provided in this embodiment of the present application can implement each process implemented by the method embodiments in FIG. 2 to FIG. 4 , and to avoid repetition, details are not described here.
  • the electronic device can perform key mapping on at least one control after receiving a third input from the user to at least one control among the M controls in the first interface, and display the display after receiving the user's trigger.
  • the first input of the target window in response to the first input, in addition to displaying the target window, at least one second control that has a mapping relationship with the N first controls (ie at least one control) in the first interface is also displayed.
  • the target operation corresponding to the target control, so that the electronic device displays a certain target control in the form of a floating window.
  • the interface of the application displays content, the user can control the application to perform certain functional operations through the function controls attached to the floating window.
  • an embodiment of the present application further provides an electronic device M00, including a processor M01, a memory M02, a program or instruction stored in the memory M02 and running on the processor M01, the program Or, when the instruction is executed by the processor M01, each process of the above-mentioned operation method embodiment can be implemented, and the same technical effect can be achieved. In order to avoid repetition, details are not repeated here.
  • the electronic devices in the embodiments of the present application include the aforementioned mobile electronic devices and non-mobile electronic devices.
  • FIG. 7 is a schematic diagram of a hardware structure of an electronic device implementing various embodiments of the present application.
  • the electronic device 100 includes but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, a memory 109, and a processor 110, etc. part.
  • the electronic device 100 may also include a power source (such as a battery) for supplying power to various components, and the power source may be logically connected to the processor 110 through a power management system, so as to manage charging, discharging, and power management through the power management system. consumption management and other functions.
  • a power source such as a battery
  • the structure of the electronic device shown in FIG. 7 does not constitute a limitation on the electronic device.
  • the electronic device may include more or less components than the one shown, or combine some components, or arrange different components, which will not be repeated here. .
  • the user input unit 107 is used to receive the first input when the first interface is displayed; the first interface includes N first controls, where N is a positive integer; the display unit 106 is used to respond to the user input unit
  • the first input received in 107 displays the target window and at least one second control; the interface display content of the first interface is displayed in the target window; at least one second control has a mapping relationship with N first controls;
  • the user input unit 107 is also used to receive the first input from the user to the target control in the at least one second control displayed by the display unit 106; the processor 110 is used for In response to the first input received by the user input unit 107, a target operation corresponding to the target control is performed.
  • At least one second control that has a mapping relationship with the N first controls in the first interface is also displayed . And, after receiving the second input from the user to the target control in the at least one second control, in response to the second input, execute the target operation corresponding to the target control, so that the electronic device displays a certain target control in the form of a floating window.
  • the interface of the application displays content, the user can control the application to perform certain functional operations through the function controls attached to the floating window.
  • the first interface includes M controls, and the M controls include N first controls, where M ⁇ N, and M is a positive integer; the user input unit 107 is also used to display the first interface when the first interface is displayed. , receiving a third input from the user to at least one of the M controls; the processor 110 is configured to, in response to the third input received by the user input unit 107, determine that the at least one control is the first control; wherein, the at least one control can
  • the triggered operations include: functions that can be used when the first interface is in a floating window display state.
  • the user can map the function keys in the first interface of the application in the form of a single function or a combined function according to his own habits and preferences, so that the user can display the first interface in the form of a floating window on the electronic device,
  • the application can be controlled by clicking the mapped button.
  • the user input unit 107 is further configured to receive a fourth input when the first interface is displayed; the display unit 106 is further configured to display according to preset transparency in response to the fourth input received by the user input unit 107 the second interface; the second interface includes an interface image of the first interface; wherein, the third input is: the user's input to the target area in the second interface; the target area is: the interface image of the first interface corresponds to at least one control Area.
  • the electronic device can lock the first interface when the user performs the key mapping setting, so as to prevent the user from misoperation caused by the change of the interface content.
  • the display unit 106 is further configured to cancel the display of the second interface.
  • the electronic device can cancel the locking of the first interface after the user completes the key mapping setting, so that the first interface returns to normal.
  • the user input unit 107 is further configured to receive a fifth input from the user for dragging the target window; the display unit 106, in response to the fifth input received by the user input unit 107, according to the movement trajectory corresponding to the fifth input, The display position of the display target window and the at least one second control is updated; wherein, the display position of the at least one second control moves with the movement of the target window.
  • the above-mentioned second control can move with the movement of the target window, so as to ensure the unobtrusive cleanliness and integrity of the interface.
  • the electronic device after receiving the third input from the user to at least one of the M controls in the first interface, the electronic device can perform key mapping on the at least one control, and after receiving the user-triggered display After the first input of the target window, in response to the first input, in addition to displaying the target window, at least one second control that has a mapping relationship with the N first controls (ie at least one control) in the first interface is also displayed. And, after receiving the second input from the user to the target control in the at least one second control, in response to the second input, execute the target operation corresponding to the target control, so that the electronic device displays a certain target control in the form of a floating window.
  • the interface of the application displays content, the user can control the application to perform certain operations through the function controls attached to the floating window.
  • the input unit 104 may include a graphics processor (Graphics Processing Unit, GPU) 1041 and a microphone 1042. Such as camera) to obtain still pictures or video image data for processing.
  • the display unit 106 may include a display panel 1061, which may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like.
  • the user input unit 107 includes a touch panel 1071 and other input devices 1072 .
  • the touch panel 1071 is also called a touch screen.
  • the touch panel 1071 may include two parts, a touch detection device and a touch controller.
  • Other input devices 1072 may include, but are not limited to, physical keyboards, function keys (such as volume control keys, switch keys, etc.), trackballs, mice, and joysticks, which will not be described herein again.
  • Memory 109 may be used to store software programs as well as various data including, but not limited to, application programs and operating systems.
  • the processor 110 may integrate an application processor and a modem processor, wherein the application processor mainly processes an operating system, a user interface, and an application program, and the like, and the modem processor mainly processes wireless communication. It can be understood that, the above-mentioned modulation and demodulation processor may not be integrated into the processor 110 .
  • Embodiments of the present application further provide a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or instruction is executed by a processor, each process of the foregoing operation method embodiment can be implemented, and the same can be achieved.
  • the technical effect, in order to avoid repetition, will not be repeated here.
  • the processor is the processor in the electronic device described in the foregoing embodiments.
  • the readable storage medium includes a computer-readable storage medium, such as a computer read-only memory (Read-Only Memory, ROM), a random access memory (Random Access Memory, RAM), a magnetic disk or an optical disk, and the like.
  • An embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each of the foregoing operation method embodiments process, and can achieve the same technical effect, in order to avoid repetition, it will not be repeated here.
  • the chip mentioned in the embodiments of the present application may also be referred to as a system-on-chip, a system-on-chip, a system-on-a-chip, or a system-on-a-chip, or the like.
  • the method of the above embodiment can be implemented by means of software plus a necessary general hardware platform, and of course can also be implemented by hardware, but in many cases the former is better implementation.
  • the technical solution of the present application can be embodied in the form of a software product in essence or in a part that contributes to the prior art, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, CD-ROM), including several instructions to make an electronic device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to execute the methods described in the various embodiments of the present application.
  • a storage medium such as ROM/RAM, magnetic disk, CD-ROM

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种操作方法及装置,该方法包括:在显示第一界面的情况下,接收第一输入;第一界面中包括N个第一控件,N为正整数;响应于第一输入,显示目标窗口和至少一个第二控件;目标窗口中显示第一界面的界面显示内容;至少一个第二控件与N个第一控件存在映射关系;每个第二控件用于触发对应的第一控件所能触发的功能操作;接收用户对至少一个第二控件中的目标控件的第二输入;响应于第二输入,执行与目标控件对应的目标操作。

Description

操作方法及装置
相关申请的交叉引用
本申请主张在2020年09月21日在中国提交的中国专利申请号202010994746.4的优先权,其全部内容通过引用包含于此。
技术领域
本申请实施例涉及通信技术领域,尤其涉及一种操作方法及装置。
背景技术
随着电子设备屏幕尺寸的增大,以及系统功能的不断丰富,电子设备可以在一块屏幕上展示不同应用的不同内容。
在相关技术中,电子设备可以在显示社交应用的聊天界面时,以悬浮窗的形式播放视频,使得用户在播放视频的同时,还能与他人进行聊天。
然而,由于悬浮窗的显示区域较小,电子设备只能显示视频内容,用户无法直接对视频功能控件进行操作,例如,暂停视频播放或者切换播放的音乐等。
发明内容
本申请实施例的目的是提供一种操作方法、装置及电子设备,能够解决电子设备以悬浮窗的形式显示应用的界面显示内容时,用户无法对该应用进行操作的问题。
第一方面,本申请实施例提供一种操作方法,该方法包括:在显示第一界面的情况下,接收第一输入;第一界面中包括N个第一控件,N为正整数;响应于第一输入,显示目标窗口和至少一个第二控件;目标窗口中显示第一界面的界面显示内容;至少一个第二控件与N个第一控件存在映射关系;每个第二控件用于触发对应的第一控件所能触发的功能操作;接收用户对至少一个第二控件中的目标控件的第二输入;响应于第二输入,执行与目标控件对应的目标操作。
第二方面,本申请实施例还提供了一种操作装置,该装置包括:接收模块、显示模块和执行模块;接收模块,用于在显示第一界面的情况下,接收第一输入;第一界面中包括N个第一控件,N为正整数;显示模块,用于响应于接收模块接收的第一输 入,显示目标窗口和至少一个第二控件;目标窗口中显示第一界面的界面显示内容;至少一个第二控件与N个第一控件存在映射关系;每个第二控件用于触发对应的第一控件所能触发的功能操作;接收模块,还用于接收用户对显示模块显示的至少一个第二控件中的目标控件的第一输入;执行模块,用于响应于接收模块接收的第一输入,执行与目标控件对应的目标操作。
第三方面,本申请实施例提供了一种电子设备,包括处理器、存储器及存储在该存储器上并可在该处理器上运行的程序或指令,该程序或指令被该处理器执行时实现如第一方面所述的操作方法的步骤。
第四方面,本申请实施例提供了一种可读存储介质,所述可读存储介质上存储程序或指令,所述程序或指令被处理器执行时实现如第一方面所述的方法的步骤。
第五方面,本申请实施例提供了一种芯片,所述芯片包括处理器和通信接口,所述通信接口和所述处理器耦合,所述处理器用于运行程序或指令,实现如第一方面所述的方法。
在本申请实施例中,在接收到用户触发显示目标窗口的第一输入后,响应于该第一输入,除了显示目标窗口,还显示与第一界面中的N个第一控件存在映射关系的至少一个第二控件。并且,在接收到用户对至少一个第二控件中的目标控件的第二输入后,响应于该第二输入,执行与该目标控件对应的目标操作,使得电子设备在以悬浮窗形式显示某个应用的界面显示内容时,用户可以通过悬浮窗附带的功能控件,控制该应用执行某些功能操作。
附图说明
图1是本申请实施例提供的一种悬浮窗显示界面示意图;
图2是本申请实施例提供的一种操作方法的流程示意图;
图3是本申请实施例提供的一种操作方法所应用的界面的示意图之一;
图4是本申请实施例提供的一种操作方法所应用的界面的示意图之二;
图5是本申请实施例提供的一种操作装置结构示意图;
图6是本申请实施例提供的一种电子设备的结构示意图之一;
图7是本申请实施例提供的一种电子设备的结构示意图之二。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
本申请的说明书和权利要求书中的术语“第一”、“第二”等是用于区别类似的对象,而不用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便本申请的实施例能够以除了在这里图示或描述的那些以外的顺序实施,且“第一”、“第二”等所区分的对象通常为一类,并不限定对象的个数,例如第一对象可以是一个,也可以是多个。此外,说明书以及权利要求中“和/或”表示所连接对象的至少其中之一,字符“/”,一般表示前后关联对象是一种“或”的关系。
本申请实施例提供的操作方法可以应用于电子设备以显示悬浮窗的形式显示应用的界面显示内容的场景中。
示例性的,针对电子设备以显示悬浮窗的形式显示应用的界面显示内容的场景,随着电子设备的功能的不断丰富,在相关技术中,为了方便用户使用电子设备,以及充分利用电子设备的功能,例如,如图1所示,用户在电子设备上边看比赛视频边聊天,电子设备以悬浮窗或者小窗口的形式(如图1中的悬浮窗10),显示视频应用的正在播放的视频内容。区别于分屏技术,此时用户无法通过悬浮窗或者小窗口对该应用进行操作,例如,暂停正在播放的视频内容。这样,当用户想要暂停悬浮窗中正在播放的视频,需要点击悬浮窗,切换至视频应用的原始界面,才能暂停正在播放的视频,操作较为繁琐。
针对这一问题,在本申请实施例提供的技术方案中,在电子设备显示第一界面的情况下,通过设置功能,可以将用户想要实现快捷操作的功能按键或者多个功能按键的组合功能进行映射,电子设备在显示悬浮窗的同时,显示映射按键,使得用户点击这些映射按键时,能够触发电子设备执行对应的功能,用户无需通过点击悬浮窗显示应用的原始界面来触发这些功能,简化了操作。
下面结合附图,通过具体的实施例及其应用场景对本申请实施例提供的操作方法进行详细地说明。
如图2所示,本申请实施例提供的一种操作方法,该方法可以包括下述步骤201至步骤204:
步骤201、在显示第一界面的情况下,操作装置接收第一输入。
其中,上述第一界面中包括N个第一控件,N为正整数。
示例性的,上述第一界面可以为电子设备中安装的目标应用的界面,电子设备接收到用户对该第一界面上的N个第一控件的输入后,触发目标应用执行对应的操作。
示例性的,上述第一输入可以为:用户在电子设备的屏幕上的输入,或者,用户输入的语音指令,或者,用户输入的特定手势,具体的可以根据实际使用需求确定,本发明实施例不作限定。示例性的,上述第一输入可以是用户触发电子设备以悬浮窗形式显示上述第一界面的输入。
步骤202、操作装置响应于该第一输入,显示目标窗口和至少一个第二控件。
其中,上述目标窗口中显示第一界面的界面显示内容,上述至少一个第二控件与N个第一控件存在映射关系,每个第二控件用于触发对应的第一控件所能触发的功能操作。
示例性的,区别于相关技术中,电子设备只显示目标窗口(即上述悬浮窗或小窗口),本申请实施例中,电子设备除了显示目标窗口,还会在目标窗口的预设位置显示至少一个第二控件。示例性的,上述预设位置可以为目标窗口的下方。
需要说明的是,上述目标窗口中显示第一界面的界面显示内容,用户可以通过目标窗口实时查看第一界面的界面显示内容,但无法通过目标窗口控制目标应用。例如,当上述目标应用为视频应用时,用户可以通过该目标窗口查看视频应用当前正在播放的视频内容,但无法控制该视频应用暂停正在播放的视频。
示例性的,上述至少一个第二控件为:与第一界面中N个第一控件存在映射关系的控件,其中,一个第二控件与至少一个第一控件存在映射关系。上述映射关系为按键映射,即当用户点击第二控件时,能够触发电子设备执行至少一个第一控件所能触发执行的功能操作。当一个第二控件与至少一个第二控件建立映射关系后,该第二控件为至少一个第一控件的映射控件。
举例说明,结合图1,如图3所示,当电子设备接收到用户的第一输入后,电子设备响应于第一输入,除了显示悬浮窗口20(即上述目标窗口)之外,还在区域21中显示与 第一界面中的控件存在映射关系的3个映射控件。
步骤203、操作装置接收用户对上述至少一个第二控件中的目标控件的第二输入。
示例性的,上述第二输入可以为用户对目标控件的点击、拖动、长按压、短按压输入。根据目标控件的控件类型,对其进行对应的输入动作。例如,当目标控件为按钮控件时,上述第二输入为用户对目标控件的点击输入;当目标控件为进度条控件时,上述第二输入为用户对目标控件的拖动输入。
步骤204、操作装置响应于该第二输入,执行与上述目标控件对应的目标操作。
示例性的,当电子设备接收到用户对上述目标控件的第二输入后,电子设备响应于该第二输入,执行目标操作。该目标操作为:上述第一界面中,与目标控件存在映射关系的第一控件所能触发的操作。
举例说明,如图3所示,电子设备接收到用户点击区域21中的三个映射控件中的暂停控件后,电子设备向目标应用发送控制指令,控制目标应用暂停播放正在播放的视频内容。
需要说明的是,上述目标应用可以是视频应用,也可以是社交应用、游戏应用等其他可以通过触控控件进行操作的应用。电子设备不仅可以对目标应用的第一界面中的控件进行按键映射,还可以对目标应用的功能进行功能映射。
示例性的,当电子设备对目标应用的功能进行映射时,上述第一界面中包括N个功能(例如,通过各种手势操作实现的功能,这些功能在第一界面中可以存在,也可以不存在对应的控件),上述至少一个第二控件与N个功能存在映射关系,每个第二控件用于触发N个功能中对应的功能,上述目标操作为:触发上述第一界面中,与目标控件对应的目标功能。需要说明的是,在本申请实施例之后的描述中,第一界面中的控件可以等效替换为功能。
如此,在接收到用户触发显示目标窗口的第一输入后,响应于该第一输入,除了显示目标窗口,还显示与第一界面中的N个第一控件存在映射关系的至少一个第二控件。并且,在接收到用户对至少一个第二控件中的目标控件的第二输入后,响应于该第二输入,执行与该目标控件对应的目标操作,使得电子设备在以悬浮窗形式显示某个应用的界面显示内容时,用户可以通过悬浮窗附带的功能控件,控制该应用执行某些功能操作。
可选地,在本申请实施例中,为了能够使得电子设备在显示悬浮窗(即上述目标窗口)时,显示映射按键(即上述第二控件),电子设备需要对第一界面中的控件进行按键映射。
示例性的,上述第一界面中包含M个控件,M个控件中包括上述N个第一控件,其中,M≥N,M为正整数。即上述N个第一控件,为第一界面中M个控件中,与第二控件建立映射关系的控件。上述第一界面中含有M个控件,其中,电子设备只对N个控件(即上述N个第一控件)进行了按键映射。
示例性的,上述步骤202中,显示目标窗口和至少一个第二控件之前,本申请实施例提供的操作方法,还可以包括以下步骤202a1和步骤202a2:
步骤202a1、在显示第一界面的情况下,操作装置接收用户对M个控件中至少一个控件的第三输入。
步骤202a2、操作装置响应于该第三输入,确定上述至少一个控件为第一控件。
其中,上述至少一个控件所能触发的操作包括:第一界面处于悬浮窗显示状态下能够使用的功能。
示例性的,在上述步骤202a1之前,用户可以通过点击电子设备提供的功能入口,进入设置界面。用户可以在设置界面,点击一个或多个想要进行按键映射的控件进行按键映射。
示例性的,在完成映射按键设置之后,用户可以通过在第一界面上的特定的手势操作,触发电子设备显示目标窗口。
需要说明的是,当上述M个控件中的多个控件与一个第二控件进行按键映射时,在用户点击该第二控件之后,电子设备执行上述多个控件对应的操作。例如,用户可以将视频应用的暂停和关闭控件映射至一个映射控件,当用户点击该映射控件时,电子设备可以暂停视频应用正在播放的视频内容,并关闭该视频应用。
举例说明,如图4中(A)所示,为电子设备显示的视频应用的视频播放界面(即上述第一界面),该第一界面中包括三个控件(快退、暂停和快进),当电子设备接收到用户的特定输入并显示按键映射设置界面时(如图4中(B)所示),用户可以通过选择上述三个控件中的至少一个,并对其进行按键映射。当用户想要取消选中某个控件时,可以通过双击该控件的方式取消选中该控件。设置完成后,用户可以点击“完成”控件,电子 设备建立按键映射关系。若上述第一界面中没有任何可以进行按键映射操作的控件,则电子设备显示提示信息:“界面中没有可操作项”。
如此,用户可以根据自己的习惯和喜好,将应用的第一界面中的功能按键以单一功能或者组合功能的形式进行按键映射,使得用户可以在电子设备以悬浮窗的形式显示第一界面时,能够通过点击映射按键,对应用进行控制。
进一步可选地,在本申请实施例中,为了防止用户在进行按键映射设置过程中,界面内容发生变化,可以对第一界面进行锁定。
示例性的,上述步骤202a1之前,本申请实施例提供的操作方法,还可以包括以下步骤202b1和步骤202b2:
步骤202b1、在显示第一界面的情况下,操作装置接收第四输入。
步骤202b2、操作装置响应于上述第四输入,按照预设透明度显示第二界面。
其中,上述第二界面包括第一界面的界面图像;上述第三输入为:用户对第二界面中的目标区域的输入。上述目标区域为:第一界面的界面图像中与至少一个控件对应的区域。
示例性的,电子设备以预设透明度显示第二界面,该第二界面中显示有第一界面的静态图像,防止用户在进行按键映射设置时,由于第一界面的界面内容发生变化,导致设置失败的问题。
如此,电子设备可以在用户进行按键映射设置时,锁定第一界面,防止由于界面内容发生变化,导致的用户误操作。
进一步可选地,在本申请实施例中,在用户完成按键映射设置之后,电子设备可以取消对第一界面的锁定。
示例性的,上述步骤202b2之后,本申请实施例提供的操作方法,还可以包括以下步骤202b3:
步骤202b3、操作装置取消显示上述第二界面。
示例性的,电子设备完成按键映射设置后,取消显示第二界面,电子设备更新第一界面的界面显示内容。
如此,电子设备可以在用户完成按键映射设置后,取消对第一界面的锁定,使得第一界面恢复正常。
可选地,在本申请实施例中,电子设备显示目标窗口及至少一个第二控件之后,用户可以通过拖动操作改变目标窗口的显示位置,为了保持界面布局的整洁,上述第二控件可以在用户拖动目标窗口移动时,随着目标窗口的移动而移动。
示例性的,上述步骤202之后,本申请实施例提供的操作方法,还可以包括以下步骤202c1和步骤202c2:
步骤202c1、操作装置接收用户用于拖动目标窗口的第五输入。
步骤202c2、操作装置响应于上述第五输入,按照第五输入对应的移动轨迹,更新显示目标窗口和至少一个第二控件的显示位置。
其中,上述至少一个第二控件的显示位置随着目标窗口的移动而移动。
示例性的,上述目标窗口和至少一个第二控件在移动过程中,目标窗口与至少一个第二控件的相对位置关系保持不变。即上述至少一个第二控件始终位于目标窗口的预设位置。
如此,当用户拖动目标窗口进行移动时,上述第二控件可以随着目标窗口的移动而移动,从而保证界面不觉的整洁和完整。
本申请实施例提供的操作方法,电子设备可以在接收到用户对第一界面中的M个控件中至少一个控件的第三输入后,对至少一个控件进行按键映射,并在接收到用户触发显示目标窗口的第一输入后,响应于该第一输入,除了显示目标窗口,还显示与第一界面中的N个第一控件(即上述至少一个控件)存在映射关系的至少一个第二控件。并且,在接收到用户对至少一个第二控件中的目标控件的第二输入后,响应于该第二输入,执行与该目标控件对应的目标操作,使得电子设备在以悬浮窗形式显示某个应用的界面显示内容时,用户可以通过悬浮窗附带的功能控件,控制该应用执行某些操作。
需要说明的是,本申请实施例提供的操作方法,执行主体可以为操作装置,或者该操作装置中的用于执行操作方法的控制模块。本申请实施例中以操作装置执行操作方法为例,说明本申请实施例提供的操作装置。
需要说明的是,本申请实施例中,上述各个方法附图所示的。操作方法均是以结合本申请实施例中的一个附图为例示例性的说明的。具体实现时,上述各个方法附图所示的操作方法还可以结合上述实施例中示意的其它可以结合的任意附图实现,此处不再赘述。
图5为实现本申请实施例提供的一种操作装置的可能的结构示意图,如图5所示,操作装置600包括:接收模块601、显示模块602和执行模块603;接收模块601,用于在显示第一界面的情况下,接收第一输入;第一界面中包括N个第一控件,N为正整数;显示模块602,用于响应于接收模块601接收的第一输入,显示目标窗口和至少一个第二控件;目标窗口中显示第一界面的界面显示内容;至少一个第二控件与N个第一控件存在映射关系;每个第二控件用于触发对应的第一控件所能触发的功能操作;接收模块601,还用于接收用户对显示模块602显示的至少一个第二控件中的目标控件的第一输入;执行模块603,用于响应于接收模块601接收的第一输入,执行与目标控件对应的目标操作。
可选地,如图5所示,电子设备600,还包括:确定模块604;第一界面中包含M个控件,M个控件包括N个第一控件,其中,M≥N,M为正整数;接收模块601,还用于在显示第一界面的情况下,接收用户对M个控件中至少一个控件的第三输入;确定模块604,用于响应于接收模块601接收的第三输入,确定至少一个控件为第一控件;其中,至少一个控件所能触发的操作包括:第一界面处于悬浮窗显示状态下能够使用的功能。
可选地,接收模块601,还用于在显示第一界面的情况下,接收第四输入;显示模块602,还用于响应于接收模块601接收的第四输入,按照预设透明度显示第二界面;第二界面包括第一界面的界面图像;其中,第三输入为:用户对第二界面中的目标区域的输入;目标区域为:第一界面的界面图像中与至少一个控件对应的区域。
可选地,显示模块602,还用于取消显示第二界面。
可选地,接收模块601,还用于接收用户用于拖动目标窗口的第五输入;显示模块602,响应于接收模块601接收的第五输入,按照第五输入对应的移动轨迹,更新显示目标窗口和至少一个第二控件的显示位置;其中,至少一个第二控件的显示位置随着目标窗口的移动而移动。
需要说明的是,如图5所示,操作装置600中一定包括的模块用实线框示意,如接收模块601、显示模块602和执行模块603;操作装置600中可能包括的模块用虚线框示意,如确定模块604。
本申请实施例中的操作装置可以是装置,也可以是终端中的部件、集成电路、或芯片。该装置可以是移动电子设备,也可以为非移动电子设备。示例性的,移动电子设备可以为 手机、平板电脑、笔记本电脑、掌上电脑、车载电子设备、可穿戴设备、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本或者个人数字助理(personal digital assistant,PDA)等,非移动电子设备可以为服务器、网络附属存储器(Network Attached Storage,NAS)、个人计算机(personal computer,PC)、电视机(television,TV)、柜员机或者自助机等,本申请实施例不作具体限定。
本申请实施例中的操作装置可以为具有操作系统的装置。该操作系统可以为安卓(Android)操作系统,可以为iOS操作系统,还可以为其他可能的操作系统,本申请实施例不作具体限定。
本申请实施例提供的操作装置能够实现图2至图4的方法实施例实现的各个过程,为避免重复,这里不再赘述。
本申请实施例提供的操作装置,电子设备可以在接收到用户对第一界面中的M个控件中至少一个控件的第三输入后,对至少一个控件进行按键映射,并在接收到用户触发显示目标窗口的第一输入后,响应于该第一输入,除了显示目标窗口,还显示与第一界面中的N个第一控件(即上述至少一个控件)存在映射关系的至少一个第二控件。并且,在接收到用户对至少一个第二控件中的目标控件的第二输入后,响应于该第二输入,执行与该目标控件对应的目标操作,使得电子设备在以悬浮窗形式显示某个应用的界面显示内容时,用户可以通过悬浮窗附带的功能控件,控制该应用执行某些功能操作。
可选的,如图6所示,本申请实施例还提供一种电子设备M00,包括处理器M01,存储器M02,存储在存储器M02上并可在处理器M01上运行的程序或指令,该程序或指令被处理器M01执行时实现上述操作方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
需要注意的是,本申请实施例中的电子设备包括上述的移动电子设备和非移动电子设备。
图7为实现本申请各个实施例的一种电子设备的硬件结构示意图。
该电子设备100包括但不限于:射频单元101、网络模块102、音频输出单元103、输入单元104、传感器105、显示单元106、用户输入单元107、接口单元108、存储器109、以及处理器110等部件。
本领域技术人员可以理解,电子设备100还可以包括给各个部件供电的电源(比如电池),电源可以通过电源管理系统与处理器110逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。图7中示出的电子设备结构并不构成对电子设备的限定,电子设备可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置,在此不再赘述。
其中,用户输入单元107,用于在显示第一界面的情况下,接收第一输入;第一界面中包括N个第一控件,N为正整数;显示单元106,用于响应于用户输入单元107接收的第一输入,显示目标窗口和至少一个第二控件;目标窗口中显示第一界面的界面显示内容;至少一个第二控件与N个第一控件存在映射关系;每个第二控件用于触发对应的第一控件所能触发的功能操作;用户输入单元107,还用于接收用户对显示单元106显示的至少一个第二控件中的目标控件的第一输入;处理器110,用于响应于用户输入单元107接收的第一输入,执行与目标控件对应的目标操作。
如此,在接收到用户触发显示目标窗口的第一输入后,响应于该第一输入,除了显示目标窗口,还显示与第一界面中的N个第一控件存在映射关系的至少一个第二控件。并且,在接收到用户对至少一个第二控件中的目标控件的第二输入后,响应于该第二输入,执行与该目标控件对应的目标操作,使得电子设备在以悬浮窗形式显示某个应用的界面显示内容时,用户可以通过悬浮窗附带的功能控件,控制该应用执行某些功能操作。
可选地,第一界面中包含M个控件,M个控件包括N个第一控件,其中,M≥N,M为正整数;用户输入单元107,还用于在显示第一界面的情况下,接收用户对M个控件中至少一个控件的第三输入;处理器110,用于响应于用户输入单元107接收的第三输入,确定至少一个控件为第一控件;其中,至少一个控件所能触发的操作包括:第一界面处于悬浮窗显示状态下能够使用的功能。
如此,用户可以根据自己的习惯和喜好,将应用的第一界面中的功能按键以单一功能或者组合功能的形式进行按键映射,使得用户可以在电子设备以悬浮窗的形式显示第一界面时,能够通过点击映射按键,对应用进行控制。
可选地,用户输入单元107,还用于在显示第一界面的情况下,接收第四输入;显示单元106,还用于响应于用户输入单元107接收的第四输入,按照预设透明度显示第二界 面;第二界面包括第一界面的界面图像;其中,第三输入为:用户对第二界面中的目标区域的输入;目标区域为:第一界面的界面图像中与至少一个控件对应的区域。
如此,电子设备可以在用户进行按键映射设置时,锁定第一界面,防止由于界面内容发生变化,导致的用户误操作。
可选地,显示单元106,还用于取消显示第二界面。
如此,电子设备可以在用户完成按键映射设置后,取消对第一界面的锁定,使得第一界面恢复正常。
可选地,用户输入单元107,还用于接收用户用于拖动目标窗口的第五输入;显示单元106,响应于用户输入单元107接收的第五输入,按照第五输入对应的移动轨迹,更新显示目标窗口和至少一个第二控件的显示位置;其中,至少一个第二控件的显示位置随着目标窗口的移动而移动。
如此,当用户拖动目标窗口进行移动时,上述第二控件可以随着目标窗口的移动而移动,从而保证界面不觉的整洁和完整。
本申请实施例提供的电子设备,电子设备可以在接收到用户对第一界面中的M个控件中至少一个控件的第三输入后,对至少一个控件进行按键映射,并在接收到用户触发显示目标窗口的第一输入后,响应于该第一输入,除了显示目标窗口,还显示与第一界面中的N个第一控件(即上述至少一个控件)存在映射关系的至少一个第二控件。并且,在接收到用户对至少一个第二控件中的目标控件的第二输入后,响应于该第二输入,执行与该目标控件对应的目标操作,使得电子设备在以悬浮窗形式显示某个应用的界面显示内容时,用户可以通过悬浮窗附带的功能控件,控制该应用执行某些操作。
应理解的是,本申请实施例中,输入单元104可以包括图形处理器(Graphics Processing Unit,GPU)1041和麦克风1042,图形处理器1041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。显示单元106可包括显示面板1061,可以采用液晶显示器、有机发光二极管等形式来配置显示面板1061。用户输入单元107包括触控面板1071以及其他输入设备1072。触控面板1071,也称为触摸屏。触控面板1071可包括触摸检测装置和触摸控制器两个部分。其他输入设备1072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、 操作杆,在此不再赘述。存储器109可用于存储软件程序以及各种数据,包括但不限于应用程序和操作系统。处理器110可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器110中。
本申请实施例还提供一种可读存储介质,所述可读存储介质上存储有程序或指令,该程序或指令被处理器执行时实现上述操作方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
其中,所述处理器为上述实施例中所述的电子设备中的处理器。所述可读存储介质,包括计算机可读存储介质,如计算机只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等。
本申请实施例另提供了一种芯片,所述芯片包括处理器和通信接口,所述通信接口和所述处理器耦合,所述处理器用于运行程序或指令,实现上述操作方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
应理解,本申请实施例提到的芯片还可以称为系统级芯片、系统芯片、芯片系统或片上系统芯片等。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。此外,需要指出的是,本申请实施方式中的方法和装置的范围不限按示出或讨论的顺序来执行功能,还可包括根据所涉及的功能按基本同时的方式或按相反的顺序来执行功能,例如,可以按不同于所描述的次序来执行所描述的方法,并且还可以添加、省去、或组合各种步骤。另外,参照某些示例所描述的特征可在其他示例中被组合。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡 献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台电子设备(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本申请各个实施例所述的方法。
上面结合附图对本申请的实施例进行了描述,但是本申请并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本申请的启示下,在不脱离本申请宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本申请的保护之内。

Claims (15)

  1. 一种操作方法,所述方法包括:
    在显示第一界面的情况下,接收第一输入;所述第一界面中包括N个第一控件,N为正整数;
    响应于所述第一输入,显示目标窗口和至少一个第二控件;所述目标窗口中显示第一界面的界面显示内容;所述至少一个第二控件与所述N个第一控件存在映射关系;每个第二控件用于触发对应的第一控件所能触发的功能操作;
    接收用户对所述至少一个第二控件中的目标控件的第二输入;
    响应于所述第二输入,执行与所述目标控件对应的目标操作。
  2. 根据权利要求1所述的方法,其中,所述第一界面中包含M个控件,所述M个控件包括所述N个第一控件,其中,M≥N,M为正整数;
    所述接收第一输入之前,所述方法还包括:
    在显示所述第一界面的情况下,接收用户对所述M个控件中至少一个控件的第三输入;
    响应于所述第三输入,确定所述至少一个控件为所述第一控件;
    其中,所述至少一个控件所能触发的操作包括:所述第一界面处于悬浮窗显示状态下使用的功能。
  3. 根据权利要求2所述的方法,其中,所述接收用户对所述M个控件中至少一个控件的第三输入之前,所述方法还包括:
    在显示所述第一界面的情况下,接收第四输入;
    响应于所述第四输入,按照预设透明度显示第二界面;所述第二界面包括所述第一界面的界面图像;
    其中,所述第三输入为:用户对所述第二界面中的目标区域的输入;所述目标区域为:所述第一界面的界面图像中与所述至少一个控件对应的区域。
  4. 根据权利要求3所述的方法,其中,所述响应于所述第四输入,按照预设透明度显示第二界面之后,所述方法还包括:
    取消显示所述第二界面。
  5. 根据权利要求1至4中任一项所述的方法,其中,所述显示目标窗口和至少一个第二控件之后,所述方法还包括:
    接收用户用于拖动所述目标窗口的第五输入;
    响应于所述第五输入,按照所述第五输入对应的移动轨迹,更新显示所述目标窗口和所述至少一个第二控件的显示位置;
    其中,所述至少一个第二控件的显示位置随着所述目标窗口的移动而移动。
  6. 一种操作装置,所述装置包括:接收模块、显示模块和执行模块;
    所述接收模块,用于在显示第一界面的情况下,接收第一输入;所述第一界面中包括N个第一控件,N为正整数;
    所述显示模块,用于响应于所述接收模块接收的第一输入,显示目标窗口和至少一个第二控件;所述目标窗口中显示第一界面的界面显示内容;所述至少一个第二控件与所述N个第一控件存在映射关系;每个第二控件用于触发对应的第一控件所能触发的功能操作;
    所述接收模块,还用于接收用户对所述显示模块显示的至少一个第二控件中的目标控件的第一输入;
    所述执行模块,用于响应于所述接收模块接收的第一输入,执行与所述目标控件对应的目标操作。
  7. 根据权利要求6所述的装置,其中,所述装置还包括:确定模块;所述第一界面中包含M个控件,所述M个控件包括所述N个第一控件,其中,M≥N,M为正整数;
    所述接收模块,还用于在显示所述第一界面的情况下,接收用户对所述M个控件中至少一个控件的第三输入;
    所述确定模块,用于响应于所述接收模块接收的第三输入,确定所述至少一个控件为所述第一控件;
    其中,所述至少一个控件所能触发的操作包括:所述第一界面处于悬浮窗显示状态下能够使用的功能。
  8. 根据权利要求7所述的装置,其中,
    所述接收模块,还用于在显示所述第一界面的情况下,接收第四输入;
    所述显示模块,还用于响应于所述接收模块接收的第四输入,按照预设透明度显示第二界面;所述第二界面包括所述第一界面的界面图像;
    其中,所述第三输入为:用户对所述第二界面中的目标区域的输入;所述目标区域为:所述第一界面的界面图像中与所述至少一个控件对应的区域。
  9. 根据权利要求8所述的装置,其中,所述显示模块,还用于取消显示所述第二界面。
  10. 根据权利要求6至9中任一项所述的装置,其中,
    所述接收模块,还用于接收用户用于拖动所述目标窗口的第五输入;
    所述显示模块,响应于所述接收模块接收的第五输入,按照所述第五输入对应的移动轨迹,更新显示所述目标窗口和所述至少一个第二控件的显示位置;
    其中,所述至少一个第二控件的显示位置随着所述目标窗口的移动而移动。
  11. 一种可读存储介质,所述可读存储介质上存储程序或指令,所述程序或指令被处理器执行时实现如权利要求1至5中任一项所述的操作方法的步骤。
  12. 一种电子设备,包括处理器、存储器及存储在所述存储器上并可在所述处理器上运行的程序或指令,所述程序或指令被所述处理器执行时实现如权利要求1至5中任一项所述的操作方法的步骤。
  13. 一种计算机程序产品,所述程序产品被至少一个处理器执行以实现如权利要求1至5中任一项所述的操作方法。
  14. 一种芯片,所述芯片包括处理器和通信接口,所述通信接口和所述处理器耦合,所述处理器用于运行程序或指令,实现如权利要求1至5中任一项所述的操作方法。
  15. 一种电子设备,包括所述电子设备被配置成用于执行如权利要求1至5中任一项所述的操作方法。
PCT/CN2021/118982 2020-09-21 2021-09-17 操作方法及装置 WO2022057881A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP21868703.6A EP4216045A4 (en) 2020-09-21 2021-09-17 OPERATION METHOD AND APPARATUS
KR1020237012327A KR20230065337A (ko) 2020-09-21 2021-09-17 조작 방법 및 장치
JP2023517295A JP2023542666A (ja) 2020-09-21 2021-09-17 操作方法及び装置
US18/187,057 US20230236852A1 (en) 2020-09-21 2023-03-21 Operation method and apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010994746.4A CN112162665B (zh) 2020-09-21 2020-09-21 操作方法及装置
CN202010994746.4 2020-09-21

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/187,057 Continuation US20230236852A1 (en) 2020-09-21 2023-03-21 Operation method and apparatus

Publications (1)

Publication Number Publication Date
WO2022057881A1 true WO2022057881A1 (zh) 2022-03-24

Family

ID=73863180

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/118982 WO2022057881A1 (zh) 2020-09-21 2021-09-17 操作方法及装置

Country Status (6)

Country Link
US (1) US20230236852A1 (zh)
EP (1) EP4216045A4 (zh)
JP (1) JP2023542666A (zh)
KR (1) KR20230065337A (zh)
CN (1) CN112162665B (zh)
WO (1) WO2022057881A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023241013A1 (zh) * 2022-06-14 2023-12-21 Oppo广东移动通信有限公司 界面操作方法、装置、电子设备及存储介质

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112162665B (zh) * 2020-09-21 2021-11-09 维沃移动通信有限公司 操作方法及装置
CN115480629A (zh) * 2021-05-28 2022-12-16 华为技术有限公司 一种多界面显示的方法和电子设备
CN113970971B (zh) * 2021-09-10 2022-10-04 荣耀终端有限公司 基于触控笔的数据处理方法和装置
CN113900553A (zh) * 2021-09-14 2022-01-07 维沃移动通信有限公司 显示方法、装置及电子设备
CN113821294A (zh) * 2021-09-24 2021-12-21 维沃移动通信(杭州)有限公司 二维码扫描方法、装置、电子设备及存储介质
CN115658203A (zh) * 2022-10-28 2023-01-31 维沃移动通信有限公司 信息显示方法、装置、电子设备及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102830969A (zh) * 2012-08-10 2012-12-19 中国电子科技集团公司第四十一研究所 基于窗口与菜单的仪器交互界面及其生成方法
CN106873869A (zh) * 2017-01-06 2017-06-20 珠海格力电器股份有限公司 一种音乐播放的控制方法及装置
US20170195613A1 (en) * 2015-12-31 2017-07-06 Le Holdings (Beijing) Co., Ltd. Method and electronic device for displaying video being played
CN112162665A (zh) * 2020-09-21 2021-01-01 维沃移动通信有限公司 操作方法及装置

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8499254B2 (en) * 2008-10-27 2013-07-30 Microsoft Corporation Surfacing and management of window-specific controls
KR102210999B1 (ko) * 2013-08-22 2021-02-03 삼성전자주식회사 디스플레이 장치의 애플리케이션 실행 방법 및 그 디스플레이 장치
CN105554553B (zh) * 2015-12-15 2019-02-15 腾讯科技(深圳)有限公司 通过悬浮窗口播放视频的方法及装置
CN105979339B (zh) * 2016-05-25 2020-07-14 腾讯科技(深圳)有限公司 一种窗口显示方法及客户端
CN108228040A (zh) * 2017-12-28 2018-06-29 北京安云世纪科技有限公司 移动终端及浮屏操作控制方法、装置
CN109814794A (zh) * 2018-12-13 2019-05-28 维沃移动通信有限公司 一种界面显示方法及终端设备
CN116723266A (zh) * 2019-07-31 2023-09-08 华为技术有限公司 一种悬浮窗口的管理方法及相关装置
CN110737374B (zh) * 2019-09-27 2021-11-02 维沃移动通信有限公司 操作方法及电子设备
CN111026302B (zh) * 2019-11-26 2021-02-23 维沃移动通信有限公司 一种显示方法及电子设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102830969A (zh) * 2012-08-10 2012-12-19 中国电子科技集团公司第四十一研究所 基于窗口与菜单的仪器交互界面及其生成方法
US20170195613A1 (en) * 2015-12-31 2017-07-06 Le Holdings (Beijing) Co., Ltd. Method and electronic device for displaying video being played
CN106873869A (zh) * 2017-01-06 2017-06-20 珠海格力电器股份有限公司 一种音乐播放的控制方法及装置
CN112162665A (zh) * 2020-09-21 2021-01-01 维沃移动通信有限公司 操作方法及装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4216045A4 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023241013A1 (zh) * 2022-06-14 2023-12-21 Oppo广东移动通信有限公司 界面操作方法、装置、电子设备及存储介质

Also Published As

Publication number Publication date
KR20230065337A (ko) 2023-05-11
EP4216045A4 (en) 2024-02-28
EP4216045A1 (en) 2023-07-26
US20230236852A1 (en) 2023-07-27
JP2023542666A (ja) 2023-10-11
CN112162665B (zh) 2021-11-09
CN112162665A (zh) 2021-01-01

Similar Documents

Publication Publication Date Title
WO2022057881A1 (zh) 操作方法及装置
WO2022063022A1 (zh) 视频预览方法、装置及电子设备
WO2022012656A1 (zh) 分屏显示方法、装置及电子设备
US11934648B2 (en) Permission setting method and apparatus and electronic device
CN112148176B (zh) 挂件控制方法、装置、电子设备及可读存储介质
WO2022121790A1 (zh) 分屏显示方法、装置、电子设备和可读存储介质
US20170168628A1 (en) Method and electronic device for split-screen display
WO2022057407A1 (zh) 挂件显示方法和电子设备
WO2022135409A1 (zh) 显示处理方法、显示处理装置和可穿戴设备
WO2020192297A1 (zh) 屏幕界面切换方法及终端设备
WO2022228378A1 (zh) 可穿戴式设备的交互方法、装置、电子设备及可读存储介质
WO2022161241A1 (zh) 息屏显示方法及装置
WO2022121877A1 (zh) 消息处理方法、装置和电子设备
WO2022206697A1 (zh) 图像分享方法、装置及电子设备
WO2022135290A1 (zh) 截屏方法、装置及电子设备
WO2022068864A1 (zh) 后台任务的显示方法、装置、电子设备和可读存储介质
CN112269505B (zh) 音视频控制方法、装置及电子设备
WO2023025060A1 (zh) 界面显示的适配处理方法、装置和电子设备
WO2022253182A1 (zh) 通信方法、装置、电子设备以及可读存储介质
WO2022143869A1 (zh) 程序运行方法、装置及电子设备
WO2022089481A1 (zh) 信息处理方法、装置及电子设备
WO2022161406A1 (zh) 加密方法、装置、电子设备及介质
WO2022063164A1 (zh) 界面显示方法、装置和电子设备
WO2021254377A1 (zh) 应用图标显示方法、装置和电子设备
WO2023056978A1 (zh) 图标显示方法, 装置及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21868703

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023517295

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20237012327

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2021868703

Country of ref document: EP

Effective date: 20230421