WO2024036998A1 - Procédé d'affichage, support de stockage et dispositif électronique - Google Patents

Procédé d'affichage, support de stockage et dispositif électronique Download PDF

Info

Publication number
WO2024036998A1
WO2024036998A1 PCT/CN2023/087891 CN2023087891W WO2024036998A1 WO 2024036998 A1 WO2024036998 A1 WO 2024036998A1 CN 2023087891 W CN2023087891 W CN 2023087891W WO 2024036998 A1 WO2024036998 A1 WO 2024036998A1
Authority
WO
WIPO (PCT)
Prior art keywords
window
application
display
user
area
Prior art date
Application number
PCT/CN2023/087891
Other languages
English (en)
Chinese (zh)
Inventor
谭泳发
秦涛
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Publication of WO2024036998A1 publication Critical patent/WO2024036998A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Definitions

  • the present invention relates to the technical field of mobile terminals, and in particular, to a display method, a storage medium and an electronic device.
  • Existing terminal devices usually support multi-tasking window mode. Users can add an application on the device as a window of any size and float it above other application interfaces or device display interfaces as needed. This type of window is called a floating window. When the user opens other applications, the floating window can remain on top, allowing the user to perform multi-tasking operations on the device.
  • Embodiments of the present invention provide a display method, a storage medium and an electronic device, which can customize the display content in the floating window according to the user's individual needs, thereby improving the user experience.
  • the present invention provides a display method, which is applied to an electronic device.
  • the method includes: detecting a first operation of a user performing a floating display on a first application interface of a first application; in response to the first operation, on the first A check box is generated on the first application interface of an application; a first area of the first application interface selected by the user through the check box is displayed in a floating manner in the first window.
  • the first operation can be used to represent the operation of the user clicking button 23A as shown in Figure 2B. That is, the user clicks button 23A to suspend the display interface of the video application (ie, the first application interface).
  • the display interface of the video application is already a floating display interface at this time, as shown in FIG. 2E , after the user clicks the button 23A, the display interface of the window 25 (ie, the first window) is also a floating display interface, and the window 25
  • the display interface is an interface in the display interface of the video application.
  • the operation of the user clicking button 23A is to suspend and display the interface in the display interface of the video application again, and which part of the interface in the display interface of the video application is suspended and displayed through the selection area of the self-select box To be sure.
  • corresponding prompt information can be added to the button 23A, such as "Specify APP display range” to prompt the user to select the first area of the first application interface by clicking the "Specify APP display range” button 23A.
  • a self-selection box (ie, the selection box 24 ) is generated on the first application interface of the first application, and the electronic device can display the area selected by the self-selection box (ie, the first area).
  • the content is presented separately in the floating window (ie, the first window), so that the display content of the floating window meets the user's personalized needs.
  • the method further includes: in response to the first operation, generating a focus area according to the application type of the first application; and obtaining the first area after the user adjusts the focus area.
  • the focus area can be configured in advance for different application types according to different application types.
  • the focus area of a video application is a video playback area
  • the focus area of a navigation application is a map navigation area, etc.
  • the user can also configure the focus area in advance according to the application type.
  • Different application types configure personalized focus areas for different application types. This allows the electronic device to obtain the application type of the first application and the preconfigured focus area corresponding to the application type in response to the first operation, so as to control the self-select box to automatically focus on the focus area.
  • the user when the focus area can meet the user's display needs, the user can use the focus area as the first area; when the focus area cannot meet the user's display needs, the user can adjust the focus area by adjusting the self-select box.
  • the first area to meet user display needs.
  • the first operation is a screenshot operation; in response to the screenshot operation, a clipping area is generated on the first application interface of the first application; and the first adjustment of the clipping area by the user is obtained. area.
  • the first operation can be used to represent the operation of the user taking a screenshot of the display interface of the video application (ie, the first application interface) as shown in Figure 3B, that is, the operation of the user taking a screenshot of the display interface of the video application. It is to make the display interface of the video application appear suspended.
  • the first operation at this time is to take a screenshot, as shown in 3E, after the user takes a screenshot of the display interface of the video application, the display interface of window 33 (ie, the first window) is also a floating display interface, and the window The display interface of 33 is an interface in the display interface of the video application.
  • the user takes a screenshot of the display interface of the video application in order to display the interface in the display interface of the video application in a floating manner, and which part of the interface in the display interface of the video application is displayed in a floating manner is through self-selection. It is determined by the selection area of the frame.
  • a self-selection box i.e., the interception box 32
  • the selection area of the screenshot operation is selected through the self-selection box, thereby obtaining the third
  • the electronic device can present the display content of the area selected by the self-selection box (that is, the first area) separately in the floating window (that is, the first window), so that the display content of the floating window meets the user's personalized needs.
  • the above method further includes: detecting the user's adjustment operation on the self-selection box; and selecting the first area in response to the adjustment operation.
  • the user can adjust the size and position of the self-selection box, and then adjust the first area selected by the self-selection box, so that the first area meets the user's personalized needs.
  • the user when the user does not want to display the display content of the above-mentioned focus area or interception area in the first window, the user can adjust the focus area or interception area through the self-selecting box, thereby obtaining the first area that meets the user's display requirements; For another example, when the display content of the focus area or the clipping area meets the user's display requirements, the user can also directly use the focus area or the clipping area as the first area.
  • the above-mentioned adjustment operation of the self-selection box includes at least one of the following operations: an operation of expanding the self-selection box; an operation of shrinking the self-selection box; and an operation of moving the self-selection box.
  • the user can reduce the size of the self-selection box by dragging the border of the self-selection box; or as shown in Figures 3C to 3D, the user can move the self-selection box. You can click and drag the marquee to move the marquee's position.
  • the user adjusts the size and position of the self-select box so that the first area selected by the self-select box meets the user's display requirements.
  • the above method further includes: prompting the user to determine the first area of the first application interface selected through the self-selection box; in response to the user's determination operation, displaying the user in a floating manner in the first window The first area of the first application interface selected through the check box.
  • a prompt button can be loaded on the first application interface to prompt the user to click the button to determine the first area of the first application interface selected by the self-select box.
  • the area selected by the selection box 24 i.e., the self-select box
  • the first area can be displayed in a floating manner through the window 25 (i.e., the first window).
  • the button 31A i.e., confirms the operation
  • the area selected by the cutout box 32 i.e., the self-selection box
  • the window 33 i.e., the first window
  • the method further includes: displaying a second application interface of the second application on the screen of the electronic device, and displaying the first window suspended above the second application interface.
  • the display screen of the mobile phone 20 displays a window 21 for displaying a chat application (ie, the second application) display interface and a window 21 for displaying a video application (ie, the first application) display.
  • the window 25 of the interface i.e., the first window
  • the display screen of the mobile phone 30 displays a display interface for the chat application (i.e., the second application) and a display interface for displaying the video application (i.e., the first application).
  • Window 33 of the interface i.e., the first window).
  • the first application and the second application are displayed on the screen of the electronic device at the same time, and the user can perform the first application and the second application on the screen of the electronic device at the same time. interface view.
  • the content displayed in the first window for displaying the first application is the display content of the first area of the first application interface customized by the user through the selection box, so that the display content of the first window meets the user's personalized needs and improves User experience.
  • the first area includes at least one of the following areas: a partial area of the first application interface; and an entire area of the first application interface.
  • the user can select part of the first application interface through the self-selecting box, or can select all the areas of the first application interface.
  • the size and shape of the first window are the same as the size and shape of the check box.
  • the floating display interface of the first window is the part of the first application interface corresponding to the partial area in the first application interface. , making the size of the first window smaller and less likely to block the display interface of other application windows.
  • the above self-selected box includes at least one of the following shapes: straight-sided rectangle, arc-sided rectangle, oval, heart-shaped, circle, star, straight-sided triangle, Curved triangle.
  • the above method further includes: detecting that the user performs a second operation to restore the display of the first application interface of the first application; in response to the second operation, on the screen of the electronic device Display the first application interface of the first application.
  • the second operation may be used to represent the operation of the user clicking the button 25A as shown in FIG. 2E, or the second operation may also be used to represent the operation of the user clicking the button 33A as shown in FIG. 3F.
  • the present invention provides a display device, which is applied to electronic equipment.
  • the device includes: a detection module, used to detect the first operation of the user to perform floating display on the first application interface of the first application; and the generation module, In response to the first operation, generate a self-check box on the first application interface of the first application; and a display module, configured to suspend and display in the first window the first area of the first application interface selected by the user through the self-check box.
  • embodiments of the present invention provide a readable storage medium.
  • the readable storage medium stores instructions.
  • the electronic device enables the electronic device to implement the above first aspect and each of the above first aspects. It is possible to implement any of the provided display methods.
  • inventions of the present invention provide an electronic device.
  • the electronic device includes: a memory for storing instructions executed by one or more processors of the electronic device; and a processor that is one of the processors of the electronic device. 1. For executing instructions stored in the memory to implement the above-mentioned first aspect and any display method provided by various possible implementations of the above-mentioned first aspect.
  • embodiments of the present invention provide a program product.
  • the program product includes instructions.
  • the electronic device can implement the above first aspect and various possible implementations of the above first aspect. any display method.
  • Figure 1A shows a schematic diagram of an application scenario of the display method according to some embodiments of the present invention
  • Figure 1B shows a schematic diagram of another application scenario of the display method according to some embodiments of the present invention.
  • Figure 2A shows a schematic diagram of a user opening an application through a sidebar according to some embodiments of the present invention
  • Figure 2B shows a schematic diagram of a user clicking a custom display button according to some embodiments of the present invention
  • Figure 2C shows a schematic diagram of a user adjusting a frame selection area according to some embodiments of the present invention
  • Figure 2D shows a schematic diagram of an electronic device automatically selecting a focus area according to some embodiments of the present invention
  • Figure 2E shows a schematic diagram of separately presenting the display content of a frame selection area or a focus area through a floating window according to some embodiments of the present invention
  • Figure 2F shows a schematic diagram of restoring the suspended window after the user clicks the restore button according to some embodiments of the present invention
  • Figure 3A shows a schematic diagram of full-screen display of a video application according to some embodiments of the present invention
  • Figure 3B shows a schematic diagram of a user performing a screenshot operation through finger joints according to some embodiments of the present invention
  • Figure 3C shows a schematic diagram of a user adjusting the interception area according to some embodiments of the present invention.
  • Figure 3D shows another schematic diagram of a user clicking a custom display button according to some embodiments of the present invention.
  • Figure 3E shows a schematic diagram of separately presenting the display content of the intercepted area through a floating window according to some embodiments of the present invention
  • Figure 3F shows a schematic diagram of a floating window top display according to some embodiments of the present invention.
  • Figure 3G shows another schematic diagram of restoring the floating window after the user clicks the restore button according to some embodiments of the present invention
  • FIG. 4A shows a schematic diagram of a software structure of the electronic device 100 according to some embodiments of the present invention
  • Figure 4B shows a schematic diagram of another software structure of the electronic device 100 according to some embodiments of the present invention.
  • Figure 5A shows a schematic flowchart of interactive steps of display method Embodiment 1 according to some embodiments of the present invention
  • FIG. 5B shows a schematic flowchart of interactive steps in Embodiment 2 of the display method according to some embodiments of the present invention
  • FIG. 5C shows a schematic flowchart of interactive steps in Embodiment 3 of the display method according to some embodiments of the present invention.
  • Figure 6A shows a schematic flowchart showing method steps of method embodiment 1 according to some embodiments of the present invention
  • Figure 6B shows a schematic flowchart showing a method step of method embodiment 2 according to some embodiments of the present invention
  • FIG. 6C shows a schematic flowchart showing method steps of method embodiment 3 according to some embodiments of the present invention.
  • FIG. 7 shows a schematic diagram of the hardware structure of the electronic device 100 according to some embodiments of the present invention.
  • Illustrative embodiments of the present invention include, but are not limited to, a display method, a storage medium, and an electronic device.
  • the multi-tasking window mode means that different interfaces of multiple applications can be displayed separately in different windows on the display screen of the same electronic device, or that different interfaces of one application can be displayed separately in different windows on the display screen of the same electronic device. , users can operate multiple applications simultaneously on the display screen of the electronic device through the multi-tasking window mode.
  • the user when a user opens a video application on an electronic device and is watching a video, and receives a chat message from a friend through a chat application, the user can display the video application in the form of a floating window.
  • the chat app When a user opens a chat app, the chat app The window of the electronic device can be displayed in full screen on the display screen of the electronic device, and the floating window of the video application is placed on top of the chat application window. That is, the video application window and the chat application window are displayed on the display screen of the electronic device at the same time, so that the user can While watching videos normally, you can also reply to messages from friends.
  • the electronic device when the electronic device presents the display interface of an application in the form of a floating window, it can often only present the display content of the complete interface of the application in the floating window. In order to avoid blocking other application interfaces under the floating window, users can only adjust the overall size of the floating window through the zoom function of the floating window.
  • the display resolution of the window will be reduced, and the reduced window size is limited.
  • the reduced window will be automatically closed, affecting the user experience.
  • window 12A is presented as the interface of a chat application, and window 12A is displayed in full screen.
  • Window 12B is presented as the interface of a video application, and window 12B is a floating window. Window 12B is suspended above window 12A.
  • window 12B presents a complete interface of the application, window 12B is larger in size and easily blocks the interface of other windows (such as window 12A).
  • window 12B in Figure 1A is reduced as a whole, the display resolution of the window 12B will be reduced, and the reduced window size is limited. When the preset size is reached, the reduced window will be automatically closed, affecting the user experience.
  • some applications in the electronic device support the "picture-in-picture" function and have their own trigger buttons.
  • the user can use the trigger button to control the electronic device to separately present the display content of the focus area in the application interface in a floating window. middle.
  • the "picture-in-picture” function can be used to mean that after the electronic device detects the user's trigger operation (such as clicking a trigger button), the electronic device can respond to the trigger operation, automatically obtain the focus area in the triggered application interface, and Present the display content of the focus area separately in a floating window.
  • the focus area of the above-mentioned application interface is usually pre-configured according to the needs of most users or pre-configured according to the application type.
  • the user's needs for the focus area change the user's changing needs cannot be obtained through the above-mentioned "picture-in-picture" function.
  • the last focus area is not conducive to user experience.
  • FIG. 1B there are two windows displayed on the display screen of the mobile phone 11.
  • the window 11A is presented as the interface of the chat application, and the window 11A is displayed in full screen.
  • the window 11B is presented as the interface of the video application, and the window 11B is a floating window. Window 11B is suspended above window 11A.
  • the mobile phone 11 After the mobile phone 11 detects that the user's finger clicks on the trigger button 11B1 in the video application (left picture in Figure 1B), the mobile phone 11 can automatically obtain the current video playback area in the video application, and present the display content of the video playback area separately in the window 11B (right picture in Figure 1B), allowing users to chat with friends while watching videos.
  • the focus area of the application interface is usually pre-configured according to the needs of most users or pre-configured according to the application type, when the user's needs for the focus area change, the user changes cannot be obtained through the above-mentioned "picture-in-picture" function.
  • the focus area after the demand affects the user experience.
  • the present invention provides a display method that adds a customized display function to the application window, so that when the electronic device detects the user's operation on the application window, such as when the electronic device detects that the user has When the operation is added as a floating window or the user takes a screenshot of the application window, the electronic device can display a check box on the floating window or the screenshot window, and the electronic device can respond to the user's display interface of the floating window or screenshot window through the check box.
  • the frame selection operation of a certain area in the box will present the display content of the frame selection area in a floating window of the same size as the self-selected box, so that the display content of the floating window can meet the user's personalization requirements, and because the floating window only presents part of the complete interface of the application (the interface of the area selected by the user through the selection box), the size of the floating window is small and it is not easy to block the display interface of other application windows.
  • the electronic device when the electronic device detects that the user has added an application as a floating window to display the complete interface of the application, it triggers the custom display function of the floating window and displays a custom display button on the floating window.
  • the electronic device can generate a selection box (i.e., a self-selection box) in the floating window in response to the user's click operation on the custom display button, and in response to the user's adjustment operation of the size and position of the selection box, obtain the selected area of the selection box.
  • a selection box i.e., a self-selection box
  • the display content of the area, and the display content of the frame selection area is presented separately in a floating window of the same size as the selection box, so that the display content of the floating window meets the user's personalized needs;
  • the electronic device may respond to the user's click operation on the custom display button, obtain the focus area of the application type displayed corresponding to the floating window, and generate a selection for framing the focus area in the display interface of the floating window. frame, and the display content of the focus area is presented separately in a floating window of the same size as the selection box, so that the display content of the floating window meets the user's personalized needs.
  • the electronic device when the electronic device detects that the user takes a screenshot of an application window that displays the complete interface of an application, the electronic device can display a screenshot box (i.e., a checkbox) corresponding to the screenshot operation on the application window. And trigger the custom display function of the application window, and display the custom display button on the application window.
  • a screenshot box i.e., a checkbox
  • the electronic device can separately present the display content of the area captured by the clipping box in a floating window of the same size as the clipping box, so that the display content of the floating window Meet the personalized needs of users;
  • the electronic device can respond to the user's click operation on the custom display button and the user's adjustment operation on the size and position of the screenshot frame, and separately present the display content of the area captured by the screenshot box in the same location as the screenshot box. size of the floating window, so that the display content of the floating window meets the user's personalized needs.
  • the electronic device when the electronic device detects that the user adds an application as a floating window to display the complete interface of the application, it triggers the automatic opening of the floating window corresponding to the complete interface of the application. Define display functions and generate custom display buttons on the floating window.
  • the electronic device When the electronic device detects that the user clicks the custom display button, the electronic device generates a selection box in the display interface of the floating window in response to the click operation of the custom display button.
  • the user can change the size of the selection box by dragging the border of the selection box. Size and position, and then frame the area you want to display separately.
  • the electronic device When the electronic device detects that the user clicks the custom display button again, it means that the user has completed the area selection. In response to the click operation of the custom display button again, the electronic device can present the display content of the user's selected area separately on the floating screen. in the window. At this time, the window size of the floating window is the same as the area size of the frame selection area.
  • the electronic device when the electronic device presents the display content of the area selected by the user in the floating window alone, it triggers the recovery function of the floating window and generates a recovery button on the floating window.
  • the electronic device When the electronic device detects that the user clicks the restore button, the electronic device responds to the click operation of the restore button, obtains the display window of the application before triggering the custom display function (that is, a floating window that displays the complete interface of the application), and Restore the floating window used to separately present the display content of the frame selection area to a floating window displaying the complete interface of the application.
  • the custom display function that is, a floating window that displays the complete interface of the application
  • two windows are displayed on the display screen of the mobile phone 20.
  • Window 21 is presented as the interface of the chat application, and window 21 is displayed in full screen.
  • Window 22 is presented as the interface of the sidebar.
  • the window 22 includes clock, video, music, browser and camera applications, and the window 22 is suspended above the window 21 .
  • the mobile phone 20 can trigger the opening of the video application in the form of a floating window and present the complete interface of the video application (Fig. 2B).
  • the mobile phone 20 when the mobile phone 20 detects that the user adds a video application as a floating window 23 and presents the complete interface of the video application through the floating window 23 , the mobile phone 20 can trigger the automatic opening of the floating window 23 .
  • the display function Define the display function and generate a custom display button 23A at the bottom of the floating window 23 .
  • the custom display button 23A can add corresponding prompt information, such as "Specify APP display range” to prompt the user to customize the display range of the video application by clicking the "Specify APP display range” button 23A.
  • the mobile phone 20 after detecting that the user clicks the "Specify APP display range" button 23A ( Figure 2B), the mobile phone 20 generates a selection box 24 in the display interface of the floating window 23.
  • the area selected by the selection box 24 is displayed completely transparently, and the area not selected by the selection box 24 is displayed semi-transparently. The transparency allows the user to distinguish the selection of the selection box 24. area.
  • the user can control the size and position of the area selected by the selection box 24 by dragging the border of the selection box 24. For example, the user can drag the lower right corner border of the selection box 24 up in the direction of the arrow to frame the area. Select the video playback area in the video application.
  • the custom display button 23A can change the prompt information, such as "OK", to prompt the user to confirm the area selected by the selection box 24 by clicking the button 23A.
  • the interface size of the video playback area presented by the floating window 25 and the floating window 23 is the same, that is, the interface resolution of the video playback area presented by the two is the same.
  • the floating window 25 since the floating window 25 only presents the interface of the video playback area, the overall size of the floating window 25 is smaller than the overall size of the floating window 23, and it is difficult to block the interface of the window 21.
  • the interface presented by the floating window 25 is a user-defined frame selection interface, the display content of the floating window 25 meets the user's personalized needs and improves user experience satisfaction.
  • the restore button 25A can add corresponding prompt information, such as "one-click restore” to prompt the user to restore the floating window 25 to the floating window 23 by clicking the "one-click restore” button 25A.
  • the mobile phone 20 after the mobile phone 20 detects that the user clicks the "one-click restore” button 25A, the mobile phone 20 responds to the click operation of the "one-click restore” button 25A to obtain the information before triggering the custom display function.
  • the display window (suspended window 23) of the video application restores the suspended window 25 to the suspended window 23 (Fig. 2B).
  • the focus area can be configured in advance for different application types according to different application types.
  • the focus area of a video application is the video playback area
  • the focus area of a navigation application is a map navigation area, etc.
  • the user can also configure the focus area in advance for different application types. Configure personalized focus areas for different application types according to different application types.
  • the electronic device When the electronic device detects that the user clicks the custom display button for the first time, the electronic device can obtain the type of application and the preconfigured focus area corresponding to the type to control the selection box to automatically select the focus area.
  • the mobile phone 20 when the user sets the focus area of a video type application to the video playback area, after the mobile phone 20 detects that the user clicks the "Specify APP display range" button in Figure 2B, the mobile phone 20 can automatically obtain the video playback area and control the selection.
  • Frame 24 automatically selects the video playback area (as shown in Figure 2D), that is, there is no need for the user to manually select the focus area (as shown in Figure 2C).
  • the user can also control the size and position of the area selected by the selection box 24 by dragging the border of the selection box 24, that is, re-select the area. Focus area, so that the display content finally selected in the floating window can meet the user's personalized needs.
  • the user can simultaneously view the display interfaces of multiple application windows (as shown in FIG. 2E ) on the mobile phone 20 in order to perform task operations on the corresponding loaded application programs in the multiple application windows.
  • the display interface of one of the application windows (such as the floating window 25) displayed by the mobile phone 20 is any area (such as the selection box 24) in the display interface of any application (such as the video application) selected by the user.
  • the display interface of the selected area enables the display interface of the application window to meet the interface display requirements of the application program selected by the user, thereby allowing the application window to be personalized for the current user of the electronic device and improve the user experience.
  • the electronic device when the electronic device detects that the user takes a screenshot of an application window interface (at this time the window is displayed as the complete interface of the application), the electronic device triggers the opening of the customized display of the application window. Function, add a custom display button on the corresponding screenshot interface.
  • the electronic device may separately present the display content of the screenshot area corresponding to the user's screenshot operation in the floating window in response to the click operation of the custom display button.
  • the window size of the floating window is the same as the area size of the clipping area.
  • the electronic device when the electronic device presents the display content of the user's selected area separately in the floating window, it triggers the recovery function of the floating window and generates a recovery button on the floating window.
  • the electronic device When the electronic device detects that the user clicks the restore button, the electronic device responds to the click operation of the restore button to obtain the display window of the application before triggering the custom display function (that is, the window that displays the complete interface of the application), and suspends the window Restore to the window showing the full interface of the application.
  • the custom display function that is, the window that displays the complete interface of the application
  • a window 31 is displayed on the display screen of the mobile phone 30 , the window 31 is displayed in full screen, and the window 31 is presented as a complete interface of the video application.
  • the user uses his knuckles to take a screenshot of the display interface of the window 31 , for example, the user uses his knuckles to draw a screenshot box 32 on the display interface of the window 31 .
  • the mobile phone 30 can take a screenshot and save the interface in the screenshot box 32 in response to the user's screenshot operation.
  • mobile phone 30 after detecting that the user takes a screenshot of the interface of window 31, mobile phone 30 triggers the activation of the custom display function of window 31 and adds a custom display button 31A to the screenshot interface.
  • the area captured by the interception box 32 is displayed completely transparently, and the area not intercepted by the interception box 32 is displayed semi-transparently.
  • the transparency facilitates the user to distinguish the interception box. 32 interception area.
  • the custom display button 31A can add corresponding prompt information, such as "small window display”, to prompt the user that by clicking the button 31A, the display content of the intercepted area can be presented separately in the floating window.
  • the user can change the size, position, and shape of the screenshot box 32 on the screenshot interface, so that the display interface of the screenshot box 32 meets the user's personalized needs.
  • the user can click on the clipping box 32 and drag it upward along the arrow to the video playback area of the video application, or the user can also adjust the shape of the clipping box 32 through the shape button provided in the screenshot function.
  • shape button provided in the screenshot function.
  • the present invention is not limited here.
  • the mobile phone 30 can respond to the "small window display”
  • the click operation of the button 31A presents the display content of the clipped area of the clipping box 32 in the floating window 33 (Fig. 3E).
  • the window size of the floating window 33 is the same as the size of the clipping area.
  • the mobile phone 30 presents the display content of the area selected by the selection box 32 separately in the floating window 33 and floats it above the main interface of the mobile phone 30 .
  • the floating window is triggered to open.
  • the restore function of the window 33 generates a restore button 33A on the floating window 33.
  • the restore button 33A can add corresponding prompt information, such as "one-click restore” to prompt the user to restore the 33A floating window 33 to the window 31 by clicking the "one-click restore” button.
  • the floating window 33 when the user opens other application windows (such as a chat application) and displays them in full screen, the floating window 33 is in the top state, that is, the floating window 33 is located above the chat application window.
  • the user can chat with friends through the chat application while watching videos through the video application on the mobile phone 30 .
  • the floating window 33 only presents the display content of the intercepted area separately.
  • the floating window 33 is small and cannot easily block the chat application interface or the main interface of the mobile phone 30 .
  • the interface presented by the floating window 33 is a user-customized and selected interface, the display content of the floating window 33 meets the user's personalized needs and improves user experience satisfaction.
  • the mobile phone 30 after the mobile phone 30 detects that the user clicks the “one-click restore” button 33A, the mobile phone 30 responds to the click operation of the “one-click restore” button 33A and obtains the information before triggering the custom display function.
  • the display window (window 31) of the video application restores the floating window 33 to window 31.
  • the user can simultaneously view the display interfaces of multiple application windows (as shown in FIG. 3F) on the mobile phone 30 in order to perform task operations on the corresponding loaded applications in the multiple application windows.
  • the display interface of one of the application windows (such as the floating window 33) displayed by the mobile phone 20 is any area (such as the interception box 32) in the display interface of any application (such as the video application) selected by the user.
  • the display interface of the selected area enables the display interface of the application window to meet the interface display requirements of the application program selected by the user, thereby allowing the application window to be personalized for the current user of the electronic device and improve the user experience.
  • electronic devices may include but are not limited to mobile phones, tablet computers, wearable devices, Vehicle-mounted devices, augmented reality (AR)/virtual reality (VR) devices, laptops, ultra-mobile personal computers (UMPC), netbooks, personal digital assistants (PDA) ) or a specialized camera (such as a SLR camera, a card camera), etc.
  • AR augmented reality
  • VR virtual reality
  • UMPC ultra-mobile personal computers
  • PDA personal digital assistants
  • the embodiments of the present invention do not place any restrictions on the specific type of the electronic device.
  • the above video application is only explained as an example of an application that triggers a custom display function.
  • the application that triggers a custom display function can be any application on the electronic device that meets the triggering conditions.
  • the invention is not limited here.
  • the electronic device when the electronic device detects that the user adds an application as a floating window, it triggers the custom display function of the floating window where the application is located; another example is when the electronic device detects that the user takes a screenshot of an application window interface. , triggers the custom display function of the window where the application is located.
  • FIG. 4A shows a schematic diagram of the software structure of an electronic device 100 according to the present invention.
  • the layered architecture divides the software into several layers, and each layer has clear roles and division of labor.
  • the layers communicate through software interfaces.
  • the Android system applied to the electronic device can be divided into four layers. From top to bottom, they are the application layer, the application framework layer, the Android runtime (Android runtime) and system libraries, and the kernel layer. .
  • the application layer can include a series of application packages.
  • the application package can include applications (Application, APP) such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message, etc.
  • applications Application, APP
  • the application framework layer provides an application programming interface (API) and programming framework for applications in the application layer.
  • API application programming interface
  • the application framework layer includes some predefined functions.
  • the application framework layer can include a window manager, activity manager, content provider, view system, Phone manager, resource manager, notification manager, etc.
  • WMS Window Manager Service
  • WMS Window Manager Service
  • WMS can be used to manage the startup, addition, and deletion of windows. As shown in Figure 2A and Figure 2B, when WMS detects that the user clicks on the video application icon and wants to open the window interface of the video application, WMS can add a floating The window 23 presents the complete interface of the video application through the floating window 23 .
  • WMS can be used for window size and hierarchy management.
  • the level of the floating window 23 is located above the window 21; or as shown in Figures 2D and 2E, WMS can respond to the user's request for Define the click of the display button, change the size of the floating window 23 to the size of the floating window 25 according to the size of the area selected by the selection box 24, and present the display content of the area selected by the selection box 24 separately in the floating window 25; Or as shown in 2E and 2F, the WMS can restore the size of the floating window 25 to the size of the floating window 23 in response to the user's click on the restore button, and restore the display interface of the floating window 23.
  • AMS Activity Manager Service
  • AMS Activity Manager Service
  • each activity 401 includes a window 402 (Phone Window).
  • Phone Window is used to represent an example of window. It can be understood that each Activity includes an example of a window (Phone Window).
  • the window 402 includes multiple views, such as the root view 403 (Decor View) and the content view.
  • the root view 403 is used to represent the entire window interface of the window 402.
  • Figure 4B shows the inclusion relationship between activity 401, window 402, root view 403, and content view.
  • the size and size of each root view 403 and content view are set according to the actual application.
  • a window 31 is displayed on the display screen of the mobile phone 30 .
  • the window 31 is displayed in full screen, and the window 31 is presented as a complete interface of the video application. It can be understood that when the video application is opened by the user and runs on the mobile phone 30, the video application at this time is an activity of the mobile phone 30, and the window corresponding to the activity is the window 31. Since the window 31 is displayed in full screen, it can be understood as the root window.
  • the interface corresponding to the view is the display interface of the entire display screen of the mobile phone 30 , and the video playback area 31D, the barrage selection area 31B, and the selection area 31C in the window 31 respectively correspond to different content views.
  • different content views correspond to different types.
  • the content view of the video playback area 31D corresponds to the video playback class
  • the content view of the barrage selection area 31B corresponds to the barrage selection class
  • the content view of the selection area 31C corresponds to the selection class.
  • the type of view can be set according to the type of different applications.
  • the target content view can be configured for different applications according to the type of the application, and the target content view can be used as the focus area of the application.
  • the focus area of a video application is the content view of the video playback class
  • the focus area of a navigation application is the content view of the map navigation class, etc., or the user can also customize the configuration according to different needs, and there is no limit here.
  • the chat application and the video application correspond to two different activities in the mobile phone 20 .
  • the difference from Figure 3A is that the interface of the video application in Figure 2B is presented by the floating window 23, and the video application interface in Figure 3A is presented by the window 31.
  • the content view of the corresponding activity of the video application in Figure 2B is floating.
  • Window 23 , and the interface corresponding to the content view of the corresponding window of the video application is the entire display interface of the floating window 23 .
  • Content providers are used to store and retrieve data and make this data accessible to applications.
  • Data can include videos, Images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system may be a display system of the electronic device, capable of managing and modifying the display style of the application program to be displayed on the electronic device.
  • the view system can obtain the display function corresponding to the dark mode according to the display style parameters contained in the display parameters stored in the dotted line of the electronic device.
  • Telephone managers are used to provide communication capabilities to electronic devices. For example, call status management (including connected, hung up, etc.).
  • the resource manager provides various resources to applications, such as localized strings, icons, pictures, layout files, video files, etc.
  • the notification manager allows applications to display notification information in the status bar, which can be used to convey notification-type messages and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also be notifications that appear in the status bar at the top of the system in the form of charts or scroll bar text, such as notifications for applications running in the background, or notifications that appear on the screen in the form of conversation windows. For example, text information is prompted in the status bar, a beep sounds, the electronic device vibrates, the indicator light flashes, etc.
  • Android Runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library contains two parts: one is the functional functions that need to be called by the Java language, and the other is the core library of Android.
  • the application layer and application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and application framework layer into binary files.
  • the virtual machine is used to perform object life cycle management, stack management, thread management, security and exception management, and garbage collection and other functions.
  • System libraries can include multiple functional modules. For example: surface manager (surface manager), media libraries (Media Libraries), three-dimensional graphics processing libraries (for example: OpenGL ES), two-dimensional graphics engines (for example: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as static image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, composition, and layer processing.
  • 2D Graphics Engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display driver, camera driver, audio driver, and sensor driver.
  • the software structure of the electronic device 100 shown in FIG. 4A is only an example.
  • the software structure of the electronic device 100 may also include more or less layers, and each layer may also include more or fewer modules, and can also be other operating system structures, such as Android TM , iOS TM , Windows TM, etc., which are not limited here.
  • the electronic device detects that the user has added an application as a floating window to display the complete interface of the application, it triggers the custom display function of displaying the floating window corresponding to the complete interface of the application.
  • the user can customize the frame selection area, taking the display content of the framed area to be presented separately in a floating window with the same window size as the area of the framed area as an example, the technical solution of the present application will be introduced in detail.
  • Figure 5A shows a flowchart showing an interactive step of the method, including the following steps:
  • S411A The application 401 generates a trigger event in response to the user operation.
  • the application 401 generates a call window in response to the user's click operation on the video application.
  • the service 402 creates a trigger event of the first floating window to present the complete interface of the video application through the created first floating window.
  • S411B The application 401 sends the trigger event to the window service 402.
  • the application program 401 sends the generated trigger event for using the window service 402 to create the first floating window to the window service 402, so that the window service 402 creates the first floating window after receiving the trigger event.
  • the window service 402 After receiving the trigger event, the window service 402 creates the first floating window.
  • the window service 402 after receiving the trigger event in FIG. 2A , creates a floating window 23 and presents the complete interface of the video application through the floating window 23 .
  • S412B The window service 402 sends the notification information that the first floating window is created to the root view service 403.
  • the window service 402 sends the notification information of the creation of the first floating window to the root view service 403, so that the root view service 403 can perform the next step (such as creating a corresponding button) after receiving the notification information. .
  • the root view service 403 creates a custom display button and a recovery button based on the received notification information.
  • the window service 402 when the window service 402 creates the floating window 23 and presents the complete interface of the video application through the floating window 23 , it triggers the custom display function of the floating window 23 to be turned on and displays it through the root view service.
  • 403 Create a custom display button 23A and a restore button 25A (Fig. 2E), configure a first response event for the custom display button 23A, and configure a second response event for the restore button 25A.
  • S413B The root view service 403 sends the custom display button and restore button to the application 401.
  • the customized display button 23A is loaded to the bottom of the first floating window after step S413B. As shown in FIG. 2B , the customized display button 23A is loaded to the bottom of the floating window 23 after step S413B; and the restore button 25A is loaded to the bottom of the second floating window 25 after step S416. As shown in FIG. 2E , the restore button 25A is loaded to the bottom of the floating window 25 after step S416.
  • S414A In response to the user's click on the custom display button and the user's operation of selecting a custom area on the first floating window, the application 401 obtains the first coordinate information of the selected area.
  • the first response event is triggered, that is, the custom display function of the first floating window is turned on, so that the user can perform custom area selection on the first floating window to frame the first floating window.
  • the display content of the selected area is presented separately in the second floating window.
  • a selection box 24 is generated in the display interface of the floating window 23, and the user can control the selection box by dragging the border of the selection box 24 24 selected areas.
  • S414B The application program 401 sends the first coordinate information of the selected area to the root view service 403.
  • the size of the area selected by the selection box 24 is the first coordinate information of the above-mentioned selected area, wherein the area selected by the selection box 24 in Figure 2D is the video in the video application. Content View of the playback class.
  • S415A The root view service 403 generates a first adjustment request according to the first coordinate information.
  • S415B The root view service 403 sends the first adjustment request to the window server 404.
  • the root view service 403 sends the size of the user-defined frame selection area to the window server 404, so that the window server 404 adjusts the first floating window according to the size of the frame selection area to obtain the same size as the frame selection area. A second floating window of the same size.
  • the window server 404 adjusts the first floating window to a second floating window adapted to the first coordinate information according to the first adjustment request, and presents the display content in the frame selection area separately in the second floating window.
  • the window server 404 adjusts the floating window 23 (the first floating window) to the same size as the Content View of the video playback class in the video application, and adjusts the floating window 25 (the second floating window), and The display content in the Content View of the video playback class is presented separately in the floating window 25.
  • the restore button 25A is loaded to the bottom of the floating window 25 .
  • a second response event is triggered, that is, the second suspended window is restored to the first suspended window.
  • the floating window 25 after detecting that the user clicks the “one-click restore” button 25A, the floating window 25 generates a response event to restore to the floating window 23 .
  • S417B The application 401 sends the second coordinate information to the root view service 403.
  • the size of the floating window 23 is the coordinate information of the floating window 23 , that is, the second coordinate information of the first floating window.
  • S418A The root view service 403 generates a second adjustment request according to the second coordinate information.
  • S418B The root view service 403 sends the second adjustment request to the window server 404.
  • the root view service 403 sends the size of the first floating window to the window server 404, so that the window server 404 adjusts the second floating window according to the size of the first floating window.
  • the window server 404 restores the second floating window to the first floating window adapted to the second coordinate information according to the second adjustment request, and restores the display interface of the first floating window.
  • the window server 404 restores the floating window 25 (the second floating window) to the floating window 23 (the first floating window), and restores the display interface of the floating window 23, that is, the display interface of the video application. Complete interface.
  • FIG. 6A shows a flowchart showing one method step of the method, including the following steps:
  • the window service 402 creates a first floating window to present the complete interface of the application program 401.
  • the window service 402 creates a floating window 23 in response to the user's finger clicking operation on the video application, so as to present the complete interface of the video application through the floating window 23 .
  • Root view service 403 creates custom display button and restore button.
  • the custom display function of the floating window 23 is triggered.
  • the customized display button 23A is loaded to the bottom of the first floating window after step S612. As shown in FIG. 2B , the customized display button 23A is loaded to the bottom of the floating window 23 after step S612; and the restore button 25A is loaded to the bottom of the second floating window 25 after step S614B, as shown in FIG. 2E , and the restore button 25A is loaded to the bottom of the floating window 25 after step S614B.
  • Application 401 detects that the button click operation is a custom display button or a restore button.
  • step S614A when the detected button click operation is a custom display button, jump to step S614A; when the detected button click operation is a restore button, jump to step S615A.
  • S614A The application 401 turns on the customized display function of the first floating window, and performs custom area selection on the first floating window.
  • the application 401 in response to the user clicking the custom display button, the application 401 triggers a first response event, that is, turning on the custom display function of the first floating window, so that the user can perform custom area selection on the first floating window. , to present the display content of the frame selection area separately in the second floating window.
  • a first response event that is, turning on the custom display function of the first floating window
  • the user clicks the "Specify APP display range" button 23A after it is detected that the user clicks the "Specify APP display range" button 23A, a selection box 24 is generated in the display interface of the floating window 23, and the user can control the selection box by dragging the border of the selection box 24 24 selected areas.
  • the window server 404 creates a second floating window based on the frame selection area, and presents the display content of the frame selection area separately in the third Two floating windows.
  • the window server 404 adjusts the floating window 23 (the first floating window) to the same size as the area selected by the selection box 24, and adjusts the floating window 25 (the second floating window).
  • the display content of the frame selection area is presented separately in the floating window 25 .
  • the restore button 25A is loaded to the bottom of the floating window 25 .
  • S615A The application 401 obtains the first floating window.
  • the application 401 in response to the user clicking the restore button, the application 401 triggers a second response event, that is, restoring the second floating window to the first floating window. For example, as shown in Figures 2E and 2F, after detecting that the user clicks the "one-click restore" button 25A, the size of the floating window 23 is obtained.
  • S615B The window server 404 restores the second floating window to the first floating window, and restores the display interface of the first floating window.
  • the window server 404 restores the floating window 25 (the second floating window) to the floating window 23 according to the size of the floating window 23 (the first floating window), and restores the size of the floating window 23.
  • Display interface that is, the complete interface of application 401.
  • the user can customize the framed area in the first floating window, and the display content of the framed area is presented separately in the second floating window, so that the display content of the second floating window meets the user's personalization. needs, and because the second floating window only presents part of the complete interface in the first floating window (the interface of the user-defined frame selection area), the size of the second floating window is smaller and it is not easy to block other application window interfaces on the electronic device. .
  • the user can view the display interfaces of multiple application windows at the same time in the electronic device, so as to perform task operations on the corresponding loaded application programs in the multiple application windows.
  • the display interface of one of the application windows (such as the second floating window) displayed by the electronic device is the display interface of any area of the display interface of any application selected by the user, so that the display interface of the application window It meets the interface display requirements of the application selected by the user, so that the application window can be personalized for the current user of the electronic device to improve user experience.
  • the electronic device when the electronic device detects that the user has added an application as a floating window to display the complete interface of the application, it triggers the custom display function of displaying the floating window corresponding to the complete interface of the application.
  • the electronic device can automatically select the frame. Taking the focus area to present the display content of the focus area separately in a floating window as an example, the technical solution of this application is introduced in detail.
  • Figure 5B shows a flowchart showing another method step of the method.
  • the flowchart shown in Figure 5B is similar to the flowchart shown in Figure 5A of the first embodiment.
  • the only difference lies in S424A and S424B in the flowchart shown in Figure 5B.
  • S425A, S425B and S426, so only S424A, S424B, S425A, S425B and S426 are introduced in detail below:
  • S424A The application 401 generates a focus area request notification in response to the user's click on the custom display button.
  • a third response event is triggered, that is, the custom display function of the first floating window is turned on, and the focus area of the first floating window is queried to separate the display content of the focus area.
  • the response event presented in the third floating window For example, as shown in FIGS. 2B and 2D , after it is detected that the user clicks the "Specify APP display range" button 23A, a selection box 24 is generated in the display interface of the floating window 23 and the focus area is selected through the selection box 24 .
  • S424B The application 401 sends the focus area request notification to the root view service 403.
  • the root view service 403 may query and obtain the focus area in the first floating window according to the received focus area request notification.
  • the root view service 403 queries the third coordinate information of the focus area according to the focus area request notification, and generates a third adjustment request.
  • the focus area can be preset according to the types of different applications.
  • the focus area of a video-type application is the video playback area
  • the focus area of a navigation-type application is the map navigation area.
  • the focus area corresponding to the floating window 23 is the video playback area, that is, the area selected by the selection box 24 in FIG. 2D .
  • the area selected by the selection box 24 in Figure 2D is the Content View of the video playback class in the video application, and the size of the Content View is the third coordinate information of the above-mentioned focus area.
  • S425B The root view service 403 sends the third adjustment request to the window server 404.
  • the root view service 403 sends the size of the focus area to the window server 404, so that the window server 404 adjusts the first floating window according to the size of the focus area to obtain a third floating window with the same size as the focus area. window.
  • the window server 404 adjusts the first floating window to a third floating window adapted to the third coordinate information according to the third adjustment request, and presents the display content in the focus area separately in the third floating window.
  • the window server 404 adjusts the floating window 23 (the first floating window) to the same size as the Content View of the video playback class in the video application, and adjusts the floating window 25 (the third floating window), and The display content in the Content View of the video playback class is presented separately in the floating window 25.
  • the restore button 25A is loaded to the bottom of the floating window 25 .
  • FIG. 6B shows a flowchart showing one method step of the method.
  • the flowchart shown in Figure 6B is similar to the flowchart shown in Figure 6A of the first embodiment.
  • the only difference lies in S624A and S624B in the flowchart shown in Figure 6B.
  • S624A and S624B are introduced in detail below:
  • S624A The application 401 turns on the customized display function of the first floating window and queries the focus area of the first floating window.
  • the application 401 in response to the click of the custom display button, triggers a third response event, that is, turning on the custom display function of the first floating window, and querying the focus area of the first floating window to change the focus area.
  • the display content is presented separately in the third floating window in response to the event. For example, as shown in Figures 2B and 2D, after it is detected that the user clicks the "Specify APP display range" button 23A, a selection box 24 is generated in the display interface of the floating window 23 and the focus area is selected through the selection box 24.
  • the focus area can be preset according to the types of different applications.
  • the focus area of a video-type application is the video playback area
  • the focus area of a navigation-type application is the map navigation area.
  • the focus area corresponding to the floating window 23 is the video playback area, that is, the area selected by the selection box 24 in FIG. 2D . It can be understood that the area selected by selection box 24 in Figure 2D is the Content View of the video playback class in the video application.
  • the window server 404 creates a third floating window based on the focus area, and presents the display content of the focus area individually in the third floating window.
  • the window server 404 adjusts the floating window 23 (the first floating window) to the same size as the frame selection area and the floating window 25 (the third floating window), and adjusts the size of the frame selection area.
  • the display content is presented separately in the floating window 25 .
  • the restore button 25A is loaded to the bottom of the floating window 25 .
  • the root view service 403 can automatically select the focus area according to the application type in the first floating window, and present the display content of the focus area separately in the third floating window, that is, there is no need for the user to manually focus on the focus area.
  • Frame selection can automatically select the focus area, which improves the user experience and allows the display content of the third floating window to meet the user's personalized needs, and because the third floating window only presents part of the complete interface in the first floating window interface (the interface of the focus area), so that the third floating window size The size is smaller and does not easily block other application window interfaces on the electronic device.
  • the user can view the display interfaces of multiple application windows at the same time in the electronic device, so as to perform task operations on the corresponding loaded application programs in the multiple application windows.
  • the display interface of one of the application windows (such as the third floating window) displayed by the electronic device is the display interface of any area of the display interface of any application selected by the user, so that the display interface of the application window It meets the interface display requirements of the application selected by the user, so that the application window can be personalized for the current user of the electronic device to improve user experience.
  • the electronic device when the electronic device detects that the user takes a screenshot of an application window interface (at this time the window is displayed as the complete interface of the application), it triggers the custom display function of the application window.
  • the user can select the screenshot corresponding to the screenshot operation.
  • the display content of the area is presented separately in the floating window, or the user can customize the clipping area. Taking the display content of the clipping area as being presented separately in the floating window as an example, the technical solution of this application is introduced in detail.
  • Figure 5C shows a flowchart showing another method step of the method, including the following steps:
  • S431A The application program 401 generates a trigger event in response to the user operation.
  • a window 31 is displayed on the display screen of the mobile phone 30 , the window 31 is displayed in full screen, and the window 31 is presented as a complete interface of the video application.
  • the application program 401 responds to the user's operation of taking a screenshot of the display interface of the window 31 with his knuckles, such as the operation of the user drawing the screenshot box 32 on the display interface of the window 31 with his knuckles, so that the application program 401 401 generates a trigger event for taking a screenshot of the area selected by the capture box 32 .
  • S431B The application 401 sends the trigger event to the root view service 403.
  • the application program 401 sends the trigger event to the root view service 403, so that the root view service 403 can perform the next step of processing (such as creating a corresponding button) after receiving the trigger event.
  • S432A The root view service 403 creates a custom display button and a restore button according to the received trigger event.
  • the custom display button 31A and the restore button 33A are created through the root view service 403 (Fig. 3E), and configure the fourth response event for the custom display button 31A, and configure the fifth response event for the recovery button 33A.
  • S432B The root view service 403 sends the custom display button and restore button to the application 401.
  • the customized display button is loaded to the top of the window 31 after step S432B.
  • the customized display button 31A is loaded to the top of window 31 after step S432B; and the restore button is loaded after step S435.
  • the restore button 33A is loaded to the bottom of the floating window 33 after step S435.
  • S433A In response to the click of the custom display button, the application 401 obtains the clipping area of the original window and the fourth coordinate information of the clipping area.
  • a fourth response event is triggered, that is, the clipped area of the original window is obtained to separately present the display content of the clipped area in the fourth floating window.
  • the original window is used to represent the window (window 31) used to present the display interface of the application program 401 before the screenshot operation.
  • the user can adjust the clipping area. As shown in FIG. 3C , the user can click the clipping box 32 and drag the clipping box 32 upward so that the clipping box 32 moves to the video playback area of the video application. As further shown in Figure 3D, when it is detected that the user clicks the "Small Window Display” button 31A, the application 401 can respond to the click operation of the "Small Window Display” button 31A and obtain the clipping area corresponding to the clipping box 32. The fourth coordinate information.
  • S433B The application program 401 sends the fourth coordinate information of the clipping area to the root view service 403.
  • the size of the clipping area corresponding to the clipping box 32 is the fourth coordinate information of the clipping area.
  • S434A The root view service 403 generates a fourth adjustment request according to the fourth coordinate information.
  • S434B The root view service 403 sends the fourth adjustment request to the window server 404.
  • the root view service 403 sends the size of the user's clipping area to the window server 404, so that the window server 404 creates a fourth floating window according to the size of the clipping area.
  • the window server 404 creates a fourth floating window adapted to the fourth coordinate information according to the fourth adjustment request, and presents the display content in the clipped area individually in the fourth floating window.
  • the window server 404 creates a floating window 33 (the fourth floating window) with the same size as the clipping box 32, and presents the display content of the clipping box 32 in Figure 3D separately. in floating window 33.
  • the restore button 33A is loaded to the bottom of the floating window 33 .
  • S436A The application 401 obtains the fifth coordinate information of the original window in response to the user's click on the restore button.
  • the fifth response event is triggered, that is, the fourth suspended window is restored to the original window.
  • the floating window 33 after detecting that the user clicks the "one-click restore" button 33A, the floating window 33 generates a response event to restore to the window 31.
  • S436B The application 401 sends the fifth coordinate information to the root view service 403.
  • the size of window 31 is the coordinate information of window 23 , that is, the fifth coordinate information of the original window.
  • S437A The root view service 403 generates a fifth adjustment request according to the fifth coordinate information.
  • S437B The root view service 403 sends the fifth adjustment request to the window server 404.
  • the root view service 403 sends the size of the original window to the window server 404, so that the window server 404 adjusts the fourth floating window according to the size of the original window.
  • the window server 404 restores the fourth floating window to the original window adapted to the fifth coordinate information according to the fifth adjustment request, and restores the display interface of the original window.
  • the window server 404 adjusts the floating window 33 (the fourth floating window) to the window 31 (the original window), and restores the display interface of the window 31.
  • FIG. 6C shows a flowchart showing one method step of the method, including the following steps:
  • S631 The application 401 responds to the user operation and obtains the clipping area in the original window.
  • the application 401 responds to the user taking a screenshot of the display interface of the window 31 with his knuckles, such as the user drawing a screenshot box on the display interface of the window 31 with his knuckles. 32 operation to obtain the clipping area of clipping box 32.
  • Root view service 403 creates custom display button and restore button.
  • the custom display button 31A and the restore button 33A are created through the root view service 403, and the custom display button 31A and the restore button 33A are created (Fig. 3E). It is defined that the display button 31A is configured with a fourth response event, and the restore button 33A is configured with a fifth response event.
  • Application 401 detects that the button click operation is a custom display button or a recovery button.
  • step S634A when the detected button click operation is a custom display button, jump to step S634A; when the detected button click operation is a restore button, jump to step S635A.
  • S634A Application 401 turns on the customized display function of the original window and obtains the clipped area of the original window.
  • the application program 401 triggers a fourth response event in response to a click of the custom display button, that is, obtains a clipped area of the original window, so as to present the display content of the clipped area separately in the fourth floating window.
  • a click of the custom display button that is, obtains a clipped area of the original window, so as to present the display content of the clipped area separately in the fourth floating window.
  • the application 401 can obtain the size of the clipping area corresponding to the clipping box 32 in response to the click operation of the "Small Window Display” button 31A.
  • the window server 404 creates a fourth floating window based on the clipping area, and presents the display content of the clipping area in the fourth floating window alone.
  • the window server 404 creates a floating window 33 (the fourth floating window) with the same size as the clipping box 32, and presents the display content of the clipping box 32 in Figure 3D separately. in floating window 33.
  • the restore button 33A is loaded to the bottom of the floating window 33 .
  • the application program 401 triggers a fifth response event in response to clicking the restore button, that is, restoring the display of the fourth suspended window as the original window.
  • a fifth response event in response to clicking the restore button, that is, restoring the display of the fourth suspended window as the original window.
  • the floating window 33 after detecting that the user clicks the "one-click restore" button 33A, the floating window 33 generates a response event to restore to the window 31.
  • S635B The window server 404 restores the fourth floating window to the original window and restores the display interface of the original window.
  • the window server 404 adjusts the floating window 33 (the fourth floating window) to the window 31 (the original window), and restores the display interface of the window 31.
  • the user can present the display content of the interception area in the screenshot operation separately in the fourth floating window, or the user can customize the interception area and then present the display content of the interception area separately in the fourth floating window.
  • the display content of the fourth floating window meets the user's personalized needs, and because the fourth floating window only presents part of the complete interface in the original window (the interface of the intercepted area), the size of the fourth floating window is It is smaller and does not easily block other application window interfaces on the electronic device.
  • the user can view the display interfaces of multiple application windows at the same time in the electronic device, so as to perform task operations on the corresponding loaded application programs in the multiple application windows.
  • the display interface of one of the application windows (such as the fourth floating window) displayed by the electronic device is the display interface of any area of the display interface of any application selected by the user, so that the display interface of the application window It meets the interface display requirements of the application selected by the user, so that the application window can be personalized for the current user of the electronic device to improve user experience.
  • FIG. 7 shows a schematic diagram of the hardware structure of an electronic device 100 according to some embodiments of the present invention.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figures, or combine some components, or separate some components, or arrange different components.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, and a battery 142 , Antenna 1, Antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193 , display screen 194, and subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and an ambient light sensor. 180L, bone conduction sensor 180M, etc.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (GPU), an image signal processor. (image signal processor (ISP), controller, video codec, digital signal processor (DSP), baseband processor, and/or neural-network processing unit (NPU), etc.
  • different processing units can be independent devices or integrated in one or more processors.
  • the controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • the processor 110 can execute instructions corresponding to the display methods provided in the foregoing embodiments.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have been recently used or recycled by processor 110 . If the processor 110 needs to use the instruction or data again, it can be directly called from the above-mentioned memory. Repeated access is avoided and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • Interfaces can include integrated circuit (inter-integrated circuit, I2C) interface, integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, pulse code modulation (pulse code modulation, PCM) interface, universal asynchronous receiver and transmitter (universal asynchronous receiver/transmitter (UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and /or universal serial bus (USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous receiver and transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (SDA) and a serial clock line (derail clock line, SCL).
  • processor 110 may include multiple sets of I2C buses.
  • the processor 110 can separately couple the touch sensor 180K, charger, flash, camera 193, etc. through different I2C bus interfaces.
  • the processor 110 can be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interface to implement the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • processor 110 may include multiple sets of I2S buses.
  • the processor 110 can be coupled with the audio module 170 through the I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface to implement the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications to sample, quantize and encode analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface to implement the function of answering calls through a Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is generally used to connect the processor 110 and the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface to implement the function of playing music through a Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 110 and the camera 193 communicate through the CSI interface to implement the shooting function of the electronic device 100 .
  • the processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100 .
  • the GPIO interface can be configured through software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193, display screen 194, wireless communication module 160, audio module 170, sensor module 180, etc.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface Mouth etc.
  • the USB interface 130 is an interface that complies with USB standard specifications, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, etc.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through them. This interface can also be used to connect other electronic devices, such as AR devices, etc.
  • the interface connection relationship between the modules illustrated in the embodiment of the present invention is only a schematic illustration and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through the wireless charging coil of the electronic device 100 . While the charging management module 140 charges the battery 142, it can also provide power to the electronic device 100 through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, the wireless communication module 160, and the like.
  • the power management module 141 can also be used to monitor battery capacity, battery cycle times, battery health status (leakage, impedance) and other parameters.
  • the power management module 141 may also be provided in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 can be implemented through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • antenna 1 can be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, antennas may be used in conjunction with tuning switches.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G/6G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, perform filtering, amplification and other processing on the received electromagnetic waves, and transmit them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor and convert it into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be disposed in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be sent into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the application processor outputs sound signals through audio devices (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194.
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110 and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide information for use on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), Bluetooth (Bluetooth, BT), and global navigation satellites. Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), infrared technology (infrared, IR), etc.
  • the wireless communication module 160 may be a set of One or more devices that form at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110, frequency modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband code Wideband code division multiple access (WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), 5G and subsequent evolution standards, BT, GNSS, WLAN, NFC, FM, and/or IR technology, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • CDMA code division multiple access
  • WCDMA broadband code Wideband code division multiple access
  • TD-SCDMA time-division code division multiple access
  • LTE long term evolution
  • 5G and subsequent evolution standards BT, GNSS, WLAN, NFC, FM, and/or IR technology, etc.
  • GNSS can include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi- zenith satellite system (QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is an image processing microprocessor and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the display screen 194 is used to display images, videos, etc.
  • Display 194 includes a display panel.
  • the electronic device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the electronic device 100 can implement the shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193. For example, when taking a photo, the shutter is opened and the light is transmitted to the camera sensor through the lens. The light signal is converted into an electrical signal. The camera sensor passes the electrical signal to the ISP for processing and converts it into an image visible to the naked eye. ISP can also perform algorithm optimization on image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, the ISP may be provided in the camera 193.
  • Camera 193 is used to capture still images or video.
  • the object passes through the lens to produce an optical image that is projected onto the photosensitive element.
  • the photosensitive element can be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other format image signals.
  • the electronic device 100 may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
  • Video codecs are used to compress or decompress digital video.
  • Electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in multiple encoding formats, such as moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG moving picture experts group
  • MPEG2 MPEG2, MPEG3, MPEG4, etc.
  • NPU is a neural network (NN) computing processor.
  • NN neural network
  • the NPU can realize intelligent cognitive applications of the electronic device 100, such as image recognition, face recognition, speech recognition, text understanding, etc.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement the data storage function. For example, music, video and other files are saved in the external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the internal memory 121 may include a program storage area and a data storage area.
  • the stored program area can store an operating system, at least one application program required for a function (such as a sound playback function, an image playback function, etc.).
  • the storage data area may store data created during use of the electronic device 100 (such as audio data, phone book, etc.).
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one disk storage device, flash memory device, universal flash storage (UFS), etc.
  • the processor 110 executes instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor to execute various functional applications and data processing of the electronic device 100 .
  • the internal memory 121 may be used to temporarily store instructions for the display methods provided by the foregoing embodiments.
  • the electronic device 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be connected to or separated from the electronic device 100 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195 .
  • the electronic device 100 can support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card, etc. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the multiple cards may be the same or different.
  • the SIM card interface 195 is also compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as calls and data communications.
  • the electronic device 100 employs an eSIM, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
  • each unit/module mentioned in each device embodiment of the present invention is a logical unit/module.
  • a logical unit/module can be a physical unit/module, or it can be a physical unit/module.
  • Part of the module can also be implemented as a combination of multiple physical units/modules.
  • the physical implementation of these logical units/modules is not the most important.
  • the combination of functions implemented by these logical units/modules is what solves the problem of the present invention. Key technical issues raised.
  • the above-mentioned equipment embodiments of the present invention do not introduce units/modules that are not closely related to solving the technical problems raised by the present invention. This does not mean that the above-mentioned equipment embodiments do not exist. Other units/modules.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Digital Computer Display Output (AREA)
  • Telephone Function (AREA)

Abstract

La présente invention a trait au domaine technique des terminaux mobiles. Sont divulgués un procédé d'affichage, un support de stockage, ainsi qu'un dispositif électronique. Dans le procédé, une fonction d'affichage personnalisé est ajoutée à une fenêtre d'application ; lorsqu'un dispositif électronique détecte une opération d'un utilisateur pour la fenêtre d'application, si le dispositif électronique détecte une opération de l'utilisateur ajoutant une certaine application pour servir de fenêtre flottante ou une opération de l'utilisateur prenant une capture d'écran de la fenêtre d'application, le dispositif électronique peut afficher une case d'auto-sélection dans la fenêtre flottante ou une fenêtre de capture d'écran, et présenter individuellement, en réponse à une opération de sélection de case de l'utilisateur pour une certaine zone dans l'interface d'affichage de la fenêtre flottante ou de la fenêtre de capture d'écran au moyen de la case d'auto-sélection, le contenu d'affichage d'une zone sélectionnée par case dans la fenêtre flottante de la même taille que la case d'auto-sélection, de sorte que le contenu d'affichage de la fenêtre flottante réponde aux besoins individuels de l'utilisateur ; et la fenêtre flottante présente uniquement une partie de l'interface complète de l'application, de sorte que la fenêtre flottante présente une taille relativement petite et ne bloque pas facilement l'interface d'affichage d'une autre fenêtre d'application.
PCT/CN2023/087891 2022-08-19 2023-04-12 Procédé d'affichage, support de stockage et dispositif électronique WO2024036998A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211001156.2 2022-08-19
CN202211001156.2A CN117632329A (zh) 2022-08-19 2022-08-19 显示方法、存储介质及电子设备

Publications (1)

Publication Number Publication Date
WO2024036998A1 true WO2024036998A1 (fr) 2024-02-22

Family

ID=89940535

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/087891 WO2024036998A1 (fr) 2022-08-19 2023-04-12 Procédé d'affichage, support de stockage et dispositif électronique

Country Status (2)

Country Link
CN (1) CN117632329A (fr)
WO (1) WO2024036998A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130007577A1 (en) * 2011-06-28 2013-01-03 International Business Machines Corporation Drill-through lens
CN104123078A (zh) * 2014-08-12 2014-10-29 广州三星通信技术研究有限公司 输入信息的方法和设备
CN104778037A (zh) * 2015-03-19 2015-07-15 小米科技有限责任公司 应用程序的窗口小部件显示方法及装置
CN114779977A (zh) * 2022-04-26 2022-07-22 维沃移动通信有限公司 界面显示方法、装置、电子设备及存储介质
CN114879880A (zh) * 2021-02-05 2022-08-09 华为技术有限公司 电子设备及其应用的显示方法和介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130007577A1 (en) * 2011-06-28 2013-01-03 International Business Machines Corporation Drill-through lens
CN104123078A (zh) * 2014-08-12 2014-10-29 广州三星通信技术研究有限公司 输入信息的方法和设备
CN104778037A (zh) * 2015-03-19 2015-07-15 小米科技有限责任公司 应用程序的窗口小部件显示方法及装置
CN114879880A (zh) * 2021-02-05 2022-08-09 华为技术有限公司 电子设备及其应用的显示方法和介质
CN114779977A (zh) * 2022-04-26 2022-07-22 维沃移动通信有限公司 界面显示方法、装置、电子设备及存储介质

Also Published As

Publication number Publication date
CN117632329A (zh) 2024-03-01

Similar Documents

Publication Publication Date Title
WO2020233553A1 (fr) Procédé de photographie et terminal
US11450322B2 (en) Speech control method and electronic device
WO2021013158A1 (fr) Procédé d'affichage et appareil associé
WO2021129326A1 (fr) Procédé d'affichage d'écran et dispositif électronique
WO2021103981A1 (fr) Procédé et appareil de traitement d'affichage à écran divisé, et dispositif électronique
US11669242B2 (en) Screenshot method and electronic device
WO2021017889A1 (fr) Procédé d'affichage d'appel vidéo appliqué à un dispositif électronique et appareil associé
US11385857B2 (en) Method for displaying UI component and electronic device
WO2021036571A1 (fr) Procédé d'édition de bureau et dispositif électronique
WO2021213164A1 (fr) Procédé d'interaction entre des interfaces d'application, dispositif électronique et support de stockage lisible par ordinateur
WO2021036585A1 (fr) Procédé d'affichage sur écran souple, et dispositif électronique
WO2020221063A1 (fr) Procédé de commutation entre une page parent et une sous-page, et dispositif associé
WO2021082835A1 (fr) Procédé d'activation de fonction et dispositif électronique
WO2021013132A1 (fr) Procédé d'entrée et dispositif électronique
WO2022017393A1 (fr) Système d'interaction d'affichage, procédé d'affichage, et dispositif
US20220358089A1 (en) Learning-Based Keyword Search Method and Electronic Device
WO2020155875A1 (fr) Procédé d'affichage destiné à un dispositif électronique, interface graphique personnalisée et dispositif électronique
US20230168802A1 (en) Application Window Management Method, Terminal Device, and Computer-Readable Storage Medium
US20230276125A1 (en) Photographing method and electronic device
WO2023056795A1 (fr) Procédé de photographie rapide, dispositif électronique, et support de stockage lisible par ordinateur
US20230353862A1 (en) Image capture method, graphic user interface, and electronic device
WO2023130921A1 (fr) Procédé de disposition de page adapté à de multiples dispositifs, et dispositif électronique
WO2020233593A1 (fr) Procédé d'affichage d'élément de premier plan et dispositif électronique
WO2023207667A1 (fr) Procédé d'affichage, véhicule et dispositif électronique
WO2021204103A1 (fr) Procédé de prévisualisation d'images, dispositif électronique et support de stockage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23853909

Country of ref document: EP

Kind code of ref document: A1