WO2022166713A1 - Dispositif électronique et son procédé d'affichage, et support - Google Patents

Dispositif électronique et son procédé d'affichage, et support Download PDF

Info

Publication number
WO2022166713A1
WO2022166713A1 PCT/CN2022/074024 CN2022074024W WO2022166713A1 WO 2022166713 A1 WO2022166713 A1 WO 2022166713A1 CN 2022074024 W CN2022074024 W CN 2022074024W WO 2022166713 A1 WO2022166713 A1 WO 2022166713A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
application
area
user
mobile phone
Prior art date
Application number
PCT/CN2022/074024
Other languages
English (en)
Chinese (zh)
Inventor
胡颖峰
王红军
刘诗聪
崔擎誉
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022166713A1 publication Critical patent/WO2022166713A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present application relates to a graphic interface display technology in the field of electronic equipment.
  • it relates to an electronic device and a display method and medium for its application.
  • the user needs to pay attention to the content changes of one or more other applications. For example, after starting the navigation application, the user wants to use the instant chat application to chat while always paying attention to the content change of the navigation route in the navigation application. In this case, the user needs to switch back and forth between the navigation application and the instant chat application, which is cumbersome.
  • the purpose of this application is to provide a display method and medium for an electronic device and its application.
  • the electronic device will display the display content of the display interface of the first application on the second application or the desktop through a floating window. in the display interface, so that the user can pay attention to the changes of the first application while using the second application without switching between the two applications.
  • a first aspect of the present application provides an application display method, including: displaying a first display interface of a first application on a screen of an electronic device; displaying a second display interface of a second application and a first display interface of the first application on the screen a display area, wherein a part of the second display interface is blocked by the first display area, and the first display area is a part of the first display interface.
  • the electronic device opens the first application and the second application, selects the first display area from the first display interface of the first application, and displays the first display area on the second display of the second application In the interface, the first display area is smaller than the second display interface.
  • the electronic device may be a mobile phone
  • the first application may be a navigation application
  • the second application may be an instant chat application.
  • the first display interface may be a display interface of a navigation application
  • the second display interface may be a display interface of an instant chat application
  • the first display area may be an area of navigation information displayed in the navigation application.
  • the mobile phone displays the area of the displayed navigation information in the display interface of the instant chat application, so that the area outside the display interface of the instant chat application is invisible.
  • the second display interface of the second application and the first display area of the first application are displayed on the screen.
  • the electronic device may prompt the user whether to enable the multi-application display on the second display interface of the currently running second application.
  • the second display interface of the second application and the first display area of the first application may prompt the user whether to enable the multi-application display on the second display interface of the currently running second application.
  • the user's selection of multi-application display includes the user clicking on an icon used to display the first application in a floating manner.
  • the electronic device may prompt, in the form of an icon, on the second display interface of the currently running second application for confirming the multi-application display.
  • the first display area is selected from the first display interface.
  • the first display area is determined according to the user's selection result.
  • determining the first display area according to the user's selection result includes:
  • the first display area is determined according to the trajectory formed by the user's gesture operation on the first display interface.
  • the first display area is determined after completing the unclosed area.
  • the first application may be a navigation application
  • the first display interface may be the display interface of the navigation application
  • the first display area may be the area of the navigation information displayed in the navigation application, that is, the A local area in the display interface.
  • the user can click the "Partial Window" button in the control center of the mobile phone to enter the selection interface.
  • the user can perform a selection gesture on the display interface of the navigation application in the selection interface, that is, a gesture operation, and the mobile phone determines the local area according to the trajectory of the selection gesture.
  • the trajectory formed by the user's selection gesture is an arc shape
  • the mobile phone can directly connect the part with the gap to complete it.
  • the size and position of the first display area are adjusted, wherein the user's adjustment operation includes at least one of the following:
  • the user may perform, for example, a gesture to expand the displayed content, a gesture to shrink the displayed content, and a gesture to move on the first display area.
  • the first application may be a navigation application
  • the mobile phone may detect the area of the displayed navigation information contained in the navigation application through the view system, and determine the area as the first display area.
  • the number of changes in the picture in the detection area is detected, and when the number of changes exceeds a preset change threshold, it is determined that the area has changed in display content. area.
  • the method that the mobile phone can detect the area of the displayed navigation information contained in the navigation application through the view system may be to detect the refresh frequency of the displayed content in the navigation application, that is, to determine the display content. Whether the number of screen changes exceeds the refresh rate.
  • the first display area is displayed on the screen by means of a floating window.
  • the first display area is displayed in a floating manner on the second display interface of the second application.
  • the first display area is displayed on the screen by means of a floating window in the following manner:
  • the view system of the mobile phone sets the navigation application on the first layer, sets the instant chat application on the second layer, and the first layer covers the second layer.
  • the area of the navigation information of the navigation application is set to be transparent. This displays both the navigation application's navigation information and the live chat application.
  • the first display content in response to a user's first change operation in the first display area, is changed to the second display content in the first display area.
  • a second change operation performed by the user in an area outside the first display area is received, and an instruction corresponding to the second change operation is transmitted to the second layer.
  • the third display content in response to an instruction corresponding to the second change operation, is changed to the fourth display content in the second display interface.
  • the display content of the navigation application will be expanded and displayed, that is, the display content of the first display will be changed to The second display content.
  • the display content of the display interface of the mobile phone's instant chat application achieves an upward movement effect, that is, from the third display The content is changed to the fourth display content.
  • the first display area is displayed on the screen in a picture-in-picture manner.
  • a second aspect of the present application provides an electronic device, characterized in that it includes:
  • the processor is coupled to the memory, and when the program instructions stored in the memory are executed by the processor, the electronic device executes the display method of the application provided by the aforementioned first aspect.
  • a third aspect of the present application provides a readable medium, in which instructions are stored, characterized in that, when the instructions are executed on the readable medium, the readable medium is caused to execute the application provided by the foregoing first aspect. Display method.
  • FIG. 1( a ) shows an example of a display interface of a navigation application of an electronic device according to an embodiment of the present application
  • Fig. 1(b) shows an example in which a display interface of a navigation application of an electronic device is suspended in a display interface of an instant chat application according to an embodiment of the present application;
  • FIG. 2 shows a block diagram of a hardware structure of an electronic device according to an embodiment of the present application
  • FIG. 3 shows a block diagram of a software structure of an electronic device according to an embodiment of the present application
  • FIG. 4 shows a method flowchart of a method for displaying an application of a mobile phone according to an embodiment of the present application
  • Fig. 5(a) shows an example of a guidance interface supporting the display of a floating window of a navigation application according to an embodiment of the present application
  • Fig. 5(b) shows an example of a prompt operation of a guidance interface of a navigation application according to an embodiment of the present application
  • Fig. 5(c) shows an example of a display result of a guidance interface of a navigation application according to an embodiment of the present application
  • FIG. 6( a ) shows an example of a mobile phone acquiring a selection gesture of a display interface of a navigation application according to an embodiment of the present application
  • Fig. 6(b) shows an example of a preview interface of a partial area of a display interface of a navigation application according to an embodiment of the present application
  • Fig. 6(c) shows an example of an instant chat application in which a local area of a navigation application is suspended and displayed according to an embodiment of the present application
  • FIG. 7( a ) shows an example of display content in a floating window of a navigation application according to an embodiment of the present application
  • FIG. 7(b) shows an example of changes in the displayed content in the floating window of the navigation application according to an embodiment of the present application
  • FIG. 8( a ) shows an example of a user performing an operation gesture in a floating window of a navigation application according to an embodiment of the present application
  • Figure 8(b) shows another example of a user performing an operation gesture in a floating window of a navigation application according to an embodiment of the present application
  • FIG. 8( c ) shows an example of changes in the displayed content in the floating window of the navigation application according to an embodiment of the present application
  • Figure 9(a) shows an example of a user performing an operation gesture in the display interface of the instant chat application according to an embodiment of the present application
  • FIG. 9(b) shows an example of changes in the displayed content in the display interface of the instant chat application according to the embodiment of the present application.
  • Figure 10(a) shows an example of a user performing a downward swipe gesture on a display interface of a video playback application according to an embodiment of the present application
  • Fig. 10(b) shows an example of the user clicking the "partial window” button on the display interface of the control center of the mobile phone to enter a selection interface for partial area selection according to an embodiment of the present application
  • Fig. 10(c) shows an example of adjusting the preview frame on the selection interface selected by the user in the local area according to an embodiment of the present application
  • FIG. 10(d) shows an example of a user determining a partial area of a display interface of a video playback application according to an embodiment of the present application
  • FIG. 10(e) shows an example of an instant chat application in which a local area of a video playback application is suspended and displayed according to an embodiment of the present application
  • Fig. 11(a) shows an example of a dynamically changing area in a display interface of a mobile phone identification navigation application according to an embodiment of the present application
  • FIG. 11( b ) shows an example of an instant chat application in which a local area of a navigation application is suspended and displayed according to an embodiment of the present application
  • Fig. 12(a) shows an example in which a user clicks a button displayed in a floating window on a display interface of a video playback application to start a floating display according to an embodiment of the present application
  • Figure 12 (b) shows an example in which the user clicks a button to exit the floating window display in the floating window of the video playback application according to an embodiment of the present application to close the floating display;
  • Fig. 13(a) shows an example of a mobile phone prompting a user that the display content of a navigation application running in the background has changed according to an embodiment of the present application, and the user confirms that the changed display content is displayed;
  • Fig. 13(b) shows an example of the display content of the navigation application displayed by the mobile phone prompting the user through a floating window according to an embodiment of the present application
  • Fig. 13(c) shows an example of an instant chat application in which a local area of a navigation application is suspended and displayed according to an embodiment of the present application
  • Figure 14(a) shows an example of a user performing a downward swipe gesture on a display interface of an instant chat application according to an embodiment of the present application
  • Figure 14(b) shows an example of a user clicking the "Application Floating Display” button on the display interface of the control center of the mobile phone to enter a list interface of applications running in the background according to an embodiment of the present application;
  • Figure 14(c) shows an example of a list interface of applications running in the background according to an embodiment of the present application
  • Fig. 14(d) shows an example of an application displayed by a user selecting a hover display according to an embodiment of the present application
  • FIG. 14(e) shows an example of a video playback application and an instant chat application in which a partial area of a navigation application is suspended and displayed according to an embodiment of the present application
  • FIG. 15 shows an example in which a navigation application, an instant chat application, a reading application and a desktop are displayed through multiple windows according to an embodiment of the present application
  • FIG. 16 shows an example in which a navigation application is displayed in an equal-scale scaling manner according to an embodiment of the present application
  • Fig. 17(a) shows an example of a display interface of a video playback application according to an embodiment of the present application
  • FIG. 17(b) shows an example of a video playback application in which a local area of a video playback application is suspended and displayed according to an embodiment of the present application
  • FIG. 18 shows a block diagram of the hardware structure of the electronic device in the embodiment of the present application.
  • Embodiments of the present application include, but are not limited to, an electronic device and a display method and medium for its application.
  • FIGS. 1(a)-(b) provide a display scenario of an application of an electronic device according to an embodiment of the present application.
  • the first application It is an instant chat application 102
  • the second application is a navigation application 101 .
  • FIG. 1( a ) shows the display interface of the navigation application 101 .
  • the electronic device 100 will display the navigation application 101 in a suspended manner on other interfaces.
  • the user switches to the instant chat application 102, as shown in FIG. 1(b),
  • the view system of the electronic device 100 suspends the entire display interface of the navigation application 101 in the display interface of the instant chat application 102 as a floating window in the form of a small size window, and the display interface of the navigation application 101 is displayed in the floating window.
  • the electronic device 100 can also be dragged on the screen of the electronic device 100 based on the user's gesture operation to change the position of the floating window.
  • FIG. 2 shows another scene diagram of two applications simultaneously displayed on the screen of the electronic device 100 in the embodiment of the present application. As shown in FIG.
  • the view system of the electronic device 100 suspends a part of the display area 1011 of the navigation application 101 on the display of the instant chat application 102 as a floating window in the form of a small-sized window.
  • the part of the display area 1011 of the navigation application 101 displayed in the floating window is the part of the navigation application 101 where the display content changes with time.
  • the electronic device 100 may also receive a gesture operation performed by the user on the display interface of the navigation application 101 in the floating window, so as to change the display content of the display interface of the navigation application 101 in the floating window.
  • the user can also click the “close” button 1018 or the “full screen display” button 1019 in the partial display area 1011 , so that the navigation application 101 is closed or displayed in full screen on the screen of the electronic device 100 .
  • the electronic device 100 can display the display content of the display interface of the first application that the user needs to pay attention to in real time in the display interface of the second application or the desktop by means of a floating window, so that the two applications can be switched without switching between the two applications.
  • the second application it is convenient for the user to pay attention to the changes of the first application while using the second application.
  • the area that the user pays attention to can be displayed to the user more clearly, and the display content in the floating window can be avoided due to the display interface of the entire first application being displayed in a reduced size of the floating window. Problems that become unclear due to zooming out.
  • the first application and the second application here may be third-party applications installed on the electronic device 100 , or may be system applications of the electronic device 100 , for example, the first application and the second application may be the electronic device 100 Apps such as desktop and settings.
  • the electronic device 100 may be various electronic devices, for example, the electronic device 100 includes, but is not limited to, a laptop computer, a desktop computer, a tablet computer, a mobile phone, a server, a wearable device, a headphone Wearable displays, mobile email devices, portable game consoles, portable music players, reader devices, or other electronic devices capable of accessing a network.
  • the embodiments of the present application may also be applied to wearable devices worn by users. For example, smart watches, bracelets, jewelry (eg, devices made into decorative items such as earrings, bracelets, etc.) or glasses, or the like, or as part of a watch, bracelet, jewelry, or glasses, or the like.
  • various applications may be installed on the electronic device 100 , such as video conference applications, instant chat applications, video playback applications, and navigation applications.
  • FIG. 3 is a block diagram of a software structure of an electronic device 100 according to an embodiment of the present invention.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the Android system is divided into four layers, which are, from top to bottom, an application layer, an application framework layer, an Android runtime (Android runtime) and a system library, and a kernel layer.
  • the application layer can include a series of application packages.
  • the application package can include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message and so on.
  • the application package may include a startup navigation application 101 , an instant chat application 102 , a reading application 103 , a desktop 104 and a video playing application 105 .
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, an application service management module, a gesture recognition module, and the like.
  • the gesture recognition module is used to recognize the gesture operation performed by the user on the screen of the electronic device 100 .
  • a window manager is used to manage window programs.
  • the window manager can get the display size, determine whether there is a status bar, lock the screen, take screenshots, etc.
  • Content providers are used to store and retrieve data and make these data accessible to applications.
  • the data may include video, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications.
  • the display interface can be composed of one or more views, where the views are used to display the visual controls in the area where the view is located and handle events that occur in the area where the diagram is located. For example, a view can handle events corresponding to gestures generated in the area where the view is located.
  • a display interface of a navigation application it may include a view of navigation content and a view of navigation information
  • an instant chat application it may include a view of chat content and a view of menu.
  • the view system can calculate the local area corresponding to the trajectory from the view of the display interface of the application according to the recognized trajectory of the user's gesture operation, and use the local area as the display area, and use the local area as the display area. It is displayed by means of a floating window.
  • the view system may also automatically determine the display area in the application's display interface.
  • the view system can also place two applications in separate layers, with one layer above the other. This layer is used to host a window corresponding to an application.
  • the phone manager is used to provide the communication function of the electronic device 100 .
  • the management of call status including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localization strings, icons, pictures, layout files, video files and so on.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a brief pause without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications on the screen in the form of dialog windows. For example, text information is prompted in the status bar, a prompt sound is issued, the electronic device vibrates, and the indicator light flashes.
  • the application business management module can manage the connection between the application and the server.
  • Android Runtime includes core libraries and a virtual machine. Android runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • surface manager surface manager
  • media library Media Libraries
  • 3D graphics processing library eg: OpenGL ES
  • 2D graphics engine eg: SGL
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display drivers, camera drivers, audio drivers, and sensor drivers.
  • the technical solution of the display method of the application of the mobile phone 100 includes:
  • the mobile phone 100 starts the navigation application 101, and prompts the user that the navigation application 101 supports displaying the display interface of the navigation application 101 by means of a floating window.
  • the mobile phone 100 After the user turns on the mobile phone 100, the user can click the "navigation" icon in the user interface (UI) of the mobile phone 100. After the mobile phone 100 receives the user's click operation instruction, the mobile phone 100 starts the navigation application 101.
  • the display interface of the navigation application 101 displays a guide interface 1013 that the application supports to be displayed by means of a floating window.
  • the guide interface 1013 can be Display the prompt message "Select an area in the display interface to display it in the floating window” and the "Continue" and "Cancel” buttons.
  • the guidance interface 1013 of the navigation application 101 exits.
  • the guidance interface 1013 may prompt the user through animation to display the operation method of displaying the display interface of the navigation application 101 through a floating window.
  • the guidance interface 1013 prompts the user to perform a gesture operation on the display interface of the navigation application 101 to select a display area, and click the "small window display” button 1016, as shown in Fig. 5(c) display, so that the display area 1011 is displayed by means of a floating window. It can be understood that the display area 1011 in FIG.
  • 5( c ) also includes two buttons “close” and “full screen display”. After the user clicks the “close” button 1018 in the display area 1011, the display area 1011 is closed. After the user clicks the “full screen display” button 1019 in the display area 1011 , the navigation application 101 is displayed in the full screen on the screen of the mobile phone 100 . Here, after the user clicks the "got” button 1020 in the guidance interface 1013 of FIG. 5( c ), the guidance interface 1013 of the navigation application 101 is closed.
  • the mobile phone 100 may also start the instant chat application 102 in advance. After the mobile phone 100 determines the display area 1011 displayed in the form of a floating window in the navigation application 101 , the display area 1011 can be displayed on the display interface of the instant chat application 102 .
  • the mobile phone 100 determines the display area 1011 in the display interface of the navigation application 101 .
  • the display area 1011 in the display interface of the navigation application 101 may be a partial area in the display interface of the navigation application 101 , and the partial area may be determined by the mobile phone 100 according to the selection gesture performed by the user on the display interface of the navigation application 101 .
  • the gesture recognition module of the mobile phone 100 can recognize a selection gesture performed by the user on the display interface of the navigation application 101 , and the gesture recognition module obtains the trajectory of the selection gesture, The area selected by the selection gesture is sent to the view system of the mobile phone 100 . As shown in FIG. 6( c ), the view system of the mobile phone 100 determines the area as the display area 1011 in the display interface of the navigation application 101 .
  • the view system of the mobile phone 100 can also determine the size and position of the display area 1011 in the display interface of the navigation application 101 .
  • the size of the display interface of the navigation application 101 may be 90mm ⁇ 120mm (length ⁇ width)
  • the view system of the mobile phone 100 determines that the size of the display area 1011 corresponding to the trajectory of the user’s selection gesture may be 20mm ⁇ 19mm (length ⁇ width).
  • the position of the display area 1011 may be 10 mm from the upper border of the display interface of the navigation application 101 and 10 mm from the left border. The process of determining the display area 1011 in the display interface of the navigation application 101 by the mobile phone 100 will be described in detail below.
  • the mobile phone 100 may also automatically determine the display area 1011 in the display interface of the navigation application 101, and the implementation process of the method will also be described in detail below.
  • the mobile phone 100 displays the display area 1011 of the navigation application 101 in the form of a floating window.
  • the view system of the mobile phone 100 enters the preview interface 120, and the preview interface 120 is set with “small window display” ” button 1202 is used to set the display area 1011 to be displayed by means of a floating window.
  • the view system of the mobile phone 100 displays the display area 1011 of the navigation application 101 on the display interface of the instant chat application 102 of the mobile phone 100 by means of a floating window. Display location.
  • the size of the display area 1011 determined by the user for example, the size of the display area 1011 can be determined as 20mm ⁇ 19mm (length ⁇ width), and the default display position can be an instant distance from the mobile phone 100
  • the upper boundary of the display interface of the chat application 102 is 10 mm, and the right boundary is 10 mm. That is, the display area 1011 of the navigation application 101 is displayed on the upper right of the display interface of the instant chat application 102 .
  • the view system of the mobile phone 100 sets the navigation application 101 on the first layer, and sets the instant chat application 102 of the mobile phone 100 on the second picture layer, the first layer overlays the second layer. That is, the mobile phone 100 runs the navigation application 101 in the first layer, and the mobile phone 100 runs the display interface of the instant chat application 102 in the second layer.
  • the mobile phone 100 sets the display area 1011 of the navigation application 101 to be visible. For example, the area that is 20 mm away from the upper border of the display interface of the navigation application 101 and 19 mm from the left border and the size is 20 mm ⁇ 19 mm (length ⁇ width) is set as visible, and Make the area outside this display area transparent.
  • the mobile phone 100 moves the position of the navigation application 101 in the first layer so that the display area 1011 is located to the display position of the display interface of the instant chat application 102 of the mobile phone 100 in the second layer.
  • the user can browse the display content of the display interface of the instant chat application 102 of the mobile phone 100 in the second layer through the area other than the display area 1011 of the navigation application 101, and at the same time, the display area 1011 of the navigation application 101 can pass through
  • the floating window is displayed on the display interface of the instant chat application 102 of the mobile phone 100 , and the user can also browse the display content of the display area of the navigation application 101 in real time.
  • the display position of the display interface of the instant chat application 102 of the mobile phone 100 may be a parameter preset in the storage area of the mobile phone 100 .
  • the display content of the navigation application 101 in the display area 1011 will change in real time. That is, after the mobile phone 100 sets the navigation application 101 on the first layer, the navigation application 101 is still in the running state. Therefore, the display content in the display interface of the navigation application 101 also changes in real time.
  • the display content of the display area 1011 belongs to the display content of the display interface of the navigation application 101
  • the display content of the navigation application 101 in the display area 1011 on the display interface of the instant chat application 102 of the mobile phone 100 will also change in real time. . For example, as shown in FIGS.
  • the navigation information in the display interface of the navigation application 101 changes. For example, after the vehicle has been running for a period of time, the route of the navigation information changes. The changes will be updated in the display area 1011 on the display interface of the instant chat application 102 of the mobile phone 100 in real time.
  • S404 The mobile phone 100 determines whether a gesture operation performed by the user in the display area 1011 of the navigation application 101 is detected.
  • the mobile phone 100 After the mobile phone 100 receives the gesture operation performed by the user in the display area of the navigation application 101, S405 is performed, and the mobile phone 100 changes the display content in the display area in response to the gesture operation.
  • the mobile phone 100 changes the display content in the display area 1011 .
  • the navigation application 101 of the mobile phone 100 runs in the first layer, and the first layer is overlaid on the second layer. Therefore, the first layer is the current view in the screen of the mobile phone 100 , that is to say, the gesture operations performed by the user on the screen of the mobile phone 100 all act on the first layer. If the user's gesture operation acts on the display area 1011 of the navigation application 101 of the mobile phone 100, it is equivalent to acting on the display interface of the navigation application 101. After the gesture recognition module of the mobile phone 100 recognizes the gesture operation, the navigation The application 101 can respond to the instruction corresponding to the gesture operation, so that the display content of the navigation application 101 in the display area 1011 of the navigation application 101 of the mobile phone 100 is changed.
  • the user performs a gesture to expand the displayed content on the display content of the navigation application 101 , and the gesture recognition module of the mobile phone 100 recognizes the After the gesture operation, the navigation application 101 is made to respond to the gesture, the display content of the navigation application 101 in the display area 1011 of the navigation application 101 of the mobile phone 100 will be enlarged and displayed, and the gesture will not affect the display interface of the instant chat application 102 .
  • the user performs a pull-down gesture on the display content of the navigation application 101 in the display area 1011 of the navigation application 101 of the mobile phone 100, and the gesture recognition of the mobile phone 100
  • the navigation application 101 responds to the gesture, and the display content of the navigation application 101 in the display area 1011 of the navigation application 101 moves downward.
  • the gesture will not have any effect on the display interface of the instant chat application 102 .
  • the mobile phone 100 determines whether a gesture operation performed by the user in an area other than the display area 1011 of the navigation application 101 is detected.
  • S407 is executed, and the mobile phone 100 changes the display content of the instant chat application 102 in response to the gesture operation.
  • the mobile phone 100 changes the display content of the instant chat application 102 .
  • the mobile phone 100 determines that the user performs a gesture operation in an area other than the display area 1011 of the navigation application 101 of the mobile phone 100, the mobile phone 100 is acquiring the instruction corresponding to the gesture operation and making it not in the first layer is executed, the mobile phone 100 transmits the instruction corresponding to the gesture operation to the display interface of the instant chat application 102 of the mobile phone 100 on the second layer, and the display interface of the instant chat application 102 of the mobile phone 100 can respond to the instruction corresponding to the gesture operation, so that The display content of the display interface of the instant chat application 102 of the mobile phone 100 changes.
  • the user executes a For the bottom-up swipe gesture, after acquiring the instruction corresponding to the gesture operation, the mobile phone 100 determines that it is executed in an area other than the display area 1011 of the navigation application 101 of the mobile phone 100, and the mobile phone 100 transfers the instruction corresponding to the bottom-up swipe gesture to the display interface of the instant chat application 102 of the mobile phone 100 on the second layer.
  • the display interface of the instant chat application 102 of the mobile phone 100 responds to the instruction corresponding to the sliding gesture from the bottom to the top, and the instant chat application 102 of the mobile phone 100
  • the display content of the display interface realizes an upward movement effect.
  • a process of determining the display area 1011 in the display interface of the navigation application 101 by the mobile phone 100 described in S402 will be specifically described below.
  • S402a The mobile phone 100 receives a selection gesture performed by the user on the display interface of the navigation application 101.
  • the user can manually select a partial area of the display interface of the navigation application 101 through a selection gesture, and the gesture recognition module of the mobile phone 100 can recognize the selection gesture. trajectory.
  • the display interface of the navigation application 101 here may be a display interface that displays changes in navigation information, such as a navigation route.
  • the user can select a local area on the display interface of the navigation application 101 on the screen of the mobile phone 100 through a selection gesture of sliding with a knuckle. In this way, conflicts with default gestures supported by the navigation application 101 can be avoided.
  • the gesture recognition module may be any existing module for realizing gesture recognition or finger joint recognition, which is not limited herein.
  • S402b The mobile phone 100 selects a local area in the display interface of the navigation application 101 according to the trajectory of the selection gesture.
  • the gesture recognition module of the mobile phone 100 After recognizing the trajectory of the user's selection gesture, the gesture recognition module of the mobile phone 100 sends it to the view system of the mobile phone 100 . According to the local area corresponding to the track, the view system of the mobile phone 100 uses the local area as a display area.
  • the partial region of the display interface of the navigation application 101 selected by the user through the selection gesture may be the display content included in the track formed by the selection gesture performed by the user in the display interface of the navigation application 101 .
  • the view system of the mobile phone 100 sets a method for monitoring gesture events on the view where the display interface of the navigation application 101 is located, for example, the onTouchEvent method, which recognizes the trajectory of the user's selection gesture by starting the gesture recognition module of the mobile phone 100. That is, when the user presses or contacts the screen of the mobile phone 100 with a conductor, the method can detect the position of pressing or contacting the screen of the mobile phone 100 , and the position of pressing or contacting the screen of the mobile phone 100 may be referred to as a contact point for short. When the gesture recognition module detects the touch point, it can be determined that the user's selection gesture is detected.
  • the selection gesture is a gesture that can be slid within the screen of the mobile phone 100 .
  • the method for monitoring gesture events can obtain the x-coordinate value and The y coordinate value. After the user completes the selection gesture, the method for monitoring gesture events can calculate the trajectory formed by the selection gesture according to the x coordinate value and the y coordinate value, and determine the local area corresponding to the trajectory.
  • the trajectory formed by the user's selection gesture may be a closed area or a non-closed area.
  • the selected partial area of the display interface of the navigation application 101 is the display content in the closed area.
  • the selected local area of the display interface of the navigation application 101 may be the area between the trajectory formed by the user's selection gesture and the left and right or upper and lower boundaries of the screen of the mobile phone 100 . Display content within the formed area.
  • the selected local area of the display interface of the navigation application 101 is within the area formed by the straight line, curve or arc and the left and right boundaries or upper and lower boundaries of the screen. Display content.
  • the selected partial area of the display interface of the navigation application 101 may also be the display content in the closed area obtained by completing the non-closed area.
  • the display content of the selected current display page is the display content in the closed area obtained after the V-shaped or arc-shaped complement is completed. Completion may be to directly connect the parts with gaps, or to complete the non-closed area according to the principle of symmetry, or to complete the shape of the non-closed area into a regular figure, etc.
  • S402c The mobile phone 100 adjusts a partial area of the display interface of the navigation application 101 to form a display area of the navigation application 101 .
  • the mobile phone 100 can enter the preview interface 120 of the partial area of the display interface of the navigation application 101, and the preview interface 120 , the view system of the mobile phone 100 may expand the local area to form a rectangle, or other shapes such as a circle or a diamond-shaped frame, as the preview area 1201 . For example, as shown in FIG.
  • the view system of the mobile phone 100 can form a preview area 1201 by circumscribing the circular closed area, for example , the preview area 1201 can be an area with a size of 20mm ⁇ 19mm (length ⁇ width).
  • the user can also adjust the size of the preview area 1201 again through gestures. For example, the user can adjust the size of the display area by dragging the four sides of the rectangle formed by the preview area 1201, and the user can also adjust the display interface of the preview area 1201 in the navigation application 101 by pressing and dragging the rectangle formed by the preview area 1201. in the location.
  • the view system of the mobile phone 100 may also store the size of the preview area 1201 and the default position of the preview area 1201 in its own storage area.
  • the size of the preview area 1201 may be 20mm ⁇ 19mm (length ⁇ width)
  • the default position of the preview area 1201 may be the distance from the center of the preview area 1201 to the four sides of the display interface of the navigation application 101 .
  • the mobile phone 100 obtains a method for the user to perform a selection gesture on the display interface of the navigation application 101 to determine the display area 1011 of the navigation application 101, and several other implementation steps S402 will be described below.
  • the mobile phone 100 determines the display area 1011 of the navigation application 101 .
  • the mobile phone 100 can automatically select the display area 1011 in the display interface of the navigation application 101 through the selection function of the display area of its own control center. After receiving an instruction from the user to enable the selection function set in the control center of the mobile phone 100, the mobile phone 100 enters the local area selection mode of the application. In this selection mode, the view system of the mobile phone 100 can implement the selection of a display area in the display interface of the navigation application 101 as described in S402 above.
  • the user opens the video playback application 105 and the instant chat application 102 on the mobile phone 100 and switches to the video playback application 105 .
  • the mobile phone 100 enters the display interface of the control center 201 .
  • a button 2011 named "partial window” is provided in the display interface of the control center 201.
  • the mobile phone 100 enters the selection interface 202 for selecting a local area on the display interface of the video playback application 105 .
  • a rectangular preview frame 2021 for selecting a partial area and a "small window display” button 2022 will be displayed.
  • the user can drag the four sides of the preview frame 2021 to match Select the size of the local area of the display interface of the video playback application 105 to adjust, the user can also press and drag the preview frame 2021 to adjust the position of the preview frame in the display interface of the video playback application 105 to select the video playback application 105.
  • the partial area of the display interface is used as the display area.
  • Fig. 10(d) after the user has determined the display area in the display interface of the video playback application 105, he can click the "small window display” button 2022 to enter S403, as shown in Fig. 10(e), the display of the mobile phone 100
  • the view system displays the video display area 1051 of the display interface of the video playback application 105 on the display interface of the instant chat application 102 of the mobile phone 100 through a floating window.
  • the mobile phone 100 can implement the display area 1011 in the display interface of the navigation application 101 determined by the mobile phone 100 as described in the above S402 through the floating window display mode of the display area.
  • the mobile phone 100 can activate the floating window display mode. Different from the method described above, in this floating window display mode, the view system of the mobile phone 100 can detect and determine the area in the display interface of the navigation application 101 where the displayed content will change dynamically, And the display area 1012 of the navigation application 101 is determined from the area in which the display content will change dynamically.
  • the area in which the display content will change dynamically described here refers to that the display content in an area of the display interface of an application will change within a period of time.
  • the dynamically changing area contained therein may be the area where the map is located, and the type may be the map view MapView.
  • the display content in the area where the map view is located will change in real time.
  • the view system of the mobile phone 100 can detect each area in the display interface of the navigation application 101.
  • a change threshold for example, A change of 10 frames per second is generated, that is, the refresh rate of the display content is 10FPS.
  • the FPS here refers to the number of frames transmitted per second (FPS, Frames Per Second, frames per second), which refers to the refreshed content in the display per second.
  • the number of pictures, 10FPS means that the number of pictures refreshed in the display content is 10 in one second.
  • the view system of the mobile phone 100 confirms that the area belongs to the area where the display content will change dynamically. In FIG.
  • the mobile phone 100 can radiate from the center of the dynamically changing area to the surrounding area of the dynamically changing area to form a In the rectangular display area 1011, the view system of the mobile phone 100 enters S403, and displays the display area 1011 on the display interface of the instant chat application 102 through a floating window.
  • the view system of the mobile phone 100 may determine which views belong to the dynamically changing region according to the list of view types pre-stored in its own storage region.
  • the dynamically changing area contained in it may be a video playback view, and its type may be VideoView.
  • the mobile phone 100 may also set a button for entering/exiting the floating window display mode for an already opened application. For example, as shown in FIGS. 12( a ) to 12 ( b ), the user can click the “floating window display” button 1052 on the display interface of the video playing application 105 to make the mobile phone 100 determine the display interface of the video playing application 105 The video display area 1051 is displayed in a floating window. After the mobile phone 100 displays the video playing area of the video playing application 105 in the floating window, the user can also click the "exit floating window display” button 1053 in the floating window, so that the mobile phone 100 cancels the display area 1051 of the video of the video playing application 105 Display the floating window.
  • the floating window display mode of the mobile phone 100 can also be used for an application popped up on the mobile phone 100 to realize the determination of the display area 1011 in the display interface of the navigation application 101 by the mobile phone 100 described in S402 above. .
  • the mobile phone 100 After the mobile phone 100 starts the floating window display mode, in the floating window display mode, when the mobile phone 100 detects that the display content of the application running in the background has changed, it prompts the user whether The application is displayed in the form of a floating window, so that the user can always pay attention to the displayed content of the application. For example, as shown in Fig. 13(a), the display interface of the instant chat application 102 is displayed on the screen of the mobile phone 100. At this time, the upper part of the screen of the mobile phone 100 prompts the user that the display content of the display interface of the navigation application 101 running in the background has changed. To click to view.
  • the mobile phone 100 will no longer prompt the user to change the display content of the display interface of the navigation application 101 .
  • the mobile phone 100 prompts the user again whether to display the display interface of the navigation application 101 in the display interface of the instant chat application 102 by means of a floating window. If the user chooses to click "No", the mobile phone 100 will not display the display interface of the navigation application 101 in the form of a floating window. If the user chooses to click “Yes”, as shown in FIG. 13( c ), the view system of the mobile phone 100 can display the display interface of the navigation application 101 in the display interface of the instant chat application 102 by means of a floating window.
  • the mobile phone 100 may also display multiple applications by means of a floating window.
  • the mobile phone 100 can enable the application floating display function. After this function is enabled, the view system of the mobile phone 100 can display the applications already running in the background in a list display mode on the screen of the mobile phone 100. The user can select one of the above applications. Or multiple applications, so that the above applications are displayed in the display interface of the current application by means of a floating window.
  • the following takes the mobile phone 100 running the navigation application 101 , the instant chat application 102 , the reading application 103 and the video playback application 105 as an example for description, wherein the current application displayed on the screen of the mobile phone 100 is the instant chat application 102 .
  • the mobile phone 100 enters the display interface 401 .
  • a button 4011 named “Application Floating Display” is provided in the display interface of the control center 401 . After the user presses the button 4011 of the "application floating display”, the mobile phone 100 enters the list interface 501 of the applications currently running in the background.
  • the identifiers of all the applications currently running in the background will be displayed, for example, the navigation application 101, the reading application 103 and the video playing application 105 will be displayed, and the user will be prompted to select the The application displayed in the form of a window.
  • the user can determine whether to display the application in the display interface of the current application by means of a floating window by clicking on the identifier of each application.
  • the user can click the "OK" button, as shown in Fig.
  • the view system of the mobile phone 100 displays the display interface of the video playback application 105 and the display interface of the navigation application 101 on the display interface of the instant chat application 102 of the mobile phone 100 through a floating window.
  • the user can also click the "Cancel" button to exit the application floating display function.
  • the above-mentioned step 403 describes a method for displaying the display area 1011 of the navigation application 101 on the display interface of the instant chat application 102 by means of a floating window. Another implementation method of the above-mentioned step 403 is described below.
  • the view system of the mobile phone 100 may set the navigation application 101 on the second layer, the instant chat application 102 of the mobile phone 100 on the first layer, and the first layer.
  • the layer overlays the second layer. That is, the mobile phone 100 runs the navigation application 101 in the second layer, and the mobile phone 100 runs the display interface of the instant chat application 102 in the first layer.
  • the mobile phone 100 digs out a local area in the display interface of the instant chat application 102.
  • the mobile phone 100 moves the position of the navigation application 101 in the second layer so that the display area 1011 is aligned with the local area of the instant chat application 102 of the mobile phone 100 in the first layer.
  • the user can browse the display content of the display area 1011 of the display interface of the navigation application 101 of the mobile phone 100 in the second layer through the partial area of the instant chat application 102, and at the same time, the display area of the navigation application 101 can be displayed in the display area of the navigation application 101 through a similar suspension.
  • the window is displayed on the display interface of the instant chat application 102 of the mobile phone 100 , and the user can also browse the displayed content of the display area of the navigation application 101 in real time.
  • the size of the above-mentioned display area may be the same as the size of the display area in step S402, or may be pre-stored in the storage area of the mobile phone 100, for example, 20mm ⁇ 10mm (length ⁇ width).
  • the size of the above-mentioned display area may also be the size of the view in which the dynamically changing area in the display interface of the navigation application 101 is located, and the mobile phone 100 may obtain the type of the view corresponding to the dynamically changing area. At the same time, get the size of the view and set it to the size of the display area.
  • the mobile phone 100 displays the display interface of the navigation application 101 by means of a floating window in the display interface of the instant chat application 102 .
  • the mobile phone 100 starts the navigation application 101 , the instant chat application 102 , the reading application 103 and the desktop 104 .
  • the window manager of the mobile phone 100 divides the screen of the mobile phone 100 into multiple windows to divide the navigation application 101.
  • the instant chat application 102, the reading application 103 and the desktop 104 are displayed in split screens, and a main window is combined with multiple auxiliary windows to display the navigation application 101, the instant chat application 102, the reading application 103 and the desktop 104, respectively. Users can follow Applications in every window.
  • FIG. 16 shows a scene diagram of two applications simultaneously displayed on the screen of the mobile phone 100 in other embodiments of the present application.
  • An instant chat application 102 and a navigation application 101 run on the screen of the mobile phone 100.
  • the view system of the mobile phone 100 can set the window of the navigation application 101 as a resizable (adjustable) type of application.
  • the display style of the display content of the navigation application 101 may remain the same as the display style of the current display interface of the electronic device 100 .
  • the font size and font style of the navigation application 101 in the floating window are the same as the font size and font style of the instant chat application 102 .
  • FIGS. 17( a ) to ( b ) illustrate scenarios in which the display content of the application is displayed in the screen of the mobile phone 100 by means of a floating window in other embodiments of the present application.
  • FIG. 17( a ) shows the display interface of the video playback application 105 .
  • the view system of the mobile phone 100 displays the display area 1051 of the video being played in the video playback application 105 in the display interface of the video playback application 105 by means of a floating window.
  • the window performs gesture operations to control the video being played in the floating window. For example, the user can click on the floating window to pause the playing video.
  • FIG. 18 shows a schematic structural diagram of an electronic device 100 according to an embodiment of the present application.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195 and so on.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (APPlication processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • APPlication processor, AP application processor
  • modem processor graphics processor
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural-network processing unit neural-network processing unit
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the interface connection relationship between the modules illustrated in the embodiments of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is used to receive charging input from the charger.
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160.
  • the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the mobile communication module 150 may provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the electronic device 100 .
  • the modem processor may include a modulator and a demodulator. Wherein, the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellites System (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • Wi-Fi wireless fidelity
  • BT wireless fidelity
  • GNSS global navigation satellites System
  • frequency modulation frequency modulation
  • FM near field communication technology
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • the electronic device 100 may include one or N display screens 194 , where N is a positive integer greater than one.
  • the electronic device 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the camera 193 may be the camera 102 in the embodiments of the present application, and is used to collect scene images of the current environment.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area may store data (such as audio data, phone book, etc.) created during the use of the electronic device 100 and the like.
  • the internal memory 121 stores a list of view types, and the list of view types is used to determine the views belonging to the dynamically changing area.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also referred to as a "speaker" is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also referred to as "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be answered by placing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone”, is used to convert sound signals into electrical signals.
  • the earphone jack 170D is used to connect wired earphones.
  • the earphone interface 170D may be the USB interface 130, or may be a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking photos with fingerprints, answering incoming calls with fingerprints, and the like.
  • Touch sensor 180K also called “touch device”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect gesture operations acting on or near it.
  • the touch sensor can pass the detected gesture operation to the application processor to determine the type of touch event.
  • Visual output related to the gesture operation may be provided via display screen 194 .
  • the touch sensor 180K is used to receive gesture operations performed by the user on the touch screen. For example, the touch sensor 180K can determine that the user has performed a gesture operation of selecting a local area on the display interface of the application and a gesture operation of changing the display content of the display interface of the application by the user.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. Touch buttons are also possible.
  • the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
  • Motor 191 can generate vibrating cues.
  • the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • gesture operations acting on different applications can correspond to different vibration feedback effects.
  • gesture operations acting on different areas of the display screen 194 can also correspond to different vibration feedback effects.
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, which can be used to indicate a charging state, a change in power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used for connecting a SIM card.
  • the SIM card can be contacted and separated from the electronic device 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195 .
  • references in the specification to "one embodiment,” “an embodiment,” “an illustrative embodiment,” etc. mean that the described embodiment may include a particular feature, structure, or property, but each embodiment may or may not include a particular characteristics, structure or properties. Moreover, these phrases are not necessarily referring to the same embodiment. Furthermore, when certain features are described in conjunction with specific embodiments, the knowledge of those skilled in the art can influence the combination of those features with other embodiments, whether or not those embodiments are explicitly described.
  • module may refer to, being a part of, or including: memory (shared, dedicated, or bank) for running one or more software or firmware programs, an application specific integrated circuit (ASIC), Electronic circuits and/or processors (shared, dedicated, or group), combinational logic circuits, and/or other suitable components that provide the described functionality.
  • memory shared, dedicated, or bank
  • ASIC application specific integrated circuit
  • processors shared, dedicated, or group
  • combinational logic circuits and/or other suitable components that provide the described functionality.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un dispositif électronique et un procédé d'affichage pour une application de celui-ci, ainsi qu'un support. Le procédé d'affichage pour une application consiste à afficher une première interface d'affichage d'une première application sur un écran d'un dispositif électronique ; et à afficher, sur l'écran, une seconde interface d'affichage d'une seconde application et une première zone d'affichage de la première application, une partie d'une zone de la seconde interface d'affichage étant protégée par la première zone d'affichage, et la première zone d'affichage faisant partie de la première interface d'affichage. Au moyen du procédé dans la présente invention, un dispositif électronique affiche un contenu d'affichage dans une interface d'affichage d'une première application dans une interface d'affichage d'une seconde application ou d'un bureau au moyen d'une fenêtre flottante, de telle sorte qu'un utilisateur prête attention à un changement dans la première application tout en utilisant la seconde application, sans commuter les deux applications.
PCT/CN2022/074024 2021-02-05 2022-01-26 Dispositif électronique et son procédé d'affichage, et support WO2022166713A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110162248.8A CN114879880A (zh) 2021-02-05 2021-02-05 电子设备及其应用的显示方法和介质
CN202110162248.8 2021-02-05

Publications (1)

Publication Number Publication Date
WO2022166713A1 true WO2022166713A1 (fr) 2022-08-11

Family

ID=82667956

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/074024 WO2022166713A1 (fr) 2021-02-05 2022-01-26 Dispositif électronique et son procédé d'affichage, et support

Country Status (2)

Country Link
CN (1) CN114879880A (fr)
WO (1) WO2022166713A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117632329A (zh) * 2022-08-19 2024-03-01 荣耀终端有限公司 显示方法、存储介质及电子设备
CN115658203A (zh) * 2022-10-28 2023-01-31 维沃移动通信有限公司 信息显示方法、装置、电子设备及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106126236A (zh) * 2016-06-24 2016-11-16 北京奇虎科技有限公司 终端屏幕的分屏显示方法、装置及终端
CN106909268A (zh) * 2015-12-23 2017-06-30 北京奇虎科技有限公司 一种在设备桌面上设置app悬浮窗的方法及装置
US20190272086A1 (en) * 2018-03-01 2019-09-05 Samsung Electronics Co., Ltd Devices, methods, and computer program for displaying user interfaces
CN110243386A (zh) * 2019-07-15 2019-09-17 腾讯科技(深圳)有限公司 导航信息显示方法、装置、终端及存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106909268A (zh) * 2015-12-23 2017-06-30 北京奇虎科技有限公司 一种在设备桌面上设置app悬浮窗的方法及装置
CN106126236A (zh) * 2016-06-24 2016-11-16 北京奇虎科技有限公司 终端屏幕的分屏显示方法、装置及终端
US20190272086A1 (en) * 2018-03-01 2019-09-05 Samsung Electronics Co., Ltd Devices, methods, and computer program for displaying user interfaces
CN110243386A (zh) * 2019-07-15 2019-09-17 腾讯科技(深圳)有限公司 导航信息显示方法、装置、终端及存储介质

Also Published As

Publication number Publication date
CN114879880A (zh) 2022-08-09

Similar Documents

Publication Publication Date Title
US11893219B2 (en) Method for quickly invoking small window when video is displayed in full screen, graphic user interface, and terminal
CN110471639B (zh) 显示方法及相关装置
US20220269405A1 (en) Floating Window Management Method and Related Apparatus
WO2021129326A1 (fr) Procédé d'affichage d'écran et dispositif électronique
US11797145B2 (en) Split-screen display method, electronic device, and computer-readable storage medium
WO2020221063A1 (fr) Procédé de commutation entre une page parent et une sous-page, et dispositif associé
US11687235B2 (en) Split-screen method and electronic device
CN113553014A (zh) 多窗口投屏场景下的应用界面显示方法及电子设备
US11972106B2 (en) Split screen method and apparatus, and electronic device
WO2022166713A1 (fr) Dispositif électronique et son procédé d'affichage, et support
CN113132526B (zh) 一种页面绘制方法及相关装置
WO2022068819A1 (fr) Procédé d'affichage d'interface et appareil associé
US20220214891A1 (en) Interface display method and electronic device
WO2021175272A1 (fr) Procédé d'affichage d'informations d'application et dispositif associé
WO2022161119A1 (fr) Procédé d'affichage et dispositif électronique
CN114077365A (zh) 分屏显示方法和电子设备
JP2024513773A (ja) 表示方法、電子機器、記憶媒体、及びプログラムプロダクト
US20220291832A1 (en) Screen Display Method and Electronic Device
WO2022222688A1 (fr) Procédé et dispositif de commande de fenêtre
CN115567666B (zh) 屏幕录制方法、电子设备及可读存储介质
EP4210308A1 (fr) Méthode de soufflage de message et dispositif électronique
CN116680019B (en) Screen icon moving method, electronic equipment and storage medium
CN117707561A (zh) 卡片数据的更新方法、电子设备及计算机可读存储介质
CN116680019A (zh) 一种屏幕图标移动方法、电子设备、存储介质及程序产品
CN117093290A (zh) 窗口尺寸调整方法、相关装置及通信系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22748992

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22748992

Country of ref document: EP

Kind code of ref document: A1