WO2024109286A1 - Procédé et appareil de commutation multi-fenêtre, dispositif électronique et support de stockage lisible par ordinateur - Google Patents

Procédé et appareil de commutation multi-fenêtre, dispositif électronique et support de stockage lisible par ordinateur Download PDF

Info

Publication number
WO2024109286A1
WO2024109286A1 PCT/CN2023/119188 CN2023119188W WO2024109286A1 WO 2024109286 A1 WO2024109286 A1 WO 2024109286A1 CN 2023119188 W CN2023119188 W CN 2023119188W WO 2024109286 A1 WO2024109286 A1 WO 2024109286A1
Authority
WO
WIPO (PCT)
Prior art keywords
target application
window
mode
support
window mode
Prior art date
Application number
PCT/CN2023/119188
Other languages
English (en)
Chinese (zh)
Inventor
韩国辉
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2024109286A1 publication Critical patent/WO2024109286A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present disclosure relates to the field of computer technology, and in particular to a multi-window switching method and device, an electronic device, and a computer-readable storage medium.
  • multiple applications in the terminal can be displayed in a multi-window mode.
  • the purpose of the present disclosure is to provide a multi-window switching method and device, an electronic device, and a storage medium, thereby overcoming the problem of having many steps for switching multi-window modes due to limitations and defects of related technologies, at least to a certain extent.
  • a multi-window switching method including: detecting a gesture touch operation acting on a target application in a terminal, and determining the support status of the target application for the multi-window mode; if the support status of the target application is to support the multi-window mode, displaying the target application through the multi-window mode.
  • a multi-window switching device including: a touch operation detection module, used to detect a gesture touch operation acting on a target application in a terminal, and determine the support status of the target application for the multi-window mode; a multi-window mode activation module, used to display the target application through the multi-window mode if the support status of the target application is to support the multi-window mode.
  • an electronic device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to execute the first A multi-window switching method and possible implementation thereof.
  • a computer-readable storage medium on which a computer program is stored.
  • the computer program is executed by a processor, the multi-window switching method and possible implementation methods thereof of the first aspect are implemented.
  • the technical solution provided in the embodiments of the present disclosure can display the target application using the multi-window mode according to the support status of the target application when a gesture touch operation acting on the target application is detected, thereby switching to the multi-window mode.
  • a gesture touch operation acting on the target application is detected, thereby switching to the multi-window mode.
  • only one gesture touch operation is required, which reduces the operation steps of switching to the multi-window mode, reduces the difficulty of operation, and improves the operation efficiency of opening the multi-window mode for the application.
  • the multi-window mode can be switched directly by performing a gesture touch operation on the target application displayed in the terminal, compared with the related art, the object to be executed can be quickly determined, which improves convenience and operability.
  • any terminal can be switched to the multi-window mode by a gesture touch operation acting on the target application, the scope of application is increased and the versatility is improved.
  • FIG. 1 is a schematic diagram showing a method of entering a multi-window mode in the related art.
  • FIG. 2 is a schematic diagram showing a multi-window switching method and an application scenario of the multi-window switching method to which an embodiment of the present disclosure can be applied.
  • FIG3 schematically shows a schematic diagram of a multi-window switching method according to an embodiment of the present disclosure.
  • FIG. 4 schematically shows a schematic diagram of enabling a multi-window mode for a target application in an embodiment of the present disclosure.
  • FIG5 schematically shows a schematic diagram of a system architecture in an embodiment of the present disclosure.
  • FIG. 6 schematically shows a first schematic diagram of enabling a multi-window mode for multiple target applications in an embodiment of the present disclosure.
  • FIG. 7 schematically shows a second schematic diagram of enabling a multi-window mode for multiple target applications in an embodiment of the present disclosure.
  • FIG8 schematically shows a flowchart of opening a multi-window mode according to an embodiment of the present disclosure.
  • FIG. 9 schematically shows a block diagram of a multi-window switching device in an embodiment of the present disclosure.
  • FIG. 10 schematically shows a block diagram of an electronic device in an embodiment of the present disclosure.
  • multiple windows can be switched in the following two ways: Referring to the first way shown in FIG1 A, the steps of entering multiple windows from the sidebar and displaying multiple windows include: swiping out the sidebar; clicking on the application to be displayed in the sidebar; and displaying the application in a floating frame. Referring to the second way shown in FIG1 B, the steps of entering multiple windows from a multi-tasking card and displaying multiple windows include: entering multi-tasking; clicking on the floating button in the upper right corner of a card; and displaying the application in a floating frame.
  • the user needs to perform multiple operation steps to switch to the multi-window mode; and the operation of switching to the multi-window mode is not very convenient and has poor convenience.
  • different layouts need to be switched in different ways, so the versatility is poor.
  • FIG2 shows a schematic diagram of the system architecture of the multi-window switching method and device of the embodiment of the present disclosure.
  • the system architecture may include: a user 210 and a terminal 220.
  • the terminal 220 may be any type of device, such as a computer, a smart phone, a smart TV, a tablet computer, a smart wearable device (such as AR glasses), a robot, a drone, and the like.
  • the user 210 may use the terminal 220 to perform various operations.
  • Various types of applications may be installed in the terminal, so that the user may perform corresponding operations on the application to enter the multi-window mode.
  • the multi-window switching method provided in the embodiment of the present disclosure may be executed by the terminal 220 .
  • FIG3 schematically shows a multi-window switching method in an embodiment of the present disclosure, which specifically includes the following steps:
  • Step S310 detecting a gesture touch operation on a target application in the terminal, and determining a support status of the target application for a multi-window mode
  • Step S320 If the support status of the target application is to support the multi-window mode, the target application is displayed in the multi-window mode.
  • the target application may be any application in the terminal.
  • the gesture touch operation may be a swipe up operation. If a gesture touch operation acting on the target application in the terminal is detected, the support status of the target application may be determined to determine whether the target application supports the multi-window mode. The support status is used to indicate whether the target application supports the multi-window mode. Further, based on the support status of the target application, the target application may be opened in the multi-window mode to display the target application, for example, the multi-window mode may be opened in a floating control mode or a split-screen mode to display the target application.
  • the target application since the target application can be displayed using the multi-window mode according to the support status of the target application when the gesture touch operation acting on the target application is detected, the target application is switched to the multi-window mode.
  • the target application is switched to the multi-window mode.
  • only one gesture touch operation is required, which reduces the operation steps of switching to the multi-window mode, reduces the difficulty of operation, and improves the operation efficiency of opening the multi-window mode for the application.
  • the multi-window mode can be switched directly by performing a gesture touch operation on the target application displayed in the terminal, compared with the related art, the object to be executed can be quickly determined, which improves convenience and operability.
  • any terminal can be switched to the multi-window mode through a gesture touch operation acting on the target application, the scope of application is increased and the versatility is improved.
  • step S310 if a gesture touch operation acting on a target application in the terminal is detected, a support status of the target application for the multi-window mode is determined.
  • the terminal may be any type of device capable of installing applications and performing corresponding operations on the applications.
  • the terminal may be a smart device such as a smartphone, a tablet computer, a smart TV, a wearable device, an access control device, etc.
  • it may also include smart vehicle-mounted devices, such as a smart cockpit entertainment system of a car, etc.
  • the terminal is described as a smartphone as an example.
  • the application can be an application of the terminal itself or an application provided by a third party, such as a game application, a video application, and a social application, which are not specifically limited here.
  • the target application can be any one or more of the applications, that is, the target application can be one application or multiple applications.
  • the application can be displayed in the display interface of the terminal, and the display interface can include icons corresponding to multiple applications arranged in sequence. For example, as shown in reference FIG4, application 401, application 402, and application 403 can be displayed in sequence on the display interface 400 of the terminal. As shown in reference FIG6, application 601, application 602, and application 603 can be displayed in sequence on the display interface 600 of the terminal.
  • the gesture touch operation may be a touch operation performed by the user and acting on a certain application, and is used to control the application selected by the user to switch to the multi-window mode or to start the multi-window mode.
  • the gesture touch operation may be any type of touch operation, such as a sliding operation or a sliding pressing operation, etc., or an air gesture or any other
  • the gesture touch operation of sliding upward is taken as an example to illustrate.
  • FIG5 schematically shows a schematic diagram of a system architecture of a terminal.
  • the terminal may include a kernel layer 501 , a system layer 502 , an application framework layer 503 and an application layer 504 .
  • the kernel layer 501 may include display driver, camera driver, audio driver and sensor driver.
  • the system layer 502 may include surface manager, 3D graphics processing library, 2D graphics engine, media library and Android runtime.
  • the application framework layer 503 may include window manager, content manager, phone manager, gesture manager, activity manager, view manager and resource manager, etc.
  • the application layer 504 may include camera, calendar, wmShell in SystemUI, settings, desktop, gallery, notification, navigation, Bluetooth, video, etc.
  • the window manager is responsible for managing the window hierarchy and window positions in multi-window mode.
  • the activity manager is responsible for managing the activity stack, as well as the size and configuration of windows.
  • the gesture manager is responsible for triggering actions such as swiping up on a window.
  • the wmShell module in SystemUI is responsible for displaying multiple windows.
  • the settings module is responsible for dynamically setting the gesture switch.
  • the gesture manager of the application framework layer 503 can be used to trigger gesture actions such as sliding up the window. Based on this, the operation acting on the target application can be detected based on the gesture manager to determine whether the target application has received the gesture touch operation. Specifically, the gesture manager can first detect whether the operation acting on the target application is received, and further, according to the position of the starting touch point and the ending touch point of the operation and the positional relationship between different touch points, it can be determined whether the received gesture touch operation meets the touch condition to determine whether it is a gesture touch operation to open the multi-window mode.
  • the touch condition can be set according to actual needs, for example, the position of the starting touch point and the ending touch point can be different, and the ordinate of the starting touch point is less than the ordinate of the ending touch point.
  • the position of the starting touch point is at the position of the icon of the target application, and the starting touch point is different from the ending touch point, it can be considered that the sliding operation is detected, and the direction of the sliding operation can be further determined according to the positional relationship between the starting touch point and the ending touch point.
  • the gesture touch operation is a sliding operation
  • the gesture touch operation can be considered successful.
  • there is no specific limitation on the track length of the gesture touch operation as long as it is a swipe up operation.
  • the number of target applications may be one or more, which is determined according to actual needs. For example, it is possible to detect whether a gesture touch operation acting on one target application is received, or it is possible to detect whether a gesture touch operation acting on multiple target applications is received. Furthermore, the gesture touch operations acting on different target applications may be the same or different.
  • the gesture touch operation acting on the target application can be determined according to the display state of the terminal.
  • the display state can be the number of applications included in the display interface, or it can be other information. If the display state of the terminal meets the display condition, the gesture touch operation is an upward sliding operation; if the display state does not meet the display condition, the gesture touch operation can be other touch operations, such as pressing and sliding operations, etc.
  • the number of applications on the display interface is less than the quantity threshold, it can be considered that the display state meets the display condition.
  • the quantity threshold can be six or ten, etc.
  • a touch gesture operation acting on target application 401 is detected;
  • a touch gesture operation acting on target application 601 and target application 602 is detected.
  • the support status of the target application for the multi-window mode can be further determined, that is, the support status is used to indicate whether the target application supports the multi-window mode.
  • the support status may be the same or different due to their own application properties.
  • the support status of each application is set in advance according to its own application properties.
  • Multi-window mode refers to a mode in which the target application is displayed in multiple scenes. Multi-window mode may include but is not limited to floating window mode, split-screen mode, and picture-in-picture mode, etc., and the floating window mode and split-screen mode are used as examples for explanation.
  • the floating window mode refers to displaying the target application that receives the gesture touch operation in a floating layer on the display interface.
  • the split-screen mode refers to dividing the display interface into multiple screen areas to display the target application that receives the gesture touch operation through multiple screen areas.
  • the activity manager of the application framework layer 503 can be used to determine the support status of the target application through the attribute information of the target application.
  • whether the support status of the target application is to support the multi-window mode can be determined by whether the attribute information is the first identifier or the second identifier.
  • the attribute information can be represented by android:resizeableActivity.
  • the first identifier can be true for representing a correct value, and the second identifier can be false for representing an incorrect value.
  • the target application When the attribute information of the target application is determined to be the first identification by the activity manager, it can be considered that the target application supports the multi-window mode, that is, supports the use of floating windows or the use of split-screen mode display.
  • the attribute information of the target application is determined to be the second identification by the activity manager, it can be considered that the target application does not support the multi-window mode, that is, the target application does not support the use of floating windows or the use of split-screen mode display.
  • step S320 if the support status of the target application is to support the multi-window mode, the target application is displayed in the multi-window mode.
  • different methods can be selected to display the target application according to the support status of the target application. For example, if the support status of the target application is to support multi-window mode, the target application can be opened in multi-window mode to display the target application, that is, the target application is switched to multi-window mode for display. If the support status of the target application is not to support multi-window mode, the display mode of the target application can be kept unchanged, and prompt information for displaying the support status of the target application for multi-window mode is provided.
  • the number of target applications can be one or more, and the number of target applications can be determined according to actual needs.
  • the number of target applications is different, and the display method of displaying the target applications through the multi-window mode is also different.
  • the display method of displaying the target application can include a floating control method (floating window) and a split screen method, in addition to which it can also include a picture-in-picture mode or other modes, etc.
  • the target application when the number of target applications is one, if a gesture touch operation acting on the target application is detected, and it is determined that the support state of the target application is to support multi-window mode, the target application can be switched to multi-window mode using a floating control to display the target application in multi-window mode. That is. Use the floating control to turn on the multi-window mode of the target application.
  • the floating control can be located at the top layer of the display interface, that is, the layer above the display interface.
  • the floating control can be a floating window, which can be located at any suitable position on the display interface, and the floating control can display the detailed content of the target application that receives the gesture touch operation.
  • the size of the floating control can be determined according to the default parameters, or it can be customized and adjusted, which is not limited here.
  • multiple applications are displayed on the display interface. If the gesture manager detects a gesture touch operation acting on the target application 401, and the activity manager determines that the support state of the target application 401 is to support the multi-window mode through the attribute information android:resizeableActivity is true. On this basis, as shown in reference to FIG4 , the target application is started in the floating window mode through the activity manager, and a floating control 404 can be provided at any position on the display interface, and the detailed content of the target application 401 is displayed in the floating control 404, so as to display the detailed content of the target application through the floating window.
  • the display mode of each target application can be determined according to the support status of each target application, thereby determining the display mode of the multiple target applications, and then switching the multiple target applications to multi-window mode for display through the display mode.
  • the display mode can be determined according to the number of target applications whose support status is to support the multi-window mode.
  • the display mode can be a split-screen mode; wherein the target applications that support the multi-window mode can be all the applications in the target applications that receive the gesture touch operation, or can be part of them.
  • the number of target applications that receive gesture touch operations can be 4.
  • the number can be 2, 3 or 4, which can be determined by the activity manager based on the value of the attribute information of each application.
  • each target application can be displayed in a split-screen manner.
  • the display interface of the terminal can be divided into multiple screen areas according to the number of target applications that support multi-window mode, so that each target application can be displayed independently through each screen area.
  • the sizes of the multiple screen areas can be the same or different, and can be determined according to actual needs.
  • the length and width of each screen area can be determined according to the size of the display interface.
  • the size of the screen area can be determined according to the application information of each target application.
  • the application information can be the memory occupied by the target application, the click frequency, etc.
  • the size of the screen area is positively correlated with the size of the application information of the target application. For example, when clicking, the screen area is positively correlated with the size of the application information of the target application.
  • the target application with the highest frequency has the largest screen area, and the target application with the lowest click frequency has the smallest screen area.
  • the arrangement of the multiple screen areas may be the same as or different from the arrangement order of the multiple target applications displayed on the display interface, which is not specifically limited here.
  • the display mode of the target application can be a floating control mode.
  • the target application that supports multi-window mode can be displayed using a floating control, while the display mode of other target applications that receive gesture touch operations but do not support multi-window mode remains unchanged, that is, for target applications that do not support multi-window mode, they are still only displayed as small icons on the display interface of the terminal.
  • a prompt message can be provided at any position for the target application that does not support multi-window mode to indicate that the support status of the target application is that multi-window mode is not supported.
  • the prompt message can be a text mark, an image mark, etc., which is not specifically limited here.
  • the size of the floating control and the size of the screen area must be determined, so stacking and setting the stack boundary are required.
  • the size of each Task in it is controlled by setting the stack boundary, and the size of the Task ultimately determines the size of the window.
  • the stack boundary is represented by Rect(left, top, right, bottom), which stores four values, representing the positions of the four sides of the rectangle from the coordinate axis.
  • the size of the window displayed on the screen is ultimately determined by the size of the Stack boundary.
  • a plurality of applications are displayed on the display interface, for example, application 601, application 602, and application 603.
  • the gesture manager detects a gesture touch operation acting on application 601 and application 602, the target applications can be considered to be application 601 and application 602.
  • the attribute information of application 601 and application 602 are both first identifiers, it can be considered that the support status of both is to support multi-window mode, and application 601 and application 602 can open the multi-window mode in a split-screen manner to display the applications.
  • the detailed content of application 601 is displayed in screen area 604, and the detailed content of application 602 is displayed in screen area 605.
  • the sizes of screen area 604 and screen area 605 can be the same or different, and are not specifically limited here.
  • a plurality of applications are displayed on the display interface, for example, application 701, application 702, and application 703.
  • the gesture manager detects a gesture touch operation acting on application 701 and application 702
  • the target applications can be considered to be application 701 and application 702.
  • the attribute information of application 701 is a first identifier
  • the attribute information of application 702 is a second identifier
  • it can be considered that the support status of application 701 is to support multi-window mode
  • the support status of application 702 is not to support multi-window mode. Therefore, it can be considered that the number of target applications that support multi-window mode among the multiple target applications is one, and based on this, the display method of the target application can be a floating control method.
  • a floating control 704 can be provided, and the target application 701 that supports multi-window mode can be displayed using the floating control 704, while the display mode of other applications 702 that receive gesture touch operations but do not support multi-window mode remains unchanged, that is, for the target application 702 that does not support multi-window mode, it is still only displayed as a small icon on the display interface of the terminal.
  • a prompt message 705 may be provided for the application 702 that does not support the multi-window mode to remind the user of the support status corresponding to the target application that receives the gesture touch operation but does not support the multi-window mode.
  • the target application after receiving a gesture touch operation on a target application, when the target application supports the multi-window mode, the target application can be switched to the multi-window mode to display the target application, thereby avoiding the cumbersome operation of clicking a window through a sidebar or clicking a window from a multi-task card in the related art, reducing the operation steps, and improving the efficiency of switching to the multi-window mode.
  • it since it is achieved through a gesture touch operation on the target application, it avoids the problem of requiring multiple steps to display multiple windows, thereby improving convenience.
  • FIG8 schematically shows a flow chart of switching to the multi-window mode.
  • the flow chart mainly includes the following steps:
  • step S810 enter the desktop application.
  • step S820 the application is controlled through a gesture touch operation.
  • the touch gesture operation may include but is not limited to a swipe up operation, or may be other types of touch operations, which are determined according to actual needs.
  • step S830 it is determined whether the gesture touch operation is successful. If so, the process goes to step S840; if not, the process goes to step S860.
  • the gesture manager can be used to determine whether a gesture touch operation is successful based on the type and direction of the received gesture touch operation. For example, when the position of the starting touch point is at the position of the application icon, and the starting touch point is different from the ending touch point, it can be considered that a sliding operation is detected, and the direction of the sliding operation can be further determined based on the positional relationship between the starting touch point and the ending touch point. For example, if the gesture touch operation is an upward sliding operation, when the direction of the sliding operation is determined to be from bottom to top, the gesture touch operation can be considered successful. If the positional relationship between the starting touch point and the ending touch point does not meet the touch condition or the position of the starting touch point and the ending touch point itself does not meet the touch condition, the gesture touch operation can be considered to have failed.
  • step S840 it is determined whether the target application supports the multi-window mode. If yes, the process goes to step S850; if no, the process goes to step S860.
  • the activity manager can detect whether the application supports the multi-window mode by using the attribute information android:resizeableActivity to determine whether the application supports the multi-window mode. If the first identifier is true, the application is considered to support the multi-window mode. If the attribute information is false, the application is considered not to support the multi-window mode.
  • step S850 enter the multi-window mode.
  • the target application when a user swipes up on an application icon and the gesture management successfully detects the gesture, if the target application supports multi-window mode, the target application is started as a floating control.
  • the activity manager determines the support status of the two applications through attribute information, and can determine the display mode according to the support status of each application, thereby displaying each target application based on the display mode, specifically including the following situations:
  • both target applications support multi-window mode
  • the two target applications are started in split-screen mode through the activity manager; 2. If only one target application supports multi-window mode, the target application is displayed in a floating window, and a prompt message is provided on the display interface to prompt the user that the other target application does not support multi-window mode; 3. If both target applications do not support multi-window mode, a prompt message is provided to remind the user that the target application does not support multi-window mode, and the display mode of the target application is kept unchanged without performing any other actions.
  • step S860 the display mode is kept unchanged.
  • a prompt message is provided to the target application to remind the user that the target application does not support multi-window mode display such as floating windows.
  • the original display mode of the target application can be kept unchanged without opening the target application in multi-window mode.
  • the technical solution provided in the embodiment of the present disclosure determines whether to open the target application through the multi-window mode through the gesture touch operation acting on the target application and the support status of the target application.
  • the target application can be displayed using the multi-window mode according to the support status of the target application, thereby switching to the multi-window mode.
  • only one gesture touch operation is required, which reduces the operation steps of switching to the multi-window mode, reduces the difficulty of operation, and improves the operation efficiency.
  • the multi-window mode can be switched directly to the target application displayed in the terminal by performing a gesture touch operation, compared with the related art, the object to be executed can be quickly determined, which improves the convenience and operability of opening the multi-window mode.
  • any terminal can switch to the multi-window mode through a gesture touch operation acting on the target application to realize the opening of the target application, the scope of application is increased and the versatility is improved.
  • the multi-window switching device 900 may include:
  • a touch operation detection module 901 is used to detect a gesture touch operation acting on a target application in the terminal, and determine a support status of the target application for the multi-window mode;
  • the multi-window mode enabling module 902 is configured to display the target application in the multi-window mode if the target application is in a support state of supporting the multi-window mode.
  • the touch operation detection module includes: a detection control module for detecting The gesture manager determines that the position information of the starting touch point and the ending touch point of the gesture touch operation meets the touch condition, and determines to detect the gesture touch operation acting on the target application.
  • the touch operation detection module includes: a first state determination module, which is used to determine, based on the activity manager, that the attribute information of the target application is a first identifier, and determine that the support state of the target application is to support the multi-window mode; and a second state determination module, which is used to determine, based on the activity manager, that the attribute information of the target application is a second identifier, and determine that the support state of the target application is to not support the multi-window mode.
  • the number of target applications is one;
  • the multi-window mode activation module includes: a floating control display module, which is used to switch the target application to multi-window mode for display using a floating control when the support state is to support multi-window mode.
  • the number of the target applications is multiple;
  • the multi-window mode activation module includes: a display method determination module, which is used to determine the display method of multiple target applications based on the support status of each target application, and switch the multiple target applications to multi-window mode for display through the display method.
  • a display method determination module includes: a first display module, which is used to display multiple target applications in a split-screen manner if there are multiple target applications whose support status is to support multi-window mode; and a second display module, which is used to display one target application in a floating control manner if there is a target application whose support status is to support multi-window mode.
  • the device also includes: a display mode maintaining module, which is used to keep the display mode of the target application unchanged if the support status of the target application is not to support the multi-window mode, and provide prompt information for the target application to display the support status.
  • a display mode maintaining module which is used to keep the display mode of the target application unchanged if the support status of the target application is not to support the multi-window mode, and provide prompt information for the target application to display the support status.
  • the exemplary embodiments of the present disclosure also provide an electronic device.
  • the electronic device may be the above-mentioned terminal 220.
  • the electronic device may include a processor and a memory, the memory is used to store executable instructions of the processor, and the processor is configured to execute the above-mentioned method by executing the executable instructions.
  • the mobile terminal 1000 may specifically include: a processor 1001, a memory 1002, a bus 1003, a mobile communication module 1004, an antenna 1, a wireless communication module 1005, an antenna 2, a display screen 1006, a camera module 1007, an audio module 1008, a power module 1009 and a sensor module 1010.
  • the processor 1001 may include one or more processing units.
  • the processor 1001 may include an AP (Application Processor), a modem processor, a GPU (Graphics Processing Unit), ISP (Image Signal Processor), controller, encoder, decoder, DSP (Digital Signal Processor), baseband processor and/or NPU (Neural-Network Processing Unit), etc.
  • the method in this exemplary embodiment can be executed by AP, GPU or DSP.
  • NPU can load neural network parameters and execute neural network-related algorithm instructions.
  • the encoder can encode (i.e. compress) an image or video to reduce the data size for easy storage or transmission.
  • the decoder can decode (i.e. decompress) the encoded data of the image or video to restore the image or video data.
  • the mobile terminal 1000 can support one or more encoders and decoders, for example: image formats such as JPEG (Joint Photographic Experts Group), PNG (Portable Network Graphics), BMP (Bitmap), and video formats such as MPEG (Moving Picture Experts Group) 1, MPEG10, H.1063, H.1064, and HEVC (High Efficiency Video Coding).
  • the processor 1001 may be connected to the memory 1002 or other components via a bus 1003 .
  • the memory 1002 may be used to store computer executable program codes, which may include instructions.
  • the processor 1001 executes various functional applications and data processing of the mobile terminal 1000 by running the instructions stored in the memory 1002.
  • the memory 1002 may also store application data, such as images, videos, and other files.
  • the communication function of the mobile terminal 1000 can be implemented by the mobile communication module 1004, antenna 1, wireless communication module 1005, antenna 2, modulation and demodulation processor and baseband processor. Antenna 1 and antenna 2 are used to transmit and receive electromagnetic wave signals.
  • the mobile communication module 1004 can provide 3G, 4G, 5G and other mobile communication solutions applied to the mobile terminal 1000.
  • the wireless communication module 1005 can provide wireless communication solutions such as wireless LAN, Bluetooth, near field communication, etc. applied to the mobile terminal 1000.
  • the display screen 1006 is used to implement display functions, such as displaying user interfaces, images, videos, etc.
  • the camera module 1007 is used to implement shooting functions, such as shooting images, videos, etc., and the camera module may include a color temperature sensor array.
  • the audio module 1008 is used to implement audio functions, such as playing audio, collecting voice, etc.
  • the power module 1009 is used to implement power management functions, such as charging the battery, powering the device, monitoring the battery status, etc.
  • the sensor module 1010 may include one or more sensors for implementing corresponding sensing detection functions.
  • the sensor module 1010 may include an inertial sensor, which is used to detect the motion posture of the mobile terminal 1000 and output inertial sensing data.
  • a computer-readable storage medium is also provided in an embodiment of the present disclosure.
  • the computer-readable storage medium may be included in the electronic device described in the above embodiment; or may exist independently without being assembled into the electronic device.
  • the computer readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or any combination thereof. More specific examples of computer readable storage media may include, but are not limited to, an electrical connection with one or more wires, a portable computer disk, a hard disk, a random access memory, or a computer program product. (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
  • computer readable storage medium can be any tangible medium that contains or stores a program that can be used by or in conjunction with an instruction execution system, apparatus, or device.
  • Computer-readable storage media can send, propagate or transmit programs for use by or in conjunction with an instruction execution system, apparatus or device.
  • the program code contained on the computer-readable storage medium can be transmitted using any appropriate medium, including but not limited to: wireless, wire, optical cable, RF, etc., or any suitable combination of the above.
  • the computer-readable storage medium carries one or more programs.
  • the electronic device implements the method described in the following embodiments.
  • the technical solution according to the implementation of the present disclosure can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a USB flash drive, a mobile hard disk, etc.) or on a network, including several instructions to enable a computing device (which can be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the implementation of the present disclosure.
  • a non-volatile storage medium which can be a CD-ROM, a USB flash drive, a mobile hard disk, etc.
  • a computing device which can be a personal computer, a server, a terminal device, or a network device, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Des modes de réalisation de la présente divulgation se rapportent au domaine technique des ordinateurs, et concernent un procédé et un appareil de commutation multi-fenêtre, un dispositif électronique et un support de stockage. Le procédé de commutation multi-fenêtre consiste à : détecter une opération tactile gestuelle agissant sur une application cible dans un terminal, et déterminer un état de prise en charge de l'application cible pour un mode multi-fenêtre ; et si l'état de prise en charge de l'application cible est qu'un mode multi-fenêtre est pris en charge, afficher l'application cible au moyen du mode multi-fenêtre. Selon une solution technique d'un mode de réalisation de la présente divulgation, l'efficacité opérationnelle et la commodité d'ouverture d'une application cible peut être améliorée à l'aide d'un mode multi-fenêtre.
PCT/CN2023/119188 2022-11-23 2023-09-15 Procédé et appareil de commutation multi-fenêtre, dispositif électronique et support de stockage lisible par ordinateur WO2024109286A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211477846.5A CN118068993A (zh) 2022-11-23 2022-11-23 多窗口切换方法及装置、电子设备、存储介质
CN202211477846.5 2022-11-23

Publications (1)

Publication Number Publication Date
WO2024109286A1 true WO2024109286A1 (fr) 2024-05-30

Family

ID=91095933

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/119188 WO2024109286A1 (fr) 2022-11-23 2023-09-15 Procédé et appareil de commutation multi-fenêtre, dispositif électronique et support de stockage lisible par ordinateur

Country Status (2)

Country Link
CN (1) CN118068993A (fr)
WO (1) WO2024109286A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104571844A (zh) * 2013-10-28 2015-04-29 联想(北京)有限公司 一种信息处理方法及电子设备
CN106919302A (zh) * 2017-02-16 2017-07-04 北京小米移动软件有限公司 移动终端的操作控制方法及装置
CN111625154A (zh) * 2019-12-23 2020-09-04 蘑菇车联信息科技有限公司 一种应用显示方法及装置
US20220050582A1 (en) * 2018-09-10 2022-02-17 Huawei Technologies Co., Ltd. Method for quickly invoking small window when video is displayed in full screen, graphic user interface, and terminal
CN114168029A (zh) * 2021-11-30 2022-03-11 深圳市鸿合创新信息技术有限责任公司 显示方法、装置、设备及介质
CN114416227A (zh) * 2021-11-16 2022-04-29 华为技术有限公司 窗口切换方法、电子设备及可读存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104571844A (zh) * 2013-10-28 2015-04-29 联想(北京)有限公司 一种信息处理方法及电子设备
CN106919302A (zh) * 2017-02-16 2017-07-04 北京小米移动软件有限公司 移动终端的操作控制方法及装置
US20220050582A1 (en) * 2018-09-10 2022-02-17 Huawei Technologies Co., Ltd. Method for quickly invoking small window when video is displayed in full screen, graphic user interface, and terminal
CN111625154A (zh) * 2019-12-23 2020-09-04 蘑菇车联信息科技有限公司 一种应用显示方法及装置
CN114416227A (zh) * 2021-11-16 2022-04-29 华为技术有限公司 窗口切换方法、电子设备及可读存储介质
CN114168029A (zh) * 2021-11-30 2022-03-11 深圳市鸿合创新信息技术有限责任公司 显示方法、装置、设备及介质

Also Published As

Publication number Publication date
CN118068993A (zh) 2024-05-24

Similar Documents

Publication Publication Date Title
WO2022022495A1 (fr) Procédé et dispositif de glissement d'objet de dispositif transversal et dispositif
AU2013204564B2 (en) Method and apparatus for processing multiple inputs
CN115097981B (zh) 处理内容的方法及其电子设备
CN109089138B (zh) 图像显示装置及操作其的方法
KR102221034B1 (ko) 컨텐츠 표시 제어 방법 및 그 전자 장치
US11853543B2 (en) Method and apparatus for controlling display of video call interface, storage medium and device
KR102148001B1 (ko) 디스플레이 장치 및 디스플레이 장치의 제어 방법
WO2019233280A1 (fr) Procédé et dispositif d'affichage d'interface utilisateur, terminal et support de stockage
EP2667629B1 (fr) Procédé et appareil pour lecture multiple de vidéos
US20150363091A1 (en) Electronic device and method of controlling same
CN113014987A (zh) 屏幕录制方法、装置、电子设备以及存储介质
CN114157889A (zh) 一种显示设备及触控协助交互方法
CN110088719B (zh) 移动设备的显示方法和移动设备
US9787746B2 (en) Method and apparatus for processing multimedia content on a graphic cloud
WO2024037563A1 (fr) Procédé et appareil d'affichage de contenu, et dispositif et support de stockage
WO2024109286A1 (fr) Procédé et appareil de commutation multi-fenêtre, dispositif électronique et support de stockage lisible par ordinateur
CN112926420B (zh) 一种显示设备和菜单文字识别方法
CN116980554A (zh) 一种显示设备及视频会议界面显示方法
CN114827708A (zh) 视频播放方法、装置以及电子设备
CN115550717A (zh) 一种显示设备及多指触控显示方法
CN109710359B (zh) 动图显示方法、装置、计算机可读存储介质和终端
CN115390702A (zh) 显示设备、触摸点定位方法及装置
CN115525182A (zh) 电子设备及其虚拟键盘的手指活动区域调整方法
CN115712340A (zh) 电子设备及人机交互方法
CN115185392A (zh) 显示设备、图像处理方法及装置