CN117724783A - Dynamic effect display method and electronic equipment - Google Patents

Dynamic effect display method and electronic equipment Download PDF

Info

Publication number
CN117724783A
CN117724783A CN202310854682.1A CN202310854682A CN117724783A CN 117724783 A CN117724783 A CN 117724783A CN 202310854682 A CN202310854682 A CN 202310854682A CN 117724783 A CN117724783 A CN 117724783A
Authority
CN
China
Prior art keywords
window
application window
application
screen
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310854682.1A
Other languages
Chinese (zh)
Inventor
任士彦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310854682.1A priority Critical patent/CN117724783A/en
Publication of CN117724783A publication Critical patent/CN117724783A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a dynamic effect display method and electronic equipment, and relates to the field of terminals. The electronic equipment displays a first application window in a first area on a screen, and the first application window plays video content; displaying a second application window in a second area on the screen; responding to a dragging event of the first application window, acquiring interface layer information of the first application window by the electronic equipment, and controlling the first application window to move along with the dragging event through the interface layer information of the first application window; and continuously playing the video content by the first application window in the moving process of the first application window. The display content of the first application window is updated in real time according to the actual content of the interface layer, so that the display effect is more flexible; and when the application window displays the dynamic image, a large amount of screenshot processing is not needed, so that a large amount of occupation of the image processor resources is avoided.

Description

Dynamic effect display method and electronic equipment
Technical Field
The application relates to the field of terminals, in particular to a dynamic effect display method and electronic equipment.
Background
As electronic devices continue to evolve, so too does applications (apps) installed on electronic devices. To meet the use needs of users, more and more electronic devices support running and displaying multiple application windows simultaneously.
When a plurality of application windows are displayed, a user can switch the application windows through gestures; such as exchanging display positions of a plurality of application windows, such as switching window modes of the application windows (e.g., a floating window mode is switched to a split screen mode, a split screen mode is switched to a floating window mode, etc.).
In order to improve the use experience of a user, in the process of switching application windows, gesture dynamic effects are usually displayed in the application windows. At present, the gesture dynamic effect form is relatively stiff, and the user experience is poor. How to improve the display effect of gesture movement effect is a problem to be solved.
Disclosure of Invention
The embodiment of the application provides a dynamic effect display method and electronic equipment, wherein the electronic equipment can display more flexible gesture dynamic effects in the process of responding to gestures made by a user in an application window to operate a mobile application window, so that the use experience of the user is improved.
In order to achieve the above purpose, the embodiments of the present application adopt the following technical solutions:
in a first aspect, a dynamic display method is provided, the method including: the electronic equipment displays a first application window in a split screen mode in a first area on a screen, and displays a second application window in a split screen mode in a second area on the screen; wherein the first application window plays video content. Responding to a dragging event of the first application window, acquiring interface layer information of the first application window by the electronic equipment, and controlling the first application window to move along with the dragging event through the interface layer information of the first application window; responding to the lifting event acted on the first application window falling in a third area in the screen of the electronic equipment, wherein the first application window is displayed in a second area, and the second application window is displayed in the first area; i.e. the first application window and the second application window exchange display positions. And in the moving process of the first application window, the first application window continuously plays the video content.
In one embodiment, the electronic device further controls the first application window to adjust the size according to the position of the drag event through interface layer information of the first application window.
In the method, in the process of exchanging the positions of the application windows displayed by two split screens in response to the gestures of the user, the gesture dynamic effect displayed in the application windows is not a static image, but a dynamic playing interface. The screenshot processing is not needed to be carried out on the interface of the application window, but the gesture movement effect of the application window is generated according to the interface layer of the application window, the gesture movement effect can be updated according to the actual content of the interface layer, and the display effect of the gesture movement effect is more flexible. Moreover, when the dynamic gesture dynamic effect is generated, a large amount of screenshot processing is not needed, and a large amount of occupation of image processor resources is avoided.
In combination with the first aspect, in one implementation manner, in response to a drag event on the first application window, only one time of interface layer information needs to be acquired, and in a subsequent moving process of the application window, the interface layer information can be directly used without repeated acquisition, so that efficiency is improved.
With reference to the first aspect, in one implementation manner, the interface layer information of the first application window includes a handle of an interface layer corresponding to a task of the first application window. The handle of the interface layer is used for only pointing to one interface layer, and the information of all the interface layers of the application window can be obtained according to the handle of the interface layer corresponding to the task of the application window. Thus, the interface layer of the application window can be operated, such as designating the display position of the interface layer, modifying the display size of the interface layer, and the like. And acquiring the handle of the interface layer corresponding to the task of the first application window, so that the interface layer of the first application window can be displayed as the gesture movement effect of the first application window. The gesture movement effect is updated in real time along with the interface layer, and the display effect is more flexible.
In one embodiment, obtaining a handle for an interface layer of a first application window includes: acquiring handles of all visible interface layers in a split screen mode; the interface layer in the visible, split-screen mode includes an interface layer of the first application window.
In combination with the first aspect, in one embodiment, the second region includes a third region (e.g., the third region is the hot zone 5), and the lift operation to the first application window is detected in the third region.
That is, if an operation of releasing the first application window is detected in the third area, exchanging the first application window and the second application window is performed.
With reference to the first aspect, in one implementation, in response to a drag event acting on the first application window falling within a fourth region in the screen of the electronic device, the aspect ratio of the first application window changes and a blurred image is displayed. In one embodiment, the aspect ratio of the second application window changes and a blurred image is displayed in response to a drag event acting on the first application window falling within a fourth area in the screen of the electronic device.
For example, in response to a drag operation of the user on the first application window, the display position of the first application window enters the hot zone 4 or the hot zone 6, the shape of the application window is greatly changed, and the electronic device displays a blurred image on the first application window, and no longer displays a playing interface. Therefore, the deformed picture can be prevented from being displayed, and the watching experience of a user is improved.
Wherein the blurred image may be any one of the following:
a preset blurred image; or, the image after blurring the first image is the last frame of video image played by the first application window before the drag event falls in the fourth area.
With reference to the first aspect, in one implementation, displaying, by the electronic device, the blurred image in the first application window includes: the electronic equipment acquires a wallpaper layer; the wallpaper layer is used for drawing a blurred image; the electronic device generates a blurred image from the wallpaper layer.
That is, the blurred image is also generated from the interface layer.
With reference to the first aspect, in one implementation, in response to a lift event acting on the first application window falling within the fourth area, the first application window is displayed as a floating window and continues to play video content, and the second application window is displayed full screen.
In a second aspect, an electronic device is provided, which has the functionality to implement the method of the first aspect. The functions can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
In a third aspect, an electronic device is provided, comprising: a processor and a memory; the memory is configured to store computer-executable instructions that, when executed by the electronic device, cause the electronic device to perform the method of any of the first aspects.
In a fourth aspect, there is provided an electronic device comprising: a processor; the processor is configured to perform the method according to any of the first aspects above according to instructions in a memory after being coupled to the memory and reading the instructions in the memory.
In a fifth aspect, there is provided a computer readable storage medium having instructions stored therein which, when run on a computer, cause the computer to perform the method of any of the first aspects above.
In a sixth aspect, there is provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method of any of the first aspects above.
In a seventh aspect, there is provided an apparatus (e.g. the apparatus may be a system-on-a-chip) comprising a processor for supporting an electronic device to implement the functions referred to in the first aspect above. In one possible design, the apparatus further includes a memory for storing program instructions and data necessary for the electronic device. When the device is a chip system, the device can be formed by a chip, and can also comprise the chip and other discrete devices.
The technical effects of any one of the design manners of the second aspect to the seventh aspect may be referred to the technical effects of the different design manners of the first aspect, and will not be repeated here.
Drawings
Fig. 1 is a schematic view of a scenario to which the dynamic display method provided in the embodiment of the present application is applicable;
fig. 2A is a schematic diagram of another scenario to which the dynamic display method provided in the embodiment of the present application is applicable;
fig. 2B is a schematic view of another scenario to which the dynamic display method provided in the embodiment of the present application is applicable;
fig. 2C is a schematic view of another scenario to which the dynamic display method provided in the embodiment of the present application is applicable;
fig. 3 is a schematic diagram of an example of a scenario of a dynamic display method according to an embodiment of the present application;
fig. 4 is a schematic hardware structure of an electronic device according to an embodiment of the present application;
fig. 5 is a schematic software architecture of an electronic device according to an embodiment of the present application;
fig. 6 is an example schematic diagram of a screen hot zone division manner in the dynamic display method according to the embodiment of the present application;
FIG. 7 is a schematic flow chart of a method for displaying dynamic effects according to an embodiment of the present disclosure;
fig. 8 is a schematic diagram of a scenario example of a dynamic display method according to an embodiment of the present application;
Fig. 9 is a schematic diagram of a scenario example of a dynamic display method according to an embodiment of the present application;
fig. 10 is a schematic diagram of an example of a scenario of a dynamic display method according to an embodiment of the present application;
fig. 11 is a schematic diagram of a scenario example of a dynamic display method according to an embodiment of the present application;
fig. 12 is a schematic diagram of a scenario example of a dynamic display method according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In the description of the embodiments of the present application, the terminology used in the embodiments below is for the purpose of describing particular embodiments only and is not intended to be limiting of the present application. As used in the specification of this application and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include, for example, "one or more" such forms of expression, unless the context clearly indicates to the contrary. It should also be understood that in the various embodiments herein below, "at least one", "one or more" means one or more than two (including two). The term "and/or" is used to describe an association relationship of associated objects, meaning that there may be three relationships; for example, a and/or B may represent: a alone, a and B together, and B alone, wherein A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise. The term "coupled" includes both direct and indirect connections, unless stated otherwise. The terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated.
In the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as examples, illustrations, or descriptions. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
When the electronic device displays the application window, the user can switch the display mode of the application window by dragging the application window. In one example, two application windows are displayed split on the electronic device screen. The user may cause the two application windows displayed on the screen to swap the display positions on the screen by gesture. For example, the first application window is displayed on the left side of the screen, and the second application window is displayed on the right side of the screen. The user presses the top of the application window one for a long time, so that the application window one is picked up, and the display position can be moved after the application window one is picked up; the user then drags the first application window to the right so that the first application window and the second application window exchange display positions. The user can press the top of the second application window for a long time, so that the second application window is picked up; and then dragging the application window II to the left so that the application window I and the application window II exchange display positions.
Illustratively, as shown in fig. 1 (a), the window 10 of the video application and the window 20 of the calculator application are displayed on the screen of the mobile phone in a split screen manner, the window 10 is displayed on the left side of the screen, and the window 20 is displayed on the right side of the screen. The user presses the top Bar 11 of the window 10 long. As shown in fig. 1 (b), the handset picks up the window 10 in response to a gesture by the user to press the Bar 11 at the top of the window 10 for a long time; the user continues to press the top Bar 11 to move to the right. As shown in fig. 1 (c), a gesture of the user dragging rightward is received, and the window 10 moves along with the drag trajectory. When the window 10 is moved to the right side of the screen, a user's hand-up gesture is detected. As shown in fig. 1 (d), the window 10 and the window 20 exchange display positions, the window 20 is displayed on the left side of the screen, and the window 10 is displayed on the right side of the screen. For another example, application window one is on the upper side of the screen and application window two is on the lower side of the screen. The user presses the top of the first application window for a long time so that the window is picked up, and then drags the first application window downwards so that the first application window and the second application window exchange display positions. The user can also press the top of the second application window for a long time, so that the second application window is picked up, and then drag the second application window upwards, so that the first application window and the second application window exchange display positions.
In another example, the first application window is displayed full screen on the electronic device screen and the second application window is displayed in the form of a floating window. The user can make the first application window and the second application window split-screen displayed on the screen through gestures. Illustratively, as shown in FIG. 2A, the window 10 of the video application is displayed full screen on the cell phone screen, and the window 20 of the calculator application is displayed in the form of a floating window. The mobile phone responds to a gesture of a user for pressing Bar21 at the top of the window 20 for a long time to pick up the window 20; the gesture of the user pressing the top Bar21 to drag to the left of the screen is detected, and the window 20 moves along with the drag trajectory. When the window 20 moves to the left side of the screen, a hand lifting gesture of a user is detected, the window 10 and the window 20 are displayed in a split screen mode, the window 20 is displayed on the left side of the screen, and the window 10 is displayed on the right side of the screen.
In yet another example, the electronic device displays the application window one in a floating window form, further displays the application window two and the application window three in a split screen, and switches to display the application window one and the application window two in a split screen in response to a user gesture; or, in response to the user gesture, switching to split-screen display of the first application window and the third application window.
In yet another example, an application window one is displayed full screen on the electronic device screen. The user can switch the display mode of the application window one through a drag gesture. In one example, referring to fig. 2B, the cell phone displays the window 10 of the video application full screen, the window 10 moving with the drag trajectory in response to a drag gesture applied to the window 10. The window 10 is displayed in the mini-window mode upon detection of a user's lift operation. In another example, the cell phone displays the window 10 of the video application full screen, the window 10 moving with the drag trajectory in response to a drag gesture acting on the window 10. And responding to the hand-lifting operation of the user, and displaying the split-screen starter by the mobile phone. In another example, referring to fig. 2C, the cell phone displays the window 10 of the video application full screen, the window 10 moving with the drag trajectory in response to a drag gesture acting on the window 10. In response to a user's lift operation, the window 10 is displayed in a split screen mode.
It should be noted that, the dynamic display method provided in the embodiment of the present application may be applied to dragging an application window so that the application window switches between scenes in a display mode. In this embodiment, the scenario shown in fig. 1 is taken as an example to describe the present application.
Take the scenario shown in fig. 1 as an example. The electronic equipment split-screen displays an application window I and an application window II, wherein the application window I is displayed in a first area in a screen, and the application window II is displayed in a second area in the screen. In response to receiving the first gesture of the user, the application window one is picked up. Illustratively, the first gesture is a long press application window top (such as window top Bar) gesture.
Once the application window is picked up, the user may move the display position of the application window one through the second gesture. The second gesture is, for example, a gesture to drag the application window. In response to receiving the second gesture of the user, the application window moves the display position on the screen along with the movement track of the second gesture.
In response to detecting a third gesture of a user in a preset area of the screen, displaying an application window I in a second area in the screen, and displaying an application window II in a first area in the screen; i.e. application window one and application window two transpose the display position. The third gesture is, for example, a gesture (a lift gesture) in which the user releases the application window. In one implementation, when the electronic device detects a hand-up gesture of a user, if it is determined that an execution position of the hand-up gesture is within a preset area of the screen, the application window one is displayed in a second area within the screen, and the application window two is displayed in a first area within the screen (i.e., executing a display position of the application window one and the application window two). And if the execution position of the hand-lifting gesture is not in the preset area of the screen, canceling exchanging the display position of the first application window and the second application window. Illustratively, as shown in fig. 1 (c), the window 10 moves the display position along the movement locus of the second gesture of the user (gesture of dragging the window 10). When the top Bar 11 of the window 10 moves into the top area of the window 20, a hand-up gesture is detected, that is, the execution position of the hand-up gesture is in the top area of the window 20, the display positions of the window 10 and the window 20 are exchanged, the window 20 is displayed on the left side of the screen, and the window 10 is displayed on the right side of the screen. If the execution position of the hand-up gesture is not within the top region of window 20, the display positions of window 10 and window 20 are canceled and window 10 is still displayed on the left side of the screen and window 20 is displayed on the right side of the screen.
When the electronic device displays a plurality of application windows, the application windows are switched in response to a gesture of a user in one of the application windows. And in the process of switching the application windows, displaying dynamic effects on each application window respectively. In the embodiment of the present application, this motion effect is referred to as gesture motion effect. For example, a first gesture (e.g., a gesture that presses the top of the application window for a long time) of the user is detected, the gesture effect is started to be displayed, a third gesture (e.g., a hand-up gesture) is detected, and the gesture effect is stopped to be displayed.
In one embodiment, use is made ofAnd a native mechanism, which operates for a View (View), and generates gesture dynamic effects according to the screenshot of the application interface.
In one implementation, after receiving the first gesture, performing screenshot processing on an interface of the application window once, and generating a gesture movement effect of the application according to one screenshot image. Taking the example of the scenario shown in fig. 1 as an interface 101 is displayed in the window 10 of the video application, the handset detects a first gesture of the user (a gesture to press the top of the window 10 long). The mobile phone performs screenshot processing on the interface 101, and generates gesture dynamic effects of the window 10 according to the screenshot image of the interface 101. In the process of moving the window 10 from the left side of the screen to the right side of the screen, the gesture movement effect of the window 10 is a screenshot image of the interface 101. That is, during the movement of window 10 from the left side of the screen to the right side of the screen, the playback interface of the video application is not presented to the user as it is obscured by the screen shots, which are static images of the screen shots. In this implementation, the gesture motion effect is static and the display effect of the gesture motion effect is not flexible. Moreover, a certain time is required in the process of screenshot processing, after the user makes the first gesture, the electronic device may pause for a period of time, and the application window is moved after waiting for obtaining the screenshot image and generating the gesture action according to the screenshot image. The electronic device responds slowly to the user gestures and the user experience is poor.
In another implementation manner, after the first gesture is received, multiple screenshot processing can be performed on the interface of the application window, and a dynamic gesture motion effect is generated according to multiple screenshot images obtained through the screenshot processing. In this implementation manner, if the frequency of the screenshot process is high, for example, the screenshot process is performed on each frame interface of the video application, the generated gesture motion effect is the same as that of the playing interface of the video application, so that the user can watch the dynamic playing interface in the moving process of the application window of the video application. However, frequent screenshot processing brings high power consumption, and executing a large amount of screenshot processing also places higher demands on the performance of the image processor, and places a large burden on the image processor, which may not be able to process the images. If the frequency of screenshot processing is low, the gesture motion effect display effect is affected. For example, in the moving process of an application window of a video application, a video playing interface watched by a user loses frames and is blocked, so that the real-time performance is poor and the user experience is poor.
The embodiment of the application provides a dynamic effect display method, which does not need screenshot processing on an interface of an application window in the process of switching the application window, but generates gesture dynamic effects of the application window according to an interface layer of the application window. The interface layer is an interface image generated by an image synthesizer (Surface Flinger) of the electronic device for display in the application window. According to the dynamic effect display method provided by the embodiment of the application, the gesture dynamic effect is generated according to the interface layer, the gesture dynamic effect can be updated according to the actual content of the interface layer, and the display effect is more flexible. Still take application one as video application, application two as calculator application, application window one and application window two-screen display scene as an example. Illustratively, as shown in fig. 3, the window 10 of the video application and the window 20 of the calculator application are displayed on the screen of the mobile phone in a split screen manner, the window 10 is displayed on the left side of the screen, and the window 20 is displayed on the right side of the screen. The mobile phone receives a gesture of a user for pressing Bar 11 at the top of the window 10 for a long time, and picks up the window 10; upon receiving an operation of the user dragging right by holding the top Bar 11, the window 10 moves along with the user dragging trajectory. During movement of window 10, the gesture of window 10 is a dynamic interface of the video application, rather than a static screenshot image. When the window 10 moves to the right side of the screen, a hand lifting gesture of the user is detected, the window 10 is displayed on the right side of the screen, the window 20 is displayed on the left side of the screen, the gesture movement effect is stopped, and the window 10 continues to play the interface of the video application. During the process of exchanging display positions of the window 10 and the window 20, the window 10 displays a dynamic and coherent video application interface, and the display effect of the gesture movement effect is more flexible. Moreover, when the dynamic gesture dynamic effect is generated, a large amount of screenshot processing is not needed, and a large amount of occupation of image processor resources is avoided.
In a scenario example given in the embodiment of the present application, after detecting a gesture of a user pressing the top of a window for a long time, the electronic device picks up an application window. After the application window is picked up, a gesture for dragging the application window is detected, and the display position of the application window moves along with the dragging track.
It should be noted that, in other embodiments, the electronic device detects a gesture of dragging the application window without picking up the application window by pressing the top of the window for a long time, and the display position of the application window moves along with the dragging track.
According to the dynamic effect display method, the gesture dynamic effect is displayed in the process that the display position of the application window moves along with the dragging track, and the gesture dynamic effect is generated according to the interface layer of the application window, so that the dynamic playing interface of the application can be displayed.
The dynamic effect display method provided by the embodiment of the application can be applied to the electronic equipment supporting multi-window display. The electronic device may include a mobile phone, a tablet computer, a notebook computer, a personal computer (personal computer, PC), an ultra-mobile personal computer (ultra-mobile personal computer, UMPC), a handheld computer, a netbook, an intelligent home device (such as an intelligent television, a smart screen, a large screen, an intelligent sound box, an intelligent air conditioner, etc.), a personal digital assistant (personal digital assistant, PDA), a wearable device (such as an intelligent watch, an intelligent bracelet, etc.), a vehicle-mounted device, a virtual reality device, etc., which is not limited in this embodiment.
Fig. 4 is a schematic structural diagram of the electronic device. Wherein the electronic device 100 may include: processor 110, external memory interface 120, internal memory 121, universal serial bus (universal serial bus, USB) interface 130, charge management module 140, power management module 141, battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, sensor module 180, keys 190, motor 191, indicator 192, camera 193, display 194, and subscriber identity module (subscriber identification module, SIM) card interface 195, among others. Wherein the sensor module 180 may include a pressure sensor, a fingerprint sensor, a temperature sensor, a touch sensor, etc.
It is to be understood that the structure illustrated in the present embodiment does not constitute a specific limitation on the electronic apparatus 100. In other embodiments, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and command center of the electronic device 100. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
It should be understood that the connection relationship between the modules illustrated in this embodiment is only illustrative, and does not limit the structure of the electronic device. In other embodiments, the electronic device may also use different interfacing manners in the foregoing embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light emitting diode (AMOLED), a flexible light-emitting diode (FLED), a Mini-LED, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the electronic device may include 1 or N cameras 193, N being a positive integer greater than 1. In the present embodiment, the camera 193 may be used to capture video images.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, audio, video, etc. files are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 121. For example, in an embodiment of the present application, the processor 110 may include a storage program area and a storage data area by executing instructions stored in the internal memory 121, and the internal memory 121 may include a storage program area and a storage data area. The storage program area may store application programs (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system. The storage data area may store data created during use of the electronic device (e.g., video files), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The pressure sensor is used for sensing a pressure signal and can convert the pressure signal into an electric signal. In some embodiments, the pressure sensor may be provided on the display screen 194. Pressure sensors are of many kinds, such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, etc. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. When a force is applied to the pressure sensor, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor.
Touch sensors, also known as "touch panels". The touch sensor may be disposed on the display screen 194, and the touch sensor and the display screen 194 form a touch screen, which is also referred to as a "touch screen". The touch sensor is used to detect a touch operation acting on or near it. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor may also be disposed on a surface of the electronic device 100 at a different location than the display 194.
The electronic apparatus 100 may detect a pressing operation, a dragging operation, etc. of the user on the display screen 194 through a pressure sensor or a touch sensor, and thus may detect whether a multi-window switching gesture of the user is received.
In the embodiment of the present application, the electronic device 100 is an electronic device that can run an operating system and install an application program. Alternatively, the operating system on which electronic device 100 operates may beSystem (S)>System (S)>A system, etc. In some embodiments, the software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, or a cloud architecture. The embodiment of the application adopts a layered architecture +.>The system is an example, illustrating the software architecture of the electronic device 100.
Fig. 5 is a software structure diagram of the electronic device 100 according to the embodiment of the present application.
It will be appreciated that the layered architecture divides the software into several layers, each with a clear role and division. The layers communicate with each other through a software interface. In some embodiments of the present invention, in some embodiments,the system may include an application (App) layer, an application Framework (FWK) layer, a An Zhuoyun row (Android run) and system library layer, and a kernel (kernel) layer.
The application layer may include a series of application packages. Such as cameras, gallery, calendar, talk, map, navigation, bluetooth, music, video, short message, etc.
The three-party application comprises a multi-window gesture entrance for processing the received gesture of the user and triggering to generate gesture dynamic effects.
The System user interface (System user interface, system UI) application includes a gesture management module, a gesture hotzone module, a gesture dynamic effect module, and the like. The gesture management module is used for managing related processes for generating gesture dynamic effects, and the gesture hotspots module is used for generating and managing screen hotspots in different processes. The screen hotspots are preset areas in the screen, and specific division and corresponding processing flows of different screen hotspots will be described in detail in the following embodiments. The gesture dynamic effect module is used for providing a function of generating gesture dynamic effects.
As shown in fig. 5, the application framework layer may include a window manager (window manager service, WMS), a content provider, a view system, a telephony manager, a resource manager, a notification manager, an activity manager (activity manager service, AMS), a gesture map management module, a multi-window management module, and the like.
The window manager WMS is used to manage window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. Information of the respective application window interfaces, such as interface layer information of the application window, etc., may be managed.
The Activity manager AMS is responsible for managing Activity, starting, switching, scheduling, managing and scheduling applications of each component in the system, and the like.
The event distribution module is used for distributing events to each module.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is for providing communication functions of the electronic device. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
The gesture image layer management module is used for acquiring and managing interface image layer information for generating gesture dynamic effects.
The multi-window management module is used for providing a function of displaying a plurality of application windows on a screen and managing the plurality of application windows. For example, a window mode (split screen mode, floating window mode, etc.) of a plurality of application windows is managed, and a display size, a position, etc. of the plurality of application windows are managed.
In the embodiment of the application, the gesture management module (gesture management type), the gesture hot zone module (gesture hot zone type) and the gesture dynamic effect module (gesture dynamic effect type) are newly added in the System UI, the gesture layer management module (gesture layer management type) is newly added in the application framework layer, so that the interface layer of the application is obtained, and the gesture dynamic effect of the application window is generated according to the interface layer of the application.
Android runtimes include core libraries and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional (3D) graphics processing Libraries (e.g., openGL ES), two-dimensional (2D) graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
The kernel layer is a layer between hardware and software. The kernel layer may contain display drivers, camera drivers, audio drivers, sensor drivers, etc.
The following describes in detail the dynamic effect display method provided in the embodiment of the present application with reference to the accompanying drawings. It should be noted that, in the embodiment of the present application, the electronic device displays two application windows, and a scene of exchanging display positions of the two application windows in response to a gesture of a user in one of the application windows is described in detail as an example. The dynamic effect display method provided by the embodiment of the application is also suitable for other multi-application window switching scenes. For example, a scene in which an application window is switched from a floating window mode to a split screen mode, a scene in which a split screen mode is switched to a floating window mode, and the like. The dynamic effect display method provided by the embodiment of the application is also suitable for other scenes for switching the display mode of the application window. For example, the application window is switched from full screen mode to floating window mode, the application window is switched from full screen mode to split screen mode, and so on.
The electronic device displays an application window I and an application window II. Alternatively, the first application window and the second application window may be application windows of two different applications, for example, the first application window is an application window of the first application, and the second application window is an application window of the second application. Alternatively, the application window one and the application window two may be two different application windows of the same application.
The first application window is displayed in a first area in the screen, and the second application window is displayed in a second area in the screen. In some embodiments, the electronic device screen displays an application window one and an application window two on a split screen (split screen mode). In some examples, application window one is displayed on the left side of the screen and application window two is displayed on the right side of the screen. In some examples, application window one is displayed on the upper side of the screen and application window two is displayed on the lower side of the screen. Taking the example of the scene shown in fig. 3 as an example, the mobile phone split screen displays an application window one and an application window two, wherein the application window one is a window 10 of a video application, the application window two is a window 20 of a calculator application, the window 10 is displayed on the left side of the screen, and the window 20 is displayed on the right side of the screen.
In some embodiments, an area on the screen of the electronic device for displaying a plurality of application windows may be divided into a plurality of preset areas. In the embodiment of the present application, a preset area is referred to as a screen hot zone. In different window modes, the screen hotspots may have different divisions. For example, in the split screen mode, the screen area of the split screen for displaying each application window is divided into 6 screen hotspots; for example, in the floating window mode, the screen area occupied by the application window displayed full screen is divided into 4 screen hotspots. The mobile phone can execute corresponding flow when detecting the gestures of the user in different screen hotspots.
Illustratively, the scenario example shown in FIG. 3 is still presented as an example. The mobile phone split screen displays a window 10 of the video application and a window 20 of the calculator application, wherein the window 10 is displayed on the left side of the screen, and the window 20 is displayed on the right side of the screen. As shown in fig. 6, the screen area occupied by window 10 and window 20 is denoted as screen area 200. The user may drag window 10 or window 20 to swap the display positions of window 10 and window 20. The application window dragged by the user is referred to as an active window, and the application window whose position is moved as the active window moves is referred to as a passive window. Taking window 10 as an active window for example, screen area 200 is divided into 6 screen zones, zone 1, zone 2, zone 3, zone 4, zone 5, and zone 6, respectively.
The user performs a first gesture on application window one (window 10), illustratively a gesture that presses long on the top of the application window (such as window top Bar). The mobile phone detects the long-press gesture of the user in the hot zone 1, and determines that the first gesture of the user is detected. In response to receiving the first gesture on application window one (window 10), application window one (window 10) is picked up. Illustratively, referring to FIG. 3, the handset receives a gesture from the user to press Bar 11 long on top of window 10, picking up window 10.
Once the application window one (window 10) is picked up, the user may drag the application window one (window 10) through the second gesture. The second gesture is, for example, a gesture to drag the application window. In response to receiving the second gesture of the user, application window one (window 10) moves the display position on the screen along with the movement track of the second gesture. Illustratively, referring to FIG. 3, in response to a gesture by a user pressing on the top Bar 11 of the window 10 and dragging, the window 10 moves with the drag trajectory. The movement track of the second gesture may pass through any one of the screen hotspots.
The user may perform a third gesture (e.g., a lift hand gesture) anywhere within the screen. And the mobile phone detects the third gesture in different screen hot areas and can execute corresponding actions respectively. In one example, if a third gesture is detected in hot zone 5, the positions of window 10 and window 20 are swapped, window 10 is displayed on the right side of the screen, and window 20 is displayed on the left side of the screen. In one example, if a third gesture is detected in hot zone 2, the positions of window 10 and window 20 are not swapped, window 10 is still displayed on the left side of the screen, and window 20 is displayed on the right side of the screen. In one example, a third gesture is detected in hot zone 3, window 10 is closed, and a split screen management application is launched. In one example, if a third gesture is detected in hot zone 4, window 10 is displayed as a hover window and window 20 is displayed full screen. In one example, if a third gesture is detected in hot zone 6, the positions of window 10 and window 20 are swapped, window 10 is displayed on the right side of the screen, and window 20 is displayed on the left side of the screen.
In some embodiments, the application window displays a gesture motion effect as the user moves position with the second gesture. The electronic device generates gesture dynamic effects of the application windows according to the interface layers of the application windows.
An exemplary, electronic device software architecture shown in fig. 5 is combined, and fig. 7 shows a schematic flow chart of a dynamic effect display method provided in an embodiment of the present application.
As shown in fig. 7, the method for displaying dynamic effects provided in the embodiment of the present application may include:
s701, a multi-window gesture entry module of the three-party application receives a gesture event.
A sensor of an electronic device, such as a touch sensor, detects a user's operation on a display screen (touch screen) at a preset frequency and generates a corresponding gesture event according to the detected operation. For example, when a long press operation is detected on the top Bar of the application window, a first event (long press event) is generated, and the first event includes a position where the long press operation is detected. For example, when an operation of pressing an application window (such as the top Bar of the application window) and moving for a plurality of times and a moving distance is larger than a threshold value is detected, namely, when a drag operation of the application window is detected, a second event (drag event) is generated, wherein the second event comprises an action position of the drag operation; for example, when a lift operation is detected on the application window, a third event (lift event) is generated, where the third event includes a position where the lift operation is detected.
The multi-window gesture entry module of the three-way application receives gesture events (long press event, drag event, hand up event, etc.), then determines that a user gesture was received.
Illustratively, the scenario example shown in FIG. 3 is still presented as an example. The three-way application is a video application, and the video application comprises a multi-window gesture entry module, wherein the multi-window gesture entry module receives gesture events. For example, a long press event is received (the position of the long press operation is the screen area corresponding to the top Bar of the application window); or receiving a drag event of the application window; or a hands-up event is received for the application window.
S702, responding to a dragging event of an application window, and sending a first message to a gesture management module of a System UI (System UI) by a multi-window gesture entry module of the three-party application.
The electronic equipment detects gestures of a user on a display screen (touch screen) according to preset frequency, the electronic equipment detects operations of dragging an application window by the user continuously for multiple times, and the application window moves a display position along with a movement track of the dragging operation.
In one implementation, the multi-window gesture entry module determines whether an operation to drag the application window is detected for the first time based on the gesture event. Illustratively, the scenario example shown in FIG. 3 is still presented as an example. The window 10 of the video application and the window 20 of the calculator application are displayed on the mobile phone screen in a split screen mode, the window 10 is displayed on the left side of the screen, and the window 20 is displayed on the right side of the screen. For example, window 10 is displayed in a first area and window 20 is displayed in a second area. The user can realize the display position of the moving window 10 by a drag operation. If it is determined that the start position of the drag operation is the first region according to the drag event, it is determined that the gesture of dragging the window 10 is detected for the first time.
In one implementation, after an event of dragging an application window is detected for the first time, a multi-window gesture entry module of the three-way application sends a first message to a gesture management module of the system UI, where the first message is used to trigger gesture action preparation. In one implementation, the first message includes task identification (TaskId), gesture information, window mode, mode information, and the like.
Wherein the TaskId is used to uniquely identify the application window (such as window 10) that received the user gesture.
The gesture information includes information such as a gesture type of a currently received gesture event, a position (coordinates on a screen) at which a user operation is detected, and the like. For example, the gesture type is a long press operation, and the position where the long press operation is detected is the coordinates 1 on the screen. For example, the gesture type is a drag operation, and the action position of the drag operation detected is the coordinates 2 on the screen.
The window mode is used for indicating the display mode of the application window receiving the gesture event, and comprises a split screen mode, a floating window mode and the like.
Various different window modes may correspond to different mode information. In one example, the mode information corresponding to the split mode includes up and down split, left and right split, and the like; the mode information corresponding to the floating window mode is used for indicating a display mode of the background application, such as full screen display, desktop display, up-down split screen display, left-right split screen display and the like.
Illustratively, the structure of the first message is exemplified as follows:
wherein ev represents gesture information, taskId represents task identification, windowMode represents window mode, and primary Screen type represents mode information.
S703, the gesture management module of the system UI sends a second message to the gesture hot zone module of the system UI.
In one implementation, the gesture management module sends a second message to the gesture hotspots module, the second message being used to trigger creation of the screen hotspots. In one implementation, the second message includes a TaskId, gesture information, window mode, mode information, and the like. The gesture management module acquires the taskId, gesture information, window mode, mode information and the like from the first message and sends the taskId, the gesture information, the window mode, the mode information and the like to the gesture hot zone module in the second message.
S704, a gesture hotbox module of the system UI creates a screen hotbox according to the window mode and the mode information.
In different window modes, the screen hotspots may have different divisions.
In one example, the window mode is a split screen mode, and when the mode information is left and right split screens, a screen hotzone as shown in fig. 6 is created. The screen hot zone includes hot zone 1, hot zone 2, hot zone 3, hot zone 4, hot zone 5, and hot zone 6. It can be appreciated that in other window modes or mode information, there is a corresponding division of the screen hotspots.
S705, the gesture hot zone module of the system UI sends a third message to the gesture dynamic effect module of the system UI.
In one implementation, the gesture hotpot module sends a third message to the gesture motion effect module, the third message being used to trigger creation of the gesture motion effect. In one implementation, the third message includes a TaskId, gesture information, window mode, mode information, and the like. The gesture hot zone module acquires the taskId, gesture information, window mode, mode information and the like from the second message and sends the taskId, the gesture information, the window mode, the mode information and the like to the gesture dynamic effect module in the third message.
S706, the gesture dynamic effect module of the system UI requests the gesture image layer management module of the framework layer to acquire interface image layer information.
For example, the interface layer information is a handle of an interface layer (Surface) corresponding to a Task (Task) of the application window. The handle of the interface layer is used for only pointing to one interface layer, and according to the handle of the interface layer (Surface) corresponding to the Task (Task) of the application window, the handle of the uppermost layer of the interface layer tree of the application window can be obtained, and all information of the interface layer tree of the application window is obtained. Such as interface layer levels included in the interface layer tree, display ranges, display sizes, display positions, etc. of the interface layers at each level. The display position, the display size and the like of all the interface layers of the application window can be adjusted according to the handle of the interface layer (Surface) corresponding to the Task (Task) of the application window.
S707, the gesture image layer management module of the framework layer collects interface image layer information of each application window.
In one implementation, the WMS manages an interface layer for each application window on a screen of an electronic device. The gesture image layer management module acquires interface image layer information of each application window from the WMS.
In one example, the gesture layer management module traverses the interface layer tree of the application window, collecting all visible full-screen mode interface layer trees. The gesture graph layer management module traverses the interface graph layer tree of the application window and collects all the interface graph layer trees in the visible split screen mode; for example, if the screen is left and right split screens, collecting an interface layer (Surface) corresponding to a Task (Task) of a left application window and an interface layer (Surface) corresponding to a Task (Task) of a right application window; for example, if the screen is divided up and down, the interface layer corresponding to the Task (Task) of the upper application window and the interface layer corresponding to the Task (Task) of the lower application window are collected. The gesture graph layer management module also traverses the interface graph layer tree of the application window and collects all the interface graph layer trees of the visible floating window mode.
Optionally, the gesture layer management module further traverses the interface layer tree of the application window, collects all the interface layer trees of the visible pip mode, and hides all the interface layers of the pip mode. Therefore, when the gesture dynamic effect is played, the picture-in-picture interface cannot appear, and the integrity of the gesture dynamic effect playing interface is ensured.
Optionally, the gesture layer management module further creates a wallpaper layer for drawing a frame of image. In one example, the image may be used to be presented as background to the application window after the application window is zoomed out. For example, as shown in fig. 3, after the window 10 is reduced in display, an image 30 is displayed on the screen as a background of the window 10. In one example, the image may be blurred to generate a blurred image for presentation in a gesture motion effect.
And S708, the gesture image layer management module of the framework layer returns the collected interface image layer information of each application window to the gesture dynamic effect module of the system UI.
In one implementation, the gesture image management module sends interface image layer information to the gesture motion effect module. The interface layer information comprises one or more of a handle of a full-screen mode interface layer, a handle of each interface layer in a split-screen mode, a handle of a floating window mode interface layer, a handle of a picture-in-picture mode interface layer and a handle of a wallpaper layer.
It is to be appreciated that one or more items of interface layer information can be null. For example, when the current window mode is a split screen mode, a full screen mode interface layer does not exist, and a handle of the full screen mode interface layer is empty. Taking the example of the scene shown in fig. 3 as an example, the window mode is a split screen mode, the window 10 is displayed on the left side of the screen, and the window 20 is displayed on the right side of the screen. The gesture layer management module obtains the handle of the interface layer corresponding to the window 10 and the handle of the interface layer corresponding to the window 20 from the WMS, the handle of the interface layer in full screen mode is null, the handle of the interface layer in floating window mode is null, and the handle of the interface layer in picture-in-picture mode is null.
S709, the gesture dynamic effect module of the system UI generates gesture dynamic effects according to the interface layer information.
In one implementation, the gesture dynamic effect module obtains interface layer information of the corresponding application window from the interface layer information of each application window returned by the gesture layer management module based on the window mode and the mode information, and generates the gesture dynamic effect of the application window according to the interface layer information of the corresponding application window (the handle of the interface layer corresponding to the task of the application window). The gesture dynamic effect module adjusts the display position and the display size of the interface layer of the application window through the handle of the interface layer corresponding to the task of the application window, namely adjusts the display position and the display size of the gesture dynamic effect of the application window.
In the mode, a handle of an interface layer corresponding to a task of the application window is obtained, namely, the control right of the interface layer of the application window is obtained, and gesture dynamic effects can be generated according to the interface layer of the application window; the interface layer is updated in real time, and the gesture dynamic effect is updated in real time. The dynamic interface can be played without frequent screenshot processing. And a large amount of power consumption caused by frequent screenshot processing is avoided.
In one implementation, the display position and display size of the gesture action (interface layer) of the application window are adjusted during movement of the application window along with the movement track of the drag event. In this process, the window mode and mode information of the multi-application window are unchanged. That is, the window mode and mode information is window mode and mode information when a drag event is first received. After receiving the hand-up event, the window mode may be switched or the application window position may be exchanged, and the window mode and the mode information may be updated accordingly.
In one example, the window mode is a split screen mode, and the mode information is left and right split screens, and then the handle of the interface layer corresponding to the task of the left application window and the handle of the interface layer corresponding to the task of the right application window in the split screen mode interface layer are obtained from the interface layer information of each application window returned by the gesture image layer management module. Adjusting the display position and the display size of the interface layer corresponding to the left application window through the handle of the interface layer corresponding to the task of the left application window; and adjusting the display position and the display size of the interface layer corresponding to the right application window through the handle of the interface layer corresponding to the task of the right application window.
In one example, the window mode is a split screen mode, and the mode information is up and down split screens, so that a handle of an interface layer corresponding to a task of an upper application window and a handle of an interface layer corresponding to a task of a lower application window in the split screen mode interface layer are obtained from interface layer information of each application window returned by the gesture image layer management module. Adjusting the display position and the display size of the interface layer corresponding to the upper application window through the handle of the interface layer corresponding to the task of the upper application window; and adjusting the display position and the display size of the interface layer corresponding to the lower application window through the handle of the interface layer corresponding to the task of the lower application window.
Thus, as the gesture effect is generated according to the interface layer corresponding to the application window, the interface layer is updated in real time, and the gesture effect is updated along with the real time. For applications with dynamically changing interfaces, such as video applications, the gesture movement effect is dynamically updated during the movement of the application window, and the display effect is more flexible. Compared with the method for generating the gesture dynamic effect according to the screenshot image by performing screenshot processing on the interface of the application window, the method for generating the gesture dynamic effect according to the screenshot image has the advantages that the screenshot processing is not needed, so that the speed of generating the gesture dynamic effect can be improved, and the response speed to the gesture of the user is improved.
In the method provided by the embodiment of the application, when the application window is pressed and moved for a plurality of times in the dragging operation so that the moving distance is larger than the threshold value, the handle of the interface layer corresponding to the task of the application window is obtained, and therefore the interface layer of the application window can be adjusted through the handle, such as adjusting the display position, the display size and the like of the interface layer. Even after the interface layer moves in position or changes in display size, the gesture effect of the application window is updated in real time along with the update of the interface layer.
In the subsequent moving process of the drag operation, according to the action position of the drag operation on the screen, the display parameters of the interface layer can be modified through the handle of the interface layer corresponding to the task of the application window, so that the display effect of the gesture action effect (interface layer) can be adjusted according to the action position of the drag operation on the screen. For example, according to the detected action position of the drag operation, the display size of the interface layer of the application window is adjusted, the display position of the interface layer is adjusted, the transparency of the interface layer is adjusted, the definition of the interface layer is adjusted, and the like.
In one implementation, the active location of a received gesture operation (drag operation) on the screen may be determined from gesture information (gesture type, location where a gesture was detected, etc.). When a drag operation of a user acts in different screen hotspots on a screen, a gesture motion effect (interface layer) can be displayed as a corresponding display effect.
Still taking the example of the scenario shown in fig. 3 as an example, when receiving the operation of pressing the Bar 11 at the top of the window 10 to drag the window 10, the gesture motion effect (interface layer) of the window 10 moves along with the drag track of the user.
For example, as shown in fig. 8, if the drag operation of the user moves into the hot zone 2, the active window (window 10) is scaled down according to the distance in which the gesture moves downward and scaled up according to the distance in which the drag operation moves upward as it moves into the hot zone 2; the passive window (window 20) is scaled down to a preset value (e.g., 95% of the original window size).
For example, as shown in fig. 9, if the drag operation of the user moves into the hot zone 3, the active window (window 10) is scaled down in accordance with the distance the drag operation moves downward and scaled up in accordance with the distance the drag operation moves upward as it moves into the hot zone 3; the passive window (window 20) is scaled down to a preset value (e.g., 95% of the original window size).
For example, as shown in fig. 10, if the drag operation of the user moves into the hot zone 5, the active window (window 10) is scaled down in accordance with the distance the drag operation moves downward and scaled up in accordance with the distance the drag operation moves upward as it moves into the hot zone 5; the passive window (window 20) is scaled down to a preset value (e.g., 95% of the original window size) and moved to the left side of the screen.
For example, as shown in fig. 11, if the drag operation of the user moves into the hot zone 4, the active window (window 10) does not continue to shrink as it moves into the hot zone 4, and is displayed in the floating window mode; the passive window (window 20) is displayed in full screen mode. The gesture effects of the active window and the passive window are respectively generated according to the wallpaper layer, that is to say, the gesture effects of the active window and the passive window are displayed as blurred images. After the active window is reduced and displayed as a floating window mode, the shape of the active window is greatly changed, and a dynamic effect picture generated by an actual split screen mode interface layer is seriously deformed; after the passive window is displayed in a full screen mode in an enlarged manner, the shape of the passive window is greatly changed, and a dynamic effect picture generated by an actual split screen mode interface layer is also severely deformed; poor in appearance. The gesture dynamic effects of the active window and the passive window are uniformly displayed as blurred images, so that the viewing experience of a user can be improved. In one implementation, the blurred image is a preset blurred image; in one implementation manner, the blurred image is an image after blurring processing by applying a last frame of image played by a window before the drag operation enters a preset hot zone (for example, hot zone 4); in one implementation, the blurred image is an image of the application window display interface occluded with a preset image having a preset transparency.
For example, as shown in fig. 12, if the drag operation of the user moves into the hot zone 6, the active window (window 10) does not continue to shrink as it moves into the hot zone 6, and is displayed in the split screen mode; the passive window (window 20) is scaled down to a preset value (e.g., 95% of the original window size) and moved to the left side of the screen. The gesture effects of the active window and the passive window are respectively generated according to the wallpaper layer, that is to say, the gesture effects of the active window and the passive window are displayed as blurred images.
In some embodiments, in the application window switching process, the audio of the application continues to play without interruption, and synchronization with the dynamic picture of the gesture dynamic effect can be realized, so that the viewing experience of the user is coherent and better.
It may be understood that, in order to implement the above-mentioned functions, the electronic device provided in the embodiments of the present application includes corresponding hardware structures and/or software modules that perform each function. Those of skill in the art will readily appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present application.
The embodiment of the application may divide the functional modules of the electronic device according to the method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
In one example, please refer to fig. 13, which shows a possible structural schematic diagram of the electronic device involved in the above embodiment. The electronic device 1300 includes: a processing unit 1310, a storage unit 1320, and a display unit 1330.
The processing unit 1310 is configured to control and manage an operation of the electronic device 1300.
The memory unit 1320 is used to store program codes and data of the electronic device 1300.
The display unit 1330 is used to display an interface of the electronic device 1300.
Of course, the unit modules in the above-described electronic apparatus 1300 include, but are not limited to, the above-described processing unit 1310, storage unit 1320, and display unit 1330.
Optionally, an audio unit, a communication unit, etc. may also be included in the electronic device 1300. The audio unit is used for collecting audio, playing audio and the like. The communication unit is used to support the electronic device 1300 to communicate with other devices.
The processing unit 1310 may be a processor or controller, such as a central processing unit (central processing unit, CPU), a digital signal processor (digital signal processor, DSP), an application-specific integrated circuit (application-specific integrated circuit, ASIC), a field programmable gate array (field programmable gate array, FPGA) or other programmable logic device, transistor logic device, hardware components, or any combination thereof. The storage unit 1320 may be a memory. The display unit 1330 may be a display screen or the like. The audio unit may include a microphone, a speaker, etc. The communication unit may comprise a mobile communication unit and/or a wireless communication unit.
For example, processing unit 1310 is a processor (e.g., processor 110 shown in fig. 4), storage unit 1320 may be a memory (e.g., internal memory 121 shown in fig. 4), and display unit 1330 may be a display screen (e.g., display screen 194 shown in fig. 4). The audio unit may be an audio module (such as audio module 170 shown in fig. 4). The communication units may include a mobile communication unit (such as the mobile communication module 150 shown in fig. 4) and a wireless communication unit (such as the wireless communication module 160 shown in fig. 4). The electronic device 1300 provided in the embodiment of the present application may be the electronic device 100 shown in fig. 4. Wherein the processors, memory, display screen, etc. may be coupled together, for example, via a bus.
Embodiments of the present application also provide a chip system including at least one processor and at least one interface circuit. The processors and interface circuits may be interconnected by wires. For example, the interface circuit may be used to receive signals from other devices (e.g., a memory of an electronic apparatus). For another example, the interface circuit may be used to send signals to other devices (e.g., processors). The interface circuit may, for example, read instructions stored in the memory and send the instructions to the processor. The instructions, when executed by a processor, may cause an electronic device to perform the various steps of the embodiments described above. Of course, the chip system may also include other discrete devices, which are not specifically limited in this embodiment of the present application.
The embodiment of the application also provides a computer readable storage medium, which comprises computer instructions, when the computer instructions run on the electronic device, the electronic device is caused to execute the functions or steps executed by the mobile phone in the embodiment of the method.
The present application also provides a computer program product, which when run on a computer, causes the computer to perform the functions or steps performed by the mobile phone in the above-mentioned method embodiments.
It will be apparent to those skilled in the art from this description that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a specific embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A dynamic effect display method, characterized in that the method comprises:
the method comprises the steps that the electronic equipment displays a first application window and a second application window in a split mode, wherein the first application window plays video content, the first application window is displayed in a first area on a screen, and the second application window is displayed in a second area on the screen;
responding to a dragging event of the first application window, the electronic equipment acquires interface layer information of the first application window, and controls the first application window to move along with the dragging event through the interface layer information of the first application window; wherein, in the moving process of the first application window, the first application window continuously plays the video content;
and responding to that the lifting event acted on the first application window falls in a third area in the screen of the electronic device, wherein the first application window is displayed in the second area, and the second application window is displayed in the first area.
2. The method according to claim 1, wherein the method further comprises:
and the electronic equipment controls the first application window to adjust the size according to the position of the dragging event through the interface layer information of the first application window.
3. The method of claim 1 or 2, wherein the interface layer information of the first application window includes a handle of an interface layer corresponding to a task of the first application window.
4. A method according to any one of claims 1-3, wherein the second region comprises the third region.
5. The method according to any one of claims 1-4, further comprising:
and responding to the dragging event acted on the first application window falling in a fourth area in the screen of the electronic device, changing the aspect ratio of the first application window and displaying a blurred image.
6. The method of claim 5, wherein the method further comprises:
and responding to the dragging event acted on the first application window falling in a fourth area in the screen of the electronic device, changing the aspect ratio of the second application window and displaying a blurred image.
7. The method according to claim 5 or 6, wherein the blurred image is:
a preset blurred image; or,
and blurring the first image, wherein the first image is the last frame of video image played by the first application window before the drag event falls in the fourth area.
8. The method according to any one of claims 5-7, further comprising:
and responding to the lifting event acted on the first application window falling in the fourth area, displaying the first application window as a floating window and continuing to play video content, and displaying the second application window in a full screen mode.
9. An electronic device, comprising:
one or more processors;
a display screen;
a memory;
wherein the memory has stored therein one or more computer programs, the one or more computer programs comprising instructions, which when executed by the electronic device, cause the electronic device to perform the method of any of claims 1-8.
10. A computer-readable storage medium comprising computer instructions; the computer instructions, when run on an electronic device, cause the electronic device to perform the method of any one of claims 1-8.
CN202310854682.1A 2023-07-12 2023-07-12 Dynamic effect display method and electronic equipment Pending CN117724783A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310854682.1A CN117724783A (en) 2023-07-12 2023-07-12 Dynamic effect display method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310854682.1A CN117724783A (en) 2023-07-12 2023-07-12 Dynamic effect display method and electronic equipment

Publications (1)

Publication Number Publication Date
CN117724783A true CN117724783A (en) 2024-03-19

Family

ID=90209397

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310854682.1A Pending CN117724783A (en) 2023-07-12 2023-07-12 Dynamic effect display method and electronic equipment

Country Status (1)

Country Link
CN (1) CN117724783A (en)

Similar Documents

Publication Publication Date Title
CN112269527B (en) Application interface generation method and related device
CN113805745B (en) Control method of suspension window and electronic equipment
US20230021260A1 (en) Gesture instruction execution method and apparatus, system, and storage medium
CN111966252A (en) Application window display method and electronic equipment
CN110569095B (en) Method and electronic equipment for displaying page elements
CN114416227B (en) Window switching method, electronic device and readable storage medium
CN111782102B (en) Window display method and related device
JP2021516818A (en) Application program display adaptation method and its devices, terminals, storage media, and computer programs
CN113986002B (en) Frame processing method, device and storage medium
WO2022062898A1 (en) Window display method and device
CN113132526B (en) Page drawing method and related device
CN114816167B (en) Application icon display method, electronic device and readable storage medium
CN111882642B (en) Texture filling method and device for three-dimensional model
CN110928464B (en) User interface display method, device, equipment and medium
CN110442277B (en) Method for displaying preview window information and electronic equipment
WO2023093169A1 (en) Photographing method and electronic device
WO2021254113A1 (en) Control method for three-dimensional interface and terminal
CN114780012B (en) Display method and related device of screen locking wallpaper of electronic equipment
WO2022228042A1 (en) Display method, electronic device, storage medium, and program product
WO2022228043A1 (en) Display method, electronic device, storage medium and program product
CN115390738A (en) Scroll screen opening and closing method and related product
CN117724783A (en) Dynamic effect display method and electronic equipment
WO2024012354A1 (en) Display method and electronic device
CN116700914B (en) Task circulation method and electronic equipment
WO2023078133A1 (en) Video playback method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination