WO2022222830A1 - 图形界面显示方法、电子设备、介质以及程序产品 - Google Patents

图形界面显示方法、电子设备、介质以及程序产品 Download PDF

Info

Publication number
WO2022222830A1
WO2022222830A1 PCT/CN2022/086706 CN2022086706W WO2022222830A1 WO 2022222830 A1 WO2022222830 A1 WO 2022222830A1 CN 2022086706 W CN2022086706 W CN 2022086706W WO 2022222830 A1 WO2022222830 A1 WO 2022222830A1
Authority
WO
WIPO (PCT)
Prior art keywords
movement
elements
electronic device
distance
present disclosure
Prior art date
Application number
PCT/CN2022/086706
Other languages
English (en)
French (fr)
Inventor
卞超
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022222830A1 publication Critical patent/WO2022222830A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/38Creation or generation of source code for implementing user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/802D [Two Dimensional] animation, e.g. using sprites

Definitions

  • the present disclosure generally relates to the field of information technology, and more particularly, to a graphical interface display method, an electronic device, a computer-readable storage medium, and a computer program product.
  • Embodiments of the present disclosure relate to a technical solution for realizing an animation effect of "attractive force” or “repulsive force” between UI elements, and specifically provide a graphical interface display method, electronic device, and computer-readable storage media, and computer program products.
  • a graphical interface display method In this method, the electronic device displays M user interface UI elements on the screen, where M is a positive integer greater than 1.
  • the electronic device detects an operation acting on the first UI element among the M UI elements.
  • the electronic device animates each of the N UI elements on the screen, where N is a positive integer between 1 and M-1.
  • the electronic device determines the target distance that the second UI element among the N UI elements will move in the first direction, where the first direction is the direction from the second UI element to the first UI element or from the first UI element.
  • a UI element points in the direction of a second UI element.
  • the electronic device causes the second UI element to perform a first movement along the first direction with the target distance from the starting position. After the first movement, the electronic device makes the second UI element perform a second movement in a second direction opposite to the first direction, so as to reset to the starting position.
  • the embodiments of the present disclosure achieve an animation effect with "attraction" between UI elements, exhibit a dynamic effect that conforms to natural laws, are more consistent with the user's life experience, and enhance the vitality and humanization of the electronic device.
  • the second UI element may perform multiple first and second displacements, depending on system settings or user settings, or depending on the length of time the operation of the first UI element being manipulated lasts.
  • the second UI element can perform the first movement in the first direction, perform the second movement in the second direction, then perform the first movement in the first direction, and then perform the first movement in the second direction in a loop. Execute the second move on, and so on.
  • the target distance of the second UI element in the first movement in the first direction in each loop may remain constant or gradually decrease.
  • the electronic device may determine the size of the second UI element, determine the distance between the second UI element and the first UI element, and determine the target distance based on the size and the distance.
  • the size of the "attractive" or “repulsive" effect of UI elements on other UI elements can depend on the size of the UI element itself and the distance between two UI elements, so as to conform to the law of gravity in nature , thereby further improving the user experience.
  • the electronic device may cause the target distance to increase with increasing size and decrease with increasing distance.
  • the electronic device may determine the first center point of the first UI element, determine the second center point of the second UI element, and determine the first center point of the first UI element.
  • the straight-line distance between the first center point and the second center point is used as the distance between the second UI element and the first UI element.
  • the distance between the two UI elements can be determined as the distance between the center points of the two UI elements in a direct and clear manner, thereby improving the consistency of the way in which the electronic device determines the distance between the UI elements and simplifying the electronic device.
  • the computing process of the device is used as the distance between the center points of the two UI elements in a direct and clear manner, thereby improving the consistency of the way in which the electronic device determines the distance between the UI elements and simplifying the electronic device.
  • the electronic device may determine a first center point of the first UI element, and determine a plurality of circles having respective radii with the first center point as the center. a circle, determining that the second UI element intersects at least one of the multiple circles, and determining the radius of the circle with the smallest radius among the at least one circles as the distance between the second UI element and the first UI element. In this way, the electronic device can determine the distance between UI elements more simply and conveniently, and make the distance between UI elements have higher consistency, thereby simplifying the subsequent processing and calculation process based on the distance.
  • the electronic device may determine the horizontal distance between the first UI element and the second UI element, and determine the distance between the first UI element and the second UI element. vertical spacing between elements, and determining a distance between the second UI element and the first UI element based on at least one of a horizontal spacing and a vertical spacing and the first direction. In this way, the electronic device can determine the distance between the UI elements based on the distance between the UI elements, thereby improving the flexibility and rationality of the distance determination method, especially in a scene where the distance between the UI elements is basically consistent .
  • the electronic device may further determine the influence area of the first UI element based on the size of the first UI element, and determine the UI elements within the influence area among the M UI elements as N UI elements. In this way, the electronic device can set the "gravity" influence range of the UI element to an appropriate size, thereby reducing the calculation of the electronic device when implementing the "gravity” animation effect while keeping the “gravity” animation effect in line with the laws of nature to save computing resources.
  • the electronic device may further determine M-1 UI elements other than the first UI element among the M UI elements as N UI elements. In this way, the electronic device does not need to set the "gravity” influence range of the UI element, so that the related setting of the "gravity” animation effect can be simplified while keeping the “gravity” animation effect conforming to the natural law.
  • At least one of a first duration of the first movement, a second duration of the second movement, and a total duration of the first movement and the second movement may be configurable. In this way, the user of the electronic device can set the duration of the "gravity" animation effect according to preference, thereby further improving the user experience.
  • the animation effect of the movement of the second UI element during at least one of the first movement and the second movement may be determined based on a predefined curve of displacement over time.
  • the electronic device can conveniently control the movement of the UI elements based on the predefined curve of the displacement over time, so that the "gravity" animation effect is more in line with the user's usage habits, thereby further improving the user experience.
  • the predefined curve may be a Bezier curve or an elastic force curve.
  • the electronic device can conveniently control the movement of UI elements based on the Bezier curve or the elastic force curve, so that the "gravity” animation effect is more in line with the user's habitual recognition of "attraction” and “repulsion” in life. to further improve the user experience.
  • At least one of the first movement and the second movement may comprise variable acceleration linear motion.
  • the electronic device can realize the first movement and the second movement of UI elements based on the accelerated motion law of objects in nature under the action of gravity, so that the "gravity" animation effect is more in line with the laws of nature and the habits of users in life. awareness to further improve the user experience.
  • the electronic device may determine a first point in time at which the operation on the first UI element is performed, based on a predetermined speed and a relationship between the second UI element and the first UI element the distance between, determining the delay between the second point in time to start the first movement and the first point in time, determining the second point in time based on the first point in time and the delay, and starting the second UI element at the second point in time First move.
  • the UI of the electronic device can visually present the linkage of the "gravitational force", that is, the movement caused by the “attractive force” or “repulsive force” spreads with the distance, so that the UI animation effect is more in line with the user's usage habits to further improve the user experience.
  • the operation on the first UI element includes exchanging the positions of the first UI element and the second UI element, and the above target distance is the first target distance.
  • the electronic device may also change the second UI element The element is moved from the initial position to the initial position, which is the initial position of the first UI element; after the second UI element reaches the initial position and before the first movement, it is determined that the second UI element will move in the third direction
  • the second target distance of , the third direction is the direction from the second UI element to the third UI element or the direction from the third UI element to the second UI element; performing a third movement from the starting position in a third direction with a second target distance; and after the third movement and before the first movement, causing the second UI element to perform a fourth movement in a fourth direction opposite to the third direction, to reset to the starting position.
  • the electronic device can more fully and comprehensively display the animation effect of "gravity” between UI elements, thereby further improving the user experience.
  • the electronic device may also reduce or enlarge the size of the second UI element during at least one of the first movement and the second movement in order to generate the animation effect. In this way, the electronic device can display the animation effect of "gravity" between UI elements in a more diverse manner, thereby further improving the user experience.
  • the first direction may point from the second center point of the second UI element to the first center point of the first UI element, or may point from the first center point to the second center point.
  • the electronic device can accurately and consistently determine the direction of the "attractive force” or “repulsive force” between two UI elements, thereby improving the accuracy and efficiency of implementing the "attractive force” animation effect.
  • the functions of the graphical interface display method of the first aspect may be implemented by at least one of an AAR format file, a JAR format file, and a system interface of an electronic device.
  • an AAR format file a JAR format file
  • a system interface of an electronic device a system interface of an electronic device.
  • an electronic device in a second aspect of the present disclosure, includes a processor and a memory storing instructions.
  • the instructions when executed by a processor, cause an electronic device to perform any of the methods according to the first aspect and implementations thereof.
  • a computer-readable storage medium stores instructions that, when executed by an electronic device, cause the electronic device to perform any of the methods of the first aspect and its implementations.
  • a computer program product comprises instructions which, when executed by an electronic device, cause the electronic device to perform any of the methods of the first aspect and its implementations.
  • FIG. 1 shows a schematic diagram of a hardware structure of an electronic device that can implement an embodiment of the present disclosure.
  • FIG. 2 shows a flowchart of an example processing procedure of a graphical interface display method according to an embodiment of the present disclosure.
  • 3A to 3J illustrate schematic diagrams of a "gravity" animation effect generated in a scene where a UI element is clicked, according to an embodiment of the present disclosure.
  • FIGS. 4A and 4B illustrate schematic diagrams of a first direction of a first movement and a second direction of a second movement of a UI element in a "gravity" animation effect according to an embodiment of the present disclosure.
  • FIG. 5 is a schematic diagram showing the positions of UI elements affected by "attraction” in the “attraction” animation effect at different moments during the first movement and the second movement according to an embodiment of the present disclosure.
  • FIG. 6 shows a schematic diagram of the animation process and related control logic of the "gravity” animation effect according to an embodiment of the present disclosure.
  • FIG. 7A is a schematic diagram illustrating that the predefined curve of the displacement of the UI element changing with time is a Bezier curve according to an embodiment of the present disclosure.
  • FIG. 7B shows a schematic diagram in which the predefined curve of the displacement of the UI element as a function of time is an inverse proportional curve according to an embodiment of the present disclosure.
  • FIG. 7C shows a schematic diagram of a predefined curve of the displacement of a UI element over time as a critically damped elastic force curve according to an embodiment of the present disclosure.
  • FIG. 7D shows a schematic diagram of a predefined curve of the displacement of a UI element over time as an under-damped elastic force curve according to an embodiment of the present disclosure.
  • 7E to 7H are schematic diagrams showing a comparison of different displacement time-varying curves of three UI elements affected by "gravity" according to an embodiment of the present disclosure.
  • FIG. 8 shows a flowchart of an example process for determining a target distance for a first movement of a second UI element affected by the "attractive force" or “repulsive force” of the first UI element according to an embodiment of the present disclosure .
  • FIG. 9 illustrates a schematic diagram of determining the size of a second UI element that is affected by the "attractive” or “repulsive” force of the first UI element according to an embodiment of the present disclosure.
  • 10A and 10B respectively illustrate schematic diagrams of two exemplary ways of determining distances between UI elements according to embodiments of the present disclosure.
  • FIG. 11 illustrates a flowchart of an example process for determining a distance between a first UI element and a second UI element based on a center point, according to an embodiment of the present disclosure.
  • FIG. 12 shows a schematic diagram of determining a distance between a first UI element and a second UI element based on a center point according to an embodiment of the present disclosure.
  • FIG. 13 illustrates a flowchart of an example process for determining a distance between a first UI element and a second UI element based on a radius, according to an embodiment of the present disclosure.
  • FIG. 14 shows a schematic diagram of determining a distance between a first UI element and a second UI element based on a radius according to an embodiment of the present disclosure.
  • 15A and 15B show schematic diagrams of the overall conduction between UI elements in the case where the distance between UI elements is determined based on the radius, according to an embodiment of the present disclosure.
  • FIG. 16 illustrates a flowchart of an example process for determining a distance between a first UI element and a second UI element based on the distance, according to an embodiment of the present disclosure.
  • 17A to 17F illustrate schematic diagrams of determining a distance between a first UI element and a second UI element based on the distance according to an embodiment of the present disclosure.
  • FIGS. 18A-18C illustrate schematic diagrams of "gravity” animation effects generated in a scene where UI elements have a limited “gravity” range, according to embodiments of the present disclosure.
  • 19A shows a flowchart of an example process for determining the point in time at which a "gravity” animation effect of a UI element begins based on the "gravity” propagation speed, according to an embodiment of the present disclosure.
  • FIGS. 19B-19E are schematic diagrams showing a comparison of different displacement time curves of three UI elements affected by “gravitational force” taking into account the propagation delay of “gravitational force” according to an embodiment of the present disclosure.
  • 20A-20D illustrate schematic diagrams of a "gravity" animation effect produced in a scene where a UI element is moved and swaps positions with another UI element, according to embodiments of the present disclosure.
  • FIG. 21 shows a flow of an example processing procedure in which a UI element that reaches the new position first is subjected to the "gravitational force" of other UI elements to generate a "gravitational force” animation effect in a scene where UI elements exchange positions according to an embodiment of the present disclosure picture.
  • 22A to 22D are schematic diagrams illustrating a "gravity” animation effect generated by the "gravitational force" of other UI elements on the UI element that first reaches the new position in a scene where UI elements exchange positions, according to an embodiment of the present disclosure.
  • 23A-23D illustrate schematic diagrams of a "gravity" animation effect produced in a scene where a UI element is moved and merged with another UI element, according to embodiments of the present disclosure.
  • 24A to 24D illustrate schematic diagrams of a "gravity" animation effect produced in a scene where a UI element is deleted, according to an embodiment of the present disclosure.
  • 25A-25D illustrate schematic diagrams of a "gravity" animation effect produced in a scene in which a UI element is expanded, according to an embodiment of the present disclosure.
  • 26 shows a schematic diagram of the relationship between the UI frame animation effect associated with the "gravity" animation effect and the system desktop according to an embodiment of the present disclosure.
  • FIG. 27 shows a schematic diagram of other application scenarios to which the "gravity" animation effect capability or function of an embodiment of the present disclosure may be applied.
  • FIG. 28 shows a schematic diagram of a system framework for implementing a "gravity” animation effect capability or function according to an embodiment of the present disclosure.
  • FIG. 29 shows a schematic diagram of the relationship between the application side and the UI framework side involved in the "attraction" animation effect capability or function according to an embodiment of the present disclosure.
  • Fig. 30 shows a schematic diagram of a specific description of three ways of implementing the "gravity" animation effect capability or function according to an embodiment of the present disclosure.
  • FIG. 32 shows an operation sequence diagram of the application side and the dynamic effect capability side for realizing the "gravity” animation effect according to an embodiment of the present disclosure.
  • FIG 33 shows a schematic diagram of an interface for adjusting parameters of a "gravity” animation effect according to an embodiment of the present disclosure.
  • the term “including” and the like should be construed as inclusive, ie, “including but not limited to”.
  • the term “based on” should be understood as “based at least in part on”.
  • the terms “one embodiment” or “the embodiment” should be understood to mean “at least one embodiment”.
  • the terms “first”, “second”, etc. may refer to different or the same object and are used only to distinguish the referents and do not imply a particular spatial order, temporal order, significance of the referents Sexual order, etc.
  • values, processes, selected items, determined items, equipment, apparatus, means, parts, assemblies, etc. are referred to as “best”, “lowest”, “highest”, “minimum” , “Max”, etc.
  • determining can encompass a wide variety of actions. For example, “determining” may include computing, calculating, processing, deriving, investigating, looking up (eg, in a table, database, or another data structure), ascertaining, and the like. Further, “determining” may include receiving (eg, receiving information), accessing (eg, accessing data in memory), and the like. Furthermore, “determining” may include parsing, selecting, selecting, establishing, and the like.
  • the term "UI” refers to the interface through which the user interacts with the application or operating system and exchanges information, which enables the conversion between the internal form of the information and the form acceptable to the user.
  • the UI of an application is source code written in a specific computer language such as java, extensible markup language (XML), etc.
  • the UI source code is parsed and rendered on the electronic device, and finally presented as content that the user can recognize , such as images, text, buttons and other UI elements.
  • the attributes and content of UI elements in the UI are defined by tags or nodes.
  • the UI elements contained in the UI are specified by nodes such as ⁇ Textview>, ⁇ ImgView>, and ⁇ VideoView>.
  • a node corresponds to a UI element or attribute in the UI. After parsing and rendering, the node is presented as user-visible content.
  • many applications, such as hybrid applications often contain web pages in their UI.
  • a web page can be understood as a special UI element embedded in the UI of an application.
  • a web page is source code written in a specific computer language, such as hypertext markup language (HTML), cascading style sheets (cascading style sheets) , CSS), java script (JavaScript, JS), etc.
  • the web page source code can be loaded and displayed as user-identifiable content by a browser or a web page display component similar in function to a browser.
  • the specific content contained in a web page is also defined by tags or nodes in the source code of the web page. For example, HTML defines the elements and attributes of web pages through ⁇ p>, ⁇ img>, ⁇ video>, and ⁇ canvas>.
  • UI element includes, but is not limited to: window, scrollbar, tableview, button, menu bar, text box , Navigation bar, toolbar (toolbar), image (image), static text (tatictext), widget (Widget) and other visual UI elements.
  • UI elements may also include controls.
  • a control can encapsulate data and methods, and a control can have its own properties and methods. Properties are simple visitors to the data of the control, and methods are some simple and visible functions of the control.
  • Controls are the basic elements of a user interface.
  • the types of controls may include, but are not limited to: user interface controls (for developing controls for building user interfaces, such as controls for interface elements such as windows, text boxes, buttons, drop-down menus, etc.), chart controls (for developing charts Controls, which can realize data visualization, etc.), report controls (controls used to develop reports, realize the functions of browsing, viewing, designing, editing, printing, etc.
  • the types of controls in this embodiment of the present application may also include: composite controls (combining various existing controls to form a new control, and concentrating the performance of various controls), extended controls (deriving a new control according to the existing controls) controls, add new properties to existing controls or change the properties of existing controls), custom controls, etc.
  • UI elements may also include page modules.
  • the page can be divided into multiple consecutive page modules.
  • a page module can carry one or more types of information such as pictures, texts, operation buttons, links, animations, sounds, videos, etc.
  • a page module can be presented as a collection of one or more controls, as a card, or as a collection of cards and other controls.
  • a page module can appear as an icon on the main interface, a picture in the gallery, a card in the negative screen, and so on.
  • different page modules may or may not overlap.
  • the page module may also be referred to as a module for short.
  • the card can provide a more fine-grained service capability than the application (application, APP), and directly display the service or content that the user cares about most to the user in the form of an interactive card, and the card can be embedded in various APPs or interactive scenarios. , to better meet user needs.
  • the card-style layout different contents can be displayed separately, which makes the presentation of the contents on the display interface more intuitive, and also enables users to operate on different contents more easily and more accurately.
  • animations are essentially real-time display of user interface UI or UI elements based on refresh rate. Due to the human vision persistence principle, the user feels that the picture is moving. The animation transitions from the initial state of the animation to the final state of the animation after the animation time has elapsed.
  • the animation can be controlled by the animation type and animation transformation form.
  • animation types may include displacement animations, rotation animations, scale animations, and transparency animations, among others.
  • the animation transformation form can be controlled by controllers such as interpolators and estimators. Such a controller can be used to control the speed at which the animation is transformed during animation time.
  • an embodiment of the present disclosure proposes a new solution for displaying a graphical interface.
  • the embodiments of the present disclosure relate to a novel dynamic effect implementation scheme, and propose the design and implementation of a gravitational dynamic effect. It is mainly based on human factors research, simulating the gravitational effect of nature and realizing gravitational dynamic effect.
  • the embodiment of the present disclosure is the first use of the theory of gravitational field in the dynamic effect field of UI framework, and constructs the characteristic dynamic effect of gravity.
  • gravitational animation includes sub-features such as space, balance, capture, diffusion, and convergence.
  • the embodiments of the present disclosure are mainly aimed at the effect of the gravitational field and the ability to construct a gravitational dynamic effect. Between different controls, icons, and pages, the connection between each other is strengthened, the relationship between individual individuals is highlighted, and the user experience is enhanced.
  • the perfect presentation of nature's gravitational field theory in the field of dynamic effects further proves the importance of human factors theory research, and also enables terminal devices with screens to display dynamic effects that conform to the laws of nature. In the process of using the device, the user is also more in line with the life experience, which strengthens the vitality and humanization of the device.
  • FIG. 1 shows a schematic diagram of a hardware structure of an electronic device 100 that can implement embodiments of the present disclosure.
  • the electronic device 100 may include a processor 110 , an external memory interface 120 , an internal memory 121 , a universal serial bus (USB) interface 130 , a charging management module 140 , a power management module 141 , and a battery 142 , Antenna 1, Antenna 2, Mobile Communication Module 150, Wireless Communication Module 160, Audio Module 170, Speaker 170A, Receiver 170B, Microphone 170C, Headphone Interface 170D, Sensor Module 180, Key 190, Motor 191, Indicator 192, Camera 193 , a display screen 194, and a subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the structures shown in the embodiments of the present disclosure do not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown, or combine some components, or separate some components, or different component arrangements.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may contain multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flash, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 can couple the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate with each other through the I2C bus interface, so as to realize the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • the processor 110 may contain multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications, sampling, quantizing, and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is typically used to connect the processor 110 with the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 110 communicates with the camera 193 through a CSI interface, so as to realize the photographing function of the electronic device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to implement the display function of the electronic device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • the GPIO interface can also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiments of the present disclosure is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 charges the battery 142 , the electronic device 100 can also be powered by the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, the wireless communication module 160, and the like.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide a wireless communication solution including 2G/3G/4G/5G/6G etc. applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110, and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 may provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), 5G and subsequent evolution standards, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • CDMA code division multiple access
  • WCDMA broadband Code Division Multiple Access
  • TD-SCDMA Time Division Code Division Multiple Access
  • LTE Long Term Evolution
  • 5G and subsequent evolution standards BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc.
  • GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi- zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the electronic device 100 may include one or N display screens 194 , where N is a positive integer greater than one.
  • the electronic device 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • the camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy and so on.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos of various encoding formats, for example, moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save files such as music, video, etc. in an external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area may store data (such as audio data, phone book, etc.) created during the use of the electronic device 100 and the like.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be contacted and separated from the electronic device 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195 .
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card and so on. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the electronic device 100 employs an eSIM, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiments of the present disclosure take a mobile operating system with a layered architecture as an example to illustrate the software structure of the electronic device 100 as an example.
  • FIG. 2 shows a flowchart of an example process 200 of a graphical interface display method according to an embodiment of the present disclosure.
  • the process 200 may be implemented by the electronic device 100 , for example, by the processor 110 or the processing unit of the electronic device 100 in cooperation with other components (eg, the display screen 194 ).
  • process 200 may also be implemented by other devices having screens to display UI elements.
  • the electronic device 100 will be used to perform the process 200 as an example, and the process 200 will be discussed with reference to FIGS. 3A to 3J , 4A to 4B and 5 , wherein FIGS. A schematic diagram of the "gravity" animation effect produced in the scene where the UI element is clicked.
  • the electronic device 100 displays M user interface UI elements on its screen 300, where M is a positive integer greater than one.
  • screen 300 may be an example of display screen 194 depicted in FIG. 1 .
  • Three rows include UI elements 331-334, the fourth row includes UI elements 341-344, the fifth row includes UI elements 351-354, and the sixth row includes UI elements 361-364. It should be noted that although the example of FIG.
  • FIG. 3A shows a specific number of UI elements arranged in a regular manner
  • embodiments of the present disclosure are not limited thereto, but are equally applicable to any regular or irregular manner
  • the M UI elements are shown in the example of FIG. 3A as being substantially the same size, embodiments of the present disclosure are not limited thereto, but are equally applicable to one or more of the M UI elements Each UI element has a scene with different sizes.
  • the electronic device 100 detects an operation acting on a first UI element of the M UI elements.
  • the electronic device 100 may detect an operation acting on the UI element 343 among the 24 UI elements 311 to 364 .
  • the UI element that is operated may also be referred to as a "first UI element”. Therefore, in the example of FIG. 3B , the UI element 343 being manipulated may also be referred to as the first UI element 343 .
  • the first UI element 343 may also be referred to as the first UI element 343 .
  • the user of the electronic device 100 may use the hand 370 to click on the UI element 343 , eg, to launch the application corresponding to the UI element 343 .
  • the "attraction" animation effect of embodiments of the present disclosure will be described with a click operation as an example of an operation on a UI element. It should be understood, however, that embodiments of the present disclosure are not limited to click operations, but may be equally or similarly applicable to any other operations related to UI elements, such as operations to move UI elements, operations to merge UI elements with other UI elements Actions, actions to expand UI elements, actions to delete UI elements, and so on. "Attraction" animation effects according to embodiments of the present disclosure related to these operations will be described later with reference to Figures 20A-20D, 22A-22D, 23A-23D, 24A-24D, 25A- 25D is further described.
  • electronic device 100 causes each of the N UI elements on screen 300 to produce a “gravitational” animation effect, that is, subject to the “gravity” of UI element 343 .
  • Attractive force or “repulsive force”, where N is a positive integer between 1 and M-1. That is, at least one UI element can be “attracted” or “repulsive” by UI element 343, and at most M-1 UI elements can be “attracted” or “repulsed” by UI element 343.
  • all other UI elements on the screen 300 may be affected by the UI element 343 to produce a "gravitational" animation effect.
  • the electronic device 100 may determine the M-1 UI elements except the UI element 343 among the M UI elements as the N UI elements that will generate the "attraction” animation effect. In this way, the electronic device 100 does not need to specifically set the "gravity” influence range of the UI element 343, so that the related settings of the "gravity” animation effect can be simplified while keeping the "gravity” animation effect conforming to natural laws.
  • the electronic device 100 may also determine the N UI elements that need to generate an animation effect based on the "gravity" influence area of the UI element 343 being manipulated. Such an embodiment will be described later with reference to FIGS. 18A to 18C .
  • a UI element manipulated by the user may be considered to have its "gravitational" equilibrium state “broken” by the user's operation, thereby producing an "attractive” or “repulsive force” to other UI elements, or Will be “attracted” or “repulsed” by other UI elements.
  • whether the "attractive force" of the manipulated UI element to other UI elements is expressed as “attractive force” or “repulsive force” may be preset or settable.
  • whether the manipulated UI element is subjected to the "attractive force” of other UI elements is expressed as “attractive force” or “repulsive force”, which may also be preset or settable.
  • first displacement of a UI element affected by the "gravitational force” towards or away from the UI element being manipulated will be referred to as the “first displacement” or “first movement”
  • second displacement the subsequent displacement of the affected UI element back to the starting position
  • first direction the direction of “first displacement” or “first movement”
  • second direction the direction of "second displacement” or “second movement”
  • FIGS. 5 An example of an “attraction” animation effect of an embodiment of the present disclosure will be described below by first referring to FIGS. Then, with reference to FIG. 5 , the details of a UI element being affected by the "attraction” of the UI element being manipulated to produce an attractive animation effect will be described in detail.
  • UI element 311 can move in direction 311-d1 pointing to UI element 343, UI element 312 can move in direction 312-d1 pointing to UI element 343, as indicated by the dashed arrow, UI element 313 can move in direction 313-d1 pointing to UI element 343, UI element 314 can move in direction 314-d1 pointing to UI element 343, UI element 321 can move in direction 321-d1 pointing to UI element 343 , UI element 322 can move in direction 322-d1 pointing to UI element 343, UI element 323 can move in direction 323-d1 pointing to UI element 343, UI element 324 can move in direction 324-d1 pointing to UI element 343 move.
  • UI element 331 can move in direction 331-d1 pointing to UI element 343, UI element 332 can move in direction 332-d1 pointing to UI element 343, and UI element 333 can move along Moving in direction 333-d1 pointing to UI element 343, UI element 334 can move in direction 334-d1 pointing to UI element 343, UI element 341 can move in direction 341-d1 pointing to UI element 343, UI element 342 can move in direction 341-d1 pointing to UI element 343 Moving in the direction 342-d1 pointing to the UI element 343, the UI element 344 may move in the direction 344-d1 pointing to the UI element 343.
  • UI element 351 can move in direction 351-d1 pointing to UI element 343, UI element 352 can move in direction 352-d1 pointing to UI element 343, and UI element 353 can move along Moving in direction 353-d1 pointing to UI element 343, UI element 354 can move in direction 354-d1 pointing to UI element 343, UI element 361 can move in direction 361-d1 pointing to UI element 343, UI element 362 can move in direction 361-d1 pointing to UI element 343 Moving in direction 362-d1 pointing to UI element 343, UI element 363 can move in direction 363-d1 pointing to UI element 343, and UI element 364 can move in direction 364-d1 pointing to UI element 343.
  • the direction in which a UI element points to UI element 343 may refer to a direction in which any point on the UI element points to any point on UI element 343 .
  • the direction 344-d1 in which UI element 344 points to UI element 343 may refer to the direction in which any point on UI element 344 points to UI element 343.
  • the electronic device 100 may only need to determine the approximate direction between the two UI elements, so that the operation of the electronic device 100 in determining the direction of "gravity" may be simplified.
  • the direction in which a certain UI element points to the UI element 343 may refer to the direction in which the center point on the UI element points to the center point of the UI element 343 . That is, in the case where the UI element is "attractive” or “repulsive” by the UI element being manipulated, the first direction of the resulting first movement is from the center point of the UI element affected to the UI being manipulated The center point of the element, or from the center point of the UI element being manipulated to the center point of the affected UI element. For example, in the example of FIG.
  • the direction 344 - d1 in which UI element 344 points to UI element 343 may refer to the direction in which the center point of UI element 344 points to the center point of UI element 343 .
  • the electronic device 100 can accurately and consistently determine the direction of the "attractive force” or “repulsive force” between two UI elements, thereby improving the accuracy and efficiency of implementing the "attractive force” animation effect.
  • Such embodiments are further described below with reference to Figures 4A and 4B.
  • FIGS. 4A and 4B illustrate schematic diagrams of a first direction of a first movement and a second direction of a second movement of a UI element in a "gravity” animation effect according to an embodiment of the present disclosure.
  • Figure 4A shows an example scenario where the "gravity" of the UI element being manipulated acts as “attraction”.
  • the cross graphic in the lower right schematically represents the magnified center point of the UI element being manipulated, which may also be referred to herein as the occurrence center point 410 .
  • the center point of the UI element is the center point of the attraction or repulsion.
  • the cross graphic in the upper left schematically represents the magnified center point of another UI element affected by the gravitational force of the UI element being manipulated, which may also be referred to herein as element center point 420 .
  • FIG. 4B shows an example scenario where the "gravity" of the UI element being manipulated acts as a "repulsive force”.
  • the cross on the lower right schematically represents the occurrence center point 410
  • the cross on the upper left schematically represents the element center point 420 .
  • UI element 311 may have center point 311-o
  • UI element 312 may have center point 312-o
  • UI element 313 may have center point 313-o
  • UI element 314 may have center point 314-o
  • UI element 321 may have center point 321-o
  • UI element 322 may have center point 322-o
  • UI element 323 may have center point 323-o
  • UI element 324 may have center point 324-o.
  • UI element 331 may have center point 331-o
  • UI element 332 may have center point 332-o
  • UI element 333 may have center point 333-o
  • UI element 334 may have center point 334-o
  • UI element 341 There may be center point 341-o
  • UI element 342 may have center point 342-o
  • UI element 343 may have center point 343-o
  • UI element 344 may have center point 344-o.
  • UI element 351 may have center point 351-o
  • UI element 352 may have center point 352-o
  • UI element 353 may have center point 353-o
  • UI element 354 may have center point 354-o
  • UI element 361 There may be center point 361-o
  • UI element 362 may have center point 362-o
  • UI element 363 may have center point 363-o
  • UI element 364 may have center point 364-o.
  • the direction 344-d1 of the UI element 344 affected by the "gravitational force" to the UI element 343 being manipulated may refer to the center point 344-o of the UI element 344 to the center of the UI element 343
  • the direction 344-d1 of the point 343-o, ie the first direction of the first movement of the UI element 344, will be the direction 344-d1.
  • the UI element 344 can first perform a first movement in the first direction 344-d1 toward the UI element 343 move, then a second move will be made in the opposite direction to return to the starting position.
  • first direction of the first movement and the second direction of the second movement of other UI elements on the screen 300 that are "attracted” to the UI element 343 can be similarly determined.
  • a cross symbol is used to indicate the current position of the center point of each UI element, that is, the position of the center point after the affected UI element produces a gravitational animation effect, and a small black dot is used to indicate that each UI element is generating The starting position of the center point before the gravity animation effect.
  • a cross symbol is used to indicate the current position of the center point of each UI element, that is, the position of the center point after the affected UI element produces a gravitational animation effect
  • a small black dot is used to indicate that each UI element is generating The starting position of the center point before the gravity animation effect.
  • FIG. 3E For example, for illustrative clarity, only the current position 344-o of the center point of the UI element 344 and the starting position 344-1 of the center point are marked in FIG. 3E. As shown in FIG.
  • each UI element affected by the "gravity" will return to the starting position in a second direction opposite to the first direction.
  • UI element 344 after completing the first movement in first direction 344-d1, will return to the starting position in a second direction opposite to first direction 344-d1.
  • FIG. 3F at the moment shown in FIG. 3F, UI element 344 has completed the second movement and returned to the starting position, whereby the cross symbol representing the current position of the center point of UI element 344 and the center point initial The small black dots at the location coincide.
  • each of the other UI elements affected by the "gravity” of the UI element 343 also completes their respective second displacements and returns to their respective initial positions.
  • FIG. 3E and FIG. 3F depict that the UI element affected by "gravity” performs a first displacement and a second displacement, the embodiments of the present disclosure are not limited thereto. In other embodiments, a UI element affected by "gravity” may perform multiple first and second displacements, depending on system settings or user settings, or depending on the length of time that the manipulation of the UI element being manipulated lasts.
  • UI elements affected by "gravity” can perform a first move in a first direction, a second move in a second direction, and then a first move in the first direction, and then a second move in the first direction. A second movement is performed in the second direction, and so on.
  • the target distance of the UI element affected by the "gravity" in the first movement in the first direction in each loop may remain constant or gradually decrease.
  • a plurality of UI elements of the same size are displayed on the screen 300 in a regular arrangement.
  • the animation effects of "attractive force” or “repulsive force” proposed by the embodiments of the present disclosure are not limited to regularly arranged multiple UI elements of the same size, but are equally or similarly applicable to arrangements in any manner of multiple UI elements with different sizes. Such examples are described below with reference to Figures 3G-3J.
  • the electronic device 100 displays M UI elements on the screen 300, for example, various UI elements displayed in a negative screen.
  • FIG. 3G the electronic device 100 displays M UI elements on the screen 300, for example, various UI elements displayed in a negative screen.
  • M 13, that is, there are 13 UI elements from UI element 381 to UI element 393, and they have different sizes, wherein UI element 385 is the largest, UI element 381 is the second, and UI element 384 is the second.
  • UI elements 382, 383, 386, 387, 388, 389, 390, 391, 392, and 393 are minimal.
  • the user of the electronic device 100 may use the hand 370 to click on the UI element 385 , for example, to start an operation or service corresponding to the UI element 385 .
  • the "attraction" animation effect of embodiments of the present disclosure will be described with a click operation as an example of an operation on a UI element. It should be understood, however, that embodiments of the present disclosure are not limited to click operations, but may be equally or similarly applicable to any other operations related to UI elements, such as operations to move UI elements, operations to merge UI elements with other UI elements Actions, actions to expand UI elements, actions to delete UI elements, and so on.
  • the electronic device 100 causes each UI element of the N UI elements on the screen 300 to produce an "attractive” animation effect, that is, the "attractive force” or “repulsive force” of the UI element 385 And produce moving animation effects, where N is a positive integer between 1 and M-1.
  • N is a positive integer between 1 and M-1.
  • the electronic device 100 can first make the other 12 UIs on the screen 300 The element moves in the direction pointing to UI element 385 (ie, the first direction). For example, in the example of FIG.
  • UI element 381 can move in direction 381-d1 pointing to UI element 385
  • UI element 382 can move in direction 382-d1 pointing to UI element 385, as indicated by the dashed arrow
  • UI element 383 can move in direction 383-d1 pointing to UI element 385
  • UI element 384 can move in direction 384-d1 pointing to UI element 385
  • UI element 386 can move in direction 386-d1 pointing to UI element 385
  • UI element 387 can move in direction 387-d1 pointing to UI element 385
  • UI element 388 can move in direction 388-d1 pointing to UI element 385
  • UI element 389 can move in direction 389-d1 pointing to UI element 385 move.
  • UI element 390 can move in direction 390-d1 pointing to UI element 385
  • UI element 391 can move in direction 391-d1 pointing to UI element 385
  • UI element 392 can move in direction 392-d1 pointing to UI element 385
  • UI element 393 can move in direction 393-d1 pointing to UI element 385.
  • the direction in which a certain UI element points to the UI element 385 may refer to the direction in which the center point on the UI element points to the center point of the UI element 385 .
  • the direction in which a certain UI element points to the UI element 385 may refer to the direction in which any point on the UI element points to any point on the UI element 385 .
  • a cross symbol is used to indicate the current position of the center point of each UI element, that is, the position of the center point after the affected UI element produces a gravitational animation effect, and a small black dot is used to indicate that each UI element is generating The starting position of the center point before the gravity animation effect.
  • the center point of the UI element 381 has made a first movement along the first direction toward the center point of the UI element 385, that is, from the center The starting position of the point is moved to the current position of the center point.
  • each UI element affected by the "gravity" will return to the starting position in a second direction opposite to the first direction.
  • UI element 381 after completing the first movement in the first direction, will return to the starting position in a second direction opposite to the first direction.
  • FIG. 3J at the moment shown in FIG.
  • UI element 381 has completed the second movement and returned to the starting position, whereby the cross symbol representing the current position of the center point of UI element 381 and the center point initial The small black dots at the location coincide.
  • each of the other UI elements affected by the "gravity" of UI element 385 also complete their respective second displacements and return to their respective initial positions. It should be noted that, although the examples of FIG. 3I and FIG. 3J depict that the UI element affected by “gravity” performs a first displacement and a second displacement, the embodiments of the present disclosure are not limited thereto.
  • a UI element affected by "gravity” may perform multiple first and second displacements, depending on system settings or user settings, or depending on the length of time that the manipulation of the UI element being manipulated lasts. That is, UI elements affected by "gravity” can perform a first move in a first direction, a second move in a second direction, and then a first move in the first direction, and then a second move in the first direction. A second movement is performed in the second direction, and so on.
  • the target distance of the UI element affected by the "gravity” in the first movement in the first direction in each loop may remain constant or gradually decrease. The following will describe in detail with reference to FIG. 5 that a UI element is affected by the "attraction" of the manipulated UI element to generate an attractive animation effect.
  • FIG. 5 is a schematic diagram showing the positions of UI elements affected by "attraction” in the "attraction” animation effect at different moments during the first movement and the second movement according to an embodiment of the present disclosure.
  • UI element 343 is the first UI element to be manipulated
  • UI element 344 is the second UI element that is affected by the "attractive" effect of UI element 343 . 2 and 5 simultaneously, at block 232 of FIG. 2, the electronic device 100 determines that the second UI element 344 of the N UI elements affected by the manipulated first UI element 343 is to be moved in the first direction The target distance D0.
  • the first direction is the direction from the second UI element 344 to the first UI element 343 .
  • the "attractive force" of the first UI element 343 is set to "repulsive force”
  • the first direction of the first displacement of the second UI element 344 may be directed from the first UI element 343 The orientation of the second UI element 344 .
  • the electronic device 100 may use any appropriate method to determine the target distance D0 that the UI element 344 affected by the "gravitational force" needs to move during the first movement.
  • the electronic device 100 may set the target distance of the first movement of all UI elements affected by the "gravitational force" of the first UI element 343 to be the same.
  • the electronic device 100 may determine the affected UI based on the size of the UI element that is "attractive", the size of the UI element that is “attracted”, and/or the distance between two UI elements The target distance of the element in the first move.
  • the electronic device 100 since a particular UI element that produces the "gravity” effect is the same for other affected UI elements, when an overall "gravity” animation effect is generated for multiple affected UI elements , when the electronic device 100 determines the size of the target distance of the first movement of each affected UI element, the size of the UI element that produces the "gravity” effect may not be considered.
  • the electronic device 100 may determine that the second UI element 344 is in the first The target distance D0 of the first movement in the direction.
  • the electronic device 100 may determine that the second UI element 344 is in the first The target distance D0 of the first movement in the direction.
  • the electronic device 100 makes a first movement of the second UI element 344 from the starting position p1 in the first direction by the target distance D0. That is to say, in the example of FIG. 5 , the first movement of the second UI element 344 means that the UI element 344 moves in the first direction from the starting position p1 until the distance from the starting position p1 is the target distance The target position p2 of D0. More specifically, as shown in FIG.
  • the second UI element 344 is located at the starting position p1 at time t1 and begins to make the first movement; at time t2, the second UI element 344 moves along the The first direction has moved a distance D1; at time t3, the second UI element 344 has moved a distance D2 in the first direction; at time t4, the second UI element 344 has moved a target distance D0 in the first direction to reach target position p2.
  • the electronic device 100 causes the second UI element 344 to proceed in a second direction opposite to the first direction The second movement to reset to the starting position p1.
  • the second movement of the second UI element 344 means that the second UI element 344 moves in the second direction from the target position p2 until returning to the starting position p1 . More specifically, as shown in FIG. 5, during the second movement after the first movement, at time t5, the second UI element 344 moves a distance D3 along the second direction from position p2; at time t6, the first The second UI element 344 has moved the distance D4 along the second direction; at time t7, the second UI element 344 has moved the target distance D0 along the second direction to return to the starting position p1.
  • the first duration of the first movement of the second UI element 344, the second duration of the second movement, and/or the total duration of the first movement and the second movement is configurable. In this way, the user of the electronic device 100 can set the duration of the "gravity" animation effect according to preference, thereby further improving the user experience. In some embodiments, the electronic device 100 may reduce or enlarge the size of the second UI element 344 during the first movement and/or the second movement when generating the "gravity" animation effect of the second UI element 344 . In this way, the electronic device 100 can display the animation effect with "attraction" between UI elements in a more diverse manner, thereby further improving the user experience.
  • embodiments of the present disclosure may mimic the "gravitational" effect that exists between objects in nature, where another object subjected to the gravitational effect of one object would act under the gravitational force of another object.
  • Variable acceleration linear motion may comprise variable acceleration linear motion. That is, the relationship between each of the moving distances D1 to D4 described above and each of the times t1 to t7 can be determined according to the displacement time curve of the variable-acceleration linear motion.
  • the electronic device 100 can realize the first movement and the second movement of the UI elements based on the accelerated motion law of objects in nature under the action of gravity, so that the "gravity" animation effect is more in line with the laws of nature and the user's life style. Habitual awareness to further improve the user experience.
  • the electronic device 100 may determine that the second UI element 344 is in the first movement and/or the second UI element 344 based on a predefined curve of displacement over time. The animation effect of the movement during the second movement.
  • the electronic device 100 may determine the details of the movement of the second UI element 344 in the first movement and/or the second movement, such as to which specific location at a specific moment, based on a predefined curve of displacement over time. , that is, the above-described relationship between the respective moving distances D1 to D4 and the respective times t1 to t7 , and so on. In this way, the electronic device 100 can conveniently control the movement of the UI element based on the predefined curve of displacement with time, so that the "gravity" animation effect is more in line with the user's usage habits, thereby further improving the user experience.
  • a predefined curve of displacement over time that is, the above-described relationship between the respective moving distances D1 to D4 and the respective times t1 to t7 , and so on.
  • the electronic device 100 can conveniently control the movement of the UI element based on the predefined curve of displacement with time, so that the "gravity" animation effect is more in line with the user's usage habits, thereby further improving
  • the embodiments of the present disclosure realize an animation effect with “gravity” between UI elements, exhibit a dynamic effect that conforms to the laws of nature, are more consistent with the user’s life experience, and enhance the vitality and civilization of the electronic device 100 degree of transformation.
  • “gravity” animation effect after the UI elements (for example, icons) are arranged, the display effect of the UI elements is relatively simple, and each icon is presented independently without mutual connection, which does not conform to the laws of nature .
  • the effect of a single icon can affect the entire page, and there is a potential connection between each icon, just like there is a "universal gravitational force" between UI elements "Like, tie them together.
  • the animation effects of UI elements related to moving, merging, deleting, expanding and other operations will be more in line with natural laws, more humanized, and improve communication with users.
  • the embodiment of the present disclosure proposes a new type of animation effect implementation scheme, which mainly provides an implementation model of the gravitational animation effect, and realizes the animation effect realization of the gravitational theory, so that the user can better experience the UI elements. Function.
  • the embodiments of the present disclosure can implement a gravitational animation effect model based on a gravitational formula; and can implement dynamic effects of gravitational scenes such as attraction, repulsion, and black hole adsorption for different operation scenarios of UI elements (eg, icons);
  • a gravitational field can be established to build the basis of the entire feature animation effect; and the basic animation effect can also be opened to third-party applications, thereby establishing an ecology.
  • FIG. 6 shows a schematic diagram of the animation process and related control logic of the "gravity” animation effect according to an embodiment of the present disclosure.
  • animation is to display the current interface or control in real time according to the refresh rate, and use the principle of human visual persistence to make the user feel that the displayed picture is moving. Therefore, as shown in FIG. 6 , the electronic device 100 may first determine the initial state 610 of the “gravity” animation and the final state 620 of the “gravity” animation. Additionally, the electronic device 100 may determine the animation time 605 for which the process of transitioning from the initial state 610 of the "gravity” animation to the final state 620 of the "gravity” animation lasts.
  • the electronic device 100 can also determine the "attraction" animation type 630 and the "attraction” animation transformation form 640 .
  • "gravity" animation types 630 may include displacement animations 632, scale animations 634, rotation animations 636, transparency animations 638, etc. of UI elements
  • "gravity” animation transforms 640 may be controlled by interpolators 642 and 644, such as in The relative transformation speed is controlled in a fixed animation time 605, and so on.
  • the displacement animation 632 in the "gravity” animation type 630 is mainly involved, but it should be understood that other “gravity” animation types are also possible.
  • the displacement animation effect generated by the "gravity” animation effect in the embodiment of the present disclosure may be that the UI element moves toward a certain direction first, and then resets in the opposite direction.
  • the two animations can define the duration and interpolator respectively, and the application side can adjust as needed.
  • the electronic device 100 may determine the animation effect of the movement of the second UI element 344 during the first movement and/or the second movement based on a predefined curve of displacement over time.
  • the electronic device 100 may employ any suitable displacement time curve known or discovered in the future to control the details of the movement of the second UI element 344 during the first movement and/or the second movement.
  • the electronic device 100 may choose to use a Bezier curve or an elastic force curve as the predefined curve for the first displacement and/or the second displacement of the second UI element 344 .
  • the electronic device 100 may use a second-order Bezier curve to control the first displacement of the second UI element 344 and a spring force curve to control the second displacement of the second UI element 344, or vice versa.
  • the electronic device 100 may also use one of the Bezier curve or the elastic force curve to control both the first displacement and the second displacement. In this way, the electronic device 100 can conveniently control the movement of UI elements based on the Bezier curve or the elastic force curve, so that the "attractive force” animation effect is more in line with the user's habit of "attractive force” and "repulsive force” in life awareness to further improve the user experience.
  • FIG. 7A is a schematic diagram illustrating that the predefined curve of the displacement of the UI element changing with time is a Bezier curve according to an embodiment of the present disclosure.
  • the abscissa represents time
  • the ordinate represents displacement (or distance).
  • the interpolator used to control the movement of the UI element may use a common curve interpolator, such as in the example of FIG. 7A , the displacement time of the first movement of the second UI element 344 previously depicted in FIG. 5 .
  • Curve 710 may be a second order Bezier curve.
  • the electronic device 100 can achieve different movement effects of the UI element 344 by selecting two second-order points of the second-order Bezier curve.
  • the electronic device 100 adjusts the displacement time curve so that the UI elements can be accelerated and decelerated instead of moving at a constant rate.
  • the Bezier curve is mainly used for motion matching between click operation page switching in a fixed scene.
  • the following are related parameters of 9 different rhythms of Bezier curves in a specific construction platform, and the curve 710 shown in FIG. 7A may be one of the following 9 Bezier curves.
  • curvilinear forms include, but are not limited to, first-order Bezier curves, third- or higher-order Bezier curves, other curvilinear forms known or discovered in the future, or even straight lines.
  • the Bezier curve following the user's hand sliding can be appropriately tried 40-60
  • 33-33 can be the Bezier curve following the hand speed
  • 70-80 is a curve with a stronger rhythm , which can be used to highlight interesting scenes.
  • the interpolator of the first movement of the second UI element 344 can select a Bezier curve, and the specific coordinates can be analyzed and obtained according to various parameters set in the "gravity" animation effect.
  • the coordinates of the two points of the Bezier curve in the embodiment of the present disclosure can be arbitrarily determined, not limited to the above 9 kinds of curves, and the coordinates of the two points can be (x1, y1), (x2, y2 ), where x1, y1, x2, and y2 can be values between 0 and 1, generally one decimal place.
  • the displacement time curve 710 of the embodiment of the present disclosure is exemplarily depicted as a second-order Bezier curve in FIG. 7A, embodiments of the present disclosure are not limited thereto, but are equally applicable to other orders of Bezier curves and any other curve.
  • the electronic device 100 may determine, based on the displacement time curve 710, that the moving distance of the UI element 344 at time t1 is 0,
  • the moving distance at time t2 is D1
  • the moving distance at time t3 is D2
  • the moving distance at time t4 is the target distance D0.
  • the electronic device 100 can determine the position of the UI element 344 at each moment on the displacement time curve 710 according to the time interval corresponding to the refresh frequency of the screen 300, and then at different moments The UI element 344 is displayed at the corresponding position on the screen 300, so that the animation effect of the UI element 344 performing the first movement can be realized.
  • FIG. 7B shows a schematic diagram in which the predefined curve of the displacement of the UI element as a function of time is an inverse proportional curve according to an embodiment of the present disclosure.
  • the abscissa represents time
  • the ordinate represents displacement (or distance).
  • the displacement time curve 720 of the first movement of the second UI element 344 previously depicted in FIG. 5 may be an inversely proportional curve, that is, over time, the second UI element 344 in a unit of time The distance moved is getting smaller and smaller.
  • the electronic device 100 may determine, based on the displacement time curve 720, that the moving distance of the UI element 344 at time t1 is 0,
  • the moving distance at time t2 is D1
  • the moving distance at time t3 is D2
  • the moving distance at time t4 is the target distance D0.
  • the electronic device 100 can determine the position of the UI element 344 at each moment on the displacement time curve 720 according to the time interval corresponding to the refresh frequency of the screen 300, and then at different moments The UI element 344 is displayed at the corresponding position on the screen 300, so that the animation effect of the UI element 344 performing the first movement can be realized.
  • the displacement time curve 730 of the second movement of the UI element 344 depicted in FIG. 5 is a spring force curve, eg, a critically damped spring force curve.
  • the elastic force curve can use different states in different operating scenarios, namely critical damping, under damping and over damping. Under different damping states, the elastic force curve of displacement time can be different. Specifically, the three cases are as follows: the square of the damping is equal to 4 times the mass times the stiffness, which is critical damping.
  • the electronic device 100 may determine, based on the displacement time curve 730, that the moving distance of the UI element 344 at time t4 is 0 , the moving distance at time t5 is D3, the moving distance at time t6 is D4, and the moving distance at time t7 is the target distance D0.
  • the electronic device 100 can determine the position of the UI element 344 at each moment on the displacement time curve 730 according to the time interval corresponding to the refresh frequency of the screen 300, and then at different moments The UI element 344 is displayed at the corresponding position on the screen 300, so that the animation effect of the UI element 344 performing the second movement can be realized.
  • f is the force during vibration
  • m is mass
  • a acceleration
  • k elastic system (rigidity)
  • x is spring deformation
  • g drag coefficient (damping)
  • t time.
  • the user of the electronic device 100 only needs to determine the spring deformation amount x to be generated (that is, the distance of the second movement), and the remaining parameters may be adjustable parameters.
  • relevant recommended values of these tunable parameters can be given through human factors research to be used by the application. Of course, the application can also customize these tunable parameters as needed.
  • the relevant settings of the elastic engine interpolator may be as follows.
  • PhysicalInterpolatorBase interpolator new SpringInterpolator(400F, 40F, 200F, 2600F, 1F);
  • ObjectAnimator animator ObjectAnimator.ofFloat(listView, "translationY”, 0,346)
  • SpringAnimation animation SpringAnimation(listView,DynamicAnimation.TRANSLATION_Y,400F,40F,0,1000F);
  • FIG. 7D shows a schematic diagram of a predefined curve of the displacement of a UI element over time as an under-damped elastic force curve according to an embodiment of the present disclosure.
  • the displacement time curve 740 of the second movement of the UI element 344 depicted in FIG. 5 is a spring force curve, eg, an underdamped spring force curve.
  • the abscissa represents time
  • the ordinate represents displacement (or distance). It should be understood that although the displacement time curve 740 of an embodiment of the present disclosure is exemplarily depicted in FIG. 7D as an underdamped elastic force curve, embodiments of the present disclosure are not so limited, but are equally applicable to any other curve.
  • the electronic device 100 may determine, based on the displacement time curve 740, that the moving distance of the UI element 344 at time t4 is 0 , the moving distance at time t5 is D3, the moving distance at time t6 is D4, and the moving distance at time t7 is the target distance D0.
  • the underdamped spring force curve 740 in Figure 7D may have a "reciprocating" effect.
  • the UI element 344 has reached the target distance D0 some time before time t45, and continues to move in the second direction beyond the target distance D0 before moving in the first direction. For example, at time t45 in FIG. 7D, the distance the UI element 344 has moved is D45, which is greater than the target distance D0. Similarly, at times t55 and t65, the moving distances D55 and D65 of the UI element 344 in the second direction are both greater than the target distance D0.
  • the UI element 344 will return from the target position p2 to the starting position p1 in the second direction, and then move beyond the starting position p1 in the second direction, Then, the back and forth "reciprocating" motion is carried out with the starting position p1 as the center until it finally stops at the starting position p1.
  • the electronic device 100 can determine the position of the UI element 344 at each moment on the displacement time curve 740 according to the time interval corresponding to the refresh frequency of the screen 300, and then at different moments The UI element 344 is displayed at the corresponding position on the screen 300, so that the animation effect of the UI element 344 performing the second movement can be realized.
  • FIGS. 7E to 7H are schematic diagrams showing a comparison of different displacement time-varying curves of three UI elements affected by "gravity” according to an embodiment of the present disclosure.
  • FIG. 7E shows the third UI element 344 , UI element 324 and UI element 311 in the example described above with reference to FIGS. 3C to 3F under the influence of the “gravity” of UI element 343
  • a displacement time curve of a movement is a schematic diagram of a Bezier curve.
  • 7F illustrates the first movement of three UI elements, UI element 344, UI element 324, and UI element 311 in the example described above with reference to FIGS.
  • the displacement time curves are all schematic diagrams of inverse proportional curves.
  • 7G shows a second movement of three UI elements, UI element 344, UI element 324, and UI element 311 in the example described above with reference to FIGS. 3C-3F, under the influence of the "gravity" of UI element 343
  • the displacement-time curves are all schematic diagrams of the critically damped elastic force curve.
  • 7H shows a second movement of three UI elements, UI element 344, UI element 324, and UI element 311 in the example described above with reference to FIGS.
  • FIGS. 7E to 7H depict the displacement time curves of three UI elements in an exemplary manner to illustrate that the first and second displacements of different UI elements under the influence of the “gravitational force” of the same UI element can be They have different displacement time curves respectively.
  • the first and second displacements of other UI elements that are affected by the "gravity" of UI element 343 depicted in FIGS. 3C-3F may have similar displacement time profiles.
  • the abscissa represents time and the ordinate represents displacement (or distance)
  • the displacement time curve 710 of the first movement of the second UI element 344 previously depicted in FIG. 5 may be two
  • the first-order Bezier curve the displacement-time curve 712 of the first movement of the UI element 324 may be a second-order Bezier curve
  • the displacement-time curve 714 of the first movement of the UI element 311 may also be a second-order Bezier curve.
  • Bezier curves 710, 712, and 714 may have different parameters.
  • UI element 344 may have the largest target distance D0-344 because UI element 344 has the closest distance to UI element 343 being manipulated. Because UI element 324 is further away from UI element 343 being manipulated than UI element 344, UI element 324 may have a target distance D0-324 that is less than the target distance D0-344 of UI element 344. Because UI element 311 is farther from UI element 343 being manipulated than UI element 324, UI element 311 may have a target distance D0-311 that is smaller than target distance D0-324 of UI element 324. Referring to FIGS.
  • UI elements 344 , 324 and 311 begin to prepare for their respective first movements under the “gravity” of UI element 343 .
  • UI elements 344, 324, and 311 move distances D1-344, D1-324, and D1-311 in their respective first directions.
  • UI elements 344, 324 and 311 move distances D2-344, D2-324 and D2-311 in their respective first directions.
  • UI elements 344, 324 and 311 move target distances D0-344, D0-324 and D0-311 in their respective first directions.
  • the electronic device 100 can determine where the UI elements 344 , 324 and 311 are located at each moment on the displacement time curves 710 , 712 , and 714 according to the time interval corresponding to the refresh frequency of the screen 300 . and then display UI elements 344, 324 and 311 at corresponding positions on the screen 300 at different moments, so that the animation effect of UI elements 344, 324 and 311 performing their respective first movements can be realized.
  • FIG. 7E the respective first movements of UI elements 344, 324 and 311 are shown as starting and ending at the same time, this is exemplary only and is not intended to limit the present disclosure in any way range.
  • the respective first movements of UI elements 344, 324, and 311 may begin and/or end at different times. This may be, for example, taking into account the speed at which the "gravitational force" of the UI element 343 propagates, such an embodiment will be described further below with reference to FIG. 19 .
  • the displacement time curve 720 of the first movement of the second UI element 344 previously depicted in FIG. 5 may be inversely proportional curve
  • the displacement time curve 722 of the first movement of the UI element 324 may be an inversely proportional curve
  • the displacement time curve 724 of the first movement of the UI element 311 may also be an inversely proportional curve.
  • inverse proportional curves 720, 722, and 724 may have different parameters. For example, at the same time t4, UI element 344 may have the largest target distance D0-344 because UI element 344 has the closest distance to UI element 343 being manipulated.
  • UI element 324 may have a target distance D0-324 that is less than the target distance D0-344 of UI element 344. Because UI element 311 is farther from UI element 343 being manipulated than UI element 324, UI element 311 may have a smaller target distance D0-311 than UI element 324's target distance D0-324. Referring to FIGS. 3C to 3E and 7F simultaneously, at time t1, UI elements 344, 324 and 311 begin to prepare for their respective first movements under the "gravitational force" of UI element 343.
  • UI elements 344, 324, and 311 move distances D1-344, D1-324, and D1-311 in their respective first directions.
  • UI elements 344, 324 and 311 move distances D2-344, D2-324 and D2-311 in their respective first directions.
  • UI elements 344, 324 and 311 move target distances D0-344, D0-324 and D0-311 in their respective first directions. It should be noted that, in a specific implementation, the electronic device 100 can determine where the UI elements 344 , 324 and 311 are located at each moment on the displacement time curves 720 , 722 , and 724 according to the time interval corresponding to the refresh frequency of the screen 300 .
  • the displacement time curve 730 of the second movement of the second UI element 344 previously depicted in FIG. 5 may be critical
  • the damped spring force curve, the displacement time curve 732 of the second movement of the UI element 324 may be a critically damped spring force curve
  • the displacement time curve 734 of the second movement of the UI element 311 may also be a critically damped spring force curve. Note that the critically damped spring force curves 730, 732, and 734 may have different parameters.
  • UI element 344 may have the largest target distance D0-344 because UI element 344 has the closest distance to UI element 343 being manipulated. Because UI element 324 is further away from UI element 343 being manipulated than UI element 344, UI element 324 may have a target distance D0-324 that is less than the target distance D0-344 of UI element 344. Because UI element 311 is farther from UI element 343 being manipulated than UI element 324, UI element 311 may have a target distance D0-311 that is smaller than target distance D0-324 of UI element 324. Referring to FIGS.
  • UI elements 344, 324 and 311 have completed their respective first movements under the "gravitational force" of UI element 343, and begin to prepare for their respective second movements .
  • UI elements 344, 324, and 311 move distances D3-344, D3-324, and D3-311 in their respective second directions.
  • UI elements 344, 324, and 311 move distances D4-344, D4-324, and D4-311 in their respective second directions.
  • UI elements 344, 324 and 311 move target distances D0-344, D0-324 and D0-311 in respective second directions.
  • the electronic device 100 can determine where the UI elements 344 , 324 and 311 are located at each moment on the displacement time curves 730 , 732 , and 734 according to the time interval corresponding to the refresh rate of the screen 300 and then display UI elements 344, 324 and 311 at corresponding positions on the screen 300 at different moments, so that the animation effect of UI elements 344, 324 and 311 performing their respective first movements can be realized.
  • the second movement of each of UI elements 344, 324 and 311 is shown in the example of FIG. 7G as starting and ending at the same time, this is exemplary only and is not intended to limit the present disclosure in any way. range. In other embodiments, the respective second movements of UI elements 344, 324, and 311 may begin and/or end at different times.
  • the displacement time curve 740 of the second movement of the second UI element 344 previously depicted in FIG. 5 may be less
  • the damped spring force curve, the displacement time curve 742 of the second movement of the UI element 324 may be an underdamped spring force curve, and the displacement time curve 744 of the second movement of the UI element 311 may also be an underdamped spring force curve. Note that the underdamped spring force curves 740, 742, and 744 may have different parameters.
  • UI element 344 may have the largest target distance D0-344 because UI element 344 is the closest distance to UI element 343 being manipulated. Because UI element 324 is further away from UI element 343 being manipulated than UI element 344, UI element 324 may have a target distance D0-324 that is less than the target distance D0-344 of UI element 344. Because UI element 311 is farther from UI element 343 being manipulated than UI element 324, UI element 311 may have a target distance D0-311 that is smaller than target distance D0-324 of UI element 324. Referring to FIGS.
  • UI elements 344, 324 and 311 have completed their respective first movements under the "gravitational force" of UI element 343, and begin to prepare for their respective second movements .
  • UI elements 344, 324, and 311 move distances D3-344, D3-324, and D3-311 in their respective second directions.
  • UI elements 344, 324, and 311 move distances D4-344, D4-324, and D4-311 in their respective second directions.
  • UI elements 344, 324 and 311 move target distances D0-344, D0-324 and D0-311 in respective second directions.
  • UI elements 344, 324, and 311 will "reciprocate" back and forth at their respective starting positions based on the displacement time profiles of their respective underdamped spring force profiles. It should be noted that, in a specific implementation, the electronic device 100 can determine where the UI elements 344 , 324 and 311 are located at each moment on the displacement time curves 730 , 732 , and 734 according to the time interval corresponding to the refresh rate of the screen 300 and then display UI elements 344, 324 and 311 at corresponding positions on the screen 300 at different moments, so that the animation effect of UI elements 344, 324 and 311 performing their respective first movements can be realized.
  • each of UI elements 344, 324 and 311 is shown in the example of FIG. 7G as starting and ending at the same time, this is exemplary only and is not intended to limit the present disclosure in any way. range. In other embodiments, the respective second movements of UI elements 344, 324, and 311 may begin and/or end at different times.
  • the electronic device 100 may determine the second UI element 344 based on two factors, the size of the second UI element 344 and the distance of the second UI element 344 from the first UI element 343 The target distance D0 for the first movement in the first direction.
  • the electronic device 100 may determine the second UI element 344 based on two factors, the size of the second UI element 344 and the distance of the second UI element 344 from the first UI element 343 The target distance D0 for the first movement in the first direction.
  • process 800 illustrates a flow of an example process 800 for determining a target distance for a first movement of a second UI element affected by the "attractive force" or "repulsive force" of the first UI element according to an embodiment of the present disclosure picture.
  • the process 800 may be implemented by the electronic device 100 , for example, by the processor 110 or the processing unit of the electronic device 100 in cooperation with other components (eg, the display screen 194 ).
  • process 800 may also be implemented by other devices having screens to display UI elements.
  • the electronic device 100 will perform the process 800 as an example, and the process 800 will be discussed with reference to FIG. 9 , FIG. 10A and FIG. 10B .
  • FIG. 9 illustrates a schematic diagram of determining the size of a second UI element that is affected by the "attractive force” or “repulsive force” of the first UI element, according to an embodiment of the present disclosure.
  • 10A and 10B respectively illustrate schematic diagrams of two exemplary ways of determining distances between UI elements according to embodiments of the present disclosure.
  • the electronic device 100 may determine the size of the second UI element 344 .
  • the electronic device 100 can determine the lengths of the two sides 910 and 920 of the second UI element 344, and then obtain the first The size or size of the two UI elements 344.
  • the lengths of the two sides 910 and 920 of the second UI element 344 may be expressed in the number of pixels, so the size or size of the second UI element 344 may be expressed in the number of pixels.
  • the electronic device 100 may measure the length of the two sides 910 and 920 of the second UI element 344 using any suitable unit to measure the size or size of the second UI element 344 .
  • the size or size of the second UI element 344 may be measured in square millimeters. It should be noted that although FIG. 9 schematically illustrates how the electronic device 100 determines the size of the second UI element 344 in a common regular rectangular shape, the embodiments of the present disclosure are not limited to this, but can be similarly applied to Any regular or irregular shaped UI element.
  • the electronic device 100 may determine the distance between the second UI element 344 and the first UI element 343 . It should be noted that, in the embodiments of the present disclosure, the electronic device 100 may determine the distance between two UI elements in various ways. In some embodiments, the electronic device 100 may first determine the respective reference points of the two UI elements, and then determine the distance between the two reference points as the distance between the two UI elements. For example, in the example of FIG. 10A , the electronic device 100 may determine the location of the reference point 1010 of the first UI element 343 and may determine the location of the reference point 1020 of the second UI element 344 .
  • the electronic device 100 may determine the distance 1015 between the reference point 1010 and the reference point 1020 as the distance between the first UI element 343 and the second UI element 344 .
  • the selection of the reference point of the UI element may be based on a predetermined rule. For example, in the example of FIG. 10A , the reference point of the UI element is determined as the corner point of the lower left corner of the UI element. It should be understood that the reference point of a UI element may be chosen according to any suitable rule, as long as the distance between two UI elements can be reasonably determined.
  • the electronic device 100 may use the center point of the UI element as a reference point, which will be described later with reference to FIGS. 11 and 12 . Such embodiments are described in detail. However, in actual use, the selection of the reference point may not be limited, but may be freely set by the application.
  • the distance between the two closest points between the two UI elements may be determined as the distance between the two UI elements. For example, in the example of FIG. 10B , since the first UI element 343 and the second UI element 344 are basically regular rectangular shapes, and there are parallel sides between them, the two closest points between them are between The distance is the distance 1025 between two adjacent edges.
  • FIG. 10B depicts in a schematic manner the distance between the two closest points of two regularly shaped UI elements
  • embodiments of the present disclosure are not so limited, but are equally applicable to Two UI elements with any same shape or different shapes.
  • the distance between two UI elements may also be determined in various other ways, such as determining the distance between UI elements based on the radius of a reference circle, or determining the distance between UI elements based on the distance between UI elements to determine the distance between UI elements, etc. These embodiments will be described later with reference to FIGS. 11 to 17A-17E.
  • the electronic device 100 may determine that the second UI element 344 needs to move in the first movement target distance. Generally, the electronic device 100 can make the target distance have any appropriate relationship with the size of the second UI element 344 and the distance between the two UI elements, as long as the "attraction force" of the first UI element 343 to the second UI element 344 can be reflected " or "repulsive force". In some embodiments, the electronic device 100 may cause the target distance to increase as the size of the second UI element 344 increases.
  • the electronic device 100 may decrease the target distance as the distance between the two UI elements increases.
  • This is also consistent with the laws of gravity in the natural world, because the "gravitational" effect of nature increases as the distance between objects decreases. In this way, the larger the size of the UI element itself, the smaller the distance between two UI elements, and the greater the size of the UI element's “attraction” or “repulsion” effect from other UI elements, thus conforming to the The law of gravity in nature further improves the user experience.
  • the magnitude of the animation effect of the first displacement and the second displacement, that is, the distance moved is inversely proportional to the distance between the UI element and the point where the attractive or repulsive force occurs. More specifically, the embodiments of the present disclosure can borrow the model of gravitation, namely:
  • the magnitude of the gravitational force between two objects is related to their respective masses and distances. Since the embodiments of the present disclosure are mainly aimed at user experience UI elements, graphics, icons or controls used on the UX interface, it can be considered that generally the quality and size of UI elements are proportional. Assuming that the size of a UI element is R and the distance is r, its "quality" can be considered as:
  • the relationship between the attractive force or repulsive force between two UI elements and the distance between the two UI elements and the size of the affected UI element can be derived as follows:
  • the magnitude of the displacement of the affected UI element can be calculated by the following formula:
  • the formula is derived from human factor research, where 0.1 and 0.8 can be used as fixed constants, and this formula is the closest to the gravitational effect.
  • a is a constant, its default value can be 10, of course, the user can adjust the settings.
  • the displacement time curve derived based on the above formula 4 and formula 7 will be similar to the inverse proportional curve described above with reference to FIGS. 7B and 7F .
  • the electronic device 100 may use this formula to calculate the final position of the UI element's displacement animation in the "gravity" animation effect.
  • the electronic device 100 can also make the moving target distance of the second UI element 344 affected by "gravity" decrease as the size of the second UI element 344 increases, and as the size of the second UI element 344 increases, as the two UI elements increases as the distance between elements increases, or has any other functional relationship.
  • a function change relationship may be inconsistent with the laws of gravity in nature, it can also bring a brand new user experience to the user.
  • the amount of "attraction” or "repulsion" to which a UI element is subjected to other UI elements may depend on the size of the UI element itself and the distance between two UI elements, thus conforming to the gravitational force in nature Size rules, thereby further improving the user experience.
  • the distance between two UI elements may also be determined in various other manners.
  • the other three ways of determining the distance between UI elements will be described below with reference to FIGS. 11 to 17A to 17F .
  • process 1100 may be implemented by the electronic device 100 , for example, by the processor 110 or the processing unit of the electronic device 100 in cooperation with other components (eg, the display screen 194 ). In other embodiments, process 1100 may also be implemented by other devices having screens to display UI elements.
  • the electronic device 100 will perform the process 1100 as an example, and the process 1100 will be discussed with reference to FIG. 12 .
  • 12 shows a schematic diagram of determining a distance between a first UI element and a second UI element based on a center point according to an embodiment of the present disclosure.
  • the electronic device 100 may determine the first center point 343 - o of the first UI element 343 .
  • the center point of the UI element may refer to the center in the geometric sense, or refer to the center of gravity in the physical sense when the UI element is considered as an object with uniform density.
  • the center point of a UI element may also refer to a center point defined in any other way that represents the "center" of the UI element.
  • the electronic device 100 may determine the coordinate position or pixel position of the first center point 343 - o on the screen 300 (not shown in FIG.
  • the electronic device 100 may determine the second center point 344 - o of the second UI element 344 . For example, in a similar manner, the electronic device 100 may determine the coordinate position or pixel position of the second center point 344-o on the screen 300 (not shown in FIG. 12) based on the geometric shape of the second UI element 344, etc. .
  • the electronic device 100 may determine the linear distance 1200 between the first center point 343 - o and the second center point 344 - o as the distance between the first UI element 343 and the second UI element 344 distance.
  • the electronic device 100 may determine the straight-line distance between the two center points.
  • the distance between the two UI elements can be determined as the distance between the center points of the two UI elements in a direct and clear manner, thereby improving the consistency of the manner in which the electronic device 100 determines the distance between the UI elements.
  • the calculation process of the electronic device 100 is simplified.
  • process 1300 illustrates a flowchart of an example process 1300 for determining a distance between a first UI element and a second UI element based on a radius, according to an embodiment of the present disclosure.
  • the process 1300 may be implemented by the electronic device 100 , for example, by the processor 110 or the processing unit of the electronic device 100 in cooperation with other components (eg, the display screen 194 ).
  • process 1300 may also be implemented by other devices having screens to display UI elements.
  • the electronic device 100 will perform the process 1300 as an example, and the process 1300 will be discussed with reference to FIG. 14 .
  • 14 shows a schematic diagram of determining a distance between a first UI element and a second UI element based on a radius according to an embodiment of the present disclosure.
  • the electronic device 100 may determine the first center point 343 - o of the first UI element 343 .
  • the first center point 343-o of the first UI element 343 may refer to the center of the first UI element 343 in a geometric sense, or may refer to the center of the first UI element 343 Consider the center of gravity in the physical sense of an object with uniform density.
  • the first center point 343-o of the first UI element 343 may also refer to a center point defined in any other way that represents the "center" of the first UI element 343.
  • the electronic device 100 may determine the coordinate position or pixel position of the first center point 343 - o on the screen 300 (not shown in FIG. 14 ) based on the geometric shape of the first UI element 343 , and so on .
  • the electronic device 100 may determine a plurality of circles having respective radii centered on the first center point 343-o. For example, in the example depicted in FIG. 14, the electronic device 100 may determine a first circle 1410 with radius r1, a second circle 1420 with radius r2, a third circle 1430 with radius r3, and a fourth circle 1440 with radius r4 , and a fifth circle 1450 with radius r5. It should be noted that, in some embodiments, the difference between the radii of each circle (eg, circle 1410 to circle 1450 ) may be equal, that is, r1 to r5 may form an arithmetic progression.
  • the electronic device 100 may also set the difference between the radii of the circles (for example, the circle 1410 to the circle 1450 ) to be unequal according to the user's setting or depending on the different arrangement of UI elements, That is, r1 to r5 do not form an arithmetic progression. In this way, the flexibility of generating each circle and the adaptability of each circle to the scene can be improved.
  • the electronic device 100 may determine that the second UI element 344 intersects at least one of a plurality of circles (eg, circle 1410 to circle 1450 ). For example, in the example depicted in FIG. 14 , the electronic device 100 may determine that the second UI element 344 intersects the first circle 1410 . It should be noted that, in some embodiments, a UI element does not always intersect only one circle. For example, in the example of FIG. 14, UI element 352 intersects both first circle 1410 and second circle 1420, and UI element 354 also intersects both first circle 1410 and second circle 1420. At block 1340 of FIG.
  • the electronic device 100 may determine the radius of the circle with the smallest radius among at least one circle that intersects the second UI element 344 as the distance between the second UI element 344 and the first UI element 343 .
  • the electronic device 100 may determine the radius r1 of the first circle 1410 as the difference between the second UI element 344 and the first UI element 343 distance between.
  • the electronic device 100 may determine the circle with the smaller radius as the first circle 1410.
  • the electronic device 100 may determine the distance between the UI element 352 (or the UI element 354 ) and the first UI element 343 as the radius r1 of the first circle 1410 .
  • the electronic device 100 can determine the distance between two UI elements more simply and conveniently, and make the distance between the UI elements have higher consistency, thereby simplifying the subsequent processing and calculation process based on the distance.
  • FIGS. 15A and 15B illustrate schematic diagrams of an overall conduction manner between UI elements in a case where the distance between UI elements is determined based on a radius, according to an embodiment of the present disclosure.
  • UI elements are represented as circles with fill patterns, eg, UI element 1510.
  • a wireframe 1505 around the UI elements is used to schematically illustrate how the UI elements are arranged.
  • FIGS. 15A and 15B assuming that the UI element in row 3 and column 4 is operated, the electronic device 100 may determine five circles centered on the UI element, which are represented by indices 1 to 5 respectively.
  • the linkage mode of the radius-based "gravity" animation effect of the embodiment of the present disclosure as shown in FIG.
  • the linkage mode of the radius mode is unfolded in a circular manner.
  • the radius can be imagined as moving in a wave pattern, and the center point can determine the relationship between conduction in a wave pattern.
  • a related UI element eg, an icon
  • the UI element moves according to the conduction number of that radius. If a UI element (eg, an icon) does not intersect any of the circles, then the distance between UI elements can be used to find the smallest radius that satisfies that distance.
  • the determination of the overall conduction mode is shown in Figure 15B, and the transmission of physical parameters can be expressed by the following equation:
  • “stiffness” represents the stiffness of the elastic force curve when the displacement time change curve of the UI element is an elastic force curve
  • “damping” represents the damping of the elastic force curve when the displacement time change curve of the UI element is an elastic force curve
  • the animation callback can be expressed as: onUpdate(x, y, index), which calculates the x, y displacement of the index number according to the movement of the 0 node.
  • the delta time difference of the linkage transmission of the "gravity” animation effect between UI elements with different indexes can be determined based on the speed of the "gravity” propagation, and an embodiment of the "gravity” propagation speed will be further described later with reference to FIG. 19 . .
  • FIG. 16 illustrates a flow diagram of an example process 1600 for determining a distance between a first UI element and a second UI element based on spacing, according to an embodiment of the present disclosure.
  • the process 1600 may be implemented by the electronic device 100 , for example, by the processor 110 or the processing unit of the electronic device 100 in cooperation with other components (eg, the display screen 194 ).
  • process 1600 may also be implemented by other devices having screens to display UI elements.
  • the electronic device 100 will perform the processing procedure 1600 as an example, and the processing procedure 1600 will be discussed with reference to FIGS. 17A to 17F .
  • 17A to 17F illustrate schematic diagrams of determining a distance between a first UI element and a second UI element based on the distance according to an embodiment of the present disclosure.
  • the electronic device 100 may determine the lateral spacing between the first UI element and the second UI element.
  • the spacing between UI elements may refer to the distance between two adjacent borders of two UI elements. Therefore, the horizontal spacing may refer to the border distance of two UI elements in the horizontal direction relative to the screen 300 , and the vertical spacing may refer to the border distance of the two UI elements in the vertical direction relative to the screen 300 .
  • the electronic device 100 may determine that the horizontal spacing between the UI element 343 and the UI element 344 is 1710. In the example of FIG.
  • the electronic device 100 may determine that the horizontal distance between the UI element 343 and the UI element 353 is 0.
  • the electronic device 100 may determine that the horizontal distance between the UI element 343 and the UI element 344 is 1710 .
  • the electronic device 100 may determine the longitudinal spacing between the first UI element and the second UI element. For example, in the example of FIG. 17A , since the UI elements 343 and 344 are arranged horizontally with respect to the screen 300 , the electronic device 100 may determine that the vertical distance between the UI elements 343 and 344 is 0. In the example of FIG. 17B , since the UI element 343 and the UI element 353 are vertically arranged with respect to the screen 300 , the electronic device 100 may determine that the vertical distance between the UI element 343 and the UI element 353 is 1720 . In the example of FIG. 17C , since the UI element 343 and the UI element 354 are arranged obliquely with respect to the screen 300 , the electronic device 100 may determine that the vertical distance between the UI element 343 and the UI element 344 is 1720 .
  • the electronic device 100 may determine the second UI element and the first UI element based on at least one of the horizontal spacing 1710 and the vertical spacing 1720 and the first direction of the first movement of the second UI element the distance between. For example, in the example of FIG. 17A, since the horizontal distance between UI element 343 and UI element 344 is 1710, the vertical distance is 0, and UI element 344 is toward or away from the first direction 344 of the first movement of UI element 343- d1 (the direction away from UI element 343 in FIG. 17A ) is parallel to the horizontal direction of screen 300, so electronic device 100 can determine that the distance between UI element 343 and UI element 344 is the horizontal distance 1710 between them.
  • the first movement of UI element 353 toward (not shown) or away from UI element 343 is parallel to the longitudinal direction of the screen 300, so the electronic device 100 can determine that the distance between the UI element 343 and the UI element 353 is both The longitudinal distance is 1720. It should be noted that, in the example of FIG. 17A and FIG.
  • the electronic device 100 can set the horizontal distance 1710 (Fig. 17A) or the projection of the longitudinal distance 1720 (FIG. 17B) in the first direction is determined as the distance between two UI elements.
  • the distance between UI element 343 and UI element 354 may be determined by the projection of the lateral distance 1710 and the longitudinal distance 1720 based on the first direction 354-dl. As an example, as shown in FIG.
  • the electronic device 100 may determine a right triangle with the horizontal distance 1710 and the vertical distance 1720 as two right-angled sides, and the right triangle has an oblique angle Edge 1725. Then, based on the first direction 354-d1 of the first displacement of the UI element 354, the electronic device 100 may determine the distance 1730 between the UI element 343 and the UI element 354 within the right triangle.
  • the electronic device 100 can calculate the angle between the first direction 354-d1 and the horizontal direction or the angle between the first direction 354-d1 and the vertical direction, and use the principle of trigonometric functions to calculate the angle between the first direction 354-d1 and the first direction 354-d1. the distance.
  • both the lateral distance 1710 and the longitudinal distance 1720 are utilized during the projection calculation process based on the first direction 354-dl.
  • electronic device 100 may use only one of horizontal distance 1710 and vertical distance 1720 to determine the distance between UI element 343 and UI element 354 according to the specific orientation of first direction 354-d1. For example, as shown in FIG. 17E , the electronic device 100 may determine whether the first direction 354 - d1 is closer to the horizontal direction or the vertical direction of the screen 300 . If the first direction 354-d1 is closer to the horizontal direction, the electronic device 100 may only use the lateral distance 1710 to determine the distance between the UI element 343 and the UI element 354.
  • the electronic device 100 may only use the vertical distance 1720 to determine the distance between the UI element 343 and the UI element 354.
  • the electronic device 100 can determine that the distance between the UI element 343 and the UI element 354 is 1740 based on the auxiliary line 1712 perpendicular to the horizontal distance 1710 .
  • the electronic device 100 may determine that the distance between the UI element 343 and the UI element 354 is 1750 based on the auxiliary line 1722 perpendicular to the longitudinal distance 1720.
  • a calculation method may also be referred to as a segment calculation method in this document, that is, different segment calculations are performed according to the horizontal spacing and the longitudinal spacing and in different directions.
  • the electronic device 100 can determine the included angle between the first direction and the horizontal direction and the vertical direction. If the first direction is more inclined to one of the horizontal direction and the vertical direction, it can follow this direction. Calculate distance.
  • the length of the chord side that is, the distance
  • the length of the chord side that is, the distance
  • the length of the chord side that is, the distance
  • the first direction of the first movement of the UI elements affected by the “gravitational force” is used as the reference direction, and then based on the difference between the horizontal and vertical spacing between UI elements One or both to determine the distance between two UI elements.
  • the embodiments of the present disclosure are not limited thereto, but are equally applicable to use an arbitrary direction as a reference direction, and then determine the distance between two UI elements based on one or both of the horizontal distance and the vertical distance between the UI elements. distance.
  • the reference direction used to replace the first direction of the UI elements described above may include, but is not limited to , a landscape orientation (eg, relative to screen 300 ), a vertical orientation (eg, relative to screen 300 ), or some fixed orientation (eg, relative to screen 300 ), and so on.
  • the spacing-based UI element distance calculation method proposed by the embodiments of the present disclosure can be more widely used in scenarios where UI elements of different sizes are arranged at the same spacing.
  • UI elements of various sizes may be displayed on the screen of the electronic device 100, for example, UI element 1760, UI element 1762, UI element 1764, UI element 1766, etc., wherein UI element 1764 is the largest and UI element 1764 Element 1760 is next, UI element 1762 is next, and UI element 1766 is the smallest.
  • UI elements 1760-1766 have different sizes, the horizontal spacing 1775 and vertical spacing 1765 between them may be the same.
  • FIG. 17F UI elements of various sizes may be displayed on the screen of the electronic device 100, for example, UI element 1760, UI element 1762, UI element 1764, UI element 1766, etc., wherein UI element 1764 is the largest and UI element 1764 Element 1760 is next, UI element 1762 is next, and UI element 1766
  • the so-called spacing may be the distance of the border between two UI elements (eg, cards) or other controls.
  • the spacing between UI elements may be different.
  • the distance between all UI elements can be directly calculated through the projection calculation method described above.
  • the distance of the lateral movement is the lateral distance 1775
  • the distance of the longitudinal movement is the vertical distance 1765 .
  • the horizontal and vertical spacing of each UI element may be different.
  • This spacing value can be determined when the UI elements are laid out, and can be Follows the properties of the current UI element (eg, a control). After the horizontal distance and the vertical distance are determined, the distance in each first direction can be calculated according to the two distances. In addition, after the distance is determined, based on the principle of elastic motion, the electronic device 100 can perform chain linkage of UI element animation effects as needed. In the chain linkage process, all parameters can be adjusted. The entire conduction formula can be used to perform the relevant movement of the gravitational force of the UI element according to the values calculated by various relevant parameters. Through the example processing process 1600, the electronic device 100 can determine the distance between UI elements based on the distance between UI elements, thereby improving the flexibility and rationality of the distance determination method, especially the distance between UI elements is basically consistent in the scene.
  • the electronic device 100 can determine the distance between UI elements based on the distance between UI elements, thereby improving the flexibility and rationality of the distance determination method, especially the distance between UI elements is basically consistent in the scene
  • the first UI element operated by the user of the electronic device 100 may not have an “attractive” effect on all UI elements on the screen 300, that is, there is “attraction” or “Repulsive force”, but there is a certain “gravitational” influence range.
  • the electronic device 100 can set the “gravity” influence range of the UI element to an appropriate size, so as to keep the "gravity” animation effect in line with the laws of nature and reduce the time when the electronic device 100 realizes the "gravity” animation effect The amount of calculation, saving computing resources.
  • the "attractive" or “repulsive" sphere of influence (or area of influence) of a UI element may also be referred to as gravitational sphere, gravitational range, gravitational range of influence, and the like. It should be understood that the gravitational reach of a UI element can be an area having any shape. In some embodiments, the gravitational range of the UI element may be a circular area centered on the UI element. This setting conforms to the laws of nature, because in nature, the gravitational range of an object is usually considered to be a sphere centered on the object.
  • the gravitational range of UI elements can also be set to other regular shapes (for example, square) or irregular shapes, so as to improve the setting of the gravitational range. flexibility.
  • the electronic device 100 may set the gravitational range of each UI element to be the same, which may simplify the calculation process of the electronic device 100 regarding the gravitational range of the UI element.
  • the electronic device 100 may set the gravitational range of the UI element according to the size of the UI element.
  • the electronic device 100 may determine the area of influence of the first UI element based on the size of the operated first UI element. For example, in the example of FIG. 18A , assuming that the UI element 343 is the first UI element to be operated, the electronic device 100 may determine that the UI element 343 has an area of influence (ie, a gravitational range) 1800 according to the size of the UI element 343 . That is to say, taking the center of attraction or repulsion as the center of the circle, the UI elements within the radius R will be affected by the "gravitational force" of the UI element 343, and the electronic device 100 can implement displacement animation for these UI elements to simulate Attractive or repulsive effects.
  • the UI element 343 has an area of influence (ie, a gravitational range) 1800 according to the size of the UI element 343 . That is to say, taking the center of attraction or repulsion as the center of the circle, the UI elements within the radius R will be affected by the "gravitational force
  • the radius R may be related to the size of the UI element itself, and a larger UI element R may be larger.
  • the gravitational influence range of a UI element may be represented as (min, max). That is to say, the size of the UI element can be considered as proportional to the size of the "gravity” range, that is, it can be deduced that the "mass” of the UI element is proportional to its “gravitational” range.
  • the specific value of the upper and lower limits of the gravitational influence range can be set by the application side, and the distance from the center point of the UI element being operated needs to be within this range to produce the gravitational animation effect. In the example depicted in FIG.
  • the area of influence 1800 of the UI element 343 is depicted as a circular area of radius R with the center point 343 - o of the UI element 343 as the center. Then, the electronic device 100 may determine the UI elements within the area of influence 1800 among the M (24 in this example) UI elements on the screen 300 as the N ones that will be affected by the "gravitational force" of the UI element 343 UI elements.
  • UI elements within area of influence 1800 include UI element 332 , UI element 333 , UI element 334 , UI element 342 , UI element 344 , UI element 352 , UI element 353 , and UI element 354 .
  • the small black dots represent UI element 332, UI element 333, UI element 334, UI element 342, UI element 344, UI element 352, UI element 353, and UI element within UI element 343's area of influence 1800 354 is the starting position before the "gravity" animation effect begins, and the cross symbol indicates the current position of the various UI elements. That is, at the moment shown in FIG.
  • UI element 332, UI element 333, UI element 334, UI element 342, UI element 344, UI element 352, UI element 353, and UI element 354 surrounding UI element 343 are already in The respective target distances have been moved in the first direction towards the UI element 343 and then will begin to return to the respective starting positions in the second direction away from the UI element 343 .
  • the small black dots indicate that UI element 332, UI element 333, UI element 334, UI element 342, UI element 344, UI element 352, UI element 353, and UI element 354 around UI element 343 are in "gravity" The starting position before the animation effect starts, and the cross symbol indicates the current position of each UI element.
  • UI element 332 , UI element 333 , UI element 334 , UI element 342 , UI element 344 , UI element 352 , UI element 353 surrounding UI element 343 within area of influence 1800 and UI element 354 have completed the second movement away from UI element 343 and return to their respective starting positions.
  • UI elements outside UI element 343's area of influence 1800 include UI elements 311-314, UI elements 321-324, UI element 331, UI element 341, UI element 351, and UI elements 361-361- 364 will be unaffected by the "gravity" of UI element 343, allowing it to remain motionless during the "gravity” animation effect.
  • process 1900 may be implemented by the electronic device 100 , for example, by the processor 110 or the processing unit of the electronic device 100 in cooperation with other components (eg, the display screen 194 ). In other embodiments, process 1900 may also be implemented by other devices having screens to display UI elements.
  • the electronic device 100 will perform the process 1900 as an example, and the process 1900 will be discussed with reference to FIGS. 3B to 3D .
  • the electronic device 100 may determine a first point in time T1 at which the operation on the UI element 343 is performed. For example, the electronic device 100 may record the point in time when the user operates the UI element 343 .
  • the electronic device 100 may determine a first point in time T1 at which the operation on the UI element 343 is performed. For example, the electronic device 100 may record the point in time when the user operates the UI element 343 .
  • the electronic device 100 may determine the delay Delay-344 between the second time point T2 and the first time point T1 associated with the UI element 344 as the center The distance between the point 344-o and the center point 343-o is divided by the predetermined speed s.
  • the electronic device 100 may determine the delay Delay-Delay- 311 is the distance between the center point 311-o and the center point 343-o divided by the predetermined speed s. It should be understood that since the distance between the center point 311-o and the center point 343-o is greater than the distance between the center point 344-o and the center point 343-o, the delay Delay-311 will be greater than the delay Delay-344.
  • the electronic device 100 may determine a second time point T2 at which the second UI element begins the first movement based on the first time point T1 and the delay Delay. For example, in the example of FIG. 3D , the electronic device 100 may add a delay Delay-344 to the first time point T1 to obtain a second time point T2-344 at which the UI element 344 starts to make the first movement. Similarly, in the example of FIG. 3D , the electronic device 100 may add a delay Delay-311 to the first time point T1 to obtain a second time point T2-311 at which the UI element 311 starts to perform the first movement.
  • the electronic device 100 may cause the second UI element to begin the first movement at the second point in time T2.
  • the electronic device 100 may cause the UI element 344 to start the first movement at the second time point T2-344.
  • the electronic device 100 may cause the UI element 311 to start the first movement at the second time point T2-311.
  • the UI element 311 will start the "gravity" animation effect later than the UI element 344. That is to say, the time point at which the "gravity” animation effect of the embodiment of the present disclosure starts may be inversely proportional to the distance r between the affected UI element and the operated UI element, and the transmission speed of the wave is defined as s. , the application side can be adjusted by itself.
  • the UI element of the first wave of motion (that is, the UI element closest to the center point within the influence range of the UI element being operated, assuming that the distance from the center point is r0) can have no delay, r0 It is also an adjustable parameter, which is determined by the application side.
  • the delay for other affected UI elements (eg distance r from the center point of the UI element being manipulated) can be:
  • the UI of the electronic device 100 can visually present the linkage of the "gravitational force", that is, the movement caused by the "attractive force” or “repulsive force” propagates with the distance, so that the animation effect of the UI is more It conforms to the user's usage habits, thereby further improving the user experience.
  • FIGS. 19B-19E are schematic diagrams showing a comparison of different displacement time curves of three UI elements affected by “gravitational force” taking into account the propagation delay of “gravitational force” according to an embodiment of the present disclosure.
  • FIG. 19B shows that three UI elements, UI element 344 , UI element 324 and UI element 311 in the example described above with reference to FIGS. 3C to 3F , under the influence of the “gravity” of UI element 343
  • the displacement time curves of the first movement in the case of considering the propagation delay of "gravity” are all schematic diagrams of Bezier curves.
  • Figure 19C shows UI element 344, UI element 324, and UI element 311 in the example described above with reference to Figures 3C-3F under the "gravitational force" of UI element 343, taking into account the "gravity” of UI element 343.
  • the displacement-time curves of the first movement in the case of "gravity” propagation delay are schematic diagrams of inverse proportional curves.
  • Figure 19D shows UI element 344, UI element 324, and UI element 311 in the example described above with reference to Figures 3C to 3F under the "gravitational force" of UI element 343, taking into account the "gravity” of UI element 343.
  • the displacement-time curves of the second movement in the case of "gravity” propagation delay are all schematic diagrams of the critically damped elastic force curve.
  • Figure 19E shows UI element 344, UI element 324, and UI element 311 in the example described above with reference to Figures 3C-3F under the "gravitational force” of UI element 343, taking into account the "gravity” of UI element 343.
  • the displacement time curves of the second movement in the case of "gravity” propagation delay are all schematic diagrams of the underdamped elastic force curves. It should be noted that FIGS.
  • 19B to 19E depict the displacement time curves of three UI elements in an exemplary manner, to illustrate that the first and second displacements of different UI elements under the influence of the “gravitational force” of the same UI element can be There are different displacement time curves, respectively, and there is a time difference or delay between when the first movement or the second movement is started.
  • the first and second displacements of other UI elements that are affected by the "gravity" of UI element 343 depicted in FIGS. 3C-3F may have similar displacement time profiles and delays.
  • the abscissa represents time and the ordinate represents displacement (or distance)
  • the displacement time curve 1910 of the first movement of the second UI element 344 previously depicted in FIG. 5 may be two
  • the first-order Bezier curve the displacement-time curve 1912 of the first movement of the UI element 324 may be a second-order Bezier curve
  • the displacement-time curve 1914 of the first movement of the UI element 311 may also be a second-order Bezier curve.
  • Bezier curves 1910, 1912, and 1914 may have different parameters.
  • UI element 344 may have the largest target distance D0-344 because UI element 344 is closest to the UI element 343 being manipulated, and the first movement has the earliest start time t19-1. Because UI element 324 is farther from UI element 343 being manipulated than UI element 344, UI element 324 may have a target distance D0-324 that is less than the target distance D0-344 of UI element 344, and the start time of the first move t19-2 is later than the start time t19-1 of the first movement of UI element 344.
  • UI element 311 is farther from UI element 343 being manipulated than UI element 324, UI element 311 may have a smaller target distance D0-311 than UI element 324's target distance D0-324, and the start time of the first move t19-3 is later than the start time t19-2 of the first movement of the UI element 324.
  • the UI element 344 begins to prepare for the first movement under the "gravitational force" of the UI element 343.
  • the UI element 324 begins to prepare for the first movement under the "gravitational force" of the UI element 343.
  • UI elements 344, 324, and 311 move distances D1-344, D1-324, and D1-311 in their respective first directions (0 in the example of FIG. 19B because UI element 311 has not start the first move).
  • the UI element 311 begins to prepare for the first movement under the action of the "gravity" of the UI element 343.
  • UI elements 344, 324, and 311 move distances D2-344, D2-324, and D2-311 in their respective first directions.
  • the UI element 344 moves the target distance D0-344 in the first direction.
  • the UI element 324 moves the target distance D0-324 in the first direction.
  • the UI element 311 moves the target distance D0-311 in the first direction.
  • the electronic device 100 can determine where the UI elements 344 , 324 and 311 are located at each moment on the displacement time curves 1910 , 1912 and 1914 according to the time interval corresponding to the refresh frequency of the screen 300 and then display UI elements 344, 324 and 311 at corresponding positions on the screen 300 at different moments, so that the animation effect of UI elements 344, 324 and 311 performing their respective first movements can be realized.
  • the displacement time curve 1920 of the first movement of the second UI element 344 previously depicted in FIG. 5 may be inversely proportional curve
  • the displacement time curve 1922 of the first movement of the UI element 324 may be an inversely proportional curve
  • the displacement time curve 1924 of the first movement of the UI element 311 may also be an inversely proportional curve. Note that inverse proportional curves 1920, 1922, and 1924 may have different parameters.
  • UI element 344 may have the largest target distance D0-344 because UI element 344 is closest to the UI element 343 being manipulated, and the first movement has the earliest start time t19-1. Because UI element 324 is farther from UI element 343 being manipulated than UI element 344, UI element 324 may have a target distance D0-324 that is less than the target distance D0-344 of UI element 344, and the start time of the first move t19-2 is later than the start time t19-1 of the first movement of UI element 344.
  • UI element 311 is farther from UI element 343 being manipulated than UI element 324, UI element 311 may have a smaller target distance D0-311 than UI element 324's target distance D0-324, and the start time of the first move t19-3 is later than the start time t19-2 of the first movement of the UI element 324.
  • the UI element 344 begins to prepare for the first movement under the action of the "gravity" of the UI element 343.
  • the UI element 324 begins to prepare for the first movement under the "gravitational force" of the UI element 343.
  • the UI element 311 begins to prepare for the first movement under the action of the "gravity" of the UI element 343.
  • UI elements 344, 324, and 311 move distances D1-344, D1-324, and D1-311 in their respective first directions.
  • UI elements 344, 324, and 311 move distances D2-344, D2-324, and D2-311 in their respective first directions.
  • the UI element 344 moves the target distance D0-344 in the first direction.
  • the UI element 324 moves the target distance D0-324 in the first direction.
  • the UI element 311 moves the target distance D0-311 in the first direction.
  • the electronic device 100 can determine where the UI elements 344 , 324 and 311 are located at each moment on the displacement time curves 1920 , 1922 and 1924 according to the time interval corresponding to the refresh frequency of the screen 300 . and then display UI elements 344, 324 and 311 at corresponding positions on the screen 300 at different moments, so that the animation effect of UI elements 344, 324 and 311 performing their respective first movements can be realized.
  • the displacement time curve 1930 of the second movement of the second UI element 344 previously depicted in FIG. 5 may be critical
  • the damped spring force curve, the displacement time curve 1932 of the second movement of the UI element 324 may be a critically damped spring force curve, and the displacement time curve 1934 of the second movement of the UI element 311 may also be a critically damped spring force curve.
  • the respective first moves of UI elements 344, 324 and 311 have the same duration, so the delay between the start times of the respective second moves will be the same as the start times of the respective first moves delay is the same.
  • UI element 344 may have the largest target distance D0-344 because UI element 344 is closest to the UI element 343 being manipulated, and the second movement has the earliest start time t19-9. Because UI element 324 is farther from UI element 343 being manipulated than UI element 344, UI element 324 may have a target distance D0-324 that is smaller than the target distance D0-344 of UI element 344, and the start time of the second movement t19-10 is later than the start time t19-9 of the first movement of UI element 344 .
  • the UI element 311 may have a target distance D0-311 that is smaller than the target distance D0-324 of the UI element 324, and the start time of the second movement t19-11 is later than the start time t19-10 of the first movement of UI element 324 .
  • the UI element 344 has completed the first movement under the "gravitational force" of the UI element 343, and begins to prepare for the second movement.
  • the UI element 324 has completed the first movement under the "gravitational force" of the UI element 343, and begins to prepare for the second movement.
  • the UI element 311 has completed the first movement under the action of the "gravity” of the UI element 343, and begins to prepare for the second movement.
  • UI elements 344, 324 and 311 move distances D1-344, D1-324 and D1-311 in the respective second directions.
  • UI elements 344, 324 and 311 move distances D2-344, D2-324 and D2-311 in the respective second directions.
  • the UI element 344 moves the target distance D0-344 in the second direction.
  • the electronic device 100 can determine where the UI elements 344 , 324 and 311 are located at each moment on the displacement time curves 1930 , 1932 and 1934 according to the time interval corresponding to the refresh rate of the screen 300 and then display UI elements 344, 324 and 311 at corresponding positions on the screen 300 at different moments, so that the animation effect of UI elements 344, 324 and 311 performing their respective second movements can be realized.
  • the damped spring force curve, the displacement time curve 1942 of the second movement of the UI element 324 may be an underdamped spring force curve, and the displacement time curve 1944 of the second movement of the UI element 311 may also be an underdamped spring force curve.
  • the respective first moves of UI elements 344, 324 and 311 have the same duration, so the delay between the start times of the respective second moves will be the same as the start times of the respective first moves delay is the same.
  • underdamped elastic force curves 1940, 1942, and 1944 may have different parameters.
  • UI element 344 may have the largest target distance D0-344 because UI element 344 is closest to the UI element 343 being manipulated, and the second movement has the earliest start time t19-9. Because UI element 324 is farther from UI element 343 being manipulated than UI element 344, UI element 324 may have a target distance D0-324 that is smaller than the target distance D0-344 of UI element 344, and the start time of the second movement t19-10 is later than the start time t19-9 of the first movement of UI element 344 .
  • the UI element 311 may have a target distance D0-311 that is smaller than the target distance D0-324 of the UI element 324, and the start time of the second movement t19-11 is later than the start time t19-10 of the first movement of UI element 324 .
  • the UI element 344 has completed the first movement under the "gravitational force" of the UI element 343, and begins to prepare for the second movement.
  • the UI element 324 has completed the first movement under the "gravitational force" of the UI element 343, and begins to prepare for the second movement.
  • the UI element 311 has completed the first movement under the action of the "gravity” of the UI element 343, and begins to prepare for the second movement.
  • UI elements 344, 324 and 311 move distances D3-344, D3-324 and D3-311 in the respective second directions.
  • UI elements 344, 324 and 311 move distances D4-344, D4-324 and D4-311 in their respective second directions.
  • the UI element 344 moves the target distance D0-344 in the second direction.
  • the UI element 324 moves the target distance D0-324 in the second direction.
  • the UI element 311 moves the target distance D0-311 in the second direction. Note that in the example shown in Figure 19E, UI elements 344, 324, and 311 will "reciprocate" back and forth at their respective starting positions based on the displacement time profiles of their respective underdamped spring force profiles.
  • the electronic device 100 can determine where the UI elements 344 , 324 and 311 are located at each moment on the displacement time curves 1940 , 1942 and 1944 according to the time interval corresponding to the refresh frequency of the screen 300 and then display UI elements 344, 324 and 311 at corresponding positions on the screen 300 at different moments, so that the animation effect of UI elements 344, 324 and 311 performing their respective second movements can be realized. It should be noted that, in the example of FIG.
  • UI elements such as UI elements 344, 324, and 311, etc.
  • other UI elements affected by the "gravitational force" of the UI element 343 may have different parameters (for example, different start times) according to , different target distances, etc.) to perform the second movement (in some embodiments, the first movement can also be performed according to the underdamped elastic force curve), so these UI elements perform a "gravity" animation effect During this period, especially during multiple back-and-forth "reciprocating" motions, these UI elements may "overlap", that is, one UI element may cover another one or more UI elements.
  • electronic device 100 may select a displacement time curve like the one depicted in FIGS. 19B-19D to control the movement of UI elements "Gravity" animation effect.
  • the target distance ie, the movement magnitude
  • overlapping of UI elements may occur during the "gravity” animation of multiple UI elements.
  • Embodiments of the present disclosure do not preclude such overlapping of UI elements. In other words, whether or not UI elements overlap during the "gravity" animation effect should be considered within the scope of embodiments of the present disclosure.
  • the "gravity" animation effect proposed by the embodiments of the present disclosure is not limited to the example operation scenario described above in which a UI element is clicked, but can be applied to scenarios of various other operations on UI elements.
  • operations on the first UI element may include click operations, move operations, merge operations with other UI elements, expand operations, delete operations, and the like.
  • the electronic device can implement a "gravity" animation effect in almost all operations related to UI elements, thereby enhancing user experience in more operation scenarios. 20A-20D, 21, 22A-22D to describe the "gravity" animation effect in an example scene where UI elements are moved and exchange positions with other UI elements.
  • FIG. 20A-20D illustrate schematic diagrams of a "gravity" animation effect produced in a scene where a UI element is moved and swaps positions with another UI element, according to embodiments of the present disclosure.
  • the hand 370 of the user of the electronic device 100 presses the UI element 343 , and then drags the UI element 343 to the vicinity of the UI element 333 located above the UI element 343 .
  • FIG. 20B in response to the manipulation of UI element 343 by the user's hand 370, UI element 343 and UI element 333 exchange positions.
  • UI element 343 will move to the position before UI element 333
  • UI element 333 will move to the position before UI element 343 .
  • UI element 333 is initially at the initial position of row 3, column 3, while UI element 343 is initially at the initial position of row 4, column 3.
  • an "initial position” may refer to the position at which a UI element is initially located prior to a user's action on the UI element, which is different from where the UI element is located when the "gravity" animation effect described above is triggered the "starting position".
  • UI element 343 has completed the position exchange with UI element 333, so UI element 343 is currently located at row 3, column 3, and UI element 333 is located at row 4, column 3.
  • the operated UI element 343 has come to a new position, it can be imagined that the previous "gravitational" equilibrium state is “broken", which will have a “gravitational” effect on the surrounding UI elements.
  • the "gravitational force” that the UI element 343 exerts on surrounding UI elements after coming to a new position may be set to "repulsive force".
  • UI elements surrounding UI element 343 will first have a first displacement in a first direction away from UI element 343, and will then have a second displacement in a second direction toward UI element 343, returning to their respective starting point. More specifically, UI element 311 will perform a first movement in a first direction 311-d1 away from UI element 343, UI element 312 will perform a first movement in a first direction 312-d1 away from UI element 343, and UI element 313 will make a first movement in a first direction 313-d1 away from UI element 343, and UI element 314 will make a first movement in a first direction 314-d1 away from UI element 343.
  • UI element 321 will make a first movement in a first direction 321-d1 away from UI element 343
  • UI element 322 will make a first movement in a first direction 322-d1 away from UI element 343
  • UI element 323 The first movement will be in the first direction 323-d1 away from the UI element 343, and the UI element 324 will be in the first movement in the first direction 324-d1 away from the UI element 343.
  • UI element 331 will make a first movement in a first direction 331-d1 away from UI element 343
  • UI element 332 will make a first movement in a first direction 332-d1 away from UI element 343
  • UI element 334 A first movement will be made along a first direction 334 - d1 away from UI element 343 .
  • UI element 341 will make a first movement in a first direction 341-d1 away from UI element 343
  • UI element 342 will make a first movement in a first direction 342-d1 away from UI element 343
  • UI element 333 The first movement will be in the first direction 333-d1 away from the UI element 343, and the UI element 344 will be in the first movement in the first direction 344-d1 away from the UI element 343.
  • UI element 351 will make a first movement in a first direction 351-d1 away from UI element 343
  • UI element 352 will make a first movement in a first direction 352-d1 away from UI element 343
  • UI element 353 A first movement will be made in a first direction 353-d1 away from UI element 343, and a first movement will be made in a first direction 354-d1 away from UI element 343 by UI element 354.
  • UI element 361 will make a first movement in a first direction 361-d1 away from UI element 343
  • UI element 362 will make a first movement in a first direction 362-d1 away from UI element 343
  • UI element 363 A first movement will be made along a first direction 363-d1 away from UI element 343, and a first movement will be made along a first direction 364-d1 away from UI element 343 by UI element 364.
  • the size (ie, the target distance or magnitude of the first movement) of each UI element affected by the "gravitational force" of the UI element 343 may vary with the UI
  • the element's distance from UI element 343 increases and decreases.
  • the UI elements 323, 332, and 334 have the closest distances to the UI element 343, so the target distance of the first movement is the largest.
  • UI elements 322, 324, 342, 344 are next in proximity (ie, more distant) from UI element 343, and therefore the target distance for the first move.
  • UI elements 313, 331, 353 to UI element 343 are again the same (ie, the distance is greater), so the target distance of the first move is also the same.
  • UI elements 312, 314, 321, 341, 352, and 354 are again the same (ie, more distant) from UI element 343, and so are the target distances of the first move.
  • the proximity of UI elements 311 and 351 to UI element 343 is again the same (ie, the distance is greater), so the target distance of the first move is also the same.
  • the proximity of UI element 363 to UI element 343 is again the same (i.e., the distance is greater), so the target distance of the first move is also the same.
  • UI elements 362 and 364 to UI element 343 are again the same (ie, the distance is greater), so the target distance of the first move is also the same.
  • UI element 361 is the farthest away from UI element 343, so the target distance of the first move is also the smallest.
  • the size of the target distance of each UI element in the first movement can be determined based on the distance between the UI element and the UI element that produces the "gravitational" influence, and the distance between the two UI elements The distance of can be determined according to any of the distance calculation methods described above with reference to FIGS. 8 to 17F .
  • the small black dots represent the starting positions of each UI element except UI element 343 before the "gravity" animation effect starts, and the cross symbol represents the current position of each UI element. That is, at the moment shown in FIG. 20C , the respective UI elements except UI element 343 have moved their respective target distances in the first direction away from UI element 343 , and then will be moved in the second direction toward UI element 343 by the respective target distance. Return to their respective starting positions in the direction. In the example of FIG.
  • the size of the "repulsive force" (ie, the size of the target distance) of a UI element subjected to the UI element 343 may depend on the relationship between the UI element and the UI. Distance between elements 343. Therefore, as schematically shown in FIG. 20C , in the respective first movements, UI elements around UI element 343 will have different moving distances depending on the distance from UI element 343 . For example, UI element 323 is closer to UI element 343 than UI element 313, so UI element 323 can move a greater target distance than UI element 313. As shown in FIG.
  • the small black dots represent the starting positions of each UI element except UI element 343 before the "gravity" animation effect starts, and the cross symbol represents the current position of each UI element. That is, at the moment shown in FIG. 20D , the respective UI elements except for the UI element 343 have completed the second movement toward the UI element 343 and return to their respective starting positions.
  • UI element 333 may reach the new position earlier than UI element 343, that is, when UI element 333 reaches the new position at row 4, column 3, UI element 343 may not have reached row 3, column 3 The new position of the column.
  • the UI element 333 reaching the new position can be considered as the UI element whose "gravitational" balance is broken, and thus will be subjected to the "gravitational" effect of other surrounding UI elements.
  • UI element 333 may be "attracted” by a surrounding UI element to produce an "attractive" animation effect.
  • process 2100 shows an example process 2100 of an example processing procedure 2100 in which a UI element that reaches the new position first is subjected to the "gravitational force" of other UI elements to generate a "gravitational force” animation effect in a scene where UI elements exchange positions according to an embodiment of the present disclosure flow chart.
  • the process 2100 may be implemented by the electronic device 100 , for example, by the processor 110 or the processing unit of the electronic device 100 in cooperation with other components (eg, the display screen 194 ).
  • process 2100 may also be implemented by other devices having screens to display UI elements.
  • the electronic device 100 will perform the processing procedure 2100 as an example, and the processing procedure 2100 will be discussed with reference to FIGS. 22A to 22D .
  • FIGS. 22A to 22D are schematic diagrams illustrating a "gravity” animation effect generated by the "gravitational force" of other UI elements on the UI element that first reaches the new position in a scene where UI elements exchange positions, according to an embodiment of the present disclosure.
  • the scenes depicted in FIGS. 22A to 22D are temporally between the above-described FIGS. 20A and 20B . That is, the scenarios of FIGS. 22A to 22D occur when UI element 333 has reached the new position (ie, the position before UI element 343 ), and UI element 343 has not yet reached the new position (ie, the position before UI element 333 ) during this time.
  • the target distance of the "gravity” animation effect for the second UI element 333 in the example process 200 refers to the one depicted in Figure 20C
  • the moving distance which will hereinafter be referred to as the first target distance.
  • the "attraction" animation effect for UI element 333 will also include the “attraction” animation effect depicted in Figures 22A-22D in addition to the "attraction” animation effect depicted in Figures 20A-20D.
  • the electronic device 100 may move the second UI element 333 from an initial position to a starting position, which may be the initial position of the first UI element 343 .
  • a starting position which may be the initial position of the first UI element 343 .
  • the initial position of the first UI element 343 is row 4, column 3
  • the initial position of the second UI element 333 is row 3, column 3.
  • the second UI element 333 since the second UI element 333 has come to a new position, it can be imagined that the previous "gravitational" equilibrium state of the second UI element 333 is "broken", so that it will be affected by the "gravitational force” generated by the surrounding UI elements. As an example, as shown in FIG. 22B , the second UI element 333 will be “attracted” by the UI element 353 below to produce an "attractive” animation effect.
  • the UI element 353 that exerts a "gravitational force" on the second UI element 333 may be referred to as the third UI element. It should be noted that although FIG.
  • the "attractive force” or “repulsive force” to the second UI element 333 may come from one or more other UI elements, or it may also be the second UI element 333 to one or more other UI elements Create an "attractive” or “repulsive” force.
  • the electronic device 100 may determine a second target distance that the second UI element 333 will move in the third direction 333-d3.
  • the third direction 333 - d3 is the direction from the second UI element 333 to the third UI element 353 , that is, the second UI element 333 is "attracted" by the third UI element 353 .
  • the third direction 333 - d3 may also be a direction from the third UI element 353 to the second UI element 333 , that is, the second UI element 333 is subjected to the “repulsive force” of the third UI element 353 . It should be understood that the electronic device 100 may determine the second target distance in the same or similar manner as described above for determining the first target distance, which will not be repeated here.
  • the electronic device 100 may cause the second UI element 333 to move away from the first UI element 343 UI element 333 makes a third movement at a second target distance in a third direction 333-d3 from the starting position (eg, row 4, column 3).
  • the third direction 333-d3 is from the second UI element 333 to the third UI element 353, the second UI element 333 can make a third movement toward the third UI element 353.
  • the small black dot represents the starting position of the second UI element 333 before the “gravity” animation effect starts, and the cross symbol represents the current position of the second UI element 333 .
  • the electronic device 100 may cause the second UI element 333 to move in a fourth direction (eg, away from the third UI) opposite the third direction 333-d3 element 353) to make a fourth move to reset to the starting position (eg, row 4, column 3).
  • a fourth direction eg, away from the third UI
  • the first UI element 343 may still not reach the new position (eg, row 3, column 3). For example, this may be because the user's hand 370 maintains the drag operation on the first UI element 343 without releasing it.
  • the electronic device 100 may cause the second UI element 333 to repeatedly make multiple third and fourth movements until the first UI element 343 reaches a new location (eg, row 3). column 3).
  • the electronic device 100 can more fully and comprehensively display the "attractive" animation effect between UI elements, thereby further improving the user experience.
  • FIG. 23A-23D illustrate schematic diagrams of a "gravity" animation effect produced in a scene where a UI element is moved and merged with another UI element, according to embodiments of the present disclosure.
  • the hand 370 of the user of the electronic device 100 presses the UI element 343 , and then drags the UI element 343 to overlap with the UI element 333 above the UI element 343 .
  • FIG. 23B in response to the manipulation of UI element 343 by the user's hand 370, UI element 343 and UI element 333 start an animation effect of UI element merging (eg, creating a new folder).
  • the "gravitational" effect on surrounding UI elements when UI element 343 begins to merge with UI element 333 may be set to "repulsive force". That is, UI elements surrounding UI element 343 will first have a first displacement in a first direction away from UI element 343, and will then have a second displacement in a second direction toward UI element 343, returning to their respective starting point.
  • UI element 311 will perform a first movement in a first direction 311-d1 away from UI element 343, UI element 312 will perform a first movement in a first direction 312-d1 away from UI element 343, and UI element 313 will make a first movement in a first direction 313-d1 away from UI element 343, and UI element 314 will make a first movement in a first direction 314-d1 away from UI element 343.
  • UI element 321 will make a first movement in a first direction 321-d1 away from UI element 343
  • UI element 322 will make a first movement in a first direction 322-d1 away from UI element 343
  • UI element 323 The first movement will be in the first direction 323-d1 away from the UI element 343, and the UI element 324 will be in the first movement in the first direction 324-d1 away from the UI element 343.
  • UI element 331 will make a first movement in a first direction 331-d1 away from UI element 343, UI element 332 will make a first movement in a first direction 332-d1 away from UI element 343, UI element 334 A first movement will be made along a first direction 334 - d1 away from UI element 343 .
  • UI element 341 will make a first movement in a first direction 341-d1 away from UI element 343
  • UI element 342 will make a first movement in a first direction 342-d1 away from UI element 343, UI element 344 A first movement will be made along a first direction 344 - d1 away from UI element 343 .
  • UI element 351 will make a first movement in a first direction 351-d1 away from UI element 343, UI element 352 will make a first movement in a first direction 352-d1 away from UI element 343, UI element 353 The first movement will be along a first direction 353-d1 away from UI element 343, and the first movement will be along a first direction 354-d1 away from UI element 343 by UI element 354.
  • UI element 361 will make a first movement in a first direction 361-d1 away from UI element 343, UI element 362 will make a first movement in a first direction 362-d1 away from UI element 343, and UI element 363 A first movement will be made along a first direction 363-d1 away from UI element 343, and a first movement will be made along a first direction 364-d1 away from UI element 343 by UI element 364.
  • the size (ie, the target distance or magnitude of the first movement) of each UI element affected by the "gravitational force" of the UI element 343 may vary with the UI
  • the element's distance from UI element 343 increases and decreases.
  • the UI elements 323, 332, and 334 have the closest distances to the UI element 343, so the target distance of the first movement is the largest.
  • UI elements 322, 324, 342, 344 are next in proximity (ie, more distant) from UI element 343, and therefore the target distance for the first move.
  • UI elements 313, 331, 353 to UI element 343 are again the same (ie, the distance is greater), so the target distance of the first move is also the same.
  • UI elements 312, 314, 321, 341, 352, and 354 are again the same (ie, more distant) from UI element 343, and so are the target distances of the first move.
  • the proximity of UI elements 311 and 351 to UI element 343 is again the same (ie, the distance is greater), so the target distance of the first move is also the same.
  • the proximity of UI element 363 to UI element 343 is again the same (ie, the distance is greater), so the target distance of the first move is also the same.
  • UI elements 362 and 364 to UI element 343 are again the same (ie, the distance is greater), so the target distance of the first move is also the same.
  • UI element 361 is the farthest away from UI element 343, so the target distance of the first move is also the smallest.
  • the size of the target distance of each UI element in the first movement can be determined based on the distance between the UI element and the UI element that produces the "gravitational" influence, and the distance between the two UI elements The distance of can be determined according to any of the distance calculation methods described above with reference to FIGS. 8 to 17F .
  • the small black dots represent the starting positions of each UI element except UI elements 343 and 333 before the "gravity" animation effect started, and the cross symbol represents the current position of each UI element. That is, at the moment shown in FIG. 23C , the respective UI elements except UI element 343 and UI element 333 have moved their respective target distances in the first direction away from UI element 343, and then will be moved toward UI element The second direction of 343 returns to the respective starting position.
  • FIG. 23C the small black dots represent the starting positions of each UI element except UI elements 343 and 333 before the "gravity" animation effect started
  • the cross symbol represents the current position of each UI element. That is, at the moment shown in FIG. 23C , the respective UI elements except UI element 343 and UI element 333 have moved their respective target distances in the first direction away from UI element 343, and then will be moved toward UI element The second direction of 343 returns to the respective starting position.
  • FIG. 23C the respective UI elements except UI
  • the size of the "repulsive force" (ie, the size of the target distance) of a UI element subjected to the UI element 343 may depend on the relationship between the UI element and the UI. Distance between elements 343. Therefore, as schematically shown in FIG. 23C , UI elements around UI element 343 will have different moving distances depending on the distance from UI element 343 . For example, UI element 323 is closer to UI element 343 than UI element 313, so UI element 323 can move a greater target distance than UI element 313. As shown in FIG.
  • the small black dots represent the starting positions of each UI element except UI element 343 and UI element 333 before the "gravity" animation effect starts, and the cross symbol represents the current position of each UI element. That is, at the moment shown in FIG. 23D , the respective UI elements other than UI element 343 and UI element 333 have completed the second movement toward UI element 343 and return to their respective starting positions. Additionally, as further shown in FIG. 23D , UI element 343 and UI element 333 have completed the merge animation, forming new UI element 335 .
  • new UI element 335 may be a folder that includes both UI element 343 and UI element 333 .
  • FIGS. 24A to 24D illustrate schematic diagrams of a "gravity" animation effect produced in a scene where a UI element is deleted, according to an embodiment of the present disclosure.
  • the user of the electronic device 100 may perform an operation to delete the UI element 343, so the UI element 343 starts to perform a deletion animation effect of gradually decreasing in size in a circular shape until disappearing.
  • the deletion animation effect when the UI element 343 is deleted depicted in FIGS. 24A to 24D is only illustrative, and is not intended to limit the scope of the present disclosure in any way.
  • Embodiments of the present disclosure are also equally applicable to any delete animation effects when a UI element is deleted. As shown in FIG.
  • UI element 343 in response to the user's delete operation on UI element 343, UI element 343 starts to become a smaller circular UI element 343 and keeps shrinking.
  • the operated UI element 343 since the operated UI element 343 is gradually becoming smaller and disappearing, it can be imagined that the previous "gravitational" equilibrium state of the UI element 343 is “broken", which will have a “gravitational” effect on the surrounding UI elements.
  • the "gravitational force" effect on surrounding UI elements when the UI element 343 begins to become smaller and disappears may be set to "attractive force".
  • UI elements around UI element 343 will first make a first movement in a first direction towards UI element 343, and will then make a second movement in a second direction away from UI element 343, returning to their respective starting point. More specifically, in the first movement of each UI element, UI element 311 will make a first movement along a first direction 311-d1 towards UI element 343, and UI element 312 will make a first movement along a first direction towards UI element 343 312-d1 makes a first movement, UI element 313 will make a first movement in a first direction 313-d1 towards UI element 343, UI element 314 will make a first movement in a first direction 314-d1 towards UI element 343 move.
  • UI element 321 will make a first movement along a first direction 321-d1 towards UI element 343
  • UI element 322 will make a first movement along a first direction 322-d1 towards UI element 34
  • UI element 323 A first movement will be made along a first direction 323-d1 towards UI element 343, and a first movement will be made along a first direction 324-d1 towards UI element 343 by UI element 324.
  • UI element 331 will make a first movement along a first direction 331-d1 towards UI element 343
  • UI element 332 will make a first movement along a first direction 332-d1 towards UI element 343
  • UI element 333 The first movement will be in the first direction 333-d1 towards the UI element 343, and the UI element 334 will be in the first movement in the first direction 334-d1 towards the UI element 343.
  • UI element 341 will make a first movement along a first direction 341-d1 towards UI element 343
  • UI element 342 will make a first movement along a first direction 342-d1 towards UI element 34
  • UI element 344 A first movement will be made along a first direction 344 - d1 towards UI element 343 .
  • UI element 351 will make a first movement in a first direction 351-d1 towards UI element 343
  • UI element 352 will make a first movement in a first direction 352-d1 towards UI element 34
  • UI element 353 A first movement will be made along a first direction 353-d1 towards UI element 343, and a first movement will be made along a first direction 354-d1 towards UI element 343 by UI element 354.
  • UI element 361 will make a first movement in a first direction 361-d1 towards UI element 343
  • UI element 362 will make a first movement in a first direction 362-d1 towards UI element 34
  • UI element 363 A first movement will be made in a first direction 363-d1 towards UI element 343, and a first movement will be made in a first direction 364-d1 towards UI element 343 by UI element 364.
  • the size (ie, the target distance or magnitude of the first movement) of each UI element affected by the "gravitational force" of the UI element 343 may vary with the UI
  • the element's distance from UI element 343 increases and decreases.
  • UI elements 333, 342, 344, and 353 have the closest distances to UI element 343, so the target distance of the first move is the largest.
  • UI elements 332, 334, 352, and 354 are the next closest (ie, more distant) to UI element 343, and therefore the target distance for the first move.
  • UI elements 323, 341, 363 to UI element 343 is again the same (ie, the distance is greater), so the target distance of the first move is also the same.
  • UI elements 322, 324, 331, 351, 362, and 364 are again the same (ie, more distant) from UI element 343, so the target distance of the first move is also the same.
  • the proximity of UI elements 321 and 361 to UI element 343 is again the same (ie, the distance is greater), so the target distance of the first move is also the same.
  • the proximity of UI element 313 to UI element 343 is again the same (ie, the distance is greater), so the target distance of the first move is also the same.
  • UI elements 312 and 314 to UI element 343 are again the same (ie, the distance is greater), so the target distance of the first move is also the same.
  • UI element 311 is the farthest away from UI element 343, so the target distance of the first move is also the smallest.
  • the size of the target distance of each UI element in the first movement can be determined based on the distance between the UI element and the UI element that produces the "gravitational" influence, and the distance between the two UI elements The distance of can be determined according to any of the distance calculation methods described above with reference to FIGS. 8 to 17F .
  • the small black dots represent the starting positions of each UI element except UI element 343 before the "gravity" animation effect begins, and the cross symbol represents the current position of each UI element. That is to say, at the moment shown in FIG. 24C , each UI element except the UI element 343 has completed their respective first movements, and moved their respective target distances in the first direction toward the UI element 343 , and then The respective starting positions will be returned in the second direction away from UI element 343 .
  • FIG. 24C the small black dots represent the starting positions of each UI element except UI element 343 before the "gravity" animation effect begins
  • the size of a UI element "attracted" by UI element 343 may depend on the relationship between the UI element and the UI Distance between elements 343. Therefore, as schematically shown in FIG. 24C , UI elements around UI element 343 will have different moving distances depending on the distance from UI element 343 . For example, UI element 323 is closer to UI element 343 than UI element 313, so UI element 323 can move a greater target distance than UI element 313. Additionally, as further shown in Figure 24C, the UI element 343, which has become circular, is further shrunk compared to the moment depicted in Figure 24B. As shown in FIG.
  • the small black dots represent the starting positions of each UI element except UI element 343 before the "gravity" animation effect starts, and the cross symbol represents the current position of each UI element. That is, at the moment shown in FIG. 24D , the respective UI elements except UI element 343 have completed their respective second movements, that is, moved away from UI element 343 and returned to their respective starting positions. Additionally, as further shown in Figure 24D, UI element 343 has completely disappeared to indicate that it has been deleted.
  • 25A-25D illustrate schematic diagrams of a "gravity" animation effect produced in a scene in which a UI element is expanded, according to an embodiment of the present disclosure.
  • the hand 370 of the user of the electronic device 100 may perform an operation to expand the UI element 343 .
  • expanding UI element 343 may include long-pressing UI element 343 to open a menu related to UI element 343 for the user to select or view, and then selecting or viewing the expanded menu in the open menu. Therefore, UI element 343 begins to animate the expanded menu.
  • FIGS the expansion animation effect when the UI element 343 is expanded depicted in FIGS.
  • the "gravitational" effect on surrounding UI elements when UI element 345 begins to appear at UI element 343 may be set to "repulsive force". That is, UI elements around UI element 343 will first make a first movement in a first direction away from UI element 343, and will then make a second movement in a second direction towards UI element 343, returning to their respective starting point.
  • UI element 311 will make a first movement along a first direction 311-d1 away from UI element 343, and UI element 312 will make a first movement along a first direction away from UI element 343 312-d1 makes a first move, UI element 313 will make a first move in a first direction 313-d1 away from UI element 343, UI element 314 will make a first move in a first direction 314-d1 away from UI element 343 move.
  • UI element 321 will make a first movement in a first direction 321-d1 away from UI element 343
  • UI element 322 will make a first movement in a first direction 322-d1 away from UI element 343
  • UI element 323 The first movement will be in the first direction 323-d1 away from the UI element 343, and the UI element 324 will be in the first movement in the first direction 324-d1 away from the UI element 343.
  • UI element 331 will make a first movement in a first direction 331-d1 away from UI element 343
  • UI element 332 will make a first movement in a first direction 332-d1 away from UI element 343
  • UI element 333 The first movement will be in the first direction 333-d1 away from the UI element 343, and the UI element 334 will be in the first movement in the first direction 334-d1 away from the UI element 343.
  • UI element 341 will make a first movement in a first direction 341-d1 away from UI element 343
  • UI element 342 will make a first movement in a first direction 342-d1 away from UI element 343
  • UI element 344 A first movement will be made along a first direction 344 - d1 away from UI element 343 .
  • UI element 351 will make a first movement in a first direction 351-d1 away from UI element 343
  • UI element 352 will make a first movement in a first direction 352-d1 away from UI element 343
  • UI element 353 A first movement will be made in a first direction 353-d1 away from UI element 343, and a first movement will be made in a first direction 354-d1 away from UI element 343 by UI element 354.
  • UI element 361 will make a first movement in a first direction 361-d1 away from UI element 343
  • UI element 362 will make a first movement in a first direction 362-d1 away from UI element 343
  • UI element 363 A first movement will be made along a first direction 363-d1 away from UI element 343, and a first movement will be made along a first direction 364-d1 away from UI element 343 by UI element 364.
  • the size (ie, the target distance or magnitude of the first movement) of each UI element affected by the "gravitational force" of the UI element 343 may vary with the UI
  • the element's distance from UI element 343 increases and decreases.
  • UI elements 333, 342, 344, and 353 have the closest distances to UI element 343, so the target distance of the first move is the largest.
  • UI elements 332, 334, 352, and 354 are the next closest (ie, more distant) to UI element 343, and therefore the target distance for the first move.
  • UI elements 323, 341, 363 to UI element 343 is again the same (ie, the distance is greater), so the target distance of the first move is also the same.
  • UI elements 322, 324, 331, 351, 362, and 364 are again the same (ie, more distant) from UI element 343, so the target distance of the first move is also the same.
  • the proximity of UI elements 321 and 361 to UI element 343 is again the same (ie, the distance is greater), so the target distance of the first move is also the same.
  • the proximity of UI element 313 to UI element 343 is again the same (ie, the distance is greater), so the target distance of the first move is also the same.
  • UI elements 312 and 314 to UI element 343 are again the same (ie, the distance is greater), so the target distance of the first move is also the same.
  • UI element 311 is the farthest away from UI element 343, so the target distance of the first move is also the smallest.
  • the size of the target distance of each UI element in the first movement can be determined based on the distance between the UI element and the UI element that produces the "gravitational" influence, and the distance between the two UI elements The distance of can be determined according to any of the distance calculation methods described above with reference to FIGS. 8 to 17F .
  • the small black dots represent the starting positions of each UI element except UI element 343 before the "gravity" animation effect started, and the cross symbol represents the current position of each UI element. That is to say, at the moment shown in FIG. 25C , each UI element except the UI element 343 has completed their respective first movements, and moved their respective target distances in the first direction away from the UI element 343 , and then The respective starting positions will be returned in the second direction towards UI element 343 .
  • FIG. 25C the small black dots represent the starting positions of each UI element except UI element 343 before the "gravity" animation effect started
  • the cross symbol represents the current position of each UI element. That is to say, at the moment shown in FIG. 25C , each UI element except the UI element 343 has completed their respective first movements, and moved their respective target distances in the first direction away from the UI element 343 , and then The respective starting positions will be returned in the second direction towards UI element 343 .
  • FIG. 25C the example of
  • the size of the "repulsive force" (ie, the size of the target distance) of a UI element by the UI element 343 may depend on the relationship between the UI element and the UI. Distance between elements 343.
  • UI elements around UI element 343 will have different moving distances depending on the distance from UI element 343 .
  • UI element 323 is closer to UI element 343 than UI element 313, so UI element 323 can move a greater target distance than UI element 313.
  • UI element 345 that has been expanded may cover UI element 343 and surrounding UI elements 344, UI element 353, and UI element 354, rendering these UI elements invisible.
  • the small black dots represent the starting positions of each UI element except UI element 343 before the "gravity" animation effect starts, and the cross symbol represents the current position of each UI element. That is, at the moment shown in FIG. 25D , the respective UI elements except UI element 343 have completed their respective second movements, that is, moved toward UI element 343 and returned to their respective starting positions.
  • UI element 345 that has been fully expanded may cover UI element 343 and surrounding UI elements 344, UI element 353, and UI element 354, rendering these UI elements invisible.
  • UI framework animation 2602 may provide gravity animation capability 2604.
  • the gravitational dynamic capability 2604 may be in AAR form 2606 , JAR form 2608 , and system interface 2610 .
  • the desktop 2614 may implement various operations on UI elements, such as move operations 2616 , merge operations 2618 , expand operations 2620 , delete operations 2622 , and other operations 2624 .
  • the desktop 2614 can use the gravitational animation capability 2604 provided by the UI framework animation 2602 by integrating 2612 .
  • the desktop 2614 can also use the gravitational animation capabilities 2604 provided by the UI framework animation 2602 by invoking (eg, the system interface 2610 ). That is to say, the UI framework can provide the ability of "gravity” animation effect in the form of AAR, JAR, and system interface. After the desktop 2614 is integrated, it can be applied to various scenarios required in the field. It should be noted that although the embodiments of the present disclosure mainly take the desktop scene as an example, the UI framework mainly provides the ability of "gravity” animation effect, so the "gravity” animation effect can be implemented in any other appropriate scene except the desktop middle.
  • the usage scenarios of the present disclosure may include any scenarios in which arranged UI elements (eg, icons) are associated, and gravitational animation can be supported as long as a scenario in which multiple UI elements respond to operations on a certain UI element .
  • the more common scenarios can include operations of various icons on the desktop, such as moving, merging, deleting, expanding, etc.
  • the possible operations are not limited to the above listed items. If the desktop provides other functions or operations for UI elements in the future, the same is true.
  • the ability to use the "gravity" animation effect provided by embodiments of the present disclosure may be used.
  • the system desktop of the electronic device generally belongs to the application layer, which can integrate or invoke the capabilities of the UI framework.
  • the external capabilities of the UI framework are generally divided into three types.
  • the platform capabilities generally include the AAR method and the JAR package method. These two methods encapsulate the code and provide it for application integration. It does not belong to a certain level. Generally, it can be integrated and used in the application, along with the application layer.
  • the system capabilities generally include system interfaces, which belong to the application framework layer and can be various services or capabilities provided to the above applications.
  • Figure 27 shows a schematic diagram of other application scenarios to which the "gravity" animation effect capability or function of an embodiment of the present disclosure may be applied.
  • an embodiment of the present disclosure provides a capability, and a specific usage scenario is not limited, and various types of scenarios can be used.
  • such scenarios may include, but are not limited to, a list 2710 of pictures in a gallery, a sliding list 2720 in an application market, a card moving and expanding operation 2730 on a negative screen, and a multi-tasking card linkage scenario 2740, and so on.
  • Figure 28 shows a schematic diagram of a system framework 2800 for implementing a "gravity" animation effect capability or functionality, according to an embodiment of the present disclosure.
  • the animation capabilities of the UI framework are based on the electronic device's operating system (eg, Android or Hongmeng ), which can include mainstream 4-layer logic processing, and the data processing flow is presented to users from the bottom up. Users can use and experience the functions of motion effects mainly at the application layer.
  • the capability interaction relationship between the desktop and the UI framework is depicted in FIG. 28 . Specifically, as shown in FIG.
  • the system framework 2800 may include an application layer 2810 , an application framework layer 2830 , a hardware abstraction layer 2850 , and a kernel layer 2870 .
  • Application layer 2810 may include desktop 2812.
  • Icon operations 2814 may be implemented on the desktop 2812. Icon operations 2814 may include, for example, move operations, merge operations, expand operations, delete operations, and other operations.
  • the application framework layer 2830 may include system services 2832 and extension services 2834.
  • System services 2832 may include various system services, such as Service 2833.
  • Extension services 2834 may include various extension services, such as HwSDK 2835.
  • Hardware Abstraction Layer (HAL) 2850 may include HAL 3.0 2852 and Algorithm Algo 2854.
  • Kernel layer 2870 may include drivers 2872 and physical devices 2874.
  • the physical device 2874 may provide the raw parameter stream to the driver 2872, and the driver 2872 may provide the functional processing parameter stream to the physical device 2874.
  • the UI framework 2820 for implementing the gravitational animation 2825 may be implemented between the application layer 2810 and the application framework layer 2830 .
  • UI framework 2820 may include platform capabilities 2822 and system capabilities 2824, both of which may be used to provide gravitational animation 2825.
  • the gravity animation 2825 may in turn be provided to the icon operation 2814 of the application layer 2810.
  • FIG. 29 shows a schematic diagram of the relationship between the application side and the UI framework side involved in the "attraction" animation effect capability or function according to an embodiment of the present disclosure.
  • the application side 2910 may include a desktop 2915, and UI elements on the desktop 2915 may implement operations such as move 2912, merge 2914, expand 2916, delete 2918, other 2920, and so on.
  • the UI frame side 2950 may include UI frame motion effects 2952.
  • the UI frame motion effects 2952 may implement the gravitational motion effect capability 2954, and the gravitational motion effect capability 2954 may be implemented by means of AAR format 2951, JAR format 2953, and system interface 2955.
  • the application side 2910 can call the "gravity” animation effect capability or function provided by the UI framework side 2950 by integrating 2930 and calling 2940.
  • embodiments of the present disclosure implement a new type of attractive "animation effect” that links otherwise separate UI elements (eg, icons).
  • Fig. 30 shows a schematic diagram of a specific description of three ways of implementing the "gravity" animation effect capability or function according to an embodiment of the present disclosure.
  • the relationship 3001 between the AAR format 2951 and the system of the electronic device 100 is as follows: AAR format 2951 is packaged with capabilities in a binary format, which provides the capability of integration on the application side in the system, and can freely control the version rhythm, Do not follow the system.
  • the relationship 3003 between the JAR format 2953 and the system of the electronic device 100 is as follows: the JAR format 2953 is packaged with capabilities in a binary format, providing capabilities for all components in the system, and can freely control the version rhythm without following the system.
  • the relationship 3005 between the system interface 2955 and the system of the electronic device 100 is: the system interface 2955 is the interface of the framework layer in the system version, and provides the capability of all components in the system, following the system upgrade. More specifically, the integration mode may refer to the mode of AAR and JAR package, and the invocation mode may refer to the mode of system interface. Therefore, the scene to which the embodiments of the present disclosure are applied is not limited to any specific scene, but the way of displaying the ability of the "gravity" animation effect may be inconsistent. That is to say, the functions of the various methods described above in the present disclosure may be implemented through an AAR format file, a JAR format file, and/or a system interface of the electronic device 100 . In this way, the ability or functionality of the "gravity" animation effect can be simply and conveniently implemented and provided to an application of an electronic device, such as a desktop.
  • the interface design and solution implementation include the design and implementation of the ability to realize the gravity model.
  • the following is an example of the design and implementation of gravity model capabilities.
  • FIG. 31 shows a schematic diagram of a class diagram relationship on the animation capability side for realizing the “gravity” animation effect according to an embodiment of the present disclosure.
  • the dynamic effect capability side may include GravityAnimator class 3110
  • GravityAnimator class 3110 may include GravityField class 3120
  • GravityField class 3120 may include GravityAsteroid class 3122 , GravityAsteroid class 3124 , , GravityAsteroid class 3126 .
  • the layout design on the application side can be combined arbitrarily and freely.
  • the GravityAnimator class 3110 may be the animation class of the entire gravity
  • the GravityField class 3110 may be equivalent to the area of the entire gravity scene
  • the GravityAsteroid classes 3122 to 3126 may be equivalent to each All UI elements in the gravitational field.
  • FIG. 32 shows an operation sequence diagram of the application side and the dynamic effect capability side for realizing the "gravity" animation effect according to an embodiment of the present disclosure.
  • the application side 3210 may include a GravityDemo class 3212 and a View class 3214
  • the dynamic effect capability side 3250 may include a GravityAnimator class 3110 , a GravityField class 3120 and a GravityAsteroid class 3122 .
  • the application side can organize the graphical representation, and the functional capability side can provide specific capabilities.
  • the timing diagram for each operation is depicted in Figure 32.
  • the operation flow may include: first, in the first step, the parent layout is passed in during initialization, and a listener callback is set to all affected UI elements (also called child elements). Then, in the second step, register a callback with android.view.Choreographer to update the position of each affected element every frame. Then, in the third step, each frame calculates the value of the interpolator according to the time, calculates the position of the current element at the current moment, and passes it to the child element through the callback in the first step. After that, in the fourth step, the child element updates the position in the callback.
  • mGravityAnimator new GravityAnimator(pos, mViewContainer, GRAVITATION);
  • the electronic device 100 may display a setting area 3310 on the screen 300 for adjusting the “attraction” animation effect of the electronic device 100 .
  • the user can set whether the "attraction” animation effect is "forward", that is, the "attraction” of the operated UI element to other UI elements. If the Gravity animation is set to "Forward” turned on, the UI element that is animating the Gravity will first be attracted to another UI element and then return to its starting position.
  • the user can also set whether the "gravity” animation effect includes the “delete” operation in the setting area 3310, and set the gravitational speed (that is, the gravitational propagation speed), the gravitational range, the gravitational duration (that is, the duration of the first movement), recovery Duration (that is, the duration of the second movement), the amplitude coefficient used to determine the target distance, the position of the relevant control point, the recovery rigidity (that is, the displacement time curve of the second movement is the parameter when the elastic force curve is used), recovery Damping (ie the displacement time curve of the second movement is a parameter when using the elastic force curve), etc.
  • the gravitational speed that is, the gravitational propagation speed
  • the gravitational range that is, the duration of the first movement
  • recovery Duration that is, the duration of the second movement
  • the amplitude coefficient used to determine the target distance the position of the relevant control point
  • the recovery rigidity that is, the displacement time curve of the second movement is the parameter when the elastic force curve is used
  • recovery Damping
  • any other parameters related to the "attraction” animation effect may be set in the setting area of the "attraction” animation effect provided by the electronic device 100 to the user. That is to say, since various parameters of the "gravity” animation effect can be adjusted, the embodiments of the present disclosure provide the function of self-adjustment and verification. effect and make adjustments.
  • the object editing method of the embodiment of the present disclosure can be applied to various electronic devices.
  • the electronic device may be, for example, a mobile phone, a tablet computer (Tablet Personal Computer), a digital camera, a personal digital assistant (personal digital assistant, PDA for short), a navigation device, and a mobile Internet Device (MID) , Wearable Devices, and other devices capable of object editing.
  • the object editing solution of the embodiments of the present disclosure can be implemented not only as a function of an input method, but also as a function of an operating system of an electronic device.
  • the above-mentioned embodiments it may be implemented in whole or in part by software, hardware, firmware or any combination thereof.
  • a software program it may take the form of a computer program product, in whole or in part.
  • the computer program product includes one or more computer instructions.
  • the computer program instructions when loaded and executed on a computer, result in whole or in part of the procedures or functions described in accordance with the embodiments of the present disclosure.
  • the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions may be downloaded from a website site, computer, server or data center Transmission to another website site, computer, server, or data center by wire (eg, coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (eg, infrared, wireless, microwave, etc.).
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that includes an integration of one or more available media.
  • the available media may be magnetic media (eg, floppy disks, hard disks, magnetic tapes), optical media (eg, DVD), or semiconductor media (eg, Solid State Disk (SSD)), and the like.
  • the various example embodiments of the present disclosure may be implemented in hardware or special purpose circuits, software, logic, or any combination thereof. Certain aspects may be implemented in hardware, while other aspects may be implemented in firmware or software that may be executed by a controller, microprocessor or other computing device. For example, in some embodiments, various examples of the present disclosure (eg, methods, apparatus, or devices) may be implemented in part or in whole on a computer-readable medium.
  • the present disclosure also provides at least one computer program product stored on a non-transitory computer-readable storage medium.
  • the computer program product includes computer-executable instructions, such as program modules included in a device executed on a target's physical or virtual processor, to perform the examples described above with respect to FIGS. 4 , 14 and 15 Methods or example processes 400 , 1400 and 1500 .
  • program modules may include routines, programs, libraries, objects, classes, components, data structures, etc. that perform particular tasks or implement particular abstract data structures.
  • the functionality of the program modules may be combined or divided among the described program modules.
  • Computer-executable instructions for program modules may be executed in local or distributed devices. In a distributed facility, program modules may be located in both local and remote storage media.
  • Program code for implementing the methods of the present disclosure may be written in one or more programming languages. Such computer program code may be provided to a processor of a general purpose computer, special purpose computer or other programmable data processing apparatus such that the program code, when executed by the computer or other programmable data processing apparatus, causes the flowchart and/or block diagrams The functions/operations specified in are implemented.
  • the program code may execute entirely on the computer, partly on the computer, as a stand-alone software package, partly on the computer and partly on a remote computer or entirely on the remote computer or server.
  • computer program code or related data may be carried by any suitable carrier to enable a device, apparatus or processor to perform the various processes and operations described above. Examples of carriers include signals, computer-readable media, and the like.
  • the above-mentioned embodiments it may be implemented in whole or in part by software, hardware, firmware or any combination thereof.
  • software it can be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, all or part of the processes or functions described in the embodiments of the present application are generated.
  • the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions may be downloaded from a website site, computer, server or data center Transmission to another website site, computer, server, or data center by wire (eg, coaxial cable, optical fiber, digital subscriber line) or wireless (eg, infrared, wireless, microwave, etc.).
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that includes an integration of one or more available media.
  • the usable media may be magnetic media (eg, floppy disks, hard disks, magnetic tapes), optical media (eg, DVDs), or semiconductor media (eg, solid state drives), and the like.
  • the process can be completed by instructing the relevant hardware by a computer program, and the program can be stored in a computer-readable storage medium.
  • the aforementioned storage medium includes: ROM or random storage memory RAM, magnetic disk or optical disk and other mediums that can store program codes.
  • the computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium.
  • Computer-readable media may include, but are not limited to, electronic, magnetic, optical, electromagnetic, infrared, or semiconductor systems, devices, or devices, or any suitable combination thereof.
  • machine-readable storage media include electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read only Memory (EPROM or flash memory), fiber optics, portable compact disk read only memory (CD-ROM), optical storage devices, magnetic storage devices, or any suitable combination thereof.
  • RAM random access memory
  • ROM read only memory
  • EPROM or flash memory erasable programmable read only Memory
  • CD-ROM compact disk read only memory
  • optical storage devices magnetic storage devices, or any suitable combination thereof.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本公开的实施例提供了一种图形界面显示方法、电子设备、存储介质以及程序产品。在该方法中,电子设备在屏幕上显示M个用户界面UI元素。电子设备检测到作用于第一UI元素的操作。响应于该操作,电子设备使屏幕上的N个UI元素中的每个UI元素产生受到"吸引力"或"排斥力"的动画效果。在产生动画效果时,电子设备确定第二UI元素将在第一方向上移动的目标距离。电子设备使第二UI元素从起始位置沿第一方向以目标距离进行第一移动。在第一移动之后,电子设备使第二UI元素沿与第一方向相反的第二方向进行第二移动,以复位到起始位置。如此本公开的实施例展现出符合自然规律的动态效果,与用户生活体验更加一致,加强了电子设备的生命力和人性化程度。

Description

图形界面显示方法、电子设备、介质以及程序产品 技术领域
本公开总体上涉及信息技术领域,并且更特别地涉及一种图形界面显示方法、电子设备、计算机可读存储介质、以及计算机程序产品。
背景技术
随着信息技术的发展,越来越多的电子设备被配备有各种类型的屏幕。因此,电子设备的屏幕上的用户界面(user interface,UI)或图形界面(graphic interface,GUI)的整体显示效果及风格成为影响用户体验的重要因素。在UI框架的构建中,动画效果已经成为不可分割的一部分。随着智能电话等电子设备的性能提高,电子设备的UI动画效果也随之发展。高刷新率、高渲染度、高复杂度的动画效果逐渐出现。然而,电子设备的屏幕上的UI动画效果还存在进一步改进的空间,以提供更良好的用户体验。
发明内容
本公开的实施例涉及一种用于实现UI元素之间存在“吸引力”或“排斥力”的动画效果的技术方案,并且具体提供了一种图形界面显示方法、电子设备、计算机可读存储介质、以及计算机程序产品。
在本公开的第一方面,提供了一种图形界面显示方法。在该方法中,电子设备在屏幕上显示M个用户界面UI元素,M为大于1的正整数。电子设备检测到作用于M个UI元素中的第一UI元素的操作。响应于对第一UI元素的操作,电子设备使屏幕上的N个UI元素中的每个UI元素产生动画效果,N为1与M-1之间的正整数。在产生动画效果时,电子设备确定N个UI元素中的第二UI元素将在第一方向上移动的目标距离,第一方向是从第二UI元素指向第一UI元素的方向或是从第一UI元素指向第二UI元素的方向。电子设备使第二UI元素从起始位置沿第一方向以目标距离进行第一移动。电子设备在第一移动之后,使第二UI元素沿与第一方向相反的第二方向进行第二移动,以复位到起始位置。以此方式,本公开的实施例实现了UI元素之间具有“引力”的动画效果,展现出符合自然规律的动态效果,与用户生活体验更加一致,加强了电子设备的生命力和人性化程度。在一些实施例中,取决于系统设置或用户设置,或者取决于被操作的第一UI元素的操作持续的时间长度,第二UI元素可以进行多次第一位移和第二位移。也就是说,第二UI元素可以按照循环的方式在第一方向上执行第一移动,在第二方向上执行第二移动,然后再在第一方向上执行第一移动,再在第二方向上执行第二移动,如此循环往复。在一些实施例中,第二UI元素在每次循环中在第一方向上的第一移动中的目标距离可以保持不变或逐渐减小。
在一些实现方式中,为了确定目标距离,电子设备可以确定第二UI元素的尺寸,确定第二UI元素与第一UI元素之间的距离,以及基于尺寸和距离来确定目标距离。以此方式,UI元素受到其他UI元素的“吸引力”或“排斥力”作用的大小可以取决于UI元素本身的大小和两个UI元素之间的距离,从而符合于自然界中的引力大小规律,由此进一步提升用户体验。
在一些实现方式中,为了基于尺寸和距离来确定目标距离,电子设备可以使目标距离随着尺寸增大而增大并且随着距离增大而减小。以此方式,UI元素本身的大小越大,两个UI 元素之间的距离越小,UI元素受到其他UI元素的“吸引力”或“排斥力”作用的大小也就越大,从而符合于自然界中的引力大小规律,由此进一步提升用户体验。
在一些实现方式中,为了确定第二UI元素与第一UI元素之间的距离,电子设备可以确定第一UI元素的第一中心点,确定第二UI元素的第二中心点,以及确定第一中心点与第二中心点之间的直线距离,作为第二UI元素与第一UI元素之间的距离。以此方式,两个UI元素之间的距离可以按照直接明了的方式确定为两个UI元素中心点之间的距离,从而提高电子设备确定UI元素之间距离的确定方式的一致性,简化电子设备的计算过程。
在一些实现方式中,为了确定第二UI元素与第一UI元素之间的距离,电子设备可以确定第一UI元素的第一中心点,确定以第一中心点为圆心的具有各自半径的多个圆,确定第二UI元素与多个圆中的至少一个圆相交,以及将至少一个圆中半径最小的圆的半径确定为第二UI元素与第一UI元素之间的距离。以此方式,电子设备可以更简单和方便地确定UI元素之间的距离,并且使得UI元素之间的距离具有更高的一致性,从而简化基于距离的后续处理和计算过程。
在一些实现方式中,为了确定第二UI元素与第一UI元素之间的距离,电子设备可以确定第一UI元素与第二UI元素之间的横向间距,确定第一UI元素与第二UI元素之间的纵向间距,以及基于横向间距和纵向间距中的至少一者和第一方向来确定第二UI元素与第一UI元素之间的距离。以此方式,电子设备可以基于UI元素之间的间距来确定UI元素之间的距离,从而提高距离确定方式的灵活性和合理性,特别是在UI元素之间的间距基本保持一致的场景中。
在一些实现方式中,电子设备还可以基于第一UI元素的尺寸来确定第一UI元素的影响区域,以及将M个UI元素中在影响区域内的UI元素确定为N个UI元素。以此方式,电子设备可以将UI元素的“引力”影响范围设置为适当的大小,从而可以在保持“引力”动画效果符合自然规律的同时,减少电子设备在实现“引力”动画效果时的计算量,节省计算资源。
在一些实现方式中,电子设备还可以将M个UI元素中除了第一UI元素以外的M-1个UI元素确定为N个UI元素。以此方式,电子设备可以无需设置UI元素的“引力”影响范围,从而可以在保持“引力”动画效果符合自然规律的同时,简化“引力”动画效果的相关设置。
在一些实现方式中,第一移动持续的第一时长、第二移动持续的第二时长、以及第一移动和第二移动持续的总时长中的至少一个可以是可配置的。以此方式,电子设备的用户可以按照偏好来设置“引力”动画效果的时间长短,从而进一步改进用户体验。
在一些实现方式中,第二UI元素在第一移动和第二移动中的至少一者期间的移动的动画效果可以基于位移随时间变化的预定义曲线来确定。以此方式,电子设备可以基于位移随时间变化的预定义曲线来方便地控制UI元素的移动,使得“引力”动画效果更符合用户的使用习惯,从而进一步改进用户体验。
在一些实现方式中,预定义曲线可以为贝塞尔曲线或弹性力曲线。以此方式,电子设备可以基于贝塞尔曲线或弹性力曲线来方便地控制UI元素的移动,使得“引力”动画效果更加符合用户在生活中对于“吸引力”和“排斥力”的习惯认知,从而进一步改进用户体验。
在一些实现方式中,第一移动和第二移动中的至少一个可以包括变加速直线运动。以此方式,电子设备可以基于自然界的物体在引力作用下的加速运动规律,来实现UI元素的第一移动和第二移动,使得“引力”动画效果更加符合自然规律和用户在生活中的习惯认知,从 而进一步改进用户体验。
在一些实现方式中,为了使第二UI元素进行第一移动,电子设备可以确定对第一UI元素的操作被执行的第一时间点,基于预定速度和第二UI元素与第一UI元素之间的距离,确定开始第一移动的第二时间点与第一时间点之间的延迟,基于第一时间点和延迟来确定第二时间点,以及在第二时间点使第二UI元素开始第一移动。以此方式,电子设备的UI可以在视觉上呈现“引力”作用的联动,也即,“吸引力”或“排斥力”造成的移动随着距离进行传播,使得UI的动画效果更符合用户的使用习惯,从而进一步改进用户体验。
在一些实现方式中,对第一UI元素的操作包括使第一UI元素与第二UI元素交换位置,上述的目标距离是第一目标距离,为了产生动画效果,电子设备还可以将第二UI元素从初始位置移动到起始位置,起始位置是第一UI元素的初始位置;在第二UI元素到达起始位置之后并且在第一移动之前,确定第二UI元素将在第三方向上移动的第二目标距离,第三方向是从第二UI元素指向第三UI元素的方向或是从第三UI元素指向第二UI元素的方向;在第一移动之前,使第二UI元素从起始位置沿第三方向以第二目标距离进行第三移动;以及在第三移动之后并且在第一移动之前,使第二UI元素沿与第三方向相反的第四方向进行第四移动,以复位到起始位置。以此方式,尽管第二UI元素没有被直接操作,但是第二UI元素由于需要与第一UI元素交换位置而来到新位置,从而受到其他UI元素的“引力”作用。因此,电子设备可以更加充分和全面地展现出UI元素之间具有“引力”的动画效果,从而进一步提升用户体验。
在一些实现方式中,为了产生动画效果,电子设备还可以在第一移动和第二移动中的至少一者期间缩小或放大第二UI元素的尺寸。以此方式,电子设备可以更加多样化地展现出UI元素之间具有“引力”的动画效果,从而进一步提升用户体验。
在一些实现方式中,第一方向可以从第二UI元素的第二中心点指向第一UI元素的第一中心点、或者可以从第一中心点指向第二中心点。以此方式,电子设备可以准确且一致地确定出两个UI元素之间的“吸引力”或“排斥力”的方向,从而提高实现“引力”动画效果的准确度和效率。
在一些实现方式中,对第一UI元素的操作可以包括以下至少一项:点击、移动、与其他UI元素合并、展开、以及删除。以此方式,电子设备可以在与UI元素相关的几乎所有的操作中实现“引力”动画效果,从而在更多的操作场景中提升用户体验。
在一些实现方式中,第一方面的图形界面显示方法的功能可以通过AAR格式文件、JAR格式文件和电子设备的系统接口中的至少一者来实现。以此方式,“引力”动画效果的能力或功能可以简单和方便地被实现并提供给电子设备的应用程序,例如桌面。
在本公开的第二方面,提供了一种电子设备。电子设备包括处理器以及存储有指令的存储器。指令在被处理器执行时使得电子设备执行根据第一方面及其实现方式的任一方法。
在本公开的第三方面,提供了一种计算机可读存储介质。计算机可读存储介质存储有指令,指令在被电子设备执行时使得电子设备执行第一方面及其实现方式的任一方法。
在本公开的第四方面,提供了一种计算机程序产品。计算机程序产品包括指令,指令在被电子设备执行时使得电子设备执行第一方面及其实现方式的任一方法。
应当理解,发明内容部分中所描述的内容并非旨在限定本公开的关键或重要特征,亦非用于限制本公开的范围。本公开的其他特征通过以下的描述将变得容易理解。
附图说明
通过参考附图阅读下文的详细描述,本公开的实施例的上述以及其他目的、特征和优点将变得容易理解。在附图中,以示例性而非限制性的方式示出了本公开的若干实施例。
图1示出了可以实现本公开的实施例的一种电子设备的硬件结构的示意图。
图2示出了根据本公开的实施例的图形界面显示方法的示例处理过程的流程图。
图3A至图3J示出了根据本公开的实施例的在UI元素被点击的场景中所产生的“引力”动画效果的示意图。
图4A和图4B示出了根据本公开的实施例的“引力”动画效果中的UI元素的第一移动的第一方向和第二移动的第二方向的示意图。
图5示出了根据本公开的实施例的“引力”动画效果中的受到“吸引力”影响的UI元素在进行第一移动和第二移动的过程中在不同时刻的位置的示意图。
图6示出了根据本公开的实施例的“引力”动画效果的动画过程和相关控制逻辑的示意图。
图7A示出了根据本公开的实施例的UI元素的位移随时间变化的预定义曲线为贝塞尔曲线的示意图。
图7B示出了根据本公开的实施例的UI元素的位移随时间变化的预定义曲线为反比例曲线的示意图。
图7C示出了根据本公开的实施例的UI元素的位移随时间变化的预定义曲线为临界阻尼弹性力曲线的示意图。
图7D示出了根据本公开的实施例的UI元素的位移随时间变化的预定义曲线为欠阻尼弹性力曲线的示意图。
图7E至图7H示出了根据本公开的实施例的受到“引力”影响的三个UI元素的不同位移时间变化曲线的比较的示意图。
图8示出了根据本公开的实施例的用于确定受到第一UI元素的“吸引力”或“排斥力”影响的第二UI元素进行第一移动的目标距离的示例处理过程的流程图。
图9示出了根据本公开的实施例的确定受到第一UI元素的“吸引力”或“排斥力”影响的第二UI元素的尺寸的示意图。
图10A和图10B分别示出了根据本公开的实施例的确定UI元素之间的距离的两种示例方式的示意图。
图11示出了根据本公开的实施例的基于中心点来确定第一UI元素与第二UI元素之间的距离的示例处理过程的流程图。
图12示出了根据本公开的实施例的基于中心点来确定第一UI元素与第二UI元素之间的距离的示意图。
图13示出了根据本公开的实施例的基于半径来确定第一UI元素与第二UI元素之间的距离的示例处理过程的流程图。
图14示出了根据本公开的实施例的基于半径来确定第一UI元素与第二UI元素之间的距离的示意图。
图15A和图15B示出了根据本公开的实施例的在基于半径确定UI元素之间的距离的情 况下UI元素之间的整体传导方式的示意图。
图16示出了根据本公开的实施例的基于间距来确定第一UI元素与第二UI元素之间的距离的示例处理过程的流程图。
图17A至图17F示出了根据本公开的实施例的基于间距来确定第一UI元素与第二UI元素之间的距离的示意图。
图18A至图18C示出了根据本公开的实施例的在UI元素具有有限的“引力”范围的场景中所产生的“引力”动画效果的示意图。
图19A示出了根据本公开的实施例的基于“引力”传播速度来确定UI元素的“引力”动画效果开始的时间点的示例处理过程的流程图。
图19B至图19E示出了根据本公开的实施例的在考虑到“引力”传播延迟的情况下受到“引力”影响的三个UI元素的不同位移时间变化曲线的比较的示意图。
图20A至图20D示出了根据本公开的实施例在UI元素被移动并且与另一UI元素交换位置的场景中所产生的“引力”动画效果的示意图。
图21示出了根据本公开的实施例的在UI元素交换位置的场景中,先到达新位置的UI元素受到其他UI元素的“引力”作用而产生“引力”动画效果的示例处理过程的流程图。
图22A至图22D示出了根据本公开的实施例的在UI元素交换位置的场景中,先到达新位置的UI元素受到其他UI元素的“引力”作用而产生“引力”动画效果的示意图。
图23A至图23D示出了根据本公开的实施例在UI元素被移动并且与另一UI元素合并的场景中所产生的“引力”动画效果的示意图。
图24A至图24D示出了根据本公开的实施例在UI元素被删除的场景中所产生的“引力”动画效果的示意图。
图25A至图25D示出了根据本公开的实施例在UI元素被展开的场景中所产生的“引力”动画效果的示意图。
图26示出了根据本公开的实施例的“引力”动画效果相关联的UI框架动效与系统桌面之间的关系的示意图。
图27示出了本公开的实施例的“引力”动画效果能力或功能可以被应用到的其他应用场景的示意图。
图28示出了根据本公开的实施例的用于实现“引力”动画效果能力或功能的系统框架的示意图。
图29示出了根据本公开的实施例的“引力”动画效果能力或功能所涉及到的应用侧和UI框架侧之间的关系的示意图。
图30示出了根据本公开的实施例的“引力”动画效果能力或功能实现的三种方式的具体说明的示意图。
图31示出了根据本公开的实施例的用于实现“引力”动画效果的动效能力侧的类图关系的示意图。
图32示出了根据本公开的实施例的应用侧和动效能力侧用于实现“引力”动画效果的操作时序图。
图33示出了根据本公开的实施例的用于调整“引力”动画效果的参数的界面的示意图。
贯穿所有附图,相同或者相似的参考标号被用来表示相同或者相似的组件。
具体实施方式
下文将参考附图中示出的若干示例性实施例来描述本公开的原理和精神。应当理解,描述这些具体的实施例仅是为了使本领域的技术人员能够更好地理解并实现本公开,而并非以任何方式限制本公开的范围。在以下描述和权利要求中,除非另有定义,否则本文中使用的所有技术和科学术语具有与所属领域的普通技术人员通常所理解的含义。
如本文所使用的,术语“包括”及其类似用语应当理解为开放性包含,即“包括但不限于”。术语“基于”应当理解为“至少部分地基于”。术语“一个实施例”或“该实施例”应当理解为“至少一个实施例”。术语“第一”、“第二”等等可以指代不同的或相同的对象,并且仅用于区分所指代的对象,而不暗示所指代的对象的特定空间顺序、时间顺序、重要性顺序,等等。在一些实施例中,取值、过程、所选择的项目、所确定的项目、设备、装置、手段、部件、组件等被称为“最佳”、“最低”、“最高”、“最小”、“最大”,等等。应当理解,这样的描述旨在指示可以在许多可使用的功能选择中进行选择,并且这样的选择不需要在另外的方面或所有方面比其他选择更好、更低、更高、更小、更大或者以其他方式优选。如本文所使用的,术语“确定”可以涵盖各种各样的动作。例如,“确定”可以包括运算、计算、处理、导出、调查、查找(例如,在表格、数据库或另一数据结构中查找)、查明等。此外,“确定”可以包括接收(例如,接收信息)、访问(例如,访问存储器中的数据)等。再者,“确定”可以包括解析、选择、选取、建立等。
在本文中使用的术语“UI”表示用户与应用程序或操作系统进行交互和信息交换的接口,它实现信息的内部形式与用户可以接受形式之间的转换。例如,应用程序的UI是通过java、可扩展标记语言(extensible markup language,XML)等特定计算机语言编写的源代码,UI源代码在电子设备上经过解析,渲染,最终呈现为用户可以识别的内容,比如图片、文字、按钮等UI元素。
在某些实施例中,UI中的UI元素的属性和内容是通过标签或者节点来定义的,比如XML通过<Textview>、<ImgView>、<VideoView>等节点来规定UI所包含的UI元素。一个节点对应UI中一个UI元素或属性,节点经过解析和渲染之后呈现为用户可视的内容。此外,很多应用程序,比如混合应用(hybrid application)的UI中通常还包含有网页。网页可以理解为内嵌在应用程序UI中的一个特殊的UI元素,网页是通过特定计算机语言编写的源代码,例如超文本标记语言(hyper text markup language,HTML),层叠样式表(cascading style sheets,CSS),java脚本(JavaScript,JS)等,网页源代码可以由浏览器或与浏览器功能类似的网页显示组件加载和显示为用户可识别的内容。网页所包含的具体内容也是通过网页源代码中的标签或者节点来定义的,比如HTML通过<p>、<img>、<video>、<canvas>来定义网页的元素和属性。在本文中使用的术语“UI元素”包括但不限于:窗口(window)、滚动条(scrollbar)、表格视图(tableview)、按钮(button)、菜单栏(menu bar)、文本框(text box)、导航栏、工具栏(toolbar)、图像(image)、静态文本(tatictext)、部件(Widget)等可视的UI元素。
在一些实施例中,UI元素还可以包括控件(control)。控件可以是对数据和方法的封装,控件可以有自己的属性和方法,属性是控件数据的简单访问者,方法则是控件的一些简单可见的功能。控件是用户界面的基本元素。例如,控件的类型可以包括但不限于:用户界面控件(用于开发构建用户界面的控件,如针对视窗、文本框、按钮、下拉式菜单等界面元素的 控件)、图表控件(用于开发图表的控件,可以实现数据可视化等)、报表控件(用与开发报表的控件,实现报表的浏览查看、设计、编辑、打印等功能)、表格控件(用于开发表格(CELL)的控件,实现网格中数据处理和操作的功能)等。本申请实施例中控件的类型还可以包括:复合控件(将现有的各种控件组合起来,形成一个新的控件,集中多种控件的性能)、扩展控件(根据现有控件派生出一个新的控件,为现有控件增加新的性能或者更改现有控件的性能)、自定义控件等。
在一些实施例中,UI元素还可以包括页面模块。根据页面中控件的布局和属性,可以将页面划分为多个连续的页面模块。一个页面模块可以承载图片、文本、操作按钮、链接、动画、声音、视频等中的一或多种信息类型。一个页面模块可以呈现为一或多个控件的集合,也可以呈现为一张卡片,也可以呈现为卡片以及其他控件的集合。例如,页面模块可以呈现为主界面上的一个图标、图库中一张图片、负一屏中的一张卡片等等。本申请实施例中,不同页面模块可以有重叠,也可以没有重叠。本申请实施例中,页面模块也可以简称为模块。其中,卡片可以提供一种比应用程序(application,APP)更细粒度的服务能力,以可交互的卡片形式直接将用户最关心的服务或内容展示给用户,卡片可以嵌入各种APP或交互场景中,更好的满足用户需求。将一个应用的图片、文本、操作按钮、链接等多种元素整合到一张卡片,该卡片可以关联该应用的一个或者多个用户界面,用户通过在卡片上执行操作(例如点击操作),可以实现显示界面跳转至对应应用的用户界面。采用卡片式的布局,可以对不同内容区分显示,使得显示界面内容的呈现更加直观,也使得用户可以更容易更准确地针对不同内容进行操作。
在本公开的实施例中描述的一些流程中,包含了按照特定顺序出现的多个操作或步骤,但是应该理解,这些操作或步骤可以不按照其在本公开的实施例中出现的顺序来执行或并行执行,操作的序号仅用于区分开各个不同的操作,序号本身不代表任何的执行顺序。另外,这些流程可以包括更多或更少的操作,并且这些操作或步骤可以按顺序执行或并行执行,并且这些操作或步骤可以进行组合。
在诸如
Figure PCTCN2022086706-appb-000001
Figure PCTCN2022086706-appb-000002
的移动操作系统中,动画本质上是基于刷新率实时显示用户界面UI或UI元素。由于人类的视觉暂留原理,使得用户感觉画面是运动的。动画从动画的初态在经过动画时间之后变换为动画的终态。在这个变换过程中,动画可以由动画类型和动画变换形式进行控制。例如,动画类型可以包括位移动画、旋转动画、缩放动画和透明动画等。而动画变换形式可以由插值器和估值器等控制器进行控制。这样的控制器可以用于在动画时间期间控制对动画进行变换的速度。
然而,传统上,动画仅仅是简单的动画效果的组合,使得动画效果单一,不符合物理规律,并且没有考虑真实使用场景和用户使用习惯等。为此,本公开的实施例提出了一种图形界面显示的新方案。本公开的实施例涉及新型的动效实现方案,提出了引力动效的设计与实现。主要是基于人因研究,仿真自然界的引力效果,实现引力动效。本公开的实施例是引力场的理论在UI框架的动效领域的首次使用,构建了引力的特征动效。引力动效作为新颖的特征特效,包含了空间、平衡、捕获、扩散、汇聚等子特征。本公开的实施例主要就是针对引力场的效果,构建引力动效的能力。在不同的控件、图标、页面之间,加强了彼此之间的联系,突出了各个独立个体之间的关系,加强用户体验。自然界的引力场理论在动效领域的完美呈现,进一步证明了人因理论研究的重要性,也使得有屏幕的终端设备的展现出符合自然 规律的动态效果。用户在使用设备的过程中,也更加符合生活体验,加强了设备的生命力和人性化。下文将参考附图来描述本公开的一些说明性实施例。
图1示出了可以实施本公开的实施例的一种电子设备100的硬件结构的示意图。如图1所示,电子设备100可以包括处理器110、外部存储器接口120、内部存储器121、通用串行总线(universal serial bus,USB)接口130、充电管理模块140、电源管理模块141、电池142、天线1、天线2、移动通信模块150、无线通信模块160、音频模块170、扬声器170A、受话器170B、麦克风170C、耳机接口170D、传感器模块180、按键190、马达191、指示器192、摄像头193、显示屏194、以及用户标识模块(subscriber identification module,SIM)卡接口195等。传感器模块180可以包括压力传感器180A、陀螺仪传感器180B、气压传感器180C、磁传感器180D、加速度传感器180E、距离传感器180F、接近光传感器180G、指纹传感器180H、温度传感器180J、触摸传感器180K、环境光传感器180L、骨传导传感器180M等。
应当理解,本公开的实施例所示意的结构并不构成对电子设备100的具体限定。在本公开的另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP)、调制解调处理器、图形处理器(graphics processing unit,GPU)、图像信号处理器(image signal processor,ISP)、控制器、视频编解码器、数字信号处理器(digital signal processor,DSP)、基带处理器、和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口、集成电路内置音频(inter-integrated circuit sound,I2S)接口、脉冲编码调制(pulse code modulation,PCM)接口、通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口、移动产业处理器接口(mobile industry processor interface,MIPI)、通用输入输出(general-purpose input/output,GPIO)接口、用户标识模块(subscriber identity module,SIM)接口、和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器180K、充电器、闪光灯、摄像头193等。例如,处理器110可以通过I2C接口耦合触摸传感器180K,使处理器110与触摸传感器180K通过I2C总线接口通信,实现电子设备100的触摸功能。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。 在一些实施例中,音频模块170可以通过I2S接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。
PCM接口也可以用于音频通信,将模拟信号抽样、量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与无线通信模块160。例如,处理器110通过UART接口与无线通信模块160中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以通过UART接口向无线通信模块160传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器110与显示屏194、摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI)、显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现电子设备100的拍摄功能。处理器110和显示屏194通过DSI接口通信,实现电子设备100的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头193、显示屏194、无线通信模块160、音频模块170、传感器模块180等。GPIO接口还可以被配置为I2C接口、I2S接口、UART接口、MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口、Micro USB接口、USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本公开的实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本公开的另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备100的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备100供电。
电源管理模块141用于连接电池142、充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110、内部存储器121、显示屏194、摄像头193、和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
电子设备100的无线通信功能可以通过天线1、天线2、移动通信模块150、无线通信模块160、调制解调处理器以及基带处理器等实现。天线1和天线2用于发射和接收电磁波信 号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如,可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G/6G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器、开关、功率放大器、低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A、受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络)、蓝牙(Bluetooth,BT)、全球导航卫星系统(global navigation satellite system,GNSS)、调频(frequency modulation,FM)、近距离无线通信技术(near field communication,NFC)、红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM)、通用分组无线服务(general packet radio service,GPRS)、码分多址接入(code division multiple access,CDMA)、宽带码分多址(wideband code division multiple access,WCDMA)、时分码分多址(time-division codedivision multiple access,TD-SCDMA)、长期演进(long term evolution,LTE)、5G以及后续演进标准、BT、GNSS、WLAN、NFC、FM、和/或IR技术等。其中GNSS可以包括全球卫星定位系统(global positioning system,GPS)、全球导航卫星系统(global navigation satellite system,GLONASS)、北斗卫星导航系统(beidou navigation satellite system,BDS)、准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备100通过GPU、显示屏194、以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可以包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备100可以通过ISP、摄像头193、视频编解码器、GPU、显示屏194以及应用处理器等实现拍摄功能。ISP用于处理摄像头193反馈的数据。例如,在拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如,动态图像专家组(moving picture experts group,MPEG)1、MPEG2、MPEG3、MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如图像识别、人脸识别、语音识别、文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐、视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。处理器110通过运行存储在内部存储器121的指令、和/或存储在设置于处理器中的存储器的指令,执行电子设备100的各种功能应用以及数据处理。
电子设备100可以通过音频模块170、扬声器170A、受话器170B、麦克风170C、耳机接口170D、以及应用处理器等实现音频功能。例如音乐播放,录音等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备100的接触和分离。电子设备100可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口195可以支持Nano SIM卡、Micro SIM卡、SIM卡 等。同一个SIM卡接口195可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口195也可以兼容不同类型的SIM卡。SIM卡接口195也可以兼容外部存储卡。电子设备100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备100采用eSIM,即嵌入式SIM卡。eSIM卡可以嵌在电子设备100中,不能和电子设备100分离。
电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本公开的实施例以分层架构的一种移动操作系统为例,示例性说明电子设备100的软件结构。
图2示出了根据本公开的实施例的图形界面显示方法的示例处理过程200的流程图。在一些实施例中,处理过程200可以由电子设备100来实现,例如可以由电子设备100的处理器110或处理单元配合其他组件(例如,显示屏194)来实现。在其他实施例中,处理过程200也可以由具有屏幕以显示UI元素的其他设备来实现。为了便于说明,将以电子设备100执行处理过程200为例,并且参考图3A至图3J、图4A至图4B和图5来论述处理过程200,其中图3A至图3J示出了根据本公开的实施例的在UI元素被点击的场景中所产生的“引力”动画效果的示意图。
同时参考图2和图3A,在图2的框210处,电子设备100在其屏幕300上显示M个用户界面UI元素,M为大于1的正整数。需要说明的是,屏幕300可以是图1中描绘的显示屏194的示例。在图3A的示例中,屏幕300上显示了以6行4列的方式排列的M=24个UI元素,其中第一行包括UI元素311至314,第二行包括UI元素321至324,第三行包括UI元素331至334,第四行包括UI元素341至344,第五行包括UI元素351至354,并且第六行包括UI元素361至364。需要说明的是,尽管图3A的示例中示出了以规律方式排列的特定数目的UI元素,但是本公开的实施例不限于此,而是等同地适用于以任何规律的或非规律的方式布置的任何数目的UI元素的场景。此外,需要注意的是,尽管图3A的示例中将M个UI元素示出为尺寸基本相同,但是本公开的实施例不限于此,而是等同地适用于M个UI元素中的一个或多个UI元素具有不同尺寸的场景。
同时参考图2和图3B,在图2的框220处,电子设备100检测到作用于M个UI元素中的第一UI元素的操作。例如,在图3B的示例中,电子设备100可以检测到作用于24个UI元素311至364中的UI元素343的操作。在本公开的实施例中,为了描述的便利,被操作的UI元素也可以称为“第一UI元素”。因此,在图3B的示例中,被操作的UI元素343也可以称为第一UI元素343。具体地,如图3B所描绘的,电子设备100的用户可以使用手部370来点击UI元素343,例如以启动UI元素343所对应的应用程序。在下文中,将以点击操作作为针对UI元素的操作的示例来描述本公开的实施例的“引力”动画效果。然而,应当理解,本公开的实施例不限于点击操作,而是可以等同地或类似地适用于与UI元素有关的任何其他操作,诸如移动UI元素的操作、将UI元素与其他UI元素合并的操作、展开UI元素的操作、以及删除UI元素的操作,等等。与这些操作有关的根据本公开的实施例的“引力”动画效果将在后文参考图20A至图20D、图22A至图22D、图23A至图23D、图24A至图24D、图25A至图25D进一步被描述。
在图2的框230处,响应于对UI元素343的操作,电子设备100使屏幕300上的N个UI元素中的每个UI元素产生“引力”动画效果,也即受到UI元素343的“吸引力”或“排 斥力”而产生移动的动画效果,其中N为1与M-1之间的正整数。也就是说,至少有一个UI元素可以受到UI元素343的“吸引力”或“排斥力”,而至多有M-1个UI元素可以受到UI元素343的“吸引力”或“排斥力”。换言之,在一些实施例中,除了被操作的UI元素343本身之外,屏幕300上的所有其他的UI元素都可以受到UI元素343的影响而产生“引力”动画效果。在这种情况下,电子设备100可以将M个UI元素中除了UI元素343以外的M-1个UI元素确定为将会产生“引力”动画效果的N个UI元素。以此方式,电子设备100可以无需专门设置UI元素343的“引力”影响范围,从而可以在保持“引力”动画效果符合自然规律的同时,简化“引力”动画效果的相关设置。在其他实施例中,电子设备100还可以基于被操作的UI元素343的“引力”影响区域来确定需要产生动画效果的N个UI元素。后文将参考图18A至图18C来描述这样的实施例。
在本公开的实施例中,被用户操作的UI元素可以被认为其“引力”平衡状态被用户的操作“打破”,从而将会对其他UI元素产生“吸引力”或“排斥力”,或者将会受到其他UI元素的“吸引力”或“排斥力”。在一些实施例中,被操作的UI元素对其他UI元素的“引力”作用表现为“吸引力”还是“排斥力”可以是预先设定的,也可以是可设置的。类似地,被操作的UI元素受到其他UI元素的“引力”作用表现为“吸引力”还是“排斥力”也可以是预先设定的,或者可以是可设置的。在被操作的UI元素的“引力”作用表现为“吸引力”的情况下,受到“引力”作用影响的其他UI元素将会首先从起始位置朝向被操作的UI元素进行移动,然后再在相反的方向上远离被操作的UI元素进行移动,从而回到起始位置。在被操作的UI元素的“引力”作用表现为“排斥力”的情况下,受到“引力”作用影响的其他UI元素将会首先从起始位置远离被操作的UI元素进行移动,然后再在相反的方向上朝向被操作的UI元素进行移动,从而回到起始位置。在本公开的上下文中,受到“引力”作用影响的UI元素首先进行的朝向或远离被操作的UI元素的位移将称为“第一位移”或“第一移动”,而受到“引力”作用影响的UI元素随后进行的返回到起始位置的位移将被称为“第二位移”或“第二移动”。此外,“第一位移”或“第一移动”的方向将被称为“第一方向”,而“第二位移”或“第二移动”的方向将被称为“第二方向”。下文将首先参考图3B至图3J,以UI元素343被操作并且对其他UI元素产生“吸引力”为例,来描述本公开的实施例的“引力”动画效果的一种示例。然后,将参考图5来详细描述一个UI元素受到被操作的UI元素的“吸引力”的影响而产生引力动画效果的细节。
如图3C所示,为了实现本公开的实施例的“引力”动画效果,在电子设备100检测到用户对UI元素343的点击操作之后,假定UI元素343对其他UI元素的“引力”作用被设置为“吸引力”,那么电子设备100可以首先使屏幕300上的N个UI元素(在图3C的示例中是24-1=23个)沿着指向UI元素343的方向(也即第一方向)进行移动。例如,在图3C的示例中,如虚线箭头所指示的,UI元素311可以沿着指向UI元素343的方向311-d1移动,UI元素312可以沿着指向UI元素343的方向312-d1移动,UI元素313可以沿着指向UI元素343的方向313-d1移动,UI元素314可以沿着指向UI元素343的方向314-d1移动,UI元素321可以沿着指向UI元素343的方向321-d1移动,UI元素322可以沿着指向UI元素343的方向322-d1移动,UI元素323可以沿着指向UI元素343的方向323-d1移动,UI元素324可以沿着指向UI元素343的方向324-d1移动。
类似地,如虚线箭头所指示的,UI元素331可以沿着指向UI元素343的方向331-d1移 动,UI元素332可以沿着指向UI元素343的方向332-d1移动,UI元素333可以沿着指向UI元素343的方向333-d1移动,UI元素334可以沿着指向UI元素343的方向334-d1移动,UI元素341可以沿着指向UI元素343的方向341-d1移动,UI元素342可以沿着指向UI元素343的方向342-d1移动,UI元素344可以沿着指向UI元素343的方向344-d1移动。类似地,如虚线箭头所指示的,UI元素351可以沿着指向UI元素343的方向351-d1移动,UI元素352可以沿着指向UI元素343的方向352-d1移动,UI元素353可以沿着指向UI元素343的方向353-d1移动,UI元素354可以沿着指向UI元素343的方向354-d1移动,UI元素361可以沿着指向UI元素343的方向361-d1移动,UI元素362可以沿着指向UI元素343的方向362-d1移动,UI元素363可以沿着指向UI元素343的方向363-d1移动,并且UI元素364可以沿着指向UI元素343的方向364-d1移动。
在一些实施例中,某个UI元素指向UI元素343的方向可以是指该UI元素上的任何一点指向UI元素343上的任何一点的方向。例如,在图3C的示例中,UI元素344指向UI元素343的方向344-d1可以是指UI元素344上的任何一点指向UI元素343的方向。以此方式,电子设备100可以仅需要确定两个UI元素之间的大概方向,从而可以简化电子设备100在确定“引力”作用方向时的操作。在其他实施例中,某个UI元素指向UI元素343的方向可以是指该UI元素上的中心点指向UI元素343的中心点的方向。也就是说,在UI元素受到被操作的UI元素的“吸引力”或“排斥力”的情况下,产生的第一移动的第一方向从受影响的UI元素的中心点指向被操作的UI元素的中心点,或者从被操作的UI元素的中心点指向受影响的UI元素的中心点。例如,在图3C的示例中,UI元素344指向UI元素343的方向344-d1可以是指UI元素344的中心点指向UI元素343的中心点的方向。以此方式,电子设备100可以准确且一致地确定出两个UI元素之间的“吸引力”或“排斥力”的方向,从而提高实现“引力”动画效果的准确度和效率。下文参考图4A和图4B来进一步描述这样的实施例。
图4A和图4B示出了根据本公开的实施例的“引力”动画效果中的UI元素的第一移动的第一方向和第二移动的第二方向的示意图。图4A示出了被操作的UI元素的“引力”作用为“吸引力”的示例场景。如图4A所示,右下方的十字图形示意性地表示被操作的UI元素的放大的中心点,其在本文中也可以称为发生中心点410。在UI元素是应用图标的场景中,当针对应用图标的删除、拖拽释放、合并文件夹、卡片展开等事件发生时,该UI元素的中心点就是吸引力或者排斥力的发生中心点。另外,在图4A中,左上方的十字图形示意性地表示受到被操作的UI元素的引力影响的另一UI元素的放大的中心点,其在本文中也可以称为元素中心点420。在“引力”动画效果被设置为“吸引力”的情况下,UI元素被“吸引”的方向,也即产生第一位移的第一方向将从元素中心点420指向发生中心点410。也就是说,在“吸引力”的作用下,UI元素的第一移动的第一方向为每个受到影响的UI元素的中心点指向被操作的UI元素的中心点的矢量方向。类似地,图4B示出了被操作的UI元素的“引力”作用为“排斥力”的示例场景。如图4B所示,右下方的十字图形示意性地表示发生中心点410,而左上方的十字图形示意性地表示元素中心点420。在“引力”动画效果被设置为“排斥力”的情况下,UI元素被“排斥”的方向,也即产生第一位移的第一方向将从发生中心点410指向元素中心点420。也就是说,在“排斥力”的作用下,UI元素的第一移动的第一方向为被操作的UI元素的中心点指向每个受到影响的UI元素的中心点的矢量方向。
如图3D所示,UI元素311可以具有中心点311-o,UI元素312可以具有中心点312-o, UI元素313可以具有中心点313-o,UI元素314可以具有中心点314-o,UI元素321可以具有中心点321-o,UI元素322可以具有中心点322-o,UI元素323可以具有中心点323-o,并且UI元素324可以具有中心点324-o。类似地,UI元素331可以具有中心点331-o,UI元素332可以具有中心点332-o,UI元素333可以具有中心点333-o,UI元素334可以具有中心点334-o,UI元素341可以具有中心点341-o,UI元素342可以具有中心点342-o,UI元素343可以具有中心点343-o,并且UI元素344可以具有中心点344-o。类似地,UI元素351可以具有中心点351-o,UI元素352可以具有中心点352-o,UI元素353可以具有中心点353-o,UI元素354可以具有中心点354-o,UI元素361可以具有中心点361-o,UI元素362可以具有中心点362-o,UI元素363可以具有中心点363-o,并且UI元素364可以具有中心点364-o。因此,在图3D的示例中,受“引力”作用影响的UI元素344指向被操作的UI元素343的方向344-d1可以是指从UI元素344的中心点344-o指向UI元素343的中心点343-o的方向344-d1,也即UI元素344的第一移动的第一方向将是方向344-d1。也就是说,假设UI元素343的“引力”作用表现为“吸引力”,那么在UI元素343被操作之后,UI元素344可以首先在朝向UI元素343的第一方向344-d1上进行第一移动,然后将在相反的方向上进行第二移动以返回起始位置。同样地,屏幕300上的其他UI元素被UI元素343“吸引”所进行的第一移动的第一方向和第二移动的第二方向也可以类似地确定。
在图3E中,使用十字符号表示每个UI元素的中心点的当前位置,也即受影响的UI元素产生引力动画效果之后的中心点位置,并且使用小黑点来表示每个UI元素在产生引力动画效果之前的中心点的起始位置。例如,为了示意的清楚性,图3E中仅标记出了UI元素344的中心点的当前位置344-o和中心点的起始位置344-1。如图3E所示,由于受到被操作的UI元素343的“吸引力”影响,UI元素344的中心点已经沿着指向UI元素343的中心点343-o的第一方向344-d1进行第一移动,也即从中心点的起始位置344-1移动到中心点的当前位置344-o。类似地,如图3E所示,每个其他受到UI元素343影响的UI元素的中心点均沿着指向UI元素343的中心点343-o的各自的第一方向进行了各自的第一移动。需要说明的是,在完成了各自的第一移动之后,受到“引力”影响的各个UI元素将会在与第一方向相反的第二方向上返回到起始位置。例如,在图3E的示例中,UI元素344在完成了第一方向344-d1上的第一移动之后,将在第一方向344-d1相反的第二方向上回到起始位置。如图3F进一步示出的,在图3F中示出的时刻,UI元素344已经完成了第二移动并且回到起始位置,从而表示UI元素344中心点当前位置的十字符号与表示中心点初始位置的小黑点重合。类似地,每个其他受到UI元素343的“引力”影响的UI元素也均完成了各自的第二位移而返回到各自的初始位置。需要说明的是,尽管图3E和图3F的示例中描绘了受到“引力”影响的UI元素进行一次第一位移和一次第二位移,但是本公开的实施例不限于此。在其他实施例中,取决于系统设置或用户设置,或者取决于被操作的UI元素的操作持续的时间长度,受到“引力”影响的UI元素可以进行多次第一位移和第二位移。也就是说,受到“引力”影响的UI元素可以按照循环的方式在第一方向上执行第一移动,在第二方向上执行第二移动,然后再在第一方向上执行第一移动,再在第二方向上执行第二移动,如此循环往复。在一些实施例中,受到“引力”影响的UI元素在每次循环中在第一方向上的第一移动中的目标距离可以保持不变或逐渐减小。
在上文参考图3A至图3F描述的示例中,屏幕300上显示的是规则排列的尺寸相同的多 个UI元素。应当理解的是,本公开的实施例提出的“吸引力”或“排斥力”的动画效果不限于规则排列的大小相同的多个UI元素,而是等同地或类似地适用于以任何方式排列的具有不同大小的多个UI元素。下文参考图3G至图3J来描述这样的示例。如图3G所示,电子设备100在屏幕300上显示M个UI元素,例如,在负一屏中显示的各种UI元素。在图3G的示例中,M=13,也即UI元素381至UI元素393共13个UI元素,它们具有不同的大小,其中UI元素385最大、UI元素381次之,UI元素384再次之,UI元素382、383、386、387、388、389、390、391、392和393最小。电子设备100检测到作用于M个UI元素中的第一UI元素的操作。例如,在图3B的示例中,如图3G进一步示出的,电子设备100可以检测到作用于M=13个UI元素381至393中的UI元素385的操作。具体地,电子设备100的用户可以使用手部370来点击UI元素385,例如以启动UI元素385所对应的操作或服务。在下文中,将以点击操作作为针对UI元素的操作的示例来描述本公开的实施例的“引力”动画效果。然而,应当理解,本公开的实施例不限于点击操作,而是可以等同地或类似地适用于与UI元素有关的任何其他操作,诸如移动UI元素的操作、将UI元素与其他UI元素合并的操作、展开UI元素的操作、以及删除UI元素的操作,等等。
响应于对UI元素385的操作,电子设备100使屏幕300上的N个UI元素中的每个UI元素产生“引力”动画效果,也即受到UI元素385的“吸引力”或“排斥力”而产生移动的动画效果,其中N为1与M-1之间的正整数。在图3G至图3J的示例中,假定N=M-1=12并且“引力”被设置为“吸引力”,也即,除了UI元素385本身以外,其他的全部UI元素都受到UI元素385的“吸引力”。如图3H所示,为了实现本公开的实施例的“引力”动画效果,在电子设备100检测到用户对UI元素385的点击操作之后,电子设备100可以首先使屏幕300上的其他12个UI元素沿着指向UI元素385的方向(也即第一方向)进行移动。例如,在图3H的示例中,如虚线箭头所指示的,UI元素381可以沿着指向UI元素385的方向381-d1移动,UI元素382可以沿着指向UI元素385的方向382-d1移动,UI元素383可以沿着指向UI元素385的方向383-d1移动,UI元素384可以沿着指向UI元素385的方向384-d1移动,UI元素386可以沿着指向UI元素385的方向386-d1移动,UI元素387可以沿着指向UI元素385的方向387-d1移动,UI元素388可以沿着指向UI元素385的方向388-d1移动,UI元素389可以沿着指向UI元素385的方向389-d1移动。UI元素390可以沿着指向UI元素385的方向390-d1移动,UI元素391可以沿着指向UI元素385的方向391-d1移动,UI元素392可以沿着指向UI元素385的方向392-d1移动,并且UI元素393可以沿着指向UI元素385的方向393-d1移动。在图3H的示例中,某个UI元素指向UI元素385的方向可以是指该UI元素上的中心点指向UI元素385的中心点的方向。但是,需要说明的是,在其他实施例中,某个UI元素指向UI元素385的方向可以是指该UI元素上的任何一点指向UI元素385上的任何一点的方向。
在图3I中,使用十字符号表示每个UI元素的中心点的当前位置,也即受影响的UI元素产生引力动画效果之后的中心点位置,并且使用小黑点来表示每个UI元素在产生引力动画效果之前的中心点的起始位置。如图3I所示,由于受到被操作的UI元素385的“吸引力”影响,UI元素381的中心点已经沿着指向UI元素385的中心点的第一方向进行第一移动,也即从中心点的起始位置移动到中心点的当前位置。类似地,如图3I所示,每个其他受到UI元素385影响的UI元素的中心点均沿着指向UI元素385的中心点的各自的第一方向进行了 各自的第一移动。需要说明的是,在完成了各自的第一移动之后,受到“引力”影响的各个UI元素将会在与第一方向相反的第二方向上返回到起始位置。例如,在图3I的示例中,UI元素381在完成了第一方向上的第一移动之后,将在第一方向相反的第二方向上回到起始位置。如图3J进一步示出的,在图3J中示出的时刻,UI元素381已经完成了第二移动并且回到起始位置,从而表示UI元素381中心点当前位置的十字符号与表示中心点初始位置的小黑点重合。类似地,每个其他受到UI元素385的“引力”影响的UI元素也均完成了各自的第二位移而返回到各自的初始位置。需要说明的是,尽管图3I和图3J的示例中描绘了受到“引力”影响的UI元素进行一次第一位移和一次第二位移,但是本公开的实施例不限于此。在其他实施例中,取决于系统设置或用户设置,或者取决于被操作的UI元素的操作持续的时间长度,受到“引力”影响的UI元素可以进行多次第一位移和第二位移。也就是说,受到“引力”影响的UI元素可以按照循环的方式在第一方向上执行第一移动,在第二方向上执行第二移动,然后再在第一方向上执行第一移动,再在第二方向上执行第二移动,如此循环往复。在一些实施例中,受到“引力”影响的UI元素在每次循环中在第一方向上的第一移动中的目标距离可以保持不变或逐渐减小。下文将参考图5来详细描述一个UI元素受到被操作的UI元素的“吸引力”的影响而产生引力动画效果的细节。
图5示出了根据本公开的实施例的“引力”动画效果中的受到“吸引力”影响的UI元素在进行第一移动和第二移动的过程中在不同时刻的位置的示意图。在图5的示例中,假设UI元素343为被操作的第一UI元素,并且UI元素344是受到UI元素343的“吸引力”作用影响的第二UI元素。同时参考图2和图5,在图2的框232处,电子设备100确定受到被操作的第一UI元素343影响的N个UI元素中的第二UI元素344将在第一方向上移动的目标距离D0。在图5的示例中,第一方向是从第二UI元素344指向第一UI元素343的方向。当然,在其他实施例中,如果第一UI元素343的“引力”作用被设置为“排斥力”,那么第二UI元素344的第一位移的第一方向可以是从第一UI元素343指向第二UI元素344的方向。需要说明的是,电子设备100可以采用任何适当的方式来确定受“引力”作用影响的UI元素344在第一移动中需要移动的目标距离D0。在一些实施例中,电子设备100可以将所有受到第一UI元素343的“引力”作用影响的UI元素的第一移动的目标距离设置为相同。以此方式,电子设备100用于实现“引力”动画效果的处理可以被简化。在其他实施例中,电子设备100可以基于产生“引力”作用的UI元素的尺寸、受到“引力”作用的UI元素的尺寸、和/或两个UI元素之间的距离来确定受影响的UI元素在第一移动中的目标距离。在另外的实施例中,由于产生“引力”作用的某个特定UI元素对于其他受影响的UI元素而言是相同的,所以在产生多个受影响的UI元素的整体“引力”动画效果时,在电子设备100确定每个受影响的UI元素的第一移动的目标距离的大小时,可以不考虑产生“引力”作用的UI元素的尺寸。例如,在图5的示例中,电子设备100可以基于第二UI元素344的尺寸和第二UI元素344到第一UI元素343的距离这两个因素,来确定第二UI元素344在第一方向上的第一移动的目标距离D0。后文将参考图8、图9以及图10A和图10B来进一步描述这样的实施例。
在图2的框234处,电子设备100使第二UI元素344从起始位置p1沿第一方向以目标距离D0进行第一移动。也就是说,在图5的示例中,第二UI元素344的第一移动是指UI元素344从起始位置p1开始沿第一方向进行移动,直至到达与起始位置p1的距离为目标距离D0的目标位置p2。更具体地,如图5所示,在进行第一移动期间,第二UI元素344在时 间t1处位于起始位置p1并且开始进行第一移动;在时间t2处,第二UI元素344沿着第一方向移动了距离D1;在时间t3处,第二UI元素344沿着第一方向移动了距离D2;在时间t4处,第二UI元素344沿着第一方向移动了目标距离D0而到达目标位置p2。在图2的框236处,在第二UI元素344完成从起始位置p1到目标位置p2的第一移动之后,电子设备100使第二UI元素344沿与第一方向相反的第二方向进行第二移动,以复位到起始位置p1。也就是说,在图5的示例中,第二UI元素344的第二移动是指第二UI元素344从目标位置p2开始沿第二方向进行移动,直至回到起始位置p1。更具体地,如图5所示,在第一移动之后的第二移动期间,在时间t5处,第二UI元素344从位置p2沿着第二方向移动了距离D3;在时间t6处,第二UI元素344沿着第二方向移动了距离D4;在时间t7处,第二UI元素344沿着第二方向移动了目标距离D0而回到起始位置p1。
在一些实施例中,第二UI元素344的第一移动持续的第一时长、第二移动持续的第二时长、和/或第一移动和第二移动持续的总时长是可配置的。以此方式,电子设备100的用户可以按照偏好来设置“引力”动画效果的时间长短,从而进一步改进用户体验。在一些实施例中,在产生第二UI元素344的“引力”动画效果时,电子设备100可以在第一移动和/或第二移动期间缩小或放大第二UI元素344的尺寸。以此方式,电子设备100可以更加多样化地展现出UI元素之间具有“引力”的动画效果,从而进一步提升用户体验。另外,如前文描述的,本公开的实施例可以模仿自然界中的物体之间存在的“引力”作用,而在自然界中,受到一个物体的引力作用的另一物体将会在引力的作用下进行变加速直线运动。因此,在一些实施例中,第二UI元素344的第一移动和/或第二移动可以包括变加速直线运动。也就是说,上文描述的各个移动距离D1至D4与各个时刻t1至t7之间的关系可以根据变加速直线运动的位移时间曲线来确定。以此方式,电子设备100可以基于自然界的物体在引力作用下的加速运动规律,来实现UI元素的第一移动和第二移动,使得“引力”动画效果更加符合自然规律和用户在生活中的习惯认知,从而进一步改进用户体验。在其他实施例中,为了使得“引力”动画效果更符合用户日常的使用习惯,电子设备100可以基于位移随时间变化的预定义曲线,来确定第二UI元素344在第一移动和/或第二移动期间的移动的动画效果。也就是说,电子设备100可以基于位移随时间变化的预定义曲线来确定第二UI元素344在第一移动和/或第二移动中的运动细节,诸如在某个具体时刻运动到哪个具体位置,也即,上文描述的各个移动距离D1至D4与各个时刻t1至t7之间的关系,等等。以此方式,电子设备100可以基于位移随时间变化的预定义曲线来方便地控制UI元素的移动,使得“引力”动画效果更符合用户的使用习惯,从而进一步改进用户体验。后文将参考图6、图7A和图7B来详细描述这样的实施例。
通过示例处理过程200,本公开的实施例实现了UI元素之间具有“引力”的动画效果,展现出符合自然规律的动态效果,与用户生活体验更加一致,加强了电子设备100的生命力和人性化程度。例如,如果不存在“引力”动画效果,当UI元素(例如,图标)排列完成之后,UI元素的显示效果就比较单一,每个图标都是独立呈现的,没有相互的联系,不符合自然规律。相比之下,在实现了本公开的实施例提供的引力动画效果之后,单个图标的效果可以影响整个页面,并且每个图标之间是有潜在联系的,就像UI元素之间存在“万有引力”一样,将它们联系在一起。例如,UI元素的相关的移动、合并、删除、展开等操作的动画效果将更加符合自然规律,更加的人性化,提升了与用户的交流。更一般地说,本公开的实施例 提出了一种新型动画效果的实现方案,主要提供了引力动画效果的实现模型,实现了引力理论的动画效果实现,使得用户可以更好地体验UI元素的功能。更特别地,本公开的实施例可以基于引力公式,实现引力动画效果模型;可以针对UI元素(例如,图标)的不同操作场景,实现吸引力、排斥力以及黑洞吸附等引力场景的动效;可以建立引力场,构建整个特征动画效果的基础;并且还可以将基础动画效果开放给三方应用,从而建立生态。
图6示出了根据本公开的实施例的“引力”动画效果的动画过程和相关控制逻辑的示意图。在通常的电子设备的操作系统中,例如在当前主流的安卓系统
Figure PCTCN2022086706-appb-000003
和IOS系统
Figure PCTCN2022086706-appb-000004
中,动画本质上就是根据刷新率实时显示当前的界面或者控件,利用人类的视觉暂留原理,使得用户感觉所显示的画面就是运动的。因此,如图6所示,电子设备100可以首先确定“引力”动画的初态610和“引力”动画的终态620。另外,电子设备100可以确定从“引力”动画的初态610变换到“引力”动画的终态620的过程持续的动画时间605。再者,电子设备100还可以确定“引力”动画类型630和“引力”动画变换形式640。例如,“引力”动画类型630可以包括UI元素的位移动画632、缩放动画634、旋转动画636、透明动画638等,而“引力”动画变换形式640可以通过插值器642和644来控制,例如在固定的动画时间605里进行相关变换速度的控制,等等。
在本公开的实施例中,为了实现“引力”的动画效果,主要是涉及到“引力”动画类型630中的位移动画632,但是应当理解,其他的“引力”动画类型也可以可能的。如上文描述的,本公开的实施例中的“引力”动画效果产生的位移动画效果可以是UI元素先朝向某一个方向移动,然后再以相反的方向复位。两段动画可以分别定义时长和插值器,应用侧可以按需进行调节。如上文提到的,在一些实施例中,电子设备100可以基于位移随时间变化的预定义曲线来确定第二UI元素344在第一移动和/或第二移动期间的移动的动画效果。关于此,需要说明的是,根据人因研究,针对不同UI元素的不同移动阶段可以使用不同的插值器和时间,从而达成不一样的动画效果。应理解,电子设备100可以采用已知的或未来发现的任何适当的位移时间曲线来控制第二UI元素344在第一移动和/或第二移动期间的移动细节。在一些实施例中,电子设备100可以选择使用贝塞尔曲线或弹性力曲线作为第二UI元素344的第一位移和/或第二位移的预定义曲线。例如,电子设备100可以使用二阶贝塞尔曲线来控制第二UI元素344的第一位移,并且使用弹性力曲线来控制第二UI元素344的第二位移,或者反之亦然。当然,在其他实施例中,电子设备100也可以使用贝塞尔曲线或弹性力曲线之一来控制第一位移和第二位移两者。以此方式,电子设备100可以基于贝塞尔曲线或弹性力曲线来方便地控制UI元素的移动,使得“引力”动画效果更加符合用户在生活中对于“吸引力”和“排斥力”的习惯认知,从而进一步改进用户体验。下文将参考图7A来描述电子设备100基于二阶贝塞尔曲线来控制第二UI元素344的第一位移的示例,并且参考图7B来描述电子设备100基于弹性力曲线来控制第二UI元素344的第二位移的示例。
图7A示出了根据本公开的实施例的UI元素的位移随时间变化的预定义曲线为贝塞尔曲线的示意图。在图7A示出的位移时间曲线图中,横坐标表示时间,纵坐标表示位移(或距离)。在一些实施例中,用于控制UI元素移动的插值器可以使用常见的曲线插值器,例如在图7A的示例中,此前在图5中描绘的第二UI元素344的第一移动的位移时间曲线710可以是二阶贝塞尔曲线。具体地,电子设备100可以通过选择二阶贝塞尔曲线的两个二阶的点,从而达到UI元素344的不同移动效果。以此方式,位移时间曲线与时间的相互配合将会产生 运动的韵律感。电子设备100调整位移时间曲线可以使UI元素实现加速和减速,而不是以恒定的速率移动。一般而言,贝塞尔曲线主要应用于固定的场景下点击操作页面切换间的运动匹配。以下为在特定的构建平台中的贝塞尔曲线的9种不同节奏的相关参数,图7A示出的曲线710可以是下列9种贝塞尔曲线之一。需要说明的是,尽管在本公开的上下文中以二阶贝塞尔曲线作为位移时间曲线描述了一些示例,但是本公开的实施例不限于此,而是可以等同地将任何曲线形式作为位移时间曲线来实现UI元素的移动(例如,第一移动和第二移动之一或两者)。例如,这样的曲线形式包括但不限于一阶贝塞尔曲线、三阶或更高阶贝塞尔曲线、其他已知的或未来发现的其他曲线形式、或者甚至是直线。
Figure PCTCN2022086706-appb-000005
在上述9种不同节奏中,跟随用户的手部滑动的贝塞尔曲线可以适当尝试40-60、33-33可以是跟随手速的贝塞尔曲线,而70-80是节奏较强的曲线,可用于凸显趣味性场景。根据上述分析,第二UI元素344的第一移动的插值器可以选择贝塞尔曲线,具体的坐标可以根据的“引力”动画效果的所设置的各种参数来分析得出。此外,需要说明的是,本公开的实施例的贝塞尔曲线的两个点的坐标可以任意确定,不限于以上9种曲线,两个点坐标可以是(x1,y1)、(x2,y2),其中x1、y1、x2和y2可以是0到1之间的数值,一般可以取一位小数。应当理解,尽管图7A中将本公开的实施例的位移时间曲线710示例性地描绘为二阶贝塞尔曲线,但是本公开的实施例不限于此,而是等同地适用于其他阶数的贝塞尔曲线和任何其他的曲线。同时参考图5和图7A,在UI元素344从起始位置p1到目标位置p2的第一移动中,电子设备100可以基于位移时间曲线710,确定UI元素344在t1时刻的移动距离为0,在t2时刻的移动距离为D1,在t3时刻的移动距离为D2,并且在t4时刻的移动距离为目标距离D0。需要说明的是,在具体的实现中,电子设备100可以根据屏幕300的刷新频率所对应的时间间隔在位移时间曲线710上确定出UI元素344在每个时刻所在的位置,然后在不同的时刻在屏幕300上的对应位置处显示UI元素344,从而可以实现UI元素344进行第一移动的动画效果。
图7B示出了根据本公开的实施例的UI元素的位移随时间变化的预定义曲线为反比例曲线的示意图。在图7B示出的位移时间曲线图中,横坐标表示时间,纵坐标表示位移(或距离)。在图7B的示例中,此前在图5中描绘的第二UI元素344的第一移动的位移时间曲线720可以是反比例曲线,也即随着时间的推移,第二UI元素344在单位时间内移动的距离越来越小。同时参考图5和图7B,在UI元素344从起始位置p1到目标位置p2的第一移动中,电子设备100可以基于位移时间曲线720,确定UI元素344在t1时刻的移动距离为0,在t2时刻的移动距离为D1,在t3时刻的移动距离为D2,并且在t4时刻的移动距离为目标距离 D0。需要说明的是,在具体的实现中,电子设备100可以根据屏幕300的刷新频率所对应的时间间隔在位移时间曲线720上确定出UI元素344在每个时刻所在的位置,然后在不同的时刻在屏幕300上的对应位置处显示UI元素344,从而可以实现UI元素344进行第一移动的动画效果。
图7C示出了根据本公开的实施例的UI元素的位移随时间变化的预定义曲线为临界阻尼弹性力曲线的示意图。在图7C的示例中,图5中描绘的UI元素344的第二移动的位移时间曲线730为弹性力曲线,例如,临界阻尼的弹性力曲线。一般地,弹性力曲线在不同的操作场景可以使用不同的状态,也即,临界阻尼、欠阻尼和过阻尼。在不同的阻尼状态下,位移时间的弹性力曲线可以是不一样的。具体地,三种情况如下:阻尼的平方等于4倍的质量乘以刚性,这是临界阻尼。进一步地,如果阻尼大就是过阻尼,刚性大就是欠阻尼。特别地,阻尼的平方小于4倍质量乘以刚性为欠阻尼,而阻尼的平方大于4倍的质量乘以刚性为过阻尼。在图7C示出的位移时间曲线图中,横坐标表示时间,纵坐标表示位移(或距离)。应当理解,尽管图7C中将本公开的实施例的位移时间曲线730示例性地描绘为临界阻尼弹性力曲线,但是本公开的实施例不限于此,而是等同地适用于任何其他的曲线。同时参考图5和图7C,在UI元素344从目标位置p2回到起始位置p1的第二移动中,电子设备100可以基于位移时间曲线730,确定UI元素344在t4时刻的移动距离为0,在t5时刻的移动距离为D3,在t6时刻的移动距离为D4,并且在t7时刻的移动距离为目标距离D0。需要说明的是,在具体的实现中,电子设备100可以根据屏幕300的刷新频率所对应的时间间隔在位移时间曲线730上确定出UI元素344在每个时刻所在的位置,然后在不同的时刻在屏幕300上的对应位置处显示UI元素344,从而可以实现UI元素344进行第二移动的动画效果。
在弹性力模型的具体实现中,弹性引擎基于胡克定律下的阻尼振动公式如下:
f=ma  (公式1),
Figure PCTCN2022086706-appb-000006
其中f表示振动过程中的受力,m表示质量,a表示加速度,k表示弹性系统(刚性),x表示弹簧形变量,g表示阻力系数(阻尼),并且t表示时间。在具体的设置中,电子设备100的用户只需要确定需要产生的弹簧形变量x(也即第二移动的距离),其余的参数可以是可调参数。在一些实施例中,通过人因研究可以给出这些可调参数的相关推荐值,以供应用进行使用,当然应用也可以按需自定义地设置这些可调参数。
在一些实施例中,弹性引擎插值器的相关设置可以如下。
代码类的实现
1.SpringInterpolator(float stiffness,float damping)
2.SpringInterpolator(float stiffness,float damping,float endPos)
3.SpringInterpolator(float stiffness,float damping,float endPos,float velocity)
4.SpringInterpolator(float stiffness,float damping,float endPos,float velocity,float valueThreshold)
调用距离:
1.PhysicalInterpolatorBase interpolator=new SpringInterpolator(400F,40F,200F,2600F,1F);
2.ObjectAnimator animator=ObjectAnimator.ofFloat(listView,“translationY”,0,346)
3.animator.setDuration(interpolator.getDuration());
4.animator.setInterpolator(interpolator);
5.animator.start();
弹性力引擎动画类
动画类实例:
1.PringAnimation(K object,FloatPropertyCompat<K>property,float stiffness,float damping,float startValue,float endValue,float velocity)
2.SpringAnimation(K object,FloatPropertyCompat<K>property,float stiffness,float damping,float endValue,float velocity)
动画类调用实例:
1.SpringAnimation animation=SpringAnimation(listView,DynamicAnimation.TRANSLATION_Y,400F,40F,0,1000F);
2.animation.start();
图7D示出了根据本公开的实施例的UI元素的位移随时间变化的预定义曲线为欠阻尼弹性力曲线的示意图。在图7D的示例中,图5中描绘的UI元素344的第二移动的位移时间曲线740为弹性力曲线,例如,欠阻尼的弹性力曲线。在图7D示出的位移时间曲线图中,横坐标表示时间,纵坐标表示位移(或距离)。应当理解,尽管图7D中将本公开的实施例的位移时间曲线740示例性地描绘为欠阻尼弹性力曲线,但是本公开的实施例不限于此,而是等同地适用于任何其他的曲线。同时参考图5和图7D,在UI元素344从目标位置p2回到起始位置p1的第二移动中,电子设备100可以基于位移时间曲线740,确定UI元素344在t4时刻的移动距离为0,在t5时刻的移动距离为D3,在t6时刻的移动距离为D4,并且在t7时刻的移动距离为目标距离D0。需要特别之处的是,与图7C中示出的临界阻尼弹性力曲线不同,图7D中的欠阻尼弹性力曲线740可以具有“往复”的效果。例如,根据时间位移曲线740,UI元素344在t45时刻之前的某个时刻就已经到达了目标距离D0,并且继续沿着第二方向移动超过目标距离D0,然后再向第一方向移动。例如,在图7D中的时刻t45,UI元素344移动的距离是D45,其大于目标距离D0。类似地,在时刻t55和t65,UI元素344在第二方向上的移动距离D55和D65均大于目标距离D0。换言之,在位移时间曲线为欠阻尼弹性力曲线740的情况下,UI元素344将沿着第二方向从目标位置p2回到起始位置p1,然后在第二方向上移动超过起始位置p1,然后再以起始位置p1为中心进行来回的“往复”运动,直到最后停在起始位置p1。需要说明的是,在具体的实现中,电子设备100可以根据屏幕300的刷新频率所对应的时间间隔在位移时间曲线740上确定出UI元素344在每个时刻所在的位置,然后在不同的时刻在屏幕300上的对应位置处显示UI元素344,从而可以实现UI元素344进行第二移动的动画效果。
图7E至图7H示出了根据本公开的实施例的受到“引力”影响的三个UI元素的不同位移时间变化曲线的比较的示意图。具体地,图7E示出了在上文参考图3C至图3F描述的示例中的UI元素344、UI元素324和UI元素311三个UI元素在受到UI元素343的“引力”影响下的第一移动的位移时间曲线均为贝塞尔曲线的示意图。图7F示出了在上文参考图3C至图3F描述的示例中的UI元素344、UI元素324和UI元素311三个UI元素在受到UI元 素343的“引力”影响下的第一移动的位移时间曲线均为反比例曲线的示意图。图7G示出了在上文参考图3C至图3F描述的示例中的UI元素344、UI元素324和UI元素311三个UI元素在受到UI元素343的“引力”影响下的第二移动的位移时间曲线均为临界阻尼弹性力曲线的示意图。图7H示出了在上文参考图3C至图3F描述的示例中的UI元素344、UI元素324和UI元素311三个UI元素在受到UI元素343的“引力”影响下的第二移动的位移时间曲线均为欠阻尼弹性力曲线的示意图。需要说明的是,图7E至图7H以示例性的方式描绘了三个UI元素的位移时间曲线,以说明不同UI元素在相同UI元素的“引力”影响下的第一位移和第二位移可以分别具有不同的位移时间曲线。图3C至图3F中描绘的受到UI元素343的“引力”影响的其他UI元素的第一位移和第二位移可以具有类似的位移时间曲线。
在图7E示出的位移时间曲线图中,横坐标表示时间,纵坐标表示位移(或距离),此前在图5中描绘的第二UI元素344的第一移动的位移时间曲线710可以是二阶贝塞尔曲线,UI元素324的第一移动的位移时间曲线712可以是二阶贝塞尔曲线,并且UI元素311的第一移动的位移时间曲线714也可以是二阶贝塞尔曲线。注意,贝塞尔曲线710、712和714可以具有不同的参数。例如,在相同的时间t4处,因为UI元素344与被操作的UI元素343的距离最近,所以UI元素344可以具有最大的目标距离D0-344。因为UI元素324与被操作的UI元素343的距离比UI元素344远,所以UI元素324可以具有比UI元素344的目标距离D0-344小的目标距离D0-324。因为UI元素311与被操作的UI元素343的距离比UI元素324远,所以UI元素311可以具有比UI元素324的目标距离D0-324小的目标距离D0-311。同时参考图3C至图3E和图7E,在t1时刻,UI元素344、324和311在UI元素343的“引力”作用下,开始准备进行各自的第一移动。在t2时刻,UI元素344、324和311在各自的第一方向上移动距离D1-344、D1-324和D1-311。在t3时刻,UI元素344、324和311在各自的第一方向上移动距离D2-344、D2-324和D2-311。在t4时刻,UI元素344、324和311在各自的第一方向上移动目标距离D0-344、D0-324和D0-311。需要说明的是,在具体的实现中,电子设备100可以根据屏幕300的刷新频率所对应的时间间隔在位移时间曲线710、712、714上确定出UI元素344、324和311在每个时刻所在的位置,然后在不同的时刻在屏幕300上的对应位置处显示UI元素344、324和311,从而可以实现UI元素344、324和311进行各自的第一移动的动画效果。还需要说明的是,尽管在图7E的示例中将UI元素344、324和311各自的第一移动示出为同时开始且同时结束,但是这仅是示例性的,无意以任何方式限制本公开的范围。在其他实施例中,UI元素344、324和311各自的第一移动可以在不同的时间开始和/或在不同的时间结束。例如,这可能是在考虑UI元素343的“引力”传播的速度的情况下,后文将参考图19来进一步描述这样的实施例。
在图7F示出的位移时间曲线图中,横坐标表示时间,纵坐标表示位移(或距离),此前在图5中描绘的第二UI元素344的第一移动的位移时间曲线720可以是反比例曲线,UI元素324的第一移动的位移时间曲线722可以是反比例曲线,并且UI元素311的第一移动的位移时间曲线724也可以是反比例曲线。注意,反比例曲线720、722和724可以具有不同的参数。例如,在相同的时间t4处,因为UI元素344与被操作的UI元素343的距离最近,所以UI元素344可以具有最大的目标距离D0-344。因为UI元素324与被操作的UI元素343的距离比UI元素344远,所以UI元素324可以具有比UI元素344的目标距离D0-344小的目标距离D0-324。因为UI元素311与被操作的UI元素343的距离比UI元素324远,所以UI 元素311可以具有比UI元素324的目标距离D0-324小的目标距离D0-311。同时参考图3C至图3E和图7F,在t1时刻,UI元素344、324和311在UI元素343的“引力”作用下,开始准备进行各自的第一移动。在t2时刻,UI元素344、324和311在各自的第一方向上移动距离D1-344、D1-324和D1-311。在t3时刻,UI元素344、324和311在各自的第一方向上移动距离D2-344、D2-324和D2-311。在t4时刻,UI元素344、324和311在各自的第一方向上移动目标距离D0-344、D0-324和D0-311。需要说明的是,在具体的实现中,电子设备100可以根据屏幕300的刷新频率所对应的时间间隔在位移时间曲线720、722、724上确定出UI元素344、324和311在每个时刻所在的位置,然后在不同的时刻在屏幕300上的对应位置处显示UI元素344、324和311,从而可以实现UI元素344、324和311进行各自的第一移动的动画效果。还需要说明的是,尽管在图7F的示例中将UI元素344、324和311各自的第一移动示出为同时开始且同时结束,但是这仅是示例性的,无意以任何方式限制本公开的范围。在其他实施例中,UI元素344、324和311各自的第一移动可以在不同的时间开始和/或在不同的时间结束。
在图7G示出的位移时间曲线图中,横坐标表示时间,纵坐标表示位移(或距离),此前在图5中描绘的第二UI元素344的第二移动的位移时间曲线730可以是临界阻尼弹性力曲线,UI元素324的第二移动的位移时间曲线732可以是临界阻尼弹性力曲线,并且UI元素311的第二移动的位移时间曲线734也可以是临界阻尼弹性力曲线。注意,临界阻尼弹性力曲线730、732和734可以具有不同的参数。例如,在相同的时间t7处,因为UI元素344与被操作的UI元素343的距离最近,所以UI元素344可以具有最大的目标距离D0-344。因为UI元素324与被操作的UI元素343的距离比UI元素344远,所以UI元素324可以具有比UI元素344的目标距离D0-344小的目标距离D0-324。因为UI元素311与被操作的UI元素343的距离比UI元素324远,所以UI元素311可以具有比UI元素324的目标距离D0-324小的目标距离D0-311。同时参考图3E至图3F和图7G,在t4时刻,UI元素344、324和311在UI元素343的“引力”作用下,已经完成了各自的第一移动,开始准备进行各自的第二移动。在t5时刻,UI元素344、324和311在各自的第二方向上移动距离D3-344、D3-324和D3-311。在t6时刻,UI元素344、324和311在各自的第二方向上移动距离D4-344、D4-324和D4-311。在t7时刻,UI元素344、324和311在各自的第二方向上移动目标距离D0-344、D0-324和D0-311。需要说明的是,在具体的实现中,电子设备100可以根据屏幕300的刷新频率所对应的时间间隔在位移时间曲线730、732、734上确定出UI元素344、324和311在每个时刻所在的位置,然后在不同的时刻在屏幕300上的对应位置处显示UI元素344、324和311,从而可以实现UI元素344、324和311进行各自的第一移动的动画效果。还需要说明的是,尽管在图7G的示例中将UI元素344、324和311各自的第二移动示出为同时开始且同时结束,但是这仅是示例性的,无意以任何方式限制本公开的范围。在其他实施例中,UI元素344、324和311各自的第二移动可以在不同的时间开始和/或在不同的时间结束。
在图7H示出的位移时间曲线图中,横坐标表示时间,纵坐标表示位移(或距离),此前在图5中描绘的第二UI元素344的第二移动的位移时间曲线740可以是欠阻尼弹性力曲线,UI元素324的第二移动的位移时间曲线742可以是欠阻尼弹性力曲线,并且UI元素311的第二移动的位移时间曲线744也可以是欠阻尼弹性力曲线。注意,欠阻尼弹性力曲线740、742和744可以具有不同的参数。例如,在相同的时间t7处,因为UI元素344与被操作的 UI元素343的距离最近,所以UI元素344可以具有最大的目标距离D0-344。因为UI元素324与被操作的UI元素343的距离比UI元素344远,所以UI元素324可以具有比UI元素344的目标距离D0-344小的目标距离D0-324。因为UI元素311与被操作的UI元素343的距离比UI元素324远,所以UI元素311可以具有比UI元素324的目标距离D0-324小的目标距离D0-311。同时参考图3E至图3F和图7H,在t4时刻,UI元素344、324和311在UI元素343的“引力”作用下,已经完成了各自的第一移动,开始准备进行各自的第二移动。在t5时刻,UI元素344、324和311在各自的第二方向上移动距离D3-344、D3-324和D3-311。在t6时刻,UI元素344、324和311在各自的第二方向上移动距离D4-344、D4-324和D4-311。在t7时刻,UI元素344、324和311在各自的第二方向上移动目标距离D0-344、D0-324和D0-311。注意,在图7H示出的示例中,UI元素344、324和311将会基于各自的欠阻尼弹性力曲线的位移时间曲线,而在各自的起始位置进行来回的“往复”运动。需要说明的是,在具体的实现中,电子设备100可以根据屏幕300的刷新频率所对应的时间间隔在位移时间曲线730、732、734上确定出UI元素344、324和311在每个时刻所在的位置,然后在不同的时刻在屏幕300上的对应位置处显示UI元素344、324和311,从而可以实现UI元素344、324和311进行各自的第一移动的动画效果。还需要说明的是,尽管在图7G的示例中将UI元素344、324和311各自的第二移动示出为同时开始且同时结束,但是这仅是示例性的,无意以任何方式限制本公开的范围。在其他实施例中,UI元素344、324和311各自的第二移动可以在不同的时间开始和/或在不同的时间结束。
如上文提到的,在一些实施例中,电子设备100可以基于第二UI元素344的尺寸和第二UI元素344到第一UI元素343的距离这两个因素,来确定第二UI元素344在第一方向上的第一移动的目标距离D0。下文将参考图8、图9以及图10A和图10B来描述这样的实施例。
图8示出了根据本公开的实施例的用于确定受到第一UI元素的“吸引力”或“排斥力”影响的第二UI元素进行第一移动的目标距离的示例处理过程800的流程图。在一些实施例中,处理过程800可以由电子设备100来实现,例如可以由电子设备100的处理器110或处理单元配合其他组件(例如,显示屏194)来实现。在其他实施例中,处理过程800也可以由具有屏幕以显示UI元素的其他设备来实现。为了便于说明,将以电子设备100执行处理过程800为例,参考图9、图10A和图10B来论述处理过程800。图9示出了根据本公开的实施例的确定受到第一UI元素的“吸引力”或“排斥力”影响的第二UI元素的尺寸的示意图。图10A和图10B分别示出了根据本公开的实施例的确定UI元素之间的距离的两种示例方式的示意图。
在图8的框810处,电子设备100可以确定第二UI元素344的尺寸。例如,在图9的示例中,由于第二UI元素344大致是矩形的形状,所以电子设备100可以确定第二UI元素344两条边910和920的长度,然后再基于面积计算公式得出第二UI元素344的尺寸或大小。在一些实施例中,第二UI元素344的两条边910和920的长度可以采用像素数目为单位,因此第二UI元素344的尺寸或大小可以采用像素的数目来表示。在其他实施例中,电子设备100可以使用任何适合的单位来度量第二UI元素344的两条边910和920的长度,进而度量第二UI元素344的尺寸或大小。例如,第二UI元素344的尺寸或大小可以采用平方毫米为单位来度量。需要说明的是,尽管图9中示意性地说明了电子设备100如何确定常见的规则矩形形状的第二UI元素344的尺寸,但是本公开的实施例不限于此,而是可以类似地适用于任何 规则或不规则形状的UI元素。
在图8的框820处,电子设备100可以确定第二UI元素344与第一UI元素343之间的距离。需要说明的是,在本公开的实施例中,电子设备100可以采用各种不同的方式来确定两个UI元素之间的距离。在一些实施例中,电子设备100可以先确定两个UI元素各自的参考点,然后确定两个参考点之间的距离,作为两个UI元素之间的距离。例如,在图10A的示例中,电子设备100可以确定第一UI元素343的参考点1010的位置,并且可以确定第二UI元素344的参考点1020的位置。然后,电子设备100可以确定参考点1010与参考点1020之间的距离1015,作为第一UI元素343与第二UI元素344之间的距离。需要说明的是,UI元素的参考点的选择可以基于预定的规则。例如,在图10A的示例中,UI元素的参考点被确定为UI元素左下角的角点。应理解,UI元素的参考点可以按照任何适当的规则来选取,只要两个UI元素之间的距离可以合理地被确定。例如,由于每个UI元素(例如,图标)的大小可能存在不一致的情况,在一些实施例中,电子设备100可以使用UI元素的中心点作为参考点,后文中将参考图11和图12来详细描述这样的实施例。然而,在实际的使用时,参考点的选择可以不作限定,而是可以由应用自由设定。在另一些实施例中,两个UI元素之间的最接近的两个点之间的距离可以被确定为两个UI元素之间的距离。例如,在图10B的示例中,由于第一UI元素343和第二UI元素344基本上是规则的矩形形状,并且两者存在平行的边,所以它们之间的最接近的两个点之间距离即为两条相邻边之间的距离1025。应理解,尽管图10B的示例以示意性地方式描绘了两个规则形状的UI元素的最接近的两个点之间的距离,但是本公开的实施例不限于此,而是等同地适用于具有任何相同形状或不同形状的两个UI元素。此外,在其他实施例中,两个UI元素之间的距离还可以采用各种其他的方式来确定,例如基于参考圆的半径来确定UI元素之间的距离,或者基于UI元素之间的间距来确定UI元素之间的距离,等等。后文将参考图11至图17A-图17E来描述这些实施例。
在图8的框830处,基于第二UI元素344的尺寸和第二UI元素344与第一UI元素343之间的距离,电子设备100可以确定第二UI元素344在第一移动中需要移动的目标距离。一般地,电子设备100可以使目标距离与第二UI元素344的尺寸和两个UI元素的距离具有任何适当的关系,只要可以体现出第一UI元素343对第二UI元素344的“吸引力”或“排斥力”的作用。在一些实施例中,电子设备100可以使目标距离随着第二UI元素344的尺寸增大而增大。也就是说,第二UI元素344越大,则第二UI元素344受到第一UI元素343的“吸引力”或“排斥力”越大。这与自然世界中的引力规律是相符的,因为第二UI元素344越大,可以认为第二UI元素344的“质量”越大,因此将受到更大的“引力”作用。另一方面,电子设备100可以使目标距离随着两个UI元素之间的距离增大而减小。换句话说,第二UI元素344离第一UI元素343越近,则第二UI元素344受到第一UI元素343的“吸引力”或“排斥力”越大。这也与自然世界中的引力规律相符,因为自然界的“引力”作用随着物体之间的距离减小而增大。以此方式,UI元素本身的大小越大,两个UI元素之间的距离越小,UI元素受到其他UI元素的“吸引力”或“排斥力”作用的大小也就越大,从而符合于自然界中的引力大小规律,由此进一步提升用户体验。
在一些实施例中,第一位移和第二位移的动画效果的幅度,也就是移动的距离与该UI元素与吸引力或者排斥力发生点的距离成反比。更具体地,本公开的实施例可以借用万有引力的模型,也即:
Figure PCTCN2022086706-appb-000007
两个物体之间的万有引力大小与它们各自的质量和距离有关。因为本公开的实施例主要针对的是用户体验UX界面上使用的UI元素、图形、图标或者控件,所以可以认为一般情况下UI元素的质量和大小是成正比的。假设某一UI元素的大小为R,距离为r,则其“质量”可以认为是:
M=K*R=K*r  (公式4)。
这样,基于上述的引力模型以及本公开的实施例的推导公式,可以得出两个UI元素之间的吸引力或排斥力与两个UI元素的距离和受影响的UI元素大小之间的关系如下:
F=K*(R*R)/(r*r)  (公式5)。
通过简化,即可得出吸引力或排斥力与受影响的UI元素的大小成正比,与两个UI元素之间的距离成反比,例如:
F=K*R/r  (公式6)。
进一步研究相关的参数K的细致意义,受影响的UI元素的位移的幅度可以通过以下的公式进行计算得到:
Figure PCTCN2022086706-appb-000008
该公式是通过人因研究而得出,其中0.1和0.8可以作为固定常量,该公式是最接近引力效果的。此外,a是常量,其默认值可以是10,当然用户可以进行调节设定。需要说明的是,基于上文的公式4和公式7得出的位移时间曲线将类似于前文参考图7B和图7F描述的反比例曲线。在一些实施例中,电子设备100可以使用该公式来计算UI元素在“引力”动画效果中的位移动画的最终位置。当然,在其他实施例中,电子设备100也可以使受“引力”影响的第二UI元素344的移动的目标距离随着第二UI元素344的尺寸增大而减小,随着两个UI元素之间的距离增大而增大,或者具有任何其他的函数变化关系。尽管这样的函数变化关系可能与自然界中的引力规律不符,但是也可以给用户带来全新的用户体验。通过示例处理过程800,UI元素受到其他UI元素的“吸引力”或“排斥力”作用的大小可以取决于UI元素本身的大小和两个UI元素之间的距离,从而符合于自然界中的引力大小规律,由此进一步提升用户体验。
如上文提到的,在本公开的实施例中,除了上文描述的距离确定方式之外,两个UI元素之间的距离还可以采用各种其他的方式来确定。下文将参考图11至图17A至图17F来描述另外三种UI元素之间的距离确定方式。
图11示出了根据本公开的实施例的基于中心点来确定第一UI元素与第二UI元素之间的距离的示例处理过程1100的流程图。在一些实施例中,处理过程1100可以由电子设备100来实现,例如可以由电子设备100的处理器110或处理单元配合其他组件(例如,显示屏194)来实现。在其他实施例中,处理过程1100也可以由具有屏幕以显示UI元素的其他设备来实现。为了便于说明,将以电子设备100执行处理过程1100为例,参考图12来论述处理过程1100。图12示出了根据本公开的实施例的基于中心点来确定第一UI元素与第二UI元素之间的距离的示意图。
同时参考图11和图12,在图11的框1110处,电子设备100可以确定第一UI元素343的第一中心点343-o。需要说明的是,在一些实施例中,UI元素的中心点可以是指几何学意 义上的中心,或者是指将UI元素考虑成密度均匀的物体情况下物理学意义上的重心。在其他实施例中,UI元素的中心点也可以是指以任何其他方式被定义的代表UI元素“中心”的中心点。在图12的示例中,电子设备100可以基于第一UI元素343的几何形状来确定出第一中心点343-o在屏幕300(图12未示出)上的坐标位置或像素位置,等等。在图11的框1120处,电子设备100可以确定第二UI元素344的第二中心点344-o。例如,以类似的方式,电子设备100可以基于第二UI元素344的几何形状来确定出第二中心点344-o在屏幕300(图12未示出)上的坐标位置或像素位置,等等。在图11的框1130处,电子设备100可以确定第一中心点343-o与第二中心点344-o之间的直线距离1200,作为第一UI元素343与第二UI元素344之间的距离。例如,基于第一中心点343-o和第二中心点344-o各自的坐标位置或像素位置,电子设备100可以确定出两个中心点之间的直线距离。通过处理过程1100,两个UI元素之间的距离可以按照直接明了的方式确定为两个UI元素中心点之间的距离,从而提高电子设备100确定UI元素之间距离的确定方式的一致性,简化电子设备100的计算过程。
图13示出了根据本公开的实施例的基于半径来确定第一UI元素与第二UI元素之间的距离的示例处理过程1300的流程图。在一些实施例中,处理过程1300可以由电子设备100来实现,例如可以由电子设备100的处理器110或处理单元配合其他组件(例如,显示屏194)来实现。在其他实施例中,处理过程1300也可以由具有屏幕以显示UI元素的其他设备来实现。为了便于说明,将以电子设备100执行处理过程1300为例,参考图14来论述处理过程1300。图14示出了根据本公开的实施例的基于半径来确定第一UI元素与第二UI元素之间的距离的示意图。
同时参考图13和图14,在图13的框1310处,电子设备100可以确定第一UI元素343的第一中心点343-o。如上文提到的,在一些实施例中,第一UI元素343的第一中心点343-o可以是指几何学意义上的第一UI元素343的中心,或者是指将第一UI元素343考虑成密度均匀的物体情况下在物理学意义上的重心。在其他实施例中,第一UI元素343的第一中心点343-o也可以是指以任何其他方式被定义的代表第一UI元素343“中心”的中心点。在图14的示例中,电子设备100可以基于第一UI元素343的几何形状来确定出第一中心点343-o在屏幕300(图14未示出)上的坐标位置或像素位置,等等。
在图13的框1320处,电子设备100可以确定以第一中心点343-o为圆心的具有各自半径的多个圆。例如,在图14描绘的示例中,电子设备100可以确定具有半径r1的第一圆1410、具有半径r2的第二圆1420、具有半径r3的第三圆1430、具有半径r4的第四圆1440、以及具有半径r5的第五圆1450。需要说明的是,在一些实施例中,各个圆(例如,圆1410至圆1450)的半径之间的差可以是相等的,也即r1至r5可以形成等差数列。以此方式,电子设备100生成各个圆的处理过程可以被简化。当然,在其他实施例中,电子设备100也可以根据用户的设置或者取决于UI元素不同排布方式而将各个圆(例如,圆1410至圆1450)的半径之间的差设置为不相等,也即r1至r5不形成等差数列。如此,生成各个圆的灵活性和各个圆对场景的适应性可以得到提高。
在图13的框1330处,电子设备100可以确定第二UI元素344与多个圆(例如,圆1410至圆1450)中的至少一个圆相交。例如,在图14描绘的示例中,电子设备100可以确定第二UI元素344与第一圆1410相交。需要说明的是,在一些实施例中,某个UI元素并不总是仅与一个圆相交。例如,在图14的示例中,UI元素352与第一圆1410和第二圆1420两者 相交,并且UI元素354也与第一圆1410和第二圆1420两者相交。在图13的框1340处,电子设备100可以将与第二UI元素344相交的至少一个圆中半径最小的圆的半径确定为第二UI元素344与第一UI元素343之间的距离。例如,在图14的示例中,由于第二UI元素344仅与第一圆1410相交,所以电子设备100可以将第一圆1410的半径r1确定为第二UI元素344与第一UI元素343之间的距离。又例如,对于UI元素352和UI元素354,由于它们与第一圆1410和第二圆1420两者相交,因此电子设备100可以确定这两个圆中半径较小的圆为第一圆1410。然后,电子设备100可以确定UI元素352(或UI元素354)与第一UI元素343之间的距离为第一圆1410的半径r1。通过处理过程1300,电子设备100可以更简单和方便地确定两个UI元素之间的距离,并且使得UI元素之间的距离具有更高的一致性,从而简化基于距离的后续处理和计算过程。
图15A和图15B示出了根据本公开的实施例的在基于半径确定UI元素之间的距离的情况下UI元素之间的整体传导方式的示意图。在图15A和图15B的示例中,UI元素以带有填充图案的圆形表示,例如,UI元素1510。UI元素周围的线框1505用于示意性地示出UI元素的布置方式。如图15A和图15B所示,假设第3行第4列的UI元素被操作,电子设备100可以确定以该UI元素为中心的五个圆,分别以索引1至5来表示。在本公开的实施例的基于半径的“引力”动画效果的联动方式中,如图15A和图15B所示,半径方式的联动是以圆形方式进行展开的。例如,半径可以想象成以波纹形式运动,中心点可以按照波纹的方式确定传导之间的关系。相关的UI元素(例如,图标)只要与某一个圆相交,那么该UI元素就按照该半径的传导编号进行运动。如果UI元素(例如,图标)和任何一个圆都不相交,那么可以通过UI元素之间的距离来找出满足该距离的最小半径。整体的传导方式的确定如图15B所示,物理参数的传递可以通过以下等式表示:
stiffness=stiffness*(n+1) -0.18,damping=damping*(n+1) -0.18,n=index-0(公式8),
其中“stiffness”表示UI元素的位移时间变化曲线为弹性力曲线的情况下的弹性力曲线的刚性,“damping”表示UI元素的位移时间变化曲线为弹性力曲线的情况下的弹性力曲线的阻尼。动画回调可以表示为:onUpdate(x,y,index),根据0节点的运动计算出编号index的x、y位移。此外,具有不同index的UI元素之间的“引力”动画效果联动传递的Delta时间差可以基于“引力”传播的速度来确定,后文将参考图19来进一步描述有关“引力”传播速度的实施例。
图16示出了根据本公开的实施例的基于间距来确定第一UI元素与第二UI元素之间的距离的示例处理过程1600的流程图。在一些实施例中,处理过程1600可以由电子设备100来实现,例如可以由电子设备100的处理器110或处理单元配合其他组件(例如,显示屏194)来实现。在其他实施例中,处理过程1600也可以由具有屏幕以显示UI元素的其他设备来实现。为了便于说明,将以电子设备100执行处理过程1600为例,参考图17A至图17F来论述处理过程1600。图17A至图17F示出了根据本公开的实施例的基于间距来确定第一UI元素与第二UI元素之间的距离的示意图。
在图16的框1610处,电子设备100可以确定第一UI元素与第二UI元素之间的横向间距。在本公开的上下文中,UI元素之间的间距可以是指两个UI元素的相邻的两个边框之间的距离。因此,横向间距可以是指两个UI元素在相对于屏幕300的横向方向上的边框距离,而纵向间距可以是指两个UI元素在相对于屏幕300的纵向方向上的边框距离。在图17A的 示例中,由于UI元素343与UI元素344是相对于屏幕300水平排列的,所以电子设备100可以确定UI元素343与UI元素344之间的横向间距为1710。在图17B的示例中,由于UI元素343与UI元素353是相对于屏幕300竖直排列的,所以电子设备100可以确定UI元素343与UI元素353之间的横向间距为0。在图17C的示例中,由于UI元素343与UI元素354是相对于屏幕300斜向排列的,所以电子设备100可以确定UI元素343与UI元素344之间的横向间距为1710。
在图16的框1620处,电子设备100可以确定第一UI元素与第二UI元素之间的纵向间距。例如,在图17A的示例中,由于UI元素343与UI元素344是相对于屏幕300水平排列的,所以电子设备100可以确定UI元素343与UI元素344之间的纵向间距为0。在图17B的示例中,由于UI元素343与UI元素353是相对于屏幕300竖直排列的,所以电子设备100可以确定UI元素343与UI元素353之间的纵向间距为1720。在图17C的示例中,由于UI元素343与UI元素354是相对于屏幕300斜向排列的,所以电子设备100可以确定UI元素343与UI元素344之间的纵向间距为1720。
在图16的框1630处,电子设备100可以基于横向间距1710和纵向间距1720中的至少一者和第二UI元素的第一移动的第一方向,来确定第二UI元素与第一UI元素之间的距离。例如,在图17A的示例中,由于UI元素343与UI元素344之间的横向距离为1710,纵向距离为0,而UI元素344朝向或远离UI元素343的第一移动的第一方向344-d1(在图17A中是远离UI元素343的方向)与屏幕300的横向方向是平行的,所以电子设备100可以确定UI元素343与UI元素344之间距离即为两者的横向距离1710。类似地,在图17B的示例中,由于UI元素343与UI元素353之间的横向距离为0,纵向距离为1720,而UI元素353朝向(未示出)或远离UI元素343的第一移动的第一方向353-d1(在图17B中是远离UI元素343的方向)与屏幕300的纵向方向是平行的,所以电子设备100可以确定UI元素343与UI元素353之间距离即为两者的纵向距离1720。需要说明的是,在图17A和图17B的示例中,如果UI元素的第一移动的第一方向并不与屏幕300的横向方向或纵向方向平行,那么电子设备100可以将横向距离1710(图17A)或纵向距离1720(图17B)在第一方向上的投影确定为两个UI元素之间的距离。
在图17C的示例中,由于UI元素343与UI元素354之间的横向距离1710和纵向距离1720均不为0,并且UI元素354朝向或远离UI元素343的第一移动的第一方向354-d1(在图17C中是远离UI元素343的方向)不与屏幕300的横向方向或纵向方向平行。在这种情况下,UI元素343与UI元素354之间的距离可以通过横向距离1710和纵向距离1720基于第一方向354-d1的投影来确定。作为一种示例,如图17D所示,基于横向距离1710和纵向距离1720的大小,电子设备100可以确定出以横向距离1710和纵向距离1720为两条直角边的直角三角形,该直角三角形具有斜边1725。然后,基于UI元素354的第一位移的第一方向354-d1,电子设备100可以在直角三角形内确定出UI元素343与UI元素354之间的距离1730。在具体的投影计算方式中,电子设备100可以根据第一方向354-d1与水平方向的夹角或者与竖直方向的夹角,利用三角函数的原理,计算出在第一方向354-d1上的距离。
在图17D的示例中,在基于第一方向354-d1的投影计算过程中,同时利用了横向距离1710和纵向距离1720。在其他实施例中,电子设备100可以根据第一方向354-d1的具体指向而仅使用横向距离1710和纵向距离1720之一,来确定UI元素343与UI元素354之间的 距离。例如,如图17E所示,电子设备100可以确定第一方向354-d1是更接近于屏幕300的水平方向还是竖直方向。如果第一方向354-d1更接近于水平方向,则电子设备100可以仅使用横向距离1710来确定UI元素343与UI元素354之间的距离。另一方面,如果第一方向354-d1更接近于竖直方向,则电子设备100可以仅使用纵向距离1720来确定UI元素343与UI元素354之间的距离。在图17E的示例中,假设第一方向354-d1更接近于水平方向,则基于与横向距离1710垂直的辅助线1712,电子设备100可以确定UI元素343与UI元素354之间的距离为1740。另外,假设第一方向354-d1更接近于竖直方向,则基于与纵向距离1720垂直的辅助线1722,电子设备100可以确定UI元素343与UI元素354之间的距离为1750。这样的计算方式在本文中也可以称为分段计算方式,也即按照横向间距和纵向间距和不同的方向进行不同的分段计算。更一般地说,电子设备100可以确定第一方向与水平方向和竖直方向之间的夹角,如果第一方向更偏向水平方向和竖直方向中的某一方向,就可以按照该方向来计算距离。例如,在第一方向更接近竖直方向时,可以按照与竖直方向有关的三角函数来计算弦边长度,也即距离。相反,在第一方向更接近水平方向时,可以按照与水平方向有关的三角函数来计算弦边长度,也即距离。
在上文参考图17A至图17E描述的示例中,受到“引力”作用影响的UI元素的第一移动的第一方向被用作参考方向,然后基于UI元素之间的横向间距和纵向间距之一或二者来确定两个UI元素之间的距离。然而,本公开的实施例不限于此,而是等同地适用于以任意方向作为参考方向,然后基于UI元素之间的横向间距和纵向间距之一或二者来确定两个UI元素之间的距离。例如,在一些实施例中,在基于横向间距和/或纵向间距来确定UI元素之间的距离的过程中,用于替代上文描述的UI元素的第一方向的参考方向可以包括但不限于,横向方向(例如,相对于屏幕300)、竖直方向(例如,相对于屏幕300)、或者某个固定的方向(例如,相对于屏幕300),等等。
本公开的实施例提出的基于间距的UI元素距离的计算方式可以更广泛地使用在不同大小的UI元素以相同间距布置的场景中。例如,在图17F的示例中,电子设备100的屏幕上可以显示各种尺寸的UI元素,例如,UI元素1760、UI元素1762、UI元素1764、UI元素1766等,其中UI元素1764最大,UI元素1760次之,UI元素1762再次之,UI元素1766最小。尽管UI元素1760至1766具有不同的大小,但是它们之间的横向间距1775和纵向间距1765可以是相同的。在图17F的示例中,所谓的间距可以是两个UI元素(例如,卡片)或者其他控件之间的边框的距离。但是,在考虑不同的第一方向的情况下,在同一种UI元素布局中,UI元素之间的间距可能不一样。以第一方向为投影方向,通过上文介绍的投影计算方式可以直接计算出所有UI元素(例如,控件)之间的间距。在通常的场景中,横向运动的距离即为横向间距1775,纵向运动的距离即为纵向间距1765。但是,在横向间距和纵向间距不规则的场景,每一个UI元素(例如,控件)的横向间距和纵向间距可能是不一样的,这个间距值可以是UI元素被布局时就确定的,并且可以跟随当前UI元素(例如,控件)的属性。当横向间距和纵向间距确定了之后,每个第一方向上的距离都可以根据这两个间距来计算。此外,在间距确定了之后,基于弹性运动的原理,电子设备100可以按需进行UI元素动画效果的链式联动。在链式联动的过程中,所有的参数都是可以调节的。整个传导公式可以按照各种相关的参数计算而得出的值进行UI元素的引力的相关的运动。通过示例处理过程1600,电子设备100可以基于UI元素之间的间距来确定UI元素之间的距离,从而提高距离确定方式的 灵活性和合理性,特别是在UI元素之间的间距基本保持一致的场景中。
如上文提到的,在一些实施例中,电子设备100的用户所操作的第一UI元素可以并不是对屏幕300上的所有UI元素都存在“引力”作用,也即存在“吸引力”或“排斥力”,而是存在一定的“引力”影响范围。以此方式,电子设备100可以将UI元素的“引力”影响范围设置为适当的大小,从而可以在保持“引力”动画效果符合自然规律的同时,减少电子设备100在实现“引力”动画效果时的计算量,节省计算资源。如本文中使用的,UI元素的“吸引力”或“排斥力”的影响范围(或影响区域)也可以称为引力范围、引力作用范围、引力影响范围,等等。应理解,UI元素的引力范围可以是具有任何形状的区域。在一些实施例中,UI元素的引力范围可以是一个以UI元素为中心的圆形区域。该设置符合于自然界的规律,因为在自然界中,物体的引力范围通常认为是以物体为中心的球体。当然,在一些实施例中,取决于用户的偏好或者具体的应用场景,UI元素的引力范围也可以被设置为其他规则的形状(例如,正方形)或不规则的形状,从而提高引力范围设置的灵活性。在一些实施例中,电子设备100可以将每个UI元素的引力范围设置为相同的,这样可以简化电子设备100关于UI元素的引力范围的计算过程。在其他实施例中,电子设备100可以根据UI元素的尺寸大小来设置UI元素的引力范围。以此方式,UI元素的“引力”作用范围的大小更加符合自然界的规律,因为在自然界中,在假定物体具有均匀密度的情况下,体积越大的物体对周围的物体将具有更大的引力。下文将参考图18A至图18C来进一步描述这样的实施例。图18A至图18C示出了根据本公开的实施例的在UI元素具有有限的“引力”范围的场景中所产生的“引力”动画效果的示意图。
具体地,电子设备100可以基于被操作的第一UI元素的尺寸来确定第一UI元素的影响区域。例如,在图18A的示例中,假设UI元素343是被操作的第一UI元素,则电子设备100可以根据UI元素343的尺寸来确定UI元素343具有影响区域(也即,引力范围)1800。也就是说,以吸引力或者排斥力的发生中心点为圆心,半径R范围内的UI元素将受到UI元素343的“引力”影响,电子设备100可以针对这些UI元素来实现位移动画,以模拟吸引力或者排斥力效果。例如,半径R可以与UI元素自身的尺寸大小有关,越大的UI元素R可以越大。在一些实施例中,UI元素的引力影响范围可以表示为(min,max)。也就是说,UI元素的大小可以认为是与“引力”范围的大小成正比,也即,可以推导出UI元素的“质量”与其“引力”范围成正比。引力影响范围的上下限的具体数值可以由应用侧进行设定,与被操作的UI元素的中心点的距离需要在这个区间内才会产生引力动画效果。在图18A描绘的示例中,UI元素343的影响区域1800被描绘为以UI元素343的中心点343-o为圆心半径为R的圆形区域。然后,电子设备100可以将屏幕300上的M个(在该示例中为24个)UI元素中在影响区域1800内的UI元素确定为将会受到UI元素343的“引力”作用影响的N个UI元素。例如,在图18A的示例中,在影响区域1800内的UI元素包括UI元素332、UI元素333、UI元素334、UI元素342、UI元素344、UI元素352、UI元素353和UI元素354。
如图18B所示,小黑点表示在UI元素343的影响区域1800内的UI元素332、UI元素333、UI元素334、UI元素342、UI元素344、UI元素352、UI元素353和UI元素354在“引力”动画效果开始之前所在的起始位置,而十字符号表示各个UI元素的当前位置。也就是说,在图18B示出的时刻,UI元素343周围的UI元素332、UI元素333、UI元素334、UI元素342、UI元素344、UI元素352、UI元素353和UI元素354已经在朝向UI元素343的第一 方向上移动了各自的目标距离,之后将开始在远离UI元素343的第二方向返回各自的起始位置。如图18C所示,小黑点表示UI元素343周围的UI元素332、UI元素333、UI元素334、UI元素342、UI元素344、UI元素352、UI元素353和UI元素354在“引力”动画效果开始之前所在的起始位置,而十字符号表示各个UI元素的当前位置。也就是说,在图18C示出的时刻,在影响区域1800内的UI元素343周围的UI元素332、UI元素333、UI元素334、UI元素342、UI元素344、UI元素352、UI元素353和UI元素354已经完成了远离UI元素343的第二移动而返回到各自的起始位置。相比之下,在UI元素343的影响区域1800之外的各个UI元素,包括UI元素311至314、UI元素321至324、UI元素331、UI元素341、UI元素351、以及UI元素361至364将不会受到UI元素343的“引力”影响,从而在“引力”动画效果期间可以保持不动。
图19示出了根据本公开的实施例的基于“引力”传播速度来确定UI元素的“引力”动画效果开始的时间点的示例处理过程1900的流程图。在一些实施例中,处理过程1900可以由电子设备100来实现,例如可以由电子设备100的处理器110或处理单元配合其他组件(例如,显示屏194)来实现。在其他实施例中,处理过程1900也可以由具有屏幕以显示UI元素的其他设备来实现。为了便于说明,将以电子设备100执行处理过程1900为例,参考图3B至图3D来论述处理过程1900。
同时参考图3B和图19,在图19的框1902处,为了使第二UI元素344进行第一移动,电子设备100可以确定针对UI元素343的操作被执行的第一时间点T1。例如,电子设备100可以记录用户操作UI元素343的时间点。在图19的框1904处,基于预定速度s(例如,UI元素343的“引力”作用的传播速度)和第二UI元素与被操作的第一UI元素343之间的距离D,电子设备100可以确定第二UI元素开始第一移动的第二时间点T2与第一时间点T1之间的延迟T2-T1=Delay,例如该延迟可以计算如下:
Delay=D/s  (公式9)。
在图3D的示例中,假设第二UI元素是UI元素344,那么电子设备100可以确定与UI元素344相关联的第二时间点T2与第一时间点T1之间的延迟Delay-344为中心点344-o与中心点343-o之间的距离除以预定速度s。又例如,在图3D的示例中,假设第二UI元素是UI元素311,那么电子设备100可以确定与UI元素311相关联的第二时间点T2与第一时间点T1之间的延迟Delay-311为中心点311-o与中心点343-o之间的距离除以预定速度s。应明白,由于中心点311-o与中心点343-o之间的距离大于中心点344-o与中心点343-o之间的距离,所以延迟Delay-311将会大于延迟Delay-344。
在框1906处,电子设备100可以基于第一时间点T1和延迟Delay来确定第二UI元素开始第一移动的第二时间点T2。例如,在图3D的示例中,电子设备100可以将第一时间点T1加上延迟Delay-344,从而得出UI元素344开始进行第一移动的第二时间点T2-344。类似地,在图3D的示例中,电子设备100可以将第一时间点T1加上延迟Delay-311,从而得出UI元素311开始进行第一移动的第二时间点T2-311。应明白,由于延迟Delay-311大于延迟Delay-344,所以UI元素311的第二时间点T2-311将会晚于UI元素344的第二时间点T2-344。在框1908处,电子设备100可以在第二时间点T2使第二UI元素开始第一移动。例如,在图3D的示例中,电子设备100可以使UI元素344在第二时间点T2-344开始进行第一移动。类似地,在图3D的示例中,电子设备100可以使UI元素311在第二时间点T2-311开始进行 第一移动。应理解,由于UI元素311的第二时间点T2-311晚于UI元素344的第二时间点T2-344,所以UI元素311将会比UI元素344较晚地开始进行“引力”动画效果。也就是说,本公开的实施例的“引力”动画效果开始的时间点可以是与受影响的UI元素与被操作的UI元素之间的距离r成反比的,另外定义波的传输速度为s,应用侧可以自行调节。在这种情况下,第一波运动的UI元素(也即,在被操作的UI元素的影响范围内距离中心点最近的UI元素,假设距中心点的距离为r0)可以没有延时,r0也是可调参数,由应用侧确定。其他受影响的UI元素(例如,距被操作的UI元素的中心点的距离r)的延时可以是:
delay=(r-r0)/s  (公式10)。
通过示例处理过程1900,电子设备100的UI可以在视觉上呈现“引力”作用的联动,也即,“吸引力”或“排斥力”造成的移动随着距离进行传播,使得UI的动画效果更符合用户的使用习惯,从而进一步改进用户体验。
图19B至图19E示出了根据本公开的实施例的在考虑到“引力”传播延迟的情况下受到“引力”影响的三个UI元素的不同位移时间变化曲线的比较的示意图。具体地,图19B示出了在上文参考图3C至图3F描述的示例中的UI元素344、UI元素324和UI元素311三个UI元素在受到UI元素343的“引力”影响下,在考虑到“引力”传播延迟的情况下的第一移动的位移时间曲线均为贝塞尔曲线的示意图。图19C示出了在上文参考图3C至图3F描述的示例中的UI元素344、UI元素324和UI元素311三个UI元素在受到UI元素343的“引力”影响下,在考虑到“引力”传播延迟的情况下的的第一移动的位移时间曲线均为反比例曲线的示意图。图19D示出了在上文参考图3C至图3F描述的示例中的UI元素344、UI元素324和UI元素311三个UI元素在受到UI元素343的“引力”影响下,在考虑到“引力”传播延迟的情况下的第二移动的位移时间曲线均为临界阻尼弹性力曲线的示意图。图19E示出了在上文参考图3C至图3F描述的示例中的UI元素344、UI元素324和UI元素311三个UI元素在受到UI元素343的“引力”影响下,在考虑到“引力”传播延迟的情况下的第二移动的位移时间曲线均为欠阻尼弹性力曲线的示意图。需要说明的是,图19B至图19E以示例性的方式描绘了三个UI元素的位移时间曲线,以说明不同UI元素在相同UI元素的“引力”影响下的第一位移和第二位移可以分别具有不同的位移时间曲线,并且开始第一移动或第二移动的时间之间具有时间差或延迟。图3C至图3F中描绘的受到UI元素343的“引力”影响的其他UI元素的第一位移和第二位移可以具有类似的位移时间曲线和延迟。
在图19B示出的位移时间曲线图中,横坐标表示时间,纵坐标表示位移(或距离),此前在图5中描绘的第二UI元素344的第一移动的位移时间曲线1910可以是二阶贝塞尔曲线,UI元素324的第一移动的位移时间曲线1912可以是二阶贝塞尔曲线,并且UI元素311的第一移动的位移时间曲线1914也可以是二阶贝塞尔曲线。注意,贝塞尔曲线1910、1912和1914可以具有不同的参数。例如,因为UI元素344与被操作的UI元素343的距离最近,所以UI元素344可以具有最大的目标距离D0-344,并且第一移动具有最早的开始时间t19-1。因为UI元素324与被操作的UI元素343的距离比UI元素344远,所以UI元素324可以具有比UI元素344的目标距离D0-344小的目标距离D0-324,并且第一移动的开始时间t19-2晚于UI元素344的第一移动的开始时间t19-1。因为UI元素311与被操作的UI元素343的距离比UI元素324远,所以UI元素311可以具有比UI元素324的目标距离D0-324小的目标距离D0-311,并且第一移动的开始时间t19-3晚于UI元素324的第一移动的开始时间t19-2。 同时参考图3C至图3E和图19B,在t19-1时刻,UI元素344在UI元素343的“引力”作用下,开始准备进行第一移动。在t19-2时刻,UI元素324在UI元素343的“引力”作用下,开始准备进行第一移动。在t19-3时刻,UI元素344、324和311在各自的第一方向上移动距离D1-344、D1-324和D1-311(在图19B的示例中为0,因为此时UI元素311尚未开始第一移动)。在t19-4时刻,UI元素311在UI元素343的“引力”作用下,开始准备进行第一移动。在t19-5时刻,UI元素344、324和311在各自的第一方向上移动距离D2-344、D2-324和D2-311。在t19-6时刻,UI元素344在第一方向上移动目标距离D0-344。在t19-7时刻,UI元素324在第一方向上移动目标距离D0-324。在t19-8时刻,UI元素311在第一方向上移动目标距离D0-311。需要说明的是,在具体的实现中,电子设备100可以根据屏幕300的刷新频率所对应的时间间隔在位移时间曲线1910、1912、1914上确定出UI元素344、324和311在每个时刻所在的位置,然后在不同的时刻在屏幕300上的对应位置处显示UI元素344、324和311,从而可以实现UI元素344、324和311进行各自的第一移动的动画效果。
在图19C示出的位移时间曲线图中,横坐标表示时间,纵坐标表示位移(或距离),此前在图5中描绘的第二UI元素344的第一移动的位移时间曲线1920可以是反比例曲线,UI元素324的第一移动的位移时间曲线1922可以是反比例曲线,并且UI元素311的第一移动的位移时间曲线1924也可以是反比例曲线。注意,反比例曲线1920、1922和1924可以具有不同的参数。例如,因为UI元素344与被操作的UI元素343的距离最近,所以UI元素344可以具有最大的目标距离D0-344,并且第一移动具有最早的开始时间t19-1。因为UI元素324与被操作的UI元素343的距离比UI元素344远,所以UI元素324可以具有比UI元素344的目标距离D0-344小的目标距离D0-324,并且第一移动的开始时间t19-2晚于UI元素344的第一移动的开始时间t19-1。因为UI元素311与被操作的UI元素343的距离比UI元素324远,所以UI元素311可以具有比UI元素324的目标距离D0-324小的目标距离D0-311,并且第一移动的开始时间t19-3晚于UI元素324的第一移动的开始时间t19-2。同时参考图3C至图3E和图19C,在t19-1时刻,UI元素344在UI元素343的“引力”作用下,开始准备进行第一移动。在t19-2时刻,UI元素324在UI元素343的“引力”作用下,开始准备进行第一移动。在t19-3时刻,UI元素311在UI元素343的“引力”作用下,开始准备进行第一移动。在t19-4时刻,UI元素344、324和311在各自的第一方向上移动距离D1-344、D1-324和D1-311。在t19-5时刻,UI元素344、324和311在各自的第一方向上移动距离D2-344、D2-324和D2-311。在t19-6时刻,UI元素344在第一方向上移动目标距离D0-344。在t19-7时刻,UI元素324在第一方向上移动目标距离D0-324。在t19-8时刻,UI元素311在第一方向上移动目标距离D0-311。需要说明的是,在具体的实现中,电子设备100可以根据屏幕300的刷新频率所对应的时间间隔在位移时间曲线1920、1922、1924上确定出UI元素344、324和311在每个时刻所在的位置,然后在不同的时刻在屏幕300上的对应位置处显示UI元素344、324和311,从而可以实现UI元素344、324和311进行各自的第一移动的动画效果。
在图19D示出的位移时间曲线图中,横坐标表示时间,纵坐标表示位移(或距离),此前在图5中描绘的第二UI元素344的第二移动的位移时间曲线1930可以是临界阻尼弹性力曲线,UI元素324的第二移动的位移时间曲线1932可以是临界阻尼弹性力曲线,并且UI元素311的第二移动的位移时间曲线1934也可以是临界阻尼弹性力曲线。在图19D的示例中,假设UI元素344、324和311各自的第一移动具有相同的持续时间,因此各自的第二移动的 开始时间之间的延迟将与各自的第一移动的开始时间之间的延迟相同。注意,临界阻尼弹性力曲线1930、1932和1934可以具有不同的参数。例如,因为UI元素344与被操作的UI元素343的距离最近,所以UI元素344可以具有最大的目标距离D0-344,并且第二移动具有最早的开始时间t19-9。因为UI元素324与被操作的UI元素343的距离比UI元素344远,所以UI元素324可以具有比UI元素344的目标距离D0-344小的目标距离D0-324,并且第二移动的开始时间t19-10晚于UI元素344的第一移动的开始时间t19-9。因为UI元素311与被操作的UI元素343的距离比UI元素324远,所以UI元素311可以具有比UI元素324的目标距离D0-324小的目标距离D0-311,并且第二移动的开始时间t19-11晚于UI元素324的第一移动的开始时间t19-10。同时参考图3E至图3F和图19D,在t19-9时刻,UI元素344在UI元素343的“引力”作用下,已经完成第一移动,开始准备进行第二移动。在t19-10时刻,UI元素324在UI元素343的“引力”作用下,已经完成第一移动,开始准备进行第二移动。在t19-11时刻,UI元素311在UI元素343的“引力”作用下,已经完成第一移动,开始准备进行第二移动。在t19-12时刻,UI元素344、324和311在各自的第二方向上移动距离D1-344、D1-324和D1-311。在t19-13时刻,UI元素344、324和311在各自的第二方向上移动距离D2-344、D2-324和D2-311。在t19-14时刻,UI元素344在第二方向上移动目标距离D0-344。在t19-15时刻,UI元素324在第二方向上移动目标距离D0-324。在t19-16时刻,UI元素311在第二方向上移动目标距离D0-311。需要说明的是,在具体的实现中,电子设备100可以根据屏幕300的刷新频率所对应的时间间隔在位移时间曲线1930、1932、1934上确定出UI元素344、324和311在每个时刻所在的位置,然后在不同的时刻在屏幕300上的对应位置处显示UI元素344、324和311,从而可以实现UI元素344、324和311进行各自的第二移动的动画效果。
在图19E示出的位移时间曲线图中,横坐标表示时间,纵坐标表示位移(或距离),此前在图5中描绘的第二UI元素344的第二移动的位移时间曲线1940可以是欠阻尼弹性力曲线,UI元素324的第二移动的位移时间曲线1942可以是欠阻尼弹性力曲线,并且UI元素311的第二移动的位移时间曲线1944也可以是欠阻尼弹性力曲线。在图19E的示例中,假设UI元素344、324和311各自的第一移动具有相同的持续时间,因此各自的第二移动的开始时间之间的延迟将与各自的第一移动的开始时间之间的延迟相同。注意,欠阻尼弹性力曲线1940、1942和1944可以具有不同的参数。例如,因为UI元素344与被操作的UI元素343的距离最近,所以UI元素344可以具有最大的目标距离D0-344,并且第二移动具有最早的开始时间t19-9。因为UI元素324与被操作的UI元素343的距离比UI元素344远,所以UI元素324可以具有比UI元素344的目标距离D0-344小的目标距离D0-324,并且第二移动的开始时间t19-10晚于UI元素344的第一移动的开始时间t19-9。因为UI元素311与被操作的UI元素343的距离比UI元素324远,所以UI元素311可以具有比UI元素324的目标距离D0-324小的目标距离D0-311,并且第二移动的开始时间t19-11晚于UI元素324的第一移动的开始时间t19-10。同时参考图3E至图3F和图19E,在t19-9时刻,UI元素344在UI元素343的“引力”作用下,已经完成第一移动,开始准备进行第二移动。在t19-10时刻,UI元素324在UI元素343的“引力”作用下,已经完成第一移动,开始准备进行第二移动。在t19-11时刻,UI元素311在UI元素343的“引力”作用下,已经完成第一移动,开始准备进行第二移动。在t19-12时刻,UI元素344、324和311在各自的第二方向上移动距离D3-344、D3-324 和D3-311。在t19-13时刻,UI元素344、324和311在各自的第二方向上移动距离D4-344、D4-324和D4-311。在t19-14时刻,UI元素344在第二方向上移动目标距离D0-344。在t19-15时刻,UI元素324在第二方向上移动目标距离D0-324。在t19-16时刻,UI元素311在第二方向上移动目标距离D0-311。注意,在图19E示出的示例中,UI元素344、324和311将会基于各自的欠阻尼弹性力曲线的位移时间曲线,而在各自的起始位置进行来回的“往复”运动。需要说明的是,在具体的实现中,电子设备100可以根据屏幕300的刷新频率所对应的时间间隔在位移时间曲线1940、1942、1944上确定出UI元素344、324和311在每个时刻所在的位置,然后在不同的时刻在屏幕300上的对应位置处显示UI元素344、324和311,从而可以实现UI元素344、324和311进行各自的第二移动的动画效果。需要说明的是,在图19E的示例中,由于受到UI元素343的“引力”作用影响的其他UI元素(例如UI元素344、324和311等)可以根据具有不同参数(例如,不同的开始时间,不同的目标距离等)的欠阻尼弹性力曲线来进行第二移动(在一些实施例中,也可以根据欠阻尼弹性力曲线进行第一移动),所以在这些UI元素进行“引力”动画效果期间,特别是在进行多次来回“往复”运动期间,这些UI元素可能发生“重叠”,也即一个UI元素可能会覆盖另外的一个或多个UI元素。在一些实施例中,如果不期望在UI元素的“引力”动画效果期间发生UI元素的“重叠”,则电子设备100可以选择类似图19B至图19D中描绘的位移时间曲线来控制UI元素的“引力”动画效果。还需要说明的是,更一般地,不限于图19E的示例,在本公开的一些实施例中,受到“引力”影响的各个UI元素的第一移动的目标距离(也即,移动幅度)是可设置的,所以在某些设置的情况下,在多个UI元素进行“引力”动画效果期间可能会发生UI元素的重叠。本公开的实施例并不排除这样的UI元素重叠。换句话说,不论UI元素在进行“引力”动画效果期间是否发生重叠,都应当认为落在本公开的实施例的范围之内。
如上文指出的,本公开的实施例提出的“引力”动画效果不限于在上文描述的UI元素被点击的示例操作场景,而是可以适用于对UI元素的各种其他操作的场景。例如,在一些实施例中,对第一UI元素的操作可以包括点击操作、移动操作、与其他UI元素合并操作、展开操作、删除操作,等等。以此方式,电子设备可以在与UI元素相关的几乎所有的操作中实现“引力”动画效果,从而在更多的操作场景中提升用户体验。下文将参考图20A至图20D、图21、图22A至图22D来描述UI元素被移动,并且与其他UI元素交换位置的示例场景中的“引力”动画效果。然后,将参考图23A至图23D来描述UI元素与其他UI元素合并的示例场景中的“引力”动画效果。接着,将参考图24A至图24D来描述UI元素被删除的示例场景中的“引力”动画效果。最后,将参考图25A至图25D来描述UI元素被展开的示例场景中的“引力”动画效果。
图20A至图20D示出了根据本公开的实施例在UI元素被移动并且与另一UI元素交换位置的场景中所产生的“引力”动画效果的示意图。如图20A所示,电子设备100的用户的手部370按住UI元素343,然后将UI元素343拖动到位于UI元素343的上方的UI元素333的附近。如图20B所示,响应于用户的手部370对UI元素343的操作,UI元素343与UI元素333交换位置。也就是说,在用户对UI元素343的上述操作之后,UI元素343将移动到UI元素333之前的位置,而UI元素333将移动到UI元素343之前的位置。更具体地,在图20A至图20D描绘的示例中,UI元素333最初位于第3行第3列的初始位置,而UI元素343最初位于第4行第3列的初始位置。如本文中使用的,“初始位置”可以是指在用户对 UI元素的操作之前,UI元素最初所位于的位置,其有别于上文描述的“引力”动画效果被触发时UI元素所位于的“起始位置”。
在图20B中,UI元素343已经完成与UI元素333的位置交换,因此UI元素343目前位于第3行第3列,而UI元素333位于第4行第3列。此时,由于被操作的UI元素343来到了新的位置,所以可以想象为此前的“引力”平衡状态被“打破”,从而将对周围的UI元素产生“引力”作用。具体地,在图20A至图20D描绘的示例中,UI元素343来到新位置之后对周围的UI元素产生的“引力”作用可以被设置为“排斥力”。也就是说,UI元素343周围的UI元素将首先在远离UI元素343的第一方向上进行第一位移,然后将在朝向UI元素343的第二方向上进行第二位移,从而返回到各自的起始位置。更具体地,UI元素311将沿着远离UI元素343的第一方向311-d1进行第一移动,UI元素312将沿着远离UI元素343的第一方向312-d1进行第一移动,UI元素313将沿着远离UI元素343的第一方向313-d1进行第一移动,UI元素314将沿着远离UI元素343的第一方向314-d1进行第一移动。类似地,UI元素321将沿着远离UI元素343的第一方向321-d1进行第一移动,UI元素322将沿着远离UI元素343的第一方向322-d1进行第一移动,UI元素323将沿着远离UI元素343的第一方向323-d1进行第一移动,UI元素324将沿着远离UI元素343的第一方向324-d1进行第一移动。
类似地,UI元素331将沿着远离UI元素343的第一方向331-d1进行第一移动,UI元素332将沿着远离UI元素343的第一方向332-d1进行第一移动,UI元素334将沿着远离UI元素343的第一方向334-d1进行第一移动。类似地,UI元素341将沿着远离UI元素343的第一方向341-d1进行第一移动,UI元素342将沿着远离UI元素343的第一方向342-d1进行第一移动,UI元素333将沿着远离UI元素343的第一方向333-d1进行第一移动,UI元素344将沿着远离UI元素343的第一方向344-d1进行第一移动。类似地,UI元素351将沿着远离UI元素343的第一方向351-d1进行第一移动,UI元素352将沿着远离UI元素343的第一方向352-d1进行第一移动,UI元素353将沿着远离UI元素343的第一方向353-d1进行第一移动,UI元素354将沿着远离UI元素343的第一方向354-d1进行第一移动。类似地,UI元素361将沿着远离UI元素343的第一方向361-d1进行第一移动,UI元素362将沿着远离UI元素343的第一方向362-d1进行第一移动,UI元素363将沿着远离UI元素343的第一方向363-d1进行第一移动,UI元素364将沿着远离UI元素343的第一方向364-d1进行第一移动。
在图20B的示例中,由于各个UI元素的大小是相同的,因此每个UI元素受到UI元素343的“引力”影响的大小(也即,第一移动的目标距离或幅度)可以随着UI元素与UI元素343的距离增大而减小。具体地,在图20B的示例中,假设UI元素之间的横向间距和纵向间距是相等的。因此,UI元素323、332、334与UI元素343的距离最近,因此第一移动的目标距离最大。UI元素322、324、342、344与UI元素343的接近程度次之(也即,距离更大),因此第一移动的目标距离也次之。UI元素313、331、353与UI元素343的接近程度再次之(也即,距离更大),因此第一移动的目标距离也再次之。UI元素312、314、321、341、352和354与UI元素343的接近程度再次之(也即,距离更大),因此第一移动的目标距离也再次之。UI元素311和351与UI元素343的接近程度再次之(也即,距离更大),因此第一移动的目标距离也再次之。UI元素363与UI元素343的接近程度再次之(也即,距离更 大),因此第一移动的目标距离也再次之。UI元素362和364与UI元素343的接近程度再次之(也即,距离更大),因此第一移动的目标距离也再次之。UI元素361距离UI元素343的距离最远,因此第一移动的目标距离也最小。需要说明的是,具体到每个UI元素在第一移动中的目标距离的大小,可以基于该UI元素与产生“引力”影响的UI元素之间的距离来确定,而两个UI元素之间的距离可以按照前文参考图8至图17F中描述的任何一种距离计算方式来确定。
例如,如图20C所示,小黑点表示除了UI元素343之外的各个UI元素在“引力”动画效果开始之前所在的起始位置,而十字符号表示各个UI元素的当前位置。也就是说,在图20C示出的时刻,除了UI元素343之外的各个UI元素已经在远离UI元素343的第一方向上移动了各自的目标距离,之后将在朝向UI元素343的第二方向上返回各自的起始位置。在图20C的示例中,由于各个UI元素的尺寸是相同的,所以某个UI元素受到UI元素343的“排斥力”的大小(也即目标距离的大小)可以取决于与该UI元素与UI元素343之间的距离。因此,如图20C中示意性地示出的,在各自的第一移动中,UI元素343周围的UI元素将根据与UI元素343的距离远近而具有不同的移动距离。例如,UI元素323比UI元素313更接近于UI元素343,所以UI元素323可以比UI元素313移动更大的目标距离。如图20D所示,小黑点表示除了UI元素343之外的各个UI元素在“引力”动画效果开始之前所在的起始位置,而十字符号表示各个UI元素的当前位置。也就是说,在图20D示出的时刻,除了UI元素343之外的各个UI元素已经完成了朝向UI元素343的第二移动而返回到各自的起始位置。
在一些实施例中,UI元素333可能将早于UI元素343达到新位置,也即当UI元素333到达第4行第3列的新位置时,UI元素343可能还未到达第3行第3列的新位置。在这样的情况下,到达新位置的UI元素333可以认为是“引力”平衡被打破的UI元素,因此将会受到周围其他UI元素的“引力”作用。例如,在UI元素333到达新位置之后,并且在UI元素343尚未到达新位置之前,UI元素333可以受到周围某个UI元素的“吸引力”而产生“引力”动画效果。下文将参考图21以及图22A至图22D来描述这样的实施例。当然,在UI元素333到达新位置之后,并且在UI元素343尚未到达新位置之前,UI元素333的“引力”本公开的实施例的“引力”动画效果不限于UI元素333受到周围某个UI元素的“吸引力”,而也可能是受到周围某个UI元素的“排斥力”、受到周围多个UI元素的“吸引力”或“排斥力”、或者可以是对周围的UI元素产生“吸引力”或“排斥力”,等等。
图21示出了根据本公开的实施例的在UI元素交换位置的场景中,先到达新位置的UI元素受到其他UI元素的“引力”作用而产生“引力”动画效果的示例处理过程2100的流程图。在一些实施例中,处理过程2100可以由电子设备100来实现,例如可以由电子设备100的处理器110或处理单元配合其他组件(例如,显示屏194)来实现。在其他实施例中,处理过程2100也可以由具有屏幕以显示UI元素的其他设备来实现。为了便于说明,将以电子设备100执行处理过程2100为例,参考图22A至图22D来论述处理过程2100。
图22A至图22D示出了根据本公开的实施例的在UI元素交换位置的场景中,先到达新位置的UI元素受到其他UI元素的“引力”作用而产生“引力”动画效果的示意图。需要说明的是,图22A至图22D描绘的场景在时间上处于上文描述的图20A与图20B之间。也就是说,图22A至图22D的场景发生在UI元素333已经到达新位置(也即UI元素343之前的位置),并且UI元素343尚未到达新位置(也即UI元素333之前的位置)的这段时间期间。 因此,如果将UI元素333考虑为图2的示例过程200中的第二UI元素,那么示例过程200中针对第二UI元素333的“引力”动画效果的目标距离指的是图20C中描绘的移动距离,下文中将称为第一目标距离。在图20A至20D描绘的“引力”动画效果之外,针对UI元素333的“引力”动画效果还将包括图22A至图22D描绘的“引力”动画效果。
同时参考图21和图22A,在图21的框2110处,电子设备100可以将第二UI元素333从初始位置移动到起始位置,起始位置可以是第一UI元素343的初始位置。例如,在图22A的示例中,第一UI元素343的初始位置是第4行第3列,而第二UI元素333的初始位置是第3行第3列。在电子设备100的用户使用手部370发起第一UI元素343与第二UI元素333的位置交换过程之后,第二UI元素333已经达到新位置第4行第3列,但是第一UI元素343还没有到达新位置第3行第3列。此时,由于第二UI元素333来到了新的位置,所以可以想象为第二UI元素333此前的“引力”平衡状态被“打破”,从而将受到周围的UI元素产生的“引力”作用。作为示例,如图22B所示,第二UI元素333将会受到下方的UI元素353的“吸引力”而产生“引力”动画效果。为了描述的便利,对第二UI元素333产生“引力”作用的UI元素353可以以称为第三UI元素。需要说明的是,尽管图22B中描绘的是下方的UI元素353对第二UI元素333产生“吸引力”,但是这仅是示意性的,无意以任何方式限制本公开的范围。在其他实施例中,对第二UI元素333的“吸引力”或“排斥力”可以来自于其他一个或多个UI元素,或者也可以是第二UI元素333对其他一个或多个UI元素产生“吸引力”或“排斥力”。
同时参考图21和图22B,在图21的框2120处,在第二UI元素333到达图22B中描绘的第一移动的起始位置(在该示例中为第4行第3列)之后,并且在开始远离第一UI元素343的第一移动之前,电子设备100可以确定第二UI元素333将在第三方向333-d3上移动的第二目标距离。在该示例中,第三方向333-d3是从第二UI元素333指向第三UI元素353的方向,也即第二UI元素333受到第三UI元素353的“吸引力”。在其他实施例中,第三方向333-d3也可以是从第三UI元素353指向第二UI元素333的方向,也即第二UI元素333受到第三UI元素353的“排斥力”。应理解,电子设备100可以按照前文描述的用于确定第一目标距离的相同或类似方式来确定第二目标距离,此处将不再赘述。
同时参考图21和图22C,在图21的框2130处,在第二UI元素333远离第一UI元素343的第一移动(例如,在图20C中描绘)之前,电子设备100可以使第二UI元素333从起始位置(例如,第4行第3列)沿第三方向333-d3以第二目标距离进行第三移动。例如,在图22C的示例中,由于第三方向333-d3是从第二UI元素333指向第三UI元素353的,所以第二UI元素333可以朝向第三UI元素353进行第三移动。如图22C所示,小黑点表示第二UI元素333在“引力”动画效果开始之前所在的起始位置,而十字符号表示第二UI元素333的当前位置。
同时参考图21和图22D,在图21的框2040处,在第二UI元素333的第三移动(例如,朝向第三UI元素353的移动)之后,并且在第二UI元素333的第一移动(例如,图20C中描绘的远离第一UI元素343的移动)之前,电子设备100可以使第二UI元素333沿与第三方向333-d3相反的第四方向(例如,远离第三UI元素353的方向)进行第四移动,以复位到起始位置(例如,第4行第3列)。如图22D所示,在第二UI元素333完成朝向第三UI元素353的第三移动和远离第三UI元素353的第四移动之后,第一UI元素343可能仍然没 有到达新位置(例如,第3行第3列)。例如,这可能是因为用户的手部370保持对第一UI元素343的拖动操作而未放开。在这样的情况下,在一些实施例中,电子设备100可以使第二UI元素333重复地进行多次第三移动和第四移动,直到第一UI元素343到达新位置(例如,第3行第3列)。通过示例过程2100,尽管第二UI元素333没有被直接操作,但是第二UI元素333由于需要与第一UI元素343交换位置而来到新位置,从而受到其他UI元素(例如,第三UI元素353)的“引力”作用。因此,电子设备100可以更加充分和全面地展现出UI元素之间具有“引力”的动画效果,从而进一步提升用户体验。
图23A至图23D示出了根据本公开的实施例在UI元素被移动并且与另一UI元素合并的场景中所产生的“引力”动画效果的示意图。如图23A所示,电子设备100的用户的手部370按住UI元素343,然后将UI元素343拖动到与UI元素343的上方的UI元素333重叠。如图23B所示,响应于用户的手部370对UI元素343的操作,UI元素343与UI元素333开始进行UI元素合并(例如,产生一个新的文件夹)的动画效果。此时,由于被操作的UI元素343与UI元素333开始合并,所以可以想象为此前的“引力”平衡状态被“打破”,从而将对周围的UI元素产生“引力”作用。具体地,在图23A至图23D描绘的示例中,UI元素343开始与UI元素333合并时对周围的UI元素产生的“引力”作用可以被设置为“排斥力”。也就是说,UI元素343周围的UI元素将首先在远离UI元素343的第一方向上进行第一位移,然后将在朝向UI元素343的第二方向上进行第二位移,从而返回到各自的起始位置。更具体地,UI元素311将沿着远离UI元素343的第一方向311-d1进行第一移动,UI元素312将沿着远离UI元素343的第一方向312-d1进行第一移动,UI元素313将沿着远离UI元素343的第一方向313-d1进行第一移动,UI元素314将沿着远离UI元素343的第一方向314-d1进行第一移动。类似地,UI元素321将沿着远离UI元素343的第一方向321-d1进行第一移动,UI元素322将沿着远离UI元素343的第一方向322-d1进行第一移动,UI元素323将沿着远离UI元素343的第一方向323-d1进行第一移动,UI元素324将沿着远离UI元素343的第一方向324-d1进行第一移动。
类似地,UI元素331将沿着远离UI元素343的第一方向331-d1进行第一移动,UI元素332将沿着远离UI元素343的第一方向332-d1进行第一移动,UI元素334将沿着远离UI元素343的第一方向334-d1进行第一移动。类似地,UI元素341将沿着远离UI元素343的第一方向341-d1进行第一移动,UI元素342将沿着远离UI元素343的第一方向342-d1进行第一移动,UI元素344将沿着远离UI元素343的第一方向344-d1进行第一移动。类似地,UI元素351将沿着远离UI元素343的第一方向351-d1进行第一移动,UI元素352将沿着远离UI元素343的第一方向352-d1进行第一移动,UI元素353将沿着远离UI元素343的第一方向353-d1进行第一移动,UI元素354将沿着远离UI元素343的第一方向354-d1进行第一移动。类似地,UI元素361将沿着远离UI元素343的第一方向361-d1进行第一移动,UI元素362将沿着远离UI元素343的第一方向362-d1进行第一移动,UI元素363将沿着远离UI元素343的第一方向363-d1进行第一移动,UI元素364将沿着远离UI元素343的第一方向364-d1进行第一移动。
在图23B的示例中,由于各个UI元素的大小是相同的,因此每个UI元素受到UI元素343的“引力”影响的大小(也即,第一移动的目标距离或幅度)可以随着UI元素与UI元素343的距离增大而减小。具体地,在图23B的示例中,假设UI元素之间的横向间距和纵 向间距是相等的。因此,UI元素323、332、334与UI元素343的距离最近,因此第一移动的目标距离最大。UI元素322、324、342、344与UI元素343的接近程度次之(也即,距离更大),因此第一移动的目标距离也次之。UI元素313、331、353与UI元素343的接近程度再次之(也即,距离更大),因此第一移动的目标距离也再次之。UI元素312、314、321、341、352和354与UI元素343的接近程度再次之(也即,距离更大),因此第一移动的目标距离也再次之。UI元素311和351与UI元素343的接近程度再次之(也即,距离更大),因此第一移动的目标距离也再次之。UI元素363与UI元素343的接近程度再次之(也即,距离更大),因此第一移动的目标距离也再次之。UI元素362和364与UI元素343的接近程度再次之(也即,距离更大),因此第一移动的目标距离也再次之。UI元素361距离UI元素343的距离最远,因此第一移动的目标距离也最小。需要说明的是,具体到每个UI元素在第一移动中的目标距离的大小,可以基于该UI元素与产生“引力”影响的UI元素之间的距离来确定,而两个UI元素之间的距离可以按照前文参考图8至图17F中描述的任何一种距离计算方式来确定。
例如,如图23C所示,小黑点表示除了UI元素343和333之外的各个UI元素在“引力”动画效果开始之前所在的起始位置,而十字符号表示各个UI元素的当前位置。也就是说,在图23C示出的时刻,除了UI元素343和UI元素333之外的各个UI元素已经在远离UI元素343的第一方向上移动了各自的目标距离,之后将在朝向UI元素343的第二方向返回各自的起始位置。在图23C的示例中,由于各个UI元素的尺寸是相同的,所以某个UI元素受到UI元素343的“排斥力”的大小(也即目标距离的大小)可以取决于与该UI元素与UI元素343之间的距离。因此,如图23C中示意性地示出的,UI元素343周围的UI元素将根据与UI元素343的距离远近而具有不同的移动距离。例如,UI元素323比UI元素313更接近于UI元素343,所以UI元素323可以比UI元素313移动更大的目标距离。如图23D所示,小黑点表示除了UI元素343和UI元素333之外的各个UI元素在“引力”动画效果开始之前所在的起始位置,而十字符号表示各个UI元素的当前位置。也就是说,在图23D示出的时刻,除了UI元素343和UI元素333之外的各个UI元素已经完成了朝向UI元素343的第二移动而返回到各自的起始位置。此外,如图23D进一步示出的,UI元素343和UI元素333已经完成了合并动画,形成了新的UI元素335。例如,新UI元素335可以是包括UI元素343和UI元素333两者的文件夹。
图24A至图24D示出了根据本公开的实施例在UI元素被删除的场景中所产生的“引力”动画效果的示意图。如图24A所示,电子设备100的用户可以执行操作以删除UI元素343,因此UI元素343开始进行以圆形形状逐渐变小直到消失的删除动画效果。需要说明的是,图24A至图24D中描绘的UI元素343被删除时的删除动画效果仅是示意性的,无意以任何方式限制本公开的范围。本公开的实施例也等同地适用于UI元素被删除时的任何删除动画效果。如图24B所示,响应于用户对UI元素343的删除操作,UI元素343开始变为较小的圆形UI元素343并且不断地缩小。此时,由于被操作的UI元素343正在逐渐变小并且消失,所以可以想象为UI元素343此前的“引力”平衡状态被“打破”,从而将对周围的UI元素产生“引力”作用。具体地,在图24A至图24D描绘的示例中,UI元素343开始变小和消失时对周围的UI元素产生的“引力”作用可以被设置为“吸引力”。也就是说,UI元素343周围的UI元素将首先在朝向UI元素343的第一方向上进行第一移动,然后将在远离UI元素343的 第二方向上进行第二移动,从而返回到各自的起始位置。更具体地,在各个UI元素的第一移动中,UI元素311将沿着朝向UI元素343的第一方向311-d1进行第一移动,UI元素312将沿着朝向UI元素343的第一方向312-d1进行第一移动,UI元素313将沿着朝向UI元素343的第一方向313-d1进行第一移动,UI元素314将沿着朝向UI元素343的第一方向314-d1进行第一移动。类似地,UI元素321将沿着朝向UI元素343的第一方向321-d1进行第一移动,UI元素322将沿着朝向UI元素343的第一方向322-d1进行第一移动,UI元素323将沿着朝向UI元素343的第一方向323-d1进行第一移动,UI元素324将沿着朝向UI元素343的第一方向324-d1进行第一移动。
类似地,UI元素331将沿着朝向UI元素343的第一方向331-d1进行第一移动,UI元素332将沿着朝向UI元素343的第一方向332-d1进行第一移动,UI元素333将沿着朝向UI元素343的第一方向333-d1进行第一移动,UI元素334将沿着朝向UI元素343的第一方向334-d1进行第一移动。类似地,UI元素341将沿着朝向UI元素343的第一方向341-d1进行第一移动,UI元素342将沿着朝向UI元素343的第一方向342-d1进行第一移动,UI元素344将沿着朝向UI元素343的第一方向344-d1进行第一移动。类似地,UI元素351将沿着朝向UI元素343的第一方向351-d1进行第一移动,UI元素352将沿着朝向UI元素343的第一方向352-d1进行第一移动,UI元素353将沿着朝向UI元素343的第一方向353-d1进行第一移动,UI元素354将沿着朝向UI元素343的第一方向354-d1进行第一移动。类似地,UI元素361将沿着朝向UI元素343的第一方向361-d1进行第一移动,UI元素362将沿着朝向UI元素343的第一方向362-d1进行第一移动,UI元素363将沿着朝向UI元素343的第一方向363-d1进行第一移动,UI元素364将沿着朝向UI元素343的第一方向364-d1进行第一移动。
在图24B的示例中,由于各个UI元素的大小是相同的,因此每个UI元素受到UI元素343的“引力”影响的大小(也即,第一移动的目标距离或幅度)可以随着UI元素与UI元素343的距离增大而减小。具体地,在图24B的示例中,假设UI元素之间的横向间距和纵向间距是相等的。因此,UI元素333、342、344和353与UI元素343的距离最近,因此第一移动的目标距离最大。UI元素332、334、352和354与UI元素343的接近程度次之(也即,距离更大),因此第一移动的目标距离也次之。UI元素323、341、363与UI元素343的接近程度再次之(也即,距离更大),因此第一移动的目标距离也再次之。UI元素322、324、331、351、362和364与UI元素343的接近程度再次之(也即,距离更大),因此第一移动的目标距离也再次之。UI元素321和361与UI元素343的接近程度再次之(也即,距离更大),因此第一移动的目标距离也再次之。UI元素313与UI元素343的接近程度再次之(也即,距离更大),因此第一移动的目标距离也再次之。UI元素312和314与UI元素343的接近程度再次之(也即,距离更大),因此第一移动的目标距离也再次之。UI元素311距离UI元素343的距离最远,因此第一移动的目标距离也最小。需要说明的是,具体到每个UI元素在第一移动中的目标距离的大小,可以基于该UI元素与产生“引力”影响的UI元素之间的距离来确定,而两个UI元素之间的距离可以按照前文参考图8至图17F中描述的任何一种距离计算方式来确定。
例如,如图24C所示,小黑点表示除了UI元素343之外的各个UI元素在“引力”动画效果开始之前所在的起始位置,而十字符号表示各个UI元素的当前位置。也就是说,在图 24C示出的时刻,除了UI元素343之外的各个UI元素已经完成了各自的第一移动,而在朝向UI元素343的第一方向上移动了各自的目标距离,之后将在远离UI元素343的第二方向上返回各自的起始位置。在图24C的示例中,由于各个UI元素的尺寸是相同的,所以某个UI元素受到UI元素343的“吸引力”的大小(也即目标距离的大小)可以取决于与该UI元素与UI元素343之间的距离。因此,如图24C中示意性地示出的,UI元素343周围的UI元素将根据与UI元素343的距离远近而具有不同的移动距离。例如,UI元素323比UI元素313更接近于UI元素343,所以UI元素323可以比UI元素313移动更大的目标距离。此外,如图24C进一步示出的,已经变为圆形的UI元素343相比于图24B描绘的时刻进一步缩小。如图24D所示,小黑点表示除了UI元素343之外的各个UI元素在“引力”动画效果开始之前所在的起始位置,而十字符号表示各个UI元素的当前位置。也就是说,在图24D示出的时刻,除了UI元素343之外的各个UI元素已经完成了各自的第二移动,也即,远离UI元素343移动而返回到各自的起始位置。此外,如图24D进一步示出的,UI元素343已经完全消失,以表示其已经被删除。
图25A至图25D示出了根据本公开的实施例在UI元素被展开的场景中所产生的“引力”动画效果的示意图。如图25A所示,电子设备100的用户的手部370可以执行操作以展开UI元素343。例如,展开UI元素343的操作可能包括长按UI元素343,以打开与UI元素343有关的可供用户选择或查看的菜单,并且然后在打开的菜单中选择或查看展开的菜单。因此,UI元素343开始进行展开菜单的动画效果。需要说明的是,图25A至图25D中描绘的UI元素343被展开时的展开动画效果仅是示意性的,无意以任何方式限制本公开的范围。本公开的实施例也等同地适用于UI元素被展开时的任何展开动画效果。如图25B所示,响应于用户对UI元素343的展开操作,UI元素343的位置处开始出现展开的UI元素345,并且UI元素345将会逐渐变大,最终覆盖UI元素343且可能覆盖的附近其他UI元素。此时,由于被操作的UI元素343处正在出现新的UI元素345,所以可以想象为此前的“引力”平衡状态被“打破”,从而将对周围的UI元素产生“引力”作用。具体地,在图25A至图25D描绘的示例中,UI元素343处开始出现UI元素345时对周围的UI元素产生的“引力”作用可以被设置为“排斥力”。也就是说,UI元素343周围的UI元素将首先在远离UI元素343的第一方向上进行第一移动,然后将在朝向UI元素343的第二方向上进行第二移动,从而返回到各自的起始位置。更具体地,在各个UI元素的第一移动中,UI元素311将沿着远离UI元素343的第一方向311-d1进行第一移动,UI元素312将沿着远离UI元素343的第一方向312-d1进行第一移动,UI元素313将沿着远离UI元素343的第一方向313-d1进行第一移动,UI元素314将沿着远离UI元素343的第一方向314-d1进行第一移动。类似地,UI元素321将沿着远离UI元素343的第一方向321-d1进行第一移动,UI元素322将沿着远离UI元素343的第一方向322-d1进行第一移动,UI元素323将沿着远离UI元素343的第一方向323-d1进行第一移动,UI元素324将沿着远离UI元素343的第一方向324-d1进行第一移动。
类似地,UI元素331将沿着远离UI元素343的第一方向331-d1进行第一移动,UI元素332将沿着远离UI元素343的第一方向332-d1进行第一移动,UI元素333将沿着远离UI元素343的第一方向333-d1进行第一移动,UI元素334将沿着远离UI元素343的第一方向334-d1进行第一移动。类似地,UI元素341将沿着远离UI元素343的第一方向341-d1进行第一移动,UI元素342将沿着远离UI元素343的第一方向342-d1进行第一移动,UI元素 344将沿着远离UI元素343的第一方向344-d1进行第一移动。类似地,UI元素351将沿着远离UI元素343的第一方向351-d1进行第一移动,UI元素352将沿着远离UI元素343的第一方向352-d1进行第一移动,UI元素353将沿着远离UI元素343的第一方向353-d1进行第一移动,UI元素354将沿着远离UI元素343的第一方向354-d1进行第一移动。类似地,UI元素361将沿着远离UI元素343的第一方向361-d1进行第一移动,UI元素362将沿着远离UI元素343的第一方向362-d1进行第一移动,UI元素363将沿着远离UI元素343的第一方向363-d1进行第一移动,UI元素364将沿着远离UI元素343的第一方向364-d1进行第一移动。
在图25B的示例中,由于各个UI元素的大小是相同的,因此每个UI元素受到UI元素343的“引力”影响的大小(也即,第一移动的目标距离或幅度)可以随着UI元素与UI元素343的距离增大而减小。具体地,在图25B的示例中,假设UI元素之间的横向间距和纵向间距是相等的。因此,UI元素333、342、344和353与UI元素343的距离最近,因此第一移动的目标距离最大。UI元素332、334、352和354与UI元素343的接近程度次之(也即,距离更大),因此第一移动的目标距离也次之。UI元素323、341、363与UI元素343的接近程度再次之(也即,距离更大),因此第一移动的目标距离也再次之。UI元素322、324、331、351、362和364与UI元素343的接近程度再次之(也即,距离更大),因此第一移动的目标距离也再次之。UI元素321和361与UI元素343的接近程度再次之(也即,距离更大),因此第一移动的目标距离也再次之。UI元素313与UI元素343的接近程度再次之(也即,距离更大),因此第一移动的目标距离也再次之。UI元素312和314与UI元素343的接近程度再次之(也即,距离更大),因此第一移动的目标距离也再次之。UI元素311距离UI元素343的距离最远,因此第一移动的目标距离也最小。需要说明的是,具体到每个UI元素在第一移动中的目标距离的大小,可以基于该UI元素与产生“引力”影响的UI元素之间的距离来确定,而两个UI元素之间的距离可以按照前文参考图8至图17F中描述的任何一种距离计算方式来确定。
例如,如图25C所示,小黑点表示除了UI元素343之外的各个UI元素在“引力”动画效果开始之前所在的起始位置,而十字符号表示各个UI元素的当前位置。也就是说,在图25C示出的时刻,除了UI元素343之外的各个UI元素已经完成了各自的第一移动,而在远离UI元素343的第一方向上移动了各自的目标距离,之后将在朝向UI元素343的第二方向上返回各自的起始位置。在图25C的示例中,由于各个UI元素的尺寸是相同的,所以某个UI元素受到UI元素343的“排斥力”的大小(也即目标距离的大小)可以取决于与该UI元素与UI元素343之间的距离。因此,如图25C中示意性地示出的,UI元素343周围的UI元素将根据与UI元素343的距离远近而具有不同的移动距离。例如,UI元素323比UI元素313更接近于UI元素343,所以UI元素323可以比UI元素313移动更大的目标距离。此外,如图25C进一步示出的,已经展开的UI元素345可能覆盖UI元素343以及周边的UI元素344、UI元素353和UI元素354,而使得这些UI元素不可见。如图25D所示,小黑点表示除了UI元素343之外的各个UI元素在“引力”动画效果开始之前所在的起始位置,而十字符号表示各个UI元素的当前位置。也就是说,在图25D示出的时刻,除了UI元素343之外的各个UI元素已经完成了各自的第二移动,也即,朝向UI元素343移动而返回到各自的起始位置。此外,如图25D进一步示出的,已经完全展开的UI元素345可能覆盖UI元素343 以及周边的UI元素344、UI元素353和UI元素354,而使得这些UI元素不可见。
图26示出了根据本公开的实施例的“引力”动画效果相关联的UI框架动效与系统桌面之间的关系的示意图。如图26所示,UI框架动效2602可以提供引力动效能力2604。引力动效能力2604可以采用AAR形式2606、JAR形式2608和系统接口2610。桌面2614可以对UI元素实现各种操作,例如移动操作2616、合并操作2618、展开操作2620、删除操作2622和其他操作2624。桌面2614可以通过集成2612的方式来使用UI框架动效2602提供的引力动效能力2604。尽管图26中未示出,但是桌面2614也可以通过调用(例如,系统接口2610)的方式来使用UI框架动效2602提供的引力动效能力2604。也就是说,UI框架可以采用AAR、JAR、系统接口的形式来提供“引力”动画效果的能力,桌面2614集成之后可以应用在领域内需要的各种场景。需要说明的是,尽管本公开的实施例主要以桌面的场景为例,但是UI框架主要提供“引力”动画效果的能力,所以“引力”动画效果可以实现在除了桌面以外的任何其他适当的场景中。
具体地,本公开的使用场景可以包括将排列好的UI元素(例如,图标)相联系的任何场景,只要是多个UI元素针对某个UI元素被操作进行响应的场景都可以支持引力动效。比较常见的场景可以包括桌面中各种图标的操作,比如移动、合并、删除、展开等,可能的操作不限于上述的列举项,如果将来桌面提供针对UI元素的其他的功能或操作,也同样可以使用本公开的实施例提供的“引力”动画效果的能力。在这方面,需要说明的是,电子设备的系统桌面一般属于应用层,其可以集成或者调用UI框架的能力。UI框架的对外的能力体现一般分为3种,其中平台能力一般包含AAR方式和JAR包的方式,这两种方式都是将代码封装好,提供给应用集成,它可以不属于某一个层次,一般可以是在应用中集成使用,跟随应用层一起。而系统能力一般包含系统接口,它属于应用程序框架层,可以是提供给上面应用的各种服务或者能力。
图27示出了本公开的实施例的“引力”动画效果能力或功能可以被应用到的其他应用场景的示意图。如图27所示,本公开的实施例提供的是一种能力,不限定具体的使用场景,各种类型的场景都可以使用。例如,这样的场景可以包括但不限于图库中的图片的列表2710,应用市场中的滑动列表2720,负一屏的卡片移动展开操作2730,以及多任务的卡片联动场景2740,等等。
图28示出了根据本公开的实施例的用于实现“引力”动画效果能力或功能的系统框架2800的示意图。在一些实施例中,UI框架的动效能力的是基于电子设备的操作系统(例如,安卓
Figure PCTCN2022086706-appb-000009
或者鸿蒙
Figure PCTCN2022086706-appb-000010
)的整体架构来实现的,可以包含主流的4层的逻辑处理,数据处理的流程从底层往上呈现给用户。用户可以主要在应用层使用和体验动效的功能。在本公开的实施例中,桌面和UI框架的能力交互关系如图28所描绘。具体地,如图28所示,系统框架2800可以包括应用程序层2810、应用程序框架层2830、硬件抽象层2850、以及内核层2870。应用程序层2810可以包括桌面2812。桌面2812上可以实现图标操作2814。图标操作2814例如可以包括移动操作、合并操作、展开操作、删除操作和其他操作。应用程序框架层2830可以包括系统服务2832和扩展服务2834。系统服务2832可以包括各种系统服务,例如Service 2833。扩展服务2834可以包括各种扩展服务,例如HwSDK 2835。硬件抽象层(HAL)2850可以包括HAL 3.0 2852和算法Algo 2854。内核层2870可以包括驱动2872和物理设备2874。物理设备2874可以向驱动2872提供原始参数流,而驱动2872可以向物理设备2874提供功 能处理参数流。如图28进一步示出的,用于实现引力动效2825的UI框架2820可以实现在应用程序层2810与应用程序框架层2830之间。UI框架2820可以包括平台能力2822和系统能力2824,两者可以用于提供引力动效2825。引力动效2825进而可以提供给应用程序层2810的图标操作2814。
图29示出了根据本公开的实施例的“引力”动画效果能力或功能所涉及到的应用侧和UI框架侧之间的关系的示意图。如图29所示,应用侧2910可以包括桌面2915,桌面2915上的UI元素可以实现移动2912、合并2914、展开2916、删除2918、其他2920等等操作。UI框架侧2950可以包括UI框架动效2952,UI框架动效2952可以实现引力动效能力2954,引力动效能力2954可以通过AAR格式2951、JAR格式2953和系统接口2955等方式来实现。应用侧2910可以通过集成2930和调用2940等方式来调用UI框架侧2950提供的“引力”动画效果能力或功能。通过应用侧2910和UI框架侧2950之间的交互,本公开的实施例实现了新型的引力“动画效果”,使得原本独立的UI元素(例如,图标)联系起来。
图30示出了根据本公开的实施例的“引力”动画效果能力或功能实现的三种方式的具体说明的示意图。如图30所示,AAR格式2951与电子设备100的系统之间的关系3001为:AAR格式2951是以二进制方式的能力打包的,提供给系统中应用侧集成的能力,可以自由控制版本节奏,不跟随系统。JAR格式2953与电子设备100的系统之间的关系3003为:JAR格式2953是以二进制方式的能力打包的,提供给系统中所有部件的能力,可以自由控制版本节奏,不跟随系统。系统接口2955与电子设备100的系统之间的关系3005为:系统接口2955是系统版本中的框架层的接口,提供给系统中所有部件的能力,跟随系统升级。更具体地,集成方式可以是指AAR和JAR包的方式,调用方式可以是指系统接口的方式。因此,本公开的实施例应用的场景是不限于任何特定场景的,只是“引力”动画效果的能力的展现方式可能不一致。也就是说,本公开在前文描述的各种方法的功能可以通过AAR格式文件、JAR格式文件和/或电子设备100的系统接口来实现。以此方式,“引力”动画效果的能力或功能可以简单和方便地被实现并提供给电子设备的应用程序,例如桌面。
在本公开的实施例中,接口设计与方案实现包括实现引力模型能力的设计与实现。以下是引力模型能力的设计与实现的一种示例。
Figure PCTCN2022086706-appb-000011
Figure PCTCN2022086706-appb-000012
相关参数的意义如下表所示:
Figure PCTCN2022086706-appb-000013
Figure PCTCN2022086706-appb-000014
图31示出了根据本公开的实施例的用于实现“引力”动画效果的动效能力侧的类图关系的示意图。如图31所示,动效能力侧可以包括GravityAnimator类3110,GravityAnimator类3110可以包括GravityField类3120,并且GravityField类3120可以包括GravityAsteroid类3122、GravityAsteroid类3124、……、GravityAsteroid类3126。更一般地,应用侧的布局设计可以任意自由地组合。在图31示出的动效能力侧的类图关系中,GravityAnimator类3110可以是整个引力的动画类,GravityField类3110可以相当于是整个引力场景的区域,而GravityAsteroid类3122至3126可以相当于是每个引力场中的所有UI元素。
图32示出了根据本公开的实施例的应用侧和动效能力侧用于实现“引力”动画效果的操作时序图。如图32所示,应用侧3210可以包括GravityDemo类3212和View类3214,而动效能力侧3250可以包括GravityAnimator类3110、GravityField类3120和GravityAsteroid类3122。具体地,应用侧可以组织图形的示意,功能能力侧可以提供具体能力。各个操作的时序图如图32所描绘。操作流程可以包括:首先,在第一步骤中,在初始化时传入父布局,向所有被影响的UI元素(也称为子元素)设置监听回调。然后,在第二步骤中,向android.view.Choreographer中注册回调,每一帧更新每个受影响元素位置。接着,在第三步骤中,每一帧根据时间计算插值器的值,计算当前时刻当前元素的位置,通过第一步骤中的回调传给子元素。此后,在第四步骤中,子元素在回调中更新位置。
对接接口的示例设计可以如下表所示:
Figure PCTCN2022086706-appb-000015
Figure PCTCN2022086706-appb-000016
具体的使用说明如下:
1、通过构造函数创建引力动画GravityAnimator
mGravityAnimator=new GravityAnimator(pos,mViewContainer,GRAVITATION);
2、设置引力动画、恢复动画的插值器、时长等参数
mGravityAnimator.setImpactRadius(800);
mGravityAnimator.setDuration(150);
mGravityAnimator.setResetDuration(300);
mGravityAnimator.setInterpolator(PathInterpolatorCompat.create(0.4,0,1,));
mGravityAnimator.setResetInterpolator(new SpringInterpolator(mStiffness,mDamping));
3、设置动画监听器等参数
Figure PCTCN2022086706-appb-000017
4、调用start启动动画
mGravityAnimator.start();
图33示出了根据本公开的实施例的用于调整“引力”动画效果的参数的界面的示意图。如图33所示,在用户触发了电子设备100的“引力”动画效果的参数设置功能之后,电子设备100可以在屏幕300上显示用于调整电子设备100的“引力”动画效果的设置区域3310。在设置区域3310中,用户可以设置“引力”动画效果是否为“正向”,也即表现为被操作的 UI元素对其他UI元素的“吸引力”。如果“引力”动画效果被设置为“正向”被打开,那么进行“引力”动画效果的UI元素将首先被另一UI元素吸引,然后再返回到起始位置。相比之下,如果“引力”动画效果被设置为“正向”被关闭,那么进行“引力”动画效果的UI元素将首先被另一UI元素排斥,然后再返回到起始位置。另外,用户还可以在设置区域3310中设置“引力”动画效果是否包括“删除”操作,设置引力速度(也即引力传播速度)、引力范围、引力时长(也即第一移动的时长)、恢复时长(也即第二移动的时长)、用于确定目标距离的振幅系数、相关的控制点的位置、恢复刚性(也即第二移动的位移时间曲线为使用弹性力曲线时的参数)、恢复阻尼(也即第二移动的位移时间曲线为使用弹性力曲线时的参数),等等。应当理解,图33中描绘的设置区域3310中的具体内容仅是示意性的,无意以任何方式限制本公开的内容。在其他实施例中,电子设备100向用户提供的“引力”动画效果的设置区域中可以设置与“引力”动画效果有关的任何其他参数。也就是说,由于“引力”动画效果的各种参数都是可以调整的,所以本公开的实施例提供了自调节验证的功能,将所有的参数都由用户或者应用自己来进行设定,查看效果,并进行调节。
本公开的实施例的对象编辑方法可以应用于多种电子设备。示例性的,该电子设备例如可以为:移动手机、平板电脑(Tablet Personal Computer)、数码相机、个人数字助理(personal digital assistant,简称PDA)、导航装置、移动上网装置(Mobile Internet Device,MID)、可穿戴式设备(Wearable Device)、以及其他能够进行对象编辑的设备等。此外,本公开的实施例的对象编辑方案不仅可以作为输入法的一个功能,也可以作为电子设备的操作系统的一个功能而实施。
在上述实施例中,可以全部或部分的通过软件,硬件,固件或者其任意组合来实现。当使用软件程序实现时,可以全部或部分地以计算机程序产品的形式出现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本公开的实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。该可用介质可以是磁性介质,(例如,软盘,硬盘、磁带)、光介质(例如,DVD)或者半导体介质(例如固态硬盘Solid State Disk(SSD))等。
一般而言,本公开的各种示例实施例可以在硬件或专用电路、软件、逻辑,或其任何组合中实施。某些方面可以在硬件中实施,而其他方面可以在可以由控制器、微处理器或其他计算设备执行的固件或软件中实施。例如,在一些实施例中,本公开的各种示例(例如方法、装置或设备)可以部分或者全部被实现在计算机可读介质上。当本公开的实施例的各方面被图示或描述为框图、流程图或使用某些其他图形表示时,应理解此处描述的方框、装置、系统、技术或方法可以作为非限制性的示例在硬件、软件、固件、专用电路或逻辑、通用硬件或控制器或其他计算设备,或其某些组合中实施。
本公开还提供了存储在非瞬态计算机可读存储介质上的至少一种计算机程序产品。计算 机程序产品包括计算机可执行指令,计算机可执行指令诸如包括在目标的物理或者虚拟处理器上的器件中执行的程序模块中,用以执行上文关于图4、图14和图15描述的示例方法或示例过程400、1400和1500。一般而言,程序模块可以包括例程、程序、库、对象、类、组件、数据结构等,其执行特定的任务或者实现特定的抽象数据结构。在各实施例中,程序模块的功能可以在所描述的程序模块之间合并或者分割。用于程序模块的计算机可执行指令可以在本地或者分布式设备内执行。在分布式设备中,程序模块可以位于本地和远程存储介质二者中。
用于实现本公开的方法的程序代码可以用一种或多种编程语言编写。这些计算机程序代码可以提供给通用计算机、专用计算机或其他可编程的数据处理装置的处理器,使得程序代码在被计算机或其他可编程的数据处理装置执行的时候,引起在流程图和/或框图中规定的功能/操作被实施。程序代码可以完全在计算机上、部分在计算机上、作为独立的软件包、部分在计算机上且部分在远程计算机上或完全在远程计算机或服务器上执行。在本公开的上下文中,计算机程序代码或相关数据可以由任何适当的载体来承载,以使设备、装置或处理器能够执行上文描述的各种过程和操作。载体的示例包括信号、计算机可读介质,等等。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如DVD)、或者半导体介质(例如固态硬盘)等。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,该流程可以由计算机程序来指令相关的硬件完成,该程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述各方法实施例的流程。而前述的存储介质包括:ROM或随机存储记忆体RAM、磁碟或者光盘等各种可存储程序代码的介质。计算机可读介质可以是计算机可读信号介质或计算机可读存储介质。计算机可读介质可以包括但不限于电子的、磁的、光学的、电磁的、红外的或半导体系统、装置或设备,或其任何合适的组合。机器可读存储介质的更详细示例包括带有一根或多根导线的电气连接、便携式计算机磁盘、硬盘、随机存储存取器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或闪存)、光纤、便携式压缩盘只读存储器(CD-ROM)、光存储设备、磁存储设备,或其任何合适的组合。
另外,尽管操作以特定顺序被描绘,但这并不应该理解为要求此类操作以示出的特定顺序或以相继顺序完成,或者执行所有图示的操作以获取期望结果。在某些情况下,多任务或并行处理会是有益的。同样地,尽管上述论述包含了某些特定的实施细节,但这并不应解释为限制任何发明或权利要求的范围,而应解释为对可以针对特定发明的特定实施例的描述。本说明书中在分离的实施例的上下文中描述的某些特征也可以整合实施在单个实施例中。反 之,在单个实施例的上下文中描述的各种特征也可以分离地在多个实施例或在任何合适的子组合中实施。
尽管已经以特定于结构特征和/或方法动作的语言描述了主题,但是应当理解,所附权利要求中限定的主题并不限于上文描述的特定特征或动作。相反,上文描述的特定特征和动作是作为实现权利要求的示例形式而被公开的。以上描述的各种示例和过程可以独立于彼此使用,或者可以按各种方式来组合。不同的组合和子组合旨在落入本公开的范围内,并且某些步骤或过程可以在一些实现中被省略。以上所述,仅为本公开的实施例的具体实施方式,但本公开的实施例的保护范围并不局限于此,任何在本公开的实施例揭露的技术范围内的变化或替换,都应涵盖在本公开的实施例的保护范围之内。因此,本公开的实施例的保护范围应以所述权利要求的保护范围为准。

Claims (21)

  1. 一种图形界面显示方法,包括:
    在电子设备的屏幕上显示M个用户界面UI元素,M为大于1的正整数;
    检测到作用于所述M个UI元素中的第一UI元素的操作;
    响应于所述操作,使所述屏幕上的N个UI元素中的每个UI元素产生动画效果,N为1与M-1之间的正整数,其中产生所述动画效果包括:
    确定所述N个UI元素中的第二UI元素将在第一方向上移动的目标距离,所述第一方向是从所述第二UI元素指向所述第一UI元素的方向或是从所述第一UI元素指向所述第二UI元素的方向;
    使所述第二UI元素从起始位置沿所述第一方向以所述目标距离进行第一移动;以及
    在所述第一移动之后,使所述第二UI元素沿与所述第一方向相反的第二方向进行第二移动,以复位到所述起始位置。
  2. 根据权利要求1所述的方法,其中确定所述目标距离包括:
    确定所述第二UI元素的尺寸;
    确定所述第二UI元素与所述第一UI元素之间的距离;以及
    基于所述尺寸和所述距离来确定所述目标距离。
  3. 根据权利要求2所述的方法,其中基于所述尺寸和所述距离来确定所述目标距离包括:
    使所述目标距离随着所述尺寸增大而增大并且随着所述距离增大而减小。
  4. 根据权利要求2所述的方法,其中确定所述距离包括:
    确定所述第一UI元素的第一中心点;
    确定所述第二UI元素的第二中心点;以及
    确定所述第一中心点与所述第二中心点之间的直线距离,作为所述距离。
  5. 根据权利要求2所述的方法,其中确定所述距离包括:
    确定所述第一UI元素的第一中心点;
    确定以所述第一中心点为圆心的具有各自半径的多个圆;
    确定所述第二UI元素与所述多个圆中的至少一个圆相交;以及
    将所述至少一个圆中半径最小的圆的半径确定为所述距离。
  6. 根据权利要求2所述的方法,其中确定所述距离包括:
    确定所述第一UI元素与所述第二UI元素之间的横向间距;
    确定所述第一UI元素与所述第二UI元素之间的纵向间距;以及
    基于所述横向间距和所述纵向间距中的至少一者和所述第一方向来确定所述距离。
  7. 根据权利要求1至6中任一项所述的方法,还包括:
    基于所述第一UI元素的尺寸来确定所述第一UI元素的影响区域;以及
    将所述M个UI元素中在所述影响区域内的UI元素确定为所述N个UI元素。
  8. 根据权利要求1至6中任一项所述的方法,还包括:
    将所述M个UI元素中除所述第一UI元素以外的M-1个UI元素确定为所述N个UI元素。
  9. 根据权利要求1至8中任一项所述的方法,其中所述第一移动持续的第一时长、所述第二移动持续的第二时长、以及所述第一移动和所述第二移动持续的总时长中的至少一个是可配置的。
  10. 根据权利要求1至9中任一项所述的方法,其中所述第二UI元素在所述第一移动和所述第二移动中的至少一者期间的移动的动画效果基于位移随时间变化的预定义曲线来确定。
  11. 根据权利要求10所述的方法,其中所述预定义曲线为贝塞尔曲线或弹性力曲线。
  12. 根据权利要求1至11中任一项所述的方法,其中所述第一移动和所述第二移动中的至少一个包括变加速直线运动。
  13. 根据权利要求1至12中任一项所述的方法,其中使所述第二UI元素进行所述第一移动包括:
    确定所述操作被执行的第一时间点;
    基于预定速度和所述第二UI元素与所述第一UI元素之间的距离,确定开始所述第一移动的第二时间点与所述第一时间点之间的延迟;
    基于所述第一时间点和所述延迟来确定所述第二时间点;以及
    在所述第二时间点使所述第二UI元素开始所述第一移动。
  14. 根据权利要求1至13中任一项所述的方法,其中所述操作包括使所述第一UI元素与所述第二UI元素交换位置,所述目标距离是第一目标距离,产生所述动画效果还包括:
    将所述第二UI元素从初始位置移动到所述起始位置,所述起始位置是所述第一UI元素的初始位置;
    在所述第二UI元素到达所述起始位置之后并且在所述第一移动之前,确定所述第二UI元素将在第三方向上移动的第二目标距离,所述第三方向是从所述第二UI元素指向第三UI元素的方向或是从所述第三UI元素指向所述第二UI元素的方向;
    在所述第一移动之前,使所述第二UI元素从所述起始位置沿所述第三方向以所述第二目标距离进行第三移动;以及
    在所述第三移动之后并且在所述第一移动之前,使所述第二UI元素沿与所述第三方向相反的第四方向进行第四移动,以复位到所述起始位置。
  15. 根据权利要求1至14中任一项所述的方法,其中产生所述动画效果还包括:
    在所述第一移动和所述第二移动中的至少一者期间缩小或放大所述第二UI元素的尺寸。
  16. 根据权利要求1至15中任一项所述的方法,其中所述第一方向从所述第二UI元素的第二中心点指向所述第一UI元素的第一中心点、或者从所述第一中心点指向所述第二中心点。
  17. 根据权利要求1至16中任一项所述的方法,其中所述操作包括以下至少一项:点击、移动、与其他UI元素合并、展开、以及删除。
  18. 根据权利要求1至17中任一项所述的方法,其中所述方法的功能通过AAR格式文件、JAR格式文件和所述电子设备的系统接口中的至少一者来实现。
  19. 一种电子设备,包括:处理器、以及存储有指令的存储器,所述指令在被所述处理器执行时使得所述电子设备执行根据权利要求1至18中任一项所述的方法。
  20. 一种计算机可读存储介质,所述计算机可读存储介质存储有指令,所述指令在被电子设备执行时使得所述电子设备执行根据权利要求1至18中任一项所述的方法。
  21. 一种计算机程序产品,所述计算机程序产品包括指令,所述指令在被电子设备执行时使得所述电子设备执行根据权利要求1至18中任一项所述的方法。
PCT/CN2022/086706 2021-04-20 2022-04-13 图形界面显示方法、电子设备、介质以及程序产品 WO2022222830A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110425565.4A CN113568549A (zh) 2021-04-20 2021-04-20 图形界面显示方法、电子设备、介质以及程序产品
CN202110425565.4 2021-04-20

Publications (1)

Publication Number Publication Date
WO2022222830A1 true WO2022222830A1 (zh) 2022-10-27

Family

ID=78161309

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/086706 WO2022222830A1 (zh) 2021-04-20 2022-04-13 图形界面显示方法、电子设备、介质以及程序产品

Country Status (2)

Country Link
CN (2) CN115469781B (zh)
WO (1) WO2022222830A1 (zh)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115469781B (zh) * 2021-04-20 2023-09-01 华为技术有限公司 图形界面显示方法、电子设备、介质以及程序产品
CN115220621A (zh) * 2021-04-20 2022-10-21 华为技术有限公司 图形界面显示方法、电子设备、介质以及程序产品
CN114995929B (zh) * 2021-11-17 2023-04-21 荣耀终端有限公司 一种弹窗的显示方法及装置
CN114115624A (zh) * 2021-11-25 2022-03-01 京东方科技集团股份有限公司 用户界面显示方法及装置
CN116431046A (zh) * 2022-01-04 2023-07-14 华为技术有限公司 用户界面显示方法、电子设备、介质以及程序产品
CN114428923A (zh) * 2022-01-26 2022-05-03 北京有竹居网络技术有限公司 弹窗效果的呈现方法、装置、电子设备及存储介质
CN115098207A (zh) * 2022-06-23 2022-09-23 北京字跳网络技术有限公司 图像显示方法、装置、电子设备及存储介质
CN116048361B (zh) * 2022-06-24 2024-04-12 荣耀终端有限公司 交互方法、可读存储介质和电子设备
CN117472485A (zh) * 2022-07-22 2024-01-30 华为技术有限公司 一种界面显示的方法以及电子设备

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102375588A (zh) * 2010-08-19 2012-03-14 上海博泰悦臻电子设备制造有限公司 通过电子设备屏幕的手势控制设备操作的方法和装置
CN104704494A (zh) * 2013-06-09 2015-06-10 苹果公司 管理具有多页面的文件夹的设备、方法和图形用户界面
CN104731458A (zh) * 2015-03-31 2015-06-24 努比亚技术有限公司 自动整理桌面图标的方法、装置及移动终端
US20150370447A1 (en) * 2014-06-24 2015-12-24 Google Inc. Computerized systems and methods for cascading user interface element animations
CN106325652A (zh) * 2015-06-19 2017-01-11 深圳创锐思科技有限公司 图形用户界面交互方法及触摸终端
CN112256165A (zh) * 2019-12-13 2021-01-22 华为技术有限公司 一种应用图标的显示方法及电子设备
CN113552987A (zh) * 2021-04-20 2021-10-26 华为技术有限公司 图形界面显示方法、电子设备、介质以及程序产品
CN113568549A (zh) * 2021-04-20 2021-10-29 华为技术有限公司 图形界面显示方法、电子设备、介质以及程序产品

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2060970A1 (en) * 2007-11-12 2009-05-20 Research In Motion Limited User interface for touchscreen device
KR20110032596A (ko) * 2009-09-23 2011-03-30 삼성전자주식회사 중력장 맵을 생성하여 포인터 이동에 이용하는 gui 제공방법 및 이를 적용한 디스플레이 장치
KR20140068410A (ko) * 2012-11-28 2014-06-09 삼성전자주식회사 물리 엔진 기반의 사용자 인터페이스를 제공하는 방법 및 그 전자 장치
CN105528166A (zh) * 2014-09-28 2016-04-27 联想(北京)有限公司 一种控制方法及装置
CN108694006B (zh) * 2017-04-11 2021-03-30 北京京东尚科信息技术有限公司 一种实现图标仿车轮滚动效果的方法和装置
CN112148168B (zh) * 2020-09-29 2022-07-08 维沃移动通信有限公司 图标的移动方法、装置和电子设备

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102375588A (zh) * 2010-08-19 2012-03-14 上海博泰悦臻电子设备制造有限公司 通过电子设备屏幕的手势控制设备操作的方法和装置
CN104704494A (zh) * 2013-06-09 2015-06-10 苹果公司 管理具有多页面的文件夹的设备、方法和图形用户界面
US20150370447A1 (en) * 2014-06-24 2015-12-24 Google Inc. Computerized systems and methods for cascading user interface element animations
CN104731458A (zh) * 2015-03-31 2015-06-24 努比亚技术有限公司 自动整理桌面图标的方法、装置及移动终端
CN106325652A (zh) * 2015-06-19 2017-01-11 深圳创锐思科技有限公司 图形用户界面交互方法及触摸终端
CN112256165A (zh) * 2019-12-13 2021-01-22 华为技术有限公司 一种应用图标的显示方法及电子设备
CN112987987A (zh) * 2019-12-13 2021-06-18 华为技术有限公司 一种应用图标的显示方法及电子设备
CN113552987A (zh) * 2021-04-20 2021-10-26 华为技术有限公司 图形界面显示方法、电子设备、介质以及程序产品
CN113568549A (zh) * 2021-04-20 2021-10-29 华为技术有限公司 图形界面显示方法、电子设备、介质以及程序产品

Also Published As

Publication number Publication date
CN113568549A (zh) 2021-10-29
CN115469781A (zh) 2022-12-13
CN115469781B (zh) 2023-09-01

Similar Documents

Publication Publication Date Title
WO2022222830A1 (zh) 图形界面显示方法、电子设备、介质以及程序产品
CN113552987B (zh) 图形界面显示方法、电子设备、介质以及程序产品
WO2021027725A1 (zh) 显示页面元素的方法和电子设备
WO2021115194A1 (zh) 一种应用图标的显示方法及电子设备
WO2021000841A1 (zh) 一种生成用户头像的方法及电子设备
WO2021135838A1 (zh) 一种页面绘制方法及相关装置
CN112053370A (zh) 基于增强现实的显示方法、设备及存储介质
WO2021180046A1 (zh) 图像留色方法及设备
WO2022247541A1 (zh) 一种应用程序动效衔接的方法及装置
CN110892371B (zh) 显示控制方法和终端
US20230351665A1 (en) Animation Processing Method and Related Apparatus
WO2022222931A1 (zh) 图形界面显示方法、电子设备、介质以及程序产品
WO2022247542A1 (zh) 一种动效计算方法及装置
US20240111403A1 (en) Page sliding processing method and related apparatus
WO2023130977A1 (zh) 用户界面显示方法、电子设备、介质以及程序产品
CN111722896B (zh) 动画播放方法、装置、终端以及计算机可读存储介质
CN117290004A (zh) 组件预览的方法和电子设备
US20240353972A1 (en) User interface display method, electronic device, medium, and program product
WO2022222831A1 (zh) 图形界面显示方法、电子设备、介质以及程序产品
WO2022042734A1 (zh) 一种页面滑动的处理方法及相关装置
WO2024099206A1 (zh) 一种图形界面处理方法以及装置
WO2023040613A1 (zh) 人机交互方法、计算机可读介质和电子设备
CN117472485A (zh) 一种界面显示的方法以及电子设备
CN118767432A (zh) 虚拟对象的调整方法、装置、设备、介质及程序产品
CN116700555A (zh) 动效处理方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22790938

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22790938

Country of ref document: EP

Kind code of ref document: A1