WO2022222931A1 - Procédé d'affichage d'interface graphique, dispositif électronique, support et produit programme - Google Patents

Procédé d'affichage d'interface graphique, dispositif électronique, support et produit programme Download PDF

Info

Publication number
WO2022222931A1
WO2022222931A1 PCT/CN2022/087751 CN2022087751W WO2022222931A1 WO 2022222931 A1 WO2022222931 A1 WO 2022222931A1 CN 2022087751 W CN2022087751 W CN 2022087751W WO 2022222931 A1 WO2022222931 A1 WO 2022222931A1
Authority
WO
WIPO (PCT)
Prior art keywords
elements
distance
animation
present disclosure
electronic device
Prior art date
Application number
PCT/CN2022/087751
Other languages
English (en)
Chinese (zh)
Inventor
卞超
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022222931A1 publication Critical patent/WO2022222931A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/802D [Two Dimensional] animation, e.g. using sprites

Definitions

  • the present disclosure generally relates to the field of information technology, and more particularly, to a graphical interface display method, an electronic device, a computer-readable storage medium, and a computer program product.
  • a graphical interface display method, electronic device, medium, and program product are provided, which can strengthen the connection between animation effects of different UI elements, highlight the relationship between individual UI elements, and make UI
  • the animation effect is more in line with the user's usage habits, thereby significantly improving the user experience.
  • a graphical interface display method is provided.
  • M user interface UI elements are displayed on the screen of the electronic device, where M is a positive integer greater than 1.
  • a press on a first UI element of the M UI elements is detected.
  • each of the N UI elements on the screen is changed with a corresponding animation effect, where N is a positive integer between 1 and M-1.
  • Making the N UI elements change with corresponding animation effects includes: determining a distance between the first UI element and a second UI element among the N UI elements; and determining the second UI element based on the distance and the pressed position in the UI animating the change; and animating the second UI element to visually indicate the press.
  • the connection between the animation effects of different UI elements can be strengthened, and the relationship between the individual UI elements can be highlighted, so that the UI animation effects are more in line with the user's usage habits, thereby significantly improving the user experience.
  • a first reference point of the first UI element and a second reference point of the second UI element may be determined; and a distance between the first reference point and the second reference point may be determined as the distance.
  • the distance can be determined based on the determined reference point of the UI element, thereby improving the accuracy of the determined distance and the flexibility of the way of determining the distance, thereby improving the user experience.
  • a first reference point of the first UI element may be determined; from a plurality of circles having respective radii centered on the first reference point, the intersection with the second UI element and the radius is determined the smallest target circle; and determining the radius of the target circle as the distance.
  • the horizontal spacing between the first UI element and the second UI element can be determined; the vertical spacing between the first UI element and the second UI element can be determined; and based on any of the following: Determine distance: at least one of horizontal spacing and vertical spacing, or at least one of horizontal spacing and vertical spacing, and a direction from the second fiducial point of the second UI element to the first fiducial point of the first UI element.
  • Determine distance at least one of horizontal spacing and vertical spacing, or at least one of horizontal spacing and vertical spacing, and a direction from the second fiducial point of the second UI element to the first fiducial point of the first UI element.
  • the method may further include: determining an area of influence of the first UI element based on a size of the first UI element; and determining a UI element within the area of influence among the M UI elements as N UI elements. In this way, the UI element that changes in linkage with the UI element can be determined based on the size of the pressed UI element, so that the UI animation effect is more in line with the user's usage habits, thereby significantly improving the user experience.
  • the method may further include: determining M-1 UI elements other than the first UI element among the M UI elements as N UI elements. In this way, all the UI elements on the screen except the pressed UI elements can be linked and changed with the UI elements, so that the UI elements that are linked and changed can be determined more simply and conveniently, and the animation effect of the UI can be more in line with the user's usage habits, so that it is more obvious. Improve user experience.
  • the animation effect may include a rocker-like movement of the position visually relative to the position of the press, or a depression or protrusion visually relative to the position of the press. In this way, the animation effect of the UI element being pressed can be presented intuitively, so that the animation effect of the UI is more in line with the user's usage habits, thereby significantly improving the user experience.
  • the first magnitude of the change of the first UI element in response to the press may be determined; and the magnitude of the change of the second UI element in response to the press may be determined based on any of the following: A magnitude and a distance, or at least one of a size of the second UI element and a size of the first UI element, a first magnitude, and a distance.
  • the animation effect of the first UI element can be caused to be conducted to the second UI element, and further the animation effect of the second UI element can be caused to be based on the distance between the first UI element and the second UI element and the second UI element
  • the size of the UI can be determined, so that the animation effect of the UI can be more in line with the user's usage habits, thereby significantly improving the user experience.
  • the first magnitude of change of the first UI element may be determined based on at least one of the following associated with the first UI element: a size of the first UI element, a size of the first reference point of the first UI element The position, the range of the amplitude that the first UI element can change, the position of the pressing, the duration of the pressing, and the predetermined pressing force.
  • the first magnitude of the change of the first UI element can be determined intuitively, reasonably and flexibly based on various factors associated with the first UI element, thereby making the animation effect of the UI more in line with the user's usage habits , thereby significantly improving the user experience.
  • changing the second UI element may include: determining a delay time based on the distance; and changing the second UI element in response to the delay time elapsed after the pressing occurs. In this way, it is possible to visually show that the linkage change propagates with the distance, so that the animation effect of the UI is more in line with the user's usage habits, thereby significantly improving the user experience.
  • causing the second UI element to change may include determining a speed at which the second UI element changes in response to the press based on a predefined curve of magnitude versus time. In this way, the change of the first UI element can be conveniently controlled based on a predefined curve whose amplitude varies with time, so that the UI animation effect is more in line with the user's usage habits, thereby significantly improving the user experience.
  • the predefined curve may be a Bezier curve or an elastic force curve.
  • the change of the first UI element can be conveniently controlled based on the Bezier curve or the elastic force curve, so that the UI animation effect is more in line with the user's usage habits, thereby significantly improving the user experience.
  • the method may further include: restoring the changed second UI element to the second UI element.
  • the UI element can be restored after being released, so that the UI animation effect is more in line with the user's usage habits, thereby significantly improving the user experience.
  • the method may be implemented by at least one of an AAR format file, a JAR format file, and a system interface. In this way, a graphical interface display with linked changes can be realized simply and conveniently.
  • an electronic device in a second aspect of the present disclosure, includes a processor and a memory storing instructions, wherein the instructions, when executed by the processor, cause the electronic device to perform any of the methods according to the first aspect and implementations thereof.
  • a computer-readable storage medium stores instructions, wherein the instructions, when executed by a processor, cause an electronic device to perform any of the methods according to the first aspect and implementations thereof.
  • a computer program product comprises instructions, wherein the instructions, when executed by a processor, cause an electronic device to perform any of the methods according to the first aspect and implementations thereof.
  • FIG. 1A to 1B show schematic diagrams of the hardware structure and the software structure of an electronic device that can implement embodiments of the present disclosure.
  • FIG. 2 shows a block diagram of another electronic device that may implement embodiments of the present disclosure.
  • 3A-3C respectively illustrate schematic diagrams of example UIs according to some embodiments of the present disclosure.
  • FIG. 4 shows a schematic diagram of an example drag linkage in accordance with some embodiments of the present disclosure.
  • 5A and 5B show schematic diagrams, respectively, of an example velocity time curve and an example displacement time curve of a friction model according to some embodiments of the present disclosure.
  • FIG. 6 shows a schematic diagram of examples of restricted and non-restricted mobile locations in accordance with some embodiments of the present disclosure.
  • FIGS. 7A-7C show schematic diagrams of examples of curves of spring deflection x versus time t for critically damped, under-damped, and over-damped states, respectively, according to some embodiments of the present disclosure.
  • FIG. 8 shows a flowchart of a graphical interface display method according to an embodiment of the present disclosure.
  • FIG. 9 shows a schematic diagram of an example of an area of influence of a UI element according to an embodiment of the present disclosure.
  • FIG. 10 shows a schematic diagram of an example of the determination of a distance according to an embodiment of the present disclosure.
  • FIGS. 11A-11C illustrate schematic diagrams of examples of distance determinations according to embodiments of the present disclosure.
  • FIG. 12 shows a schematic diagram of an example of distance determination according to an embodiment of the present disclosure.
  • FIG. 13 shows a schematic diagram of an example of the determination of distance according to an embodiment of the present disclosure.
  • FIG. 14A and 14B illustrate schematic diagrams of examples of distance determination according to embodiments of the present disclosure.
  • FIG. 15 shows a schematic diagram of an example of a delay time according to an embodiment of the present disclosure.
  • 16A shows a schematic diagram of an example of a scene in which a UI element moves completely with the hand, according to an embodiment of the present disclosure.
  • 16B shows a schematic diagram of an example of a displacement time curve of a scene in which a UI element completely moves with a hand according to an embodiment of the present disclosure.
  • 17A shows a schematic diagram of an example of a scenario in which UI elements do not completely follow the hand, according to an embodiment of the present disclosure.
  • 17B shows a schematic diagram of an example of a displacement time curve of a scene in which a UI element does not completely move with the hand, according to an embodiment of the present disclosure.
  • FIG. 18A shows a schematic diagram of an example of a scene in which a UI element moves completely with the hand, according to an embodiment of the present disclosure.
  • 18B shows a schematic diagram of an example of a displacement time curve of a scene in which a UI element completely moves with a hand according to an embodiment of the present disclosure.
  • 19A shows a schematic diagram of an example of a scene in which a UI element moves completely with the hand, according to an embodiment of the present disclosure.
  • 19B shows a schematic diagram of an example of a displacement time curve of a scene in which a UI element completely moves with the hand, according to an embodiment of the present disclosure.
  • FIG. 20 shows a schematic diagram of an example of a UI element changing when pressed, according to some embodiments of the present disclosure.
  • 21 shows a schematic diagram of an example of a UI element changing when pressed at different locations, according to some embodiments of the present disclosure.
  • FIG. 22 shows a schematic diagram of an example of UI element changes at different pressing forces according to some embodiments of the present disclosure.
  • FIG. 23 shows a schematic diagram of an example of UI element changes at different press durations, according to some embodiments of the present disclosure.
  • FIG. 24 shows a schematic diagram of an example of UI elements changing at different sizes, according to some embodiments of the present disclosure.
  • FIG. 25 shows a flowchart of a graphical interface display method according to an embodiment of the present disclosure.
  • FIG. 26 shows a schematic diagram of an example of deep linkage of N UI elements according to an embodiment of the present disclosure.
  • FIG. 27 shows a schematic diagram of an example of an area of influence of a UI element according to an embodiment of the present disclosure.
  • FIG. 28 shows a schematic diagram of an example of distance-based scaling of UI elements, according to an embodiment of the present disclosure.
  • FIG. 29 shows a schematic diagram of an example of a delay time according to an embodiment of the present disclosure.
  • FIG. 30 shows a schematic diagram of an example of scaling of UI elements with a delay time in accordance with the present disclosure.
  • 31 shows a schematic diagram of an example of displacement of movement of UI elements according to an embodiment of the present disclosure.
  • 32A-32B illustrate schematic diagrams of examples of restoration of UI elements without displacement and restoration of UI elements with displacement, respectively, according to embodiments of the present disclosure.
  • FIGS. 33A-33B respectively show schematic diagrams of an example of a restored size time curve and a displacement time curve of a UI element with a springback effect according to an embodiment of the present disclosure
  • FIGS. 33C-33D respectively show a A schematic diagram of an example of the restored size time curve and displacement time curve of a UI element with the rebound effect of multiple rebounds with reduced rebound amplitudes of an embodiment.
  • FIG. 34 shows a schematic diagram of an example of a UI element that is a rigid body changing when pressed, according to some embodiments of the present disclosure.
  • 35 shows a schematic diagram of an example of pressing and stretching of a spring that simulates pressing of a UI element, according to some embodiments of the present disclosure.
  • 36 shows a schematic diagram of an example of a UI element that is a non-rigid body when pressed, according to some embodiments of the present disclosure.
  • Figure 37 shows a schematic diagram of an example of UI element changes at different pressing forces, according to some embodiments of the present disclosure.
  • FIG. 38 shows a schematic diagram of an example of UI element changes at different press durations, according to some embodiments of the present disclosure.
  • 39 shows a schematic diagram of an example of UI elements changing at different sizes in accordance with some embodiments of the present disclosure.
  • FIG. 40 shows a flowchart of a graphical interface display method according to an embodiment of the present disclosure.
  • 41 shows a schematic diagram of an example of pressure linkage of N UI elements according to an embodiment of the present disclosure.
  • FIG 43 shows a schematic diagram of an example of an area of influence of a UI element according to an embodiment of the present disclosure.
  • 44 shows a schematic diagram of an example of a distance-based UI element change according to an embodiment of the present disclosure.
  • FIG. 45 shows a schematic diagram of an example of a delay time according to an embodiment of the present disclosure.
  • 46A shows a schematic diagram of an example of restoration of UI elements according to an embodiment of the present disclosure.
  • 46B shows a schematic diagram of an example of an angle-time curve of recovery of a UI element with a bouncing effect, according to an embodiment of the present disclosure.
  • 46C shows a schematic diagram of an example of an angle-time curve of recovery of a UI element with a rebound effect of multiple rebounds with reduced rebound amplitudes, according to an embodiment of the present disclosure.
  • FIG. 47 shows a schematic diagram of animation implementation according to an embodiment of the present disclosure.
  • Figure 48 shows a schematic diagram of a system framework for implementing "linkage” animation effect capabilities or functions according to an embodiment of the present disclosure.
  • FIG. 49 shows a schematic diagram of the relationship between the application side and the UI framework side involved in the "linkage" animation effect capability or function according to an embodiment of the present disclosure.
  • FIG. 50 shows a schematic diagram of a specific description of three ways of implementing the “linkage” animation effect capability or function according to an embodiment of the present disclosure.
  • the term “including” and variations thereof mean open-ended inclusion, ie, “including but not limited to”.
  • the term “or” means “and/or” unless specifically stated otherwise.
  • the term “based on” means “based at least in part on”.
  • the terms “embodiments” and “some embodiments” mean “at least some embodiments.”
  • the terms “first”, “second”, etc. are used for description to distinguish different objects, etc., and do not represent the order of precedence, nor do they limit the "first” and “second” to be different types.
  • the term "UI” refers to the interface through which the user interacts with the application or operating system and exchanges information, which enables the conversion between the internal form of the information and the form acceptable to the user.
  • the UI of an application is source code written in a specific computer language such as java, extensible markup language (XML), etc.
  • the UI source code is parsed and rendered on the electronic device, and finally presented as content that the user can recognize , such as images, text, buttons and other UI elements.
  • the attributes and content of UI elements in the UI are defined by tags or nodes.
  • the UI elements contained in the UI are specified by nodes such as ⁇ Textview>, ⁇ ImgView>, and ⁇ VideoView>.
  • a node corresponds to a UI element or attribute in the UI. After parsing and rendering, the node is presented as user-visible content.
  • many applications, such as hybrid applications often contain web pages in their UI.
  • a web page can be understood as a special UI element embedded in the UI of an application.
  • a web page is source code written in a specific computer language, such as hypertext markup language (HTML), cascading style sheets (cascading style sheets) , CSS), java script (JavaScript, JS), etc.
  • the web page source code can be loaded and displayed as user-identifiable content by a browser or a web page display component similar in function to a browser.
  • the specific content contained in a web page is also defined by tags or nodes in the source code of the web page. For example, HTML defines the elements and attributes of web pages through ⁇ p>, ⁇ img>, ⁇ video>, and ⁇ canvas>.
  • UI element includes, but is not limited to: window, scrollbar, tableview, button, menu bar, text box , Navigation bar, toolbar (toolbar), image (image), static text (tatictext), widget (Widget) and other visual UI elements.
  • animations are essentially real-time display of user interface UI or UI elements based on refresh rate. Due to the human vision persistence principle, the user feels that the picture is moving.
  • the animation transitions from the initial state of the animation to the final state of the animation after the animation time has elapsed.
  • the animation can be controlled by the animation type and animation transformation form.
  • animation types may include displacement animations, rotation animations, scale animations, and transparency animations, among others.
  • the animation transformation form can be controlled by controllers such as interpolators and estimators. Such a controller can be used to control the speed at which the animation is transformed during animation time.
  • animation is only a combination of simple animation effects, which makes the animation effect single, does not conform to the laws of physics, and does not consider real usage scenarios and user usage habits.
  • an embodiment of the present disclosure proposes a new solution for displaying a graphical interface.
  • the embodiments of the present disclosure relate to the linkage of UI elements in the UI on animation effects, including drag linkage, depth linkage, and pressure linkage.
  • the manipulated target UI element can affect other UI elements that are not manipulated.
  • triggering the animation effect of the target UI element may jointly trigger the animation effect of one or more other UI elements, or even other UI elements in the entire UI.
  • the embodiments of the present disclosure can make the animation effect more in line with the laws of physics, and take into account the real usage scenarios and user usage habits, so that it can significantly improve the performance of the animation. Improve user experience.
  • FIG. 1A shows a schematic diagram of a hardware structure of an electronic device 100 that can implement embodiments of the present disclosure.
  • the electronic device 100 may include a processor 110 , an external memory interface 120 , an internal memory 121 , a universal serial bus (USB) interface 130 , a charging management module 140 , a power management module 141 , and a battery 142 , Antenna 1, Antenna 2, Mobile Communication Module 150, Wireless Communication Module 160, Audio Module 170, Speaker 170A, Receiver 170B, Microphone 170C, Headphone Interface 170D, Sensor Module 180, Key 190, Motor 191, Indicator 192, Camera 193 , a display screen 194, and a subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the structures shown in the embodiments of the present disclosure do not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown, or combine some components, or separate some components, or different component arrangements.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may contain multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flash, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 can couple the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate with each other through the I2C bus interface, so as to realize the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • the processor 110 may contain multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications, sampling, quantizing, and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is typically used to connect the processor 110 with the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 110 communicates with the camera 193 through a CSI interface, so as to realize the photographing function of the electronic device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to implement the display function of the electronic device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • the GPIO interface can also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiments of the present disclosure is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 charges the battery 142 , the electronic device 100 can also be powered by the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, the wireless communication module 160, and the like.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide a wireless communication solution including 2G/3G/4G/5G/6G etc. applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110, and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 may provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), 5G and subsequent evolution standards, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • CDMA code division multiple access
  • WCDMA broadband Code Division Multiple Access
  • TD-SCDMA Time Division Code Division Multiple Access
  • LTE Long Term Evolution
  • 5G and subsequent evolution standards BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc.
  • GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi- zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the electronic device 100 may include one or N display screens 194 , where N is a positive integer greater than one.
  • the electronic device 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • the camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point, and the like.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos of various encoding formats, for example, moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save files such as music, video, etc. in an external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area may store data (such as audio data, phone book, etc.) created during the use of the electronic device 100 and the like.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be contacted and separated from the electronic device 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195 .
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card and so on. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the electronic device 100 employs an eSIM, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiments of the present disclosure take a mobile operating system with a layered architecture as an example to illustrate the software structure of the electronic device 100 as an example.
  • FIG. 1B is a schematic diagram of a software structure of the electronic device 100 according to an embodiment of the present disclosure.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the operating system may be divided into four layers, which are, from top to bottom, an application layer, an application framework layer, an operating system runtime and system library, and a kernel layer.
  • the application layer can include a series of application packages. As shown in FIG. 1B , the application package may include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, and short message.
  • applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, and short message.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include window managers, content providers, view systems, telephony managers, resource managers, notification managers, and the like.
  • a window manager is used to manage window programs. The window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.
  • Content providers are used to store and retrieve data and make these data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, and the like.
  • the view system includes visual controls, such as controls that display text, controls that display images, and so on. View systems can be used to build applications.
  • a display interface can consist of one or more views.
  • a display interface that includes an icon for an SMS notification may include a view for displaying text and a view for displaying images.
  • the phone manager is used to provide the communication function of the electronic device 100 . For example, the management of call status (including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, images, layout files, video files, etc.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a brief pause without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications on the screen in the form of dialog windows. For example, text information is prompted in the status bar, a prompt sound is issued, the electronic device vibrates, and the indicator light flashes.
  • the operating system runtime includes core libraries and virtual machines.
  • the operating system runtime is responsible for the scheduling and management of the operating system.
  • the core library consists of two parts, one is the function functions that the Java language needs to call, and the other is the core library of the operating system.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the Java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example, surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg OpenGL ES), 2D graphics engine (eg: SGL) and so on.
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support multiple audio and video encoding formats, such as MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least display drivers, camera drivers, audio drivers, and sensor drivers.
  • FIG. 2 shows a block diagram of another electronic device 200 in which embodiments of the present disclosure may be implemented.
  • electronic device 200 may be in the form of a general-purpose computing device.
  • Components of electronic device 200 may include, but are not limited to, one or more processors or processing units 210, memory 220, storage devices 230, one or more communication units 240, one or more input devices 250, and one or more output devices 260.
  • the processing unit 210 may be an actual or virtual processor and can perform various processes according to programs stored in the memory 220 . In a multiprocessor system, multiple processing units execute computer-executable instructions in parallel to increase the parallel processing capability of the electronic device 200 .
  • Electronic device 200 typically includes multiple computer storage media. Such media can be any available media that can be accessed by electronic device 200, including but not limited to volatile and nonvolatile media, removable and non-removable media.
  • Memory 220 may be volatile memory (eg, registers, cache, random access memory (RAM)), non-volatile memory (eg, read only memory (ROM), electrically erasable programmable read only memory (EEPROM) , Flash) or some combination of them.
  • Storage device 230 may be removable or non-removable media, and may include machine-readable media, such as flash drives, magnetic disks, or any other media that may be capable of storing information and/or data (eg, training data for training). ) and can be accessed within the electronic device 200.
  • the electronic device 200 may further include additional removable/non-removable, volatile/non-volatile storage media.
  • disk drives may be provided for reading or writing from removable, non-volatile magnetic disks (eg, "floppy disks") and for reading or writing from removable, non-volatile optical disks CD-ROM drive for reading or writing.
  • each drive may be connected to a bus (not shown) by one or more data media interfaces.
  • the memory 220 may include a computer program product 225 having one or more program modules configured to perform the object editing methods or processes of embodiments of the present disclosure.
  • the communication unit 240 enables communication with other computing devices through a communication medium. Additionally, the functions of the components of the electronic device 200 may be implemented in a single computing cluster or multiple computing machines capable of communicating through a communication link. Accordingly, electronic device 200 may operate in a networked environment using logical connections to one or more other servers, network personal computers (PCs), or another network node.
  • PCs network personal computers
  • Input device 250 may be one or more input devices, such as a mouse, keyboard, trackball, and the like.
  • Output device 260 may be one or more output devices, such as a display, speakers, printer, and the like.
  • the output device 260 may include a touch screen with a touch sensor, which may receive a user's touch input.
  • the electronic device 200 may also communicate with one or more external devices (not shown) through the communication unit 240 as needed, such as a storage device, a display device, etc., with one or more devices that allow a user to interact with the electronic device 200 communicate, or with any device (eg, network card, modem, etc.) that enables electronic device 200 to communicate with one or more other computing devices. Such communication may be performed via an input/output (I/O) interface (not shown).
  • I/O input/output
  • the electronic device 100 shown in FIG. 1 and the electronic device 200 shown in FIG. 2 above are merely two example electronic devices capable of implementing one or more embodiments of the present disclosure, and should not constitute a any limitations on the functionality and scope of the described embodiments.
  • the embodiments of the present disclosure relate to the linkage of animation effects of UI elements in the UI when dragging, which is also referred to as drag linkage.
  • the dragged target UI element can affect other UI elements that are not dragged.
  • triggering the animation effect of the target UI element may jointly trigger the animation effect of one or more other UI elements, or even other UI elements in the entire UI.
  • other UI elements can also move with a corresponding animation effect, thereby visually presenting the linkage drag.
  • the embodiments of the present disclosure can make the animation effect more in line with the laws of physics, and take into account the real usage scenarios and user usage habits, so that it can significantly improve the performance of the animation. Improve user experience.
  • drag linkage will be described below with reference to FIGS. 3A-19B .
  • UI elements may have irregular sizes and shapes.
  • UI 300A may include multiple UI elements, such as UI elements 1-13, where UI elements 1, 2, 4, and 5 have different sizes and shapes.
  • the UI may also have irregular parts.
  • FIG. 3B the right side of UI elements 16 and 18 in UI 300B is vacant, that is, there is no UI element.
  • the embodiments of the present disclosure are equally applicable to regular layouts, sizes and shapes.
  • UI 300C has a regular layout, and UI elements 25-40 in UI 300C all have the same size and shape. It should be understood that the embodiments of the present disclosure are applicable to any suitable regular or irregular layout, size and shape.
  • UI elements in the UI can be dragged.
  • the user may drag the UI element when the user desires to move the UI element.
  • a user may drag a UI element when the user desires to change the position of the UI element in the UI, merge the UI element with another UI element, put the UI element into a toolbar or trash, or the like.
  • the UI element can be moved with an animation effect to visually present the drag action.
  • the dragged target UI element can affect other UI elements that are not dragged.
  • other UI elements can also move with a corresponding animation effect, thereby visually presenting a linked drag.
  • FIG. 4 shows a schematic diagram of an example drag linkage 400 in accordance with some embodiments of the present disclosure.
  • UI element 3 may move with an animation effect to visually present the drag action.
  • other UI elements 2 and 4 can also move with a corresponding animation effect, thereby visually presenting a linked drag.
  • Figure 4 only shows the coordinated movement of UI elements 2-3. It should be understood that linked movement can occur at any at least two UI elements in any UI, for example, at any at least two UI elements in UIs 300A-300C.
  • a drag at UI element 3 is detected at time T01, causing UI element 3 and other UI elements 2 and 4 to move.
  • the distance between UI element 3 and UI element 4 located in the dragging direction becomes smaller.
  • Spacing may represent the distance between corresponding reference points of two UI elements.
  • the center point of the UI element may be determined as the fiducial point of the UI element.
  • the spacing may represent the distance between adjacent boundaries of two UI elements.
  • UI element 3 may even cover at least a portion of UI element 4 .
  • the distance between UI element 3 and UI element 2 located in the opposite direction of the dragging direction becomes larger.
  • UI element 3 is moving faster than UI elements 2 and 4.
  • the distance between UI element 3 and UI element 4 located in the drag direction becomes larger, and the distance from UI element 2 located in the opposite direction to the drag direction becomes smaller.
  • UI element 3 moves at a lower speed than UI elements 2 and 4.
  • UI element 3 and other elements 2 and 4 move to a predetermined distance, thereby stopping the movement.
  • the predetermined distance may be determined based on a friction force model.
  • the manner of determining the distance based on the friction force model will be described in detail, and thus the description thereof is omitted here.
  • Movement of UI elements may be controlled by friction factors, linkage factors, catch-hand factors, catch-hand ratio factors, release rebound factors, and/or inertial rebound factors.
  • the friction factor can control UI elements to stop moving.
  • Linkage factors can control the animation of other UI elements.
  • the follow factor can control the follow movement of UI elements, such as the follow movement of UI elements when dragging without crossing the boundary.
  • the follow-to-hand ratio factor can control the ratio of the follow-hand movement of the UI element, such as the ratio of the UI element's displacement to the hand's displacement when dragging after crossing the bounds.
  • the release rebound factor can control the reset of UI elements after letting go, for example, the reset of UI elements after dragging and letting go.
  • the inertial rebound factor can control the rebound of UI elements after crossing the boundary. For example, friction factors may not stop the UI element from moving when it moves out of bounds. In this case, the inertial springback factor can control the springback of UI elements after they cross the bounds.
  • the friction force model associated with the friction force factor and the elastic force model associated with the linkage factor will be described in detail.
  • the friction model can be used to determine the distance that the UI element will move, thereby determining the source and destination locations of the UI element's movement.
  • the spring parameters of other UI elements that are moved in linkage can be determined based on the spring parameters (eg, elasticity coefficient, damping coefficient) of the dragged UI element, using a conduction method that will be described in detail below, so that the dragged UI can
  • each UI element is controlled to move according to the elastic force model based on the respective spring parameters of each UI element.
  • the friction model can be used to determine how far a UI element will move, such as how far a UI element will move after letting go or slipping. This distance can be determined by equations (1) and (2) as follows:
  • f friction is the friction force, which is configurable by the electronic device or user
  • t is the moving time
  • V 0 is the initial velocity, which is configurable by the electronic device or the user, or obtained by detecting the speed of the user's dragging
  • V(t) represents the final velocity, since the movement of the UI element will eventually stop, so V(t) is 0
  • e represents the natural constant
  • S(t) represents the distance the UI element will move.
  • the time t of the movement can be determined by equation (1).
  • the distance S(t) can be further determined by equation (2).
  • various parameters in the equation eg, friction, initial velocity, etc.
  • 5A and 5B show schematic diagrams of an example velocity time curve 500A and an example displacement time curve 500B, respectively, of a friction force model according to some embodiments of the present disclosure.
  • the moving speed of the UI element decreases to 0 with time, and the moving distance of the UI element increases with time until the movement stops.
  • the positions to which UI elements can be moved are not limited.
  • the distance determined based on the friction model is the distance the UI element will move.
  • the positions to which UI elements can be moved are limited.
  • UI elements can only be moved to predetermined positions.
  • the distance that the UI element will move can be determined based on the friction force model, if the UI element is not located at the predetermined position after moving the distance, the distance that the UI element will move needs to be adjusted so that the UI Elements can be moved to predetermined positions.
  • the UI element can be moved to a predetermined position closest to the stop position determined based on the friction model.
  • the distance that the UI element is to be moved can be jointly determined based on both the friction model and the predetermined position.
  • FIG. 6 shows a schematic diagram of an example 600 of restricted and unrestricted mobile locations in accordance with some embodiments of the present disclosure.
  • the UI element can only be moved from the source position 620 on the screen 610 to the predetermined position 630 .
  • UI elements can be moved from source position 620 on screen 610 to any position 640 without limitation to the position to which the UI element can be moved.
  • the moving range of the UI element may also be limited. UI elements that move beyond this range will be considered out of bounds.
  • the range can be any suitable range.
  • the range may be a distance from the screen boundary that is less than a predetermined proportion of the screen size or a range of predetermined pixels (such as 10% or 1000 pixels), or the distance from the source location of the UI element to the destination location is less than a predetermined proportion of the screen size. or a range of predetermined pixels (such as 50% or 10000 pixels).
  • the distance that the UI element is to be moved can also be determined based on the range.
  • the friction force model is described in detail above, and the elastic force model is further described below.
  • the displacement versus time curve of the movement of the UI element may be an elastic force curve conforming to an elastic force model. Since displacement and time can determine velocity, the velocity of UI elements' movement also follows the elastic force model. For this reason, it can be considered that the movement of UI elements can simulate the motion law of springs. It should be understood that, in order to describe the elastic force model here, a curve of the displacement of the UI element moving with time is described as an elastic force curve. However, the displacement versus time curve of the movement of the UI element may also be any suitable predefined curve, such as a Bezier curve.
  • the elastic force model can be based on the damped vibration equations (3) and (4) under Hooke's law:
  • f represents the force on the spring during vibration (ie, moving), which can be electronically or user-configurable
  • a represents the acceleration of the movement
  • t represents the time of movement
  • k represents the spring coefficient of the spring
  • x represents the spring
  • g represents the damping coefficient of the spring;
  • m represents the mass of the UI element, where the size of the UI element can be equivalent to the mass of the UI element.
  • the coefficient of elasticity is the amount of force required for a unit of deformation of the spring.
  • the elasticity coefficient k is electronically or user configurable.
  • the value range of the elastic coefficient k may be 1-999, and the suggested value range of the elastic coefficient k may be 150-400.
  • the damping coefficient is a quantitative representation of the damping force (such as fluid resistance, friction, etc.) of the spring during the vibration process.
  • the above damping force can gradually reduce the amplitude of the spring until it stops at the equilibrium position.
  • the damping coefficient g is electronically or user configurable. In some embodiments, the value range of the damping coefficient g may be 1-99.
  • the distance S(t) that the UI element is to be moved can be determined based on the friction force model, as described above.
  • S(t) can be regarded as the amount of deformation of the spring. Therefore, S(t) is equal to x.
  • the elastic force model has three damping states, namely critical damping state, under damping state and over damping state.
  • the critical damping conforms to the following equation (5):
  • g is the damping coefficient of the spring
  • m is the size of the UI element
  • k is the spring's elastic coefficient
  • FIGS. 7A-7C show schematic diagrams of examples of plots 700A-700C of spring deflection amount x versus time t for a critically damped state, an under-damped state, and an over-damped state, respectively, according to some embodiments of the present disclosure.
  • Fig. 7A in the critical damping state, the spring stops moving after returning to the equilibrium position with the smoothest speed and the shortest time, and no longer oscillates.
  • Fig. 7B in the under-damped state, the spring slowly reduces the amplitude gradually through multiple oscillations, and finally returns to the equilibrium position.
  • Fig. 7C in the overdamped state, the spring hardly vibrates, and the amplitude gradually decreases, reaching the equilibrium position.
  • the displacement versus time curve of the movement of the UI element may be an elastic force curve conforming to an elastic force model. Therefore, it can be considered that the movement of the UI element can simulate the motion law of the spring, that is, the change law of the displacement of the UI element can simulate the change law of the spring deformation.
  • the damping coefficient and/or the elastic coefficient the change law of the displacement of the UI element can be adjusted, so that the UI element simulates the motion law of the spring in a critical damping state, over damping or under damping state.
  • the spring parameters of other UI elements that are moved in linkage can be determined based on the spring parameters (such as elasticity coefficient, damping coefficient) of the dragged UI element, using a conduction method that will be described in detail below, so as to During the movement of the dragged UI element and the UI element that is moved in linkage, each UI element is controlled to move according to the elastic force model based on its respective spring parameters.
  • the dragged UI elements and the UI elements that are moved in linkage can simulate the motion law of springs with different spring parameters, so that the interval between UI elements in the drag direction decreases first and then recovers (similar to a spring First compress and then restore), and the interval between UI elements in the opposite direction of the dragging direction first increases and then restores (similar to the animation effect of a spring stretching and then restoring), which increases the dynamic feedback to the user's dragging action.
  • the animation effect of a UI element that is moved in linkage is determined based on the animation effect of the dragged UI element moving and the distance between the dragged UI element and the UI element that is moved in linkage. Since the animation effect of the UI element moving in linkage changes with the distance, it can also be considered that the animation effect of its movement is conducted with the distance.
  • the conduction may be non-linear conduction. Alternatively, the conduction can also be linear conduction.
  • the animation effect of the movement of UI elements that move in conjunction can be determined by the following equation (6):
  • x n represents the animation effect of the UI element moving in linkage
  • x represents the animation effect of the UI element being dragged
  • n represents the distance between the UI element being dragged and the UI element that is moved in linkage
  • g represents the conduction coefficient , when the conduction coefficient is 0, the animation effect of the UI element moving in linkage is the same as the animation effect of the UI element being dragged
  • the constant in equation (6) is only an example, which is configurable by the electronic device or user of.
  • the animation effect of the movement may be controlled by a damping coefficient and/or an elastic coefficient.
  • x may be determined based on at least one of a damping coefficient and a spring coefficient.
  • x can be the ratio of the elastic coefficient to the damping coefficient of the UI element being dragged.
  • the ratio of the elastic coefficient to the damping coefficient of the dragged UI element is transmitted to the UI element that moves in conjunction with the distance n, thereby obtaining the ratio x n of the elastic coefficient to the damping coefficient of the UI element that moves in conjunction.
  • the animation effect of the dragged UI element moving can be transferred to the UI element that is moved in conjunction based on the distance.
  • the larger the ratio of elastic coefficient to damping coefficient the weaker the correlation between the movement of UI elements, the greater the difference in spring characteristics and movement between UI elements, which can be considered as "softer" springs.
  • the smaller the ratio of the elastic coefficient to the damping coefficient the stronger the correlation between the movement of UI elements, and the smaller the difference in spring characteristics and movement between UI elements, which can be considered as a “harder” spring.
  • the ratio of the elastic coefficient to the damping coefficient of the UI element being dragged by x is merely an example.
  • x can be any suitable factor.
  • x may be the elasticity coefficient of the UI element being dragged.
  • x may also be the damping coefficient of the UI element being dragged.
  • the animation effect of UI element movement may follow any suitable predetermined curve, such as a Bezier curve.
  • the Bezier curve may have control points corresponding to the order.
  • the Bezier curve can have two control points.
  • the Bezier curve can have one control point
  • the Bezier curve can have three control points, and so on .
  • the animation effect of the movement can be controlled by the coordinates of at least one of at least one control point of the Bezier curve.
  • the animation of the movement can be controlled by one or both of the two control points of the second-order bezier curve.
  • x may be determined based on the coordinates of at least one of the at least one control point.
  • x n represents the animation effect of the UI element moving in linkage
  • x represents the animation effect of the UI element being dragged
  • n represents the distance between the UI element being dragged and the UI element that is moved in linkage
  • g represents the conduction coefficient , when the conduction coefficient is 0, the animation effect of the UI element moving in linkage is the same as the animation effect of the UI element being dragged.
  • the animation effect of the movement can be controlled by a damping coefficient and/or an elastic coefficient.
  • x may be determined based on at least one of a damping coefficient and a spring coefficient.
  • the animation effect of the movement can be controlled by the coordinates of at least one of at least one control point of the Bezier curve.
  • x may be determined based on the coordinates of at least one of the at least one control point.
  • FIG. 8 shows a flowchart of a graphical interface display method 800 according to an embodiment of the present disclosure. It should be understood that the method 800 may be performed by the electronic device 100 described above with reference to FIG. 1 or the electronic device 200 described with reference to FIG. 2 . Method 800 is described herein with reference to UI 300A of Figure 3A. It should be understood, however, that UI 300A is merely an example, and method 800 may be applied to any suitable interface, including but not limited to UIs 300B-300C.
  • the M user interface UI elements are displayed on the screen of the electronic device.
  • M is a positive integer greater than 1.
  • the M UI elements may be UI elements 1 to 13 .
  • a drag at a first UI element of the M UI elements is detected.
  • the first UI element may be UI element 5 .
  • the drag on the first UI element will cause the first UI element to move with an animation effect to present the drag effect.
  • each of the N UI elements on the screen is moved with a corresponding animation effect.
  • N is a positive integer between 1 and M-1.
  • the drag linkage can act on all UI elements on the screen.
  • M-1 UI elements other than the first UI element among the M UI elements may be determined as N UI elements.
  • the drag linkage may only act on some UI elements on the screen.
  • the influence area of the first UI element may be determined based on the size of the first UI element, and the UI elements within the influence area among the M UI elements may be determined as N UI elements. For example, the larger the size of the first UI element, the larger its area of influence may be. Alternatively, the area of influence may also be reduced in size, and the present disclosure is not limited herein.
  • the area of influence may be a circle with a predetermined radius centered on the reference point of the first UI element. It should be understood that the area of influence may be any suitable area having any shape, such as a rectangle, diamond, etc. having a predetermined size. The area of influence may be configurable by the electronic device and the user, and the present disclosure is not limited herein.
  • UI elements that intersect the area of influence may be considered to be within the area of influence.
  • the area of influence is a circle with a predetermined radius
  • the UI element may be considered to be within the area of influence if the distance from the first UI element is less than the predetermined radius of the area of influence.
  • FIG. 9 shows a schematic diagram of an example of an area of influence of a UI element according to an embodiment of the present disclosure.
  • UI elements 3 , 4 , 7 , and 8 are within the influence area 910 of UI element 5 , UI elements 3 , 4 , 7 , and 8 will move in conjunction with UI element 5 .
  • UI elements 1, 2, 6, 9-13 are not within the influence area 910 of UI element 5, UI elements 1, 2, 6, 9-13 will not move in conjunction with UI element 5.
  • the distance between the first UI element and each of the N UI elements may be determined.
  • how to determine the distance between the first UI element and the second UI element among the N UI elements will be described.
  • the distances may be divided into a plurality of distance classes according to the range in which the distances lie.
  • the operated UI element itself may be at distance level 0
  • the linked UI elements may be at distance level 1, 2, 3 according to their corresponding distance from the operated UI element...
  • UI elements at the same distance level may be are considered to be the same distance. Therefore, by using the distance level, the linkage of UI elements can be simplified, so that UI elements at the same distance level are linked in the same way, thereby improving the unity and coordination of the linkage.
  • the distance itself may also be used, thereby allowing UI elements to be more precisely linked.
  • distance classes are referred to interchangeably as distances.
  • FIG. 10 shows a schematic diagram of an example 1000 of distance determination according to an embodiment of the present disclosure.
  • a first fiducial point for a first UI element eg, UI element 5
  • a second fiducial point for a second UI element eg, UI element 2
  • the reference point of each UI element is indicated by "+”.
  • the center point of the UI element may be determined as the fiducial point of the UI element.
  • the fiducial point of the UI element may be electronically or user-configurable, so that the location of the fiducial point may be any suitable location, and the present disclosure is not limited herein.
  • the distance between the first reference point and the second reference point can be determined as the distance between the first UI element and the second UI element.
  • the distance can be determined by the following equation (8):
  • n represents the distance
  • x0 represents the abscissa of the first reference point
  • y0 represents the ordinate of the first reference point
  • x1 represents the abscissa of the second reference point
  • y1 represents the ordinate of the second reference point.
  • the distances between UI element 5 and other UI elements determined in the above manner are as follows: the distance between UI element 5 and itself is 0, the distance between UI element 5 and UI elements 3, 7, and 8 is 1, and the distance between UI element 5 and itself is 0. The distance between UI elements 2, 4, 6, 9 is 2, and the distance from UI elements 1, 10-13 is 3.
  • FIG. 11A-11C show schematic diagrams of examples 1100A-1100C of distance determination according to embodiments of the present disclosure.
  • a first reference point for the first UI element may be determined.
  • a plurality of circles having respective radii, such as circles 1110A-1130A, may be determined from the first reference point as the center of the circle. It should be understood that, in addition to a circle, any other suitable shapes with respective sizes can also be determined with the first reference point as the center, such as rectangles, diamonds, etc., and the present disclosure is not limited herein.
  • a first reference point for the first UI element may be determined.
  • a plurality of rectangles having respective dimensions, eg, rectangles 1110B-1120B, may be determined from the first reference point as the center.
  • a first reference point for the first UI element may be determined.
  • a plurality of diamonds having respective dimensions, such as rectangles 1110C-1140C, may be determined from the first reference point as the center.
  • the radii of the plurality of circles may increase by a predetermined size or ratio.
  • the radii of the plurality of circles may be electronically or user configurable, and the present disclosure is not limited herein.
  • the circle intersecting the second UI element can be determined. From this, the radius of the intersecting circles can be determined as the distance. In some embodiments, if there is more than one circle that intersects the second UI element, from these circles, a target circle that intersects the second UI element and has the smallest radius can be determined. Further, in some embodiments, if no circle intersects the second UI element, the circle closest to the second UI element may be used as the target circle, whereby the radius of the target circle may be determined as the distance.
  • the distance between UI element 5 and other UI elements determined in the above manner is as follows: the distance between UI element 5 and itself is 0. Since the circle with the smallest radius that intersects UI elements 3, 4, 7, 8 is circle 1110, the distance between UI elements 3, 4, 7, 8 and UI element 5 is 1. Since circle 1120 intersects UI elements 2, 6, 9, the distance between UI elements 2, 6, 9 and UI element 5 is 2. Also, since circle 1130 intersects UI elements 1, 10-13, the distance between UI elements 1, 10-13 is 3.
  • FIG. 12 shows a schematic diagram of an example 1200 of distance determination according to an embodiment of the present disclosure.
  • the horizontal spacing between the first UI element and the second UI element may be determined, and/or the vertical spacing between the first UI element and the second UI element may be determined.
  • the lateral spacing may represent the sum of the lengths of one or more lateral spacings between the first UI element and the second UI element.
  • Horizontal spacing may represent the spacing between the vertical boundaries of two UI elements on the screen.
  • vertical spacing may represent the sum of the lengths of one or more vertical spacings between the first UI element and the second UI element.
  • Vertical spacing may represent the spacing between the horizontal boundaries of two UI elements on the screen.
  • the lengths of the horizontal and vertical spaces between UI elements can be irregular.
  • the lengths of the horizontal and vertical spaces between UI elements may be electronically device or user configurable.
  • the distance may be determined based on the lateral spacing and/or the longitudinal spacing. For example, there are two vertical spaces between UI element 5 and UI element 13. Thus, the distance between UI element 5 and UI element 13 may be the sum of the lengths of these two longitudinal intervals. As another example, there is a lateral space between UI element 12 and UI element 13 . Thus, the distance between UI element 12 and UI element 13 may be the length of this lateral separation.
  • UI element 5 has a distance of 0 from itself, and distances from UI elements 2-4 and UI elements 6-9. is 1, and the distance from UI elements 1, 10-13 is 2.
  • FIG. 13 shows a schematic diagram of an example 1300 of distance determination according to an embodiment of the present disclosure.
  • the horizontal spacing may represent the sum of the lengths of one or more horizontal spacings between the first UI element and the second UI element and the width of one or more intermediate UI elements.
  • the vertical spacing may represent the sum of the lengths of one or more vertical gaps between the first UI element and the second UI element and the height of one or more intermediate UI elements.
  • the distance may be determined based on the lateral spacing and/or the longitudinal spacing. For example, there are two vertical spaces and an intermediate UI element 9 between UI element 5 and UI element 13 . Thus, the distance of UI element 5 from UI element 13 may be the sum of the lengths of these two longitudinal intervals and the height of UI element 9 . As another example, there is one lateral space and one intermediate UI element 12 between UI element 11 and UI element 13 . Thus, the distance of UI element 11 from UI element 13 may be the sum of the length of this lateral separation and the width of UI element 12 . Additionally, there is a vertical space between UI element 3 and UI element 5. Thus the distance between UI element 3 and UI element 5 is the length of this longitudinal interval.
  • the distance between UI element 3 and UI element 11 is the sum of the length of the three longitudinal intervals and the height of the two intermediate UI elements 5 and 7 .
  • the distances between UI element 5 and other UI elements determined in the above manner are as follows: the distance between UI element 5 and itself is 0, and the distance between UI elements 2-4 and UI elements 6-9 is 1 and the distance to UI element 1, 10-13 and UI element 5 is 2.
  • the operation direction may also be taken into account in the horizontal and vertical spacing.
  • the operation direction may be the direction of dragging the first UI element.
  • the drag linkage scenario is described here, as will be described below, there are also deep linkage scenarios and pressure linkage scenarios. In the depth linkage scene and the pressure linkage scene, the distance determination method considering the operation direction can also be used.
  • UI elements can be pressed.
  • the direction from the second UI element to the first UI element (such as the direction from the second reference point to the first reference point) can be regarded as an operation direction, so that the direction of operation is taken into account in the horizontal and vertical spacing.
  • the lateral spacing and/or the longitudinal spacing may first be determined using the manner of determining the distance described with reference to FIGS. 12 and 13 . Then, the angle between the operation direction and the horizontal direction and/or the vertical direction can be determined. Thus, the distance in the operation direction can be determined using the principle of trigonometric functions.
  • the trigonometric function principle can be used to determine the distance between the operation direction 1410A and the horizontal direction. distance.
  • the distance in the operation direction may be determined by selecting one of the horizontal direction and the vertical direction that is closer to the operation direction as the reference direction according to the included angle between the operation direction and the horizontal direction and the vertical direction. For example, as shown in FIG. 14B , since the operation direction 1430B is closer to the vertical direction, the vertical direction can be selected as the reference direction, and based on the longitudinal distance and the angle between the operation direction 1430B and the vertical direction, the principle of trigonometric function is used to determine Distance in operating direction 1430B.
  • the horizontal distance in the operation direction 1420B can be determined based on the horizontal spacing and the angle between the operation direction 1420B and the horizontal direction, using the principle of trigonometric functions.
  • the reference direction may be electronically or user-configurable, and the present disclosure is not limited herein.
  • the reference direction may be set to a horizontal direction, a vertical direction or any other suitable direction.
  • the distance in the operation direction is determined by using the horizontal distance and the vertical distance.
  • the horizontal and vertical spacing may be determined by the size of the intermediate spacing and intermediate UI elements. Therefore, the distance in the manipulation direction can also be determined piecewise for each intermediate interval and intermediate UI element. Specifically, for each intermediate interval and intermediate UI element, the size of the intermediate interval or the intermediate UI element, and the included angle between the operation direction and the horizontal direction or the vertical direction can be determined. Thus, the distance in the operation direction can be determined by using the principle of trigonometric functions. The distances in the manipulation direction determined for each intermediate interval and intermediate UI element may then be summed to determine the total distance in the manipulation direction.
  • an animation effect of moving the second UI element may be determined based on the distance.
  • a first animation effect of the movement of the first UI element in response to the drag may be determined.
  • the first animation effect of the movement of the first UI element may be controlled by a predefined curve of displacement over time.
  • the predefined curve can be a Bezier curve or an elastic force curve.
  • the animation effect of the movement of the second UI element in response to the drag may be determined based on the first animation effect and the distance between the first UI element and the second UI element.
  • the first animation effect of the movement of the first UI element in the case where the first animation effect of the movement of the first UI element is controlled by a predefined curve of the displacement of the first UI element over time, it may be based on the predefined curve of the displacement of the first UI element over time.
  • a plot of the displacement of the second UI element versus time is determined. For example, in the case of an elastic force curve, the damping coefficient and/or the spring coefficient of the distance conduction spring can be based.
  • the coordinates of at least one of the at least one control point of the Bezier curve may be conducted based on distance. How to transmit the animation effect of the first UI element to the second UI element, so as to obtain the animation effect of the second UI element, can be realized by using the transmission method described in detail above. Therefore, its description is omitted here.
  • the size of the second UI element can also affect the animation effect of its movement.
  • the size of the second UI element may also be considered to determine the animation effect of the second UI element.
  • the animation effect of moving the second UI element may be determined based on the first magnitude, the distance, and the size of the second UI element.
  • the size of the first UI element may also affect the animation effect of the movement of the second UI element.
  • the size of the first UI element may also be considered to determine the animation effect of the second UI element.
  • the animation effect of moving the second UI element may be determined based on the first magnitude, the distance, and the size of the first UI element.
  • both the size of the first UI element and the size of the second UI element may affect the animation effect of moving the second UI element. Therefore, in some embodiments, the animation effect of moving the second UI element may be determined based on the first magnitude, the distance, the size of the first UI element, and the size of the second UI element.
  • the second UI element may be moved with the animation effect to visually indicate that the second UI element moves with the first UI element.
  • N UI elements they can all be moved with respective animation effects to visually indicate dragging on the entire screen or a partial area of the screen, thereby presenting a dragging linkage.
  • the movement direction of a UI element that moves in conjunction can be associated with the drag direction, thereby visually indicating the drag action.
  • the direction of the drag may be determined, and the second UI element may be animated to move in association with the determined direction.
  • the first UI element and the second UI element do not start to move at the same time.
  • a first UI element may start moving when the drag occurs, while a second UI element may start moving some time after the drag occurs.
  • a delay time may be determined based on the distance between the first UI element and the second UI element, and the second UI element is moved in response to the delay time elapsed after the drag occurs.
  • a delay factor may be determined, and the delay time is determined based on the distance and the delay factor.
  • the delay time may be the quotient of the distance divided by the delay factor.
  • the delay factor may be electronically and user configurable.
  • FIG. 15 shows a schematic diagram of an example of a delay time 1500 according to an embodiment of the present disclosure.
  • the first UI element eg, UI element 5
  • the UI elements with a distance of 1 eg, UI elements 3, 4, 7, 8
  • UI elements with distance 2 eg, UI elements 2, 6, 9
  • UI elements with distance 3 eg, UI elements 1, 10-13
  • the embodiments of the present disclosure can make the animation effect more in line with the laws of physics, and take into account the real usage scenarios and user usage habits, so that it can significantly improve the performance of the animation. Improve user experience.
  • FIG. 16A shows a schematic diagram of an example of a scene 1600A in which UI elements move completely with the hand, according to an embodiment of the present disclosure.
  • 16B shows a schematic diagram of an example of a displacement time curve 1600B for a scene in which a UI element completely moves with the hand, according to an embodiment of the present disclosure.
  • T11 UI element 5 is dragged.
  • T11a UI element 5 starts to move following the drag of the finger.
  • T11a may be equal to T11 if UI element 5 begins to move when the drag occurs.
  • T11a may be greater than T11 if UI element 5 begins to move after the drag occurs.
  • other UI elements eg, UI elements 2-4 and 6-9 also move in conjunction. It should be understood that other UI elements are shown as starting to move at the same time as UI element 5 for clarity. However, as mentioned above, other UI elements may start moving after their respective delay times.
  • T12 the user releases or slips, in which case the dragging of UI element 5 ends.
  • UI element 5 stops moving.
  • T12a may be equal to T12 if UI element 5 stops moving when released or slipped.
  • T12a may be greater than T12 if UI element 5 stops moving after being released or slipped.
  • the displacement of the UI element 5 in the dragging direction is S10.
  • the displacement of UI elements 3, 4, 7, and 8 with a distance of 1 in the drag direction is S11.
  • the displacement of UI elements 2, 6, and 9 with a distance of 2 in the drag direction is S12.
  • Displacement S10 is greater than displacement S11, and displacement S11 is greater than displacement S12.
  • a predefined curve eg, an elastic force curve
  • the spacing in the drag direction between UI element 5 and UI elements 3, 4, 7, and 8 with a distance of 1 decreases. Furthermore, the spacing in the drag direction between UI elements 3, 4, 7, 8 with a distance of 1 and UI element 9 with a distance of 2 in the drag direction decreases. In addition, the distance in the drag direction increases between UI elements 3, 4, 7, and 8 with a distance of 1 and UI elements 2, 6 with a distance of 2 in the opposite direction to the drag direction.
  • the displacement of the UI element 5 in the drag direction remains at S10.
  • the UI elements 3, 4, 7, and 8 with a distance of 1 move by displacement S10 in the drag direction, and stop moving.
  • the displacement of UI elements 2, 6, and 9 with a distance of 2 in the drag direction has not yet reached S10, and continues to move with an animation effect controlled by a predefined curve.
  • the spacing in the drag direction between UI elements 3, 4, 7, 8 with a distance of 1 and UI element 9 with a distance of 2 in the drag direction increases.
  • the spacing in the drag direction between UI elements 3, 4, 7, 8 with a distance of 1 and UI elements 2, 6 with a distance of 2 in the opposite direction to the drag direction is reduced.
  • 17A shows a schematic diagram of an example of a scene 1700A in which UI elements do not completely follow the hand, according to an embodiment of the present disclosure.
  • 17B shows a schematic diagram of an example of a displacement time curve 1700B for a scene in which a UI element does not completely follow the hand, according to an embodiment of the present disclosure.
  • UI element 5 is dragged.
  • T21a UI element 5 begins to move following the drag of the finger.
  • T21a may be equal to T21 if UI element 5 begins to move when the drag occurs.
  • T21a may be greater than T21 if UI element 5 starts to move after the dragging occurs.
  • other UI elements eg, UI elements 2-4 and 6-9 also move in conjunction. It should be understood that other UI elements are shown as starting to move at the same time as UI element 5 for clarity. However, as mentioned above, other UI elements may start moving after their respective delay times.
  • T22 the user releases or swipes, in which case dragging UI element 5 ends.
  • UI element 5 stops moving.
  • T22a may be equal to T22 if UI element 5 stops moving when released or slipped.
  • T22a may be greater than T22 if the UI element 5 stops moving after being released or slipped.
  • the displacement of the finger in the dragging direction is SF2.
  • the displacement of the UI element 5 in the dragging direction is S20.
  • the displacement of UI elements 3, 4, 7, and 8 with a distance of 1 in the dragging direction is S21.
  • the displacement of UI elements 2, 6, and 9 with a distance of 2 in the drag direction is S22.
  • Displacement SF2 is greater than displacement S20, displacement S20 is greater than displacement S21, and displacement S21 is greater than displacement S22.
  • UI element 5 stops moving, while UI elements 3, 4, 7, 8 with a distance of 1 and UI elements 2, 6, and 9 with a distance of 2 continue the animation controlled with a predefined curve (eg, an elastic force curve) The effect moves.
  • a predefined curve eg, an elastic force curve
  • the distance between UI element 5 and UI elements 3, 4, 7, and 8 with a distance of 1 in the drag direction is increased.
  • the spacing in the drag direction between UI elements 3, 4, 7, 8 with a distance of 1 and UI element 9 with a distance of 2 in the drag direction decreases.
  • the spacing in the drag direction between UI elements 3, 4, 7, 8 with a distance of 1 and UI elements 2, 6 with a distance of 2 in the opposite direction to the drag direction increases.
  • the spacing in the drag direction between UI element 5 and UI elements 3, 4, 7, and 8 with a distance of 1 decreases. Furthermore, the spacing in the drag direction between UI elements 3, 4, 7, 8 with a distance of 1 and UI element 9 with a distance of 2 in the drag direction decreases. In addition, the distance in the drag direction increases between UI elements 3, 4, 7, and 8 with a distance of 1 and UI elements 2, 6 with a distance of 2 in the opposite direction to the drag direction.
  • the displacement of the UI element 5 in the drag direction remains at S20.
  • the UI elements 3, 4, 7, and 8 with a distance of 1 move the displacement S20 in the drag direction, and stop moving.
  • the displacement of UI elements 2, 6, and 9 with a distance of 2 in the dragging direction has not yet reached S20, and continues to move with an animation effect controlled by a predefined curve.
  • the spacing in the drag direction between UI elements 3, 4, 7, 8 with a distance of 1 and UI element 9 with a distance of 2 in the drag direction increases.
  • the spacing in the drag direction between UI elements 3, 4, 7, 8 with a distance of 1 and UI elements 2, 6 with a distance of 2 in the opposite direction to the drag direction is reduced.
  • UI elements 5 all stop moving after stopping the dragging. However, the UI element 5 may also continue to move a certain distance after stopping the dragging. In some embodiments, the distance may be determined based on a friction force model as described above. Whether the UI element 5 continues to move after stopping the drag is electronically device and user configurable. For example, if the electronic device is configured to allow continued movement after a release or slip, UI element 5 may continue to move. Conversely, UI element 5 will stop moving as the dragging stops.
  • FIG. 18A shows a schematic diagram of an example of a scene 1800A in which UI elements move completely with the hand, according to an embodiment of the present disclosure.
  • 18B shows a schematic diagram of an example of a displacement time curve 1800B for a scene in which a UI element completely moves with the hand, according to an embodiment of the present disclosure.
  • T31 UI element 5 is dragged.
  • T11a UI element 5 starts to move following the drag of the finger.
  • T31a may be equal to T31 if UI element 5 begins to move when the drag occurs.
  • T31a may be greater than T31 if UI element 5 begins to move after the drag occurs.
  • other UI elements eg, UI elements 2-4 and 6-9 also move in conjunction. It should be understood that other UI elements are shown as starting to move at the same time as UI element 5 for clarity. However, as mentioned above, other UI elements may start moving after their respective delay times.
  • T32 the user releases or swipes, in which case dragging UI element 5 ends.
  • UI element 5 continues to move with an animation effect controlled by a predefined curve (eg, an elastic force curve).
  • T32a may be equal to T32 if the UI element 5 moves with an animation controlled by a predefined curve when the drag is ended.
  • T32a may be greater than T32 if the UI element 5 is moved with an animation effect controlled by a predefined curve after the end of the drag.
  • the displacement of the UI element 5 in the drag direction is SF3.
  • the displacement of UI elements 3, 4, 7, and 8 with a distance of 1 in the dragging direction is S31.
  • the displacement of UI elements 2, 6, and 9 with a distance of 2 in the drag direction is S32.
  • the displacement SF3 is greater than the displacement S31, and the displacement S31 is greater than the displacement S32.
  • UI elements 3, 4, 7, 8 with a distance of 1 and UI elements 2, 6, and 9 with a distance of 2 also continue to move with the animation effect of the predefined curve.
  • T31 the distance in the drag direction between UI element 5 and UI elements 3, 4, 7, and 8 with a distance of 1 is increased.
  • the spacing in the drag direction between UI elements 3, 4, 7, 8 with a distance of 1 and UI element 9 with a distance of 2 in the drag direction decreases.
  • the spacing in the drag direction between UI elements 3, 4, 7, 8 with a distance of 1 and UI elements 2, 6 with a distance of 2 in the opposite direction to the drag direction increases.
  • the spacing in the drag direction between UI element 5 and UI elements 3, 4, 7, and 8 with a distance of 1 increases.
  • the spacing in the drag direction between UI elements 3, 4, 7, 8 with a distance of 1 and UI element 9 with a distance of 2 in the drag direction decreases.
  • the spacing in the drag direction between UI elements 3, 4, 7, 8 with a distance of 1 and UI elements 2, 6 with a distance of 2 in the opposite direction to the drag direction increases.
  • the UI element can bounce back a certain distance.
  • the displacement of the spring oscillates between positive and negative values over time. Therefore, the spring force curve of the underdamped state can be used to control the springback of UI elements.
  • UI elements are shown to be allowed to overlap each other, eg UI element 8 overlaps UI element 9 at times T32-T34.
  • UI elements may not be allowed to overlap each other. Whether overlapping is allowed is electronically device or user configurable. Movement of UI elements follows the elastic force curve of the underdamped state, with overlap allowed. The movement of UI elements follows the elastic force curve of the overdamped state without allowing overlap. Further, whether any two UI elements overlap may also depend on the relative movement magnitudes of the two UI elements. For example, where the relative movement of the two UI elements is small, the UI elements generally do not overlap. In the case of a large relative movement of the two UI elements, the UI elements may overlap.
  • FIG. 19A shows a schematic diagram of an example of a scene 1900A in which UI elements move completely with the hand, according to an embodiment of the present disclosure.
  • 19B shows a schematic diagram of an example of a displacement time curve 1900B for a scene in which a UI element completely moves with the hand, according to an embodiment of the present disclosure.
  • T41 UI element 5 is dragged.
  • T41a UI element 5 begins to move following the drag of the finger.
  • T41a may be equal to T41 if UI element 5 begins to move when the drag occurs.
  • T41a may be greater than T41 if UI element 5 begins to move after the drag occurs.
  • other UI elements eg, UI elements 2-4 and 6-9 also move in conjunction. It should be understood that other UI elements are shown as starting to move at the same time as UI element 5 for clarity. However, as mentioned above, other UI elements may start moving after their respective delay times.
  • T42 the user releases or swipes, in which case dragging UI element 5 ends.
  • UI element 5 continues to move with an animation effect controlled by a predefined curve (eg, an elastic force curve).
  • T42a may be equal to T42 if the UI element 5 moves with an animation controlled by a predefined curve when the drag is ended.
  • T42a may be greater than T42 if the UI element 5 is moved with an animation effect controlled by a predefined curve after ending the drag.
  • the displacement of the UI element 5 in the drag direction is SF4.
  • the displacement of UI elements 3, 4, 7, and 8 with a distance of 1 in the dragging direction is S41.
  • the displacement of UI elements 2, 6, and 9 with a distance of 2 in the dragging direction is S42.
  • the displacement SF4 is greater than the displacement S41, and the displacement S41 is greater than the displacement S42.
  • UI elements 3, 4, 7, 8 with a distance of 1 and UI elements 2, 6, and 9 with a distance of 2 also continue to move with the animation effect of the predefined curve.
  • T41 the distance between UI element 5 and UI elements 3, 4, 7, and 8 with a distance of 1 in the drag direction is increased.
  • the spacing in the drag direction between UI elements 3, 4, 7, 8 with a distance of 1 and UI element 9 with a distance of 2 in the drag direction decreases.
  • the spacing in the drag direction between UI elements 3, 4, 7, 8 with a distance of 1 and UI elements 2, 6 with a distance of 2 in the opposite direction to the drag direction increases.
  • the UI element 5 moves the displacement S40 in the drag direction, and starts to spring back.
  • the distance between the displacement S40 in the drag direction at the position of the rebound and the displacement SF4 in the drag direction at the position of the release or slip may be determined based on the friction force model as described above.
  • the UI element 9 with a distance of 2 moves a displacement S40 in the drag direction, and also starts to spring back.
  • UI element 5 is shown to bounce ahead of UI elements 3, 4, 7, 8 with distance 1, and UI elements 3, 4, 7, 8 with distance 1 are shown to precede distance UI elements 2, 6, and 9 for 2 bounce back, but all UI elements can bounce together.
  • UI element 5 may stop moving to wait for other UI elements to move by displacement S40, and then start bouncing back together.
  • all UI elements are shown as rebounding to the position of letting go or slipping, all UI elements may rebound more or less, and the embodiments of the present disclosure are not limited thereto.
  • the embodiments of the present disclosure relate to the linkage of UI elements in the UI in the depth direction, which is also referred to as depth linkage.
  • the depth direction refers to a direction perpendicular to the screen of the electronic device.
  • the pressed target UI element can affect other UI elements that are not pressed.
  • triggering the animation effect of the target UI element can jointly trigger the animation effect of one or more other UI elements, or even other UI elements in the entire UI, so that all other UI elements are affected by the target UI element .
  • other UI elements can also be scaled by a corresponding magnitude, thereby visually showing a linked scaling.
  • the embodiments of the present disclosure can make the animation effect more in line with the laws of physics, and take into account the real usage scenarios and user usage habits, so that it can significantly improve the performance of the animation. Improve user experience.
  • Deep linkage can occur in a UI with any suitable regular or irregular layout, and UI elements in the UI can have any suitable size and shape.
  • deep linkage can occur in UIs 300A-300C as shown in Figures 3A-3C.
  • UI elements in the UI can be pressed.
  • a user may press a UI element when the user desires to perform an operation associated with the UI element.
  • a user may press a UI element when the user desires to enter an application represented by the UI element, open a menu associated with the UI element, and the like.
  • the UI element may change, eg, the UI element may be scaled to render the press action in the depth direction.
  • UI elements can be scaled down to render away in the depth direction.
  • UI elements can be enlarged to render close in depth direction.
  • zooming will be described taking UI element zooming out as an example. However, it should be understood that zooming may also be enlarging UI elements.
  • FIG. 20 shows a schematic diagram of an example of a UI element changing 2000 when pressed, according to some embodiments of the present disclosure. As shown in FIG. 20, upon detection of a press at the UI element, the UI element can be zoomed out to appear away in the depth direction.
  • Changes in UI elements can conform to the surface pressure model.
  • the pressure is the same at every part of the UI element (eg, every pixel or every part divided in any other suitable way). That is, the pressure is the same for all parts of the UI element, regardless of which part of the UI element is pressed (eg, whether the center of the UI element is pressed or the edge of the UI element). So no matter which part of the UI element is pressed, the UI element changes will be the same.
  • FIG. 21 shows a schematic diagram of an example of a change 2100 of a UI element as it is pressed at different locations, according to some embodiments of the present disclosure. As shown in Figure 21, whether a press is detected at the center of the UI element or the edge of the UI element, the UI element can be shrunk by the same amount to appear away in the depth direction.
  • the pressed position may no longer be within the range of the shrunk UI element.
  • the press can continue to be detected as a press on the UI element.
  • the press will not be detected as a press on the UI element. In this case, the compression can be considered to be over.
  • the magnitude of the changes may depend on the magnitude of the pressing force.
  • the magnitude of the force usually refers to the actual magnitude of the force.
  • the greater the pressing force the greater the change in the depth direction.
  • the force of the pressing may be the force of the pressing applied by the user detected by the electronic device.
  • the pressing force may also be a predetermined pressing force set by the electronic device or the user.
  • FIG. 22 shows a schematic diagram of an example of a UI element changing 2200 at different pressing forces in accordance with some embodiments of the present disclosure.
  • the UI elements in the case of a large pressing force, the UI elements can be shrunk by a larger margin to present a greater degree of distance in the depth direction.
  • UI elements may even shrink to disappear from the UI, ie, to a size of 0, so as to appear as far away as possible in the depth direction.
  • the UI elements in the case of a small pressing force, the UI elements can be shrunk by a smaller amount to present a smaller degree of separation in the depth direction.
  • UI elements scale in response to different pressing forces is electronically device or user configurable. For example, UI elements can shrink in smaller increments when the pressing force is high, and can shrink by a larger amount when the pressing force is small.
  • the change in the depth direction based entirely on the real pressing force may have high requirements on the user, and may require the electronic device to be equipped with relevant hardware.
  • the time of the compression may be used to simulate or replace the force of the compression. For example, if the pressing time is longer, it can be considered that the pressing force is larger, and thus the change in the depth direction is larger.
  • FIG. 23 shows a schematic diagram of an example of a change 2300 of UI elements at different press durations, according to some embodiments of the present disclosure.
  • the UI element in the case where the duration of the pressing is long, the UI element can be shrunk by a larger amount to present a greater degree of distance in the depth direction.
  • UI elements may even shrink to disappear from the UI, ie, to a size of 0, so as to appear as far away as possible in the depth direction.
  • the duration of the press is short, the UI element may shrink by a smaller amount to present a smaller degree of separation in the depth direction.
  • UI elements scale in response to different press durations is electronically device or user configurable. For example, where the duration of the press is long, the UI element may shrink by a smaller amount, and when the duration of the press is short, the UI element may shrink by a larger amount.
  • the magnitude of changes may depend on the sizes of UI elements.
  • the same press may be difficult to press on a larger UI element, while it may be easier to press on a smaller UI element.
  • larger UI elements may be less affected by presses, while smaller UI elements may be more affected by presses.
  • FIG. 24 shows a schematic diagram of an example of a UI element changing 2400 at different sizes, according to some embodiments of the present disclosure.
  • the UI element in the case where the size of the UI element is large, the UI element can be shrunk by a larger magnitude to present a greater degree of distance in the depth direction.
  • the UI element in the case where the size of the UI element is small, the UI element can be reduced by a smaller amount to present a smaller degree of separation in the depth direction.
  • UI elements scale at different sizes is electronically device or user configurable. For example, in order to make the size of the scaled UI elements more balanced, larger UI elements may be more affected by presses, while smaller UI elements may be less affected by presses. To this end, in the case of a large size of the UI element, the UI element can be scaled down by a smaller amount, and in the case of a small size of the UI element, the UI element can be scaled down by a larger scale to render more depthwise A small distance away.
  • the range that the UI element can be zoomed may be limited, so that the UI element can only be zoomed within the allowed range.
  • the magnitude range may be any suitable range, such as 10%-90% of the size of the UI element, 100 pixels-10,000 pixels, or 2%-50% of the screen-to-body ratio, or the like.
  • the magnitude range is 10%-90% of the size of the UI element. In this case, no matter how hard the pressing force is or how long the pressing duration is, the pressed UI element can only be reduced to at most 10% of its original size, and it is impossible to disappear from the screen.
  • the pressed target UI element can affect other UI elements that are not pressed.
  • triggering the animation effect of the target UI element can jointly trigger the animation effect of one or more other UI elements, or even other UI elements in the entire UI, so that all other UI elements are affected by the target UI element .
  • other UI elements can also be scaled by a corresponding magnitude, thereby visually showing a linked scaling. Therefore, hereinafter, the depth linkage will be described in detail with reference to FIGS. 25 to 33 .
  • FIG. 25 shows a flowchart of a graphical interface display method 2500 according to an embodiment of the present disclosure. It should be understood that the method 2500 may be performed by the electronic device 100 described above with reference to FIG. 1 or the electronic device 200 described with reference to FIG. 2 . Method 2500 is described herein with reference to UI 300A of Figure 3A. However, it should be understood that UI 300A is merely an example, and method 2500 may be applied to any suitable interface, including but not limited to UIs 300B-300C.
  • the M user interface UI elements are displayed on the screen of the electronic device.
  • M is a positive integer greater than 1.
  • the M UI elements may be UI elements 1-13.
  • a press is detected for a duration at a first UI element of the M UI elements.
  • the first UI element may be UI element 5 .
  • a press at the first UI element for a duration will cause the first UI element to scale over time to present the effect of the press in the depth direction.
  • each of the N UI elements on the screen is scaled.
  • N is a positive integer between 1 and M-1. Thereby, interlocking pressing is visually indicated.
  • FIG. 26 shows a schematic diagram of an example of a depth linkage 2600 of N UI elements according to an embodiment of the present disclosure.
  • UI element 5 is pressed during a duration, so that UI element 5 is scaled over time to present a pressing effect in the depth direction.
  • other UI elements on the screen eg, UI elements 1 to 4, and 6 to 13
  • FIG. 26 only shows the depth linkage of UI elements 1-13 in UI 300A. It should be understood that deep linkage can occur at any at least two UI elements in any UI, for example, at any at least two UI elements in UIs 300A-300C.
  • deep linkage can work on all UI elements on the screen.
  • M-1 UI elements other than the first UI element among the M UI elements may be determined as N UI elements.
  • deep linkage can only act on some UI elements on the screen.
  • the influence area of the first UI element may be determined based on the size of the first UI element, and the UI elements within the influence area among the M UI elements may be determined as N UI elements. For example, the larger the size of the first UI element, the larger its area of influence may be. Alternatively, the area of influence may also be reduced in size, and the present disclosure is not limited herein.
  • the area of influence may be a circle with a predetermined radius centered on the reference point of the first UI element. It should be understood that the area of influence may be any suitable area having any shape, such as a rectangle, diamond, etc. having a predetermined size. The area of influence may be configurable by the electronic device and the user, and the present disclosure is not limited herein.
  • UI elements that intersect the area of influence may be considered to be within the area of influence.
  • the area of influence is a circle with a predetermined radius
  • the UI element may be considered to be within the area of influence if the distance from the first UI element is less than the predetermined radius of the area of influence.
  • FIG. 27 shows a schematic diagram of an example of a UI element's area of influence 2700 according to an embodiment of the present disclosure.
  • UI elements 3, 4, 7, and 8 are within the influence area 2710 of UI element 5
  • UI elements 3, 4, 7, and 8 will be scaled in conjunction with UI element 5.
  • UI elements 1, 2, 6, 9-13 are not within the influence area 2710 of UI element 5, UI elements 1, 6, 9-13 will not be scaled in conjunction with UI element 5.
  • the distance between the first UI element and each of the N UI elements may be determined.
  • distances may be divided into a plurality of distance levels according to the range in which the distances lie.
  • the operated UI element itself may be at distance level 0
  • the linked UI elements may be at distance level 1, 2, 3 according to their corresponding distance from the operated UI element...
  • UI elements at the same distance level may be are considered to be the same distance. Therefore, by using the distance level, the linkage of UI elements can be simplified, so that UI elements at the same distance level are linked in the same way, thereby improving the unity and coordination of the linkage.
  • the distance itself may also be used, thereby allowing UI elements to be more precisely linked.
  • distance classes are referred to interchangeably as distances.
  • the magnitude of scaling the second UI element may be determined based on the distance. For example, if the distance between the first UI element and the second UI element is greater, the second UI element may be scaled less, thereby visually indicating that the press has less impact on the distant UI element. Alternatively, if the distance between the first UI element and the second UI element is larger, the magnitude of the zooming of the second UI element may also be larger, thereby visually indicating that the impact of the press on the distant UI element becomes larger .
  • a first magnitude by which the first UI element is scaled in response to the press may be determined.
  • the first magnitude of scaling of the first UI element may be determined based on various factors associated with the first UI element. These factors may include, but are not limited to, the size of the first UI element, the range of magnitudes within which the first UI element can vary, the duration of the pressing, and the predetermined pressing force. In the above, the effects of these factors on the zooming amplitude of the UI elements are described in detail, so the descriptions thereof are omitted here.
  • the magnitude by which the second UI element scales in response to the press can then be determined based on the first magnitude and the distance between the first UI element and the second UI element.
  • How to transmit the magnitude of scaling the first UI element to the second UI element, so as to obtain the magnitude of scaling the second UI element can be implemented using the conducting manner described in detail above.
  • the difference is that in deep linkage, x n in conduction equations (7) and (8) represents the zoom magnitude of the linked-zoomed UI element (eg, the second UI element), while x represents the pressed UI element (eg. , the zoom range of the first UI element). Therefore, its description is omitted here.
  • the zooming amplitude of the second UI element is determined by the zooming amplitude of the first UI element and the distance between the second UI element and the first UI element, it can be intuitive, natural, and conform to the user's usage habits. deep linkage.
  • the size of the second UI element may also affect the magnitude of the scaling of the second UI element.
  • the size of the second UI element may also be considered to determine the magnitude by which to scale the second UI element. For example, if the second UI element is larger, the range of scaling the second UI element may be larger, so that the sizes of the scaled UI elements on the screen are more similar and thus more visually coordinated. Alternatively, if the second UI element is larger, the magnitude of scaling the second UI element may be smaller, so that the size difference of the scaled UI element on the screen is larger.
  • the magnitude by which the second UI element scales in response to the press may be determined based on the first magnitude, the distance, and the size of the second UI element.
  • the size of the first UI element may also affect the magnitude of the scaling of the second UI element.
  • the size of the first UI element may also be considered to determine the magnitude by which the second UI element is scaled. For example, the larger the size of the first UI element, the greater the possible linkage effect, so the animation effect of scaling the second UI element can be proportional to the size of the first UI element.
  • the magnitude by which the second UI element is scaled may be determined based on the first magnitude, the distance, and the size of the first UI element.
  • both the size of the first UI element and the size of the second UI element may affect the magnitude of the scaling of the second UI element.
  • the magnitude of scaling the second UI element may be determined based on the first magnitude, the distance, the size of the first UI element, and the size of the second UI element.
  • the second UI element can be scaled by the magnitude to visually indicate that the second UI element scales as the first UI element is pressed.
  • N UI elements they can all be scaled with respective amplitudes to visually indicate pressing on the entire screen or a partial area of the screen, thereby presenting a pressing linkage.
  • FIG. 28 shows a schematic diagram of an example of distance-based scaling of UI elements 2800 in accordance with an embodiment of the present disclosure.
  • UI elements with a distance of 0 eg, UI element 5 itself
  • UI elements with a distance of 1 eg, UI elements 3, 4, 7, 8
  • UI elements with a distance of 1 is zoomed more than UI elements with a distance of 2 (eg, UI elements 2, 6, 9)
  • UI elements with a distance of 2 are zoomed more than UI elements with a distance of 3 (eg, UI elements 1, 10-13) .
  • the first UI element and the second UI element do not start scaling at the same time.
  • a first UI element may start scaling when a press occurs, while a second UI element may start scaling some time after the press occurs.
  • a delay time may be determined based on the distance between the first UI element and the second UI element, and the second UI element is scaled in response to the delay time elapsed after the press occurs.
  • a delay factor may be determined, and the delay time is determined based on the distance and the delay factor.
  • the delay time may be the quotient of the distance divided by the delay factor.
  • the delay factor may be electronically and user configurable.
  • FIG. 29 shows a schematic diagram of an example of a delay time 2900 according to an embodiment of the present disclosure.
  • the first UI element with a distance of 0 starts scaling when the press occurs
  • the UI element with a distance of 1 is scaled later than the first UI element
  • the UI element with a distance of 2 is scaled later than the UI element with a distance of 1 UI elements
  • UI elements with a distance of 3 are scaled later than UI elements with a distance of 2.
  • FIG. 30 shows a schematic diagram of an example of scaling 3000 of UI elements with a delay time in accordance with the present disclosure.
  • UI element 5 with a distance of 0 starts scaling at time T51 when the press occurs
  • UI elements 3, 4, 7, and 8 with a distance of 1 start scaling at a later time T52 and UI elements with a distance of 2 start scaling at time T52.
  • 2, 4, 6, 9 start scaling at a later time T53 and UI elements 1, 10-13 with a distance of 3 start scaling at the latest T54.
  • the speed of zooming of UI elements may be controlled by a predefined curve of amplitude versus time.
  • the predefined curve can be a Bezier curve or an elastic force curve.
  • the speed of scaling can be controlled by controlling the damping and stiffness coefficients of the spring.
  • the speed of scaling can be controlled by controlling the coordinates of at least one of the at least one control point of the Bezier curve.
  • the UI element that is zoomed in linkage can also be moved toward the pressed UI element.
  • the N UI elements may be moved toward the first UI element to further visually highlight the press.
  • the magnitude of the displacement may depend on at least one of the distance between the co-zoomed UI element and the pressed UI element, the duration of the pressing, the size of the second UI element, and the size of the first UI element.
  • movement may be determined based on the distance between the first UI element and the second UI element, the duration of the press, the size of the first UI element, and/or the size of the second UI element The displacement of the second UI element.
  • the second UI element can then be moved by the displacement in a direction from the second UI element to the first UI element.
  • the second UI element may be moved by the displacement in a direction from the second reference point of the second UI element to the first reference point of the first UI element.
  • the visual effect of this is that the second UI element is attracted to the first UI element.
  • the displacement can also be made to move the second UI element in the opposite direction (eg, from the first reference point of the first UI element to the second reference point of the second UI element). The visual effect of this is that the second UI element is repelled by the first UI element.
  • FIG. 31 shows a schematic diagram of an example of a displacement 3100 of movement of a UI element according to an embodiment of the present disclosure.
  • UI elements with a distance of 1 eg, UI elements 3, 4, 7, 8
  • UI elements with a distance of 2 e.g., UI elements 2, 6, 9
  • the distance is UI elements with a distance of 2 have a greater displacement magnitude than UI elements with a distance of 3 (e.g. UI elements 1, 10-13)
  • the scaled UI element may be restored after the press ends (eg, after the user lifts the finger off the screen). Specifically, both the pressed UI element and the N UI elements that are zoomed together can be restored. To this end, in some embodiments, the scaled second UI element may be restored to the pre-scaled second UI element.
  • the restoration process may be an inverse process of scaling, and thus a detailed description thereof is omitted here.
  • FIG. 32A shows a schematic diagram of an example of a restoration 3200A of UI elements in accordance with an embodiment of the present disclosure.
  • the scaled UI elements eg, UI elements 1-13
  • the scaled UI elements are all restored to their original size before scaling.
  • UI elements may also move in response to presses.
  • the moved UI element can be reset after the press ends.
  • all N UI elements moved toward the pressed UI element can be reset.
  • the second UI element may be restored from the moved position to the pre-move position.
  • FIG. 32B shows a schematic diagram of an example of restoration 3200B of UI elements with a displacement, according to an embodiment of the present disclosure.
  • the moved and scaled UI elements eg, UI elements 1-13
  • the restoration of zooming or the restoration of movement may have a bouncing effect.
  • the size of the UI element can be first increased to be larger than the initial size, and then reduced to the initial size.
  • the UI element that is moved in linkage can first move away from the pressed UI element to a farther than the initial position before the movement, and then return to the initial position.
  • 33A-33B show schematic diagrams, respectively, of an example of a restored size time curve 3300A and a displacement time curve 3300B of a UI element with a springback effect, according to an embodiment of the present disclosure.
  • UI element 5 is pressed to zoom out.
  • other UI elements eg, UI elements 1-4, 6-13 are also reduced in linkage.
  • UI element 5 is reduced to 0.5 times its original size.
  • UI elements with a distance of 1 eg, UI elements 3, 4, 7, and 8 are reduced in linkage, but the reduction rate is smaller than that of UI element 5.
  • UI elements with a distance of 2 for example, UI elements 2, 6, and 9 are also reduced in linkage, but the reduction rate is smaller than that of UI elements with a distance of 1.
  • UI elements with a distance of 3 for example, UI elements 1, 10-13 are also reduced in linkage, but the reduction rate is smaller than that of UI elements with a distance of 2.
  • UI elements start scaling and bouncing back.
  • the size of UI element 5 is increased to 1.2 times the original size.
  • UI elements with a distance of 1 eg, UI elements 3, 4, 7, 8
  • UI elements with a distance of 2 for example, UI elements 2, 6, and 9
  • UI elements with a distance of 3 also increase in synergy, but the increase is smaller than that of UI elements with a distance of 2.
  • the user lets go.
  • the displacement of UI elements with a distance of 1 (eg, UI elements 3, 4, 7, 8) moving toward UI element 5 is -1.
  • UI elements with a distance of 2 (eg, UI elements 2, 6, 9) also move towards UI element 5, but the displacement magnitude is smaller than that of the UI elements with a distance of 1.
  • UI elements with a distance of 3 (eg, UI elements 1, 10-13) also move towards UI element 5, but the displacement magnitude is smaller than that of the UI elements with a distance of 2.
  • the displacement of UI elements with a distance of 1 exceeds the initial position and is +0.7.
  • UI elements with a distance of 2 eg, UI elements 2, 6, 9 are also displaced beyond the initial position, but the displacement magnitude is smaller than that of the UI elements with a distance of 1.
  • the displacement of UI elements with a distance of 3 eg, UI elements 1, 10-13 also exceeds the initial position, but the displacement magnitude is smaller than that of the UI elements with a distance of 2.
  • the scaling dimensions (eg, 0.5x, 1.2x) and movement displacements (eg, displacement-1, displacement 0.7) in Figures 33A-33B are only examples, and UI elements may be scaled smaller at any suitable size or at any Appropriate displacement moves.
  • the rebound effect is shown as only one rebound in Figures 33A-33B, a rebound effect with multiple rebounds can be achieved.
  • the number of rebounds may be any suitable number of rebounds, and the present disclosure is not limited herein.
  • the rebound magnitude of the multiple rebounds may decrease over time.
  • 33C-33D show schematic diagrams, respectively, of an example of a restored size-time curve 3300C and a displacement-time curve 3300D for a UI element with a rebound effect of multiple rebounds with reduced rebound amplitudes, according to embodiments of the present disclosure.
  • the UI element is restored to its original size after multiple rebounds, wherein the UI element with a distance of 0 (eg, UI element 5 ) bounces back at a larger scale than the UI element with a distance of 1 (eg, UI element 5 ) 3, 4, 7, 8).
  • a UI element with a distance of 1 will have a larger rebound scale than a UI element with a distance of 2 (eg, UI elements 2, 6, 9).
  • a UI element with a distance of 2 will have a larger rebound scaling than a UI element with a distance of 3 (eg, UI elements 1, 10-13).
  • the UI element recovers to the initial position after multiple rebounds, wherein the UI element with a distance of 0 (eg, UI element 5 ) rebounds with a greater displacement magnitude than the UI element with a distance of 1 (eg, UI element 5 ) , UI elements 3, 4, 7, 8).
  • a UI element with a distance of 1 rebounds with a larger displacement magnitude than a UI element with a distance of 2 (eg, UI elements 2, 6, 9).
  • a UI element with a distance of 2 rebounds with a larger displacement magnitude than a UI element with a distance of 3 (eg, UI elements 1, 10-13).
  • the springback effect may also be controlled by predefined curves (eg, elastic force curves, Bezier curves, etc.).
  • predefined curves eg, elastic force curves, Bezier curves, etc.
  • these UI elements can scale bounce or move bounce in an animated effect controlled by a predefined curve.
  • the embodiments of the present disclosure relate to the linkage of UI elements in the UI on the animation effect of pressing, which is also referred to as pressure linkage.
  • the pressed target UI element can affect other UI elements that are not pressed.
  • triggering the animation effect of the target UI element can jointly trigger the animation effect of one or more other UI elements, or even other UI elements in the entire UI, so that all other UI elements are affected by the target UI element .
  • other UI elements can also show the pressing effect with the corresponding animation effect, so as to visually present the linkage pressing .
  • the embodiments of the present disclosure can make the animation effect more in line with the laws of physics, and take into account the real usage scenarios and user usage habits, so that it can significantly improve the performance of the animation. Improve user experience.
  • Pressure linkage can occur in a UI with any suitable regular or irregular layout, and UI elements in the UI can have any suitable size and shape.
  • pressure linkage can occur in UIs 300A-300C as shown in Figures 3A-3C.
  • UI elements in the UI can be pressed.
  • a user may press a UI element when the user desires to perform an operation associated with the UI element.
  • a user may press a UI element when the user desires to enter an application represented by the UI element, open a menu associated with the UI element, and the like.
  • the UI element may change with an animation effect, for example, the UI element may be visually moved in a rocker fashion relative to the position of the press (alternatively referred to hereinafter as a rotation), Or visually concave or protrude relative to the pressed position to present a pressing action.
  • changes in UI elements can conform to the point pressure model. In the point pressure model, the pressure of the UI element at the pressed position is greater than the pressure of other parts.
  • UI elements can be treated as rigid bodies.
  • the UI element upon detection of a press at the UI element, the UI element can be visually moved in a rocker fashion relative to the position of the press to present the press effect.
  • FIG. 34 shows a schematic diagram of an example of a UI element that is a rigid body changing 3400 when pressed, according to some embodiments of the present disclosure.
  • the UI element upon detection of a press at the UI element, the UI element can be animated from the initial shape 3410 to visually move the position in a rocker fashion relative to the position of the press. For example, when the pressed position is to the left of the UI element, the UI element is visually rotated to the left about its reference point (indicated by a "+"), changing to shape 3420.
  • the altered shape 3420 resembles a rocker with the left side depressed and the right side up.
  • the UI element when the pressed position is on the right side of the UI element, the UI element is visually rotated to the right around its reference point, changing to shape 3430 .
  • the altered shape 3430 resembles a rocker with the left side lifted and the right side depressed.
  • the UI element can be seen as a rocker connected to a spring on both sides, and the pressing of the UI element can be seen as pressing the spring on one side and stretching the spring on the other, thus implementing the UI as a whole Animate the element's rotation around its base point.
  • FIG. 35 shows a schematic diagram of an example of a pressing and stretching of a spring 3500 that simulates pressing of a UI element, according to some embodiments of the present disclosure.
  • Figure 3510 shows the two-sided spring in its initial state.
  • Graph 3520 shows that when the pressed position is on the left side of the UI element, the left spring is pressed and the right spring is stretched.
  • Diagram 3530 shows that when the pressed position is on the right side of the UI element, the right spring is pressed and the left spring is stretched.
  • model of the spring can be represented by the following equation (9):
  • L represents the horizontal distance between the pressed position and the reference point of the UI element
  • c represents the linear distance between the pressed position and the reference point
  • k' represents the spring coefficient of the spring
  • k' represents the elastic coefficient of the spring
  • x' represents the deformation amount of the spring
  • g' represents the damping coefficient of the spring
  • T represents the time of deformation
  • m' represents the size of the UI element.
  • UI elements may be considered non-rigid bodies.
  • the UI element upon detection of a press at the UI element, the UI element may be visually recessed or protruded relative to the location of the press to present the press effect.
  • UI elements can be viewed as grid diagrams.
  • the initial UI element 3610 may change with an animation effect to visually recess or protrude relative to the location of the press.
  • the coordinates of the grid within the initial UI element 3610 can be changed to change to UI element 3620 recessed relative to the location of the press.
  • the color (eg, hue, lightness, saturation, etc.) of the UI elements may also vary to accentuate the press.
  • the initial UI element 3610 may also change to a UI element 3630 that is recessed and darkened relative to the location of the press. It should be understood that color changes may also apply to UI elements that are rigid bodies.
  • the pressed position may no longer be within the range of the changed UI element.
  • the press can continue to be detected as a press on the UI element.
  • the press will not be detected as a press on the UI element. In this case, the compression can be considered to be over.
  • UI elements may also vary in other ways, such as being recessed or protruding visually relative to the location of the press.
  • the magnitude of the changes may depend on the magnitude of the pressing force.
  • the magnitude of the force usually refers to the actual magnitude of the force.
  • the greater the pressing force the greater the change in the UI element.
  • the force of the pressing may be the force of the pressing applied by the user detected by the electronic device.
  • the pressing force may also be a predetermined pressing force set by the electronic device or the user.
  • FIG. 37 shows a schematic diagram of an example of a UI element changing 3700 at different pressing forces, according to some embodiments of the present disclosure.
  • the UI element in the case where the pressing force is large, the UI element can be changed (eg, rotated) by a larger magnitude.
  • UI elements can change in smaller magnitudes when the pressing force is small.
  • the manner in which the UI elements change in response to different pressing forces is electronically device or user configurable. For example, UI elements can change in smaller magnitudes when the pressing force is high, and UI elements can change in larger magnitudes when the pressing force is small.
  • the time of the compression may be used to simulate or replace the force of the compression. For example, if the pressing time is longer, it can be considered that the pressing force is larger, and thus the change is larger.
  • the UI element may change (eg, rotate) by a larger magnitude with a long duration of the press.
  • UI elements may change in smaller magnitudes.
  • UI elements change in response to different press durations is electronically device or user configurable. For example, where the duration of the press is long, the UI elements may change in smaller magnitudes, and where the duration of the press is short, the UI elements may change by a larger magnitude.
  • the magnitude of changes may depend on the sizes of UI elements.
  • the same press may be difficult to press on a larger UI element, while it may be easier to press on a smaller UI element.
  • larger UI elements may be less affected by presses, while smaller UI elements may be more affected by presses.
  • FIG. 39 shows a schematic diagram of an example of a UI element changing 3900 at different sizes, according to some embodiments of the present disclosure.
  • the UI element in the case where the size of the UI element is large, the UI element can be changed by a larger magnitude. In contrast, where the size of the UI element is small, the UI element can vary in smaller magnitudes.
  • UI elements change at different sizes is electronically device or user configurable. For example, where the size of the UI element is large, the UI element may vary in smaller magnitudes, and in the case where the size of the UI element is small, the UI element may vary in larger magnitudes.
  • the range that the UI element can change may be limited, so that the UI element can only change within the range of the allowable range.
  • the magnitude range may be any suitable range, such as a rotation angle of a UI element between 0-60 degrees, a color change of a UI element between 10%-50% grayscale, or the coordinates of a grid within a UI element Variation between 100-10000 pixels etc.
  • the magnitude range is that the UI element's rotation angle is between 0-60 degrees. In this case, no matter how high the predetermined pressing force is and how long the pressing duration is, the pressed UI element can only be rotated at most 60 degrees around the reference point, and it is impossible to rotate by a larger amount.
  • the change of the pressed UI element is described in detail.
  • the pressed target UI element can affect other UI elements that are not pressed.
  • triggering the animation effect of the target UI element can jointly trigger the animation effect of one or more other UI elements, or even other UI elements in the entire UI, so that all other UI elements are affected by the target UI element .
  • other UI elements can also change with a corresponding animation effect, thereby visually presenting the linkage press down. Therefore, hereinafter, the pressure linkage will be described in detail with reference to FIGS. 40 to 46 .
  • FIG. 40 shows a flowchart of a graphical interface display method 4000 according to an embodiment of the present disclosure. It should be understood that the method 4000 may be performed by the electronic device 100 described above with reference to FIG. 1 or the electronic device 200 described with reference to FIG. 2 . Method 4000 is described herein with reference to UI 300A of Figure 3A. However, it should be understood that UI 300A is merely an example, and method 2500 may be applied to any suitable interface, including but not limited to UIs 300B-300C.
  • the M user interface UI elements are displayed on the screen of the electronic device.
  • M is a positive integer greater than 1.
  • the M UI elements may be UI elements 1-13.
  • a press at a first UI element of the M UI elements is detected.
  • the first UI element may be UI element 5 .
  • a press at the first UI element will cause the first UI element to rotate to render the press effect.
  • N is a positive integer between 1 and M-1. Thereby, interlocking pressing is visually indicated.
  • the direction in which the N UI elements change relative to the pressed position may be a direction from each of the N UI elements to the pressed position. In some embodiments, the direction may be a direction from the corresponding reference point of each of the N UI elements to the reference point of the pressed UI element.
  • the pressed position is the change reference point for the change of N elements, that is, the pressed position is visually indicated as the center of the pressed.
  • 41 shows a schematic diagram of an example of a pressure linkage 4000 of N UI elements according to an embodiment of the present disclosure. As shown in FIG. 41, UI element 5 is pressed, so that UI element 5 is rotated to present the pressing effect. In addition, other UI elements on the screen (eg, UI elements 1 to 4, and 6 to 13) also rotate at different magnitudes relative to the position of the press in response to the press to present the press effect. Thereby, interlocking pressing is visually presented.
  • UI elements 1 to 4, and 6 to 13 also rotate at different magnitudes relative to the position
  • the direction in which the N UI elements change relative to the pressed position may be the same as the direction in which the pressed UI element changes.
  • 42 shows a schematic diagram of another example of pressure linkage 4000 of N UI elements according to an embodiment of the present disclosure. As shown in FIG. 42, UI element 5 is pressed, so that UI element 5 is rotated to present the pressing effect.
  • other UI elements on the screen eg, UI elements 1 to 4, and 6 to 13
  • the change reference point of the change of N elements is its own reference point. Thereby, interlocking pressing is visually presented.
  • Figures 41-42 only show the pressure linkages of UI elements 1-13 in UI 300A. It should be understood that pressure linkage can occur at any at least two UI elements in any UI, eg, at any at least two UI elements in UIs 300A-300C.
  • pressure linkage can act on all UI elements on the screen.
  • M-1 UI elements other than the first UI element among the M UI elements may be determined as N UI elements.
  • the pressure linkage may only act on some UI elements on the screen.
  • the influence area of the first UI element may be determined based on the size of the first UI element, and the UI elements within the influence area among the M UI elements may be determined as N UI elements. For example, the larger the size of the first UI element, the larger its area of influence may be. Alternatively, the area of influence may also be reduced in size, and the present disclosure is not limited herein.
  • the area of influence may be a circle with a predetermined radius centered on the reference point of the first UI element. It should be understood that the area of influence may be any suitable area having any shape, such as a rectangle, diamond, etc. having a predetermined size. The area of influence may be configurable by the electronic device and the user, and the present disclosure is not limited herein.
  • UI elements that intersect the area of influence may be considered to be within the area of influence.
  • the area of influence is a circle with a predetermined radius
  • the UI element may be considered to be within the area of influence if the distance from the first UI element is less than the predetermined radius of the area of influence.
  • FIG. 43 shows a schematic diagram of an example of a UI element's area of influence 4300 according to an embodiment of the present disclosure. As shown in FIG. 43 , since UI elements 3, 4, 7, and 8 are within the influence area 4310 of UI element 5, UI elements 3, 4, 7, and 8 will change in conjunction with UI element 5. In addition, since UI elements 1, 2, 6, 9-13 are not within the influence area 4310 of UI element 5, UI elements 1, 6, 9-13 will not change in conjunction with UI element 5.
  • the distance between the first UI element and each of the N UI elements may be determined.
  • distances may be divided into a plurality of distance levels according to the range in which the distances lie.
  • the operated UI element itself may be at distance level 0
  • the linked UI elements may be at distance level 1, 2, 3 according to their corresponding distance from the operated UI element...
  • UI elements at the same distance level may be are considered to be the same distance. Therefore, by using the distance level, the linkage of UI elements can be simplified, so that UI elements at the same distance level are linked in the same way, thereby improving the unity and coordination of the linkage.
  • the distance itself may also be used, thereby allowing UI elements to be more precisely linked.
  • distance classes are referred to interchangeably as distances.
  • the animation effect of the change of the second UI element may be determined based on the distance. For example, if the distance between the first UI element and the second UI element is larger, the magnitude of the change in the second UI element may be smaller, thereby visually indicating that the impact of the press on the distant UI element is smaller. Alternatively, if the distance between the first UI element and the second UI element is larger, the magnitude of the change in the second UI element may also be larger, thereby visually indicating that the impact of the press on the distant UI element changes. big.
  • a first magnitude of the change in the first UI element in response to the press may be determined.
  • the first magnitude of change in the first UI element may be determined based on various factors associated with the first UI element. These factors may include, but are not limited to, the size of the first UI element, the location of the first reference point of the first UI element, the range of magnitudes that the first UI element can vary, the location of the press, the duration of the press, and the predetermined press force. In the above, the influences of these factors on the magnitudes of changes of UI elements are described in detail, so the descriptions thereof are omitted here.
  • the magnitude by which the second UI element changes in response to the press can then be determined based on the first magnitude and the distance between the first UI element and the second UI element. How to transmit the magnitude of the change of the first UI element to the second UI element, so as to obtain the magnitude of the change of the second UI element, can be implemented using the transmission method described in detail above. The difference is that in pressure linkage, x n in conduction equations (7) and (8) represents the magnitude of change of the UI element (e.g., the second UI element) that changes in linkage, while x represents the pressed UI element (e.g., the second UI element) , the first UI element). Therefore, its description is omitted here.
  • the magnitude of the change of the second UI element is determined by the magnitude of the change of the first UI element and the distance between the second UI element and the first UI element, it can be intuitive, natural and conform to the user's usage habits. pressure linkage.
  • the size of the second UI element may also affect the animation effect of the change of the second UI element.
  • the size of the second UI element may also be considered to determine the animation effect of the change of the second UI element. For example, if the second UI element is larger, the magnitude of the change in the second UI element may be larger. Alternatively, if the second UI element is larger, the magnitude of the change in the second UI element may be smaller. To this end, in some embodiments, the magnitude of the change in the second UI element in response to the press may be determined based on the first magnitude, the distance, and the size of the second UI element.
  • the size of the first UI element may also affect the animation effect of the change of the second UI element.
  • the size of the first UI element may also be considered to determine the animation effect of the change of the second UI element.
  • the magnitude of the change of the second UI element may be proportional to the size of the first UI element.
  • the magnitude of the second UI element may be determined based on the first magnitude, the distance, and the size of the first UI element.
  • both the size of the first UI element and the size of the second UI element may affect the animation effect of the change of the second UI element. Therefore, in some embodiments, the magnitude of the change in the second UI element may be determined based on the first magnitude, the distance, the size of the first UI element, and the size of the second UI element.
  • the second UI element may be changed with the animation effect to visually indicate that the second UI element changes as the first UI element is pressed change.
  • N UI elements all of them can be changed with their respective animation effects to visually indicate pressing on the entire screen or a partial area of the screen, thereby presenting a pressing linkage.
  • FIG. 44 shows a schematic diagram of an example of a distance-based UI element change 4400 in accordance with an embodiment of the present disclosure.
  • UI elements with a distance of 0 eg, UI element 5 itself
  • UI elements with a distance of 1 e.g. UI elements 3, 4, 7, 8
  • UI elements with a distance of 1 Elements change more than UI elements with distance 2 (e.g. UI elements 2, 6, 9)
  • UI elements with distance 2 change more than UI elements with distance 3 (e.g. UI elements 1, 10) -13).
  • the first UI element and the second UI element do not start to change at the same time.
  • a first UI element may begin to change when a press occurs, while a second UI element may begin to change some time after the press occurs.
  • a delay time may be determined based on the distance between the first UI element and the second UI element, and the second UI element is changed in response to the elapse of the delay time after the pressing occurs.
  • a delay factor may be determined, and the delay time is determined based on the distance and the delay factor.
  • the delay time may be the quotient of the distance divided by the delay factor.
  • the delay factor may be electronically and user configurable.
  • the first UI element with a distance of 0 (eg, UI element 5) starts to change at time T81 when the press occurs, and the UI elements with a distance of 1 (eg, UI elements 3, 4, 7, 8) Changes begin at a later time T82, UI elements with a distance of 2 (eg, UI elements 2, 4, 6, 9) begin to change at a later time T83, and UI elements with a distance of 3 (eg, UI Elements 1, 10-13) begin to change at the latest T84.
  • UI elements with a distance of 0 eg, UI element 5
  • the UI elements with a distance of 1 eg, UI elements 3, 4, 7, 8
  • Changes begin at a later time T82
  • UI elements with a distance of 2 (eg, UI elements 2, 4, 6, 9) begin to change at a later time T83
  • UI elements with a distance of 3 (eg, UI Elements 1, 10-13) begin to change at the latest T84.
  • the speed at which UI elements change may be controlled by a predefined curve of amplitude versus time.
  • the predefined curve can be a Bezier curve or an elastic force curve.
  • the speed at which the change occurs can be controlled by controlling the damping and stiffness coefficients of the spring.
  • the speed at which the change occurs can be controlled by controlling the coordinates of at least one of the at least one control point of the Bezier curve.
  • UI elements that have changed may be restored after the press ends (eg, after the user lifts the finger off the screen).
  • the pressed UI element and the N UI elements that are changed in linkage can be restored.
  • the changed second UI element may be restored to the second UI element before the change.
  • the restoration process may be an inverse process of the change, so a detailed description thereof is omitted here.
  • FIG. 46 shows a schematic diagram of an example of a restoration 4600 of UI elements in accordance with an embodiment of the present disclosure. As shown in FIG. 46, the UI elements (eg, UI elements 1-13) after the change are all restored to their pre-change forms.
  • the UI elements eg, UI elements 1-13
  • the UI element can change from being pressed on the left side while the right side is lifted, to being lifted on the left side while the right side is pressed down, and then back to the original shape. That is to say, after the user lets go, the effect of the UI element being flipped and then restored visually is presented.
  • 46B shows a schematic diagram of an example of an angle-time curve 4600B of recovery of a UI element with a bouncing effect, according to an embodiment of the present disclosure.
  • UI element 5 is pressed and changed.
  • UI element 5 is pressed on the left, thus rotating around its reference point.
  • UI element 5 is depressed on the left side, while the right side is lifted.
  • other UI elements eg, UI elements 1-4, 6-13 also change in conjunction.
  • the user lets go.
  • the angle by which the UI element 5 is rotated is -60°.
  • UI elements with a distance of 1 eg, UI elements 3, 4, 7, and 8 rotate in conjunction with each other, but the rotation magnitude is smaller than that of UI element 5.
  • UI elements with a distance of 2 for example, UI elements 2, 6, and 9 are also rotated in conjunction, but the rotation amplitude is smaller than that of UI elements with a distance of 1.
  • UI elements with a distance of 3 for example, UI elements 1, 10-13 are also rotated in conjunction, but the rotation amplitude is smaller than that of UI elements with a distance of 2.
  • UI element 5 rotates back at an angle of 45°.
  • UI elements with a distance of 1 for example, UI elements 3, 4, 7, and 8 rotate and rebound together, but the magnitude of the rotation rebound is smaller than that of UI element 5.
  • UI elements with a distance of 2 for example, UI elements 2, 6, and 9 also rotate and rebound in conjunction, but the magnitude of the rotation rebound is smaller than that of UI elements with a distance of 1.
  • UI elements with a distance of 3 for example, UI elements 1, 10-13 also rotate and rebound in linkage, but the magnitude of the rotation rebound is smaller than that of UI elements with a distance of 2.
  • the rotation angles in Figure 46B are examples only, and the UI elements may vary in any suitable pattern.
  • the rebound effect is shown as only one rebound in FIG. 46B, a rebound effect with multiple rebounds can be achieved.
  • the number of rebounds may be any suitable number of rebounds, and the present disclosure is not limited herein.
  • the rebound magnitude of the multiple rebounds may decrease over time.
  • 46C shows a schematic diagram of an example of a recovered angular time curve 4600C of a UI element with the rebound effect of multiple rebounds with reduced rebound amplitudes, according to an embodiment of the present disclosure.
  • the UI element recovers to its original shape after multiple rebounds, wherein the UI element with a distance of 0 (eg, UI element 5 ) rebounds with a greater rotation amplitude (eg, rotation angle) than the UI with a distance of 1 elements (eg UI elements 3, 4, 7, 8).
  • a UI element with a distance of 1 bounces with a greater rotation magnitude than a UI element with a distance of 2 (eg, UI elements 2, 6, 9).
  • a UI element with a distance of 2 bounces with a greater rotation magnitude than a UI element with a distance of 3 (eg, UI elements 1, 10-13).
  • the springback effect may also be controlled by predefined curves (eg, elastic force curves, Bezier curves, etc.).
  • predefined curves eg, elastic force curves, Bezier curves, etc.
  • these UI elements can bounce back and forth with animation effects controlled by predefined curves.
  • Figure 47 shows an animation implementation schematic 4700 in accordance with an embodiment of the present disclosure.
  • animation is essentially the real-time display of UI or UI elements based on refresh rate. Due to the human vision persistence principle, the user feels that the picture is moving.
  • the animation transitions from the initial state of the animation to the final state of the animation after the animation time has elapsed.
  • the animation can be controlled by the animation type and animation transformation form.
  • animation types may include displacement animations, rotation animations, scale animations, and transparency animations, among others.
  • the animation transformation form can be controlled by controllers such as interpolators and estimators. Such a controller can be used to control the speed at which the animation is transformed during animation time.
  • the interpolator is used to set the change logic of the animation property value transitioning from the initial state to the final state, thereby controlling the rate of animation change, so that the animation can change at one or more of the rates of uniform speed, acceleration, deceleration, and parabolic rate. .
  • the electronic device 100 may set the change logic of the animation property value according to a system interpolator or a custom interpolator (eg, an elastic force interpolator, a friction force interpolator).
  • a system interpolator e.g., an elastic force interpolator, a friction force interpolator.
  • the invalidate() function is called based on the animation property value to refresh the view, that is, the onDraw() function is called to redraw and display the view.
  • the electronic device 100 customizes an elastic force interpolator.
  • the parameters of the function of the elastic force interpolator include at least stiffness and damping.
  • the function code for an elastic force interpolator can be expressed as one of the following: “SpringInterpolator(float stiffness,float damping)", “SpringInterpolator(float stiffness,float damping,float endPos)", “SpringInterpolator(float stiffness,float)” damping,float endPos,float velocity)", “SpringInterpolator(float stiffness,float damping,float endPos,float velocity,float valueThreshold)".
  • the parameter endPos represents the relative displacement, that is, the difference between the initial position of the spring and the rest position.
  • endPos may represent a relative displacement of UI elements.
  • the parameter velocity represents the initial velocity.
  • velocity may represent the initial velocity of UI elements.
  • the parameter valueThreshold indicates the threshold for judging animation stop.
  • the animation stops running.
  • the larger the threshold the easier it is for the animation to stop and the shorter the run time; conversely, the longer the animation runs.
  • the value of the threshold can be set according to specific animation properties.
  • the default value of the elastic interpolator FloatValueHold parameter is 1/1000, and the threshold value is 1 in other construction methods.
  • the suggested values shown in Table 1 can be used according to the animation properties.
  • the threshold can directly use the following constants provided by the DynamicAnimation class: MIN_VISIBLE_CHANGE_PIXELS, IN_VISIBLE_CHANGE_ROTATION_DEGREES, MIN_VISIBLE_CHANGE_ALPHA, MIN_VISIBLE_CHANGE_SCALE.
  • the specific code of the animation class of the custom elastic force interpolator can be expressed as follows:
  • PhysicalInterpolatorBase interpolator new SpringInterpolator(400F, 40F, 200F, 2600F, 1F);
  • ObjectAnimator animator ObjectAnimator.ofFloat(listView, "translationY”, 0, 346);
  • electronic device 100 customizes a friction interpolator.
  • the function code for the friction interpolator can be expressed as "FlingInterpolator(float initVelocity, float friction)". Among them, initVelocity represents the initial velocity, and friction represents the frictional force.
  • the concrete code for an animation class using a friction interpolator can be represented as follows:
  • ObjectAnimator animator ObjectAnimator.ofFloat(listView,”translationY",0,interpolator.getEndOffset());
  • animator.setInterpolator (interpolator);//Set the custom interpolator to the animation class animator.start();"//Run the animation.
  • the electronic device 100 can set the animation time (Duration) and start position by itself; it can also call the engine model to obtain the animation time (Duration) and end position, and then set them to the animation class (Animator class).
  • the code for the electronic device 100 to call the engine model to obtain the animation time may be expressed as "com.xxx.dynamicanimation.interpolator.PhysicalInterpolatorBase#getDuration".
  • the code that calls the engine model to get the end position of the spring can be expressed as "com.xxx.dynamic animation.interpolator.PhysicalInterpolatorBase#getEndOffset”.
  • the code to set the parameter valueThreshold may be represented as "com.xxx.dynamic animation.interpolator.PhysicalInterpolatorBase#setValueThreshold”.
  • the code using the elastic engine animation class can be expressed as one of the following codes: "SpringAnimation(K object,FloatPropertyCompat ⁇ K>property,float stiffness,float damping,float startValue,float endValue,float velocity )", "SpringAnimation(K object,FloatPropertyCompat ⁇ K>property,float stiffness,float damping,float endValue,float velocity)”.
  • the parameter object represents the animation object
  • Property represents the animation class or the property object of the interpolator. See Table 1, this parameter can be used to indirectly set valueThreshold. This parameter is optional in the interpolator version. When valueThreshold has been set in other ways, this parameter can be omitted, that is, the construction method without the property parameter can be used directly. This parameter is required in the animation version.
  • the DynamicAnimation class already provides the following constants that can be used directly:
  • the electronic device 100 may also implement the ViewProperty interface by definition.
  • concrete code using the Spring Engine animation class can be expressed as follows:
  • “SpringAnimation animation SpringAnimation(listView,DynamicAnimation.TRANSLATION_Y,400F,40F,0,1000F);
  • the code using the friction engine animation class can be expressed as: "FlingAnimation(K object, FloatPropertyCompat ⁇ K> property, float initVelocity, float friction)”.
  • concrete code using the friction animation class can be expressed as follows:
  • Figure 48 shows a schematic diagram of a system framework 4800 for implementing "linkage" animation effects capabilities or functions according to an embodiment of the present disclosure.
  • the animation capabilities of the UI framework are based on Android or Hongmeng It is implemented by the overall architecture of the system, including the mainstream 4-layer logic processing, and the data processing process is presented to the user from the bottom to the top. Users mainly use and experience animation functions at the application layer.
  • the capability interaction relationship between the desktop and the UI framework is shown in FIG. 48 .
  • the system framework 4800 may include an application layer 4810 , an application framework layer 4830 , a hardware abstraction layer 4850 , and a kernel layer 4870 .
  • Application layer 4810 may include desktop 4812.
  • UI element operations 4814 may be implemented on the desktop 4812.
  • the UI element operation 4814 may include, for example, a drag operation, a press operation, a deep press operation, and the like.
  • the application framework layer 4830 may include system services 4832 and extension services 4834.
  • System services 4832 may include various system services, such as Service 4833.
  • Extension services 4834 may include various extension services, such as SDK 4835.
  • a hardware abstraction layer (HAL) 4850 may include HAL 3.0 4852 and algorithm Algo 4854.
  • Kernel layer 4870 may include drivers 4872 and physical devices 4874. Physical device 4874 may provide raw parameter streams to driver 4872, and driver 4872 may provide functional processing parameter streams to physical device 4874. As further shown in FIG.
  • the UI framework 4820 for implementing the linkage motion effect 4825 may be implemented between the application layer 4810 and the application framework layer 4830 .
  • UI framework 4820 may include platform capabilities 4822 and system capabilities 4824, both of which may be used to provide linkage animation 4825.
  • the linkage animation 4825 may in turn be provided to the UI element operations 4814 of the application layer 4810.
  • FIG. 49 shows a schematic diagram of the relationship between the application side and the UI framework side involved in the "linkage" animation effect capability or function according to an embodiment of the present disclosure.
  • the application side 4910 may include a desktop 4915, and UI elements on the desktop 4915 may implement operations such as dragging, pressing, and deep pressing.
  • the UI frame side 4950 may include UI frame motion effects 4952.
  • the UI frame motion effects 4952 may implement the linkage motion effect capability 4954.
  • the linkage motion effect capability 4954 may be implemented by means of AAR format 4951, JAR format 4953, and system interface 4955.
  • the application side 4910 can call the "linkage" animation effect capability or function provided by the UI framework side 4950 by integrating 4930 and calling 4940.
  • embodiments of the present disclosure implement a new type of linkage "animation effect” that links originally independent UI elements (eg, icons, cards, controls, etc.).
  • FIG. 50 shows a schematic diagram of a specific description of three ways of implementing the “linkage” animation effect capability or function according to an embodiment of the present disclosure.
  • the relationship 5001 between the AAR format 4951 and the system of the electronic device 100 is as follows: AAR format 4951 is packaged with capabilities in a binary format, which provides the ability to integrate the application side in the system, and can freely control the version rhythm, Do not follow the system.
  • the relationship 5003 between the JAR format 4953 and the system of the electronic device 100 is as follows: the JAR format 4953 is packaged with capabilities in a binary format, provides capabilities for all components in the system, and can freely control the version rhythm without following the system.
  • the relationship 5005 between the system interface 4955 and the system of the electronic device 100 is: the system interface 4955 is the interface of the framework layer in the system version, and provides the capability of all components in the system, following the system upgrade.
  • the focus of the present disclosure is the realization of the linkage dynamic effect capability.
  • integration is the way of AAR and JAR
  • invocation is the way of system interface.
  • the scene is not limited, but the way the ability is displayed is inconsistent. That is to say, the functions of the various methods described above in the present disclosure may be implemented through an AAR format file, a JAR format file, and/or a system interface of the electronic device 100 . In this way, the ability or functionality to "link" animation effects can be simply and conveniently implemented and provided to an application of an electronic device, such as a desktop.

Abstract

La présente divulgation concerne un procédé d'affichage d'interface graphique, un dispositif électronique, un support de stockage lisible par ordinateur et un produit programme d'ordinateur. Selon le procédé d'affichage d'interface graphique décrit ici, M éléments d'interface utilisateur (UI) s'affichent sur un écran d'un dispositif électronique, M représentant un nombre entier positif supérieur à 1. Une pression agissant sur un premier élément d'UI des M éléments d'UI est détectée. En réponse à la pression, chacun des N éléments d'UI à l'écran est amené à varier selon un effet correspondant d'animation, N représentant un nombre entier positif compris entre 1 et (M-1). Amener les N éléments d'UI à varier selon les effets correspondants d'animation consiste à : déterminer une distance entre le premier élément d'UI et un second élément d'UI des N éléments d'UI ; déterminer, d'après la distance et la position de la pression dans une UI, un effet d'animation de variation du second élément d'UI ; et amener le second élément d'UI à varier selon l'effet d'animation, pour indiquer visuellement la pression. Ainsi, la relation entre les effets d'animation de différents éléments d'UI peut être accrue et la relation entre des éléments d'UI indépendants est mise en évidence, si bien que l'effet d'animation de l'UI répond mieux aux habitudes d'utilisation d'utilisateurs, ce qui améliore significativement l'expérience d'utilisateurs.
PCT/CN2022/087751 2021-04-20 2022-04-19 Procédé d'affichage d'interface graphique, dispositif électronique, support et produit programme WO2022222931A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110426824.5A CN115220621A (zh) 2021-04-20 2021-04-20 图形界面显示方法、电子设备、介质以及程序产品
CN202110426824.5 2021-04-20

Publications (1)

Publication Number Publication Date
WO2022222931A1 true WO2022222931A1 (fr) 2022-10-27

Family

ID=83604135

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/087751 WO2022222931A1 (fr) 2021-04-20 2022-04-19 Procédé d'affichage d'interface graphique, dispositif électronique, support et produit programme

Country Status (2)

Country Link
CN (1) CN115220621A (fr)
WO (1) WO2022222931A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150370447A1 (en) * 2014-06-24 2015-12-24 Google Inc. Computerized systems and methods for cascading user interface element animations
CN109643218A (zh) * 2016-04-26 2019-04-16 谷歌有限责任公司 用户界面元素的动画
CN112256165A (zh) * 2019-12-13 2021-01-22 华为技术有限公司 一种应用图标的显示方法及电子设备
CN113552987A (zh) * 2021-04-20 2021-10-26 华为技术有限公司 图形界面显示方法、电子设备、介质以及程序产品
CN113568549A (zh) * 2021-04-20 2021-10-29 华为技术有限公司 图形界面显示方法、电子设备、介质以及程序产品

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7173623B2 (en) * 2003-05-09 2007-02-06 Microsoft Corporation System supporting animation of graphical display elements through animation object instances
CN107767431B (zh) * 2017-09-28 2021-06-01 北京知道创宇信息技术股份有限公司 一种Web动画制作方法及计算设备
AU2017439979B2 (en) * 2017-11-20 2021-10-28 Huawei Technologies Co., Ltd. Method and device for dynamically displaying icon according to background image

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150370447A1 (en) * 2014-06-24 2015-12-24 Google Inc. Computerized systems and methods for cascading user interface element animations
CN109643218A (zh) * 2016-04-26 2019-04-16 谷歌有限责任公司 用户界面元素的动画
CN112256165A (zh) * 2019-12-13 2021-01-22 华为技术有限公司 一种应用图标的显示方法及电子设备
CN112987987A (zh) * 2019-12-13 2021-06-18 华为技术有限公司 一种应用图标的显示方法及电子设备
CN113552987A (zh) * 2021-04-20 2021-10-26 华为技术有限公司 图形界面显示方法、电子设备、介质以及程序产品
CN113568549A (zh) * 2021-04-20 2021-10-29 华为技术有限公司 图形界面显示方法、电子设备、介质以及程序产品

Also Published As

Publication number Publication date
CN115220621A (zh) 2022-10-21

Similar Documents

Publication Publication Date Title
WO2022222738A1 (fr) Procédé d'affichage d'interface graphique, dispositif électronique, support et produit programme
WO2022222830A1 (fr) Procédé d'affichage d'interface graphique, dispositif électronique, support et produit programme
WO2021036571A1 (fr) Procédé d'édition de bureau et dispositif électronique
EP4224831A1 (fr) Procédé de traitement d'images et dispositif électronique
WO2021115194A1 (fr) Procédé d'affichage d'icône d'application et dispositif électronique
CN113805745B (zh) 一种悬浮窗的控制方法及电子设备
WO2022199509A1 (fr) Procédé d'application d'opération de dessin et dispositif électronique
CN113132526B (zh) 一种页面绘制方法及相关装置
CN114579075A (zh) 数据处理方法和相关装置
WO2022222931A1 (fr) Procédé d'affichage d'interface graphique, dispositif électronique, support et produit programme
WO2022222831A1 (fr) Procédé d'affichage d'interface graphique, dispositif électronique, support et produit programme
WO2023130977A1 (fr) Procédé d'affichage d'interface utilisateur, dispositif électronique, support et produit programme
WO2022247541A1 (fr) Procédé et appareil de liaison d'animation d'application
WO2022247542A1 (fr) Procédé et appareil de calcul d'effet dynamique
US20230351665A1 (en) Animation Processing Method and Related Apparatus
CN116700555B (zh) 动效处理方法及电子设备
WO2024017183A1 (fr) Procédé d'affichage pour une commutation d'interface, et dispositif électronique
WO2024099206A1 (fr) Procédé et appareil de traitement d'interface graphique
WO2023236649A1 (fr) Procédé d'affichage de commande de commutateur et dispositif électronique
WO2024017185A1 (fr) Procédé d'affichage d'interface et dispositif électronique
WO2023066177A1 (fr) Procédé d'affichage d'effet d'animation et dispositif électronique
CN114691002A (zh) 一种页面滑动的处理方法及相关装置
CN117290004A (zh) 组件预览的方法和电子设备
CN115700444A (zh) 光标显示方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22791038

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22791038

Country of ref document: EP

Kind code of ref document: A1