CN115220621A - Graphical interface display method, electronic device, medium, and program product - Google Patents

Graphical interface display method, electronic device, medium, and program product Download PDF

Info

Publication number
CN115220621A
CN115220621A CN202110426824.5A CN202110426824A CN115220621A CN 115220621 A CN115220621 A CN 115220621A CN 202110426824 A CN202110426824 A CN 202110426824A CN 115220621 A CN115220621 A CN 115220621A
Authority
CN
China
Prior art keywords
elements
distance
animation
electronic device
press
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110426824.5A
Other languages
Chinese (zh)
Inventor
卞超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110426824.5A priority Critical patent/CN115220621A/en
Priority to PCT/CN2022/087751 priority patent/WO2022222931A1/en
Publication of CN115220621A publication Critical patent/CN115220621A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/802D [Two Dimensional] animation, e.g. using sprites

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure relates to a graphical interface display method, an electronic device, a computer-readable storage medium, and a computer program product. According to the graphical interface display method described herein, M user interface UI elements are displayed on a screen of an electronic device, M being a positive integer greater than 1. A press acting at a first UI element of the M UI elements is detected. In response to the pressing, each of N UI elements on the screen is caused to change with a respective animation effect, N being a positive integer between 1 and M-1. Causing the N UI elements to change with respective animation effects comprises: determining a distance between the first UI element and a second UI element of the N UI elements; determining an animation effect of the second UI element changing based on the distance and the position of the press in the UI; and causing the second UI element to change in an animation effect to visually indicate the press. In this way, the relation among the animation effects of different UI elements can be strengthened, the relation among all the independent UI elements is highlighted, the animation effects of the UI more conform to the use habits of users, and the user experience is remarkably improved.

Description

Graphical interface display method, electronic device, medium, and program product
Technical Field
The present disclosure relates generally to the field of information technology, and more particularly, to a graphical interface display method, an electronic device, a computer-readable storage medium, and a computer program product.
Background
With the development of information technology, various types of electronic devices are equipped with various types of screens. For this reason, the display effect and style of an on-screen User Interface (UI) become key elements affecting the user experience. Animation has become a vital part of the UI. With the improvement of the performance of electronic devices such as smart phones, animation has been developed. High refresh rate, high rendering degree, high complexity animation are increasing. Therefore, there is a need to further improve the display of animations to enhance the user experience.
Disclosure of Invention
According to some embodiments of the present disclosure, a graphical interface display method, an electronic device, a medium, and a program product are provided, which can strengthen the relation between animation effects of different UI elements, and highlight the relation between independent UI elements, so that the animation effects of the UI more conform to the use habits of users, thereby significantly improving user experience.
In a first aspect of the disclosure, a graphical interface display method is provided. According to the graphical interface display method of the first aspect, M user interface UI elements are displayed on a screen of an electronic device, M being a positive integer greater than 1. A press acting at a first UI element of the M UI elements is detected. In response to the pressing, each of N UI elements on the screen is caused to change with a respective animation effect, N being a positive integer between 1 and M-1. Causing the N UI elements to change with respective animation effects comprises: determining a distance between the first UI element and a second UI element of the N UI elements; determining an animation effect of the second UI element changing based on the distance and the position of the press in the UI; and causing the second UI element to change in an animation effect to visually indicate the press. In this way, the relation between the animation effects of different UI elements can be strengthened, the relation between the independent UI elements is highlighted, the animation effect of the UI is enabled to be more in line with the use habit of the user, and therefore the user experience is remarkably improved.
In some implementations, to determine the distance, a first fiducial point of the first UI element and a second fiducial point of the second UI element may be determined; and determining a distance between the first reference point and the second reference point as a distance. In this way, a distance may be determined based on the determined reference point of the UI element, thereby improving the accuracy of the determined distance and the flexibility of the manner of distance determination, thereby improving the user experience.
In some implementations, to determine the distance, a first fiducial point of the first UI element may be determined; determining a target circle that intersects the second UI element and has a smallest radius, from among a plurality of circles having respective radii centered on the first reference point; and determining the radius of the target circle as the distance. In this way, the distance between UI elements may be determined more simply and conveniently, thereby increasing the flexibility of the manner in which the distance is determined, and thus the user experience.
In some implementations, to determine the distance, a lateral spacing between the first UI element and the second UI element may be determined; determining a vertical spacing between the first UI element and the second UI element; and determining the distance based on any one of: at least one of the lateral spacing and the longitudinal spacing, or at least one of the lateral spacing and the longitudinal spacing, and a direction pointing from the second reference point of the second UI element to the first reference point of the first UI element. In this way, the distance between UI elements may be determined based on the spacing between UI elements, thereby increasing the flexibility of the manner in which the distance is determined, thereby increasing the user experience.
In some implementations, the method may further include: determining an area of influence of the first UI element based on a size of the first UI element; and determining a UI element within the area of influence among the M UI elements as N UI elements. In this way, the UI element which is changed along with the UI element in a linkage manner can be determined based on the size of the pressed UI element, so that the animation effect of the UI is more in line with the use habit of the user, and the user experience is remarkably improved.
In some implementations, the method may further include: m-1 UI elements other than the first UI element among the M UI elements are determined as N UI elements. In this way, the UI elements on the screen except for the pressed UI elements can be in linkage change along with the UI elements, so that the UI elements in linkage change can be determined more simply and conveniently, the animation effect of the UI is more in line with the use habits of users, and the user experience is remarkably improved.
In some implementations, the animation effect can include: visually shifting the position in a seesaw manner with respect to the pressed position, or visually recessing or protruding with respect to the pressed position. In this way, the animation effect of the pressed UI element can be intuitively presented, so that the animation effect of the UI is more in line with the use habit of the user, and the experience of the user is obviously improved.
In some implementations, to determine the animation effect, a first magnitude at which the first UI element changes in response to the press may be determined; and determining a magnitude of the change of the second UI element in response to the press based on any of: the first amplitude and the distance, or the first amplitude, and the distance, at least one of a size of the second UI element and a size of the first UI element. In this way, the animation effect of the first UI element may be conducted to the second UI element, and further the animation effect of the second UI element may be determined based on the distance between the first UI element and the second UI element and the size of the second UI element, whereby the animation effect of the UI may be made more in line with the usage habits of the user, thereby significantly improving the user experience.
In some implementations, the first magnitude of change of the first UI element can be determined based on at least one of the following associated with the first UI element: a size of the first UI element, a position of the first reference point of the first UI element, a range of magnitudes over which the first UI element can be changed, a position of the press, a duration of the press, and a predetermined pressing force. In this way, the first magnitude of change of the first UI element can be intuitively, reasonably, and flexibly determined based on various factors associated with the first UI element, whereby the animation effect of the UI can be made to more conform to the usage habit of the user, thereby significantly improving the user experience.
In some implementations, causing the second UI element to change can include: determining a delay time based on the distance; and causing the second UI element to change in response to a delay time elapsing after the press occurs. In this way, linkage change can be visually presented to propagate along with distance, so that the animation effect of the UI is more in line with the use habit of the user, and the user experience is remarkably improved.
In some implementations, causing the second UI element to change can include: the speed at which the second UI element changes in response to a compression is determined based on a predefined curve of amplitude over time. In this way, the change of the first UI element can be conveniently controlled based on the predefined curve of the amplitude change with time, so that the animation effect of the UI is more in line with the use habit of the user, thereby significantly improving the user experience.
In some implementations, the predefined curve may be a bezier curve or an elastic force curve. In this way, the change of the first UI element can be conveniently controlled based on the bezier curve or the elastic force curve, so that the animation effect of the UI more conforms to the use habit of the user, thereby significantly improving the user experience.
In some implementations, the method may further include: and restoring the changed second UI element into the second UI element. In this way, the UI element can be recovered after hands are released, so that the animation effect of the UI is more in line with the use habit of the user, and the user experience is remarkably improved.
In some implementations, the method can be implemented by at least one of an AAR format file, a JAR format file, and a system interface. In this way, graphical interface displays with ganged changes can be achieved simply and conveniently.
In a second aspect of the disclosure, an electronic device is provided. The electronic device comprises a processor and a memory storing instructions that, when executed by the processor, cause the electronic device to perform any of the methods according to the first aspect and implementations thereof.
In a third aspect of the disclosure, a computer-readable storage medium is provided. A computer readable storage medium has instructions stored thereon, which when executed by a processor cause an electronic device to perform any of the methods according to the first aspect and implementations thereof.
In a fourth aspect of the disclosure, a computer program product is provided. The computer program product comprises instructions which, when executed by a processor, cause the electronic device to perform any of the methods according to the first aspect and its implementations.
Drawings
The features, advantages and other aspects of various implementations of the disclosure will become more apparent with reference to the following detailed description when taken in conjunction with the accompanying drawings. Several implementations of the present disclosure are illustrated herein by way of example, and not by way of limitation, in the figures of the accompanying drawings:
fig. 1A to 1B show schematic diagrams of a hardware structure and a software structure of an electronic device in which an embodiment of the present disclosure can be implemented.
FIG. 2 illustrates a block diagram of another electronic device in which embodiments of the present disclosure may be implemented.
Fig. 3A-3C respectively illustrate schematic diagrams of example UIs, according to some embodiments of the present disclosure.
FIG. 4 illustrates a schematic diagram of an example drag linkage, according to some embodiments of the present disclosure.
Fig. 5A and 5B show schematic diagrams of an example velocity time curve and an example displacement time curve, respectively, of a friction force model, according to some embodiments of the present disclosure.
Fig. 6 illustrates a schematic diagram of a limited and non-limited example of a mobile location, according to some embodiments of the present disclosure.
7A-7C illustrate schematic diagrams of examples of curves of the amount of spring deflection x over time t in a critically damped state, an under-damped state, and an over-damped state, respectively, according to some embodiments of the present disclosure.
Fig. 8 shows a flowchart of a graphical interface display method according to an embodiment of the present disclosure.
Fig. 9 shows a schematic diagram of an example of an area of influence of a UI element according to an embodiment of the disclosure.
Fig. 10 shows a schematic diagram of an example of a determination of a distance according to an embodiment of the present disclosure.
11A-11C show schematic diagrams of examples of determination of distances according to embodiments of the present disclosure.
Fig. 12 shows a schematic diagram of an example of a determination of a distance according to an embodiment of the present disclosure.
Fig. 13 shows a schematic diagram of an example of determination of a distance according to an embodiment of the present disclosure.
Fig. 14A and 14B show schematic diagrams of examples of determination of distances according to embodiments of the present disclosure.
Fig. 15 shows a schematic diagram of an example of a delay time according to an embodiment of the present disclosure.
Fig. 16A illustrates a schematic diagram of an example of a scenario in which a UI element moves fully with the hand, according to an embodiment of the disclosure.
Fig. 16B illustrates a schematic diagram of an example of a displacement time curve for a scene in which a UI element moves fully with a hand, according to an embodiment of the disclosure.
Fig. 17A illustrates a schematic diagram of an example of a scenario in which a UI element does not completely follow hand movement, according to an embodiment of the disclosure.
Fig. 17B illustrates a schematic diagram of an example of a displacement time curve for a scene in which UI elements do not completely follow hand movement, according to an embodiment of the disclosure.
Fig. 18A illustrates a schematic diagram of an example of a scenario in which a UI element moves fully with the hand, according to an embodiment of the disclosure.
Fig. 18B shows a schematic diagram of an example of a displacement time curve for a scene in which a UI element moves fully with a hand, according to an embodiment of the disclosure.
Fig. 19A illustrates a schematic diagram of an example of a scene in which a UI element moves fully with the hand according to an embodiment of the disclosure.
Fig. 19B illustrates a schematic diagram of an example of a displacement time curve for a scene in which a UI element is moving fully with a hand, according to an embodiment of the disclosure.
Fig. 20 shows a schematic diagram of an example of a change in a UI element when pressed, according to some embodiments of the disclosure.
Fig. 21 illustrates a schematic diagram of an example of a change in a UI element when pressed at different locations according to some embodiments of the disclosure.
Fig. 22 illustrates an exemplary view of an example of a change of a UI element at different pressing forces according to some embodiments of the present disclosure.
Fig. 23 illustrates a schematic diagram of an example of a change in UI elements at different press durations, according to some embodiments of the present disclosure.
Fig. 24 illustrates a schematic diagram of an example of a change in UI elements at different sizes, according to some embodiments of the disclosure.
Fig. 25 shows a flowchart of a graphical interface display method according to an embodiment of the present disclosure.
Fig. 26 shows a schematic diagram of an example of depth chaining of N UI elements according to an embodiment of the disclosure.
Fig. 27 shows a schematic diagram of an example of an area of influence of a UI element according to an embodiment of the disclosure.
Fig. 28 illustrates a schematic diagram of an example of scaling of a distance-based UI element according to an embodiment of the disclosure.
Fig. 29 shows a schematic diagram of an example of a delay time according to an embodiment of the present disclosure.
Fig. 30 shows a schematic diagram of an example of scaling of UI elements with delay times according to the present disclosure.
Fig. 31 shows a schematic diagram of an example of displacement of movement of a UI element according to an embodiment of the present disclosure.
Fig. 32A-32B illustrate schematic diagrams of examples of restoration of a UI element without displacement and restoration of a UI element with displacement, respectively, according to an embodiment of the disclosure.
Fig. 33A-33B show schematic diagrams of examples of size-time curves and displacement-time curves, respectively, of a restoration of a UI element with a rebound effect according to an embodiment of the present disclosure, and fig. 33C-33D show schematic diagrams of examples of size-time curves and displacement-time curves, respectively, of a restoration of a UI element with a rebound effect of a plurality of rebounds with reduced rebound amplitudes according to an embodiment of the present disclosure.
Fig. 34 illustrates a schematic diagram of an example of a change in a UI element that is a rigid body when pressed, according to some embodiments of the present disclosure.
Fig. 35 illustrates a schematic diagram of an example of compression and stretching of a spring simulating compression of a UI element, according to some embodiments of the disclosure.
Fig. 36 illustrates a schematic diagram of an example of a change in a UI element that is a non-rigid body when pressed, according to some embodiments of the disclosure.
Fig. 37 illustrates an illustrative view of an example of a change in UI elements at different pressing forces according to some embodiments of the disclosure.
Fig. 38 illustrates a schematic diagram of an example of a change in UI elements at different press durations, according to some embodiments of the disclosure.
Fig. 39 illustrates a schematic diagram of an example of a change in UI elements at different sizes, according to some embodiments of the disclosure.
Fig. 40 shows a flowchart of a graphical interface display method according to an embodiment of the present disclosure.
Fig. 41 shows a schematic diagram of an example of pressure ganging of N UI elements, according to an embodiment of the disclosure.
Fig. 42 shows a schematic diagram of another example of pressure ganging of N UI elements, according to an embodiment of the disclosure.
Fig. 43 shows a schematic diagram of an example of an area of influence of a UI element according to an embodiment of the present disclosure.
Fig. 44 illustrates a schematic diagram of an example of a change in distance-based UI elements according to an embodiment of the disclosure.
Fig. 45 shows a schematic diagram of an example of a delay time according to an embodiment of the present disclosure.
Fig. 46A illustrates a schematic diagram of an example of a restoration of a UI element according to an embodiment of the disclosure.
Fig. 46B shows a schematic diagram of an example of an angular time curve of a recovery of a UI element with a bounce effect according to an embodiment of the disclosure.
Fig. 46C shows a schematic diagram of an example of an angular time curve of recovery of a UI element having a bounce effect of a number of bounces with decreasing bounce amplitude, according to an embodiment of the disclosure.
FIG. 47 illustrates an animation implementation schematic in accordance with an embodiment of the disclosure.
FIG. 48 illustrates a schematic diagram of a system framework for implementing a "ganged" animation effect capability or function, according to an embodiment of the present disclosure.
FIG. 49 illustrates a schematic diagram of the relationship between the application side and the UI framework side involved in the "ganged" animation effect capability or function according to an embodiment of the disclosure.
FIG. 50 shows a schematic diagram with a specific illustration of three ways of "ganged" animation effect capabilities or function implementation, according to an embodiment of the disclosure.
Detailed Description
Some example implementations of the present disclosure will be described in more detail below with reference to the accompanying drawings. While some example implementations of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the example implementations set forth herein. Rather, these implementations are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
The term "including" and variations thereof as used herein is intended to be open-ended, i.e., "including but not limited to". Unless stated otherwise, the term "or" means "and/or". The term "based on" means "based at least in part on". The terms "embodiment" and "some embodiments" mean "at least some embodiments". The terms "first", "second", etc. are used to describe, to distinguish between different objects, etc., and do not denote any order or importance, nor do they limit "first" and "second" to different types.
The term "UI" as used herein refers to an interface for user interaction and information exchange with an application or operating system that enables conversion between an internal form of information and a form that is acceptable to the user. For example, the UI of the application is source code written in a specific computer language such as java (java), extensible markup language (XML), and the like, and the UI source code is parsed, rendered, and finally presented as content that can be recognized by a user, such as a picture, a text, a button, and the like on the electronic device.
In some embodiments, the properties and content of UI elements in the UI are defined by tags or nodes, such as XML specifying the UI elements that the UI contains by nodes < TextView >, < ImgView >, < VideoView >, etc. A node corresponds to a UI element or attribute in the UI and the node is rendered as user viewable content after parsing and rendering. In addition, many applications, such as hybrid applications (hybrid applications), typically include web pages in their UIs. A web page may be understood as a special UI element embedded in the UI of an application, the web page is a source code written in a specific computer language, such as hypertext markup language (HTML), cascading Style Sheets (CSS), java scripts (JavaScript, JS), etc., and the source code of the web page may be loaded and displayed as content recognizable to a user by a browser or a web page display component similar to the browser function. The specific content contained in the web page is also defined by tags or nodes in the source code of the web page, such as HTML, which defines elements and attributes of the web page by < p >, < img >, < video >, < canvas >.
The term "UI element" as used herein includes, but is not limited to: a window (window), a scroll bar (scrollbar), a table view (tablevew), a button (button), a menu bar (menu bar), a text box (text box), a navigation bar, a tool bar (toolbar), an image (image), a static text (tabtext), a component (Widget), and the like.
In some of the flows described in the embodiments of the present disclosure, a plurality of operations or steps occurring in a specific order are included, but it should be understood that these operations or steps may be executed out of order or in parallel as they occur in the embodiments of the present disclosure, and the order of the operations is merely used to distinguish between the various operations, and the order itself does not represent any execution order. In addition, the flows may include more or less operations, and the operations or steps may be performed sequentially or in parallel, and the operations or steps may be combined.
In mobile operating systems such as Android and iOS, animations are essentially real-time display of user interface UI or UI elements based on refresh rate. Due to the principle of human persistence of vision, the user feels that the picture is moving. The animation changes from an initial state of the animation to a final state of the animation after an animation time elapses. During this transformation, the animation may be controlled by the animation type and animation transformation form. For example, animation types may include displacement animation, rotation animation, zoom animation, and transparency animation, among others. And the animation transformation form can be controlled by a controller such as an interpolator and an estimator. Such a controller may be used to control the speed at which animation transitions during an animation time.
However, conventionally, animation is merely a combination of simple animation effects, so that the animation effects are single, do not conform to physical regulations, and do not consider real usage scenarios, user usage habits, and the like.
Therefore, the embodiment of the disclosure provides a new scheme for displaying a graphical interface. Embodiments of the present disclosure relate to the linkage of UI elements in a UI on animation effects, including drag linkages, depth linkages, and pressure linkages. In a linkage, the target UI element that is operated on may affect other UI elements that are not operated on. In particular, the animation effect that triggers the target UI element may jointly trigger the animation effect of one or more other UI elements, even other UI elements in the entire UI.
Thereby, the connection between animation effects of different UI elements may be strengthened and the relation between individual UI elements may be highlighted. Compared with the traditional animation with single animation effect and independent and unrelated UI elements, the embodiment of the disclosure can enable the animation effect to better conform to physical laws, and considers a real use scene and a user use habit, thereby remarkably improving user experience.
Some example embodiments of the disclosure will be described below with reference to fig. 1A to 46.
Fig. 1A shows a schematic diagram of a hardware structure of an electronic device 100 in which an embodiment of the present disclosure may be implemented. As shown in fig. 1, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiments of the present disclosure does not constitute a specific limitation on the electronic device 100. In other embodiments of the present disclosure, electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), and the like. The different processing units may be separate devices or may be integrated into one or more processors. The controller can generate an operation control signal according to the instruction operation code and the time sequence signal to finish the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bidirectional synchronous serial bus comprising a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, the processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example, the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement the touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 through an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, so as to implement a function of answering a call through a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit the audio signal to the wireless communication module 160 through the PCM interface, so as to implement the function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example, the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. Processor 110 and display screen 194 communicate via a DSI interface to implement display functions of electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to a USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules illustrated in the embodiment of the disclosure is only an illustrative example, and does not constitute a limitation to the structure of the electronic device 100. In other embodiments of the present disclosure, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device 100 via the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example, the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G/6G, etc. applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then passed to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs), such as wireless fidelity (Wi-Fi) networks, bluetooth (BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), wideband Code Division Multiple Access (WCDMA), time division code division multiple access (TD-SCDMA), long Term Evolution (LTE), 5G and subsequent evolution standards, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, and the like. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions through the GPU, the display screen 194, and the application processor, etc. The GPU is a microprocessor for image processing, connected to the display screen 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement a photographing function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, and the application processor, etc. The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, a shutter is opened, light is transmitted to a camera photosensitive element through the lens, an optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to an ISP (internet service provider) for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP to be converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as Moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
The NPU is a neural-network (NN) computing processor, which processes input information quickly by referring to a biological neural network structure, for example, by referring to a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100, for example, image recognition, face recognition, voice recognition, text understanding, and the like, may be implemented by the NPU.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving files of music, video, etc. in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area may store data (e.g., audio data, a phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one disk memory device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor, etc. Such as music playing, recording, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be attached to and detached from the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out from the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with an external memory card. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, i.e., embedded SIM cards. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present disclosure takes a mobile operating system of a layered architecture as an example, and illustrates a software structure of the electronic device 100.
Fig. 1B is a schematic diagram of a software structure of the electronic device 100 of the embodiment of the present disclosure. The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the operating system may be divided into four layers, an application layer, an application framework layer, an operating system runtime (runtime) and system libraries, and a kernel layer, from top to bottom, respectively.
The application layer may include a series of application packages. As shown in fig. 1B, the application packages may include camera, gallery, calendar, phone, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 1B, the application framework layers may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and the like. The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, dialed and received telephone calls, browsing history and bookmarks, phone books, etc. The view system includes visual controls, such as controls to display text, controls to display images, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying an image. The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.). The resource manager provides various resources for the application, such as localized strings, icons, images, layout files, video files, and the like. The notification manager enables the application program to display notification information in the status bar, can be used for conveying notification type messages, can disappear automatically after a short pause, and does not need user interaction. Such as a notification manager used to notify download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
With continued reference to FIG. 1B, the operating system runtime includes a core library and a virtual machine. The operating system runtime is responsible for scheduling and management of the operating system. The core library comprises two parts, one part is a function which needs to be called by the Java language, and the other part is the core library for operating the system. The application layer and the application framework layer run in a virtual machine. The virtual machine executes the Java files of the application layer and the application framework layer as binary files. The virtual machine is used to perform the functions of object lifecycle management, stack management, thread management, security and exception management, and garbage collection. The system library may include a plurality of functional modules. Such as surface managers (surface managers), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide a fusion of the 2D and 3D layers for multiple applications. The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
Fig. 2 illustrates a block diagram of another electronic device 200 in which embodiments of the present disclosure may be implemented. As shown in fig. 2, electronic device 200 may be in the form of a general purpose computing device. The components of electronic device 200 may include, but are not limited to, one or more processors or processing units 210, memory 220, storage 230, one or more communication units 240, one or more input devices 250, and one or more output devices 260. The processing unit 210 may be a real or virtual processor and can perform various processes according to programs stored in the memory 220. In a multiprocessor system, multiple processing units execute computer-executable instructions in parallel to improve the parallel processing capability of the electronic device 200.
Electronic device 200 typically includes a number of computer storage media. Such media may be any available media that is accessible by electronic device 200 and includes, but is not limited to, volatile and non-volatile media, removable and non-removable media. The memory 220 may be volatile memory (e.g., registers, cache, random Access Memory (RAM)), non-volatile memory (e.g., read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory), or some combination thereof. Storage 230 may be a removable or non-removable medium and may include a machine-readable medium, such as a flash drive, a magnetic disk, or any other medium that may be capable of being used to store information and/or data (e.g., training data for training) and that may be accessed within electronic device 200.
The electronic device 200 may further include additional removable/non-removable, volatile/nonvolatile storage media. Although not shown in FIG. 2, a magnetic disk drive for reading from and writing to a removable, non-volatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, non-volatile optical disk may be provided. In these cases, each drive may be connected to a bus (not shown) by one or more data media interfaces. Memory 220 may include a computer program product 225 having one or more program modules configured to perform the object editing methods or processes of embodiments of the present disclosure.
Communications unit 240 enables communications with other computing devices over a communications medium. Additionally, the functionality of the components of the electronic device 200 may be implemented in a single computing cluster or multiple computing machines, which are capable of communicating over a communications connection. Thus, the electronic device 200 may operate in a networked environment using logical connections to one or more other servers, network Personal Computers (PCs), or another network node.
The input device 250 may be one or more input devices such as a mouse, keyboard, trackball, or the like. Output device 260 may be one or more output devices such as a display, speakers, printer, or the like. In an embodiment of the present disclosure, the output device 260 may include a touch screen having a touch sensor, which may receive a touch input of a user. Electronic device 200 may also communicate with one or more external devices (not shown), such as storage devices, display devices, etc., with one or more devices that enable a user to interact with electronic device 200, or with any devices (e.g., network cards, modems, etc.) that enable electronic device 200 to communicate with one or more other computing devices, as desired, via communication unit 240. Such communication may be performed via input/output (I/O) interfaces (not shown).
It should be understood that the electronic device 100 illustrated in fig. 1 and the electronic device 200 illustrated in fig. 2 above are merely two example electronic devices capable of implementing one or more embodiments of the present disclosure and should not constitute any limitation as to the scope and functionality of the embodiments described herein.
Drag linkage
Currently, for more and better displayed information, screens of electronic devices are getting larger, layouts of UIs on the screens are getting more complicated, and UI elements are also getting larger in difference in size and shape. In this case, there are various irregular UI elements arranged in various irregular layouts in the UI. For example, there are various irregular controls, cards, pictures, and cover sheets of UI elements in the negative one-screen, control center, application market, gallery, etc. of a smartphone.
Traditionally, there is no linkage of animation effects between various irregular UI elements. That is, triggering the animation effect of the target UI element does not jointly trigger the animation effect of one or more other UI elements. The animation effect of each UI element is independent and unrelated. Therefore, the traditional animation effect is single and rigid, and the relationship among UI elements cannot be reflected.
Embodiments of the present disclosure relate to the linking of animation effects of UI elements in a UI when dragged, also referred to as drag-linking. In a drag-and-drop linkage, the dragged target UI element may affect other UI elements that are not dragged. In particular, in drag-and-drop linkage, the animation effect of the triggering target UI element may jointly trigger the animation effect of one or more other UI elements, and even other UI elements in the entire UI. For example, in a drag-and-drop linkage, while a target UI element is being dragged, in addition to the target UI element moving with an animation effect as the drag is being dragged, other UI elements may also move with a corresponding animation effect, thereby visually presenting the linkage drag.
Thus, the connection between animation effects of different UI elements can be strengthened, and the relationship between the individual UI elements can be highlighted. Compared with the traditional animation with single animation effect and independent and unrelated UI elements, the embodiment of the disclosure can enable the animation effect to better conform to physical laws, and considers a real use scene and a user use habit, thereby remarkably improving user experience.
Some example embodiments of drag linkages will be described below with reference to fig. 3A-19B.
Fig. 3A-3C illustrate schematic diagrams of example UIs 300A-300C, according to some embodiments of the present disclosure. In some embodiments, the UI elements may have irregular sizes and shapes. For example, as shown in FIG. 3A, UI 300A may include multiple UI elements, such as UI elements 1-13, where UI elements 1, 2, 4, and 5 have different sizes and shapes. Further, the UI may also have irregular parts. For example, as shown in FIG. 3B, UI elements 16 and 18 in UI 300B are left blank, i.e., no UI elements are present. However, embodiments of the present disclosure are equally applicable to regular layouts, sizes, and shapes. For example, as shown in FIG. 3C, UI 300C has a regular layout and UI elements 25-40 in UI 300C all have the same size and shape. It should be understood that embodiments of the present disclosure are applicable to any suitable regular or irregular layout, size, and shape.
The UI elements in the UI may be dragged. For example, a user may drag a UI element when the user desires to move the UI element. As an example, a user may drag a UI element when the user desires to change the position of the UI element in the UI, merge the UI element with another UI element, or place the UI element in a toolbar or trash, etc. Upon detecting a drag at the UI element, the UI element may move with an animation effect to visually present the drag action. As described above, in drag-and-drag linkages, a dragged target UI element may affect other UI elements that are not dragged. Specifically, when a target UI element is dragged, in addition to the target UI element moving with an animation effect along with the drag, other UI elements may also move with a corresponding animation effect, thereby visually presenting the linked drag.
FIG. 4 illustrates a schematic diagram of an example drag linkage 400 according to some embodiments of the present disclosure. As shown in fig. 4, in the case where a drag at the UI element 3 is detected, the UI element 3 may move with an animation effect to visually present a drag action. In addition to the UI element 3 moving with an animation effect as it is dragged, the other UI elements 2 and 4 may also move with a corresponding animation effect, thereby visually presenting the coordinated dragging. For clarity, FIG. 4 only shows the coordinated movement of UI elements 2-3. It should be understood that the coordinated movement may occur at any at least two UI elements in any UI, for example, at any at least two UI elements in UIs 300A-300C.
Specifically, detecting a drag at UI element 3 at time T01 moves UI element 3 and other UI elements 2 and 4. At time T02, the UI element 3 becomes less distant from the UI element 4 located in the drag direction. The spacing may represent the distance between the respective reference points of the two UI elements. In some embodiments, the center point of the UI element may be determined as a reference point for the UI element. Alternatively, the spacing may represent the distance between adjacent boundaries of two UI elements. As shown in fig. 4, in some embodiments, UI element 3 may even cover at least a portion of UI element 4. And the UI element 3 becomes spaced apart from the UI element 2 located in the direction opposite to the dragging direction. This means that the speed of movement of the UI element 3 is greater than the speed of the UI elements 2 and 4. At time T03, the spacing between the UI element 3 and the UI element 4 located in the drag direction becomes larger, and the spacing between the UI element 2 located in the direction opposite to the drag direction becomes smaller. This means that the speed of movement of the UI element 3 is less than the speed of the UI elements 2 and 4. At time T04, the UI element 3 and the other elements 2 and 4 move to a predetermined distance, thereby stopping the movement. In some embodiments, the predetermined distance may be determined based on a friction model. Hereinafter, the manner of determining the distance based on the frictional force model will be described in detail, and thus a description thereof will be omitted herein.
Examples of the UI element movement linkage are described above, and the principle of the UI element movement will be described below.
Movement of the UI elements may be controlled by: friction factors, linkage factors, hand-to-hand ratio factors, hand-release rebound factors, and/or inertial rebound factors. For example, a friction factor may control the UI element to stop moving. The linkage factors may control the animation effects of other UI elements. The grip factor may control grip movement of the UI element, such as grip movement of the UI element when not dragged across the boundary. The hand-to-hand ratio factor may control the ratio of the hand-to-hand movement of the UI element, such as the ratio of the displacement of the UI element to the displacement of the hand when dragging after an over-bound. A hands-free springback factor may control the resetting of UI elements after hands are released, such as the resetting of UI elements after a hands-free drag across a boundary. The inertial rebound factor may control the rebound of the UI element after the boundary is crossed. For example, when a UI element moves out of bounds, friction factors may not cause the UI element to stop moving. In this case, the inertial rebound factor may control the rebound of the UI element after the boundary is crossed.
Hereinafter, the frictional force model associated with the frictional force factor and the elastic force model associated with the linkage factor will be described in detail. In general, a friction model may be utilized to determine the distance that a UI element is to be moved, and thus the source and destination locations of movement of the UI element. Further, the spring parameters of the other UI elements of the coordinated movement may be determined using a conductive manner, which will be described in detail below, based on the spring parameters (e.g., elastic coefficient, damping coefficient) of the dragged UI element, so that each UI element is controlled to make a movement following the elastic force model based on its respective spring parameter during the movement of the dragged UI element and the UI element of the coordinated movement.
The friction model may be used to determine the distance the UI element is to be moved, e.g., after a loose hand or a flick. The distance can be determined by the following equations (1) and (2):
Figure BDA0003029898300000141
Figure BDA0003029898300000142
wherein f is friction Representing a friction force, which is configurable by the electronic device or the user; t represents the time of movement; v 0 Representing an initial speed, which is configurable by the electronic device or the user, or which is obtained by detecting a speed of a drag by the user; v (t) represents the final velocity, which is 0 since the movement of the UI element will eventually stop; e represents a natural constant; s (t) represents the distance the UI element is to be moved. It should be understood that the constants in the above equation are merely examples, which may be set by the electronic device or the user.
As can be seen from the above equation, the time t of movement can be determined by equation (1). Thus, the distance S (t) can be further determined by equation (2). In this way, the distance that the UI element is to be moved can be easily determined. Furthermore, because various parameters in the equation (e.g., friction, initial velocity, etc.) are configurable, the distance that a UI element will move can be affected by configuring these parameters, thereby improving the flexibility of the animation effect and the user experience.
Movement of the UI element that conforms to the friction model will satisfy the speed-time curve and the displacement-time curve of the friction model. Fig. 5A and 5B illustrate schematic diagrams of an example velocity time curve 500A and an example displacement time curve 500B, respectively, of a friction force model, according to some embodiments of the present disclosure. As shown in fig. 5A and 5B, in the case of receiving only the frictional force, the movement speed of the UI element decreases to 0 with time, and the movement distance of the UI element increases with time until the movement stops.
In the above, it is described how the friction force factor controls the UI element to stop moving. Further, whether the location to which a UI element can be moved is limited may also control where the UI element stops moving.
In particular, in some embodiments, the location to which the UI element can be moved is not limited. In this case, the distance determined based on the friction model is the distance that the UI element will move. However, in some embodiments, the locations to which the UI element can be moved are limited. In other words, the UI element can only be moved to a predetermined location. In this case, although the distance to be moved by the UI element may be determined based on the friction model, if the UI element is not located at the predetermined position after being moved by the distance, the distance to be moved by the UI element needs to be adjusted so that the UI element can be moved to the predetermined position. For example, the UI element may be moved to a predetermined position closest to a stop position determined based on the friction model. Thus, the distance that the UI element is to be moved may be determined based on both the friction model and the predetermined location.
Fig. 6 illustrates a schematic diagram of an example 600 of limited and non-limited locations of movement, according to some embodiments of the present disclosure. As shown in fig. 6, in the case where the position to which the UI element can be moved is limited, the UI element can be moved only to a predetermined position 630 from a source position 620 on a screen 610. In contrast, the UI element can be moved to an arbitrary position 640 from the source position 620 on the screen 610 without limitation on the position to which the UI element can be moved.
Further, in some embodiments, the range in which the UI element can move may also be limited. UI elements that move beyond this range will be considered to be out of bounds. The range may be any suitable range. For example, the range may be a range having a distance from a screen boundary smaller than a predetermined proportion of the screen size or a predetermined pixel (such as 10% or 1000 pixels), or a range having a distance from a source position to a destination position of a UI element smaller than a predetermined proportion of the screen size or a predetermined pixel (such as 50% or 10000 pixels). Thus, the distance that the UI element is to be moved may also be determined based on the range.
The friction force model is described in detail above, and the elastic force model is further described below.
The curve of displacement over time of the movement of the UI element may be an elastic force curve conforming to an elastic force model. Since the displacement and time may determine the velocity, the velocity of the movement of the UI element also follows the elastic force model. To this end, it may be considered that the movement of the UI element may mimic the laws of motion of a spring. It should be understood that for purposes of describing the elastic force model herein, the displacement versus time curve of movement of the UI element is described as an elastic force curve. However, the curve of the displacement of the movement of the UI element over time may also be any suitable predefined curve, such as a bezier curve.
The elastic force model may be based on damping vibration equations (3) and (4) under hooke's law:
f=ma (3),
Figure BDA0003029898300000151
where f represents the force that the spring experiences during vibration (i.e., in motion), which may be electronic or user configurable; a represents the acceleration of the movement; t represents the time of movement; k represents the spring constant of the spring; x represents the amount of deformation of the spring; g represents the damping coefficient of the spring; m represents the quality of the UI element, where the size of the UI element may be equivalent to the quality of the UI element.
The spring rate is the amount of force required for a unit amount of deformation of the spring. The greater the spring constant k, the shorter the time for the spring to return from the maximum amplitude to the equilibrium position and vice versa. The elastic coefficient k is configurable by the electronic device or the user. In some embodiments, the elastic coefficient k may range from 1 to 999, and the suggested range of elastic coefficient k may range from 150 to 400.
The damping coefficient is a quantitative representation of the damping force (e.g., fluid resistance, friction, etc.) of the spring during vibration, which may cause the spring to gradually decrease in amplitude until it comes to rest at an equilibrium position. The greater the damping coefficient, the easier it is for the spring to stop in the equilibrium position, and vice versa. The damping coefficient g is configurable by the electronic device or the user. In some embodiments, the damping coefficient g may range from 1 to 99.
Further, it should be appreciated that the distance S (t) over which the UI element is to be moved may be determined based on a friction model, as described above. In the case where the UI element moves by the distance S (t), S (t) may be regarded as the amount of deformation of the spring. Thus, S (t) is equal to x.
The elastic force model has three damping states, namely a critical damping state, an under-damping state and an over-damping state. The critical damping is in accordance with equation (5) below:
g 2 =4×m×k (5),
wherein g represents the damping coefficient of the spring; m represents the size of the UI element; k represents the spring constant of the spring.
And taking critical damping as a reference, if the damping is greater than the critical damping, the damping state is an over-damping state, and if the damping is less than the critical damping, the damping state is an under-damping state.
The displacement time curves of the UI elements are different under different damping states. 7A-7C illustrate schematic diagrams of examples of curves 700A-700C of the amount x of spring deformation over time t in a critically damped state, an under-damped state, and an over-damped state, respectively, according to some embodiments of the present disclosure. In the critical damping state, the spring stops moving at the smoothest rate after it has returned to the equilibrium position in the shortest time, and no longer oscillates, as shown in fig. 7A. As shown in fig. 7B, in the underdamped state, the spring slowly decreases in amplitude through multiple oscillations, eventually returning to the equilibrium position. As shown in fig. 7C, in the over-damped state, the spring vibrates little, and the amplitude gradually decreases to reach the equilibrium position.
As described above, the curve of the displacement of the movement of the UI element with time may be an elastic force curve conforming to an elastic force model. For this reason, it can be considered that the movement of the UI element can simulate the movement law of the spring, that is, the change law of the displacement of the UI element can simulate the change law of the spring deformation amount. By adjusting the damping coefficient and/or the elastic coefficient, the change rule of the displacement of the UI element can be adjusted, so that the UI element simulates the motion rule of the spring in a critical damping state, an over-damping state or an under-damping state.
Further, as described above, the spring parameters of the other UI elements that move in unison can be determined using a conductive manner, which will be described in detail below, based on the spring parameters (e.g., spring coefficient, damping coefficient) of the UI element being dragged, so that each UI element is controlled to move in compliance with the elastic force model based on its respective spring parameter during movement of the UI element being dragged and the UI element moving in unison. Therefore, the dragged UI elements and the linkage-moving UI elements can simulate the motion rules of springs with different spring parameters, so that the animation effect that the intervals among the UI elements in the dragging direction are reduced firstly and then restored (similar to the situation that the springs are compressed firstly and then restored), and the intervals among the UI elements in the opposite direction of the dragging direction are increased firstly and then restored (similar to the situation that the springs are stretched firstly and then restored) is presented, and the dynamic feedback of the dragging action of a user is increased.
The animation effect of a certain interlockingly-moving UI element is determined based on the animation effect of the dragged UI element movement and the distance between the dragged UI element and the interlockingly-moving UI element. Since the animation effect of the movement of the UI element that moves in linkage changes with distance, the animation effect of the movement thereof can also be considered to be conducted with distance. In some embodiments, the conduction may be non-linear conduction. Alternatively, the conduction may also be a linear conduction.
For example, in the case of non-linear conduction, the animation effect of the UI element movement of the coordinated movement may be determined by the following equation (6):
x n =x×(n+1) -0.18×g (6),
wherein x n Animation effects representing movement of UI elements moving in linkage; x represents the animation effect of the movement of the dragged UI element; n represents a distance between the dragged UI element and the UI element moved in linkage; g represents a conduction coefficient, and when the conduction coefficient is 0, the animation effect of the movement of the UI element which moves in linkage is the same as the animation effect of the movement of the dragged UI element; the constants in equation (6) are merely examples, which are configurable by the electronic device or user.
In case the curve of the displacement of the movement of the UI element over time is an elastic force curve, the animation effect of the movement may be controlled by the damping coefficient and/or the elastic coefficient. X may thus be determined based on at least one of the damping coefficient and the spring coefficient.
For example, x may be a ratio of an elastic coefficient to a damping coefficient of the UI element being dragged. In this case, the ratio of the elastic coefficient to the damping coefficient of the dragged UI element is conducted to the jointly moving UI element based on the distance n, resulting in the elastic coefficient and the damping coefficient of the jointly moving UI elementRatio x of the damping coefficients n . Thus, the animation effect of the dragged UI element movement may be transferred to the UI element of the joint movement based on the distance. The greater the ratio of spring rate to damping rate, the weaker the correlation between the movement of the UI elements, and the greater the spring characteristics and movement differences between the UI elements, which can be considered as a spring "softer". Conversely, the smaller the ratio of spring rate to damping rate, the stronger the correlation between the movement of the UI elements, and the smaller the spring characteristics and movement differences between UI elements, which may be considered a spring "stiffer".
It should be understood that x represents the ratio of the elastic coefficient to the damping coefficient of the dragged UI element is merely an example. x may be any suitable factor. As another example, x may be an elastic coefficient of the dragged UI element. As yet another example, x may also be a damping coefficient of the dragged UI element.
Furthermore, although the elastic force curve is described in detail above, the animation effect of the movement of the UI element may follow any suitable predetermined curve, such as a Bezier curve. Depending on the order of the bezier curve, the bezier curve may have control points corresponding to the order. For example, in the case of a second order bezier curve, the bezier curve may have two control points. Similarly, in the case of a first order bezier curve, the bezier curve may have one control point, while in the case of a third order bezier curve, the bezier curve may have three control points, and so on. In a case where a curve of displacement of the movement of the UI element over time is a bezier curve, an animation effect of the movement may be controlled by a coordinate of at least one of the at least one control point of the bezier curve. For example, in the case of a second order bezier curve, the animation effect of the movement may be controlled by one or both of the two control points of the second order bezier curve. Thus, x may be determined based on the coordinates of at least one of the at least one control point.
Non-linear conduction, in which the animation effect of the movement of the UI element of the coordinated movement can be determined by the following equation (7), is described in detail above:
x n =x-g×n (7),
wherein x is n Animation effects representing movement of UI elements moving in linkage; x represents the animation effect of the movement of the dragged UI element; n represents a distance between the dragged UI element and the UI element moved in linkage; g represents a conduction coefficient, and when the conduction coefficient is 0, the animation effect of the movement of the UI element that moves in linkage is the same as the animation effect of the movement of the UI element that is dragged.
Similar to non-linear conduction, where the curve of displacement versus time of the movement of the UI element is an elastic force curve, the animation effect of the movement may be controlled by a damping coefficient and/or an elastic coefficient. X may thus be determined based on at least one of the damping coefficient and the spring coefficient. In a case where a curve of a displacement of the movement of the UI element over time is a bezier curve, an animation effect of the movement may be controlled by coordinates of at least one control point of the bezier curve. Thus, x may be determined based on the coordinates of at least one of the at least one control point.
In the above, the principle of drag linkage is described in detail. Hereinafter, a process of linkage movement using the principle control UI element of the drag linkage will be further described.
Fig. 8 shows a flow diagram of a graphical interface display method 800 according to an embodiment of the disclosure. It should be understood that method 800 may be performed by electronic device 100 described above with reference to fig. 1 or electronic device 200 described with reference to fig. 2. The method 800 is described herein with reference to the UI 300A of fig. 3A. However, it should be understood that UI 300A is merely an example and method 800 may be applicable to any suitable interface, including but not limited to UIs 300B-300C.
At block 810, M user interface UI elements are displayed on a screen of the electronic device. M is a positive integer greater than 1. For example, referring to fig. 9, the m UI elements may be UI elements 1 to 13.
At block 820, a drag is detected acting at a first UI element of the M UI elements. For example, the first UI element may be UI element 5. A drag on the first UI element will cause the first UI element to move with an animation effect to present the drag effect.
At block 830, each of the N UI elements on the screen is moved with a respective animation effect in response to the drag acting on the first UI element. N is a positive integer between 1 and M-1. This visually indicates the interlocked dragging.
In some embodiments, the drag linkage may act on all UI elements on the screen. In this case, M-1 UI elements other than the first UI element among the M UI elements may be determined as the N UI elements. Alternatively, the drag linkage may only act on a portion of the UI elements on the screen. In this case, the influence area of the first UI element may be determined based on the size of the first UI element, and a UI element within the influence area among the M UI elements may be determined as the N UI elements. For example, the larger the size of the first UI element, the larger its area of influence may be. Alternatively, the area of influence may also shrink with decreasing size, as the disclosure is not limited herein. For example, the area of influence may be a circle having a predetermined radius centered on the reference point of the first UI element. It should be appreciated that the area of influence may be any suitable area having any shape, such as a rectangle, a diamond, etc. having a predetermined size. The area of influence may be configurable by the electronic device and the user, and the disclosure is not limited thereto.
Further, in some embodiments, UI elements that intersect the area of influence may be considered to be within the area of influence. Alternatively, in the case where the area of influence is a circle having a predetermined radius, the UI element may be considered to be within the area of influence if the distance of the UI element from the first UI element is less than the predetermined radius of the area of influence.
Fig. 9 shows a schematic diagram of an example of an area of influence of a UI element according to an embodiment of the disclosure. As shown in fig. 9, since the UI elements 3, 4, 7, 8 are within the area of influence 910 of the UI element 5, the UI elements 3, 4, 7, 8 will move in conjunction with the UI element 5. Furthermore, because UI elements 1, 2, 6, 9-13 are not within the area of influence 910 of the UI element 5, the UI elements 1, 2, 6, 9-13 do not move in conjunction with the UI element 5.
Referring back to fig. 8, in order to move the N UI elements with the respective animation effects, a distance between the first UI element and each of the N UI elements may be determined. In the following, it will be described how to determine the distance between the first UI element and the second UI element of the N UI elements.
In some embodiments, the distance may be divided into a plurality of distance levels according to the range in which the distance is located. For example, the manipulated UI element itself may be at distance level 0, those of the linked UI elements may be at distance levels 1, 2, 3, 8230depending on their respective distances from the manipulated UI element, 8230; UI elements at the same distance level may be considered to be the same distance. Thus, by using the distance levels, the linkage of the UI elements can be simplified, so that the UI elements at the same distance levels are linked in the same manner, thereby improving the uniformity and harmony of the linkage. However, it should be understood that in the linkage, the distance itself may also be used, thereby making the UI elements more precisely linked. Hereinafter, the distance rank is interchangeably referred to as distance.
Fig. 10 shows a schematic diagram of an example 1000 of determination of a distance according to an embodiment of the present disclosure. As shown in fig. 10, in some embodiments, a first reference point of a first UI element (e.g., UI element 5) and a second reference point of a second UI element (e.g., UI element 2) may be determined. In fig. 10, the reference point of each UI element is indicated by "+". In some embodiments, the center point of the UI element may be determined as a reference point for the UI element. Alternatively, the reference point of the UI element may be electronic device or user configurable, such that the location of the reference point may be any suitable location, as the disclosure is not limited herein. Thereby, a distance between the first reference point and the second reference point may be determined as a distance between the first UI element and the second UI element.
For example, assume that the position coordinates of the first reference point on the screen are (x 0, y 0), and the position coordinates of the second reference point on the screen are (x 1, y 1). In this case, the distance may be determined by the following equation (8):
Figure BDA0003029898300000181
where n represents a distance, x0 represents an abscissa of the first reference point, y0 represents an ordinate of the first reference point, x1 represents an abscissa of the second reference point, and y1 represents an ordinate of the second reference point.
As shown in fig. 10, the distances between the UI element 5 and other UI elements determined in the above manner are as follows: the UI element 5 is at a distance of 0 from itself, 1 from the UI elements 3, 7, 8, 2 from the UI elements 2, 4, 6, 9, and 3 from the UI elements 1, 10-13.
11A-11C show schematic diagrams of examples 1100A-1100C of determination of distances according to embodiments of the present disclosure. As shown in FIG. 11A, in some embodiments, a first fiducial point for a first UI element may be determined. A plurality of circles, such as circles 1110A-1130A, may be determined having respective radii from the first reference point as the center. It should be understood that any other suitable shape having the respective dimensions, such as a rectangle, a diamond, etc., may be determined centering on the first reference point, in addition to a circle, and the present disclosure is not limited thereto.
For example, as shown in fig. 11B, in some embodiments, a first reference point for a first UI element may be determined. A plurality of rectangles, e.g., rectangles 1110B-1120B, having respective dimensions, may be determined, centered from the first reference point. Further, as shown in fig. 11C, in some embodiments, a first reference point for the first UI element may be determined. A plurality of diamonds, such as rectangles 1110C-1140C, having respective sizes may be determined centered from the first reference point.
In some embodiments, the radii of the plurality of circles may increase by a predetermined size or ratio. Alternatively, the radius of the plurality of circles may be electronic device or user configurable, as the disclosure is not limited thereto.
Thus, a circle intersecting the second UI element may be determined from the plurality of circles. Thus, the radius of the intersecting circles may be determined as the distance. In some embodiments, if there is more than one circle intersecting the second UI element, a target circle intersecting the second UI element and having the smallest radius may be determined from the circles. Further, in some embodiments, if no circle intersects the second UI element, the circle closest to the second UI element may be taken as the target circle, and thus, the radius of the target circle may be determined as the distance.
As shown in fig. 11, the distances between the UI element 5 and other UI elements determined in the above manner are as follows: the UI element 5 is at a distance of 0 from itself. Since the circle of smallest radius intersecting the UI element 3, 4, 7, 8 is the circle 1110, the distance between the UI element 3, 4, 7, 8 and the UI element 5 is 1. Since circle 1120 intersects UI elements 2, 6, 9, the distance between UI elements 2, 6, 9 and UI element 5 is 2. Further, since the circle 1130 intersects the UI elements 1, 10-13, the distance between the UI elements 1, 10-13 is 3.
Fig. 12 shows a schematic diagram of an example 1200 of determination of distance according to an embodiment of the present disclosure. As shown in fig. 12, in some embodiments, a lateral spacing between the first UI element and the second UI element may be determined, and/or a longitudinal spacing between the first UI element and the second UI element may be determined. In some embodiments, the lateral spacing may represent the sum of the lengths of one or more lateral intervals between the first UI element and the second UI element. The lateral spacing may represent the spacing between the longitudinal boundaries of two UI elements on the screen. Similar to the horizontal spacing, the vertical spacing may represent a sum of lengths of one or more vertical spaces between the first UI element and the second UI element. The vertical spacing may represent the spacing between the horizontal boundaries of two UI elements on the screen. In the case of an irregular layout, the length of the lateral and longitudinal spaces between UI elements may be irregular. Further, the length of the lateral and longitudinal spacing between UI elements may be configurable by the electronic device or the user.
Thus, the distance may be determined based on the lateral spacing and/or the longitudinal spacing. For example, there are two vertical spaces between UI element 5 and UI element 13. Thus, the distance between the UI element 5 and the UI element 13 may be the sum of the lengths of the two longitudinal intervals. As another example, there is one lateral spacing between UI element 12 and UI element 13. Thus, the distance between the UI element 12 and the UI element 13 may be the length of the lateral spacing.
As shown in fig. 12, the distances between the UI element 5 and other UI elements determined in the above manner are as follows: the UI element 5 is at a distance of 0 from itself, 1 from UI elements 2-4 and 6-9, and 2 from UI elements 1, 10-13.
Fig. 13 shows a schematic diagram of an example 1300 of determination of distance according to an embodiment of the present disclosure. In some embodiments, rather than only taking into account the intermediate spacing (e.g., lateral spacing, longitudinal spacing) between UI elements in both lateral and longitudinal spacing as in fig. 12, the size of the intermediate UI elements between UI elements may also be taken into account in both lateral and longitudinal spacing. As shown in fig. 13, the lateral spacing may represent the sum of the length of one or more lateral spaces between the first UI element and the second UI element and the width of one or more intermediate UI elements. The vertical spacing may represent a sum of a length of one or more vertical spaces between the first UI element and the second UI element and a height of one or more intermediate UI elements.
Thus, the distance may be determined based on the lateral spacing and/or the longitudinal spacing. For example, there are two vertical spaces and one intermediate UI element 9 between the UI elements 5 and 13. Thus, the distance of the UI element 5 from the UI element 13 may be the sum of the length of the two longitudinal intervals and the height of the UI element 9. As another example, there is one lateral spacing and one intermediate UI element 12 between the UI elements 11 and 13. Thus, the distance of the UI element 11 from the UI element 13 may be the sum of the length of the lateral spacing and the width of the UI element 12. Furthermore, there is a vertical spacing between the UI elements 3 and 5. The distance between UI element 3 and UI element 5 is thus the length of this longitudinal interval. There are three longitudinal spaces and two intermediate UI elements 5 and 7 between UI element 3 and UI element 11. Thus, the distance between the UI element 3 and the UI element 11 is the sum of the length of the three longitudinal intervals and the height of the two intermediate UI elements 5 and 7. There is a lateral separation between UI element 3 and UI element 2. The distance between the UI element 3 and the UI element 2 is thus the length of this lateral spacing.
As shown in fig. 13, the distances between the UI element 5 and other UI elements determined in the above manner are as follows: the UI element 5 is at a distance of 0 from itself, at a distance of 1 from the UI elements 2-4 and 6-9, and at a distance of 2 from the UI elements 1, 10-13 and the UI element 5.
Fig. 14A and 14B show schematic diagrams of examples 1400A and 1400B of determination of distances according to embodiments of the present disclosure. In some embodiments, the operating direction may be taken into account in the lateral spacing and the longitudinal spacing, in addition to taking into account the length of the lateral spacing and the longitudinal spacing, and the width and height of the intermediate UI element. For example, in the case of dragging the first UI element, the operation direction may be a direction in which the first UI element is dragged. Further, while drag-linked scenarios are described herein, there are depth-linked scenarios and pressure-linked scenarios, as will be described below. In the depth linkage scene and the pressure linkage scene, a distance determination method in which the operation direction is considered may be used. Specifically, in the depth linkage scenario and the pressure linkage scenario, the UI element may be pressed. In the case of pressing the first UI element, although there is no operation direction, a direction pointing from the second UI element to the first UI element (such as a direction pointing from the second reference point to the first reference point) may be regarded as the operation direction, taking the operation direction into consideration in the lateral pitch and the longitudinal interval.
In this case, the transverse spacing and/or the longitudinal spacing can first be determined using the manner of determination of the distance described with reference to fig. 12 and 13. The angle of the operating direction to the horizontal and/or vertical direction may then be determined. Thereby, the distance in the operating direction can be determined using the trigonometric function principle.
In some embodiments, as shown in fig. 14A, since the lateral spacing, the longitudinal spacing, and the angle of the operating direction 1410A to the horizontal or vertical direction are known, the distance in the operating direction 1410A can be determined using trigonometric principles.
Alternatively, the distance in the operation direction may also be determined by selecting one of the horizontal direction and the vertical direction that is closer to the operation direction as the reference direction, according to the angle between the operation direction and the horizontal direction and the vertical direction. For example, as shown in fig. 14B, since the operating direction 1430B is closer to the vertical direction, the vertical direction may be selected as the reference direction, and the distance in the operating direction 1430B is determined using trigonometric principles based on the longitudinal spacing and the angle of the operating direction 1430B from the vertical direction. As another example, since the operating direction 1420B is closer to the horizontal, the distance in the operating direction 1420B may be determined using trigonometric principles based on the lateral spacing and the angle of the operating direction 1420B from the horizontal. Further, the reference direction may be electronic device or user configurable, as the disclosure is not limited herein. For example, the reference direction may be set to a horizontal direction, a vertical direction, or any other suitable direction.
The distance in the operating direction is determined above using the lateral pitch and the longitudinal pitch. However, as described above, the lateral spacing and the longitudinal spacing may be determined by the intermediate spacing and the size of the intermediate UI element. Thus, the distance in the direction of operation may also be determined piecewise for each intermediate interval and intermediate UI element. Specifically, for each intermediate space and intermediate UI element, the size of the intermediate space or the intermediate UI element, and the angle of the operation direction from the horizontal direction or the vertical direction may be determined. Thus, the distance in the operating direction can be determined using the trigonometric function principle. The distances in the operating direction determined for each intermediate interval and intermediate UI element may then be summed to determine a total distance in the operating direction.
Referring back to fig. 8, after determining the distance between the first UI element and the second UI element, an animation effect of moving the second UI element may be determined based on the distance. To this end, in some embodiments, a first animation effect may be determined in which the first UI element moves in response to the drag. As described above, in some embodiments, the first animation effect of the first UI element movement may be controlled by a predefined curve of displacement over time. For example, the predefined curve may be a bezier curve or an elastic force curve.
Thus, an animation effect of the second UI element moving in response to the drag may be determined based on the first animation effect and a distance between the first UI element and the second UI element. In some embodiments, where the first animation effect of the movement of the first UI element is controlled by a predefined curve of displacement over time, a curve of displacement over time of the second UI element may be determined based on the predefined curve of displacement over time of the first UI element. For example, in the case of an elastic force profile, the damping coefficient and/or the spring coefficient may be based on the distance-conducting spring. In the case of a bezier curve, the coordinates of at least one of the at least one control point of the bezier curve may be conducted based on distance. How to conduct the animation effect of the first UI element to the second UI element, thereby obtaining the animation effect of the second UI element, can be achieved using the conducting manner described in detail above. Therefore, the description thereof is omitted here.
In this way, since the animation effect of moving the second UI element is determined by the animation effect of moving the first UI element and the distance between the second UI element and the first UI element, it is possible to implement a drag linkage that is intuitive, natural, and conforms to the use habit of the user.
Furthermore, in some embodiments, the size of the second UI element may also affect the animation effect of its movement. In this case, the size of the second UI element may also be considered to determine the animation effect of the second UI element. For example, the larger the size of the second UI element, the less it may be affected by the linkage, and thus the animation effect of the second UI element movement may be inversely proportional to its size. To this end, in some embodiments, the animation effect of moving the second UI element may be determined based on the first amplitude, the distance, and the size of the second UI element.
Additionally, in some embodiments, the size of the first UI element may also affect the animation effect of the second UI element movement. In this case, the size of the first UI element may also be considered to determine the animation effect of the second UI element. For example, the larger the size of the first UI element, the greater the linkage effect it may have, and thus the animation effect of the second UI element moving may be proportional to the size of the first UI element. To this end, in some embodiments, the animation effect of moving the second UI element may be determined based on the first amplitude, the distance, and the size of the first UI element.
Further, as described above, both the size of the first UI element and the size of the second UI element may affect the animation effect of moving the second UI element. Thus, in some embodiments, the animation effect of moving the second UI element may be determined based on the first amplitude, the distance, the size of the first UI element, and the size of the second UI element.
Referring back to fig. 8, after determining the animation effect of moving the second UI element, the second UI element may be moved with the animation effect to visually indicate that the second UI element moves with the first UI element. The N UI elements may each be caused to move with respective animation effects to visually indicate a drag on the entire screen or on a partial area of the screen, thereby presenting a drag linkage.
In some embodiments, the direction of movement of the UI element of the coordinated movement may be associated with a drag direction, thereby visually indicating the drag action. To this end, in some embodiments, a direction of the drag may be determined, and the second UI element may be animated in association with the determined direction.
Further, in some embodiments, to better present the conduction of the animation effect and improve the user experience, the first UI element and the second UI element do not begin to move at the same time. For example, a first UI element may begin to move when a drag occurs, while a second UI element may begin to move after a period of time after the drag occurs. To this end, in some embodiments, a delay time may be determined based on a distance between the first UI element and the second UI element, and the second UI element may be moved in response to the delay time elapsing after the drag occurs. Further, in some embodiments, a delay factor may be determined, and the delay time determined based on the distance and the delay factor. For example, the delay time may be a quotient of the distance divided by the delay coefficient. The delay factor may be configurable by the electronic device and the user.
Fig. 15 shows a schematic diagram of an example of a delay time 1500 according to an embodiment of the disclosure. As shown in fig. 15, the first UI element (e.g., UI element 5) starts to move when the drag occurs, the UI elements (e.g., UI elements 3, 4, 7, 8) at distance 1 move later than the first UI element, the UI elements (e.g., UI elements 2, 6, 9) at distance 2 move later than the UI elements at distance 1, and the UI elements (e.g., UI elements 1, 10-13) at distance 3 move later than the UI elements at distance 2.
In this way, the connection between animation effects of different UI elements can be strengthened and the relationship between individual UI elements can be highlighted. Compared with the traditional animation with single animation effect and independent and unrelated UI elements, the embodiment of the disclosure can enable the animation effect to better conform to physical laws, and considers a real use scene and a user use habit, thereby remarkably improving user experience.
In the above, the process of implementing the drag linkage is described in detail, and the drag linkage in different scenes will be further described below. These scenarios include: a scenario in which a UI element moves completely with the hand, a scenario in which a UI element does not move completely with the hand, a scenario in which a UI element continues to move after a release or a throw slip, a scenario in which a UI element continues to move and rebounds after a release or a throw slip. It should be appreciated that these scenes may be combined with each other to synthesize a richer animation effect. For example, the UI element may not move completely with the hand and continue to move after a loose hand or a slide is thrown. As another example, the UI element may not move completely with the hand, and continue to move and bounce after the hand is loosened or thrown. Hereinafter, a description will be given taking, as an example, the UI element 5 being dragged and the UI elements 2-4 and 6-9 being moved in conjunction.
Fig. 16A shows a schematic diagram of an example of a scene 1600A in which a UI element is moving fully with the hand, according to an embodiment of the disclosure. Fig. 16B shows a schematic diagram of an example of a displacement time curve 1600B for a scene in which a UI element is fully moving with a hand, according to an embodiment of the disclosure.
As shown in fig. 16A and 16B, at T11, the UI element 5 is dragged. At T11a, the UI element 5 starts to follow the drag movement of the finger. In some embodiments, T11a may be equal to T11 if the UI element 5 starts to move when the drag occurs. Alternatively, T11a may be greater than T11 if UI element 5 begins to move after the drag occurs. Furthermore, in addition to the dragged UI element 5, other UI elements (e.g., UI elements 2-4 and 6-9) move in unison. It should be understood that for clarity, other UI elements are shown as beginning to move simultaneously with UI element 5. However, as described above, other UI elements may begin to move after respective delay times.
At T12, the user looses his hand or throws a slide, in which case dragging of the UI element 5 ends. At T12a, the UI element 5 stops moving. In some embodiments, T12a may be equal to T12 if UI element 5 stops moving when the hand is loose or thrown. Alternatively, T12a may be greater than T12 if UI element 5 stops moving after a loose hand or a throwing slip. At this time, the displacement of the UI element 5 in the drag direction is S10. The displacement of the UI element 3, 4, 7, 8 with distance 1 in the drag direction is S11. The displacement of the UI element 2, 6, 9 of distance 2 in the drag direction is S12. The displacement S10 is greater than the displacement S11, and the displacement S11 is greater than the displacement S12. At this point, the UI element 5 stops moving, while the UI elements 3, 4, 7, 8 at distance 1 and the UI elements 2, 6, 9 at distance 2 continue to move with the animation effect controlled by the predefined curve (e.g., the elastic force curve).
At T13, the separation in the drag direction between UI element 5 and UI elements 3, 4, 7, 8 at distance 1 decreases. Further, the spacing in the drag direction between UI elements 3, 4, 7, 8 at a distance of 1 and UI element 9 at a distance of 2 in the drag direction is reduced. In addition, the spacing in the drag direction between UI elements 3, 4, 7, 8 at a distance of 1 and UI elements 2, 6 at a distance of 2 in the direction opposite to the drag direction increases.
At T14, the displacement of the UI element 5 in the drag direction is kept at S10. The UI element 3, 4, 7, 8 at distance 1 is moved by a displacement S10 in the drag direction and stops moving. The displacement of the UI element 2, 6, 9 of distance 2 in the drag direction has not reached S10 and continues to move with the animation effect of the predefined curve control. The spacing in the drag direction between UI elements 3, 4, 7, 8 at distance 1 and UI element 9 at distance 2 in the drag direction is increased compared to T13. Furthermore, the spacing in the dragging direction between UI element 3, 4, 7, 8 at distance 1 and UI element 2, 6 at distance 2 in the direction opposite to the dragging direction is reduced.
At T15, the displacement of UI element 5 and UI elements 3, 4, 7, 8 of distance 1 in the drag direction remains as S10. The UI element 2, 6, 9 of distance 2 is moved by the displacement S10 in the drag direction and stops moving. Thereby, the drag linkage is completed.
Fig. 17A shows an illustrative diagram of an example of a scene 1700A in which UI elements do not completely follow hand movements, according to an embodiment of the disclosure. Fig. 17B shows a schematic diagram of an example of a displacement time curve 1700B for a scene in which a UI element does not completely follow hand movement, according to an embodiment of the disclosure.
As shown in fig. 17A and 17B, at T21, the UI element 5 is dragged. At T21a, the UI element 5 starts to follow the drag movement of the finger. In some embodiments, T21a may be equal to T21 if UI element 5 starts to move when the drag occurs. Alternatively, if the UI element 5 starts to move after the drag occurs, T21a may be greater than T21. Furthermore, in addition to the dragged UI element 5, other UI elements (e.g., UI elements 2-4 and 6-9) move in unison. It should be understood that for clarity, other UI elements are shown as beginning to move simultaneously with UI element 5. However, as described above, other UI elements may begin to move after respective delay times.
At T22, the user loosens his hand or throws a slip, in which case dragging of the UI element 5 ends. At T22a, the UI element 5 stops moving. In some embodiments, T22a may be equal to T22 if UI element 5 stops moving when the hand is loose or thrown. Alternatively, T22a may be greater than T22 if the UI element 5 stops moving after a loose hand or a throwing slip. At this time, the displacement of the finger in the dragging direction is SF2. The displacement of the UI element 5 in the drag direction is S20. The displacement of the UI element 3, 4, 7, 8 with distance 1 in the dragging direction is S21. The displacement of the UI element 2, 6, 9 of distance 2 in the drag direction is S22. The displacement SF2 is greater than the displacement S20, the displacement S20 is greater than the displacement S21, and the displacement S21 is greater than the displacement S22. At this point, the UI element 5 stops moving, while the distance 1 UI elements 3, 4, 7, 8 and the distance 2 UI elements 2, 6, 9 continue to move with the animation effect controlled by the predefined curve (e.g., the elastic force curve). The spacing in the drag direction between UI element 5 and UI elements 3, 4, 7, 8 at distance 1 is increased compared to T21. The spacing in the drag direction between UI elements 3, 4, 7, 8 at distance 1 and UI element 9 at distance 2 in the drag direction decreases. Furthermore, the spacing in the drag direction between the UI element 3, 4, 7, 8 at distance 1 and the UI element 2, 6 at distance 2 in the direction opposite to the drag direction increases.
At T23, the separation in the drag direction between UI element 5 and UI elements 3, 4, 7, 8 at distance 1 decreases. Further, the spacing in the drag direction between UI elements 3, 4, 7, 8 at distance 1 and UI element 9 at distance 2 in the drag direction decreases. In addition, the spacing in the drag direction between UI elements 3, 4, 7, 8 at a distance of 1 and UI elements 2, 6 at a distance of 2 in the direction opposite to the drag direction increases.
At T24, the displacement of the UI element 5 in the drag direction is kept at S20. The UI element 3, 4, 7, 8 at distance 1 is moved by the displacement S20 in the drag direction and stops moving. The displacement of the UI element 2, 6, 9 of distance 2 in the drag direction has not reached S20 and continues to move with the animation effect of the predefined curve control. The spacing in the drag direction between UI elements 3, 4, 7, 8 at a distance of 1 and UI element 9 at a distance of 2 in the drag direction is increased compared to T23. Furthermore, the spacing in the dragging direction between UI element 3, 4, 7, 8 at distance 1 and UI element 2, 6 at distance 2 in the direction opposite to the dragging direction is reduced.
At T25, the displacement of UI element 5 and UI elements 3, 4, 7, 8 of distance 1 in the drag direction remains S20. The UI element 2, 6, 9 of distance 2 is moved by the displacement S20 in the drag direction and stops moving. Thereby, drag linkage is completed.
In the scenario described with reference to fig. 16A-17B, the UI element 5 stops moving after the drag stops. However, the UI element 5 may also continue to move for a distance after the dragging is stopped. In some embodiments, the distance may be determined based on a friction model as described above. Whether the UI element 5 continues to move after the drag is stopped is configurable by the electronic device and the user. For example, if the electronic device is configured to allow continued movement after a loose hand or a sliding throw, the UI element 5 may continue to move. Conversely, the UI element 5 will stop moving as the drag stops.
Fig. 18A shows a schematic diagram of an example of a scenario 1800A of a UI element moving fully with the hand, according to an embodiment of the disclosure. Fig. 18B shows a schematic diagram of an example of a displacement time curve 1800B for a scene in which a UI element is moving fully with a hand, according to an embodiment of the disclosure.
As shown in fig. 18A and 18B, at T31, the UI element 5 is dragged. At T11a, the UI element 5 starts to follow the drag movement of the finger. In some embodiments, T31a may be equal to T31 if UI element 5 starts to move when the drag occurs. Alternatively, T31a may be greater than T31 if UI element 5 starts to move after the drag occurs. In addition, in addition to the dragged UI element 5, other UI elements (e.g., UI elements 2-4 and 6-9) move in unison. It should be understood that for clarity, other UI elements are shown as beginning to move simultaneously with UI element 5. However, as described above, other UI elements may begin to move after respective delay times.
At T32, the user loosens his hand or throws a slip, in which case dragging of the UI element 5 ends. At T32a, the UI element 5 continues to move with the animation effect controlled by the predefined curve (e.g., the elastic force curve). In some embodiments, T32a may be equal to T32 if, at the end of the drag, the UI element 5 moves with an animation effect controlled by a predefined curve. Alternatively, T32a may be greater than T32 if, after the end of the drag, the UI element 5 moves with an animation effect controlled by a predefined curve. At this time, the displacement of the UI element 5 in the drag direction is SF3. The displacement of the UI element 3, 4, 7, 8 of distance 1 in the drag direction is S31. The displacement of the UI element 2, 6, 9 of distance 2 in the dragging direction is S32. The displacement SF3 is greater than the displacement S31, and the displacement S31 is greater than the displacement S32. Furthermore, the UI elements 3, 4, 7, 8 of distance 1 and the UI elements 2, 6, 9 of distance 2 also continue to move with the animation effect of the predefined curve. The spacing in the dragging direction between the UI element 5 and the UI elements 3, 4, 7, 8 at a distance 1 is increased compared to T31. The spacing in the drag direction between UI elements 3, 4, 7, 8 at distance 1 and UI element 9 at distance 2 in the drag direction decreases. Furthermore, the spacing in the drag direction between UI elements 3, 4, 7, 8 at a distance of 1 and UI elements 2, 6 at a distance of 2 in the direction opposite to the drag direction increases.
At T33, the spacing in the drag direction between UI element 5 and UI elements 3, 4, 7, 8 at distance 1 increases. The spacing in the drag direction between UI elements 3, 4, 7, 8 at distance 1 and UI element 9 at distance 2 in the drag direction decreases. Furthermore, the spacing in the drag direction between UI elements 3, 4, 7, 8 at a distance of 1 and UI elements 2, 6 at a distance of 2 in the direction opposite to the drag direction increases.
At T34, all UI elements continue to move with the animation effect of the predefined curve control. The spacing in the dragging direction between UI element 5 and UI elements 3, 4, 7, 8 at a distance of 1 is reduced compared to T33. The spacing in the drag direction between UI elements 3, 4, 7, 8 at distance 1 and UI element 9 at distance 2 in the drag direction increases. Furthermore, the spacing in the dragging direction between UI elements 3, 4, 7, 8 at a distance of 1 and UI elements 2, 6 at a distance of 2 in the direction opposite to the dragging direction decreases.
At T35, all UI elements move by the displacement S30 in the drag direction, and stop moving. The distance in the dragging direction between the displacement S30 at the position where the movement is stopped and the displacement SF3 at the position where the hand is loosened or thrown can be determined based on the friction force model as described above. Thus, drag linkage is completed.
Further, in some embodiments, the UI element may bounce a distance in the event that the UI element continues to move after the drag is stopped. As described above, in an underdamped state, the displacement of the spring oscillates between positive and negative values over time. Thus, the elastic force curve of the under-damped state may be utilized to control the rebound of the UI element.
It should be understood that in FIGS. 18A-18B, the UI elements are shown as allowing overlap with each other, e.g., UI element 8 overlaps with UI element 9 at times T32-T34. However, UI elements may not be allowed to overlap each other. Whether overlap is allowed is configurable by the electronic device or by the user. Where overlap is allowed, the movement of the UI element follows the elastic force curve of the underdamped state. In case no overlap is allowed, the movement of the UI element follows the elastic force curve of the over-damped state. Further, whether any two UI elements overlap may also depend on the relative movement magnitude of the two UI elements. For example, UI elements typically do not overlap where the relative movement of two UI elements is small in magnitude. Whereas in case the relative movement amplitude of two UI elements is large, the UI elements may overlap.
Fig. 19A shows a schematic diagram of an example of a scene 1900A with UI elements moving fully with the hand, according to an embodiment of the disclosure. Fig. 19B shows a schematic diagram of an example of a displacement time curve 1900B for a scene in which a UI element is moving fully with a hand, according to an embodiment of the disclosure.
As shown in fig. 19A and 19B, at T41, the UI element 5 is dragged. At T41a, the UI element 5 starts to follow the drag movement of the finger. In some embodiments, T41a may be equal to T41 if UI element 5 starts to move when the drag occurs. Alternatively, if the UI element 5 starts to move after the drag occurs, T41a may be larger than T41. Furthermore, in addition to the dragged UI element 5, other UI elements (e.g., UI elements 2-4 and 6-9) move in unison. It should be understood that for clarity, other UI elements are shown as beginning to move simultaneously with UI element 5. However, as described above, other UI elements may begin to move after respective delay times.
At T42, the user loosens his hand or throws a slip, in which case dragging of the UI element 5 ends. At T42a, the UI element 5 continues to move with the animation effect controlled by the predefined curve (e.g., the elastic force curve). In some embodiments, T42a may be equal to T42 if, at the end of the drag, the UI element 5 moves with an animation effect controlled by a predefined curve. Alternatively, T42a may be greater than T42 if, after the end of the drag, the UI element 5 moves with an animation effect controlled by a predefined curve. At this time, the UI element 5 is displaced in the drag direction by SF4. The displacement of the UI element 3, 4, 7, 8 of distance 1 in the drag direction is S41. The displacement of the UI element 2, 6, 9 of distance 2 in the dragging direction is S42. The displacement SF4 is greater than the displacement S41, and the displacement S41 is greater than the displacement S42. Furthermore, the UI elements 3, 4, 7, 8 of distance 1 and the UI elements 2, 6, 9 of distance 2 also continue to move with the animation effect of the predefined curve. The spacing in the drag direction between UI element 5 and UI elements 3, 4, 7, 8 at distance 1 is increased compared to T41. The spacing in the drag direction between UI elements 3, 4, 7, 8 at distance 1 and UI element 9 at distance 2 in the drag direction decreases. Furthermore, the spacing in the drag direction between UI elements 3, 4, 7, 8 at a distance of 1 and UI elements 2, 6 at a distance of 2 in the direction opposite to the drag direction increases.
At T43, the UI element 5 moves the displacement S40 in the drag direction, and rebounds starts. In some embodiments, the distance between the displacement S40 in the drag direction at the location of the rebound and the displacement SF4 in the drag direction at the location of the release or throw slip may be determined based on a friction model as described above.
At T44, the UI element 9 of distance 2 is moved by the displacement S40 in the drag direction, and springback also starts.
At T45, all UI elements bounce back by a displacement SF4 in the drag direction and stop moving. Thus, the drag linkage is completed.
It should be understood that although UI element 5 is shown to bounce before UI elements 3, 4, 7, 8 at distance 1, and UI elements 3, 4, 7, 8 at distance 1 are shown to bounce before UI elements 2, 6, 9 at distance 2, all UI elements may bounce together. For example, the UI element 5 may stop moving to wait for other UI element movement displacements S40, and then begin to bounce together. Further, while all UI elements are shown to bounce to a loose hand or a throwing slip position, all UI elements may bounce to a greater or lesser magnitude, embodiments of the disclosure are not so limited.
Depth linkage
Embodiments of the present disclosure relate to the interlocking of UI elements in a UI in the depth direction, also referred to as depth interlocking. The depth direction refers to a direction perpendicular to a screen of the electronic device. In deep-stitching, a target UI element that is pressed may affect other UI elements that are not pressed. In particular, in deep-linking, triggering the animation effect of the target UI element may jointly trigger the animation effect of one or more other UI elements, even other UI elements in the entire UI, such that the other UI elements are affected by the target UI element. For example, in a deep linkage, when a target UI element is pressed during a duration, in addition to the target UI element scaling over time, other UI elements may also scale at corresponding magnitudes, thereby visually rendering the linkage scaling.
Thus, the connection between animation effects of different UI elements can be strengthened, and the relationship between the individual UI elements can be highlighted. Compared with the traditional animation with single animation effect and independent and unrelated UI elements, the embodiment of the disclosure can enable the animation effect to better conform to physical laws, and considers a real use scene and a user use habit, thereby remarkably improving user experience.
Some example embodiments of depth linkages will be described below with reference to fig. 20-33.
Depth linkage may occur in a UI having any suitable regular or irregular layout, and the UI elements in the UI may have any suitable size and shape. For example, depth linkage may occur in UIs 300A-300C as shown in FIGS. 3A-3C.
A UI element in the UI may be pressed. For example, a user may press a UI element when the user desires to perform an operation associated with the UI element. As an example, a user may press a UI element when the user desires to enter an application represented by the UI element, open a menu associated with the UI element, or the like. In the event that a press at a UI element is detected, the UI element may change, e.g., the UI element may zoom to present the press action in the depth direction. For example, the UI element may zoom out to appear farther away in the depth direction. Alternatively, the UI elements may be enlarged to present closeness in the depth direction. Hereinafter, scaling will be described taking UI element reduction as an example. However, it should be understood that zooming may also be enlarging the UI element.
Fig. 20 shows a schematic diagram of an example of a change 2000 in a UI element when pressed, according to some embodiments of the present disclosure. As shown in fig. 20, in the event that a press at the UI element is detected, the UI element may zoom out to appear farther away in the depth direction.
The changes to the UI elements may conform to the face pressure model. In the face pressure model, the pressure at each portion (e.g., each pixel point or each portion divided in any other suitable manner) in the UI element is the same. That is, the pressure is the same for all portions of the UI element regardless of which portion of the UI element is pressed (e.g., whether the center of the UI element or the edge of the UI element is pressed). Thus, the change of the UI element will be the same no matter what part of the UI element is pressed.
Fig. 21 shows a schematic diagram of an example of a change 2100 when a UI element is pressed at different locations according to some embodiments of the disclosure. As shown in fig. 21, whether a press is detected at the center of the UI element or at the edge of the UI element, the UI element may zoom out at the same magnitude to appear farther away in the depth direction.
Further, as shown in fig. 21, after the UI element is reduced, the position of the press may no longer be within the range of the reduced UI element. In this case, the press may continue to be detected as a press against the UI element because the location of the press is still within the scope of the UI element before zooming out, or any other suitable range. Alternatively, the press will not be detected as a press against the UI element since the location of the press is no longer within the scope of the scaled-down UI element. In this case, the pressing may be considered to be finished.
Further, in some embodiments, to make the change of the UI element conform to natural laws and user usage habits, the magnitude of the change may depend on the magnitude of the force of the press. In the real world, the magnitude of the force generally refers to the magnitude of the real force. In this case, the greater the pressing force, the greater the variation in the depth direction. In some embodiments, the force of the press may be a user-applied press force detected by the electronic device. Alternatively, the pressing force may also be a predetermined pressing force set by the electronic device or the user.
Fig. 22 shows a schematic diagram of an example of a change 2200 in UI elements at different pressing forces according to some embodiments of the disclosure. As shown in fig. 22, in the case where the pressing force is large, the UI element may be reduced more largely to present a greater degree of distancing in the depth direction. In some embodiments, the UI element may even shrink to disappear from the UI, i.e. to a size of 0, thereby presenting the greatest degree of distancing in the depth direction. In contrast, in the case where the pressing force is small, the UI elements may be reduced in a smaller degree to present a smaller degree of distancing in the depth direction.
However, embodiments of the present disclosure are not limited thereto. The manner in which the UI elements are scaled in response to different pressing forces is configurable by the electronic device or user. For example, in the case where the pressing force is large, the UI element may be reduced in a smaller magnitude, and in the case where the pressing force is small, the UI element may be reduced in a larger magnitude.
Further, in an electronic device, making a change in the depth direction based solely on the force of a real press may be highly demanding for the user, and may require the electronic device to be equipped with relevant hardware. Thus, in some embodiments, the time of the compression may be used to simulate or replace the force of the compression. For example, if the pressing time is longer, it can be considered that the pressing force is larger, and thus the variation in the depth direction is larger.
Fig. 23 shows a schematic diagram of an example of a change 2300 of UI elements at different press durations in accordance with some embodiments of the disclosure. As shown in fig. 23, in the case where the duration of the press is long, the UI element may be zoomed out by a larger margin to present a greater degree of distancing in the depth direction. In some embodiments, the UI element may even shrink to disappear from the UI, i.e., shrink to a size of 0, thereby presenting the greatest degree of distancing in the depth direction. In contrast, in the case where the duration of the press is short, the UI element may be zoomed out in a smaller degree to present a smaller degree of distancing in the depth direction.
However, embodiments of the present disclosure are not limited thereto. The manner in which the UI elements are scaled in response to different press durations is configurable by the electronic device or user. For example, in the case where the duration of the press is long, the UI element may be reduced in a smaller magnitude, and in the case where the duration of the press is short, the UI element may be reduced in a larger magnitude.
Furthermore, in some embodiments, to further make the variation of the UI elements conform to natural laws and user usage habits, the magnitude of the variation may depend on the size of the UI elements. For example, intuitively, the same press may have difficulty pressing a larger UI element, while a smaller UI element may be pressed with less effort. In this case, larger UI elements may be less affected by the press, while smaller UI elements may be more affected by the press.
Fig. 24 illustrates an exemplary diagram of an example of a variation 2400 in UI elements at different sizes, according to some embodiments of the disclosure. As shown in fig. 24, in the case where the size of the UI element is large, the UI element may be more greatly reduced to present a greater degree of distancing in the depth direction. In contrast, in the case where the size of the UI element is small, the UI element may be reduced in a smaller degree to present a smaller degree of distancing in the depth direction.
However, embodiments of the present disclosure are not limited thereto. The manner in which the UI elements are scaled at different sizes is configurable by the electronic device or user. For example, to make the scaled UI elements more uniform in size, larger UI elements may be more affected by the pressing, while smaller UI elements may be less affected by the pressing. For this reason, in the case where the size of the UI element is large, the UI element may be reduced in a smaller scale, and in the case where the size of the UI element is small, the UI element may be reduced in a larger scale to present a smaller degree of distancing in the depth direction.
Further, in some embodiments, to improve the user experience, the magnitude to which the UI element can be scaled may be limited such that the UI element can only be scaled within the allowed range of magnitudes. For example, the amplitude range may be any suitable range, such as 10% -90% of the size of the UI element, 100-10,000 pixels, or 2% -50% of the screen, etc. By way of example, assume that the amplitude range is 10% -90% of the size of the UI element. In this case, regardless of the pressing force and the duration of the pressing, the pressed UI element can be reduced to 10% of the original size at most, and cannot disappear from the screen.
In the above, the scaling of the pressed UI element is described in detail. As described above, in deep concatenation, a target UI element that is pressed may affect other UI elements that are not pressed. In particular, in deep-linking, the animation effect that triggers the target UI element may jointly trigger the animation effect of one or more other UI elements, even other UI elements in the entire UI, such that the other UI elements are affected by the target UI element. For example, in a depth-join, when a target UI element is pressed during a duration, in addition to the target UI element zooming over time, other UI elements may also zoom at a corresponding magnitude, thereby visually rendering the joint zoom. Therefore, hereinafter, the depth interlock will be described in detail with reference to fig. 25 to 33.
Fig. 25 shows a flowchart of a graphical interface display method 2500, according to an embodiment of the present disclosure. It should be understood that method 2500 may be performed by electronic device 100 described above with reference to fig. 1 or electronic device 200 described with reference to fig. 2. Method 2500 is described herein with reference to UI 300A of fig. 3A. However, it should be understood that UI 300A is merely an example and that method 2500 may be applied to any suitable interface, including, but not limited to UIs 300B-300C.
At block 2510, M user interface UI elements are displayed on the screen of the electronic device. M is a positive integer greater than 1. For example, the M UI elements may be UI elements 1 to 13.
At block 2520, a press at a first UI element of the M UI elements for a duration of time is detected. For example, the first UI element may be UI element 5. As described above, a press at the first UI element that is held for a duration of time will cause the first UI element to scale over time to present a press effect in the depth direction.
At block 2530, each of the N UI elements on the screen is scaled in response to detecting the press at the first UI element that is held for a duration of time. N is a positive integer between 1 and M-1. This visually indicates the coordinated pressing.
Fig. 26 shows a schematic diagram of an example of depth linkage 2600 of N UI elements according to an embodiment of the disclosure. As shown in fig. 26, the UI element 5 is pressed during a duration, so that the UI element 5 scales with time to present a pressing effect in the depth direction. In addition, other UI elements on the screen (e.g., UI elements 1 through 4, and 6 through 13) also scale at different magnitudes over time in response to the press to present a press effect in the depth direction. Thereby, the pressing in linkage is visually presented. For clarity, FIG. 26 only shows the depth linkage of UI elements 1-13 in UI 300A. It should be understood that depth linkage may occur at any at least two UI elements in any UI, for example, at any at least two UI elements in UIs 300A-300C.
In some embodiments, depth linkage may act on all UI elements on the screen. In this case, M-1 UI elements other than the first UI element among the M UI elements may be determined as the N UI elements. Alternatively, the depth linkage may only act on a portion of the UI elements on the screen. In this case, the influence area of the first UI element may be determined based on the size of the first UI element, and a UI element within the influence area among the M UI elements may be determined as the N UI elements. For example, the larger the size of the first UI element, the larger its area of influence may be. Alternatively, the area of influence may also shrink with size, and the disclosure is not limited thereto. For example, the area of influence may be a circle having a predetermined radius centered on the reference point of the first UI element. It should be appreciated that the area of influence may be any suitable area having any shape, such as a rectangle, a diamond, etc. having a predetermined size. The area of influence may be configurable by the electronic device and the user, and the disclosure is not limited thereto.
Further, in some embodiments, UI elements that intersect the area of influence may be considered to be within the area of influence. Alternatively, in the case where the area of influence is a circle having a predetermined radius, a UI element may be considered to be within the area of influence if the distance of the UI element from the first UI element is less than the predetermined radius of the area of influence.
Fig. 27 shows a schematic diagram of an example of an area of influence 2700 of UI elements according to an embodiment of the disclosure. As shown in fig. 27, since UI elements 3, 4, 7, 8 are within the area of influence 2710 of UI element 5, UI elements 3, 4, 7, 8 will scale in tandem with UI element 5. Furthermore, because UI elements 1, 2, 6, 9-13 are not within the area of influence 2710 of UI element 5, UI elements 1, 6, 9-13 do not scale in tandem with UI element 5.
Referring back to fig. 25, to scale the N UI elements by respective magnitudes, a distance between the first UI element and each of the N UI elements may be determined. As described above, in some embodiments, distances may be divided into a plurality of distance levels based on the range in which the distance is located. For example, the manipulated UI elements themselves may be at distance level 0, UI elements in a linkage may be at distance levels 1, 2, 3 \8230dependingon their respective distances from the manipulated UI elements, and \8230;, UI elements at the same distance level may be considered to be the same distance. Thus, by using the distance levels, the linkage of the UI elements can be simplified, so that the UI elements at the same distance level are linked in the same manner, thereby improving the uniformity and coordination of the linkage. However, it should be understood that in the linkage, the distance itself may also be used, thereby making the UI elements more precisely linked. Hereinafter, the distance rank is interchangeably referred to as distance.
In the above, the manner of determining the distance between the first UI element and the second UI element of the N UI elements has been described with reference to fig. 10 to 14B, and thus the description thereof is omitted here.
Referring back to fig. 25, after determining the distance between the first UI element and the second UI element, the magnitude of scaling the second UI element may be determined based on the distance. For example, if the distance between the first UI element and the second UI element is larger, the magnitude of zooming the second UI element may be smaller, thereby visually indicating that the impact of the press on the distant UI element is smaller. Alternatively, if the distance between the first UI element and the second UI element is larger, the magnitude of zooming the second UI element may also be larger, visually indicating that the impact of the press on the distant UI element is greater.
In some embodiments, to determine the magnitude of the scaling of the second UI element, a first magnitude at which the first UI element scales in response to the press may be determined. In some embodiments, the first magnitude of the scaling of the first UI element may be determined based on various factors associated with the first UI element. These factors may include, but are not limited to, the size of the first UI element, the range of magnitudes that the first UI element can be varied, the duration of the compression, and the predetermined compression force. In the above, the influence of these factors on the zoom magnitudes of UI elements, respectively, is described in detail, and thus the description thereof is omitted here.
Then, a magnitude at which the second UI element zooms in response to the press may be determined based on the first magnitude and a distance between the first UI element and the second UI element. How to conduct the magnitude of the scaling of the first UI element to the second UI element, and thereby the magnitude of the scaling of the second UI element, may be achieved using the conducting means described in detail above. The difference is that in the depth linkage, x in equations (7) and (8) is conducted n The zoom magnitude of the UI element (e.g., the second UI element) representing the joint zoom, and x represents the zoom magnitude of the UI element (e.g., the first UI element) being pressed. Therefore, the description thereof is omitted here.
Therefore, the zooming amplitude of the second UI element is determined by the zooming amplitude of the first UI element and the distance between the second UI element and the first UI element, so that the depth linkage which is intuitive and natural and accords with the use habit of the user can be realized.
Further, in some embodiments, the size of the second UI element may also affect the magnitude of the scaling of the second UI element. In this case, the size of the second UI element may also be considered to determine the magnitude of the scaling of the second UI element. For example, if the second UI element is larger, the magnitude of the zooming of the second UI element may be larger, making the scaled UI elements on the screen more similar in size and thus more visually coordinated. Alternatively, if the second UI element is larger, the magnitude of zooming the second UI element may be smaller, so that the difference in size of the zoomed UI element on the screen is larger. To this end, in some embodiments, a magnitude at which the second UI element zooms in response to the press may be determined based on the first magnitude, the distance, and a size of the second UI element.
Additionally, in some embodiments, the size of the first UI element may also affect the magnitude of the scaling of the second UI element. In this case, the size of the first UI element may also be considered to determine the magnitude of the zooming of the second UI element. For example, the larger the size of a first UI element, the greater the linkage effect it may have, and thus the animation effect of scaling a second UI element may be proportional to the size of the first UI element. To this end, in some embodiments, the magnitude to scale the second UI element may be determined based on the first magnitude, the distance, and the size of the first UI element.
Further, as described above, both the size of the first UI element and the size of the second UI element may affect the magnitude of scaling the second UI element. Thus, in some embodiments, the determination to scale the second UI element may be based on the first amplitude, the distance, the size of the first UI element, and the size of the second UI element.
Referring back to fig. 25, after determining the magnitude to zoom the second UI element, the second UI element may be zoomed at that magnitude to visually indicate that the second UI element is zoomed as the first UI element is pressed. For the N UI elements, they may all be scaled by respective magnitudes to visually indicate a press on the entire screen or on a partial area of the screen, thereby presenting a press linkage.
Fig. 28 shows a schematic diagram of an example of a scaling 2800 of a distance-based UI element, according to an embodiment of the disclosure. As shown in fig. 2800, a UI element at distance 0 (e.g., UI element 5 itself) has a larger zoom magnitude than a UI element at distance 1 (e.g., UI elements 3, 4, 7, 8), a UI element at distance 1 has a larger zoom magnitude than a UI element at distance 2 (e.g., UI elements 2, 6, 9), and a UI element at distance 2 has a larger zoom magnitude than a UI element at distance 3 (e.g., UI elements 1, 10-13).
Further, in some embodiments, to better present the conduction of the animation effect and improve the user experience, the first UI element and the second UI element do not begin zooming at the same time. For example, a first UI element may begin zooming when a press occurs, while a second UI element may begin zooming after a time period after the press occurs. To this end, in some embodiments, a delay time may be determined based on a distance between the first UI element and the second UI element, and the second UI element may be scaled in response to the delay time elapsing after the press occurs. Further, in some embodiments, a delay factor may be determined, and the delay time determined based on the distance and the delay factor. For example, the delay time may be a quotient of the distance divided by the delay coefficient. The delay factor may be configurable by the electronic device and the user.
Fig. 29 shows a schematic diagram of an example of a delay time 2900 according to an embodiment of the present disclosure. As shown in fig. 29, the first UI element with distance 0 begins to zoom when the press occurs, the UI element with distance 1 zooms later than the first UI element, the UI element with distance 2 zooms later than the UI element with distance 1, and the UI element with distance 3 zooms later than the UI element with distance 2.
Fig. 30 shows a schematic diagram of an example of a zoom 3000 of a UI element with a delay time according to the present disclosure. As shown in fig. 30, UI element 5 at distance 0 starts to zoom at time T51 when the press occurs, UI elements 3, 4, 7, 8 at distance 1 start to zoom at time T52 thereafter, UI elements 2, 4, 6, 9 at distance 2 start to zoom at later time T53, and UI elements 1, 10-13 at distance 3 start to zoom at latest T54.
Further, in some embodiments, the speed of the scaling of the UI elements may be controlled by a predefined curve of amplitude versus time. For example, the predefined curve may be a bezier curve or an elastic force curve. In the case of the spring force curve, the speed of scaling can be controlled by controlling the damping coefficient and the stiffness coefficient of the spring. In the case of a bezier curve, the speed of scaling may be controlled by controlling the coordinates of at least one of the at least one control point of the bezier curve.
Further, in some embodiments, the ganged scaled UI elements may also be moved toward the pressed UI elements in order to improve the user experience. In particular, the N UI elements may be moved towards the first UI element to further visually highlight the press. For example, the magnitude of the displacement may depend on at least one of a distance between the ganged scaled UI element and the pressed UI element, a duration of the press, a size of the second UI element, and a size of the first UI element. To this end, in some embodiments, the displacement to move the second UI element may be determined based on a distance between the first UI element and the second UI element, a duration of the press, a size of the first UI element, and/or a size of the second UI element.
The second UI element may then be moved in a direction from the second UI element to the first UI element by the displacement. For example, the second UI element may be moved by the displacement in a direction pointing from the second reference point of the second UI element to the first reference point of the first UI element. This has the visual effect that the second UI element is attracted to the first UI element. It should be understood that embodiments of the present disclosure are not limited thereto. For example, the second UI element may also be moved by the displacement in an opposite direction (e.g., a direction from the first reference point of the first UI element to the second reference point of the second UI element). This has the visual effect that the second UI element is repelled by the first UI element.
Fig. 31 shows a schematic diagram of an example of a displacement 3100 of a movement of a UI element according to an embodiment of the disclosure. As shown in FIG. 3100, UI elements at distance 1 (e.g., UI elements 3, 4, 7, 8) have a displacement magnitude greater than UI elements at distance 2 (e.g., UI elements 2, 6, 9), and UI elements at distance 2 have a displacement magnitude greater than UI elements at distance 3 (e.g., UI elements 1, 10-13)
Further, in some embodiments, after pressing ends (e.g., after the user lifts the finger off the screen), the scaled UI element may resume. In particular, both the pressed UI element and the co-scaled N UI elements may be restored. To this end, in some embodiments, the scaled second UI element may be restored to the pre-scaled second UI element. The recovery process may be an inverse process of scaling, and thus a detailed description thereof is omitted herein.
Fig. 32A illustrates a schematic diagram of an example of a restoration 3200A of UI elements according to an embodiment of the disclosure. As shown in FIG. 32A, the scaled UI elements (e.g., UI elements 1-13) all return to the original size before scaling.
Further, as described above, in some embodiments, the UI element may also move in response to a press. In this case, the moved UI element may be reset after the pressing is finished. Specifically, N UI elements moving to the pressed UI element may all be reset. To this end, in some embodiments, the second UI element may be restored from the post-movement position to the pre-movement position.
Fig. 32B illustrates a schematic diagram of an example of a restoration 3200B with displaced UI elements, according to an embodiment of the disclosure. As shown in FIG. 32B, the UI elements that were moved and scaled (e.g., UI elements 1-13) all return to the pre-movement position and the pre-scaled initial size.
In some embodiments, the restoration of zoom or the restoration of movement may have a rebound effect. For example, with respect to restoration of zoom, after the user releases his or her hand, the size of the UI element may first be increased to greater than the original size and then reduced to the original size. Further, with respect to the return of movement, after the user releases his or her hand, the UI element that moves in a coordinated manner may move farther away from the pressed UI element than the initial position before the movement and then return to the initial position.
Fig. 33A-33B show schematic diagrams of examples of a size-time curve 3300A and a displacement-time curve 3300B, respectively, of a restoration of a UI element with a bounce effect, according to an embodiment of the present disclosure.
Regarding zoom rebound, as shown in fig. 33A, at T61, the UI element 5 is pressed and contracted. In addition, other UI elements (e.g., UI elements 1-4, 6-13) also zoom out in conjunction.
At T62, the user releases his hand. At this time, the UI element 5 is reduced to 0.5 times the original size. Further, UI elements with distance 1 (e.g., UI elements 3, 4, 7, 8) zoom out in unison, but with a smaller zoom-out than UI element 5. In addition, UI elements at distance 2 (e.g., UI elements 2, 6, 9) also zoom out in unison, but at a smaller zoom-out magnitude than UI elements at distance 1. Further, UI elements at distance 3 (e.g., UI elements 1, 10-13) also zoom out in tandem, but at a smaller zoom size than UI elements at distance 2.
During T62 and T63, the UI element begins zoom bounce.
At T63, the size of the UI element 5 increases to 1.2 times the original size. Further, UI elements having a distance of 1 (e.g., UI elements 3, 4, 7, 8) increase in unison, but by a lesser extent than UI element 5. In addition, UI elements with distance 2 (e.g., UI elements 2, 6, 9) also increase in unison, but the magnitude of the increase is less than for UI elements with distance 1. Further, UI elements at distance 3 (e.g., UI elements 1, 10-13) also increase in unison, but by a lesser degree than UI elements at distance 2.
At T64, the size of all UI elements is restored to the original size.
Further, regarding the movement rebound, as shown in fig. 33B, at T71, the UI element 5 is pressed. The other UI elements move towards the UI element 5.
At T72, the user releases his hand. At this time, the displacement of the UI element having the distance of 1 (e.g., UI elements 3, 4, 7, 8) moving toward the UI element 5 is-1. In addition, UI elements at distance 2 (e.g., UI elements 2, 6, 9) also move toward UI element 5, but the magnitude of the displacement is less than for UI elements at distance 1. Further, UI elements with distance 3 (e.g., UI elements 1, 10-13) also move toward UI element 5, but with a displacement magnitude less than that of UI elements with distance 2.
During T72 and T73, the UI element begins to move to bounce.
At T73, the UI element of distance 1 (e.g., UI elements 3, 4, 7, 8) is displaced beyond the initial position and is +0.7. In addition, the UI element with distance 2 (e.g., UI elements 2, 6, 9) also has a displacement exceeding the initial position, but the displacement amplitude is smaller than the UI element with distance 1. Further, UI elements at distance 3 (e.g., UI elements 1, 10-13) are also displaced beyond the initial position, but the displacement is less in magnitude than UI elements at distance 2.
At T74, the positions of all UI elements are restored to the original positions.
It should be appreciated that the zoom size (e.g., 0.5 times, 1.2 times) and movement displacement (e.g., displacement-1, displacement 0.7) in fig. 33A-33B are merely examples, and UI elements may be scaled smaller in any suitable size or moved with any suitable displacement. Further, although the spring back effect is shown as being sprung back only once in fig. 33A-33B, a spring back effect having a plurality of spring backs may be achieved. The number of rebounds may be any suitable number of rebounds, and the disclosure is not limited herein. In certain embodiments, the magnitude of the rebound of the plurality of rebounds may decrease over time.
Fig. 33C-33D show schematic diagrams of examples of a recovered size-time curve 3300C and displacement-time curve 3300D, respectively, for a UI element having a rebound effect of multiple rebounds with a reduced rebound amplitude, according to an embodiment of the present disclosure.
As shown in fig. 33C, the UI element returns to the original size after a number of rebounds, wherein the rebounds of the UI element having a distance of 0 (e.g., UI element 5) are scaled by a larger magnitude than the UI elements having a distance of 1 (e.g., UI elements 3, 4, 7, 8). The rebound of the UI element of distance 1 scales by a larger magnitude than the UI element of distance 2 (e.g., UI elements 2, 6, 9). The rebound of the UI element of distance 2 scales more than the UI element of distance 3 (e.g., UI elements 1, 10-13).
Similarly, as shown in fig. 33D, the UI element returns to the original position after a plurality of rebounds, wherein the magnitude of the displacement of the rebound of the UI element having a distance of 0 (e.g., UI element 5) is larger than that of the UI element having a distance of 1 (e.g., UI elements 3, 4, 7, 8). The magnitude of the displacement of the bounce of the UI element of distance 1 is greater than the UI element of distance 2 (e.g., UI elements 2, 6, 9). The magnitude of the rebound displacement for UI elements at distance 2 is greater than for UI elements at distance 3 (e.g., UI elements 1, 10-13).
Further, in certain embodiments, the rebound effect may also be controlled by a predefined curve (e.g., an elastic force curve, a Bezier curve, etc.). For example, these UI elements may rebound or move back in a predefined curvilinear controlled animation effect scale.
Pressure linkage
Embodiments of the present disclosure relate to the linkage of UI elements in a UI on the animation effect of a press, also referred to as pressure linkage. In pressure-ganging, the target UI element that is pressed may affect other UI elements that are not pressed. In particular, in pressure-linking, triggering the animation effect of a target UI element may jointly trigger the animation effect of one or more other UI elements, and even other UI elements in the entire UI, such that the other UI elements are affected by the target UI element. For example, in a pressure linkage, when a target UI element is pressed, in addition to the target UI element presenting a pressing effect with an animation effect, other UI elements may also present a pressing effect with a corresponding animation effect, thereby visually presenting linkage pressing.
Thereby, the connection between animation effects of different UI elements may be strengthened and the relation between individual UI elements may be highlighted. Compared with the traditional animation with single animation effect and independent and unrelated UI elements, the embodiment of the disclosure can enable the animation effect to better accord with physical laws, and considers the real use scene and the use habit of the user, thereby remarkably improving the user experience.
Some example embodiments of the pressure linkage will be described below with reference to fig. 34-46.
Pressure linkages may occur in a UI having any suitable regular or irregular layout, and the UI elements in the UI may have any suitable size and shape. For example, the pressure linkage may occur in UIs 300A-300C as shown in FIGS. 3A-3C.
A UI element in the UI may be pressed. For example, a user may press a UI element when the user desires to perform an operation associated with the UI element. As an example, a user may press a UI element when the user desires to enter an application represented by the UI element, open a menu associated with the UI element, or the like. In the event that a press at a UI element is detected, the UI element may change in an animation effect, e.g., the UI element may visually move in position (hereinafter alternatively referred to as rotate) in a seesaw manner with respect to the position of the press, or visually sag or protrude with respect to the position of the press, to present the pressing action. In this case, the changes of the UI elements may conform to the point pressure model. In the point pressure model, the pressure of the UI element at the position of the press is greater than that of the other portions.
In some embodiments, UI elements may be considered rigid bodies. In this case, upon detecting a press at the UI element, the UI element may visually move in position in a seesaw manner with respect to the position of the press to present the press effect.
Fig. 34 shows a schematic diagram of an example of a change 3400 in a UI element that is a rigid body when pressed, according to some embodiments of the present disclosure. As shown in fig. 34, in the event that a press at a UI element is detected, the UI element may change from the initial shape 3410 in an animation effect to visually move the position in a seesaw manner with respect to the position of the press. For example, when the pressed position is located on the left side of the UI element, the UI element visually rotates to the left about its reference point (shown with "+"), thereby changing to the shape 3420. The changed shape 3420 is similar to a seesaw in which the left side is depressed and the right side is tilted. Further, when the pressed position is located at the right side of the UI element, the UI element visually rotates to the right side about its reference point, thereby changing into the shape 3430. The modified shape 3430 resembles a paddle with the left side tilted and the right side depressed.
In this case, the UI element may be considered as a rocker connected on both sides to a spring, and the pressing of the UI element may be considered as pressing the spring on one side and stretching the spring on the other side, thereby achieving a animation effect in which the UI element as a whole rotates about its reference point.
Fig. 35 illustrates a schematic diagram of an example of a compression and tension 3500 of a spring simulating a compression of a UI element, according to some embodiments of the present disclosure. 3510 shows the two-sided spring in an initial state. 3520 shows that when the pressed position is to the left of the UI element, the spring to the left is pressed and the spring to the right is stretched. 3530 shows that when the location of the press is to the right of the UI element, the right spring is pressed and the left spring is stretched.
In this case, the model of the spring can be represented by the following equation (9):
Figure BDA0003029898300000331
where L denotes a horizontal distance between the pressed position and a reference point of the UI element, c denotes a straight-line distance between the pressed position and the reference point, and k' denotes an elastic coefficient of the spring.
Further, the above equation (9) can be transformed into the form of the following equation (10):
Figure BDA0003029898300000341
where k 'represents the spring constant of the spring, x' represents the amount of deformation of the spring, g 'represents the damping constant of the spring, T represents the time during which the deformation takes place, and m' represents the size of the UI element.
Further, in some embodiments, UI elements may be considered non-rigid bodies. In this case, where a press at the UI element is detected, the UI element may be visually recessed or protruded relative to the location of the press to present the press effect.
Fig. 36 shows a schematic diagram of an example of a change 3600 of a UI element that is a non-rigid body when pressed, according to some embodiments of the present disclosure. As shown in fig. 36, the UI element may be regarded as a grid map. In the event that a press at a UI element is detected, the initial UI element 3610 may change in an animation effect to be visually recessed or protruding relative to the location of the press. For example, the coordinates of the grid within the initial UI element 3610 may change, thereby changing to a UI element 3620 that is concave with respect to the location of the press. Further, in some embodiments, the color (e.g., hue, brightness, saturation, etc.) of the UI element may also be varied to highlight the press. For example, the initial UI element 3610 may also be changed to a UI element 3630 that is concave and darkened with respect to the location of the press. It should be understood that color changes may also be applied to UI elements that are rigid bodies.
In some embodiments, after a change in a UI element, the location of the press may no longer be within the scope of the changed UI element. In this case, the press may continue to be detected as a press against the UI element because the location of the press is still within the range of the UI element before the change, or any other suitable range. Alternatively, the press will not be detected as a press against the UI element since the position of the press is no longer within the range of the changed UI element. In this case, the pressing may be considered to be ended.
Hereinafter, embodiments of the present disclosure will be described taking as an example that a UI element visually moves in position in a seesaw manner with respect to a pressed position. However, it should be understood that the UI elements may also vary in other ways, such as depressions or protrusions visually relative to the location of the press.
Further, in some embodiments, to make the changes to the UI elements conform to natural laws and user usage habits, the magnitude of the changes may depend on the amount of force of the press. In the real world, the magnitude of the force generally refers to the magnitude of the real force. In this case, the greater the force of the press, the greater the change in the UI element. In some embodiments, the force of the press may be a user-applied press force detected by the electronic device. Alternatively, the pressing force may also be a predetermined pressing force set by the electronic device or the user.
Fig. 37 shows a schematic diagram of an example of a change 3700 of UI elements at different pressing forces according to some embodiments of the present disclosure. As shown in fig. 37, in the case where the pressing force is large, the UI element may be changed (e.g., rotated) in a larger magnitude. In contrast, in the case where the pressing force is small, the UI element may change in a smaller magnitude. However, embodiments of the present disclosure are not limited thereto. The manner in which the UI elements change in response to different pressing forces is configurable by the electronic device or the user. For example, in the case where the pressing force is large, the UI element may change in a smaller magnitude, and in the case where the pressing force is small, the UI element may change in a larger magnitude.
Further, in electronic devices, changing the force based solely on a real press may be highly demanding for the user and may require the electronic device to be equipped with related hardware. Thus, in some embodiments, the time of the compression may be used to simulate or replace the force of the compression. For example, if the pressing time is longer, it can be considered that the pressing force is larger, and thus the variation is larger.
Fig. 38 illustrates a schematic diagram of an example of a change 3800 of a UI element at different press durations, according to some embodiments of the disclosure. As shown in fig. 38, in the case where the duration of the press is long, the UI element may be changed (e.g., rotated) by a larger margin. In contrast, in the case where the duration of the press is short, the UI element may change in a smaller magnitude.
However, embodiments of the present disclosure are not limited thereto. The manner in which the UI elements change in response to different press durations is configurable by the electronic device or user. For example, in the case where the duration of the press is long, the UI element may change in a smaller magnitude, and in the case where the duration of the press is short, the UI element may change in a larger magnitude.
Furthermore, in some embodiments, to further conform the changes to the UI elements to natural laws and user usage habits, the magnitude of the changes may depend on the size of the UI elements. For example, intuitively, the same press may have difficulty pressing a larger UI element, while a smaller UI element may be pressed with less effort. In this case, larger UI elements may be less affected by the press, while smaller UI elements may be more affected by the press.
FIG. 39 shows an illustrative diagram of an example of a change 3900 in UI elements at different sizes in accordance with some embodiments of the disclosure. As shown in fig. 39, in the case where the size of the UI element is large, the UI element can be changed in a larger scale. In contrast, in the case where the size of the UI element is small, the UI element may be changed in a smaller magnitude.
However, embodiments of the present disclosure are not limited thereto. The manner in which the UI elements change at different sizes is configurable by the electronic device or user. For example, in the case where the size of the UI element is large, the UI element may be changed in a smaller magnitude, and in the case where the size of the UI element is small, the UI element may be changed in a larger magnitude.
Furthermore, in some embodiments, to improve the user experience, the magnitude to which the UI element can be varied may be limited such that the UI element can only be varied within a range of permitted magnitudes. For example, the amplitude range may be any suitable range, such as a rotation angle of the UI element between 0-60 degrees, a grayscale of a color change of the UI element between 10% -50%, or a coordinate change of a grid within the UI element between 100-10000 pixels, etc. By way of example, assume a range of amplitudes for which the UI element is rotated between 0-60 degrees. In this case, regardless of the predetermined pressing force and the duration of the pressing, the pressed UI element can be rotated only by 60 degrees around the reference point at most, and cannot be rotated by a larger magnitude.
In the above, the change of the pressed UI element is described in detail. As described above, in pressure-ganging, a pressed target UI element may affect other UI elements that are not pressed. In particular, in pressure-linking, triggering the animation effect of a target UI element may jointly trigger the animation effect of one or more other UI elements, and even other UI elements in the entire UI, such that the other UI elements are affected by the target UI element. For example, in pressure-coupled, when a target UI element is pressed during a duration, in addition to the target UI element changing with an animation effect, other UI elements may also change with a corresponding animation effect, thereby visually presenting the coupled pressing. Therefore, hereinafter, the pressure linkage will be described in detail with reference to fig. 40 to 46.
FIG. 40 shows a flow diagram of a graphical interface display method 4000 according to an embodiment of the disclosure. It should be appreciated that method 4000 may be performed by electronic device 100 described above with reference to fig. 1 or electronic device 200 described above with reference to fig. 2. Method 4000 is described herein with reference to UI 300A of fig. 3A. However, it should be understood that UI 300A is merely an example and method 2500 may be applied to any suitable interface, including but not limited to UIs 300B-300C.
At block 4010, M user interface UI elements are displayed on a screen of the electronic device. M is a positive integer greater than 1. For example, the M UI elements may be UI elements 1 to 13.
At block 4020, a press at a first UI element of the M UI elements is detected. For example, the first UI element may be UI element 5. As described above, a press at the first UI element will cause the first UI element to rotate to present a press effect.
In block 4030, in response to detecting the press at the first UI element, each of the N UI elements on the screen is caused to change with a respective animation effect. N is a positive integer between 1 and M-1. This visually indicates the pressing in conjunction.
In some embodiments, the direction in which the N UI elements change relative to the location of the press may be a direction pointing from each of the N UI elements to the location of the press. In some embodiments, the direction may be a direction pointing from the respective reference point of each of the N UI elements to the reference point of the pressed UI element. In this case, the position of the press is a variation reference point of the variation of the N elements, that is, the position of the press is visually indicated as the center of the press. Fig. 41 shows a schematic diagram of an example of a pressure linkage 4000 of N UI elements according to an embodiment of the disclosure. As shown in fig. 41, the UI element 5 is pressed, so that the UI element 5 is rotated to present the pressing effect. In addition, other UI elements on the screen (e.g., UI elements 1 to 4, and 6 to 13) also rotate at different amplitudes relative to the position of the press in response to the press to present the press effect. Thereby, the coordinated pressing is visually presented.
Alternatively, the direction in which the N UI elements change with respect to the position of the pressing may be the same as the direction in which the pressed UI element changes. FIG. 42 illustrates a schematic diagram of another example of a pressure linkage 4000 of N UI elements according to an embodiment of the disclosure. As shown in fig. 42, the UI element 5 is pressed, so that the UI element 5 is rotated to present the pressing effect. In addition, other UI elements on the screen (e.g., UI elements 1 to 4, and 6 to 13) also rotate in the same direction as UI element 5 in response to the pressing at different magnitudes to present the pressing effect. In this case, the changed change reference point of the N elements is its own reference point. Thereby, the pressing in linkage is visually presented.
For clarity, FIGS. 41-42 only show the pressure linkage of UI elements 1-13 in UI 300A. It should be understood that pressure linkages may occur at any at least two UI elements in any UI, for example, at any at least two UI elements in UIs 300A-300C.
In some embodiments, the pressure linkage may act on all UI elements on the screen. In this case, M-1 UI elements other than the first UI element among the M UI elements may be determined as the N UI elements. Alternatively, the pressure linkage may only act on a portion of the UI elements on the screen. In this case, the influence area of the first UI element may be determined based on the size of the first UI element, and a UI element within the influence area among the M UI elements may be determined as the N UI elements. For example, the larger the size of the first UI element, the larger its area of influence may be. Alternatively, the area of influence may also shrink with decreasing size, as the disclosure is not limited herein. For example, the area of influence may be a circle having a predetermined radius centered on the reference point of the first UI element. It should be appreciated that the area of influence may be any suitable area having any shape, such as a rectangle, a diamond, etc. having a predetermined size. The area of influence may be configurable by the electronic device and the user, as the disclosure is not limited herein.
Further, in some embodiments, UI elements that intersect the area of influence may be considered to be within the area of influence. Alternatively, in the case where the area of influence is a circle having a predetermined radius, the UI element may be considered to be within the area of influence if the distance of the UI element from the first UI element is less than the predetermined radius of the area of influence.
Fig. 43 shows a schematic diagram of an example of an area of influence 4300 of a UI element according to an embodiment of the present disclosure. As shown in fig. 43, since the UI elements 3, 4, 7, 8 are within the area of influence 4310 of the UI element 5, the UI elements 3, 4, 7, 8 will change in linkage with the UI element 5. Further, because the UI elements 1, 2, 6, 9-13 are not within the area of influence 4310 of the UI element 5, the UI elements 1, 6, 9-13 do not change in linkage with the UI element 5.
Referring back to fig. 40, to cause the N UI elements to change with respective animation effects, a distance between the first UI element and each of the N UI elements may be determined. As described above, in some embodiments, distances may be divided into a plurality of distance levels based on the range in which the distance is located. For example, the manipulated UI element itself may be at distance level 0, those of the linked UI elements may be at distance levels 1, 2, 3, 8230depending on their respective distances from the manipulated UI element, and those UI elements at the same distance levels may be considered to be the same distance. Thus, by using the distance levels, the linkage of the UI elements can be simplified, so that the UI elements at the same distance levels are linked in the same manner, thereby improving the uniformity and harmony of the linkage. However, it should be understood that in the linkage, the distance itself may also be used, thereby making the UI elements more precisely linked. Hereinafter, the distance rank is interchangeably referred to as distance.
In the above, the manner of determining the distance between the first UI element and the second UI element of the N UI elements has been described with reference to fig. 10 to 14B, and thus the description thereof is omitted here.
Referring back to fig. 40, after determining the distance between the first UI element and the second UI element, an animation effect in which the second UI element changes may be determined based on the distance. For example, if the distance between the first UI element and the second UI element is larger, the magnitude of the change of the second UI element may be smaller, thereby visually indicating that the impact of the press on the distant UI element becomes smaller. Alternatively, if the distance between the first UI element and the second UI element is larger, the magnitude of the change of the second UI element may also be larger, visually indicating that the impact of the press on the distant UI element is larger.
In some embodiments, to determine the magnitude of the change in the second UI element, a first magnitude of the change in the first UI element in response to the press may be determined. In some embodiments, the first magnitude at which the first UI element changes may be determined based on various factors associated with the first UI element. These factors may include, but are not limited to, the size of the first UI element, the location of the first reference point of the first UI element, the range of magnitudes that the first UI element can be varied, the location of the press, the duration of the press, and the predetermined pressing force. In the above, the influence of these factors on the magnitude of change in the UI element, respectively, is described in detail, and therefore the description thereof is omitted here.
Then, a magnitude at which the second UI element changes in response to the press may be determined based on the first magnitude and a distance between the first UI element and the second UI element. How to conduct the magnitude of the change of the first UI element to the second UI element, and thus the magnitude of the change of the second UI element, can be achieved using the conducting manner described in detail above. The difference is that in the pressure linkage, x in equations (7) and (8) is conducted n The magnitude of change of the UI element (e.g., second UI element) representing the ganged change, and x represents the magnitude of change of the UI element (e.g., first UI element) being pressed. Therefore, the description thereof is omitted here.
Therefore, the change amplitude of the second UI element is determined by the change amplitude of the first UI element and the distance between the second UI element and the first UI element, so that intuitive and natural pressure linkage according with the use habit of the user can be realized.
Further, in some embodiments, the size of the second UI element may also affect the animation effect of the second UI element changing. In this case, the size of the second UI element may also be considered to determine the animation effect of the second UI element changing. For example, if the second UI element is larger, the magnitude of the change in the second UI element may be larger. Alternatively, if the second UI element is larger, the magnitude of the change of the second UI element may be smaller. To this end, in some embodiments, the magnitude at which the second UI element changes in response to the press may be determined based on the first magnitude, the distance, and the size of the second UI element.
Additionally, in some embodiments, the size of the first UI element may also affect the animation effect of the second UI element changing. In this case, the size of the first UI element may also be considered to determine the animation effect of the second UI element changing. For example, the larger the size of the first UI element, the greater the linkage effect it may have, and thus the magnitude of the change in the second UI element may be proportional to the size of the first UI element. To this end, in some embodiments, the magnitude of the second UI element may be determined based on the first magnitude, the distance, and the size of the first UI element.
Further, as described above, both the size of the first UI element and the size of the second UI element may affect the animation effect of the second UI element changing. Thus, in some embodiments, the magnitude of the change in the second UI element may be determined based on the first magnitude, the distance, the size of the first UI element, and the size of the second UI element.
Referring back to fig. 40, after determining the animation effect with which the second UI element changes, the second UI element may be caused to change with the animation effect to visually indicate that the second UI element changes as the first UI element is pressed. The N UI elements may be changed in their respective animation effects to visually indicate a press on the entire screen or a partial area of the screen, thereby presenting a press linkage.
Fig. 44 illustrates a schematic diagram of an example of a variation 4400 of a distance-based UI element according to an embodiment of the present disclosure. As shown in fig. 4400, a UI element with distance 0 (e.g., UI element 5 itself) changes by a greater extent than a UI element with distance 1 (e.g., UI elements 3, 4, 7, 8), a UI element with distance 1 changes by a greater extent than a UI element with distance 2 (e.g., UI elements 2, 6, 9), and a UI element with distance 2 changes by a greater extent than a UI element with distance 3 (e.g., UI elements 1, 10-13).
Further, in some embodiments, to better present the conduction of the animation effect and improve the user experience, the first UI element and the second UI element do not begin to change at the same time. For example, a first UI element may begin to change when a press occurs, while a second UI element may begin to change after a period of time after the press occurs. To this end, in some embodiments, a delay time may be determined based on a distance between the first UI element and the second UI element, and the second UI element may be changed in response to the delay time elapsing after the press occurs. Further, in some embodiments, a delay factor may be determined, and the delay time determined based on the distance and the delay factor. For example, the delay time may be a quotient of the distance divided by the delay coefficient. The delay factor may be configurable by the electronic device and the user.
Fig. 45 shows a schematic diagram of an example of a delay time 4500 according to an embodiment of the present disclosure. As shown in fig. 45, the first UI element (e.g., UI element 5) at distance 0 starts to change at time T81 when the press occurs, the UI elements (e.g., UI elements 3, 4, 7, 8) at distance 1 start to change at time T82 thereafter, the UI elements (e.g., UI elements 2, 4, 6, 9) at distance 2 start to change at later time T83, and the UI elements (e.g., UI elements 1, 10-13) at distance 3 start to change at latest T84.
Further, in some embodiments, the speed at which the UI element changes may be controlled by a predefined curve of amplitude change over time. For example, the predefined curve may be a bezier curve or an elastic force curve. In the case of the spring force profile, the rate of change can be controlled by controlling the damping coefficient and the stiffness coefficient of the spring. In the case of a bezier curve, the speed at which the change occurs may be controlled by controlling the coordinates of at least one of the at least one control point of the bezier curve.
Further, in some embodiments, the changed UI element may be restored after the pressing is finished (e.g., after the user lifts the finger off the screen). Specifically, both the pressed UI element and the N UI elements that change in linkage can be restored. To this end, in some embodiments, the changed second UI element may be restored to the second UI element before the change. The recovery process may be the reverse process of the change, and thus a detailed description thereof is omitted herein.
Fig. 46 shows a schematic diagram of an example of a restoration 4600 of UI elements according to an embodiment of the present disclosure. As shown in FIG. 46, the UI elements that have changed (e.g., UI elements 1-13) all revert to their pre-change form.
In certain embodiments, the recovery of the change may have a rebound effect. For example, after the user releases his or her hand, the change in the UI element may be from a left side being pressed while the right side is tilted, to a left side being tilted while the right side is pressed, and then to the original shape. That is, after the user releases his hand, the effect of the UI element being restored after it is flipped is visually presented.
Fig. 46B shows a schematic diagram of an example of an angular time curve 4600B of a restoration of a UI element with a bounce effect according to an embodiment of the present disclosure.
As shown in fig. 46B, at T91, the UI element 5 is pressed to change. For example, the UI element 5 is pressed on the left side, thereby rotating around its reference point. Specifically, the UI element 5 is depressed on the left side while the right side is tilted up. In addition, other UI elements (e.g., UI elements 1-4, 6-13) also change in linkage.
At T92, the user releases his hand. At this point, the UI element 5 is rotated by an angle of-60. Further, UI elements with distance 1 (e.g., UI elements 3, 4, 7, 8) rotate in unison, but with a smaller rotation magnitude than UI element 5. In addition, UI elements with distance 2 (e.g., UI elements 2, 6, 9) also rotate in unison, but with a smaller rotation amplitude than UI elements with distance 1. Further, UI elements at distance 3 (e.g., UI elements 1, 10-13) also rotate in unison, but at a smaller magnitude than UI elements at distance 2.
During T92 and T93, the UI element begins to rotate rebound.
At T93, the UI element 5 rotates to bounce by an angle of 45 °. Further, UI elements with a distance of 1 (e.g., UI elements 3, 4, 7, 8) are linked in rotational bounce, but the rotational bounce amplitude is less than UI element 5. In addition, UI elements with distance 2 (e.g., UI elements 2, 6, 9) also bounce in tandem, but with a smaller amplitude than UI elements with distance 1. Further, UI elements at distance 3 (e.g., UI elements 1, 10-13) also co-rotate in rebound, but with a smaller magnitude of rotational rebound than UI elements at distance 2.
At T94, all UI elements are restored to the original shape. In other words, the rotation angle of all UI elements is restored to 0 °.
It should be understood that the rotation angles in fig. 46B are merely examples, and the UI elements may be changed in any suitable pattern. Further, although the spring back effect is shown as being sprung back only once in fig. 46B, a spring back effect having a plurality of spring backs may be achieved. The number of rebounds may be any suitable number of rebounds, and the disclosure is not limited herein. In some embodiments, the rebound amplitude of the multiple rebounds may decrease over time.
Fig. 46C shows a schematic diagram of an example of an angular time curve 4600C of a recovery of a UI element with a bounce effect of multiple bounces with decreasing bounce amplitudes according to an embodiment of the disclosure.
As shown in fig. 46C, the UI element returns to the original shape after a plurality of rebounds, wherein the magnitude of rotation (e.g., the angle of rotation) of the rebounds of the UI element having the distance of 0 (e.g., UI element 5) is greater than that of the UI elements having the distance of 1 (e.g., UI elements 3, 4, 7, 8). The magnitude of the rotation of the bounce of a UI element at distance 1 is greater than a UI element at distance 2 (e.g., UI elements 2, 6, 9). The magnitude of the rotation of the bounce of the UI element at distance 2 is greater than the UI elements at distance 3 (e.g., UI elements 1, 10-13).
Further, in certain embodiments, the rebound effect may also be controlled by a predefined curve (e.g., an elastic force curve, a bezier curve, etc.). For example, these UI elements may spring back with animation effect changes that are controlled by a predefined curve.
The following describes the animation implementation principle and system architecture of linkage proposed by the embodiments of the present disclosure.
FIG. 47 illustrates an animation implementation conceptual diagram 4700 according to an embodiment of the disclosure. As described above, animation is essentially the real-time display of a UI or UI element based on a refresh rate. Due to the principle of human persistence of vision, the user feels the picture is moving. As shown in fig. 47, the animation changes from the initial state of the animation to the final state of the animation after the animation time elapses. During this transformation, the animation may be controlled by the animation type and animation transformation form. For example, animation types may include displacement animation, rotation animation, zoom animation, and transparency animation, among others. And the animation transformation form can be controlled by a controller such as an interpolator and an estimator. Such a controller may be used to control the speed at which the animation is transformed during the animation time.
In particular, the interpolator is used to set the change logic for the animation property values to transition from an initial state to a final state, thereby controlling the rate at which the animation changes, such that the animation can change at one or more of a uniform rate, an acceleration rate, a deceleration rate, a parabolic rate, etc.
In some embodiments, the electronic device 100 may set the animation property value change logic according to a system interpolator or a custom interpolator (e.g., elastic force interpolator, friction force interpolator). When the animation operation is performed, when the electronic apparatus 100 determines that the animation attribute value is changed according to the above change logic, an image frame is drawn based on the above animation attribute value, and the view is refreshed.
In some embodiments, when the electronic device 100 determines that the animation property value changes according to the change logic of the interpolator, the invalid () function is called based on the animation property value to refresh the view, that is, the onDraw () function is called to redraw the view and display the view.
In some embodiments, the electronic device 100 customizes the elastic force interpolator. For example, the parameters of the function of the elastic force interpolator include at least a stiffness coefficient (stiff) and a damping coefficient (damming). As an example, the function code of the elastic force interpolator may be expressed as one of the following: "spring interpolator (float damping)," spring interpolator (float damping, float end pos), "spring interpolator (float damping, float velocity)," spring interpolator (float damping, float end pos, float velocity, float threshold) ".
Wherein the parameter endPos represents the relative displacement, i.e. the difference between the initial position and the rest position of the spring. In some embodiments, endPos may represent a relative displacement of a UI element.
The parameter velocity represents the initial velocity. In some embodiments, velocity may represent the initial velocity of a UI element.
The parameter valueThreshold represents a threshold value for judging the stop of the animation. When the difference value of the displacement (or other attributes) between two adjacent frames is smaller than the threshold value, the animation stops running. The larger the threshold value is, the more easily the animation is stopped, and the running time is shorter; conversely, the animation runs longer. The value of the threshold can be set according to specific animation attributes. In some embodiments, the elastic interpolator FloatValueHold parameter defaults to 1/1000, and the threshold value takes 1 in other construction methods. In some embodiments, the suggested values shown in Table 1 may be used in accordance with animation properties when customizing the threshold.
TABLE 1
Animation Properties valueThreshold
ROTATION/ROTATION_X/ROTATION_Y 1/10
ALPHA 1/256
SCALE_X/SCALE_Y 1/500
TRANSLATION_Y/TRANSLATION_X 1
In addition, the threshold value can also directly use the following constants provided by the DynamicAnimation class: MIN _ VISIBLE _ CHANGE _ PIXELS, IN _ VISIBLE _ CHANGE _ ROTATION _ DESGREES, MIN _ VISIBLE _ CHANGE _ ALPHA, MIN _ VISIBLE _ CHANGE _ SCALE.
By way of example, the specific code for an animation class of a custom elasticity interpolator may be expressed as follows:
“PhysicalInterpolatorBase interpolator=new SpringInterpolator(400F,40F,200F,2600F, 1F);
ObjectAnimator animator=ObjectAnimator.ofFloat(listView,“translationY”,0,346);
station duration (interpolator. Getduration ()); // obtaining animation time
Setinterpolator (interpolator); // setting custom interpolator to animation classes
Start (); // running animation ".
In some embodiments, the electronic device 100 customizes the friction interpolator. As an example, the function code of the friction force interpolator may be expressed as "float initvector (float fraction)". Wherein initVelocity represents the initial velocity and frication represents the friction.
By way of example, the specific code for an animation class using a friction interpolator may be expressed as follows:
“PhysicalInterpolatorBase interpolator=new FlingInterpolator(600F,0.5F);
ObjectAnimator animator=ObjectAnimator.ofFloat(listView,“translationY”,0, interpolator.getEndOffset());
station duration (interpolar. Getduration ()); // obtaining animation time
Setinterpolator (interpolator); set custom interpolator to animation class animoter. "// running animation.
In some embodiments, the electronic device 100 may set the animation time (Duration) and the start position by itself; the engine model may also be called to acquire the animation time (Duration) and the end position, and then set to the animation class (animation class).
As an example, the code of the electronic device 100 calling the engine model to acquire the animation time may be expressed as "com.
As an example, a code calling the engine model to acquire the end position of the spring may be expressed as "com.
As an example, the code setting the parameter valuelthreshold may be expressed as "com.
In some embodiments, code that uses elasticity engine animation classes may be represented as one of the following: "SpringAnimation (K object, floatPropertyComponat < K > property, float stability, float damming, float startValue, float endValue, float velocity)", "SpringAnimation (K object, floatPropertyComponat < K > property, float stability, float damping, float endValue, float velocity)".
Wherein the parameter object represents an animated object; property represents an animation class or Property object acted on by an interpolator. Referring to table 1, this parameter can be used to indirectly set the value threshold. This parameter is optional in the interpolator version, and when the value threshold has been set by other means, this parameter may not be set, i.e. the construction method without property parameter is used directly. The parameter in the animation type version is a necessary parameter. The DynamicAnimation class has provided constants that can be used directly as follows:
"transition _ X, transition _ Y, transition _ Z, SCALE _ X, SCALE _ Y, roll _ X, roll _ Y, X, Y, Z, ALPHA, SCALE _ X, SCALE _ Y", electronic device 100 may also customize the ViewProperty interface.
By way of example, the specific code for animation classes using a spring engine may be expressed as follows:
“SpringAnimation animation=SpringAnimation(listView, DynamicAnimation.TRANSLATION_Y,400F,40F,0,1000F);
animation.start();”
in some embodiments, the code for animation using the friction engine may be expressed as: "Flanganimation (K object, floatPropertyComat < K > property, float initVelocity, float frication)".
By way of example, the specific code for using the friction animation class may be expressed as follows:
“FlingAnimation animation=FlingAnimation(listView, DynamicAnimation.TRANSLATION_Y,2000F,0.5F);
animation.start();”。
FIG. 48 illustrates a schematic diagram of a system framework 4800 for implementing a "ganged" animation effect capability or function, according to an embodiment of the disclosure. The dynamic capability of the UI framework is based on android
Figure BDA0003029898300000411
Or hongmeng
Figure BDA0003029898300000412
The method is realized by the overall architecture of the system, the 4 layers of logic processing of the mainstream are included, and the flow of data processing is presented to a user from the bottom layer to the top layer. Users mainly use and experience dynamic functions at the application layer. In an embodiment of the present disclosure, the capability interaction relationship of the desktop and the UI framework is as shown in FIG. 48. Specifically, as shown in fig. 48, the system framework 4800 may include an application framework layer 4810, an application framework layer 4830, a hardware abstraction layer 4850, and a kernel layer 4870. The application layer 4810 may include a desktop 4812. UI element operations 4814 may be implemented on desktop 4812. UI element operations 4814 may include, for example, drag operations, press operations, deep press operations, and the like. The application framework layer 4830 may include system services 4832 and extended services 4834. System Service 4832 may include various system services, such as Service 4833. The extension services 4834 may include various extension services, such as SDK 4835. Hardware Abstraction Layer (HAL) 4850 may include HAL 3.0 4852 and Algo 4854. Inner core layer 4870 may include drive 4872 and physical device 4874. Physical device 4874 can be driven4872 provides a raw parameter stream and drives 4872 can provide a functional process parameter stream to physical devices 4874. As further shown in fig. 48, a UI framework 4820 for implementing a linkage effect 4825 may be implemented between an application layer 4810 and an application framework layer 4830. UI framework 4820 may include platform capabilities 4822 and system capabilities 4824, both of which may be used to provide linkage effects 4825. The linkage effect 4825, in turn, may be provided to UI element operations 4814 of the application layer 4810.
FIG. 49 illustrates a schematic diagram of the relationship between the application side and the UI framework side involved in the "linkage" animation effect capability or function according to an embodiment of the present disclosure. As shown in fig. 49, application side 4910 may include a desktop 4915, and UI elements on desktop 4915 may implement drag, press, depth press, etc. operations. UI frame side 4950 may include UI frame activity 4952, UI frame activity 4952 may implement linkage activity 4954, linkage activity 4954 may be implemented via AAR format 4951, JAR format 4953, and system interface 4955, among other ways. Application side 4910 may invoke "ganged" animation effect capabilities or functions provided by UI framework side 4950 by integrating 4930 and invoking 4940, among other ways. Through interaction between application side 4910 and UI framework side 4950, embodiments of the present disclosure implement a novel type of coordinated "animation effect" that associates otherwise independent UI elements (e.g., icons, cards, controls, etc.).
FIG. 50 shows a schematic diagram with a specific illustration of three ways of achieving the "ganged" animation effect capability or functionality according to an embodiment of the disclosure. As shown in fig. 50, the relationship 5001 between the AAR format 4951 and the system of the electronic device 100 is: the AAR format 4951 is packaged by the capability in a binary mode, provides the capability of integrating on the application side in the system, can freely control the version rhythm and does not follow the system. The relationship 5003 between JAR format 4953 and the system of electronic device 100 is: JAR format 4953 is packaged with capabilities in a binary fashion, providing capabilities for all components in the system, allowing free control of the plate rhythm without following the system. The relationship 5005 between the system interface 4955 and the systems of the electronic device 100 is: system interface 4955 is a framework layer interface in the system version that provides all the component capabilities in the system, following system upgrades. The key point of the disclosure is the realization of the linkage dynamic effect capability. The integration is in the mode of AAR and JAR, and the call is in the mode of system interface. The scenes are not limited, but the showing modes of the capabilities are not consistent. That is, the functions of the various methods described in the foregoing of the present disclosure may be implemented by an AAR format file, a JAR format file, and/or a system interface of the electronic device 100. In this manner, the ability or functionality to "articulate" animation effects may be easily and conveniently implemented and provided to an application of the electronic device, such as a desktop.

Claims (17)

1. A graphical interface display method, comprising:
displaying M user interface UI elements on a screen of the electronic equipment, wherein M is a positive integer larger than 1;
detecting a press acting at a first UI element of the M UI elements;
in response to the pressing, causing each of N UI elements on the screen to change with a respective animation effect, N being a positive integer between 1 and M-1, wherein causing the N UI elements to change with respective animation effects comprises:
determining a distance between the first UI element and a second UI element of the N UI elements;
determining an animation effect of the second UI element changing based on the distance and the location of the press in the UI; and
causing the second UI element to change in the animation effect to visually indicate the press.
2. The method of claim 1, wherein determining the distance comprises:
determining a first reference point of the first UI element and a second reference point of the second UI element; and
determining a distance between the first reference point and the second reference point as the distance.
3. The method of claim 1, wherein determining the distance comprises:
determining a first reference point of the first UI element;
determining a target circle that intersects the second UI element and has a smallest radius, from among a plurality of circles having respective radii centered on the first reference point; and
determining a radius of the target circle as the distance.
4. The method of claim 1, wherein determining the distance comprises:
determining a lateral spacing between the first UI element and the second UI element;
determining a vertical spacing between the first UI element and the second UI element; and
determining the distance based on any one of:
at least one of the transverse pitch and the longitudinal pitch, or
At least one of the lateral spacing and the longitudinal spacing, and a direction pointing from a second reference point of the second UI element to a first reference point of the first UI element.
5. The method of any of claims 1 to 4, further comprising:
determining an area of influence of the first UI element based on a size of the first UI element; and
determining a UI element of the M UI elements within the area of influence as the N UI elements.
6. The method of any of claims 1 to 4, further comprising:
determining M-1 UI elements of the M UI elements except the first UI element as the N UI elements.
7. The method of any of claims 1-6, wherein the animation effect comprises:
visually shifting position relative to said position of said pressing in a see-saw manner, or
Visually relative to the location of the depression or protrusion.
8. The method of any of claims 1-7, wherein determining the animation effect comprises:
determining a first amplitude at which the first UI element changes in response to the press; and
determining a magnitude by which the second UI element changes in response to the press based on any of:
said first amplitude and said distance, or
At least one of a size of the second UI element and a size of the first UI element, the first amplitude, and the distance.
9. The method of claim 8, wherein a first magnitude of change of the first UI element is determined based on at least one of the following associated with the first UI element:
the size of the first UI element is such that,
a position of a first reference point of the first UI element,
a range of magnitudes over which the first UI element can change,
the position of the said pressing is such that,
the duration of the compression, and
the pressing force is predetermined.
10. The method of any of claims 1-9, wherein causing the second UI element to change comprises:
determining a delay time based on the distance; and
causing the second UI element to change in response to the delay time having elapsed after the press occurred.
11. The method of any of claims 1-10, wherein causing the second UI element to change comprises:
determining a speed at which the second UI element changes in response to the compression based on a predefined curve of amplitude versus time.
12. The method of claim 11, wherein the predefined curve is a bezier curve or an elastic force curve.
13. The method of any of claims 1 to 12, further comprising:
and restoring the changed second UI element into the second UI element.
14. The method of any of claims 1-13, wherein the method is implemented by at least one of an AAR format file, a JAR format file, and a system interface.
15. An electronic device, comprising: a processor, and a memory storing instructions that, when executed by the processor, cause the electronic device to perform the method of any of claims 1-14.
16. A computer-readable storage medium having stored thereon instructions that, when executed by an electronic device, cause the electronic device to perform the method of any one of claims 1-14.
17. A computer program product, characterized in that it comprises instructions which, when executed by an electronic device, cause the electronic device to carry out the method according to any one of claims 1 to 14.
CN202110426824.5A 2021-04-20 2021-04-20 Graphical interface display method, electronic device, medium, and program product Pending CN115220621A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110426824.5A CN115220621A (en) 2021-04-20 2021-04-20 Graphical interface display method, electronic device, medium, and program product
PCT/CN2022/087751 WO2022222931A1 (en) 2021-04-20 2022-04-19 Graphical interface display method, electronic device, medium, and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110426824.5A CN115220621A (en) 2021-04-20 2021-04-20 Graphical interface display method, electronic device, medium, and program product

Publications (1)

Publication Number Publication Date
CN115220621A true CN115220621A (en) 2022-10-21

Family

ID=83604135

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110426824.5A Pending CN115220621A (en) 2021-04-20 2021-04-20 Graphical interface display method, electronic device, medium, and program product

Country Status (2)

Country Link
CN (1) CN115220621A (en)
WO (1) WO2022222931A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024098713A1 (en) * 2022-11-11 2024-05-16 中兴通讯股份有限公司 Terminal desktop display method, terminal and computer-readable medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1689046A (en) * 2003-05-09 2005-10-26 微软公司 System supporting animation of graphical display elements through animation object instances
CN107767431A (en) * 2017-09-28 2018-03-06 北京知道创宇信息技术有限公司 A kind of Web animation methods and computing device
CN110140106A (en) * 2017-11-20 2019-08-16 华为技术有限公司 According to the method and device of background image Dynamically Announce icon

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2960763A1 (en) * 2014-06-24 2015-12-30 Google, Inc. Computerized systems and methods for cascading user interface element animations
US10222960B2 (en) * 2016-04-26 2019-03-05 Google Llc Animation of user interface elements
CN112256165B (en) * 2019-12-13 2022-05-10 华为技术有限公司 Application icon display method and electronic equipment
CN115469781B (en) * 2021-04-20 2023-09-01 华为技术有限公司 Graphic interface display method, electronic device, medium and program product
CN115964106B (en) * 2021-04-20 2024-02-13 华为技术有限公司 Graphic interface display method, electronic device, medium and program product

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1689046A (en) * 2003-05-09 2005-10-26 微软公司 System supporting animation of graphical display elements through animation object instances
CN107767431A (en) * 2017-09-28 2018-03-06 北京知道创宇信息技术有限公司 A kind of Web animation methods and computing device
CN110140106A (en) * 2017-11-20 2019-08-16 华为技术有限公司 According to the method and device of background image Dynamically Announce icon

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024098713A1 (en) * 2022-11-11 2024-05-16 中兴通讯股份有限公司 Terminal desktop display method, terminal and computer-readable medium

Also Published As

Publication number Publication date
WO2022222931A1 (en) 2022-10-27

Similar Documents

Publication Publication Date Title
CN113552987B (en) Graphical interface display method, electronic device, medium, and program product
WO2022222830A1 (en) Graphic interface display method, electronic device, medium and program product
EP4224831A1 (en) Image processing method and electronic device
CN114579075B (en) Data processing method and related device
WO2021115194A1 (en) Application icon display method and electronic device
CN113805745B (en) Control method of suspension window and electronic equipment
WO2022247541A1 (en) Method and apparatus for application animation linking
CN113132526A (en) Page drawing method and related device
CN115048012A (en) Data processing method and related device
WO2022222931A1 (en) Graphical interface display method, electronic device, medium, and program product
WO2021254113A1 (en) Control method for three-dimensional interface and terminal
CN116048361B (en) Interaction method, readable storage medium and electronic device
WO2022222831A1 (en) Graphical interface display method, electronic device, medium, and program product
CN114691002B (en) Page sliding processing method and related device
CN111722896B (en) Animation playing method, device, terminal and computer readable storage medium
WO2023130977A1 (en) User interface display method, electronic device, medium and program product
CN114690975B (en) Dynamic effect processing method and related device
WO2024099206A1 (en) Graphical interface processing method and apparatus
WO2022247542A1 (en) Dynamic effect calculating method and apparatus
CN117472482A (en) Interface switching display method and electronic equipment
CN117472485A (en) Interface display method and electronic equipment
CN117667276A (en) Page refreshing method and electronic device
CN117290004A (en) Component preview method and electronic equipment
CN115700444A (en) Cursor display method and electronic equipment
CN115904184A (en) Data processing method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination