CN115237304A - Interface display method, electronic device and computer readable medium - Google Patents

Interface display method, electronic device and computer readable medium Download PDF

Info

Publication number
CN115237304A
CN115237304A CN202110438190.5A CN202110438190A CN115237304A CN 115237304 A CN115237304 A CN 115237304A CN 202110438190 A CN202110438190 A CN 202110438190A CN 115237304 A CN115237304 A CN 115237304A
Authority
CN
China
Prior art keywords
display
picture
physical
interface
user operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110438190.5A
Other languages
Chinese (zh)
Inventor
高璋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110438190.5A priority Critical patent/CN115237304A/en
Priority to PCT/CN2022/085864 priority patent/WO2022222771A1/en
Publication of CN115237304A publication Critical patent/CN115237304A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The application relates to an interface display method, an electronic device and a readable medium. In the method, the electronic equipment can acquire user operation data generated by user operation on a screen of the electronic equipment; calculating the user operation data by using a physical rule corresponding to the type of the user operation to obtain display data corresponding to the user operation data; and according to the display data, displaying a physical phenomenon generated by simulating the user operation data to follow a physical law on the electronic equipment. By the method, the electronic equipment can establish the corresponding relation between the display effect of the display element in the display interface of the application and at least one physical phenomenon in advance. The display effect of the display elements is calculated in a self-adaptive manner according to the display style loaded after the electronic equipment starts the application and the operation applied to the electronic equipment by the user, so that the situation that various display effects are required to be configured on each display interface of one application is avoided.

Description

Interface display method, electronic device and computer readable medium
Technical Field
The application relates to graphical interface display technology in the field of electronic equipment. And more particularly, to an interface display method, an electronic device, and a computer-readable medium.
Background
In order to adapt the display interface of the application to display areas with different sizes, different layouts need to be configured for the display elements in the display interface of the application, and the layouts can be positions of the display elements in the display interface. For example, as shown in fig. 1 (a), the display interface of the application includes six display elements, and the six display elements are displayed in three rows and two columns; whereas in fig. 1 (b), the six display elements are displayed in two rows and three columns. Meanwhile, the application can also support multiple display styles, and display elements in the display interface of the application have different display effects under different display styles. For example, for two display styles of metallic style and water drop style supported by the same application, the appearance in the display effect of the display element may be a metallic luster effect and a water drop effect, respectively. Therefore, for the same application, in order to adapt to different display areas with different sizes and different display styles of the application, a plurality of display interfaces with different layouts and display effects are generally required to be developed in advance, so that the cost for developing and maintaining the display interfaces of the application is high.
Disclosure of Invention
The application aims to provide an interface display method, an electronic device and a computer readable medium. By the method, the display effect of the user operation can be calculated in a self-adaptive manner according to the physical data corresponding to the applied display style parameters and the physical data corresponding to the user operation data of the user operation applied by the user by using the physical formula corresponding to the physical law corresponding to the physical phenomenon generated by the at least one physical law corresponding to the user operation applied to the applied display interface.
A first aspect of the present application provides an interface display method applied to an electronic device, including:
acquiring user operation data generated by user operation on a screen of the electronic equipment;
calculating the user operation data by using a physical rule corresponding to the type of the user operation to obtain display data corresponding to the user operation data;
and according to the display data, displaying a physical phenomenon which simulates the user operation data to follow a physical law on the electronic equipment.
That is, in the embodiment of the present application, the user operation data may be the magnitude of the acting force, the direction of the acting force, and the like of the user operation acquired by the electronic device through the touch sensor; the types of user operations may include: pressing, clicking, sliding and other operations which are applied to display elements of an application interface on a screen of the mobile phone by a user; the physical law can be a physical formula corresponding to operations such as pressing, clicking, sliding and the like; the display data may be a display effect of a display element of the interface of the application calculated according to the user operation data by the physics formula, and the display effect may correspond to a physical phenomenon corresponding to the physical law.
For example, taking a mobile phone as an example, when a user applies a pressing user operation to a display element of an application interface on a screen of the mobile phone, the mobile phone may obtain the magnitude of an acting force of the pressing user operation through a touch sensor, that is, user operation data; and calculating the display effect that the display element displays a sinking in the interface, namely display data, according to the magnitude of the acting force of the pressed user operation through the elastic formula, namely the physical law, corresponding to the pressed user operation, and simulating the pressed user operation applied by the mobile phone into the physical phenomenon of compression deformation generated by the compression of the spring corresponding to the elastic formula.
In one possible implementation of the first aspect, the user operation includes at least one of pressing, clicking, and sliding.
That is, in the embodiment of the present application, the user operations of pressing, clicking, and sliding may be user-applied to the screen of the electronic device.
In one possible implementation of the first aspect, the physical phenomenon corresponding to the pressing includes at least one of pressing, deformation, and rebound;
the physical phenomenon corresponding to the click comprises at least one of jump and jump;
the physical phenomena corresponding to the sliding include at least one of sliding, rolling, rotating in place, and turning pages.
That is, in the embodiment of the present application, the pressing user operation can be simulated as a physical phenomenon of compression deformation generated by pressing the spring corresponding to the elastic formula; the sliding user operation is simulated as the physical phenomenon that the object corresponding to the friction force formula and the speed/acceleration formula generates displacement.
In one possible implementation of the first aspect, the electronic device calculates the display data by:
converting user operation data into physical data corresponding to a physical law, and acquiring a physical coefficient corresponding to a display style parameter of the electronic equipment;
the display data is calculated based on the physical data and the physical coefficients.
That is, in the embodiment of the present application, for example, the physical data of the pressing operation may be the magnitude of the acting force of the pressing operation that the electronic device can acquire through the touch sensor; for the physical coefficients corresponding to the display style parameters, including at least one of the weight, sliding coefficient and elastic coefficient of the display elements in the interface under one display style, the electronic device operates the corresponding elastic formula by pressing, such as: force = elastic coefficient distance, the distance of the display effect of the depression displayed by the display element is calculated.
In one possible implementation of the first aspect, the physical data includes at least one of an amount of force, a displacement of the acting force during the acting time, wherein the pressing, clicking and sliding forces applied by the user to the screen of the electronic device correspond to the physical data.
In one possible implementation of the first aspect, the physical coefficient includes at least one of weight, friction coefficient, and elastic coefficient; the display style parameters are different, and the weight, friction coefficient or elastic coefficient of the object operated by the user is different.
That is, in the embodiment of the present application, in the case where the display style is "metallic style", the elastic coefficient included in the display style parameter may be 100N/M; for a display style of "drop style," the elastic coefficient contained in the display style parameter may be 200N/M.
In one possible implementation of the first aspect described above, the physical laws include at least one of a friction equation, an acceleration equation, and a spring force equation.
A second aspect of the present application provides an electronic device, comprising:
a memory storing instructions;
a processor, the processor coupled to the memory, the program instructions stored by the memory when executed by the processor causing the electronic device to:
acquiring user operation data generated by user operation on a screen of the electronic equipment;
calculating the user operation data by using a physical rule corresponding to the type of the user operation to obtain display data corresponding to the user operation data;
and according to the display data, displaying a physical phenomenon generated by simulating the user operation data to follow a physical law on the electronic equipment.
In one possible implementation of the second aspect, the user operation includes at least one of pressing, clicking, and sliding.
In one possible implementation of the second aspect, the physical phenomenon corresponding to the pressing includes at least one of pressing, deformation, and rebound;
the physical phenomenon corresponding to the click comprises at least one of jump and jump;
the physical phenomena corresponding to the sliding include at least one of sliding, rolling, rotating in place, and turning pages.
In one possible implementation of the second aspect, the electronic device calculates the display data by:
converting user operation data into physical data corresponding to a physical law, and acquiring a physical coefficient corresponding to a display style parameter of the electronic equipment;
the display data is calculated based on the physical data and the physical coefficients.
In one possible implementation of the second aspect, the physical data includes at least one of a magnitude of the force, a displacement of the force over the duration of the action, wherein,
the forces applied by the user to press, click and slide on the screen of the electronic device correspond to the physical data.
In one possible implementation of the second aspect, the physical coefficient includes at least one of weight, friction coefficient, and elastic coefficient; the display style parameters are different, and the weight, friction coefficient or elastic coefficient of the object operated by the user is different.
In one possible implementation of the second aspect described above, the physical laws include at least one of a friction equation, an acceleration equation, and a spring force equation.
A third aspect of the present application provides a computer-readable medium, wherein the computer-readable medium has stored thereon instructions, which, when executed on an electronic device, cause the electronic device to execute the interface display method of the first aspect.
In a fourth aspect, an embodiment of the present application provides a chip system, where the chip system is applied to an electronic device including the touch screen. The system-on-chip includes one or more interface circuits and one or more processors. The interface circuit and the processor are interconnected by a line. The interface circuit is configured to receive a signal from a memory of the electronic device and to transmit the signal to the processor, the signal including computer instructions stored in the memory. When the processor executes the computer instructions, the electronic device performs the method of any of the aspects described above and any possible implementation thereof.
In a fifth aspect, the present application provides a computer program product, which when run on a computer causes the computer to perform the method of any of the above aspects and any possible implementation thereof.
Drawings
Fig. 1 (a) and 1 (b) illustrate display interfaces of electronic devices applied in screens of electronic devices having different screen sizes in the related art;
fig. 2 (a) and 2 (b) illustrate display interfaces of electronic devices applied in screens of electronic devices with different screen sizes according to an embodiment of the present application;
fig. 3 (a), 3 (b) and 3 (c) illustrate a process of setting a default display interface for a picture detail interface of a gallery application according to an embodiment of the present application;
4 (a) and 4 (b) illustrate a process of a user performing a zoom-in region operation on a display region of a picture detail interface of a gallery application according to an embodiment of the present application;
FIG. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 6 shows a block diagram of a software architecture of an electronic device according to an embodiment of the present application;
FIG. 7 illustrates a process of setting a default display interface for a picture gallery interface of a gallery application, according to an embodiment of the application;
FIGS. 8 (a) and 8 (b) illustrate display interfaces of a picture summary interface of a gallery application according to embodiments of the present application;
FIGS. 9 (a) and 9 (b) illustrate a display interface of an electronic display gallery application, and a flow chart for generating display effects in response to user manipulation by a user, according to an embodiment of the present application;
10 (a), 10 (b) and 10 (c) show the process of user operation of pressing in the picture list interface of the gallery application by the user according to the embodiment of the application;
11 (a), 11 (b) and 11 (c) illustrate a process in which a user performs a switching operation of a display style on a gallery application according to an embodiment of the present application;
12 (a) and 12 (b) illustrate a process in which a user performs a sliding user operation in a picture list interface of a gallery application according to an embodiment of the application;
13 (a) and 13 (b) illustrate a user performing a downslide user action in a picture list interface of a gallery application according to embodiments of the application;
14 (a), 14 (b) and 14 (c) illustrate a process of an expanded user operation performed by a user on a picture list interface of a gallery application according to an embodiment of the present application;
15 (a) and 15 (b) illustrate a process of switching from a picture summary interface to a picture detail interface of a gallery application according to an embodiment of the present application;
16 (a) and 16 (b) illustrate another process for switching from a picture summary interface to a picture detail interface of a gallery application according to an embodiment of the present application;
Detailed Description
Embodiments of the present application include, but are not limited to, an interface display method, a computer readable medium, and an electronic device. To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
In order to solve the problem that a plurality of display interfaces with different layouts and different display effects need to be developed for the same application, the application provides an interface display method. In an embodiment of the application, after the application is installed, the electronic device generates a corresponding default display interface for each display interface of the application in advance, where in the default display interface, one center display element is selected from multiple display elements included in the default display interface, and positions of other display elements in the default display interface are configured with respect to a position of the center display element, that is, each display element in the default display interface has a relative position.
After the electronic device starts the application, the electronic device determines the absolute position of the central display element and the absolute positions of the other display elements relative to the central display element according to the size of the display area of the display interface of the application. Meanwhile, when the user adjusts the size of the display area of the display interface or adjusts the size of the display elements in the display interface, the electronic device can dynamically adjust the positions of the display elements.
In addition, for various user operations of the application display interface by the user, the electronic equipment can also generate a display effect responding to the user operations by simulating physical laws in the objective world. For example, when a user presses a display element of an application interface, such as a certain icon, the icon may be deformed or deformed differently according to different pressing forces of the user, and a relationship between the deformation and the pressing force of the user may be determined by a physical formula similar to an elastic formula. For another example, when the user slides an icon of the application interface, the icon may be scrolled by different distances according to different sliding force of the user's finger, and the relationship between the scrolling distance and the sliding force of the finger may be determined according to a physical formula similar to a friction formula. In addition, the same user operation on the icon by the user can generate different display effects or different physical phenomena corresponding to different display effects of the application. Specific embodiments will be described in detail below.
It is understood that, in the embodiment of the present application, the display element in the application display interface may be a key, a text box, a picture, a drop-down box, or the like in the display interface of the application.
Furthermore, it is understood that in the embodiments of the present application, the physical law refers to a physical law like a friction force formula, a velocity/acceleration formula, an elastic force formula, and the like, or a data processing model is built based on the physical law.
In addition, it is understood that, in the embodiment of the present application, the physical phenomenon of the display element displayed on the electronic device refers to a display effect that the display element shows in response to a user operation, such as a display effect of pressing, deforming, rebounding, jumping, sliding, and scrolling.
Taking an electronic device as a mobile phone 100 and an application as a gallery application 101 as an example, a default display interface is set for a picture detailed interface 1012 of the gallery application 101, and a display element of the picture detailed interface 1012 in the default display interface has a default position; when the mobile phone 100 opens and displays the detailed picture interface 1012, the mobile phone 100 dynamically adjusts the position of each display element in the detailed picture interface 1012 according to the display area of the detailed picture interface 1012 on the screen of the mobile phone 100 for illustration.
Fig. 2 (a) to 2 (b) show a process in which the cell phone 100 opens and displays the picture detail interface 101, and as shown in fig. 2 (a), after responding to an operation of clicking an icon of the gallery application 101 on the desktop by the user, the cell phone 100 starts the gallery application 101. As shown in fig. 2 (b), the mobile phone 100 displays a picture list interface 1011 of the gallery application 101 in the screen, and the picture list interface 1011 may include thumbnails of the pictures 10111 to 10116. If the user of the mobile phone 100 wants to browse the details of the picture 10112, after the user clicks the thumbnail of the picture 10112, the mobile phone 100 may jump to and display the picture detail interface 1012 corresponding to the picture 10112.
Before the mobile phone 100 opens and displays the picture detail interface 1012, as shown in fig. 3 (a), the mobile phone 100 may generate a default display interface for the picture detail interface 1012, where the default display interface includes: picture 10121, picture details 10122, a "details" button 10123, and a "next" button 10124; in the default display interface, the cell phone 100 can configure the picture 10121 as a center display element and configure the position of the picture 10121, the picture details 10122, the "details" button 10123, and the "next" button 10124 in order from left to right based on the position of the picture 10121.
After the mobile phone 100 opens and displays the picture detail interface 1012, when the size of the display area of the picture detail interface 1012 is the whole screen of the mobile phone 100, the mobile phone 100 dynamically adjusts the position of the display element in the picture detail interface 1012 according to the size of the screen, so that the position of the display element in the picture detail interface 1012 is different from that in the default display interface, for example, as shown in fig. 3 (b), in the picture detail interface 1012 of the gallery application 101, the picture 10121, the picture details 10122, the "details" button 10123, and the "next" button 10124 may be arranged in the vertical order.
It is understood that when the display area of the picture detail interface 1012 is a partial area of the screen of the mobile phone 100, as shown in fig. 3 (c), the mobile phone 100 opens and displays the picture detail interface 1012 of the gallery application 101 on the screen through the floating window, and the positions of the display elements in the picture detail interface 1012 may be different from those in the picture detail interface 1012 shown in fig. 3 (b), for example, the picture detail 10122, the "detail" button 10123, and the "next" button 10124 and the picture 10121 may be arranged in left and right in the picture detail interface 1012 of fig. 3 (c).
It will be appreciated that in embodiments of the present application, the type and number of display elements in the display interface for an application are exemplary, that is, any other number and type of display elements may be included. For example, in the default display interfaces of fig. 3 (a) to 3 (c), in addition to the display elements 10121 to 10124, in some embodiments, other numbers and types of display elements may be included, which is not limited thereto.
Furthermore, after the mobile phone 100 responds to an operation performed by the user to reduce or enlarge the size of the display area of the picture detailed interface 1012 or an operation performed by the user to increase/decrease the size of the display element, the mobile phone 100 may also dynamically adjust the position of each display element in the picture detailed interface 1012 of the gallery application 101. For example, as shown in fig. 4 (a), in response to an operation performed by the user to enlarge the size of the display area of the picture detail interface 1012, the cellular phone 100 causes the picture 10121, the picture detail 10122, the "details" button 10123, and the "next" button 10124 in the picture detail interface 1012 to be arranged up and down again, as shown in fig. 4 (b).
In addition, as described above, in some embodiments, after the gallery application 101 is installed, the mobile phone 100 further generates a display style model, that is, a data processing model, for each display interface of the gallery application 101, where the display style model corresponds to a physical phenomenon generated by at least one physical law according to a user operation applied to the display interface, and may dynamically calculate a display effect of each display interface of the gallery application 101 according to a default display style parameter loaded after the gallery application 101 is opened or a physical data corresponding to a display style parameter selected by a user and a physical data corresponding to a user operation data of a user operation applied by the user.
For example, the mobile phone 100 may configure a display style model of the display effect of the pressing operation on the picture 10121 of the picture detail interface 1012, so that the picture 10121 can display the pressing and rebounding effects in response to the pressing operation of a user on the picture 10121. For example, the display style model of press bounce may be a principle that simulates a spring, using an elastic force formula, e.g., force = spring rate distance. The applied force here may be the applied force of the pressing operation of the user, which is acquired by the mobile phone 100 through the touch sensor, for example: force =2n, N may be the mechanical unit newton (N); the elasticity coefficient may be physical data corresponding to a display style parameter of a display style of gallery application 101, for example, in the case where the display style is "metallic style", the physical data corresponding to the display style parameter may include, for example, an elasticity coefficient: 100N/M, N/M may be the elastic progress unit Newton (N)/meter (M). Through the display style model, the mobile phone 100 can calculate that the picture 10121 responds to the pressing operation to display a pressing effect of measuring the displacement distance of 0.01M into the screen; after the end of the pressing operation by the user, the picture 10121 may spring back to the original position.
By the method, after the electronic equipment installs the application, the electronic equipment can generate a default display interface for each display interface of the application in advance, and after the electronic equipment starts the application, the electronic equipment can dynamically adjust the layout of the display elements in the display interface according to the size of the display area of each display interface. In addition, the electronic device may also configure a display style model for the display interface of the application in advance, and after the electronic device starts the application, the electronic device may dynamically calculate the display effect of the display element according to a user operation applied by the user and the display style parameters of the loaded application or the display style parameters selected by the user. Therefore, the situation that a plurality of layouts and display effects need to be configured on each display interface of one application is avoided.
The electronic device 100 in embodiments of the present application may be a variety of electronic devices, for example, the electronic device 100 includes, but is not limited to, a laptop computer, a desktop computer, a tablet computer, a cell phone, a server, a wearable device, a head-mounted display, a mobile email device, a portable game console, a portable music player, a reader device, or other electronic device capable of accessing a network. In some embodiments, embodiments of the present application may also be applied to wearable devices worn by a user. For example, a smart watch, bracelet, piece of jewelry (e.g., a device made into a decorative item such as an earring, bracelet, or the like), or glasses, or the like, or as part of a watch, bracelet, piece of jewelry, or glasses, or the like. The following description will be given taking an example in which the electronic device 100 is a cellular phone 100.
It is to be understood that the display area of the application may be a screen of the electronic device 100, and may also be a local area in the screen of the electronic device 100, and the display style parameters may include: the specific gravity of the display element, the sliding coefficient, etc.
It can be understood that various applications, for example, a gallery application, a video conference application, an instant chat application, a video playing application, a navigation application, and the like, may be installed on the mobile phone 100, and these applications may adopt the technical solution of the present application to layout the display interface of the application.
Fig. 5 shows a schematic structural diagram of a handset 100 according to an embodiment of the application.
The mobile phone 100 may include a processor 110, a screen 111, an internal memory 120, an interface module 130, a power module 140, a wireless communication module 150, a mobile communication module 160, an audio module 170, a camera 180, and a touch sensor 190.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the mobile phone 100. In other embodiments of the present application, the handset 100 may include more or fewer components than shown, or some components may be combined, some components may be separated, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), among others. The different processing units may be separate devices or may be integrated into one or more processors.
A memory may also be provided in the processor 110 for storing instructions and data. In an embodiment of the present application, the processor 110 may perform an interface display method for an application.
The screen 111 is used to display images, videos, and the like. In the embodiment of the present application, the mobile phone 100 may dynamically adjust the position of each display element in the display interface of the application according to the size of the display area of the display interface of the application in the screen 111.
Internal memory 120 may be used to store computer-executable program code, which includes instructions. The internal memory 120 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area may store data (e.g., audio data, a phonebook, etc.) created during use of the handset 100, and the like. In an embodiment of the present application, the internal memory 120 may store therein display style parameters of the application, and layout rules of display elements and display style models of the display elements in a display interface of the application. The layout rules herein are used to configure the default display interface for the application.
The interface module 130 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the mobile phone 100. The external memory card communicates with the processor 110 through the interface module 130 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The power module 140 receives battery input and supplies power to the processor 110, the internal memory 120, the display 111, and the like.
The wireless communication module 150 may provide a solution for wireless communication applied to the mobile phone 100, including a Wireless Local Area Network (WLAN) (e.g., a wireless fidelity (Wi-Fi) network), bluetooth (BT), a Global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and so on.
The mobile communication module 160 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied to the handset 100.
The mobile phone 100 implements a display function through the GPU, the display screen 111, and the application processor.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The mobile phone 100 can implement a shooting function through the camera 180 and the application processor.
The touch sensor 190 is also referred to as a "touch device". The touch sensor 190 may be disposed on the screen 111, and the touch sensor 190 and the screen 111 form a touch screen, which is also called a "touch screen". In the embodiment of the present application, the touch sensor 190 is used for recognizing a user operation performed by a user on the screen 111 of the mobile phone 100 and acquiring physical data of the user operation, for example, the physical data is an acting force, an acting direction, and the like included in the user operation.
Fig. 6 is a block diagram of the software configuration of the mobile phone 100 according to the embodiment of the present invention.
As shown in fig. 6, the mobile phone 100 may be divided into an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer.
Wherein the application layer may include a series of application packages.
As shown in fig. 6, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc. In an embodiment of the present application, the application package may include a gallery application 101 or the like.
The application framework layer may include a view system, a gesture recognition system, and the like.
In an embodiment of the present application, the gesture recognition system is used to recognize a user operation performed on the gallery application 101 by the user on the screen of the cell phone 100.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build a display interface for an application. The display interface may be composed of one or more display elements, where a display element refers to an element in the display interface of an application in the screen of the electronic device. For example, the display elements may include buttons, text, pictures, pop-ups, menus, title bars, lists, or search boxes, among others. The display interface of the application may include at least one display element. In the embodiment of the present application, the view system may be configured to implement a layout scheme of a display interface of an application of the present application, for example, when the application is started, the view system may dynamically adjust a position of a display element in the display interface based on a size of a display area of the display interface of the application in the screen 111 of the mobile phone 100; meanwhile, the view system can also configure a display style model for the display interface of the application, and when the application is started, the view system uses the display style parameters of the application to calculate the display effect of the display elements in the display interface through the display style model.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application layer and the application framework layer as binary files. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide a fusion of the 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
In an embodiment of the present application, an application may include multiple display interfaces, for example, the gallery application 101, which may include a picture summary interface 1011 and a picture detail interface 1012. After the gallery application 101 is installed in the mobile phone 100, the mobile phone 100 configures a default display interface for each display interface of the gallery application 101 and a display style model for a display effect of each display interface, and the following description takes the picture list interface 1011 as an example. The arrangement shown in fig. 5 may be implemented by the processor 110 of the handset 100 invoking the associated program.
Specifically, as shown in fig. 7, the scheme of the interface display method includes:
fig. 7 shows a process of setting a default display interface for the picture list interface 1011. As shown, it includes:
s701: display elements included in the picture list interface 1011 are acquired.
For example, as shown in fig. 8 (a), the mobile phone 100 has 6 display elements, i.e., a picture 10111 to a picture 10116, on the picture list interface 1011 of the gallery application 101.
S702: a central display element is determined.
The mobile phone 100 uses the picture 10111 as a central display element in the picture list interface 1011, and the mobile phone 100 can configure the relative position of the picture 10111 in the picture list interface 1011 as follows: left: width of the display area 1/6; and (3) right: width of display area 5/6; the following steps: height of display area 3/10; the following: height 7/10 of the display area. The relative position here is the position of the center of the picture 10111 in the display area of the picture list interface 1011. It is understood that the display area may be a partial area of the screen of the mobile phone 100, or may be the entire screen of the mobile phone 100.
S703: the positions of the other display elements relative to the central display element are configured.
For pictures 10112 to 10116, taking picture 10112 as an example, the cell phone 100 can configure the relative position of picture 10112 as "sequence number: 2; spacing: 30", wherein" serial number: 2 "indicates that the picture 10112 may be the second display element in the picture list interface 1011 and adjacent to the picture 10111 as the center display element," space: 30 "indicates that the picture 10112 is separated from the picture 10111 by 30 pixels.
In addition to the above relative positions, the cell phone 100 can configure, for example, the size: 160X 250.
It is understood that by setting one center display element in the picture list interface 1011 first and then configuring the relative positions of the other display elements with respect to the center display element, the display elements in the picture list interface 1011 can present a layout adapted to a display area of different sizes without configuring a plurality of layouts for the picture list interface 1011.
S704: and configuring a display style model of the display effect of the display elements in the display interface.
In the embodiment of the present application, the display effect of the display element may be a display effect displayed by the display element in response to a user operation of the user, for example, the picture 10112 may display a pressing and rebounding display effect in response to a pressing and holding user operation of the user, that is, the picture 10112 may present a display effect of a spring. The mobile phone 100 may configure a display style model for displaying the pressing and rebounding display effect on the picture 10112, for example: the display style model may be, force = elastic coefficient distance, where the force may be a force of a pressing operation of the user acquired by the mobile phone 100 through the touch sensor, and the elastic coefficient may be physical data corresponding to a display style parameter of the display style of the gallery application 101. The mobile phone 100 can calculate the display effect of the pressing of the picture 10112 through the display style model.
In embodiments of the present application, the display effect of a display element may also be a display effect of one display element relative to another display element. For example, if the picture 10112 slides to the picture 10113 in response to a user operation of a user pressing and holding the slide, the picture 10112 can slide to the position of the picture 10113 in the display interface of the picture list interface 1011 with a display effect of sliding, and at the same time, the picture 10113 can slide to the position of the picture 10112 in the opposite direction with the same display effect of sliding. The display style model of the display effect corresponding to the user operation of the above-described slide may be, for example, acceleration = (acting force-specific gravity ×. Slide coefficient)/specific gravity; velocity = acceleration vs. duration of the slide, where the force may be the force of the user sliding the picture 10112, and the specific gravity and the slide coefficient may be physical data corresponding to display style parameters corresponding to the display style of the gallery application 101. The mobile phone 100 calculates the speed of sliding the picture 10112 by the acceleration to display a display effect that the picture 10112 slides at the speed.
After the mobile phone 100 configures the display style models of the relative positions and the display effects for the display elements of the picture list interface 1011 of the gallery application 101 through the above-described S701 to S704, the mobile phone 100 may start the gallery application 101 in response to the start operation of the user, and display the display interface of the gallery application 101 in the display area of the screen. The following describes, by using fig. 9 (a) and 9 (b), an example in which the mobile phone 100 displays the display interface of the gallery application 101 in the display area of the screen, and responds to a user performing a user operation on a display element in the display interface of the gallery application 101 so that the display element generates a display effect, so as to describe in detail the interface display method provided by the present application. Wherein the schemes shown in fig. 9 (a) and 9 (b) can be implemented by the processor 110 of the handset 100 invoking the relevant programs, e.g. the view system. As shown in fig. 9 (a), the interface display method of the gallery application 101 in the embodiment of the present application includes:
s901: an operation of a user to open an application is detected.
The mobile phone 100 may respond to a user clicking a desktop of the mobile phone 100, that is, a "gallery" icon in a User Interface (UI) of the mobile phone 100, and after the mobile phone 100 receives an instruction of a user click operation, the mobile phone 100 starts the gallery application 101.
S902: the size of a display area of a first display interface of an application is obtained.
The display area here may be the screen of the mobile phone 100, or may be a partial area in the screen of the mobile phone 100. Taking the display area as the screen of the mobile phone 100 as an example, the size of the display area may be the resolution of the display area, and the resolution may be measured by pixels. The numerical value of the resolution may be the number of horizontal pixels and vertical pixels in the display area. For example: for a display area with a resolution of 600 × 800, that is, the display length in the horizontal direction in the display area is 600 pixels, and the display length in the vertical direction is 800 pixels. And the size of the display area is obtained, so that the position of the display element in the display area can be conveniently determined in the subsequent steps.
S903: and adjusting the display elements of the first display interface according to the size of the display area and displaying the first display interface.
For example, the first display interface may be a picture list interface 1011, and after the gallery application 101 is started by the mobile phone 100, the gallery application 101 may open and display the picture list interface 1011. The mobile phone 100 can adjust the layout of the center display element in the picture list interface 1011 of the gallery application 101 according to the size of the display area in S902.
Here, the center display element preset in the picture list interface 1011 may be a picture 10111, the layout of the picture 10111 may be as shown in fig. 8 (b), and in the case where the screen of the mobile phone 100 is 600 × 800 resolution, the position of the picture 10111 may be, left: 100, respectively; and (3) right: 500, a step of; the method comprises the following steps: 250 of (a); the following: 550, the distances between the center of the picture 10111 and the left boundary, the right boundary, the upper boundary and the lower boundary of the screen of the mobile phone 100 are 100 pixels, 500 pixels, 250 pixels and 550 pixels, respectively. It is to be understood that, in the case where the resolution of the screen of the mobile phone 100 is other values, the position of the center display element preset in the picture list interface 1011 may be different from the above-described position.
After the cell phone 100 has adjusted the layout of the picture 10111, the cell phone 100 may also adjust the size of the picture 10111, for example, the size of the picture 10111 is adjusted to 160 × 250, so that distances of the center of the picture 10111 in the picture list interface 1011 relative to the left boundary, the right boundary, the upper boundary and the lower boundary of the display area of the picture list interface 1011 are 100 pixels, 500 pixels, 250 pixels and 550 pixels, respectively, and occupy an area of 160 × 250.
After the cell phone 100 has adjusted the layout of the center display element in the picture list interface 1011 of the gallery application 101, the cell phone 100 adjusts the other display elements in the picture list interface 1011 according to the center display element. Here, the description will be given by taking the picture 10111 and the picture 10112 as examples. The mobile phone 100 may obtain the sequence number of the picture 10112 according to the sequence number: 2; spacing: the attribute of 30 ″ specifies the position of the picture 10112 on the picture list interface 1011.
For example, the distance between the center of the picture 10111 and the left boundary, the distance between the right boundary, the upper boundary and the lower boundary of the picture 10111 are 100 pixels, 500 pixels, 250 pixels and 550 pixels, respectively, and then for the picture 10112 adjacent to the picture 10111, the position of the picture 10112 may be 300 pixels, 250 pixels and 550 pixels, respectively, from the center of the picture 10112 to the left boundary, the right boundary, the distance between the upper boundary and the lower boundary of the picture 10111.
It is to be understood that, for the pictures 10113 to 10116 in the picture list interface 1011, the mobile phone 100 can sequentially determine the positions of the pictures 10113 to 10116 in this order.
The pictures 10111 to 10116 in the picture list interface 1011 are laid out in two rows and three columns in the display area of 600 × 800 by first determining the position of the center display element, the picture 10111, and then determining the relative positions of the pictures 10112 to 10116 with respect to the picture 10111 in the picture list interface 1011.
S904: and loading the display style parameters of the application.
In the embodiment of the application, when the mobile phone 100 starts the gallery application 101, the mobile phone 100 may further obtain a display style corresponding to the gallery application 101. After the mobile phone 100 determines the positions of the display elements in the picture list interface 1011 of the gallery application 101, the mobile phone 100 can arrange the display elements in the picture list interface 1011 using the display style parameters corresponding to the display style of the gallery application 101.
For example, in a case where the display style of the gallery application 101 is "metallic style", the mobile phone 100 may acquire a display style parameter corresponding to the display style being "metallic style", and the display style parameter may be used to configure the display effect of the display element of the gallery application 101. For example: the display style parameters may include: specific gravity of 2, sliding coefficient of 0.15, elastic coefficient of 100N/M, etc. The display style parameters herein may be stored in the internal memory 120 of the handset 100.
S905: and displaying the adjusted first display interface.
After the mobile phone 100 has adjusted the positions of the display elements of the picture list interface 1011 of the gallery application 101 and the display style parameters of the display elements arranged on the picture list interface 1011 through the above-described S901 to S904, the picture list interface 1011 is displayed on the screen 111.
S906: whether user operation on the first display interface exists is detected.
The mobile phone 100 may detect whether the user has performed a user operation on the picture list interface 1011, and if so, the mobile phone 100 performs S907, and the mobile phone 100 changes the position of the display element or causes the display element to display at least one display effect in response to the user operation on the first display interface, and if not, the mobile phone 100 continues back to S906, keeping the detection.
S907: the position of the display element or the display effect of the display element is changed in response to a user operation.
In an embodiment of the present application, the user operation of the picture list interface 1011 may be a user operation performed by a user on a display element of the picture list interface 1011, and the mobile phone 100 may change the position of one or more display elements in the picture list interface 1011 in response to the user operation; the display element may also display at least one display effect during the change in position of the display element.
For example, as shown in fig. 10 (a) to 10 (c) and fig. 12 (a) to 12 (b), the mobile phone 100 may cause the picture 10112 and the picture 10113 in the picture list interface 1011 to exchange positions in response to a user operation performed by the user to hold and slide the picture 10112 in the picture list interface 1011 to the position of the picture 10113. In the above process, when the mobile phone 100 responds to the user operation of pressing the picture 10112 by the user, the picture 10112 calculates and displays a display effect corresponding to the pressed user operation according to the display style parameter loaded in step S904 and through the display style model corresponding to the pressed user operation; when the user slides the picture 10112, the picture 10112 generates a display effect in response to the sliding user operation according to the display style parameters loaded in step S904, and when the picture 10112 and the picture 10113 are exchanged, the picture 10112 and the picture 10113 generate a display effect at the exchanged location according to the display style parameters loaded in step S904. The process of changing the position of the display element or displaying the display effect of the display element may be realized by S907a to S907d below.
S908: and loading the display style of the application in response to the operation of switching the display style of the application.
In the embodiment of the present application, after the mobile phone 100 has finished adjusting the positions of the display elements of the picture list interface 1011 of the gallery application 101 and the display style parameters of the display elements arranged on the picture list interface 1011 through the above-described S901 to S904, and then the picture list interface 1011 is displayed on the screen 111, the mobile phone 100 can switch the display style of the gallery application 101 in response to the switching operation of the display style performed by the user. For example, as shown in fig. 11 (a) to 11 (c), when the user touches the upper right side of the picture list interface 1011 of the gallery application 101 to display the toolbar 1111 of the gallery application 101, clicks on the option of selecting the display style in the toolbar 1111, and selects the "water-drop style" different from the current display style "metallic style" from the option, the mobile phone 100 resets the display elements of the picture list interface 1011 of the gallery application 101 using the display style parameters corresponding to the "water-drop style". For example, in addition to the cell phone 100 switching the appearance of the display element to an appearance corresponding to a "water-drop style", the cell phone 100 may also obtain a display style parameter corresponding to a display style of "water-drop style", which may be used to configure the display effect of the display element of the gallery application 101. The display style parameter corresponding to the "water droplet style" here may be different from that of the "metallic style", for example: the display style parameters of the "water droplet style" may include: a specific gravity of 3, a slip coefficient of 0.30, and the like.
After the mobile phone 100 switches the display style of the gallery application 101 through step S908, the mobile phone 100 returns to step S906 to continuously detect whether there is a user operation on the first display interface, and if so, the mobile phone 100 executes S907 to calculate the display effect of the display element through the display style model using the display style parameter of "water drop style" in response to the user operation on the display element in the first display interface.
Here, through steps S904 to S908, the mobile phone 100 can set the display style model to the display element in the display interface of the gallery application 101 only once, thereby avoiding the need to set different display effects for different display styles. After each switching of the display style by the gallery application 101, different display effects of the display elements are calculated using the display style model according to the display style parameters corresponding to the switched display style.
As the content of step S907 is described in detail below, as shown in fig. 9 (b), the process of step S907 performed by the mobile phone 100 includes:
s907a: physical data of user operation is acquired.
The mobile phone 100 obtains physical data of the user operation through the touch sensor 180K, where the physical data of the user operation may include the type of the user operation and/or the acting force of the user operation. For example, as shown in fig. 10 (a), the user presses the picture 10112, and as shown in fig. 12 (a), the user slides the picture 10112, and the mobile phone 100 may obtain the force of the two user operations through the touch sensor 180K. The application can adopt a physical field mechanics concept to simulate an acting force generated by a user touch operation, for example, the acting force of the user pressing the picture 10112 and the acting force of the user sliding the picture 10112 can be, for example, an acting force 1=1n and an acting force 2=2n, respectively. It is understood that N may be a mechanical unit newton (N), and in some embodiments of the present application, other units may be used to calculate the force, which is not limited thereto.
S907b: and determining the display effect of the display element through the display style model according to the physical data operated by the user and the display style parameters.
Here, for the user operation of pressing the picture 10112 by the user as shown in fig. 10 (a), after the mobile phone 100 acquires the acting force 1 corresponding to the pressed user operation, the mobile phone 100 may calculate the display effect of the user operation of pressing the picture 10112 by the user through the display style model according to the display style parameters configured in step S904. For example, as shown in fig. 10 (b), the display effect of the pressing operation of the picture 10112 that the user holds down may be a display effect in which the picture 10112 displays a press-down. For the display effect of the pressing, the mobile phone 100 may use the applied force 1=1n corresponding to the user operation, and the display style parameter of "metallic style" described in step S904: specific gravity: 2, sliding coefficient: 0.15, modulus of elasticity: 100N/M calculates the distance of the display effect of the press-down of the picture 10112. The distance can be calculated by the display style model (1) of the display effect configured for the picture 10112 in step S704, where the display style model (1) can be an elastic formula.
Force 1= spring constant distance
Display style model (1)
The acting force 1:1N, modulus of elasticity: after 100N/M is substituted into the display style model (1), a distance of 0.01M can be obtained. That is, in the case of the picture 10112 pressed by the user, as shown in fig. 10 (c), the picture 10112 displays a press-down effect of measuring the displacement distance 0.01M into the screen in response to the pressing operation.
Similarly, for the user operation of the user sliding the picture 10112 as shown in fig. 12 (a), after the mobile phone 100 obtains the acting force 2 corresponding to the sliding operation, the mobile phone 100 may calculate the display effect of the user sliding the picture 10112 through the display style model corresponding to the sliding operation according to the display style parameter of the "metallic style". For example, as shown in fig. 12 (a), in the process of the user sliding the picture 10112 to the position of the picture 10113, the picture 10112 displays a sliding display effect. The mobile phone 100 may use the applied force 2 =2ncorresponding to the sliding operation, and the display style parameters: specific gravity: 2, sliding coefficient: 0.15, and the sliding time length from the user sliding the picture 10112 to the position of the picture 10113, the speed of the display effect of the sliding of the picture 10112 is calculated. The acceleration here can be calculated by the following display style model (2). The display style model (2) herein may include a velocity/acceleration equation and a friction equation.
Acceleration = (force 2-specific gravity slip coefficient)/specific gravity
Speed = acceleration vs. duration of slip
Display style model (2)
And (3) applying the acting force 2:2N, specific gravity: 2, sliding coefficient: 0.15 and sliding duration: 0.3, substituting into the display style model (2), an acceleration of 0.85 and a velocity of 0.255 can be obtained. That is, in the process from the picture 10112 to the picture 10113 slid by the user, the picture 10112 displays a sliding display effect with a speed of 0.255.
It is understood that there is a connection relationship between the picture 10112 and the picture 10113 here, that is, as shown in fig. 12 (b), while the user slides the picture 10112 to the position of the picture 10113, the picture 10113 can slide to the position of the picture 10112 with a display effect of sliding the picture 10112 in the opposite direction. After picture 10112 and picture 10113 have exchanged their positions, i.e., the user is no longer holding down on picture 10112, picture 10112 may display a rebound display effect.
Here, the display effect of the picture 10113 sliding to the position of the picture 10112 can be calculated by the display style model (2) described above based on the acting force 2 corresponding to the user operation of the user sliding the picture 10112, and the display style parameter.
For example, force 2:2N, specific gravity: 2, sliding coefficient: 0.15 and sliding duration: 0.3, substituting into the display style model (2), an acceleration of 0.85 and a velocity of 0.255 can be obtained. That is, in the process of the user sliding the picture 10112 to the picture 10113, the picture 10113 displays a sliding display effect with a speed of 0.255 to slide to the picture 10112.
It is understood that the display effect of the interface elements in gallery application 101 may be different for different display styles of gallery application 101 loaded by cell phone 100. For example, for "metallic style", the sliding effect between picture 10112 and picture 10113 may be a rigid slide; for "drip style", the display style corresponding display style parameter may include: specific gravity of 3, sliding coefficient of 0.30, elastic coefficient: 200N/M. The mobile phone 100 uses the display style parameter of "water drop style", and the display effect calculated by the display style models (1) and (2) may be different from "metallic style", and the sliding effect between the picture 10112 and the picture 10113 in the "water drop style" may be fluid sliding.
S907c: changing the position of the display element or displaying the display effect of the display element.
After the cell phone 100 determines the display effect of the picture 10112 and the picture 10113, the cell phone 100 responds to the user operation of sliding the picture 10112 to the picture 10113, and when the user presses the picture 10112, as shown in fig. 10 (c), the picture 10112 displays a pressing display effect. In the process from the picture 10112 to the picture 10113 slid by the user, the picture 10112 displays a sliding display effect with a speed of 0.255; meanwhile, the picture 10113 shows a sliding display effect with a speed of 0.255, sliding to the position of the picture 10112.
In another embodiment of the present application, the cell phone 100 may also change the position of a display element in a display interface of an application in response to a user operation of an area outside the display element.
As shown in fig. 13 (a), the user performs a downslide user operation on the picture list interface 1011 of the gallery application 101. The cellular phone 100 changes the display elements in the picture list interface 1011 of the library application 101 to the pictures 10114 to 10119 in response to the user operation of the slide-down. At this time, as shown in fig. 13 (b), since the center display element arranged in advance, that is, the picture 10111 is already outside the picture list interface 1011 of the gallery application 101, the mobile phone 100 needs to change the center display element of the picture list interface 1011 of the gallery application 101. For example, after the display elements in the picture list interface 1011 of the gallery application 101 are changed to the pictures 10114 to 10119, the cell phone 100 can determine that the picture 10114 is the center display element.
For example, the mobile phone 100 may arrange the center display element in the picture list interface 1011 according to the size of the display area acquired in S902, the attribute of the center display element in the picture list interface 1011 of the gallery application 101, and the position of the center display element preset in the picture list interface 1011.
Here, the position of the center display element preset in the picture list interface 1011 may be the same as that described in S903, and the position of the center display element may be, left: 100, respectively; and (3) right: 500, a step of; the method comprises the following steps: 250 (c); the following: 550, the distances between the center of the central display element and the left boundary, the right boundary, the upper boundary and the lower boundary of the screen of the mobile phone 100 are 100 pixel points, 500 pixel points, 250 pixel points and 550 pixel points, respectively.
In another embodiment of the present application, as shown in fig. 14 (a), the mobile phone 100 may further change the display elements in the picture list interface 1011 of the gallery application 101 to pictures 10111 to 10114 in response to an expanded user operation performed by the user on the picture list interface 1011 of the gallery application 101. In this case, as shown in fig. 14 (b), the mobile phone 100 needs to change the picture 10111 on the picture list interface 1011 of the gallery application 101, and the positions of the pictures 10112 to 10114 are also changed from the picture 10111.
The mobile phone 100 obtains physical data of the user operation through the touch sensor 180K, where the physical data of the user operation may include an expansion multiple of the expanded user operation. The mobile phone 100 adjusts the size of the display element in the picture list interface 1011 of the gallery application 101 according to the expansion factor.
For example, when the magnification factor is 1.5, the mobile phone 100 resizes the display elements in the picture list interface 1011 of the library application 101 to 240 × 375.
Next, the mobile phone 100 may, according to the size of the display area acquired in S902, for example: 600 × 800, the center display element in the picture list interface 1011 of the gallery application 101 is rearranged by adjusting the size of the center display element and the position of the center display element in the picture list interface 1011.
Here, the position of the center display element preset in the picture list interface 1011 may also be adjusted according to the magnification of the user operation, and for example, in the case where the magnification is 1.5, the position of the center display element may be, left: 160; and (3) right: 440; the method comprises the following steps: 205; the following: 595, the distances between the center of the center display element and the left, right, upper and lower boundaries of the screen of the mobile phone 100 are 160 pixels, 440 pixels, 205 pixels and 595 pixels, respectively. As shown in fig. 14 (c), the mobile phone 100 can arrange the picture 10111 at the positions where the distances between the left boundary, the right boundary, the upper boundary and the lower boundary of the screen of the mobile phone 100 are 160 pixels, 440 pixels, 205 pixels and 595 pixels, respectively, and the picture 10111 occupies a 240 × 375 area on the screen of the mobile phone 100. Meanwhile, the positions and sizes of the pictures 10112 to 10114 are also adjusted accordingly.
The above steps S901 to S907 describe the procedure in which the mobile phone 100 opens and displays the picture list interface 1011 of the gallery application 101, and changes the positions of the display elements of the picture list interface 1011 and the display effects of the display elements in response to a user operation performed by the user on the picture list interface 1011, and in another embodiment of the present application, the mobile phone 100 may further cause the application to switch from the first display interface to the second display interface in response to the user operation by the user, and display the display effects of the display elements of the second display interface when the second display interface is opened and displayed.
As shown in fig. 15 (a), the cellular phone 100 can switch from the picture list interface 1011 to the picture details interface 1012 in response to a user operation of one click performed by the user on the picture 10112 in the picture list interface 1011. As shown in fig. 15 (b), the picture detail interface 1012 includes four display elements: picture 10121, "details" button 10122, "next" button 10123, and picture details 10124.
The mobile phone 100 may configure the positions of the display elements and the display style model of the display effect in the picture detail interface 1012 before opening the picture detail interface 1012 in the same manner as in S702 to S703 described above.
For example, before opening the picture details interface 1012, the cell phone 100 can configure the picture 10121 as a central display element, and at the same time, the cell phone 100 configures the location of the picture 10121 in the picture details interface 1012, where the location can be: left: width of screen 1/2; and (3) right: width of screen 1/2; the method comprises the following steps: height of screen 1/4; the following: height of screen 3/4. The layout of the picture details 10124 may be: "serial number 2; spacing: 30", similarly, the handset 100 is also configured with a" details "button 10122, and a" next "button 10123, such as: "serial No. 3; spacing: 30 "and" serial number 4; spacing: 30".
While the cell phone 100 configures the display elements in the picture detail interface 1012, the cell phone 100 may configure a display style model of a display effect for the display elements in the picture detail interface 1012 of the gallery application 101, for example, taking the picture 10121 in the picture detail interface 1012 as an example, the cell phone 100 may configure the display effect of the picture 10121 such that when the cell phone 100 opens the picture detail interface 1012, the picture 10121 falls from outside the upper boundary of the picture detail interface 1012 to a position where the picture 10121 is located in the picture detail interface 1012. Similarly, the mobile phone 100 may configure the same display effect as the picture 10121 for the picture detail 10124, the "detail" button 10122, and the "next" button 10123.
After the mobile phone 100 configures the display style model for completing the position and display effect of the display element in the picture detail interface 1012, the mobile phone 100 may display the second display interface and trigger the display effect by opening the display element of the second display interface in a method similar to that in fig. 9 (a) and fig. 9 (b), including:
s1101: and adjusting the layout of the second display interface according to the size of the display area and displaying the second display interface.
After the mobile phone 100 displays the picture detail interface 1012, if the resolution of the screen of the mobile phone 100 is 600 × 800, as shown in fig. 15 (b), the position of the picture 10121 of the picture detail interface 1012 may be, left: 300, respectively; and (3) right: 300, and (c) a step of cutting; the method comprises the following steps: 200; the following: 600, that is, the distances between the center of the picture 10121 and the left boundary, the right boundary, the upper boundary and the lower boundary of the picture detailed interface 1012 are 300 pixels, 200 pixels and 600 pixels, respectively. The position of the picture detail 10124 may be a position spaced 30 pixels below the picture 10121.
After configuring the picture details 10124, the mobile phone 100 sequentially configures the "details" button 10122 and the "next" button 10123, as shown in fig. 10 (b), in which the "details" button 10122 and the "next" button 10123 are at the same horizontal position and are located directly below the picture details 10124.
S1102: the display style of the application is loaded.
The display style here may be the same as that in S904. But differs in that the value of the specific gravity of the picture 10121 differs. For example, the display style parameters of the picture 10121 include: specific gravity: 6, sliding coefficient: 0.15.
s1103: and triggering the display effect of the display element of the second display interface.
For example, the position of the picture 10121 is the left: 300, respectively; and (3) right: 300, respectively; the method comprises the following steps: 200 of a carrier; the following: 600, the display style parameters of the picture 10121 include: specific gravity: 6, sliding coefficient: 0.15. the display effect of the picture 10121 is that the picture 10121 falls from outside the upper boundary of the picture detail interface 1012 to the position where the picture 10121 is located in the picture detail interface 1012. That is, the position where the picture 10121 is located in the picture detail interface 1012 is an end point, and the position where the picture 10121 is located outside the upper boundary of the picture detail interface 1012 is a start point. The mobile phone 100 can calculate the display effect of the falling of the picture 10121 through the display style model (3) according to the distance between the end point and the start point and the display style parameter of the picture 10121. In the display style model (3), the force 3 of the falling of the picture 10121 can be set to 3N. The display style model (3) herein may also include velocity/acceleration equations and friction equations.
Distance = pixel value of end-point-pixel value of start-point
Acceleration = (acting force 3-specific gravity slip coefficient)/specific gravity
Speed = initial speed + acceleration duration
Distance = speed × duration
Display style model (3)
And (3) applying force: 3N, specific gravity: 6, sliding coefficient: when the display style model (2) is substituted with 0.15 and the distance 500, the acceleration is 1.05 and the velocity is 22. That is, in the case of the picture 10112 slid by the user, the picture 10113 may enter the picture detail interface 1012 with the display effect of falling at a speed of 22 from outside the upper boundary of the picture detail interface 1012.
Similarly, the cellular phone 100 can calculate the display effect of the "detail" button 10122 and the "next" button 10123.
In another embodiment of the present application, the display effect of switching from the picture list interface 1011 to the picture detail interface 1012 may be that, as shown in fig. 16 (a), the picture detail interface 1012 is displayed in the screen of the mobile phone 100 as a whole with a pop-up display effect from below the picture list interface 1011, and at the same time, the display elements in the picture detail interface 1012 are displayed in the picture detail interface 1012 with a drop-down display effect from outside the upper boundary of the picture detail interface 1012 by the display effect described in fig. 15 (b).
It will be appreciated that the cell phone 100 may also configure display effects for switching between display interfaces of the gallery application 101 and other applications. For example, as shown in fig. 16 (b), when the user modifies the picture details 10124 in the picture detail interface 1012 of the gallery application 101, the input method application 102 may display with a floating display effect from below the screen of the cell phone 100. When the handset 100 receives a message, the instant messaging application 103 of the handset 100 may be displayed from above the screen of the handset 100 with a display effect that drops.
It is understood that, in the embodiments of the present application, the parameters and values in the display style models (1) to (3) are exemplary, that is, any number and types of other parameters and values may be included, and are not limited thereto.
It will be understood that, although the terms "first", "second", etc. may be used herein to describe various features, these features should not be limited by these terms. These terms are used merely for distinguishing and not to imply or imply relative importance. For example, a first feature may be termed a second feature, and, similarly, a second feature may be termed a first feature, without departing from the scope of example embodiments.
Moreover, various operations will be described as multiple operations separate from one another in a manner that is most helpful in understanding the illustrative embodiments; however, the order of description should not be construed as to imply that these operations are necessarily order dependent, and that many of the operations can be performed in parallel, concurrently, or simultaneously. In addition, the order of the operations may be re-arranged. The process may be terminated when the described operations are completed, but may have additional operations not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like.
References in the specification to "one embodiment," "an illustrative embodiment," etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may or may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Furthermore, when a particular feature is described in connection with a particular embodiment, those of ordinary skill in the art will be able to affect such feature in connection with other embodiments whether or not such embodiments are explicitly described.
The terms "comprising," "having," and "including" are synonymous, unless the context dictates otherwise. The phrase "A/B" means "A or B". The phrase "A and/or B" means "(A), (B) or (A and B)".
As used herein, the term "module" may refer to, be a part of, or include: memory (shared, dedicated, or group) for executing one or more software or firmware programs, an Application Specific Integrated Circuit (ASIC), an electronic circuit and/or processor (shared, dedicated, or group), a combinational logic circuit, and/or other suitable components that provide the described functionality.
In the drawings, some features of the structures or methods may be shown in a particular arrangement and/or order. However, it should be understood that such specific arrangement and/or ordering is not required. Rather, in some embodiments, these features may be described in a manner and/or order different from that shown in the illustrative figures. Additionally, the inclusion of a structural or methodical feature in a particular figure does not imply that all embodiments need to include such feature, and in some embodiments may not include such feature, or may be combined with other features.
While the embodiments of the present application have been described in detail with reference to the accompanying drawings, the application of the present application is not limited to the various applications mentioned in the embodiments of the present application, and various structures and modifications can be easily implemented with reference to the embodiments of the present application to achieve various beneficial effects mentioned herein. Variations that do not depart from the gist of the disclosure are intended to be within the scope of the disclosure.

Claims (15)

1. An interface display method is applied to electronic equipment, and is characterized by comprising the following steps:
acquiring user operation data generated by user operation on a screen of the electronic equipment;
calculating the user operation data by using a physical rule corresponding to the type of the user operation to obtain display data corresponding to the user operation data;
and according to the display data, displaying a physical phenomenon generated by simulating the user operation data to follow the physical law on the electronic equipment.
2. The method of claim 1, wherein the user action comprises at least one of a press, a click, and a slide.
3. The method of claim 2, wherein the physical phenomena corresponding to the compression include at least one of depression, deformation, rebound;
the physical phenomenon corresponding to the click comprises at least one of jump and jump;
the physical phenomena corresponding to the sliding include at least one of sliding, rolling, rotating in place, and turning pages.
4. The method of claim 2, wherein the electronic device calculates the display data by:
converting the user operation data into physical data corresponding to the physical law, and acquiring a physical coefficient corresponding to a display style parameter of the electronic equipment;
the display data is calculated based on the physical data and physical coefficients.
5. The method of claim 4, wherein the physical data includes at least one of force magnitude, time-of-action force displacement, wherein,
the forces of pressing, clicking and sliding applied by the user on the screen of the electronic device correspond to the physical data.
6. The method of claim 5, wherein the physical coefficient comprises at least one of weight, coefficient of friction, and coefficient of elasticity; wherein the display style parameters are different, and the weight, friction coefficient or elasticity coefficient of the object operated by the user is different.
7. The method of claim 1, wherein the physical laws comprise at least one of a friction equation, an acceleration equation, and a spring equation.
8. An electronic device, comprising:
a memory storing instructions;
a processor coupled with a memory, the memory storing program instructions that, when executed by the processor, cause the electronic device to:
acquiring user operation data generated by user operation on a screen of the electronic equipment;
calculating the user operation data by using a physical rule corresponding to the type of the user operation to obtain display data corresponding to the user operation data;
and according to the display data, displaying a physical phenomenon which simulates the user operation data to follow the physical law on the electronic equipment.
9. The electronic device of claim 8, wherein the user action comprises at least one of a press, a click, and a slide.
10. The electronic device of claim 9, wherein the physical phenomena corresponding to the pressing comprises at least one of a pressing, a deformation, and a rebound;
the physical phenomenon corresponding to the click comprises at least one of jump and jump;
the physical phenomena corresponding to the sliding include at least one of sliding, rolling, rotating in place, and turning pages.
11. The electronic device of claim 9, wherein the electronic device calculates the display data by:
converting the user operation data into physical data corresponding to the physical law, and acquiring a physical coefficient corresponding to a display style parameter of the electronic equipment;
calculating the display data based on the physical data and physical coefficients.
12. The electronic device of claim 11, wherein the physical data comprises at least one of force magnitude, time of action force displacement, and wherein,
the forces of pressing, clicking and sliding applied by the user on the screen of the electronic device correspond to the physical data.
13. The electronic device of claim 12, wherein the physical coefficient comprises at least one of weight, friction coefficient, and elastic coefficient; wherein the display style parameters are different, and the weight, friction coefficient or elasticity coefficient of the object operated by the user is different.
14. The electronic device of claim 8, wherein the physical laws comprise at least one of a friction equation, an acceleration equation, and a spring equation.
15. A computer-readable medium having stored thereon instructions that, when executed on an electronic device, cause the electronic device to perform the interface display method of any one of claims 1 to 7.
CN202110438190.5A 2021-04-22 2021-04-22 Interface display method, electronic device and computer readable medium Pending CN115237304A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110438190.5A CN115237304A (en) 2021-04-22 2021-04-22 Interface display method, electronic device and computer readable medium
PCT/CN2022/085864 WO2022222771A1 (en) 2021-04-22 2022-04-08 Interface display method, electronic device and computer-readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110438190.5A CN115237304A (en) 2021-04-22 2021-04-22 Interface display method, electronic device and computer readable medium

Publications (1)

Publication Number Publication Date
CN115237304A true CN115237304A (en) 2022-10-25

Family

ID=83665743

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110438190.5A Pending CN115237304A (en) 2021-04-22 2021-04-22 Interface display method, electronic device and computer readable medium

Country Status (2)

Country Link
CN (1) CN115237304A (en)
WO (1) WO2022222771A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2750016A1 (en) * 2012-12-28 2014-07-02 Sony Mobile Communications AB Method of operating a graphical user interface and graphical user interface
CN105653099B (en) * 2016-03-01 2019-06-11 宇龙计算机通信科技(深圳)有限公司 A kind of method and device for feeding back touch screen pressing dynamics
US10901598B2 (en) * 2017-03-29 2021-01-26 Huawei Technologies Co., Ltd. Method for adjusting interface scrolling speed, related device, and computer program product
EP3783471A4 (en) * 2018-05-21 2021-05-05 Huawei Technologies Co., Ltd. Display control method and terminal
CN111309229A (en) * 2020-02-17 2020-06-19 Oppo广东移动通信有限公司 Parameter adjusting method, device, terminal and storage medium

Also Published As

Publication number Publication date
WO2022222771A1 (en) 2022-10-27

Similar Documents

Publication Publication Date Title
JP6952877B2 (en) Systems, methods, and graphical user interfaces for interacting with augmented and virtual reality environments
US20190391730A1 (en) Computer application launching
TWI516962B (en) Method, computer system and computer program product for navigating among a plurality of content items in a browser
JP2023065397A (en) User interface for manipulating user interface object
TWI575442B (en) Method, apparatus and computer program product for providing multiple levels of interaction with a program
US20130198690A1 (en) Visual indication of graphical user interface relationship
CN103853424A (en) Display device and method of controlling the same
TW201331825A (en) Apparatus and method for providing a visual transition between screens
CN104903830A (en) Display device and method of controlling the same
CN109375861A (en) The asynchronous process of user interface manipulation
JPWO2010134324A1 (en) Content display device and content display method
US20150063785A1 (en) Method of overlappingly displaying visual object on video, storage medium, and electronic device
JP6066602B2 (en) Processing equipment
CN110928464B (en) User interface display method, device, equipment and medium
JP6568246B2 (en) GAME PROGRAM, METHOD, AND INFORMATION PROCESSING DEVICE
WO2022222771A1 (en) Interface display method, electronic device and computer-readable medium
CN109804340B (en) Method and device for page display, graphical user interface and mobile terminal
CN116958487A (en) Gesture prediction method, device, apparatus, storage medium, and computer program product
CN107862728A (en) Adding method, device and the computer-readable recording medium of picture tag
JP6453500B1 (en) GAME PROGRAM, METHOD, AND INFORMATION PROCESSING DEVICE
CN107209632B (en) Information processing program and information processing method
CN103997634B (en) User terminal and its method for showing image
WO2023207738A1 (en) Display method of electronic device having flexible screen, and electronic device
JP6427504B2 (en) User terminal and image display method thereof
JP2019130291A (en) Game program, method, and information processing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination