CN109032710B - Interface adjusting method, device, equipment and storage medium - Google Patents
Interface adjusting method, device, equipment and storage medium Download PDFInfo
- Publication number
- CN109032710B CN109032710B CN201710434670.8A CN201710434670A CN109032710B CN 109032710 B CN109032710 B CN 109032710B CN 201710434670 A CN201710434670 A CN 201710434670A CN 109032710 B CN109032710 B CN 109032710B
- Authority
- CN
- China
- Prior art keywords
- view
- moving
- interface
- background
- backboard
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the application provides an interface adjusting method, an interface adjusting device, interface adjusting equipment and a storage medium, so that the interface adjusting directivity is improved. The method comprises the following steps: receiving an operation instruction; moving an interface according to the operation instruction, wherein the interface comprises a background view and at least one element view which is displayed on the background view in an overlapped mode; the background view and the at least one element view are in different states of movement. The change of elements in the interface can be visually displayed in the interface moving process, and the movement of the interface can be clearly indicated, so that the directivity of interface adjustment is effectively improved.
Description
Technical Field
The present application relates to the field of computer technologies, and in particular, to an interface adjusting method, an interface adjusting apparatus, a terminal device, and a storage medium.
Background
With the development of intelligent terminal technology, more and more users use terminal devices such as smart phones and tablet computers.
When the user operates the terminal equipment, the user can switch between different interfaces. Taking the slide switch as an example, when the user slides by a finger, the interface will translate along with the sliding of the finger until switching to the next interface. Fig. 1 is a schematic diagram of interface switching in the background art, where the left side is an interface before switching, and the interface includes an icon; the right side is an image of a certain frame in the sliding switching process, the interface slides into a terminal screen, icons included in the interface synchronously and integrally move along with the movement of the interface, wherein the icons 1, 5 and 9 are represented by dot-dash lines, move out of the screen and are not displayed any more.
Therefore, in the conventional interface switching process, information in the interface can be translated along with the whole interface. It can be seen that the adjustment of the interface lacks directivity.
Disclosure of Invention
The technical problem to be solved by the embodiments of the present application is to provide an interface adjustment method to improve the directivity of interface adjustment.
Correspondingly, the embodiment of the application also provides an interface adjusting device, a terminal device, a storage medium and an operating system, which are used for ensuring the implementation and application of the method.
In order to solve the above problem, an embodiment of the present application discloses an interface adjustment method, including: receiving an operation instruction; moving an interface according to the operation instruction, wherein the interface comprises a background view and at least one element view which is displayed on the background view in an overlapped mode; the background view and the at least one element view are in different states of movement.
The embodiment of the present application further discloses an interface adjusting device, including: the receiving module is used for receiving an operation instruction; the adjusting module is used for moving an interface according to the operation instruction, wherein the interface comprises a background view and at least one element view which is displayed on the background view in an overlapped mode; the background view and the at least one element view are in different states of movement.
The embodiment of the application discloses terminal equipment, which is characterized by comprising: one or more processors; and one or more machine readable media having instructions stored thereon that, when executed by the one or more processors, cause the terminal device to perform the method of one or more of claims 1-19.
Embodiments of the present application disclose one or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause a terminal device to perform the method as recited in one or more of claims 1-19.
The embodiment of the application discloses an operating system for terminal equipment, which comprises: a receiving unit for receiving an operation instruction; the adjusting unit is used for moving an interface according to the operation instruction, wherein the interface comprises a background view and at least one element view which is displayed on the background view in an overlapped mode; the background view and the at least one element view are in different states of movement.
Compared with the prior art, the embodiment of the application has the following advantages:
in the embodiment of the application, an operation instruction can be received, and then the interface is moved according to the operation instruction, wherein the interface comprises a background view and at least one element view which is displayed on the background view in an overlapped mode, and the moving states of the background view and the at least one element view are different, so that the change of elements in the interface can be visually displayed in the interface moving process, the movement of the interface is definitely indicated, and the directivity of interface adjustment is effectively improved.
Drawings
FIG. 1 is a schematic diagram of interface adjustment in the prior art;
FIG. 2 is a schematic view of an interface adjustment according to an embodiment of the present application;
FIG. 3 is a schematic view of another interface adjustment according to an embodiment of the present application
FIG. 4 is a flowchart illustrating the steps of one embodiment of an interface adjustment method of the present application;
FIG. 5 is a flow chart of steps in another interface adjustment method embodiment of the present application;
FIG. 6 is a schematic diagram of an element view adjustment according to an embodiment of the present application;
FIG. 7 is a schematic diagram of an exemplary interface adjustment in an embodiment of the present application;
FIG. 8 is a block diagram of an embodiment of an interface adjustment apparatus according to the present application;
fig. 9 is a schematic hardware structure diagram of a terminal device according to an embodiment of the present application;
fig. 10 is a schematic hardware structure diagram of a terminal device according to another embodiment of the present application;
fig. 11 is a schematic diagram of an operating system according to an embodiment of the present application.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, the present application is described in further detail with reference to the accompanying drawings and the detailed description.
In the embodiment of the application, the terminal device refers to a device with an intelligent operating system, and the devices can support functions in the aspects of audio, video, data and the like, and include an intelligent mobile terminal such as a smart phone, a tablet computer and an intelligent wearable device, and also can be a smart television, a personal computer and other devices. And intelligent operating systems such as YunOS, IOS, Android and the like.
When a user uses a terminal device, the user may sometimes switch between different interfaces, such as between different desktop interfaces, between different interfaces of an application, and the like. Therefore, an operation instruction can be received, and the operation instruction can be generated through various operations, such as a gesture operation, a biological recognition operation, a touch operation, a pressing operation and the like. After receiving the operation instruction, the interface can be moved according to the operation instruction, and the interface is the current display interface in the screen of the terminal equipment. The interface includes a background view and at least one element view displayed superimposed on the background view. The background view is a view (view) of the background of the interface, such as a wallpaper layer of the desktop, and the background view is used for bearing the display content on the interface. An element view is a view corresponding to a display element in an interface, such as a view of an icon, a view of a widget, a view of a picture, a view of text, etc., and one or more element views may be included on the interface. According to the movement required to be performed by the interface, the background view and the element view also move correspondingly, but in the embodiment of the present application, the movement states of the background view and the at least one element view are different, where the movement states include various state data during the movement process, such as speed, direction, size, and the like, at least one state of the background view and the element view is different, for example, the movement speeds of the background view and the element view are different, and further, for example, the background view only moves, and the element view moves and changes the size, and the like.
Fig. 2 is a schematic diagram of an interface adjustment process in the embodiment of the present application, in which an icon area and an interface area are separated by dotted lines, and the terminal screen does not actually display the dotted lines. And the interface on the left side is an interface displayed in the interface before adjustment, and if the interface is a desktop interface of the terminal equipment, the interface comprises element views, namely icons A-L, and the interface also comprises a background view. The right interface is a frame of display image correspondingly displayed by the interface in the adjustment process, wherein the middle interface is displayed in the terminal screen, part of the interface moves out of the interface and cannot be displayed (the dotted line marks in the right interface), the background view and the element view in the interface also move correspondingly, the moving states of the background view and the element view are different, so that part of the element view moves out of the display area of the element view, the element view in the display area is normally displayed, and the element view beyond the display area is cut and cannot be displayed any more. Wherein the interface moving out of the terminal screen portion in the right interface is indicated by a dotted line and the element view (icon A, E, I) is indicated by a dotted line for clarity of interface movement. It can be seen that the background view and the element view are different in moving state in the interface moving process, so that the change of the elements in the interface can be visually displayed, for example, the element view in the right-side interface is cut due to exceeding a display area in the moving process, only a part of icons are displayed, the movement of the interface is definitely indicated, and the directivity of interface adjustment can be effectively improved. And the interface and the content contained in the interface present different effects, so that the adjustment effect is more layered.
In the interface moving process, one interface can be moved to another interface, element views can be arranged in the two interfaces before and after the movement, of course, any interface can also be provided with the element views, and for the interface with the element views, the moving states of the background view and the element views in the interface can be determined according to the movement of the interface and an operation instruction. For the case that two interfaces are simultaneously displayed in a terminal screen in a moving manner, an example is shown in fig. 3, two interfaces before and after moving are displayed in the terminal screen, the element view of the previous interface is cut in the left part view during moving, the element view of the next interface is cut in the right part view during moving, and therefore each interface can clearly indicate the movement of the interface.
Referring to fig. 4, a flowchart illustrating steps of an embodiment of an interface adjustment method according to the present application is shown, which may specifically include the following steps:
And generating an operation instruction according to the operation, and moving the interface according to the operation instruction after receiving the operation instruction, wherein the background view and the at least one element view have different moving states, so that the movement of the background view and the movement of the element view are respectively determined according to the operation instruction. The movement rule can be set, so that movement state data such as the movement speed of the background view can be determined according to the operation instruction, and information such as the movement speed, the direction and the size of the element view can be determined. Thereby enabling the determination of the state of the background view and the element view during the interface movement.
Wherein the background view and the at least one element view are different in movement state, including: the movement speed of the background view and the movement speed of the at least one element view are different. In the interface moving process, the element view and the background view move at different moving speeds, so that different moving effects can be displayed, and the interface is not moved integrally.
The background view and the at least one element view are in different moving states, including: the background view moves with the interface and the at least one element view moves and changes size with the interface. Namely, the background view and the element view move along with the interface, the moving speed of the background view and the element view can be the same or different, and on the basis, the element view is also adjusted in size, and the element view is enlarged or reduced along with the movement, or the element view is enlarged along with the increase of the moving acceleration, and is reduced along with the reduction of the moving acceleration, and the like.
In an optional embodiment, the moving the interface according to the operation instruction includes: respectively determining the moving speed of the background view and the moving speed of the at least one element view according to the operation instruction; and moving the background view according to the moving speed of the background view, and moving the corresponding at least one element view according to the moving speed of the at least one element view. The moving speed of the background view (e.g. referred to as a first moving speed) and the moving speed of at least one element view (e.g. referred to as a second moving speed) can be determined according to the operation command. For example, the first movement speed of the background view is set to be the same as the interface movement speed, i.e., the movement speed indicated by the operation instruction, and the second movement speed of the element view is set to be smaller (or larger) than the first movement speed. The background view may then be moved at a first movement speed and the corresponding at least one element view may be moved at a second movement speed, and the element view may be cut if it exceeds the display area during the movement, and only the view within the display area may be displayed.
The interface is moved according to the operation instruction, and the interface comprises: the movement speed of the background view and the movement speed of the backboard view are determined according to the operation instruction, and the movement speed and the size information of the element view are determined; moving the background view according to the moving speed of the background view; and moving the corresponding at least one element view according to the moving speed of the at least one element view, and adjusting the size of the at least one element view in the moving process according to the size information. That is, the moving speed (e.g., referred to as a first moving speed) of the background view, and the moving speed (e.g., referred to as a second moving speed) and the size information of the at least one element view are respectively determined according to the operation instruction, and then the background view may be moved according to the first moving speed, and the corresponding at least one element view may be moved according to the second moving speed, and the size of the at least one element view may be adjusted according to the size information during the moving process, such as zooming in, zooming out, stretching, and the like, if the element view moves beyond the display area during the moving process, or zooming in (stretching) beyond the display area. The excess part of the view is cut out and only the view within the display area is displayed.
In summary, an operation instruction may be received, and then the interface is moved according to the operation instruction, where the interface includes a background view and at least one element view displayed in an overlaid manner on the background view, and the moving states of the background view and the at least one element view are different, so that changes of elements in the interface can be visually displayed in the interface moving process, the movement of the interface is explicitly indicated, and the directivity of interface adjustment is effectively improved.
In an embodiment of the present application, the interface further includes: and carrying the backboard view of the element view, wherein the backboard view is superposed on the background view, namely the backboard view is positioned between the element view and the background view, the backboard view corresponds to the display area of the element view and is used for carrying the backboard view, such as the area of the backboard view of the machine position in the dotted line area where the element is positioned in fig. 2 and 3, so that the part of the element view beyond the backboard view can be cut. The view of the back plate may have a color or may be transparent, which is not limited in the embodiments of the present application. The moving state of the backboard view and the at least one element view are different, and the moving state of the backboard view and the background view can be the same or different.
Wherein the backboard view and the at least one element view are in different moving states, comprising: the speed of movement of the backplane view and the speed of movement of the at least one element view are different. In the interface moving process, the element view and the backboard view move at different moving speeds, so that different moving effects can be displayed, and the interface is not moved integrally.
The backboard view and the at least one element view are different in moving state, comprising: the backplane view moves with the interface and the at least one element view moves and changes size with the interface. Namely, the backboard view and the element view move along with the interface, the moving speed of the backboard view and the element view can be the same or different, on the basis, the element view is also adjusted in size, and the element view is enlarged or reduced along with the movement, or the element view is enlarged along with the increase of the moving acceleration and reduced along with the reduction of the moving acceleration, and the like.
Referring to fig. 5, a flowchart illustrating steps of another embodiment of an interface adjustment method according to the present application is shown, which may specifically include the following steps:
Wherein the operation instruction is triggered according to at least one of the following operations: gesture operation, biometric operation, touch operation, press operation. The gesture operation refers to an operation performed by a gesture, such as an operation of shaking the mobile phone, waving a hand on a screen, or the like; the biometric operation refers to an operation performed by recognizing biometric data of a living body, such as an operation for recognizing perception of fingerprints, irises, voiceprints, and the like, and an operation for recognizing perception of eyeball movement, and the like; the touch operation refers to an operation of touching on a terminal screen, such as sliding, clicking and the like; the pressing operation refers to an operation having pressure data received by the terminal, for example, a pressing operation on a screen such as 3D-touch, a pressing operation of holding a mobile phone, and the like. And triggering to send an operation instruction according to the operations, and correspondingly receiving the operation instruction by an operation system of the terminal.
Step 504, respectively determining the moving speed of the background view, the moving speed of the at least one element view and the moving speed of the backboard view according to the operation instruction.
Step 506, moving the background view according to the moving speed of the background view, moving the corresponding at least one element view according to the moving speed of the at least one element view, and moving the corresponding backboard view according to the moving speed of the backboard view.
Namely, a first moving speed of the background view, a second moving speed of the at least one element view and a third moving speed of the backboard view are respectively determined according to the operation instruction, so that for the movement of the interface, the background view can be moved according to the first moving speed, the element view can be moved according to the second moving speed, and the backboard view can be moved according to the third moving speed. Wherein, the first moving speed is not equal to the second moving speed, the third moving speed is not equal to the second moving speed, and the first moving speed and the third moving speed can be the same or different.
In another optional embodiment of the present application, size information of the element views may be further determined according to the operation instruction, and then the size of the at least one element view is adjusted according to the size information during the moving process, such as zooming in, zooming out, stretching, and the like, if the element view moves beyond the display area during the moving process, or zooming in (stretching) beyond the display area. The excess part of the view is cut out and only the view within the display area is displayed. Namely, the interface is moved according to the operation instruction, and the interface comprises: determining the moving speed of the background view and determining the moving speed and size information of the element view according to the operation instruction; moving the background view according to the moving speed of the background view, and moving the corresponding backboard view according to the moving speed of the backboard view; and moving the corresponding at least one element view according to the moving speed of the at least one element view, and adjusting the size of the at least one element view in the moving process according to the size information.
Wherein a partial view of the at least one element view beyond the backplane view is cut. Because the element view and the backboard view are in different moving states, and the backboard view is used for carrying the element view, when the element view exceeds the backboard view due to at least one of different moving speeds or size adjustment in the moving process, the part of the element view exceeding the backboard view can be cut, the part of the element view exceeding the backboard view is not displayed in the interface, and only the part of the element view not exceeding the backboard view is displayed. Wherein said cutting a partial view of said at least one element view beyond said backpanel view comprises: drawing a partial view of the at least one element view, which is positioned in the backboard view, and stopping drawing a partial view of the at least one element view, which exceeds the backboard view. That is, for each frame of the interface image, the partial view of the at least one element view within the backplane view may be drawn, while the drawing is stopped for the partial view of the at least one element view beyond the backplane view. Of course, in some examples, the portion of the view beyond the backplane view may also be hidden by cutting.
In an optional embodiment of the present application, the touch operation includes a sliding operation, and the moving the interface includes, according to the operation instruction: and sliding the interface according to the sliding operation instruction. The interface can be moved by the sliding indication, so that the user can slide on the screen by hand, a touch pen, etc., a sliding operation instruction can be triggered based on the sliding operation, and then the interface is slid according to the sliding operation instruction, namely, the background view, the element view, the backboard view, etc. all perform sliding. The sliding speeds of the background view, the backboard view and the element view can be respectively determined according to the sliding operation instruction, wherein the sliding speeds of the background view, the backboard view and the element view correspond to the sliding speed of the sliding operation, for example, the sliding speed configuration of the background view and the backboard view is the same as the sliding speed of the sliding operation, the sliding speed of the element view is higher or lower than the sliding speed of the sliding operation, or the sliding speeds of the background view and the backboard view are the same as the sliding speed of the sliding operation, the element view is further adjusted in size, or the sliding speeds of the background view, the backboard view and the element view are different from each other.
Taking the sliding operation as an example, a finger of the user slides on the screen to issue a sliding operation instruction, and then the interface slides according to the sliding operation instruction. The sliding speed of the background view and the sliding speed of the back panel view can be the same as the sliding speed of a finger, and the sliding speed of the element view is smaller than the sliding speed of the finger, so that the background view and the back panel view slide faster and the element view slides slower in the moving process, and thus part of the element view exceeds the back panel view and is cut in the element view after a certain frame, and only part of the element view is displayed.
In an embodiment of the application, moving the interface according to the operation instruction further includes: and determining a first direction of the background view and a second direction of the at least one element view according to the operation instruction, wherein the first direction and the second direction can be the same or different. Taking the first direction and the second direction as an example, when the interface moves, the background view and the element view move in opposite directions, and at this time, the moving states of the background view and the at least one element view are different in directions, and the moving speed may not be limited. That is, the moving states of the background view and the at least one element view may be different in any one of moving speed, direction, size change, distance, and the like.
The interface is moved according to the operation instruction, and the method further comprises the following steps: the first direction of the background view, the second direction of the at least one element view and the third direction of the backboard view are determined according to the operation instruction, and the first direction, the second direction and the third direction can be the same or different. For example, the first direction and the third direction are the same, and the second direction is different from the first two, the interface movement is that the background view and the backboard view move in the same direction, for example, the same direction as the sliding operation, and the element view moves in the opposite direction, so that the interface content caused by mistakes can be intuitively displayed when the interface moves. Of course, the three views may be in the same direction, and the moving speeds of the element view and the backboard view are different, so that the element view is cut beyond the backboard view at a certain frame in the moving process, and only a part of the views are displayed.
In an embodiment of the present application, the element view includes: the icon, i.e. the icon of the shortcut, such as an application icon, is a view of the icon and/or a view of the widget. The widgets are widgets displayed on the desktop interface, such as weather widgets, time widgets, and the like.
In an alternative embodiment of the present application, the movement status may also be characterized by a distance, that is, the movement distance of the background view is different from that of the element view, and the movement distance of the backboard view is different from that of the element view. Taking the movement of the backboard view and the element view as an example, the moving the interface according to the operation instruction includes: determining a first displacement of the backboard view and the element view according to an operation instruction; determining a second displacement of the element view according to the first displacement and the adjustment coefficient, so that during the movement, the following can be displayed: the backplane view is moved a first displacement distance and the element view is moved a second displacement distance. As shown in fig. 6, the square is the back plate view, the circle is the element view, a is the initial state, when the interface is not slid, the back plate view and the element view are relatively static, i.e. in the normal display state in the interface. Then, after the interface is slid by a certain distance, the first displacement of the corresponding backboard view is delta, namely, the element view is also moved by the first displacement delta, and at this time, the element view and the backboard are still relatively static without displacement. The element view may then be adjusted again, e.g., by determining a second displacement of the element view based on the first displacement and the adjustment factor, e.g., by adjusting the element view by 0.75delta in the reverse direction in C, the element view is shifted by 0.25delta relative to the initial position. The user can thus see the state of the element view in the non-slid front a and the state of the element view in the moved back C for fig. 6, where the part of the area beyond the back panel view is cut and no longer displayed because the element view translation distance is smaller than the back panel view.
As shown in fig. 7, which is an exemplary interface diagram, the element view is an icon view, the first image from the left is the first interface before moving, the second image from the left and the first image from the right are two screen images during moving, and the second image from the right is the second interface after moving. The right side of the icon view in the first interface is cut beyond the backplane view before moving, and then the first interface is slid left to switch to the second interface. The sliding speed of the back plate view is the same as that of the first interface along with the sliding of the first interface, the sliding speed of the icon view is lower than that of the back plate view, and therefore the cut part on the right side of the icon view is increased. And gradually displaying the icon view in the second interface in the terminal screen along with the sliding of the second interface, wherein the right side of the icon view is also cut, the sliding speed of the icon view in the second interface is greater than that of the backboard view, and the sliding speed of the backboard view is the same as that of the second interface, so that more icon views are displayed as the cut part of the right side of the icon view in the sliding of the second interface is reduced. Until the first interface moves out of the screen, the screen switches to display a second interface in which the right side of the icon view is cut. Therefore, each interface can clearly indicate the movement of the interface.
Therefore, the element view can be adjusted according to the adjustment of the interface, and the element view and the backboard view can move in a layered mode in the interface moving process, so that the movement of the interface is displayed more three-dimensionally.
In the embodiment of the application, the displacement adjustment information of the element view can be obtained by calculating the coefficient from the interface adjustment information; wherein, the adjustment directions of the element view and the interface can be the same or different. The element view and the interface can be adjusted according to a certain angle, for example, the icon displacement and the interface sliding direction form an angle of 0 to +/-180 degrees, and the element mark can be stretched and zoomed.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the embodiments are not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the embodiments. Further, those skilled in the art will also appreciate that the embodiments described in the specification are presently preferred and that no particular act is required of the embodiments of the application.
The embodiment of the application also provides an interface adjusting device.
Referring to fig. 8, a block diagram of an interface adjusting apparatus according to an embodiment of the present disclosure is shown, which may specifically include the following modules:
the receiving module 802 is configured to receive an operation instruction.
An adjusting module 804, configured to move an interface according to the operation instruction, where the interface includes a background view and at least one element view displayed in an overlapping manner on the background view; the background view and the at least one element view are in different states of movement.
In an optional embodiment, the background view and the at least one element view have different moving states, and the method includes: the movement speed of the background view and the movement speed of the at least one element view are different.
In another optional embodiment, the background view and the at least one element view have different moving states, including: the background view moves with the interface and the at least one element view moves and changes size with the interface.
The interface further comprises: a backplane view carrying the element views, the backplane view and the at least one element view being in different movement states. Wherein the backboard view and the at least one element view are in different moving states, comprising: the speed of movement of the backplane view and the speed of movement of the at least one element view are different. The backboard view and the at least one element view are different in moving state, comprising: the backplane view moves with the interface and the at least one element view moves and changes size with the interface.
The adjusting module 804 is configured to determine, according to the operation instruction, a moving speed of the background view and a moving speed of the at least one element view respectively; and moving the background view according to the moving speed of the background view, and moving the corresponding at least one element view according to the moving speed of the at least one element view.
The adjusting module 804 is configured to determine, according to the operation instruction, a moving speed of the background view, a moving speed of the at least one element view, and a moving speed of the backboard view, respectively; and moving the background view according to the moving speed of the background view, moving the corresponding at least one element view according to the moving speed of the at least one element view, and moving the corresponding backboard view according to the moving speed of the backboard view.
The adjusting module 804 is configured to determine, according to the operation instruction, a moving speed of the background view, a moving speed of the backboard view, and a moving speed and size information of the element view; moving the background view according to the moving speed of the background view; and moving the corresponding at least one element view according to the moving speed of the at least one element view, and adjusting the size of the at least one element view in the moving process according to the size information.
The adjusting module 804 is configured to determine, according to the operation instruction, a moving speed of the background view, and determine moving speed and size information of the element view; moving the background view according to the moving speed of the background view, and moving the corresponding backboard view according to the moving speed of the backboard view; and moving the corresponding at least one element view according to the moving speed of the at least one element view, and adjusting the size of the at least one element view in the moving process according to the size information.
The adjusting module 804 is further configured to cut a partial view of the at least one element view beyond the backboard view. Specifically, drawing a partial view of the at least one element view within the backboard view, and stopping drawing a partial view of the at least one element view beyond the backboard view.
Wherein the operation instruction is triggered according to at least one of the following operations: gesture operation, biometric operation, touch operation, press operation.
The touch operation includes a sliding operation, and the adjusting module 804 is configured to slide the interface according to a sliding operation instruction. The operation instruction is triggered according to the sliding operation, and the sliding speed of the background view, the backboard view and the element view corresponds to the sliding speed of the sliding operation.
The adjusting module 804 is further configured to determine a first direction of the background view and a second direction of the at least one element view according to the operation instruction.
The adjusting module 804 is further configured to determine, according to the operation instruction, a first direction of the background view, a second direction of the at least one element view, and a third direction of the backboard view.
Wherein the element view comprises: a view of an icon and/or a view of a widget.
Therefore, the element view can be adjusted according to the adjustment of the interface, and the element view and the back plate can move in a layered mode in the interface moving process, so that the movement of the interface is displayed more three-dimensionally. In the embodiment of the application, the displacement adjustment information of the element view can be obtained by calculating the coefficient from the interface adjustment information; wherein, the adjustment directions of the element view and the interface can be the same or different. The element view and the interface can be adjusted according to a certain angle, for example, the icon displacement and the interface sliding direction form an angle of 0 to +/-180 degrees, and the element mark can be stretched and zoomed.
The present application further provides a non-volatile readable storage medium, where one or more modules (programs) are stored in the storage medium, and when the one or more modules are applied to a terminal device, the one or more modules may cause the terminal device to execute instructions (instructions) of method steps in the present application.
In an alternative embodiment, the method comprises: one or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause a terminal device to perform one or more of the interface display methods described in embodiments of the present application.
Fig. 9 is a schematic hardware structure diagram of an apparatus according to an embodiment of the present application. The device may include various devices such as a server, a terminal device, a temperature control device, and the like. As shown in fig. 9, the terminal device may include an input device 90, a processor 91, an output device 92, a memory 93, and at least one communication bus 94. The communication bus 94 is used to enable communication connections between the elements. The memory 93 may comprise a high speed RAM memory, and may also include a non-volatile storage NVM, such as at least one disk memory, in which various programs may be stored in the memory 93 for performing various processing functions and implementing the method steps of the present embodiment.
Alternatively, the processor 91 may be implemented by, for example, a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a controller, a microcontroller, a microprocessor, or other electronic components, and the processor 91 is coupled to the input device 90 and the output device 92 through a wired or wireless connection.
Alternatively, the input device 90 may include a variety of input devices, such as at least one of a user-oriented user interface, a device-oriented device interface, a software-programmable interface, a camera, and a sensor. Optionally, the device interface facing the device may be a wired interface for data transmission between devices, or may be a hardware plug-in interface (e.g., a USB interface, a serial port, etc.) for data transmission between devices; optionally, the user-facing user interface may be, for example, a user-facing control key, a voice input device for receiving voice input, and a touch sensing device (e.g., a touch screen with a touch sensing function, a touch pad, etc.) for receiving user touch input; optionally, the programmable interface of the software may be, for example, an entry for a user to edit or modify a program, such as an input pin interface or an input interface of a chip; optionally, the transceiver may be a radio frequency transceiver chip with a communication function, a baseband processing chip, a transceiver antenna, and the like. An audio input device such as a microphone may receive voice data. The output device 92 may include a display, a sound, or other output device.
In this embodiment, the processor of the terminal device includes a module for executing the functions of the modules of the data processing apparatus in each device, and specific functions and technical effects may refer to the foregoing embodiments, which are not described herein again.
Fig. 10 is a schematic hardware structure diagram of a terminal device according to another embodiment of the present application. FIG. 10 is a specific embodiment of FIG. 9 in an implementation. As shown in fig. 10, the terminal device of the present embodiment includes a processor 101 and a memory 102.
The processor 101 executes the computer program code stored in the memory 102 to implement the interface adjusting method of fig. 1 to 7 in the above embodiments.
The memory 102 is configured to store various types of data to support operations at the terminal device. Examples of such data include instructions for any application or method operating on the terminal device, such as messages, pictures, videos, and so forth. The memory 102 may include a Random Access Memory (RAM) and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory.
Optionally, the processor 101 is provided in the processing assembly 100. The terminal device may further include: a communication component 103, a power component 104, a multimedia component 105, an audio component 106, an input/output interface 107 and/or a sensor component 108. The specific components included in the terminal device are set according to actual requirements, which is not limited in this embodiment.
The processing component 100 generally controls the overall operation of the terminal device. The processing component 100 may include one or more processors 101 to execute instructions to perform all or part of the steps of the methods of fig. 1-7 described above. Further, the processing component 100 can include one or more modules that facilitate interaction between the processing component 100 and other components. For example, the processing component 100 may include a multimedia module to facilitate interaction between the multimedia component 105 and the processing component 100.
The power supply component 104 provides power to the various components of the terminal device. The power components 104 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the terminal device.
The multimedia component 105 includes a display screen that provides an output interface between the terminal device and the user. In some embodiments, the display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the display screen includes a touch panel, the display screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The audio component 106 is configured to output and/or input audio signals. For example, the audio component 106 may include a Microphone (MIC) configured to receive external audio signals when the terminal device is in an operational mode, such as a voice recognition mode. The received audio signal may further be stored in the memory 102 or transmitted via the communication component 103. In some embodiments, the audio component 106 also includes a speaker for outputting audio signals.
The input/output interface 107 provides an interface between the processing component 100 and peripheral interface modules, which may be click wheels, buttons, etc. These buttons may include, but are not limited to: a volume button, a start button, and a lock button.
The sensor component 108 includes one or more sensors for providing various aspects of status assessment for the terminal device. For example, the sensor component 108 can detect the open/closed status of the terminal device, the relative positioning of the components, the presence or absence of user contact with the terminal device. The sensor assembly 108 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact, including detecting the distance between the user and the terminal device. In some embodiments, the sensor assembly 108 may also include a camera or the like.
The communication component 103 is configured to facilitate wired or wireless communication between the terminal device and other devices. The terminal device may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In one embodiment, the terminal device may include a SIM card slot therein for inserting a SIM card therein, so that the terminal device may log onto a GPRS network to establish communication with the server via the internet.
From the above, the communication component 103, the audio component 106, the input/output interface 107 and the sensor component 108 involved in the embodiment of fig. 10 can be implemented as the input device in the embodiment of fig. 9.
In an apparatus of this embodiment, comprising: one or more processors; and one or more machine readable media having instructions stored thereon, which when executed by the one or more processors, cause the terminal device to perform a method as described in one or more of the embodiments of the application.
An embodiment of the present application further provides an operating system for a terminal device, as shown in fig. 11, the operating system of the device includes: receiving section 1102 and adjusting section 1104.
The determining unit 1102 is configured to receive an operation instruction.
An adjusting unit 1104, configured to move an interface according to the operation instruction, where the interface includes a background view and at least one element view displayed in an overlapping manner on the background view; the background view and the at least one element view are in different states of movement.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present application have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including the preferred embodiment and all such alterations and modifications as fall within the true scope of the embodiments of the application.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The interface display method, the interface display device, the terminal device, the storage medium, and the operating system provided by the present application are described in detail above, and specific examples are applied in the present application to explain the principles and embodiments of the present application, and the descriptions of the above embodiments are only used to help understand the method and the core ideas of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.
Claims (18)
1. An interface adjustment method, comprising:
receiving an operation instruction;
moving an interface according to the operation instruction, wherein the interface comprises a background view, at least one element view superposed and displayed on the background view and a backboard view carrying the element view;
the background view and the at least one element view are in different movement states;
the backboard view and the at least one element view are different in moving state, a partial view of the at least one element view located in the backboard view is drawn, and the partial view of the at least one element view beyond the backboard view is stopped from being drawn, wherein the different moving state comprises different moving speed.
2. The method of claim 1, wherein the background view and the at least one element view are different in movement status, comprising:
the background view and the at least one element view are moved at different speeds.
3. The method of claim 1, wherein the background view and the at least one element view differ in their movement state, comprising:
the background view moves with the interface and the at least one element view moves and changes size with the interface.
4. The method of claim 1, wherein the backplane view and the at least one element view differ in their movement state, comprising:
the speed of movement of the backplane view and the speed of movement of the at least one element view are different.
5. The method of claim 1, wherein the backplane view and the at least one element view differ in their movement state, comprising:
the backplane view moves with the interface and the at least one element view moves and changes size with the interface.
6. The method of claim 2, wherein moving the interface according to the operation command comprises:
respectively determining the moving speed of the background view and the moving speed of the at least one element view according to the operation instruction;
and moving the background view according to the moving speed of the background view, and moving the corresponding at least one element view according to the moving speed of the at least one element view.
7. The method of claim 4, wherein moving the interface according to the operation command comprises:
respectively determining the moving speed of the background view, the moving speed of the at least one element view and the moving speed of the backboard view according to the operation instruction;
and moving the background view according to the moving speed of the background view, moving the corresponding at least one element view according to the moving speed of the at least one element view, and moving the corresponding backboard view according to the moving speed of the backboard view.
8. The method of claim 3, wherein moving the interface according to the operation command comprises:
the movement speed of the background view and the movement speed of the backboard view are determined according to the operation instruction, and the movement speed and the size information of the element view are determined;
moving the background view according to the moving speed of the background view; and
and moving the corresponding at least one element view according to the moving speed of the at least one element view, and adjusting the size of the at least one element view in the moving process according to the size information.
9. The method of claim 5, wherein moving the interface according to the operation command comprises:
determining the moving speed of the background view and determining the moving speed and size information of the element view according to the operation instruction;
moving the background view according to the moving speed of the background view, and moving the corresponding backboard view according to the moving speed of the backboard view; and
and moving the corresponding at least one element view according to the moving speed of the at least one element view, and adjusting the size of the at least one element view in the moving process according to the size information.
10. The method according to any one of claims 1-9, wherein the operation instruction is triggered according to at least one of the following operations: gesture operation, biometric operation, touch operation, press operation.
11. The method according to claim 10, wherein the touch operation comprises a slide operation, and the moving the interface according to the operation instruction comprises: and sliding the interface according to the sliding operation instruction.
12. The method according to claim 1, wherein the operation instruction is triggered according to a sliding operation, and the sliding speed of the background view, the backboard view and the element view corresponds to the sliding speed of the sliding operation.
13. The method according to claim 6 or 8, wherein the moving the interface according to the operation instruction further comprises: and determining a first direction of the background view and a second direction of the at least one element view according to an operation instruction.
14. The method according to claim 7 or 9, wherein the moving the interface according to the operation instruction further comprises: the first direction of the background view, the second direction of the at least one element view and the third direction of the backboard view are determined according to operation instructions.
15. The method of any of claims 1-9, wherein the element view comprises: a view of an icon and/or a view of a widget.
16. An interface adjustment device, comprising:
the receiving module is used for receiving an operation instruction;
the adjusting module is used for moving an interface according to the operation instruction, wherein the interface comprises a background view, at least one element view superposed and displayed on the background view and a backboard view for bearing the element view; the background view and the at least one element view are in different movement states; the backboard view and the at least one element view are different in moving state, a partial view of the at least one element view in the backboard view is drawn, and the partial view of the at least one element view beyond the backboard view is stopped to be drawn, wherein the different moving state comprises different moving speed.
17. A terminal device, comprising:
one or more processors; and
one or more machine-readable media having instructions stored thereon that, when executed by the one or more processors, cause the terminal device to perform the method of one or more of claims 1-15.
18. One or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause a terminal device to perform the method recited by one or more of claims 1-15.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710434670.8A CN109032710B (en) | 2017-06-09 | 2017-06-09 | Interface adjusting method, device, equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710434670.8A CN109032710B (en) | 2017-06-09 | 2017-06-09 | Interface adjusting method, device, equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109032710A CN109032710A (en) | 2018-12-18 |
CN109032710B true CN109032710B (en) | 2022-05-24 |
Family
ID=64628883
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710434670.8A Active CN109032710B (en) | 2017-06-09 | 2017-06-09 | Interface adjusting method, device, equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109032710B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110308961B (en) * | 2019-07-02 | 2023-03-31 | 广州小鹏汽车科技有限公司 | Theme scene switching method and device of vehicle-mounted terminal |
CN114356196B (en) * | 2020-09-30 | 2022-12-20 | 荣耀终端有限公司 | Display method and electronic equipment |
CN113076495B (en) * | 2021-03-31 | 2024-06-21 | 维沃移动通信有限公司 | Content display method, device, electronic equipment and storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102713820A (en) * | 2011-11-16 | 2012-10-03 | 华为终端有限公司 | Operation interface management method, apparatus and mobile terminal |
CN103970397A (en) * | 2013-01-30 | 2014-08-06 | 腾讯科技(深圳)有限公司 | Rotary screen interface display method and rotary screen interface display device |
CN104360805A (en) * | 2014-11-28 | 2015-02-18 | 广东欧珀移动通信有限公司 | Application icon management method and application icon management device |
CN104793843A (en) * | 2015-03-26 | 2015-07-22 | 小米科技有限责任公司 | Desktop display method and device |
CN105843467A (en) * | 2016-03-17 | 2016-08-10 | 北京麒麟合盛网络技术有限公司 | Icon displaying method and device |
CN106502573A (en) * | 2016-11-22 | 2017-03-15 | 天脉聚源(北京)传媒科技有限公司 | A kind of method and device of view interface movement |
Family Cites Families (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6945663B2 (en) * | 2002-06-14 | 2005-09-20 | Tseng-Lu Chien | Tubular electro-luminescent light incorporated with device(s) |
US7102615B2 (en) * | 2002-07-27 | 2006-09-05 | Sony Computer Entertainment Inc. | Man-machine interface using a deformable device |
CN103294424A (en) * | 2012-02-23 | 2013-09-11 | 联想(北京)有限公司 | Mobile terminal and interface display method thereof |
CN103051782B (en) * | 2012-12-10 | 2016-03-23 | 广东欧珀移动通信有限公司 | A kind of mobile terminal projection method of adjustment and device |
US9251762B2 (en) * | 2012-12-19 | 2016-02-02 | Microsoft Technology Licensing, Llc. | Runtime transformation of images to match a user interface theme |
CN103218226A (en) * | 2013-04-08 | 2013-07-24 | 北京小米科技有限责任公司 | Method and device for processing application display interface |
US20150052468A1 (en) * | 2013-08-14 | 2015-02-19 | Peter Adany | High dynamic range parameter adjustment in a graphical user interface using graphical moving scales |
CN103530117A (en) * | 2013-09-30 | 2014-01-22 | 山西云途信息技术有限公司 | Method and device for adapting to screens, of different sizes, of mobile terminals |
CN104182047B (en) * | 2014-08-26 | 2017-09-22 | 联想(北京)有限公司 | A kind of information processing method and electronic equipment |
CN105468656B (en) * | 2014-09-12 | 2019-12-20 | 腾讯科技(深圳)有限公司 | Webpage background image generation method and system |
CN104850395B (en) * | 2015-04-17 | 2018-11-30 | 魅族科技(中国)有限公司 | interface display method and system |
CN105278947B (en) * | 2015-06-18 | 2019-07-26 | 维沃移动通信有限公司 | The method and device of interface element arrangement |
CN105138317B (en) * | 2015-07-24 | 2018-11-13 | 安一恒通(北京)科技有限公司 | Window display processing method and device for terminal device |
CN105245728A (en) * | 2015-10-30 | 2016-01-13 | 上海斐讯数据通信技术有限公司 | View layout method and device in display screen and mobile terminal |
CN106686200B (en) * | 2015-11-09 | 2020-01-31 | 五八同城信息技术有限公司 | Mobile application program updating method, mobile terminal and updating system |
CN105468242B (en) * | 2015-11-12 | 2019-01-04 | 广东维沃软件技术有限公司 | Mobile terminal interface display method and mobile terminal thereof |
CN105786353A (en) * | 2016-02-19 | 2016-07-20 | 努比亚技术有限公司 | Screen locking interface adjusting method and device |
CN105930077A (en) * | 2016-04-12 | 2016-09-07 | 广东欧珀移动通信有限公司 | Method and device for adjusting size of objects displayed by screens |
CN106126039B (en) * | 2016-06-30 | 2019-06-07 | 维沃移动通信有限公司 | Operation interface display method and mobile terminal |
CN106201269A (en) * | 2016-07-15 | 2016-12-07 | 青岛海信电器股份有限公司 | Interface control method and terminal unit |
CN106354520B (en) * | 2016-09-30 | 2020-03-20 | 维沃移动通信有限公司 | Interface background switching method and mobile terminal |
CN110427236B (en) * | 2016-10-18 | 2023-06-09 | 广州视睿电子科技有限公司 | Rendering method and device |
CN106775773A (en) * | 2017-01-20 | 2017-05-31 | 深圳市金立通信设备有限公司 | A kind of theme control method and terminal device |
-
2017
- 2017-06-09 CN CN201710434670.8A patent/CN109032710B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102713820A (en) * | 2011-11-16 | 2012-10-03 | 华为终端有限公司 | Operation interface management method, apparatus and mobile terminal |
CN103970397A (en) * | 2013-01-30 | 2014-08-06 | 腾讯科技(深圳)有限公司 | Rotary screen interface display method and rotary screen interface display device |
CN104360805A (en) * | 2014-11-28 | 2015-02-18 | 广东欧珀移动通信有限公司 | Application icon management method and application icon management device |
CN104793843A (en) * | 2015-03-26 | 2015-07-22 | 小米科技有限责任公司 | Desktop display method and device |
CN105843467A (en) * | 2016-03-17 | 2016-08-10 | 北京麒麟合盛网络技术有限公司 | Icon displaying method and device |
CN106502573A (en) * | 2016-11-22 | 2017-03-15 | 天脉聚源(北京)传媒科技有限公司 | A kind of method and device of view interface movement |
Also Published As
Publication number | Publication date |
---|---|
CN109032710A (en) | 2018-12-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101995486B1 (en) | Mobile terminal and control method thereof | |
EP3661187A1 (en) | Photography method and mobile terminal | |
US9058095B2 (en) | Method for displaying data in mobile terminal having touch screen and mobile terminal thereof | |
US8289292B2 (en) | Electronic device with touch input function and touch input method thereof | |
KR101885655B1 (en) | Mobile terminal | |
KR102045458B1 (en) | Mobile terminal and method of controlling the same | |
EP2667293B1 (en) | Mobile terminal and control method thereof | |
WO2017177597A1 (en) | Entity button assembly, terminal, touch control response method and apparatus | |
US20140165013A1 (en) | Electronic device and page zooming method thereof | |
US20130212483A1 (en) | Apparatus and method for providing for remote user interaction | |
US20130326415A1 (en) | Mobile terminal and control method thereof | |
KR20140051719A (en) | Mobile terminal and control method thereof | |
WO2013153266A1 (en) | Apparatus and method for providing a digital bezel | |
CN107172347B (en) | Photographing method and terminal | |
CN106980445A (en) | Manipulate the awaking method and device, electronic equipment of menu | |
CN109032898A (en) | A kind of display methods of icon, device, equipment and storage medium | |
WO2017161824A1 (en) | Method and device for controlling terminal | |
CN109032710B (en) | Interface adjusting method, device, equipment and storage medium | |
KR102063767B1 (en) | Mobile terminal and control method thereof | |
US20210109699A1 (en) | Data Processing Method and Mobile Device | |
KR20140016107A (en) | Mobile terminal and control method thereof | |
CN107168566B (en) | Operation mode control method and device and terminal electronic equipment | |
CN111522498A (en) | Touch response method and device and storage medium | |
KR101818114B1 (en) | Mobile terminal and method for providing user interface thereof | |
JP2018503883A (en) | Touch control button, touch control panel and touch control terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20201217 Address after: Room 603, 6 / F, Roche Plaza, 788 Cheung Sha Wan Road, Kowloon, China Applicant after: Zebra smart travel network (Hong Kong) Limited Address before: A four-storey 847 mailbox in Grand Cayman Capital Building, British Cayman Islands Applicant before: Alibaba Group Holding Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |