CN110928464B - User interface display method, device, equipment and medium - Google Patents

User interface display method, device, equipment and medium Download PDF

Info

Publication number
CN110928464B
CN110928464B CN201911183743.6A CN201911183743A CN110928464B CN 110928464 B CN110928464 B CN 110928464B CN 201911183743 A CN201911183743 A CN 201911183743A CN 110928464 B CN110928464 B CN 110928464B
Authority
CN
China
Prior art keywords
layer
foreground
background
displacement
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911183743.6A
Other languages
Chinese (zh)
Other versions
CN110928464A (en
Inventor
李烈强
谢天
林晓文
骆玘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201911183743.6A priority Critical patent/CN110928464B/en
Publication of CN110928464A publication Critical patent/CN110928464A/en
Application granted granted Critical
Publication of CN110928464B publication Critical patent/CN110928464B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a display method, a device, equipment and a medium of a user interface, and relates to the field of virtual worlds. The method comprises the following steps: displaying a first user interface, wherein the first user interface comprises image content which is jointly displayed by first background content of the transparent area and foreground content of the non-transparent area on the background layer when the foreground layer and the background layer are positioned at a first relative position; controlling the foreground image layer and the background image layer to generate relative displacement; and displaying a second user interface, wherein the second user interface comprises image content which is jointly displayed by the second background content of the transparent area and the foreground content of the non-transparent area on the background layer when the foreground layer and the background layer are positioned at a second relative position. The problems that in the related technology, the loading of a user interface is slow and the flow is wasted due to the large data volume of the dynamic image are solved.

Description

User interface display method, device, equipment and medium
Technical Field
The embodiment of the application relates to the field of computers, in particular to a display method, a device, equipment and a medium of a user interface.
Background
A User Interface (User Interface) is a human-machine interaction portal provided by an application. The user interface comprises: image, text, video, audio, etc. An image is a flat medium that is rich in presentation content, often used as an advertising vehicle in a user interface.
In the related art, one or more images are arranged in a user interface, each image occupies a region with a fixed size, and text information exists between different images. The user may scroll up and down through the user interface to view different content on the user interface.
When the image on the user interface is an advertising image, many users will habitually ignore the advertising picture. In order to attract users to view, many advertisement images adopt dynamic images, but the large data volume of the dynamic images can cause slow loading of a user interface and waste of flow.
Disclosure of Invention
The embodiment of the application provides a display method, a device, equipment and a medium for a user interface, which can solve the technical problems that in the related technology, the loading of the user interface is slow and the flow is wasted due to the large data volume of a dynamic image. The technical scheme is as follows:
in one aspect, a method for displaying a user interface is provided, where the user interface includes: a foreground layer and a background layer, the foreground layer including transparent regions and non-transparent regions thereon, the method comprising:
Displaying a first user interface, wherein the first user interface comprises image content which is jointly displayed by first background content of the transparent area and foreground content of the non-transparent area on the background layer when the foreground layer and the background layer are positioned at a first relative position;
controlling the foreground image layer and the background image layer to generate relative displacement;
and displaying a second user interface, wherein the second user interface comprises image content which is jointly displayed by the second background content of the transparent area and the foreground content of the non-transparent area on the background layer when the foreground layer and the background layer are positioned at a second relative position.
In another aspect, there is provided a display device of a user interface, the user interface comprising: a foreground layer and a background layer, the foreground layer including transparent regions and non-transparent regions thereon, the apparatus comprising:
the display module is used for displaying a first user interface, wherein the first user interface comprises image content which is jointly displayed by the first background content of the transparent area and the foreground content of the non-transparent area on the background layer when the foreground layer and the background layer are positioned at a first relative position;
The control module is used for controlling the relative displacement of the foreground image layer and the background image layer;
the display module is further configured to display a second user interface, where the second user interface includes image content that is jointly displayed by the second background content of the transparent area and the foreground content of the non-transparent area on the background layer when the foreground layer and the background layer are located at a second relative position.
In another aspect, a computer device is provided, the computer device including a processor and a memory having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, the at least one instruction, the at least one program, the set of codes, or the set of instructions being loaded and executed by the processor to implement the method of displaying a user interface as described in the above aspect.
In another aspect, there is provided a computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, the at least one instruction, the at least one program, the set of codes, or the set of instructions being loaded and executed by the processor to implement the method of displaying a user interface as described in the above aspect.
The technical scheme provided by the embodiment of the application has the beneficial effects that at least:
the image is set to comprise a foreground image layer and a background image layer, wherein the foreground image layer comprises a transparent area and a non-transparent area, the relative positions of the two image layers are different by controlling the relative displacement between the two image layers, and the image content formed by the background image layer through the background content of the transparent area and the foreground content of the non-transparent area is different. The method not only can attract users to view, but also can reduce the image data quantity, quicken the image loading speed and save the flow.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a block diagram of a computer system provided in accordance with an exemplary embodiment of the present application;
fig. 2 is a schematic structural view of a terminal according to an exemplary embodiment of the present application;
FIG. 3 is a method flow diagram of a method for displaying a user interface provided by an exemplary embodiment of the present application;
FIG. 4 is a schematic diagram of layers provided by another exemplary embodiment of the present application;
FIG. 5 is a method flow diagram of a method for displaying a user interface provided in another exemplary embodiment of the present application;
FIG. 6 is a schematic diagram of a foreground layer and a background layer provided by another exemplary embodiment of the present application;
FIG. 7 is an interface diagram of a display method of a user interface provided by another exemplary embodiment of the present application in one exemplary embodiment;
FIG. 8 is an interface diagram of a display method of a user interface provided by another exemplary embodiment of the present application in one exemplary embodiment;
FIG. 9 is an interface diagram of a display method of a user interface provided by another exemplary embodiment of the present application in one exemplary embodiment;
FIG. 10 is an interface diagram of a display method of a user interface provided by another exemplary embodiment of the present application in one exemplary embodiment;
FIG. 11 is a schematic illustration of a foreground layer and a background layer provided by another exemplary embodiment of the present application;
FIG. 12 is an interface diagram of a display method of a user interface provided by another exemplary embodiment of the present application in one exemplary embodiment;
FIG. 13 is an interface diagram of a display method of a user interface provided by another exemplary embodiment of the present application in one exemplary embodiment;
FIG. 14 is a method flow diagram of a method for displaying a user interface provided by another exemplary embodiment of the present application;
FIG. 15 is a schematic diagram of a foreground layer and a background layer provided by another exemplary embodiment of the present application;
FIG. 16 is a schematic illustration of a foreground layer and a background layer provided by another exemplary embodiment of the present application;
FIG. 17 is a schematic diagram of a foreground layer and a background layer provided by another exemplary embodiment of the present application;
FIG. 18 is a schematic diagram of a foreground layer and a background layer provided by another exemplary embodiment of the present application;
FIG. 19 is a method flow diagram of a method for displaying a user interface provided by another exemplary embodiment of the present application;
FIG. 20 is an interface diagram of a display method of a user interface provided by another exemplary embodiment of the present application in one exemplary embodiment;
FIG. 21 is a schematic diagram of a foreground layer and a background layer provided by another exemplary embodiment of the present application;
FIG. 22 is an interface diagram of a display method of a user interface provided by another exemplary embodiment of the present application in one exemplary embodiment;
FIG. 23 is a schematic diagram of a foreground layer and a background layer provided by another exemplary embodiment of the present application;
FIG. 24 is an interface diagram of a display method of a user interface provided by another exemplary embodiment of the present application in one exemplary embodiment;
FIG. 25 is a schematic illustration of a foreground layer and a background layer provided by another exemplary embodiment of the present application;
FIG. 26 is an interface diagram of a display method of a user interface provided by another exemplary embodiment of the present application in one exemplary embodiment;
FIG. 27 is a schematic illustration of a foreground layer and a background layer provided by another exemplary embodiment of the present application;
FIG. 28 is an interface diagram of a display method of a user interface provided by another exemplary embodiment of the present application in one exemplary embodiment;
FIG. 29 is a schematic diagram of a foreground layer and a background layer provided by another exemplary embodiment of the present application;
FIG. 30 is a block diagram of a display device of a user interface provided by another exemplary embodiment of the present application;
fig. 31 is a block diagram of a terminal provided in an exemplary embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail with reference to the accompanying drawings.
Referring to fig. 1, a schematic diagram of an implementation environment provided by an embodiment of the present application is shown. The implementation environment may include: a terminal 10 and a server 20.
The terminal 10 may be an electronic device such as a cell phone, desktop computer, tablet computer, game console, electronic book reader, multimedia playing device, wearable device, etc. The terminal 10 may be provided with a client of an application program capable of browsing pictures, such as a client of an application program capable of browsing merchandise pictures.
The server 20 is used to provide background services for clients of applications in the terminal 10, such as applications capable of picture browsing. For example, the server 20 may be a background server of the above-described application program (e.g., an application program capable of performing picture browsing). The server 20 may be a server, a server cluster comprising a plurality of servers, or a cloud computing service center.
The terminal 10 and the server 20 can communicate with each other via a network 30. The network 30 may be a wired network or a wireless network.
For example, when the terminal 10 receives an instruction of browsing pictures from a user, the client side of the terminal 10, which installs an application program (such as a browser) capable of browsing pictures, may send a message for acquiring pictures to the server 20 through the network 30, and after receiving the acquisition message of the terminal 10, the server 20 selects a certain number of pictures from the stored pictures and sends the pictures to the terminal 10 through the network 30, and then the terminal 10 displays the pictures in its interface for the user to browse, so as to complete the loading process of the pictures.
In the embodiment of the method, the execution subject of each step can be a terminal. Referring to fig. 2, a schematic structural diagram of a terminal according to an embodiment of the present application is shown. The terminal 10 may include: motherboard 110, external input/output device 120, memory 130, external interface 140, touch system 150, and power supply 160.
Wherein, the motherboard 110 has integrated therein processing elements such as a processor and a controller.
The external input/output device 120 may include a display component (such as a display screen), a sound playing component (such as a speaker), a sound collecting component (such as a microphone), various types of keys, and the like.
The memory 130 has stored therein program codes and data.
The external interface 140 may include a headset interface, a charging interface, a data interface, and the like.
The touch system 150 may be integrated in a display component or key of the external input/output device 120, and the touch system 150 is used to detect a touch operation performed by a user on the display component or key.
The power supply 160 is used to power the other various components in the mobile terminal 10.
In an embodiment of the present application, the processor in the motherboard 110 may generate a user interface (e.g., a picture interface) by executing or invoking program codes and data stored in the memory, and present the generated user interface (e.g., the picture interface) through the external input/output device 120. In the process of displaying the user interface (e.g., the picture interface), the touch system 150 may detect a touch operation performed when the user interacts with the user interface (e.g., the picture interface) and respond to the touch operation.
In combination with the description of the virtual world and the description of the implementation environment, the method for displaying the user interface provided by the embodiment of the present application is described, and the execution subject of the method is exemplified by the terminal shown in fig. 1. The terminal is operated with an application program, and the application program is a program supporting picture browsing.
Fig. 3 is a method flowchart of a method for displaying a user interface according to an exemplary embodiment of the present application. The terminal 10 shown in fig. 1 is exemplified as the main execution body of the method, and an application program supporting picture browsing is run in the terminal. The user interface includes: the foreground image layer and the background image layer, wherein the foreground image layer comprises transparent areas and non-transparent areas, and the method at least comprises the following steps.
Step 101, displaying a first user interface, where the first user interface includes image content that is displayed together with foreground content of a non-transparent area and first background content of a transparent area on a background layer when the foreground layer and the background layer are located at a first relative position.
The terminal displays a first user interface.
The user interface is a user interface corresponding to an application program, a web page or an operating system in the terminal. Illustratively, the user interface has an image displayed thereon. For example, the user may perform a displacement operation on the user interface, e.g., the displacement operation includes: at least one of sliding, scrolling, clicking, double clicking, pressing. The user interface may also be an interface that is not receptive to user displacement operations, for example.
Taking an application capable of advertisement browsing as an example, the user interface may be an interface displaying advertisements for presenting the content of the advertisements to the user, e.g., a plurality of advertisements may be included in the user interface, as well as profiles of such advertisements, e.g., title, picture, source, time, content profile, etc.
Illustratively, the user interface includes a foreground layer and a background layer. A layer is a part of an image, and a plurality of layers are sequentially superimposed together to form an image. Illustratively, each layer has a display element, and the display elements on different layers can be edited separately, and the editing processes do not affect each other. Illustratively, the plurality of layers are stacked sequentially from top to bottom, and the layer below may be displayed through the transparent region in the layer above. For example, if the upper layer is a fully transparent layer, the lower layer can be fully displayed through the upper layer; if the upper layer is a left transparent layer and a right opaque layer, the lower layer can show half through the left transparent area, and the other half can not show through the upper layer; if the upper layer is opaque, the lower layer cannot be displayed through the upper layer.
For example, as shown in fig. 4, there are two layers of the same size: a first layer 401 and a second layer 402. The first layer includes square transparent regions 403, and the first layer 401 includes opaque regions except for the transparent regions 403. The second layer 402 is an opaque layer. By placing the first layer 401 over the second layer 402 and overlapping the layers together, the second layer 402 may be partially displayed through the transparent region 403 on the first layer 401, ultimately forming a complete image 404.
The foreground layer is a layer that is located above the background layer. That is, the background layer is superimposed below the foreground layer in the superimposed order. Illustratively, the foreground layer includes transparent regions and non-transparent regions thereon.
The transparent region is a region where the underlying layer can be displayed. The non-transparent region is a region where the underlying layer cannot be displayed.
Illustratively, the transparent region on the foreground layer is a region that is transparent to the background layer, and the non-transparent region is a region that is not transparent to the background layer. Illustratively, the transparency of the transparent region may be arbitrary, i.e. the transparent region may be translucent, fully transparent or arbitrary transparency. Illustratively, the size of the transparent region on the foreground layer is smaller than the size of the background layer. That is, the image superimposed by the foreground layer and the background layer must include a display element on the foreground layer and a display element on the background layer. Or, the foreground layer and the background layer are overlapped to form an image, and the background layer is necessarily covered by an opaque area on the foreground layer to cover part of the area.
The background layer is a layer below the foreground layer. That is, the foreground layer is superimposed over the background layer in the superimposed order. Illustratively, the background layer is an opaque layer. Alternatively, the background layer is a layer having opaque regions.
Illustratively, the foreground layer and the background layer are used to present image content.
Illustratively, the foreground layer and the background layer are the same size, or the foreground layer and the background layer are different sizes. Illustratively, the foreground and background layers are arbitrary in shape and may be the same or different.
For example, if the foreground layer and the background layer are different in size, the region where the foreground layer and the background layer overlap constitutes image content, and the region where they do not overlap does not constitute image content. For example, the size of the foreground layer is 1cm×1cm, the size of the background layer is 2cm×2cm, and the size of the overlapping area of the foreground layer and the background layer is 1cm×1cm, so that the overlapping area is image content, and the user interface only displays the overlapping area.
For example, if the foreground layer and the background layer are different in size, a partial region of the overlapping region of the foreground layer and the background layer constitutes image content, and the other region does not constitute image content. For example, the size of the foreground layer is 1cm by 1cm, the size of the background layer is 2cm by 2cm, and the size of the overlapping area of the foreground layer and the background layer is 1cm by 1cm, so that the image content is a partial area in the overlapping area, for example, the image content is an area with the size of 0.5cm by 0.5cm in the overlapping area, and the user interface only displays the area.
For example, if the foreground layer and the background layer are different in size, the image content may also be all the image contents obtained by overlapping the foreground layer and the background layer. For example, the size of the foreground layer is 1cm×1cm, the size of the background layer is 2cm×2cm, and the image content may be a 2cm×2cm image obtained by overlapping the foreground layer and the background layer, and the user interface displays the 2cm×2cm image content.
The embodiment of the application takes the example that the sizes of the foreground image layer and the background image layer are different.
The relative position refers to the positions of the foreground and background models. For example, the relative position may refer to a position of the foreground model relative to the background model, and may refer to a position of the background model relative to the foreground model. For example, when the background model size is larger than the foreground model, the relative positions of the foreground model and the background model may be: the foreground model is located in the lower right corner of the background model, or the foreground model is located in the middle of the background model. When the background model size is smaller than the foreground model, then the relative positions of the foreground model and the background model may be: the background model is located in the top half of the foreground model or the background model is located in the left half of the foreground model.
The first relative position is one of the relative positions of the foreground model and the background model.
The first background content is the content in the background layer which is displayed through the transparent area of the foreground layer. Illustratively, when the relative positions of the foreground layer and the background layer are different, the background layer displays different content through the transparent region of the foreground layer.
The foreground content of the non-transparent region may be, for example, the content contained in all non-transparent regions in the foreground layer, or may be the content contained in part of the non-transparent regions in the foreground layer.
Illustratively, the image content includes content in both the foreground layer and the background layer.
Exemplary content or display elements include shapes, patterns, colors, and combinations thereof.
Illustratively, the size of the transparent region is smaller than the size of the background layer, or the size of the transparent region is larger than the size of the background layer, or the size of the transparent region is equal to the size of the background layer.
For example, when the size of the background layer is smaller than the transparent region, a blank layer may be added behind the background layer, and the blank layer fills the blank portion of the transparent region except for the background layer. Alternatively, the transparent region is a translucent region through which the background layer is displayed.
And 102, controlling the foreground image layer and the background image layer to generate relative displacement.
The terminal controls the foreground image layer and the background image layer to generate relative displacement.
Illustratively, the terminal controls the relative displacement of the foreground layer and the background layer, i.e. the terminal changes the relative positions of the foreground layer and the background layer.
Illustratively, the terminal moves the relative positions of the foreground and background layers from a first relative position to a second relative position.
The relative displacement is that the two are relatively moved, so that the relative position of the two is changed. Illustratively, relative displacement refers to movement in a horizontal dimension parallel to the foreground and background graphics. The direction and manner of relative displacement is exemplary arbitrary.
And 103, displaying a second user interface, wherein the second user interface comprises image content which is jointly displayed by the second background content of the transparent area and the foreground content of the non-transparent area on the background layer when the foreground layer and the background layer are positioned at a second relative position.
The terminal displays a second user interface.
The second relative position is a different relative position than the first relative position.
The second background content is different from the first background content on the background layer.
Illustratively, the image content displayed on the first user interface is different from the image content displayed on the second user interface.
The foreground content is a main display content, and the first background content and the second background content are auxiliary display contents; or, the foreground content is auxiliary display content, and the first background content and the second background content are main display content.
Illustratively, the primary line content and the secondary display content together comprise image content. For example, the primary content of the image content may be on the foreground layer or on the background layer.
In summary, in the method provided in this embodiment, the image is set to include two layers, i.e., a foreground layer and a background layer, where the foreground layer includes a transparent area and a non-transparent area, and by controlling relative displacement between the two layers, the relative positions of the two layers are different, and the background layer is different from the image content formed by the background content of the transparent area and the foreground content of the non-transparent area. The method not only can attract users to view, but also can reduce the image data quantity, quicken the image loading speed and save the flow.
The application also provides an exemplary embodiment for controlling the relative displacement of the foreground image layer and the background image layer.
Fig. 5 is a method flowchart of a method for displaying a user interface according to an exemplary embodiment of the present application. The terminal 10 shown in fig. 1 is exemplified as the subject of execution of the method, which includes the following steps.
Step 101, displaying a first user interface, where the first user interface includes image content that is displayed together with foreground content of a non-transparent area and first background content of a transparent area on a background layer when the foreground layer and the background layer are located at a first relative position.
Illustratively, as shown in FIG. 6, there is a foreground layer 501 and a background layer 502. Wherein the foreground layer 501 includes transparent areas 503 and non-transparent areas 504. Illustratively, the cola bottles on the foreground layer 501 in fig. 6 are transparent areas 503 and the shaded portions are opaque areas 504. The background layer 502 is an opaque layer, wherein the upper half of the background layer 502 is white and the lower half is a shaded area. Superimposing the foreground layer 501 and the background layer 502 may form an image as shown in fig. 7. Illustratively, fig. 7 is an image content that is shown on the background layer 502 in conjunction with the foreground content of the non-transparent region 504 by the first background content 505 of the transparent region 503 when the foreground layer 501 and the background layer 502 are in the first relative position. The first background content 505 is a portion of the white area and a portion of the shadow area of the background layer 502 that passes through the transparent area 503.
Illustratively, FIG. 7 is a first user interface on which all of the foreground and background layers are displayed. By way of example, fig. 8 shows another first user interface, on which only the image content of the overlapping portion of the foreground model and the background model is displayed, and the other portion is not displayed.
In step 1021, a displacement instruction is obtained, the displacement instruction being user-triggered or generated by program code.
And the terminal acquires the displacement instruction.
The displacement instruction is an instruction for controlling the relative displacement of the foreground layer and the background layer.
Illustratively, the displacement instruction is user-triggered. When the terminal receives the displacement operation of the user, a displacement instruction is generated, and the foreground image layer and the background image layer are controlled to be displaced relatively according to the displacement instruction. For example, the displacement operation of the user may be at least one of sliding, scrolling, clicking, double clicking, pressing.
The displacement instructions may also be generated by program code, for example. For example, program code is used to control the foreground layer to automatically move upwards from below the background layer.
Step 1022, in response to the displacement distance of the displacement command, controls at least one of the foreground layer and the background layer to displace.
And the terminal responds to the displacement distance of the displacement instruction and controls at least one layer in the foreground layer and the background layer to displace.
The displacement command includes a displacement direction and a displacement distance, and the terminal controls the foreground image layer and the background image layer to generate relative displacement according to the displacement direction and the displacement distance.
The terminal can control the foreground layer to be motionless and the background layer to move according to the displacement instruction.
Or the terminal can control the background layer to be motionless and the foreground layer to move according to the displacement instruction.
Or, the terminal can control the background layer and the foreground layer to move according to the displacement instruction. Illustratively, the directions of movement of the background and foreground layers may be the same, opposite, or at any angle. For example, when the moving directions of the background layer and the foreground layer are the same, the moving speeds or moving distances of the background layer and the foreground layer are different, i.e. the relative positions of the background layer and the foreground layer are changed after moving.
For example, the terminal only controls the foreground layer to move, the background layer to be stationary, as shown in fig. 9, the displacement instruction is an instruction corresponding to the upward sliding operation of the user, the terminal controls the foreground layer 501 to move upward according to the displacement instruction, and the background layer 502 is stationary, so that the first relative position of the foreground layer 501 and the background layer 502 in fig. 9 is changed to the second relative position of the fig. 2 in fig. 9, and then to the third relative position of the fig. 3 in fig. 9. By way of example, fig. 10 corresponds to the layer moving process shown in fig. 9 when it is applied to an application program having an advertisement browsing function. As shown in fig. 10, the image content overlapped by the foreground layer and the background layer is advertisement content, and on the user interface, the advertisement content slides upwards along with the upward sliding operation of the user, at this time, the foreground layer is controlled to slide upwards along with the upward sliding operation of the user, and the background layer is fixed, so that the effect as shown in fig. 10 from fig. 1 to fig. 2 to fig. 3 is simulated, and the effect that the cola bottle descends from full cola to the cola liquid level slightly until no cola effect is obtained.
For example, the terminal controls only the background layer to move, the foreground layer to be stationary, as shown in fig. 11, with a foreground layer 506 and a background layer 507, where the foreground layer has a transparent region 508 and a non-transparent region 509. Illustratively, the white areas of the foreground layer in fig. 11 are transparent areas and the corresponding shaded areas of the three shoes are non-transparent areas. The background layer 507 is an opaque layer. The foreground layer 506 is located over the background layer 507. After the foreground layer 506 and the background layer 507 are superimposed one above the other, the image content as shown in fig. 12 can be obtained. As shown in fig. 12, a user interface for applying the foreground layer 506 and the background layer 507 shown in fig. 11 to an application program having an advertisement browsing function. And displaying the image content overlapped by the foreground image layer and the background image layer on the user interface, wherein the displacement instruction is a displacement instruction generated according to the program code, the displacement instruction controls the foreground image layer to be motionless, and the background image layer moves upwards, so that the effects of the images (1) to (2) to (3) to (4) in fig. 12 can be obtained.
For example, the terminal controls both the background layer and the foreground layer, as shown in fig. 13, and the terminal controls the background layer 507 to move rightward according to the displacement command, and the foreground layer 506 moves downward, so that the foreground layer 506 and the background layer 507 change from the first relative position as shown in fig. 13 (1) to the second relative position as shown in fig. 13 (2).
Illustratively, the distance traveled by the foreground layer or the background layer is determined from the displacement distance in the displacement instruction.
For example, when only one of the foreground layer and the background layer moves and the other layer is stationary, the terminal controls the moving layer to move by the same distance as the displacement distance.
Illustratively, the static layer size is the same as the size of the user interface. For example, in an application program with advertisement browsing, when a user scrolls a mouse to control a page to move upwards by a distance x, the terminal controls one of a foreground layer and a background layer to move by the distance x.
Illustratively, the static layer size is different from the size of the user interface, and the terminal matches the static layer size with the size of the user interface by means of equal scaling. Illustratively, matching means that the width of the control layer is equal to the width of the user interface; or, the control layer is as long as the user interface. Illustratively, the terminal matches the size of the still layer and the user interface according to the movable direction of the moving layer. For example, if the moving layer is vertically moving, the terminal or the like scales the size of the still layer to make the width of the still layer equal to the width of the user interface, and the length of the still layer may be slightly longer or shorter than the length of the user interface.
Illustratively, since the length of the still layer may be slightly longer or slightly shorter than the user interface, the still layer may be displayed under the transparent area in order to move the moving layer to any position on the user interface; or in order to enable the still layer to be fully covered by the moving layer, the terminal needs to control the still layer to move when the moving layer moves. However, the moving direction of the static layer is different from the moving direction of the moving layer, or the moving direction of the static layer is the same as the moving direction of the moving layer but the moving speed is different. For example, if the still layer is longer than the user interface, the terminal controls the still layer to move slowly down when the moving layer moves up. If the static layer is shorter than the user interface, the terminal controls the static layer to move upwards slowly when the moving layer moves upwards.
And 103, displaying a second user interface, wherein the second user interface comprises image content which is jointly displayed by the second background content of the transparent area and the foreground content of the non-transparent area on the background layer when the foreground layer and the background layer are positioned at a second relative position.
In summary, in the method provided in this embodiment, the terminal controls, by acquiring the displacement instruction, displacement of at least one of the foreground layer and the background layer according to the displacement distance in the displacement instruction, so that the relative position of the foreground layer and the background layer is changed from the first relative position to the second relative position. By changing the relative positions of the foreground image layer and the background image layer, the image content presents different effects, not only can attract users to view, but also can reduce the image data volume, quicken the image loading speed and save the flow.
Fig. 14 is a method flowchart of a method for displaying a user interface according to an exemplary embodiment of the present application. By way of illustration of the method execution body as the terminal shown in fig. 1, in comparison with the exemplary embodiment shown in fig. 5, the "control of displacement of at least one of the foreground layer and the background layer" in step 1022 may be replaced with steps 201 to 203.
Step 201, it is determined whether the size of the foreground layer is larger than the background layer.
The terminal judges whether the size of the foreground layer is larger than that of the background layer, if so, the step 202 is carried out; otherwise, step 203 is performed.
Illustratively, only one of the foreground and background layers is moving and the other layer is stationary. The terminal controls the smaller-sized layers to move and the larger-sized layers to be static.
Step 202, controlling the displacement of the background layer relative to the foreground layer.
When the size of the foreground layer is larger than that of the background layer, the terminal controls the background layer to displace relative to the foreground layer.
The terminal controls the background layer to move, and the foreground layer is static.
For example, as shown in fig. 15, a foreground layer 601 is superimposed over a background layer 602, with square, triangle, diamond-shaped transparent areas 503 on the foreground layer 601, and non-transparent areas 504 other than the transparent areas 503. Illustratively, the white portion of the foreground layer in FIG. 15 is transparent region 503 and the shaded portion is non-transparent region 504. The background layer 602 is an opaque layer. The terminal controls the background layer 602 to displace relative to the foreground layer 601 according to the displacement instruction.
Illustratively, the foreground layer may have a majority of non-transparent regions and a minority of transparent regions; the transparent areas may be mostly, and the non-transparent areas may be mostly. For example, as shown in fig. 16, there are square, triangular, diamond-shaped non-transparent regions 504 on the foreground layer 601, and transparent regions 503 other than the transparent regions 504, the foreground layer 601 being superimposed on the background layer 602. The terminal controls the background layer 602 to displace relative to the foreground layer 601 according to the displacement instruction.
Illustratively, the background layer may be moved in any direction. For example, the background layers in fig. 15 and 16 are vertically sliding. For example, the background layer may also be controlled to slide laterally. As shown in fig. 17, the foreground layer 601 is superimposed on the background layer 602, with square, triangle, diamond-shaped transparent areas 503 on the foreground layer 601, and non-transparent areas 504 other than the transparent areas 503. Illustratively, the white portion of the foreground layer in FIG. 15 is transparent region 503 and the shaded portion is non-transparent region 504. The background layer 602 is an opaque layer. The terminal controls the background layer 602 to slide transversely relative to the foreground layer 601 according to the displacement instruction.
Step 203, controlling the displacement of the foreground layer relative to the background layer.
When the size of the foreground layer is smaller than that of the background layer, the terminal controls the foreground layer to displace relative to the background layer.
Illustratively, the terminal controls the foreground layer to move and the background layer to be stationary.
For example, as shown in fig. 18, a foreground layer 601 is superimposed over a background layer 602, with square transparent areas 503 on the foreground layer 601, and non-transparent areas 504 other than the transparent areas 503. Illustratively, the white portion of the foreground layer in FIG. 15 is transparent region 503 and the shaded portion is non-transparent region 504. The background layer 602 is an opaque layer. The terminal controls the foreground image layer 601 to shift relative to the background image layer 602 according to the shift instruction.
Illustratively, the foreground layer may have a majority of non-transparent regions and a minority of transparent regions; the transparent areas may be mostly, and the non-transparent areas may be mostly.
Illustratively, the foreground layer may be moved in any direction.
In summary, according to the method provided by the embodiment, the foreground layer and the background layer are determined, so that the terminal controls the layer with smaller size to move and the layer with larger size to be static. By changing the relative positions of the foreground image layer and the background image layer, the image content presents different effects, not only can attract users to view, but also can reduce the image data volume, quicken the image loading speed and save the flow.
The displacement instruction also comprises a displacement direction, and the foreground image layer or the background image layer can move in any direction according to the displacement instruction.
Fig. 19 is a method flowchart of a method for displaying a user interface according to an exemplary embodiment of the present application. The terminal 10 shown in fig. 1 is exemplified as the execution subject of the method, and unlike the exemplary embodiment shown in fig. 14, step 202 is replaced with steps 2021 to 2026, and step 203 is replaced with steps 2031 to 2036.
Step 2021, when the size of the foreground layer is larger than the background layer.
For example, when the size of the foreground layer is larger than the background layer, the terminal controls the background layer to be displaced relative to the foreground layer, and the displacement direction of the background layer may be any one of steps 2022 to 2026.
In step 2022, the background layer is controlled to be displaced according to a first direction, and the first direction is the same as the displacement direction of the displacement command.
The terminal controls the background layer to displace according to the first direction.
Illustratively, a displacement direction is also included in the displacement instruction. The displacement direction is the direction of the displacement operation by the user, or the displacement direction set in the program code.
Illustratively, as shown in FIG. 20, there is a foreground layer 601 and a background layer 602, where the foreground layer 601 has square, triangle, diamond shaped non-transparent areas, and transparent areas other than transparent areas. The background layer 602 is an opaque layer. The displacement direction 603 in the displacement command is upward, and the terminal may control the background layer to move in the first direction 604, where the first direction 604 is the same direction as the displacement direction 603.
In step 2023, the background layer is controlled to be displaced according to the second direction, and the second direction is opposite to the displacement direction of the displacement command.
And the terminal controls the background layer to displace according to the second direction.
For example, as shown in fig. 20, if the displacement direction 603 in the displacement command is upward, the terminal may control the background layer to move in the second direction 605, where the second direction 605 is opposite to the displacement direction 603.
In step 2024, the background layer is controlled to be displaced according to a third direction, and the third direction is perpendicular to the displacement direction of the displacement command.
And the terminal controls the background layer to displace according to the third direction.
For example, as shown in fig. 20, if the displacement direction 603 in the displacement command is upward, the terminal may control the background layer to move in the third direction 606, and the third direction 606 is a direction perpendicular to the displacement direction 603. For example, when the foreground layer and the background layer are laterally slid layers as shown in fig. 17, the user can control the background layer to move left and right by sliding up and down.
In step 2025, the background layer is controlled to displace according to the fourth direction, and the fourth direction has an included angle with the displacement direction of the displacement command, where the included angle is an acute angle.
The terminal controls the background layer to displace according to the fourth direction.
For example, as shown in fig. 20, if the displacement direction 603 in the displacement command is upward, the terminal may control the background layer to move in the fourth direction 607, and the fourth direction 607 is opposite to the displacement direction 603. For example, if the displacement direction of the sliding-out of the user is forty-five degrees obliquely upward, the terminal controls the background layer to move upward.
The fourth direction may also have an included angle with the displacement direction of the displacement command, which is an obtuse angle.
In step 2026, the background layer is controlled to displace according to the preset curve track.
And the terminal controls the background layer to displace according to a preset curve track.
For example, as shown in fig. 20, the terminal may control the background layer to be displaced according to a preset curve track 608. By way of example, the curved track may be of any shape, e.g., saw tooth, wave, spiral, etc. For example, the terminal may determine the overall direction of movement of the curved track based on the direction of displacement, e.g., when the direction of displacement is upward, the terminal controls the background layer to move upward in a zigzag track,
In step 2031, when the size of the foreground layer is smaller than the background layer.
For example, when the size of the foreground layer is smaller than that of the background layer, the terminal controls the foreground layer to be displaced relative to the background layer, and the displacement direction of the foreground layer may be any one of steps 2032 to 2036.
In step 2032, the foreground layer is controlled to displace according to a first direction, and the first direction is the same as the displacement direction of the displacement command.
And the terminal controls the foreground layer to displace according to the first direction.
Step 2033, controlling the foreground layer to displace according to a second direction, where the second direction is opposite to the displacement direction of the displacement command.
And the terminal controls the foreground layer to displace according to the second direction.
Step 2034, controlling the foreground layer to displace according to a third direction, wherein the third direction is perpendicular to the displacement direction of the displacement instruction.
And the terminal controls the foreground layer to displace according to the third direction.
Step 2035, controlling the foreground layer to displace according to the fourth direction, where the fourth direction has an included angle with the displacement direction of the displacement instruction, and the included angle is an acute angle.
The terminal controls the foreground layer to displace according to the fourth direction.
In step 2036, the background layer is controlled to displace according to the preset curve track.
And the terminal controls the background layer to displace according to a preset curve track.
Illustratively, based on the same principle as steps 2022 to 2026, the terminal may control the foreground layer to move in the direction of steps 2032 to 2036.
In summary, in the method provided in this embodiment, the displacement instruction further includes a displacement direction, and the foreground layer or the background layer may determine the movement direction according to the displacement direction. By changing the relative positions of the foreground image layer and the background image layer, the image content presents different effects, not only can attract users to view, but also can reduce the image data volume, quicken the image loading speed and save the flow.
By way of example, four types of user interfaces are presented that are displayed using the display method of the user interface provided by the present application.
First, as shown in fig. 21, there is a foreground layer 701 and a background layer 702. Wherein the foreground layer 701 has transparent areas 503 in the shape of bottles and non-transparent areas 504 other than bottles. Background layer 702 is an opaque, two shaded area. The user interface shown in fig. 22 can be obtained by superimposing the foreground layer 701 and the background layer 702 one above the other. As shown in fig. 22, the terminal controls the effect that the foreground layer 701 moves from the left end of the background layer 702 to the right end of the background layer 702, i.e., fig. 22 (1) to (2) to (3). The filling color in the bottle is changed, the picture interestingness is increased, the user is attracted to view, the image data volume is reduced, the image loading speed is increased, and the flow is saved.
Second, as shown in fig. 23, there is a foreground layer 801 and a background layer 802. Wherein the foreground layer 801 has square, triangular, diamond-shaped transparent regions 503 and non-transparent regions 504 other than the transparent regions 503. Background layer 802 is an opaque shaded area 803 and a white area 804. The background layer may also be a layer with transparent areas, i.e. the background layer 802 comprises opaque shaded areas 803 and transparent areas 804, for example. The user interface shown in fig. 24 can be obtained by superimposing the foreground layer 801 and the background layer 802 on top of each other. As shown in fig. 24, in order to better show the effect, the boundary between the areas is hidden, and the terminal controls the foreground layer 801 to gradually move from the lower part of the background layer 802 to the upper part of the background layer 802, i.e., the effect of fig. 24 (1) to (6). The interest of the pictures is increased, users are attracted to view, the image data volume is reduced, the image loading speed is increased, and the flow is saved.
Third, as shown in fig. 25, there are a foreground layer 901 and a background layer 902. Wherein the foreground layer 901 has square, triangular non-transparent regions 504 and transparent regions 503 other than the transparent regions 504. The background layer 902 is an opaque layer. The user interface shown in fig. 26 can be obtained by superimposing the foreground layer 901 and the background layer 902 on top of each other. As shown in fig. 26, the terminal controls the effect of the foreground layer 901 gradually moving from the lower part of the background layer 902 to the upper part of the background layer 902, i.e., fig. 26 (1) to (3). The interest of the pictures is increased, users are attracted to view, the image data volume is reduced, the image loading speed is increased, and the flow is saved.
Fourth, as shown in fig. 27, there is a foreground layer 1001 and a background layer 1002. Wherein the foreground layer 1001 has a transparent region 503 at a central portion of the foreground layer, the transparent region 503 being a translucent region (having a transparency of 50%), and a non-transparent region 504 other than the transparent region 503. Background layer 1002 is a circular opaque layer. The user interface shown in fig. 28 can be obtained by superimposing the foreground layer 1001 and the background layer 1002 one above the other. As shown in fig. 28, the terminal controls the effect of the background layer 1002 moving arbitrarily in the foreground layer 1001, for example, fig. 28 (1) to (3). Illustratively, a white layer is further disposed behind the background layer 1002 to fill up the region of the transparent region 503 except for the background layer 1002, so as to ensure that the color of the transparent region is accurate. The method increases the interestingness of the pictures, attracts users to view, reduces the image data volume, accelerates the image loading speed and saves the flow.
Illustratively, when the user interface includes two or more sets of foreground and background layers that are in one-to-one correspondence, the entire user interface may be occupied by the background layer due to its size. At this time, the foreground layer will acquire the identifier of the corresponding background layer, and according to the identifier of the background layer, the background content of the corresponding background layer is transmitted through the transparent area of the foreground layer. For example, there is a one-to-one correspondence of a first foreground layer, a first background layer, a second foreground layer, and a second background layer on the user interface. The first foreground layer acquires the identification of the first background layer, and the transparent area of the first foreground layer is penetrated by the background content on the first background layer; the second foreground layer acquires the identification of the second background layer, and the background content on the second background layer is transmitted from the transparent area of the second foreground layer. That is, the first foreground layer does not reveal the background content of the second background layer, nor does the second foreground layer reveal the background content of the first background layer.
Illustratively, as shown in fig. 29, there is a one-to-one correspondence of a first foreground layer 901, a first background layer 903, and a second foreground layer 902, a second background layer 904. Wherein transparent areas of the first foreground layer 901 transparent to background content of the first background layer 903 and transparent areas of the second foreground layer 902 transparent to background content of the second background layer 904.
The following are device embodiments of the application, reference being made to the above-described method embodiments for details of which are not described in detail in the device embodiments.
Fig. 30 is a block diagram of a display device of a user interface provided in an exemplary embodiment of the present application. The device is applied to a terminal, an application program supporting picture browsing is operated in the terminal, and the user interface comprises: a foreground layer and a background layer, the foreground layer including transparent regions and non-transparent regions thereon, the apparatus comprising:
the display module 301 is configured to display a first user interface, where the first user interface includes image content that is jointly displayed by the first background content of the transparent area and the foreground content of the non-transparent area on the background layer when the foreground layer and the background layer are located at a first relative position;
The control module 302 is configured to control the relative displacement between the foreground layer and the background layer;
the display module 301 is further configured to display a second user interface, where the second user interface includes image content that is jointly displayed by the second background content of the transparent area and the foreground content of the non-transparent area on the background layer when the foreground layer and the background layer are located at a second relative position.
In an alternative embodiment, the apparatus further comprises: an acquisition module 303;
the obtaining module 303 is configured to obtain a displacement instruction, where the displacement instruction is triggered by a user or generated by a program code;
the control module 302 is further configured to control displacement of at least one of the foreground layer and the background layer in response to a displacement distance of the displacement command.
In an alternative embodiment, the foreground layer is larger in size than the background layer;
the control module 302 is further configured to control displacement of the background layer relative to the foreground layer.
In an alternative embodiment, the control module 302 is further configured to control the background layer to perform displacement according to a first direction, where the first direction is the same as the displacement direction of the displacement instruction;
Or alternatively, the first and second heat exchangers may be,
the control module 302 is further configured to control the background layer to perform displacement according to a second direction, where the second direction is opposite to the displacement direction of the displacement instruction;
or alternatively, the first and second heat exchangers may be,
the control module 302 is further configured to control the background layer to perform displacement according to a third direction, where the third direction is perpendicular to the displacement direction of the displacement instruction;
or alternatively, the first and second heat exchangers may be,
the control module 302 is further configured to control the background layer to perform displacement according to a fourth direction, where the fourth direction has an included angle with a displacement direction of the displacement instruction, and the included angle is an acute angle;
or alternatively, the first and second heat exchangers may be,
the control module 302 is further configured to control the background layer to displace according to a preset curve track.
In an alternative embodiment, the foreground layer is smaller in size than the background layer;
the control module 302 is further configured to control displacement of the foreground layer relative to the background layer.
In an alternative embodiment, the control module 302 is further configured to control the foreground layer to perform displacement according to a first direction, where the first direction is the same as the displacement direction of the displacement instruction;
or alternatively, the first and second heat exchangers may be,
the control module 302 is further configured to control the foreground layer to perform displacement according to a second direction, where the second direction is opposite to the displacement direction of the displacement instruction;
Or alternatively, the first and second heat exchangers may be,
the control module 302 is further configured to control the foreground layer to perform displacement according to a third direction, where the third direction is perpendicular to the displacement direction of the displacement instruction;
or alternatively, the first and second heat exchangers may be,
the control module 302 is further configured to control the foreground layer to perform displacement according to a fourth direction, where the fourth direction has an included angle with a displacement direction of the displacement instruction, and the included angle is an acute angle;
or alternatively, the first and second heat exchangers may be,
the control module 302 is further configured to control the foreground layer to displace according to a preset curve track.
In an alternative embodiment, the foreground content is primary display content, and the first background content and the second background content are secondary display content;
or alternatively, the first and second heat exchangers may be,
the foreground content is auxiliary display content, and the first background content and the second background content are main display content.
In an alternative embodiment, there are at least two different display elements in the background layer, the display elements comprising: shape, pattern, color, and combinations thereof.
It should be noted that: the display device of the user interface provided in the above embodiment is only exemplified by the division of the above functional modules, and in practical application, the above functional allocation may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to perform all or part of the functions described above. In addition, the display device of the user interface provided in the above embodiment and the display method embodiment of the user interface belong to the same concept, and the specific implementation process of the display device of the user interface is detailed in the method embodiment, which is not repeated here.
Fig. 31 shows a block diagram of a terminal 3900 according to an exemplary embodiment of the present application. The terminal 3900 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion picture expert compression standard audio plane 3), an MP4 (Moving Picture Experts Group Audio Layer IV, motion picture expert compression standard audio plane 4) player, a notebook computer, or a desktop computer. Terminal 3900 may also be referred to by other names of user devices, portable terminals, laptop terminals, desktop terminals, and the like.
In general, the terminal 3900 includes: a processor 3901 and a memory 3902.
Processor 3901 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 3901 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). The processor 3901 may also include a main processor and a coprocessor, the main processor being a processor for processing data in an awake state, also referred to as a CPU (Central Processing Unit ); a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 3901 may integrate a GPU (Graphics Processing Unit, image processor) for rendering and rendering content required to be displayed by the display screen. In some embodiments, the processor 3901 may also include an AI (Artificial Intelligence ) processor for processing computing operations related to machine learning.
Memory 3902 may include one or more computer-readable storage media, which may be non-transitory. Memory 3902 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 3902 is used to store at least one instruction for execution by processor 3901 to implement a method of displaying a user interface provided by an embodiment of a method of the application.
In some embodiments, the terminal 3900 may optionally further include: a peripheral interface 3903 and at least one peripheral. The processor 3901, the memory 3902, and the peripheral device interface 3903 may be connected by a bus or signal line. The individual peripheral devices may be connected to the peripheral device interface 3903 via buses, signal lines, or a circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 3904, a touch display screen 3905, a camera 3906, an audio circuit 3907, a positioning assembly 3908, and a power source 3909.
The peripheral interface 3903 may be used to connect at least one Input/Output (I/O) related peripheral to the processor 3901 and the memory 3902. In some embodiments, the processor 3901, the memory 3902, and the peripheral interface 3903 are integrated on the same chip or circuit board; in some other embodiments, either or both of the processor 3901, the memory 3902, and the peripheral interface 3903 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 3904 is configured to receive and transmit RF (Radio Frequency) signals, also referred to as electromagnetic signals. The radio frequency circuit 3904 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 3904 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 3904 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. The radio frequency circuit 3904 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: the world wide web, metropolitan area networks, intranets, generation mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity ) networks. In some embodiments, the radio frequency circuit 3904 may also include NFC (Near Field Communication ) related circuits, which the present application is not limited to.
The display 3905 is used for displaying a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display 3905 is a touch display, the display 3905 also has the ability to collect touch signals at or above the surface of the display 3905. The touch signal may be input as a control signal to the processor 3901 for processing. At this point, the display 3905 may also be used to provide virtual buttons and/or virtual keyboards, also referred to as soft buttons and/or soft keyboards. In some embodiments, the display 3905 may be one, providing a front panel of the terminal 3900; in other embodiments, the display 3905 may be at least two, each disposed on a different surface of the terminal 3900 or in a folded configuration; in still other embodiments, the display 3905 may be a flexible display disposed on a curved surface or a folded surface of the terminal 3900. Even more, the display 3905 may be arranged in a non-rectangular irregular pattern, i.e. a shaped screen. The display 3905 may be made of LCD (Liquid Crystal Display ), OLED (Organic Light-Emitting Diode) or other materials.
Camera assembly 3906 is used to capture images or video. Optionally, camera assembly 3906 includes a front camera and a rear camera. Typically, the front camera is disposed on the front panel of the terminal and the rear camera is disposed on the rear surface of the terminal. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, camera assembly 3906 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
The audio circuit 3907 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and the environment, converting the sound waves into electric signals, inputting the electric signals to the processor 3901 for processing, or inputting the electric signals to the radio frequency circuit 3904 for voice communication. For stereo acquisition or noise reduction purposes, the microphone may be multiple, each disposed at a different location of the terminal 3900. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is then used to convert electrical signals from the processor 3901 or the radio frequency circuit 3904 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, audio circuit 3907 may also include a headphone jack.
Positioning component 3908 is used to position the current geographic location of terminal 3900 for navigation or LBS (Location Based Service, location based services). The positioning component 3908 may be a positioning component based on the united states GPS (Global Positioning System ), the beidou system of china, or the galileo system of russia.
Power supply 3909 is used to power the various components in terminal 3900. The power source 3909 may be an alternating current, a direct current, a disposable battery, or a rechargeable battery. When power supply 3909 comprises a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal 3900 further includes one or more sensors 3910. The one or more sensors 3910 include, but are not limited to: acceleration sensor 3911, gyroscope sensor 3912, pressure sensor 3913, fingerprint sensor 3914, optical sensor 3915, and proximity sensor 3916.
The acceleration sensor 3911 can detect the magnitudes of accelerations on three coordinate axes of the coordinate system established with the terminal 3900. For example, the acceleration sensor 3911 may be used to detect components of gravitational acceleration on three coordinate axes. The processor 3901 may control the touch display 3905 to display a user interface in a landscape view or a portrait view according to the gravitational acceleration signal acquired by the acceleration sensor 3911. The acceleration sensor 3911 may also be used for the acquisition of motion data of a game or a user.
The gyro sensor 3912 may detect a body direction and a rotation angle of the terminal 3900, and the gyro sensor 3912 may cooperate with the acceleration sensor 3911 to collect 3D actions of the user on the terminal 3900. The processor 3901 may implement the following functions based on the data collected by the gyro sensor 3912: motion sensing (e.g., changing UI according to a tilting operation by a user), image stabilization at shooting, game control, and inertial navigation.
The pressure sensor 3913 may be disposed at a side frame of the terminal 3900 and/or at a lower layer of the touch display 3905. When the pressure sensor 3913 is disposed on a side frame of the terminal 3900, a grip signal of the terminal 3900 by a user may be detected, and the processor 3901 may perform a left-right hand recognition or a quick operation according to the grip signal collected by the pressure sensor 3913. When the pressure sensor 3913 is disposed at the lower layer of the touch display screen 3905, the processor 3901 controls the operability control on the UI interface according to the pressure operation of the user on the touch display screen 3905. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The fingerprint sensor 3914 is used for collecting the fingerprint of the user, and the processor 3901 can identify the identity of the user according to the fingerprint collected by the fingerprint sensor 3914 or can identify the identity of the user according to the collected fingerprint by the fingerprint sensor 3914. Upon recognizing that the user's identity is a trusted identity, the processor 3901 authorizes the user to perform related sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for and changing settings, and the like. Fingerprint sensor 3914 may be provided on the front, back, or side of terminal 3900. When a physical key or vendor Logo is provided on the terminal 3900, the fingerprint sensor 3914 may be integrated with the physical key or vendor Logo.
The optical sensor 3915 is used to collect ambient light intensity. In one embodiment, the processor 3901 may control the display brightness of the touch display 3905 based on the ambient light intensity collected by the optical sensor 3915. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 3905 is turned up; when the ambient light intensity is low, the display brightness of the touch display screen 3905 is turned down. In another embodiment, the processor 3901 may also dynamically adjust the shooting parameters of the camera assembly 3906 based on the ambient light intensity collected by the optical sensor 3915.
A proximity sensor 3916, also referred to as a distance sensor, is typically provided on the front panel of the terminal 3900. Proximity sensor 3916 is used to collect the distance between the user and the front of terminal 3900. In one embodiment, when the proximity sensor 3916 detects a gradual decrease in the distance between the user and the front face of the terminal 3900, the processor 3901 controls the touch display 3905 to switch from the on-screen state to the off-screen state; when the proximity sensor 3916 detects that the distance between the user and the front face of the terminal 3900 gradually increases, the processor 3901 controls the touch display 3905 to switch from the off-screen state to the on-screen state.
Those skilled in the art will appreciate that the structure shown in fig. 31 is not limiting of the terminal 3900 and may include more or fewer components than shown, or may combine certain components, or may employ a different arrangement of components.
The present application also provides a computer device, including a processor and a memory, where the memory stores at least one instruction, at least one program, a code set, or an instruction set, where the at least one instruction, the at least one program, the code set, or the instruction set is loaded and executed by the processor to implement a display method of a user interface provided in any of the above exemplary embodiments.
The present application also provides a computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, the at least one instruction, the at least one program, the set of codes, or the set of instructions being loaded and executed by the processor to implement the method for displaying a user interface provided by any of the above-described exemplary embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the preferred embodiments of the present application is not intended to limit the application, but rather, the application is to be construed as limited to the appended claims.

Claims (15)

1. A method of displaying a user interface, the user interface comprising: a foreground layer and a background layer, the foreground layer including transparent regions and non-transparent regions thereon, the method comprising:
displaying a first user interface, wherein the first user interface comprises image contents which are jointly displayed by a first background content of the transparent area and a foreground content of the non-transparent area on the background layer when the foreground layer and the background layer are positioned at a first relative position, and the first user interface does not comprise image contents except for overlapping parts of the foreground layer and the background layer, the foreground layer and the background layer are in one-to-one correspondence, and the background layer is determined based on the identification of the background layer corresponding to the foreground layer;
controlling the foreground image layer and the background image layer to generate relative displacement;
and displaying a second user interface, wherein the second user interface comprises image contents which are jointly displayed by the second background contents of the transparent area and the foreground contents of the non-transparent area on the background image layer when the foreground image layer and the background image layer are positioned at a second relative position, and the second user interface does not comprise the image contents except for the overlapping part of the foreground image layer and the background image layer.
2. The method of claim 1, wherein said controlling the relative displacement of the foreground layer and the background layer comprises:
obtaining a displacement instruction, wherein the displacement instruction is triggered by a user or generated by program codes;
and controlling at least one of the foreground image layer and the background image layer to displace in response to the displacement distance of the displacement instruction.
3. The method of claim 2, wherein the foreground layer is larger in size than the background layer; the controlling the displacement of at least one of the foreground layer and the background layer includes:
and controlling the background image layer to shift relative to the foreground image layer.
4. A method according to claim 3, wherein said controlling the displacement of the background layer relative to the foreground layer comprises:
controlling the background layer to displace according to a first direction, wherein the first direction is the same as the displacement direction of the displacement instruction;
or alternatively, the first and second heat exchangers may be,
controlling the background layer to displace according to a second direction, wherein the second direction is opposite to the displacement direction of the displacement instruction;
or alternatively, the first and second heat exchangers may be,
Controlling the background layer to displace according to a third direction, wherein the third direction is perpendicular to the displacement direction of the displacement instruction;
or alternatively, the first and second heat exchangers may be,
controlling the background layer to displace according to a fourth direction, wherein the fourth direction and the displacement direction of the displacement instruction have an included angle, and the included angle is an acute angle;
or alternatively, the first and second heat exchangers may be,
and controlling the background layer to displace according to a preset curve track.
5. The method of claim 2, wherein the foreground layer is smaller in size than the background layer; the controlling the displacement of one of the foreground layer and the background layer includes:
and controlling the foreground image layer to shift relative to the background image layer.
6. The method of claim 5, wherein said controlling the displacement of the foreground layer relative to the background layer comprises:
controlling the foreground layer to displace according to a first direction, wherein the first direction is the same as the displacement direction of the displacement instruction;
or alternatively, the first and second heat exchangers may be,
controlling the foreground layer to displace according to a second direction, wherein the second direction is opposite to the displacement direction of the displacement instruction;
or alternatively, the first and second heat exchangers may be,
controlling the foreground layer to displace according to a third direction, wherein the third direction is perpendicular to the displacement direction of the displacement instruction;
Or alternatively, the first and second heat exchangers may be,
controlling the foreground image layer to displace according to a fourth direction, wherein the fourth direction and the displacement direction of the displacement instruction have an included angle, and the included angle is an acute angle;
or alternatively, the first and second heat exchangers may be,
and controlling the foreground layer to displace according to a preset curve track.
7. The method according to any one of claims 1 to 6, wherein,
the foreground content is main display content, and the first background content and the second background content are auxiliary display content;
or alternatively, the first and second heat exchangers may be,
the foreground content is auxiliary display content, and the first background content and the second background content are main display content.
8. The method according to any one of claims 1 to 6, wherein,
the display elements in at least two positions in the background layer are different, and the display elements comprise: shape, pattern, color, and combinations thereof.
9. A display device of a user interface, the user interface comprising: a foreground layer and a background layer, the foreground layer including transparent regions and non-transparent regions thereon, the apparatus comprising:
the display module is used for displaying a first user interface, wherein the first user interface comprises image contents which are jointly displayed by a first background content of the transparent area and a foreground content of the non-transparent area on the background layer when the foreground layer and the background layer are positioned at a first relative position, and the first user interface does not comprise the image contents except for the overlapping part of the foreground layer and the background layer, wherein the foreground layer and the background layer are in one-to-one correspondence, and the background layer is determined based on the identification of the background layer corresponding to the foreground layer;
The control module is used for controlling the relative displacement of the foreground image layer and the background image layer;
the display module is further configured to display a second user interface, where the second user interface includes image content that is jointly displayed by the second background content of the transparent area and the foreground content of the non-transparent area on the background layer when the foreground layer and the background layer are located at a second relative position, and the second user interface does not include image content outside an overlapping portion of the foreground layer and the background layer.
10. The apparatus of claim 9, wherein the apparatus further comprises: an acquisition module;
the acquisition module is used for acquiring a displacement instruction, wherein the displacement instruction is triggered by a user or generated by a program code;
the control module is further used for responding to the displacement distance of the displacement instruction and controlling at least one layer of the foreground layer and the background layer to displace.
11. The apparatus of claim 10, wherein the foreground layer is larger in size than the background layer;
the control module is also used for controlling the displacement of the background image layer relative to the foreground image layer.
12. The apparatus of claim 10, wherein the foreground layer is smaller in size than the background layer;
the control module is also used for controlling the displacement of the foreground image layer relative to the background image layer.
13. The apparatus according to claim 12, wherein
The control module is further used for controlling the foreground image layer to displace according to a first direction, and the first direction is the same as the displacement direction of the displacement instruction;
or alternatively, the first and second heat exchangers may be,
the control module is further used for controlling the foreground image layer to displace according to a second direction, and the second direction is opposite to the displacement direction of the displacement instruction;
or alternatively, the first and second heat exchangers may be,
the control module is further used for controlling the foreground image layer to displace according to a third direction, and the third direction is perpendicular to the displacement direction of the displacement instruction;
or alternatively, the first and second heat exchangers may be,
the control module is further used for controlling the foreground image layer to displace according to a fourth direction, an included angle is formed between the fourth direction and the displacement direction of the displacement instruction, and the included angle is an acute angle;
or alternatively, the first and second heat exchangers may be,
the control module is also used for controlling the foreground layer to displace according to a preset curve track.
14. A computer device comprising a processor and a memory, wherein the memory stores at least one instruction, at least one program, a set of codes, or a set of instructions, the at least one instruction, the at least one program, the set of codes, or the set of instructions being loaded and executed by the processor to implement a method of displaying a user interface as claimed in any one of claims 1 to 8.
15. A computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, the at least one instruction, the at least one program, the set of codes, or the set of instructions being loaded and executed by a processor to implement the method of displaying a user interface as claimed in any one of claims 1 to 8.
CN201911183743.6A 2019-11-27 2019-11-27 User interface display method, device, equipment and medium Active CN110928464B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911183743.6A CN110928464B (en) 2019-11-27 2019-11-27 User interface display method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911183743.6A CN110928464B (en) 2019-11-27 2019-11-27 User interface display method, device, equipment and medium

Publications (2)

Publication Number Publication Date
CN110928464A CN110928464A (en) 2020-03-27
CN110928464B true CN110928464B (en) 2023-08-25

Family

ID=69847553

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911183743.6A Active CN110928464B (en) 2019-11-27 2019-11-27 User interface display method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN110928464B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111782114B (en) * 2020-06-15 2022-03-01 广州视源电子科技股份有限公司 Element display method, device, equipment and medium in manuscript editing application
CN112579083B (en) * 2020-12-09 2024-05-17 京东科技控股股份有限公司 Image display method, device, electronic equipment and storage medium
CN113282258B (en) * 2021-05-28 2023-08-15 武汉悦学帮网络技术有限公司 Information display method and device
CN114322409B (en) * 2021-06-23 2023-09-19 海信视像科技股份有限公司 Refrigerator and method for displaying indoor scenery pictures

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000032249A (en) * 1998-07-14 2000-01-28 Tatsumi Denshi Kogyo Kk Video printing device, and printing paper for the video printing device
WO2000010157A1 (en) * 1998-08-11 2000-02-24 Play, Inc. System and method for refracting a background image through a foreground object on a computer display
WO2001033450A1 (en) * 1999-10-30 2001-05-10 Son Young Cherl Method and system for advertisement using animation-character
CN102385477A (en) * 2010-09-03 2012-03-21 Lg电子株式会社 Method for providing user interface based on multiple displays and mobile terminal using the same
EP2525580A2 (en) * 2011-05-20 2012-11-21 EchoStar Technologies L.L.C. Dynamically configurable 3D display
CN105243268A (en) * 2015-09-18 2016-01-13 网易(杭州)网络有限公司 Game map positioning method and apparatus as well as user terminal
CN106814886A (en) * 2015-11-30 2017-06-09 阿里巴巴集团控股有限公司 The methods of exhibiting and device of banner banner pictures
CN110175065A (en) * 2019-05-29 2019-08-27 广州视源电子科技股份有限公司 A kind of display methods of user interface, device, equipment and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1199868B1 (en) * 2000-10-16 2005-12-07 Sony Deutschland Gmbh Automatic selection of a background image for a display on a mobile telephone
US20090027418A1 (en) * 2007-07-24 2009-01-29 Maru Nimit H Map-based interfaces for storing and locating information about geographical areas
US9727226B2 (en) * 2010-04-02 2017-08-08 Nokia Technologies Oy Methods and apparatuses for providing an enhanced user interface
KR101314679B1 (en) * 2011-09-14 2013-10-07 엘지전자 주식회사 Mobile terminal and method for operation control
GB2534847A (en) * 2015-01-28 2016-08-10 Sony Computer Entertainment Europe Ltd Display

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000032249A (en) * 1998-07-14 2000-01-28 Tatsumi Denshi Kogyo Kk Video printing device, and printing paper for the video printing device
WO2000010157A1 (en) * 1998-08-11 2000-02-24 Play, Inc. System and method for refracting a background image through a foreground object on a computer display
WO2001033450A1 (en) * 1999-10-30 2001-05-10 Son Young Cherl Method and system for advertisement using animation-character
CN102385477A (en) * 2010-09-03 2012-03-21 Lg电子株式会社 Method for providing user interface based on multiple displays and mobile terminal using the same
EP2525580A2 (en) * 2011-05-20 2012-11-21 EchoStar Technologies L.L.C. Dynamically configurable 3D display
CN105243268A (en) * 2015-09-18 2016-01-13 网易(杭州)网络有限公司 Game map positioning method and apparatus as well as user terminal
CN106814886A (en) * 2015-11-30 2017-06-09 阿里巴巴集团控股有限公司 The methods of exhibiting and device of banner banner pictures
CN110175065A (en) * 2019-05-29 2019-08-27 广州视源电子科技股份有限公司 A kind of display methods of user interface, device, equipment and storage medium

Also Published As

Publication number Publication date
CN110928464A (en) 2020-03-27

Similar Documents

Publication Publication Date Title
US11538501B2 (en) Method for generating video, and electronic device and readable storage medium thereof
CN110928464B (en) User interface display method, device, equipment and medium
US10712938B2 (en) Portable device and screen display method of portable device
CN109977333B (en) Webpage display method and device, computer equipment and storage medium
CN110083282B (en) Man-machine interaction method, device, terminal and medium based on information display page
CN108415705B (en) Webpage generation method and device, storage medium and equipment
JP2021516818A (en) Application program display adaptation method and its devices, terminals, storage media, and computer programs
CN110047152B (en) Object construction method and device based on virtual environment and readable storage medium
CN109948581B (en) Image-text rendering method, device, equipment and readable storage medium
CN112230914B (en) Method, device, terminal and storage medium for producing small program
US20150063785A1 (en) Method of overlappingly displaying visual object on video, storage medium, and electronic device
CN104133632B (en) Portable terminal and method for protecting displayed object
CN113127130B (en) Page jump method, device and storage medium
CN111459363B (en) Information display method, device, equipment and storage medium
CN111694478A (en) Content display method, device, terminal and storage medium
CN114546545B (en) Image-text display method, device, terminal and storage medium
CN112257006A (en) Page information configuration method, device, equipment and computer readable storage medium
CN113609358B (en) Content sharing method, device, electronic equipment and storage medium
CN113032590B (en) Special effect display method, device, computer equipment and computer readable storage medium
CN113377270B (en) Information display method, device, equipment and storage medium
CN112612405B (en) Window display method, device, equipment and computer readable storage medium
CN109032492B (en) Song cutting method and device
CN110889060A (en) Webpage display method and device, computer equipment and storage medium
CN116828207A (en) Image processing method, device, computer equipment and storage medium
CN115379274B (en) Picture-based interaction method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40022653

Country of ref document: HK

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant