CN110928464A - User interface display method, device, equipment and medium - Google Patents

User interface display method, device, equipment and medium Download PDF

Info

Publication number
CN110928464A
CN110928464A CN201911183743.6A CN201911183743A CN110928464A CN 110928464 A CN110928464 A CN 110928464A CN 201911183743 A CN201911183743 A CN 201911183743A CN 110928464 A CN110928464 A CN 110928464A
Authority
CN
China
Prior art keywords
layer
foreground
background
displacement
content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911183743.6A
Other languages
Chinese (zh)
Other versions
CN110928464B (en
Inventor
李烈强
谢天
林晓文
骆玘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201911183743.6A priority Critical patent/CN110928464B/en
Publication of CN110928464A publication Critical patent/CN110928464A/en
Application granted granted Critical
Publication of CN110928464B publication Critical patent/CN110928464B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a display method, a display device, display equipment and a display medium of a user interface, and relates to the field of virtual worlds. The method comprises the following steps: displaying a first user interface, where the first user interface includes image content that is displayed on the background layer through a first background content of the transparent area and a foreground content of the non-transparent area when the foreground layer and the background layer are located at a first relative position; controlling the relative displacement of the foreground image layer and the background image layer; and displaying a second user interface, wherein the second user interface comprises image content which is displayed on the background layer through the second background content of the transparent area and the foreground content of the non-transparent area when the foreground layer and the background layer are located at a second relative position. The method and the device solve the problems that in the related technology, the loading of a user interface is slow and the flow is wasted due to the fact that the data size of the dynamic image is large.

Description

User interface display method, device, equipment and medium
Technical Field
The embodiment of the application relates to the field of computers, in particular to a display method, a display device, display equipment and a display medium of a user interface.
Background
A User Interface (User Interface) is a human-computer interaction portal provided by an application. The user interface includes: image, text, video, audio, etc. The image is a flat medium with rich expression content, and is often used as an advertisement carrier in a user interface.
In the related art, one or more images are arranged in a user interface, each image occupies an area with a fixed size, and text information exists among different images. The user may scroll up and down the user interface to view different content on the user interface.
When the image on the user interface is an advertisement image, many users habitually ignore the advertisement picture. In order to attract the view of the user, many advertisement images adopt dynamic images, but the data volume of the dynamic images is large, so that the loading of the user interface is slow, and the traffic is wasted.
Disclosure of Invention
The embodiment of the application provides a display method, a display device, display equipment and a display medium for a user interface, which can solve the technical problems that in the related art, the loading of the user interface is slow and the flow is wasted due to the large data volume of a dynamic image. The technical scheme is as follows:
in one aspect, a method for displaying a user interface is provided, where the user interface includes: the method comprises the following steps that a foreground image layer and a background image layer are arranged, wherein the foreground image layer comprises a transparent area and a non-transparent area, and the method comprises the following steps:
displaying a first user interface, where the first user interface includes image content that is displayed on the background layer through a first background content of the transparent area and a foreground content of the non-transparent area when the foreground layer and the background layer are located at a first relative position;
controlling the relative displacement of the foreground image layer and the background image layer;
and displaying a second user interface, wherein the second user interface comprises image content which is displayed on the background layer through the second background content of the transparent area and the foreground content of the non-transparent area when the foreground layer and the background layer are located at a second relative position.
In another aspect, there is provided a display apparatus of a user interface, the user interface including: foreground map layer and background map layer, including transparent region and non-transparent region on the foreground map layer, the device includes:
a display module, configured to display a first user interface, where the first user interface includes image content that is displayed on the background layer through a first background content of the transparent area and a foreground content of the non-transparent area when the foreground layer and the background layer are located at a first relative position;
the control module is used for controlling the relative displacement of the foreground layer and the background layer;
the display module is further configured to display a second user interface, where the second user interface includes image content that is displayed on the background layer through the second background content in the transparent area and the foreground content in the non-transparent area when the foreground layer and the background layer are located at a second relative position.
In another aspect, a computer device is provided, comprising a processor and a memory, the memory having stored therein at least one instruction, at least one program, set of codes, or set of instructions, which is loaded and executed by the processor to implement a display method of a user interface as described above.
In another aspect, a computer-readable storage medium is provided, having stored therein at least one instruction, at least one program, set of codes, or set of instructions that is loaded and executed by the processor to implement a display method of a user interface as described above.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
the image is set to comprise a foreground image layer and a background image layer, wherein the foreground image layer comprises a transparent area and a non-transparent area, the relative position of the two image layers is different by controlling the relative displacement between the two image layers, and the background image layer is different by penetrating through the background content of the transparent area and the image content formed by the foreground content of the non-transparent area. The method can attract users to check, reduce the image data volume, accelerate the image loading speed and save the flow.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a block diagram of a computer system provided in an exemplary embodiment of the present application;
fig. 2 is a schematic structural diagram of a terminal provided in an exemplary embodiment of the present application;
FIG. 3 is a flowchart of a method for displaying a user interface provided by an exemplary embodiment of the present application;
FIG. 4 is a schematic diagram of layers provided by another exemplary embodiment of the present application;
FIG. 5 is a flowchart of a method for displaying a user interface provided by another exemplary embodiment of the present application;
FIG. 6 is a schematic diagram of a foreground layer and a background layer provided by another exemplary embodiment of the present application;
FIG. 7 is an interface diagram in one exemplary embodiment of a method for displaying a user interface provided by another exemplary embodiment of the present application;
FIG. 8 is an interface diagram in one exemplary embodiment of a method for displaying a user interface provided by another exemplary embodiment of the present application;
FIG. 9 is an interface diagram in one exemplary embodiment of a method for displaying a user interface provided by another exemplary embodiment of the present application;
FIG. 10 is an interface diagram in one exemplary embodiment of a method for displaying a user interface provided by another exemplary embodiment of the present application;
FIG. 11 is a schematic diagram of a foreground layer and a background layer provided by another exemplary embodiment of the present application;
FIG. 12 is an interface diagram in one exemplary embodiment of a method of displaying a user interface provided by another exemplary embodiment of the present application;
FIG. 13 is an interface diagram in one exemplary embodiment of a method for displaying a user interface provided by another exemplary embodiment of the present application;
FIG. 14 is a method flow diagram of a method of displaying a user interface provided by another exemplary embodiment of the present application;
FIG. 15 is a schematic diagram of a foreground layer and a background layer provided by another exemplary embodiment of the present application;
FIG. 16 is a schematic diagram of a foreground layer and a background layer provided by another exemplary embodiment of the present application;
FIG. 17 is a schematic diagram of a foreground layer and a background layer provided by another exemplary embodiment of the present application;
FIG. 18 is a schematic diagram of a foreground layer and a background layer provided by another exemplary embodiment of the present application;
FIG. 19 is a method flow diagram of a method of displaying a user interface provided by another exemplary embodiment of the present application;
FIG. 20 is an interface diagram in one exemplary embodiment of a method for displaying a user interface provided by another exemplary embodiment of the present application;
FIG. 21 is a schematic diagram of a foreground layer and a background layer provided by another exemplary embodiment of the present application;
FIG. 22 is an interface diagram in one exemplary embodiment of a method of displaying a user interface provided by another exemplary embodiment of the present application;
FIG. 23 is a schematic diagram of a foreground layer and a background layer provided by another exemplary embodiment of the present application;
FIG. 24 is an interface diagram in one exemplary embodiment of a method of displaying a user interface provided by another exemplary embodiment of the present application;
FIG. 25 is a schematic diagram of a foreground layer and a background layer provided by another exemplary embodiment of the present application;
FIG. 26 is an interface diagram in one exemplary embodiment of a method of displaying a user interface provided by another exemplary embodiment of the present application;
FIG. 27 is a schematic illustration of a foreground layer and a background layer provided by another exemplary embodiment of the present application;
FIG. 28 is an interface diagram in one exemplary embodiment of a method of displaying a user interface provided by another exemplary embodiment of the present application;
FIG. 29 is a schematic diagram of a foreground layer and a background layer provided by another exemplary embodiment of the present application;
FIG. 30 is a block diagram of a display device of a user interface provided in another exemplary embodiment of the present application;
fig. 31 is a block diagram of a terminal provided in an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Referring to fig. 1, a schematic diagram of an implementation environment provided by an embodiment of the present application is shown. The implementation environment may include: a terminal 10 and a server 20.
The terminal 10 may be an electronic device such as a mobile phone, a desktop computer, a tablet computer, a game console, an electronic book reader, a multimedia playing device, a wearable device, and the like. The terminal 10 may be installed with a client of an application program capable of browsing pictures, such as a client of an application program capable of browsing pictures of commodities.
The server 20 is used to provide background services for clients of applications (e.g., applications capable of browsing pictures) in the terminal 10. For example, server 20 may be a background server for the above-described applications (e.g., an application capable of picture browsing). The server 20 may be a server, a server cluster composed of a plurality of servers, or a cloud computing service center.
The terminal 10 and the server 20 can communicate with each other through the network 30. The network 30 may be a wired network or a wireless network.
Illustratively, a client of an application program (e.g. a browser) capable of browsing pictures is installed in the terminal 10, when the terminal 10 receives an instruction from a user to browse pictures, a message for acquiring pictures may be sent to the server 20 via the network 30, after the server 20 receives the information acquired by the terminal 10, a certain number of pictures are selected from the pictures stored therein and sent to the terminal 10 via the network 30, and then the terminal 10 displays the pictures in its interface for the user to browse, so as to complete the loading process of the pictures.
In the embodiment of the method, the execution subject of each step may be a terminal. Please refer to fig. 2, which illustrates a schematic structural diagram of a terminal according to an embodiment of the present application. The terminal 10 may include: a main board 110, an external input/output device 120, a memory 130, an external interface 140, a touch system 150, and a power supply 160.
The main board 110 has integrated therein processing elements such as a processor and a controller.
The external input/output device 120 may include a display component (e.g., a display screen), a sound playing component (e.g., a speaker), a sound collecting component (e.g., a microphone), various keys, and the like.
The memory 130 has program codes and data stored therein.
The external interface 140 may include a headset interface, a charging interface, a data interface, and the like.
The touch system 150 may be integrated into a display component or a key of the external input/output device 120, and the touch system 150 is used to detect a touch operation performed by a user on the display component or the key.
The power supply 160 is used to power various other components in the mobile terminal 10.
In this embodiment, the processor in the main board 110 may generate a user interface (e.g., a picture interface) by executing or calling the program codes and data stored in the memory, and display the generated user interface (e.g., the picture interface) through the external input/output device 120. In the process of displaying the user interface (e.g., the picture interface), the touch system 150 may detect a touch operation performed when the user interacts with the user interface (e.g., the picture interface), and respond to the touch operation.
With reference to the above description of the virtual world and the description of the implementation environment, a method for displaying a user interface provided in the embodiment of the present application is described, and an execution subject of the method is illustrated as a terminal shown in fig. 1. The terminal runs an application program, and the application program is a program supporting picture browsing.
Fig. 3 is a flowchart of a method for displaying a user interface according to an exemplary embodiment of the present application. The main execution body of the method is illustrated as the terminal 10 shown in fig. 1, and an application program supporting picture browsing is run in the terminal. The user interface includes: the method comprises the following steps.
Step 101, displaying a first user interface, where the first user interface includes a foreground layer and image content that is displayed on the background layer together with foreground content of a non-transparent area through first background content of the transparent area when the background layer is located at a first relative position.
The terminal displays a first user interface.
The user interface is corresponding to an application program, a webpage or an operating system in the terminal. Illustratively, the user interface has images displayed thereon. For example, the user may perform a displacement operation on the user interface, for example, the displacement operation includes: at least one of slide, scroll, click, double click, press. Illustratively, the user interface may also be an interface that is not capable of receiving user displacement operations.
Taking an application program capable of advertisement browsing as an example, the user interface may be an interface for displaying advertisements, and the user interface is used for presenting the content of the advertisements to the user, for example, the user interface may include a plurality of advertisements, and the brief descriptions of the advertisements, such as the title, picture, source, time, brief description of the content, and the like.
Illustratively, the user interface includes a foreground layer and a background layer. Layers are parts that form an image, and layers are sequentially superimposed together to form the image. For example, each layer has a display element, the display elements on different layers can be edited separately, and the editing process is not affected by each other. For example, a plurality of layers are stacked from top to bottom, and the layer located below can be displayed through the transparent area in the layer located above. For example, if the layer located above is a fully transparent layer, the layer located below may be completely displayed through the layer located above; if the layer positioned above is a layer which is transparent on the left half and opaque on the right half, the layer positioned below can display a half through the transparent area on the left half, and the other half can not be displayed through the layer positioned above; if the layer located above is the opaque layer, the layer located below cannot be displayed through the layer located above.
For example, as shown in FIG. 4, there are two layers of the same size: a first image layer 401 and a second image layer 402. The first image layer includes a square transparent area 403, and all areas except the transparent area 403 in the first image layer 401 are opaque areas. The second layer 402 is an opaque layer. The first layer 401 is placed above the second layer 402 and superimposed together, so that the second layer 402 can be partially displayed through the transparent area 403 on the first layer 401, and finally a complete image 404 is formed.
The foreground layer is a layer located above the background layer. That is, the background layer is superimposed below the foreground layer in the superimposing order. Illustratively, the foreground image layer includes a transparent area and a non-transparent area.
The transparent area is an area where the layer located below can be displayed. The non-transparent area is an area where the layer below cannot be displayed.
For example, the transparent area on the foreground layer is an area that can be penetrated by the background layer, and the non-transparent area is an area that cannot be penetrated by the background layer. Illustratively, the transparency of the transparent region may be arbitrary, i.e., the transparent region may be translucent, fully transparent, or of any transparency. Illustratively, the size of the transparent area on the foreground layer is smaller than the size of the background layer. That is, the image superimposed by the foreground layer and the background layer necessarily includes display elements on the foreground layer and display elements on the background layer. Or the background layer is a superposed image of the foreground layer and the background layer, and the background layer is certain to be shielded by the opaque area on the foreground layer.
The background layer is a layer located below the foreground layer. I.e. the foreground layer is superimposed over the background layer in the order of superimposition. Illustratively, the background layer is an opaque layer. Alternatively, the background layer is a layer having opaque regions.
Illustratively, the foreground layer and the background layer are used to present image content.
Illustratively, the foreground layer and the background layer have the same size, or the foreground layer and the background layer have different sizes. Illustratively, the shapes of the foreground layer and the background layer are arbitrary, and may be the same or different.
For example, if the foreground layer and the background layer have different sizes, the overlapping area of the foreground layer and the background layer forms image content, and the non-overlapping area does not form image content. For example, if the size of the foreground layer is 1cm × 1cm, the size of the background layer is 2cm × 2cm, and the size of the overlapping area between the foreground layer and the background layer is 1cm × 1cm, the overlapping area is the image content, and the user interface displays only the overlapping area.
For example, if the foreground layer and the background layer have different sizes, a partial area of an overlapping area of the foreground layer and the background layer constitutes image content, and other areas do not constitute image content. For example, if the size of the foreground layer is 1cm × 1cm, the size of the background layer is 2cm × 2cm, and the size of the overlapping area between the foreground layer and the background layer is 1cm × 1cm, the image content is a partial area in the overlapping area, for example, the image content is an area in the overlapping area with a size of 0.5cm × 0.5cm, and the user interface displays only the area.
For example, if the foreground layer and the background layer have different sizes, the image content may also be all image contents obtained by superimposing the foreground layer and the background layer. For example, the size of the foreground layer is 1cm × 1cm, the size of the background layer is 2cm × 2cm, the image content may be a 2cm × 2cm image obtained by superimposing the foreground layer and the background layer, and the user interface displays the 2cm × 2cm image content.
For example, in the embodiment of the present application, the foreground layer and the background layer are different in size.
The relative position refers to the position of the foreground model and the background model. For example, the relative position may refer to a position of the foreground model relative to the background model, and may refer to a position of the background model relative to the foreground model. For example, when the background model size is larger than the foreground model, the relative position of the foreground model and the background model may be: the foreground model is located in the lower right corner of the background model, or the foreground model is located in the middle of the background model. When the background model size is smaller than the foreground model, then the relative position of the foreground model and the background model may be: the background model is located in the upper half of the foreground model or the background model is located in the left half of the foreground model.
The first relative position is a relative position of the foreground model and the background model.
The first background content is content displayed in the background layer through the transparent area of the foreground layer. For example, when the relative positions of the foreground layer and the background layer are different, the content displayed by the background layer through the transparent area of the foreground layer is different.
For example, the foreground content in the non-transparent area may be content included in all the non-transparent areas in the foreground image layer, or may be content included in part of the non-transparent areas in the foreground image layer.
Illustratively, the image content includes both the content in the foreground layer and the content in the background layer.
Illustratively, the content or display elements include shapes, patterns, colors, and combinations thereof.
Illustratively, the size of the transparent area is smaller than the size of the background layer, or the size of the transparent area is larger than the size of the background layer, or the size of the transparent area is equal to the size of the background layer.
For example, when the size of the background layer is smaller than that of the transparent area, a blank layer may be added behind the background layer, and the blank layer fills a blank portion of the transparent area except for the background layer. Or, the transparent area is a semi-transparent area, and the background layer is displayed through the semi-transparent area.
And 102, controlling the relative displacement of the foreground layer and the background layer.
And the terminal controls the relative displacement of the foreground image layer and the background image layer.
Illustratively, the terminal controls the relative displacement between the foreground layer and the background layer, that is, the terminal changes the relative position between the foreground layer and the background layer.
Illustratively, the terminal moves the relative position of the foreground image layer and the background image layer from a first relative position to a second relative position.
The relative displacement is the relative movement between the two, so that the relative positions of the two are changed. Illustratively, relative displacement refers to movement in the horizontal dimension parallel to the foreground graphics and the background graphics. Illustratively, the direction and manner of relative displacement is arbitrary.
And 103, displaying a second user interface, wherein the second user interface comprises a foreground layer and image content which is displayed by the second background content of the transparent area and the foreground content of the non-transparent area on the background layer when the background layer is located at a second relative position.
The terminal displays a second user interface.
The second relative position is a relative position different from the first relative position.
The second background content is different content on the background layer than the first background content.
Illustratively, the image content displayed on the first user interface and the second user interface is different.
Illustratively, the foreground content is main display content, and the first background content and the second background content are auxiliary display content; or the foreground content is auxiliary display content, and the first background content and the second background content are main display content.
Illustratively, the main line content and the auxiliary display content together constitute image content. For example, the main content of the image content may be on the foreground layer and also on the background layer.
In summary, in the method provided in this embodiment, the image is set to include two layers, namely a foreground layer and a background layer, where the foreground layer includes a transparent area and a non-transparent area, and by controlling the relative displacement between the two layers, the relative positions of the two layers are different, and the image content formed by the background layer through the background content in the transparent area and the foreground content in the non-transparent area will be different. The method can attract users to check, reduce the image data volume, accelerate the image loading speed and save the flow.
For example, this application further provides an exemplary embodiment of controlling relative displacement between a foreground layer and a background layer.
Fig. 5 is a flowchart of a method for displaying a user interface according to an exemplary embodiment of the present application. The execution of the method is illustrated by the terminal 10 shown in fig. 1, and the method comprises the following steps.
Step 101, displaying a first user interface, where the first user interface includes a foreground layer and image content that is displayed on the background layer together with foreground content of a non-transparent area through first background content of the transparent area when the background layer is located at a first relative position.
Illustratively, as shown in FIG. 6, there is a foreground layer 501 and a background layer 502. The foreground image layer 501 includes a transparent area 503 and a non-transparent area 504. Schematically, in fig. 6, the coke bottle on the foreground image layer 501 is a transparent area 503, and the shaded portion is a non-transparent area 504. The background layer 502 is an opaque layer, in which the upper half of the background layer 502 is white and the lower half is a shaded area. Superimposing the foreground layer 501 and the background layer 502 may form an image as shown in fig. 7. For example, fig. 7 shows image content that is exhibited by the first background content 505 of the background layer 502 through the transparent area 503 and the foreground content of the non-transparent area 504 together when the foreground layer 501 and the background layer 502 are located at the first relative position. The first background content 505 is a portion of a white area and a portion of a shaded area in the background layer 502 through the transparent area 503.
Illustratively, FIG. 7 is a first user interface on which all of the foreground and background layers are displayed. For example, as shown in fig. 8, another first user interface is shown, in which only the image content of the overlapping portion of the foreground model and the background model is displayed on the first user interface, and the other portion is not displayed.
Step 1021, a displacement instruction is obtained, wherein the displacement instruction is triggered by a user or generated by a program code.
And the terminal acquires a displacement instruction.
The displacement instruction is an instruction for controlling the relative displacement of the foreground image layer and the background image layer.
Illustratively, the displacement instructions are user-triggered. Illustratively, when the terminal receives a displacement operation of a user, a displacement instruction is generated, and the relative displacement between the foreground image layer and the background image layer is controlled according to the displacement instruction. Illustratively, the displacement operation of the user may be at least one of sliding, scrolling, clicking, double clicking, and pressing.
Illustratively, the displacement instructions may also be generated by program code. For example, program code is used to control the foreground layer to automatically move upward from below the background layer.
And step 1022, controlling at least one of the foreground image layer and the background image layer to displace in response to the displacement distance of the displacement instruction.
And the terminal responds to the displacement distance of the displacement instruction and controls at least one of the foreground image layer and the background image layer to displace.
Illustratively, the displacement instruction includes a displacement direction and a displacement distance, and the terminal controls the foreground image layer and the background image layer to generate relative displacement according to the displacement direction and the displacement distance.
For example, the terminal may control the foreground layer to be stationary and the background layer to be moved according to the displacement instruction.
Or, the terminal can control the background layer to be stationary and the foreground layer to be moved according to the displacement instruction.
Or, the terminal may control the background layer and the foreground layer to move according to the displacement instruction. Illustratively, the moving directions of the background layer and the foreground layer may be the same, opposite or at any angle. For example, when the moving directions of the background layer and the foreground layer are the same, the moving speeds or moving distances of the background layer and the foreground layer are different, that is, the relative positions of the background layer and the foreground layer may change after the background layer and the foreground layer move.
For example, the terminal only controls the foreground layer to move, the background layer is not moved, as shown in fig. 9, the displacement instruction is an instruction corresponding to an upward sliding operation of the user, the terminal controls the foreground layer 501 to move upward according to the displacement instruction, and the background layer 502 is fixed, so that the foreground layer 501 and the background layer 502 change from the first relative position of fig. 1 in fig. 9 to the second relative position of fig. 2 in fig. 9, and then to the third relative position of fig. 3 in fig. 9. For example, when the method is applied to an application program with an advertisement browsing function, fig. 10 corresponds to the layer moving process shown in fig. 9. As shown in fig. 10, the image content overlapped by the foreground layer and the background layer is the advertisement content, on the user interface, the advertisement content slides upwards along with the upward sliding operation of the user, at this time, the foreground layer is controlled to slide upwards along with the upward sliding operation of the user, and the background layer is fixed, so that the effects of the images (1) to (2) to (3) in fig. 10 are achieved, and the effect that the cola bottle descends from the state that the cola is filled to the cola liquid level a little until the cola does not exist is simulated.
For example, the terminal only controls the background layer to move, and the foreground layer to not move, as shown in fig. 11, there are a foreground layer 506 and a background layer 507, where the foreground layer has a transparent area 508 and a non-transparent area 509. For example, in fig. 11, the white area of the foreground layer is a transparent area, and the corresponding shaded areas of the three shoes are non-transparent areas. The background layer 507 is an opaque layer. The foreground layer 506 is located above the background layer 507. After the foreground image layer 506 and the background image layer 507 are superimposed, the image content as shown in fig. 12 can be obtained. As shown in FIG. 12, a user interface is provided for applying the foreground layer 506 and the background layer 507 of FIG. 11 to an application having advertisement browsing functionality. The image content overlapped by the foreground layer and the background layer is displayed on the user interface, wherein the displacement instruction is generated according to the program code, the displacement instruction controls the foreground layer to be fixed, the background layer moves upwards, and the effects of the images (1) to (2) to (3) to (4) in the image 12 can be obtained.
For example, the terminal controls both the background layer and the foreground layer to move, as shown in fig. 13, the terminal controls the background layer 507 to move rightward according to the displacement instruction, and the foreground layer 506 moves downward, so that the foreground layer 506 and the background layer 507 change from the first relative position shown in fig. 13 (1) to the second relative position shown in fig. 13 (2).
For example, the moving distance of the foreground layer or the background layer is determined according to the displacement distance in the displacement instruction.
For example, when only one of the foreground image layer and the background image layer moves and the other image layer is stationary, the terminal controls the moving image layer to move by a distance equal to the displacement distance.
Illustratively, the size of the static layer is the same as the size of the user interface. For example, on an application program with advertisement browsing, a user can control a page to move upwards by a distance x by rolling a mouse, and then the terminal controls one of a foreground layer and a background layer to move by the distance x.
Illustratively, the size of the static layer is different from the size of the user interface, and the terminal matches the size of the static layer with the size of the user interface in an equal scaling manner. Illustratively, matching means that the width of the control layer is equal to the width of the user interface; or, the control layer may be as long as the user interface. Illustratively, the terminal matches the size of the static layer and the user interface according to the movable direction of the movable layer. For example, if the moving layer is moved vertically, the terminal scales the size of the still layer by the same ratio, so that the width of the still layer is equal to the width of the user interface, and the length of the still layer may be slightly longer or shorter than the length of the user interface.
For example, since the length of the static layer may be slightly longer or slightly shorter than the user interface, in order to move the moving layer to any position on the user interface, the static layer may be displayed under the transparent area; or in order to enable the static layer to be fully covered by the moving layer, when the moving layer moves, the terminal needs to control the static layer to move. However, the moving direction of the static layer is different from the moving direction of the moving layer, or the moving direction of the static layer is the same as the moving direction of the moving layer but the moving speed is different. For example, if the stationary layer is longer than the user interface, when the moving layer moves upward, the terminal controls the stationary layer to move downward slowly. If the static layer is shorter than the user interface, when the mobile layer moves upwards, the terminal controls the static layer to move upwards slowly.
And 103, displaying a second user interface, wherein the second user interface comprises a foreground layer and image content which is displayed by the second background content of the transparent area and the foreground content of the non-transparent area on the background layer when the background layer is located at a second relative position.
In summary, in the method provided in this embodiment, the terminal controls, by obtaining the displacement instruction, at least one of the foreground layer and the background layer to be displaced according to the displacement distance in the displacement instruction, so that the relative position of the foreground layer and the background layer is changed from the first relative position to the second relative position. By changing the relative positions of the foreground layer and the background layer, the image content shows different effects, which can attract users to check, reduce the image data volume, accelerate the image loading speed and save the flow.
Fig. 14 is a flowchart of a method for displaying a user interface according to an exemplary embodiment of the present application. Taking the main execution body of the method as an example of the terminal shown in fig. 1, compared with the exemplary embodiment shown in fig. 5, the step 1022 of controlling the displacement of at least one of the foreground layer and the background layer may be replaced by steps 201 to 203.
Step 201, determine whether the size of the foreground layer is larger than the background layer.
The terminal judges whether the size of the foreground image layer is larger than that of the background image layer, if so, the step 202 is carried out; otherwise, go to step 203.
For example, only one of the foreground layer and the background layer moves, and the other layer is stationary. Illustratively, the terminal controls the layer with the smaller size in the foreground layer and the background layer to move, and the layer with the larger size is static.
Step 202, controlling the displacement of the background layer relative to the foreground layer.
And when the size of the front scene layer is larger than that of the background layer, the terminal controls the displacement of the background layer relative to the foreground layer.
Illustratively, the terminal controls the background layer to move, and the foreground layer is static.
For example, as shown in fig. 15, a foreground layer 601 is superimposed on a background layer 602, where the foreground layer 601 has transparent areas 503 of square, triangle, diamond shape, and non-transparent areas 504 except the transparent areas 503. Illustratively, in fig. 15, the white part of the foreground layer is a transparent area 503, and the shaded part is a non-transparent area 504. The background layer 602 is an opaque layer. And the terminal controls the background layer 602 to displace relative to the foreground layer 601 according to the displacement instruction.
For example, most of the foreground image layer may be a non-transparent area, and a small part may be a transparent area; the majority can be transparent area, and the minority can be non-transparent area. For example, as shown in fig. 16, there are non-transparent areas 504 of a square, triangle, diamond shape on the foreground layer 601, and the foreground layer 601 is superimposed on the background layer 602 except for the transparent areas 503 outside the non-transparent areas 504. And the terminal controls the background layer 602 to displace relative to the foreground layer 601 according to the displacement instruction.
Illustratively, the background layer may be moved in any direction. For example, the background image layers in fig. 15 and 16 are vertically slid. For example, the background layer can also be controlled to slide laterally. As shown in fig. 17, a foreground layer 601 is superimposed on a background layer 602, and a transparent area 503 with a square, triangle, or diamond shape and a non-transparent area 504 except the transparent area 503 are on the foreground layer 601. Illustratively, in fig. 15, the white part of the foreground layer is a transparent area 503, and the shaded part is a non-transparent area 504. The background layer 602 is an opaque layer. And the terminal controls the background layer 602 to slide transversely relative to the foreground layer 601 according to the displacement instruction.
And step 203, controlling the displacement of the foreground layer relative to the background layer.
When the size of the current scene layer is smaller than that of the background layer, the terminal controls the displacement of the foreground layer relative to the background layer.
Illustratively, the terminal controls the foreground layer to move and the background layer to be stationary.
For example, as shown in FIG. 18, a foreground layer 601 is superimposed over a background layer 602, where the foreground layer 601 has a transparent area 503 of a square shape and a non-transparent area 504 other than the transparent area 503. Illustratively, in fig. 15, the white part of the foreground layer is a transparent area 503, and the shaded part is a non-transparent area 504. The background layer 602 is an opaque layer. And the terminal controls the foreground image layer 601 to displace relative to the background image layer 602 according to the displacement instruction.
For example, most of the foreground image layer may be a non-transparent area, and a small part may be a transparent area; the majority can be transparent area, and the minority can be non-transparent area.
Illustratively, the foreground image layer may be moved in any direction.
In summary, in the method provided in this embodiment, by determining the sizes of the foreground layer and the background layer, the terminal controls the layer with the smaller size to move, and the layer with the larger size to be stationary. By changing the relative positions of the foreground layer and the background layer, the image content shows different effects, which can attract users to check, reduce the image data volume, accelerate the image loading speed and save the flow.
Illustratively, the displacement instruction further includes a displacement direction, and the foreground image layer or the background image layer may move in any direction according to the displacement instruction.
Fig. 19 is a flowchart of a method for displaying a user interface according to an exemplary embodiment of the present application. Taking the main body of execution of the method as an example for the terminal 10 shown in fig. 1, unlike the exemplary embodiment shown in fig. 14, step 202 is replaced by step 2021 to step 2026, and step 203 is replaced by step 2031 to step 2036.
Step 2021, when the size of the foreground image layer is larger than the background image layer.
Illustratively, when the size of the foreground layer is larger than that of the background layer, the terminal controls the background layer to displace relative to the foreground layer, and the displacement direction of the background layer may be any one of steps 2022 to 2026.
Step 2022, controlling the background layer to shift according to the first direction, where the first direction is the same as the shift direction of the shift instruction.
And the terminal controls the background layer to displace according to the first direction.
Illustratively, the displacement direction is also included in the displacement instruction. The displacement direction is a direction of a displacement operation by the user, or a displacement direction set in the program code.
Illustratively, as shown in fig. 20, there are a foreground layer 601 and a background layer 602, wherein the foreground layer 601 has non-transparent areas of squares, triangles, diamonds, and transparent areas except for the non-transparent areas. The background layer 602 is an opaque layer. If the direction 603 of the displacement in the displacement command is upward, the terminal may control the background layer to move toward the first direction 604, where the first direction 604 is the same direction as the direction 603 of the displacement.
Step 2023, controlling the background layer to displace according to a second direction, where the second direction is opposite to the displacement direction of the displacement instruction.
And the terminal controls the background layer to displace according to the second direction.
For example, as shown in fig. 20, if the displacement direction 603 in the displacement command is upward, the terminal may control the background layer to move to a second direction 605, where the second direction 605 is opposite to the displacement direction 603.
Step 2024, controlling the background image layer to displace according to a third direction, where the third direction is perpendicular to the displacement direction of the displacement instruction.
And the terminal controls the background layer to displace according to the third direction.
For example, as shown in fig. 20, if the displacement direction 603 in the displacement instruction is upward, the terminal may control the background layer to move to a third direction 606, where the third direction 606 is a direction perpendicular to the displacement direction 603. For example, when the foreground layer and the background layer are laterally slid layers as shown in fig. 17, the user slides up and down to control the background layer to move left and right.
Step 2025, controlling the background image layer to displace according to a fourth direction, where the fourth direction and a displacement direction of the displacement instruction form an included angle, and the included angle is an acute angle.
And the terminal controls the background layer to shift according to the fourth direction.
For example, as shown in fig. 20, if the displacement direction 603 in the displacement instruction is upward, the terminal may control the background layer to move toward a fourth direction 607, and the fourth direction 607 is a direction opposite to the displacement direction 603. For example, if the direction of the displacement of the user sliding out is forty-five degrees in the oblique direction, the terminal controls the background layer to move upward.
For example, the fourth direction may also have an included angle with the displacement direction of the displacement instruction, and the included angle is an obtuse angle.
Step 2026, controlling the background map layer to shift according to the preset curve track.
And the terminal controls the background layer to shift according to a preset curve track.
For example, as shown in fig. 20, the terminal may control the background image layer to shift according to a preset curve track 608. Illustratively, the curved track may be of any shape, e.g., zig-zag, undulating, helical, etc. For example, the terminal may determine the overall movement direction of the curved track according to the displacement direction, for example, when the displacement direction is upward, the terminal controls the background layer to move upward in a zigzag track,
step 2031, when the size of the foreground layer is smaller than the background layer.
Illustratively, when the size of the foreground layer is smaller than that of the background layer, the terminal controls the foreground layer to displace relative to the background layer, and the displacement direction of the foreground layer may be any one of step 2032 to step 2036.
Step 2032, controlling the foreground layer to shift according to a first direction, where the first direction is the same as the shift direction of the shift instruction.
And the terminal controls the foreground layer to displace according to the first direction.
Step 2033, controlling the foreground layer to shift according to a second direction, where the second direction is opposite to the shift direction of the shift instruction.
And the terminal controls the foreground image layer to displace according to the second direction.
Step 2034, controlling the foreground image layer to shift according to a third direction, where the third direction is perpendicular to the shift direction of the shift instruction.
And the terminal controls the foreground image layer to displace according to the third direction.
And 2035, controlling the foreground image layer to displace according to a fourth direction, wherein an included angle is formed between the fourth direction and the displacement direction of the displacement instruction, and the included angle is an acute angle.
And the terminal controls the foreground layer to shift according to the fourth direction.
Step 2036, controlling the background map layer to shift according to the preset curve track.
And the terminal controls the background layer to shift according to a preset curve track.
For example, the terminal may control the foreground layer to move to the direction in step 2032 to step 2036 based on the same principle as in step 2022 to step 2026.
In summary, in the method provided in this embodiment, the displacement instruction further includes a displacement direction, and the foreground layer or the background layer may determine the movement direction according to the displacement direction. By changing the relative positions of the foreground layer and the background layer, the image content shows different effects, which can attract users to check, reduce the image data volume, accelerate the image loading speed and save the flow.
By way of example, four user interfaces displayed by the display method of the user interface provided by the application are provided.
First, as shown in fig. 21, there are a foreground layer 701 and a background layer 702. In which the foreground image layer 701 has a transparent region 503 in the shape of a bottle and a non-transparent region 504 other than the bottle. The background layer 702 is an opaque two-shaded area. The user interface shown in fig. 22 can be obtained by superposing the foreground layer 701 and the background layer 702 one on top of the other. As shown in fig. 22, the terminal controls the foreground layer 701 to move from the left end of the background layer 702 to the right end of the background layer 702, i.e., the effects of fig. 1 to 2 to 3 in fig. 22. The filling color in the bottle is changed, the interestingness of the picture is increased, the user can check the picture, the data volume of the picture is reduced, the image loading speed is increased, and the flow is saved.
Second, as shown in fig. 23, there is a foreground layer 801 and a background layer 802. The foreground image layer 801 has transparent areas 503 with square, triangle and diamond shapes, and non-transparent areas 504 except the transparent areas 503. The background layer 802 is an opaque shaded area 803 and a white area 804. For example, the background layer may also be a layer with a transparent area, that is, the background layer 802 includes an opaque shadow area 803 and a transparent area 804. The user interface shown in fig. 24 can be obtained by superposing the foreground layer 801 and the background layer 802 on top of each other. As shown in fig. 24, in order to better show the effect, the boundary between the regions is hidden, and the terminal controls the foreground layer 801 to gradually move from the lower portion of the background layer 802 to the upper portion of the background layer 802, that is, the effect of fig. 1 to fig. 6 in fig. 24. The interestingness of the picture is increased, the user is attracted to check the picture, the image data volume is reduced, the image loading speed is increased, and the flow is saved.
Third, as shown in fig. 25, there are a foreground layer 901 and a background layer 902. In which the foreground layer 901 has a non-transparent area 504 with a square and a triangle, and a transparent area 503 except the non-transparent area 504. Background layer 902 is an opaque layer. The user interface shown in fig. 26 can be obtained by superposing the foreground layer 901 and the background layer 902 on top of each other. As shown in fig. 26, the terminal controls the foreground layer 901 to gradually move from the lower portion of the background layer 902 to the upper portion of the background layer 902, i.e., the effects of fig. 1 to fig. 3 in fig. 26. The interestingness of the picture is increased, the user is attracted to check the picture, the image data volume is reduced, the image loading speed is increased, and the flow is saved.
Fourth, as shown in FIG. 27, there is a foreground layer 1001 and a background layer 1002. In which the foreground layer 1001 has a transparent area 503 located in the central portion of the foreground layer, where the transparent area 503 is a semi-transparent area (with a transparency of 50%), and a non-transparent area 504 is located besides the transparent area 503. The background layer 1002 is a circular opaque layer. Superimposing the foreground layer 1001 and the background layer 1002 on top of each other results in the user interface shown in fig. 28. As shown in fig. 28, the terminal controls the background layer 1002 to move arbitrarily in the foreground layer 1001, for example, the effects of fig. 1 to 3 in fig. 28. Illustratively, a white layer is further arranged behind the background layer 1002 to fill up the region of the transparent region 503 except the background layer 1002, so as to ensure the color accuracy of the transparent region. The method increases the interestingness of the picture, attracts a user to check, reduces the image data volume, accelerates the image loading speed and saves the flow.
For example, when the user interface includes two or more sets of foreground layers and background layers corresponding to each other, the user interface may be occupied by the background layers. At this time, the foreground layer may obtain the identifier of the corresponding background layer, and according to the identifier of the background layer, the background content of the corresponding background layer is transmitted on the transparent area of the foreground layer. For example, there are a first foreground layer, a first background layer, a second foreground layer, and a second background layer in one-to-one correspondence on the user interface. The first foreground layer acquires the identifier of the first background layer, and the background content on the first background layer is transmitted from the transparent area of the first foreground layer; the second foreground layer obtains the identifier of the second background layer, and the background content on the second background layer is transmitted from the transparent area of the second foreground layer. That is, the first foreground layer does not show the background content of the second background layer, and the second foreground layer does not show the background content of the first background layer.
Illustratively, as shown in fig. 29, there are a first foreground layer 901, a first background layer 903, and a second foreground layer 902, a second background layer 904 in a one-to-one correspondence. The transparent area of the first foreground layer 901 reveals the background content of the first background layer 903, and the transparent area of the second foreground layer 902 reveals the background content of the second background layer 904.
In the following, embodiments of the apparatus of the present application are referred to, and for details not described in detail in the embodiments of the apparatus, the above-described embodiments of the method can be referred to.
FIG. 30 is a block diagram of a display device of a user interface provided in an exemplary embodiment of the present application. The device is applied to a terminal, an application program supporting picture browsing is operated in the terminal, and the user interface comprises: foreground map layer and background map layer, including transparent region and non-transparent region on the foreground map layer, the device includes:
a display module 301, configured to display a first user interface, where the first user interface includes image content that is displayed on the background layer through a first background content in the transparent area and a foreground content in the non-transparent area when the foreground layer and the background layer are located at a first relative position;
a control module 302, configured to control relative displacement between the foreground layer and the background layer;
the display module 301 is further configured to display a second user interface, where the second user interface includes image content that is displayed on the background layer through the second background content in the transparent area and the foreground content in the non-transparent area when the foreground layer and the background layer are located at a second relative position.
In an optional embodiment, the apparatus further comprises: an acquisition module 303;
the obtaining module 303 is configured to obtain a displacement instruction, where the displacement instruction is triggered by a user or generated by a program code;
the control module 302 is further configured to control at least one of the foreground layer and the background layer to be displaced in response to a displacement distance of the displacement instruction.
In an optional embodiment, the size of the foreground image layer is larger than that of the background image layer;
the control module 302 is further configured to control the background layer to displace relative to the foreground layer.
In an optional embodiment, the control module 302 is further configured to control the background layer to be displaced according to a first direction, where the first direction is the same as a displacement direction of the displacement instruction;
or the like, or, alternatively,
the control module 302 is further configured to control the background layer to be displaced according to a second direction, where the second direction is opposite to a displacement direction of the displacement instruction;
or the like, or, alternatively,
the control module 302 is further configured to control the background layer to displace according to a third direction, where the third direction is perpendicular to a displacement direction of the displacement instruction;
or the like, or, alternatively,
the control module 302 is further configured to control the background layer to be displaced according to a fourth direction, where an included angle is formed between the fourth direction and the displacement direction of the displacement instruction, and the included angle is an acute angle;
or the like, or, alternatively,
the control module 302 is further configured to control the background layer to shift according to a preset curve track.
In an optional embodiment, the size of the foreground layer is smaller than that of the background layer;
the control module 302 is further configured to control the foreground layer to displace relative to the background layer.
In an optional embodiment, the control module 302 is further configured to control the foreground layer to be displaced according to a first direction, where the first direction is the same as a displacement direction of the displacement instruction;
or the like, or, alternatively,
the control module 302 is further configured to control the foreground layer to be displaced according to a second direction, where the second direction is opposite to the displacement direction of the displacement instruction;
or the like, or, alternatively,
the control module 302 is further configured to control the foreground layer to be displaced according to a third direction, where the third direction is perpendicular to a displacement direction of the displacement instruction;
or the like, or, alternatively,
the control module 302 is further configured to control the foreground image layer to be displaced according to a fourth direction, where an included angle is formed between the fourth direction and the displacement direction of the displacement instruction, and the included angle is an acute angle;
or the like, or, alternatively,
the control module 302 is further configured to control the foreground layer to shift according to a preset curve track.
In an optional embodiment, the foreground content is primary display content, and the first background content and the second background content are secondary display content;
or the like, or, alternatively,
the foreground content is auxiliary display content, and the first background content and the second background content are main display content.
In an optional embodiment, the display elements in at least two positions in the background layer are different, and the display elements include: shapes, patterns, colors, and combinations thereof.
It should be noted that: the display device of the user interface provided in the foregoing embodiment is only illustrated by dividing the functional modules, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to complete all or part of the functions described above. In addition, the display device of the user interface provided by the above embodiment and the display method embodiment of the user interface belong to the same concept, and specific implementation processes thereof are detailed in the method embodiment and are not described herein again.
Fig. 31 shows a block diagram of a terminal 3900 provided in an exemplary embodiment of the present application. The terminal 3900 can be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio layer iii, motion video Experts compression standard Audio layer 3), an MP4 player (Moving Picture Experts Group Audio layer IV, motion video Experts compression standard Audio layer 4), a notebook computer, or a desktop computer. The terminal 3900 may also be referred to as user equipment, a portable terminal, a laptop terminal, a desktop terminal, and other names.
Generally, the terminal 3900 includes: a processor 3901 and a memory 3902.
Processor 3901 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 3901 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). Processor 3901 may also include a main processor, which is a processor used to process data in the wake-up state and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 3901 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content that the display screen needs to display. In some embodiments, processor 3901 may also include an AI (Artificial Intelligence) processor to process computational operations related to machine learning.
The memory 3902 may include one or more computer-readable storage media, which may be non-transitory. The memory 3902 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 3902 is used to store at least one instruction for execution by the processor 3901 to implement a method of displaying a user interface provided by method embodiments herein.
In some embodiments, the terminal 3900 can also optionally include: a peripheral interface 3903 and at least one peripheral. Processor 3901, memory 3902, and peripheral interface 3903 may be connected by buses or signal lines. Various peripheral devices may be connected to peripheral interface 3903 via buses, signal lines, or circuit boards. Specifically, the peripheral device includes: at least one of radio frequency circuitry 3904, touch display screen 3905, camera 3906, audio circuitry 3907, positioning component 3908, and power source 3909.
Peripheral interface 3903 can be used to connect at least one peripheral associated with I/O (Input/Output) to processor 3901 and memory 3902. In some embodiments, processor 3901, memory 3902, and peripheral device interface 3903 are integrated on the same chip or circuit board; in some other embodiments, any one or both of processor 3901, memory 3902, and peripheral device interface 3903 may be implemented on separate chips or circuit boards, which are not limited by the present embodiment.
The Radio Frequency circuit 3904 is used to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuitry 3904 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 3904 converts electrical signals into electromagnetic signals for transmission, or converts received electromagnetic signals into electrical signals. Optionally, the radio frequency circuitry 3904 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 3904 can communicate with other terminals through at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 3904 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 3905 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 3905 is a touch display screen, the display screen 3905 also has the ability to acquire touch signals on or over the surface of the display screen 3905. The touch signal may be input to the processor 3901 for processing as a control signal. At this point, the display 3905 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 3905 may be one, providing the front panel of the terminal 3900; in other embodiments, the display screens 3905 can be at least two, each disposed on a different surface of the terminal 3900 or in a folded design; in still other embodiments, the display 3905 can be a flexible display disposed on a curved surface or on a folded surface of the terminal 3900. Even further, the display 3905 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The Display 3905 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and other materials.
Camera assembly 3906 is used to capture images or video. Optionally, camera assembly 3906 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 3906 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
Audio circuitry 3907 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 3901 for processing or inputting the electric signals to the radio frequency circuit 3904 for realizing voice communication. For stereo capture or noise reduction purposes, multiple microphones may be provided, each at a different location of the terminal 3900. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 3901 or the radio frequency circuit 3904 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuit 3907 may also include a headphone jack.
The positioning component 3908 is operable to locate a current geographic location of the terminal 3900 to implement navigation or LBS (location based Service). The positioning component 3908 can be a positioning component based on the GPS (global positioning System) in the united states, the beidou System in china, or the galileo System in russia.
Power supply 3909 is used to provide power to the various components in terminal 3900. Power supply 3909 can be an alternating current, a direct current, a disposable battery, or a rechargeable battery. When power supply 3909 includes a rechargeable battery, the rechargeable battery can be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal 3900 also includes one or more sensors 3910. The one or more sensors 3910 include, but are not limited to: an acceleration sensor 3911, a gyro sensor 3912, a pressure sensor 3913, a fingerprint sensor 3914, an optical sensor 3915, and a proximity sensor 3916.
The acceleration sensor 3911 may detect the magnitude of acceleration on three coordinate axes of a coordinate system established with the terminal 3900. For example, the acceleration sensor 3911 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 3901 may control the touch display screen 3905 to display a user interface in a landscape view or a portrait view based on the gravitational acceleration signal collected by the acceleration sensor 3911. The acceleration sensor 3911 may also be used for acquisition of motion data of a game or a user.
The gyroscope sensor 3912 may detect a body direction and a rotation angle of the terminal 3900, and the gyroscope sensor 3912 may cooperate with the acceleration sensor 3911 to acquire a 3D motion of the user on the terminal 3900. From the data collected by the gyro sensor 3912, the processor 3901 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 3913 may be disposed on side frames of the terminal 3900 and/or underlying layers of the touch display screen 3905. When the pressure sensor 3913 is disposed on the side frame of the terminal 3900, a user's holding signal of the terminal 3900 can be detected, and the processor 3901 performs left-right hand recognition or shortcut operation according to the holding signal acquired by the pressure sensor 3913. When the pressure sensor 3913 is disposed at a lower layer of the touch display screen 3905, the processor 3901 controls the operability controls on the UI interface according to the pressure operation of the user on the touch display screen 3905. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 3914 is used to collect a fingerprint of the user, and the processor 3901 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 3914, or the fingerprint sensor 3914 identifies the identity of the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, processor 3901 authorizes the user to perform relevant sensitive operations including unlocking a screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 3914 may be disposed on the front, back, or side of the terminal 3900. When a physical key or vendor Logo is provided on the terminal 3900, the fingerprint sensor 3914 may be integrated with the physical key or vendor Logo.
The optical sensor 3915 is used to collect the ambient light intensity. In one embodiment, the processor 3901 may control the display brightness of the touch display screen 3905 based on the intensity of ambient light collected by the optical sensor 3915. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 3905 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 3905 is turned down. In another embodiment, the processor 3901 may also dynamically adjust the shooting parameters of the camera assembly 3906 based on the intensity of ambient light collected by the optical sensor 3915.
A proximity sensor 3916, also known as a distance sensor, is typically disposed on the front panel of the terminal 3900. The proximity sensor 3916 is used to capture the distance between the user and the front face of the terminal 3900. In one embodiment, the touch display screen 3905 is controlled by the processor 3901 to switch from a bright screen state to a dark screen state when the proximity sensor 3916 detects that the distance between the user and the front face of the terminal 3900 gradually decreases; when the proximity sensor 3916 detects that the distance between the user and the front face of the terminal 3900 gradually becomes larger, the touch display screen 3905 is controlled by the processor 3901 to switch from a breath-screen state to a light-screen state.
Those skilled in the art will appreciate that the architecture shown in fig. 31 does not constitute a limitation of terminal 3900, and may include more or fewer components than those shown, or some of the components may be combined, or a different arrangement of components may be employed.
The present application further provides a computer device comprising a processor and a memory, wherein the memory stores at least one instruction, at least one program, a set of codes, or a set of instructions, and the at least one instruction, the at least one program, the set of codes, or the set of instructions is loaded and executed by the processor to implement the method for displaying a user interface provided by any of the above exemplary embodiments.
The present application further provides a computer-readable storage medium having at least one instruction, at least one program, a set of codes, or a set of instructions stored therein, which is loaded and executed by the processor to implement the method for displaying a user interface provided in any of the above exemplary embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (15)

1. A method for displaying a user interface, the user interface comprising: the method comprises the following steps that a foreground image layer and a background image layer are arranged, wherein the foreground image layer comprises a transparent area and a non-transparent area, and the method comprises the following steps:
displaying a first user interface, where the first user interface includes image content that is displayed on the background layer through a first background content of the transparent area and a foreground content of the non-transparent area when the foreground layer and the background layer are located at a first relative position;
controlling the relative displacement of the foreground image layer and the background image layer;
and displaying a second user interface, wherein the second user interface comprises image content which is displayed on the background layer through the second background content of the transparent area and the foreground content of the non-transparent area when the foreground layer and the background layer are located at a second relative position.
2. The method according to claim 1, wherein the controlling the relative displacement of the foreground layer and the background layer comprises:
acquiring a displacement instruction, wherein the displacement instruction is triggered by a user or generated by a program code;
and controlling at least one of the foreground image layer and the background image layer to be displaced in response to the displacement distance of the displacement instruction.
3. The method according to claim 2, wherein the foreground layer has a larger size than the background layer; the controlling of the displacement of at least one of the foreground image layer and the background image layer includes:
and controlling the displacement of the background layer relative to the foreground layer.
4. The method according to claim 3, wherein the controlling the background layer to be displaced with respect to the foreground layer comprises:
controlling the background image layer to displace according to a first direction, wherein the first direction is the same as the displacement direction of the displacement instruction;
or the like, or, alternatively,
controlling the background image layer to displace according to a second direction, wherein the second direction is opposite to the displacement direction of the displacement instruction;
or the like, or, alternatively,
controlling the background image layer to displace according to a third direction, wherein the third direction is vertical to the displacement direction of the displacement instruction;
or the like, or, alternatively,
controlling the background image layer to displace according to a fourth direction, wherein an included angle is formed between the fourth direction and the displacement direction of the displacement instruction, and the included angle is an acute angle;
or the like, or, alternatively,
and controlling the background image layer to shift according to a preset curve track.
5. The method according to claim 2, wherein the foreground layer is smaller in size than the background layer; the controlling of the displacement of one of the foreground image layer and the background image layer includes:
and controlling the displacement of the foreground image layer relative to the background image layer.
6. The method according to claim 5, wherein said controlling the foreground layer to be displaced relative to the background layer comprises:
controlling the foreground image layer to displace according to a first direction, wherein the first direction is the same as the displacement direction of the displacement instruction;
or the like, or, alternatively,
controlling the foreground image layer to displace according to a second direction, wherein the second direction is opposite to the displacement direction of the displacement instruction;
or the like, or, alternatively,
controlling the foreground image layer to displace according to a third direction, wherein the third direction is vertical to the displacement direction of the displacement instruction;
or the like, or, alternatively,
controlling the foreground image layer to displace according to a fourth direction, wherein an included angle is formed between the fourth direction and the displacement direction of the displacement instruction, and the included angle is an acute angle;
or the like, or, alternatively,
and controlling the foreground image layer to shift according to a preset curve track.
7. The method according to any one of claims 1 to 6,
the foreground content is main display content, and the first background content and the second background content are auxiliary display content;
or the like, or, alternatively,
the foreground content is auxiliary display content, and the first background content and the second background content are main display content.
8. The method according to any one of claims 1 to 6,
the display elements of at least two positions in the background layer are different, and the display elements comprise: shapes, patterns, colors, and combinations thereof.
9. A display device of a user interface, the user interface comprising: foreground map layer and background map layer, including transparent region and non-transparent region on the foreground map layer, the device includes:
a display module, configured to display a first user interface, where the first user interface includes image content that is displayed on the background layer through a first background content of the transparent area and a foreground content of the non-transparent area when the foreground layer and the background layer are located at a first relative position;
the control module is used for controlling the relative displacement of the foreground layer and the background layer;
the display module is further configured to display a second user interface, where the second user interface includes image content that is displayed on the background layer through the second background content in the transparent area and the foreground content in the non-transparent area when the foreground layer and the background layer are located at a second relative position.
10. The apparatus of claim 9, further comprising: an acquisition module;
the acquisition module is used for acquiring a displacement instruction, and the displacement instruction is triggered by a user or generated by a program code;
the control module is further configured to control at least one of the foreground layer and the background layer to be displaced in response to a displacement distance of the displacement instruction.
11. The apparatus according to claim 10, wherein the foreground layer has a larger size than the background layer;
the control module is further configured to control the background layer to displace relative to the foreground layer.
12. The apparatus according to claim 10, wherein the foreground layer is smaller in size than the background layer;
the control module is further configured to control the foreground layer to displace relative to the background layer.
13. The apparatus of claim 12, wherein the apparatus is a portable electronic device
The control module is further configured to control the foreground layer to be displaced according to a first direction, where the first direction is the same as a displacement direction of the displacement instruction;
or the like, or, alternatively,
the control module is further configured to control the foreground image layer to be displaced according to a second direction, where the second direction is opposite to the displacement direction of the displacement instruction;
or the like, or, alternatively,
the control module is further configured to control the foreground image layer to be displaced according to a third direction, where the third direction is perpendicular to the displacement direction of the displacement instruction;
or the like, or, alternatively,
the control module is further configured to control the foreground image layer to displace according to a fourth direction, where an included angle is formed between the fourth direction and a displacement direction of the displacement instruction, and the included angle is an acute angle;
or the like, or, alternatively,
the control module is further used for controlling the foreground image layer to move according to a preset curve track.
14. A computer device comprising a processor and a memory, the memory having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, the at least one instruction, the at least one program, the set of codes, or the set of instructions being loaded and executed by the processor to implement a display method of a user interface according to any one of claims 1 to 8.
15. A computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement a display method of a user interface according to any one of claims 1 to 8.
CN201911183743.6A 2019-11-27 2019-11-27 User interface display method, device, equipment and medium Active CN110928464B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911183743.6A CN110928464B (en) 2019-11-27 2019-11-27 User interface display method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911183743.6A CN110928464B (en) 2019-11-27 2019-11-27 User interface display method, device, equipment and medium

Publications (2)

Publication Number Publication Date
CN110928464A true CN110928464A (en) 2020-03-27
CN110928464B CN110928464B (en) 2023-08-25

Family

ID=69847553

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911183743.6A Active CN110928464B (en) 2019-11-27 2019-11-27 User interface display method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN110928464B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111782114A (en) * 2020-06-15 2020-10-16 广州视源电子科技股份有限公司 Element display method, device, equipment and medium in manuscript editing application
CN112579083A (en) * 2020-12-09 2021-03-30 京东数字科技控股股份有限公司 Image display method and device, electronic equipment and storage medium
CN113282258A (en) * 2021-05-28 2021-08-20 武汉悦学帮网络技术有限公司 Information display method and device
CN114322409A (en) * 2021-06-23 2022-04-12 海信视像科技股份有限公司 Refrigerator and storage room interior picture display method
TWI839111B (en) * 2023-02-15 2024-04-11 台達電子工業股份有限公司 Moving object detecting apparatus and method thereof

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000032249A (en) * 1998-07-14 2000-01-28 Tatsumi Denshi Kogyo Kk Video printing device, and printing paper for the video printing device
WO2000010157A1 (en) * 1998-08-11 2000-02-24 Play, Inc. System and method for refracting a background image through a foreground object on a computer display
WO2001033450A1 (en) * 1999-10-30 2001-05-10 Son Young Cherl Method and system for advertisement using animation-character
US20020045465A1 (en) * 2000-10-16 2002-04-18 Mitsuya Kishida Automatic selection of a background image for a display on a mobile telephone
US20090027418A1 (en) * 2007-07-24 2009-01-29 Maru Nimit H Map-based interfaces for storing and locating information about geographical areas
US20110246916A1 (en) * 2010-04-02 2011-10-06 Nokia Corporation Methods and apparatuses for providing an enhanced user interface
CN102385477A (en) * 2010-09-03 2012-03-21 Lg电子株式会社 Method for providing user interface based on multiple displays and mobile terminal using the same
EP2525580A2 (en) * 2011-05-20 2012-11-21 EchoStar Technologies L.L.C. Dynamically configurable 3D display
US20130065614A1 (en) * 2011-09-14 2013-03-14 Hotaek JUNG Mobile terminal and method for controlling operation thereof
CN105243268A (en) * 2015-09-18 2016-01-13 网易(杭州)网络有限公司 Game map positioning method and apparatus as well as user terminal
US20160216518A1 (en) * 2015-01-28 2016-07-28 Sony Computer Entertainment Europe Limited Display
CN106814886A (en) * 2015-11-30 2017-06-09 阿里巴巴集团控股有限公司 The methods of exhibiting and device of banner banner pictures
CN110175065A (en) * 2019-05-29 2019-08-27 广州视源电子科技股份有限公司 User interface display method, device, equipment and storage medium

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000032249A (en) * 1998-07-14 2000-01-28 Tatsumi Denshi Kogyo Kk Video printing device, and printing paper for the video printing device
WO2000010157A1 (en) * 1998-08-11 2000-02-24 Play, Inc. System and method for refracting a background image through a foreground object on a computer display
WO2001033450A1 (en) * 1999-10-30 2001-05-10 Son Young Cherl Method and system for advertisement using animation-character
US20020045465A1 (en) * 2000-10-16 2002-04-18 Mitsuya Kishida Automatic selection of a background image for a display on a mobile telephone
US20090027418A1 (en) * 2007-07-24 2009-01-29 Maru Nimit H Map-based interfaces for storing and locating information about geographical areas
US20110246916A1 (en) * 2010-04-02 2011-10-06 Nokia Corporation Methods and apparatuses for providing an enhanced user interface
CN102385477A (en) * 2010-09-03 2012-03-21 Lg电子株式会社 Method for providing user interface based on multiple displays and mobile terminal using the same
EP2525580A2 (en) * 2011-05-20 2012-11-21 EchoStar Technologies L.L.C. Dynamically configurable 3D display
US20130065614A1 (en) * 2011-09-14 2013-03-14 Hotaek JUNG Mobile terminal and method for controlling operation thereof
US20160216518A1 (en) * 2015-01-28 2016-07-28 Sony Computer Entertainment Europe Limited Display
CN105243268A (en) * 2015-09-18 2016-01-13 网易(杭州)网络有限公司 Game map positioning method and apparatus as well as user terminal
CN106814886A (en) * 2015-11-30 2017-06-09 阿里巴巴集团控股有限公司 The methods of exhibiting and device of banner banner pictures
CN110175065A (en) * 2019-05-29 2019-08-27 广州视源电子科技股份有限公司 User interface display method, device, equipment and storage medium

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111782114A (en) * 2020-06-15 2020-10-16 广州视源电子科技股份有限公司 Element display method, device, equipment and medium in manuscript editing application
CN112579083A (en) * 2020-12-09 2021-03-30 京东数字科技控股股份有限公司 Image display method and device, electronic equipment and storage medium
CN112579083B (en) * 2020-12-09 2024-05-17 京东科技控股股份有限公司 Image display method, device, electronic equipment and storage medium
CN113282258A (en) * 2021-05-28 2021-08-20 武汉悦学帮网络技术有限公司 Information display method and device
CN113282258B (en) * 2021-05-28 2023-08-15 武汉悦学帮网络技术有限公司 Information display method and device
CN114322409A (en) * 2021-06-23 2022-04-12 海信视像科技股份有限公司 Refrigerator and storage room interior picture display method
CN114322409B (en) * 2021-06-23 2023-09-19 海信视像科技股份有限公司 Refrigerator and method for displaying indoor scenery pictures
TWI839111B (en) * 2023-02-15 2024-04-11 台達電子工業股份有限公司 Moving object detecting apparatus and method thereof

Also Published As

Publication number Publication date
CN110928464B (en) 2023-08-25

Similar Documents

Publication Publication Date Title
US11538501B2 (en) Method for generating video, and electronic device and readable storage medium thereof
CN109977333B (en) Webpage display method and device, computer equipment and storage medium
CN110928464B (en) User interface display method, device, equipment and medium
CN108449641B (en) Method, device, computer equipment and storage medium for playing media stream
CN111701238A (en) Virtual picture volume display method, device, equipment and storage medium
CN111880712B (en) Page display method and device, electronic equipment and storage medium
CN110047152B (en) Object construction method and device based on virtual environment and readable storage medium
CN112230914B (en) Method, device, terminal and storage medium for producing small program
CN109948581B (en) Image-text rendering method, device, equipment and readable storage medium
CN110321126B (en) Method and device for generating page code
CN110288689B (en) Method and device for rendering electronic map
CN111694478A (en) Content display method, device, terminal and storage medium
CN111399736A (en) Progress bar control method, device and equipment and readable storage medium
CN114546545B (en) Image-text display method, device, terminal and storage medium
CN112257006A (en) Page information configuration method, device, equipment and computer readable storage medium
CN110889060A (en) Webpage display method and device, computer equipment and storage medium
CN113377270B (en) Information display method, device, equipment and storage medium
CN113032590B (en) Special effect display method, device, computer equipment and computer readable storage medium
CN112116681A (en) Image generation method and device, computer equipment and storage medium
CN112612405B (en) Window display method, device, equipment and computer readable storage medium
CN110992268B (en) Background setting method, device, terminal and storage medium
CN109833623B (en) Object construction method and device based on virtual environment and readable storage medium
CN109032492B (en) Song cutting method and device
CN113407141B (en) Interface updating method, device, terminal and storage medium
CN111583375B (en) Virtual picture display method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40022653

Country of ref document: HK

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant