CN113806003A - Wallpaper display method and device - Google Patents

Wallpaper display method and device Download PDF

Info

Publication number
CN113806003A
CN113806003A CN202111124517.8A CN202111124517A CN113806003A CN 113806003 A CN113806003 A CN 113806003A CN 202111124517 A CN202111124517 A CN 202111124517A CN 113806003 A CN113806003 A CN 113806003A
Authority
CN
China
Prior art keywords
input
wallpaper
image
display information
dynamic display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111124517.8A
Other languages
Chinese (zh)
Inventor
丁鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202111124517.8A priority Critical patent/CN113806003A/en
Publication of CN113806003A publication Critical patent/CN113806003A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a wallpaper display method and device, and belongs to the technical field of electronic equipment. The wallpaper display method comprises the following steps: displaying a first image; receiving a first input to a first image; responding to a first input, and configuring dynamic display information corresponding to a first object, wherein the first object is an object included in a first image, or the first object is a user-defined object; in a case where the first image is set as the wallpaper and the wallpaper is displayed, the first object is dynamically displayed according to the dynamic display information.

Description

Wallpaper display method and device
Technical Field
The application belongs to the technical field of electronic equipment, and particularly relates to a wallpaper display method and device.
Background
With the continuous development of mobile internet and computer technology, electronic devices such as mobile phones and computers have become an indispensable part of people's lives. But also wallpaper functions on electronic devices are widely used. Wallpaper of the electronic device comprises desktop wallpaper and screen locking wallpaper. The screen locking wallpaper is a background picture used when the electronic equipment is in a screen locking state, and the desktop wallpaper is a background picture used on a desktop of the electronic equipment.
However, in the related art, the electronic device can display the wallpaper only with the display parameters inherent to the wallpaper.
Disclosure of Invention
An object of the embodiments of the present application is to provide a wallpaper display method and apparatus, which can solve the problem that an electronic device can only display wallpaper with display parameters inherent to the wallpaper.
In a first aspect, an embodiment of the present application provides a wallpaper display method, including:
displaying a first image;
receiving a first input to a first image;
responding to a first input, and configuring dynamic display information corresponding to a first object, wherein the first object is an object included in a first image, or the first object is a user-defined object;
in a case where the first image is set as the wallpaper and the wallpaper is displayed, the first object is dynamically displayed according to the dynamic display information.
In a second aspect, an embodiment of the present application provides a wallpaper display device, including:
the first display module is used for displaying a first image;
a first receiving module for receiving a first input to a first image;
the configuration module is used for responding to a first input and configuring dynamic display information corresponding to a first object, wherein the first object is an object included in a first image, or the first object is a user-defined object;
and the second display module is used for dynamically displaying the first object according to the dynamic display information under the condition that the first image is set as the wallpaper and the wallpaper is displayed.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, where the program or instructions, when executed by the processor, implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium on which a program or instructions are stored, which when executed by a processor, implement the steps of the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the steps of the method according to the first aspect.
In the embodiment of the present application, a first image is first displayed, and then, in response to an input to the first image, dynamic display information of a first object is configured, and in a case where the first image is set as a wallpaper and the wallpaper is displayed, the first object is dynamically displayed according to the configured dynamic display information. In this way, the first object can be dynamically displayed when the wallpaper is displayed, that is, the first object is dynamically changed, and the wallpaper display form is more diversified.
Drawings
Fig. 1 is a schematic flowchart of a wallpaper display method provided in an embodiment of the present application;
FIG. 2 is a first schematic diagram of dynamically displaying a first object according to an embodiment of the present application;
FIG. 3 is a second schematic diagram of dynamically displaying a first object according to an embodiment of the present application;
FIG. 4 is a third schematic diagram of dynamically displaying a first object according to an embodiment of the present application;
FIG. 5 is a fourth schematic diagram of dynamically displaying a first object according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a wallpaper display device according to an embodiment of the application;
fig. 7 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 8 is a hardware configuration diagram of an electronic device implementing an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The wallpaper display method and device provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
Fig. 1 is a schematic flow chart of a wallpaper display method according to an embodiment of the present application. The wallpaper display method may include:
step 101: displaying a first image;
step 102: receiving a first input to a first image;
step 103: responding to a first input, and configuring dynamic display information corresponding to a first object, wherein the first object is an object included in a first image, or the first object is a user-defined object;
step 104: in a case where the first image is set as the wallpaper and the wallpaper is displayed, the first object is dynamically displayed according to the dynamic display information.
Specific implementations of the above steps will be described in detail below.
In the embodiment of the present application, a first image is first displayed, and then, in response to an input to the first image, dynamic display information of a first object is configured, and in a case where the first image is set as a wallpaper and the wallpaper is displayed, the first object is dynamically displayed according to the configured dynamic display information. In this way, the first object can be dynamically displayed when the wallpaper is displayed, that is, the first object is dynamically changed, and the wallpaper display form is more diversified.
The dynamic display of the first object in the embodiment of the present application includes: the method comprises the steps of passively dynamically displaying a first object and actively displaying the first object. Wherein, passively and dynamically displaying the first object means dynamically displaying the first object after receiving user input. Actively dynamically displaying the first object means that the first object dynamically changes autonomously without user input.
In some possible implementations of the embodiments of the present application, the user-defined object in the embodiments of the present application refers to an object that needs to be dynamically changed when the first image is set as wallpaper and the wallpaper is displayed by the user. It will be appreciated that the user-defined object has an associated relationship with the first image when the first image is set as wallpaper and the wallpaper is displayed.
In some possible implementations of embodiments of the present application, step 104 includes: receiving a second input to the first object; and responding to the second input, and dynamically displaying the first object according to the dynamic display information corresponding to the second input.
In some possible implementations of embodiments of the present application, one or more dynamic display information may be configured for the first object, each dynamic display information corresponding to a particular input. Wherein, the input corresponding to the dynamic display information may include: drag input for an object, rotation input for an object, text editing input for an object, object position change input, object size change input, and the like. For example, configuring the two-finger zoom input corresponds to dynamically zooming the first object; as another example, configuring the drag input corresponds to the first object dynamically changing as the drag input location changes, and so on. In practical application, the corresponding relation between the input and the dynamic display information can be configured according to actual needs.
Exemplarily, as shown in fig. 2. Fig. 2 is a first schematic diagram of dynamically displaying a first object according to an embodiment of the present application. In fig. 2, the first object is a circular object comprising areas of different sizes, and the second input is an input for rotating the first object.
In the embodiment of the application, when the wallpaper is displayed, the object can be dynamically displayed according to the input of the user, the object can interact with the user, and the user experience effect can be improved.
In some possible implementations of embodiments of the present application, step 104 includes: receiving a first gesture control input to an electronic device; and responding to the first attitude control input, and dynamically displaying the first object according to the dynamic display information corresponding to the first attitude control input.
In some possible implementations of embodiments of the present application, the gesture control input includes, but is not limited to, a whip electronic device, a rotation electronic device, and the like. It is understood that when a gesture control input is performed on an electronic device, the gesture of the electronic device changes according to the gesture control input.
In some possible implementations of embodiments of the present application, one or more dynamic display configuration information may be configured for the electronic device, each dynamic display configuration information corresponding to a particular gestural control input. For example, configuring the flail electronic device corresponds to the first object changing dynamically with the flail of the electronic device; as another example, configuring the electronic device to move in a circular motion in a horizontal direction corresponds to the first object moving in a circular motion with the electronic device over the wallpaper, and so on.
Exemplarily, as shown in fig. 3. Fig. 3 is a second schematic diagram for dynamically displaying a first object according to an embodiment of the present application. In fig. 3, the first object is a circular object and the first attitude control input is an input to swing the electronic device.
In the embodiment of the application, when the wallpaper is displayed, the object can be dynamically displayed according to the posture control input of the user to the electronic equipment, the object can interact with the user, and the user experience effect can be improved.
In some possible implementations of the embodiments of the present application, the first object may be configured to dynamically change along with the direction of gravity in advance, and when the posture of the electronic device is controlled to change by rotating the electronic device, the first object dynamically changes along with the direction of gravity. Accordingly, the first gesture control input comprises a rotational input; the first attitude control input corresponds to dynamic display information, and comprises the following steps: for dynamically rotating the display first object according to the direction of gravity.
Specifically, in the process of rotating the electronic device, the gravity direction may be detected by using a gyroscope, and then the first object is dynamically rotated and displayed according to the gravity direction. Exemplarily, as shown in fig. 4. Fig. 4 is a third schematic diagram for dynamically displaying a first object according to an embodiment of the present application. In fig. 4, the first object is an hourglass, and the first attitude control input is an input to the rotating electronic device, the hourglass rotating with the direction of gravity.
In the embodiment of the application, when the wallpaper is displayed, the object can be dynamically displayed according to the gravity direction, and the user experience effect can be improved.
In some possible implementations of the embodiments of the present application, the dynamic display configuration information of the first object without input may be preset. For example, a first subject "eye" is set to blink without input; for another example, the first object is set to move from the initial position in the wallpaper to the set first position, then move from the set first position to the set second position, then move from the set second position to the initial position under the condition of no input, and the process is repeated.
Exemplarily, as shown in fig. 5. Fig. 5 is a third schematic diagram for dynamically displaying a first object according to an embodiment of the present application. In fig. 5, the first object is a fawn, which jumps back and forth in various steps.
In some possible implementations of embodiments of the present application, step 104 includes: determining a first scene corresponding to the first image; and dynamically displaying the first object according to the dynamic display information corresponding to the first scene.
In some possible implementations of embodiments of the present application, one or more dynamic display configuration information may be configured for the first object, each dynamic display configuration information corresponding to a particular scene. In different scenarios, the first object does different actions. For example, for a first object "cat," configured in a scene with a cat litter, the action of the cat is to go into the litter to sleep; configured in a scene with cat food, the action of the cat is eating the cat food; and so on.
The method for determining the scene corresponding to the first image is not limited in the embodiment of the present application, and any available method may be applied to the embodiment of the present application. For example, an object is detected in the first image, and a scene corresponding to the first image is determined according to the detected object. For example, if a cat litter is detected in the first image, it may be determined that a scene corresponding to the first image is a scene with a cat litter; in another example, if cat food is detected to exist in the first image, it may be determined that a scene corresponding to the first image is a scene with cat food; further exemplarily, if furniture exists in the first image, it may be determined that a scene corresponding to the first image is an indoor scene; further exemplarily, if a tree is detected to exist in the first image, it may be determined that a scene corresponding to the first image is an outdoor scene; and so on.
It should be noted that, in the embodiment of the present application, a dividing manner of a scene is not limited, and any available scene dividing manner may be applied to the embodiment of the present application. In some possible implementations of the embodiments of the present application, a user may perform scene division according to their actual needs.
In the embodiment of the application, when the wallpaper is displayed, the object can be dynamically displayed in the scene corresponding to the first image, and the user experience effect can be improved.
It should be noted that, in the wallpaper display method provided in the embodiment of the present application, the execution subject may be a wallpaper display apparatus, or a control module for executing the wallpaper display method in the wallpaper display apparatus. In the embodiment of the present application, a wallpaper display apparatus is taken as an example to execute a wallpaper display method, and the wallpaper display apparatus provided in the embodiment of the present application is described.
Fig. 6 is a schematic structural diagram of a wallpaper display device according to an embodiment of the application. The wallpaper display apparatus 600 may include:
a first display module 601, configured to display a first image;
a first receiving module 602, configured to receive a first input for a first image;
a configuration module 603, configured to, in response to a first input, configure dynamic display information corresponding to a first object, where the first object is an object included in the first image, or the first object is a user-defined object;
a second display module 604, configured to dynamically display the first object according to the dynamic display information when the first image is set as wallpaper and the wallpaper is displayed.
In the embodiment of the present application, a first image is first displayed, and then, in response to an input to the first image, dynamic display information of a first object is configured, and in a case where the first image is set as a wallpaper and the wallpaper is displayed, the first object is dynamically displayed according to the configured dynamic display information. In this way, the first object can be dynamically displayed when the wallpaper is displayed, that is, the first object is dynamically changed, and the wallpaper display form is more diversified.
In some possible implementations of embodiments of the present application, the second display module 604 includes:
a first receiving submodule for receiving a second input to the first object;
and the first display submodule is used for responding to the second input and dynamically displaying the first object according to the dynamic display information corresponding to the second input.
In the embodiment of the application, when the wallpaper is displayed, the object can be dynamically displayed according to the input of the user, the object can interact with the user, and the user experience effect can be improved.
In some possible implementations of embodiments of the present application, the second display module 604 includes:
a second receiving submodule for receiving a first attitude control input to the electronic device;
and the second display submodule is used for responding to the first attitude control input and dynamically displaying the first object according to the dynamic display information corresponding to the first attitude control input.
In the embodiment of the application, when the wallpaper is displayed, the object can be dynamically displayed according to the posture control input of the user to the electronic equipment, the object can interact with the user, and the user experience effect can be improved.
In some possible implementations of embodiments of the present application, the first gestural control input comprises a rotational input;
the first attitude control input corresponds to dynamic display information, and comprises the following steps:
the display device is used for dynamically and rotatably displaying the first object according to the gravity direction.
In some possible implementations of embodiments of the present application, the second display module 604 includes:
the determining submodule is used for determining a first scene corresponding to the first image;
and the third display module is used for dynamically displaying the first object according to the dynamic display information corresponding to the first scene.
In the embodiment of the application, when the wallpaper is displayed, the object can be dynamically displayed according to the gravity direction, and the user experience effect can be improved.
The wallpaper display device in the embodiment of the application may be a device, and may also be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The wallpaper display device in the embodiment of the application can be a device with an operating system. The operating system may be an Android operating system (Android), an iOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
The wallpaper display device provided in the embodiment of the application can implement each process in the wallpaper display method embodiments of fig. 1 to 5, and is not described herein again to avoid repetition.
Optionally, as shown in fig. 7, an electronic device 700 is further provided in an embodiment of the present application, and includes a processor 701, a memory 702, and a program or an instruction stored in the memory 702 and executable on the processor 701, where the program or the instruction is executed by the processor 701 to implement each process of the wallpaper display method embodiment, and can achieve the same technical effect, and no further description is provided here to avoid repetition.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic devices and the non-mobile electronic devices described above.
In some possible implementations of embodiments of the present Application, the processor 701 may include a Central Processing Unit (CPU), or an Application Specific Integrated Circuit (ASIC), or may be configured to implement one or more Integrated circuits of embodiments of the present Application.
In some possible implementations of embodiments of the present application, the Memory 702 may include Read-Only Memory (ROM), Random Access Memory (RAM), magnetic disk storage media devices, optical storage media devices, flash Memory devices, electrical, optical, or other physical/tangible Memory storage devices. Thus, in general, memory 702 comprises one or more tangible (non-transitory) computer-readable storage media (e.g., a memory device) encoded with software comprising computer-executable instructions and when executed (e.g., by one or more processors) is operable to perform the operations described with reference to wallpaper display methods according to embodiments of the application.
Fig. 8 is a hardware configuration diagram of an electronic device implementing an embodiment of the present application.
The electronic device 800 includes, but is not limited to: a radio frequency unit 801, a network module 802, an audio output unit 803, an input unit 804, a sensor 805, a display unit 806, a user input unit 807, an interface unit 808, a memory 809, and a processor 810.
Those skilled in the art will appreciate that the electronic device 800 may further comprise a power source (e.g., a battery) for supplying power to the various components, and the power source may be logically connected to the processor 810 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system. The electronic device structure shown in fig. 8 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The display unit 806 is configured to: displaying a first image;
the user input unit 807 is for: receiving a first input to a first image;
the processor 810 is configured to: responding to a first input, and configuring dynamic display information corresponding to a first object, wherein the first object is an object included in a first image, or the first object is a user-defined object;
the display unit 806 is further configured to: in a case where the first image is set as the wallpaper and the wallpaper is displayed, the first object is dynamically displayed according to the dynamic display information.
In the embodiment of the present application, a first image is first displayed, and then, in response to an input to the first image, dynamic display information of a first object is configured, and in a case where the first image is set as a wallpaper and the wallpaper is displayed, the first object is dynamically displayed according to the configured dynamic display information. In this way, the first object can be dynamically displayed when the wallpaper is displayed, that is, the first object is dynamically changed, and the wallpaper display form is more diversified.
In some possible implementations of embodiments of the present application, the user input unit 807 is further configured to: receiving a second input to the first object;
the display unit 806 is specifically configured to: and responding to the second input, and dynamically displaying the first object according to the dynamic display information corresponding to the second input.
In the embodiment of the application, when the wallpaper is displayed, the object can be dynamically displayed according to the input of the user, the object can interact with the user, and the user experience effect can be improved.
In some possible implementations of embodiments of the present application, the user input unit 807 is further configured to: receiving a first gesture control input to an electronic device;
the display unit 806 is specifically configured to: and responding to the first attitude control input, and dynamically displaying the first object according to the dynamic display information corresponding to the first attitude control input.
In the embodiment of the application, when the wallpaper is displayed, the object can be dynamically displayed according to the posture control input of the user to the electronic equipment, the object can interact with the user, and the user experience effect can be improved.
In some possible implementations of embodiments of the present application, the first gestural control input comprises a rotational input;
the first attitude control input corresponds to dynamic display information, and comprises the following steps:
the display device is used for dynamically and rotatably displaying the first object according to the gravity direction.
In some possible implementations of embodiments of the present application, the processor 810 is further configured to: determining a first scene corresponding to the first image;
the display unit 806 is specifically configured to: and dynamically displaying the first object according to the dynamic display information corresponding to the first scene.
In the embodiment of the application, when the wallpaper is displayed, the object can be dynamically displayed according to the gravity direction, and the user experience effect can be improved.
It should be understood that in the embodiment of the present application, the input Unit 804 may include a Graphics Processing Unit (GPU) 8041 and a microphone 8042, and the Graphics Processing Unit 8041 processes image data of a still picture or a video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 806 may include a display panel 8061, and the display panel 8061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 807 includes a touch panel 8071 and other input devices 8072. A touch panel 8071, also referred to as a touch screen. The touch panel 8071 may include two portions of a touch detection device and a touch controller. Other input devices 8072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 809 may be used to store software programs as well as various data including, but not limited to, application programs and operating systems. The processor 810 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 810.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the wallpaper display method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device in the above embodiment. The readable storage medium includes a computer readable storage medium, and examples of the computer readable storage medium include non-transitory computer readable storage media such as ROM, RAM, magnetic or optical disks, and the like.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the wallpaper display method embodiment, and the same technical effect can be achieved.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A wallpaper display method, characterized in that the method comprises:
displaying a first image;
receiving a first input to the first image;
responding to the first input, and configuring dynamic display information corresponding to a first object, wherein the first object is an object included in the first image, or the first object is a user-defined object;
and dynamically displaying the first object according to the dynamic display information under the condition that the first image is set as wallpaper and the wallpaper is displayed.
2. The method of claim 1, wherein dynamically displaying the first object according to the dynamic display information comprises:
receiving a second input to the first object;
responding to the second input, and dynamically displaying the first object according to dynamic display information corresponding to the second input.
3. The method of claim 1, wherein dynamically displaying the first object according to the dynamic display information comprises:
receiving a first gesture control input to an electronic device displaying the wallpaper;
and responding to the first attitude control input, and dynamically displaying the first object according to dynamic display information corresponding to the first attitude control input.
4. The method of claim 3, wherein the first gestural control input comprises a rotational input;
the first attitude control input corresponds to dynamic display information, including:
the display device is used for dynamically and rotatably displaying the first object according to the gravity direction.
5. The method of claim 1, wherein dynamically displaying the first object according to the dynamic display information comprises:
determining a first scene corresponding to the first image;
and dynamically displaying the first object according to the dynamic display information corresponding to the first scene.
6. A wallpaper display apparatus, characterized in that the apparatus comprises:
the first display module is used for displaying a first image;
a first receiving module for receiving a first input to the first image;
the configuration module is used for responding to the first input and configuring dynamic display information corresponding to a first object, wherein the first object is an object included in the first image, or the first object is a user-defined object;
and the second display module is used for dynamically displaying the first object according to the dynamic display information under the condition that the first image is set as wallpaper and the wallpaper is displayed.
7. The apparatus of claim 6, wherein the second display module comprises:
a first receiving submodule for receiving a second input to the first object;
and the first display sub-module is used for responding to the second input and dynamically displaying the first object according to the dynamic display information corresponding to the second input.
8. The apparatus of claim 6, wherein the second display module comprises:
a second receiving submodule for receiving a first gesture control input to the electronic device;
and the second display submodule is used for responding to the first attitude control input and dynamically displaying the first object according to the dynamic display information corresponding to the first attitude control input.
9. The apparatus of claim 8, wherein the first gestural control input comprises a rotational input;
the first attitude control input corresponds to dynamic display information, including:
the display device is used for dynamically and rotatably displaying the first object according to the gravity direction.
10. The apparatus of claim 6, wherein the second display module comprises:
the determining submodule is used for determining a first scene corresponding to the first image;
and the third display module is used for dynamically displaying the first object according to the dynamic display information corresponding to the first scene.
CN202111124517.8A 2021-09-24 2021-09-24 Wallpaper display method and device Pending CN113806003A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111124517.8A CN113806003A (en) 2021-09-24 2021-09-24 Wallpaper display method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111124517.8A CN113806003A (en) 2021-09-24 2021-09-24 Wallpaper display method and device

Publications (1)

Publication Number Publication Date
CN113806003A true CN113806003A (en) 2021-12-17

Family

ID=78896739

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111124517.8A Pending CN113806003A (en) 2021-09-24 2021-09-24 Wallpaper display method and device

Country Status (1)

Country Link
CN (1) CN113806003A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104318596A (en) * 2014-10-08 2015-01-28 北京搜狗科技发展有限公司 Dynamic picture generation method and generation device
CN104427087A (en) * 2013-08-21 2015-03-18 腾讯科技(深圳)有限公司 Method for realizing dynamic wallpaper of mobile terminal, and mobile terminal
CN106648650A (en) * 2016-12-14 2017-05-10 北京小米移动软件有限公司 Method and device for adjusting terminal display status
CN113114841A (en) * 2021-03-26 2021-07-13 维沃移动通信有限公司 Dynamic wallpaper acquisition method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104427087A (en) * 2013-08-21 2015-03-18 腾讯科技(深圳)有限公司 Method for realizing dynamic wallpaper of mobile terminal, and mobile terminal
CN104318596A (en) * 2014-10-08 2015-01-28 北京搜狗科技发展有限公司 Dynamic picture generation method and generation device
CN106648650A (en) * 2016-12-14 2017-05-10 北京小米移动软件有限公司 Method and device for adjusting terminal display status
CN113114841A (en) * 2021-03-26 2021-07-13 维沃移动通信有限公司 Dynamic wallpaper acquisition method and device

Similar Documents

Publication Publication Date Title
EP3091426B1 (en) User terminal device providing user interaction and method therefor
US9910558B2 (en) Methods and devices for user interactive interfaces on touchscreens
WO2023061280A1 (en) Application program display method and apparatus, and electronic device
US20160162183A1 (en) Device and method for receiving character input through the same
CN112148167A (en) Control setting method and device and electronic equipment
CN113655929A (en) Interface display adaptation processing method and device and electronic equipment
CN112269501A (en) Icon moving method and device and electronic equipment
CN112286612A (en) Information display method and device and electronic equipment
US10192523B2 (en) Method and apparatus for providing an overview of a plurality of home screens
CN113946250A (en) Folder display method and device and electronic equipment
CN112698762B (en) Icon display method and device and electronic equipment
CN112099714B (en) Screenshot method and device, electronic equipment and readable storage medium
CN113703630A (en) Interface display method and device
CN113282213A (en) Interface display method and device
CN111638828A (en) Interface display method and device
CN116107531A (en) Interface display method and device
CN112099682B (en) Icon display method and device and electronic equipment
CN111796736B (en) Application sharing method and device and electronic equipment
CN113806003A (en) Wallpaper display method and device
CN115357173A (en) Screen control method and device and electronic equipment
CN112148175B (en) Notification display mode setting method and device, electronic equipment and storage medium
CN114416269A (en) Interface display method and display device
CN111857465B (en) Application icon sorting method and device and electronic equipment
CN114222355A (en) Terminal power saving display method and device and electronic equipment
CN104951223B (en) A kind of touch screen realizes the method, apparatus and host of magnifying glass

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination