CN117668289A - Display processing method and device, electronic equipment and storage medium - Google Patents

Display processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN117668289A
CN117668289A CN202211043102.2A CN202211043102A CN117668289A CN 117668289 A CN117668289 A CN 117668289A CN 202211043102 A CN202211043102 A CN 202211043102A CN 117668289 A CN117668289 A CN 117668289A
Authority
CN
China
Prior art keywords
animation
behavior data
played
user behavior
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211043102.2A
Other languages
Chinese (zh)
Inventor
李雨昂
陈榆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202211043102.2A priority Critical patent/CN117668289A/en
Priority to PCT/CN2023/108182 priority patent/WO2024045931A1/en
Publication of CN117668289A publication Critical patent/CN117668289A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/73Querying
    • G06F16/735Filtering based on additional data, e.g. user or group profiles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/73Querying
    • G06F16/738Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Multimedia (AREA)
  • Computational Linguistics (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a display processing method, which comprises the following steps: the electronic equipment responds to the display instruction and acquires user behavior data in a first time period; the electronic equipment determines an animation to be played based on the user behavior data, and the time length of the animation to be played is related to the user behavior data; and the electronic equipment displays the animation to be played on a display interface. The embodiment of the application also provides a display processing device, electronic equipment and a computer readable storage medium.

Description

Display processing method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of wireless communications technologies, and in particular, to a display processing method and apparatus, an electronic device, and a computer readable storage medium.
Background
With the wide application and rapid development of intelligent devices, electronic devices can collect and display behavior data such as the duration of using the electronic devices by users and the quantity of motion of the users. At present, user behavior data is only displayed in a data mode, and the expression form is simple and single.
Disclosure of Invention
The embodiment of the application provides a display processing method and device, electronic equipment and a computer readable storage medium.
The technical scheme of the application is realized as follows:
in a first aspect, an embodiment of the present application provides a display processing method, including:
the electronic equipment responds to the display instruction and acquires user behavior data in a first time period;
the electronic equipment determines an animation to be played based on the user behavior data, and the time length of the animation to be played is related to the user behavior data;
and the electronic equipment displays the animation to be played on a display interface.
In a second aspect, an embodiment of the present application provides a display processing method, including:
the electronic equipment responds to the display instruction and acquires user behavior data in a first time period;
the electronic equipment determines a target image based on the user behavior data, wherein the target image is a partial image in a target image set;
the electronic device sets the target image as wallpaper.
In a third aspect, an embodiment of the present application provides a display processing apparatus, including:
the acquisition unit is configured to respond to the display instruction and acquire user behavior data in a first time period;
a processing unit configured to determine an animation to be played based on the user behavior data, the time length of the animation to be played being related to the user behavior data;
And the display unit is configured to display the animation to be played on a display interface.
In a fourth aspect, an embodiment of the present application provides a display processing apparatus, including:
the acquisition unit is configured to respond to the display instruction and acquire user behavior data in a first time period;
a processing unit configured to determine a target image based on the user behavior data, the target image being a partial image in a target image set; and setting the target image as wallpaper.
In a fifth aspect, embodiments of the present application provide an electronic device including a memory and a processor; wherein,
the memory is used for storing a computer program capable of running on the processor;
the processor is configured to execute the display processing method according to the first aspect or the second aspect when the computer program is executed.
In a sixth aspect, embodiments of the present application provide a computer storage medium storing a computer program that when executed by at least one processor implements a method of display processing according to the first or second aspect.
According to the display processing method provided by the embodiment of the application, after receiving the display instruction, the electronic device can determine the animation to be played according to the user behavior data, the time length of the animation to be played is related to the user behavior data, and then the electronic device can display the animation to be played on the display interface. Thus, the electronic device can learn the behavior of the animation to be played for a period of time. Therefore, a connection relation between the animation and the behavior data of the user using the electronic equipment is established, the behavior data of the user is visualized through the animation, the display of the behavior data of the user is enriched, and the intelligence of the equipment is improved.
Drawings
Fig. 1A is a schematic flow chart of a display processing method according to an embodiment of the present application;
fig. 1B is a schematic flow chart diagram of a display processing method according to an embodiment of the present application
Fig. 2 is a flowchart illustrating a display processing method according to an embodiment of the present application;
FIG. 3A is a schematic diagram of a dynamic wallpaper according to an embodiment of the present application;
fig. 3B is a second schematic diagram of a dynamic wallpaper according to an embodiment of the present application;
fig. 3C is a schematic diagram III of a dynamic wallpaper according to an embodiment of the present application;
fig. 4 is a schematic diagram of a dynamic wallpaper according to an embodiment of the present application;
fig. 5 is a flow chart diagram of a display processing method according to an embodiment of the present application;
fig. 6 is a schematic diagram of an association relationship between display content of a dynamic wallpaper and accumulated usage time of an electronic device according to an embodiment of the present application;
fig. 7 is a schematic diagram of a composition structure of a display processing device according to an embodiment of the present application;
fig. 8 is a schematic diagram of a composition structure of an electronic device according to an embodiment of the present application.
Detailed Description
For a more complete understanding of the features and technical content of the embodiments of the present application, reference should be made to the following detailed description of the embodiments of the present application, taken in conjunction with the accompanying drawings, which are meant to be illustrative only and not limiting of the embodiments of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present application and the above figures are used for distinguishing between different objects and not for describing a particular sequential order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
The embodiment of the application provides a display processing method, which can be used for establishing association between behavior data of a user using electronic equipment and animation or wallpaper, visualizing the behavior data of the user through the animation or the wallpaper, enriching the display of behavior verses of the user and improving the intelligence of the equipment. The display processing method provided by the embodiment of the application can be applied to electronic equipment with a display function, and the electronic equipment can be various types of equipment such as smart phones, notebook computers, tablet computers, desktop computers, wearable equipment, vehicle-mounted equipment and the like. It should be appreciated that the electronic device may include a display panel in which an animation or wallpaper may be displayed in a display interface.
In an embodiment of the present application, referring to a flowchart of a display processing method shown in fig. 1A, the display processing method provided in the embodiment of the present application may include the following steps 110 to 130.
Step 110, the electronic device responds to the display instruction to obtain user behavior data in a first time period.
It should be understood that the electronic device may receive a display instruction triggered by a user, and after receiving the display instruction, may acquire user behavior data in a first period of time, and determine, according to the acquired user behavior data, an animation to be played, so as to display, to the user, the user behavior in the first period of time through the animation to be played.
The display instruction may be an instruction entering the main interface, an instruction entering the lock screen interface, or a page special for displaying user behavior data, which is not limited in the embodiment of the present application. For example, when receiving that the clicking operation of the user is about to enter the screen locking interface in the black screen state, user behavior data in a first period of time may be acquired, so as to determine a display animation of the screen locking interface based on the user behavior data. The electronic device may acquire user behavior data during a first period of time while successfully verifying the unlock information entered by the user, entering the main interface (or entering the foreground application and then returning to the main interface for the first time).
Alternatively, the first time period may be a time period from a specified time to a current time, where the specified time may be in units of days, in units of weeks, or in units of months, and the embodiment of the application is not limited thereto.
Optionally, the user behavior data may include, but is not limited to, at least one of: the accumulated use duration of the electronic device, the use duration of the application of the target type, the use duration of the target application, the quantity of exercise of the user, the heart rate and the body temperature.
In the embodiment of the application, the electronic device may count the user behavior data in the first period. The electronic device may count an accumulated duration of use of the electronic device by the user in the first period of time, and may also count a duration of use of a certain type/application in the installed applications; the electronic equipment can also count the quantity of motion of a user in a first time period through a gyroscope, an acceleration sensor and a gravity sensor, such as the number of motion steps, the climbing height and the like; if the electronic device is provided with a heart rate sensor, a temperature sensor and the like, the heart rate and/or the body temperature of the user in the first time period can be counted.
It should be noted that, the number of the target types may include one or more, and the number of the target applications may also include one or more, which is not limited in the embodiment of the present application.
Based on this, the electronic device may acquire the user behavior data in the first period of time using the statistical result after receiving the display instruction.
Step 120, the electronic equipment determines animation to be played based on the user behavior data; the length of time for which the animation is to be played is related to the user behavior data.
It should be appreciated that the electronic device may determine that the animation is to be played based on the collected user behavior data, and the length of time for playing the animation may be related to the user behavior data. In this way, the behavior within the first period of time is presented to the user by playing the determined animation to be played.
Optionally, the animation to be played may be at least part of the target animation, and the electronic device may determine, according to the user behavior data, a part of the target animation, to obtain the animation to be played.
In one possible implementation manner, the electronic device may determine a threshold interval in which the user behavior data is located, so as to select a portion of the target animation that matches the threshold interval according to the threshold interval, and obtain the animation to be played.
In an exemplary case where the user behavior data is the cumulative usage time of the electronic device, if the cumulative usage time of the electronic device is 0-2 hours, the electronic device may select all or part of the first third of the contents in the target animation as the animation to be played, if the cumulative usage time of the electronic device is 2-4 hours, the electronic device may select all or part of the middle third of the contents in the target animation as the animation to be played, and if the cumulative usage time of the electronic device is more than 4 hours, the electronic device may select all or part of the last third of the contents in the target animation as the animation to be played. The accumulated use time of the electronic device may be replaced by a certain application (e.g., learning, entertainment, etc.) installed in the electronic device, or a certain application. In the case that the user behavior data is the motion amount (for example, the number of steps) of the user, if the number of steps in the first period is less than or equal to 2000 steps, the electronic device may select all or part of the first third of the content in the target animation as the animation to be played, if the number of steps in the first period is greater than 2000 steps and less than or equal to 8000 steps, the electronic device may select all or part of the middle third of the content in the target animation as the animation to be played, and if the number of steps in the first period is greater than 8000 steps, the electronic device may select all or part of the second third of the content in the target animation as the animation to be played.
In one possible implementation, in a case where the user behavior data is a usage duration of the target type application, the target type application may include a first type application and a second type application, and the electronic device may compare the usage duration of the first type application with the usage duration of the second type application. If the using time length of the first type of application is smaller (or larger) than the using time length of the second type of application, the electronic device can select all or part of the content of the first half of the target animation as the animation to be played, and if the using time length of the first type of application is larger (or smaller) than or equal to the using time length of the second type of application, the electronic device can select all or part of the content of the second half of the target animation as the animation to be played.
Alternatively, the first type of application and the second type of application may correspond to different target animations. The electronic device can compare the using time of the two types of applications, if the using time of the first type of application is longer than the using time of the second type of application, a part of animation fragments are selected from the target animations corresponding to the first type of application to serve as animations to be played, and if the using time of the second type of application is longer than the using time of the first type of application, a part of animation fragments are selected from the target animations corresponding to the second type of application to serve as animations to be played. Or if the using time length of the first type of application is smaller than the using time length of the first type of application, selecting a part of animation fragments from the target animations corresponding to the first type of application as animations to be played, and if the using time length of the second type of application is smaller than the using time length of the first type of application, selecting a part of animation fragments from the target animations corresponding to the second type of application as animations to be played.
By way of example, the first type of application may be an entertainment type application, the second type of application may be a learning type application, or the first type of application may be a shopping type application, the second type of application may be an audio-visual type application, etc., which is not limited in this embodiment of the present application.
In another possible implementation manner, the electronic device may further compare the user behavior data with the reference behavior data, and determine a proportional relationship between the user behavior data and the reference behavior data, so as to obtain the animation to be played in a portion matching the proportional relationship in the selection of the target animation.
Alternatively, the reference behavior data may be understood as a target value that needs to be reached, or as an alert value. The reference behavior data and the user behavior data are the same in data type, for example, the reference behavior data are both duration, or the reference behavior data are both motion parameters.
For example, in the case where the user behavior data is an accumulated duration of time for which the user uses the electronic device, the reference behavior data may be an allowable maximum use duration; in the case that the user behavior data is a use duration of the application of the target type, the reference behavior data may be a maximum use duration allowed to be used, or may be a use duration of another type of application associated therewith, for example, the user behavior data may be a use duration of a learning-class application, the reference behavior data may be a use duration of an entertainment-class application, or the user behavior data may be a use duration of a shopping-class application, and the reference behavior data may be a use duration of an audio-visual-class application; in the case that the user behavior data is a motion quantity, the reference behavior data may be a configured motion quantity that the user needs to complete; in the case where the user behavior data is heart rate or body temperature, the reference behavior data may be an alert value of heart rate or body temperature.
Optionally, the reference behavior data may be a value preset by a user, or may be a default value configured by the electronic device, which is not limited in the embodiment of the present application.
In this embodiment of the present application, after the electronic device obtains the user behavior data in the first period of time, the proportional relationship between the user behavior data and the reference behavior data may be calculated, where the proportional relationship may be an exemplary percentage relationship. It should be appreciated that the scaling relationships may be used to accurately characterize the magnitude relationship between the user behavior data and the reference behavior data.
Further, after determining the proportional relationship, the electronic device may determine that the animation is to be played according to the proportional relationship. It should be appreciated that the target animation and the animation to be played may be dynamic wallpaper. In this embodiment of the present application, the animation to be played may be part of the content or all of the content in the target animation, or the time length of the animation to be played is less than or equal to the time length of the target animation.
That is, the electronic device can determine what portions of the content in the play target animation are based on the above-described proportional relationship. The proportional relationships between different user behavior data and reference behavior data may correspond to different portions of the target animation.
For example, if the ratio between the user behavior data and the reference behavior data is less than 100%, the animation to be played may be the first half of the target animation, and if the ratio between the user behavior data and the reference behavior data exceeds 100%, the animation to be played may be the second half of the target animation.
In addition, in the embodiment of the present application, the time length of the animation to be played is related to the above-determined proportional relationship. For example, the time length of the animation to be played is positively correlated with the above proportional relationship, and the larger the ratio of the time length to be played is, the longer the time length of the animation to be played is, and the more content is played by the animation to be played. The time length of the animation to be played does not exceed the time length of the target animation.
Alternatively, the target animation may include a process of morphological changes of the virtual object. The virtual object may be a virtual character, a virtual animal, a virtual plant, or the like displayed in the display interface, which is not limited in the embodiment of the present application.
For example, the target animation may be a change in the virtual plant from germination to wilting. The target animation may also be the course of a virtual animal from just birth to aging. Accordingly, the animation to be played may be a part of a process of changing the form of the virtual object, for example, the animation to be played may be a process of changing the virtual plant from sprouting to luxuriant or a process from luxuriant to withering.
And 130, the electronic equipment displays the animation to be played on a display interface.
It is understood that after determining the animation to be played, the electronic device may display the animation to be played in the display interface. Optionally, the display interface in the embodiment of the present application may be a lock screen interface, or may be a main interface of an electronic device, or may be an interface dedicated to displaying user behavior data, which is not limited in the embodiment of the present application.
Optionally, the electronic device may play the animation to be played in a circulation manner in the display interface, or may play the animation to be played once in the display interface.
In summary, according to the display processing method provided by the embodiment of the present application, after receiving the display instruction, the to-be-played animation may be determined according to the user behavior data, and the time length of the to-be-played animation is related to the user behavior data, so that the electronic device may display the to-be-played animation on the display interface. Thus, the electronic device can learn the current behavior through the animation to be played. Therefore, the connection relation between the display animation and the behavior data of the user using the electronic equipment can be established, the behavior data of the user is visualized in the form of the animation, the display of the behavior data of the user is enriched, and the intelligence of the equipment is improved.
In another embodiment of the present application, referring to fig. 1B, the display processing method provided in the embodiment of the present application may be further implemented through the following steps 140 to 160.
130, the electronic equipment responds to a display instruction to acquire user behavior data in a first time period;
step 140, the electronic device determines a target image based on user behavior data, wherein the target image is a partial image in a target image set;
and 150, the electronic equipment sets the target image as wallpaper.
The target image set comprises a plurality of different images, and the images in the target image set can be used as wallpaper of the electronic equipment. It should be appreciated that in embodiments of the present application, the electronic device may select an image from the set of images that matches the user behavior data, and characterize the user's behavior over a first period of time by displaying the selected image.
Alternatively, the plurality of images included in the target image set may be images of different forms of the virtual object, for example, an image of virtual plant germination, an image of virtual plant flowering, and an image of virtual plant wilting may be included in the target image set.
Alternatively, the set of target images may include a plurality of image frames that make up the target animation in the previous embodiment.
In one possible implementation manner, the electronic device determines a threshold interval in which the user behavior data is located, and further determines an image corresponding to the threshold interval from the target image set, so as to obtain the target image.
For example, when the user behavior data is the cumulative usage time of the electronic device, if the cumulative usage time of the electronic device is 0-2 hours, the electronic device may select the first image in the target image set as the target image, if the cumulative usage time of the electronic device is 2-4 hours, the electronic device may select the second image in the target image set as the target image, and if the cumulative usage time of the electronic device is more than 4 hours, the electronic device may select the third image in the target image set as the target image. Under the condition that the user behavior data is the using time of the entertainment application and the learning application, if the using time of the entertainment application is smaller than the using time of the learning application, the electronic device can select a first image in the target image set as a target image, and if the using time of the entertainment application is greater than or equal to the using time of the learning application, the electronic device can select a second image in the target image set as a target image.
In another possible implementation manner, the electronic device determines a proportional relation between the user behavior data and the reference behavior data, and determines an image matched with the proportional relation from the target image set to obtain the target image.
It should be noted that, the manner in which the electronic device determines the proportional relationship between the user behavior data and the reference behavior data is the same as that in the above embodiment, and for brevity, a detailed description is omitted herein.
It should be further noted that, the description in the embodiments of the present application may be understood with reference to the description in the previous embodiment, and for brevity, will not be repeated herein.
In an embodiment of the present application, referring to fig. 2, the electronic device determines, based on a proportional relationship between user behavior data and reference behavior data, an animation to be played, which may be implemented by:
step 1201, determining that the animation to be played is at least part of the first portion of the target animation if the ratio between the user behavior data and the reference behavior data is less than or equal to the first parameter;
step 1202, determining that the animation to be played is at least part of the second part of the target animation under the condition that the ratio between the user behavior data and the reference behavior data is greater than the first parameter; wherein the first portion of the target animation is different from the second portion.
It should be appreciated that the target animation may be composed of two different portions, e.g., the target animation may be composed of a first portion and a second portion, the first and second portions of the target animation being different in content. The electronic device may display different portions of the target animation in the display interface according to different scaling relationships. Thus, the user can determine whether the current behavior data exceeds the preset reference data according to the display content of the animation in the display interface.
Alternatively, the target animation may include a process in which the virtual object gradually changes from the first shape to the second shape and gradually changes from the second shape to the third shape. Wherein the first portion of the target animation may include a process in which the virtual object gradually changes from the first configuration to the second configuration, and the second portion of the target animation may include a process in which the virtual object gradually changes from the second configuration to the third configuration.
In some embodiments, the length of time may also be used to describe different portions of the target animation. The length of time of the target animation may be L, which is a number greater than zero. The first portion may be a first half of the target animation, the time length may be L1, and the second portion may be a second half of the target animation, the time length being L2. For example, the time length of the target animation is 6 seconds, where the time length of the animation of the first portion gradually changes from the first form to the second form may be 3 seconds, and the time length of the animation of the second portion may be 3 seconds.
For example, if the virtual object is a virtual sunflower, the first form of the virtual object may be a sprouting form of the sunflower, as shown in fig. 3A, the second form of the virtual object may be a flowering form of the sunflower, as shown in fig. 3B, and the third form may be a withering form of the sunflower, as shown in fig. 3C. If the virtual object is an animated character, the first form may be a form in which the animated character smiles, the second form may be a form in which the animated character smiles, and the third form may be a form in which the animated character cries.
In the embodiment of the application, the electronic device may display at least part of the content in the first portion of the target animation in the display interface under the condition that the ratio between the user behavior data and the reference behavior data is determined to be less than or equal to the first parameter. In addition, the electronic device may display at least a portion of the content in the second portion of the target animation in the display interface if it is determined that the ratio between the user behavior data and the reference behavior data is greater than the first parameter.
Alternatively, the first parameter may be a parameter set by the user, for example, the first parameter is 100%, or may be 115%, or may be 100% +15 minutes, or the like, which is not limited in the embodiment of the present application.
Optionally, the electronic device may determine that the ratio between the user behavior data and the reference behavior data is less than or equal to the first parameter, the dynamic wallpaper (i.e. the animation to be played) may start playing the animation from the first aspect of the virtual object, and the time length of the played animation and the ratio between the user behavior data and the reference behavior data are positively correlated, and the greater the ratio, the longer the time length of the played animation from the first aspect.
For example, in a scenario where the first parameter is 100%, if the electronic device determines that the ratio between the user behavior data and the reference behavior data is 0, the dynamic wallpaper may present the first form of the virtual object; if the electronic equipment determines that the proportion between the user behavior data and the reference behavior data is 50%, the dynamic wallpaper can present the animation of the first half part of the first part in the target animation, namely the animation of the virtual object which starts to change from the first form but is not completely changed into the second form; if the electronic device determines that the ratio between the user behavior data and the reference behavior data is 100%, the dynamic wallpaper may present all animations of the first portion of the target animation, i.e., animations in which the virtual object completely changes from the first form to the second form.
Optionally, in a case where the electronic device determines that the ratio between the user behavior data and the reference behavior data is greater than the first parameter, the dynamic wallpaper (i.e. the animation to be played) may start playing from the second state of the virtual object, and the time length of the played animation and the ratio between the user behavior data and the reference behavior data are positively correlated, and the greater the ratio, the longer the time length of playing from the second state.
For example, if the electronic device determines that the ratio between the user behavior data and the reference behavior data is 150%, the dynamic wallpaper may present an animation of the first half of the second portion of the target animation, that is, an animation in which the virtual object starts to transform from the second form, but is not completely transformed into the third form; if the electronic device determines that the ratio between the user behavior data and the reference behavior data is greater than 170%, the dynamic wallpaper may present all animations of the second portion of the target animation, i.e., animations in which the virtual object completely changes from the second form to the third form.
In some embodiments, to avoid the problem that the time length of the animation to be played is too short, the time length of the animation to be played may be set to be at least the first time length. It can be understood that, in the embodiment of the present application, no matter how much the ratio between the user behavior data and the reference behavior data is, the length of time for which the animation is to be played is the first time length, or the length of time greater than the first time length is less than or equal to the length of time for which the target animation is to be played.
Alternatively, the first duration may be 1 second, 0.5 second, etc., which is not limited by the embodiments of the present application.
For example, if the ratio between the user behavior data and the reference behavior data is 0, the dynamic wallpaper may play the first 1 second animation of the first portion of the target animation. If the electronic device determines that the ratio between the user behavior data and the reference behavior data is 50%, the dynamic wallpaper may present the first 2 seconds of animation of the first portion of the target animation.
In an embodiment of the present application, in the case where the ratio between the user behavior data and the reference behavior data is less than or equal to the first parameter in the step 1201, determining that the animation to be played is at least part of the first portion in the target animation may be specifically implemented by:
determining that the animation to be played is the content of the first N seconds in the first part under the condition that the ratio between the user behavior data and the reference behavior data is smaller than or equal to the second parameter, wherein N is a number larger than zero;
determining that the animation to be played is the content of the first part of the first n+k seconds under the condition that the ratio between the user behavior data and the reference behavior data is larger than the second parameter and smaller than or equal to the third parameter, wherein K is determined based on the ratio between the user behavior data and the reference behavior data;
And determining that the animation to be played is the whole content of the first part under the condition that the ratio between the user behavior data and the reference behavior data is larger than the third parameter and smaller than the first parameter.
Optionally, the value of the second parameter is smaller than the first parameter, and the second parameter may be 0%, 5%, 10%, or the like, which is not limited in the embodiment of the present application.
Specifically, when the ratio between the user behavior data and the reference behavior data is smaller than or equal to the second parameter, the value of the user behavior data may be considered smaller, and the value of the reference behavior data is far from being reached, at this time, the electronic device may display the first N seconds of animation content of the first portion in the target animation in the display interface. N is the first duration in the above embodiment, and the value of N may be 0.5, 1, etc., which is not limited in this embodiment.
Optionally, the third parameter has a value greater than the second parameter and less than the first parameter. By way of example, the third parameter may be 95%, 100%, etc., which is not limited by the embodiments of the present application.
It can be understood that, when the ratio between the user behavior data and the reference behavior data is greater than the second parameter and less than or equal to the third parameter, the electronic device may determine the value of K according to the ratio between the current user behavior data and the reference behavior data, and display the animation content of the first portion of the target animation in the display interface for the first n+k seconds.
Alternatively, K may be greater than zero and less than L1-N, i.e., k= (0, L1-N), L1 being the length of time of the first portion in the target animation.
Alternatively, referring to fig. 4, the ratio between the user behavior data and the reference behavior data linearly corresponds to the value of K. The electronic device may determine the value of K according to a preset correspondence and a ratio between current user behavior data and reference behavior data.
In the embodiment of the present application, when the ratio between the user behavior data and the reference behavior data is greater than the third parameter and less than the first parameter, the user behavior data may be considered to have a greater value, and reach the value of the reference behavior data by itself, and at this time, the electronic device may display, in the display interface, the complete content of the first portion in the target animation, that is, the process of changing the virtual object from the first form to the second form.
In an embodiment of the present application, in the case where the ratio between the user behavior data and the reference behavior data in the step 1202 is greater than the first parameter, determining that the animation to be played is at least part of the second portion of the target animation may be specifically implemented by:
and under the condition that the ratio between the user behavior data and the reference behavior data is greater than the first parameter, determining that the animation to be played is the content of the first N+M seconds in the second part, wherein M is an integer greater than or equal to 1, and determining that M is based on the ratio between the user behavior data and the reference behavior data.
It should be appreciated that when the ratio between the user behavior data and the reference behavior data is greater than the first parameter, the value of the user behavior data may be considered to have exceeded the value of the set reference behavior data. In this scenario, the electronic device may determine the value of M according to the ratio between the user behavior data and the reference behavior data, and display the animation content of the first n+m seconds of the second portion in the target animation in the display interface.
That is, the electronic device may begin playing from the second portion of the target animation (i.e., the second state of the virtual object), specifically playing the first n+m seconds of the second portion of the target animation. N is the first duration in the above embodiment, and the value of N may be 0.5, 1, etc., which is not limited in this embodiment.
Alternatively, M may be greater than zero and less than L2-N, i.e., k= (0, L2-N), L2 being the length of time of the second portion in the target animation.
Optionally, the ratio between the user behavior data and the reference behavior data corresponds linearly to the value of M. The electronic device may determine the value of M according to a preset correspondence and a ratio between current user behavior data and reference behavior data.
In summary, according to the display processing method provided by the embodiment of the application, the electronic device can associate different proportional relations between the user behavior data and the reference behavior data with different animations to be played, and can determine the animations to be played in the dynamic wallpaper according to the proportional relations between the user behavior data and the reference behavior data after receiving the display instruction. In this way, the user can determine the relationship between the current behavior and the reference behavior by looking at the animation played in the dynamic wallpaper. Therefore, the connection relation between the dynamic wallpaper and the behavior data of the user using the electronic device can be established, the behavior data of the user is visualized through the dynamic wallpaper, the realization of the dynamic wallpaper is enriched, and the intelligence of the device is improved.
Alternatively, a plurality of animations to be played may be displayed simultaneously in the display interface, for example, referring to fig. 4, the lock screen interface may be uniformly divided into 4 regions, and each region may display one animation to be played.
Optionally, each animation to be played in the display interface may be associated with different types of user behavior data, for example, an animation displayed in the upper left corner of the display interface may be associated with a time when the user has accumulated using the electronic device for a first period of time, an animation displayed in the upper right corner may be associated with a quantity of motion of the user, an animation displayed in the lower left corner may be associated with a heart rate of the user, and an animation displayed in the lower right corner may be associated with a body temperature of the user. The determination manner of each animation to be played in the display interface is the same as that disclosed in the foregoing embodiment, and for brevity, details are not repeated here.
It will be appreciated that the user may determine the relationship between the corresponding user behavior data and the reference behavior data by viewing different animations to be played. For example, referring to fig. 4, if the virtual sunflower displayed in the upper left corner of the lock screen interface changes from the sprouting state to the semi-blooming state, the current user is characterized by a shorter time to use the electronic device. If the virtual sunflower shown in the upper right hand corner changes from a fully flowering state to a withered state, it is indicated that the amount of motion of the current user has exceeded the reference amount of motion. If the virtual sunflower displayed in the lower left corner changes from a sprouting state to a fully flowering state, the heart rate of the current user is indicated to be normal. If the virtual sunflower displayed in the lower right corner is changed from a sprouting state to a fully flowering state, the normal body temperature of the current user is represented. That is, the user can determine the current state of the plurality of user health data through a plurality of different animations displayed in the display interface, thereby improving the information amount displayed.
In an embodiment of the present application, referring to fig. 5, in the display processing method provided in the embodiment of the present application, the following steps may be further performed before step 130:
step 510, the electronic device obtains the usage information of each application program in the second time period;
step 520, the electronic device determines display parameters of the animation to be played based on the usage information of each application program.
Correspondingly, step 130 may also be implemented by:
and the electronic equipment displays the animation to be played in the display interface based on the display parameters of the animation to be played.
In addition, in an embodiment of the present application, the following steps may be further performed after step 160:
step 530, the electronic device obtains the use information of each application program in the second time period;
step 540, the electronic device determines the display parameters of the target image based on the usage information of each application program.
Step 550, the electronic device displays the target image in the display interface based on the display parameter of the target image.
It should be noted that, the second period of time is different from the first period of time, and the second period of time may be in units of minutes, or in units of hours, or in units of days, which is not limited in the embodiment of the present application. Illustratively, the second time period may be twenty minutes, two hours, or the like, prior to the current time.
Specifically, the electronic device may obtain the usage information of each application program installed in the electronic device in the second period, where the usage information may include an accumulated duration, a number of times, a frequency, and the like of the usage, which is not limited in the embodiment of the present application.
The electronic device can determine display parameters of the animation or the target image to be played in the display interface according to the use information of each application program. The display parameters may include one or more of display color, brightness, and display size of the virtual object in the target image of the animation to be played.
Optionally, the electronic device may sort the application programs according to the usage information of each application program, determine the application program that is most commonly used by the user in the second time period, or that has the longest usage time, and determine the display parameters of the animation to be played or the target image by using the display parameters of the application program.
For example, the color of the virtual object in the animation to be played or the target image may be changed according to the icon color of the application program with the longest use time in the past second period of time.
It should be noted that, the adjusting process of the display parameter of the animation to be played or the target image and the determining process of the animation to be played or the target image may be independent of each other, or may be performed simultaneously with the determining process of the animation to be played or the target image, which is not limited in the embodiment of the present application. In this embodiment of the present application, when the electronic device switches the display interface, a process for adjusting display parameters of the animation to be played or the target image may be triggered.
Alternatively, if the display parameter of the application program that is most frequently used by the user or has the longest use time in the second period is different from the determined display parameter of the animation to be played or the target image, and the adjustment of the display parameter is not performed in the preset period (for example, 10 minutes, 20 minutes, or the like), the display parameter is adjusted. The adjustment of the display parameters may be accomplished in a smooth transition in a short time (e.g., 1 second).
Alternatively, the highest frequency of adjusting the display parameters may be a preset duration, for example, 10 minutes or 20 minutes (real time instead of use time), and the preset duration may be increased, for example, 30 minutes, if there is a relatively large influence on the memory and the power consumption.
Therefore, according to the display processing method provided by the embodiment of the application, the display parameters of the animation or the wallpaper can be dynamically adjusted according to the use habit of the user, so that the user can determine the behavior in a period of time according to the animation or the wallpaper and the display parameters. Therefore, the connection relation between the animation or the wallpaper and the behavior data of the user using the electronic equipment can be established, the behavior data of the user is visualized, the display of the behavior data of the user is enriched, and the intelligence of the equipment is improved.
The display processing method provided by the embodiment of the application is described in detail below in connection with a specific application scenario.
In this example, the electronic device may associate the dynamic wallpaper with a cumulative usage time of the electronic device by the user over the first period of time. Wherein the target animation may be divided into two parts, a first part and a second part. The time length of both the first portion and the second portion was 3 seconds. In the first part, the virtual sunflower in the target animation can be changed from the sprouting form A to the full flowering form B, and in the second part, the virtual sunflower in the target animation can be changed from the full flowering form B to the withered form B+.
Specifically, when the accumulated use duration of the electronic device is 0, the virtual sunflower in the dynamic wallpaper can be in the sprouting form a; when the accumulated using time is the reference time, the virtual sunflower in the dynamic wallpaper is in a full flowering form B; the virtual sunflower in the dynamic wallpaper is in the withered state b+ when the cumulative usage time exceeds the daily reference time period by 75 minutes or more. The display morphology of the virtual sunflower was reset at 0 a day with time of use.
The playing rule of the dynamic wallpaper is as follows: in any case, at least 1 second animation in the target animation is played, and the display progress in the dynamic wallpaper after 1 second corresponds to the accumulated use time of the electronic equipment linearly. Referring to fig. 6, a first second of the target animation is played when the use period is 0; playing a first second and a second of the target animation when the accumulated usage time reaches 50% of the daily reference time; when the cumulative usage time reaches 100% to 100% +15 minutes of the daily reference time, the first partial complete 3 second animation of the target animation is played. And after 15 minutes exceeding the daily reference time period, the second stage is started, the fourth second of the target animation is played, and after 45 minutes exceeding the daily reference time period, the fourth second and the fifth second of the target animation are played, and after 75 minutes or more exceeding the daily reference time period, the 3 second animation (namely, the fourth second, the fifth second and the sixth second) of the complete second part is played until the zero point is reset.
In addition, the virtual sunflower in the dynamic wallpaper may be color graded according to the icon color of the application that was used for the longest in the past hour. The electronic device may determine whether to change the color of the wallpaper when detecting entry into the main interface, and if the color corresponding to the application icon having the longest use time for the past one hour is different from the current color and the color change has not been performed for the past 10 minutes, the color change occurs. The color change was completed in a smooth transition in 1 second. The maximum frequency of discoloration is 10 minutes (real time rather than use time) once, and if there is a relatively large influence on memory and power consumption, the minimum interval of discoloration can be adjusted to 20 minutes.
In an embodiment of the present application, a display processing device is further provided, where the display processing device may be integrated into the electronic device in the foregoing embodiment. Referring to fig. 7, a schematic diagram of the composition structure of a display processing device 70 according to an embodiment of the present application is shown. As shown in fig. 7, the display processing device 70 may include:
an acquisition unit 71 configured to acquire user behavior data in a first period in response to the display instruction;
a processing unit 72 configured to determine an animation to be played based on the user behavior data, the length of time of the animation to be played being related to the user behavior data;
And a display unit 73 configured to display the animation to be played on a display interface.
Optionally, the user behavior data includes any one of the following:
the accumulated use duration of the electronic equipment, the use duration of the application of the target type, the use duration of the target application, the quantity of exercise of the user, the heart rate and the body temperature.
Optionally, the processing unit 72 is further configured to determine a threshold interval in which the behavior data is located, and select at least part of the animations corresponding to the threshold interval from the target animations as the animation to be played.
Optionally, the processing unit 72 is further configured to determine a proportional relation between the user behavior data and the reference behavior data, and select at least part of the animation matching the proportional relation from the target animations as the animation to be played.
Optionally, the processing unit 72 is further configured to determine that the animation to be played is at least part of the first portion of the target animation, if the ratio between the user behavior data and the reference behavior data is smaller than or equal to a first parameter; determining that the animation to be played is at least part of the second part of the target animation under the condition that the ratio between the user behavior data and the reference behavior data is larger than the first parameter; the first portion is different from the second portion.
Optionally, the length of time for which the animation is to be played is at least a first duration.
Optionally, the processing unit 72 is further configured to determine that the animation to be played is the first N seconds of content in the first portion, N being a number greater than zero, in the case that the ratio between the user behavior data and the reference behavior data is less than or equal to a second parameter; determining that the animation to be played is the content of the first part of the first n+k seconds under the condition that the ratio between the user behavior data and the reference behavior data is larger than the second parameter and smaller than or equal to a third parameter, wherein K is determined based on the ratio between the user behavior data and the reference behavior data; and determining that the animation to be played is the whole content of the first part under the condition that the ratio between the user behavior data and the reference behavior data is larger than the third parameter and smaller than the first parameter.
Optionally, the processing unit 72 is further configured to determine that the animation to be played is content of the first n+m seconds in the second portion, where M is an integer greater than or equal to 1, in a case where a ratio between the user behavior data and the reference behavior data is greater than the first parameter, where M is determined based on the ratio between the user behavior data and the reference behavior data.
Optionally, the target animation includes a process of changing the virtual object from a first shape to a second shape and from the second shape to a third shape;
the first portion of the target animation includes a process of changing the virtual object from the first configuration to the second configuration, and the second portion of the target animation includes a process of changing the virtual object from the second configuration to the third configuration.
Optionally, the obtaining unit 71 is further configured to obtain usage information of each application program in the second period;
the processing unit 72 is further configured to determine display parameters of the animation to be played based on the usage information of each application;
the display unit 73 is further configured to display the animation to be played in the display interface based on the display parameters of the animation to be played.
Optionally, the display parameter includes at least one of: and displaying color, brightness and display size of the virtual object in the animation to be played.
In an embodiment of the present application, a display processing device is further provided, where the display processing device may be integrated into the electronic device in the foregoing embodiment. As shown in fig. 7, the display processing device 70 may include:
An acquisition unit 71 configured to acquire user behavior data in a first period in response to the display instruction;
a processing unit 72 configured to determine a target image based on the user behavior data, the target image being a partial image in a set of target images; and setting the target image as wallpaper.
Optionally, the processing unit 72 is further configured to determine a threshold interval in which the user behavior data is located; and determining an image corresponding to the threshold interval from the target image set to obtain the target image.
Optionally, the processing unit 72 is further configured to determine a proportional relationship between the user behavior data and reference behavior data;
and the electronic equipment selects an image corresponding to the proportion relation from the target image set to obtain the target image.
Optionally, the obtaining unit 71 is further configured to obtain usage information of each application program in the second period;
the processing unit 72 is further configured to determine display parameters of the target image based on the usage information of the respective application programs;
the display unit is further configured to display the target image in a display interface based on the display parameter.
It should be understood by those skilled in the art that the above description of the service transmission apparatus according to the embodiments of the present application may be understood with reference to the description of the service transmission method according to the embodiments of the present application.
Fig. 8 is a schematic structural diagram of an electronic device 800 provided in an embodiment of the present application. The electronic device 800 shown in fig. 8 comprises a processor 810, from which the processor 810 may call and run a computer program to implement the method in embodiments of the present application.
Optionally, as shown in fig. 8, the electronic device 800 may also include a memory 820. Wherein the processor 810 may call and run a computer program from the memory 820 to implement the methods in embodiments of the present application.
Wherein the memory 820 may be a separate device from the processor 810 or may be integrated into the processor 810.
Optionally, the electronic device 800 may be specifically an electronic device in the display processing transmission method provided in the embodiment of the present application, and the electronic device 800 may implement a corresponding flow implemented by the electronic device in each method in the embodiment of the present application, which is not described herein for brevity.
It will be appreciated that in this embodiment, the "unit" may be a part of a circuit, a part of a processor, a part of a program or software, etc., and may of course be a module, or may be non-modular. Furthermore, the components in the present embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional modules.
The integrated units, if implemented in the form of software functional modules, may be stored in a computer-readable storage medium, if not sold or used as separate products, and based on such understanding, the technical solution of the present embodiment may be embodied essentially or partly in the form of a software product, which is stored in a storage medium and includes several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) or processor to perform all or part of the steps of the method described in the present embodiment. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
It is appreciated that memory 810 in embodiments of the present application may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM) which acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (Double Data Rate SDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), and Direct memory bus RAM (DRRAM). The memory 602 of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
And processor 810 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuitry in hardware or instructions in software in processor 810. The processor 810 may be a general purpose processor, a digital signal processor (Digital Signal Processor, DSP), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), a field programmable gate array (Field Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in hardware, in a decoded processor, or in a combination of hardware and software modules in a decoded processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. Which is located in a memory 820, and a processor 810 reads information in the memory 820 and performs the steps of the above method in combination with its hardware.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or a combination thereof. For a hardware implementation, the processing units may be implemented within one or more application specific integrated circuits (Application Specific Integrated Circuits, ASIC), digital signal processors (Digital Signal Processing, DSP), digital signal processing devices (DSP devices, DSPD), programmable logic devices (Programmable Logic Device, PLD), field programmable gate arrays (Field-Programmable Gate Array, FPGA), general purpose processors, controllers, microcontrollers, microprocessors, other electronic units configured to perform the functions described herein, or a combination thereof.
For a software implementation, the techniques described herein may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
Optionally, as another embodiment, the processor 810 is further configured to perform the steps of the method of any of the preceding embodiments when the computer program is run.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above described device embodiments are only illustrative, e.g. the division of the units is only one logical function division, and there may be other divisions in practice, such as: multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. In addition, the various components shown or discussed may be coupled or directly coupled or communicatively coupled to each other via some interface, whether indirectly coupled or communicatively coupled to devices or units, whether electrically, mechanically, or otherwise.
The units described as separate units may or may not be physically separate, and units displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units; some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may be separately used as one unit, or at least two units may be integrated in one unit; the integrated units may be implemented in hardware or in hardware plus software functional units.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the above method embodiments may be implemented by hardware associated with program instructions, where the foregoing program may be stored in a computer readable storage medium, and when executed, the program performs steps including the above method embodiments; and the aforementioned storage medium includes: a removable storage device, ROM, RAM, magnetic or optical disk, or other medium capable of storing program code.
Alternatively, the integrated units described above may be stored in a computer readable storage medium if implemented in the form of software functional modules and sold or used as a stand-alone product. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partly contributing to the prior art, and the computer software product may be stored in a storage medium, and include several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a removable storage device, ROM, RAM, magnetic or optical disk, or other medium capable of storing program code.
It should be noted that: the technical solutions described in the embodiments of the present application may be arbitrarily combined without any conflict.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (19)

1. A display processing method, the method comprising:
the electronic equipment responds to the display instruction and acquires user behavior data in a first time period;
the electronic equipment determines an animation to be played based on the user behavior data, and the time length of the animation to be played is related to the user behavior data;
and the electronic equipment displays the animation to be played on a display interface.
2. The method of claim 1, wherein the user behavior data comprises any one of:
the accumulated use duration of the electronic equipment, the use duration of the application of the target type, the use duration of the target application, the quantity of exercise of the user, the heart rate and the body temperature.
3. The method of claim 1, wherein the electronic device determining an animation to play based on the user behavior data comprises:
the electronic equipment determines a threshold interval in which the user behavior data are located;
and the electronic equipment selects at least part of the animation corresponding to the threshold interval from the target animation as the animation to be played.
4. The method of claim 1, wherein the sub-device determining an animation to play based on the user behavior data comprises:
the electronic equipment determines the proportional relation between the user behavior data and the reference behavior data;
and the electronic equipment selects at least part of the animation matched with the proportional relation from the target animation as the animation to be played.
5. The method of claim 4, wherein the electronic device selecting at least part of the animation matching the proportional relationship from the target animations as the animation to be played comprises:
determining that the animation to be played is at least part of the first part of the target animation under the condition that the ratio between the user behavior data and the reference behavior data is smaller than or equal to a first parameter;
Determining that the animation to be played is at least part of the second part of the target animation under the condition that the ratio between the user behavior data and the reference behavior data is larger than the first parameter; the first portion is different from the second portion.
6. The method of any one of claims 1-5, wherein the length of time for which the animation is to be played is at least a first length of time.
7. The method according to claim 5, wherein the determining that the animation to be played is at least part of the first portion of the target animation if the ratio between the user behavior data and the reference behavior data is less than or equal to a first parameter comprises:
determining that the animation to be played is the content of the first N seconds in the first part under the condition that the ratio between the user behavior data and the reference behavior data is smaller than or equal to a second parameter, wherein N is a number larger than zero;
determining that the animation to be played is the content of the first part of the first n+k seconds under the condition that the ratio between the user behavior data and the reference behavior data is larger than the second parameter and smaller than or equal to a third parameter, wherein K is determined based on the ratio between the user behavior data and the reference behavior data;
And determining that the animation to be played is the whole content of the first part under the condition that the ratio between the user behavior data and the reference behavior data is larger than the third parameter and smaller than the first parameter.
8. The method of claim 5, wherein the determining that the animation to be played is at least part of the second portion of the target animation if the ratio between the user behavior data and the reference behavior data is greater than the first parameter comprises:
and under the condition that the ratio between the user behavior data and the reference behavior data is greater than the first parameter, determining that the animation to be played is the content of the first N+M seconds in the second part, wherein M is an integer greater than or equal to 1, and determining that M is based on the ratio between the user behavior data and the reference behavior data.
9. The method of any one of claims 5, 7, or 8, wherein the target animation includes a process in which a virtual object changes from a first modality to a second modality, and from the second modality to a third modality;
the first portion of the target animation includes a process of changing the virtual object from the first configuration to the second configuration, and the second portion of the target animation includes a process of changing the virtual object from the second configuration to the third configuration.
10. The method of any one of claims 1-5, 7, 8, wherein the electronic device further comprises, prior to displaying the animation to be played on a display interface:
acquiring the use information of each application program in a second time period;
determining display parameters of the animation to be played based on the use information of each application program;
the electronic equipment displays the animation to be played on a display interface and further comprises:
and the electronic equipment displays the animation to be played in the display interface based on the display parameters.
11. The method of claim 10, wherein the display parameters include at least one of:
and displaying color, brightness and display size of the virtual object in the animation to be played.
12. A display processing method, the method comprising:
the electronic equipment responds to the display instruction and acquires user behavior data in a first time period;
the electronic equipment determines a target image based on the user behavior data, wherein the target image is a partial image in a target image set;
the electronic device sets the target image as wallpaper.
13. The method of claim 12, wherein the determining the target image comprises:
The electronic equipment determines a threshold interval in which the user behavior data are located;
and the electronic equipment determines an image corresponding to the threshold interval from the target image set to obtain the target image.
14. The method of claim 12, wherein the electronic device determining a target image based on the user behavior data comprises:
the electronic equipment determines the proportional relation between the user behavior data and the reference behavior data;
and the electronic equipment selects an image corresponding to the proportion relation from the target image set to obtain the target image.
15. The method of any of claims 12-14, wherein after the electronic device sets the target image as wallpaper, further comprising:
the electronic equipment acquires the use information of each application program in a second time period;
the electronic equipment determines display parameters of the target image based on the use information of each application program;
the electronic device displays the target image in a display interface based on the display parameters.
16. A display processing apparatus, the apparatus comprising:
The acquisition unit is configured to respond to the display instruction and acquire user behavior data in a first time period;
a processing unit configured to determine an animation to be played based on the user behavior data, the time length of the animation to be played being related to the user behavior data;
and the display unit is configured to display the animation to be played on a display interface.
17. A display processing apparatus, the apparatus comprising:
the acquisition unit is configured to respond to the display instruction and acquire user behavior data in a first time period;
a processing unit configured to determine a target image based on the user behavior data, the target image being a partial image in a target image set; and setting the target image as wallpaper.
18. An electronic device comprising a memory and a processor; wherein,
the memory is used for storing a computer program capable of running on the processor;
the processor being adapted to perform the method of any one of claims 1 to 11 or the method of any one of claims 12 to 15 when the computer program is run.
19. A computer storage medium storing a computer program which, when executed by at least one processor, implements the method of any one of claims 1 to 11, or the method of any one of claims 12 to 15.
CN202211043102.2A 2022-08-29 2022-08-29 Display processing method and device, electronic equipment and storage medium Pending CN117668289A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211043102.2A CN117668289A (en) 2022-08-29 2022-08-29 Display processing method and device, electronic equipment and storage medium
PCT/CN2023/108182 WO2024045931A1 (en) 2022-08-29 2023-07-19 Data processing method and apparatus, electronic device, chip, storage medium, program, and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211043102.2A CN117668289A (en) 2022-08-29 2022-08-29 Display processing method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117668289A true CN117668289A (en) 2024-03-08

Family

ID=90079476

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211043102.2A Pending CN117668289A (en) 2022-08-29 2022-08-29 Display processing method and device, electronic equipment and storage medium

Country Status (2)

Country Link
CN (1) CN117668289A (en)
WO (1) WO2024045931A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140111523A1 (en) * 2012-10-22 2014-04-24 Google Inc. Variable length animations based on user inputs
CN111857476B (en) * 2020-07-17 2022-07-08 维沃移动通信有限公司 Display method and device and electronic equipment
CN112148410A (en) * 2020-09-29 2020-12-29 维沃移动通信有限公司 Image display method and electronic equipment
CN114363683B (en) * 2020-10-14 2023-05-12 深圳市万普拉斯科技有限公司 Image display method, apparatus, computer device and storage medium
CN113656136A (en) * 2021-08-18 2021-11-16 维沃移动通信有限公司 Wallpaper display method and device and electronic equipment

Also Published As

Publication number Publication date
WO2024045931A1 (en) 2024-03-07

Similar Documents

Publication Publication Date Title
CN104679649B (en) A kind of software fluency method of testing and test device
CN108211352A (en) A kind of method and terminal for adjusting image quality
CN104573479B (en) A kind of use control method and device of user equipment
US11115684B2 (en) System, method, and program for distributing live video
CN110222597B (en) Method and device for adjusting screen display based on micro-expressions
KR20180072793A (en) Push Information Approximate Selection Alignment Method, Device and Computer Storage Medium
CN108040265A (en) A kind of method and apparatus handled video
CN113347503B (en) Audio and video playing method and device, computer equipment and storage medium
WO2017028619A1 (en) Desktop display method and apparatus for smart television
CN117275673A (en) Cognitive training equipment and method for improving distribution attention
EP3246869A1 (en) Information processing apparatus, information processing method, and program
CN107613385B (en) A kind of platform for preventing child myopia
CN109308660B (en) Credit assessment scoring model evaluation method, apparatus, device and storage medium
CN117668289A (en) Display processing method and device, electronic equipment and storage medium
CN106445698B (en) Method and device for acquiring step counting data
CN111489289B (en) Image processing method, image processing device and terminal equipment
CN107608589B (en) Video production method and mobile terminal
CN113032278B (en) Application running mode, and method and device for confirming grade of terminal equipment
CN106861191A (en) Method, apparatus and system that game attributes update
CN110929190A (en) Page playing method and device, electronic equipment and storage medium
CN106528525B (en) Method and device for identifying cheating on ranking list
CN111144973B (en) Question ranking method and computer-readable storage medium
CN108469930A (en) game data processing method and device
CN116943153A (en) Game time control method and device and electronic equipment
CN114367104A (en) Rendering distance adaptation method and device, game processing equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination