CN114757815A - Dynamic fluid display method, device, electronic equipment and readable medium - Google Patents

Dynamic fluid display method, device, electronic equipment and readable medium Download PDF

Info

Publication number
CN114757815A
CN114757815A CN202011565590.4A CN202011565590A CN114757815A CN 114757815 A CN114757815 A CN 114757815A CN 202011565590 A CN202011565590 A CN 202011565590A CN 114757815 A CN114757815 A CN 114757815A
Authority
CN
China
Prior art keywords
fluid
target object
target
image
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011565590.4A
Other languages
Chinese (zh)
Inventor
李奇
李小奇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202011565590.4A priority Critical patent/CN114757815A/en
Priority to PCT/CN2021/133136 priority patent/WO2022135017A1/en
Publication of CN114757815A publication Critical patent/CN114757815A/en
Priority to US18/340,400 priority patent/US20230334730A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides a dynamic fluid display method, a dynamic fluid display device, electronic equipment and a readable medium, and relates to the technical field of computers. The method comprises the following steps: responding to the image input operation of a user, displaying an image corresponding to the image input operation on a user display interface, and displaying the fluid and the image in a superposition manner; when a target object is detected in a user display interface, acquiring attribute information of the target object; determining, from the attribute information, a change in a parameter of the fluid at each target texel location relative to the target object in the image; the dynamic fluid is displayed in the image according to changes in parameters of the fluid at each target texel location. According to the technical scheme, the background image input by the user can be received, the fluid and the background image are displayed in a superposition mode, and the parameter change of the fluid in the image is controlled according to the detected attribute information of the target object in the user display interface, so that the fluid drives the background image to flow together, and the fluid flow control system is novel in interaction mode and high in interestingness.

Description

Dynamic fluid display method, device, electronic equipment and readable medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a dynamic fluid display method, apparatus, electronic device, and readable medium.
Background
With the rapid development of computer technology and communication technology, various application programs based on terminal equipment are commonly applied, and the daily life of people is greatly enriched. Users may entertain through various applications, share daily life with other users, and the like. To enhance enjoyment, interactive methods are often added to gaming or video capture applications to enhance the user experience.
However, in the prior art, the interactive mode applied to the mobile terminal is single in playing method and not strong in interestingness.
Disclosure of Invention
The present disclosure provides a dynamic fluid display method, apparatus, electronic device and readable medium for solving the problems in the prior art.
In a first aspect, a dynamic fluid display method is provided, the method comprising:
responding to the image input operation of a user, displaying an image corresponding to the image input operation on a user display interface, and displaying the fluid and the image in a superposition manner;
when a target object is detected in a user display interface, acquiring attribute information of the target object;
determining, from the attribute information, a change in a parameter of the fluid at each target texel location relative to the target object in the image;
the dynamic fluid is displayed in the image according to changes in parameters of the fluid at each target texel location.
In a second aspect, there is provided a dynamic fluid display device, the device comprising:
the first display module is used for responding to the image input operation of a user, displaying an image corresponding to the image input operation on a user display interface, and overlapping and displaying the fluid and the image;
the acquisition module is used for acquiring the attribute information of the target object when the target object is detected in the user display interface;
a determination module for determining, in the image, a change in a parameter of the fluid at each target texel location relative to the target object in dependence on the attribute information;
and the second display module is used for displaying the dynamic fluid in the image according to the change of the parameter of the fluid at each target texture pixel position.
In a third aspect, the present disclosure provides an electronic device comprising:
one or more processors;
a memory storing one or more application programs, wherein the one or more application programs, when executed by the one or more processors, cause the electronic device to perform operations corresponding to the dynamic fluid display method as illustrated in the first aspect of the present disclosure.
In a fourth aspect, the present disclosure provides a computer readable medium for storing computer instructions which, when executed by a computer, cause the computer to perform the dynamic fluid display method as set forth in the first aspect of the present disclosure.
In a fifth aspect, the present disclosure provides a computer program product comprising computer instructions which, when executed by a computer, implement the dynamic fluid display method as illustrated in the first aspect of the present disclosure.
The beneficial effect that technical scheme that this disclosure provided brought can include:
according to the dynamic fluid display method, the dynamic fluid display device, the electronic equipment and the readable medium, the background image input by the user can be received, the fluid and the background image are displayed in an overlapping mode, when the target object is detected in the user display interface, the parameter change of the fluid in the image is controlled according to the detected attribute information of the target object in the user display interface, and the dynamic fluid is displayed, so that the fluid drives the background image to flow together, and the dynamic fluid display method, the dynamic fluid display device, the electronic equipment and the readable medium are novel in interaction mode and strong in interestingness.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present disclosure, the drawings used in the description of the embodiments of the present disclosure will be briefly described below.
Fig. 1 is a schematic flow chart diagram illustrating a dynamic fluid display method according to an embodiment of the disclosure;
fig. 2 is a schematic structural diagram of a dynamic fluid display device according to an embodiment of the disclosure;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing the devices, modules, or units, and are not used for limiting the devices, modules, or units to be determined as different devices, modules, or units, and are not used for limiting the sequence or interdependence of the functions performed by the devices, modules, or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
The following describes the technical solutions of the present disclosure and how to solve the above technical problems in specific embodiments. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present disclosure will be described below with reference to the accompanying drawings.
The technical scheme of the disclosure can be applied to application programs related to dynamic fluid effect making, application, use and the like. The technical scheme of the disclosure can be applied to terminal equipment, and the terminal equipment can include a mobile terminal or computer equipment, wherein the mobile terminal can include, for example, a smart phone, a palm computer, a tablet computer, a wearable device with a display screen, and the like; the computer device may include, for example, a desktop computer, a laptop computer, a kiosk, a smart home device, and the like.
The technical scheme of the disclosure can be realized by a Graphics Processing Unit (GPU) in the terminal device.
Fig. 1 is a schematic flow chart of a dynamic fluid display method provided in an embodiment of the present disclosure, and as shown in fig. 1, the method may include:
and step S101, responding to the image input operation of the user, displaying an image corresponding to the image input operation on a user display interface, and displaying the fluid and the image in a superposed manner.
The scheme provided by the embodiment of the disclosure can be implemented as an application program or a functional plug-in of the application program, and when the terminal device detects a starting instruction of a user for the application program, the application program is started, and the user display interface is displayed; or when the terminal device detects a trigger instruction of the user for the functional plug-in of the application program, the user display interface is displayed.
The terminal device may receive an image input by a user as a background image, divide the background image into a plurality of grids, and divide the grids according to a horizontal direction and a vertical direction, where the horizontal direction corresponds to a width direction of the image and the vertical direction corresponds to a height direction of the image. The positions of each grid and each texture pixel of the GPU are in one-to-one correspondence, the positions of the texture pixels and the positions of screen display pixels of the terminal equipment are in one-to-one correspondence in a two-dimensional plane, and when a background image is displayed in a user display interface, each texture pixel is in one-to-one correspondence with each pixel in the background image, namely, each grid is in one-to-one correspondence with each pixel in the background image. The background image shows fluid, and the texture pixels are used to store parameters of the fluid corresponding to each mesh, including initial velocity parameters and the like.
The background image may be an image of any color, either a single color or multiple colors, the fluid may be a fluid of any color, the fluid may be in a static state or a moving state, and the fluid is displayed in superposition with the background image input by the user. Optionally, the color of the fluid is different from the color of the background image, so that the fluid and the background image are displayed in the user display interface with better display effect.
In one example, the fluid is in a colorless and transparent static state, the background image is an image formed by multiple colors with the display content being landscape, the fluid is displayed in the background image in an overlapping mode, and the background image with a layer of transparent fluid superimposed is displayed in the user display interface.
Step S102, when the target object is detected in the user display interface, acquiring the attribute information of the target object.
Wherein the target object may be a specific object in the video, including but not limited to: a face, head, gesture, finger, etc., of a person or other living being.
Among these, gestures may include, but are not limited to: finger up, fist making, barbie, like.
The terminal device may start a video capture device (e.g., a camera) of the terminal device to capture a video. The duration of the video capture may be a preset time period, or the duration of the video capture may be determined according to a video capture start instruction and a video capture end instruction, which is not limited in this disclosure. In the process of video acquisition, the terminal equipment detects a target object in a user display interface, namely a video picture.
Optionally, the target object may also be a touch operation object, for example, a user touches a display screen of the terminal device with a finger and slides on the display screen, and when the terminal device detects a touch operation instruction, an object inputting the touch operation instruction is a touch operation object (at this time, the finger is the touch operation object, that is, the target object).
The attribute information of the target object is information indicating a specific attribute of the target object. The specific properties may comprise velocity, position related information of the target object.
Optionally, the position of the target object may be a position of the target object in the corresponding at least one grid in the user display interface.
Different types of target objects and different specific implementation modes for acquiring the attribute information.
In one possible implementation manner, the obtaining of the attribute information of the target object includes:
acquiring the position of a target object in each frame of image of a video;
determining the moving distance of the target object from one frame image to the next frame image according to the position of the target object in each frame image of the video;
acquiring the moving time of a target object from one frame image to the next frame image;
determining the moving speed of the target object at each moment according to the moving distance and the moving time;
and determining the speed variation of the target object from the current time to the next time according to the moving speed of the target object at each time.
In practical applications, the target object may be a target object in a video, for example, a gesture in a video. After the gesture is detected in the video, the position of the gesture in each frame image of the video is obtained, the moving distance of the gesture in the adjacent frame image can be obtained according to the position of the gesture in each frame image, the moving time of the gesture in the adjacent frame image is obtained, the moving speed of the gesture is determined according to the moving distance and the moving time of the gesture, and the speed variation of the gesture is determined according to the moving speed of the gesture at each moment.
In a possible implementation manner, the obtaining of the attribute information of the target object includes:
detecting the sliding operation speed of a touch operation object in a user display interface at each moment;
based on the sliding operation speed, the speed variation of the touch operation object from the current time to the next time is determined.
In practical applications, the target object may be a touch operation object, for example, a finger touches a touch screen to slide, the terminal device detects a position corresponding to each time of the sliding operation, determines a moving distance from a current time to a next time according to the position corresponding to each time, determines a time interval from the current time to the next time, calculates a sliding operation speed at each time according to the moving distance and the time interval, and obtains a speed variation of the finger sliding according to the sliding operation speed at each time.
Step S103 determines, in the image, a change in a parameter of the fluid at each target texel position relative to the target object, based on the attribute information.
From changes in the position, velocity, etc. of the target object, changes in parameters of the fluid at the target texel positions associated with the target object may be determined in the image, thereby enabling interaction between the user and the terminal device. The target texel is a texel related to the target object in the image, that is, a texel corresponding to the coverage of the target object, and the number of the target texel may be at least one. The parameter of the fluid may be a parameter related to a motion state, a display state, and the like of the reaction fluid.
Step S104, displaying dynamic fluid in the image according to the change of the parameters of the fluid at the position of each target texture pixel.
Specifically, according to the change of the relevant parameters such as the motion state and the display state of the fluid, the effect image after the fluid simulation is displayed in the image, so that the background image is driven by the flow of the fluid to move together, and the dynamic effect of the fluid can be displayed in the form of a video or a dynamic graph.
In one possible implementation, the parameter of the fluid comprises a velocity parameter, and for each target texel, the method further comprises:
acquiring the time step length of updating the speed parameter;
acquiring a speed parameter of fluid at the position of a target texture pixel at the current moment;
determining a first texture pixel position at the current moment according to the time step, the speed parameter of the fluid at the target texture pixel position at the current moment and the position of the target texture pixel at the current moment;
the color at the first texel position at the current time is determined as the color of the fluid at the target texel position at the next time.
In practical applications, the time step of the velocity parameter update may be preconfigured, the velocity parameter of the fluid at each target texel position is updated according to the time step, and the color of the fluid at each target texel position is updated simultaneously with the update of the velocity parameter.
In one example, the color of the fluid at each target texel location is updated by equation (1):
Figure BDA0002861670000000071
wherein x is the position of the grid corresponding to the target texture pixel;
Figure BDA0002861670000000072
represents the color of the fluid at the x position at the time t + 1; dt represents a time step;
Figure BDA0002861670000000073
is the velocity parameter of the fluid at the x position at time t;
Figure BDA0002861670000000074
representing a first texel position at time t;
Figure BDA0002861670000000075
representing the color of the fluid at the first texel position at time t.
In one possible implementation, determining, in the image, a change in a parameter of the fluid at each target texel location relative to the target object based on the attribute information of the target object includes:
and determining the change of the speed parameter of the fluid at the position of each target texture pixel according to the speed change quantity of the target object.
In practical applications, the velocity parameter of the fluid at each position of the target texel may be adjusted according to the velocity variation of the target object, so that the fluid generates a velocity variation according to the velocity variation of the target object, and the velocity variation of the fluid may include at least one of a variation in velocity magnitude or a variation in velocity direction.
In a possible implementation manner, the attribute information of the target object further includes initial position information of the target object and an action radius of the target object, and the determining the change of the velocity parameter of the fluid at each position of the target texel according to the velocity change amount of the target object includes:
for each target texture pixel, acquiring position information corresponding to the target texture pixel;
and for each target texture pixel, determining the speed parameter of the fluid at the position of the target texture pixel at the next moment according to the speed variation of the target object, the initial position information of the target object, the action radius of the target object, the position information corresponding to the target texture pixel and the speed parameter of the fluid at the position of the target texture pixel at the current moment.
In practical application, the attribute information of the target object further includes initial position information of the target object and an action radius of the target object, where the initial position information of the target object is a position of the target object when the target object first appears in the user display interface, and may be a position of at least one corresponding grid when the target object first appears in the user display interface. The action radius of the target object represents the action range of the target object in the user display interface, the action radius can be configured in advance according to specific needs, and the larger the action radius is, the larger the action range of the target object in the user display interface is. The position of the target object in the user display interface corresponds to the position of each target texture pixel, and the position of each target texture pixel can be determined according to the position of the target object. And determining the speed of the fluid at the position of the target texture pixel at the next moment according to the speed variation of the target object, the initial position information of the target object, the action radius of the target object, the position information corresponding to the target texture pixel and the speed of the fluid at the position of the target texture pixel at the current moment.
In one example, the velocity of the fluid at the target texel location is updated according to equation (2):
Figure BDA0002861670000000081
wherein the content of the first and second substances,
Figure BDA0002861670000000082
representing the velocity parameter of the fluid at the target texel position x at time t +1,
Figure BDA0002861670000000083
represents the velocity parameter of the fluid at the target texel position x at time t, x represents the grid position corresponding to the target texel, i.e. the position corresponding to the target texel, p represents the initial position of the target object, r represents the radius of action, and Δ v represents the change in velocity of the target object from t to t + 1.
In one possible implementation, if the target object is not detected in the user display interface, for each target texel, the method further comprises:
determining a second texture pixel position at the current moment according to the time step, the speed parameter of the fluid at the target texture pixel position at the current moment and the position of the target texture pixel at the current moment;
the velocity parameter of the fluid at the second texel location at the current time is determined as the velocity parameter of the fluid at the target texel location at the next time.
In practical application, if the initial state of the fluid is a static state, after a target object is detected, according to attribute information of the target object, parameters of the fluid at each target texel position in an image are changed, so that the fluid drives the image to start flowing, and if a user stops an interactive action at a certain moment, that is, the target object is not detected in a user display interface, the fluid continues flowing according to inertia at the moment, specifically, the processing method is as follows: for each target texture pixel in the image, determining a second texture pixel position at the current moment according to a time step updated by a preconfigured speed, a fluid speed parameter at the target texture pixel position at the current moment and the target texture pixel position at the current moment; the velocity parameter of the fluid at the second texel position at the current time is determined as the velocity parameter of the fluid at the target texel position at the next time, and the color of the fluid is updated by using the velocity parameter, and the color of the fluid can be updated according to the method shown in formula (1), so that the image flows with the inertia of the fluid.
In one example, the velocity of the fluid at each target texel location is updated by equation (3):
Figure BDA0002861670000000091
wherein, x is the position of the grid corresponding to the target texture pixel;
Figure BDA0002861670000000092
representing the velocity parameter of the fluid at the x position at the time of t + 1; dt represents a time step;
Figure BDA0002861670000000093
is the velocity parameter of the fluid at the x position at time t;
Figure BDA0002861670000000094
representing a second texel position corresponding to the target texel at the position x at the time t;
Figure BDA0002861670000000095
representing the velocity parameter of the fluid at the second texel position at time t.
According to the dynamic fluid display method provided by the embodiment of the disclosure, the background image input by the user can be received, the fluid and the background image are displayed in an overlapping manner, when the target object is detected in the user display interface, the parameter change of the fluid in the image is controlled according to the detected attribute information of the target object in the user display interface, and the dynamic fluid is displayed, so that the fluid drives the background image to flow together, and the dynamic fluid display method is novel in interactive mode and strong in interestingness.
Based on the same principle as the method shown in fig. 1, an embodiment of the present disclosure also provides a dynamic fluid display device 20, as shown in fig. 2, the dynamic fluid display device 20 may include:
the first display module 21 is configured to, in response to an image input operation by a user, display an image corresponding to the image input operation on a user display interface, and display a fluid and the image in a superimposed manner;
the acquiring module 22 is configured to acquire attribute information of a target object when the target object is detected in the user display interface;
a determining module 23 for determining, in the image, a variation of a parameter of the fluid at each target texel position relative to the target object, based on the attribute information;
and a second display module 24 for displaying the dynamic fluid in the image according to the change of the parameter of the fluid at each target texel position.
In one possible implementation, the parameter of the fluid comprises a speed parameter, and the dynamic fluid display device 20 further comprises a color update module for:
acquiring the time step length of updating the speed parameter;
for each target texture pixel, acquiring a speed parameter of fluid at the position of the target texture pixel at the current moment;
for each target texture pixel, determining a first texture pixel position at the current moment according to the time step, the speed parameter of the fluid at the target texture pixel position at the current moment and the position of the target texture pixel at the current moment;
the color at the first texel position at the current time is determined as the color of the fluid at the target texel position at the next time.
In a possible implementation manner, the target object includes a target object in a video, the attribute information of the target object includes a speed variation of the target object, and the obtaining module 22 is configured to:
acquiring the position of a target object in each frame of image of a video;
determining the moving distance of the target object from one frame image to the next frame image according to the position of the target object in each frame image of the video;
acquiring the moving time of a target object from one frame image to the next frame image;
determining the moving speed of the target object at each moment according to the moving distance and the moving time;
and determining the speed variation of the target object from the current time to the next time according to the moving speed of the target object at each time.
In a possible implementation manner, the target object includes a touch operation object, the attribute information of the target object includes a speed variation of the target object, and the obtaining module 22 is configured to:
detecting the sliding operation speed of a touch operation object in a user display interface at each moment;
based on the sliding operation speed, the speed variation of the touch operation object from the current time to the next time is determined.
In a possible implementation manner, according to the attribute information of the target object, the determining module 23 is configured to:
and determining the change of the speed parameter of the fluid at the position of each target texture pixel according to the speed change quantity of the target object.
In a possible implementation manner, the attribute information of the target object further includes initial position information of the target object and an action radius of the target object, and the determining module 23, when determining the change of the velocity parameter of the fluid at each target texel position according to the velocity change amount of the target object, is configured to:
for each target texture pixel, acquiring position information corresponding to the target texture pixel;
and for each target texture pixel, determining the speed parameter of the fluid at the position of the target texture pixel at the next moment according to the speed variation of the target object, the initial position information of the target object, the acting radius of the target object, the position information corresponding to the target texture pixel and the speed parameter of the fluid at the position of the target texture pixel at the current moment.
In one possible implementation, if the target object is not detected in the user display interface, the dynamic fluid display device 20 further includes a speed update module for:
for each target texture pixel, determining a second texture pixel position at the current moment according to the time step, the speed parameter of the fluid at the target texture pixel position at the current moment and the position of the target texture pixel at the current moment;
for each target texel, the velocity parameter of the fluid at the current time instant at the second texel position is determined as the velocity parameter of the fluid at the next time instant at the target texel position.
The dynamic fluid display device of the embodiments of the present disclosure may execute the dynamic fluid display method provided by the embodiments of the present disclosure, and the implementation principles thereof are similar, the actions executed by each module in the dynamic fluid display device of the embodiments of the present disclosure correspond to the steps in the dynamic fluid display method of the embodiments of the present disclosure, and for the detailed functional description of each module of the dynamic fluid display device, reference may be specifically made to the description in the corresponding dynamic fluid display method shown in the foregoing, and details are not repeated here.
The dynamic fluid display device provided by the embodiment of the disclosure can receive a background image input by a user, superpose and display fluid and the background image, and when a target object is detected in a user display interface, the parameter change of the fluid in the image is controlled according to the detected attribute information of the target object in the user display interface, so that the dynamic fluid is displayed, and therefore the fluid drives the background image to flow together, the interaction mode is novel, and the interestingness is strong.
Referring now to FIG. 3, a block diagram of an electronic device 300 suitable for use in implementing embodiments of the present disclosure is shown. The execution subject of the technical solution of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle-mounted terminal (e.g., a car navigation terminal), a wearable electronic device, etc., and a stationary terminal such as a digital TV, a desktop computer, a smart home device, etc. The electronic device shown in fig. 3 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
The electronic device includes: a memory and a processor, wherein the processor may be referred to as a processing device 301 described below, and the memory may include at least one of a Read Only Memory (ROM)302, a Random Access Memory (RAM)303, and a storage device 308, which are described below:
as shown in fig. 3, the electronic device 300 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 301 that may perform various appropriate actions and processes, such as performing the above-described functions defined in the methods of the disclosed embodiments, according to a program stored in a Read Only Memory (ROM)302 or a program loaded from a storage means 308 into a Random Access Memory (RAM) 303. In the RAM 303, various programs and data necessary for the operation of the electronic apparatus 300 are also stored. The processing device 301, the ROM 302, and the RAM 303 are connected to each other via a bus 304. An input/output (I/O) interface 305 is also connected to bus 304.
Generally, the following devices may be connected to the I/O interface 305: input devices 306 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 307 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage devices 308 including, for example, magnetic tape, hard disk, etc.; and a communication device 309. The communication means 309 may allow the electronic device 300 to communicate wirelessly or by wire with other devices to exchange data. While fig. 3 illustrates an electronic device 300 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may be alternatively implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication means 309, or installed from the storage means 308, or installed from the ROM 302. The computer program, when executed by the processing device 301, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may be separate and not incorporated into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to perform the functions defined in the methods of the embodiments of the present disclosure. For example, the one or more programs, when executed by the electronic device, cause the electronic device to perform operations comprising: responding to the image input operation of a user, displaying an image corresponding to the image input operation on a user display interface, and displaying the fluid and the image in a superposition manner; when a target object is detected in a user display interface, acquiring attribute information of the target object; determining, from the attribute information, a change in a parameter of the fluid at each target texel location relative to the target object in the image; the dynamic fluid is displayed in the image according to changes in parameters of the fluid at each target texel location.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, Smalltalk, C + +, including conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules or units described in the embodiments of the present disclosure may be implemented by software or hardware. Wherein the designation of a module or unit does not in some cases constitute a limitation on the unit itself.
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
In accordance with one or more embodiments of the present disclosure, there is provided a dynamic fluid display method, the method including:
responding to an image input operation of a user, displaying an image corresponding to the image input operation on a user display interface, and overlapping and displaying the fluid and the image;
when a target object is detected in the user display interface, acquiring attribute information of the target object;
determining, in the image, a change in a parameter of the fluid at each target texel location relative to the target object in accordance with the attribute information;
displaying dynamic fluid in the image according to changes in parameters of the fluid at the target texel locations.
In one possible implementation, the parameter of the fluid comprises a velocity parameter, and for each target texel, the method further comprises:
acquiring the time step length of updating the speed parameter;
obtaining a speed parameter of the fluid at the position of the target texture pixel at the current moment;
determining a first texture pixel position at the current moment according to the time step, the speed parameter of the fluid at the target texture pixel position at the current moment and the position of the target texture pixel at the current moment;
determining a color at the first texel location at a current time as a color of the fluid at the target texel location at a next time.
In one possible implementation manner, the obtaining the attribute information of the target object includes:
acquiring the position of the target object in each frame of image of the video;
determining the moving distance of the target object from one frame image to the next frame image according to the position of the target object in each frame image of the video;
acquiring the moving time of the target object from one frame image to the next frame image;
determining the moving speed of the target object at each moment according to the moving distance and the moving time;
and determining the speed variation of the target object from the current time to the next time according to the moving speed of the target object at each time.
In one possible implementation manner, the obtaining the attribute information of the target object includes:
detecting the sliding operation speed of the touch operation object in the user display interface at each moment;
and determining the speed variation of the touch operation object from the current moment to the next moment based on the sliding operation speed.
In one possible implementation, the determining, in the image, a change in a parameter of the fluid at each target texel position relative to the target object according to the attribute information of the target object includes:
and determining the change of the speed parameter of the fluid at each target texture pixel position according to the speed change of the target object.
In a possible implementation manner, the attribute information of the target object further includes initial position information of the target object and an action radius of the target object, and the determining a change of the velocity parameter of the fluid at each target texel position according to a velocity change amount of the target object includes:
for each target texture pixel, acquiring position information corresponding to the target texture pixel;
and for each target texture pixel, determining the speed parameter of the fluid at the position of the target texture pixel at the next moment according to the speed variation of the target object, the initial position information of the target object, the acting radius of the target object, the position information corresponding to the target texture pixel and the speed parameter of the fluid at the position of the target texture pixel at the current moment.
In a possible implementation manner, if no target object is detected in the user display interface, for each target texel, the method further includes:
determining a second texture pixel position at the current moment according to the time step, the speed parameter of the fluid at the target texture pixel position at the current moment and the position of the target texture pixel at the current moment;
determining a velocity parameter of the fluid at the second texel location at the current time as the velocity parameter of the fluid at the target texel location at the next time.
According to one or more embodiments of the present disclosure, there is provided a dynamic fluid display device including:
the first display module is used for responding to image input operation of a user, displaying an image corresponding to the image input operation on a user display interface, and overlapping and displaying the fluid and the image;
the acquisition module is used for acquiring the attribute information of the target object when the target object is detected in the user display interface;
a determination module for determining, in the image, a change in a parameter of the fluid at each target texel location relative to the target object in dependence on the attribute information;
a second display module for displaying dynamic fluid in the image according to the change of the parameter of the fluid at each target texel position.
In one possible implementation, the parameter of the fluid comprises a speed parameter, and the dynamic fluid display device further comprises a color update module configured to:
acquiring the time step of updating the speed parameter;
for each target texture pixel, acquiring a speed parameter of the fluid at the position of the target texture pixel at the current moment;
for each target texture pixel, determining a first texture pixel position at the current moment according to the time step, the speed parameter of the fluid at the position of the target texture pixel at the current moment and the position of the target texture pixel at the current moment;
determining a color at the first texel location at a current time as a color of the fluid at the target texel location at a next time.
In a possible implementation manner, the target object includes a target object in a video, the attribute information of the target object includes a speed variation of the target object, and the obtaining module is configured to:
acquiring the position of the target object in each frame of image of the video;
determining the moving distance of the target object from one frame image to the next frame image according to the position of the target object in each frame image of the video;
acquiring the moving time of the target object from one frame image to the next frame image;
determining the moving speed of the target object at each moment according to the moving distance and the moving time;
and determining the speed variation of the target object from the current time to the next time according to the moving speed of the target object at each time.
In a possible implementation manner, the target object includes a touch operation object, the attribute information of the target object includes a speed variation of the target object, and the obtaining module is configured to:
detecting the sliding operation speed of the touch operation object in the user display interface at each moment;
and determining the speed variation of the touch operation object from the current moment to the next moment based on the sliding operation speed.
In one possible implementation manner, the determining module is configured to:
and determining the change of the speed parameter of the fluid at each target texture pixel position according to the speed change of the target object.
In a possible implementation manner, the attribute information of the target object further includes initial position information of the target object and an acting radius of the target object, and the determining module, when determining the change of the velocity parameter of the fluid at each target texel position according to the velocity change amount of the target object, is configured to:
for each target texture pixel, acquiring position information corresponding to the target texture pixel;
and for each target texture pixel, determining the speed parameter of the fluid at the position of the target texture pixel at the next moment according to the speed variation of the target object, the initial position information of the target object, the acting radius of the target object, the position information corresponding to the target texture pixel and the speed parameter of the fluid at the position of the target texture pixel at the current moment.
In a possible implementation manner, if no target object is detected in the user display interface, for each target texel, the dynamic fluid display device further includes a speed updating module, configured to:
for each target texture pixel, determining a second texture pixel position at the current moment according to the time step, the speed parameter of the fluid at the target texture pixel position at the current moment and the position of the target texture pixel at the current moment;
for each target texel, determining a velocity parameter of the fluid at the second texel location at the current time as the velocity parameter of the fluid at the target texel location at the next time.
In accordance with one or more embodiments of the present disclosure, there is provided an electronic device including:
one or more processors;
a memory storing one or more application programs, wherein the one or more application programs, when executed by the one or more processors, cause the electronic device to perform the dynamic fluid display method.
According to one or more embodiments of the present disclosure, there is provided a computer-readable medium for storing computer instructions which, when executed by a computer, cause the computer to perform the above-described dynamic fluid display method.
According to one or more embodiments of the present disclosure, there is provided a computer program product comprising computer instructions which, when executed by a computer, implement the above dynamic fluid display method.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (11)

1. A dynamic fluid display method, the method comprising:
responding to an image input operation of a user, displaying an image corresponding to the image input operation on a user display interface, and displaying the fluid and the image in a superposition mode;
when a target object is detected in the user display interface, acquiring attribute information of the target object;
determining, in the image, a change in a parameter of the fluid at each target texel location relative to the target object in accordance with the attribute information;
displaying dynamic fluid in the image according to changes in parameters of the fluid at the target texel locations.
2. The dynamic fluid display method of claim 1, wherein the parameters of the fluid comprise a velocity parameter, the method further comprising, for each target texel:
acquiring the time step length of updating the speed parameter;
obtaining a speed parameter of the fluid at the position of the target texture pixel at the current moment;
determining a first texture pixel position at the current moment according to the time step, the speed parameter of the fluid at the target texture pixel position at the current moment and the position of the target texture pixel at the current moment;
determining a color at the first texel location at a current time as a color of the fluid at the target texel location at a next time.
3. The dynamic fluid display method according to claim 1, wherein the target object comprises a target object in a video, the attribute information of the target object comprises a speed change amount of the target object, and the obtaining the attribute information of the target object comprises:
acquiring the position of the target object in each frame of image of the video;
determining the moving distance of the target object from one frame image to the next frame image according to the position of the target object in each frame image of the video;
acquiring the moving time of the target object from one frame image to the next frame image;
determining the moving speed of the target object at each moment according to the moving distance and the moving time;
and determining the speed variation of the target object from the current time to the next time according to the moving speed of the target object at each time.
4. The dynamic fluid display method according to claim 1, wherein the target object includes a touch operation object, the attribute information of the target object includes a speed change amount of the target object, and the obtaining the attribute information of the target object includes:
detecting the sliding operation speed of the touch operation object in the user display interface at each moment;
and determining the speed variation of the touch operation object from the current moment to the next moment based on the sliding operation speed.
5. A dynamic fluid display method as claimed in claim 3 or 4, wherein said determining in said image, from attribute information of said target object, a change in a parameter of the fluid at each target texel position relative to said target object, comprises:
and determining the change of the speed parameter of the fluid at each target texture pixel position according to the speed change of the target object.
6. The dynamic fluid display method according to claim 5, wherein the attribute information of the target object further includes initial position information of the target object and an action radius of the target object, and the determining the change of the velocity parameter of the fluid at each target texel position according to the velocity change amount of the target object includes:
for each target texture pixel, acquiring position information corresponding to the target texture pixel;
and for each target texture pixel, determining the speed parameter of the fluid at the position of the target texture pixel at the next moment according to the speed variation of the target object, the initial position information of the target object, the acting radius of the target object, the position information corresponding to the target texture pixel and the speed parameter of the fluid at the position of the target texture pixel at the current moment.
7. The dynamic fluid display method of claim 2, wherein if no target object is detected in the user display interface, the method further comprises, for each target texel:
determining a second texture pixel position at the current moment according to the time step, the speed parameter of the fluid at the target texture pixel position at the current moment and the position of the target texture pixel at the current moment;
determining a velocity parameter of the fluid at the second texel location at the current time as the velocity parameter of the fluid at the target texel location at the next time.
8. A dynamic fluid display device, the device comprising:
the first display module is used for responding to image input operation of a user, displaying an image corresponding to the image input operation on a user display interface, and overlapping and displaying the fluid and the image;
the acquisition module is used for acquiring attribute information of a target object when the target object is detected in the user display interface;
a determination module for determining, in the image, a change in a parameter of the fluid at each target texel location relative to the target object in dependence on the attribute information;
a second display module for displaying dynamic fluid in the image according to changes in parameters of the fluid at the target texel locations.
9. An electronic device, comprising:
one or more processors;
a memory storing one or more application programs, wherein the one or more application programs, when executed by the one or more processors, cause the electronic device to perform the dynamic fluid display method of any of claims 1-7.
10. A computer readable medium for storing computer instructions which, when executed by a computer, cause the computer to perform the dynamic fluid display method of any one of claims 1-7.
11. A computer program product comprising computer instructions, wherein the computer instructions, when executed by a computer, implement the dynamic fluid display method of any one of claims 1-7.
CN202011565590.4A 2020-12-25 2020-12-25 Dynamic fluid display method, device, electronic equipment and readable medium Pending CN114757815A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202011565590.4A CN114757815A (en) 2020-12-25 2020-12-25 Dynamic fluid display method, device, electronic equipment and readable medium
PCT/CN2021/133136 WO2022135017A1 (en) 2020-12-25 2021-11-25 Dynamic fluid display method and apparatus, and electronic device and readable medium
US18/340,400 US20230334730A1 (en) 2020-12-25 2023-06-23 Dynamic fluid display method and apparatus, electronic device, and readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011565590.4A CN114757815A (en) 2020-12-25 2020-12-25 Dynamic fluid display method, device, electronic equipment and readable medium

Publications (1)

Publication Number Publication Date
CN114757815A true CN114757815A (en) 2022-07-15

Family

ID=82157357

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011565590.4A Pending CN114757815A (en) 2020-12-25 2020-12-25 Dynamic fluid display method, device, electronic equipment and readable medium

Country Status (2)

Country Link
CN (1) CN114757815A (en)
WO (1) WO2022135017A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2917820B1 (en) * 2012-11-06 2020-07-01 Nokia Technologies Oy Method and apparatus for creating motion effect for image
WO2017042985A1 (en) * 2015-09-09 2017-03-16 チームラボ株式会社 Information provision device
CN105303598A (en) * 2015-10-23 2016-02-03 浙江工业大学 Multi-style video artistic processing method based on texture transfer
CN110930487A (en) * 2019-11-29 2020-03-27 珠海豹趣科技有限公司 Animation implementation method and device
CN113114841B (en) * 2021-03-26 2023-05-12 维沃移动通信有限公司 Dynamic wallpaper acquisition method and device

Also Published As

Publication number Publication date
WO2022135017A1 (en) 2022-06-30

Similar Documents

Publication Publication Date Title
CN111242881B (en) Method, device, storage medium and electronic equipment for displaying special effects
CN112965780B (en) Image display method, device, equipment and medium
CN112051961A (en) Virtual interaction method and device, electronic equipment and computer readable storage medium
WO2023169305A1 (en) Special effect video generating method and apparatus, electronic device, and storage medium
CN111652675A (en) Display method and device and electronic equipment
CN114742856A (en) Video processing method, device, equipment and medium
CN113163135B (en) Animation adding method, device, equipment and medium for video
CN111833459B (en) Image processing method and device, electronic equipment and storage medium
CN114067030A (en) Dynamic fluid effect processing method and device, electronic equipment and readable medium
CN114116081B (en) Interactive dynamic fluid effect processing method and device and electronic equipment
CN113766293B (en) Information display method, device, terminal and storage medium
CN113891141B (en) Video processing method, device and equipment
CN117319725A (en) Subtitle display method, device, equipment and medium
CN114757815A (en) Dynamic fluid display method, device, electronic equipment and readable medium
CN110769129B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN111290692B (en) Picture display method and device, electronic equipment and computer readable medium
WO2022135018A1 (en) Dynamic fluid display method and apparatus, electronic device, and readable medium
CN111696214A (en) House display method and device and electronic equipment
US20230334730A1 (en) Dynamic fluid display method and apparatus, electronic device, and readable medium
CN114357348B (en) Display method and device and electronic equipment
CN111258475B (en) Information updating method and device and electronic equipment
US20240153211A1 (en) Methods, apparatuses, terminals and storage media for display control based on extended reality
CN114693847A (en) Dynamic fluid display method, device, electronic equipment and readable medium
CN118034551A (en) Display method and device and electronic equipment
CN113741749A (en) Cursor position updating method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination