CN113032590A - Special effect display method and device, computer equipment and computer readable storage medium - Google Patents

Special effect display method and device, computer equipment and computer readable storage medium Download PDF

Info

Publication number
CN113032590A
CN113032590A CN202110336362.8A CN202110336362A CN113032590A CN 113032590 A CN113032590 A CN 113032590A CN 202110336362 A CN202110336362 A CN 202110336362A CN 113032590 A CN113032590 A CN 113032590A
Authority
CN
China
Prior art keywords
target
special effect
display
target image
display mode
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110336362.8A
Other languages
Chinese (zh)
Other versions
CN113032590B (en
Inventor
王永杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Fanxing Huyu IT Co Ltd
Original Assignee
Guangzhou Fanxing Huyu IT Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Fanxing Huyu IT Co Ltd filed Critical Guangzhou Fanxing Huyu IT Co Ltd
Priority to CN202110336362.8A priority Critical patent/CN113032590B/en
Publication of CN113032590A publication Critical patent/CN113032590A/en
Application granted granted Critical
Publication of CN113032590B publication Critical patent/CN113032590B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/43Querying
    • G06F16/438Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/43Querying
    • G06F16/432Query formulation
    • G06F16/434Query formulation using image data, e.g. images, photos, pictures taken by a user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/48Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/483Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Library & Information Science (AREA)
  • Mathematical Physics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a special effect display method, a special effect display device, computer equipment and a computer readable storage medium, and belongs to the technical field of image processing. According to the method and the device, the content of the display mode of the recorded target image is added in the target field of the target file, so that the target file corresponding to the target image is automatically acquired and analyzed after the target image is acquired, and then the target image is displayed according to the display mode recorded in the target field according to the analyzed target field, and therefore the display of the target special effect is achieved.

Description

Special effect display method and device, computer equipment and computer readable storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a special effect display method and apparatus, a computer device, and a computer-readable storage medium.
Background
In scenes related to picture display, such as movie animation, games, multimedia data shooting and the like, the special effect can bring great visual enjoyment to a user, and the visual shocking effect is achieved, so that the user experience is improved. Therefore, a special effect display method is needed to realize special effect display so as to meet the visual requirements of users.
Disclosure of Invention
The embodiment of the application provides a special effect display method, a special effect display device, computer equipment and a computer readable storage medium, which can realize the display of a target special effect based on a target image. The technical scheme provided by the application is as follows:
in one aspect, a special effect display method is provided, and the method includes:
responding to the acquired target image, acquiring a target file corresponding to the target image, wherein the target file comprises a target field, and the display mode of the target image is recorded in the target field;
analyzing the target file to obtain the target field;
and displaying the target image according to the display mode recorded in the target field so as to realize the target special effect.
In a possible implementation manner, the displaying the target image according to the display manner recorded in the target field includes:
and responding to the display mode indicating that the special effect type is the first special effect type, and displaying the target image according to at least one of the row number, the column number and the frame rate indicated by the display mode.
In one possible implementation, the displaying the target image according to at least one of the row number, the column number, and the frame rate indicated by the presentation manner in response to the presentation manner indicating that the special effect type is the first special effect type includes:
responding to the display mode to indicate that the special effect type is a second special effect type, and determining the target display number of the target image according to the row number and the column number indicated by the display mode;
and displaying the target images of the target display quantity according to the display duration corresponding to the frame rate indicated by the display mode.
In a possible implementation manner, the displaying the target image according to the display manner recorded in the target field includes:
and in response to the display mode indicating that the special effect type is the second special effect type, scrolling and displaying the target image according to the scrolling speed indicated by the display mode.
In a possible implementation manner, the parsing the target file to obtain the target field includes:
and analyzing the material object in the target file to obtain the target field.
In one possible implementation, the target file is an image language transmission format file.
In a possible implementation manner, the presenting the target image according to the presentation manner recorded in the target field to achieve the target special effect includes:
and responding to that the target image comprises a plurality of images, and sequentially displaying the plurality of images according to the display mode recorded in the target field so as to realize the target special effect.
In one aspect, a special effects display apparatus is provided, the apparatus comprising:
the acquisition module is used for responding to the acquisition of a target image, acquiring a target file corresponding to the target image, wherein the target file comprises a target field, and the display mode of the target image is recorded in the target field;
the analysis module is used for analyzing the target file to obtain the target field;
and the display module is used for displaying the target image according to the display mode recorded in the target field so as to realize the target special effect.
In one possible implementation manner, the display module is configured to, in response to the display manner indicating that the special effect type is the first special effect type, display the target image according to at least one of the line number, the column number, and the frame rate indicated by the display manner.
In a possible implementation manner, the display module is configured to determine, in response to the display manner indicating that the special effect type is the second special effect type, a target display number of the target image according to the number of rows and columns indicated by the display manner; and displaying the target images of the target display quantity according to the display duration corresponding to the frame rate indicated by the display mode.
In a possible implementation manner, the display module is configured to, in response to the display manner indicating that the special effect type is the second special effect type, scroll and display the target image at the scroll speed indicated by the display manner.
In a possible implementation manner, the parsing module is configured to parse the material object in the target file to obtain the target field.
In one possible implementation, the target file is an image language transmission format file.
In a possible implementation manner, the display module is configured to, in response to that the target image includes a plurality of images, sequentially display the plurality of images according to the display manner recorded in the target field, so as to implement the target special effect.
In one aspect, a computer device is provided that includes one or more processors and one or more memories having at least one program code stored therein, the program code being loaded and executed by the one or more processors to implement operations performed by the special effects presentation method.
In one aspect, a computer-readable storage medium having at least one program code stored therein is provided, the program code being loaded and executed by a processor to implement the operations performed by the special effects presentation method.
In one aspect, a computer program product is provided that includes computer program code to be loaded and executed by a processor to perform the operations performed by the special effects presentation method.
According to the scheme provided by the application, the content of the display mode of the recorded target image is added in the target field of the target file, so that the target file corresponding to the target image is automatically acquired and analyzed after the target image is acquired, and then the target image is displayed according to the display mode recorded in the target field according to the analyzed target field, and therefore the display of the target special effect is achieved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of an implementation environment of a special effect display method provided in an embodiment of the present application;
FIG. 2 is a flowchart of a special effect display method provided in an embodiment of the present application;
FIG. 3 is a flowchart of a special effect display method provided in an embodiment of the present application;
FIG. 4 is a diagram illustrating an effect of displaying a special effect provided by an embodiment of the present application;
FIG. 5 is a diagram illustrating an effect of displaying a special effect provided by an embodiment of the present application;
FIG. 6 is a flowchart of a special effect display method provided in an embodiment of the present application;
FIG. 7 is a schematic structural diagram of an effect display apparatus according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Fig. 1 is a schematic diagram of an implementation environment of a special effect display method provided in an embodiment of the present application, and referring to fig. 1, the implementation environment includes: a terminal 101 and a server 102.
The terminal 101 is at least one of a smart phone, a tablet computer, an e-book reader, an MP3(Moving Picture Experts Group Audio Layer III, motion Picture Experts compression standard Audio Layer 3) player, an MP4(Moving Picture Experts Group Audio Layer IV, motion Picture Experts compression standard Audio Layer 4) player, and a laptop. The terminal 101 and the server 102 are connected by wired or wireless communication, which is not limited in the present embodiment. The user can select the target image through the terminal 101, and the terminal 101 acquires the target image selected by the user, and then displays the target image according to a specific display mode to realize a target special effect. Or, the terminal 101 sends the target image selected by the user to the server 102, and receives multimedia data (such as video data, dynamic images, and the like) corresponding to the target image sent by the server 102, so as to display the multimedia data, thereby realizing the display of the target special effect.
The terminal 101 may be generally referred to as one of a plurality of terminals, and the embodiment of the present application is illustrated by the terminal 101. Those skilled in the art will appreciate that the number of terminals described above may be greater or fewer. For example, the number of the terminals may be only a few, or the number of the terminals may be several tens or hundreds, or more, and the number of the terminals 101 and the type of the device are not limited in the embodiment of the present application.
The server 102 may be at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. The server 102 and the terminal 101 are connected by wired or wireless communication, which is not limited in the embodiment of the present application. The server 102 receives the target image sent by the terminal 101, and further generates multimedia data (such as video data, dynamic images, and the like) corresponding to the target image according to a specific display mode, the multimedia data displays the target image according to the specific display mode to achieve a target special effect, and further sends the multimedia data to the terminal 101, so that the target special effect of the target image is displayed through the terminal 101. Optionally, the number of the servers may be more or less, and the embodiment of the present application does not limit this. Of course, the server 102 may also include other functional servers to provide more comprehensive and diverse services.
It should be noted that, the terminal 101 and the server 102 provided in fig. 1 can both generate multimedia data corresponding to the target image according to a specific presentation manner, and both the terminal 101 and the server 102 belong to a computer device.
Fig. 2 is a flowchart of a special effect displaying method provided in an embodiment of the present application, and referring to fig. 2, the method includes:
201. the computer equipment responds to the acquired target image, acquires a target file corresponding to the target image, wherein the target file comprises a target field, and the display mode of the target image is recorded in the target field.
It should be noted that the target file is a Graphics Language Transmission Format (GLTF) file. The target file includes a target field, such as an fxextra field, and the display mode of the target image recorded in the target field includes a special effect type and a special effect display parameter, so that the target image is displayed according to the special effect type and the special effect display parameter.
202. And the computer equipment analyzes the target file to obtain the target field.
203. And the computer equipment displays the target image according to the display mode recorded in the target field so as to realize the target special effect.
In a possible implementation manner, the computer device displays the target object by using the special effect type included in the display manner recorded in the target field according to the special effect display parameter included in the display manner recorded in the target field, so as to implement the target special effect.
According to the scheme provided by the embodiment of the application, the content of the display mode of the recorded target image is added in the target field of the target file, so that the target file corresponding to the target image is automatically acquired and analyzed after the target image is acquired, and then the target image is displayed according to the display mode recorded in the target field according to the analyzed target field, and therefore the display of the target special effect is achieved.
Fig. 3 is a flowchart of a special effect displaying method provided in an embodiment of the present application, and referring to fig. 3, the method includes:
301. the computer equipment responds to the acquired target image, acquires a target file corresponding to the target image, wherein the target file comprises a target field, and the display mode of the target image is recorded in the target field.
The target image is an image in multiple formats, such as an image in Joint Photographic Experts Group (JPEG) format, an image in Bitmap (BMP) format, an image in Portable Network Graphics (PNG) format, and the like, and optionally, the target image is an image in another format, which is not limited in this embodiment of the present application. The change of the appearance style of the object can be realized by dynamically changing the target image, and the target special effect is also realized.
It should be noted that the object file is a GLTF file, and the object file includes a code, where a plurality of objects are defined in the code, such as a Material (Material) object, a Texture (Texture) object, a Scene (Scene) object, an Image (Image) object, and the like, where the Material object includes an additional (extra) field, and a technician can record a display mode of the object Image through the code in the object field, where the display mode includes a special effect type and a special effect display parameter, and optionally, the display mode also includes other contents, and this is not limited in this application.
For example, a Material object includes the following code:
Figure BDA0002997857970000061
Figure BDA0002997857970000071
the description of the Material object in the GLTF file is expanded by newly adding a target field in an Extras field of the Material object in the GLTF file, and then the description related to the display mode of the target image is added in the newly added target field, so that special effect display is realized based on the GLTF file, and the realization mode of the special effect display is expanded. In addition, because the GLTF file is a universal 3D content format standard in a Three-Dimensional (3D) graphic world, a special effect display mode based on the GLTF file can be applied to various systems or various software, and the universality of the special effect display method is improved.
In one possible implementation manner, the computer device provides a plurality of images so that a technician can select from the images, and the computer device responds to the triggering operation of the technician to acquire the selected target image and further acquire a target file corresponding to the target object.
In another possible implementation manner, a technician writes the identifier of the target object in the Image object of the code of the target file, so as to implement acquisition of the target Image and the target file corresponding to the target Image by executing the code of the target file.
In the above description, only two exemplary methods for acquiring a target image are described, and in more possible implementation manners, other manners may also be adopted to acquire the target image, and the manner of acquiring the target image is not limited in the embodiment of the present application.
302. The computer device parses the object file to obtain the object field, and performs the following steps 303 and 304.
In one possible implementation, the computer device parses a Material object in the target file to parse a target field (e.g., an fxextra field) from the Material object.
For example, after the computer device acquires the target file, the computer device analyzes each object in the target file by running the code included in the target file, and when the Material object and the corresponding code are run, determines an extra field from the code corresponding to the Material object, and further acquires an fxextra field from the extra field.
303. And the computer equipment responds to the display mode to indicate that the special effect type is the first special effect type, and displays the target image according to at least one of the line number, the column number and the frame rate indicated by the display mode so as to realize the target special effect.
The display mode recorded in the target field may include a first special effect type or a second special effect type, and each special effect type corresponds to different special effect display parameters. For example, the special effect showing parameters corresponding to the first special effect type include a row number, a column number, and a frame rate, the special effect showing parameters corresponding to the second special effect type are a scrolling speed (including a horizontal scrolling speed and a vertical scrolling speed), and optionally, the special effect showing parameters corresponding to the first special effect type and the second special effect type further include other contents, which is not limited in this embodiment of the present application.
It should be noted that, when the display mode is recorded through the target field, the special effect (effect) information is defined in the target field, and then a special effect type (effect type) variable is added to the effect information, so that the effect type variable indicates which mode the computer device adopts to display the target image, and then based on the determined effect type variable, a corresponding special effect display variable is added to the effect information, and the special effect display variable is used for indicating a special effect display parameter required when the target image is displayed. For example, when the effectType variable instructs the computer device to display the target image in a manner corresponding to the first special effect type, a column number (col) variable, a row number (row) variable and a frame rate (fps) variable are added to the effect information, so that the target image is displayed in the computer device in a manner corresponding to the first special effect type, and a corresponding special effect display parameter is determined based on the col variable, the row variable and the fps variable; when the effectType variable indicates that the computer device displays the target image in a mode corresponding to the second special effect type, a horizontal rolling speed (xSpeed) variable and a longitudinal rolling speed (ySpeed) variable are added into effect information so that the computer device displays the target image in a mode corresponding to the second special effect type, and corresponding special effect display parameters are determined based on the xSpeed variable and the ySpeed variable.
In a possible implementation manner, if the display manner indicates that the special effect type is the first special effect type, the computer device determines the target display number of the target images according to the number of rows and the number of columns indicated by the display manner, and displays the target images of the target display number according to the display duration corresponding to the frame rate indicated by the display manner, so as to implement the target special effect.
It should be noted that the special effect type indicated by the display manner is determined based on the effectType variable in the target field, if it is determined that the special effect type is the first special effect type based on the effectType variable in the target field, the computer device continues to execute the code to read the col variable and the row variable, and further determines, according to the read result, which rows and columns the target image is to be displayed in, to determine the display position of the target image, and further determines the target display number of the target image according to the number of the display positions; and continuously executing the codes by the computer equipment to read the fps variable, further determining the display duration of the target image at each position according to the read result, and further displaying the target images with the target display quantity according to the determined display duration so as to realize the target special effect.
For example, when the special effect type is the first special effect type, the codes of the additional field and the target field in the additional field are exemplified as follows:
Figure BDA0002997857970000091
and determining the display duration of the target image at each position according to the read result of the fps variable, and determining the reciprocal of the read result of the fps variable as the display duration of the target image at each position. Taking the above code as an example, if the value of the fps variable read is 20, the display time length of the target image at each position is 0.05 seconds (S).
Still taking the display duration of the target image at each position as an example of 0.05S, referring to fig. 4, fig. 4 is a schematic diagram of a special effect display effect provided by an embodiment of the present application, where the special effect type in fig. 4 is a first special effect type, where the position of the target image in the solid line form is the display position of each target image at the current time, and the position of the target image in the dotted line form is the display position of each target image after 0.05S, and the target special effect can be achieved by changing the display position of the target image.
304. And the computer equipment responds to the display mode to indicate that the special effect type is the second special effect type, and the target image is displayed in a rolling mode according to the rolling speed indicated by the display mode so as to realize the target special effect.
In a possible implementation manner, if the display manner indicates that the special effect type is the second special effect type, the computer device performs scrolling display on the target image according to the horizontal scrolling speed and the vertical scrolling speed indicated by the display manner to implement the target special effect.
It should be noted that, if it is determined that the special effect type is the second special effect type based on the effectType variable in the target field, the computer device continues to execute the code to read the xseed variable and the yseed variable, and further determines the horizontal and vertical scrolling speeds of the target image according to the read result, and further performs scroll display on the target image according to the determined horizontal and vertical scrolling speeds to achieve the target special effect.
For example, when the special effect type is the second special effect type, the codes of the additional field and the target field in the additional field are exemplified as follows:
Figure BDA0002997857970000101
referring to fig. 5, fig. 5 is a schematic diagram illustrating a special effect provided by an embodiment of the present application, where the special effect type in fig. 5 is a second special effect type, where a position of a target image in a solid line form is a display position of each target image at a current time, and a position of a target image in a dotted line form is a display position of each target image after scrolling display based on a horizontal scrolling speed and a vertical scrolling speed, and a target special effect can be achieved by changing the display position of the target image.
According to the scheme provided by the embodiment of the application, the content of the display mode of the recorded target image is added in the target field of the target file, so that the target file corresponding to the target image is automatically acquired and analyzed after the target image is acquired, and then the target image is displayed according to the display mode recorded in the target field according to the analyzed target field, and therefore the display of the target special effect is achieved. According to the scheme provided by the embodiment of the application, two special effect display modes can be realized based on one target image, the visual requirement of a user is met, and therefore the user experience is improved. In addition, because the embodiment of the application adopts the GLTF file to realize the display of the target special effect, and the GLTF file is used as a universal 3D content format standard and can be supported by a plurality of systems or a plurality of kinds of software, the scheme provided by the embodiment of the application can be applied to the plurality of systems or the plurality of kinds of software, and the universality of the special effect display method is improved.
It should be noted that, the process shown in fig. 3 is described by taking an example of obtaining one target image and then displaying the one target image to achieve a target special effect, and in more possible implementation manners, a plurality of target images may also be obtained and then displayed to achieve the target special effect. Fig. 6 is a flowchart of a special effect displaying method provided in an embodiment of the present application, and referring to fig. 6, the method includes:
601. the computer equipment responds to the acquired target image, acquires a target file corresponding to the target image, wherein the target file comprises a target field, and the display mode of the target image is recorded in the target field.
602. And the computer equipment analyzes the target file to obtain the target field.
Steps 601 to 602 are similar to steps 301 to 302, and are not described herein again.
603. And the computer equipment responds that the target image comprises a plurality of images, and sequentially displays the plurality of images according to the display mode recorded in the target field so as to realize the target special effect.
The display modes recorded by the target fields corresponding to the multiple images comprise image identifications of the multiple images and display modes of the multiple images, and the display modes of the multiple images comprise special effect types and special effect display parameters corresponding to the special effect types. The special effect type and the special effect display parameter corresponding to the special effect type may refer to step 303 described above, and details are not described here.
By adding the marks of the multiple images in the target field, the special effect display based on the multiple images can be realized, the flexibility of the special effect display method is improved, and in addition, the special effects realized based on the multiple images are more various, so that the special effect display effect is improved.
In a possible implementation manner, if the display manner indicates that the special effect type is the first special effect type, the computer device determines, according to the number of rows and the number of columns indicated by the display manner, a display number corresponding to each image in the plurality of images included in the target image, and sequentially displays, according to a display duration corresponding to the frame rate indicated by the display manner, each target image of the corresponding display number to implement the target special effect. The specific process is the same as the above step 303, and is not described herein again.
In a possible implementation manner, if the display manner indicates that the special effect type is the second special effect type, the computer device performs scroll display on the multiple images according to the horizontal scroll speed and the vertical scroll speed indicated by the display manner, so as to implement the target special effect. The specific process is the same as the above step 304, and is not described herein again.
According to the scheme provided by the embodiment of the application, when the target image comprises a plurality of images, the content of recording the display mode of the target image is added in the target field of the target file, so that the target file corresponding to the target image is automatically acquired and analyzed after the target image is acquired, and then the plurality of images included in the target image are sequentially displayed according to the display mode recorded in the target field according to the analyzed target field, so that the display of the target special effect is realized.
All the above optional technical solutions may be combined arbitrarily to form optional embodiments of the present application, and are not described herein again.
Fig. 7 is a schematic structural diagram of an effect display apparatus provided in an embodiment of the present application, and referring to fig. 7, the apparatus includes:
an obtaining module 701, configured to, in response to obtaining a target image, obtain a target file corresponding to the target image, where the target file includes a target field, and a display manner of the target image is recorded in the target field;
an analyzing module 702, configured to analyze the target file to obtain the target field;
the display module 703 is configured to display the target image according to the display mode recorded in the target field, so as to implement a target special effect.
According to the device provided by the embodiment of the application, the content of the display mode of the recorded target image is added in the target field of the target file, so that the target file corresponding to the target image is automatically acquired and analyzed after the target image is acquired, and then the target image is displayed according to the display mode recorded in the target field according to the analyzed target field, and therefore the display of the target special effect is achieved.
In a possible implementation manner, the displaying module 703 is configured to, in response to the displaying manner indicating that the special effect type is the first special effect type, display the target image according to at least one of the row number, the column number, and the frame rate indicated by the displaying manner.
In a possible implementation manner, the display module 703 is configured to determine, in response to the display manner indicating that the special effect type is the second special effect type, a target display number of the target image according to the number of rows and columns indicated by the display manner; and displaying the target images of the target display quantity according to the display duration corresponding to the frame rate indicated by the display mode.
In a possible implementation manner, the displaying module 703 is configured to, in response to the displaying manner indicating that the special effect type is the second special effect type, scroll and display the target image at the scroll speed indicated by the displaying manner.
In a possible implementation manner, the parsing module 702 is configured to parse the material object in the target file to obtain the target field.
In one possible implementation, the target file is an image language transmission format file.
In a possible implementation manner, the display module 703 is configured to, in response to that the target image includes a plurality of images, sequentially display the plurality of images according to the display manner recorded in the target field, so as to implement the target special effect.
It should be noted that: in the special effect display device provided in the above embodiment, when the target image is displayed to achieve the target special effect, only the division of the functional modules is illustrated, and in practical application, the function distribution may be completed by different functional modules according to needs, that is, the internal structure of the computer device is divided into different functional modules to complete all or part of the functions described above. In addition, the special effect display device and the special effect display method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are detailed in the method embodiments and are not described herein again.
In an exemplary embodiment, a computer device is provided, optionally, the computer device is provided as a terminal, or the computer device is provided as a server, and the structure of the terminal and the server is as follows:
fig. 8 is a schematic structural diagram of a terminal according to an embodiment of the present application. The terminal 800 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. The terminal 800 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, etc.
In general, the terminal 800 includes: one or more processors 801 and one or more memories 802.
The processor 801 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so forth. The processor 801 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 801 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 801 may be integrated with a GPU (Graphics Processing Unit) which is responsible for rendering and drawing the content required to be displayed by the display screen. In some embodiments, the processor 801 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 802 may include one or more computer-readable storage media, which may be non-transitory. Memory 802 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 802 is used to store at least one program code for execution by the processor 801 to implement the special effects presentation method provided by the method embodiments of the present application.
In some embodiments, the terminal 800 may further include: a peripheral interface 803 and at least one peripheral. The processor 801, memory 802 and peripheral interface 803 may be connected by bus or signal lines. Various peripheral devices may be connected to peripheral interface 803 by a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of a radio frequency circuit 804, a display screen 805, a camera assembly 806, an audio circuit 807, a positioning assembly 808, and a power supply 809.
The peripheral interface 803 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 801 and the memory 802. In some embodiments, the processor 801, memory 802, and peripheral interface 803 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 801, the memory 802, and the peripheral interface 803 may be implemented on separate chips or circuit boards, which are not limited by this embodiment.
The Radio Frequency circuit 804 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 804 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 804 converts an electrical signal into an electromagnetic signal to be transmitted, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 804 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 804 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 804 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 805 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display 805 is a touch display, the display 805 also has the ability to capture touch signals on or above the surface of the display 805. The touch signal may be input to the processor 801 as a control signal for processing. At this point, the display 805 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 805 may be one, disposed on a front panel of the terminal 800; in other embodiments, the display 805 may be at least two, respectively disposed on different surfaces of the terminal 800 or in a folded design; in other embodiments, the display 805 may be a flexible display disposed on a curved surface or a folded surface of the terminal 800. Even further, the display 805 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The Display 805 can be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and other materials.
The camera assembly 806 is used to capture images or video. Optionally, camera assembly 806 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 806 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 807 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 801 for processing or inputting the electric signals to the radio frequency circuit 804 to realize voice communication. For the purpose of stereo sound collection or noise reduction, a plurality of microphones may be provided at different portions of the terminal 800. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 801 or the radio frequency circuit 804 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 807 may also include a headphone jack.
The positioning component 808 is used to locate the current geographic position of the terminal 800 for navigation or LBS (Location Based Service). The Positioning component 808 may be a Positioning component based on the GPS (Global Positioning System) in the united states, the beidou System in china, the graves System in russia, or the galileo System in the european union.
Power supply 809 is used to provide power to various components in terminal 800. The power supply 809 can be ac, dc, disposable or rechargeable. When the power source 809 comprises a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 800 also includes one or more sensors 810. The one or more sensors 810 include, but are not limited to: acceleration sensor 811, gyro sensor 812, pressure sensor 813, fingerprint sensor 814, optical sensor 815 and proximity sensor 816.
The acceleration sensor 811 may detect the magnitude of acceleration in three coordinate axes of the coordinate system established with the terminal 800. For example, the acceleration sensor 811 may be used to detect the components of the gravitational acceleration in three coordinate axes. The processor 801 may control the display 805 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 811. The acceleration sensor 811 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 812 may detect a body direction and a rotation angle of the terminal 800, and the gyro sensor 812 may cooperate with the acceleration sensor 811 to acquire a 3D motion of the user with respect to the terminal 800. From the data collected by the gyro sensor 812, the processor 801 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 813 may be disposed on the side frames of terminal 800 and/or underneath display 805. When the pressure sensor 813 is disposed on the side frame of the terminal 800, the holding signal of the user to the terminal 800 can be detected, and the processor 801 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 813. When the pressure sensor 813 is disposed at a lower layer of the display screen 805, the processor 801 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 805. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 814 is used for collecting a fingerprint of the user, and the processor 801 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 814, or the fingerprint sensor 814 identifies the identity of the user according to the collected fingerprint. Upon identifying that the user's identity is a trusted identity, the processor 801 authorizes the user to perform relevant sensitive operations including unlocking a screen, viewing encrypted information, downloading software, paying for and changing settings, etc. Fingerprint sensor 814 may be disposed on the front, back, or side of terminal 800. When a physical button or a vendor Logo is provided on the terminal 800, the fingerprint sensor 814 may be integrated with the physical button or the vendor Logo.
The optical sensor 815 is used to collect the ambient light intensity. In one embodiment, processor 801 may control the display brightness of display 805 based on the ambient light intensity collected by optical sensor 815. Specifically, when the ambient light intensity is high, the display brightness of the display screen 805 is increased; when the ambient light intensity is low, the display brightness of the display 805 is reduced. In another embodiment, the processor 801 may also dynamically adjust the shooting parameters of the camera assembly 806 based on the ambient light intensity collected by the optical sensor 815.
A proximity sensor 816, also known as a distance sensor, is typically provided on the front panel of the terminal 800. The proximity sensor 816 is used to collect the distance between the user and the front surface of the terminal 800. In one embodiment, when the proximity sensor 816 detects that the distance between the user and the front surface of the terminal 800 gradually decreases, the processor 801 controls the display 805 to switch from the bright screen state to the dark screen state; when the proximity sensor 816 detects that the distance between the user and the front surface of the terminal 800 becomes gradually larger, the display 805 is controlled by the processor 801 to switch from the breath-screen state to the bright-screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 8 is not intended to be limiting of terminal 800 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
Fig. 9 is a schematic structural diagram of a server according to an embodiment of the present application, where the server 900 may generate a relatively large difference due to different configurations or performances, and may include one or more processors (CPUs) 901 and one or more memories 902, where the one or more memories 902 store at least one program code, and the at least one program code is loaded and executed by the one or more processors 901 to implement the methods provided by the foregoing method embodiments. Certainly, the server 900 may also have components such as a wired or wireless network interface, a keyboard, and an input/output interface, so as to perform input and output, and the server 900 may also include other components for implementing device functions, which are not described herein again.
In an exemplary embodiment, a computer-readable storage medium, such as a memory including program code, which is executable by a processor to perform the special effects presentation method in the above-described embodiments, is also provided. For example, the computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a Compact Disc Read-Only Memory (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program product is also provided, which comprises computer program code that is loaded and executed by a processor of a terminal to perform the method steps of the virtual object acquisition method provided in the above-mentioned embodiment, or that is loaded and executed by a processor of a server to perform the method steps of the special effects presentation method provided in the above-mentioned embodiment.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by hardware associated with program code, and the program may be stored in a computer readable storage medium, where the above mentioned storage medium may be a read-only memory, a magnetic or optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. A special effect display method is characterized by comprising the following steps:
responding to the acquisition of a target image, acquiring a target file corresponding to the target image, wherein the target file comprises a target field, and the target field records the display mode of the target image;
analyzing the target file to obtain the target field;
and displaying the target image according to the display mode recorded in the target field so as to realize the target special effect.
2. The method according to claim 1, wherein said displaying the target image according to the display mode recorded in the target field comprises:
and responding to the display mode indicating that the special effect type is a first special effect type, and displaying the target image according to at least one of the row number, the column number and the frame rate indicated by the display mode.
3. The method of claim 2, wherein in response to the showing manner indicating that the type of special effect is the first type of special effect, displaying the target image in at least one of a number of rows, a number of columns, and a frame rate indicated by the showing manner comprises:
responding to the display mode to indicate that the special effect type is a second special effect type, and determining the target display number of the target image according to the row number and the column number indicated by the display mode;
and displaying the target images of the target display quantity according to the display duration corresponding to the frame rate indicated by the display mode.
4. The method according to claim 1, wherein said displaying the target image according to the display mode recorded in the target field comprises:
and responding to the display mode indicating that the special effect type is a second special effect type, and scrolling and displaying the target image according to the scrolling speed indicated by the display mode.
5. The method of claim 1, wherein parsing the target file to obtain the target field comprises:
and analyzing the material object in the target file to obtain the target field.
6. The method of claim 1, wherein the target file is an image language transmission format file.
7. The method according to claim 1, wherein the displaying the target image according to the display mode recorded in the target field to achieve the target special effect comprises:
and responding to that the target image comprises a plurality of images, and sequentially displaying the plurality of images according to the display mode recorded in the target field so as to realize the target special effect.
8. A special effects display apparatus, the apparatus comprising:
the acquisition module is used for responding to the acquisition of a target image and acquiring a target file corresponding to the target image, wherein the target file comprises a target field, and the display mode of the target image is recorded in the target field;
the analysis module is used for analyzing the target file to obtain the target field;
and the display module is used for displaying the target image according to the display mode recorded in the target field so as to realize the target special effect.
9. A computer device, characterized in that the computer device comprises one or more processors and one or more memories having at least one program code stored therein, which is loaded and executed by the one or more processors to implement the operations performed by the special effects presentation method of any one of claims 1 to 7.
10. A computer-readable storage medium having stored therein at least one program code, the program code being loaded and executed by a processor to implement operations performed by the special effects presentation method of any one of claims 1 to 7.
CN202110336362.8A 2021-03-29 2021-03-29 Special effect display method, device, computer equipment and computer readable storage medium Active CN113032590B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110336362.8A CN113032590B (en) 2021-03-29 2021-03-29 Special effect display method, device, computer equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110336362.8A CN113032590B (en) 2021-03-29 2021-03-29 Special effect display method, device, computer equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN113032590A true CN113032590A (en) 2021-06-25
CN113032590B CN113032590B (en) 2024-05-03

Family

ID=76452776

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110336362.8A Active CN113032590B (en) 2021-03-29 2021-03-29 Special effect display method, device, computer equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN113032590B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024011733A1 (en) * 2022-07-11 2024-01-18 上海幻电信息科技有限公司 3d image implementation method and system
EP4328863A4 (en) * 2022-07-11 2024-05-01 Shanghai Hode Information Technology Co., Ltd. 3d image implementation method and system

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019219065A1 (en) * 2018-05-17 2019-11-21 杭州海康威视数字技术股份有限公司 Video analysis method and device
CN110675466A (en) * 2019-09-27 2020-01-10 广州华多网络科技有限公司 Rendering system, rendering method, rendering device, electronic equipment and storage medium
CN110688627A (en) * 2019-08-30 2020-01-14 华为技术有限公司 3D material protection method and device
CN111061896A (en) * 2019-10-21 2020-04-24 武汉神库小匠科技有限公司 Loading method, device, equipment and medium for 3D (three-dimensional) graph based on glTF (generalized likelihood TF)
CN111737506A (en) * 2020-06-24 2020-10-02 众趣(北京)科技有限公司 Three-dimensional data display method and device and electronic equipment
CN111858828A (en) * 2020-09-24 2020-10-30 北京数字政通科技股份有限公司 Three-dimensional geographic data oriented transmission and rendering method and system
CN111935534A (en) * 2020-07-30 2020-11-13 视伴科技(北京)有限公司 Method and device for playing back recorded video
CN111930816A (en) * 2020-07-16 2020-11-13 万翼科技有限公司 Data processing method and related device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019219065A1 (en) * 2018-05-17 2019-11-21 杭州海康威视数字技术股份有限公司 Video analysis method and device
CN110688627A (en) * 2019-08-30 2020-01-14 华为技术有限公司 3D material protection method and device
CN110675466A (en) * 2019-09-27 2020-01-10 广州华多网络科技有限公司 Rendering system, rendering method, rendering device, electronic equipment and storage medium
CN111061896A (en) * 2019-10-21 2020-04-24 武汉神库小匠科技有限公司 Loading method, device, equipment and medium for 3D (three-dimensional) graph based on glTF (generalized likelihood TF)
CN111737506A (en) * 2020-06-24 2020-10-02 众趣(北京)科技有限公司 Three-dimensional data display method and device and electronic equipment
CN111930816A (en) * 2020-07-16 2020-11-13 万翼科技有限公司 Data processing method and related device
CN111935534A (en) * 2020-07-30 2020-11-13 视伴科技(北京)有限公司 Method and device for playing back recorded video
CN111858828A (en) * 2020-09-24 2020-10-30 北京数字政通科技股份有限公司 Three-dimensional geographic data oriented transmission and rendering method and system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024011733A1 (en) * 2022-07-11 2024-01-18 上海幻电信息科技有限公司 3d image implementation method and system
EP4328863A4 (en) * 2022-07-11 2024-05-01 Shanghai Hode Information Technology Co., Ltd. 3d image implementation method and system

Also Published As

Publication number Publication date
CN113032590B (en) 2024-05-03

Similar Documents

Publication Publication Date Title
CN108401124B (en) Video recording method and device
CN109922356B (en) Video recommendation method and device and computer-readable storage medium
CN111083526B (en) Video transition method and device, computer equipment and storage medium
CN110248236B (en) Video playing method, device, terminal and storage medium
CN109144346B (en) Song sharing method and device and storage medium
CN111753784A (en) Video special effect processing method and device, terminal and storage medium
CN110533585B (en) Image face changing method, device, system, equipment and storage medium
CN110139143B (en) Virtual article display method, device, computer equipment and storage medium
CN109982129B (en) Short video playing control method and device and storage medium
CN113409427B (en) Animation playing method and device, electronic equipment and computer readable storage medium
CN113407291A (en) Content item display method, device, terminal and computer readable storage medium
CN111368114A (en) Information display method, device, equipment and storage medium
CN110839174A (en) Image processing method and device, computer equipment and storage medium
CN110662105A (en) Animation file generation method and device and storage medium
CN111586444B (en) Video processing method and device, electronic equipment and storage medium
CN112257006A (en) Page information configuration method, device, equipment and computer readable storage medium
CN113032590B (en) Special effect display method, device, computer equipment and computer readable storage medium
CN113204672B (en) Resource display method, device, computer equipment and medium
CN114245218A (en) Audio and video playing method and device, computer equipment and storage medium
CN110868642B (en) Video playing method, device and storage medium
CN112616082A (en) Video preview method, device, terminal and storage medium
CN112118353A (en) Information display method, device, terminal and computer readable storage medium
CN112023403A (en) Battle process display method and device based on image-text information
CN112004134B (en) Multimedia data display method, device, equipment and storage medium
CN111370096A (en) Interactive interface display method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant