CN115695685A - Special effect processing method and device, electronic equipment and storage medium - Google Patents

Special effect processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN115695685A
CN115695685A CN202211339065.XA CN202211339065A CN115695685A CN 115695685 A CN115695685 A CN 115695685A CN 202211339065 A CN202211339065 A CN 202211339065A CN 115695685 A CN115695685 A CN 115695685A
Authority
CN
China
Prior art keywords
picture
special effect
preset
augmented reality
caustic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211339065.XA
Other languages
Chinese (zh)
Inventor
王兢业
李小奇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202211339065.XA priority Critical patent/CN115695685A/en
Publication of CN115695685A publication Critical patent/CN115695685A/en
Priority to PCT/CN2023/125288 priority patent/WO2024088141A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the disclosure provides a special effect processing method and device, electronic equipment and a storage medium. Acquiring an augmented reality picture shot by shooting equipment, and generating a target special effect picture based on the augmented reality picture; displaying a first special effect area of the target special effect picture in a first picture area under the condition that the shooting equipment displays the first picture area of the augmented reality picture; and displaying a second picture area of the augmented reality picture and displaying a second special effect area of the target special effect picture in the second picture area under the condition that the change of the shooting angle of the shooting equipment is detected. The target special effect picture is generated through the virtual display picture, and the target special effect picture can change along with the change of the shooting angle of the shooting equipment, so that the reality and the vividness of the special effect processing effect are improved.

Description

Special effect processing method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to water surface rendering technologies, and in particular, to a special effect processing method and apparatus, an electronic device, and a storage medium.
Background
Currently, a terminal application entertainment scene generally relates to a water surface rendering technology, and a water surface special effect can be obtained through shooting and rendering by a virtual camera.
Under the special effect scene of cell-phone, generally shoot the video through cell-phone rear camera, the fixed position of surface of water special effect demonstration in the screen that will render and obtain, the display effect is stiff, and vividness is relatively poor, and compares with real surface of water, and the authenticity of special effect is relatively poor.
Disclosure of Invention
The disclosure provides a special effect processing method, a special effect processing device, an electronic device and a storage medium, so as to achieve the effect of improving vividness and reality of a special effect processing effect.
In a first aspect, an embodiment of the present disclosure provides a special effect processing method, where the method includes:
acquiring an augmented reality picture shot by shooting equipment, and generating a target special effect picture based on the augmented reality picture;
displaying a first special effect area of the target special effect picture in a first picture area under the condition that the shooting equipment displays the first picture area of the augmented reality picture;
and displaying a second picture area of the augmented reality picture and displaying a second special effect area of the target special effect picture in the second picture area under the condition that the shooting angle of the shooting device is detected to be changed.
In a second aspect, an embodiment of the present disclosure further provides a special effect rendering apparatus, where the apparatus includes:
the special effect generation module is used for acquiring an augmented reality picture shot by shooting equipment and generating a target special effect picture based on the augmented reality picture;
the special effect display module is used for displaying a first special effect area of the target special effect picture in a first picture area under the condition that the shooting equipment displays the first picture area of the augmented reality picture;
and the display change module is used for displaying a second picture area of the augmented reality picture and displaying the second special effect area of the target special effect picture in the second picture area under the condition that the shooting angle of the shooting device is detected to be changed.
In a third aspect, an embodiment of the present disclosure further provides an electronic device, where the electronic device includes:
one or more processors;
a storage device to store one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the special effects processing method of any of claims 1-9.
In a fourth aspect, the disclosed embodiments also provide a storage medium containing computer-executable instructions, wherein the computer-executable instructions, when executed by a computer processor, are configured to perform the special effects processing method of any of claims 1-9.
According to the technical scheme of the embodiment of the invention, the augmented reality picture shot by the shooting equipment is obtained, the target special effect picture is generated based on the augmented reality picture, and the target special effect picture with space sense can be generated through the augmented reality picture, so that the real scene is more fit; under the condition that the shooting equipment displays the first picture area of the augmented reality picture, displaying the first special effect area of the target special effect picture in the first picture area, and associating the shooting angle with the picture display area of the target special effect picture to ensure the visual presentation effect; and displaying a second picture area of the augmented reality picture and displaying a second special effect area of the target special effect picture in the second picture area under the condition that the shooting angle of the shooting device is detected to be changed. The target special effect picture is generated through the virtual display picture, different special effect areas are displayed in the corresponding picture areas based on the change of the position and the angle of the shooting equipment, the target special effect picture can change along with the change of the shooting angle of the shooting equipment, the change of the shooting angle is adopted to simulate the change of the visual field of a user, and then the change condition of an actual scene is simulated through the different picture areas of the target special effect picture, so that the rendering effect of the target special effect picture is more real and vivid, and the user experience is improved.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and features are not necessarily drawn to scale.
Fig. 1 is a schematic flowchart of a special effect processing method according to an embodiment of the disclosure;
FIG. 2 is a schematic flow chart diagram illustrating another special effects processing method provided by the embodiments of the present disclosure;
FIG. 3 is a schematic flow chart diagram illustrating another special effect processing method provided by the embodiment of the present disclosure;
FIG. 4 is a focal-scattering plot provided by an embodiment of the present disclosure;
FIG. 5 is a normal texture map provided by an embodiment of the present disclosure;
FIG. 6 is a flow chart illustrating a further special effect processing method provided by an embodiment of the disclosure;
FIG. 7 is a schematic flowchart of an alternative example of a special effects processing method provided by an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of a special effect processing apparatus according to an embodiment of the disclosure;
fig. 9 is a schematic structural diagram of an electronic device for special effect processing according to an embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be understood that the various steps recited in method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a" or "an" in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will appreciate that references to "one or more" are intended to be exemplary and not limiting unless the context clearly indicates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
It is understood that before the technical solutions disclosed in the embodiments of the present disclosure are used, the type, the use range, the use scene, etc. of the personal information related to the present disclosure should be informed to the user and obtain the authorization of the user through a proper manner according to the relevant laws and regulations.
For example, in response to receiving an active request from a user, a prompt message is sent to the user to explicitly prompt the user that the requested operation to be performed would require the acquisition and use of personal information to the user. Thus, the user can autonomously select whether to provide personal information to software or hardware such as an electronic device, an application program, a server, or a storage medium that performs the operations of the disclosed technical solution, according to the prompt information.
As an optional but non-limiting implementation manner, in response to receiving an active request from the user, the manner of sending the prompt information to the user may be, for example, a pop-up window, and the prompt information may be presented in a text manner in the pop-up window. In addition, a selection control for providing personal information to the electronic device by the user's selection of "agreeing" or "disagreeing" can be carried in the pop-up window.
It is understood that the above notification and user authorization process is only illustrative and not limiting, and other ways of satisfying relevant laws and regulations may be applied to the implementation of the present disclosure.
It will be appreciated that the data involved in the subject technology, including but not limited to the data itself, the acquisition or use of the data, should comply with the requirements of the corresponding laws and regulations and related regulations.
Fig. 1 is a schematic flowchart of a special effect processing method provided in the embodiment of the present disclosure, where the embodiment of the present disclosure is applicable to a situation of special effect picture rendering with stereoscopic impression, the method may be executed by a special effect processing device, the device may be implemented in a software and/or hardware manner, and optionally implemented by an electronic device, where the electronic device may be a mobile terminal, a PC terminal, or a server, and the embodiment of the present disclosure is particularly applicable to a mobile terminal provided with an Augmented Reality (AR) shooting device.
As shown in fig. 1, the method of the embodiment may specifically include:
and S110, acquiring an augmented reality picture shot by the shooting equipment, and generating a target special effect picture based on the augmented reality picture.
Wherein, the shooting device can be understood as a device for shooting the augmented reality picture. Optionally, the shooting device may be a device having a shooting function and installed with an AR component, and having a shooting function. The shooting device can also be used for displaying a special effect picture obtained after the special effect processing is carried out on the augmented reality picture. In the embodiment of the present disclosure, the terminal may be preset according to a scene requirement, and is not specifically limited herein. Illustratively, the photographing device may be an AR camera or an augmented reality AR device (e.g., AR glasses) or the like configured in the terminal. It is understood that the specific picture content of the augmented reality picture can be determined according to the actual shooting scene, and is not limited in detail herein. Optionally, the augmented reality picture may be a single frame picture in an augmented reality video, or may be an augmented reality image.
The target special effect picture can be understood as a special effect picture obtained by carrying out special effect processing on the augmented reality picture. In the embodiment of the present disclosure, the special effect of the target special effect picture may be preset according to a scene requirement, and is not specifically limited herein. Illustratively, the contour of the target special effect picture may be irregular. Optionally, the target special effect picture may have multiple presentation forms, for example, the target special effect picture may be a special effect picture obtained by performing water surface special effect rendering, snow mountain special effect rendering, flame special effect rendering, quicksand special effect rendering, or the like on the augmented reality picture. Illustratively, the target special effect picture may be a water surface special effect picture. Further, the edges of the water surface special effect picture can be nonlinear to simulate a scene of water surface fluctuation.
In an embodiment of the present disclosure, the augmented reality picture acquired by the shooting device may include depth information of the augmented reality picture, that is, a distance between each pixel point in the augmented reality picture and the shooting device. Optionally, generating a target special effect picture based on the augmented reality picture includes: and acquiring a depth estimation image of the augmented reality image, and generating a target special effect image based on the depth estimation image. The depth data of each pixel point in the depth estimation map can be used for indicating the distance from the scene information corresponding to the pixel point to the shooting device.
Illustratively, generating a target special effect picture based on the depth estimation map includes: determining a special effect rendering area corresponding to the augmented reality picture; for each pixel point to be rendered in the special effect rendering area, determining a pixel value of the pixel point to be rendered based on depth information corresponding to the pixel point to be rendered in the depth estimation map; and rendering based on the pixel value of each pixel point to be rendered in the special effect rendering area to obtain a target special effect picture.
Optionally, determining a special effect rendering region corresponding to the augmented reality screen includes: determining a special effect rendering region corresponding to the augmented reality picture based on a preset region generation algorithm; or responding to a special effect application triggering operation aiming at the augmented reality picture, and acquiring a special effect rendering area based on the triggered special effect. The special effect application trigger operation may be understood as a trigger operation that is applied to the augmented reality screen to start a default special effect, or a special effect selection operation for at least one special effect, or the like.
And S120, displaying the first special effect region of the target special effect picture in the first picture region under the condition that the shooting equipment displays the first picture region of the augmented reality picture.
It can be understood that when the shooting angle of the shooting device changes, the screen region of the augmented reality screen displayed by the shooting device may change accordingly, in other words, for the change of the displayed screen region of the augmented reality screen, a corresponding special effect region may be displayed in a different screen region of the target special effect screen. In the embodiment of the present disclosure, the presentation effects of different screen regions of the target special effect screen may be the same or different.
The first picture area may be understood as a picture area corresponding to a current shooting angle of the shooting device in the augmented reality picture. The first special effect region may be understood as a special effect region displayed in the target special effect picture corresponding to the first picture region.
Specifically, when the shooting device displays a first picture area of the augmented reality picture, a special effect area corresponding to the first picture area in the target special effect picture is acquired as a first special effect area, and the first special effect area is displayed in the first picture area.
And S130, under the condition that the shooting angle of the shooting device is detected to be changed, displaying a second picture area of the augmented reality picture, and displaying a second special effect area of the target special effect picture in the second picture area.
Wherein, the shooting angle can be understood as the angle of the shooting device when shooting the augmented reality picture. Optionally, the shooting angle may be determined based on information such as a shooting position and/or a shooting orientation when the shooting device acquires the augmented reality screen.
The second picture area may be understood as a picture area of an augmented reality picture photographed by the terminal after a photographing angle of the photographing apparatus is changed. The second special effect region may be understood as a special effect region displayed in the target special effect screen corresponding to the second screen region. Optionally, the first screen area and the second screen area may be different, and the first special effect area and the second special effect area may be different.
Specifically, when it is detected that a shooting angle of the shooting device changes, a second picture area of the augmented reality picture shot by the shooting device is acquired, the second picture area is displayed, and a second special effect area of the target special effect picture is displayed in the second picture area.
In order to enrich the display effect of the special effect, a preset special effect object can be additionally rendered in the target special effect picture. Optionally, the special effect processing method further includes: and rendering a preset special effect object to the target special effect picture. The special effect object can be understood as an object rendered in the target special effect picture. In the embodiment of the present disclosure, information such as a specific form and a display mode of the special effect object may be preset according to a requirement, and is not specifically limited herein. Optionally, the special effect object may be a preset special effect prop. Different target special effect pictures can adopt the same or different special effect objects. For example, taking the target special effect picture as the water surface, the special effect object may be a water surface floating object and/or a aquatic creature, and may be, for example, a ship, a fish, a water plant, and the like.
Optionally, the rendering a preset special effect object to the target special effect picture includes: and determining target display information of a preset special effect object in the target special effect picture, and rendering the special effect object to the target special effect picture according to the target display information. The preset special effect object is rendered into the target special effect picture through the target display information, so that the rendering effect of the target special effect picture is richer.
The target display information may be understood as information indicating a manner in which the special effect object is displayed on the target special effect screen. In the embodiment of the present disclosure, the target display information may be preset according to a scene requirement, and is not specifically limited herein. Optionally, the target display information may include, but is not limited to, at least one of a display position, a motion state, a display color, and a display depth.
The motion state can be understood as indicating whether the special effect object is moving or not and in which way the special effect object is moving if it is moving. In the embodiment of the present disclosure, for example, the motion state of the special effect object in the target special effect screen may be a state simulating a pendulum motion or the like.
Optionally, the motion state includes dynamic or static. Illustratively, the special effect object may be displayed in a static or dynamic manner in the target special effect screen.
Optionally, when the special effect object moves, the motion state may further include, but is not limited to, at least one of motion information such as a motion angle, a motion amplitude, a motion time, a motion trajectory, and a motion speed of the special effect object in the target special effect picture.
The display color information may be understood as color information of the special effect object displayed in the target special effect picture. Optionally, the display color information is determined based on a display position of the special effect object in the target special effect picture. In particular, the determination may be based on color values and/or depth data of pixel points of the special effect object at a display position in the target special effect picture.
The display depth information may be understood as depth information of the special effect object displayed in the target special effect picture. It is to be understood that the display depth information of different object regions may be different for the special effect object. In this embodiment of the present disclosure, optionally, the special effect object is rendered based on display depth information of each object region of a preset special effect object in the target special effect screen. In the embodiment of the present disclosure, a special effect may be presented based on a display depth of the special effect object and the target special effect screen. By adopting the technical scheme, the special effect that the target special effect picture is shielded by the partial area of the special effect object or the partial area of the special effect object is shielded by the target special effect picture can be realized.
According to the technical scheme of the embodiment of the invention, the augmented reality picture shot by the shooting equipment is obtained, the target special effect picture is generated based on the augmented reality picture, and the target special effect picture with space sense can be generated through the augmented reality picture, so that the real scene is more fit; under the condition that the shooting equipment displays the first picture area of the augmented reality picture, displaying the first special effect area of the target special effect picture in the first picture area, and associating the shooting angle with the picture display area of the target special effect picture to ensure the visual presentation effect; and displaying a second picture area of the augmented reality picture and displaying a second special effect area of the target special effect picture in the second picture area under the condition that the change of the shooting angle of the shooting equipment is detected. The target special effect picture is generated through the virtual display picture, the target special effect picture can change along with the change of the shooting angle of the shooting equipment, the change of the shooting angle is adopted to simulate the visual field change of a user, and then the change condition of an actual scene is simulated through different picture areas of the target special effect picture, so that the rendering effect of the target special effect picture is more real and more vivid, and the user experience is improved.
Fig. 2 is a schematic flow chart of another special effect processing method provided in the embodiment of the present disclosure, and this embodiment refines the target special effect picture generated based on the augmented reality picture in the above embodiment. Optionally, the detailed description may refer to the description of this embodiment. The technical features that are the same as or similar to those of the previous embodiments are not repeated herein.
As shown in fig. 2, the method of the embodiment may specifically include:
and S210, acquiring an augmented reality picture shot by the shooting equipment.
S220, generating an initial special effect picture based on the augmented reality picture, and carrying out optical processing on the initial special effect picture to obtain a target special effect picture.
Wherein the optical treatment comprises at least one of a scattering treatment, a reflection treatment, a refraction treatment, a caustic treatment, and a highlight treatment.
Wherein, the initial special effect picture can be understood as a special effect picture preliminarily generated based on the augmented reality picture. The optical processing may be understood as a process of processing each pixel point in the initial special effect picture based on optical information. Alternatively, the optical treatment may be at least one of a scattering treatment, a reflection treatment, a refraction treatment, a caustic treatment, and a highlight treatment.
The scattering processing may be understood as a processing mode of processing at least part of the pixel points in the initial special effect picture based on scattering of light, so that the initial special effect picture has an effect of scattering light.
In the embodiment of the present disclosure, by performing the scattering processing on the initial special effect picture, the pixels in the initial special effect picture, which have different distances from the terminal, may have different brightness degrees, so that the optical effect of the initial special effect picture is more real. Exemplarily, scattering processing is carried out on the initial water surface special effect picture, the obtained target special effect picture can be a water surface far away from the shooting equipment, the color is dark, the water surface near the shooting equipment is bright, and the authenticity of water surface special effect rendering is improved.
The reflection processing may be understood as a processing manner of processing at least part of the pixel points in the initial special effect picture based on reflection of light, so that the initial special effect picture has an effect of emitting light. Optionally, the reflection processing may be a process of processing optical information of a water surface pixel point in the initial water surface special effect picture. The refraction processing may be understood as a processing manner of processing at least part of the pixel points in the initial special effect picture based on light refraction so that the initial special effect picture has an effect of refracting light. Optionally, the refraction processing may be a process of processing optical information of a pixel point below the water surface in the initial water surface special effect picture. In the embodiment of the disclosure, the initial special effect picture is subjected to reflection processing and refraction processing, so that pixel points on the surface of the water and below the water in the water surface picture have different optical effects, and the authenticity of the rendering of the special effect on the water surface is improved.
In the embodiment of the present disclosure, the initial special effect picture may be subjected to the caustic processing by sampling a preset caustic map, and it can be understood that sampling manners of the caustic map and the focus diffuse map may also be different according to different shooting angles of the shooting device. In the processing of the water surface special effect picture, the water surface special effect picture can simulate the special effect of the glistening Pond under the light of the real water surface through the caustic treatment.
The highlight processing may be understood as a processing procedure for making a partial region of the initial special effect picture exhibit an optical effect of a total reflection light source. Optionally, the initial special effect picture is processed based on a preset highlight algorithm. Illustratively, the preset highlight algorithm may include, but is not limited to, the Blinn-Phong lighting algorithm.
Illustratively, the optical processing may include reflection processing and refraction processing. Optionally, the optical processing is performed on the initial special effect picture, and includes: determining a refraction light direction according to preset incident light and the target special effect picture, and sampling the augmented reality picture according to the refraction light direction to obtain a refraction color value; determining a reflected light direction according to preset incident light and the target special effect picture, and sampling a preset environment map according to the reflected light direction to obtain a reflected color value; and determining the reflectivity corresponding to the reflection color value and the refractive index corresponding to the refraction color value, and processing the initial special effect picture according to the refraction color value, the refractive index, the reflection color value and the reflectivity.
Wherein the preset incident light is determined based on an illumination direction of a preset light source. The refracted light direction may be understood as a light ray direction determined for the preset incident light and the refractive index of the target special effect picture. The refraction color value can be understood as a color value obtained by sampling the augmented reality picture according to the refraction light direction. The refractive index may be understood as the percentage of the radiant energy of the refracted ray to the radiant energy of the predetermined incident ray. It is understood that the refractive index may be preset according to the requirements of the scene, and is not particularly limited herein. Specifically, the larger the refractive index is, the brighter the brightness of the picture pixel point corresponding to the refractive index may be.
The reflected light direction may be understood as a light direction determined according to the preset incident light and the reflectivity of the water surface in the target special effect picture. The preset environment map can be understood as a map representing color values corresponding to the refracted light direction. The reflection color value may be understood as a color value obtained by sampling the preset environment map according to the reflection light direction. The reflectivity can be understood as the percentage of the radiant energy of the reflected light ray to the radiant energy of the preset incident light ray. It is understood that the reflectivity may be preset according to the scene requirement, and is not particularly limited herein. Specifically, the larger the reflectivity is, the brighter the brightness of the picture pixel point corresponding to the reflectivity can be. Taking the rendering of a special effect picture on the water surface as an example, under a normal condition, the brightness of picture pixel points below the water surface is darker, and the brightness of picture pixel points on the surface of the water surface is brighter.
And S230, under the condition that the shooting equipment displays the first picture area of the augmented reality picture, displaying the first special effect area of the target special effect picture in the first picture area.
And S240, under the condition that the shooting angle of the shooting device is detected to be changed, displaying a second picture area of the augmented reality picture, and displaying a second special effect area of the target special effect picture in the second picture area.
According to the technical scheme of the embodiment of the disclosure, an initial special effect picture is generated based on the augmented reality picture, and the initial special effect picture is optically processed to obtain a target special effect picture. The rendered water surface special effect simulates the optical effects of scattering, reflection, refraction, caustic and highlight of a real water surface, and the vividness and the authenticity of a target special effect picture are improved.
Fig. 3 is a schematic flow chart of another special effect processing method provided in the embodiment of the present disclosure, and this embodiment refines the optical processing on the initial special effect picture in the above embodiment.
As shown in fig. 3, the method includes:
and S310, acquiring an augmented reality picture shot by the shooting equipment.
And S320, generating an initial special effect picture based on the augmented reality picture, and performing caustic treatment on the initial special effect picture based on a preset caustic light map.
Wherein, the Jiao Sanguang map can be understood as a map for characterizing the caustic characteristics. The Jiao Sanguang map may be set according to actual requirements, and is not specifically limited herein as long as it can represent the defocus characteristic. For example, a Jiao Sanguang map may be as shown in fig. 4.
In the embodiment of the present disclosure, the caustic color value of each picture pixel point in the initial special effect picture may be determined based on a caustic light map, so as to perform caustic processing on the initial special effect picture.
Optionally, the caustic treatment of the initial special-effect picture based on the preset caustic mapping includes: determining a caustic sampling coordinate corresponding to each picture pixel point to be caustic processed in the initial special-effect picture; sampling a preset caustic light map based on the caustic sampling coordinates, and determining a caustic color value corresponding to the picture pixel point based on a sampling result; and carrying out caustic treatment on the picture pixel points based on caustic color values corresponding to the picture pixel points.
The image pixel points may be understood as pixel points to be subjected to caustic processing for each of the initial special effect images. The caustic sampling coordinate can be understood as a coordinate on which a caustic color value corresponding to the picture pixel point depends can be obtained by sampling a caustic map. The sampling process may be understood as a process of sampling a preset caustic light map based on the caustic sampling coordinates to obtain a caustic color value corresponding to the picture pixel point. The caustic color value can be understood as a color value of a pixel point corresponding to the picture pixel point in a preset Jiao Sanguang map obtained by sampling processing.
Optionally, determining a caustic sampling coordinate corresponding to the picture pixel point includes: determining a normal sampling coordinate corresponding to a preset normal texture mapping according to the world coordinate of the picture pixel point and the illumination direction coordinate of a preset light source; and determining a caustic sampling coordinate corresponding to the picture pixel point according to the normal sampling coordinate.
The world coordinate may be understood as a coordinate composed of three coordinate axes perpendicular to and intersecting each other. The preset light source may be understood as an object providing incident light. In the embodiment of the present disclosure, the preset light source may be preset according to a requirement, and is not specifically limited herein. The illumination direction coordinate may be understood as an illumination direction coordinate of the incident light. The normal texture map may be understood as a map of normal texture characterizing the water surface ripple (for a specific example, refer to fig. 5). In an embodiment of the present disclosure, the caustic sampling coordinate corresponding to each of the picture pixel points may be determined based on the normal texture map. The normal sample coordinates may be understood as coordinates sampled for the normal texture map.
Specifically, the world coordinate and the illumination direction coordinate of the preset light source are determined, the two-dimensional vector of the world coordinate and the illumination direction of the preset light source is obtained, and the normal sampling coordinate capable of sampling in the preset found texture map is obtained through calculation. Specifically, a first ratio of a vertical axis component (namely, a y component) of world coordinates of the picture pixel points to a vertical axis component (namely, a y component) of the illumination direction coordinates is calculated; taking the sum of the product of the first ratio and the vertical axis component (i.e., x component) of the illumination direction coordinate and the horizontal axis component (i.e., x component) of the world coordinate of the picture pixel point as the abscissa of the normal sampling coordinate on the normal texture map, and taking the sum of the product of the first ratio and the vertical axis component (i.e., z component) of the illumination direction coordinate and the sum of the vertical axis component (i.e., z component) of the world coordinate of the picture pixel point as the ordinate of the normal sampling coordinate on the normal texture map.
Further, a normal sampling point in the normal texture map is determined based on the normal sampling coordinate, and a caustic sampling coordinate corresponding to the picture pixel point is determined based on the coordinate of the normal sampling point. Specifically, the abscissa and ordinate components (i.e., the x-component and the y-component) of the normal sampling point may be taken as the caustic sampling coordinates corresponding to the picture pixel point. And then, sampling a preset caustic light map based on caustic sampling coordinates to obtain a caustic light map value corresponding to each picture pixel point.
Optionally, determining a caustic color value corresponding to the picture pixel point based on the sampling result includes: and determining a caustic color value corresponding to the picture pixel point based on the caustic map value corresponding to the picture pixel point. Specifically, a difference value between a longitudinal axis component (i.e., a y component) of a world coordinate of the picture pixel point and a first preset value is calculated, a second ratio of the difference value to a second preset value is calculated, the difference value between 1 and the second ratio is used as a first factor, a caustic map value corresponding to the picture pixel point is used as a second factor, and the first factor and the second factor are multiplied to obtain a caustic color value corresponding to the picture pixel point.
On the basis, the initial special effect picture can be processed by combining one or more modes of scattering processing, reflection processing, refraction processing and highlight processing to obtain a target special effect picture.
And S330, displaying the first special effect area of the target special effect picture in the first picture area under the condition that the shooting equipment displays the first picture area of the augmented reality picture.
And S340, under the condition that the shooting angle of the shooting device is detected to be changed, displaying a second picture area of the augmented reality picture, and displaying the second special effect area of the target special effect picture in the second picture area.
According to the technical scheme of the embodiment of the disclosure, the initial special effect picture is subjected to caustic treatment based on the preset caustic light map, so that the target special effect picture can show the dynamic effect of the Pond, and the vividness and the reality of special effect rendering are improved.
Fig. 6 is a schematic flow chart of another special effect processing method provided in the embodiment of the present disclosure, and this embodiment refines the optical processing on the initial special effect picture in the above embodiment.
As shown in fig. 6, the method includes:
and S410, acquiring the augmented reality picture shot by the shooting equipment.
And S420, generating an initial special effect picture based on the augmented reality picture.
S430, respectively determining weighted values of a preset first color value and a preset second color value according to the components of the sight line vector corresponding to the augmented reality picture in the preset direction.
In this embodiment, the sight line vector may be understood as a vector from the shooting device to a picture pixel point to be subjected to the caustic processing. The preset direction may be a longitudinal axis direction of the sight line vector.
In the embodiment of the present disclosure, performing scattering processing on the initial special effect picture may enable the processed initial special effect picture to exhibit a color gradient effect. Illustratively, two different color values may be preset, and the color value of the picture pixel farthest from the terminal and the color value of the picture pixel closest to the terminal may be respectively represented. Specifically, a preset first color value is adopted to represent a color value of a picture pixel point farthest from the terminal. And the preset second color value represents the color value of the picture pixel point closest to the terminal. Optionally, the performing scattering processing on the initial special effect picture includes: and processing the initial special effect picture according to a preset first color value, a preset second color value and depth data of each pixel point corresponding to the initial special effect picture. Wherein the depth data for each pixel point may be determined based on the value of the component of the line of sight vector for that pixel point in the vertical axis (i.e., the y-axis).
Specifically, for each pixel point, a first weight of the preset first color and a second weight of the preset second color can be respectively determined according to a value of a y component of the normalized sight line vector, and then a scattered light value is determined based on the preset first color, the first weight, the preset second color and the second weight, and a color value of the pixel point is determined based on the scattered light value.
Further, a value of a y component of the normalized sight line vector may be used as a first weight of the preset first color, and a difference value between 1 and the first weight may be used as a second weight of the preset second color.
It can be understood that, in the embodiment of the present disclosure, the preset first color value and the preset second color value may be preset according to a scene requirement, and are not specifically limited herein. Optionally, when the initial special-effect picture is a water surface picture, the first color value may be a deeper water surface color, and the second color value may be a lighter water surface color.
S440, determining a scattering value based on a preset first color value, a preset second color value, a weight value of the preset first color value and a weight value of the preset second color value, and acting the scattering value on the initial special effect picture to obtain a target special effect picture.
Specifically, a first product obtained by multiplying a preset first color by a first weight and a second product obtained by multiplying a preset second color by a second weight are summed to obtain a scattered light value. And further applying the scattering value to the initial special effect picture.
On the basis, the initial special effect picture can be processed by combining one or more modes of reflection processing, refraction processing, caustic treatment and highlight processing to obtain a target special effect picture.
And S450, displaying the first special effect area of the target special effect picture in the first picture area under the condition that the shooting equipment displays the first picture area of the augmented reality picture.
And S460, under the condition that the shooting angle of the shooting device is detected to be changed, displaying a second picture area of the augmented reality picture, and displaying a second special effect area of the target special effect picture in the second picture area.
According to the technical scheme of the embodiment of the disclosure, weight values of a preset first color value and a preset second color value are respectively determined according to components of a sight line vector corresponding to the augmented reality picture in a preset direction; and determining a scattering value based on a preset first color value, a preset second color value, a weight value of the preset first color value and a weight value of the preset second color value, and applying the scattering value to the initial special-effect picture. The target special effect picture can have the visual effect of distance difference, and the reality of special effect rendering is improved.
Fig. 7 is a schematic flowchart of an alternative example of a special effect processing method according to an embodiment of the present disclosure. Taking the processing of the water surface special effect picture as an example, as shown in fig. 7, the overall flow of the special effect processing method may be:
1. and acquiring an augmented reality picture. The method comprises the steps of shooting an augmented reality picture containing depth information through a terminal provided with an AR camera, acquiring a depth estimation map in a scene based on an AR camera component, and writing the depth estimation map into a depth buffer of the AR camera.
2. Rendering the special effect object. Rendering the special effect objects floating on the water surface in a water surface special effect picture, such as a dragon boat.
3. And calculating the buffer of the display color writing color of the special effect object, and calculating the display depth writing depth buffer of the special effect object.
4. And rendering the water surface. Surface rendering may be the result of a combination of scattering, reflection, refraction, caustic and highlights. Wherein the content of the first and second substances,
1) Scattering: and taking the value of the y component of the normalized sight line vector as a first weight of a preset first color, taking the difference value of 1 and the first weight as a second weight of a preset second color, and summing a first product obtained by multiplying the preset first color by the first weight and a second product obtained by multiplying the preset second color by the second weight to obtain a scattered light value.
2) Refraction and reflection: determining a refraction light direction, and sampling the augmented reality picture according to the refraction light direction to obtain a refraction color value; and determining the direction of reflected light, and sampling the preset environment map according to the direction of the reflected light to obtain a reflected color value. And mixing the refraction color value and the reflection color value according to a mixing coefficient calculated by a Schlick approximation method of Fresnel law.
3) Charred powder: the method comprises the steps of taking the product of the ratio of the vertical axis component (namely, y component) of the world coordinate of the picture pixel point to the vertical axis component (namely, y component) of the illumination direction coordinate and the vertical axis component (namely, x component) of the illumination direction coordinate as a first parameter, taking the horizontal axis component (namely, x component) of the world coordinate of the picture pixel point as a second parameter, taking the sum of the first parameter and the second parameter as the horizontal coordinate of the normal sampling coordinate on the normal texture map, taking the product of the ratio of the vertical axis component (namely, y component) of the world coordinate of the picture pixel point to the vertical axis component (namely, y component) of the illumination direction coordinate and the vertical axis component (namely, z component) of the illumination direction coordinate as a third parameter, taking the vertical axis component (namely, z component) of the world coordinate of the picture pixel point as a fourth normal, and taking the sum of the third parameter and the fourth parameter as the vertical coordinate of the sampling coordinate on the texture map. Further, the normal texture map is sampled based on the abscissa and the ordinate of the normal sampling coordinate, and a normal sampling point is obtained. The abscissa and ordinate components (i.e., the x-component and the y-component) of the normal sample point are taken as the abscissa and ordinate components of the caustic sample coordinate corresponding to the picture pixel point. And then, sampling a preset caustic light map based on caustic sampling coordinates to obtain a caustic light map value corresponding to each picture pixel point.
4) Highlight: highlight color values were determined using the Blinn-Phong method.
5. And writing the water surface rendering result into a system cache so as to display the picture area of the water surface special effect picture along with the shooting angle of the shooting equipment.
According to the technical scheme, the augmented reality picture is obtained based on the shooting equipment provided with the AR component to render the water surface, various optical treatments are carried out on the paper surface, the special effect of the Three-dimensional (3D) water surface is rendered by the conventional rectangular horizontal plane, and the sense of reality of the special effect treatment effect is improved. In addition, different water surface special effect areas can be displayed in the corresponding picture areas according to the change of the shooting angle, the vividness of the special effect processing effect is improved, and the user experience is improved.
Fig. 8 is a schematic structural diagram of a special effect processing apparatus according to an embodiment of the disclosure, and as shown in fig. 8, the apparatus includes: an effect generation module 510, an effect display module 520, and a display change module 530.
The special effect generating module 510 is configured to obtain an augmented reality picture captured by a capturing device, and generate a target special effect picture based on the augmented reality picture; a special effect display module 520, configured to display a first special effect region of the target special effect picture in a first picture region of the augmented reality picture when the shooting device displays the first picture region; a display change module 530, configured to, when it is detected that a shooting angle of the shooting device changes, display a second screen region of the augmented reality screen, and display the second special effect region of the target special effect screen in the second screen region.
According to the technical scheme of the embodiment of the invention, the augmented reality picture shot by the shooting equipment is obtained, the target special effect picture is generated based on the augmented reality picture, and the target special effect picture with space sense can be generated through the augmented reality picture, so that the real scene is more fit; under the condition that the shooting equipment displays the first picture area of the augmented reality picture, displaying the first special effect area of the target special effect picture in the first picture area, and associating the shooting angle with the picture display area of the target special effect picture to ensure the visual presentation effect; and displaying a second picture area of the augmented reality picture and displaying a second special effect area of the target special effect picture in the second picture area under the condition that the change of the shooting angle of the shooting equipment is detected. The target special effect picture is generated through the virtual display picture, the target special effect picture can change along with the change of the shooting angle of the shooting equipment, the change of the shooting angle is adopted to simulate the change of the visual field of a user, and then the change condition of an actual scene is simulated through different picture areas displaying the target special effect picture, so that the rendering effect of the target special effect picture is more real and vivid, and the user experience is improved.
Optionally, the special effect generating module 510 includes: and an optical processing submodule.
The optical processing submodule is configured to generate an initial special effect picture based on the augmented reality picture, and perform optical processing on the initial special effect picture to obtain a target special effect picture, where the optical processing includes at least one of scattering processing, reflection processing, refraction processing, caustic processing, and highlight processing.
Optionally, the optical processing sub-module includes: a scorching treatment unit.
And the caustic processing unit is used for carrying out caustic processing on the initial special effect picture based on a preset caustic light mapping.
Optionally, the caustic treatment unit includes: the device comprises a caustic sampling coordinate determination subunit, a caustic color value acquisition subunit and a caustic processing subunit.
The caustic sampling coordinate determining subunit is configured to determine caustic sampling coordinates corresponding to each picture pixel point to be caustic processed in the initial special-effect picture;
the caustic color value acquisition subunit is used for sampling a preset caustic light map based on the caustic sampling coordinates and determining a caustic color value corresponding to the picture pixel point based on a sampling result;
and the caustic treatment subunit is used for carrying out caustic treatment on the picture pixel points based on caustic color values corresponding to the picture pixel points.
Optionally, the caustic sampling coordinate determination subunit is configured to:
according to the world coordinates of the picture pixel points and the illumination direction coordinates of a preset light source, normal sampling coordinates corresponding to a preset normal texture mapping are determined;
and determining a caustic sampling coordinate corresponding to the picture pixel point according to the normal sampling coordinate.
Optionally, the optical processing sub-module includes a caustic processing unit, configured to:
respectively determining weighted values of a preset first color value and a preset second color value according to the components of the sight line vector corresponding to the augmented reality picture in a preset direction;
determining a scattering value based on a preset first color value, a preset second color value, a weight value of the preset first color value and a weight value of the preset second color value, and applying the scattering value to the initial special-effect picture.
Optionally, the optical processing sub-module includes a reflection processing unit and a refraction processing unit, and is configured to:
determining a refraction light direction according to preset incident light and the target special effect picture, and sampling the augmented reality picture according to the refraction light direction to obtain a refraction color value;
determining a reflected light direction according to preset incident light and the target special effect picture, and sampling a preset environment map according to the reflected light direction to obtain a reflected color value;
and determining the reflectivity corresponding to the reflection color value and the refractive index corresponding to the refraction color value, and processing the initial special effect picture according to the refraction color value, the refractive index, the reflection color value and the reflectivity.
Optionally, the special effect processing method further includes a special effect object rendering module, configured to:
and rendering a preset special effect object to the target special effect picture.
Optionally, the special effect object rendering module is configured to:
determining target display information of a preset special effect object in the target special effect picture, and rendering the special effect object into the target special effect picture according to the target display information, wherein the target display information comprises at least one of a display position, a motion state, a display color and a display depth.
The special effect processing device provided by the embodiment of the disclosure can execute the special effect processing method provided by any embodiment of the disclosure, and has corresponding functional modules and beneficial effects of the execution method.
It should be noted that, the units and modules included in the apparatus are merely divided according to functional logic, but are not limited to the above division as long as the corresponding functions can be implemented; in addition, specific names of the functional units are only used for distinguishing one functional unit from another, and are not used for limiting the protection scope of the embodiments of the present disclosure.
Fig. 9 is a schematic structural diagram of an electronic device for special effect processing according to an embodiment of the present disclosure. Referring now to fig. 9, a schematic diagram of an electronic device (e.g., the terminal device or the server in fig. 9) 500 suitable for implementing embodiments of the present disclosure is shown. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 9 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 9, electronic device 500 may include a processing means (e.g., central processing unit, graphics processor, etc.) 501 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM) 502 or a program loaded from a storage means 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data necessary for the operation of the electronic apparatus 500 are also stored. The processing device 501, the ROM 502, and the RAM 503 are connected to each other through a bus 504. An editing/output (I/O) interface 505 is also connected to bus 504.
Generally, the following devices may be connected to the I/O interface 505: input devices 506 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 507 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage devices 508 including, for example, magnetic tape, hard disk, etc.; and a communication device 509. The communication means 509 may allow the electronic device 500 to communicate with other devices wirelessly or by wire to exchange data. While fig. 9 illustrates an electronic device 500 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 509, or installed from the storage means 508, or installed from the ROM 502. The computer program performs the above-described functions defined in the methods of the embodiments of the present disclosure when executed by the processing device 501.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
The electronic device provided by the embodiment of the disclosure and the special effect processing method provided by the embodiment belong to the same disclosure concept, and technical details which are not described in detail in the embodiment can be referred to the embodiment, and the embodiment have the same beneficial effects.
The disclosed embodiments provide a computer storage medium on which a computer program is stored, which when executed by a processor implements the special effect processing method provided by the above embodiments.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring an augmented reality picture shot by shooting equipment, and generating a target special effect picture based on the augmented reality picture; displaying a first special effect area of the target special effect picture in a first picture area under the condition that the shooting equipment displays the first picture area of the augmented reality picture; and displaying a second picture area of the augmented reality picture and displaying a second special effect area of the target special effect picture in the second picture area under the condition that the change of the shooting angle of the shooting equipment is detected.
Computer program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of a unit does not in some cases constitute a limitation of the unit itself, for example, the first retrieving unit may also be described as a "unit for retrieving at least two internet protocol addresses".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems on a chip (SOCs), complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to one or more embodiments of the present disclosure, [ example one ] there is provided a special effects processing method, including:
acquiring an augmented reality picture shot by shooting equipment, and generating a target special effect picture based on the augmented reality picture;
displaying a first special effect region of the target special effect picture in a first picture region under the condition that the shooting equipment displays the first picture region of the augmented reality picture;
and displaying a second picture area of the augmented reality picture and displaying a second special effect area of the target special effect picture in the second picture area under the condition that the change of the shooting angle of the shooting equipment is detected.
In accordance with one or more embodiments of the present disclosure, [ example two ] there is provided the method of example one, further comprising:
and generating an initial special effect picture based on the augmented reality picture, and carrying out optical treatment on the initial special effect picture to obtain a target special effect picture, wherein the optical treatment comprises at least one of scattering treatment, reflection treatment, refraction treatment, caustic treatment and highlight treatment.
In accordance with one or more embodiments of the present disclosure, [ example three ] there is provided the method of example two, further comprising:
and carrying out caustic treatment on the initial special effect picture based on a preset caustic light mapping.
According to one or more embodiments of the present disclosure, [ example four ] there is provided the method of example three, further comprising:
determining a caustic sampling coordinate corresponding to each picture pixel point to be caustic processed in the target special effect picture;
sampling a preset caustic light map based on the caustic sampling coordinates, and determining a caustic color value corresponding to the picture pixel point based on a sampling result;
and carrying out caustic treatment on the picture pixel points based on caustic color values corresponding to the picture pixel points.
In accordance with one or more embodiments of the present disclosure, [ example five ] there is provided the method of example four, further comprising:
according to the world coordinates of the picture pixel points and the illumination direction coordinates of a preset light source, normal sampling coordinates corresponding to a preset normal texture mapping are determined;
and determining a caustic sampling coordinate corresponding to the picture pixel point according to the normal sampling coordinate.
In accordance with one or more embodiments of the present disclosure, [ example six ] there is provided the method of example two, further comprising:
respectively determining weighted values of a preset first color value and a preset second color value according to the components of the sight line vector corresponding to the augmented reality picture in a preset direction;
determining a scattering value based on a preset first color value, a preset second color value, a weight value of the preset first color value and a weight value of the preset second color value, and applying the scattering value to the initial special-effect picture.
In accordance with one or more embodiments of the present disclosure, [ example seven ] there is provided the method of example two, further comprising:
determining a refraction light direction according to preset incident light and the target special effect picture, and sampling the augmented reality picture according to the refraction light direction to obtain a refraction color value;
determining a reflected light direction according to preset incident light and the target special effect picture, and sampling a preset environment map according to the reflected light direction to obtain a reflected color value;
and determining the reflectivity corresponding to the reflection color value and the refractive index corresponding to the refraction color value, and processing the initial special effect picture according to the refraction color value, the refractive index, the reflection color value and the reflectivity.
In accordance with one or more embodiments of the present disclosure, [ example eight ] there is provided the method of example one, further comprising:
and rendering a preset special effect object to the target special effect picture.
In accordance with one or more embodiments of the present disclosure, [ example nine ] there is provided the method of example eight, further comprising:
determining target display information of a preset special effect object in the target special effect picture, and rendering the special effect object into the target special effect picture according to the target display information, wherein the target display information comprises at least one of a display position, a motion state, a display color and a display depth.
According to one or more embodiments of the present disclosure, [ example ten ] there is provided a special effects processing apparatus including:
the special effect generation module is used for acquiring an augmented reality picture shot by shooting equipment and generating a target special effect picture based on the augmented reality picture;
the special effect display module is used for displaying a first special effect area of the target special effect picture in a first picture area under the condition that the shooting equipment displays the first picture area of the augmented reality picture;
and the display change module is used for displaying a second picture area of the augmented reality picture and displaying the second special effect area of the target special effect picture in the second picture area under the condition that the change of the shooting angle of the shooting device is detected.
According to one or more embodiments of the present disclosure, [ example eleven ] there is provided special effects processing electronic device comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the special effects processing method of any one of examples one to nine.
According to one or more embodiments of the present disclosure [ example twelve ] there is provided a special effects processing storage medium comprising:
the computer executable instructions, when executed by a computer processor, are for performing the special effects processing method of any of examples one to nine.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (12)

1. A special effect processing method is characterized by comprising the following steps:
acquiring an augmented reality picture shot by shooting equipment, and generating a target special effect picture based on the augmented reality picture;
displaying a first special effect area of the target special effect picture in a first picture area under the condition that the shooting equipment displays the first picture area of the augmented reality picture;
and displaying a second picture area of the augmented reality picture and displaying a second special effect area of the target special effect picture in the second picture area under the condition that the change of the shooting angle of the shooting equipment is detected.
2. The special effect processing method according to claim 1, wherein the generating a target special effect picture based on the augmented reality picture includes:
and generating an initial special effect picture based on the augmented reality picture, and carrying out optical processing on the initial special effect picture to obtain a target special effect picture, wherein the optical processing comprises at least one of scattering processing, reflection processing, refraction processing, caustic processing and highlight processing.
3. The special effect processing method according to claim 2, wherein the performing of the caustic processing on the initial special effect picture includes:
and carrying out caustic treatment on the initial special effect picture based on a preset caustic light mapping.
4. The special effect processing method according to claim 3, wherein the caustic processing on the initial special effect picture based on the preset Jiao Sanguang map comprises:
determining a caustic sampling coordinate corresponding to each picture pixel point to be caustic processed in the initial special-effect picture;
sampling a preset caustic light map based on the caustic sampling coordinates, and determining a caustic color value corresponding to the picture pixel point based on a sampling result;
and carrying out caustic treatment on the picture pixel points based on caustic color values corresponding to the picture pixel points.
5. The special effects processing method of claim 4, wherein the determining the caustic sampling coordinates corresponding to the picture pixel points comprises:
according to the world coordinates of the picture pixel points and the illumination direction coordinates of a preset light source, normal sampling coordinates corresponding to a preset normal texture mapping are determined;
and determining a caustic sampling coordinate corresponding to the picture pixel point according to the normal sampling coordinate.
6. The special effect processing method according to claim 2, wherein the performing scattering processing on the initial special effect picture includes:
respectively determining weighted values of a preset first color value and a preset second color value according to the components of the sight line vector corresponding to the augmented reality picture in a preset direction;
determining a scattering value based on a preset first color value, a preset second color value, a weight value of the preset first color value and a weight value of the preset second color value, and applying the scattering value to the initial special-effect picture.
7. The special effects processing method according to claim 2, wherein the optical processing includes reflection processing and refraction processing; the optically processing the initial special effect picture comprises:
determining a refraction light direction according to preset incident light and the target special effect picture, and sampling the augmented reality picture according to the refraction light direction to obtain a refraction color value;
determining a reflected light direction according to preset incident light and the target special effect picture, and sampling a preset environment map according to the reflected light direction to obtain a reflected color value;
and determining the reflectivity corresponding to the reflection color value and the refractive index corresponding to the refraction color value, and processing the initial special effect picture according to the refraction color value, the refractive index, the reflection color value and the reflectivity.
8. The special effects processing method according to claim 1, further comprising:
and rendering a preset special effect object to the target special effect picture.
9. The special effect processing method according to claim 8, wherein the rendering a preset special effect object into the target special effect screen includes:
determining target display information of a preset special effect object in the target special effect picture, and rendering the special effect object into the target special effect picture according to the target display information, wherein the target display information comprises at least one of a display position, a motion state, a display color and a display depth.
10. A special effect rendering apparatus, comprising:
the special effect generation module is used for acquiring an augmented reality picture shot by shooting equipment and generating a target special effect picture based on the augmented reality picture;
a special effect display module, configured to display a first special effect region of the target special effect picture in a first picture region of the augmented reality picture when the shooting device displays the first picture region;
and the display change module is used for displaying a second picture area of the augmented reality picture and displaying the second special effect area of the target special effect picture in the second picture area under the condition that the change of the shooting angle of the shooting device is detected.
11. An electronic device, characterized in that the electronic device comprises:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the special effects processing method of any of claims 1-9.
12. A storage medium containing computer-executable instructions, which when executed by a computer processor, are for performing the special effects processing method of any of claims 1-9.
CN202211339065.XA 2022-10-28 2022-10-28 Special effect processing method and device, electronic equipment and storage medium Pending CN115695685A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211339065.XA CN115695685A (en) 2022-10-28 2022-10-28 Special effect processing method and device, electronic equipment and storage medium
PCT/CN2023/125288 WO2024088141A1 (en) 2022-10-28 2023-10-18 Special-effect processing method and apparatus, electronic device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211339065.XA CN115695685A (en) 2022-10-28 2022-10-28 Special effect processing method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115695685A true CN115695685A (en) 2023-02-03

Family

ID=85046679

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211339065.XA Pending CN115695685A (en) 2022-10-28 2022-10-28 Special effect processing method and device, electronic equipment and storage medium

Country Status (2)

Country Link
CN (1) CN115695685A (en)
WO (1) WO2024088141A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024088141A1 (en) * 2022-10-28 2024-05-02 北京字跳网络技术有限公司 Special-effect processing method and apparatus, electronic device, and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111356000A (en) * 2018-08-17 2020-06-30 北京达佳互联信息技术有限公司 Video synthesis method, device, equipment and storage medium
CN112348969B (en) * 2020-11-06 2023-04-25 北京市商汤科技开发有限公司 Display method and device in augmented reality scene, electronic equipment and storage medium
CN112866562B (en) * 2020-12-31 2023-04-18 上海米哈游天命科技有限公司 Picture processing method and device, electronic equipment and storage medium
CN112954193B (en) * 2021-01-27 2023-02-10 维沃移动通信有限公司 Shooting method, shooting device, electronic equipment and medium
CN115695685A (en) * 2022-10-28 2023-02-03 北京字跳网络技术有限公司 Special effect processing method and device, electronic equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024088141A1 (en) * 2022-10-28 2024-05-02 北京字跳网络技术有限公司 Special-effect processing method and apparatus, electronic device, and storage medium

Also Published As

Publication number Publication date
WO2024088141A1 (en) 2024-05-02

Similar Documents

Publication Publication Date Title
CN113038264B (en) Live video processing method, device, equipment and storage medium
CN114677386A (en) Special effect image processing method and device, electronic equipment and storage medium
WO2024104248A1 (en) Rendering method and apparatus for virtual panorama, and device and storage medium
CN115690382A (en) Training method of deep learning model, and method and device for generating panorama
CN115965727A (en) Image rendering method, device, equipment and medium
WO2024088141A1 (en) Special-effect processing method and apparatus, electronic device, and storage medium
CN116758208A (en) Global illumination rendering method and device, storage medium and electronic equipment
CN114331823A (en) Image processing method, image processing device, electronic equipment and storage medium
CN112802206A (en) Roaming view generation method, device, equipment and storage medium
CN111862342A (en) Texture processing method and device for augmented reality, electronic equipment and storage medium
CN115002442B (en) Image display method and device, electronic equipment and storage medium
CN115760553A (en) Special effect processing method, device, equipment and storage medium
CN115358959A (en) Generation method, device and equipment of special effect graph and storage medium
CN114428573A (en) Special effect image processing method and device, electronic equipment and storage medium
US20170186218A1 (en) Method for loading 360 degree images, a loading module and mobile terminal
CN115690284A (en) Rendering method, device and storage medium
CN112070903A (en) Virtual object display method and device, electronic equipment and computer storage medium
CN117197319B (en) Image generation method, device, electronic equipment and storage medium
CN113223110B (en) Picture rendering method, device, equipment and medium
US20240153159A1 (en) Method, apparatus, electronic device and storage medium for controlling based on extended reality
CN117152385A (en) Image processing method, device, electronic equipment and storage medium
CN116612227A (en) Page rendering method and device, storage medium and electronic equipment
CN114419238A (en) Special effect image processing method and device, electronic equipment and storage medium
CN118096951A (en) Special effect processing method, device, electronic equipment, storage medium and program product
CN117319790A (en) Shooting method, device, equipment and medium based on virtual reality space

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination