CN112767518A - Virtual animation special effect making method and device and electronic equipment - Google Patents

Virtual animation special effect making method and device and electronic equipment Download PDF

Info

Publication number
CN112767518A
CN112767518A CN202011531842.1A CN202011531842A CN112767518A CN 112767518 A CN112767518 A CN 112767518A CN 202011531842 A CN202011531842 A CN 202011531842A CN 112767518 A CN112767518 A CN 112767518A
Authority
CN
China
Prior art keywords
point
color
value
animation
color value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011531842.1A
Other languages
Chinese (zh)
Other versions
CN112767518B (en
Inventor
赵刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Tricolor Technology Co ltd
Original Assignee
Beijing Tricolor Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Tricolor Technology Co ltd filed Critical Beijing Tricolor Technology Co ltd
Priority to CN202011531842.1A priority Critical patent/CN112767518B/en
Publication of CN112767518A publication Critical patent/CN112767518A/en
Application granted granted Critical
Publication of CN112767518B publication Critical patent/CN112767518B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The application provides a method, a device and electronic equipment for making a special effect of a virtual animation, which relate to the technical field of image processing, and the method comprises the following steps: the method comprises the steps of obtaining object information of each point in a designated area on an object image, wherein the object information comprises at least one of object depth information, an edge value and an original color value, calculating to obtain a pixel change value in each point animation time based on the object information of each point, preset animation time and an animation period, and rendering to obtain a virtual animation special effect for displaying in the designated area of the object image based on the pixel change value of each point. The method can enhance the detail display of at least one information of object depth information, edge values and original color values, thereby improving the fineness of the object image when the object image is subjected to the reality enhancement display when the pixel change value is rendered to obtain a virtual animation special effect.

Description

Virtual animation special effect making method and device and electronic equipment
Technical Field
The application relates to the technical field of image processing, in particular to a method and a device for making a special effect of a virtual animation and electronic equipment.
Background
An Augmented Reality (Augmented Reality) technology is a technology for fusing virtual information with a real world, and a plurality of technical means such as multimedia, three-dimensional modeling, real-time tracking and registration, intelligent interaction, sensing and the like are widely applied, virtual information such as characters, images, three-dimensional models, music, videos and the like generated by a computer is applied to the real world after being simulated, and the two kinds of information complement each other, so that the real world is enhanced. In the prior art, before the virtual information is fused with the real world, it is usually required to know where to put the virtual information in the real world, for example, to put a three-dimensional model generated by a computer in a certain position in the real world, which is a method commonly adopted by the existing augmented reality technology, that is, to achieve the purpose of augmented reality by calculating a corresponding relationship between the virtual world and the real world. However, in the prior art, when a three-dimensional model of an object is displayed, specific details of the object are not depicted, and the problem of low fineness when the object is displayed in a real-life enhanced manner exists.
Disclosure of Invention
The embodiment of the application aims to provide a method, a device and electronic equipment for making a special effect of a virtual animation, so as to solve the problem that the fineness is low when the existing method is used for carrying out real augmented display on an object.
The embodiment of the application provides a method for making a special effect of a virtual animation, which comprises the following steps:
acquiring object information of each point in a designated area on an object image, wherein the object information comprises at least one of object depth information, an edge value and an original color value;
calculating to obtain a pixel change value of each point in animation time based on the object information of each point, preset animation time and an animation period;
and rendering based on the pixel change value of each point to obtain a virtual animation special effect for displaying in the specified area of the object image.
In the implementation process, the object depth information, the edge value and the original color value are mapped into the pixel change value, the virtual animation special effect displayed based on the pixel change value can show the effect that the color of each point in the specified area of the object image changes along with the change of at least one of the object depth information, the edge value and the original color value, so that the detail display of at least one of the object depth information, the edge value and the original color value can be enhanced, and when the virtual animation special effect is obtained by rendering based on the pixel change value, the fineness of the object image during the reality enhancement display is improved.
Optionally, the object information is object depth information, the pixel change value includes a first target color value of each point at each time in the animation time, and the calculating based on the object information of each point, a preset animation time, and an animation period obtains the pixel change value of each point in the animation time by:
and based on the object depth information of each point, the animation time and the animation period, carrying out color value comparison on the first target color value of each point at each moment in the animation time.
In the implementation process, the object depth information of each point in the designated area on the object image is mapped into the first target color value of each point at each moment in the animation time, the object depth information of each point in the designated area on the object image can be embodied by the first target color value, and when the virtual animation special effect is obtained by rendering based on the pixel change value, the object depth information of each point in the designated area in the object image can be reflected through color change, so that the details of the object image during real augmented display can be improved, and the fineness of the object during real augmented display is improved.
Optionally, the object information is an edge value, the pixel change value includes a second target color value of each point at each time in the animation time, and the calculating based on the object information of each point, a preset animation time, and an animation cycle to obtain the pixel change value of each point in the animation time includes:
setting an initial gradient color value and a target gradient color value of each point;
acquiring a distance value from each point to a central point of the designated area;
and calculating to obtain a second target color value of each point in the animation time based on the edge value of each point, the initial gradient color value of each point, the target gradient color value of each point, the distance value of each point, the animation time and the animation period.
In the implementation process, the edge value of each point in the designated area on the object image is mapped to a second target color value of each point at each moment in the animation time, the edge information of the designated area on the object image is embodied by the second target color value, the detail information in the object image of the edge value of each point in the designated area on the object image can be reflected, the virtual animation special effect is obtained by rendering based on the color change value of each point in the animation time, an edge color gradient special effect can be shown in the designated area of the object image, namely, the animation with color gradient changing along with time is shown on different edges of the object, and when the virtual animation special effect is obtained by rendering based on the second target color value, the edge value of each point in the designated area in the object image can be reflected by the color change, the method can improve the details of the object image during the real augmented display, and further improve the fineness of the virtual animation special effect during the real augmented display of the object.
Optionally, the object information is an original color value, the pixel change value includes a new color value of each point at each time in the animation time, and the calculating based on the object information of each point, a preset animation time, and an animation cycle to obtain the pixel change value of each point in the animation time includes:
acquiring the original color value of each point on the designated area;
when the gray scale of any point in each point is smaller than a first color mixing proportion, obtaining a mixed color value of the any point as a current color value of the any point based on a second color mixing proportion, a first preset color value and a second preset color value, wherein the second color mixing proportion is changed based on the animation time and the animation period; when the gray scale of any point is larger than or equal to the first color mixing proportion, the current color value of any point is the original color value;
and obtaining the new color value of each point based on the original color value, the current color value and the first color mixing proportion of each point.
In the implementation process, the original color value of each point in the designated area on the object image is mapped into the new color value of each point at each moment in the animation time, the original color value of the designated area on the object image can be reflected by the new color value, the detail information of the original color value of each point in the designated area on the object image can be reflected, and when the virtual animation special effect is obtained by rendering based on the new color value, the original color value of each point in the designated area in the object image can be reflected through color change, so that the details of the object image during real augmented exhibition can be improved, and the fineness of the virtual animation special effect on the object during real augmented exhibition can be further improved.
Optionally, the step of, based on the object depth information of each point, the animation time, and the animation cycle, for a first target color value of each point at each time within the animation time includes:
calculating to obtain a first target color value of each point at each moment in the animation time through a first calculation formula based on the object information of each point, the animation time and the animation period;
the first calculation formula includes:
Color(R,G,B)1=Color(E,125,125)+Color(F,125,125)×Fun(D,T,P)
wherein ,
Figure BDA0002851181960000051
indicating the first target color value, E, F are all constants, D indicates the object depth information, T indicates the animation time, and P indicates the animation period.
In the implementation process, the animation period and the animation time mapping are combined to map the object depth information of each point of the designated area on the object image into the first target color value of each point at each moment in the animation time, the object depth information of each point of the object in the designated area on the object image can be embodied by the first target color value, and the detail information of the object depth information of each point in the designated area on the object image in the object image can be reflected, when the animation special effect is obtained by rendering based on the first target color value, the object depth information of each point in the specified area in the object image can be reflected through color change, therefore, the fineness of the virtual animation special effect when the object image is subjected to real augmented display can be improved, and the fineness of the virtual animation special effect when the object is subjected to real augmented display is further improved.
Optionally, the calculating based on the object information of each point, a preset animation time and an animation period to obtain a pixel change value of each point in the animation time includes:
based on the object information of each point, preset animation time and animation period, calculating a second target color value of each point at each moment in the animation time by adopting a second calculation formula;
the second calculation formula includes:
Color(R,G,B)2
=Color(R0,G0,B0)×Fun(E1,T,P)
+Color(R1,G1-D1,B1×D1)
wherein ,
Figure BDA0002851181960000061
representing the second target color value, D1Represents the distance value, T represents the animation time, P represents the animation period, Color (R)0,G0,B0) Representing the initial gradient Color value, Color (R)1,G1,B1) Representing the target gradient, E1Representing the edge value.
In the implementation process, the edge value of each point of the designated area on the object image is mapped to the second target color value of each point at each moment in the animation time by combining the animation cycle, the animation time, the initial gradient color and the target gradient color, the edge value of the designated area on the object image is embodied by the second target color value, so that the edge value of each point in the designated area on the object image can be reflected by the detail information in the object image, when the animation special effect is obtained by rendering based on the edge gradient color, the edge value of each point in a specified area in the object image can be reflected through color change, the details of the animation special effect when the object image is subjected to real enhanced display can be improved, and the fineness of the virtual animation special effect when the object is subjected to real enhanced display is further improved.
Optionally, when the gray scale of any point is smaller than the first color mixing ratio, obtaining the mixed color value of any point as the current color value of any point based on the second color mixing ratio, the first preset color value and the second preset color value, includes:
calculating the current color value by adopting a third calculation formula;
the third calculation formula includes:
Color(R5,G5,B5)=Color(R3,G3,B3)×α+Color(R4,G4,B4)×(1-α)
wherein, Color (R)5,G5,B5) Representing said current Color value, Color (R)3,G3,B3) Representing a first predetermined Color value, Color (R)4,G4,B4) Representing a second preset color value, alpha is a second color mixing ratio,
Figure BDA0002851181960000062
in the implementation process, the second color mixing proportion is a function of the animation period and the animation time, the current color value is obtained based on the second color mixing proportion, and the dynamic change of the current color value is caused by the dynamic change of the second color mixing proportion, so that the dynamic property of the virtual animation special effect based on the original color value can be improved.
Optionally, the obtaining the new color value of each point based on the original color value, the current color value, and the first color blending ratio example includes:
calculating the new color value by adopting a fourth calculation formula;
the fourth calculation formula includes:
Color(R7,G7,B7)=Color(R6,G6,B6)×β+Color(R5,G5,B5)×(1-β)
wherein, Color (R)7,G7,B7) Representing the new Color value, Color (R)6,G6,B6) Representing said primary Color value, Color (R)5,G5,B5) Representing the current color value, beta is a first color blending ratio,
Figure BDA0002851181960000071
in the implementation process, the first color mixing proportion is a function of the animation period and the animation time, the new color value is obtained based on the first color mixing proportion, and the new color dynamically changes with the animation time and the animation period, so that the dynamic property of the virtual animation special effect based on the new color value is improved.
An embodiment of the present application provides a virtual animated special effect producing apparatus, including:
the system comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring object information of each point in a designated area on an object image, and the object information comprises at least one of object depth information, an edge value and an original color value;
the calculation module is used for calculating to obtain a pixel change value of each point in the animation time based on the object information of each point, the preset animation time and the animation period;
and the generating module is used for rendering based on the pixel change value of each point to obtain a virtual animation special effect for displaying in the specified area of the object image.
In the implementation process, the object depth information, the edge value and the original color value are mapped into the pixel change value, and the virtual animation special effect displayed based on the pixel change value can show the effect that the color of each point in the specified area of the object image changes along with the change of at least one of the object depth information, the edge value and the original color value, so that the detail display of at least one of the object depth information, the edge value and the original color value is enhanced, and when the virtual animation special effect is obtained by rendering based on the pixel change value, the fineness of the object image during the reality enhancement display can be improved.
Optionally, the computing module is configured to:
and based on the object depth information of each point, the animation time and the animation period, carrying out color value comparison on the first target color value of each point at each moment in the animation time.
In the implementation process, the object depth information of each point in the designated area on the object image is mapped into the first target color value of each point at each moment in the animation time, the object depth information of each point in the designated area on the object image can be embodied by the first target color value, and when the virtual animation special effect is obtained by rendering based on the pixel change value, the object depth information of each point in the designated area in the object image can be reflected through color change, so that the details of the object image during real augmented display can be improved, and the fineness of the object during real augmented display is improved.
Optionally, the computing module is configured to:
setting an initial gradient color value and a target gradient color value of each point;
acquiring a distance value from each point to a central point of the designated area;
and calculating to obtain a pixel change value of each point in the animation time based on the edge value of each point, the initial gradient color value of each point, the target gradient color value of each point, the distance value of each point, the animation time and the animation period.
In the implementation process, the edge value of each point in the designated area on the object image is mapped to a second target color value of each point at each moment in the animation time, the edge information of the designated area on the object image is embodied by the second target color value, the detail information in the object image of the edge value of each point in the designated area on the object image can be reflected, the virtual animation special effect is obtained by rendering based on the color change value of each point in the animation time, an edge color gradient special effect can be shown in the designated area of the object image, namely, the animation with color gradient changing along with time is shown on different edges of the object, and when the virtual animation special effect is obtained by rendering based on the second target color value, the edge value of each point in the designated area in the object image can be reflected by the color change, the method can improve the details of the object image during the real augmented display, and further improve the fineness of the virtual animation special effect during the real augmented display of the object.
Optionally, the computing module is configured to:
acquiring the original color value of each point on the designated area;
when the gray scale of any point in each point is smaller than a first color mixing proportion, obtaining a current color value of any point as a current color value of any point based on a second color mixing proportion, a first preset color value and a second preset color value, wherein the second color mixing proportion is changed based on the animation time and the animation period; when the gray scale of any point is larger than or equal to the first color mixing proportion, the current color value of any point is the original color value;
and obtaining the new color value of each point based on the original color value, the current color value and the first color mixing proportion of each point.
In the implementation process, the original color value of each point in the designated area on the object image is mapped into the new color value of each point at each moment in the animation time, the original color value of the designated area on the object image can be reflected by the new color value, the detail information of the original color value of each point in the designated area on the object image can be reflected, and when the virtual animation special effect is obtained by rendering based on the new color value, the original color value of each point in the designated area in the object image can be reflected through color change, so that the details of the object image during real augmented exhibition can be improved, and the fineness of the virtual animation special effect on the object during real augmented exhibition can be further improved.
Optionally, the calculation module is specifically configured to:
calculating to obtain a first target color value of each point at each moment in the animation time through a first calculation formula based on the object information of each point, the animation time and the animation period;
the first calculation formula includes:
Color(R,G,B)1=Color(E,125,125)+Color(F,125,125)×Fun(D,T,P)
wherein ,
Figure BDA0002851181960000101
indicating the first target color value, E, F are all constants, D indicates the object depth information, T indicates the animation time, and P indicates the animation period.
In the implementation process, the animation period and the animation time mapping are combined to map the object depth information of each point of the designated area on the object image into the first target color value of each point at each moment in the animation time, the object depth information of each point of the object in the designated area on the object image can be embodied by the first target color value, and the detail information of the object depth information of each point in the designated area on the object image in the object image can be reflected, when the animation special effect is obtained by rendering based on the first target color value, the object depth information of each point in the specified area in the object image can be reflected through color change, therefore, the fineness of the virtual animation special effect when the object image is subjected to real augmented display can be improved, and the fineness of the virtual animation special effect when the object is subjected to real augmented display is further improved.
Optionally, the computing module is configured to:
based on the object information of each point, preset animation time and animation period, calculating a second target color value of each point at each moment in the animation time by adopting a second calculation formula;
the second calculation formula includes:
Color(R,G,B)2
=Color(R0,G0,B0)×Fun(E1,T,P)
+Color(R1,G1-D1,B1×D1)
wherein ,
Figure BDA0002851181960000111
representing the second target color value, F being a constant, D1Represents the distance value, T represents the animation time, P represents the animation period, Color (R)0,G0,B0) Representing the initial gradient, Color (R)1,G1,B1) Representing the target gradient, E1Representing the edge value.
In the implementation process, the edge value of each point of the designated area on the object image is mapped to the second target color value of each point at each moment in the animation time by combining the animation cycle, the animation time, the initial gradient color and the target gradient color, the edge value of the designated area on the object image is embodied by the second target color value, so that the edge value of each point in the designated area on the object image can be reflected by the detail information in the object image, when the animation special effect is obtained by rendering based on the edge gradient color, the edge value of each point in a specified area in the object image can be reflected through color change, the details of the animation special effect when the object image is subjected to real enhanced display can be improved, and the fineness of the virtual animation special effect when the object is subjected to real enhanced display is further improved.
Optionally, the computing module is configured to:
calculating the new color value by adopting a third calculation formula;
the third calculation formula includes:
Color(R5,G5,B5)=Color(R6,G6,B6)×β+Color(R7,G7,B7)×(1-β)
wherein, Color (R)5,G5,B5) Representing said current Color value, Color (R)6,G6,B6) Representing a first predetermined Color value, Color (R)6,G6,B6) Representing a second preset color value, alpha is a second color mixing ratio,
Figure BDA0002851181960000121
in the implementation process, the second color mixing proportion is a function of the animation period and the animation time, the current color value is obtained based on the second color mixing proportion, and the dynamic change of the current color value is caused by the dynamic change of the second color mixing proportion, so that the dynamic property of the virtual animation special effect based on the original color value can be improved.
Optionally, the computing module is configured to:
calculating the new color value by adopting a fourth calculation formula;
the fourth calculation formula includes:
Color(R7,G7,B7)=Color(R6,G6,B6)×β+Color(R5,G5,B5)×(1-β)
wherein, Color (R)7,G7,B7) Representing the new Color value, Color (R)6,G6,B6) Representing said primary Color value, Color (R)5,G5,B5) Representing the current color value, beta is a first color blending ratio,
Figure BDA0002851181960000122
in the implementation process, the first color mixing proportion is a function of the animation period and the animation time, the new color value is obtained based on the first color mixing proportion, and the new color dynamically changes with the animation time and the animation period, so that the dynamic property of the virtual animation special effect based on the new color value is improved.
The present embodiment also provides an electronic device, where the electronic device includes a memory and a processor, where the memory stores program instructions, and the processor executes the program instructions to perform the steps of any of the above methods.
The present embodiment also provides a storage medium having stored therein computer program instructions, which when executed by a processor, perform the steps of any of the above methods.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
Fig. 1 is a flowchart of a method for making a special effect of a virtual animation according to an embodiment of the present application.
Fig. 2 is a flowchart of another method for producing a virtual animated special effect according to an embodiment of the present disclosure.
Fig. 3 is a flowchart of another method for producing a virtual animated special effect according to an embodiment of the present application.
Fig. 4 is a schematic diagram of a virtual animation special effect producing apparatus according to an embodiment of the present application.
Legend: 40-virtual animation special effect making device; 401-an acquisition module; 402-a calculation module; 403-generation module.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
In the description of the present application, it is noted that the terms "first", "second", and the like are used merely for distinguishing between descriptions and are not intended to indicate or imply relative importance.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the embodiments of the application. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and drawings.
In order to improve the fineness of a virtual animation special effect when the virtual animation special effect performs reality enhancement on an object in the process of performing reality enhancement on the object, an embodiment of the present application provides a virtual animation special effect manufacturing method, please refer to fig. 1, where fig. 1 is a flowchart of a virtual animation special effect manufacturing method provided by an embodiment of the present application, and the virtual animation special effect manufacturing method includes the following sub-steps:
step S1: and acquiring object information of each point in the designated area on the object image, wherein the object information comprises at least one of object depth information, an edge value and an original color value.
It is understood that each point on an object existing in the real world corresponds to a three-dimensional coordinate value under a certain coordinate system, and the three-dimensional coordinate value of any point on the object can be represented by any coordinate form, for example, by (x, y, z). The designated area can be defined according to the actual situation and does not need to be defined at all.
An object image can be obtained by shooting the object, the object image is presented as a two-dimensional color image, the corresponding coordinates of a point (x, y, z) on the object in the object image are (x ', y'), when the information of the object is compressed into a model in the image for storage, a depth map of the object is obtained, the object depth information in the depth map corresponding to the object image is (x ', y', z '), wherein z' is the depth value of the object. And establishing a one-to-one correspondence relationship between the three-dimensional coordinates of the object and the depth information of the object through the depth map.
It can be understood that extracting the edge information and the wrinkle information of the object based on the object image and saving the edge information and the wrinkle information of the object in a gray-scale map results in a gray-scale map, i.e., an edge map. The edge map can filter out the points existing at the edge positions, that is, the pixel value of each point on the edge is greater than zero, and the pixel value of each point on the edge, that is, the edge value is greater than zero. The pixel values of points other than the edge in the edge map are all zero.
It will be appreciated that since the object image is a two-dimensional image in color, each point (x ', y ', z ') corresponds to a primary color value. Wherein, the color value can be R, G, B value.
Step S2: and calculating to obtain the pixel change value of each point in the animation time based on the object information of each point, the preset animation time and the animation period.
It is understood that the number of the obtained object images is one, and in order to dynamically show the change of the object information, the object information is given a time change, and as an embodiment, the time change can be realized by multiplying the object information by a time change function.
Optionally, step S2 includes the following substeps: and based on the object depth information, the animation time and the animation period of each point, the first target color value of each point at each moment in the animation time is determined.
It can be understood that the time change is given to the object depth information, namely the object depth information is multiplied by a time function, the object depth information is converted into the change of a pixel value, namely the first target color at each moment is only visually reflected, the change of the object depth information time of each point in a specified area has a dynamic characteristic, the mapping relation between the object depth information and the first target color value in the mapping mode can reflect the detail information in the object image of the object depth information of each point in the specified area on the object image, when the virtual animation special effect is obtained by rendering based on the first target color value, the fineness of the virtual animation special effect on the real enhancement of the object image can be improved, and further the fineness of the virtual animation on the real enhancement of the object is improved.
Optionally, step S2 includes: calculating to obtain a first target color value of each point at each moment in the animation time through a first calculation formula based on the object information, the animation time and the animation period of each point;
the first calculation formula includes:
Color(R,G,B)1=Color(E,125,125)+Color(F,125,125)×Fun(D,T,P)
wherein ,
Figure BDA0002851181960000161
indicating the first target color value, E, F are all constants, D indicates object depth information, T indicates animation time, and P indicates an animation period.
It is understood that in the first calculation formula, the object depth information D in the Fun (D, T, P) corresponding to the same point remains unchanged, and the function Fun () may be in the form of any function or a combination of functions, such as a trigonometric function, a linear function, or an exponential function. The time of dynamic change of the object depth information is shown by animation time T, the dynamic change period of the object depth information is shown by animation period P, wherein T is nP, n is the number of the dynamic change periods, the value ranges of E and F are both between [0 and 255], E and F are generally set to be 125, and the dynamic change period can also be adjusted according to actual needs.
Referring to fig. 2, fig. 2 is a flowchart illustrating a step of obtaining a second target color value according to an embodiment of the present application. Optionally, step S2 includes the following substeps:
step S21: setting the initial gradient color value and the target gradient color value of each point.
Step S22: and acquiring the distance value of each point to the central point of the designated area.
Step S23: and calculating based on the edge value of each point, the initial gradient color value of each point, the target gradient color value of each point, the distance value of each point, the animation time and the animation period to obtain a second target color value of each point in the animation time.
It is to be understood that, in step S21, the value obtained by performing edge feature extraction on the object based on the object image is an edge value. In step S22, the initial gradation color value is set to specify the color value at the start point of the dynamic display time of the edge value at each point in the edge map, and the target gradation color value is set to specify the color value at the end point of the dynamic display time of the edge value at each point in the edge map. The initial gradient color value is given to the time change, that is, the initial gradient color value is multiplied by a time function, the edge value of each point in the edge map is mapped to the second target color value of each moment through step S23, and the object image can be dynamically and visually displayed through the change of the second target color value.
Optionally, step S2 includes: based on the object information of each point, preset animation time and animation period, calculating a second target color value of each point at each moment in the animation time by adopting a second calculation formula, wherein the second calculation formula comprises:
Color(R,G,B)2
=Color(R0,G0,B0)×Fun(E1,T,P)
+Color(R1,G1-D1,B1×D1)
wherein ,
Figure BDA0002851181960000171
representing a second target color, D1Indicating a distance value, T an animation time, P an animation period, Color (R)0,G0,B0) Representing the initial gradient Color value, Color (R)1,G1,B1) Representing a gradient of the target, E1The edge value is represented.
It is understood that Fun (E) corresponding to the same point in the second calculation formula1Edge value E in T, P)1The edge value E is given by the animation time T and the animation period P without change1And the time change characteristic is used for mapping the edge value to a second target color value at each moment through a second calculation formula, and the edge details of the object are displayed by gradually changing the second target color value along with the change of time. Wherein, Fun (E)1T, P) may be in the form of any function or a combined function, such as a trigonometric function, a linear function, or an exponential function.
Referring to fig. 3, fig. 3 is a flowchart illustrating a step of obtaining a new color value according to an embodiment of the present application. Optionally, step S2 includes the following substeps:
step S24: and acquiring the original color value of each point on the designated area.
It is understood that the object image is a color two-dimensional image, so that each point on the object image has a color value, which can be represented by three primary colors.
Step S25: when the gray scale of any point in each point is smaller than the first color mixing proportion, obtaining a mixed color value of any point as a current color value of any point based on the second color mixing proportion, the first preset color value and the second preset color value, wherein the second color mixing proportion is changed based on the animation time and the animation period; and when the gray scale of any point is greater than or equal to a first color mixing proportion, the current color value of any point is the original color value.
It will be understood that grayscale refers to the color depth of a point in a black-and-white image, typically ranging from 0 to 255, 255 for white and 0 for black. In step S25, the first preset color value and the second preset color value are considered to be set, and may be set according to actual conditions. And mixing the first color mixing proportion and the second color mixing proportion based on the first preset color value and the second preset color value to obtain a new color value, namely a current color value, and giving a time change attribute to the first color mixing proportion through a time function so that the current color value also has a time attribute. As an embodiment, the gray scale of each point may be calculated by the color value of each point in the object image. Suppose a Color value at a point is Color (R)6,G6,B6) Then the gray value is
Figure BDA0002851181960000181
Optionally, step S25 includes: calculating the current color value by adopting a third calculation formula, wherein the third calculation formula comprises the following steps:
Color(R5,G5,B5)=Color(R3,G3,B3)×α+Color(R4,G4,B4)×(1-α)
wherein, Color (R)5,G5,B5) Representing the current Color value, Color (R)3,G3,B3) Representing a first predetermined Color value, Color (R)4,G4,B4) Representing a second preset color value, alpha is a second color mixing ratio,
Figure BDA0002851181960000191
it will be appreciated that the above-described,
Figure BDA0002851181960000192
indicating that the second color mixture ratio has the time variation property, the animation time T and the animation period P can be set according to the actual situation, whichWhere T ═ nP, n denotes the number of dynamically changing cycles.
Step S26: the new color value of each point is obtained based on the original color value, the current color value and the first color blending ratio.
It can be understood from step S25 that the current color value has a time attribute due to the existence of the second color mixing ratio, and the first color mixing ratio has a time attribute, so that the obtained new color value will also have a time attribute that changes with time, i.e., a time-varying attribute. The original color value of each point in the object image can be mapped to the new color value which changes at different moments through the step S26, and the situation of the original color value of each point in the object image is visually reflected through the gradual change process of the new color value which changes at different moments.
Optionally, step S26 includes the following substeps: calculating the new color value by adopting a fourth calculation formula, wherein the fourth calculation formula comprises the following steps:
Color(R7,G7,B7)=Color(R6,G6,B6)×β+Color(R5,G5,B5)×(1-β)
wherein, Color (R)7,G7,B7) Indicating a new Color value, Color (R)6,G6,B6) Representing the original Color value, Color (R)5,G5,B5) Denotes the current color value, β is the first color mixing ratio, and β is 0.5sin (2 × TP + 0.5).
It will be appreciated that the above-described,
Figure BDA0002851181960000201
indicating that the first color mixing ratio has a time characteristic, the animation time T and the animation period P may be set according to the actual situation, where T ═ nP, and n denotes the number of dynamically changing periods.
Continuing with fig. 1, step S3: and rendering based on the pixel change value of each point to obtain a virtual animation special effect for displaying in a specified area of the object image.
It is understood that in step S3, a device including editing software may be used to perform editing rendering on the pixel change values, and a virtual animated special effect of the object image may be obtained through the projection device.
Referring to fig. 4, fig. 4 is a schematic view of a virtual animation special effect making apparatus according to an embodiment of the present application. The virtual animated special effect producing device 40 includes:
the obtaining module 401 is configured to obtain object information of each point in a designated area on an object image, where the object information includes at least one of object depth information, an edge value, and an original color value.
And the calculating module 402 is configured to calculate a pixel change value of each point in the animation time based on the object information of each point, a preset animation time, and an animation period.
And a generating module 403, configured to render based on the pixel change value of each point to obtain a virtual animation special effect for displaying in a specified area of the object image.
Optionally, the calculation module 402 is configured to:
and based on the object depth information, the animation time and the animation period of each point, the first target color value of each point at each moment in the animation time is determined.
Optionally, the calculation module 402 is configured to:
setting an initial gradual change color value and a target gradual change color value of each point;
acquiring a distance value from each point to the central point of the designated area;
and calculating to obtain the pixel change value of each point in the animation time based on the edge value of each point, the initial gradient color value of each point, the target gradient color value of each point, the distance value of each point, the animation time and the animation period.
Optionally, the calculation module 402 is configured to:
acquiring the original color value of each point on the designated area;
when the gray scale of any point in each point is smaller than the first color mixing proportion, obtaining a mixed color value of any point as a current color value of any point based on the second color mixing proportion, the first preset color value and the second preset color value, wherein the second color mixing proportion is changed based on the animation time and the animation period; when the gray scale of any point is larger than or equal to the first color mixing proportion, the current color value of any point is the original color value;
the new color value of each point is obtained based on the original color value, the current color value and the first color blending ratio.
Optionally, the calculating module 402 is specifically configured to:
based on the object information, the animation time and the animation period of each point, calculating through a first calculation formula to obtain a first target color value of each point at each moment in the animation time;
the first calculation formula includes:
Color(R,G,B)1=Color(E,125,125)+Color(F,125,125)×Fun(D,T,P)
wherein ,
Figure BDA0002851181960000211
the first target color values representing depth, E, F, are all constants, D represents object depth information, T represents animation time, and P represents an animation period.
Optionally, the calculation module 402 is configured to:
based on the object information of each point, preset animation time and animation period, calculating a second target color value of each point at each moment in the animation time by adopting a second calculation formula;
the second calculation formula includes:
Color(R,G,B)2
=Color(R0,G0,B0)×Fun(E1,T,P)
+Color(R1,G1-B1,B1×D1)
wherein ,
Figure BDA0002851181960000221
representing a second target color value, D1Indicating a distance value, T an animation time, P an animation period, Color (R)0,G0,B0) Representing the initial gradient Color value, Color (R)1,G1,B1) Representing a gradient of the target, E1Representing the edge value.
Optionally, the calculation module 402 is configured to:
calculating a new color value by adopting a third calculation formula;
the third calculation formula includes:
Color(R5,G5,B5)=Color(R6,G6,B6)×β+Color(R7,G7,B7)×(1-β)
wherein, Color (R)5,G5,B5) Representing the current Color value, Color (R)6,G6,B6) Representing a first predetermined Color value, Color (R)6,G6,B6) Representing a second preset color value, alpha is a second color mixing ratio,
Figure BDA0002851181960000222
optionally, the calculation module 402 is configured to:
calculating a new color value by adopting a fourth calculation formula;
the fourth calculation formula includes:
Color(R7,G7,B7)=Color(R6,G6,B6)×β+Color(R5,G5,B5)×(1-β)
wherein, Color (R)7,G7,B7) Indicating a new Color value, Color (R)6,G6,B6) Representing the original Color value, Color (R)5,G5,B5) Denotes the current color value, β is the first color mixing ratio, and β is 0.5sin (2 × TP + 0.5).
The present embodiment also provides an electronic device, where the electronic device includes a memory and a processor, where the memory stores program instructions, and the processor executes the program instructions to perform the steps of any of the above methods.
The present embodiment also provides a storage medium having stored therein computer program instructions, which when executed by a processor, perform the steps of any of the above methods.
To sum up, the embodiment of the present application provides a method, an apparatus and an electronic device for producing a special effect of a virtual animation, which relate to the technical field of image processing, and the method for producing the special effect of the virtual animation includes: the method comprises the steps of obtaining object information of each point in a designated area on an object image, wherein the object information comprises at least one of object depth information, an edge value and an original color value. And calculating each pixel value based on the object information, the animation time and the animation period of each point to obtain a pixel change value of the object image. And rendering based on the pixel change value to obtain a color rendering result and obtaining a virtual animation special effect of the object image based on the color rendering result.
In the implementation process, the object depth information, the edge value and the original color value are mapped into the pixel change value, and the virtual animation special effect displayed based on the pixel change value can show the effect that the color of each point in the specified area of the object image changes along with the change of at least one of the object depth information, the edge value and the original color value, so that the detail display of at least one of the object depth information, the edge value and the original color value is enhanced, and when the virtual animation special effect is obtained by rendering based on the pixel change value, the fineness of the object image during the reality enhancement display can be improved.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. The apparatus embodiments described above are merely illustrative, and for example, the block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of devices according to various embodiments of the present application. In this regard, each block in the block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams, and combinations of blocks in the block diagrams, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Therefore, the present embodiment further provides a readable storage medium, in which computer program instructions are stored, and when the computer program instructions are read and executed by a processor, the computer program instructions perform the steps of any of the block data storage methods. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. A method for producing a virtual animated special effect, the method comprising:
acquiring object information of each point in a designated area on an object image, wherein the object information comprises at least one of object depth information, an edge value and an original color value;
calculating to obtain a pixel change value of each point in animation time based on the object information of each point, preset animation time and an animation period;
and rendering based on the pixel change value of each point to obtain a virtual animation special effect for displaying in the specified area of the object image.
2. The method according to claim 1, wherein the object information is object depth information, the pixel change value includes a first target color value of each point at each time within the animation time, and the calculating based on the object information of each point, a preset animation time and an animation period obtains the pixel change value of each point within the animation time includes:
and based on the object depth information of each point, the animation time and the animation period, carrying out color value comparison on the first target color value of each point at each moment in the animation time.
3. The method according to claim 1, wherein the object information is an edge value, the pixel change value includes a second target color value of each point at each time within the animation time, and the calculating based on the object information of each point, a preset animation time and an animation period to obtain the pixel change value of each point within the animation time includes:
setting an initial gradient color value and a target gradient color value of each point;
acquiring a distance value from each point to a central point of the designated area;
and calculating to obtain a second target color value of each point in the animation time based on the edge value of each point, the initial gradient color value of each point, the target gradient color value of each point, the distance value of each point, the animation time and the animation period.
4. The method of claim 1, wherein the object information is an original color value, the pixel change value includes a new color value of each point at each time in the animation time, and the calculating based on the object information of each point, a preset animation time and an animation period to obtain the pixel change value of each point in the animation time includes:
acquiring the original color value of each point on the designated area;
when the gray scale of any point in each point is smaller than a first color mixing proportion, obtaining a mixed color value of the any point as a current color value of the any point based on a second color mixing proportion, a first preset color value and a second preset color value, wherein the second color mixing proportion is changed based on the animation time and the animation period; when the gray scale of any point is larger than or equal to the first color mixing proportion, the current color value of any point is the original color value;
and obtaining the new color value of each point based on the original color value, the current color value and the first color mixing proportion of each point.
5. The method of claim 2, wherein the step of, for the first target color value of each point at each time within the animation time based on the object depth information of each point, the animation time and the animation cycle, comprises:
calculating to obtain a first target color value of each point at each moment in the animation time through a first calculation formula based on the object information of each point, the animation time and the animation period;
the first calculation formula includes:
Color(R,G,B)1=Color(E,125,125)+Color(F,125,125)×Fun(D,T,P)
wherein ,
Figure FDA0002851181950000021
indicating the first target color value, E, F are all constants, D indicates the object depth information, T indicates the animation time, and P indicates the animation period.
6. The method according to claim 3, wherein the calculating based on the object information of each point, a preset animation time and an animation period to obtain a pixel change value of each point in the animation time comprises:
based on the object information of each point, preset animation time and animation period, calculating a second target color value of each point at each moment in the animation time by adopting a second calculation formula;
the second calculation formula includes:
Color(R,G,B)2=Color(R0,G0,B0)×Fun(E1,T,P)+Color(R1,G1-D1,B1×D1)
wherein ,
Figure FDA0002851181950000031
representing the second target color value, D1Represents the distance value, T represents the animation time, P represents the animation period, Color (R)0,G0,B0) Representing the initial gradient Color value, Color (R)1,G1,B1) Representing the target gradient, E1Representing the edge value.
7. The method of claim 4, wherein obtaining the blended color value of the any point as the current color value of the any point based on the second color blending ratio, the first preset color value and the second preset color value when the gray scale of the any point is smaller than the first color blending ratio comprises:
when the gray scale of any point is smaller than the first color mixing proportion, calculating the mixed color value of any point by adopting a third calculation formula based on the second color mixing proportion, the first preset color value and the second preset color value to be used as the current color value of any point;
the third calculation formula includes:
Color(R5,G5,B5)=Color(R3,G3,B3)×α+Color(R4,G4,B4)×(1-α)
wherein, Color (R)5,G5,B5) Representing said current Color value, Color (R)3,G3,B3) Representing a first predetermined Color value, Color (R)4,G4,B4) Representing a second preset color value, alpha is a second color mixing ratio,
Figure FDA0002851181950000041
8. the method of claim 7, wherein the deriving the new color value for each point based on the original color value, the current color value, and the first blend ratio comprises:
calculating a new color value of each point by adopting a fourth calculation formula based on the original color value, the current color value and the first color mixing proportion of each point;
the fourth calculation formula includes:
Color(R7,G7,B7)=Color(R6,G6,B6)×β+Color(R5,G5,B5)×(1-β)
wherein, Color (R)7,G7,B7) Representing the new Color value, Color (R)6,G6,B6) Representing said primary Color value, Color (R)5,G5,B5) Representing the current color value, beta is a first color blending ratio,
Figure FDA0002851181950000042
9. a virtual animated special effect producing apparatus, characterized in that the apparatus comprises:
the system comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring object information of each point in a designated area on an object image, and the object information comprises at least one of object depth information, an edge value and an original color value;
the calculation module is used for calculating to obtain a pixel change value of each point in the animation time based on the object information of each point, the preset animation time and the animation period;
and the generating module is used for rendering based on the pixel change value of each point to obtain a virtual animation special effect for displaying in the specified area of the object image.
10. An electronic device comprising a memory having stored therein program instructions and a processor that, when executed, performs the steps of the method of any of claims 1-8.
CN202011531842.1A 2020-12-22 2020-12-22 Virtual animation special effect manufacturing method and device and electronic equipment Active CN112767518B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011531842.1A CN112767518B (en) 2020-12-22 2020-12-22 Virtual animation special effect manufacturing method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011531842.1A CN112767518B (en) 2020-12-22 2020-12-22 Virtual animation special effect manufacturing method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN112767518A true CN112767518A (en) 2021-05-07
CN112767518B CN112767518B (en) 2023-06-06

Family

ID=75695203

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011531842.1A Active CN112767518B (en) 2020-12-22 2020-12-22 Virtual animation special effect manufacturing method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN112767518B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1992009947A1 (en) * 1990-11-30 1992-06-11 Vpl Research, Inc. Method and apparatus for rendering graphical images
CN102113303A (en) * 2008-08-04 2011-06-29 微软公司 Gpu scene composition and animation
CN102129708A (en) * 2010-12-10 2011-07-20 北京邮电大学 Fast multilevel imagination and reality occlusion method at actuality enhancement environment
US20180117466A1 (en) * 2016-10-31 2018-05-03 Zynga Inc. G.p.u.-assisted character animation
CN108604389A (en) * 2016-05-16 2018-09-28 谷歌有限责任公司 continuous depth ordering image synthesis
CN108876931A (en) * 2017-05-12 2018-11-23 腾讯科技(深圳)有限公司 Three-dimension object color adjustment method, device, computer equipment and computer readable storage medium
CN109242943A (en) * 2018-08-21 2019-01-18 腾讯科技(深圳)有限公司 A kind of image rendering method, device and image processing equipment, storage medium
WO2019135979A1 (en) * 2018-01-05 2019-07-11 Microsoft Technology Licensing, Llc Fusing, texturing, and rendering views of dynamic three-dimensional models
US20190279424A1 (en) * 2018-03-07 2019-09-12 California Institute Of Technology Collaborative augmented reality system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1992009947A1 (en) * 1990-11-30 1992-06-11 Vpl Research, Inc. Method and apparatus for rendering graphical images
CN102113303A (en) * 2008-08-04 2011-06-29 微软公司 Gpu scene composition and animation
CN102129708A (en) * 2010-12-10 2011-07-20 北京邮电大学 Fast multilevel imagination and reality occlusion method at actuality enhancement environment
CN108604389A (en) * 2016-05-16 2018-09-28 谷歌有限责任公司 continuous depth ordering image synthesis
US20180117466A1 (en) * 2016-10-31 2018-05-03 Zynga Inc. G.p.u.-assisted character animation
CN108876931A (en) * 2017-05-12 2018-11-23 腾讯科技(深圳)有限公司 Three-dimension object color adjustment method, device, computer equipment and computer readable storage medium
WO2019135979A1 (en) * 2018-01-05 2019-07-11 Microsoft Technology Licensing, Llc Fusing, texturing, and rendering views of dynamic three-dimensional models
US20190279424A1 (en) * 2018-03-07 2019-09-12 California Institute Of Technology Collaborative augmented reality system
CN109242943A (en) * 2018-08-21 2019-01-18 腾讯科技(深圳)有限公司 A kind of image rendering method, device and image processing equipment, storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
王山东等: "图像抽象化的实时增强型绘制", vol. 25, no. 2, pages 189 - 199 *
赵维;茅坪;沈凡宇;: "下一代三维图形引擎发展趋势研究", no. 12, pages 2935 - 2944 *

Also Published As

Publication number Publication date
CN112767518B (en) 2023-06-06

Similar Documents

Publication Publication Date Title
US6285779B1 (en) Floating-point complementary depth buffer
KR100612890B1 (en) Multi-effect expression method and apparatus in 3-dimension graphic image
CN111508052B (en) Rendering method and device of three-dimensional grid body
US5592597A (en) Real-time image generation system for simulating physical paint, drawing media, and feature modeling with 3-D graphics
CN109448137B (en) Interaction method, interaction device, electronic equipment and storage medium
CN110163831B (en) Method and device for dynamically displaying object of three-dimensional virtual sand table and terminal equipment
CN108022285B (en) Map rendering method and device
US6184893B1 (en) Method and system for filtering texture map data for improved image quality in a graphics computer system
RU2422902C2 (en) Two-dimensional/three-dimensional combined display
US6791569B1 (en) Antialiasing method using barycentric coordinates applied to lines
JP2012190428A (en) Stereoscopic image visual effect processing method
US6791563B2 (en) System, method and computer program product for global rendering
US6774897B2 (en) Apparatus and method for drawing three dimensional graphics by converting two dimensional polygon data to three dimensional polygon data
US9064336B2 (en) Multiple texture compositing
CN112516595B (en) Magma rendering method, device, equipment and storage medium
CN111311720A (en) Texture image processing method and device
JP7460641B2 (en) Apparatus and method for generating a light intensity image - Patents.com
CN106846449B (en) Rendering method and device for visual angle material or map
CN112767518B (en) Virtual animation special effect manufacturing method and device and electronic equipment
KR100848687B1 (en) 3-dimension graphic processing apparatus and operating method thereof
US6646650B2 (en) Image generating apparatus and image generating program
US20050231533A1 (en) Apparatus and method for performing divide by w operations in a graphics system
KR100684558B1 (en) Texture mipmapping device and the same method
JPH11328427A (en) Device and method for polygon divisional plotting and storage medium
KR100818286B1 (en) Method and apparatus for rendering 3 dimensional graphics data considering fog effect

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant