CN111489429B - Image rendering control method, terminal equipment and storage medium - Google Patents

Image rendering control method, terminal equipment and storage medium Download PDF

Info

Publication number
CN111489429B
CN111489429B CN202010301299.XA CN202010301299A CN111489429B CN 111489429 B CN111489429 B CN 111489429B CN 202010301299 A CN202010301299 A CN 202010301299A CN 111489429 B CN111489429 B CN 111489429B
Authority
CN
China
Prior art keywords
pixel
transparent
processed
region
display interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010301299.XA
Other languages
Chinese (zh)
Other versions
CN111489429A (en
Inventor
李帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ARCHERMIND TECHNOLOGY (NANJING) CO LTD
Original Assignee
ARCHERMIND TECHNOLOGY (NANJING) CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ARCHERMIND TECHNOLOGY (NANJING) CO LTD filed Critical ARCHERMIND TECHNOLOGY (NANJING) CO LTD
Priority to CN202010301299.XA priority Critical patent/CN111489429B/en
Publication of CN111489429A publication Critical patent/CN111489429A/en
Application granted granted Critical
Publication of CN111489429B publication Critical patent/CN111489429B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/503Blending, e.g. for anti-aliasing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)

Abstract

The invention provides an image rendering control method, a terminal device and a storage medium, wherein the method comprises the following steps: superposing a transparent channel on a display interface to obtain a region to be treated; acquiring pixel data of a region to be processed, and calculating pixel adjustment parameters according to the control pixel coefficients and the pixel data; and adjusting the pixel point state of the region to be processed according to the pixel adjusting parameters so as to finish the pixel state switching display of the display interface. The method and the device perform zone rendering on the display interface, and the rendering display mode is more flexible and efficient.

Description

Image rendering control method, terminal equipment and storage medium
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image rendering control method, a terminal device, and a storage medium.
Background
Currently, in the art, in order to realize a three-dimensional dynamic rendering effect, the prior art mainly relies on a three-dimensional transparent channel mapping processing technology, but the prior art mainly focuses on realizing a pixel value change display effect on three-dimensional objects such as trees, flowers and plants, clothes and hair of a model through a three-dimensional surface patch and overlapping transparent channels.
In the above technique, to control pixel data of a three-dimensional object surface, pixel changing effects of the three-dimensional object surface, such as transparency and brightness, are controlled by pixel values, and rendering effects are not divided into areas. In addition, the pixel data of each pixel point can be controlled and regulated respectively, but the method can realize the zone rendering, but the zone rendering has low efficiency and is easy to make mistakes, so that the zone rendering effect is poor.
Therefore, how to render and display the sub-regions separately and at the same time, improving the rendering efficiency of the sub-regions is a technical problem to be solved in the art.
Disclosure of Invention
The invention aims to provide an image rendering control method, terminal equipment and storage medium, which realize the regional rendering of a display interface, and the rendering display mode is more flexible and efficient.
The technical scheme provided by the invention is as follows:
the invention provides an image rendering control method, which comprises the following steps:
Superposing a transparent channel on a display interface to obtain a region to be treated;
acquiring pixel data of the region to be processed, and calculating pixel adjustment parameters according to the control pixel coefficients and the pixel data;
And adjusting the pixel point state of the region to be processed according to the pixel adjusting parameters so as to finish the pixel state switching display of the display interface.
Further, the step of acquiring the pixel data of the area to be processed includes the steps of:
If the interface image corresponding to the display interface does not comprise transparent pixels, carrying out gray processing on the interface image;
and obtaining a target pixel value corresponding to each pixel point according to the color channel value of each pixel in the interface image after gray processing to obtain the pixel data.
Further, the step of superposing the transparent channel on the display interface to obtain the region to be processed includes the steps of:
Acquiring a grid object corresponding to the display interface, and generating a corresponding patch model according to the grid object;
And superposing a transparent channel corresponding to the identification information at the patch model, and extracting a region to be processed of the corresponding patch model through the transparent channel.
Further, the step of obtaining the pixel data of the to-be-processed area and calculating the pixel adjustment parameter according to the control pixel coefficient and the pixel data includes the steps of:
obtaining a target pixel value corresponding to each pixel point in the region to be processed, wherein the target pixel value comprises a transparent pixel and a brightness pixel;
and carrying out product operation on the control pixel coefficient and the transparent pixel and/or the brightness pixel to obtain the pixel adjusting parameter.
Further, the multiplying the control pixel coefficient with the transparent pixel and/or the brightness pixel includes the steps of:
acquiring a distance value of each pixel point at the to-be-processed area relative to a preset origin in a preset moving direction when a display interface slides;
And obtaining a corresponding control pixel coefficient according to each distance value, and carrying out product operation on the control pixel coefficient of each pixel point and the corresponding transparent pixel and/or brightness pixel.
The invention also provides a terminal device, comprising:
the image processing module is used for superposing the transparent channel on the display interface to obtain a region to be processed;
The extraction operation module is used for acquiring pixel data of the region to be processed and calculating pixel adjustment parameters according to the control pixel coefficients and the pixel data;
And the image display module is used for adjusting the pixel point state of the region to be processed according to the pixel adjusting parameters so as to finish the pixel state switching display of the display interface.
Further, the image processing module includes:
the surface patch generating unit is used for acquiring a grid object corresponding to the display interface and generating a corresponding surface patch model according to the grid object;
And the image processing unit is used for superposing transparent channels corresponding to the identification information at the patch model and extracting the area to be processed of the corresponding patch model through the transparent channels.
Further, the extraction operation module includes:
the pixel extraction unit is used for obtaining target pixel values corresponding to all pixel points in the region to be processed; the target pixel value includes a transparent pixel and a brightness pixel;
and the operation unit is used for carrying out product operation on the control pixel coefficient and the transparent pixel and/or the brightness pixel to obtain the pixel adjustment parameter.
Further, the operation unit includes:
The distance acquisition subunit is used for acquiring a distance value of each pixel point at the to-be-processed area relative to a preset origin in a preset moving direction when the display interface slides;
and the product operation subunit is used for acquiring a corresponding control pixel coefficient according to each distance value, and performing product operation on the control pixel coefficient of each pixel point and the corresponding transparent pixel and/or brightness pixel to obtain a pixel adjustment parameter.
The present invention also provides a storage medium having at least one instruction stored therein, the instruction being loaded and executed by a processor to implement operations performed by the image rendering control method.
The image rendering control method, the terminal equipment and the storage medium provided by the invention can render the display interface in the different areas, and the rendering display mode is more flexible and efficient.
Drawings
The above features, technical features, advantages and implementation manners of an image rendering control method, a terminal device and a storage medium will be further described in a clear and understandable manner with reference to the accompanying drawings.
FIG. 1 is a flow chart of one embodiment of an image rendering control method of the present invention;
FIG. 2 is a flow chart of another embodiment of an image rendering control method of the present invention;
FIG. 3 is a flow chart of another embodiment of an image rendering control method of the present invention;
FIG. 4 is a flow chart of another embodiment of an image rendering control method of the present invention;
FIG. 5 is a schematic diagram of the present invention for image transparency adjustment using the image rendering control method of FIG. 4;
FIG. 6 is a flow chart of another embodiment of an image rendering control method of the present invention;
FIG. 7 is a schematic diagram of the present invention for switching adjustment of image transparency in three-dimensional space using the image rendering control method of FIG. 6;
fig. 8 is a schematic structural view of an embodiment of a terminal device of the present invention.
Detailed Description
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the following description will explain the specific embodiments of the present invention with reference to the accompanying drawings. It is evident that the drawings in the following description are only examples of the invention, from which other drawings and other embodiments can be obtained by a person skilled in the art without inventive effort.
For the sake of simplicity of the drawing, the parts relevant to the present invention are shown only schematically in the figures, which do not represent the actual structure thereof as a product. Additionally, in order to simplify the drawing for ease of understanding, components having the same structure or function in some of the drawings are shown schematically with only one of them, or only one of them is labeled. Herein, "a" means not only "only this one" but also "more than one" case.
In one embodiment of the present invention, as shown in fig. 1, an image rendering control method includes:
S110, overlapping transparent channels on a display interface to obtain a region to be processed;
Specifically, by editing the transparent channel, the color is added or deleted, and the mask color and the transparency can be set, and the transparent pixel value is generally between 0 and 1. The transparent channel is a gray value channel that defines transparent, opaque and translucent areas in 256 gray levels, where a transparent pixel value of 0 indicates opaque and a transparent pixel value of 1 indicates fully transparent. And (3) carrying out image matting by superposing the transparent channels on the display interface to obtain a region to be processed, namely a mask region.
Illustratively, the pixel points in the area to be processed are represented as (r, g, b, a, l), where r is the red channel value of the pixel point, g is the green channel value of the pixel point, b is the blue channel value of the pixel point, a is the transparent channel value of the pixel point, i.e., the transparent pixel, and l is the brightness value of the pixel point, i.e., the brightness pixel. Alternatively, r, g, b, a and l are each in the range of 0 to 255 or 0 to 1, where 0 to 1 is converted from 0 to 255 (divided by 255). In the following embodiments, the values of r, g, b, a and l are all 0 to 1, and the value of the control pixel coefficient is 0-1. Illustratively, a clear channel value=1 is opaque, a clear channel value=0 is transparent, and a clear channel value of 0 < 1 is translucent.
S120, acquiring pixel data of a region to be processed, and calculating pixel adjustment parameters according to the control pixel coefficients and the pixel data;
Specifically, the pixel data includes corresponding RGB values, and further includes corresponding transparent pixels and brightness pixels, and after the region to be processed is obtained, the terminal device can read the RGB values, the transparency values and the brightness values of each pixel point. And acquiring control pixel coefficients input by a user, wherein the control pixel coefficients comprise transparent pixel coefficients, brightness pixel coefficients, red pixel coefficients, blue pixel coefficients and green pixel coefficients corresponding to the transparency. After the pixel data of the area to be processed is obtained, each type of pixel value (comprising an RGB value, a transparency value and a brightness value) in the pixel data is respectively in one-to-one correspondence with the corresponding type of control pixel coefficient to obtain the corresponding pixel adjusting parameter. The pixel adjustment parameters include any one or more of a transparency adjustment parameter, a brightness adjustment parameter, and a chromaticity adjustment parameter (e.g., a red adjustment parameter corresponding to a red pixel coefficient).
S130, adjusting the pixel point state of the area to be processed according to the pixel adjusting parameters so as to finish the pixel state switching display of the display interface.
Specifically, after the operation result is obtained according to the above manner, the pixel points of the area to be processed are respectively adjusted, so that the pixel data of each pixel point is changed to complete the pixel state switching display of the display interface.
In the embodiment, the region to be processed is extracted through the transparent channel, and each pixel point of the region to be processed is respectively rendered according to the operation result of the control pixel coefficient and the pixel data, so that the display interface is rendered in different regions, and the rendering display mode is more flexible and efficient.
In one embodiment of the present invention, as shown in fig. 2, an image rendering control method includes:
S210, overlapping transparent channels on a display interface to obtain a region to be processed;
s220, if the interface image corresponding to the display interface does not comprise transparent pixels, gray processing is carried out on the interface image;
specifically, the PNG, BMP, PSD, TGA format images may each include transparent pixels, i.e., a color space including RGBA data (Red color channel value, green color channel value, blue color channel value, and transparency information Alpha). For example, if the PNG format image is composed of a plurality of pixels, each pixel corresponds to one RGBA data. While images in GIF and JPEG formats each include only RGB data (Red color channel values, green color channel values, blue color channel values) and do not include transparent pixels.
S230, obtaining pixel data of a target pixel value corresponding to each pixel point according to the color channel value of each pixel in the interface image after gray level processing;
Specifically, the gray processing is performed on the interface image similar to the GIF and JPEG formats, and the gray processing is in the prior art, which is not described herein. And then, according to the color channel value of each pixel in the display interface after gray processing, inquiring a preset conversion table by adopting an image algorithm extracted by the depth of field of the pixel and image processing software such as Photo shop to obtain a transparent pixel corresponding to each pixel point. In addition, brightness pixels corresponding to each pixel point can be obtained through calculation according to a preset formula substituted by the color channel values.
The calculation formula of the brightness pixel is: l=r 0.30+g 0.59+b 0.11, where R is the red color channel value, G is the green color channel value, and B is the blue color channel value.
S240, calculating pixel adjustment parameters according to the control pixel coefficients and the pixel data;
S250, adjusting the pixel point state of the area to be processed according to the pixel adjusting parameters so as to finish the pixel state switching display of the display interface.
In the embodiment, the region to be processed is extracted through the transparent channel, and each pixel point of the region to be processed is respectively rendered according to the operation result of the control pixel coefficient and the pixel data, so that the display interface is rendered in different regions, and the rendering display mode is more flexible and efficient. In addition, because the gray scale processing is carried out on the interface image, the method is applicable to the zone rendering of the interface image in any format, and has higher compatibility and practicability.
In one embodiment of the present invention, as shown in fig. 3, an image rendering control method includes:
s310, acquiring a grid object corresponding to the display interface, and generating a corresponding patch model according to the grid object;
Specifically, all independent interface contents of the display interface are obtained to generate corresponding grid objects, grid vertexes, sequences, directions, vertex normals and the like corresponding to the display interface are screened and determined according to image processing software such as Unity 3D, photo shop and the like, and therefore a plurality of patch models corresponding to the display interface are created. And constructing a grid object in the three-dimensional space for each interface content according to the obtained coordinates of the central points of the three-dimensional space information display interface, and constructing a triangular patch by connecting three vertexes of each interface content, thereby creating a patch model corresponding to the display interface. Illustratively, after gray processing is performed on an interface image corresponding to a display interface, the interface image after gray processing is loaded into image processing software (for example Blender, unity D, photo shop), so that the interface image is subdivided into a plurality of grids to create a plurality of surface patch models corresponding to the display interface, each surface patch model has a grid coordinate range corresponding to the surface patch model, and a corresponding identity ID is generated for each surface patch model.
S320, overlapping transparent channels corresponding to the identification information at the patch model, and extracting a region to be processed of the corresponding patch model through the transparent channels;
Specifically, each transparent channel has an identification information, a target area coordinate of the transparent channel to be overlapped is obtained, a corresponding target grid coordinate range is obtained according to target area coordinate query matching, and therefore a target patch model corresponding to the target grid coordinate range is found out, or an identity ID of the target patch model of the transparent channel to be overlapped is obtained. According to the method, the position of the target patch model is found, the transparent channel corresponding to the identification information is overlapped at the position of the target patch model, so that the corresponding transparent channels can be overlapped at the positions of the patch models according to requirements, and the corresponding area to be processed is obtained according to the transparent channels overlapped by the patch models.
S330, obtaining target pixel values corresponding to all pixel points in the region to be processed, wherein the target pixel values comprise transparent pixels and brightness pixels;
s340, carrying out product operation on the control pixel coefficient and the transparent pixel and/or the brightness pixel to obtain a pixel adjustment parameter;
S350, adjusting the pixel point state of the area to be processed according to the pixel adjusting parameters so as to finish the pixel state switching display of the display interface.
In this embodiment, a corresponding patch model is generated through a grid object corresponding to a display interface, and transparent channels corresponding to identification information are superimposed on a plurality of patch models according to requirements, so that different areas to be processed can be extracted, and different areas to be processed on the same display interface can be respectively rendered according to control pixel coefficients corresponding to different patch models to realize regional rendering. In addition, each pixel point in the same area to be processed is respectively rendered according to the operation results of the control pixel coefficient and the pixel data, the display interface is further rendered in the areas, and the rendering display mode is flexible and efficient.
In one embodiment of the present invention, as shown in fig. 4, an image rendering control method includes:
S410, acquiring a grid object corresponding to the display interface, and generating a corresponding patch model according to the grid object;
s420, overlapping transparent channels corresponding to the identification information at the patch model, and extracting a to-be-processed area corresponding to the patch model through the transparent channels;
s430, if the interface image corresponding to the display interface does not comprise transparent pixels, carrying out gray processing on the interface image;
S440, obtaining pixel data of a target pixel value corresponding to each pixel point according to the color channel value of each pixel in the interface image after gray level processing;
s450, obtaining target pixel values corresponding to all pixel points in the region to be processed, wherein the target pixel values comprise transparent pixels and brightness pixels;
S460, carrying out product operation on the control pixel coefficient and the transparent pixel and/or the brightness pixel to obtain a pixel adjusting parameter;
s470, adjusting the pixel point state of the area to be processed according to the pixel adjusting parameters so as to finish the pixel state switching display of the display interface.
Specifically, when only the transparent pixel of each pixel point in the to-be-processed area is obtained through the above embodiment, if the transparent control pixel coefficient and the brightness control pixel coefficient are obtained at this time, only the transparent pixel of each pixel point in the to-be-processed area is obtained, so that the product operation is performed only on the transparent pixel corresponding to each pixel point and the transparent control pixel coefficient, thereby adjusting the transparency of each pixel point in the to-be-processed area.
Similarly, when only the brightness pixel of each pixel point in the area to be processed is obtained by the above embodiment, if the transparent control pixel coefficient and the brightness control pixel coefficient are obtained at this time, the product operation is performed only on the brightness pixel corresponding to each pixel point and the brightness control pixel coefficient, so as to adjust the brightness (also referred to as brightness) of each pixel point in the area to be processed.
In addition, when the transparent pixel and the brightness pixel of each pixel point in the area to be processed are obtained through the above embodiment, if only the transparent control pixel coefficient is obtained, the product operation is only performed on the transparent pixel corresponding to each pixel point and the transparent control pixel coefficient, so that the transparency of each pixel point in the area to be processed is adjusted. Similarly, when only the brightness control pixel coefficient is obtained, only the brightness of each pixel point in the region to be processed is adjusted.
In an exemplary case where transparent pixels of a plurality of pixel points are input at the same time, product calculation is performed on the transparent pixels of the plurality of pixel points and the acquired transparent control pixel coefficients respectively, so that the surface which is already partially transparent can be avoided, original transparency information is covered when transparency readjustment is performed, and transparency adjustment of different areas is performed according to requirements.
Detailed description referring to fig. 5, an original image (i.e., a two-dimensional image of an interface image of a display interface of the present invention) including transparent pixels and color pixels, i.e., RGB pixels, is input. The transparent pixels of the original image are extracted, the transparent channels are superimposed on the original image, unwanted parts of the original image are removed by combining the transparent channels, and as shown in fig. 5, the transparent channel mapping is the physical representation of the transparent channels, black represents is removed to obtain a non-transparent processing area, and white represents remains to obtain a to-be-processed area. The transparency adjustment value (i.e., the transparency control pixel coefficient of the present invention) is superimposed again on the region to be processed, since the transparent channel has already been assigned to the portion to be displayed (i.e., the region to be processed) and the portion not to be displayed (i.e., the non-transparentized region) in the original image. Assuming that the transparent pixels corresponding to the petals of the area to be treated are 0.5 and the transparent pixels corresponding to the pistil are 1, at this time, the transparency adjustment value is superimposed on the pixels of the non-transparentizing treatment area, that is, an effect that 0 times any value is 0 is generated, and the area to be treated is affected by the transparency adjustment value, that is, 0.25 is obtained by multiplying 0.5 by 0.5 for the transparent pixels corresponding to the pixels of the petals of the area to be treated, 0.5 is obtained by multiplying 0.5 for the transparent pixels 1 corresponding to the pixels of the pistil area, so that the originally opaque pistil part becomes semitransparent, and the semitransparent petal area becomes more transparent, thereby presenting the split area transparency adjustment effect as far as the right side in fig. 5. Based on the superposition thought of the product mode, the transparency of different areas to be displayed can be adjusted by using the product result through the multiple transparent channels.
In this embodiment, the region to be processed is extracted through the transparent channel, and according to the operation result of controlling the pixel coefficient and the pixel data, transparency or brightness rendering is performed on each pixel point of the region to be processed, so that region division and synchronous rendering are implemented on the display interface, and the rendering display mode is more flexible and efficient.
In one embodiment of the present invention, as shown in fig. 6, an image rendering control method includes:
S510, acquiring a grid object corresponding to the display interface, and generating a corresponding patch model according to the grid object;
s520, overlapping transparent channels corresponding to the identification information at the patch model, and extracting a region to be processed of the corresponding patch model through the transparent channels;
S530, if the interface image corresponding to the display interface does not comprise transparent pixels, carrying out gray processing on the interface image;
S540, obtaining a target pixel value corresponding to each pixel point according to the color channel value of each pixel in the interface image after gray processing to obtain the pixel data;
s550, obtaining target pixel values corresponding to all pixel points in the region to be processed, wherein the target pixel values comprise transparent pixels and brightness pixels;
s560, acquiring a distance value of each pixel point at the to-be-processed area relative to a preset origin in a preset moving direction when the display interface slides;
S570 obtains corresponding control pixel coefficients according to each distance value, and performs product operation on the control pixel coefficients of each pixel point and the corresponding transparent pixels and/or brightness pixels to obtain pixel adjustment parameters;
Specifically, when the user slides the display interface, a preset origin is selected in advance according to requirements and habit preferences, and a plane coordinate system is established, and when the display interface slides, the coordinate value of each pixel point on the plane coordinate system changes, so that the distance value of each pixel point relative to the preset origin changes, and the distance value is an absolute value. Therefore, a distance value of each pixel point at the area to be processed relative to a preset origin in a preset moving direction is obtained, a preset control coefficient change table is set, and the preset control coefficient change table comprises a corresponding relation between the distance value and a transparent control pixel coefficient, and the distance value and a brightness control pixel coefficient. Therefore, the corresponding control pixel coefficient can be obtained according to each distance value, and the product operation can be carried out on the control pixel coefficient of each pixel point and the corresponding transparent pixel and/or brightness pixel.
Of course, a transparency change curve (or a brightness change curve) may be preset, and the transparency change curve (or the brightness change curve) may be a curve that is nonlinear and monotonically increases (or monotonically decreases) with the distance value according to the transparency control pixel coefficient (or the brightness control pixel coefficient). In addition, the transparency change curve (or brightness change curve) may also be a curve that is nonlinear and increases (or decreases) with the distance value according to the transparency control pixel coefficient (or brightness control pixel coefficient) according to the demand. In short, the effect of higher brightness level can be satisfied as the transparency level of the pixel point moving toward the vicinity of the preset origin is lower.
S580 adjusts the pixel point state of the area to be processed according to the pixel adjusting parameters to complete the pixel state switching display of the display interface.
Specifically, with the popularization of Virtual Reality (VR), VR application interaction scenarios are more and more, which inevitably requires development and innovation of a human-machine interface (Human Machine Interaction, HMI) in the VR use process, and various graphic interaction schemes which are easier to understand are developed according to the requirements. In the three-dimensional space in the virtual reality scene, various types of menus and interface displays become basic functions which each virtual reality application program must have. The virtual menu performs various dynamic effects such as hiding/displaying, opaque to completely transparent transition, position shifting, color changing and other effects, and the like, are applied in many scenes.
In the VR menu display process, it is desirable to achieve the following technical effects: the menu in three-dimensional space has high translucency in certain regions and low translucency in certain regions. Referring to fig. 7, a dynamic effect during switching of the VR menu interface is illustrated, and a user is prompted to slide left or right to fully display a part of visible objects by means of invisible part of the objects on both sides of the main picture or the logo.
In the VR menu switching process, if the area in the semitransparent state in the left interface body (a 3) moves along the direction from left to right towards the position close to the preset origin, the middle interface body (a 2) moves along the direction from left to right towards the position far away from the preset origin, at this time, the area in the semitransparent state of the left interface body (a 3) gradually becomes an opaque area, and the area in the non-transparent state of the middle interface body (a 2) gradually becomes a transparent area.
When the interface body (a 2) in the middle moves in the right-to-left direction toward the direction away from the preset origin if the area of the interface body (a 1) on the right side has been in the semitransparent state, the area of the interface body (a 1) on the right side gradually becomes an opaque area, and the area of the interface body (a 2) on the middle in the non-transparent state gradually becomes a transparent area.
Conversely, if the area of the interface body (a 3) on the left that has been rendered semi-transparent is moved in the right-to-left direction away from the preset origin, the area of the interface body (a 3) on the left that is semi-transparent gradually becomes a highly transparent area or even a fully transparent area. When the area of the interface body (a 1) on the right side, which has been in a semitransparent state, moves away from the preset origin in the direction from left to right, the area of the interface body (a 1) on the right side gradually becomes a high transparent area or even a full transparent area. The above is only an example of the left and right interface bodies, and the manner of adjusting the upper and lower interface bodies by sliding up and down is not described here in detail.
By the method, the surface of the virtual application menu or other surfaces with transparent pictures is simultaneously subjected to a plurality of control pixel coefficients, the final rendering surface is influenced by the product, and the control of the local transparent surface and the semitransparent surface can be controlled by the control pixel coefficients for a plurality of times. The surface of the three-dimensional object which is scratched off by using the transparent channel mapping can be kept completely transparent under the control of the transparent channel mapping. The method is mainly applied to the semitransparent animation simple processing of the user interface menu in the three-dimensional space in the VR equipment, and can realize the regional rendering of the user interface menu, so that the translucence of certain regions is high, and the translucence of certain regions is low.
In this embodiment, the transparency or brightness of each area to be processed is respectively adjusted according to the sliding track of the user operating the interface body, especially in the edge area, there is usually gradual change or abrupt change of transparency, and a perfect and natural linking animation can be given on the plane of the terminal device in cooperation with the sliding operation of the user, so that the user can more conveniently view the menu content, different transparency can be set for each area to be processed, the effect of transparency or abrupt change or even gradual change is achieved, thus the space movement sense is provided, the visual feeling of the user is enhanced, the interactive experience of the user and the menu switching with VR/AR/MR display function is improved, the abundant and vivid interface switching effect is provided, the visual quality is also improved, and the space movement sense is provided, and the use experience of the user is improved.
An embodiment of the present invention, as shown in fig. 8, is a terminal device, including:
The image processing module 10 is used for superposing the transparent channel on the display interface to obtain a region to be processed;
the extraction operation module 20 is configured to obtain pixel data of a region to be processed, and calculate a pixel adjustment parameter according to the control pixel coefficient and the pixel data;
The image display module 30 is configured to adjust the pixel status of the area to be processed according to the pixel adjustment parameter, so as to complete the pixel status switching display of the display interface.
Specifically, the embodiment is an embodiment of a device corresponding to the embodiment of the method, and specific effects refer to the embodiment of the method, which is not described herein in detail.
Based on the foregoing embodiment, the image processing module 10 includes:
the surface patch generating unit is used for acquiring a grid object corresponding to the display interface and generating a corresponding surface patch model according to the grid object;
And the image processing unit is used for superposing transparent channels corresponding to the identification information at the patch model and extracting the to-be-processed area of the corresponding patch model through the transparent channels.
Specifically, the embodiment is an embodiment of a device corresponding to the embodiment of the method, and specific effects refer to the embodiment of the method, which is not described herein in detail.
Based on the foregoing embodiment, the extraction operation module 20 includes:
The pixel extraction unit is used for obtaining target pixel values corresponding to all pixel points in the region to be processed; the target pixel value includes a transparent pixel and a brightness pixel;
and the operation unit is used for carrying out product operation on the control pixel coefficient and the transparent pixel and/or the brightness pixel.
Specifically, the embodiment is an embodiment of a device corresponding to the embodiment of the method, and specific effects refer to the embodiment of the method, which is not described herein in detail.
Based on the foregoing embodiments, the operation unit includes:
the distance acquisition subunit is used for acquiring a distance value of each pixel point at the region to be processed relative to a preset origin in a preset moving direction when the display interface slides;
And the product operation subunit is used for acquiring a corresponding control pixel coefficient according to each distance value, and carrying out product operation on the control pixel coefficient of each pixel point and the corresponding transparent pixel and/or brightness pixel.
Specifically, the embodiment is an embodiment of a device corresponding to the embodiment of the method, and specific effects refer to the embodiment of the method, which is not described herein in detail.
It will be apparent to those skilled in the art that the above-described program modules are only illustrated in the division of the above-described program modules for convenience and brevity, and that in practical applications, the above-described functional allocation may be performed by different program modules, i.e., the internal structure of the apparatus is divided into different program units or modules, to perform all or part of the above-described functions. The program modules in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one processing unit, where the integrated units may be implemented in a form of hardware or in a form of a software program unit. In addition, the specific names of the program modules are also only for distinguishing from each other, and are not used to limit the protection scope of the present application.
An embodiment of the invention, a terminal device, including a processor, a memory, wherein the memory is used for storing a computer program; and the processor is used for executing the computer program stored in the memory to realize the image rendering control method in the embodiment of the method.
The terminal equipment can be desktop computers, notebooks, palm computers, tablet computers, mobile phones, man-machine interaction screens and other equipment. The terminal device may include, but is not limited to, a processor, a memory. It will be appreciated by those skilled in the art that the foregoing is merely an example of a terminal device and is not limiting of the terminal device and may include more or fewer components than shown, or may combine certain components, or different components, such as: the terminal device may also include input/output interfaces, display devices, network access devices, communication buses, communication interfaces, and the like. The communication interface and the communication bus may further comprise an input/output interface, wherein the processor, the memory, the input/output interface and the communication interface complete communication with each other through the communication bus. The memory stores a computer program, and the processor is configured to execute the computer program stored in the memory to implement the image rendering control method in the above method embodiment.
The Processor may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (DIGITAL SIGNAL Processor, DSP), application SPECIFIC INTEGRATED Circuit (ASIC), field-Programmable gate array (Field-Programmable GATE ARRAY, FPGA) or other Programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory may be an internal storage unit of the terminal device, for example: a hard disk or a memory of the terminal equipment. The memory may also be an external storage device of the terminal device, for example: a plug-in hard disk, a smart memory card (SMART MEDIA CARD, SMC), a Secure Digital (SD) card, a flash memory card (FLASH CARD) and the like which are arranged on the terminal equipment. Further, the memory may also include both an internal storage unit and an external storage device of the terminal device. The memory is used for storing the computer program and other programs and data required by the terminal device. The memory may also be used to temporarily store data that has been output or is to be output.
A communication bus is a circuit that connects the elements described and enables transmission between these elements. For example, the processor receives commands from other elements through the communication bus, decrypts the received commands, and performs calculations or data processing based on the decrypted commands. The memory may include program modules such as a kernel (kernel), middleware (middleware), application programming interfaces (Application Programming Interface, APIs), and applications. The program modules may be comprised of software, firmware, or hardware, or at least two of them. The input/output interface forwards commands or data entered by a user through the input/output interface (e.g., sensor, keyboard, touch screen). The communication interface connects the terminal device with other network devices, user devices, networks. For example, the communication interface may be connected to a network by wire or wirelessly to connect to external other network devices or user devices. The wireless communication may include at least one of: wireless fidelity (WiFi), bluetooth (BT), near field wireless communication technology (NFC), global Positioning System (GPS) and cellular communications, among others. The wired communication may include at least one of: universal Serial Bus (USB), high Definition Multimedia Interface (HDMI), asynchronous transfer standard interface (RS-232), and the like. The network may be a telecommunications network or a communication network. The communication network may be a computer network, the internet of things, a telephone network. The terminal device may be connected to the network through a communication interface, and protocols used by the terminal device to communicate with other network devices may be supported by at least one of an application, an Application Programming Interface (API), middleware, a kernel, and a communication interface.
In one embodiment of the present invention, a storage medium stores at least one instruction, where the instruction is loaded and executed by a processor to implement an operation performed by the corresponding embodiment of the image rendering control method. For example, the computer readable storage medium may be Read Only Memory (ROM), random Access Memory (RAM), compact disk read only memory (CD-ROM), magnetic tape, floppy disk, optical data storage device, etc.
They may be implemented in program code that is executable by a computing device such that they may be stored in a memory device for execution by the computing device, or they may be separately fabricated into individual integrated circuit modules, or a plurality of modules or steps in them may be fabricated into a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
In the foregoing embodiments, the descriptions of the embodiments are focused on, and the parts of a certain embodiment that are not described or depicted in detail may be referred to in the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other manners. For example, the apparatus/terminal device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on this understanding, the present invention may implement all or part of the flow of the method of the above embodiment, or may be implemented by sending instructions to related hardware by a computer program, where the computer program may be stored in a computer readable storage medium, and the computer program may implement the steps of each of the method embodiments described above when executed by a processor. Wherein the computer program comprises: computer program code may be in the form of source code, object code, executable files, or in some intermediate form, etc. The computer readable storage medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the content of the computer readable storage medium may be appropriately increased or decreased according to the requirements of legislation and patent practice in jurisdictions, for example: in some jurisdictions, computer-readable media do not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
It should be noted that the above embodiments can be freely combined as needed. The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention.

Claims (6)

1. An image rendering control method, characterized by comprising the steps of:
Superposing a transparent channel on a display interface to obtain a region to be treated;
Acquiring pixel data of the region to be processed, and calculating pixel adjustment parameters according to the control pixel coefficient and the pixel data, wherein the method specifically comprises the following steps: obtaining a target pixel value corresponding to each pixel point in the region to be processed, wherein the target pixel value comprises a transparent pixel and a brightness pixel; performing product operation on the control pixel coefficient and the transparent pixel and/or the brightness pixel to obtain a pixel adjusting parameter;
The step of performing product operation on the control pixel coefficient and the transparent pixel and/or the brightness pixel to obtain a pixel adjustment parameter specifically includes: acquiring a distance value of each pixel point at the to-be-processed area relative to a preset origin in a preset moving direction when a display interface slides; obtaining a corresponding control pixel coefficient according to each distance value, and carrying out product operation on the control pixel coefficient of each pixel point and a corresponding transparent pixel and/or brightness pixel to obtain a pixel adjustment parameter;
And adjusting the pixel point state of the region to be processed according to the pixel adjusting parameters so as to finish the pixel state switching display of the display interface.
2. The image rendering control method according to claim 1, wherein the acquiring the pixel data of the region to be processed includes the steps of:
If the interface image corresponding to the display interface does not comprise transparent pixels, carrying out gray processing on the interface image;
and obtaining a target pixel value corresponding to each pixel point according to the color channel value of each pixel in the interface image after gray processing to obtain the pixel data.
3. The image rendering control method according to claim 1 or 2, wherein the step of superimposing the transparent channel on the display interface to obtain the region to be processed includes the steps of:
Acquiring a grid object corresponding to the display interface, and generating a corresponding patch model according to the grid object;
And superposing a transparent channel corresponding to the identification information at the patch model, and extracting a region to be processed of the corresponding patch model through the transparent channel.
4. A terminal device, comprising:
the image processing module is used for superposing the transparent channel on the display interface to obtain a region to be processed;
the extraction operation module is used for acquiring pixel data of the area to be processed and calculating pixel adjustment parameters according to the control pixel coefficient and the pixel data, wherein the extraction operation module specifically comprises:
the pixel extraction unit is used for obtaining target pixel values corresponding to all pixel points in the region to be processed; the target pixel value includes a transparent pixel and a brightness pixel;
The operation unit is used for carrying out product operation on the control pixel coefficient and the transparent pixel and/or the brightness pixel to obtain a pixel adjustment parameter, wherein the operation unit comprises:
The distance acquisition subunit is used for acquiring a distance value of each pixel point at the to-be-processed area relative to a preset origin in a preset moving direction when the display interface slides;
The product operation subunit is used for acquiring a corresponding control pixel coefficient according to each distance value, and performing product operation on the control pixel coefficient of each pixel point and the corresponding transparent pixel and/or brightness pixel to obtain a pixel adjustment parameter;
And the image display module is used for adjusting the pixel point state of the region to be processed according to the pixel adjusting parameters so as to finish the pixel state switching display of the display interface.
5. The terminal device of claim 4, wherein the image processing module comprises:
the surface patch generating unit is used for acquiring a grid object corresponding to the display interface and generating a corresponding surface patch model according to the grid object;
And the image processing unit is used for superposing transparent channels corresponding to the identification information at the patch model and extracting the area to be processed of the corresponding patch model through the transparent channels.
6. A storage medium having stored therein at least one instruction loaded and executed by a processor to implement the operations performed by the image rendering control method of any one of claims 1 to 3.
CN202010301299.XA 2020-04-16 2020-04-16 Image rendering control method, terminal equipment and storage medium Active CN111489429B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010301299.XA CN111489429B (en) 2020-04-16 2020-04-16 Image rendering control method, terminal equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010301299.XA CN111489429B (en) 2020-04-16 2020-04-16 Image rendering control method, terminal equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111489429A CN111489429A (en) 2020-08-04
CN111489429B true CN111489429B (en) 2024-06-07

Family

ID=71794831

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010301299.XA Active CN111489429B (en) 2020-04-16 2020-04-16 Image rendering control method, terminal equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111489429B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112017110A (en) * 2020-08-26 2020-12-01 广州拓想家科技有限公司 Image multi-face perspective adjustment method and system
CN112153408B (en) * 2020-09-28 2022-07-08 广州虎牙科技有限公司 Live broadcast rendering method and device, electronic equipment and storage medium
CN112767238A (en) * 2020-12-31 2021-05-07 北京字跳网络技术有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN113377476B (en) * 2021-06-23 2023-07-25 北京百度网讯科技有限公司 Interface display method, related device and computer program product
CN114003163B (en) * 2021-10-27 2023-10-24 腾讯科技(深圳)有限公司 Image processing method and device, storage medium and electronic equipment

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0962852A (en) * 1995-08-30 1997-03-07 Kubota Corp Graphic processor
JP2000339488A (en) * 1999-05-25 2000-12-08 Sega Enterp Ltd Picture processor
JP2001101441A (en) * 1999-09-28 2001-04-13 Square Co Ltd Method and device for rendering, game system and computer readable recording medium storing program for rendering three-dimensional model
CN101078982A (en) * 2006-05-24 2007-11-28 北京壁虎科技有限公司 Screen display method based on drawing engine
CN101529495A (en) * 2006-09-19 2009-09-09 奥多比公司 Image mask generation
CN102393970A (en) * 2011-12-13 2012-03-28 北京航空航天大学 Object three-dimensional modeling and rendering system as well as generation and rendering methods of three-dimensional model
CN103618886A (en) * 2013-12-13 2014-03-05 厦门美图网科技有限公司 Shooting method for intelligently decoloring according to main color tone
CN108205998A (en) * 2015-12-21 2018-06-26 联发科技股份有限公司 The controller and corresponding control methods of transparence display
CN108295467A (en) * 2018-02-06 2018-07-20 网易(杭州)网络有限公司 Rendering method, device and the storage medium of image, processor and terminal
CN108475330A (en) * 2015-11-09 2018-08-31 港大科桥有限公司 Auxiliary data for there is the View synthesis of pseudomorphism perception
CN109272565A (en) * 2017-07-18 2019-01-25 腾讯科技(深圳)有限公司 Animation playing method, device, storage medium and terminal
CN109859303A (en) * 2019-01-16 2019-06-07 网易(杭州)网络有限公司 Rendering method, device, terminal device and the readable storage medium storing program for executing of image
CN110163831A (en) * 2019-04-19 2019-08-23 深圳市思为软件技术有限公司 The object Dynamic Display method, apparatus and terminal device of three-dimensional sand table
CN110503704A (en) * 2019-08-27 2019-11-26 北京迈格威科技有限公司 Building method, device and the electronic equipment of three components
CN110503599A (en) * 2019-08-16 2019-11-26 珠海天燕科技有限公司 Image processing method and device
CN110796721A (en) * 2019-10-31 2020-02-14 北京字节跳动网络技术有限公司 Color rendering method and device of virtual image, terminal and storage medium
CN110796725A (en) * 2019-08-28 2020-02-14 腾讯科技(深圳)有限公司 Data rendering method, device, terminal and storage medium
CN111009026A (en) * 2019-12-24 2020-04-14 腾讯科技(深圳)有限公司 Object rendering method and device, storage medium and electronic device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7688319B2 (en) * 2005-11-09 2010-03-30 Adobe Systems, Incorporated Method and apparatus for rendering semi-transparent surfaces
JP6494249B2 (en) * 2014-11-12 2019-04-03 キヤノン株式会社 Image forming apparatus, image forming method, and program

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0962852A (en) * 1995-08-30 1997-03-07 Kubota Corp Graphic processor
JP2000339488A (en) * 1999-05-25 2000-12-08 Sega Enterp Ltd Picture processor
JP2001101441A (en) * 1999-09-28 2001-04-13 Square Co Ltd Method and device for rendering, game system and computer readable recording medium storing program for rendering three-dimensional model
CN101078982A (en) * 2006-05-24 2007-11-28 北京壁虎科技有限公司 Screen display method based on drawing engine
CN101529495A (en) * 2006-09-19 2009-09-09 奥多比公司 Image mask generation
CN102393970A (en) * 2011-12-13 2012-03-28 北京航空航天大学 Object three-dimensional modeling and rendering system as well as generation and rendering methods of three-dimensional model
CN103618886A (en) * 2013-12-13 2014-03-05 厦门美图网科技有限公司 Shooting method for intelligently decoloring according to main color tone
CN108475330A (en) * 2015-11-09 2018-08-31 港大科桥有限公司 Auxiliary data for there is the View synthesis of pseudomorphism perception
CN108205998A (en) * 2015-12-21 2018-06-26 联发科技股份有限公司 The controller and corresponding control methods of transparence display
CN109272565A (en) * 2017-07-18 2019-01-25 腾讯科技(深圳)有限公司 Animation playing method, device, storage medium and terminal
CN108295467A (en) * 2018-02-06 2018-07-20 网易(杭州)网络有限公司 Rendering method, device and the storage medium of image, processor and terminal
CN109859303A (en) * 2019-01-16 2019-06-07 网易(杭州)网络有限公司 Rendering method, device, terminal device and the readable storage medium storing program for executing of image
CN110163831A (en) * 2019-04-19 2019-08-23 深圳市思为软件技术有限公司 The object Dynamic Display method, apparatus and terminal device of three-dimensional sand table
CN110503599A (en) * 2019-08-16 2019-11-26 珠海天燕科技有限公司 Image processing method and device
CN110503704A (en) * 2019-08-27 2019-11-26 北京迈格威科技有限公司 Building method, device and the electronic equipment of three components
CN110796725A (en) * 2019-08-28 2020-02-14 腾讯科技(深圳)有限公司 Data rendering method, device, terminal and storage medium
CN110796721A (en) * 2019-10-31 2020-02-14 北京字节跳动网络技术有限公司 Color rendering method and device of virtual image, terminal and storage medium
CN111009026A (en) * 2019-12-24 2020-04-14 腾讯科技(深圳)有限公司 Object rendering method and device, storage medium and electronic device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
甘岚等.肿瘤细胞图像识别.西安交通大学出版社,2019,11-12. *

Also Published As

Publication number Publication date
CN111489429A (en) 2020-08-04

Similar Documents

Publication Publication Date Title
CN111489429B (en) Image rendering control method, terminal equipment and storage medium
WO2021135320A1 (en) Video generation method and apparatus, and computer system
US20150286364A1 (en) Editing method of the three-dimensional shopping platform display interface for users
CN107637075B (en) Three-dimensional view processing
CN111583379B (en) Virtual model rendering method and device, storage medium and electronic equipment
CA2853761A1 (en) Rendering system, rendering server, control method thereof, program, and recording medium
CN106447756B (en) Method and system for generating user-customized computer-generated animations
CN112288665A (en) Image fusion method and device, storage medium and electronic equipment
US10614633B2 (en) Projecting a two-dimensional image onto a three-dimensional graphical object
KR20190138885A (en) How to recolor dynamic image color using alpha blending
CN105808035B (en) Icon display method and device
CN108038916B (en) Augmented reality display method
KR102639725B1 (en) Electronic device for providing animated image and method thereof
CN111787240B (en) Video generation method, apparatus and computer readable storage medium
CN114820915A (en) Method and device for rendering shading light, storage medium and electronic device
JP6376591B2 (en) Data output device, data output method, and three-dimensional object manufacturing system
CN111107264A (en) Image processing method, image processing device, storage medium and terminal
JP5857606B2 (en) Depth production support apparatus, depth production support method, and program
CN113763546A (en) Card preview method and device and electronic equipment
EP2816521A1 (en) Editing method of the three-dimensional shopping platform display interface for users
JP2015169958A (en) Design editing device and program
CN117115299A (en) Display information processing method and device, storage medium and electronic device
JP5966837B2 (en) Depth production support apparatus, depth production support method, and program
KR102608863B1 (en) Animating method for expressing cartoon characteristics
US20220182559A1 (en) Information processing apparatus, image generation apparatus, information processing method, image generation method, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant