CN114419235A - Real-time plane reflection rendering method and device for mobile terminal - Google Patents

Real-time plane reflection rendering method and device for mobile terminal Download PDF

Info

Publication number
CN114419235A
CN114419235A CN202210041160.5A CN202210041160A CN114419235A CN 114419235 A CN114419235 A CN 114419235A CN 202210041160 A CN202210041160 A CN 202210041160A CN 114419235 A CN114419235 A CN 114419235A
Authority
CN
China
Prior art keywords
reflection
screen space
texture
target object
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210041160.5A
Other languages
Chinese (zh)
Inventor
扈红柯
李建良
郭子文
何雨泉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yunyou Interactive Network Technology Co ltd
Online Tuyoo Beijing Technology Co ltd
Original Assignee
Beijing Yunyou Interactive Network Technology Co ltd
Online Tuyoo Beijing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Yunyou Interactive Network Technology Co ltd, Online Tuyoo Beijing Technology Co ltd filed Critical Beijing Yunyou Interactive Network Technology Co ltd
Priority to CN202210041160.5A priority Critical patent/CN114419235A/en
Publication of CN114419235A publication Critical patent/CN114419235A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)

Abstract

The method calculates the screen space coordinate of a target to be reflected according to the screen space coordinate to obtain the reflected screen space coordinate, further converts the reflected screen space coordinate into a UV texture coordinate, and generates a reflected UV texture by using the two coordinates. In the screen post-processing stage, the reflected UV texture is used to sample the normally rendered texture, thereby obtaining the reflection effect. The reflection effect achieved by the method is real, the calculated amount is small, the method is particularly suitable for the rendering requirement of the mobile terminal, and the pressure of the mobile terminal in real-time rendering of the three-dimensional scene is reduced.

Description

Real-time plane reflection rendering method and device for mobile terminal
Technical Field
The present application relates to the field of computer graphics rendering technologies, and in particular, to a real-time plane reflection rendering method and apparatus for a mobile terminal, a computing device, and a computer-readable storage medium.
Background
In a three-dimensional scene, a more realistic lighting effect is required for some important objects such as characters, buildings and the like, and especially a plane reflection effect is required. In rendering, the prior art generally adopts three methods, namely, reflection cube map, camera-rendered reflection mapping, screen space reflection, and the like. However, each of the three schemes has corresponding defects, and is not suitable for realizing real-time plane reflection rendering in three-dimensional application of a mobile terminal. In order to adapt to the hardware resources of the mobile terminal and have a better plane reflection effect, a faster, more efficient and more flexible real-time plane reflection scheme is needed.
Disclosure of Invention
In view of the above, embodiments of the present application provide a real-time plane reflection rendering method and apparatus for a mobile end, a computing device, and a computer-readable storage medium, so as to solve technical defects in the prior art.
According to a first aspect of embodiments of the present application, there is provided a real-time plane reflection rendering method for a mobile terminal, including:
generating a screen space reflection coordinate of the target object according to the reflection plane and the normal direction;
generating screen space reflection UV textures according to the screen space reflection coordinates of the target object;
and reflecting the UV texture according to the screen space to generate the reflection effect of the target object.
According to a second aspect of embodiments of the present application, there is provided a real-time plane reflection rendering apparatus for a mobile terminal, including:
the reflection coordinate generation module is used for generating screen space reflection coordinates of the target object according to the reflection plane and the normal direction;
the reflection UV texture generation module is used for generating screen space reflection UV textures according to the screen space reflection coordinates of the target object;
and the reflection effect generation module is used for generating the reflection effect of the target object according to the UV texture reflected by the screen space.
According to a third aspect of embodiments of the present application, there is provided a computing device comprising a memory, a processor and computer instructions stored on the memory and executable on the processor, the processor implementing the steps of the real-time plane reflection rendering method for a mobile terminal when executing the instructions.
According to a fourth aspect of embodiments of the present application, there is provided a computer-readable storage medium storing computer instructions which, when executed by a processor, implement the steps of the real-time planar reflection rendering method for a mobile terminal.
According to the embodiment of the application, the reflection UV texture is obtained through the acquisition and conversion of the reflection coordinate of the target object, the generation of the reflection effect is further transferred to the screen post-processing stage in an MRT mode, and the reflection UV texture in the MRT is used for sampling the texture normally rendered in the screen post-processing stage to obtain the reflection rendering result. The reflection effect achieved by the method is real, and is directly synthesized in the screen post-processing stage without additional computing resources, which is obviously different from the existing SSPR technology, and is particularly suitable for the rendering requirement of the mobile terminal, and the hardware resource pressure of the mobile terminal in the real-time rendering of the three-dimensional scene is reduced.
Drawings
FIG. 1 is a block diagram of a computing device provided by an embodiment of the present application;
FIG. 2 is a diagram of a planar reflection rendering effect in the prior art;
fig. 3 is a schematic flowchart illustrating a real-time plane reflection rendering method for a mobile terminal according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of how the target object reflection coordinates are calculated according to the embodiment of the present application;
fig. 5 is a schematic diagram illustrating an effect of plane reflection implemented by a real-time plane reflection rendering method for a mobile terminal according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a real-time plane reflection rendering apparatus for a mobile terminal according to an embodiment of the present disclosure.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is capable of implementation in many different ways than those herein set forth and of similar import by those skilled in the art without departing from the spirit of this application and is therefore not limited to the specific implementations disclosed below.
The terminology used in the one or more embodiments of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the one or more embodiments of the present application. As used in one or more embodiments of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used in one or more embodiments of the present application refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It will be understood that, although the terms first, second, etc. may be used herein in one or more embodiments of the present application to describe various information, these information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, a first aspect may be termed a second aspect, and, similarly, a second aspect may be termed a first aspect, without departing from the scope of one or more embodiments of the present application. The word "if," as used herein, may be interpreted as "responsive to a determination," depending on the context.
In the present application, a real-time plane reflection rendering method and apparatus, a computing device, and a computer-readable storage medium for a mobile terminal are provided, which are described in detail in the following embodiments one by one.
FIG. 1 shows a block diagram of a computing device 100 according to an embodiment of the present application. The components of the computing device 100 include, but are not limited to, memory 110 and processor 120. The processor 120 is coupled to the memory 110 via a bus 130 and a database 150 is used to store data.
Computing device 100 also includes access device 140, access device 140 enabling computing device 100 to communicate via one or more networks 160. Examples of such networks include the Public Switched Telephone Network (PSTN), a Local Area Network (LAN), a Wide Area Network (WAN), a Personal Area Network (PAN), or a combination of communication networks such as the internet. Access device 140 may include one or more of any type of network interface (e.g., a Network Interface Card (NIC)) whether wired or wireless, such as an IEEE802.11 Wireless Local Area Network (WLAN) wireless interface, a worldwide interoperability for microwave access (Wi-MAX) interface, an ethernet interface, a Universal Serial Bus (USB) interface, a cellular network interface, a bluetooth interface, a Near Field Communication (NFC) interface, and so forth.
In one embodiment of the present application, the above-mentioned components of the computing device 100 and other components not shown in fig. 1 may also be connected to each other, for example, by a bus. It should be understood that the block diagram of the computing device architecture shown in FIG. 1 is for purposes of example only and is not limiting as to the scope of the present application. Those skilled in the art may add or replace other components as desired.
Computing device 100 may be any type of stationary or mobile computing device, including a mobile computer or mobile computing device (e.g., tablet, personal digital assistant, laptop, notebook, netbook, etc.), a mobile phone (e.g., smartphone), a wearable computing device (e.g., smartwatch, smartglasses, etc.), or other type of mobile device, or a stationary computing device such as a desktop computer or PC. Computing device 100 may also be a mobile or stationary server.
In the prior art, in the field of three-dimensional rendering, particularly in a three-dimensional scene of a game, real-time plane reflection rendering is a frequently-occurring scene, and is used as a part of global illumination, so that the accuracy of an illumination environment can be ensured, and the illumination effect can be improved, as shown in fig. 2. In mainstream game engines, reflection cube map, camera rendering reflection mapping, screen space reflection, and other three technologies are generally used to realize plane reflection. The reflection cube map is a cubic map generated for a scene, is time-consuming, is usually used in static reflection, and has a reflection effect in a plane area. The camera needs an additional camera to re-render the objects to be reflected in the scene into a texture map, the camera itself causes a large extra overhead, the re-rendering of the scene causes the DrawCall to be doubled, and the reflection map generally simplifies the illumination calculation, which results in poor effect. The Screen Space Reflection (SSR) requires a ray tracing or ray stepping algorithm, is computationally expensive, cannot reflect individual objects for optimization, and is very limited at the mobile end.
Recently, in order to solve the defects of the prior art, a screen space plane reflection technology (SSPR) is also provided, compared with an SSR, the SSPR avoids high consumption of light stepping, the performance is greatly improved, and compared with plane reflection, the condition of doubling of drawcall does not exist. However, the idea of the SSPR technique is to reconstruct world coordinates from a depth map, then perform coordinate inversion by a predetermined height, and then use the coordinates of the inverted screen to correspond to the screen color before the inversion, and the implementation of these steps depends on ComputeShader techniques, such as CN112233216A, "URP rendering pipeline-screen space plane reflection": https:// zhuanlan. zhihu. com/p/357714920 and the like. For example, the ComputeShader technology is not supported by all mobile end devices, and the overhead is large when the ComputeShader is used to calculate the inverse coordinate, which puts high demands on hardware resources of the mobile end, and due to the disorder of the execution of the ComputeShader, the texture after reflection flickers.
In the embodiment of the present application, in order to solve the above problem, a real-time plane reflection rendering method and apparatus for a mobile terminal, a computing device, and a computer-readable storage medium are provided. Wherein the processor 120 may perform the steps in the real-time plane reflection rendering method for a mobile terminal shown in fig. 3. A flow chart of a real-time plane reflection rendering method for a mobile terminal is shown in fig. 3, including step 202 to step 208.
Step 302: and generating screen space reflection coordinates of the target object according to the reflection plane and the normal direction.
In a specific embodiment, for a target object requiring real-time plane reflection, in the vertex shader, coordinates of the target object after reflection projection are calculated according to a reflection plane position and a normal direction which are transmitted in advance. As shown in fig. 4, where S is the reflection plane and P is the target object needed to generate the reflection effect. In this step, the screen space coordinates of the target object P are acquired, and the coordinates of the reflected object P' are generated according to the position of the reflection plane S and the direction of the normal line.
Further, a Pass channel is newly built in the vertex shader, and the coordinate calculation of the reflected object P' is realized in the Pass channel.
Figure BDA0003470280920000071
In the step, whether each object needs to be reflected or not can be controlled independently by acquiring the screen space reflection coordinates of the target object, so that the method is more flexible, only one additional Pass channel is needed, new camera cutting or reflection calculation of the whole screen space is not needed, and the calculation pressure of the mobile terminal equipment is reduced.
Step 304: and generating screen space reflection UV textures according to the screen space reflection coordinates of the target object.
In a specific embodiment, the vertex coordinates of the target object are transformed to the screen space UV coordinates through projection transformation and perspective division, and the position of the vertex coordinates of the target object in the screen texture is obtained.
Figure BDA0003470280920000081
In the field of computer graphics, coordinates such as vertex coordinates, UV coordinates, world coordinates, etc. are conventional in the art and will not be described herein.
Further, in the fragment shader, the screen space UV coordinates of the target object are written into a reflection space formed by the screen space reflection coordinates of the target object, and a reflection UV texture is generated.
Further, the reflected UV texture is written into one of the buffers of the MRT. Wherein, the other buffer areas of the MRT store the texture map normally rendered in the original screen.
The Multiple Render Target (MRT) technique is to enable a rendering program to bind Multiple Render targets in one draw call, and then write different Render data in different buffers.
In this step, the coordinates of the UV texture of the target object are obtained through coordinate transformation, the coordinates and the reflection coordinates calculated in step 302 are combined to generate a reflection UV texture, and the reflection UV texture is written into a buffer area in the MRT, so that the acquisition of the screen space reflection UV texture is realized.
Step 306: and reflecting the UV texture according to the screen space to generate a reflection rendering effect.
In a specific embodiment, according to the screen space reflection UV texture of the MRT buffer, sampling the screen texture of another buffer of the MRT buffer to obtain the reflected screen pixels;
further, the reflected pixels are mixed with the current screen pixels to obtain the reflection effect of the target object.
Further, the above step 306 is implemented in a post-screen processing stage. The screen post-processing stage is a series of processing stages performed on the rendered image after the screen rendered image is obtained, and this term is well known to those skilled in the art and will not be described herein.
In this step, the object sampled according to the reflected UV texture is a screen buffer texture rendered from the original scene, so the generated reflection effect is realistic, and the reflection effect shown in fig. 5 is realized by the method of the embodiment of the present application. Meanwhile, the generation step is directly synthesized in the screen post-processing stage, extra calculation and generation of a Pass channel are not needed, and resource consumption of the mobile terminal is further reduced.
In the embodiment of the real-time plane reflection rendering method for the mobile terminal, the screen space coordinate of the target to be reflected is calculated according to the screen space coordinate of the target to be reflected to obtain the reflected screen space coordinate, so that whether each object needs to be reflected or not can be controlled independently, the method is more flexible, the calculation amount is small, and the calculation pressure of the mobile terminal equipment is reduced. And further converting the acquired space coordinates of the reflected screen into UV texture coordinates, generating reflected UV textures by using the two coordinates, and writing the reflected UV textures and the normally rendered textures into the MRT together. In the screen post-processing stage, the normally rendered texture is sampled by using the reflected UV texture in the MRT, so that the reflection effect is obtained. The reflection effect achieved by the method is real, and is directly synthesized in the screen post-processing stage without additional computing resources, which is obviously different from the existing SSPR technology, and is particularly suitable for the rendering requirement of the mobile terminal, and the hardware resource pressure of the mobile terminal in the real-time rendering of the three-dimensional scene is reduced.
Corresponding to the above method embodiment, the present application further provides an embodiment of a real-time plane reflection rendering apparatus for a mobile end, and fig. 6 shows a schematic structural diagram of a real-time plane reflection rendering apparatus for a mobile end according to an embodiment of the present application. As shown in fig. 6, the apparatus includes:
the reflection coordinate generation module is used for generating screen space reflection coordinates of the target object according to the reflection plane and the normal direction;
the reflection UV texture generation module is used for generating screen space reflection UV textures according to the screen space reflection coordinates of the target object;
and the reflection effect generation module is used for generating the reflection effect of the target object according to the UV texture reflected by the screen space.
The above is a schematic solution of the real-time plane reflection rendering apparatus for a mobile terminal according to the embodiment. It should be noted that the technical solution of the real-time plane reflection rendering apparatus for a mobile end and the technical solution of the real-time plane reflection rendering method for a mobile end belong to the same concept, and details of the technical solution of the real-time plane reflection rendering apparatus for a mobile end, which are not described in detail, can be referred to the description of the technical solution of the real-time plane reflection rendering method for a mobile end.
There is also provided in an embodiment of the present application a computing device, including a memory, a processor, and computer instructions stored on the memory and executable on the processor, wherein the processor executes the instructions to implement the steps of the real-time plane reflection rendering method for a mobile terminal.
The above is an illustrative scheme of a computing device of the present embodiment. It should be noted that the technical solution of the computing device and the technical solution of the real-time plane reflection rendering method for the mobile terminal belong to the same concept, and details that are not described in detail in the technical solution of the computing device can be referred to the description of the technical solution of the real-time plane reflection rendering method for the mobile terminal.
An embodiment of the present application also provides a computer readable storage medium storing computer instructions, which when executed by a processor, implement the steps of the real-time plane reflection rendering method for a mobile terminal as described above.
The above is an illustrative scheme of a computer-readable storage medium of the present embodiment. It should be noted that the technical solution of the storage medium belongs to the same concept as the technical solution of the real-time plane reflection rendering method for the mobile terminal, and details that are not described in detail in the technical solution of the storage medium can be referred to the description of the technical solution of the real-time plane reflection rendering method for the mobile terminal.
The foregoing description of specific embodiments of the present application has been presented. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The computer instructions comprise computer program code which may be in the form of source code, object code, an executable file or some intermediate form, or the like. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
It should be noted that, for the sake of simplicity, the above-mentioned method embodiments are described as a series of acts or combinations, but those skilled in the art should understand that the present application is not limited by the described order of acts, as some steps may be performed in other orders or simultaneously according to the present application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
The preferred embodiments of the present application disclosed above are intended only to aid in the explanation of the application. Alternative embodiments are not exhaustive and do not limit the invention to the precise embodiments described. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the application and its practical applications, to thereby enable others skilled in the art to best understand and utilize the application. The application is limited only by the claims and their full scope and equivalents.

Claims (9)

1. A real-time plane reflection rendering method for a mobile terminal is characterized by comprising the following steps:
generating a screen space reflection coordinate of the target object according to the reflection plane and the normal direction;
generating screen space reflection UV textures according to the screen space reflection coordinates of the target object;
and generating a reflection effect of the target object according to the screen space reflection UV texture.
2. The method of claim 1, wherein generating screen space reflection coordinates of the target object further comprises:
and in the vertex shader, calculating the screen space reflection coordinates of the target object in the newly-built Pass channel according to the position of the reflection plane and the normal direction.
3. The method of claim 1, the generating a screen space reflectance (UV) texture from screen space reflectance coordinates of the target object comprising:
transforming the vertex coordinates of the target object to screen space UV coordinates;
and writing the screen space UV coordinates of the target object into a reflection space formed by the screen space reflection coordinates of the target object, and generating a reflection UV texture.
4. The method of claim 3, further comprising:
writing the reflected UV texture into a buffer of a multi-render target MRT;
the normally rendered texture map in the original screen is written into another buffer of the multi-render target MRT.
5. The method of claim 1, the generating a reflective rendering effect from screen space reflective UV textures comprising:
according to the screen space reflection UV texture of the MRT buffer area, sampling the screen texture of the other buffer area of the MRT buffer area to obtain a reflected screen pixel;
and mixing the reflected screen pixels with the current screen pixels to obtain a rendering result of the reflection effect.
6. The method of claim 5, wherein the reflection rendering effect is generated at a screen post-processing stage.
7. A real-time plane reflection rendering apparatus for a mobile terminal, comprising:
the reflection coordinate generation module is used for generating screen space reflection coordinates of the target object according to the reflection plane and the normal direction;
the reflection UV texture generation module is used for generating screen space reflection UV textures according to the screen space reflection coordinates of the target object;
and the reflection effect generation module is used for generating the reflection effect of the target object according to the UV texture reflected by the screen space.
8. A computing device comprising a memory, a processor, and computer instructions stored on the memory and executable on the processor, wherein the processor implements the steps of the method of any one of claims 1-6 when executing the instructions.
9. A computer-readable storage medium storing computer instructions, which when executed by a processor, perform the steps of the method of any one of claims 1 to 6.
CN202210041160.5A 2022-01-14 2022-01-14 Real-time plane reflection rendering method and device for mobile terminal Pending CN114419235A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210041160.5A CN114419235A (en) 2022-01-14 2022-01-14 Real-time plane reflection rendering method and device for mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210041160.5A CN114419235A (en) 2022-01-14 2022-01-14 Real-time plane reflection rendering method and device for mobile terminal

Publications (1)

Publication Number Publication Date
CN114419235A true CN114419235A (en) 2022-04-29

Family

ID=81272922

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210041160.5A Pending CN114419235A (en) 2022-01-14 2022-01-14 Real-time plane reflection rendering method and device for mobile terminal

Country Status (1)

Country Link
CN (1) CN114419235A (en)

Similar Documents

Publication Publication Date Title
WO2022193941A1 (en) Image rendering method and apparatus, device, medium, and computer program product
US10970917B2 (en) Decoupled shading pipeline
US10311548B2 (en) Scaling render targets to a higher rendering resolution to display higher quality video frames
US7932914B1 (en) Storing high dynamic range data in a low dynamic range format
US10049486B2 (en) Sparse rasterization
CN110570506B (en) Map resource management method, device, computing equipment and storage medium
US7038678B2 (en) Dependent texture shadow antialiasing
CN112233216B (en) Game image processing method and device and electronic equipment
US11941752B2 (en) Streaming a compressed light field
CN114820905B (en) Virtual image generation method and device, electronic equipment and readable storage medium
US7256792B1 (en) Method and apparatus for sampling non-power of two dimension texture maps
US20170358132A1 (en) System And Method For Tessellation In An Improved Graphics Pipeline
US20060077209A1 (en) Pixel center position displacement
US20110141112A1 (en) Image processing techniques
US11501467B2 (en) Streaming a light field compressed utilizing lossless or lossy compression
US11954830B2 (en) High dynamic range support for legacy applications
US20050068319A1 (en) 3D graphics rendering engine for processing an invisible fragment and a method therefor
CN113393564A (en) Pond-based spatio-temporal importance resampling using global illumination data structure
CN114419235A (en) Real-time plane reflection rendering method and device for mobile terminal
CN111739074A (en) Scene multipoint light source rendering method and device
US7385604B1 (en) Fragment scattering
CN110378958B (en) Sea surface generation method based on FFT data pre-baking technology
US11961188B2 (en) Neural networks to generate appearance-responsive material map sets in digital graphical environments
CN114219885A (en) Real-time shadow rendering method and device for mobile terminal
KR100818286B1 (en) Method and apparatus for rendering 3 dimensional graphics data considering fog effect

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination