CN115225873A - Color correction method and device for fusion projection system - Google Patents

Color correction method and device for fusion projection system Download PDF

Info

Publication number
CN115225873A
CN115225873A CN202210674432.5A CN202210674432A CN115225873A CN 115225873 A CN115225873 A CN 115225873A CN 202210674432 A CN202210674432 A CN 202210674432A CN 115225873 A CN115225873 A CN 115225873A
Authority
CN
China
Prior art keywords
value
video frame
pixel
color
rgb
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210674432.5A
Other languages
Chinese (zh)
Inventor
周慧敏
范江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Zhizai Yuntian Culture Technology Co ltd
Original Assignee
Guangzhou Zhizai Yuntian Culture Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Zhizai Yuntian Culture Technology Co ltd filed Critical Guangzhou Zhizai Yuntian Culture Technology Co ltd
Priority to CN202210674432.5A priority Critical patent/CN115225873A/en
Publication of CN115225873A publication Critical patent/CN115225873A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3182Colour adjustment, e.g. white balance, shading or gamut

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Of Color Television Signals (AREA)
  • Video Image Reproduction Devices For Color Tv Systems (AREA)

Abstract

The invention discloses a color correction method and a device for a fusion projection system, which can adjust and correct RGB values of video frame pixels output by a projector, so that a projection picture can adjust the color of a corresponding video frame according to factors such as the environment background where the projection picture is located, the light intensity and the like, and the color change of the output video frame is collected in real time by a light sensing camera, so that the RGB of the video frame can be corrected in real time, the color for watching the projection becomes consistent experience, the color correction in the prior art is a required specific picture, the specific color is adjusted, the continuous color adjustment cannot be carried out, the smooth experience cannot be realized, the adjustment rhythm is slow, and the color adjustment cannot be carried out according to the real-time environment change.

Description

Color correction method and device for fusion projection system
Technical Field
The invention belongs to the field of projected image color correction, and particularly relates to a color correction method and device of a fusion projection system.
Background
When a projector is used for watching videos, each video output device has different degrees of color fading or different surrounding environments, so that visual experience is inconsistent, a screen projected during projection is not necessarily white, the color of the screen is changed due to various reasons, such as illumination intensity, light color and the like, and the illumination intensity also affects the projection effect.
Disclosure of Invention
The present invention is directed to a color correction method and apparatus for a fusion projection system, which solves one or more of the problems of the prior art and provides at least one of the advantages of the present invention.
A method of color correction for a fused projection system, the method comprising the steps of:
s100: collecting an output video through a camera;
s200: extracting a corresponding video frame from a storage unit for analysis;
s300: graying the output video to obtain a grayed image;
s400: dividing the projection area through a gray level image;
s500: the color of the projected area is corrected by the RGB values of the pixels.
Further, in the step S100, the light sensing camera collects the output video, and the light sensing camera can rapidly acquire each frame of the output video and transmit the collected output video to the storage unit.
Further, in the step S200, the color in the output video is analyzed to obtain an RGB value of each pixel point in the output video, a video frame that is externally output in the storage unit at the same time as the output video is extracted in the storage unit as a corresponding video frame, and the RGB value of the pixel point in the corresponding video frame is used as original data.
Further, in step S300, graying the output video to obtain a grayed image, decomposing RGB channels of pixels of the output video, graying the RGB values of the pixels of the output video respectively corresponding to the R, G, and B channels to obtain an R-value grayscale coefficient, a G-value grayscale coefficient, and a B-value grayscale coefficient, and comparing the R-value grayscale coefficient, the G-value grayscale coefficient, and the B-value grayscale coefficient with RGB values of pixels of the corresponding video frame to obtain ratios R1, G1, and B1.
Further, in step S400, in the process of dividing the projection area by the grayscale image, the pixel points of the output video are converted into pixel values between intervals [0,255], the pixel values are divided on the grayscale image according to R, G, and B respectively, so as to obtain an R-value grayscale image, a G-value grayscale image, and a B-value grayscale image, and the R-value grayscale image, the G-value grayscale image, and the B-value grayscale image are subjected to edge detection, so as to obtain an R-value edge detection map, a G-value edge detection map, and a B-value edge detection map.
Further, fast retrieving a video frame of video data in a storage unit through the R value edge detection map, the G value edge detection map and the B value edge detection map, retrieving a video frame with the same edge point, performing coincidence comparison on the R value edge detection map and the video frame to obtain a pixel color difference, expressing the pixel color difference in a pixel value form, recording the pixel color difference as a first pixel color difference value, performing coincidence comparison on the G value edge detection map and the corresponding video frame to obtain a second pixel color difference value, performing coincidence comparison on the B value edge detection map and the corresponding video frame to obtain a third pixel color difference value, and combining the pixel color differences S1, S2 and S3 with the ratios R1, G1 and B1 to calculate a color correction value C, wherein the expression of the color correction value C is as follows:
C=
Figure DEST_PATH_IMAGE001
wherein n is the number of the pixel points, the exp function is a logarithmic function based on a natural number e, and the exp function is obtained
Figure 798622DEST_PATH_IMAGE002
Mean square error of color difference value
Figure DEST_PATH_IMAGE003
Figure 824346DEST_PATH_IMAGE004
Obtaining a color correction value C through the calculation;
the method has the advantages that in the projection process, the color of a video frame can change due to environmental factors, the RGB value of a pixel point can change greatly due to illumination in the environment with high illumination intensity, the mean square error can represent a reference value which is more accurate in color correction of the video frame under the condition that the color change numerical values of R, G and B channels are small, and the pixel point with the RGB value with overlarge difference can be screened through the absolute value of the ratio obtained by adding the mean square error and the sum of color difference values;
still further, in step S500, the color of the projection area is corrected according to the RGB values of the pixels, and the specific steps are as follows:
s501, comparing the color correction value C with the RGB value in the original data to obtain the RGB difference D of each pixel point, adding the RGB value of the pixel point in the original data with the difference D, combining the pixel points added with the difference to obtain a first correction video frame, retrieving to obtain the pixel RGB value of the next frame of the corresponding video frame in a storage unit, adjusting the adjustment range of the RGB value of the pixel of the first correction video frame to be consistent with the adjustment range of the pixel at the corresponding position of the RGB value of the pixel of the next frame of the corresponding video frame in the storage unit, and outputting the video frame of the next frame of the corresponding video frame in the storage unit after adjustment to be a second correction video frame;
s502: decomposing RGB channels of pixels of the second correction video frame, comparing RGB values corresponding to R, G and B channels with RGB values of the first correction video respectively, if the RGB values corresponding to the decomposed R, G and B channels are the same as the adjusted amplitude of the first correction video frame, outputting the second correction video frame by a projector, if the RGB values corresponding to the decomposed R, G and B channels are different from the adjusted amplitude of the first correction video frame, graying the R, G and B channels respectively to obtain R1 value gray coefficients, G1 value gray coefficients and B1 value gray coefficients, and comparing the R1 value gray coefficients, the G1 value gray coefficients and the B1 value gray coefficients with the RGB values of pixels of the corresponding video frames to obtain ratios R2, G2 and B2;
s503: comparing the ratios R2, G2 and B2 with the ratios R1, G1 and B1 to obtain difference values D, adding the difference values D with corresponding RGB values of R, G and B channels respectively to obtain a third corrected video frame, comparing the RGB values of pixels of the third corrected video frame with the RGB values of pixels of the second corrected video frame, and if the RGB values are the same, outputting the third corrected video frame by the projector.
Still further, an apparatus for blending color correction for a projection system includes: the system comprises a projector processor, a projection lens, a light sensing camera and a storage unit, wherein the projector processor, the storage unit and a computer program which is stored in the storage unit and can run in the projector processor are adopted, and when the projector processor executes the computer program, the steps of any one of the color correction methods for the fusion projection system are realized.
The invention has the beneficial effects that: through decomposing and correcting the RGB channel of every video frame, can make the projection picture can adjust the color of the video frame that corresponds according to the environment background of locating, factors such as light intensity to gather the color change of output video frame in real time through the light sense camera, can real-timely rectify the RGB of video frame, let the color of watching the projection become unanimous experience, avoided the specific picture that traditional color correction needs, realize automatic and uninterrupted media play seamless picture correction.
Drawings
The above and other features of the present invention will become more apparent by describing in detail embodiments thereof with reference to the attached drawings in which like reference numerals designate the same or similar elements, it being apparent that the drawings in the following description are merely exemplary of the present invention and other drawings can be obtained by those skilled in the art without inventive effort, wherein:
FIG. 1 is a system flow diagram of a color correction method and apparatus for a fusion projection system;
FIG. 2 is a flow chart of a device structure of a color correction method and apparatus for a fusion projection system.
Detailed Description
The conception, the specific structure and the technical effects of the present invention will be clearly and completely described in conjunction with the embodiments and the accompanying drawings to fully understand the objects, the schemes and the effects of the present invention. It should be noted that, in the present application, the embodiments and features of the embodiments may be combined with each other without conflict.
In the description of the present invention, a plurality of means is one or more, a plurality of means is two or more, and greater than, less than, more than, etc. are understood as excluding the essential numbers, and greater than, less than, etc. are understood as including the essential numbers. If there is a description that refers to a first and a second for the purpose of distinguishing technical features, this is not to be understood as indicating or implying a relative importance or implicitly indicating the number of technical features indicated or implicitly indicating the precedence of the technical features indicated.
A method of color correction for a fused projection system as shown in fig. 1, said method comprising the steps of:
s100: collecting an output video through a camera;
s200: extracting a corresponding video frame from a storage unit for analysis;
s300: graying the output video to obtain a grayed image;
s400: dividing a projection area through a grayed image;
s500: the color of the projected area is corrected by the RGB values of the pixels.
Further, in the step S100, the light sensing camera collects the output video, and the light sensing camera can rapidly acquire each frame of the output video and transmit the collected output video to the storage unit.
Further, in the step S200, the color in the output video is analyzed to obtain an RGB value of each pixel point in the output video, a video frame that is externally output in the storage unit at the same time as the output video is extracted in the storage unit as a corresponding video frame, and the RGB value of the pixel point in the corresponding video frame is used as original data.
Further, in step S300, graying the output video to obtain a grayed image, decomposing RGB channels of the pixels of the output video, graying the RGB values of the pixels of the output video respectively corresponding to the R, G, and B channels to obtain an R-value grayscale coefficient, a G-value grayscale coefficient, and a B-value grayscale coefficient, and comparing the R-value grayscale coefficient, the G-value grayscale coefficient, and the B-value grayscale coefficient with the RGB values of the pixels of the corresponding video frame to obtain ratios R1, G1, and B1.
Still further, in step S400, in the process of dividing the projection area by the grayscale image, the pixel points of the output video are converted into pixel values between intervals [0,255], the pixel values are divided on the grayscale image according to R, G, and B respectively, so as to obtain an R-value grayscale image, a G-value grayscale image, and a B-value grayscale image, and the R-value grayscale image, the G-value grayscale image, and the B-value grayscale image are respectively subjected to edge detection, so as to obtain an R-value edge detection map, a G-value edge detection map, and a B-value edge detection map.
Further, fast retrieving a video frame of video data in a storage unit through the R value edge detection map, the G value edge detection map and the B value edge detection map, retrieving a video frame with the same edge point, performing coincidence comparison on the R value edge detection map and the video frame to obtain a pixel color difference, expressing the pixel color difference in a pixel value form, recording the pixel color difference as a first pixel color difference value, performing coincidence comparison on the G value edge detection map and the corresponding video frame to obtain a second pixel color difference value, performing coincidence comparison on the B value edge detection map and the corresponding video frame to obtain a third pixel color difference value, and combining the pixel color differences S1, S2 and S3 with the ratios R1, G1 and B1 to calculate a color correction value C, wherein the expression of the color correction value C is as follows:
C=
Figure 560221DEST_PATH_IMAGE001
wherein n is the number of the pixel points, the exp function is a logarithmic function based on a natural number e, and the exp function is obtained
Figure 493542DEST_PATH_IMAGE002
Mean square error of color difference value
Figure 631262DEST_PATH_IMAGE003
Figure 93468DEST_PATH_IMAGE004
Obtaining a color correction value C through the calculation;
the method has the advantages that in the projection process, the colors of video frames can change due to environmental factors, the RGB values of pixel points can also change greatly due to illumination in the environment with high illumination intensity, the mean square deviation can represent a reference value which is more accurate for correcting the colors of the video frames under the condition that the color change numerical values of R, G and B channels are small, and the pixel points with overlarge RGB values can be screened by adding the absolute value of the ratio obtained by adding the mean square deviation and the sum of the color difference values;
still further, in step S500, the color of the projection area is corrected according to the RGB values of the pixels, and the specific steps are as follows:
s501, comparing the color correction value C with the RGB value in the original data to obtain the RGB difference D of each pixel point, adding the RGB value of the pixel point in the original data with the difference D, combining the pixel points added with the difference to obtain a first correction video frame, retrieving to obtain the pixel RGB value of the next frame of the corresponding video frame in a storage unit, adjusting the adjustment range of the RGB value of the pixel of the first correction video frame to be consistent with the adjustment range of the pixel at the corresponding position of the RGB value of the pixel of the next frame of the corresponding video frame in the storage unit, and outputting the video frame of the next frame of the corresponding video frame in the storage unit after adjustment to be a second correction video frame;
s502: decomposing RGB channels of pixels of the second correction video frame, comparing RGB values corresponding to R, G and B channels with RGB values of the first correction video respectively, if the RGB values corresponding to the decomposed R, G and B channels are the same as the adjusted amplitude of the first correction video frame, outputting the second correction video frame by a projector, if the RGB values corresponding to the decomposed R, G and B channels are different from the adjusted amplitude of the first correction video frame, graying the R, G and B channels respectively to obtain R1 value gray coefficients, G1 value gray coefficients and B1 value gray coefficients, and comparing the R1 value gray coefficients, the G1 value gray coefficients and the B1 value gray coefficients with the RGB values of pixels of the corresponding video frames to obtain ratios R2, G2 and B2;
s503: comparing the ratios R2, G2 and B2 with the ratios R1, G1 and B1 to obtain difference values D, adding the difference values D with corresponding RGB values of R, G and B channels respectively to obtain a third corrected video frame, comparing the RGB values of pixels of the third corrected video frame with the RGB values of pixels of the second corrected video frame, and if the RGB values are the same, outputting the third corrected video frame by the projector.
Still further, the color correction apparatus for a merged projection system includes: the system comprises a projector processor, a projection lens, an optical sensing camera and a storage unit, wherein the projector processor, the storage unit and a computer program stored in the storage unit and operable in the projector processor are used for implementing the steps in any one of the color correction methods for the merged projection system when the computer program is executed by the projector processor, and those skilled in the art can understand that the example is only an example of a device for merging color correction of the projection system, and does not constitute a limitation on the device for merging color correction of the projection system, and may include more or less components than a certain proportion, or combine some components, or different components, for example, the device for merging color correction of the projection system may further include an input and output device, a network access device, a bus, and the like.
As shown in fig. 2, the apparatus includes: the projector comprises a projector processor, a projection lens, a light-sensitive camera and a storage unit, wherein the unit of the projector processor in the program operation of a computer comprises:
a video comparison unit: the system is responsible for comparing the RGB values of pixels of the output video collected by the light sensing camera and the corresponding video frame;
a color correction unit: the video processing device is responsible for adjusting the RGB values of the pixels of the corresponding video frames through the color correction values obtained by the video comparison unit, adjusting the video frames of the next frame according to the adjustment range of the video frames, correcting the video colors in real time, and avoiding the generation of different visual experiences of users.
Although the description of the present invention has been presented in considerable detail and with reference to a few illustrated embodiments, it is not intended to be limited to any such detail or embodiment or any particular embodiment so as to effectively encompass the intended scope of the invention. Furthermore, the foregoing describes the invention in terms of embodiments foreseen by the inventor for which an enabling description was available, notwithstanding that insubstantial modifications of the invention, not presently foreseen, may nonetheless represent equivalent modifications thereto.

Claims (8)

1. A color correction method for a fusion projection system, the method comprising the steps of:
s100: collecting an output video through a camera;
s200: extracting a corresponding video frame from a storage unit for analysis;
s300: graying the output video to obtain a grayed image;
s400: dividing the projection area through a gray level image;
s500: the color of the projected area is corrected by the RGB values of the pixels.
2. The method as claimed in claim 1, wherein in the step S100, the output video is captured by a photo-sensing camera, and the photo-sensing camera can rapidly acquire each frame of the output video and transmit the captured output video to a storage unit.
3. The color correction method for a fusion projection system according to claim 1, wherein in step S200, the color in the output video is analyzed to obtain the RGB value of each pixel in the output video, a video frame externally output in the storage unit at the same time as the output video is extracted in the storage unit as a corresponding video frame, and the RGB value of the pixel in the corresponding video frame is used as the original data.
4. The color correction method for a fusion projection system as claimed in claim 1, wherein in step S300, the output video is grayed to obtain a grayed image, the RGB channels of the pixels of the output video are decomposed, the RGB values of the pixels of the output video are grayed corresponding to the R, G and B channels respectively to obtain R-value grayscale coefficients, G-value grayscale coefficients and B-value grayscale coefficients, and the R-value grayscale coefficients, G-value grayscale coefficients and B-value grayscale coefficients are compared with the RGB values of the pixels of the corresponding video frames to obtain the ratios R1, G1 and B1.
5. The color correction method for a fusion projection system according to claim 1, wherein in step S400, in the dividing of the projection region by the grayed-out image, the pixel points of the output video are converted into pixel values between intervals [0,255], the pixel values are divided on the grayed-out image according to R, G and B respectively, so as to obtain an R-value grayed-out image, a G-value grayed-out image and a B-value grayed-out image, and the R-value grayed-out image, the G-value grayed-out image and the B-value grayed-out image are respectively subjected to edge detection, so as to obtain an R-value edge detection map, a G-value edge detection map and a B-value edge detection map.
6. The color correction method of claim 5, wherein the R-value edge detection map, the G-value edge detection map and the B-value edge detection map are used to quickly search video frames of video data in a storage unit, search out video frames with the same edge points, compare the R-value edge detection map with the video frames to obtain pixel color differences, represent the pixel color differences in the form of pixel values, record the pixel color differences as first pixel color difference values, compare the G-value edge detection map with the corresponding video frames to obtain second pixel color difference values, compare the B-value edge detection map with the corresponding video frames to obtain third pixel color difference values, and calculate the color correction value C by combining the pixel color differences S1, S2 and S3 with the ratios R1, G1 and B1, wherein the expression of the color correction value C is:
C=
Figure DEST_PATH_IMAGE002
wherein n is the number of the pixel points, the exp function is a logarithmic function based on a natural number e, and the exp function is obtained
Figure DEST_PATH_IMAGE004
Mean square error of color difference value
Figure DEST_PATH_IMAGE006
Figure DEST_PATH_IMAGE008
The color correction value C is obtained by the above calculation.
7. The color correction method for a converged projection system of claim 1, wherein in step S500, the correction of the color of the projection area is performed by using the RGB values of the pixels, which specifically comprises the following steps:
s501, comparing the color correction value C with the RGB value in the original data to obtain the RGB difference D of each pixel point, adding the RGB value of the pixel point in the original data with the difference D, combining the pixel points added with the difference to obtain a first correction video frame, retrieving to obtain the pixel RGB value of the next frame of the corresponding video frame in a storage unit, adjusting the adjustment range of the RGB value of the pixel of the first correction video frame to be consistent with the adjustment range of the pixel at the corresponding position of the RGB value of the pixel of the next frame of the corresponding video frame in the storage unit, and outputting the video frame of the next frame of the corresponding video frame in the storage unit after adjustment to be a second correction video frame;
s502: decomposing RGB channels of pixels of the second correction video frame, comparing RGB values corresponding to R, G and B channels with RGB values of the first correction video respectively, if the RGB values corresponding to the decomposed R, G and B channels are the same as the adjusted amplitude of the first correction video frame, outputting the second correction video frame by a projector, if the RGB values corresponding to the decomposed R, G and B channels are different from the adjusted amplitude of the first correction video frame, graying the R, G and B channels respectively to obtain R1 value gray coefficients, G1 value gray coefficients and B1 value gray coefficients, and comparing the R1 value gray coefficients, the G1 value gray coefficients and the B1 value gray coefficients with the RGB values of pixels of the corresponding video frames to obtain ratios R2, G2 and B2;
s503: comparing the ratios R2, G2 and B2 with the ratios R1, G1 and B1 to obtain difference values D, adding the difference values D with corresponding RGB values of R, G and B channels respectively to obtain a third corrected video frame, comparing the RGB values of pixels of the third corrected video frame with the RGB values of pixels of the second corrected video frame, and if the RGB values are the same, outputting the third corrected video frame by the projector.
8. A color correction apparatus for a fusion projection system, the apparatus comprising: the system comprises a projector processor, a projection lens, a light sensing camera and a storage unit, wherein the projector processor, the storage unit and a computer program which is stored in the storage unit and can be run in the projector processor are adopted, and when the processor executes the computer program, the steps in the color correction method of the fusion projection system in any one of claims 1 to 7 are realized.
CN202210674432.5A 2022-06-14 2022-06-14 Color correction method and device for fusion projection system Pending CN115225873A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210674432.5A CN115225873A (en) 2022-06-14 2022-06-14 Color correction method and device for fusion projection system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210674432.5A CN115225873A (en) 2022-06-14 2022-06-14 Color correction method and device for fusion projection system

Publications (1)

Publication Number Publication Date
CN115225873A true CN115225873A (en) 2022-10-21

Family

ID=83607684

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210674432.5A Pending CN115225873A (en) 2022-06-14 2022-06-14 Color correction method and device for fusion projection system

Country Status (1)

Country Link
CN (1) CN115225873A (en)

Similar Documents

Publication Publication Date Title
Nuutinen et al. CVD2014—A database for evaluating no-reference video quality assessment algorithms
US8472717B2 (en) Foreground image separation method
US6912313B2 (en) Image background replacement method
US8180145B2 (en) Method for producing image with depth by using 2D images
JP5744437B2 (en) TRACKING DEVICE, TRACKING METHOD, AND PROGRAM
CN102932586B (en) Imaging device and method
US20170099442A1 (en) Spatial and temporal alignment of video sequences
CN102025959B (en) The System and method for of high definition video is produced from low definition video
US20060088209A1 (en) Video image quality
US8526057B2 (en) Image processing apparatus and image processing method
US20140037212A1 (en) Image processing method and device
US20110311150A1 (en) Image processing apparatus
JP2007318331A (en) Imaging device for microscope
CN113132695A (en) Lens shadow correction method and device and electronic equipment
CN114331835A (en) Panoramic image splicing method and device based on optimal mapping matrix
US7305124B2 (en) Method for adjusting image acquisition parameters to optimize object extraction
US20140334692A1 (en) Device and method for detecting a plant against a background
JP6479178B2 (en) Image processing apparatus, imaging apparatus, microscope system, image processing method, and image processing program
CN117176983B (en) Video generation evaluation system based on panoramic image synthesis
CN115225873A (en) Color correction method and device for fusion projection system
US20070035784A1 (en) Image processing method and apparatus
EP2517172B1 (en) Filter setup learning for binary sensor
US12035033B2 (en) DNN assisted object detection and image optimization
JP2008141616A (en) Motion vector calculating apparatus, method and program, moving image compressing and recording device, and imaging apparatus
CN112637573A (en) Multi-lens switching display method and system, intelligent terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination