CN113489961B - Projection correction method, projection correction device, storage medium and projection equipment - Google Patents

Projection correction method, projection correction device, storage medium and projection equipment Download PDF

Info

Publication number
CN113489961B
CN113489961B CN202111050693.1A CN202111050693A CN113489961B CN 113489961 B CN113489961 B CN 113489961B CN 202111050693 A CN202111050693 A CN 202111050693A CN 113489961 B CN113489961 B CN 113489961B
Authority
CN
China
Prior art keywords
projection
imaging medium
coordinates
target
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111050693.1A
Other languages
Chinese (zh)
Other versions
CN113489961A (en
Inventor
付啸天
张聪
胡震宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Huole Science and Technology Development Co Ltd
Original Assignee
Shenzhen Huole Science and Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Huole Science and Technology Development Co Ltd filed Critical Shenzhen Huole Science and Technology Development Co Ltd
Priority to CN202111050693.1A priority Critical patent/CN113489961B/en
Publication of CN113489961A publication Critical patent/CN113489961A/en
Application granted granted Critical
Publication of CN113489961B publication Critical patent/CN113489961B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence

Abstract

The present disclosure relates to a projection correction method, apparatus, storage medium, and projection device, the method comprising: the method comprises the steps of obtaining a target image obtained by shooting of an image acquisition device, determining partial boundaries of an imaging medium area according to the target image, determining coordinates of each vertex of the imaging medium according to the partial boundaries of the imaging medium, determining a target transformation relation according to a projection picture display area, determining a target projection area according to the coordinates of each vertex of the imaging medium and the target transformation relation, and correcting the projection picture display area according to the target projection area to enable the projection picture display area to be overlapped with the imaging medium area. The method and the device can determine the target projection area according to the partial boundary corresponding to the imaging medium by combining the target transformation relation, and correct the projection picture display area by using the target projection area so as to ensure that the projection of the projection equipment is aligned with the imaging medium, are simple to operate, and can obtain the projection effect meeting the requirements of a user without manual adjustment of the user.

Description

Projection correction method, projection correction device, storage medium and projection equipment
Technical Field
The present disclosure relates to the field of projection technologies, and in particular, to a projection correction method, an apparatus, a storage medium, and a projection device.
Background
With the continuous development of projection technology, short-focus or ultra-short-focus projection devices are becoming more and more popular, and will become the mainstream of household projection devices. When using short-focus or ultra-short-focus projection equipment, in order to obtain a better projection effect, the short-focus or ultra-short-focus projection equipment is usually used in cooperation with a projection curtain, so that the projection equipment can project a clear image on the projection curtain.
However, in the actual use process, due to the characteristics of short distance and large elevation angle of the short-focus or ultra-short-focus projection device, it is difficult for the user to adjust the projection picture projected by the projection device on the projection screen to be aligned with the frame of the projection screen, which may affect the projection effect of the projection device. Meanwhile, the projection picture is adjusted to be aligned with the frame of the projection curtain in a manual adjustment mode, the operation process is complex, and the projection effect meeting the requirements of a user cannot be obtained frequently.
Disclosure of Invention
In order to solve the problems in the prior art, the present disclosure provides a projection correction method, device, storage medium, and projection apparatus, which can achieve automatic alignment of a curtain frame.
In order to achieve the above object, according to a first aspect of embodiments of the present disclosure, there is provided a projection correction method, the method including:
acquiring a target image shot by an image acquisition device; the target image comprises a projection picture display area and an imaging medium area;
determining a partial boundary of the imaging medium according to the target image, wherein the partial boundary of the imaging medium is a plurality of line segments which are intersected or the extension lines of which are intersected;
determining coordinates of each vertex of the imaging medium according to the partial boundary of the imaging medium;
determining a target transformation relation according to the projection picture display area;
determining a target projection area according to the coordinates of each vertex of the imaging medium and the target transformation relation;
and correcting the projection picture display area according to the target projection area, so that the projection picture display area and the imaging medium area are overlapped.
Optionally, the determining coordinates of each vertex of the imaging medium according to the partial boundary of the imaging medium includes:
acquiring coordinates of intersection points of a plurality of line segments or coordinates of intersection points of extension lines of the plurality of line segments;
taking the coordinates of the intersection point as the coordinates of the known vertex of the imaging medium;
and determining the coordinates of other vertexes of the imaging medium according to the coordinates of the intersection points, the intersection included angles of the line segments and the preset proportional relation of the boundaries of the imaging medium.
Optionally, the determining a target transformation relation according to the projection screen display area includes:
determining a target transformation relation according to the projection picture display area and a reflection area preset on a Digital Micromirror Device (DMD) of the projection equipment; the target conversion relation represents a conversion relation between an imaging plane where the projection picture display area is located and a reflection plane of the DMD.
Optionally, the determining a target transformation relationship according to the projection image display area and a reflection area preset on a digital micromirror device DMD of the projection device includes:
determining the coordinates of a first corner point of the projection picture display area according to the target image;
acquiring coordinates of a second corner point corresponding to the first corner point in a reflection area preset on the DMD;
and determining the target transformation relation according to the coordinates of the first corner point and the coordinates of the second corner point.
Optionally, the determining a target projection area according to the coordinates of each vertex of the imaging medium and the target transformation relation includes:
mapping each vertex of the imaging medium to the reflection plane according to the target transformation relation to obtain a point on the reflection plane corresponding to each vertex of the imaging medium;
and determining the target projection area according to the point on the reflection plane.
Optionally, the determining coordinates of each vertex of the imaging medium according to the partial boundary of the imaging medium includes:
if the number of the intersection points of the plurality of line segments is a preset value, taking the coordinates of the intersection points of the plurality of line segments as the coordinates of each vertex of the imaging medium;
or if the number of the intersection points of the extension lines of the line segments is a preset value, taking the coordinates of the intersection points of the extension lines of the line segments as the coordinates of each vertex of the imaging medium.
Optionally, the determining a partial boundary of the imaging medium according to the target image includes:
performing edge recognition on the target image, and extracting image edge features;
and performing noise reduction on the image edge features, performing linear extraction on the image edge features subjected to the noise reduction, and taking line segments obtained by the linear extraction as partial boundaries of the imaging medium.
According to a second aspect of the embodiments of the present disclosure, there is provided a projection correction apparatus, the apparatus including:
the acquisition module is used for acquiring a target image shot by the image acquisition device; the target image comprises a projection picture display area and an imaging medium area;
the determining module is used for determining the partial boundary of the imaging medium according to the target image, wherein the partial boundary of the imaging medium is a plurality of line segments which are intersected or the extension lines of which are intersected;
the determining module is further configured to determine coordinates of each vertex of the imaging medium according to a partial boundary of the imaging medium;
the determining module is further configured to determine a target transformation relationship according to the projection picture display area;
the determining module is further configured to determine a target projection area according to the coordinates of each vertex of the imaging medium and the target transformation relation;
and the correction module is used for correcting the projection picture display area according to the target projection area so that the projection picture display area is superposed with the imaging medium area.
Optionally, the determining module includes:
the first acquisition submodule is used for acquiring the coordinates of the intersection points of the line segments or the coordinates of the intersection points of the extension lines of the line segments;
a first determining submodule for taking the coordinates of the intersection point as the coordinates of the known vertex of the imaging medium;
and the first determining submodule is also used for determining the coordinates of other vertexes of the imaging medium according to the coordinates of the intersection points, the intersection included angles of the line segments and the preset proportional relation of the imaging medium boundary.
Optionally, the determining module is configured to:
determining a target transformation relation according to the projection picture display area and a reflection area preset on a Digital Micromirror Device (DMD) of the projection equipment; the target conversion relation represents a conversion relation between an imaging plane where the projection picture display area is located and a reflection plane of the DMD.
Optionally, the determining module includes:
the second determining submodule is used for determining the coordinate of the first corner point of the projection picture display area according to the target image;
the second obtaining submodule is used for obtaining the coordinates of a second corner point corresponding to the first corner point in a preset reflection area on the DMD;
and the second determining submodule is further configured to determine the target transformation relationship according to the coordinates of the first corner point and the coordinates of the second corner point.
Optionally, the determining module is configured to:
mapping each vertex of the imaging medium to the reflection plane according to the target transformation relation to obtain a point on the reflection plane corresponding to each vertex of the imaging medium;
and determining the target projection area according to the point on the reflection plane.
Optionally, the determining module is configured to:
if the number of the intersection points of the plurality of line segments is a preset value, taking the coordinates of the intersection points of the plurality of line segments as the coordinates of each vertex of the imaging medium;
or if the number of the intersection points of the extension lines of the line segments is a preset value, taking the coordinates of the intersection points of the extension lines of the line segments as the coordinates of each vertex of the imaging medium.
Optionally, the determining module includes:
the recognition submodule is used for carrying out edge recognition on the target image and extracting image edge features;
and the processing submodule is used for carrying out noise reduction processing on the image edge characteristics, then carrying out linear extraction on the image edge characteristics after the noise reduction processing, and taking line segments obtained through the linear extraction as partial boundaries of the imaging medium.
According to a third aspect of embodiments of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of any one of the above first aspects.
According to a fourth aspect of the embodiments of the present disclosure, there is provided a projection apparatus including:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to implement the steps of the method of any one of the first aspect.
According to the technical scheme, the method comprises the steps of firstly obtaining a target image shot by an image acquisition device, determining a partial boundary of an imaging medium according to the target image, wherein the target image comprises a projection image display area and an imaging medium area, the partial boundary of the imaging medium is a plurality of line segments which are intersected or extended lines of which are intersected, then determining coordinates of each vertex of the imaging medium according to the partial boundary of the imaging medium, determining a target transformation relation according to the projection image display area, determining a target projection area according to the coordinates of each vertex of the imaging medium and the target transformation relation, and finally correcting the projection image display area according to the target projection area to enable the projection image display area to be overlapped with the imaging medium area. The method and the device can determine the target projection area which can enable the projection picture display area and the imaging medium area to be overlapped by combining the target transformation relation according to the partial boundary corresponding to the imaging medium, and correct the projection picture display area by utilizing the target projection area to realize automatic projection correction, thereby ensuring the alignment of the projection equipment and the imaging medium, being simple in operation, and being capable of obtaining the projection effect meeting the requirements of a user without manual adjustment of the user.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure without limiting the disclosure. In the drawings:
FIG. 1 is a flow chart illustrating a method of projection correction according to an exemplary embodiment;
FIG. 2 is a flow chart of one step 103 shown in the embodiment of FIG. 1;
FIG. 3 is a schematic illustration of an imaging media area shown in accordance with an exemplary embodiment;
FIG. 4 is a schematic illustration of another imaging media area shown in accordance with an exemplary embodiment;
FIG. 5 is a flow chart illustrating one step 104 of the embodiment shown in FIG. 1;
FIG. 6 is a schematic illustration of a reflective region and a projected picture according to an exemplary embodiment;
FIG. 7 is a flow chart illustrating one step 102 of the embodiment shown in FIG. 1;
FIG. 8 is a block diagram illustrating a projection correction device in accordance with an exemplary embodiment;
FIG. 9 is a block diagram of a determination module shown in the embodiment of FIG. 8;
FIG. 10 is a block diagram of another determination module shown in the embodiment of FIG. 8;
FIG. 11 is a block diagram of yet another determination module shown in the embodiment shown in FIG. 8;
fig. 12 is a block diagram illustrating a projection device according to an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
Before describing the projection correction method, apparatus, storage medium, and projection device provided by the present disclosure, an application scenario related to various embodiments of the present disclosure is first described, where the application scenario may include a projection device, an image capture apparatus, and an imaging medium. The projection device can project the self-played image or video onto the imaging medium to display the corresponding projection picture on the imaging medium. The projection equipment can be a short-focus or ultra-short-focus projector, the image acquisition device is arranged on the projection equipment and used for shooting a projection picture and an area where an imaging medium is located, the imaging medium can be a projection curtain, a wall surface, a hard screen and any other devices capable of displaying the projection picture, and the image acquisition device can be a camera or an image sensor.
FIG. 1 is a flow chart illustrating a method of projection correction according to an exemplary embodiment. As shown in fig. 1, the method may include the steps of:
step 101, acquiring a target image shot by an image acquisition device. Wherein the target image includes a projection screen display area and an imaging medium area.
For example, when the projection device projects onto the imaging medium, if the user finds that the projection picture of the projection device is not aligned with the frame of the imaging medium, the position of the projection picture may be corrected by sending a correction instruction to the projection device. For example, a corresponding correction button may be provided on the projection device, and the user may press the correction button to send a correction instruction to the projection device. When the projection device starts the automatic screen alignment function, when the projection device detects that the imaging medium area is completely contained in the projection picture display area or the projection picture display area is partially overlapped with the imaging medium area, the correction instruction can be automatically triggered.
After receiving the correction instruction, the projection device may first project a preset image onto the imaging medium to obtain a projection picture corresponding to the preset image displayed on the imaging medium. The preset image is a calibration image (e.g., a checkerboard image) for calibration, and the calibration image can be used to obtain relevant information of the projection picture, such as the shape of the projection picture, whether the projection picture has distortion, and the like. Then, the image acquisition device can shoot the area where the projection picture and the imaging medium are located to obtain a target image, and the target image is further sent to the projection equipment. The target image comprises a projection image display area corresponding to a projection image shot by the image acquisition device and an imaging medium area corresponding to an imaging medium shot by the image acquisition device.
Step 102, determining a partial boundary of the imaging medium according to the target image. Wherein, the partial boundary of the imaging medium is a plurality of line segments which intersect or extend lines intersect.
For example, to ensure that the projection screen is aligned with the frame of the imaging medium, the boundary position of the imaging medium may be determined using the imaging medium region in the target image, and the position of the projection screen may be corrected based on the boundary position. If the position of the entire border of the imaging medium (i.e., the position of the border of 4 edges of the imaging medium) is to be determined, it is required that the target image must contain all 4 edges of the imaging medium. However, in an actual situation, due to the limitation of the pose relationship formed by the projection device and the imaging medium, the target image obtained by the image capturing device may not include all 4 sides of the imaging medium, or the characteristics of the sides of the imaging medium in the target image are not strong, so that a middle break occurs on a certain side of the imaging medium in the target image, and a left and right bright angle of a certain side of the imaging medium is absent, which may cause that all boundary positions of the imaging medium cannot be determined, and thus the position of the projection picture cannot be accurately corrected. In addition, in order to ensure that the target image can contain all 4 sides of the imaging medium, the requirement on the lens view of the image acquisition device is high, and the image acquisition device needs to adopt a fisheye lens. However, the fisheye lens is expensive, and the fisheye lens has a large volume, and the fisheye lens needs to extend beyond the housing of the projection device to keep the outer edge of the field of view from being blocked, which limits the ID (industrial design) design of the projection device.
In order to avoid the problem that the position of the projection picture cannot be accurately corrected due to the fact that the position of the whole boundary of the imaging medium cannot be determined, the position of the projection picture can be corrected by determining the position of a part of the boundary of the imaging medium (namely, the target image only needs to contain a part of the imaging medium) and correcting the position of the projection picture based on the position of the part of the boundary of the imaging medium. Specifically, the projection device may first identify the target image to determine a partial boundary of the imaging media. The partial boundary may be a plurality of intersecting line segments or a plurality of line segments intersecting with an extension line, and each line segment may be a complete edge in the imaging medium region or a part of a complete edge in the imaging medium region.
For example, where the imaging media is a rectangular projection screen having an aspect ratio of 16:9, the partial boundaries may include 3 arbitrarily connected boundaries of the imaging media area, such as an upper boundary, a left boundary, and a right boundary, and further such as an upper boundary, a left boundary, and a lower boundary. At this time, the upper boundary is a boundary corresponding to a frame above the projection curtain in the target image, the left boundary is a boundary corresponding to a frame on the left side of the projection curtain in the target image, the right boundary is a boundary corresponding to a frame on the right side of the projection curtain in the target image, and the lower boundary is a boundary corresponding to a frame below the projection curtain in the target image. Furthermore, since the position of the whole boundary of the imaging medium does not need to be determined, the requirement on the lens view of the image acquisition device is small. Therefore, the image acquisition device can use a wide-angle lens or a common visual angle lens which is low in price and small in size so as to reduce the use cost and the occupied structural space range, and meanwhile, the ID design requirement of the projection equipment can be met.
In step 103, the coordinates of the vertices of the imaging medium are determined based on the partial boundaries of the imaging medium.
For example, the projection device may first determine the coordinates of the vertices of the imaging media using the partial boundaries of the imaging media. Taking the example where the imaging medium is a rectangular projection screen with an aspect ratio of 16:9, the partial boundaries may include 3 boundaries, such as the upper, left, and right boundaries of a rectangle, and the projection device may first determine the coordinates of two vertices above the imaging medium and the angle between the upper, left, and right boundaries using the upper, left, and right boundaries. And then the projection equipment can determine the coordinates of the two vertexes below the imaging medium according to the determined coordinates and included angle of the two vertexes and the preset length-width ratio of the imaging medium. In this manner, the positions of the other two vertices of the imaging medium are calculated, corresponding to the positions of the two vertices of the imaging medium determined by the partial boundary.
And 104, determining a target transformation relation according to the display area of the projection picture.
In this step, the projection Device may determine the target transformation relationship according to a projection image display area and a reflection area preset on a DMD (Digital Micromirror Device) of the projection Device. The target conversion relation represents the conversion relation between the imaging plane where the projection picture display area is located and the reflection plane of the DMD. The reflection area may be understood as an area preset on a reflection plane of the DMD to reflect an image to be projected by the projection device, and may be, for example, a 16:9 rectangular area.
The way of determining the target transformation relation may be, for example: the projection device may first select a plurality of feature points from a preset image, and determine corresponding coordinates of each feature point in the reflection area. Projected feature points corresponding to each feature point may then be searched from the projected picture display area and the coordinates of each projected feature point determined. And finally, determining a target transformation relation according to the coordinates of each characteristic point and the coordinates of each projection characteristic point. Wherein the number of the characteristic points is at least 4.
And 105, determining a target projection area according to the coordinates of each vertex of the imaging medium and the target transformation relation.
For example, the projection device may map each vertex of the imaging medium to the reflection plane according to the target transformation relationship, resulting in a point on the reflection plane corresponding to each vertex of the imaging medium. Because the projection picture is formed by projecting light on the plane where the imaging medium is located after the light is reflected by the reflection plane of the DMD, the area reflected by the DMD only needs to be limited in the area surrounded by the points on the reflection plane corresponding to each vertex of the imaging medium, and the projection picture can be ensured to be aligned with the frame of the imaging medium. After determining the points on the reflection plane corresponding to the respective vertices, the target projection area may be further determined according to the points on the reflection plane. The target projection area can be understood as an actual projection area on a plane where the imaging medium is located, which can ensure that a projection picture is aligned with a frame of the imaging medium.
And 106, correcting the projection picture display area according to the target projection area, so that the projection picture display area and the imaging medium area are overlapped.
For example, the projection device may correct the projection image display area according to the target projection area, so that the projection image display area coincides with the imaging medium area, and the projection image is aligned with the frame of the imaging medium, thereby implementing automatic screen alignment of the projection device.
In summary, the present disclosure first obtains a target image captured by an image capturing device, and determines a partial boundary of an imaging medium according to the target image, where the target image includes a projection image display area and an imaging medium area, the partial boundary of the imaging medium is a plurality of intersecting line segments or intersecting extension lines, then determines coordinates of each vertex of the imaging medium according to the partial boundary of the imaging medium, then determines a target transformation relation according to the projection image display area, determines a target projection area according to the coordinates of each vertex of the imaging medium and the target transformation relation, and finally corrects the projection image display area according to the target projection area, so that the projection image display area and the imaging medium area coincide. The method and the device can determine the target projection area which can enable the projection picture display area and the imaging medium area to be overlapped by combining the target transformation relation according to the partial boundary corresponding to the imaging medium, and correct the projection picture display area by utilizing the target projection area to realize automatic projection correction, thereby ensuring the alignment of the projection equipment and the imaging medium, being simple in operation, and being capable of obtaining the projection effect meeting the requirements of a user without manual adjustment of the user.
Fig. 2 is a flow chart illustrating one step 103 of the embodiment shown in fig. 1. As shown in fig. 2, step 103 may include the steps of:
step 1031, obtaining the coordinates of the intersection point of the plurality of line segments or the coordinates of the intersection point of the extension lines of the plurality of line segments.
The coordinates of the intersection point are used as the coordinates of the known vertex of the imaging medium, step 1032.
And 1033, determining coordinates of other vertexes of the imaging medium according to the coordinates of the intersection point, the intersection included angle of the line segments and the preset proportional relation of the boundary of the imaging medium.
In one scenario, a feature processing module may be further provided in the projection apparatus, and the feature processing module may first use coordinates of an intersection point of the plurality of line segments in the partial boundary, or coordinates of an intersection point of extension lines of the plurality of line segments in the partial boundary as coordinates of the known vertex of the imaging medium. The coordinates of other vertices of the imaging medium may then be determined based on the known coordinates of the vertices, the included angle of the included angles at which the line segments intersect (e.g., when the projection device captures 3 boundaries of the imaging medium in the target image, the included angles formed between the line segments representing the 3 boundaries are two, i.e., there are two included angles), and the proportional relationship of the boundaries of the imaging medium.
As shown in fig. 3, the area enclosed by the dotted line in fig. 3 is the edge of the object image, and the area formed by A, B, C and D in fig. 3 is the imaging medium area. The imaging medium is actually rectangular, and the shape of the imaging medium in the target image does not appear as a rectangle due to the shooting angle of the projection device, distortion and the like. In the case where the partial boundary includes line segments AB, AC, and BD, the intersection of the plurality of line segments in the partial boundary is two vertices above the imaging medium, i.e., coordinates of the known vertices are coordinates of the two vertices above the imaging medium (i.e., coordinates of A, B), and the angle at which the plurality of line segments intersect at this time is the angle between line segments AB and AC and the angle between line segments AB and BD. The projection device may then determine the coordinates of the two vertices below the imaging medium (i.e., the coordinates of C, D) based on the determined coordinates and included angle of the two vertices combined as a proportional relationship to the boundary of the imaging medium, where the coordinates of the other vertices are the coordinates of the two vertices below the imaging medium.
Alternatively, step 103 may be implemented by:
1) and if the number of the intersection points of the plurality of line segments is a preset value, taking the coordinates of the intersection points of the plurality of line segments as the coordinates of each vertex of the imaging medium.
2) Or if the number of the intersection points of the extension lines of the line segments is a preset value, taking the coordinates of the intersection points of the extension lines of the line segments as the coordinates of each vertex of the imaging medium.
In another scenario, the predetermined value is the actual number of vertices of the imaging media. Taking the imaging medium as a rectangle as an example, if the number of the intersection points of the plurality of line segments is 4 (that is, the preset value is 4), it indicates that each boundary (or a part of each boundary) of the imaging medium is included in the partial boundary, and at this time, the intersection points of the plurality of line segments are each vertex of the imaging medium, and then the coordinates of the intersection points of the plurality of line segments can be taken as the coordinates of each vertex of the imaging medium. In addition, if the number of intersections of the extensions of the plurality of line segments is 4, it is described that each boundary (or a part of each boundary) of the imaging medium is included in the partial boundary, and at this time, the intersections of the extensions of the plurality of line segments are each vertex of the imaging medium, and then the coordinates of the intersections of the extensions of the plurality of line segments may be used as the coordinates of each vertex of the imaging medium.
For example, as shown in fig. 4, the area enclosed by the dotted line in fig. 4 is the edge of the target image, and the area formed by a, b, c, d, e, and f in fig. 4 is the imaging medium area. As can be seen from fig. 4, if the target image does not include the middle portion of the lower boundary of the imaging medium, and the number of intersections of the line segments is 4, and the intersections a, b, c, and d are respectively defined as intersections a, b, c, and d, the intersections a, b, c, and d can be directly defined as 4 vertices of the imaging medium. In addition, in this case, it is also possible to directly connect two intersection points c and d of the bottom side of the imaging medium region directly to obtain the lower boundary of the imaging medium region.
Fig. 5 is a flow chart illustrating one step 104 of the embodiment shown in fig. 1. As shown in fig. 5, step 104 may include the steps of:
step 1041, determining coordinates of a first corner point of the display area of the projection picture according to the target image.
Step 1042, obtaining coordinates of a second corner point corresponding to the first corner point in a reflection area preset on the DMD.
And 1043, determining a target transformation relation according to the coordinates of the first corner point and the coordinates of the second corner point.
For example, the projection device may obtain the boundary of the projection screen display area according to the target image, and further determine coordinates of first corner points of the projection screen display area, where the first corner points are respective vertices of the projection screen display area. Specifically, the preset image may be an image with a checkerboard, as shown in fig. 6, an optical machine of the projection device may emit corresponding light according to the preset image, and after the light is reflected by a reflection region on the DMD, a projection picture is formed in a region where the imaging medium is located (that is, actually, the checkerboard is projected onto the imaging medium), at this time, the target image captured by the image capture device also includes the checkerboard, and then the projection device may use a checkerboard detection algorithm to detect the checkerboard in the target image to obtain coordinates of the checkerboard, and calculate coordinates of a first corner of a display region of the projection picture according to prior information of the preset image, that is, coordinates of 4 corners of the projection picture in fig. 6.
Then, the projection device may obtain coordinates of a second corner point corresponding to the first corner point in the reflection area on the DMD, that is, coordinates of 4 corner points of the reflection area in fig. 6, and determine the target transformation relationship according to the coordinates of each first corner point and the coordinates of the second corner point corresponding to the first corner point. The second corner points are points on the DMD plane, and the corresponding second corner points can be obtained by transforming the first corner points through the target transformation relation. Further, to ensure that the projection of the projection device is aligned with the frame of the imaging medium, the ratio of the reflective area to the boundary of the imaging medium needs to be consistent, for example, when the ratio of the boundary of the imaging medium is 16:9, the ratio of the reflective area needs to be 16: 9.
Fig. 7 is a flow chart illustrating one step 102 of the embodiment shown in fig. 1. As shown in fig. 7, step 102 may include the steps of:
and step 1021, performing edge recognition on the target image, and extracting image edge features.
And 1022, performing noise reduction on the image edge features, performing linear extraction on the image edge features subjected to the noise reduction, and taking line segments obtained by the linear extraction as partial boundaries of the imaging medium.
For example, the projection device may further include a feature extraction module, after the projection device acquires the target image, the feature extraction module may perform edge recognition on the target image to obtain an image edge feature of the imaging medium region, and perform noise reduction on the image edge feature by using a binarization processing method to eliminate part of noise. The feature extraction module may then perform linear extraction on the edge features of the image after the noise reduction processing by using a preset linear extraction algorithm to obtain a plurality of line segments (i.e., line segments that may be boundaries of the imaging medium region), and use the line segments obtained through the linear extraction as part of the boundaries of the imaging medium. In addition, in order to improve the accuracy of determining the partial boundary, the feature extraction module may classify the plurality of line segments obtained by the straight line extraction using a classification tree determined by prior knowledge after obtaining the plurality of line segments by the straight line extraction, so that the classification tree selects the partial boundary of the imaging medium from the plurality of line segments according to parameters such as an angle, a position, and a length of each line segment.
Further, the target image acquired by the image acquisition device may include various random noises and distortions, and in order to perform projection correction more accurately, it is necessary to improve the visual effect of the target image and improve the definition of the target image, so that the target image becomes more beneficial to the device processing, thereby facilitating the analysis and extraction of various features. Specifically, the projection device may be provided with a preprocessing module, and when the projection device receives the target image, the preprocessing module may preprocess the target image. For example, first, the preprocessing module may acquire an image parameter of the target image and determine whether the image parameter satisfies a preset parameter condition. The image parameters may include exposure parameters and contrast, and the preset parameter condition may be whether the exposure parameters and the contrast are within a preset parameter range. Under the condition that the image parameters do not meet the preset parameter conditions, the image quality of the target image does not meet expectations, and the preprocessing module can perform preprocessing operation on the target image to obtain a processed target image. Wherein the pre-processing operation includes at least one of contrast enhancement, random noise removal, edge feature enhancement, and pseudo-color processing.
In addition, if the image acquisition device uses a wide-angle lens, the image of the wide-angle lens has obvious distortion, and the preprocessing module can further perform distortion correction on the processed target image and take the image after the distortion correction as an updated target image. The method for performing distortion correction on the processed target image may refer to the description in the related art, and this disclosure does not limit this in detail.
In summary, the present disclosure first obtains a target image captured by an image capturing device, and determines a partial boundary of an imaging medium according to the target image, where the target image includes a projection image display area and an imaging medium area, the partial boundary of the imaging medium is a plurality of intersecting line segments or intersecting extension lines, then determines coordinates of each vertex of the imaging medium according to the partial boundary of the imaging medium, then determines a target transformation relation according to the projection image display area, determines a target projection area according to the coordinates of each vertex of the imaging medium and the target transformation relation, and finally corrects the projection image display area according to the target projection area, so that the projection image display area and the imaging medium area coincide. The method and the device can determine the target projection area which can enable the projection picture display area and the imaging medium area to be overlapped by combining the target transformation relation according to the partial boundary corresponding to the imaging medium, and correct the projection picture display area by utilizing the target projection area to realize automatic projection correction, thereby ensuring the alignment of the projection equipment and the imaging medium, being simple in operation, and being capable of obtaining the projection effect meeting the requirements of a user without manual adjustment of the user.
Fig. 8 is a block diagram illustrating a projection correction apparatus according to an exemplary embodiment. As shown in fig. 8, the apparatus 200 includes:
the acquiring module 201 is configured to acquire a target image captured by an image capturing device, where the target image includes a projection image display area and an imaging medium area.
The determining module 202 is configured to determine a partial boundary of the imaging medium according to the target image, where the partial boundary of the imaging medium is a plurality of line segments intersecting with each other or extending lines intersecting with each other.
The determining module 202 is further configured to determine coordinates of vertices of the imaging medium according to the partial boundary of the imaging medium.
The determining module 202 is further configured to determine a target transformation relation according to the projection screen display area.
The determining module 202 is further configured to determine a target projection area according to the coordinates of each vertex of the imaging medium and the target transformation relationship.
And the correction module 203 is used for correcting the projection picture display area according to the target projection area, so that the projection picture display area and the imaging medium area are overlapped.
FIG. 9 is a block diagram of one type of determination module shown in the embodiment of FIG. 8. As shown in fig. 9, the determining module 202 includes:
the first obtaining submodule 2021 is configured to obtain coordinates of an intersection of the plurality of line segments or coordinates of an intersection of extension lines of the plurality of line segments.
A first determining submodule 2022 for taking the coordinates of the intersection as the coordinates of the known vertex of the imaging medium.
The first determining sub-module 2022 is further configured to determine coordinates of other vertices of the imaging medium according to the coordinates of the intersection point, an included angle formed by the intersection of the plurality of line segments, and a preset proportional relationship of the imaging medium boundary.
Optionally, the determining module 202 is configured to:
and determining a target transformation relation according to the display area of the projection picture and a preset reflection area on a Digital Micromirror Device (DMD) of the projection equipment. The object transformation relation represents a transformation relation between an imaging plane where the projection screen display area is located and the reflection plane of the DMD.
FIG. 10 is a block diagram of one type of determination module shown in the embodiment of FIG. 8. As shown in fig. 10, the determining module 202 includes:
the second determining sub-module 2023 is configured to determine, according to the target image, coordinates of a first corner point of the display area of the projection screen.
The second obtaining sub-module 2024 is configured to obtain coordinates of a second corner point corresponding to the first corner point in the reflection area preset on the DMD.
The second determining submodule 2023 is further configured to determine the target transformation relationship according to the coordinates of the first corner point and the coordinates of the second corner point.
Optionally, the determining module 202 is configured to:
and mapping each vertex of the imaging medium to the reflecting plane according to the target transformation relation to obtain a point on the reflecting plane corresponding to each vertex of the imaging medium.
The target projection area is determined from points on the reflection plane.
Optionally, the determining module 202 is configured to:
and if the number of the intersection points of the plurality of line segments is a preset value, taking the coordinates of the intersection points of the plurality of line segments as the coordinates of each vertex of the imaging medium.
Or if the number of the intersection points of the extension lines of the line segments is a preset value, taking the coordinates of the intersection points of the extension lines of the line segments as the coordinates of each vertex of the imaging medium.
FIG. 11 is a block diagram of yet another determination module shown in the embodiment of FIG. 8. As shown in fig. 11, the determining module 202 includes:
the identifying sub-module 2025 is configured to perform edge identification on the target image, and extract an image edge feature.
The processing sub-module 2026 is configured to perform noise reduction on the image edge feature, perform linear extraction on the image edge feature after the noise reduction, and use a line segment obtained by the linear extraction as a partial boundary of the imaging medium.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
In summary, the present disclosure first obtains a target image captured by an image capturing device, and determines a partial boundary of an imaging medium according to the target image, where the target image includes a projection image display area and an imaging medium area, the partial boundary of the imaging medium is a plurality of intersecting line segments or intersecting extension lines, then determines coordinates of each vertex of the imaging medium according to the partial boundary of the imaging medium, then determines a target transformation relation according to the projection image display area, determines a target projection area according to the coordinates of each vertex of the imaging medium and the target transformation relation, and finally corrects the projection image display area according to the target projection area, so that the projection image display area and the imaging medium area coincide. The method and the device can determine the target projection area which can enable the projection picture display area and the imaging medium area to be overlapped by combining the target transformation relation according to the partial boundary corresponding to the imaging medium, and correct the projection picture display area by utilizing the target projection area to realize automatic projection correction, thereby ensuring the alignment of the projection equipment and the imaging medium, being simple in operation, and being capable of obtaining the projection effect meeting the requirements of a user without manual adjustment of the user.
Fig. 12 is a block diagram illustrating a projection device 300 according to an example embodiment. As shown in fig. 12, the projection apparatus 300 may include: a processor 301 and a memory 302. The projection device 300 may also include one or more of a multimedia component 303, an input/output (I/O) interface 304, and a communication component 305.
The processor 301 is configured to control the overall operation of the projection apparatus 300, so as to complete all or part of the steps in the projection correction method. Memory 302 is used to store various types of data to support operation at the projection device 300, such as instructions for any application or method operating on the projection device 300, as well as application-related data, such as contact data, messaging, pictures, audio, video, and the like. The Memory 302 may be implemented by any type of volatile or non-volatile Memory device or combination thereof, such as Static Random Access Memory (SRAM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Erasable Programmable Read-Only Memory (EPROM), Programmable Read-Only Memory (PROM), Read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic disk or optical disk. The multimedia components 303 may include a screen and an audio component. Wherein the screen may be, for example, a touch screen and the audio component is used for outputting and/or inputting audio signals. For example, the audio component may include a microphone for receiving external audio signals. The received audio signal may further be stored in the memory 302 or transmitted through the communication component 305. The audio assembly also includes at least one speaker for outputting audio signals. The I/O interface 304 provides an interface between the processor 301 and other interface modules, such as a keyboard, mouse, buttons, etc. These buttons may be virtual buttons or physical buttons. The communication component 305 is used for wired or wireless communication between the projection device 300 and other devices. Wireless Communication, such as Wi-Fi, bluetooth, Near Field Communication (NFC), 2G, 3G, 4G, NB-IOT, eMTC, or other 5G, etc., or a combination of one or more of them, which is not limited herein. The corresponding communication component 305 may therefore include: Wi-Fi module, Bluetooth module, NFC module, etc.
In an exemplary embodiment, the projection Device 300 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic components for performing the above-described projection correction method.
In another exemplary embodiment, a computer-readable storage medium is also provided, which comprises program instructions, which when executed by a processor, implement the steps of the projection correction method described above. For example, the computer readable storage medium may be the memory 302 described above including program instructions that are executable by the processor 301 of the projection device 300 to perform the projection correction method described above.
In another exemplary embodiment, a computer program product is also provided, which comprises a computer program executable by a programmable apparatus, the computer program having code portions for performing the projection correction method described above when executed by the programmable apparatus.
The preferred embodiments of the present disclosure are described in detail with reference to the accompanying drawings, however, the present disclosure is not limited to the specific details of the above embodiments, and various simple modifications may be made to the technical solution of the present disclosure within the technical idea of the present disclosure, and these simple modifications all belong to the protection scope of the present disclosure.
It should be noted that, in the foregoing embodiments, various features described in the above embodiments may be combined in any suitable manner, and in order to avoid unnecessary repetition, various combinations that are possible in the present disclosure are not described again.
In addition, any combination of various embodiments of the present disclosure may be made, and the same should be considered as the disclosure of the present disclosure, as long as it does not depart from the spirit of the present disclosure.

Claims (9)

1. A method of projection correction, the method comprising:
acquiring a target image shot by an image acquisition device; the target image comprises a part of an imaging medium, and the target image comprises a projection picture display area and an imaging medium area;
determining a partial boundary of the imaging medium according to the target image, wherein the partial boundary of the imaging medium is a plurality of line segments which are intersected or the extension lines of which are intersected;
determining coordinates of each vertex of the imaging medium according to the partial boundary of the imaging medium;
determining a target transformation relation according to the projection picture display area;
determining a target projection area according to the coordinates of each vertex of the imaging medium and the target transformation relation;
correcting the projection picture display area according to the target projection area, so that the projection picture display area and the imaging medium area are overlapped;
the determining coordinates of each vertex of the imaging media based on the partial boundary of the imaging media comprises:
acquiring coordinates of intersection points of a plurality of line segments or coordinates of intersection points of extension lines of the plurality of line segments;
taking the coordinates of the intersection point as the coordinates of the known vertex of the imaging medium;
and determining the coordinates of other vertexes of the imaging medium according to the coordinates and the included angles of the intersection points and the proportional relation of the preset imaging medium boundary, wherein the included angles are the included angles formed by intersecting a plurality of line segments or the included angles formed by intersecting the extension lines of the plurality of line segments.
2. The method of claim 1, wherein determining a target transformation relationship based on the projection screen display area comprises:
determining a target transformation relation according to the projection picture display area and a reflection area preset on a Digital Micromirror Device (DMD) of the projection equipment; the target conversion relation represents a conversion relation between an imaging plane where the projection picture display area is located and a reflection plane of the DMD.
3. The method according to claim 2, wherein the determining the target transformation relation according to the display area of the projection picture and a preset reflection area on a Digital Micromirror Device (DMD) of the projection device comprises:
determining the coordinates of a first corner point of the projection picture display area according to the target image;
acquiring coordinates of a second corner point corresponding to the first corner point in a reflection area preset on the DMD;
and determining the target transformation relation according to the coordinates of the first corner point and the coordinates of the second corner point.
4. The method of claim 2, wherein said determining a target projection area based on coordinates of vertices of said imaging media and said target transformation relationship comprises:
mapping each vertex of the imaging medium to the reflection plane according to the target transformation relation to obtain a point on the reflection plane corresponding to each vertex of the imaging medium;
and determining the target projection area according to the point on the reflection plane.
5. The method of claim 1, wherein said determining coordinates of each vertex of said imaging media from a partial boundary of said imaging media comprises:
if the number of the intersection points of the plurality of line segments is a preset value, taking the coordinates of the intersection points of the plurality of line segments as the coordinates of each vertex of the imaging medium;
or if the number of the intersection points of the extension lines of the line segments is a preset value, taking the coordinates of the intersection points of the extension lines of the line segments as the coordinates of each vertex of the imaging medium.
6. The method of claim 1, wherein said determining a partial boundary of said imaging media from said target image comprises:
performing edge recognition on the target image, and extracting image edge features;
and performing noise reduction on the image edge features, performing linear extraction on the image edge features subjected to the noise reduction, and taking line segments obtained by the linear extraction as partial boundaries of the imaging medium.
7. A projection correction apparatus, characterized in that the apparatus comprises:
the acquisition module is used for acquiring a target image shot by the image acquisition device; the target image comprises a part of an imaging medium, and the target image comprises a projection picture display area and an imaging medium area;
the determining module is used for determining the partial boundary of the imaging medium according to the target image, wherein the partial boundary of the imaging medium is a plurality of line segments which are intersected or the extension lines of which are intersected;
the determining module is further configured to determine coordinates of each vertex of the imaging medium according to a partial boundary of the imaging medium;
the determining module is further configured to determine a target transformation relationship according to the projection picture display area;
the determining module is further configured to determine a target projection area according to the coordinates of each vertex of the imaging medium and the target transformation relation;
the correction module is used for correcting the projection picture display area according to the target projection area so that the projection picture display area is superposed with the imaging medium area;
the determining module comprises:
the first acquisition submodule is used for acquiring the coordinates of the intersection points of the line segments or the coordinates of the intersection points of the extension lines of the line segments;
a first determining submodule for taking the coordinates of the intersection point as the coordinates of the known vertex of the imaging medium;
the first determining submodule is further used for determining coordinates of other vertexes of the imaging medium according to the coordinates and the included angles of the intersection points and the proportional relation of the preset imaging medium boundary, and the included angles are included angles formed by intersecting a plurality of line segments or included angles formed by intersecting extension lines of the plurality of line segments.
8. A non-transitory computer readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 6.
9. A projection device, comprising:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to carry out the steps of the method of any one of claims 1 to 6.
CN202111050693.1A 2021-09-08 2021-09-08 Projection correction method, projection correction device, storage medium and projection equipment Active CN113489961B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111050693.1A CN113489961B (en) 2021-09-08 2021-09-08 Projection correction method, projection correction device, storage medium and projection equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111050693.1A CN113489961B (en) 2021-09-08 2021-09-08 Projection correction method, projection correction device, storage medium and projection equipment

Publications (2)

Publication Number Publication Date
CN113489961A CN113489961A (en) 2021-10-08
CN113489961B true CN113489961B (en) 2022-03-22

Family

ID=77946759

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111050693.1A Active CN113489961B (en) 2021-09-08 2021-09-08 Projection correction method, projection correction device, storage medium and projection equipment

Country Status (1)

Country Link
CN (1) CN113489961B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114466173A (en) * 2021-11-16 2022-05-10 海信视像科技股份有限公司 Projection equipment and projection display control method for automatically throwing screen area
CN114339179A (en) * 2021-12-23 2022-04-12 深圳市火乐科技发展有限公司 Projection correction method, projection correction device, storage medium and projection equipment
CN114286069B (en) * 2021-12-31 2024-04-02 深圳市火乐科技发展有限公司 Projection picture processing method and device, storage medium and projection equipment
CN114422763B (en) * 2022-01-14 2024-02-09 深圳市火乐科技发展有限公司 Screen function verification method and device, computer equipment and storage medium
CN114401388A (en) * 2022-01-26 2022-04-26 深圳市火乐科技发展有限公司 Projection method, projection device, storage medium and projection equipment
CN114449249B (en) * 2022-01-29 2024-02-09 深圳市火乐科技发展有限公司 Image projection method, image projection device, storage medium and projection apparatus
CN114827562A (en) * 2022-03-11 2022-07-29 深圳海翼智新科技有限公司 Projection method, projection device, projection equipment and computer storage medium
CN114615480A (en) * 2022-03-11 2022-06-10 峰米(重庆)创新科技有限公司 Projection picture adjusting method, projection picture adjusting device, projection picture adjusting apparatus, storage medium, and program product
CN117014589B (en) * 2023-09-27 2023-12-19 北京凯视达科技股份有限公司 Projection method, projection device, electronic equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107547881A (en) * 2016-06-24 2018-01-05 上海顺久电子科技有限公司 A kind of auto-correction method of projection imaging, device and laser television
CN109618141A (en) * 2018-12-24 2019-04-12 北京易创盈互联网科技有限公司 A kind of ultrashort out-of-focus projection's method and device of passenger's new media
CN110798670A (en) * 2019-11-11 2020-02-14 成都极米科技股份有限公司 Ultrashort-focus picture screen alignment method and device, ultrashort-focus projection equipment and medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3925521B2 (en) * 2004-08-19 2007-06-06 セイコーエプソン株式会社 Keystone correction using part of the screen edge

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107547881A (en) * 2016-06-24 2018-01-05 上海顺久电子科技有限公司 A kind of auto-correction method of projection imaging, device and laser television
CN109618141A (en) * 2018-12-24 2019-04-12 北京易创盈互联网科技有限公司 A kind of ultrashort out-of-focus projection's method and device of passenger's new media
CN110798670A (en) * 2019-11-11 2020-02-14 成都极米科技股份有限公司 Ultrashort-focus picture screen alignment method and device, ultrashort-focus projection equipment and medium

Also Published As

Publication number Publication date
CN113489961A (en) 2021-10-08

Similar Documents

Publication Publication Date Title
CN113489961B (en) Projection correction method, projection correction device, storage medium and projection equipment
US11115565B2 (en) User feedback for real-time checking and improving quality of scanned image
CN113365041B (en) Projection correction method, projection correction device, storage medium and electronic equipment
EP3092790B1 (en) Adaptive camera control for reducing motion blur during real-time image capture
CN112272292B (en) Projection correction method, apparatus and storage medium
CN109479082B (en) Image processing method and apparatus
WO2016101524A1 (en) Method and apparatus for correcting inclined shooting of object being shot, mobile terminal, and storage medium
CN106464799A (en) Automatic zooming method and device
CN111340960B (en) Image modeling method and device, storage medium and electronic equipment
JP2017130794A (en) Information processing apparatus, evaluation chart, evaluation system, and performance evaluation method
CN110738604A (en) Canny operator and Hough transformation-based large-dip angle certificate image correction method and system
JP2019169914A (en) Projection control device, marker detection method, and program
CN115086631B (en) Image generating method and information processing apparatus
WO2018099128A1 (en) Method and device used in projection apparatus
CN114401388A (en) Projection method, projection device, storage medium and projection equipment
CN114339179A (en) Projection correction method, projection correction device, storage medium and projection equipment
JP2013153392A (en) Image processing apparatus and image processing method, and program
US20230040505A1 (en) Projector and control method therefor
CN115883794A (en) Projection method, projection device, storage medium and projection equipment
JP2013171490A (en) Touch position input device and touch position input method
CN114727074A (en) Projection correction method and projection correction device for projection device and projection device
JP2019169913A (en) Projection control device, projection device, correction image projection method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant