CN114339179A - Projection correction method, projection correction device, storage medium and projection equipment - Google Patents

Projection correction method, projection correction device, storage medium and projection equipment Download PDF

Info

Publication number
CN114339179A
CN114339179A CN202111590617.XA CN202111590617A CN114339179A CN 114339179 A CN114339179 A CN 114339179A CN 202111590617 A CN202111590617 A CN 202111590617A CN 114339179 A CN114339179 A CN 114339179A
Authority
CN
China
Prior art keywords
projection
image
area
target
picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111590617.XA
Other languages
Chinese (zh)
Inventor
孙世攀
张聪
胡震宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Huole Science and Technology Development Co Ltd
Original Assignee
Shenzhen Huole Science and Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Huole Science and Technology Development Co Ltd filed Critical Shenzhen Huole Science and Technology Development Co Ltd
Priority to CN202111590617.XA priority Critical patent/CN114339179A/en
Publication of CN114339179A publication Critical patent/CN114339179A/en
Pending legal-status Critical Current

Links

Images

Abstract

The disclosure relates to a projection correction method, a projection correction device, a storage medium and a projection device, and relates to the technical field of projection, wherein the method comprises the following steps: projecting a characteristic image to a target projection area, and acquiring a shot image of the target projection area, wherein the characteristic image comprises scale marks distributed in a preset image area; determining a target picture area in the shot image based on the scale marks of the projection picture in the shot image, wherein the target picture area is an area of the feature image in the modulation plane, which is mapped on the shot image; the projected image on the modulation plane is corrected based on the target picture area. Therefore, the method not only can remarkably improve the speed of projection correction, but also is suitable for a target projection area of a single plane and a target projection area of a complex multi-plane. Meanwhile, the corrected image can accord with the viewing angle of the user, so that the optimal projection viewing experience is provided for the user.

Description

Projection correction method, projection correction device, storage medium and projection equipment
Technical Field
The present disclosure relates to the field of projection technologies, and in particular, to a projection correction method, an apparatus, a storage medium, and a projection device.
Background
With the popularization of the mobile projection device and the rotatable projection device, when a user uses the mobile projection device or the rotatable projection device, a projection picture is often projected in a complex stereoscopic surface. For example, the internal and external corners of a wall, the corners of a ceiling, and other complex solid surfaces.
In the related art, in an application scenario in which a projection picture is projected on a complex stereoscopic surface, the projection picture is generally encoded according to projection structured light of a projection device, and then the projection picture is corrected according to a distorted image of the current projection picture and a pre-calibrated encoded picture. However, this correction method requires a long calibration process, and the projection correction speed is slow, which reduces the user experience of watching the projection.
Disclosure of Invention
The invention aims to provide a projection correction method, a projection correction device, a storage medium and a projection device, which are used for improving the correction speed of the projection device facing a complex projection plane.
In a first aspect, an embodiment of the present disclosure provides a projection correction method, including:
projecting a characteristic image to a target projection area, and acquiring a shot image of the target projection area, wherein the characteristic image comprises scale marks distributed in an image preset area;
determining a target picture area in the shot image based on the scale marks of a projection picture in the shot image, wherein the target picture area is an area of a feature image in a modulation plane mapped on the shot image;
correcting the projected image on the modulation plane based on the target picture area.
In a second aspect, an embodiment of the present disclosure provides a projection correction apparatus, including:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is configured to project a characteristic image to a target projection area and acquire a shot image of the target projection area, and the characteristic image comprises scale marks distributed in an image preset area;
a picture positioning module configured to determine a target picture area in the captured image based on the scale mark of a projection picture in the captured image, wherein the target picture area is an area where a feature image in a modulation plane is mapped on the captured image;
a correction module configured to correct a projected image on the modulation plane based on the target picture area.
In a third aspect, the disclosed embodiments provide a non-transitory computer readable storage medium, on which a computer program is stored, which when executed by a processor, implements the steps of the method of the first aspect.
In a fourth aspect, an embodiment of the present disclosure provides a projection apparatus, including:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to implement the steps of the method of the first aspect.
Based on the technical scheme, the characteristic image comprising the scale marks is projected to the target projection area, the characteristic image in the modulation plane is mapped in the shot image according to the scale marks of the projection image in the shot image of the target projection area, the target image area is obtained, and the projection image in the modulation plane is corrected through the target image area, so that the image projected in the target projection area is consistent with the target image area. According to the projection correction method provided by the embodiment of the disclosure, the image to be projected does not need to be encoded in advance, the position information of the target projection area does not need to be obtained through calculation, and the speed of projection correction can be obviously improved. Moreover, the projection correction method provided by the embodiment of the disclosure is suitable for the target projection area of a single plane, and is also suitable for the target projection area of a complex multi-plane. In addition, the images are shot for projection correction, and the corrected images can accord with the viewing angle of the user, so that the best projection viewing experience is provided for the user.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure without limiting the disclosure. In the drawings:
FIG. 1 is a schematic flow chart diagram of a projection correction method provided in accordance with an exemplary embodiment;
FIG. 2 is a schematic illustration of a feature image provided in accordance with an exemplary embodiment;
FIG. 3 is a schematic illustration of a captured image provided in accordance with an exemplary embodiment;
FIG. 4 is a schematic illustration of a target picture region provided in accordance with an exemplary embodiment;
FIG. 5 is a schematic flow chart illustrating step 120 of FIG. 1;
FIG. 6 is a schematic illustration of a target projection area provided in accordance with an exemplary embodiment;
FIG. 7 is a schematic illustration of determining a second line of intersection provided in accordance with an exemplary embodiment;
FIG. 8 is a schematic illustration of determining a second line of intersection provided in accordance with another exemplary embodiment;
FIG. 9 is a flow diagram providing projection correction based on a target picture region in accordance with an exemplary embodiment;
FIG. 10 is a schematic illustration of projection correction provided in accordance with an exemplary embodiment;
FIG. 11 is a diagram of effects after projection correction provided in accordance with an exemplary embodiment;
FIG. 12 is a block diagram illustrating the connection of modules of a projection correction apparatus according to an exemplary embodiment;
fig. 13 is a block diagram illustrating a projection device according to an example embodiment.
Detailed Description
The following detailed description of specific embodiments of the present disclosure is provided in connection with the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating the present disclosure, are given by way of illustration and explanation only, not limitation.
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
Fig. 1 is a flowchart illustrating a projection correction method according to an exemplary embodiment. The projection correction method disclosed in this embodiment may be executed by a projection device, and specifically may be executed by a projection correction apparatus, which may be implemented by software and/or hardware and configured in the projection device. As shown in fig. 1, the projection correction method may include the following steps.
In step 110, a characteristic image is projected to the target projection area, and a shot image of the target projection area is obtained, wherein the characteristic image includes scale marks distributed in a preset area of the image.
Here, the target projection area may refer to a medium for carrying a projection screen, such as a wall surface or a curtain. The target projection area may be a single plane, such as a curtain, a flat wall surface, etc., or may be a solid composed of two or more planes in different planes, such as a corner with an internal corner and an external corner, a ceiling corner, etc. Wherein, the internal and external corners are one kind of building structure, and the internal corners refer to the recessed wall corners, such as the included angles between the top surface and the surrounding walls; the external corner refers to a protruding wall corner, such as an included angle formed by two walls at a turning part of a walkway.
After the projection equipment projects the characteristic image to the target projection area, the projection equipment acquires a shot image of the target projection area and performs projection correction according to the shot image.
In some embodiments, after the projection apparatus projects the characteristic image to the target projection area, the projection apparatus may acquire a captured image of the target projection area through an image capturing device provided on the projection apparatus. For example, a captured image of the projection area of the object is acquired through a wide-angle lens.
In other embodiments, after the projection device projects the feature image to the target projection area, a shot image sent by the mobile terminal and obtained by shooting the target projection area by the mobile terminal is received. For example, after the projection device projects the feature image to the target area, the user photographs the target projection area through the mobile terminal to obtain a photographed image, and transmits the photographed image to the projection device.
It should be understood that a photographed image of the target projection area, which corresponds to a viewing picture from the user's perspective, is acquired by the mobile terminal. Therefore, the corrected image can accord with the current viewing angle of the user by performing projection correction according to the shot image of the target projection area acquired by the mobile terminal, so that the corrected image can provide the best viewing experience for the user. In addition, for the ultra-short-focus projection equipment, the shot image obtained by the mobile terminal in the target projection area has a better effect than the shot image obtained by the camera device arranged on the projection equipment, because the shot image obtained by the mobile terminal not only can completely cover the target projection area, but also can better accord with the viewing angle of a user.
The characteristic image may be an image in which scale marks are distributed on an upper portion and a lower portion of the image, and other regions are transparent regions. Therefore, the characteristic image can be superposed with the projected image in the modulation plane, the superposed image is corrected, the noninductive projection correction can be carried out under the condition that a user watches the projection picture, and after the projection is finished, the characteristic image is directly removed, so that the user experience in the projection correction process is improved. It should be understood that the tick marks in the feature image may refer to uniformly or non-uniformly arranged mark lines, which may be vertical lines, dotted lines, and the like. Through the graduation marks, the characteristic image can be uniformly or non-uniformly segmented.
FIG. 2 is a schematic illustration of a feature image provided in accordance with an exemplary embodiment. As shown in fig. 2, scale marks 21 are distributed on the upper and lower portions of the feature image 20, and the other regions of the feature image 20 are in a transparent state. The characteristic image 20 can be subjected to the non-sensible projection correction as a mask layer of the projection image.
In step 120, a target picture area is determined in the captured image based on the graduation marks of the projection picture in the captured image, wherein the target picture area is an area on the captured image where the characteristic image in the modulation plane is mapped.
Here, the projection picture in the captured image refers to a picture in which the characteristic image in the modulation plane is projected on the target picture region. Wherein, the modulation plane refers to a plane where an image is generated by a light modulator (chip) of the projection apparatus. The chip corresponding to the modulation plane comprises a reflection type image modulation chip or a transmission type image modulation chip. The reflective image modulation chip includes a DMD chip (Digital Micromirror Device) or an LCOS chip (Liquid Crystal on Silicon ), and the transmissive image modulation chip includes an LCD chip (Liquid Crystal Display ), and the like.
FIG. 3 is a schematic illustration of a captured image provided in accordance with an exemplary embodiment. As shown in fig. 3, the captured image 30 includes a target projection area 31 and a projection screen 32 (indicated by a dashed line frame). Wherein the pixel area of the projection screen 32 includes a graduation mark 34. And the position information corresponding to all the scale marks can be determined by shooting the image.
In some embodiments, all the graduation marks in the captured image can be detected by the image detection model, and then the position information corresponding to each graduation mark is determined according to the coordinate information of the graduation marks in the reference coordinate system of the captured image. The image detection model can be obtained by training the neural network model by utilizing a historical shooting image marked with the position of the scale mark.
It should be understood that the captured image actually corresponds to the user's visual plane, and therefore the target picture area determined in the captured image is the projected picture that is ultimately presented in the user's visual plane.
After the position information corresponding to each scale mark is obtained in the shot image, the target picture area can be determined according to the position information of the scale marks of the projection picture and the position information of the scale marks in the characteristic image. The position information of the scale marks of the projection screen is coordinate information of the scale marks in a reference coordinate system constructed with an arbitrary point in the captured image as an origin.
The target screen region may be a maximum rectangle determined in a pixel region belonging to the feature image in the captured image according to the aspect ratio of the feature image, the aspect ratio of the maximum rectangle being identical to the aspect ratio of the feature image.
It should be appreciated that by calculating the maximum rectangle, the area of the projected picture actually viewed by the user can be maximized to enhance the user's projected viewing experience.
FIG. 4 is a schematic illustration of a target picture area provided in accordance with an exemplary embodiment. As shown in fig. 4, the captured image 40 includes a target projection area 41 and a projection screen 42. The target screen region 43 is determined in the region of the projection screen 42 by the above-described embodiment.
In step 130, the projected image on the modulation plane is corrected based on the target picture area.
Here, the second position information and the third position information are both coordinate information in a reference coordinate system constructed with an arbitrary point in the captured image as an origin.
For example, a homography matrix relationship between each vertex of the projection picture in the captured image and a corner corresponding to the feature image may be constructed based on fourth position information of each vertex of the projection picture in the captured image and fifth position information of each corner of the feature image in the modulation plane, and the feature image on the modulation plane may be corrected based on the homography matrix relationship and sixth position information of each vertex of the target picture region to obtain a corrected feature image, so that the projection image on the modulation plane may be corrected based on the corrected feature image.
The homography matrix is actually a perspective transformation matrix, and the homography matrix reflects the position change of pixel points on the characteristic image when the pixel points are transformed to the shot image. The characteristic image on the modulation plane can be corrected through the homography matrix relation, so that when the corrected characteristic image is projected on the target projection area, a projection picture presented in a visual angle plane of a user is consistent with the target picture area.
It should be understood that the correction of the projection image based on the corrected feature image means that the size of the projection image is adjusted to coincide with the size of the corrected feature image, and the projection screen of the adjusted projection image displayed on the target projection area coincides with the target screen area. Here, the projection image refers to an image that the user wants to project, and may be a video, a picture, or the like.
It is worth mentioning that the projection correction can be performed by the projection correction method provided by the embodiment of the present disclosure, regardless of whether the target projection area is a single plane or a solid composed of a plurality of non-coplanar planes. Particularly, for the case that the projection picture falls on a solid composed of intersecting planes, the projection correction method provided by the embodiment of the disclosure performs projection correction, so that the projection picture finally viewed by the user can be consistent with the target picture area. For example, the projection device projects toward a corner position, and when the projection device is not corrected by the projection correction method provided by the embodiment of the disclosure, a projection picture viewed by a user is consistent with the shape of the corner, and has distortion. After the projection correction method provided by the disclosed embodiment is used for correcting, the projection picture viewed by the user is consistent with the target projection area, namely, in the vision of the user, the projection picture is presented as a standard rectangle.
Thus, the target screen area is obtained by projecting the feature image including the scale marks to the target projection area and mapping the feature image in the modulation plane in the captured image according to the scale marks of the projection screen in the captured image of the target projection area, and the projected image in the modulation plane is corrected by the target screen area so that the image projected in the target projection area appears to coincide with the target screen area. According to the projection correction method provided by the embodiment of the disclosure, the image to be projected does not need to be encoded in advance, the position information of the target projection area does not need to be obtained through calculation, and the speed of projection correction can be obviously improved. Moreover, the projection correction method provided by the embodiment of the disclosure is suitable for the target projection area of a single plane, and is also suitable for the target projection area of a complex multi-plane. In addition, the images are shot for projection correction, and the corrected images can accord with the viewing angle of the user, so that the best projection viewing experience is provided for the user.
Fig. 5 is a schematic flow chart of step 120 shown in fig. 1. In some implementations, when the target projection area includes at least two sub-projection areas located on intersecting planes, as shown in fig. 5, the step 120 of determining the target picture area in the captured image based on the graduation marks of the projection picture in the captured image may include the following steps.
In step 121, first position information of a graduation line in the projection picture and second position information of a first intersection line between intersecting planes are determined in the captured image.
Here, the target projection area including at least two sub-projection areas located on the intersecting planes means that the projection screen is located in the sub-projection areas of the at least two intersecting planes. For example, the projected picture falls in the inside and outside corners of the wall.
The first intersection line refers to an intersection line between the two sub-projection areas. FIG. 6 is a schematic illustration of a target projection area provided in accordance with an exemplary embodiment. As shown in fig. 6, the target projection area 60 includes a first sub-projection area ABEF and a second sub-projection area BCDF, wherein an intersection BE between the first sub-projection area ABEF and the second sub-projection area BCDF is used as a first intersection of the target projection area 60. When the projection apparatus projects an image onto the target projection area 60, the first intersection line BE divides the projection screen 61 into two parts, which are a first sub-projection screen on the first sub-projection area ABEF and a second sub-projection screen on the second sub-projection area BCDF. It should be understood that the meaning of the first intersection line is consistent for other more complex target projection areas and is not exhaustive in the embodiments of the disclosure.
It should be noted that the first position information of the scale mark and the second position information of the first intersection line respectively refer to coordinate information of the scale mark of the projection screen and the coordinate information of the first intersection line in a reference coordinate system constructed by using any point in the captured image as an origin.
As an example, the second position information of the first intersection line may be obtained by taking the photographed image as an input of a trained machine learning model. The trained machine learning model can be obtained by training the machine learning model by taking the historical shooting image marked with the position of the first intersection line as a training sample. Wherein the machine learning model may be a convolutional neural network model.
As another example, all pixels belonging to the first intersection line may be determined in the captured image according to a difference between a gray value of the pixel of the first intersection line and a gray value of the pixel of the other region in the captured image, so that the second position information of the first intersection line may be determined according to the determined coordinate information of all the pixels belonging to the first intersection line in the captured image.
In step 122, third position information of a second intersecting line corresponding to the first intersecting line on the feature image is determined based on the first position information of the scale mark and the second position information of the first intersecting line.
Here, the projection apparatus projects the feature image on the target projection area, the number of the scale marks in the projection screen does not change, and after projection, the distance between the scale marks in the projection screen becomes larger than the distance between the scale marks of the feature image in the modulation plane.
Therefore, after the second position information of the first intersection line between the sub-projection areas is obtained, the relative position relationship between the first intersection line and the scale mark in the shot image can be obtained according to the first position information of the scale mark in the shot image and the second position information of the first intersection line. Wherein, the relative position relationship represents the positions of the two end points of the first intersecting line in the scale mark. For example, the first end point of the first intersecting line is located on the fifth scale line from left to right on the upper part of the image of the projection screen in the captured image, and the second end point of the first intersecting line is located on the eighth scale line from left to right on the lower part of the image of the projection screen in the captured image.
Since the number of the scale marks in the projection picture does not change compared with the number of the scale marks of the feature image in the modulation plane, the second intersection line can be determined in the feature image on the modulation plane according to the relative position relationship.
FIG. 7 is a schematic illustration of determining a second intersection provided in accordance with an exemplary embodiment. As shown in fig. 7, (a) in fig. 7 is a captured image, and (b) is a characteristic image in the modulation plane. In the captured image 80, a relative positional relationship between the first intersection line 84 to the target projection area 81 and the scale mark 83 in the projection screen 82 can be determined, and the relative positional relationship is: the first end of the first intersection line 84 is located on the eighth scale line from left to right above the image of the projection screen 82, and the second end of the first intersection line 84 is located on the eighth scale line from left to right below the image of the projection screen 82.
From this relative positional relationship, the position in the feature image 85 to the second intersection line 87 can be determined by combining the positional information of the scale mark 86 of the feature image 85 in the modulation plane.
It should be understood that the first intersection line divides the projection picture on the target projection area into at least two sub-projection pictures, and the second intersection line also divides the feature image in the modulation plane into at least two sub-feature images, which correspond one-to-one to the sub-projection pictures.
In some embodiments, a target tick mark closest to the first intersection line may be determined in the captured image based on the first position information of the tick mark and the second position information of the first intersection line; and determining third position information of a second intersecting line corresponding to the first intersecting line on the characteristic image based on the relative position relation between the target scale mark and the first intersecting line.
The first position information includes coordinate information of each scale mark of the projection picture in the captured image, and the second position information of the first intersection line may refer to coordinate information of two end points where the first intersection line intersects with the projection picture in the captured image. After the second position information of the first intersecting line is determined, the relative position relationship between the target scale mark and the first intersecting line can be determined according to the first position information and the second position information, and then the third position information of the second intersecting line is determined according to the relative position relationship.
It should be noted that the relative position relationship between the target scale mark and the first intersection line reflects the position of the scale mark of the first intersection line in the projection picture. And the third position information is coordinate information in a reference coordinate system which is constructed by taking any one point in the modulation plane as an origin.
When the first intersecting line is overlapped with the scale mark, the target scale mark closest to the first intersecting line is the scale mark overlapped with the first intersecting line. For this situation, the detailed process of determining the third position information based on the relative position relationship between the target scale mark and the first intersecting line has been described in detail in the foregoing embodiments, and is not described again here.
When the first intersection line is located between two scale marks, the position of the first intersection line in the scale marks can be located according to the target scale mark closest to the first intersection line. After the target scale mark closest to the first intersection line is determined, the third position information of the second intersection line in the scale marks can be determined according to the distance proportion between the target scale mark and another scale mark adjacent to the first intersection line.
FIG. 8 is a schematic illustration of determining a second intersection provided in accordance with another example embodiment. As shown in fig. 8 (a), in the captured image 90, the relative positional relationship is: a first end of the first intersection 901 is located in the middle between the first target graduation mark 902 and the second target graduation mark 903, and a second end of the first intersection 901 is located in the middle between the third target graduation mark 904 and the fourth target graduation mark 905.
Since the number of the projected scale marks does not change, the third position information of the feature image 91 in which the first intersection line 901 is mapped in the modulation plane can be determined according to the above-mentioned relative position relationship. As shown in fig. 8 (b), a first end point of the second intersection line 911 is located between the fifth target scale mark 912 and the sixth target scale mark 913, and a second end point of the second intersection line 911 is located between the seventh target scale mark 914 and the eighth target scale mark 915. The first target scale mark 902 corresponds to the fifth target scale mark 912, the second target scale mark 903 corresponds to the sixth target scale mark 913, the third target scale mark 904 corresponds to the seventh target scale mark 914, and the fourth target scale mark 905 corresponds to the eighth target scale mark 915.
Therefore, the third position information of the second intersection line of the characteristic image of which the first intersection line is mapped on the modulation plane can be quickly and accurately positioned through the relative position relation between the target scale mark and the first intersection line under the condition that the space position information of the first intersection line is not required to be determined.
In step 123, the feature image is mapped to the captured image based on the second position information of the first intersection and the third position information of the second intersection to determine the target screen area.
Here, after obtaining second position information of the first intersection in the photographed image and third position information of the second intersection in the modulation plane, the feature image in the modulation plane may be mapped into the photographed image to determine the target picture area.
The related meaning of the target frame area has been described in detail in the above embodiments, and is not described herein again.
In some embodiments, the feature image may be moved into the captured image based on the second position information of the first intersection and the third position information of the second intersection, with the second intersection coinciding with the first intersection as a constraint condition; and then scaling the characteristic image moved into the shot image in an equal ratio, and taking the characteristic image after the scaling in the equal ratio as a target picture area, wherein the characteristic image after the scaling in the equal ratio is the largest rectangle positioned in the pixel area belonging to the projection picture in the shot image.
Here, the feature image in the modulation plane is moved into the captured image with the second intersection overlapping the first intersection as a constraint condition. After the feature image is moved to the shot image, the feature image moved to the shot image is subjected to equal ratio scaling, and the feature image subjected to equal ratio scaling is used as a target picture area, wherein the target picture area is the largest feature image in a pixel area of a projection picture of the shot image after the equal ratio scaling.
Specifically, the first intersection line and the second intersection line may be aligned, and the bottom edge of the feature image coincides with the lower endpoint of the first intersection line (as shown in fig. 4). Then, the feature image is enlarged or reduced until the enlarged or reduced feature image is maximum in the pixel region of the projection screen in the captured image, thereby taking the maximum image as the target screen region.
It should be understood that, since the target picture region is obtained by enlarging or reducing according to the feature image, the aspect ratio of the target picture region coincides with the aspect ratio of the feature image in the modulation plane. Moreover, the maximum rectangular is used as the target picture area, so that the maximum viewing picture can be provided for the user, and the projection viewing experience of the user is improved.
In addition, the captured image actually corresponds to the visual plane of the user, and therefore, the target screen region specified in the captured image is the projection screen finally presented in the visual plane of the user.
Therefore, the projection correction can be accurately finished through the characteristic image comprising the scale marks, the spatial information of the target projection area and the pose information of the projection equipment do not need to be obtained in advance, and the speed of the projection correction can be greatly improved.
Fig. 9 is a flowchart providing projection correction according to a target picture area according to an exemplary embodiment. As shown in fig. 9, in the case that the target projection area includes at least two sub-projection areas located on intersecting planes, the step 130 of correcting the projection image on the modulation plane based on the target picture area may include the following steps.
In step 131, second position information of a first intersection between the intersecting planes is determined in the captured image.
Here, the detailed execution process of step 131 has been described in detail in the above embodiments, and is not described herein again.
In step 132, first coordinate information of sub-projection screens in which the projection screen is located in the respective sub-projection areas and second coordinate information of sub-screen areas in which the target screen area is located in the respective sub-projection areas are determined in the captured image based on the second position information of the first intersection line.
Here, the first intersection line in the target projection area divides the projection screen and the target screen area in the captured image into a plurality of parts, and the first coordinate information of the sub-projection screen located in each sub-projection area and the second coordinate information of the sub-screen area located in each sub-projection area can be determined based on the second position information of the first intersection line. FIG. 10 is a schematic diagram of projection correction provided in accordance with an exemplary embodiment. As shown in fig. 10 (a), the first intersection BE in the captured image 100 divides the projection screen ABCDEF into a first sub-projection screen ABEF and a second sub-projection screen BCDE in the respective sub-projection areas (not shown in fig. 10). Since the second intersection line corresponding to the target screen region GHJI coincides with the first intersection line BE, the first intersection line BE also divides the target screen region GHJI into the first sub-screen region GKEI and the second sub-screen region KHJE located in the respective sub-projection regions (not shown in fig. 10).
Since the position information of each sub-projection picture and each sub-picture area in the captured image is determined, the first coordinate information of the sub-projection picture and the second coordinate information of the sub-picture area in which the target picture area is located in each sub-projection area can be located. It should be understood that the first coordinate information and the second coordinate information are both in a reference coordinate system constructed with an arbitrary point in the captured image as a coordinate origin.
In step 133, for each sub-projection picture, a homography matrix relationship between the sub-projection picture and the corresponding sub-feature image is established based on the first coordinate information of the sub-projection picture and the third coordinate information of the sub-feature image corresponding to the sub-projection picture, where the sub-feature image is obtained by dividing the feature image in the modulation plane according to the second position information of the first intersection line.
Here, the sub-feature image refers to an image area on the feature image on which the sub-projection picture is mapped in the modulation plane. In the shot image, the projection picture is divided into a plurality of sub projection pictures by first intersecting lines between the sub projection areas, and each sub projection picture corresponds to one sub feature image in the modulation plane. The sub-feature image is obtained by dividing the feature image in the modulation plane according to the second position information of the first intersection line. Specifically, a second intersection corresponding to the first intersection is determined in the feature image according to the second position information of the first intersection, and then each sub-feature image is obtained according to the second intersection.
It should be understood that the principle of obtaining the second intersection line in the above-described embodiment and with reference to fig. 7 and 8 is described in detail, and will not be described again here.
Referring to fig. 10 (B), line B1E1That is, the first intersection BE is mapped on a second intersection B in the modulation plane1E1Segmenting the feature image 101 in the modulation plane into a first sub-feature image A1B1E1F1And a second sub-feature image B1C1D1E1
And for each sub-projection picture, the projection device projects the sub-feature image corresponding to the sub-projection picture in the corresponding sub-projection area. Therefore, a homography matrix relationship between the sub-projection picture and the corresponding sub-feature image can be established according to the second coordinate information of each vertex of the sub-projection picture and the third coordinate information of each corner of the sub-feature image corresponding to the sub-projection picture.
For example, for the first sub-projection picture ABEF, the first coordinate information of each vertex of the first sub-projection picture ABEF and the first sub-feature image A are used1B1E1F1The first sub-projection picture ABEF and the first sub-feature image A are established according to the third coordinate information1B1E1F1A first homography matrix relationship therebetween. Aiming at the second sub-projection picture BCDE, according to the first coordinate information of each vertex of the second sub-projection picture BCDE and the second sub-feature image B1C1D1E1Establishing a second sub-projection picture BCDE and a second sub-feature image B according to third coordinate information of each corner point1C1D1E1A second homography matrix relationship therebetween.
It is worth noting that the homographic matrix relationship acts as a perspective transformation matrix that reflects the change in position of the pixel projections in the modulation plane within the projection region of the object.
In step 134, for each sub-picture region, coordinate information of an image region of the sub-picture region mapped on the modulation plane is obtained according to the homography matrix relationship corresponding to the sub-picture region and the second coordinate information of the sub-picture region.
Here, for each sub-picture region in the captured image, the coordinate information of the image region mapped on the modulation plane by the sub-picture region is obtained according to the homography matrix relation corresponding to the sub-picture region and the second coordinate information of the sub-picture region. Specifically, the second coordinate information of the sub-picture area is multiplied by the corresponding homography matrix relation to obtain the coordinate information of the image area of the sub-picture area mapped on the modulation plane. When the image area mapped on the modulation plane is projected on the corresponding sub-projection area, the sub-picture area appears as the corresponding sub-picture area.
As shown in fig. 10, for the first sub-picture area GKEI, according to the second coordinate information of the first sub-picture area GKEI and the first homography matrix relationship, a first image area a of the first sub-picture area GKEI mapped on the modulation plane may be obtained1B1E1F1The coordinate information of (2). Aiming at the second sub-picture area KHJE, according to the second coordinate system information of the second sub-picture area KHJE and the second homography matrix relationship, a second image area K of the second sub-picture area KHJE mapped on the modulation plane can be obtained1H1J1E1The coordinate information of (2).
In step 135, the projected image is corrected based on the coordinate information of the image area on the modulation plane to which each sub-picture area is mapped.
Here, the coordinate information of the image area in which each sub-picture area is mapped on the modulation plane constitutes a projection picture that is presented in correspondence with the target picture area when the image is projected on the target projection area. As shown in fig. 10, according to the first image region G1K1E1I1And a second image area K1H1J1E1A target image area G with the target picture area mapped on the modulation plane can be obtained1K1H1J1E1I1The coordinate information of (2).
Wherein the projection device is based on the target image area G1K1H1J1E1I1When the projection image is projected, the projection picture presented in the user view angle is consistent with the target picture area.
It should be noted that, since the projection correction is performed facing the projection picture in the stereo, the focal length of the projection device can be determined by the distance between the intersection line between the sub-projection areas and the projection device during the projection process of the projection device, so as to ensure a good focusing effect.
Fig. 11 is an effect diagram after projection correction provided according to an exemplary embodiment, and as shown in fig. 11, a projection screen 1101 in which a feature image not corrected by the projection correction method provided by the embodiment of the present disclosure is projected at a corner is shown, and an image presented in a user view plane is not a rectangular image. A projection picture 1102 of the original image, which is corrected by the projection correction method provided by the embodiment of the present disclosure, projected at a corner is shown in the figure, and an image presented in a user view plane is a rectangular image.
It should be noted that, in the above embodiment, only the target projection area including two sub-projection areas is illustrated, and the projection correction method provided by the embodiment of the present disclosure is not limited to be only used for the application scenario of the target projection area including two sub-projection areas to perform projection correction. For a target projection area including one sub-projection area or three or more sub-projection areas, projection correction can be performed by the projection correction method provided by the embodiment of the disclosure. When the target projection area comprising three or more sub-projection areas is faced, the projection correction is carried out according to the first intersection line among the sub-projection areas.
Fig. 12 is a schematic block diagram illustrating a connection of a projection correction apparatus according to an exemplary embodiment, and as shown in fig. 12, a projection correction apparatus 1300 provided in an embodiment of the present disclosure may include:
the obtaining module 1301 is configured to project a characteristic image to the target projection area and obtain a shot image of the target projection area, where the characteristic image includes scale marks distributed in a preset image area;
a picture positioning module 1302 configured to determine a target picture area in the captured image based on a scale mark of a projection picture in the captured image, wherein the target picture area is an area where the feature image in the modulation plane is mapped on the captured image;
a correction module 1303 configured to correct the projected image on the modulation plane based on the target picture area.
Optionally, the screen positioning module 1302 includes:
a first position determination unit configured to determine first position information of a scale mark in the projection picture and second position information of a first intersection line between the intersecting planes in the captured image when the target projection area includes at least two sub-projection areas located on the intersecting planes;
a second position determination unit configured to determine third position information of a second intersecting line corresponding to the first intersecting line on the feature image based on the first position information of the scale mark and the second position information of the first intersecting line;
and the picture determining unit is configured to map the characteristic image into the shot image based on the second position information of the first intersection line and the third position information of the second intersection line so as to determine the target picture area.
Optionally, the picture determination unit includes:
the mapping unit is configured to move the feature image to the shot image based on second position information of the first intersection line and third position information of the second intersection line by taking the second intersection line and the first intersection line as constraint conditions;
and the construction unit is configured to perform equal ratio scaling on the characteristic image moved into the shot image, and take the equal ratio scaled characteristic image as a target picture area, wherein the equal ratio scaled characteristic image is a largest rectangle located in a pixel area belonging to the projection picture in the shot image.
Optionally, the second position determination unit comprises:
a scale mark determination unit configured to determine a target scale mark closest to the first intersection line in the captured image based on first position information of the scale mark and second position information of the first intersection line;
and the intersection line positioning unit is configured to determine third position information of a second intersection line corresponding to the first intersection line on the feature image based on the relative position relation between the target scale mark and the first intersection line.
Optionally, the correction module 1303 includes:
a third position determination unit configured to determine second position information of a first intersection line between the intersecting planes in the captured image when the target projection area includes at least two sub-projection areas located on the intersecting planes;
a fourth position determination unit configured to determine, in the captured image, first coordinate information of sub-projection screens whose projection screens are located in the respective sub-projection areas and second coordinate information of sub-screen areas whose target screen area is located in the respective sub-projection areas, based on the second position information of the first intersection line;
the homography determining unit is configured to establish a homography matrix relation between each sub-projection picture and the corresponding sub-feature image based on first coordinate information of the sub-projection picture and third coordinate information of the sub-feature image corresponding to the sub-projection picture, wherein the sub-feature image is obtained by dividing the feature image in the modulation plane according to second position information of the first intersection line;
a first correction subunit, configured to, for each sub-picture region, obtain, according to a homography matrix relationship corresponding to the sub-picture region and second coordinate information of the sub-picture region, coordinate information of an image region on a modulation plane to which the sub-picture region is mapped;
and a second correction subunit configured to correct the projection image according to the coordinate information of the image area on the modulation plane to which the respective sub-picture area is mapped.
Optionally, the feature image is an image with scale lines distributed on the upper part and the lower part of the image.
Optionally, the obtaining module 1301 includes:
a projection unit configured to project the feature image to a target projection area;
and the receiving unit is configured to receive a shot image which is sent by the mobile terminal and obtained by shooting the target projection area by the mobile terminal.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 13 is a block diagram illustrating a projection device according to an example embodiment. As shown in fig. 13, the projection apparatus 700 may include: a processor 701 and a memory 702. The projection device 700 may also include one or more of a multimedia component 703, an input/output (I/O) interface 704, and a communication component 705.
The processor 701 is configured to control the overall operation of the projection apparatus 700, so as to complete all or part of the steps in the projection correction method. Memory 702 is used to store various types of data to support operation at the projection device 700, such as instructions for any application or method operating on the projection device 700 and application-related data, such as contact data, messaging, pictures, audio, video, and the like. The Memory 702 may be implemented by any type of volatile or non-volatile Memory device or combination thereof, such as Static Random Access Memory (SRAM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Erasable Programmable Read-Only Memory (EPROM), Programmable Read-Only Memory (PROM), Read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic disk, or optical disk. The multimedia components 703 may include screen and audio components. Wherein the screen may be, for example, a touch screen and the audio component is used for outputting and/or inputting audio signals. For example, the audio component may include a microphone for receiving external audio signals. The received audio signal may further be stored in the memory 702 or transmitted through the communication component 705. The audio assembly also includes at least one speaker for outputting audio signals. The I/O interface 704 provides an interface between the processor 701 and other interface modules, such as a keyboard, mouse, buttons, etc. These buttons may be virtual buttons or physical buttons. The communication component 705 is used for wired or wireless communication between the projection device 700 and other devices. Wireless Communication, such as Wi-Fi, bluetooth, Near Field Communication (NFC), 2G, 3G, 4G, NB-IOT, eMTC, or other 5G, etc., or a combination of one or more of them, which is not limited herein. The corresponding communication component 705 may thus include: Wi-Fi module, Bluetooth module, NFC module, etc.
In an exemplary embodiment, the projection Device 700 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic components for performing the projection correction method described above.
In another exemplary embodiment, a computer-readable storage medium is also provided, which comprises program instructions, which when executed by a processor, implement the steps of the projection correction method described above. For example, the computer readable storage medium may be the memory 702 described above including program instructions that are executable by the processor 701 of the projection device 700 to perform the projection correction method described above.
In another exemplary embodiment, a computer program product is also provided, which comprises a computer program executable by a programmable apparatus, the computer program having code portions for performing the projection correction method described above when executed by the programmable apparatus.
The preferred embodiments of the present disclosure are described in detail with reference to the accompanying drawings, however, the present disclosure is not limited to the specific details of the above embodiments, and various simple modifications may be made to the technical solution of the present disclosure within the technical idea of the present disclosure, and these simple modifications all belong to the protection scope of the present disclosure.
It should be noted that, in the foregoing embodiments, various features described in the above embodiments may be combined in any suitable manner, and in order to avoid unnecessary repetition, various combinations that are possible in the present disclosure are not described again.
In addition, any combination of various embodiments of the present disclosure may be made, and the same should be considered as the disclosure of the present disclosure, as long as it does not depart from the spirit of the present disclosure.

Claims (10)

1. A projection correction method, comprising:
projecting a characteristic image to a target projection area, and acquiring a shot image of the target projection area, wherein the characteristic image comprises scale marks distributed in an image preset area;
determining a target picture area in the shot image based on the scale marks of a projection picture in the shot image, wherein the target picture area is an area of a feature image in a modulation plane mapped on the shot image;
correcting the projected image on the modulation plane based on the target picture area.
2. The projection correction method according to claim 1, wherein the determining a target picture area in the captured image based on the graduation mark of the projection picture in the captured image includes:
when the target projection area comprises at least two sub-projection areas positioned on intersecting planes, determining first position information of the scale marks in the projection picture and second position information of first intersecting lines between the intersecting planes in the shot image;
determining third position information of a second intersection line corresponding to the first intersection line on the feature image based on the first position information of the scale mark and the second position information of the first intersection line;
and mapping the characteristic image to the shot image based on second position information of the first intersection line and third position information of the second intersection line so as to determine the target picture area.
3. The projection correction method according to claim 2, wherein the mapping the feature image into the captured image based on the second position information of the first intersection and the third position information of the second intersection to determine the target picture area includes:
moving the feature image to the shot image based on second position information of the first intersection line and third position information of the second intersection line by taking the second intersection line and the first intersection line as constraint conditions;
and scaling the characteristic image moved into the shot image in an equal ratio, and taking the characteristic image after the equal ratio scaling as the target picture area, wherein the characteristic image after the equal ratio scaling is the largest rectangle positioned in the pixel area of the shot image belonging to the projection picture.
4. The projection correction method according to claim 2, wherein the determining third position information of a second intersection corresponding to the first intersection on the feature image based on the first position information of the graduation mark and the second position information of the first intersection includes:
determining a target scale mark closest to the first intersection line in the shot image based on first position information of the scale mark and second position information of the first intersection line;
and determining third position information of a second intersecting line corresponding to the first intersecting line on the feature image based on the relative position relationship between the target scale mark and the first intersecting line.
5. The projection correction method according to claim 1, wherein the correcting the projection image on the modulation plane based on the target picture region includes:
determining second position information of a first intersection line between intersecting planes in the captured image when the target projection area includes at least two sub-projection areas located on the intersecting planes;
according to the second position information of the first intersection line, determining first coordinate information of sub-projection pictures of which the projection pictures are located in the sub-projection areas and second coordinate information of sub-picture areas of which the target picture areas are located in the sub-projection areas in the shot image;
aiming at each sub-projection picture, establishing a homography matrix relation between the sub-projection picture and a corresponding sub-feature image based on first coordinate information of the sub-projection picture and third coordinate information of the sub-feature image corresponding to the sub-projection picture, wherein the sub-feature image is obtained by dividing the feature image in the modulation plane according to second position information of the first intersection line;
for each sub-picture area, obtaining the coordinate information of the image area of the sub-picture area mapped on the modulation plane according to the homography matrix relation corresponding to the sub-picture area and the second coordinate information of the sub-picture area;
and correcting the projected image according to the coordinate information of the image area of each sub-picture area mapped on the modulation plane.
6. The projection correction method according to any one of claims 1 to 5, wherein the feature image is an image in which the scale lines are distributed on an upper portion and a lower portion of the image.
7. The projection correction method according to any one of claims 1 to 5, wherein the projecting a feature image to a target projection area and acquiring a captured image of the target projection area includes:
projecting the characteristic image to a target projection area;
and receiving a shot image which is sent by the mobile terminal and obtained by shooting the target projection area by the mobile terminal.
8. A projection correction apparatus, comprising:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is configured to project a characteristic image to a target projection area and acquire a shot image of the target projection area, and the characteristic image comprises scale marks distributed in an image preset area;
a picture positioning module configured to determine a target picture area in the captured image based on the scale mark of a projection picture in the captured image, wherein the target picture area is an area where a feature image in a modulation plane is mapped on the captured image;
a correction module configured to correct a projected image on the modulation plane based on the target picture area.
9. A non-transitory computer readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
10. A projection device, comprising:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to carry out the steps of the method of any one of claims 1 to 7.
CN202111590617.XA 2021-12-23 2021-12-23 Projection correction method, projection correction device, storage medium and projection equipment Pending CN114339179A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111590617.XA CN114339179A (en) 2021-12-23 2021-12-23 Projection correction method, projection correction device, storage medium and projection equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111590617.XA CN114339179A (en) 2021-12-23 2021-12-23 Projection correction method, projection correction device, storage medium and projection equipment

Publications (1)

Publication Number Publication Date
CN114339179A true CN114339179A (en) 2022-04-12

Family

ID=81054026

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111590617.XA Pending CN114339179A (en) 2021-12-23 2021-12-23 Projection correction method, projection correction device, storage medium and projection equipment

Country Status (1)

Country Link
CN (1) CN114339179A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116156132A (en) * 2022-12-27 2023-05-23 深圳市宇通联发科技有限公司 Projection image correction method, projection image correction device, electronic equipment and readable storage medium

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010017687A1 (en) * 1999-03-03 2001-08-30 3M Innovative Properties Company Integrated front projection system with distortion correction and associated method
JP2005318355A (en) * 2004-04-30 2005-11-10 Nec Viewtechnology Ltd Trapezoidal distortion correcting device of projector and projector provided therewith
CN101271575A (en) * 2008-04-09 2008-09-24 东华大学 Orthogonal projection emendation method for image measurement in industry close range photography
JP2009145061A (en) * 2007-12-11 2009-07-02 Mitsubishi Electric Corp Position measuring equipment
CN105184800A (en) * 2015-09-28 2015-12-23 神画科技(深圳)有限公司 Automatic three-dimensional mapping projection system and method
CN105182662A (en) * 2015-09-28 2015-12-23 神画科技(深圳)有限公司 Projection method and system with augmented reality effect
CN105282535A (en) * 2015-10-22 2016-01-27 神画科技(深圳)有限公司 3D projection system and 3D projection method in 3D space environment
CN107430321A (en) * 2015-03-30 2017-12-01 精工爱普生株式会社 The control method of projecting apparatus and projecting apparatus
CN110381302A (en) * 2019-08-22 2019-10-25 歌尔科技有限公司 A kind of projection pattern bearing calibration of optical projection system, apparatus and system
CN112272292A (en) * 2020-11-06 2021-01-26 深圳市火乐科技发展有限公司 Projection correction method, apparatus and storage medium
CN112584116A (en) * 2021-02-24 2021-03-30 深圳市火乐科技发展有限公司 Projection correction method, projection correction device, storage medium and electronic equipment
CN112665530A (en) * 2021-01-05 2021-04-16 银昌龄 Light plane recognition device corresponding to projection line, three-dimensional measurement system and method
CN112669388A (en) * 2019-09-30 2021-04-16 上海禾赛科技股份有限公司 Calibration method and device for laser radar and camera device and readable storage medium
CN113489961A (en) * 2021-09-08 2021-10-08 深圳市火乐科技发展有限公司 Projection correction method, projection correction device, storage medium and projection equipment

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010017687A1 (en) * 1999-03-03 2001-08-30 3M Innovative Properties Company Integrated front projection system with distortion correction and associated method
JP2005318355A (en) * 2004-04-30 2005-11-10 Nec Viewtechnology Ltd Trapezoidal distortion correcting device of projector and projector provided therewith
JP2009145061A (en) * 2007-12-11 2009-07-02 Mitsubishi Electric Corp Position measuring equipment
CN101271575A (en) * 2008-04-09 2008-09-24 东华大学 Orthogonal projection emendation method for image measurement in industry close range photography
CN107430321A (en) * 2015-03-30 2017-12-01 精工爱普生株式会社 The control method of projecting apparatus and projecting apparatus
CN105184800A (en) * 2015-09-28 2015-12-23 神画科技(深圳)有限公司 Automatic three-dimensional mapping projection system and method
CN105182662A (en) * 2015-09-28 2015-12-23 神画科技(深圳)有限公司 Projection method and system with augmented reality effect
CN105282535A (en) * 2015-10-22 2016-01-27 神画科技(深圳)有限公司 3D projection system and 3D projection method in 3D space environment
CN110381302A (en) * 2019-08-22 2019-10-25 歌尔科技有限公司 A kind of projection pattern bearing calibration of optical projection system, apparatus and system
CN112669388A (en) * 2019-09-30 2021-04-16 上海禾赛科技股份有限公司 Calibration method and device for laser radar and camera device and readable storage medium
CN112272292A (en) * 2020-11-06 2021-01-26 深圳市火乐科技发展有限公司 Projection correction method, apparatus and storage medium
CN112665530A (en) * 2021-01-05 2021-04-16 银昌龄 Light plane recognition device corresponding to projection line, three-dimensional measurement system and method
CN112584116A (en) * 2021-02-24 2021-03-30 深圳市火乐科技发展有限公司 Projection correction method, projection correction device, storage medium and electronic equipment
CN113365040A (en) * 2021-02-24 2021-09-07 深圳市火乐科技发展有限公司 Projection correction method, projection correction device, storage medium and electronic equipment
CN113489961A (en) * 2021-09-08 2021-10-08 深圳市火乐科技发展有限公司 Projection correction method, projection correction device, storage medium and projection equipment

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
JOSEPH LAM: "《Eye-In-Hand Visual Servoing for Accurate Shooting in Pool Robotics》", 《2008 CANADIAN CONFERENCE ON COMPUTER AND ROBOT VISION》 *
伏燕军: "《基于单目结构光的大物体三维测量关键方法的研究》", 《应用光学》, no. 2 *
李冬: "《非平面投影系统几何优化关键技术研究》", 中国优秀硕士学位论文全文数据库 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116156132A (en) * 2022-12-27 2023-05-23 深圳市宇通联发科技有限公司 Projection image correction method, projection image correction device, electronic equipment and readable storage medium
CN116156132B (en) * 2022-12-27 2023-11-14 深圳市宇通联发科技有限公司 Projection image correction method, projection image correction device, electronic equipment and readable storage medium

Similar Documents

Publication Publication Date Title
KR100796849B1 (en) Method for photographing panorama mosaics picture in mobile device
TWI242374B (en) Image processing system, projector, program, information storing medium, and image processing method
US20130222776A1 (en) Image projector, method of image projection, and computer-readable storage medium storing program for causing computer to execute image projection
CN112272292B (en) Projection correction method, apparatus and storage medium
JP6645687B2 (en) Display device and control method
JP2004274354A (en) System and method for processing image, projector, program and information storage medium
JP2010122273A (en) Method of measuring zoom ratio of projection optical system, method of correcting projection image using the method, and projector executing the correction method
JP5839785B2 (en) Projection system, projection apparatus, and imaging apparatus
WO2017169186A1 (en) Image projection system and correction method
WO2013124901A1 (en) Optical-projection-type display apparatus, portable terminal, and program
CN109644248B (en) Projection type image display device and method for adjusting projection image
CN114286068B (en) Focusing method, focusing device, storage medium and projection equipment
CN114286066A (en) Projection correction method, projection correction device, storage medium and projection equipment
CN114125411A (en) Projection equipment correction method and device, storage medium and projection equipment
JP4199641B2 (en) Projector device
CN114339179A (en) Projection correction method, projection correction device, storage medium and projection equipment
WO2017179111A1 (en) Display system and information processing method
JP2010266714A (en) Video processor, video display device and video processing method
EP4283986A1 (en) Electronic apparatus and control method thereof
CN115086631B (en) Image generating method and information processing apparatus
CN114401388A (en) Projection method, projection device, storage medium and projection equipment
JP2013083985A (en) Projection device, projection method, and program
WO2020162051A1 (en) Projection type video display system
TWM607874U (en) Projection system and projection display system
JP5630799B2 (en) Projection apparatus, projection method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination