CN115134570A - Projection equipment and correction method of projection image thereof - Google Patents

Projection equipment and correction method of projection image thereof Download PDF

Info

Publication number
CN115134570A
CN115134570A CN202210729303.1A CN202210729303A CN115134570A CN 115134570 A CN115134570 A CN 115134570A CN 202210729303 A CN202210729303 A CN 202210729303A CN 115134570 A CN115134570 A CN 115134570A
Authority
CN
China
Prior art keywords
projection
dimensional
projection screen
feature points
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210729303.1A
Other languages
Chinese (zh)
Inventor
金飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Hisense Laser Display Co Ltd
Original Assignee
Qingdao Hisense Laser Display Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Hisense Laser Display Co Ltd filed Critical Qingdao Hisense Laser Display Co Ltd
Priority to CN202210729303.1A priority Critical patent/CN115134570A/en
Publication of CN115134570A publication Critical patent/CN115134570A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Geometry (AREA)
  • Projection Apparatus (AREA)

Abstract

The application discloses projection equipment and a correction method of a projection image of the projection equipment. Because the projection host can correct the display effect of the projection image based on the three-dimensional offset of the plurality of characteristic points, the display effect of the projection image projected to the projection screen can be ensured to be good. And the projection host determines the three-dimensional offset of the projection screen according to a shot image shot by the depth camera, so that the calibration effect of correcting based on the three-dimensional offset is better.

Description

Projection equipment and correction method of projection image thereof
Technical Field
The present disclosure relates to the field of electronic technologies, and in particular, to a projection device and a method for correcting a projected image thereof.
Background
The integrated projection device generally includes a projection screen, a projection host, and a connecting bracket for connecting the projection screen and the projection host.
The optical engine of the projection device is positioned in the projection host, and light rays emitted by the optical engine obliquely and upwards emit to the projection screen, so that a projection image is projected to the projection screen. In order to ensure the display effect of the projected image, the position between the light beam emitted from the optical engine and the projection screen needs to be strictly aligned.
However, in the process of installation or use of the integrated projection device, the projection screen of the integrated projection device may shift relative to the projection host, so that the projection picture cannot be completely projected into the projection screen, and the display effect of the projection image is poor.
Disclosure of Invention
The application provides a projection device and a correction method of a projection image thereof, which can solve the problem of poor display effect of the projection image in the related art. The technical scheme is as follows:
in one aspect, a method for correcting a projected image is provided, and is applied to a projection device, where the projection device includes a projection screen, a projection host, and a connection bracket for connecting the projection screen and the projection host; the method comprises the following steps:
responding to a correction instruction, and acquiring a shot image obtained by shooting the projection screen by a depth camera;
determining measured three-dimensional coordinates of a plurality of feature points of the projection screen based on the shot image;
determining a three-dimensional offset of a measured three-dimensional coordinate of each feature point in the plurality of feature points compared to a reference three-dimensional coordinate of the feature point;
and correcting the display effect of the projection image based on the three-dimensional offset of the plurality of characteristic points.
In another aspect, a projection apparatus is provided, which includes a projection screen, a projection host, and a connection bracket for connecting the projection screen and the projection host; the projection host is used for:
responding to a correction instruction, and acquiring a shot image obtained by shooting the projection screen by a depth camera;
determining measured three-dimensional coordinates of a plurality of feature points of the projection screen based on the shot image;
determining a three-dimensional offset of a measured three-dimensional coordinate of each of the plurality of feature points compared to a reference three-dimensional coordinate of the feature point;
and correcting the display effect of the projection image based on the three-dimensional offset of the plurality of feature points.
In another aspect, a projection apparatus is provided, in which a projection host includes: a memory, a processor and a computer program stored on the memory, the processor implementing the method of correcting a projected image as described above when executing the computer program.
In yet another aspect, a computer-readable storage medium is provided, having stored therein instructions that are loaded and executed by a processor to implement a method of correcting a projected image as described in the above aspect.
In a further aspect, there is provided a computer program product containing instructions which, when run on a computer, cause the computer to perform the method of correcting a projected image as described in the preceding aspect.
The beneficial effect that technical scheme that this application provided brought includes at least:
the application provides a projection device and a projection image correction method thereof, wherein a projection host in the projection device can acquire a shot image obtained by shooting a projection screen by a depth camera, and determines three-dimensional offset of a plurality of feature points in the projection screen based on the shot image. Since the projection host can correct the display effect of the projected image based on the three-dimensional offset of the plurality of feature points, the display effect of the projected image projected onto the projection screen can be ensured to be good. And the projection host can determine the three-dimensional offset of the projection screen according to a shot image shot by the depth camera, so that the calibration effect of correcting based on the three-dimensional offset is better.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a projection apparatus provided in an embodiment of the present application;
FIG. 2 is a schematic structural diagram of another projection apparatus provided in an embodiment of the present application;
FIG. 3 is a flowchart of a method for correcting a projected image according to an embodiment of the present disclosure;
FIG. 4 is a flowchart of another method for correcting a projected image according to an embodiment of the present application;
fig. 5 is a schematic diagram illustrating shifting of a projection screen according to an embodiment of the present disclosure.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Fig. 1 is a schematic structural diagram of a projection apparatus provided in an embodiment of the present application. As shown in fig. 1, the projection apparatus may include a projection screen 10, a projection host 20, and a connection bracket 30 connecting the projection screen 10 and the projection host 20. Since the projection screen 10 is integrally connected to the projection host 20 through the connection bracket 30, the projection apparatus may also be referred to as an integrated projection apparatus or a desktop projection apparatus.
Fig. 2 is a side view of the projection device shown in fig. 1. As shown in fig. 2. The optical engine 21 of the projection apparatus is located in the projection host 20, and the light beam emitted from the optical engine 21 is emitted obliquely upward to the projection screen 10, so that the projection image is projected to the projection screen 10.
It is understood that the projection screen 10 may be movably coupled to the coupling bracket 20. For example, the projection screen 10 may be deflectable relative to the attachment bracket 30. The user can adjust the position of the projection screen 10 as desired. Therefore, the relative position between the projection screen 10 and the projection host 20 is not fixed.
When the position of the projection screen 10 is changed, the position between the light beam emitted from the optical engine 21 of the host projector 20 and the projection screen 10 cannot be aligned, so that the shape of the projection image projected onto the projection screen 10 is deformed (for example, the projection image displayed on the projection screen 10 is trapezoidal), or the projection image cannot be completely projected onto the projection screen 10.
In embodiments of the present application, the projection device may further include a depth camera (not shown in fig. 1) located on the projection host 20. The depth camera may capture the projection screen 10 after the projection device receives the correction instruction. When the depth camera photographs the projection screen 10, it may not only obtain a two-dimensional (2D) image of the projection screen 10, but also obtain a distance between each point in the projection screen 10 and the depth camera. The 2D image of the projection screen may also be referred to as a Red Green Blue (RGB) image of the projection screen. The depth camera may also be referred to as a three-dimensional (3D) camera or a 3D depth camera.
Alternatively, the depth camera may be a binocular stereo depth camera, a structured light camera, or a time of flight (TOF) depth camera.
Fig. 3 is a schematic flowchart of a method for correcting a projection image according to an embodiment of the present application. The method may be applied to a projection device, such as the projection device shown in fig. 1 or fig. 2. As shown in fig. 1, the projection apparatus includes a projection screen, a projection host, and a connection bracket for connecting the projection screen and the projection host.
Referring to fig. 3, the method includes:
and step 101, responding to the correction instruction, and acquiring a shot image obtained by shooting the projection screen by the depth camera.
In the embodiment of the application, after the projection host of the projection device detects the correction instruction, a shot image obtained by shooting the projection screen by the depth camera can be obtained. Wherein the captured image may include a 2D image of the projection screen and distance information between various points in the projection screen and the depth camera.
Alternatively, the correction instruction may be generated for a click operation of a correction button (or a power-on button). The projection host and a remote controller for controlling the projection host can be provided with the correction button (or the starting button). Alternatively, the projection host may periodically generate a correction instruction, and may acquire a captured image obtained by capturing the projection screen by the depth camera in response to the correction instruction. That is, the projection host may periodically perform the calibration process.
And 102, determining the actually measured three-dimensional coordinates of a plurality of characteristic points of the projection screen based on the shot image.
In this embodiment of the application, the projection host may determine the two-dimensional coordinates of each feature point in the projection screen based on the 2D image of the projection screen in the captured image, and further determine the actually measured three-dimensional coordinates of each feature point in the projection screen based on the distance between each feature point and the depth camera.
In the embodiment of the present application, the measured three-dimensional coordinates of each feature point may refer to coordinates in a reference three-dimensional coordinate system. The origin of coordinates of the reference three-dimensional coordinate system may be the location where the depth camera is located.
The plurality of feature points may be preset feature points in the projection screen. For example, the plurality of feature points may include respective vertices of the projection screen.
And 103, determining the three-dimensional offset of the measured three-dimensional coordinate of each feature point in the plurality of feature points compared with the reference three-dimensional coordinate of the feature point.
In the embodiment of the present application, the projection host stores reference three-dimensional coordinates of a plurality of feature points in advance, and the reference three-dimensional coordinates of the plurality of feature points may be three-dimensional coordinates determined when positions of the optical engine and the projection screen in the projection host are strictly aligned. For each of the plurality of feature points, the projection host can calculate a three-dimensional offset between the measured three-dimensional coordinates of the feature point and the reference three-dimensional coordinates.
The three-dimensional offset may include offset distances of the feature points along a first direction x, a second direction y, and a third direction z. Referring to fig. 2, the first direction x, the second direction y and the third direction z are perpendicular to each other, and the first direction x and the second direction y are both parallel to the projection screen, and the third direction z is perpendicular to the projection screen.
And 104, correcting the display effect of the projection image based on the three-dimensional offset of the plurality of characteristic points.
In this embodiment of the present application, the projection host can adjust relevant parameters of the optical engine based on the three-dimensional offsets of the multiple feature points, so that the positions between the light beam emitted by the optical engine and the projection screen are aligned again. Thereby, correction of the display effect of the projected image can be achieved.
Optionally, the display effect of the projection image may include a projection position of the projection image on the projection screen, and/or a projection shape of the projection image on the projection screen. The projection position of the projection image after correction is located in the projection screen, and the projection shape of the projection image is the same as the shape of the projection screen. The projection position of the projection image is located in the projection screen, that is, the projection positions of all pixels in the projection image are located in the projection screen, and the projection positions of the edge pixels of the projection image are aligned with the edge of the projection area of the projection screen. The edge pixels of the projection image refer to pixels located in the outermost peripheral area of the projection image.
In summary, the embodiment of the present application provides a method for correcting a projected image. The projection host in the projection equipment can acquire a shot image obtained by shooting the projection screen by the depth camera, and determines the three-dimensional offset of a plurality of feature points in the projection screen based on the shot image. Because the projection host can correct the display effect of the projection image based on the three-dimensional offset of the plurality of characteristic points, the display effect of the projection image projected to the projection screen can be ensured to be good. And the projection host determines the three-dimensional offset of the projection screen according to a shot image shot by the depth camera, so that the calibration effect of correcting based on the three-dimensional offset is better.
Fig. 4 is a schematic flowchart of another method for correcting a projection image according to an embodiment of the present disclosure. The method may be applied to a projection device, such as the projection device shown in fig. 1 or fig. 2. Referring to fig. 4, the method includes:
step 201, responding to a correction instruction, and acquiring a shot image obtained by shooting a projection screen by a three-dimensional depth camera.
In an embodiment of the application, the projection device comprises a depth camera, and the depth camera is arranged on the projection host. After the projection host of the projection device detects the correction instruction, a shot image obtained by shooting the projection screen by the depth camera can be obtained. The shot image comprises a 2D image of the projection screen and distance information between each point in the projection screen and the depth camera.
Alternatively, the correction instruction may be generated for a click operation of a correction button (or a power-on button). The correction button (or power-on button) can be arranged on both the projection host and a remote controller for controlling the projection host. Alternatively, the projection host may periodically generate a correction instruction, and may acquire a captured image obtained by capturing the projection screen by the depth camera in response to the correction instruction. That is, the projection host may periodically perform the calibration process.
The depth camera can shoot a projection screen in real time after the projection equipment is powered on; alternatively, the depth camera may periodically photograph the projection screen; or, the depth camera can be started under the control of the projection host, and shoots the projection screen.
Step 202, a plurality of feature points of the projection screen are identified from the captured image.
In this embodiment of the application, the projection host may process the captured image based on a preset feature point identification algorithm to identify a plurality of feature points in the captured image. For example, the projection host may identify a plurality of feature points in the captured image using a corner detection algorithm, an edge detection algorithm, or a speeded-up robust features (SURF) algorithm.
Alternatively, the projection screen may be polygonal, and the plurality of feature points of the projection screen may include a plurality of vertices of the polygon. For example, the projection screen may be rectangular, and the plurality of feature points of the projection screen may include four vertices of the rectangle.
And step 203, for each feature point in the plurality of feature points, determining the actually measured three-dimensional coordinates of the feature point according to the two-dimensional coordinates of the feature point in the shot image and the distance between the feature point and the depth camera.
In an embodiment of the present application, the photographed image may include an RGB image and a depth image of the projection screen. Based on the RGB image, the projection host may determine two-dimensional coordinates of each feature point in a camera coordinate system of the depth camera. Based on the depth value of each feature point in the depth image, the projection host may determine a distance between the feature point and a camera of the depth camera. Therefore, the projection host can obtain the three-dimensional coordinates of each feature point in the depth camera coordinate system. The projection host can determine the conversion relation between the camera coordinate system and the reference three-dimensional coordinate system according to the camera parameters of the depth camera, and can determine the actually measured three-dimensional coordinates of each feature point in the reference three-dimensional system according to the conversion relation. The origin of coordinates of the reference three-dimensional coordinate system may be a position where the depth camera is located.
And 204, determining the three-dimensional offset of the measured three-dimensional coordinate of each feature point in the plurality of feature points compared with the reference three-dimensional coordinate of the feature point.
In this embodiment of the application, the projection host stores reference three-dimensional coordinates of a plurality of feature points in advance, and the reference three-dimensional coordinates of the plurality of feature points may be three-dimensional coordinates determined when positions of an optical engine and a projection screen in the projection host are strictly aligned. For each of the plurality of feature points, the projection host can calculate a three-dimensional offset between the measured three-dimensional coordinates of the feature point and the reference three-dimensional coordinates.
The three-dimensional offset may include offset distances of the feature points along a first direction x, a second direction y, and a third direction z. Referring to fig. 2, the first direction x, the second direction y and the third direction z are perpendicular to each other, and the first direction x and the second direction y are both parallel to the projection screen, and the third direction z is perpendicular to the projection screen.
For example, referring to fig. 5, if the projection screen is rectangular, the feature points are four vertices of the projection screen. The reference three-dimensional coordinate of feature point a may be (x1, y1, z1), the reference three-dimensional coordinate of feature point B may be (x2, y2, z2), the reference three-dimensional coordinate of feature point C may be (x3, y3, z3), and the reference three-dimensional coordinate of feature point D may be (x4, y4, z 4).
As shown in fig. 5, if the user deflects the projection screen around the x-axis, based on the captured image obtained by capturing the projection screen by the depth camera, the projection host may determine that the actual three-dimensional coordinates of the deflected feature point a ' may be (x1, y1+ a, z1+ B), the actual three-dimensional coordinates of the deflected feature point B ' may be (x2, y2+ a, z2+ B), the actual three-dimensional coordinates of the deflected feature point C ' may be (x3, y3-a, z3-B), and the actual three-dimensional coordinates of the deflected feature point D ' may be D ' (x4, y4-a, z 4-B). Based on the measured three-dimensional coordinates, the projection device may determine that the offset distances of the four feature points in the first direction x are all 0, the offset distances in the second direction y are all | a |, and the offset distances in the third direction z are all | b |.
Step 205, if the three-dimensional offset is greater than the offset threshold, determining a correction parameter of the projection device based on the three-dimensional offsets of the plurality of feature points.
In this embodiment of the present application, after the projection host determines the three-dimensional offset of each feature point, the three-dimensional offset may be compared with an offset threshold stored in the projection host. If the three-dimensional offset of the plurality of feature points is less than or equal to the offset threshold, the projection host computer may determine that the displacement of the projection device is within an allowable error range, so that the display effect of the projected image does not need to be corrected. The offset threshold may be pre-stored in the projection host.
As one possible example, the offset threshold may be a three-dimensional threshold, that is, the offset threshold may include three sub-thresholds corresponding to three-dimensional offsets of feature points one to one. For the three-dimensional offset of each feature point, the projection host may compare the offset distance of each dimension in the three-dimensional offset with a corresponding sub-threshold. If the offset distance of any one dimension in the three-dimensional offset of a certain feature point is greater than a corresponding sub-threshold, it can be determined that the three-dimensional offset is greater than the offset threshold.
As another possible example, the offset threshold may be a one-dimensional distance threshold. Accordingly, the three-dimensional offset of the feature point may be a straight-line distance between the measured three-dimensional coordinates and the reference three-dimensional coordinates of the feature point. The straight-line distance may be calculated based on offset distances in three directions in the three-dimensional offset amount.
If the three-dimensional offset of any one of the plurality of feature points is greater than the offset threshold, the projection host may determine that the displacement of the projection device exceeds an allowable error range, and thus may determine the correction parameter of the projection device based on the three-dimensional offset of the plurality of feature points.
In a first possible implementation, the correction parameters may include a rotation parameter and a translation parameter. The projection host can determine translation information and/or rotation information of the projection screen based on the three-dimensional offset of the plurality of feature points, and determine translation parameters and rotation parameters of an optical engine in the projection host based on the translation information and the rotation information. Wherein the translation information may include translation distances of the projection screen along the first direction x, the second direction y, and the third direction z. The rotation information may include a rotation angle of the projection screen about a first direction X, a second direction Y, or a third direction Z.
In a second possible implementation manner, the correction parameter may include an offset between the target projection position and the initial projection position of the plurality of feature points. The initial projection position is the position of the plurality of feature points on the projection screen when the position of the projection screen is not shifted, and the target position is the position of the plurality of feature points on the projection screen after the projection screen is shifted.
In this implementation, when the projection screen is not shifted, the projection host may determine an initial projection position of the plurality of feature points on the projection screen and a position of the plurality of feature points on the captured image in advance, and determine a conversion relationship between an image coordinate system of the captured image and a screen image coordinate system of the projection screen according to the initial projection positions of the plurality of feature points and the positions of the plurality of feature points in the captured image. Then, the projection host may determine the target projection positions of the plurality of feature points on the projection screen according to the conversion relationship and the shooting positions of the plurality of feature points after the projection screen is shifted in the shot image. Then, the projection host can calculate the three-dimensional offset of the target projection position and the initial projection position of the plurality of feature points on the projection screen, so as to obtain the correction parameter.
For example, the projection host may calculate a three-dimensional offset amount of the target projection position from the initial projection position of each feature point, and may use an average value of the three-dimensional offset amounts of the plurality of feature points as a correction parameter.
And step 206, correcting the display effect of the projection image according to the correction parameters.
In the embodiment of the present application, the projection host stores a correction algorithm for the projected image in advance. After the projection host determines the correction parameters of the projected image, the display effect of the projected image can be corrected through the correction algorithm.
It is understood that the display effect of the display image is corrected, that is, the relevant parameters of the optical engine are corrected, so that the position between the light beam emitted by the optical engine and the projection screen is aligned.
In the first implementation manner, the projection host may adjust the focal length of the projection lens in the optical engine based on the translation parameter in the correction parameter, so that the projection image can be clearly and completely projected onto the projection screen. And the projection host can adjust the projection angle of the projector lens (for example, rotating the lens and/or adjusting the lens elevation angle) based on the rotation parameter in the correction parameter, so that the shape and the area of the projection image projected onto the projection screen by the projection lens can be adapted to the projection screen.
In the second implementation manner, the projection host may correct the projection position of each pixel in the projection image directly based on the correction parameter.
Alternatively, the display effect of the projected image may include a projection position of the projected image on the projection screen, and/or a projection shape of the projected image on the projection screen. The projection position of the projection image after correction is located in the projection screen, and the projection shape of the projection image is the same as the shape of the projection screen. The projection position of the projection image is located in the projection screen, that is, the projection positions of all pixels in the projection image are located in the projection screen, and the projection positions of the edge pixels of the projection image are aligned with the edge of the projection area of the projection screen. The edge pixel of the projection image refers to a pixel located in the outermost peripheral area of the projection image.
In summary, the embodiment of the present application provides a method for correcting a projected image. The projection host in the projection equipment can acquire a shot image obtained by shooting the projection screen by the depth camera, and determines the three-dimensional offset of a plurality of feature points in the projection screen based on the shot image. Since the projection host can correct the display effect of the projected image based on the three-dimensional offset amounts of the plurality of feature points, the display effect of the projected image projected onto the projection screen can be ensured to be good. And the projection host determines the three-dimensional offset of the projection screen according to a shot image shot by the depth camera, so that the calibration effect of correcting based on the three-dimensional offset is better.
The embodiment of the present application provides a projection apparatus, as shown in fig. 1 and fig. 2, the projection apparatus includes a projection screen 10, a projection host 20, and a connection bracket 30 connecting the projection screen 10 and the projection host 20. The projection host 20 is configured to:
in response to the correction instruction, a captured image obtained by the depth camera capturing the projection screen 10 is acquired.
The measured three-dimensional coordinates of the plurality of feature points of the projection screen 10 are determined based on the captured image.
A three-dimensional offset of the measured three-dimensional coordinates of each of the plurality of feature points from the reference three-dimensional coordinates of the feature points is determined.
And correcting the display effect of the projection image based on the three-dimensional offset of the plurality of feature points.
Optionally, the projection host 20 is configured to:
and determining correction parameters of the projection equipment based on the three-dimensional offset of the plurality of feature points, wherein the correction parameters comprise a rotation parameter and a translation parameter.
And correcting the display effect of the projection image according to the correction parameters.
Optionally, the projection host 20 is configured to:
and if the three-dimensional offset is larger than the offset threshold, correcting the display effect of the projected image based on the three-dimensional offsets of the plurality of characteristic points.
Optionally, the projection screen 10 is polygonal, and the plurality of feature points of the projection screen 10 include a plurality of vertices of the polygon.
Optionally, the projection host 20 is configured to:
a plurality of feature points of the projection screen 10 are identified from the captured image.
And for each feature point in the plurality of feature points, determining the actually measured three-dimensional coordinates of the feature point according to the two-dimensional coordinates of the feature point in the shot image and the distance between the feature point and the depth camera.
Optionally, the projection device includes a depth camera, and the depth camera is disposed on the projection host 20.
To sum up, the embodiment of the present application provides a projection device, where a projection host in the projection device can obtain a captured image obtained by capturing a projection screen by a depth camera, and determine three-dimensional offsets of a plurality of feature points in the projection screen based on the captured image. Because the projection host can correct the display effect of the projection image based on the three-dimensional offset of the plurality of characteristic points, the display effect of the projection image projected to the projection screen can be ensured to be better. And the projection host can determine the three-dimensional offset of the projection screen according to a shot image shot by the depth camera, so that the calibration effect of correcting based on the three-dimensional offset is better.
It can be understood that the projection apparatus provided in the above embodiment and the embodiment of the method for correcting a projected image of the projection apparatus belong to the same concept, and specific implementation processes thereof are described in the method embodiment, and are not described herein again.
The embodiment of the application provides a projection device, and a projection host in the projection device comprises: a memory, a processor and a computer program stored on the memory, the processor implementing the method of correcting a projected image (e.g. the method shown in fig. 3 or fig. 4) as provided in the above method embodiments when executing the computer program.
The present application provides a computer-readable storage medium, which stores instructions that are loaded and executed by a processor to implement the method embodiments (e.g., the method shown in fig. 3 or fig. 4) as described above.
Embodiments of the present application provide a computer program product comprising instructions which, when run on a computer, cause the computer to perform a method of correcting a projected image (e.g. the method shown in fig. 3 or fig. 4) as provided in the above method embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by hardware related to instructions of a program, and the program may be stored in a computer readable storage medium, where the above mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk.
The term "plurality" in this application means two or more. The term "and/or" in this application is only one kind of association relationship describing the associated object, and means that there may be three kinds of relationships, for example, a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone.
In this application, the terms "first," "second," and the like are used for distinguishing identical or similar items with substantially identical functions and functionalities, and it should be understood that "first," "second," and "n" have no logical or temporal dependency, and no limitation on the number or execution order.
The above description is only exemplary of the present application and should not be taken as limiting the present application, and any modifications, equivalents, improvements and the like that are made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. The correction method of the projection image is characterized by being applied to projection equipment, wherein the projection equipment comprises a projection screen, a projection host and a connecting bracket for connecting the projection screen and the projection host; the method comprises the following steps:
responding to a correction instruction, and acquiring a shot image obtained by shooting the projection screen by a depth camera;
determining measured three-dimensional coordinates of a plurality of feature points of the projection screen based on the shot image;
determining a three-dimensional offset of a measured three-dimensional coordinate of each of the plurality of feature points compared to a reference three-dimensional coordinate of the feature point;
and correcting the display effect of the projection image based on the three-dimensional offset of the plurality of feature points.
2. The method of claim 1, wherein correcting the display effect of the projected image based on the three-dimensional offset of the plurality of feature points if the three-dimensional offset is greater than an offset threshold comprises:
determining correction parameters of the projection device based on the three-dimensional offsets of the plurality of feature points, wherein the correction parameters comprise a rotation parameter and a translation parameter;
and correcting the display effect of the projection image according to the correction parameter.
3. The method of claim 1, wherein correcting the display effect of the projected image based on the three-dimensional offsets of the plurality of feature points comprises:
and if the three-dimensional offset is larger than the offset threshold, correcting the display effect of the projected image based on the three-dimensional offsets of the plurality of characteristic points.
4. The method of claim 1, wherein the projection screen is a polygon, and wherein the plurality of feature points of the projection screen comprise a plurality of vertices of the polygon.
5. The method of any of claims 1 to 4, wherein said determining measured three-dimensional coordinates of a plurality of feature points of said projection screen based on said captured image comprises:
identifying a plurality of feature points of the projection screen from the captured image;
for each of the plurality of feature points, determining measured three-dimensional coordinates of the feature point according to two-dimensional coordinates of the feature point in the captured image and a distance between the feature point and the depth camera.
6. The method of any of claims 1 to 4, wherein the projection device comprises the depth camera, and the depth camera is disposed on the projection host.
7. The projection equipment is characterized by comprising a projection screen, a projection host and a connecting bracket for connecting the projection screen and the projection host; the projection host is used for:
responding to a correction instruction, and acquiring a shot image obtained by shooting the projection screen by a depth camera;
determining measured three-dimensional coordinates of a plurality of feature points of the projection screen based on the shot image;
determining a three-dimensional offset of a measured three-dimensional coordinate of each feature point in the plurality of feature points compared to a reference three-dimensional coordinate of the feature point;
and correcting the display effect of the projection image based on the three-dimensional offset of the plurality of characteristic points.
8. The projection device of claim 7, wherein the projection host is configured to:
determining correction parameters of the projection device based on the three-dimensional offsets of the plurality of feature points, wherein the correction parameters comprise a rotation parameter and a translation parameter;
and correcting the display effect of the projection image according to the correction parameter.
9. The projection device of claim 7, wherein the projection host is configured to correct a display effect of the projected image based on the three-dimensional offset amount of the plurality of feature points if the three-dimensional offset amount is greater than an offset threshold.
10. The projection device of any of claims 7 to 9, wherein the projection screen is a polygon, and wherein the plurality of feature points of the projection screen comprise a plurality of vertices of the polygon.
CN202210729303.1A 2022-06-24 2022-06-24 Projection equipment and correction method of projection image thereof Pending CN115134570A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210729303.1A CN115134570A (en) 2022-06-24 2022-06-24 Projection equipment and correction method of projection image thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210729303.1A CN115134570A (en) 2022-06-24 2022-06-24 Projection equipment and correction method of projection image thereof

Publications (1)

Publication Number Publication Date
CN115134570A true CN115134570A (en) 2022-09-30

Family

ID=83379542

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210729303.1A Pending CN115134570A (en) 2022-06-24 2022-06-24 Projection equipment and correction method of projection image thereof

Country Status (1)

Country Link
CN (1) CN115134570A (en)

Similar Documents

Publication Publication Date Title
JP3844076B2 (en) Image processing system, projector, program, information storage medium, and image processing method
TWI253006B (en) Image processing system, projector, information storage medium, and image processing method
CN110809786B (en) Calibration device, calibration chart, chart pattern generation device, and calibration method
JP5266953B2 (en) Projection display apparatus and display method
JP5266954B2 (en) Projection display apparatus and display method
CN111083456B (en) Projection correction method, apparatus, projector, and readable storage medium
US20080113799A1 (en) Orientation device method for coordinate generation employed thereby
JP4042695B2 (en) Projector and zoom adjustment method
CN113055663A (en) Projection image correction method and laser projection device
JP2005072888A (en) Image projection method and image projection device
CN114727081B (en) Projector projection correction method and device and projector
JP2010028411A (en) Image correcting apparatus, image correcting method, projector and projection system
JP4572377B2 (en) Image processing system, projector, program, information storage medium, and image processing method
US20170339400A1 (en) Registering cameras in a multi-camera imager
WO2021031781A1 (en) Method and device for calibrating projection image and projection device
CN113286135A (en) Image correction method and apparatus
WO2022242306A1 (en) Laser projection system, image correction method, and laser projection device
CN114268777A (en) Starting method of laser projection equipment and laser projection system
JP2016085380A (en) Controller, control method, and program
JP6088864B2 (en) Calibration system and calibration method
CN115134570A (en) Projection equipment and correction method of projection image thereof
CN115174878B (en) Projection picture correction method, apparatus and storage medium
US20150241997A1 (en) Coordinate detection system, information processing apparatus, method of detecting coordinate, and program
JP5955003B2 (en) Image processing apparatus, image processing method, and program
JP2015142157A (en) Image projection system, projection controller, projection controlling program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication