CN114401388A - Projection method, projection device, storage medium and projection equipment - Google Patents

Projection method, projection device, storage medium and projection equipment Download PDF

Info

Publication number
CN114401388A
CN114401388A CN202210094719.0A CN202210094719A CN114401388A CN 114401388 A CN114401388 A CN 114401388A CN 202210094719 A CN202210094719 A CN 202210094719A CN 114401388 A CN114401388 A CN 114401388A
Authority
CN
China
Prior art keywords
projection
target
image
pixel
boundary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210094719.0A
Other languages
Chinese (zh)
Inventor
郑炯彬
张聪
胡震宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Huole Science and Technology Development Co Ltd
Original Assignee
Shenzhen Huole Science and Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Huole Science and Technology Development Co Ltd filed Critical Shenzhen Huole Science and Technology Development Co Ltd
Priority to CN202210094719.0A priority Critical patent/CN114401388A/en
Publication of CN114401388A publication Critical patent/CN114401388A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3188Scale or resolution adjustment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/64Analysis of geometric attributes of convexity or concavity

Abstract

The disclosure relates to a projection method, a projection device, a storage medium and an electronic device, and relates to the technical field of projection. The method comprises the following steps: determining boundary curvature information of a first boundary line of a target projection picture projected to a projection area by the projection equipment in a forward projection state; and projecting the target image, wherein the edge region of the target image comprises opaque pixels set according to the boundary curvature information, so that the boundary of a projection picture projected by the target image in the projection region appears as a straight line in the vision of a user. Therefore, the projection method provided by the disclosure can eliminate the problem of 'burrs' on the boundary of the projection picture caused by the lens of the projection equipment in a software mode, and not only can greatly reduce the production cost of the projection equipment, but also has high distortion correction speed and high efficiency.

Description

Projection method, projection device, storage medium and projection equipment
Technical Field
The present disclosure relates to the field of projection technologies, and in particular, to a projection method, an apparatus, a storage medium, and a projection device.
Background
During projection, due to the problem of defects in the lens of the projection device, the image quality, or the modulation plane of the projection device, etc., the edges of the projected picture presented may be distorted. For such distortion, manual correction is often required, resulting in a sharp increase in the production cost of the projection apparatus.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
In a first aspect, the present disclosure provides a projection method, including:
determining boundary curvature information of a first boundary line of a target projection picture projected to a projection area by the projection equipment in a forward projection state;
and projecting the target image, wherein the edge region of the target image comprises opaque pixels set according to the boundary curvature information, so that the boundary of a projection picture projected by the target image in the projection region appears as a straight line in the vision of a user.
In a second aspect, the present disclosure provides a projection apparatus comprising:
a determination module configured to determine boundary curvature information of a first boundary line of a target projection picture projected to a projection area by a projection device in a forward projection state;
and the projection module is configured to project the target image, wherein the edge area of the target image comprises opaque pixels which are set according to the boundary curvature information, so that the boundary of a projection picture projected by the target image in the projection area is presented as a straight line in the vision of a user.
In a third aspect, the present disclosure provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processing apparatus, performs the steps of the method of the first aspect.
In a fourth aspect, the present disclosure provides a projection device comprising:
a storage device having a computer program stored thereon;
processing means for executing the computer program in the storage means to carry out the steps of the method of the first aspect.
Based on the technical scheme, the boundary curvature information of the first boundary line of the target projection picture projected to the projection area by the projection equipment in the orthographic projection state is determined, the target image is constructed according to the boundary curvature information, wherein the edge area of the target image comprises opaque pixels set according to the boundary curvature information, and then the target image is projected, so that the boundary of the projection picture projected in the projection area by the target image is presented as a straight line in the vision of a user. By the projection method provided by the disclosure, the problem of burrs at the boundary of a projection picture caused by a lens of the projection equipment can be solved in a software mode, so that the production cost of the projection equipment can be greatly reduced, and the distortion correction speed is high and the efficiency is high.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and features are not necessarily drawn to scale. In the drawings:
fig. 1 is a flow chart of a proposed projection method according to an exemplary embodiment.
Fig. 2 is a schematic diagram of a projection screen in a forward projection state according to an exemplary embodiment.
Fig. 3 is a schematic illustration of a proposed first borderline according to an exemplary embodiment.
FIG. 4 is a schematic illustration of a proposed target image according to an exemplary embodiment.
FIG. 5 is a schematic illustration of a proposed target image according to another exemplary embodiment.
Fig. 6 is a detailed flowchart of step 110 shown in fig. 1.
Fig. 7 is a schematic illustration of different proposed projected picture distortions in accordance with an exemplary embodiment.
Fig. 8 is a detailed flowchart of step 113 shown in fig. 6.
Fig. 9 is a schematic diagram of a proposed pixel difference value according to an exemplary embodiment.
FIG. 10 is a flow diagram of constructing a target image, according to an example embodiment.
FIG. 11 is a schematic diagram of a proposed construction of a target image according to an exemplary embodiment.
Fig. 12 is a schematic diagram of a target image after being subjected to a gradation process according to an exemplary embodiment.
Fig. 13 is a schematic block diagram of a projection apparatus according to an exemplary embodiment.
Fig. 14 is a schematic structural diagram of a proposed projection device according to an exemplary embodiment.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
Fig. 1 is a flow chart of a proposed projection method according to an exemplary embodiment. The projection method proposed by the present disclosure may be executed by a projection device, and specifically may be executed by a projection apparatus, which may be implemented by software and/or hardware and configured in the projection device. As shown in fig. 1, the projection method may include the following steps.
In step 110, boundary curvature information of a first boundary line of a target projection screen projected by a projection apparatus to a projection area in a forward projection state is determined.
Here, the projection area refers to an area for carrying a projection screen, where the projection area may be a wall surface, a curtain, or the like. The projection device projecting to the projection area in the orthographic projection state means that the projection device projects in a state that the projection area is horizontally placed and the projection screen is vertical to the projection area when the projection area is vertical to the ground. Fig. 2 is a schematic diagram of a projection screen in a forward projection state according to an exemplary embodiment. As shown in fig. 2, if the projection device 20 does not have any distortion, the projection screen 22 projected by the projection device 20 to the projection area 21 in the front projection state appears as a standard rectangle.
The boundary curvature information may refer to position information of each pixel point on the first boundary line of the target projection screen, the boundary curvature information reflecting a degree of curvature of the first boundary line. It should be understood that, in the embodiment of the present disclosure, the first boundary line may refer to four boundary lines of the target projection screen, and may also refer to a boundary line that generates a curve in the target projection screen, which may be set according to actual situations. Fig. 3 is a schematic illustration of a proposed first borderline according to an exemplary embodiment. As shown in fig. 3, the target projection screen 30 includes a boundary line AB, a boundary line BD, a boundary line CD, and a boundary line AC, where the boundary curvature information of the first boundary line is the boundary curvature information of the boundary line BD if the boundary line BD is a curved boundary line.
It should be understood that the boundary curvature information of the first boundary line obtained in the orthographic projection state may be considered as image distortion generated by the target projection screen due to discontinuity of the curvature of the lens of the projection apparatus. When the curvature of the lens of the projection device is discontinuous, uneven 'burrs' are generated on the edge of the target projection picture, so that the projection picture in the vision of a user cannot be presented as a standard rectangle.
In some embodiments, the boundary curvature information of the first boundary line may be acquired by a projection device. For example, before the start of the main projection, the projection device projects a target projection screen onto the projection area in a forward projection state, and determines boundary curvature information of the first boundary line from the target projection screen.
In other embodiments, the projection device may receive boundary curvature information of the first boundary line sent by the image calibration apparatus to the projection device. For example, the manufacturer may set the boundary curvature information of the first boundary line in the projection apparatus before the projection apparatus is shipped from the factory. The boundary curvature information may be boundary curvature information of the first boundary line determined by a manufacturer, which acquires a target projection picture projected by the projection equipment to the projection area in the orthographic projection state through the image calibration device, and according to the target projection picture. In the projection process of the projection device, the projection device directly performs projection correction according to the stored boundary curvature information without performing additional operation.
In step 120, a target image is projected, wherein an edge region of the target image includes opaque pixels set according to boundary curvature information, so that a boundary of a projection screen on which the target image is projected in the projection region appears as a straight line in a user's vision.
Here, the edge region of the target image includes opaque pixels set according to the boundary curvature information of the first boundary line. The opaque pixels are used for shielding partial pixels on the target image, so that the shielded pixels are rendered opaque in the projection picture, and the boundaries of the projection picture in the vision of a user can be rendered as straight lines by matching with 'burrs' generated by a lens of the projection equipment.
It should be understood that the presentation of the boundary of the projection screen as a straight line in the user's vision means that it is a straight line in the imaging of the human eye, and the pixels on the boundary of the projection screen are not necessarily straight lines in a strict sense in a mathematical sense.
As an example, the target image may be obtained by setting opaque pixels on an edge region of the projection image in accordance with boundary curvature information of the first boundary line. The projection image refers to an original image that the user needs to project, and may be a picture, a video, or the like. In the projection process, for each projection image, opaque pixels may be set on an edge region of the projection image in accordance with boundary curvature information of the first boundary line to obtain a target image.
FIG. 4 is a schematic illustration of a proposed target image according to an exemplary embodiment. As shown in fig. 4, sub-diagram (a) in fig. 4 includes a projection image 40, sub-diagram (b) includes a first projection screen 41, sub-diagram (c) includes a first target image 42, and sub-diagram (d) includes a second projection screen 43. Wherein each small square in fig. 4 represents a pixel point.
As shown in sub-diagram (b), the first projection screen 41 on which the projection image 40 is projected in the projection area, the first pixel 411, the second pixel 412, the third pixel 413, and the fourth pixel 414 on the right boundary of the first projection screen 41 are rendered in a convex state, resulting in the right boundary of the first projection screen 41 no longer being rendered as a straight line. An opaque pixel is set on the right boundary of the projection image 40 based on the boundary curvature information of the right boundary of the first projection screen 41, and a first target image 42 is obtained. Wherein the pixel values of the fifth pixel 421, the sixth pixel 422, the seventh pixel 423, and the eighth pixel 424 in the right boundary of the first target image 42 are set to be opaque, such as black. The projection effect of the first target image 42 on the projection area is as shown by the second projection screen 43, and the right boundary of the second projection screen 43 appears as a straight line.
It should be understood that the display of the opaque pixels on the target image on the projection area is black, and the resulting projection is actually a projection with black edges. Because the black edge is relatively fine, the watching of a user is not influenced, and the distortion correction effect of the projection picture is extremely obvious.
As another example, the target image may be obtained by superimposing the projection image and the correction image. Wherein the edge region of the corrected image includes opaque pixels set according to the boundary curvature information of the first boundary line and other pixel regions of the corrected image are in a transparent state.
It should be understood that image attributes such as image size and resolution of the corrected image may be consistent with the projected image. Wherein the corrected image is actually a mask layer as the projected image. The target image is obtained by correcting the image, and in the projection process, pixels on each frame of projection image do not need to be processed independently, so that the distortion correction speed can be improved, and the computing resources can be saved.
FIG. 5 is a schematic illustration of a proposed target image according to another exemplary embodiment. As shown in fig. 5, sub-graph (e) in fig. 5 includes a corrected image 50, sub-graph (f) includes a second target image 51, and sub-graph (g) includes a third projection screen 52. Wherein each small square in fig. 5 represents a pixel point.
The image attribute of the corrected image 50 in sub-image (e) in fig. 5 coincides with the projected image 40 in fig. 4. Opaque pixels are set on the right boundary of the corrected image 50 according to the boundary curvature information of the right boundary of the first projection picture 41 in fig. 4. In this case, the ninth pixel 501, the tenth pixel 502, the eleventh pixel 503, and the twelfth pixel 504 in the right boundary of the corrected image 50 have their pixel values set to opaque, and the other pixel points are set to transparent. By superimposing the corrected image 50 with the projected image 40 shown in fig. 4, a second target image 51 shown in a sub-image (f) in fig. 5 is obtained. The projection screen on which the second target image 51 is projected on the projection area is a third projection screen 52 shown in sub-diagram (g) in fig. 5. Wherein the display effect of the third projection screen 52 is consistent with the display effect of the second projection screen 43 in fig. 4.
It should be noted that, in the above example, the principle of lens distortion correction is illustrated by taking only one first boundary line of the target projection picture as an example, and the principle of lens distortion correction is consistent for other first boundary lines of the target projection picture, which is not described herein again. In addition, in the above example, in order to facilitate understanding of the principle of the projection method provided by the embodiment of the present disclosure, the pixel points of the projection pictures in fig. 4 and 5 are all set to correspond to the pixel points of the projection image one by one, but in the actual projection process, the pixel points affected by the lens distortion do not necessarily correspond to the pixel points of the corresponding projection image, and the principle of the lens distortion correction is consistent.
Thus, by determining boundary curvature information of a first boundary line of a target projection screen projected to a projection area by a projection device in a forward projection state and constructing a target image according to the boundary curvature information, wherein an edge area of the target image includes opaque pixels set according to the boundary curvature information, the target image is projected such that a boundary of the projection screen on which the target image is projected in the projection area appears as a straight line in the user's vision. By the projection method provided by the disclosure, the problem of burrs at the boundary of a projection picture caused by a lens of the projection equipment can be solved in a software mode, so that the production cost of the projection equipment can be greatly reduced, and the distortion correction speed is high and the efficiency is high.
Fig. 6 is a detailed flowchart of step 110 shown in fig. 1. As shown in fig. 6, the step 110 of determining boundary curvature information of the first boundary line of the target projection screen according to the target captured image may include the following steps.
In step 111, the projection apparatus is controlled to project the target projection screen to the projection area in the forward projection state.
Here, the detailed description of the specific principle of the projection device projecting the target projection screen to the target projection area in the forward projection state is already described in the above embodiments, and is not repeated here.
In some cases, the "barrel/pillow distortion" of the projected picture can occur due to the lens being ground closer to a spherical surface than to a free-form surface. Fig. 7 is a schematic illustration of different proposed projected picture distortions in accordance with an exemplary embodiment. As shown in fig. 7, "barrel/pincushion distortion" is mainly reflected in that the distortion of the entire screen is large, and "burr distortion" is mainly reflected in that the curvature of a plurality of pixels on the boundary line is discontinuous. The projection image can be corrected based on the projection method provided by the embodiment of the disclosure for the barrel/pillow distortion with the deformation degree smaller than or equal to the preset deformation threshold, and the projection image can be corrected by calibrating the lens for the barrel/pillow distortion with the deformation degree larger than the preset deformation threshold, which is the prior art and will not be described in detail herein.
It should be understood that, in the present disclosure, in order to prevent the barrel/pillow distortion whose deformation degree is greater than the preset deformation threshold from affecting the correction process of the burr distortion, before the projection target projection picture, calibration parameters may be obtained according to calibration of a lens of the projection apparatus, and the projection target projection picture may be projected according to the calibration parameters. Therefore, there is no "barrel/pincushion distortion" in the target projection screen, which has a degree of distortion greater than a preset distortion threshold.
In step 112, a target photographic image is acquired, wherein the target photographic image is obtained by photographing the projection area with preset photographing conditions by the photographing device.
Here, the target captured image is a captured image acquired by capturing the projection area with preset capturing conditions by the capturing device when the projection apparatus projects the target projection screen to the target projection area in the orthographic projection state. The preset shooting conditions are such that the shot target shot image only retains the edge defects of the projection picture, which may be caused by the lens of the projection device, the image quality or the modulation plane of the projection device.
It should be understood that the camera may be a camera disposed on the projection device, and the camera may also be an external camera.
When the shooting device is an external camera, the shooting device shoots the projection area under a target shooting condition, and a first shooting image shot by the shooting device is used as a target shooting image, wherein the target shooting condition is that an optical axis of the shooting device is perpendicular to the projection area and passes through a central point of a target projection picture.
For example, when the external camera captures a shooting image of the object, the projection area may be a translucent plate to which the projection device projects a projection picture in a front projection state. When the photographing device is set, the optical axis of the photographing device and the optical axis of the projection device are observed through the translucent plate, and the optical axis of the photographing device is aligned with the optical axis of the projection device.
It should be understood that, since the angle of view coverage of the camera is small and there is substantially no distortion in the lens of the camera, the change in curvature of the boundary line of the projection screen in the target captured image captured by the camera under the target capturing condition may be understood as being caused by the lens of the projection apparatus.
In some implementation manners, when the photographing device does not photograph the projection area under the target photographing condition, a second photographed image photographed by the photographing device is acquired, and the target photographed image is obtained by combining coordinate information of each vertex of the second photographed image with a perspective transformation matrix constructed according to the target projection picture in the second photographed image and the original image corresponding to the target projection picture.
In the actual shooting process, the shooting device may fail to shoot the projection area under the target shooting condition due to the shooting space limitation. At this time, a previously established perspective transformation matrix (also referred to as a homography matrix) may be multiplied by the coordinate information of each vertex of the second captured image to obtain the target captured image. It should be understood that the perspective transformation matrix reflects the position change relationship of the pixel points of the original image mapped on the shot image.
As an example, the original image corresponding to the target projection screen may be a pure color image, and then coordinate information of each vertex of the target projection screen may be determined in the second captured image, and a perspective transformation matrix may be constructed according to the coordinate information of each vertex of the target projection screen and the coordinate information of each corner point of the corresponding original image.
As another example, the original image corresponding to the target projection screen may be a feature image with feature points, such as a checkerboard image, a feature point array image, or the like. At this time, the coordinate information of each target feature point in the target projection picture may be determined in the second captured image, and the perspective transformation matrix may be constructed according to the coordinate information of each target feature point in the target projection picture and the coordinate information of the target feature point on the original image.
It should be noted that, when the shooting device is a camera disposed on the projection apparatus, it is necessary to correct the shot image obtained by the camera according to the relative positional relationship between the camera and the projection apparatus when the camera is turned off, so that the corrected shot image corresponds to the target shot image obtained under the target shooting condition.
It should be understood that, in the above two examples, the four vertices of the target projection screen in the obtained target captured image satisfy the following positional relationship: the coordinate values of the upper left vertex and the upper right vertex on the Y axis are consistent, the coordinate values of the upper left vertex and the lower left vertex on the X axis are consistent, the coordinate values of the upper right vertex and the lower right vertex on the X axis are consistent, and the coordinate values of the lower left vertex and the lower right vertex on the Y axis are consistent. The positional relationship does not change due to discontinuities in curvature changes of the lens of the projection device.
In step 113, boundary curvature information of the first boundary line of the target projection screen is determined from the target captured image.
Here, the boundary curvature information of the first boundary line of the target projection screen is determined from pixels of the projection screen located on the boundary in the target captured image.
Fig. 8 is a detailed flowchart of step 113 shown in fig. 6. As shown in fig. 8, in some possible embodiments, determining boundary curvature information of the first boundary line of the target projection screen according to the target captured image in step 113 may include the following steps.
In step 1131, the pixel coordinates of the pixel point located on the second boundary line of the target projection image are determined in the target captured image.
Here, the second boundary line refers to a boundary line of the target projection screen in the target captured image. The second boundary line may refer to a boundary line of four sides of the target projection screen in the target captured image, or may refer to a boundary line that generates a curve in the target projection screen in the target captured image, which may be set according to actual situations.
The pixel coordinates of the pixel points on the second boundary line refer to coordinate information of all the pixel points on the second boundary line on a reference coordinate system constructed by any point in the target shot image.
In some embodiments, the pixel coordinates of the pixel points located on the second boundary line of the target projection picture in the target captured image may be determined by the image recognition model. The image recognition model may be obtained by training a machine learning model using a history captured image marked with positions of vertices corresponding to a quadrangle as a training sample.
In other embodiments, the pixel coordinates corresponding to all the pixel points located on the second boundary line may be determined in the captured target image based on a difference between the pixel gray scale value of the projected target image in the captured target image and the pixel gray scale value of the other image area.
It should be appreciated that since the target projection screen has a higher brightness than the ambient light, there is a difference between the gray-scale values of the pixels of the boundary of the target projection screen and the gray-scale values of the pixels of the projection area, and the set of pixel points can be determined from such difference between the gray-scale values.
It should be noted that the method for determining the pixel coordinates of all the pixel points located on the second boundary line in the target captured image is not limited to the method provided in the foregoing embodiment, and other methods may also be used to obtain the pixel coordinates of all the pixel points located on the second boundary line. For example, the pixel coordinates of all pixel points of the target projection picture on the second boundary line may be determined in the target captured image through an edge detection algorithm. The specific principle of the edge detection algorithm is that an edge extraction convolution kernel is constructed, convolution operation is carried out on a target shot image through the convolution kernel to obtain an image with an extracted edge, and then pixel coordinates of all pixel points on a second boundary line are located in the image.
In step 1132, regarding the second boundary line, taking a target point on the second boundary line as a reference point, and determining a pixel difference between each pixel point and the reference point according to the pixel coordinate of each pixel point on the second boundary line, where the pixel difference includes a difference between the pixel point and the reference point in a direction perpendicular to a vertical reference line formed by two vertices of the second boundary line.
Here, for each second boundary line, a target point on the second boundary line is used as a reference point, and a pixel difference value between each pixel point and the reference point is determined according to the pixel coordinates of each pixel point on the second boundary line. The target point may be one of the two vertices of the second boundary line, or the target point may be another pixel point located on a vertical reference line formed by the two vertices of the second boundary line.
In some embodiments, after determining the pixel coordinates of all the pixel points located on the second boundary line, a pixel point set composed of all the pixel points located on the second boundary line may be obtained, then two first pixel points with the farthest distance are determined in the pixel point set, and two second pixel points with the largest vertical distance from a line segment formed by the first pixel points are determined from the pixel point set, and the positions of the first pixel points and the second pixel points in the target captured image are determined as the pixel coordinates of each vertex of the target projection image.
It should be understood that the four vertices of the target projection screen in the target captured image satisfy the following positional relationship: the coordinate values of the upper left vertex and the upper right vertex on the Y axis are consistent, the coordinate values of the upper left vertex and the lower left vertex on the X axis are consistent, the coordinate values of the upper right vertex and the lower right vertex on the X axis are consistent, and the coordinate values of the lower left vertex and the lower right vertex on the Y axis are consistent. Therefore, for each pixel point located on the second boundary line, the difference between the pixel point and the reference point in the direction perpendicular to the vertical reference line formed by the two vertices of the second boundary line can be calculated, and the pixel difference between the pixel point and the reference point can be obtained.
It should be noted that the value corresponding to the pixel difference is divided into positive and negative values, which are distinguished according to the position of the pixel point on the second boundary line.
Fig. 9 is a schematic diagram of a proposed pixel difference value according to an exemplary embodiment. As shown in fig. 9, in the target captured image, there are 9 pixel points (such as small white squares in fig. 9) belonging to the target projection screen. Wherein, the pixel point A, the pixel point B, the pixel point C, the pixel point D and the pixel point E are all pixel points on the right boundary. At this time, the pixel point a may be used as a reference point. And then respectively calculating pixel difference values between pixel coordinate values of the pixel point A, the pixel point B, the pixel point C, the pixel point D and the pixel point E on the X axis and pixel coordinate values of the reference point on the X axis. The pixel difference value corresponding to the pixel point a is "0", the pixel difference value corresponding to the pixel point B is "1", the pixel difference value corresponding to the pixel point C is "2", the pixel difference value corresponding to the pixel point D is "1", and the pixel difference value corresponding to the pixel point E is "0".
In step 1133, a pixel difference value between a pixel point included in the second boundary line and the reference point is determined as the boundary curvature information of the corresponding first boundary line.
Here, the pixel difference value between the pixel point included in the second boundary line and the reference point is calculated to be substantially equal to the pixel difference value corresponding to the pixel point on the first boundary line in the target captured image, and therefore, the pixel difference value between the pixel point included in the second boundary line and the reference point can be determined as the boundary curvature information of the corresponding first boundary line. For example, the boundary curvature information of the corresponding first boundary line may be determined based on the boundary curvature information, where the pixel difference value corresponding to the pixel point a is "0", the pixel difference value corresponding to the pixel point B is "1", the pixel difference value corresponding to the pixel point C is "2", the pixel difference value corresponding to the pixel point D is "1", and the pixel difference value corresponding to the pixel point E is "0".
Therefore, the boundary curvature information of the corresponding first boundary line can be accurately and quickly obtained by calculating the pixel difference value between the pixel point included in the second boundary line and the reference point in the target shooting image.
In the above embodiment regarding step 120, two methods of constructing the target image are set forth. In the following embodiments, the principle of constructing a target image from a corrected image will be described in detail. For constructing the target image from the original projection image, the principle is consistent with that of constructing the target image from the corrected image, and will not be described in detail later.
FIG. 10 is a flow diagram of constructing a target image, according to an example embodiment. As shown in fig. 10, the target image may be constructed by the following steps.
In step 210, a first target pixel point to be shielded is determined on the target captured image according to a pixel difference value between a pixel point included in the second boundary line and the reference point.
Here, when the boundary curvature information of the first boundary line is a pixel difference value between a pixel point included in the corresponding second boundary line and the reference point, the first target pixel point to be blocked may be determined on the target captured image according to the pixel difference value.
The first target pixel points needing to be shielded refer to pixel points which cause the second boundary line to be uneven. The first target pixel point to be shielded can also be represented by a pixel difference value between a pixel point included in the second boundary line and the reference point.
FIG. 11 is a schematic diagram of a proposed construction of a target image according to an exemplary embodiment. As shown in fig. 11, sub-diagram (a) in fig. 11 is a target captured image, sub-diagram (b) is a corrected image in the modulation plane, and sub-diagram (c) is a corrected projection screen. In sub-figure (a), the boundary curvature information is: the pixel difference value corresponding to the pixel point a is "0", the pixel difference value corresponding to the pixel point B is "1", the pixel difference value corresponding to the pixel point C is "2", the pixel difference value corresponding to the pixel point D is "1", and the pixel difference value corresponding to the pixel point E is "0".
According to the boundary curvature information, the situation that in a target shooting image, pixels needing to be shielded do not exist in a first row of pixels counted from top to bottom, 1 pixel needing to be shielded exists in a second row of pixels, the second row of pixels is pixel B, 2 pixels needing to be shielded exist in a third row of pixels, the fourth row of pixels are pixel C and pixel F, 1 pixel needing to be shielded exists in a fourth row of pixels, and the fifth row of pixels do not exist in pixel D needing to be shielded.
It should be understood that, for each second boundary line, a pixel point to be blocked in each row of pixels of the second boundary line is determined according to a pixel difference value corresponding to the row of pixels. For example, in the third row of pixels, the pixel difference value corresponding to the pixel point C is "2", and the pixel points to be shielded in the third row of pixels are two pixel points, namely the pixel point C and the pixel point F.
It is worth mentioning that the modulation plane refers to the plane where the light modulator (chip) of the projection device generates the image. The chip corresponding to the modulation plane comprises a reflection type image modulation chip or a transmission type image modulation chip. The reflective image modulation chip includes a DMD chip (Digital Micromirror Device) or an LCOS chip (Liquid Crystal on Silicon ), and the transmissive image modulation chip includes an LCD chip (Liquid Crystal Display ), and the like.
In some embodiments, the pixel coordinate of the standard pixel point corresponding to the minimum pixel difference value may be determined according to the pixel difference value between the pixel point included in the second boundary line and the reference point, and then the first target pixel to be blocked may be determined on the target captured image according to the pixel coordinate of the standard pixel point.
Here, the standard pixel point is a pixel point corresponding to the smallest pixel difference value in the second boundary line, and if the pixel difference values of the plurality of pixel points are the smallest, one pixel point is determined from the plurality of pixel points with the smallest pixel difference values to serve as the standard pixel point.
As shown in fig. 11, according to the boundary curvature information, it can be determined that the standard pixel is pixel a or pixel E. And taking the pixel point A as a standard pixel point, and then establishing a target vertical line which passes through the standard pixel point and is parallel to the vertical reference line corresponding to the second boundary line. In fig. 11, the target vertical line is a line segment formed by pixel a and pixel E. After the target vertical line is determined, all pixel points located outside the target vertical line can be used as first target pixel points needing to be shielded. The outer side of the target vertical line is a side close to the second boundary line. In fig. 11, the target pixels to be shielded are pixel B, pixel C, pixel F, and pixel D.
In step 220, a second target pixel point corresponding to the first target pixel point is determined on the modulation plane according to a resolution ratio between the first resolution of the modulation plane of the projection device and the second resolution of the target captured image.
Here, after the first target pixel point to be shielded is determined in the target shot image, the first target pixel point needs to be converted to the modulation plane of the projection device, so as to determine a second target pixel point corresponding to the first target pixel point on the modulation plane.
The number of the pixel points on each second boundary line depends on the second resolution of the shooting device, and the larger the second resolution is, the larger the number of the pixel points on each second boundary line is, in the shot image of the target shot by the shooting device. However, the first resolution of the modulation plane of the projection device is fixed, which is generally 1080P, and the number of pixels on each boundary line of the original image corresponding to the target projection picture is 1080. Therefore, the first target pixel point in the target shot image needs to be converted to the modulation plane according to the resolution ratio so as to determine the second target pixel point. For example, when the second resolution of the photographing device is 2160P and the first resolution of the modulation plane is 1080P, the resolution ratio is 1:2, and according to the resolution ratio, the difference value of the pixel row where the first target pixel point to be shielded is located is converted into the pixel difference value of the pixel row corresponding to the adjustment plane. For example, when the first target pixel point is: the pixel difference value corresponding to the pixel point a is "0", the pixel difference value corresponding to the pixel point B is "2", the pixel difference value corresponding to the pixel point C is "4", the pixel difference value corresponding to the pixel point D is "2", and the pixel difference value corresponding to the pixel point E is "0", the pixel difference values are multiplied by the resolution ratio, in the corresponding pixel row of the adjustment plane, the pixel difference values in the modulation plane are "0", "1", "2", "1", "0", respectively, and the second target pixel point can be determined on the modulation plane according to the pixel difference values of "0", "1", "2", "1", and "0".
Sub-graph (b) shown in fig. 11 is a corrected image when the resolution ratio is 1:1, and when the second resolution of the photographing device is identical to the first resolution of the modulation plane, the pixel points on the target photographed image correspond to the positions of the pixel points on the modulation plane one by one. At this time, the pixel point B, the pixel point C, the pixel point F, and the pixel point D in the sub-graph (a) respectively correspond to the black pixel point in the sub-graph (B) one by one, and then the black pixel point in the sub-graph (B) is the second target pixel point.
In step 230, a corrected image is constructed according to the second target pixel point, wherein an opaque pixel is disposed on the second target pixel point of the corrected image.
Here, after the second target pixel point is determined on the modulation plane, the target image may be constructed according to the second target pixel point. Specifically, an image is constructed according to the size of a modulation plane, a second target pixel point in the image is set to be an opaque pixel, and other pixel points except the second target pixel point are set to be in an opaque state, so that a corrected image is obtained. As shown in sub-graph (b) of fig. 11 (white squares are transparent pixels and black squares are opaque pixels).
In step 240, the corrected image is superimposed with the projected image to obtain a target image.
Here, after the completion of the construction of the correction image, the correction image may be superimposed with the projection image to obtain the target image. The correction image is actually equivalent to a mask layer located on the upper layer of the projection image, and in the projection process, pixels affected by a lens of the projection equipment on the projection image are shielded through opaque pixels on the correction image, so that the shielded pixels are not displayed in the projection image, and the boundary of the projection image projected by the target image in the projection area is presented as a straight line in the vision of a user. The specific effect is shown in sub-graph (c) in fig. 11, where in sub-graph (c) in fig. 11, the white pixel is a projection picture, and the black pixel is a boundary, which is equivalent to a black boundary of the curtain.
Thus, by the boundary curvature information determined from the target captured image, it is possible to construct a corresponding corrected image, and construct a target image from the corrected image and the projected image, so that the boundary of the projection screen on which the target image is projected in the projection area appears as a straight line in the user's vision. The problem of burr generation at the boundary of a projection picture caused by the lens of the projection equipment can be solved under the condition that the lens of the projection equipment is not corrected, so that the production cost of the projection equipment is greatly reduced. Moreover, the correction image can be constructed once and used for multiple times.
It should be noted that, in the above steps 210 to 240, the process of constructing the target image according to the resolution ratio is described in detail. In other embodiments, before step 1131 is executed, if the second resolution of the target captured image is not consistent with the first resolution of the modulation plane, the second resolution of the target captured image may be converted to be consistent with the first resolution of the modulation plane according to the resolution ratio, and a new target captured image may be obtained. Then, based on the new target captured image, the above steps 1131 to 1133 are performed, the obtained boundary curvature information of the first boundary line is actually in one-to-one correspondence with the modulation plane, and in step 220, the resolution ratio may be set to 1, so as to obtain a second target pixel point.
In some implementation manners, transparency gradient processing may be performed on pixels of the target image located in the edge area, so as to obtain a new target image.
Here, since the salient pixels in the target image appear to be particularly obtrusive on smooth edges, the projected image has a noticeable jaggy feeling, which results in a poor user's appearance. The pixels of the target image in the edge area are subjected to transparency gradual change processing, so that the protruded pixels in the target image are not as abrupt, and the display quality of a projection picture is improved. The transparency gradation processing refers to performing transparency gradation processing on pixels on the same boundary line in the direction of the boundary line, and may be performed only on opaque pixels on the same boundary line.
It should be noted that, when performing transparency gradient processing on an opaque pixel, different gradient parameters may be used, where the gradient parameter α is 255/n, where n is a constant, and a value of n may be selected according to actual needs, and in general, the value of n is 16 or 32, so as to reduce the abrupt feeling of a protruding pixel and at the same time, not reduce the effect of correcting "fringe distortion" of a target image.
Fig. 12 is a schematic diagram of a target image after being subjected to a gradation process according to an exemplary embodiment. As shown in fig. 12, in sub-image (a), the pixels on the right border are not processed with transparency gradation, and the right border appears more abrupt. In sub-diagram (b), the pixels on the right boundary are subjected to transparency gradation processing with n being 8. In sub-diagram (c), the pixels at the right boundary are subjected to transparency gradation processing with n-16. In sub-diagram (d), the pixels at the right boundary are subjected to transparency gradation processing with n 1080. It can be seen that the larger the value of n, the less obtrusive the pixel with the highlighted right border appears.
Next, the effect of the projection method proposed by the embodiment of the present disclosure will be described by way of example with reference to fig. 13.
Fig. 13 is a schematic block diagram of a projection apparatus according to an exemplary embodiment. As shown in fig. 13, an embodiment of the present disclosure provides a projection apparatus 1300, including:
a determining module 1301 configured to determine boundary curvature information of a first boundary line of a target projection screen projected to a projection area by a projection apparatus in a forward projection state;
a projection module 1302 configured to project a target image, wherein an edge region of the target image includes opaque pixels set according to the boundary curvature information, so that a boundary of a projection screen on which the target image is projected in the projection region appears as a straight line in a user's vision.
Optionally, the determining module 1301 includes:
a control unit configured to control the projection device to project a target projection screen to the projection area in a forward projection state;
an acquisition unit configured to acquire a target photographic image, wherein the target photographic image is obtained by photographing a projection area by a photographing device under a preset photographing condition;
a curvature determining unit configured to determine boundary curvature information of the first boundary line of the target projection screen based on the target captured image.
Optionally, the curvature determining unit comprises:
the pixel unit is configured to determine pixel coordinates of pixel points located on a second boundary line of the target projection picture in the target shooting image;
the distance unit is configured to take a target point on the second boundary line as a reference point for the second boundary line, and determine a pixel difference value between each pixel point and the reference point according to the pixel coordinate of each pixel point on the second boundary line, wherein the pixel difference value comprises a difference value of the pixel point and the reference point in a direction perpendicular to a vertical reference line formed by two vertexes of the second boundary line;
a boundary determining unit configured to determine a pixel difference value between a pixel point included in the second boundary line and the reference point as boundary curvature information of the corresponding first boundary line.
Optionally, the projection module 1302 includes:
the shielding unit is configured to determine a first target pixel point needing shielding on the target shooting image according to a pixel difference value between a pixel point contained in the second boundary line and the reference point;
the conversion unit is configured to determine a second target pixel point corresponding to the first target pixel point on the modulation plane according to a resolution ratio between a first resolution of the modulation plane of the projection device and a second resolution of the target shooting image;
the image construction unit is configured to construct a corrected image according to the second target pixel point, wherein the second target pixel point of the corrected image is provided with opaque pixels;
and the superposition unit is configured to superpose the corrected image and the projected image to obtain a target image.
Optionally, the shielding unit includes:
the difference unit is configured to determine the pixel coordinate of the standard pixel point corresponding to the minimum pixel difference value according to the pixel difference value between the pixel point included in the second boundary line and the reference point;
and the pixel positioning unit is configured to determine a first target pixel needing to be shielded on the target shooting image according to the pixel coordinates of the standard pixel points.
Optionally, the obtaining unit includes:
a first shooting unit configured to take a first shot image shot by a shooting device as a target shot image when the shooting device shoots a projection area under a target shooting condition, wherein the target shooting condition is that an optical axis of the shooting device is perpendicular to the projection area and passes through a center point of a target projection picture;
a second photographing unit configured to acquire a second photographed image photographed by the photographing apparatus when the photographing apparatus does not photograph the projection area in the target photographing condition;
and the transformation unit is configured to obtain the target shot image by combining the coordinate information of each vertex of the second shot image according to a perspective transformation matrix constructed by the target projection picture in the second shot image and the original image corresponding to the target projection picture.
Optionally, the apparatus 1300 further comprises:
and the gradual changing module is configured to perform transparency gradual changing processing on the pixels of the target image in the edge area to obtain a new target image.
The detailed implementation process of each functional module of the projection apparatus 1300 has been described in detail in the embodiment of the projection method, and is not repeated herein.
Referring now to fig. 14, a schematic diagram of a projection device 600 suitable for use in implementing embodiments of the present disclosure is shown. The projection device in the embodiment of the present disclosure may be an independent device, or may be a module that can be used in cooperation with other intelligent terminals. The projection device 600 shown in fig. 14 is only an example, and should not bring any limitation to the function and the range of use of the embodiments of the present disclosure.
As shown in fig. 14, projection device 600 may include a processing device (e.g., central processing unit, graphics processor, etc.) 601 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage device 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the projection apparatus 600 are also stored. The processing device 601, the ROM 602, and the RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Generally, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 608 including, for example, tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the projection device 600 to communicate with other devices wirelessly or by wire to exchange data. While fig. 14 illustrates a projection device 600 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 609, or may be installed from the storage means 608, or may be installed from the ROM 602. The computer program, when executed by the processing device 601, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the projection device and the camera may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the projection device; or may be separate and not incorporated into the projection device.
The computer readable medium carries one or more programs which, when executed by the projection device, cause the projection device to: determining boundary curvature information of a first boundary line of a target projection picture projected to a projection area by the projection equipment in a forward projection state; and projecting the target image, wherein the edge region of the target image comprises opaque pixels set according to the boundary curvature information, so that the boundary of a projection picture projected by the target image in the projection region appears as a straight line in the vision of a user.
Computer program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules described in the embodiments of the present disclosure may be implemented by software or hardware. Wherein the name of a module in some cases does not constitute a limitation on the module itself.
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims. With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.

Claims (10)

1. A method of projection, comprising:
determining boundary curvature information of a first boundary line of a target projection picture projected to a projection area by the projection equipment in a forward projection state;
projecting a target image, wherein an edge region of the target image comprises opaque pixels set according to the boundary curvature information, so that a boundary of a projection picture projected by the target image in the projection region appears as a straight line in user vision.
2. The projection method according to claim 1, wherein the determining boundary curvature information of the first boundary line of the target projection picture projected by the projection device to the projection area in the forward projection state comprises:
controlling projection equipment to project a target projection picture to a projection area in a forward projection state;
acquiring a target shooting image, wherein the target shooting image is obtained by shooting the projection area by a shooting device under a preset shooting condition;
and determining boundary curvature information of a first boundary line of the target projection picture according to the target shooting image.
3. The projection method according to claim 2, wherein the determining boundary curvature information of the first boundary line of the target projection picture from the target captured image includes:
determining pixel coordinates of pixel points positioned on a second boundary line of a target projection picture in the target shooting image;
regarding the second boundary line, taking a target point on the second boundary line as a reference point, and determining a pixel difference value between each pixel point and the reference point according to the pixel coordinate of each pixel point on the second boundary line, wherein the pixel difference value comprises a difference value between the pixel point and the reference point in a direction perpendicular to a vertical reference line formed by two vertexes of the second boundary line;
and determining the pixel difference value between the pixel point contained in the second boundary line and the reference point as the boundary curvature information of the corresponding first boundary line.
4. The projection method of claim 3, wherein the target image is obtained by:
determining a first target pixel point needing to be shielded on the target shot image according to a pixel difference value between the pixel point and the reference point contained in the second boundary line;
determining a second target pixel point corresponding to the first target pixel point on the modulation plane according to a resolution ratio between a first resolution of the modulation plane of the projection equipment and a second resolution of the target shooting image;
constructing a corrected image according to the second target pixel points, wherein the opaque pixels are arranged on the second target pixel points of the corrected image;
and superposing the corrected image and the projected image to obtain the target image.
5. The projection method according to claim 4, wherein the determining, according to a pixel difference between the pixel point included in the second boundary line and the reference point, a first target pixel point to be blocked on the target captured image includes:
determining the pixel coordinate of a standard pixel point corresponding to the minimum pixel difference value according to the pixel difference value between the pixel point and the reference point contained in the second boundary line;
and determining a first target pixel needing to be shielded on the target shot image according to the pixel coordinates of the standard pixel points.
6. The projection method according to claim 2, wherein the acquiring of the target captured image comprises:
when the shooting device shoots the projection area under a target shooting condition, taking a first shooting image shot by the shooting device as the target shooting image, wherein the target shooting condition is that an optical axis of the shooting device is vertical to the projection area and passes through a central point of the target projection picture;
when the shooting device does not shoot the projection area under the target shooting condition, acquiring a second shooting image shot by the shooting device;
and combining coordinate information of each vertex of the second shot image with a perspective transformation matrix constructed according to a target projection picture in the second shot image and an original image corresponding to the target projection picture to obtain the target shot image.
7. The projection method of any of claims 1 to 6, wherein the method further comprises:
and carrying out transparency gradual change processing on the pixels of the target image in the edge area to obtain a new target image.
8. A projection device, comprising:
a determination module configured to determine boundary curvature information of a first boundary line of a target projection picture projected to a projection area by a projection device in a forward projection state;
and the projection module is configured to project a target image, wherein the edge area of the target image comprises opaque pixels set according to the boundary curvature information, so that the boundary of a projection picture projected by the target image in the projection area is presented as a straight line in the vision of a user.
9. A computer-readable storage medium, on which a computer program is stored, which program, when being executed by processing means, carries out the steps of the method according to any one of claims 1 to 7.
10. A projection device, comprising:
a storage device having a computer program stored thereon;
processing means for executing the computer program in the storage means to carry out the steps of the method according to any one of claims 1 to 7.
CN202210094719.0A 2022-01-26 2022-01-26 Projection method, projection device, storage medium and projection equipment Pending CN114401388A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210094719.0A CN114401388A (en) 2022-01-26 2022-01-26 Projection method, projection device, storage medium and projection equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210094719.0A CN114401388A (en) 2022-01-26 2022-01-26 Projection method, projection device, storage medium and projection equipment

Publications (1)

Publication Number Publication Date
CN114401388A true CN114401388A (en) 2022-04-26

Family

ID=81232216

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210094719.0A Pending CN114401388A (en) 2022-01-26 2022-01-26 Projection method, projection device, storage medium and projection equipment

Country Status (1)

Country Link
CN (1) CN114401388A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116546175A (en) * 2023-06-01 2023-08-04 深圳创疆网络科技有限公司 Intelligent control method and device for realizing projector based on automatic induction

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120050309A1 (en) * 2010-08-31 2012-03-01 Olympus Corporation Display control apparatus and display control method
CN103136723A (en) * 2011-11-29 2013-06-05 方正国际软件(北京)有限公司 Image sawtooth removing method and system
US8624911B1 (en) * 2011-01-05 2014-01-07 Google Inc. Texture-based polygon antialiasing
CN111738955A (en) * 2020-06-23 2020-10-02 安徽海微电光电科技有限责任公司 Distortion correction method and device for projected image and computer readable storage medium
CN112449167A (en) * 2020-11-13 2021-03-05 深圳市火乐科技发展有限公司 Image sawtooth elimination and image display method and device
CN113489961A (en) * 2021-09-08 2021-10-08 深圳市火乐科技发展有限公司 Projection correction method, projection correction device, storage medium and projection equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120050309A1 (en) * 2010-08-31 2012-03-01 Olympus Corporation Display control apparatus and display control method
US8624911B1 (en) * 2011-01-05 2014-01-07 Google Inc. Texture-based polygon antialiasing
CN103136723A (en) * 2011-11-29 2013-06-05 方正国际软件(北京)有限公司 Image sawtooth removing method and system
CN111738955A (en) * 2020-06-23 2020-10-02 安徽海微电光电科技有限责任公司 Distortion correction method and device for projected image and computer readable storage medium
CN112449167A (en) * 2020-11-13 2021-03-05 深圳市火乐科技发展有限公司 Image sawtooth elimination and image display method and device
CN113489961A (en) * 2021-09-08 2021-10-08 深圳市火乐科技发展有限公司 Projection correction method, projection correction device, storage medium and projection equipment

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
XIAO LUO 等: "Image distortion correction algorithm based on lattice array coordinate projection", 《2010 3RD INTERNATIONAL CONFERENCE ON ADVANCED COMPUTER THEROY AND ENGINEERING》 *
杨帆: "可估量曲面的自主感知与多投影校正技术研究", 《万方学位论文》 *
陈一超 等: "基于精确模型和逆投影的超大视场红外图像畸变校正", 《半导体光电》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116546175A (en) * 2023-06-01 2023-08-04 深圳创疆网络科技有限公司 Intelligent control method and device for realizing projector based on automatic induction
CN116546175B (en) * 2023-06-01 2023-10-31 深圳创疆网络科技有限公司 Intelligent control method and device for realizing projector based on automatic induction

Similar Documents

Publication Publication Date Title
CN110336987B (en) Projector distortion correction method and device and projector
CN113489961B (en) Projection correction method, projection correction device, storage medium and projection equipment
CN110191326B (en) Projection system resolution expansion method and device and projection system
JP2015060012A (en) Image processing system, image processing device, image processing method and image processing program as well as display system
EP1861748A1 (en) Method of and apparatus for automatically adjusting alignement of a projector with respect to a projection screen
JP2015097350A (en) Image processing apparatus and multi-projection system
US11146768B2 (en) Projection system and projection method
CN113286135A (en) Image correction method and apparatus
CN112272292A (en) Projection correction method, apparatus and storage medium
CN114449249B (en) Image projection method, image projection device, storage medium and projection apparatus
CN114125411A (en) Projection equipment correction method and device, storage medium and projection equipment
WO2022242306A1 (en) Laser projection system, image correction method, and laser projection device
TWI443604B (en) Image correction method and image correction apparatus
JP2019220887A (en) Image processing system, image processing method, and program
JP4751084B2 (en) Mapping function generation method and apparatus, and composite video generation method and apparatus
CN114401388A (en) Projection method, projection device, storage medium and projection equipment
CN114302121A (en) Image correction inspection method, device, electronic equipment and storage medium
US20230033956A1 (en) Estimating depth based on iris size
CN202841396U (en) Digital film optimization device and digital film projection system
CN117097872A (en) Automatic trapezoid correction system and method for projection equipment
US20220292652A1 (en) Image generation method and information processing device
CN114339179A (en) Projection correction method, projection correction device, storage medium and projection equipment
KR20220162595A (en) Electronic apparatus and control method thereof
TWI497448B (en) Method and system for image enhancement
KR20110074442A (en) Image processing apparatus, image processing method and recording medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination