CN114401388B - Projection method, projection device, storage medium and projection apparatus - Google Patents

Projection method, projection device, storage medium and projection apparatus Download PDF

Info

Publication number
CN114401388B
CN114401388B CN202210094719.0A CN202210094719A CN114401388B CN 114401388 B CN114401388 B CN 114401388B CN 202210094719 A CN202210094719 A CN 202210094719A CN 114401388 B CN114401388 B CN 114401388B
Authority
CN
China
Prior art keywords
projection
target
image
pixel
boundary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210094719.0A
Other languages
Chinese (zh)
Other versions
CN114401388A (en
Inventor
郑炯彬
张聪
胡震宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Huole Science and Technology Development Co Ltd
Original Assignee
Shenzhen Huole Science and Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Huole Science and Technology Development Co Ltd filed Critical Shenzhen Huole Science and Technology Development Co Ltd
Priority to CN202210094719.0A priority Critical patent/CN114401388B/en
Publication of CN114401388A publication Critical patent/CN114401388A/en
Application granted granted Critical
Publication of CN114401388B publication Critical patent/CN114401388B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3188Scale or resolution adjustment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/64Analysis of geometric attributes of convexity or concavity

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Geometry (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Projection Apparatus (AREA)

Abstract

The disclosure relates to a projection method, a projection device, a storage medium and electronic equipment, and relates to the technical field of projection. The method comprises the following steps: determining boundary curvature information of a first boundary line of a target projection picture projected to a projection area by a projection device in a front projection state; and projecting the target image, wherein an edge region of the target image includes opaque pixels set according to the boundary curvature information so that a boundary of a projection screen in which the target image is projected in the projection region appears as a straight line in the user's vision. Therefore, the projection method provided by the disclosure can eliminate the problem that burrs appear on the boundary of a projection picture caused by the lens of the projection equipment in a software mode, so that the production cost of the projection equipment can be greatly reduced, and the distortion correction speed is high and the efficiency is high.

Description

Projection method, projection device, storage medium and projection apparatus
Technical Field
The disclosure relates to the technical field of projection, and in particular relates to a projection method, a projection device, a storage medium and projection equipment.
Background
In the projection process, due to defects of a lens of the projection device, image quality or a modulation plane of the projection device, the edges of the presented projection picture are distorted. For such distortion, correction is often required manually, resulting in a surge in production cost of the projection apparatus.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
In a first aspect, the present disclosure provides a projection method, comprising:
determining boundary curvature information of a first boundary line of a target projection picture projected to a projection area by a projection device in a front projection state;
and projecting the target image, wherein an edge region of the target image includes opaque pixels set according to the boundary curvature information so that a boundary of a projection screen in which the target image is projected in the projection region appears as a straight line in the user's vision.
In a second aspect, the present disclosure provides a projection apparatus comprising:
a determining module configured to determine boundary curvature information of a first boundary line of a target projection screen projected by the projection apparatus toward the projection area in a forward projection state;
And a projection module configured to project the target image, wherein an edge region of the target image includes opaque pixels set according to the boundary curvature information so that a boundary of a projection screen in which the target image is projected in the projection region appears as a straight line in the user's vision.
In a third aspect, the present disclosure provides a computer readable storage medium having stored thereon a computer program which when executed by a processing device performs the steps of the method of the first aspect.
In a fourth aspect, the present disclosure provides a projection apparatus comprising:
a storage device having a computer program stored thereon;
processing means for executing said computer program in said storage means to carry out the steps of the method of the first aspect.
Based on the above technical solution, by determining boundary curvature information of a first boundary line of a target projection screen projected by a projection apparatus in a forward projection state toward a projection area, and constructing a target image according to the boundary curvature information, wherein an edge area of the target image includes opaque pixels set according to the boundary curvature information, and then projecting the target image, the boundary of the projection screen of the target image projected in the projection area is made to appear as a straight line in a user's vision. By the projection method, the problem that burrs appear on the boundary of a projection picture caused by the lens of the projection equipment can be solved in a software mode, so that the production cost of the projection equipment can be greatly reduced, and the distortion correction speed is high and the efficiency is high.
Additional features and advantages of the present disclosure will be set forth in the detailed description which follows.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale. In the drawings:
fig. 1 is a flow chart of a projection method according to an exemplary embodiment.
Fig. 2 is a schematic diagram of a projected screen in a forward projection state according to an exemplary embodiment.
Fig. 3 is a schematic illustration of a first boundary line proposed according to an exemplary embodiment.
Fig. 4 is a schematic diagram of a proposed target image according to an exemplary embodiment.
Fig. 5 is a schematic diagram of a proposed target image according to another exemplary embodiment.
Fig. 6 is a specific flowchart of step 110 shown in fig. 1.
Fig. 7 is a schematic diagram of different projected picture distortions proposed in accordance with an exemplary embodiment.
Fig. 8 is a specific flowchart of step 113 shown in fig. 6.
Fig. 9 is a schematic diagram of pixel difference values according to an exemplary embodiment.
Fig. 10 is a flowchart of constructing a target image according to an exemplary embodiment.
Fig. 11 is a schematic diagram of a construction target image according to an exemplary embodiment.
Fig. 12 is a schematic diagram of a target image after a gradation process according to an exemplary embodiment.
Fig. 13 is a schematic diagram of module connection of a projection apparatus according to an exemplary embodiment.
Fig. 14 is a schematic structural view of a projection apparatus according to an exemplary embodiment.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order and/or performed in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
Fig. 1 is a flow chart of a projection method according to an exemplary embodiment. The projection method proposed by the present disclosure may be performed by a projection device, and in particular may be performed by a projection apparatus, which may be implemented by software and/or hardware, and configured in the projection device. As shown in fig. 1, the projection method may include the following steps.
In step 110, boundary curvature information of a first boundary line of a target projection screen projected by a projection device in a forward projection state toward a projection region is determined.
Here, the projection area refers to an area for carrying a projection screen, where the projection area may be a wall surface, a curtain, or the like. The projection of the projection device onto the projection area in the orthographic projection state means that the projection device projects in a state where the projection area is placed horizontally and the projection screen is perpendicular to the projection area in a case where the projection area is perpendicular to the ground. Fig. 2 is a schematic diagram of a projected screen in a forward projection state according to an exemplary embodiment. As shown in fig. 2, if the projection apparatus 20 does not have any distortion, the projection screen 22 projected by the projection apparatus 20 toward the projection area 21 in the forward projection state appears as a standard rectangle.
The boundary curvature information may refer to position information of each pixel point on a first boundary line of the target projection screen, the boundary curvature information reflecting a degree of curvature of the first boundary line. It should be understood that, in the embodiment of the present disclosure, the first boundary line may refer to four boundary lines of the target projection screen, and may refer to a boundary line where a curve is generated in the target projection screen, which may be set according to actual situations. Fig. 3 is a schematic illustration of a first boundary line proposed according to an exemplary embodiment. As shown in fig. 3, the target projection screen 30 includes a boundary line AB, a boundary line BD, a boundary line CD, and a boundary line AC, wherein if the boundary line BD is a boundary line that generates a curve, the boundary curvature information of the first boundary line is boundary curvature information of the boundary line BD.
It should be understood that the boundary curvature information of the first boundary line obtained in the orthographic projection state may be considered as image distortion generated by the target projection screen due to the discontinuity of curvature of the lens of the projection apparatus. In a projection device where the curvature of the lens is discontinuous, the edges of the target projected image may create uneven "burrs" resulting in the projected image not appearing as a standard rectangle in the user's vision.
In some embodiments, the boundary curvature information of the first boundary line may be acquired by the projection device. For example, before the main projection is started, the projection apparatus projects a target projection screen onto the projection area in the forward projection state, and determines boundary curvature information of the first boundary line from the target projection screen.
In other embodiments, the projection device may receive boundary curvature information of the first boundary line transmitted by the image calibration device to the projection device. For example, the manufacturer may set boundary curvature information of the first boundary line in the projection device before the projection device leaves the factory. The boundary curvature information may be boundary curvature information of a first boundary line determined by a manufacturer through an image calibration device acquiring a target projection picture projected to a projection area by a projection device in a forward projection state and according to the target projection picture. In the projection process of the projection device, the projection device directly carries out projection correction according to the stored boundary curvature information, and no additional operation is needed.
In step 120, a target image is projected, wherein an edge region of the target image includes opaque pixels set according to boundary curvature information so that a boundary of a projection screen in which the target image is projected in the projection region appears as a straight line in a user's vision.
Here, the edge region of the target image includes opaque pixels set according to boundary curvature information of the first boundary line. The opaque pixels are used for shielding part of pixels on the target image, so that the shielded pixels are opaque in the projection picture, and are matched with 'burrs' generated by the lens of the projection device, so that the boundary of the projection picture in the vision of a user can be presented as a straight line.
It should be understood that the presentation of the boundary of the projected picture as a straight line in the user's vision means that it is a straight line in the imaging of the human eye, whereas in a mathematical sense the pixels on the boundary of the projected picture are not necessarily straight lines in a strict sense.
As an example, the target image may be obtained by setting opaque pixels on the edge region of the projection image in accordance with the boundary curvature information of the first boundary line. The projection image refers to an original image that a user needs to project, and may be a picture, a video, and the like. In the projection process, for each projection image, an opaque pixel may be set on an edge region of the projection image according to the boundary curvature information of the first boundary line, to obtain a target image.
Fig. 4 is a schematic diagram of a proposed target image according to an exemplary embodiment. As shown in fig. 4, sub-image (a) in fig. 4 includes a projection image 40, sub-image (b) includes a first projection screen 41, sub-image (c) includes a first target image 42, and sub-image (d) includes a second projection screen 43. Wherein each small square in fig. 4 represents a pixel point.
The first projection screen 41 of the projection image 40 projected in the projection area is shown in sub-image (b), and the first pixel 411, the second pixel 412, the third pixel 413, and the fourth pixel 414 on the right boundary of the first projection screen 41 are rendered in a convex state, resulting in that the right boundary of the first projection screen 41 is no longer rendered as a straight line. According to the boundary curvature information of the right boundary of the first projection screen 41, opaque pixels are disposed on the right boundary of the projection image 40, and the first target image 42 is obtained. Wherein the pixel values of the fifth pixel 421, the sixth pixel 422, the seventh pixel 423, and the eighth pixel 424 in the right boundary of the first target image 42 are set to be opaque, such as black. The projection effect of the first target image 42 on the projection area is as shown in the second projection screen 43, and the right boundary of the second projection screen 43 appears as a straight line.
It will be appreciated that the display of opaque pixels on the target image on the projection area is black and that the resulting projection screen is in fact a projection screen with black edges. The black edge is relatively fine, so that the effect of correcting the distortion of the projection picture is extremely obvious, and the watching of a user is not influenced.
As another example, the target image may be obtained by superimposing the projection image and the correction image. Wherein the edge region of the corrected image includes opaque pixels set according to the boundary curvature information of the first boundary line and other pixel regions of the corrected image are in a transparent state.
It should be appreciated that the image size of the corrected image, as well as the image properties of resolution, may be consistent with the projected image. Wherein the corrected image is actually a mask layer as the projected image. The target image is obtained by correcting the image, and in the projection process, pixels on each frame of projection image do not need to be processed independently, so that the distortion correction speed can be improved, and the calculation resources can be saved.
Fig. 5 is a schematic diagram of a proposed target image according to another exemplary embodiment. As shown in fig. 5, the sub-image (e) in fig. 5 includes a corrected image 50, the sub-image (f) includes a second target image 51, and the sub-image (g) includes a third projection screen 52. Wherein each small square in fig. 5 represents a pixel point.
The image properties of the corrected image 50 in sub-image (e) in fig. 5 are identical to those of the projected image 40 in fig. 4. According to the boundary curvature information of the right boundary of the first projection screen 41 in fig. 4, opaque pixels are disposed on the right boundary of the corrected image 50. Wherein pixel values of a ninth pixel 501, a tenth pixel 502, an eleventh pixel 503, and a twelfth pixel 504 in the right boundary of the corrected image 50 are set to be opaque, and other pixel points are set to be transparent. By superimposing the corrected image 50 with the projected image 40 shown in fig. 4, a second target image 51 shown in the sub-image (f) in fig. 5 is obtained. The projection screen of the second target image 51 projected on the projection area is a third projection screen 52 shown in sub-image (g) in fig. 5. The display effect of the third projection screen 52 is identical to the display effect of the second projection screen 43 in fig. 4.
It should be noted that, in the above example, only one first boundary line of the target projection frame is taken as an example to illustrate the principle of lens distortion correction, and the principle of lens distortion correction is consistent with other first boundary lines of the target projection frame, which is not described herein. In addition, in the above example, in order to facilitate understanding of the principle of the projection method proposed by the embodiment of the present disclosure, the pixels of the projection screen in fig. 4 and fig. 5 are set to correspond to the pixels of the projection image one by one, but in the actual projection process, the pixels affected by the lens distortion do not necessarily correspond to the pixels of the corresponding projection image, and the principle of lens distortion correction is consistent.
Thus, by determining boundary curvature information of a first boundary line of a target projection screen projected by a projection apparatus in a forward projection state toward a projection region, and constructing a target image based on the boundary curvature information, wherein an edge region of the target image includes opaque pixels set based on the boundary curvature information, then projecting the target image such that the boundary of the projection screen projected by the target image in the projection region appears as a straight line in user's vision. By the projection method, the problem that burrs appear on the boundary of a projection picture caused by the lens of the projection equipment can be solved in a software mode, so that the production cost of the projection equipment can be greatly reduced, and the distortion correction speed is high and the efficiency is high.
Fig. 6 is a specific flowchart of step 110 shown in fig. 1. As shown in fig. 6, in step 110, determining boundary curvature information of a first boundary line of a target projection screen according to a target captured image may include the following steps.
In step 111, the projection apparatus is controlled to project a target projection screen onto a projection area in a forward projection state.
Here, the specific principle of the projection device projecting the target projection screen onto the target projection area in the orthographic projection state is described in detail in the above embodiment, and will not be described herein.
In some cases, the projected image may be "barrel/pincushion" as the lens is polished closer to a sphere than a free-form surface. Fig. 7 is a schematic diagram of different projected picture distortions proposed in accordance with an exemplary embodiment. As shown in fig. 7, "barrel/pincushion distortion" is mainly represented by a large deformation of the entire screen, and "burr distortion" is mainly represented by an imagination that a plurality of pixels on a boundary line have curvature discontinuities. The method can be used for correcting barrel-shaped/pillow-shaped distortion with deformation degree smaller than or equal to a preset deformation threshold based on the projection method provided by the embodiment of the disclosure, and the method for calibrating the lens can be used for correcting the projection picture with deformation degree larger than the preset deformation threshold, wherein the specific method is the prior art and is not described in detail herein.
It should be understood that, in the present disclosure, in order to prevent the "barrel/pillow distortion" having a deformation degree greater than the preset deformation threshold from affecting the correction process of the "burr distortion", before the projection of the target projection image, calibration parameters may be obtained according to calibrating the lens of the projection device, and the target projection image may be projected according to the calibration parameters. Therefore, there is no "barrel/pincushion distortion" in which the degree of deformation is greater than a preset deformation threshold in the target projection screen.
In step 112, a target photographed image is acquired, wherein the target photographed image is obtained by photographing the projection region by the photographing device under preset photographing conditions.
Here, the target photographed image is a photographed image obtained by photographing the projection area with a preset photographing condition by the photographing device when the projection apparatus projects the target projection screen onto the target projection area in the forward projection state. Wherein, the preset shooting conditions cause that the shot target shooting image only maintains the edge defect of the projection picture, and the edge defect can be caused by the lens of the projection device, the image quality or the modulation plane of the projection device.
It should be understood that the camera may be a camera provided on the projection device, or the camera may be an external camera.
When the shooting device is an external camera, the shooting device shoots the projection area under the target shooting condition, and takes the first shooting image shot by the shooting device as a target shooting image, wherein the target shooting condition is that the optical axis of the shooting device is perpendicular to the projection area and passes through the center point of a target projection picture.
For example, when capturing a target captured image by an external capturing device, the projection area may be a translucent plate to which the projection apparatus projects a projection screen in a forward projection state. When the photographing device is arranged, the optical axis of the photographing device is aligned with the optical axis of the projection device by observing the optical axis of the photographing device and the optical axis of the projection device through the semitransparent plate.
It should be understood that since the view angle coverage of the photographing device is small and there is substantially no distortion of the lens of the photographing device, the curvature variation of the boundary line of the projection screen in the target photographed image photographed by the photographing device in the target photographing condition can be understood to be caused by the lens of the projection apparatus.
In some possible embodiments, when the shooting device does not shoot the projection area under the target shooting condition, a second shooting image shot by the shooting device is obtained, and according to a perspective transformation matrix constructed by the target projection picture in the second shooting image and the original image corresponding to the target projection picture, the coordinate information of each vertex of the second shooting image is combined to obtain the target shooting image.
In the actual shooting process, the shooting device may not shoot the projection area under the target shooting condition due to the limitation of shooting space. At this time, a pre-established perspective transformation matrix (also referred to as homography matrix) may be multiplied by the coordinate information of each vertex of the second captured image to obtain the target captured image. It should be understood that the perspective transformation matrix reflects the positional change relation of the pixel point map of the original image on the photographed image.
As an example, the original image corresponding to the target projection picture may be a solid color image, and then coordinate information of each vertex of the target projection picture may be determined in the second captured image, and a perspective transformation matrix may be constructed according to the coordinate information of each vertex of the target projection picture and the coordinate information of each corner of the corresponding original image.
As another example, the original image corresponding to the target projection screen may be a feature image with feature points, such as a checkerboard image, a feature point array image, or the like. At this time, the coordinate information of each target feature point in the target projection screen may be determined in the second captured image, and a perspective transformation matrix may be constructed according to the coordinate information of each target feature point in the target projection screen and the coordinate information of the target feature point on the original image.
It should be noted that, when the photographing device is a camera disposed on the projection device, the photographed image obtained by the camera needs to be corrected according to the relative positional relationship between the camera and the shutdown of the projection device, so that the corrected photographed image corresponds to the target photographed image obtained under the target photographing condition.
It should be understood that in the above two examples, the four vertices of the target projection screen in the obtained target captured image satisfy the following positional relationship: the coordinate values of the upper left vertex and the upper right vertex are consistent with each other on the Y axis, the coordinate values of the upper left vertex and the lower left vertex are consistent with each other on the X axis, the coordinate values of the upper right vertex and the lower right vertex are consistent with each other on the X axis, and the coordinate values of the lower left vertex and the lower right vertex are consistent with each other on the Y axis. The positional relationship does not change due to discontinuous curvature change of the lens of the projection device.
In step 113, boundary curvature information of the first boundary line of the target projection screen is determined from the target captured image.
Here, boundary curvature information of the first boundary line of the target projection screen is determined from pixels located on the boundary of the projection screen in the target captured image.
Fig. 8 is a specific flowchart of step 113 shown in fig. 6. As shown in fig. 8, in some possible embodiments, in step 113, determining boundary curvature information of the first boundary line of the target projection screen according to the target captured image may include the following steps.
In step 1131, pixel coordinates of pixel points located on the second boundary line of the target projection screen are determined in the target captured image.
Here, the second boundary line refers to a boundary line of the target projection screen in the target captured image. The second boundary line may be four boundary lines of the target projection screen in the target captured image, or may be a boundary line that generates a curve in the target projection screen in the target captured image, which may be set according to actual conditions.
The pixel coordinates of the pixel points on the second boundary line refer to coordinate information of all the pixel points located on the second boundary line on a reference coordinate system constructed with any point in the target captured image.
In some embodiments, the pixel coordinates of the pixel points located on the second boundary line of the target projection screen in the target captured image may be determined by the image recognition model. The image recognition model may be obtained by training the machine learning model using a history image of the positions of vertices corresponding to the quadrangles as a training sample.
In other embodiments, pixel coordinates corresponding to all pixel points located on the second boundary line may be determined in the target captured image based on a difference between the pixel gray values of the target projection screen in the target captured image and the pixel gray values of the other image areas.
It will be appreciated that since the target projection screen has a higher brightness than the ambient light, there is a difference between the gray values of the pixels of the boundary of the target projection screen and the gray values of the pixels of the projection area, from which difference the set of pixel points can be determined.
It should be noted that, the method for determining the pixel coordinates of all the pixels located on the second boundary line in the target captured image may not be limited to the method provided in the above embodiment, and other methods may be used to obtain the pixel coordinates of all the pixels located on the second boundary line. For example, the pixel coordinates of all the pixel points of the target projection screen on the second boundary line may be determined in the target captured image by an edge detection algorithm. The specific principle of the edge detection algorithm is that a convolution kernel is constructed, a convolution operation is carried out on a target shooting image through the convolution kernel, an image after the edge is extracted is obtained, and then pixel coordinates of all pixel points located on a second boundary line are located in the image.
In step 1132, regarding the second boundary line, one target point on the second boundary line is taken as a reference point, and a pixel difference between each pixel point and the reference point is determined according to the pixel coordinate of each pixel point located on the second boundary line, wherein the pixel difference includes a difference between the pixel point and the reference point in a direction perpendicular to a vertical reference line formed by two vertexes of the second boundary line.
Here, for each second boundary line, one target point on the second boundary line is taken as a reference point, and a pixel difference value between each pixel point and the reference point is determined according to the pixel coordinates of each pixel point located on the second boundary line. Wherein the target point may be one of the two vertices of the second boundary line, or the target point may be another pixel point located on a vertical reference line constituted by the two vertices of the second boundary line.
In some embodiments, after determining the pixel coordinates of all the pixels located on the second boundary line, a set of pixels formed by all the pixels located on the second boundary line may be obtained, then two first pixels located farthest from each other are determined from the set of pixels, two second pixels located at the largest vertical distance from the line segment formed by the first pixels are determined from the set of pixels, and the positions of the first pixels and the second pixels in the target captured image are determined as the pixel coordinates of each vertex of the target projection screen.
It should be understood that the four vertices of the target projection screen in the target captured image satisfy the following positional relationship: the coordinate values of the upper left vertex and the upper right vertex are consistent with each other on the Y axis, the coordinate values of the upper left vertex and the lower left vertex are consistent with each other on the X axis, the coordinate values of the upper right vertex and the lower right vertex are consistent with each other on the X axis, and the coordinate values of the lower left vertex and the lower right vertex are consistent with each other on the Y axis. Thus, for each pixel located on the second boundary line, the difference between the pixel and the reference point in the direction perpendicular to the vertical reference line formed by the two vertices of the second boundary line can be calculated, obtaining the pixel difference between the pixel and the reference point.
It should be noted that the pixel difference value corresponds to a positive or negative value, which is distinguished according to the position of the pixel point on the second boundary line.
Fig. 9 is a schematic diagram of pixel difference values according to an exemplary embodiment. As shown in fig. 9, in the target captured image, there are 9 pixels belonging to the target projection screen (white small squares in fig. 9). The pixel point A, the pixel point B, the pixel point C, the pixel point D and the pixel point E are all the pixel points on the right boundary. At this time, the pixel point a may be taken as a reference point. Then, pixel differences between the pixel coordinate values of the pixel point A, the pixel point B, the pixel point C, the pixel point D and the pixel point E on the X axis and the pixel coordinate values of the reference point on the X axis are calculated respectively. The pixel difference value corresponding to the pixel point a is "0", the pixel difference value corresponding to the pixel point B is "1", the pixel difference value corresponding to the pixel point C is "2", the pixel difference value corresponding to the pixel point D is "1", and the pixel difference value corresponding to the pixel point E is "0".
In step 1133, the pixel difference between the pixel point included in the second boundary line and the reference point is determined as the boundary curvature information of the corresponding first boundary line.
Here, the pixel difference between the pixel point included in the second boundary line and the reference point is calculated in the target captured image to be substantially equal to the pixel difference corresponding to the pixel point on the first boundary line, and therefore, the pixel difference between the pixel point included in the second boundary line and the reference point can be determined as the boundary curvature information of the corresponding first boundary line. For example, the boundary curvature information of the corresponding first boundary line is obtained by setting the pixel difference value corresponding to the pixel point a to "0", the pixel difference value corresponding to the pixel point B to "1", the pixel difference value corresponding to the pixel point C to "2", the pixel difference value corresponding to the pixel point D to "1", and the pixel difference value corresponding to the pixel point E to "0", and the degree of curvature of the first boundary line can be determined based on the boundary curvature information.
Thus, by calculating the pixel difference value between the pixel point included in the second boundary line and the reference point in the target captured image, the boundary curvature information of the corresponding first boundary line can be accurately and rapidly obtained.
In the above embodiments with respect to step 120, two methods of constructing the target image are illustrated. In the following embodiment, a principle of constructing a target image from a corrected image will be described in detail. The principle of constructing the target image from the original projection image is identical to the principle of constructing the target image from the corrected image, and will not be described in detail later.
Fig. 10 is a flowchart of constructing a target image according to an exemplary embodiment. As shown in fig. 10, the target image may be constructed by the following steps.
In step 210, a first target pixel point to be blocked is determined on the target captured image according to the pixel difference between the pixel point included in the second boundary line and the reference point.
Here, when the boundary curvature information of the first boundary line is a pixel difference value between a pixel point included in the corresponding second boundary line and the reference point, the first target pixel point to be blocked may be determined on the target captured image based on the pixel difference value.
The first target pixel point to be shielded is a pixel point which causes the second boundary line to be rugged. The first target pixel point to be blocked may also be represented by a pixel difference value between the pixel point included in the second boundary line and the reference point.
Fig. 11 is a schematic diagram of a construction target image according to an exemplary embodiment. As shown in fig. 11, a sub-image (a) in fig. 11 is a target captured image, a sub-image (b) is a corrected image in the modulation plane, and a sub-image (c) is a corrected projection screen. In sub-graph (a), the boundary curvature information is: the pixel difference value corresponding to the pixel point A is 0, the pixel difference value corresponding to the pixel point B is 1, the pixel difference value corresponding to the pixel point C is 2, the pixel difference value corresponding to the pixel point D is 1, and the pixel difference value corresponding to the pixel point E is 0.
According to the boundary curvature information, it can be determined that the first row of pixels from top to bottom in the target shooting image does not have pixel points to be blocked, the second row of pixels have 1 pixel point to be blocked, the second row of pixels have 2 pixel points to be blocked, the third row of pixels have pixel points to be blocked, the fourth row of pixels have 1 pixel point to be blocked, the fourth row of pixels have pixel points to be blocked, the fifth row of pixels have no pixel points to be blocked.
It should be understood that, for each second boundary line, the pixel points to be blocked in each row of pixels of the second boundary line are determined according to the pixel difference value corresponding to the row of pixels. For example, in the third row of pixels, the pixel difference value corresponding to the pixel point C is "2", and the pixel points to be blocked in the third row of pixels are two pixel points, namely the pixel point C and the pixel point F.
It should be noted that the modulation plane refers to a plane in which an image is generated by a light modulator (chip) of the projection apparatus. The chip corresponding to the modulation plane comprises a reflective image modulation chip or a transmissive image modulation chip. The reflective image modulation chip includes a DMD chip (Digital Micromirror Device ) or an LCOS chip (Liquid Crystal on Silicon, liquid crystal on silicon) or the like, and the transmissive image modulation chip includes an LCD chip (Liquid CRYSTAL DISPLAY ) or the like.
In some embodiments, the pixel coordinates of the standard pixel point corresponding to the minimum pixel difference may be determined according to the pixel difference between the pixel point included in the second boundary line and the reference point, and then the first target pixel to be blocked is determined on the target captured image according to the pixel coordinates of the standard pixel point.
Here, the standard pixel point is a pixel point corresponding to the smallest pixel difference value in the second boundary line, and if the pixel difference value of the plurality of pixel points is smallest, one pixel point is determined from the pixel points with the smallest pixel difference values as the standard pixel point.
As shown in fig. 11, from the above-described boundary curvature information, it can be determined that the standard pixel is the pixel a or the pixel E. With pixel a as the standard pixel, a target vertical line may be established that crosses the standard pixel and is parallel to the vertical reference line corresponding to the second boundary line. In fig. 11, the target vertical line is a line segment formed by the pixel point a and the pixel point E. After the target vertical line is determined, all pixel points located outside the target vertical line can be used as first target pixel points to be shielded. Wherein the outer side of the target vertical line means the side close to the second boundary line. In fig. 11, the target pixels to be masked are pixel B, pixel C, pixel F, and pixel D.
In step 220, a second target pixel point corresponding to the first target pixel point is determined on the modulation plane according to a resolution ratio between a first resolution of the modulation plane of the projection device and a second resolution of the target captured image.
Here, after determining a first target pixel point to be blocked in the target captured image, the first target pixel point needs to be converted onto a modulation plane of the projection apparatus to determine a second target pixel point corresponding to the first target pixel point on the modulation plane.
The number of the pixels on each second boundary line depends on the second resolution of the photographing device, and the larger the second resolution is, the larger the number of the pixels on each second boundary line is. But the first resolution of the modulation plane of the projection device is fixed, which is generally 1080P, the number of pixels on each boundary line of the original image corresponding to the target projection screen is 1080. Therefore, it is necessary to convert the first target pixel point in the target photographed image to the modulation plane according to the resolution ratio to determine the second target pixel point. For example, when the second resolution of the photographing device is 2160P and the first resolution of the modulation plane is 1080P, the resolution ratio is 1:2, and according to the resolution ratio, the difference value of the pixel row where the first target pixel point to be shielded is located is converted into the pixel difference value of the corresponding pixel row in the adjustment plane. For example, when the first target pixel point is: the pixel difference value corresponding to the pixel point a is "0", the pixel difference value corresponding to the pixel point B is "2", the pixel difference value corresponding to the pixel point C is "4", the pixel difference value corresponding to the pixel point D is "2", and the pixel difference value corresponding to the pixel point E is "0", the pixel difference values are multiplied by the resolution ratio, respectively, in the corresponding pixel rows of the adjustment plane, the pixel difference values in the adjustment plane are "0", "1", "2", "1", "0", and the second target pixel point on the adjustment plane can be determined according to the pixel difference values of "0", "1", "2", "1", and "0".
The sub-image (b) shown in fig. 11 is a corrected image when the resolution ratio is 1:1, and the pixel points on the target photographed image correspond to the positions of the pixel points on the modulation plane one by one when the second resolution of the photographing device is identical to the first resolution of the modulation plane. At this time, the pixel point B, the pixel point C, the pixel point F, and the pixel point D in the sub-image (a) are respectively in one-to-one correspondence with the black pixel point in the sub-image (B), and the black pixel point in the sub-image (B) is the second target pixel point.
In step 230, a corrected image is constructed from the second target pixel points, wherein opaque pixels are disposed on the second target pixel points of the corrected image.
Here, after the second target pixel point is determined on the modulation plane, a target image may be constructed from the second target pixel point. Specifically, an image is constructed with the size of the modulation plane, and a second target pixel point in the image is set as an opaque pixel, and other pixel points except the second target pixel point are set as an opaque state, so as to obtain a corrected image. As shown in sub-graph (b) of fig. 11 (white squares are transparent pixels and black squares are opaque pixels).
In step 240, the corrected image is superimposed with the projected image to obtain a target image.
Here, after the corrected image is constructed, the corrected image and the projected image may be superimposed to obtain the target image. Wherein the corrected image is actually equivalent to a mask layer positioned on the upper layer of the projected image, and during the projection process, pixels on the projected image, which are affected by the lens of the projection device, are blocked by opaque pixels on the corrected image, so that the blocked pixels are not displayed in the projection picture, and the boundary of the projection picture of the target image projected in the projection area appears as a straight line in the user's vision. The specific effect is shown in the sub-image (c) in fig. 11, where in the sub-image (c) in fig. 11, the white pixel point is a projection screen, and the black pixel point is a boundary, which corresponds to the black boundary of the curtain.
Thus, by the boundary curvature information determined from the target captured image, it is possible to construct a corresponding correction image, and construct the target image from the correction image and the projection image such that the boundary of the projection screen in which the target image is projected in the projection area appears as a straight line in the user's vision. The problem of 'burr' on the boundary of the projection picture caused by the lens of the projection equipment can be eliminated under the condition that the lens of the projection equipment is not corrected, so that the production cost of the projection equipment is greatly reduced. Moreover, the corrected image can be constructed once and used for a plurality of times.
It should be noted that, in the above steps 210 to 240, the process of constructing the target image according to the resolution ratio is described in detail. In other embodiments, before step 1131 is performed, if the second resolution of the target captured image is inconsistent with the first resolution of the modulation plane, the second resolution of the target captured image may be converted to be consistent with the first resolution of the modulation plane according to the resolution ratio, so as to obtain a new target captured image. Then, based on the new target captured image, the above steps 1131 to 1133 are performed, and the obtained boundary curvature information of the first boundary line is actually in one-to-one correspondence with the modulation plane, and in step 220, the resolution ratio may be set to 1, so as to obtain the second target pixel point.
In some implementations, a transparency gradient process may be performed on pixels of the target image that are located in the edge region to obtain a new target image.
Here, since the pixels protruding from the target image are particularly abrupt on the smooth edges, the projected image has a noticeable jaggy feel, resulting in poor user's look. The pixels of the target image, which are located in the edge area, are subjected to transparency gradient processing, so that the projected pixels in the target image are not abrupt, and the display quality of the projection picture is improved. The transparency gradation processing refers to transparency gradation processing for pixels on the same border line in the direction of the border line, or transparency gradation processing may be performed for only opaque pixels on the same border line.
It should be noted that, when the transparency gradient processing is performed on the opaque pixels, different gradient parameters may be used, where n is a constant, and the value of n may be selected according to actual needs, and in general, the value of n is 16 or 32, so as not to reduce the effect of correcting the "burr distortion" of the target image while reducing the abrupt sense of the protruding pixels.
Fig. 12 is a schematic diagram of a target image after a gradation process according to an exemplary embodiment. As shown in fig. 12, in the sub-image (a), the pixels at the right boundary are not subjected to the transparency gradation process, and the right boundary appears more abrupt. In the sub-image (b), the transparency gradation processing is performed for the pixels of the right boundary with n=8. In the sub-image (c), the transparency gradation processing is performed for the pixels of the right boundary with n=16. In the sub-image (d), the transparency gradation processing is performed for the pixels of the right boundary with n=1080. It can be seen that the larger the value of n, the less abrupt the pixel with the right boundary protruding appears.
Next, effects of the projection method according to the embodiment of the present disclosure are illustrated with reference to fig. 13.
Fig. 13 is a schematic diagram of module connection of a projection apparatus according to an exemplary embodiment. As shown in fig. 13, an embodiment of the present disclosure provides a projection apparatus 1300 including:
A determining module 1301 configured to determine boundary curvature information of a first boundary line of a target projection screen projected to a projection region by a projection apparatus in a forward projection state;
The projection module 1302 is configured to project a target image, wherein an edge region of the target image includes opaque pixels set according to boundary curvature information so that a boundary of a projection screen in which the target image is projected in the projection region appears as a straight line in a user's vision.
Optionally, the determining module 1301 includes:
A control unit configured to control the projection apparatus to project a target projection screen to the projection area in a forward projection state;
An acquisition unit configured to acquire a target photographed image, wherein the target photographed image is obtained by photographing the projection region with a preset photographing condition by the photographing device;
And a curvature determination unit configured to determine boundary curvature information of the first boundary line of the target projection screen from the target captured image.
Optionally, the curvature determining unit includes:
A pixel unit configured to determine pixel coordinates of a pixel point located on a second boundary line of the target projection screen in the target photographed image;
A distance unit configured to take one target point on a second boundary line as a reference point for the second boundary line, and determine a pixel difference value between each pixel point and the reference point according to a pixel coordinate of each pixel point located on the second boundary line, wherein the pixel difference value includes a difference value between the pixel point and the reference point in a direction perpendicular to a vertical reference line formed by two vertexes of the second boundary line;
and a boundary determining unit configured to determine a pixel difference value between the pixel point included in the second boundary line and the reference point as boundary curvature information of the corresponding first boundary line.
Optionally, the projection module 1302 includes:
the shielding unit is configured to determine a first target pixel point to be shielded on the target shooting image according to a pixel difference value between the pixel point contained in the second boundary line and the reference point;
A conversion unit configured to determine a second target pixel point corresponding to the first target pixel point on the modulation plane according to a resolution ratio between a first resolution of the modulation plane of the projection apparatus and a second resolution of the target captured image;
an image construction unit configured to construct a corrected image according to the second target pixel point, wherein an opaque pixel is provided on the second target pixel point of the corrected image;
And a superimposing unit configured to superimpose the corrected image and the projected image to obtain a target image.
Optionally, the shielding unit includes:
The difference value unit is configured to determine the pixel coordinates of the standard pixel point corresponding to the minimum pixel difference value according to the pixel difference value between the pixel point contained in the second boundary line and the reference point;
and the pixel positioning unit is configured to determine a first target pixel needing to be shielded on the target shooting image according to the pixel coordinates of the standard pixel point.
Optionally, the acquiring unit includes:
A first photographing unit configured to take a first photographed image photographed by the photographing device as a target photographed image when the photographing device photographs the projection region under a target photographing condition, wherein the target photographing condition is that an optical axis of the photographing device is perpendicular to the projection region and passes through a center point of a target projection screen;
A second photographing unit configured to acquire a second photographed image photographed by the photographing device when the photographing device does not photograph the projection region under the target photographing condition;
and the transformation unit is configured to obtain a target shooting image according to a perspective transformation matrix constructed according to the target projection picture in the second shooting image and the original image corresponding to the target projection picture and combining the coordinate information of each vertex of the second shooting image.
Optionally, the apparatus 1300 further includes:
And the gradual change module is configured to carry out transparency gradual change processing on pixels of the target image in the edge area to obtain a new target image.
The specific implementation process of each functional module of the above-mentioned projection apparatus 1300 is described in detail in the embodiments of the projection method, and will not be described herein.
Referring now to FIG. 14, a schematic diagram of a projection device 600 suitable for use in implementing embodiments of the present disclosure is shown. The projection device in the embodiment of the disclosure may be an independent device or a module that can be used in cooperation with other intelligent terminals. The projection device 600 illustrated in fig. 14 is merely an example and should not be construed as limiting the functionality and scope of use of the disclosed embodiments.
As shown in fig. 14, the projection apparatus 600 may include a processing device (e.g., a central processing unit, a graphics processor, etc.) 601, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 602 or a program loaded from a storage device 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data required for the operation of the projection apparatus 600 are also stored. The processing device 601, the ROM 602, and the RAM 603 are connected to each other through a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
In general, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, and the like; an output device 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 608 including, for example, magnetic tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the projection device 600 to communicate with other devices wirelessly or by wire to exchange data. While fig. 14 shows projection device 600 with various means, it should be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a non-transitory computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via communication means 609, or from storage means 608, or from ROM 602. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing device 601.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some embodiments, the projection device and camera may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be embodied in the projection device; or may be present alone without being fitted into the projection device.
The computer readable medium carries one or more programs which, when executed by the projection device, cause the projection device to: determining boundary curvature information of a first boundary line of a target projection picture projected to a projection area by a projection device in a front projection state; and projecting the target image, wherein an edge region of the target image includes opaque pixels set according to the boundary curvature information so that a boundary of a projection screen in which the target image is projected in the projection region appears as a straight line in the user's vision.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including, but not limited to, an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules described in the embodiments of the present disclosure may be implemented in software or hardware. The name of a module does not in some cases define the module itself.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims. The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.

Claims (8)

1. A projection method, comprising:
determining boundary curvature information of a first boundary line of a target projection picture projected to a projection area by a projection device in a front projection state;
projecting a target image, wherein an edge region of the target image includes opaque pixels set according to the boundary curvature information so that a boundary of a projection screen in which the target image is projected in the projection region appears as a straight line in user's vision;
the determining boundary curvature information of a first boundary line of a target projection picture projected to a projection area by a projection device in a front projection state includes:
Controlling the projection equipment to project a target projection picture to a projection area in a front projection state;
Acquiring a target shooting image, wherein the target shooting image is obtained by shooting the projection area by a shooting device under preset shooting conditions;
Determining boundary curvature information of a first boundary line of the target projection picture according to the target shooting image;
the determining boundary curvature information of the first boundary line of the target projection picture according to the target shooting image includes:
Determining pixel coordinates of pixel points positioned on a second boundary line of the target projection picture in the target shooting image;
Regarding the second boundary line, taking a target point on the second boundary line as a reference point, and determining a pixel difference value between each pixel point and the reference point according to pixel coordinates of each pixel point on the second boundary line, wherein the pixel difference value comprises a difference value between the pixel point and the reference point in a direction perpendicular to a vertical reference line formed by two vertexes of the second boundary line;
And determining a pixel difference value between the pixel point contained in the second boundary line and the reference point as boundary curvature information of the corresponding first boundary line.
2. The projection method according to claim 1, wherein the target image is obtained by:
determining a first target pixel point to be shielded on the target shooting image according to the pixel difference value between the pixel point contained in the second boundary line and the reference point;
Determining a second target pixel point corresponding to the first target pixel point on a modulation plane of the projection device according to a resolution ratio between a first resolution of the modulation plane and a second resolution of the target photographed image;
Constructing a correction image according to the second target pixel point, wherein the opaque pixels are arranged on the second target pixel point of the correction image;
and superposing the corrected image and the projection image to obtain the target image.
3. The projection method according to claim 2, wherein the determining a first target pixel point to be blocked on the target captured image according to a pixel difference between the pixel point included in the second boundary line and the reference point includes:
Determining the pixel coordinates of a standard pixel point corresponding to the minimum pixel difference value according to the pixel difference value between the pixel point contained in the second boundary line and the reference point;
And determining a first target pixel to be shielded on the target shooting image according to the pixel coordinates of the standard pixel point.
4. The projection method according to claim 1, wherein the acquiring the target photographed image includes:
When the shooting device shoots the projection area under a target shooting condition, taking a first shooting image shot by the shooting device as a target shooting image, wherein the target shooting condition is that an optical axis of the shooting device is perpendicular to the projection area and passes through a center point of a target projection picture;
acquiring a second shooting image shot by the shooting device when the shooting device does not shoot the projection area under the target shooting condition;
and obtaining the target shooting image by combining the coordinate information of each vertex of the second shooting image according to a perspective transformation matrix constructed by the target projection picture in the second shooting image and the original image corresponding to the target projection picture.
5. The projection method according to any one of claims 1 to 4, characterized in that the method further comprises:
And carrying out transparency gradient treatment on pixels of the target image in the edge area to obtain a new target image.
6. A projection apparatus, comprising:
a determining module configured to determine boundary curvature information of a first boundary line of a target projection screen projected by the projection apparatus toward the projection area in a forward projection state;
A projection module configured to project a target image, wherein an edge region of the target image includes opaque pixels set according to the boundary curvature information so that a boundary of a projection screen in which the target image is projected in the projection region appears as a straight line in a user's vision;
the determining module includes:
A control unit configured to control the projection apparatus to project a target projection screen to the projection area in a forward projection state;
An acquisition unit configured to acquire a target photographed image, wherein the target photographed image is obtained by photographing the projection area with a preset photographing condition by a photographing device;
A curvature determination unit configured to determine boundary curvature information of a first boundary line of the target projection screen from the target captured image;
The curvature determination unit includes:
a pixel unit configured to determine pixel coordinates of a pixel point located on a second boundary line of a target projection screen in the target photographed image;
A distance unit configured to take one target point on the second boundary line as a reference point for the second boundary line, and determine a pixel difference value between each pixel point and the reference point according to a pixel coordinate of each pixel point located on the second boundary line, wherein the pixel difference value includes a difference value between the pixel point and the reference point in a direction perpendicular to a vertical reference line formed by two vertexes of the second boundary line;
And a boundary determining unit configured to determine a pixel difference value between the pixel point included in the second boundary line and the reference point as boundary curvature information of the corresponding first boundary line.
7. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processing device, carries out the steps of the method according to any one of claims 1-5.
8. A projection device, comprising:
a storage device having a computer program stored thereon;
Processing means for executing said computer program in said storage means to carry out the steps of the method according to any one of claims 1-5.
CN202210094719.0A 2022-01-26 2022-01-26 Projection method, projection device, storage medium and projection apparatus Active CN114401388B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210094719.0A CN114401388B (en) 2022-01-26 2022-01-26 Projection method, projection device, storage medium and projection apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210094719.0A CN114401388B (en) 2022-01-26 2022-01-26 Projection method, projection device, storage medium and projection apparatus

Publications (2)

Publication Number Publication Date
CN114401388A CN114401388A (en) 2022-04-26
CN114401388B true CN114401388B (en) 2024-07-23

Family

ID=81232216

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210094719.0A Active CN114401388B (en) 2022-01-26 2022-01-26 Projection method, projection device, storage medium and projection apparatus

Country Status (1)

Country Link
CN (1) CN114401388B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116546175B (en) * 2023-06-01 2023-10-31 深圳创疆网络科技有限公司 Intelligent control method and device for realizing projector based on automatic induction

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111738955A (en) * 2020-06-23 2020-10-02 安徽海微电光电科技有限责任公司 Distortion correction method and device for projected image and computer readable storage medium
CN112449167A (en) * 2020-11-13 2021-03-05 深圳市火乐科技发展有限公司 Image sawtooth elimination and image display method and device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5642457B2 (en) * 2010-08-31 2014-12-17 オリンパス株式会社 Display control apparatus and display control method
US8624911B1 (en) * 2011-01-05 2014-01-07 Google Inc. Texture-based polygon antialiasing
CN103136723A (en) * 2011-11-29 2013-06-05 方正国际软件(北京)有限公司 Image sawtooth removing method and system
CN113709431A (en) * 2021-07-26 2021-11-26 深圳市金研微科技有限公司 Apparatus and method for automatically correcting projection picture
CN113489961B (en) * 2021-09-08 2022-03-22 深圳市火乐科技发展有限公司 Projection correction method, projection correction device, storage medium and projection equipment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111738955A (en) * 2020-06-23 2020-10-02 安徽海微电光电科技有限责任公司 Distortion correction method and device for projected image and computer readable storage medium
CN112449167A (en) * 2020-11-13 2021-03-05 深圳市火乐科技发展有限公司 Image sawtooth elimination and image display method and device

Also Published As

Publication number Publication date
CN114401388A (en) 2022-04-26

Similar Documents

Publication Publication Date Title
CN110336987B (en) Projector distortion correction method and device and projector
CN111353948B (en) Image noise reduction method, device and equipment
JP6037375B2 (en) Image projection apparatus and image processing method
CN112272292B (en) Projection correction method, apparatus and storage medium
WO2006100991A1 (en) Method of and apparatus for automatically adjusting alignement of a projector with respect to a projection screen
CN114125411B (en) Projection device correction method, projection device correction device, storage medium and projection device
JP2015060012A (en) Image processing system, image processing device, image processing method and image processing program as well as display system
JP2008288714A (en) Video projection system
JP2015097350A (en) Image processing apparatus and multi-projection system
TWI698127B (en) Projection system and projection method
WO2022242306A1 (en) Laser projection system, image correction method, and laser projection device
CN114401388B (en) Projection method, projection device, storage medium and projection apparatus
CN114449249B (en) Image projection method, image projection device, storage medium and projection apparatus
TWI443604B (en) Image correction method and image correction apparatus
CN114071104A (en) Method for realizing multi-projector projection gradual change fusion based on shader
CN202841396U (en) Digital film optimization device and digital film projection system
WO2024119619A1 (en) Correction method and apparatus for picture captured underwater, and storage medium
WO2021145913A1 (en) Estimating depth based on iris size
JP2000081593A (en) Projection type display device and video system using the same
CN114979600B (en) Laser projection apparatus and correction method of projection image
CN112073700A (en) Projection correction system and projection correction method thereof
CN114760451B (en) Projection image correction prompting method, projection image correction prompting device, projection equipment and storage medium
JP2003143621A (en) Projector with built-in circuit for correcting color and luminance unevenness
US20220292652A1 (en) Image generation method and information processing device
CN114339179A (en) Projection correction method, projection correction device, storage medium and projection equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant